CN115190280B - Device and method for determining area of fused projection image - Google Patents

Device and method for determining area of fused projection image Download PDF

Info

Publication number
CN115190280B
CN115190280B CN202210760219.6A CN202210760219A CN115190280B CN 115190280 B CN115190280 B CN 115190280B CN 202210760219 A CN202210760219 A CN 202210760219A CN 115190280 B CN115190280 B CN 115190280B
Authority
CN
China
Prior art keywords
vertex
image
quadrangle
determining
projection image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210760219.6A
Other languages
Chinese (zh)
Other versions
CN115190280A (en
Inventor
王小路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Eswin Computing Technology Co Ltd
Haining Eswin IC Design Co Ltd
Original Assignee
Beijing Eswin Computing Technology Co Ltd
Haining Eswin IC Design Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Eswin Computing Technology Co Ltd, Haining Eswin IC Design Co Ltd filed Critical Beijing Eswin Computing Technology Co Ltd
Priority to CN202210760219.6A priority Critical patent/CN115190280B/en
Publication of CN115190280A publication Critical patent/CN115190280A/en
Application granted granted Critical
Publication of CN115190280B publication Critical patent/CN115190280B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems

Abstract

The application discloses a device and a method for determining the area of a fused projection image, and relates to the technical field of image projection. The device of the application comprises: an acquisition unit configured to acquire a first image, a second image, and a third image, wherein the first image includes a first projection image, the second image includes a second projection image, and the third image includes a fused projection image; a first determining unit, configured to determine a position coordinate corresponding to each vertex included in the first projection image and a position coordinate corresponding to each vertex included in the second projection image; the second determining unit is used for determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image; and the third determining unit is used for determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image.

Description

Device and method for determining area of fused projection image
Technical Field
The application relates to the technical field of image projection, in particular to a device and a method for determining the area of a fused projection image.
Background
The projection splicing fusion technology is to fuse the images projected by a plurality of projectors, so that a fused projection image which has no gap, larger area and higher resolution is formed on a projection curtain. Before the multiple projectors are formally used, the projection positions of the projectors are required to be adjusted respectively, so that the fused projection images projected by the multiple projectors are ensured to meet the requirements.
Currently, a worker manually adjusts projection positions of a plurality of projectors; in the process of adjusting the projection positions of the projectors, a worker is required to manually measure the areas of the fused projection images projected by the two adjacent projectors, and if the areas of the fused projection images projected by the two adjacent projectors do not meet the requirements, the worker is required to continuously adjust the projection positions of the two adjacent projectors, so that the areas of the fused projection images projected by the two adjacent projectors are ensured to meet the requirements; because the efficiency of the staff to manually measure the area of the fused projection image projected by two adjacent projectors is lower, the efficiency of the staff to adjust the projection positions of a plurality of projectors is lower.
Disclosure of Invention
The embodiment of the application provides a device and a method for determining the area of a fused projection image, which mainly aim to improve the efficiency of adjusting the projection positions of a plurality of projectors by a worker.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
in a first aspect, the present application provides an apparatus for determining a fused projection image area, the apparatus comprising:
the device comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring a first image, a second image and a third image, the first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after a first projector projects to the target projection area, the second projection image is an image obtained after a second projector projects to the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project to the target projection area at the same time;
A first determining unit, configured to determine a position coordinate corresponding to each vertex included in the first projection image and a position coordinate corresponding to each vertex included in the second projection image, where the position coordinate corresponding to each vertex included in the first projection image is a position coordinate of each vertex included in the first projection image in the first image, and the position coordinate corresponding to each vertex included in the second projection image is a position coordinate of each vertex included in the second projection image in the second image;
the second determining unit is used for determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image;
and the third determining unit is used for determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image.
Optionally, the second determining unit includes:
the generating module is used for generating a fourth image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image, wherein the fourth image has the same image area as the first image, the second image and the third image, and the fourth image contains a first quadrangle and a second quadrangle;
A first determining module, configured to determine a positional relationship between each vertex included in the first quadrilateral and the second quadrilateral, and a positional relationship between each vertex included in the second quadrilateral and the first quadrilateral, according to a positional coordinate corresponding to each vertex included in the first quadrilateral and a positional coordinate corresponding to each vertex included in the second quadrilateral;
a second determining module, configured to determine, as a first vertex, a vertex that is included in the first quadrilateral and that is located in the second quadrilateral, determine, as a second vertex, a vertex that is included in the first quadrilateral and that is located outside the second quadrilateral, determine, as a third vertex, a vertex that is included in the second quadrilateral and that is located in the first quadrilateral, and determine, as a fourth vertex, a vertex that is included in the second quadrilateral and that is located outside the first quadrilateral;
the third determining module is used for determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex;
and the fourth determining module is used for determining the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle as the position coordinates corresponding to each vertex contained in the fused projection image.
Optionally, the first determining module is specifically configured to:
drawing a target direction ray corresponding to each vertex contained in the first quadrangle and a target direction ray corresponding to each vertex contained in the second quadrangle in the fourth image;
determining the number of intersection points of the target direction rays corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction rays corresponding to each vertex contained in the second quadrangle and the first quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is odd, determining that the vertexes contained in the first quadrangle are in the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is odd, determining that the vertexes contained in the second quadrangle are in the first quadrangle;
And if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
Optionally, the third determining module is specifically configured to:
if the first vertex exists, judging whether the edge where the first vertex is located is intersected with each edge of the second quadrangle; if the first vertex is intersected with the second vertex, calculating a position coordinate corresponding to an intersection point of the edge where the first vertex is located and the edge of the second quadrilateral;
if the third vertex exists, judging whether the edge where the third vertex is located is intersected with each edge of the first quadrilateral; if so, calculating the position coordinates corresponding to the intersection point of the edge where the third vertex is located and the edge of the first quadrilateral.
Optionally, the third determining unit includes:
a fifth determining module, configured to determine an image area corresponding to the fused projection image according to a position coordinate corresponding to each vertex included in the fused projection image;
the acquisition module is used for acquiring the actual area corresponding to the target projection area and the image area corresponding to the third image;
And a sixth determining module, configured to determine an actual area corresponding to the fused projection image according to the image area corresponding to the fused projection image, the actual area corresponding to the target projection area, and the image area corresponding to the third image.
In a second aspect, the present application further provides a method for determining an area of a fused projection image, the method comprising:
acquiring a first image, a second image and a third image, wherein the first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after a first projector projects onto the target projection area, the second projection image is an image obtained after a second projector projects onto the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project onto the target projection area at the same time;
determining position coordinates corresponding to each vertex contained in the first projection image and position coordinates corresponding to each vertex contained in the second projection image, wherein the position coordinates corresponding to each vertex contained in the first projection image are position coordinates of each vertex contained in the first projection image in the first image, and the position coordinates corresponding to each vertex contained in the second projection image are position coordinates of each vertex contained in the second projection image in the second image;
Determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image;
and determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image.
Optionally, the determining, according to the position coordinate corresponding to each vertex included in the first projection image and the position coordinate corresponding to each vertex included in the second projection image, the position coordinate corresponding to each vertex included in the fused projection image includes:
generating a fourth image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image, wherein the fourth image has the same image area as the first image, the second image and the third image, and the fourth image contains a first quadrangle and a second quadrangle;
determining the position relation between each vertex contained in the first quadrangle and the second quadrangle and the position relation between each vertex contained in the second quadrangle according to the position coordinates corresponding to each vertex contained in the first quadrangle and the position coordinates corresponding to each vertex contained in the second quadrangle;
Determining a vertex which is included in the first quadrangle and is positioned in the second quadrangle as a first vertex, determining a vertex which is included in the first quadrangle and is positioned outside the second quadrangle as a second vertex, determining a vertex which is included in the second quadrangle and is positioned in the first quadrangle as a third vertex, and determining a vertex which is included in the second quadrangle and is positioned outside the first quadrangle as a fourth vertex;
determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex;
and determining the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle as the position coordinates corresponding to each vertex contained in the fused projection image.
Optionally, the determining the positional relationship between each vertex included in the first quadrangle and the second quadrangle and the positional relationship between each vertex included in the second quadrangle and the first quadrangle according to the positional coordinates corresponding to each vertex included in the first quadrangle and the positional coordinates corresponding to each vertex included in the second quadrangle includes:
Drawing a target direction ray corresponding to each vertex contained in the first quadrangle and a target direction ray corresponding to each vertex contained in the second quadrangle in the fourth image;
determining the number of intersection points of the target direction rays corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction rays corresponding to each vertex contained in the second quadrangle and the first quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is odd, determining that the vertexes contained in the first quadrangle are in the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is odd, determining that the vertexes contained in the second quadrangle are in the first quadrangle;
and if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
Optionally, the determining, according to the position coordinate corresponding to each first vertex or the position coordinate corresponding to each third vertex, the position coordinate corresponding to each intersection point of the first quadrangle and the second quadrangle includes:
if the first vertex exists, judging whether the edge where the first vertex is located is intersected with each edge of the second quadrangle; if the first vertex is intersected with the second vertex, calculating a position coordinate corresponding to an intersection point of the edge where the first vertex is located and the edge of the second quadrilateral;
if the third vertex exists, judging whether the edge where the third vertex is located is intersected with each edge of the first quadrilateral; if so, calculating the position coordinates corresponding to the intersection point of the edge where the third vertex is located and the edge of the first quadrilateral.
Optionally, the determining, according to the position coordinates corresponding to each vertex included in the fused projection image, an actual area corresponding to the fused projection image includes:
determining the image area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image;
acquiring an actual area corresponding to the target projection area and an image area corresponding to the third image;
And determining the actual area corresponding to the fused projection image according to the image area corresponding to the fused projection image, the actual area corresponding to the target projection area and the image area corresponding to the third image.
In a third aspect, an embodiment of the present application provides a storage medium, where the storage medium includes a stored program, where the program, when executed, controls a device where the storage medium is located to perform the method for determining a fused projection image area according to the second aspect.
In a fourth aspect, embodiments of the present application provide an apparatus for determining a fused projected image area, the apparatus comprising a storage medium; and one or more processors coupled to the storage medium, the processors configured to execute the program instructions stored in the storage medium; the program instructions, when executed, perform the method for determining a fused projected image area of the second aspect.
By means of the technical scheme, the technical scheme provided by the application has the following advantages:
the application provides a device and a method for determining the area of a fused projection image, wherein the device for determining the area of the fused projection image comprises the following components: an acquisition unit, a first determination unit, a second determination unit, and a third determination unit; firstly, acquiring a first image, a second image and a third image in a local storage space of target terminal equipment by an acquisition unit; secondly, determining, by a first determining unit, position coordinates corresponding to each vertex contained in the first projection image according to the first image, and determining position coordinates corresponding to each vertex contained in the second projection image according to the second image; thirdly, the second determining unit determines the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image; and finally, determining the actual area corresponding to the fused projection image by a third determining unit according to the position coordinates corresponding to each vertex contained in the fused projection image. Because in this application, need not the staff manual measurement and merge the actual area of projection image that two adjacent projectors projected, confirm and merge the projection image area device and according to the first image that contains first projection image and the second image that contains second projection image, can confirm the actual area of fusion projection image, and after confirming the actual area that fuses the projection image area device and confirm the fusion projection image, target terminal equipment alright output display the actual area of fusion projection image, so that the staff knows the actual area of fusion projection image, thereby confirm whether need adjust the projection position of first projector and second projector, and then can effectively improve the staff and carry out the efficiency of adjusting to the projection position of two adjacent projectors projection.
The foregoing description is only an overview of the technical solutions of the present application, and may be implemented according to the content of the specification in order to make the technical means of the present application more clearly understood, and in order to make the above-mentioned and other objects, features and advantages of the present application more clearly understood, the following detailed description of the present application will be given.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like reference numerals refer to similar or corresponding parts and in which:
FIG. 1 shows a block diagram of an apparatus for determining an area of a fused projection image according to an embodiment of the present application;
FIG. 2 illustrates a block diagram of another apparatus for determining the area of a fused projection image provided by an embodiment of the present application;
fig. 3 is a schematic diagram of drawing, in a fourth image, a target direction ray corresponding to each vertex included in a first quadrangle and a target direction ray corresponding to each vertex included in a second quadrangle according to an embodiment of the present application;
fig. 4 shows a flowchart of a method for determining an area of a fused projection image according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
It is noted that unless otherwise indicated, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
The embodiment of the application provides a device for determining the area of a fused projection image, which is applied to target terminal equipment, wherein the target terminal equipment comprises a preset camera, and the target terminal equipment can be but is not limited to: computers, projection fuses, etc., as shown in fig. 1, the apparatus specifically includes: an obtaining unit 11, configured to obtain a first image, a second image, and a third image, where the first image, the second image, and the third image are obtained by capturing a target projection area by a preset camera, and the image areas of the first image, the second image, and the third image are the same, and the target projection area may be, but is not limited to,: the system comprises a complete plane projection curtain, a partial plane projection curtain and the like, wherein a first image comprises a first projection image, a second image comprises a second projection image, a third image comprises a fusion projection image, the first projection image is an image obtained after the first projector projects a target projection area, the second projection image is an image obtained after the second projector projects a target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project the target projection area at the same time; a first determining unit 12, configured to determine a position coordinate corresponding to each vertex included in the first projection image and a position coordinate corresponding to each vertex included in the second projection image, where the position coordinate corresponding to each vertex included in the first projection image is a position coordinate of each vertex included in the first projection image in the first image, and the position coordinate corresponding to each vertex included in the second projection image is a position coordinate of each vertex included in the second projection image in the second image; a second determining unit 13, configured to determine, according to the position coordinate corresponding to each vertex included in the first projection image and the position coordinate corresponding to each vertex included in the second projection image, the position coordinate corresponding to each vertex included in the fused projection image; and a third determining unit 14, configured to determine an actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex included in the fused projection image.
The following describes a detailed procedure for determining the area size of the fused projection image by the fused projection image area determining device in conjunction with fig. 1:
the first projector and the second projector are two adjacent projectors which need to be subjected to projection position adjustment, namely the position relationship between the first projector and the second projector is adjacent up and down or adjacent left and right; the specific process of shooting the first image, the second image and the third image in advance is as follows: the method comprises the steps that a worker firstly starts a first projector, controls the first projector to project towards a target projection area, enables the first projector to project a first projection image in the target projection area, and controls target terminal equipment to shoot a first image obtained in the target projection area through a preset camera; secondly, closing the first projector, opening the second projector, controlling the second projector to project to the target projection area, so that the second projector projects a second projection image in the target projection area, and controlling the target terminal equipment to shoot a second image obtained by the target projection area through a preset camera; finally, the first projector is started again, the first projector is controlled to project to the target projection area, so that the first projector projects a first projection image in the target projection area again, at this time, the first projection image projected by the first projector and the second projection image projected by the second projector form a fused projection image, and the target terminal equipment is controlled to shoot a third image obtained by the target projection area through a preset camera, wherein the first projection image and the second projection image can be but are not limited to: pure red images, pure blue images, pure white images, and the like.
In the embodiment of the application, a worker controls a target terminal device in advance to shoot through a preset camera to obtain a first image, a second image and a third image, the shot first image, second image and third image are stored in a local storage space of the target terminal device, when the worker expects to determine the area size of a fused projection image, a corresponding instruction is sent to a fused projection image area determining device through an input device of the target terminal device, after the fused projection image area determining device receives the instruction input by the worker, an acquiring unit 11 in the fused projection image area determining device acquires the first image, the second image and the third image in the local storage space of the target terminal device; after the obtaining unit 11 obtains the obtained first image, second image and third image, the first determining unit 12 may determine, according to the first image, a position coordinate corresponding to each vertex included in the first projection image, and determine, according to the second image, a position coordinate corresponding to each vertex included in the second projection image, that is, a position coordinate of each vertex included in the first projection image in the first image and a position coordinate of each vertex included in the second projection image in the second image; after the first determining unit 12 determines the position coordinates corresponding to each vertex included in the first projection image and the position coordinates corresponding to each vertex included in the second projection image, the second determining unit 13 may determine the position coordinates corresponding to each vertex included in the fused projection image, that is, determine the position coordinates of each vertex included in the fused projection image in the third image, according to the position coordinates corresponding to each vertex included in the first projection image and the position coordinates corresponding to each vertex included in the second projection image; after the second determining unit 13 determines the position coordinates corresponding to each vertex included in the fused projection image, the third determining unit 14 may determine the actual area corresponding to the fused projection image, that is, determine the area of the fused projection image in the target projection area, according to the position coordinates corresponding to each vertex included in the fused projection image, and after the third determining unit 14 determines the actual area of the fused projection image, the target terminal device may output and display the actual area corresponding to the fused projection image, so that a worker knows the actual area corresponding to the fused projection image, thereby determining whether the projection positions of the first projector and the second projector need to be adjusted. The first determining unit 12 may determine, using an edge extraction algorithm, a position coordinate corresponding to each vertex included in the first projection image and a position coordinate corresponding to each vertex included in the second projection image, but is not limited thereto.
It should be noted that, since the position coordinate of each vertex included in the fused projection image in the third image cannot be accurately determined according to the third image including the fused projection image, and the fused projection image is an image formed by the first projection image and the second projection image, and the first image, the second image and the third image are images with the same image area size, the position coordinate of each vertex included in the first projection image in the first image and the position coordinate of each vertex included in the second projection image in the second image can be determined first, and then the position coordinate of each vertex included in the fused projection image in the third image can be determined according to the position coordinate of each vertex included in the first projection image in the first image and the position coordinate of each vertex included in the second projection image in the second image.
The embodiment of the application provides a device for determining the area of a fused projection image, which comprises: an acquisition unit, a first determination unit, a second determination unit, and a third determination unit; firstly, acquiring a first image, a second image and a third image in a local storage space of target terminal equipment by an acquisition unit; secondly, determining, by a first determining unit, position coordinates corresponding to each vertex contained in the first projection image according to the first image, and determining position coordinates corresponding to each vertex contained in the second projection image according to the second image; thirdly, the second determining unit determines the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image; and finally, determining the actual area corresponding to the fused projection image by a third determining unit according to the position coordinates corresponding to each vertex contained in the fused projection image. Because in this embodiment of the present application, the actual area of the fused projection image is determined by determining the fused projection image area device according to the first image including the first projection image and the second image including the second projection image without manually measuring the actual area of the fused projection image projected by the two adjacent projectors by a worker, and after determining the actual area of the fused projection image by the fused projection image area device, the target terminal device may output and display the actual area of the fused projection image, so that the worker knows the actual area of the fused projection image, thereby determining whether to adjust the projection positions of the first projector and the second projector, and further effectively improving the efficiency of the worker to adjust the projection positions projected by the two adjacent projectors.
The embodiment of the application also provides another device for determining the area of the fused projection image, which is applied to the target terminal equipment; as shown in fig. 2, the following is explained in connection with fig. 2:
further, as shown in fig. 2, the second determination unit 13 includes: a generating module 131, configured to generate a fourth image according to the position coordinates corresponding to each vertex included in the first projection image and the position coordinates corresponding to each vertex included in the second projection image, where the fourth image has the same image area size as the first image, the second image, and the third image, and the fourth image includes a first quadrangle and a second quadrangle; a first determining module 132, configured to determine a positional relationship between each vertex included in the first quadrilateral and the second quadrilateral, and a positional relationship between each vertex included in the second quadrilateral and the first quadrilateral, according to a positional coordinate corresponding to each vertex included in the first quadrilateral and a positional coordinate corresponding to each vertex included in the second quadrilateral; a second determining module 133, configured to determine, as a first vertex, a vertex that is included in the first quadrilateral and that is located in the second quadrilateral, determine, as a second vertex, a vertex that is included in the first quadrilateral and that is located outside the second quadrilateral, determine, as a third vertex, a vertex that is included in the second quadrilateral and that is located in the first quadrilateral, and determine, as a fourth vertex, a vertex that is included in the second quadrilateral and that is located outside the first quadrilateral; a third determining module 134, configured to determine, according to the position coordinate corresponding to each first vertex or the position coordinate corresponding to each third vertex, the position coordinate corresponding to each intersection of the first quadrangle and the second quadrangle; the fourth determining module 135 is configured to determine, as the position coordinates corresponding to each vertex included in the fused projection image, the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex, and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle.
In this embodiment of the present application, the specific process of determining, by the second determining unit 13, the position coordinate corresponding to each vertex included in the fused projection image according to the position coordinate corresponding to each vertex included in the first projection image and the position coordinate corresponding to each vertex included in the second projection image is:
(1) Generating a fourth image by the generating module 131 according to the position coordinates of each vertex contained in the first projection image and the position coordinates of each vertex contained in the second projection image, that is, generating a blank image with the same size as the image areas of the first image, the second image and the third image, and drawing a first quadrangle and a second quadrangle in the blank image according to the position coordinates of each vertex contained in the first projection image and the position coordinates of each vertex contained in the second projection image, so as to obtain the fourth image, wherein the position coordinates of each vertex contained in the first quadrangle in the fourth image are in one-to-one correspondence with the position coordinates of each vertex contained in the first projection image in the first image, and the position coordinates of each vertex contained in the second quadrangle in the fourth image are in one-to-one correspondence with the position coordinates of each vertex contained in the second projection image in the second image in the first image, for example, the position coordinates of each vertex contained in the first projection image in the first image are: (30,120), (375,135), (45,600), (380,620), then the position coordinates of each vertex contained in the first quadrilateral in the fourth image are also (30,120), (375,135), (45,600), (380,620);
(2) The first determining module 132 determines a positional relationship between each vertex included in the first quadrangle and the second quadrangle according to the positional coordinates corresponding to each vertex included in the first quadrangle and the positional coordinates corresponding to each vertex included in the second quadrangle, and determines a positional relationship between each vertex included in the second quadrangle and the first quadrangle, that is, determines whether each vertex included in the first quadrangle is inside or outside the second quadrangle, and determines whether each vertex included in the second quadrangle is outside or inside the first quadrangle;
(3) Next, the second determining module 133 determines, as a first vertex, a vertex which is included in the first quadrangle and is located in the second quadrangle, determines, as a second vertex, a vertex which is included in the first quadrangle and is located outside the second quadrangle, determines, as a third vertex, a vertex which is included in the second quadrangle and is located in the first quadrangle, and determines, as a fourth vertex, a vertex which is included in the second quadrangle and is located outside the first quadrangle;
(4) Thirdly, determining, by the third determining module 134, the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex, that is, determining the position coordinates of each intersection point of the first quadrangle and the second quadrangle in the fourth image;
(5) Finally, the fourth determining module 135 determines, as the position coordinates corresponding to each vertex included in the fused projection image, the position coordinates corresponding to each second vertex, the position coordinates corresponding to each fourth vertex, and the position coordinates corresponding to the intersection point of each first quadrangle and the second quadrangle, where, since the third image and the fourth image have the same image area size and the second vertex, the intersection point of the fourth vertex, and the first quadrangle, and each vertex included in the fused projection image in the third image are in one-to-one correspondence with each vertex included in the fused projection image in the fourth image, the position coordinates of each second vertex in the fourth image, the position coordinates of each fourth vertex in the fourth image, and the position coordinates of the intersection point of each first quadrangle and the second quadrangle in the fourth image can be determined as the position coordinates of each vertex included in the fused projection image in the third image.
Further, as shown in fig. 2, in the embodiment of the present application, the specific process of determining the positional relationship between each vertex included in the first quadrilateral and the second quadrilateral and determining the positional relationship between each vertex included in the second quadrilateral and the first quadrilateral by the first determining module 132 according to the positional coordinates corresponding to each vertex included in the first quadrilateral and the positional coordinates corresponding to each vertex included in the second quadrilateral is:
First, a target direction ray corresponding to each vertex included in the first quadrangle and a target direction ray corresponding to each vertex included in the second quadrangle are drawn in the fourth image, wherein, for any vertex included in the first quadrangle, the target direction ray corresponding to the vertex is a ray made in a target direction from the vertex, and for any vertex included in the second quadrangle, the target direction ray corresponding to the vertex is a ray made in a target direction from the vertex, and the target direction may be, but is not limited to: the y-axis positive direction, the y-axis negative direction, the x-axis positive direction, the x-axis negative direction, and the like, for example, when the target direction is the y-axis positive direction, the target direction ray corresponding to each vertex included in the first quadrangle ABCD drawn in the fourth image and the target direction ray corresponding to each vertex included in the second quadrangle WXYZ are specifically shown in fig. 3;
secondly, determining the number of intersection points of the target direction ray corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction ray corresponding to each vertex contained in the second quadrangle and the first quadrangle, for example, determining that the position coordinate of the vertex A of the first quadrangle is (a, b), the target direction ray corresponding to the vertex A is a ray which is positively done towards the Y axis from the vertex A, two end points of a certain side of the second quadrangle are X (X1, Y1) and Y (X2, Y2), the slope k= (Y2-Y1)/(X2-X1) of the side XY is (X, Y), y=k (X-X1) +y1) is determined, if a < X1 or a > X2 is not intersected with the side XY, if X1 a is not less than or equal to X1, and if b is not less than k (a-X1) +y1, and if b is not equal to X1;
Finally, if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is odd, determining that the vertexes contained in the first quadrangle are in the second quadrangle; if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle; if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is an odd number, determining that the vertexes contained in the second quadrangle are in the first quadrangle; and if the number of the intersection points of the target direction ray corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
Further, as shown in fig. 2, in the embodiment of the present application, the specific process of determining, by the third determining module 134 according to the position coordinate corresponding to each first vertex or the position coordinate corresponding to each third vertex, the position coordinate corresponding to each intersection of the first quadrangle and the second quadrangle is:
in an actual application scenario, there are three cases: (1) there is only a first vertex and no third vertex; (2) there is only a third vertex and no first vertex; (3) there is a first vertex and a third vertex;
(1) If only the first vertexes exist and the third vertexes do not exist, the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle can be determined according to the position coordinates corresponding to each first vertex, namely, for any one first vertex, whether the edges of the first vertexes and the second quadrangle intersect or not is judged, and if so, the position coordinates corresponding to the intersection point of the edges of the first vertexes and the edges of the second quadrangle are calculated;
the specific process of judging whether a certain edge (edge AB) where the first vertex is located and a certain edge (edge CD) of the second quadrangle intersect is as follows: firstly, judging whether straight lines of the side AB and the side CD are intersected, if not, determining that the side AB and the side CD are not intersected, if so, judging whether straight lines of the side CD and the side AB are intersected, if not, determining that the side AB and the side CD are not intersected, and if so, determining that the side AB and the side CD are intersected; the specific process of determining whether the straight line where the edge AB and the edge CD are located intersect may be: judging whether the end points A and B are on two sides of the straight line where the edge CD is located, if the end points A and B are on two sides of the straight line where the edge CD is located, judging that the straight line where the edge AB is located is intersected with the straight line where the edge CD is located, and the specific process of judging whether the straight line where the edge CD is intersected with the straight line where the edge AB is located can be as follows: judging whether the endpoint C and the endpoint D are on two sides of the straight line where the edge AB is located, if the endpoint C and the endpoint D are on two sides of the straight line where the edge AB is located, determining that the edge CD intersects the straight line where the edge AB is located, for example, the position coordinate corresponding to the endpoint a is (xa, ya), the position coordinate corresponding to the endpoint B is (xb, yb), the position coordinate corresponding to the endpoint C is (xc, yc), the position coordinate corresponding to the endpoint D is (xd, yd), ta= (xc-xa) (yd-ya) - (yc-xa), tb= (xc-yb) - (yc-yb) is (xd-xb), and when any one of the following six preset conditions is satisfied, determining that the endpoint a and the endpoint B are on two sides of the straight line where the edge CD is located, the six preset conditions are specifically: (1) ta <0, tb=0 (2) ta <0, tb >0 (3) ta=0, tb <0 (4) ta=0, tb >0 (5) ta >0, tb=0 (6) ta >0, tb <0;
When it is determined that the edge AB where the first vertex is located intersects the edge CD of the second quadrangle, the specific process of calculating the position coordinates corresponding to the intersection point of the edge AB where the first vertex is located and the edge CD of the second quadrangle is: firstly, calculating a slope k1 corresponding to the side AB and a slope k2 corresponding to the side CD according to a position coordinate (xa, ya) corresponding to the end point A, a position coordinate (xb, yb) corresponding to the end point B, a position coordinate (xc, yc) corresponding to the end point C and a position coordinate (xd, yd) corresponding to the end point D, wherein k1= (yb-ya)/(xb-xa), and k2= (yd-yc)/(xd-xc); secondly, calculating an abscissa xm and an ordinate ym corresponding to an intersection point of the edge AB and the edge CD according to the position coordinates (xb, yb) corresponding to the end point B, the position coordinates (xd, yd) corresponding to the end point D, the slope k1 corresponding to the edge AB and the slope k2 corresponding to the edge CD, wherein xm= (k1×xb-k2×xd+yd-yb)/(k 1-k 2), ym=k1 (xm-xb) +yb, but not limited thereto;
(2) If only third vertexes exist and no first vertexes exist, determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each third vertex, namely judging whether the edge of any third vertex is intersected with each edge of the first quadrangle or not for any third vertex, and if so, calculating the position coordinates corresponding to the intersection point of the edge of the third vertex and the edge of the first quadrangle;
The specific process of determining whether the certain edge where the third vertex is located is intersected with the certain edge of the first quadrangle may refer to the specific process of determining whether the certain edge where the certain first vertex is located is intersected with the certain edge of the second quadrangle, which is not described in detail in this embodiment of the present application;
the specific process of calculating the position coordinate corresponding to the intersection point of the edge where the third vertex is located and the edge of the first quadrangle may refer to the specific process of calculating the position coordinate corresponding to the intersection point of the edge where the certain first vertex is located and the certain edge of the second quadrangle, which is not described in detail in the embodiment of the present application;
(3) If there are both the first vertices and the third vertices, the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle may be determined according to the position coordinates corresponding to each first vertex, or the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle may be determined according to the position coordinates corresponding to each third vertex, which is not particularly limited in the embodiment of the present application.
Further, as shown in fig. 2, the third determination unit 14 includes: a fifth determining module 141, configured to determine an image area corresponding to the fused projection image according to the position coordinates corresponding to each vertex included in the fused projection image, where the image area corresponding to the fused projection image is an area of the fused projection image in the third image; the obtaining module 142 is configured to obtain an actual area corresponding to the target projection area and an image area corresponding to the third image, where the actual area corresponding to the target projection area is obtained by a worker measuring the target projection area in advance, and the actual area corresponding to the target projection area and the image area corresponding to the third image are stored in a local storage space of the target terminal device in advance; the sixth determining module 143 is configured to determine an actual area corresponding to the fused projection image according to an image area corresponding to the fused projection image, an actual area corresponding to the target projection area, and an image area corresponding to the third image.
In the embodiment of the present application, the specific process of determining the actual area corresponding to the fused projection image by the third determining unit 14 according to the position coordinates corresponding to each vertex included in the fused projection image is:
(1) The fifth determining module 141 determines the image area corresponding to the fused projection image according to the position coordinates corresponding to each vertex included in the fused projection image, that is, substitutes the position coordinates corresponding to each vertex included in the fused projection image into a preset formula, so as to calculate and obtain the image area corresponding to the fused projection image, wherein the preset formula is specifically as follows:
wherein n is the number of vertexes contained in the fused projection image, any one of the n vertexes contained in the fused projection image is determined as a first vertex, any one of the remaining n-1 vertexes is determined as a second vertex … …, and the remaining last vertex is determined as an nth vertex; s is the image area corresponding to the fused projection image, and x i Is the abscissa corresponding to the ith vertex, y i+1 Is the ordinate corresponding to the (i+1) th vertex, x i+1 Is the abscissa corresponding to the (i+1) th vertex, y i For the ordinate corresponding to the ith vertex, it should be noted that, when i=n, x is i+1 For the abscissa corresponding to the first vertex, y i+1 Is the ordinate corresponding to the first vertex.
(2) Next, the acquiring module 142 acquires, in the local storage space of the target terminal device, the actual area corresponding to the target projection area and the image area corresponding to the third image.
(3) Finally, the sixth determining module 143 determines the actual area corresponding to the fused projection image according to the image area corresponding to the fused projection image, the actual area corresponding to the target projection area, and the image area corresponding to the third image, that is, calculates the ratio of the image area corresponding to the fused projection image to the image area corresponding to the third image to obtain a first calculation result, calculates the product of the first calculation result and the actual area corresponding to the target projection area to obtain a second calculation result, and determines the second calculation result as the actual area corresponding to the fused projection image.
Further, as an implementation of the apparatus shown in fig. 1 and fig. 2, another embodiment of the present application further provides a method for determining an area of a fused projection image, where the method is applied to a target terminal device, where the target terminal device includes a preset camera, and the target terminal device may but is not limited to: computers, projection fuses, and the like. The method embodiment corresponds to the foregoing apparatus embodiment, and for convenience of reading, details of the foregoing apparatus embodiment are not described one by one in this method embodiment, but it should be clear that the method in this embodiment can correspondingly implement all the contents of the foregoing apparatus embodiment. The method is applied to improving the efficiency of adjusting the projection positions of a plurality of projectors by a worker, and particularly as shown in fig. 4, the method comprises the following steps:
201. A first image, a second image, and a third image are acquired.
The first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after the first projector projects the target projection area, the second projection image is an image obtained after the second projector projects the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project the target projection area simultaneously.
202. And determining the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image.
The position coordinates of each vertex contained in the first projection image are the position coordinates of each vertex contained in the first projection image in the first image, and the position coordinates of each vertex contained in the second projection image are the position coordinates of each vertex contained in the second projection image in the second image.
203. And determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image.
204. And determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image.
Further, step 203, determining the position coordinates corresponding to each vertex included in the fused projection image according to the position coordinates corresponding to each vertex included in the first projection image and the position coordinates corresponding to each vertex included in the second projection image, includes:
generating a fourth image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image, wherein the fourth image has the same image area as the first image, the second image and the third image, and the fourth image contains a first quadrangle and a second quadrangle;
determining the position relation between each vertex contained in the first quadrangle and the second quadrangle and the position relation between each vertex contained in the second quadrangle according to the position coordinates corresponding to each vertex contained in the first quadrangle and the position coordinates corresponding to each vertex contained in the second quadrangle;
determining the vertex which is contained by the first quadrangle and is positioned in the second quadrangle as a first vertex, determining the vertex which is contained by the first quadrangle and is positioned outside the second quadrangle as a second vertex, determining the vertex which is contained by the second quadrangle and is positioned in the first quadrangle as a third vertex, and determining the vertex which is contained by the second quadrangle and is positioned outside the first quadrangle as a fourth vertex;
Determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex;
and determining the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle as the position coordinates corresponding to each vertex contained in the fused projection image.
Further, determining a positional relationship between each vertex included in the first quadrangle and the second quadrangle and a positional relationship between each vertex included in the second quadrangle and the first quadrangle according to the positional coordinates corresponding to each vertex included in the first quadrangle and the positional coordinates corresponding to each vertex included in the second quadrangle includes:
drawing a target direction ray corresponding to each vertex contained in the first quadrangle and a target direction ray corresponding to each vertex contained in the second quadrangle in the fourth image;
determining the number of intersection points of the target direction rays corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction rays corresponding to each vertex contained in the second quadrangle and the first quadrangle;
If the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is an odd number, determining that the vertexes contained in the first quadrangle are in the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is an odd number, determining that the vertexes contained in the second quadrangle are in the first quadrangle;
and if the number of the intersection points of the target direction ray corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
Further, determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex includes:
if the first vertex exists, judging whether the edge where the first vertex is located is intersected with each edge of the second quadrangle; if the first vertex is intersected with the second vertex, calculating a position coordinate corresponding to an intersection point of the edge where the first vertex is located and the edge of the second quadrangle;
If the third vertex exists, judging whether the edge where the third vertex is located is intersected with each edge of the first quadrilateral; if the edges intersect, position coordinates corresponding to the intersection point of the edge where the third vertex is located and the edge of the first quadrangle are calculated.
Further, step 204, determining an actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex included in the fused projection image, includes:
determining the image area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image;
acquiring an actual area corresponding to the target projection area and an image area corresponding to the third image;
and determining the actual area corresponding to the fused projection image according to the image area corresponding to the fused projection image, the actual area corresponding to the target projection area and the image area corresponding to the third image.
The embodiment of the application provides a device and a method for determining the area of a fused projection image, wherein the device for determining the area of the fused projection image comprises the following steps: an acquisition unit, a first determination unit, a second determination unit, and a third determination unit; firstly, acquiring a first image, a second image and a third image in a local storage space of target terminal equipment by an acquisition unit; secondly, determining, by a first determining unit, position coordinates corresponding to each vertex contained in the first projection image according to the first image, and determining position coordinates corresponding to each vertex contained in the second projection image according to the second image; thirdly, the second determining unit determines the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image; and finally, determining the actual area corresponding to the fused projection image by a third determining unit according to the position coordinates corresponding to each vertex contained in the fused projection image. Because in this embodiment of the present application, the actual area of the fused projection image is determined by determining the fused projection image area device according to the first image including the first projection image and the second image including the second projection image without manually measuring the actual area of the fused projection image projected by the two adjacent projectors by a worker, and after determining the actual area of the fused projection image by the fused projection image area device, the target terminal device may output and display the actual area of the fused projection image, so that the worker knows the actual area of the fused projection image, thereby determining whether to adjust the projection positions of the first projector and the second projector, and further effectively improving the efficiency of the worker to adjust the projection positions projected by the two adjacent projectors.
The embodiment of the application provides a storage medium, which comprises a stored program, wherein when the program runs, equipment where the storage medium is located is controlled to execute the method for determining the fused projection image area.
The storage medium may include volatile memory, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM), among other forms in computer readable media, the memory including at least one memory chip.
The embodiment of the application also provides a device for determining the area of the fused projection image, which comprises a storage medium; and one or more processors coupled to the storage medium, the processors configured to execute the program instructions stored in the storage medium; and executing the method for determining the area of the fused projection image when the program instructions run.
The embodiment of the application provides equipment, which comprises a processor, a memory and a program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the following steps:
acquiring a first image, a second image and a third image, wherein the first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after a first projector projects onto the target projection area, the second projection image is an image obtained after a second projector projects onto the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project onto the target projection area at the same time;
Determining position coordinates corresponding to each vertex contained in the first projection image and position coordinates corresponding to each vertex contained in the second projection image, wherein the position coordinates corresponding to each vertex contained in the first projection image are position coordinates of each vertex contained in the first projection image in the first image, and the position coordinates corresponding to each vertex contained in the second projection image are position coordinates of each vertex contained in the second projection image in the second image;
determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image;
and determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image.
Further, the determining, according to the position coordinate corresponding to each vertex included in the first projection image and the position coordinate corresponding to each vertex included in the second projection image, the position coordinate corresponding to each vertex included in the fused projection image includes:
Generating a fourth image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image, wherein the fourth image has the same image area as the first image, the second image and the third image, and the fourth image contains a first quadrangle and a second quadrangle;
determining the position relation between each vertex contained in the first quadrangle and the second quadrangle and the position relation between each vertex contained in the second quadrangle according to the position coordinates corresponding to each vertex contained in the first quadrangle and the position coordinates corresponding to each vertex contained in the second quadrangle;
determining a vertex which is included in the first quadrangle and is positioned in the second quadrangle as a first vertex, determining a vertex which is included in the first quadrangle and is positioned outside the second quadrangle as a second vertex, determining a vertex which is included in the second quadrangle and is positioned in the first quadrangle as a third vertex, and determining a vertex which is included in the second quadrangle and is positioned outside the first quadrangle as a fourth vertex;
Determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex;
and determining the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle as the position coordinates corresponding to each vertex contained in the fused projection image.
Further, the determining, according to the position coordinate corresponding to each vertex included in the first quadrangle and the position coordinate corresponding to each vertex included in the second quadrangle, the position relationship between each vertex included in the first quadrangle and the second quadrangle, and the position relationship between each vertex included in the second quadrangle and the first quadrangle includes:
drawing a target direction ray corresponding to each vertex contained in the first quadrangle and a target direction ray corresponding to each vertex contained in the second quadrangle in the fourth image;
determining the number of intersection points of the target direction rays corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction rays corresponding to each vertex contained in the second quadrangle and the first quadrangle;
If the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is odd, determining that the vertexes contained in the first quadrangle are in the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is odd, determining that the vertexes contained in the second quadrangle are in the first quadrangle;
and if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
Further, the determining, according to the position coordinate corresponding to each first vertex or the position coordinate corresponding to each third vertex, the position coordinate corresponding to each intersection of the first quadrangle and the second quadrangle includes:
if the first vertex exists, judging whether the edge where the first vertex is located is intersected with each edge of the second quadrangle; if the first vertex is intersected with the second vertex, calculating a position coordinate corresponding to an intersection point of the edge where the first vertex is located and the edge of the second quadrilateral;
If the third vertex exists, judging whether the edge where the third vertex is located is intersected with each edge of the first quadrilateral; if so, calculating the position coordinates corresponding to the intersection point of the edge where the third vertex is located and the edge of the first quadrilateral.
Further, the determining, according to the position coordinates corresponding to each vertex included in the fused projection image, an actual area corresponding to the fused projection image includes:
determining the image area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image;
acquiring an actual area corresponding to the target projection area and an image area corresponding to the third image;
and determining the actual area corresponding to the fused projection image according to the image area corresponding to the fused projection image, the actual area corresponding to the target projection area and the image area corresponding to the third image.
The present application also provides a computer program product adapted to perform, when executed on a data processing device, a program code initialized with the method steps of: acquiring a first image, a second image and a third image, wherein the first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after a first projector projects onto the target projection area, the second projection image is an image obtained after a second projector projects onto the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project onto the target projection area at the same time; determining position coordinates corresponding to each vertex contained in the first projection image and position coordinates corresponding to each vertex contained in the second projection image, wherein the position coordinates corresponding to each vertex contained in the first projection image are position coordinates of each vertex contained in the first projection image in the first image, and the position coordinates corresponding to each vertex contained in the second projection image are position coordinates of each vertex contained in the second projection image in the second image; determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image; and determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (8)

1. An apparatus for determining the area of a fused projection image, said apparatus comprising:
the device comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring a first image, a second image and a third image, the first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after a first projector projects to the target projection area, the second projection image is an image obtained after a second projector projects to the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project to the target projection area at the same time;
A first determining unit, configured to determine a position coordinate corresponding to each vertex included in the first projection image and a position coordinate corresponding to each vertex included in the second projection image, where the position coordinate corresponding to each vertex included in the first projection image is a position coordinate of each vertex included in the first projection image in the first image, and the position coordinate corresponding to each vertex included in the second projection image is a position coordinate of each vertex included in the second projection image in the second image;
the second determining unit is used for determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image;
the third determining unit is used for determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image;
the second determination unit includes:
the generating module is used for generating a fourth image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image, wherein the fourth image has the same image area as the first image, the second image and the third image, and the fourth image contains a first quadrangle and a second quadrangle;
A first determining module, configured to determine a positional relationship between each vertex included in the first quadrilateral and the second quadrilateral, and a positional relationship between each vertex included in the second quadrilateral and the first quadrilateral, according to a positional coordinate corresponding to each vertex included in the first quadrilateral and a positional coordinate corresponding to each vertex included in the second quadrilateral;
a second determining module, configured to determine, as a first vertex, a vertex that is included in the first quadrilateral and that is located in the second quadrilateral, determine, as a second vertex, a vertex that is included in the first quadrilateral and that is located outside the second quadrilateral, determine, as a third vertex, a vertex that is included in the second quadrilateral and that is located in the first quadrilateral, and determine, as a fourth vertex, a vertex that is included in the second quadrilateral and that is located outside the first quadrilateral;
the third determining module is used for determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex;
and the fourth determining module is used for determining the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle as the position coordinates corresponding to each vertex contained in the fused projection image.
2. The apparatus of claim 1, wherein the device comprises a plurality of sensors,
the first determining module is specifically configured to:
drawing a target direction ray corresponding to each vertex contained in the first quadrangle and a target direction ray corresponding to each vertex contained in the second quadrangle in the fourth image;
determining the number of intersection points of the target direction rays corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction rays corresponding to each vertex contained in the second quadrangle and the first quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is odd, determining that the vertexes contained in the first quadrangle are in the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is odd, determining that the vertexes contained in the second quadrangle are in the first quadrangle;
And if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
3. The apparatus of claim 1, wherein the device comprises a plurality of sensors,
the third determining module is specifically configured to:
if the first vertex exists, judging whether the edge where the first vertex is located is intersected with each edge of the second quadrangle; if the first vertex is intersected with the second vertex, calculating a position coordinate corresponding to an intersection point of the edge where the first vertex is located and the edge of the second quadrilateral;
if the third vertex exists, judging whether the edge where the third vertex is located is intersected with each edge of the first quadrilateral; if so, calculating the position coordinates corresponding to the intersection point of the edge where the third vertex is located and the edge of the first quadrilateral.
4. A device according to any one of claims 1-3, wherein the third determining unit comprises:
a fifth determining module, configured to determine an image area corresponding to the fused projection image according to a position coordinate corresponding to each vertex included in the fused projection image;
the acquisition module is used for acquiring the actual area corresponding to the target projection area and the image area corresponding to the third image;
And a sixth determining module, configured to determine an actual area corresponding to the fused projection image according to the image area corresponding to the fused projection image, the actual area corresponding to the target projection area, and the image area corresponding to the third image.
5. A method of determining the area of a fused projection image, the method comprising:
acquiring a first image, a second image and a third image, wherein the first image, the second image and the third image are all obtained by shooting a target projection area through a preset camera, the first image comprises a first projection image, the second image comprises a second projection image, the third image comprises a fusion projection image, the first projection image is an image obtained after a first projector projects onto the target projection area, the second projection image is an image obtained after a second projector projects onto the target projection area, and the fusion projection image is an image obtained after the first projector and the second projector project onto the target projection area at the same time;
determining position coordinates corresponding to each vertex contained in the first projection image and position coordinates corresponding to each vertex contained in the second projection image, wherein the position coordinates corresponding to each vertex contained in the first projection image are position coordinates of each vertex contained in the first projection image in the first image, and the position coordinates corresponding to each vertex contained in the second projection image are position coordinates of each vertex contained in the second projection image in the second image;
Determining the position coordinates corresponding to each vertex contained in the fused projection image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image;
determining the actual area corresponding to the fused projection image according to the position coordinates corresponding to each vertex contained in the fused projection image;
the determining the position coordinate corresponding to each vertex included in the fused projection image according to the position coordinate corresponding to each vertex included in the first projection image and the position coordinate corresponding to each vertex included in the second projection image includes:
generating a fourth image according to the position coordinates corresponding to each vertex contained in the first projection image and the position coordinates corresponding to each vertex contained in the second projection image, wherein the fourth image has the same image area as the first image, the second image and the third image, and the fourth image contains a first quadrangle and a second quadrangle;
determining the position relation between each vertex contained in the first quadrangle and the second quadrangle and the position relation between each vertex contained in the second quadrangle according to the position coordinates corresponding to each vertex contained in the first quadrangle and the position coordinates corresponding to each vertex contained in the second quadrangle;
Determining a vertex which is included in the first quadrangle and is positioned in the second quadrangle as a first vertex, determining a vertex which is included in the first quadrangle and is positioned outside the second quadrangle as a second vertex, determining a vertex which is included in the second quadrangle and is positioned in the first quadrangle as a third vertex, and determining a vertex which is included in the second quadrangle and is positioned outside the first quadrangle as a fourth vertex;
determining the position coordinates corresponding to each intersection point of the first quadrangle and the second quadrangle according to the position coordinates corresponding to each first vertex or the position coordinates corresponding to each third vertex;
and determining the position coordinates corresponding to the second vertex, the position coordinates corresponding to the fourth vertex and the position coordinates corresponding to the intersection point of the first quadrangle and the second quadrangle as the position coordinates corresponding to each vertex contained in the fused projection image.
6. The method according to claim 5, wherein the determining the positional relationship between each vertex included in the first quadrangle and the second quadrangle and the positional relationship between each vertex included in the second quadrangle according to the positional coordinates corresponding to each vertex included in the first quadrangle and the positional coordinates corresponding to each vertex included in the second quadrangle includes:
Drawing a target direction ray corresponding to each vertex contained in the first quadrangle and a target direction ray corresponding to each vertex contained in the second quadrangle in the fourth image;
determining the number of intersection points of the target direction rays corresponding to each vertex contained in the first quadrangle and the second quadrangle, and determining the number of intersection points of the target direction rays corresponding to each vertex contained in the second quadrangle and the first quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is odd, determining that the vertexes contained in the first quadrangle are in the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the first quadrangle and the second quadrangle is even, determining that the vertexes contained in the first quadrangle are outside the second quadrangle;
if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is odd, determining that the vertexes contained in the second quadrangle are in the first quadrangle;
and if the number of the intersection points of the target direction rays corresponding to the vertexes contained in the second quadrangle and the first quadrangle is even, determining that the vertexes contained in the second quadrangle are outside the first quadrangle.
7. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the method of determining a fused projected image area of claim 5 or 6.
8. An apparatus for determining the area of a fused projection image, said apparatus comprising a storage medium; and one or more processors coupled to the storage medium, the processors configured to execute the program instructions stored in the storage medium; the program instructions, when executed, perform the method for determining a fused projected image area of claim 5 or 6.
CN202210760219.6A 2022-06-30 2022-06-30 Device and method for determining area of fused projection image Active CN115190280B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210760219.6A CN115190280B (en) 2022-06-30 2022-06-30 Device and method for determining area of fused projection image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210760219.6A CN115190280B (en) 2022-06-30 2022-06-30 Device and method for determining area of fused projection image

Publications (2)

Publication Number Publication Date
CN115190280A CN115190280A (en) 2022-10-14
CN115190280B true CN115190280B (en) 2024-02-20

Family

ID=83515992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210760219.6A Active CN115190280B (en) 2022-06-30 2022-06-30 Device and method for determining area of fused projection image

Country Status (1)

Country Link
CN (1) CN115190280B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1434434A1 (en) * 2002-10-31 2004-06-30 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Self calibration of a projection system using multiple projectors
JP2012142669A (en) * 2010-12-28 2012-07-26 Seiko Epson Corp Projection controller, projection system, test chart, and projection area determination method
CN108200416A (en) * 2017-12-15 2018-06-22 歌尔科技有限公司 Coordinate mapping method, device and the projection device of projected image in projection device
CN111601093A (en) * 2020-05-13 2020-08-28 宜宾市极米光电有限公司 Method and device for acquiring projection standard brightness and readable storage medium
CN111698491A (en) * 2020-06-24 2020-09-22 杭州爱科科技股份有限公司 Multi-projection image display method and device, electronic equipment and storage medium
WO2020220832A1 (en) * 2019-04-30 2020-11-05 成都极米科技股份有限公司 Method and apparatus for achieving projection picture splicing, and projection system
CN112734860A (en) * 2021-01-15 2021-04-30 中国传媒大学 Arc-screen prior information-based pixel-by-pixel mapping projection geometric correction method
WO2022105276A1 (en) * 2020-11-18 2022-05-27 成都极米科技股份有限公司 Method and apparatus for determining projection area, projection device, and readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107995478B (en) * 2017-12-13 2019-08-27 歌尔股份有限公司 Projecting method and projection device
CN114449234B (en) * 2020-10-30 2024-01-09 扬智科技股份有限公司 Projection device and projection picture correction method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1434434A1 (en) * 2002-10-31 2004-06-30 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Self calibration of a projection system using multiple projectors
JP2012142669A (en) * 2010-12-28 2012-07-26 Seiko Epson Corp Projection controller, projection system, test chart, and projection area determination method
CN108200416A (en) * 2017-12-15 2018-06-22 歌尔科技有限公司 Coordinate mapping method, device and the projection device of projected image in projection device
WO2020220832A1 (en) * 2019-04-30 2020-11-05 成都极米科技股份有限公司 Method and apparatus for achieving projection picture splicing, and projection system
CN111601093A (en) * 2020-05-13 2020-08-28 宜宾市极米光电有限公司 Method and device for acquiring projection standard brightness and readable storage medium
CN111698491A (en) * 2020-06-24 2020-09-22 杭州爱科科技股份有限公司 Multi-projection image display method and device, electronic equipment and storage medium
WO2022105276A1 (en) * 2020-11-18 2022-05-27 成都极米科技股份有限公司 Method and apparatus for determining projection area, projection device, and readable storage medium
CN112734860A (en) * 2021-01-15 2021-04-30 中国传媒大学 Arc-screen prior information-based pixel-by-pixel mapping projection geometric correction method

Also Published As

Publication number Publication date
CN115190280A (en) 2022-10-14

Similar Documents

Publication Publication Date Title
US20220286654A1 (en) Projector Keystone Correction Method, Apparatus And System, And Readable Storage Medium
US10789672B2 (en) Method and device for performing mapping on spherical panoramic image
US8025414B2 (en) Projector, projected image adjustment method, and program for projecting an image while correcting the shape of the image
CN103051841B (en) The control method of time of exposure and device
CN100478775C (en) Projector and image correction method
US20110216288A1 (en) Real-Time Projection Management
US20210329203A1 (en) Projection method and projection device
CN104750443A (en) Display control method and electronic equipment
CN105516597A (en) Method and device for processing panoramic photography
CN104349068A (en) Shooting method and electronic equipment
WO2020164044A1 (en) Free-viewpoint image synthesis method, device, and apparatus
US11652967B2 (en) Projection device and projection image correction method thereof
CN109685721B (en) Panoramic picture splicing method, device, terminal and corresponding storage medium
US20020009699A1 (en) Data receiving device and image forming apparatus using same
US10573277B2 (en) Display device, display system, and non-transitory recording medium, to adjust position of second image in accordance with adjusted zoom ratio of first image
JP2008211355A (en) Projector, program, and information storage medium
US11877103B2 (en) Projection device and projection picture correction method thereof
CN113380184B (en) Correction method and device, LED display screen and computer equipment
CN115190280B (en) Device and method for determining area of fused projection image
EP2034444B1 (en) Method for rotating images
JP5187480B2 (en) Projector, program, information storage medium, and image generation method
JP2006074805A (en) Multi-projection video display device
CN114760451B (en) Projection image correction prompting method, projection image correction prompting device, projection equipment and storage medium
EP4283986A1 (en) Electronic apparatus and control method thereof
CN115190281B (en) Device and method for adjusting projection position of projector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant