WO2024080234A1 - Dispositif de projection, dispositif de correction, système de projection, procédé de correction et programme informatique - Google Patents

Dispositif de projection, dispositif de correction, système de projection, procédé de correction et programme informatique Download PDF

Info

Publication number
WO2024080234A1
WO2024080234A1 PCT/JP2023/036495 JP2023036495W WO2024080234A1 WO 2024080234 A1 WO2024080234 A1 WO 2024080234A1 JP 2023036495 W JP2023036495 W JP 2023036495W WO 2024080234 A1 WO2024080234 A1 WO 2024080234A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
image
lens
projection surface
optical path
Prior art date
Application number
PCT/JP2023/036495
Other languages
English (en)
Japanese (ja)
Inventor
将秀 重野
類 松本
賢悟 林
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2024080234A1 publication Critical patent/WO2024080234A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present disclosure relates to a projection device, a correction device, a projection system, a correction method, and a computer program that correct parameters when projecting an image by the projection device.
  • one method for correcting distortion in an image projected onto a screen by a projection device involves photographing the projected image and the screen, detecting the four corners of each, and making corrections (see, for example, patent documents).
  • the present disclosure provides a projection device, a correction device, a projection system, a correction method, and a computer program that realizes correction of various parameters of a projection device placed in a target space in which a screen is installed, without the need for actual projection by the projection device.
  • the projection device of the present disclosure projects an image onto a projection surface.
  • the projection device includes a light source that emits light used to project the image, a lens on an optical path from the light source to the projection surface, an image sensor used to capture an image of the projection surface, and an optical element that guides the light for projection from the light source to the lens and guides the light from the lens obtained by capturing the image to the image sensor.
  • a first partial optical path from the optical element to the projection surface coincides with a second partial optical path from the projection surface to the optical element of a second optical path from the projection surface to the image sensor.
  • the projection device, correction device, projection system, correction method, and computer program disclosed herein can correct parameters of a projection device without the need to actually perform projection using the projection device.
  • FIG. 1 is a conceptual diagram illustrating a projection system according to an embodiment.
  • 1 is an example showing an overlap between a projection area and a shooting area.
  • 13 is another example showing the overlap of the projection area and the shooting area.
  • 13 is another example showing the overlap of the projection area and the shooting area.
  • 1 is a block diagram showing a configuration of a projection device according to an embodiment.
  • 3 is a conceptual diagram showing projection and display by the projection device of FIG. 2.
  • 1 is a block diagram showing a configuration of a correction device according to an embodiment.
  • FIG. 2 is a schematic diagram showing an example of image projection by a projection device.
  • FIG. 2 is a schematic diagram illustrating an example of a projection of a corrected image.
  • FIG. 10A and 10B are schematic diagrams showing an example of the shape of an image projected before correction and markers placed on a projection surface.
  • FIG. 13 is a schematic diagram showing the shape of an image projected after correction. 13 is an example of the shape of a marker on a projection surface that specifies an image projection range. 13 is another example of the shape of the marker on the projection surface that specifies the image projection range. 13 is another example of the shape of the marker on the projection surface that specifies the image projection range. 13 is a captured image of a projection surface on which no markers are placed. 7B is an example in which a user places a marker on the projected image of FIG. 7A. 13 is an example of a marker on the projection surface that specifies a zoom range.
  • FIG. 8B is an example of a zoom-corrected projected image according to the markers of FIG. 8A.
  • 1 is an example of a marker on a projection plane that specifies the center of an image.
  • 9B is an example of a projection image shift-corrected according to the markers of FIG. 9A.
  • 10 is a flowchart illustrating an example of a process executed in the projection device.
  • 10 is a flowchart illustrating an example of a process executed in the imaging device.
  • 10 is a flowchart showing an example of a correction process of a projection range executed in the imaging device.
  • 10 is a flowchart illustrating an example of a zoom correction process executed in the imaging device.
  • 10 is a flowchart illustrating an example of a shift correction process executed in the imaging device.
  • the projection device, correction device, projection system, correction method, and computer program disclosed herein correct projection parameters even when an image is not being projected.
  • projection parameters refer to parameter values, etc., used for projection, such as parameter values for geometric correction, parameter values for zoom adjustment, or parameter values for shift adjustment.
  • “Shooting area” refers to the largest area that can be mechanically photographed using the projection device's shooting function.
  • Processing area refers to the maximum area that can be mechanically projected onto a projection surface by a projection device.
  • Processing range refers to the range in which the actual image is projected onto the projection surface after adjustment by the projection device.
  • Geometric correction refers to the correction of geometric distortions that occur when an image is displayed on a display device.
  • geometric correction is a correction that displays an image that has trapezoidal, barrel, or pincushion distortion as a rectangular image.
  • Trapezoidal distortion is a distortion in which an image that should be rectangular is projected onto the screen as a trapezoid.
  • Barrel distortion is a distortion in which the image is displayed as if it bulges in the center.
  • Pincushion distortion is a distortion in which the image is displayed as if it bulges in the center.
  • Geometric correction is achieved by adjusting the parameter values of the displayed image.
  • the projection system 1 is connected to a projection device 2 and a correction device 3.
  • the projection device 2 is a projector that projects an image onto a screen 4, which is a projection surface.
  • the correction device 3 corrects the parameters used by the projection device 2 in the projection.
  • the projection device 2 has a built-in mechanism for shooting in addition to a mechanism for projection.
  • the optical axis L1 for projection and the optical axis L2 for shooting coincide with each other.
  • the projection region R1 and the shooting region R2 at least partially overlap with each other.
  • the optical axis L1 for projection and the optical axis L2 for shooting coincide with each other, the relationship between the projection region R1 and the shooting region R2 is unchanged. In other words, when a certain point P1 is specified in the projection region R1, the position of the point P2 corresponding to the point P1 can be uniquely obtained in the shooting region R2.
  • Figure 1B is an example where the entire capture area R2 is included within the projection area R1.
  • Figure 1C is an example where the projection area R1 and the capture area R2 coincide.
  • Figure 1D is an example where the projection area R1 is included within the capture area R2. Which case applies depends on conditions such as the specifications of the light modulation element 22, the optical element 23, and the image sensor 25.
  • the projection device 2 has a light source 21, a light modulation element 22, an optical element 23, a lens 24, and an image sensor 25.
  • the light source 21 emits light used for projecting an image.
  • the light modulation element 22 modulates the red, green, and blue light emitted from the light source 21 based on the image to be projected.
  • the lens 24 is located on the optical path from the light source 21 to the projection surface, which is used for projecting and photographing the image.
  • the image sensor 25 is used for photographing the image on the projection surface.
  • the optical element 23 guides the light for projection modulated by the light modulation element 22 to the lens 24, and guides the light from the lens 24 obtained by photographing to the image sensor 25.
  • the optical element 23 may be, for example, a prism or a half mirror. Note that other optical elements such as lenses and prisms are arranged between the light source 21 and the lens 24 and between the lens 24 and the image sensor 25, but these are omitted from the illustration.
  • the light source 21, the light modulation element 22, the optical element 23, and the lens 24 are used as a projection mechanism.
  • the lens 24, the optical element 23, and the image sensor 25 are used as an image capturing mechanism.
  • the projection mechanism captures an image of the projection surface in order to adjust the projection parameters as a preprocessing step before projecting an image.
  • the first partial optical path from the optical element 23 to the screen 4 coincides with the second partial optical path from the screen 4 to the optical element 23 of the second optical path from the screen 4 to the image sensor 25.
  • Photographing process In the photographing process executed as a pre-processing, in the projection device 2, the light received by the lens 24 is guided to the image sensor 25 via the optical element 23. This allows the projection device 2 to photograph the screen 4.
  • the photographed image data obtained by the image sensor 25 is transmitted to the correction device 3 by a communication circuit (not shown).
  • the projection device 2 has an autofocus function (not shown) and can automatically adjust the focus.
  • the communication circuit transmits various parameters used in the photographing process of the image to the correction device 3 together with the photographed image data.
  • the parameter transmitted from the projection device 2 to the correction device 3 is, for example, a value indicating the focal length.
  • the projection device 2 may also perform shift processing and zoom processing as necessary, and transmit them as parameters used in the photographing process.
  • FIG. 3 is a schematic diagram showing how, in the projection device 2, a photographing mechanism acquires a photographed image Im of the screen 4, and a projection mechanism projects the image onto the screen 4.
  • the first partial optical path from the optical element 23 to the projection surface of the first optical path from the light source 21 to the projection surface coincides with the second partial optical path from the projection surface to the image sensor 23.
  • the projection device 2 since the projection area R1 and the photographing area R2 are the same, the projection device 2 does not need to perform coordinate conversion to match the coordinate system of the projection area R1 with the coordinate system of the photographing area R2, and therefore does not need to calibrate the projected image and the photographed image.
  • the lens 24 since the lens 24 is shared between shooting and projection, the lens 24 has already been focus-adjusted in the shooting process, which is a pre-processing, so that the focus is already adjusted at the time of the subsequent projection process. Specifically, the focus is adjusted and maintained in the adjusted state by changing the distance between the optical element 23, the lens 24, and one or more optical elements between the optical element 23 and the lens 24.
  • a contrast method is used for focus adjustment. Specifically, as described above, the positional relationship between the optical element 23, the lens 24, and one or more optical elements between the optical element 23 and the lens 24 is adjusted multiple times to capture an image, and the positional relationship with the highest contrast among the multiple images is selected to adjust the focus. This eliminates the need to adjust the focus after the light source 21 emits the projection light, and the light source 21 emits light to project a focused image immediately at the time of starting projection.
  • the projection device 2 in the projection process, in the projection device 2, the light emitted by the light source 21 is modulated by the light modulation element 22. Also, in the projection device 2, the light modulated by the light modulation element 22 is guided to the lens 24 via the optical element 23, and projected onto the screen 4 via the lens 24 as projection light.
  • the projection device 2 may correct the projection range of the image, such as geometric correction, the shift amount and zoom amount of the lens 24, and parameters of the lens 24 and other optical elements, under the control of the correction device 3. Also, since the focus is adjusted in the shooting process, there is no need to perform focus adjustment again during the projection process. Note that the shift amount and zoom amount of the lens 24 are corrected in the shooting process, which is the pre-processing described above, and may also be corrected in the preparation process performed prior to the shooting process.
  • FIG. 3 shows a schematic representation of a captured image Im.
  • This captured image Im may be stored, for example, in a storage device (not shown) built into the projection device 2, or may be transmitted to an external information processing device (for example, correction device 3) connected to the projection device.
  • the correction device 3 is an information processing device including an arithmetic circuit 30, an input device 31, an output device 32, a communication circuit 33, and a storage device 34.
  • the correction device 3 transmits a control signal for correcting shooting parameters.
  • the arithmetic circuit 30 is a controller that controls the entire correction device 3.
  • the arithmetic circuit 30 reads and executes a correction program P stored in the storage device 34 to realize various correction processes during projection in the projection device 2.
  • the arithmetic circuit 30 is not limited to a circuit that realizes a predetermined function through the cooperation of hardware and software, but may be a hardware circuit designed specifically to realize a predetermined function.
  • the arithmetic circuit 30 can be realized by various processors such as a CPU, MPU, GPU, FPGA, DSP, ASIC, etc.
  • the input device 31 is an input means such as an operation button, keyboard, mouse, touch panel, microphone, etc. that is used for operations and data input.
  • the output device 32 is an output means such as a display, speaker, etc. that is used for outputting processing results and data.
  • the communication circuit 33 is a communication means for enabling data communication with an external device (e.g., the projection device 2).
  • the above-mentioned data communication may be wired and/or wireless, and may be performed according to known communication standards.
  • wired data communication is performed by using a communication controller of a semiconductor integrated circuit that operates in accordance with the Ethernet (registered trademark) standard and/or the USB (registered trademark) standard as the communication circuit 33.
  • Wireless data communication is performed by using a communication controller of a semiconductor integrated circuit that operates in accordance with the IEEE 802.11 standard for LANs (Local Area Networks) and/or the fourth/fifth generation mobile communication systems, known as 4G/5G, for mobile communications, as the communication circuit 33.
  • the storage device 34 is a recording medium that records various information.
  • the storage device 34 is realized, for example, by a RAM, a ROM, a flash memory, an SSD (Solid State Drive), a hard disk drive, or other storage device, or an appropriate combination of these.
  • the storage device 34 stores the correction program P, which is a computer program executed by the arithmetic circuit 30, and various data such as the marker data D1 used to execute the correction.
  • Marker data D1 is data that includes a reference point indicating a specific position on the projection surface that is used for correction when projecting an image.
  • the projection position when projecting an image can be determined based on the reference point indicated by marker data D1.
  • the degree of zoom for zoom correction can be determined based on the reference point indicated by marker data D1.
  • the amount of shift for image shifting can be determined based on the reference point indicated by marker data D1.
  • correction of the projection range of an image such as geometric correction, zoom correction, which is a correction of parameters, and shift correction, which is a correction of parameters, etc.
  • the arithmetic circuit 30 When correcting the projection range such as geometric correction, the arithmetic circuit 30 obtains a captured image of a projection surface on which a predetermined marker is arranged from the projection device 2, and detects the predetermined marker from the captured image.
  • the predetermined marker corresponds to a reference point indicating the projection range of an ideal image.
  • the arithmetic circuit 30 sets the range indicated by the detected marker as the projection range.
  • the shape of the marker used here is stored, for example, as marker data D1 in the storage device 34, and the arithmetic circuit 30 can detect the predetermined marker from the captured image obtained from the projection device 2 by using the marker data D1 as a reference. An example of the shape of the marker will be described later with reference to FIGS. 6A to 6C.
  • FIG. 5A is a schematic diagram showing the projection area and shooting area when the projection surface is not perpendicular to the projection axis of the projection device 2 and is projected obliquely onto the projection surface, and shows the state of the projection surface being photographed from the front.
  • the original image is rectangular, but because the projection device 2 projects obliquely, the projection area and shooting area are a quadrangle formed by four points A to D, and are trapezoids.
  • FIG. 5A is a schematic diagram showing the projection area and shooting area when the projection surface is not perpendicular to the projection axis of the projection device 2 and is projected obliquely onto the projection surface, and shows the state of the projection surface being photographed from the front.
  • the original image is rectangular, but because the projection device 2 projects obliquely, the projection area and shooting area are a quadrangle formed by four points A to D, and are trapezoids.
  • 5A shows an example in which the four points A' to D' are markers indicating the projection range of the ideal image, and the quadrangle indicated by points A' to D' is set as the projection range.
  • the calculation circuit 30 detects the coordinates of the markers, and by performing coordinate conversion so that points A to D of the projected image are each within the projection range of points A' to D', it can be displayed within the desired projection range as shown in FIG. 5B.
  • coordinate transformation for example, homography transformation, which will be described later with reference to FIG. 5C and FIG. 5D, is used.
  • the arithmetic circuit 30 controls the projection device 2 to project only the light that projects the transformed image corresponding to the projection range, among the light corresponding to each pixel of the image of the light source 21, and not to project the light that corresponds to the outside of the projection range.
  • the arithmetic circuit 30 controls the projection device 2 to mask the part that corresponds to the outside of the projection range with black. This allows the correction device 3 to correct the projection image so that it is projected in the set projection range. By correcting the projection range in this way, the correction device 3 can correct the projection range by the projection device 2 to a desired shape. This allows the correction device 3 to perform geometric correction on the projection image of the projection device 2, for example.
  • FIG. 5C shows an example of an original image projected by the projection device 2 from the state of FIG. 5A.
  • points A to D indicate the four corners of the projection area and the capture area. Therefore, for example, the four corners of the projected image projected by the projection device 2 are points A to D. Also, points A to D correspond to points A' to D', which indicate the projection range, respectively. Note that each point in the projected image and the captured image is represented using X and Y coordinates, as shown in FIG. 5C.
  • the image projected from the projection device 2 in other words the four corners of the original image, are respectively represented by coordinates A (xa, ya), B (xb, yb), C (xc, yc), and D (xd, yd). Furthermore, the four corners of the projection range are respectively represented by the following coordinates: A' (xa', ya'), B' (xb', yb'), C' (xc', yc'), and D' (xd', yd'). In this way, the coordinates before transformation are represented by (xi, yi), and the coordinates after transformation are represented by (xi', yi').
  • the arithmetic circuit 30 calculates the homography matrix H using the coordinates of the corresponding points and the following equation (1). Furthermore, after calculating the homography matrix H, the arithmetic circuit 30 performs homography transformation on the entire pre-transformation image, i.e., on each pixel of the image, using the calculated homography matrix H to generate a projected image, which is the image after transformation. In this manner, by using the homography transformation, a projected image is obtained as shown in Fig. 5D.
  • FIG. 6A shows an example of a captured image in which four specific shapes are each designated as a marker M1, and a rectangle defined by each marker M1 is the projection range.
  • each marker M1 is mountain-shaped, and the projection range is a rectangular area connecting the centers of gravity of each marker.
  • the calculation circuit 30 detects four markers M1 from the captured image, it sets the rectangular area connecting the centers of gravity of each marker M1 as the projection range. Note that the calculation circuit 30 may set the projection range using a feature point other than the center of gravity.
  • FIG. 6B shows an example of a captured image in which four points are markers M2 and the projection range is a rectangle defined by the markers M2.
  • the arithmetic circuit 30 detects four points as markers M2 from the captured image, it sets the rectangular area connecting the markers M2 as the projection range.
  • FIG. 6C shows an example of a captured image in which the rectangular frame itself is the marker M3, and the frame indicated by the marker M3 is the projection range.
  • the arithmetic circuit 30 detects the rectangular frame as the marker M3 from the captured image, it sets the rectangular area indicated by the marker M3 as the projection range.
  • the shapes of the markers M1 to M3 shown in Figures 6A to 6C are examples, and the shape and arrangement of the markers are not limited as long as they are capable of identifying the projection range. Furthermore, these markers may be temporarily attached to the projection surface, or may utilize a configuration that is originally present on the projection surface.
  • the shape indicated by the markers can be determined according to the purpose of the projection. Furthermore, the shape indicated by the markers may be a shape intended for geometric correction.
  • the arithmetic circuit 30 may control the projection device 2 to perform geometric correction using a marker identified by something other than the captured image. For example, the arithmetic circuit 30 sets a predetermined range or a range specified via the input device 31 as the projection range, and controls the projection device 2 to adjust the light emitted from the light source 21 so that the image is projected into that projection range.
  • FIG. 7A shows an example of a captured image that does not include a marker.
  • a captured image does not include a marker
  • FIG. 7B shows four markers M4 specified by the user.
  • the markers M4 are cross-shaped, and a rectangular area connecting the intersections of each of the markers M4 is set as the projection range.
  • the calculation circuit 30 receives a zoom request, it detects a predetermined marker indicating the zoom range from the captured image of the projection surface acquired from the projection device 2. The calculation circuit 30 also sets the range indicated by the detected marker as the projection range. Next, the calculation circuit 30 generates a control signal so that the set projection range is displayed larger (occupies a larger proportion) in the shooting area, and controls the projection device 2. Specifically, the calculation circuit 30 generates a control signal that changes the distance between the multiple lenses included in the lens 24. This allows the correction device to control the projection area of the lens 24 as the projection range, so that the projection device 2 can zoom the image so as to increase the number of effective pixels for the projection range. In the following description, "zooming an image” means changing the projected image from the wide side to the telephoto side.
  • the calculation circuit 30 compares the size of the current projection area and shooting area with the size of the projection range calculated by the marker M11.
  • the calculation circuit 30 also generates a control signal to control the projection device 2 according to the calculated value. Specifically, the calculation circuit 30 generates a control signal to change the distance between the multiple lenses included in the lens 24. This allows the projection area and shooting area of the projection device 2 to be adjusted to the area specified by the marker M11, as shown in FIG.
  • the projection area and shooting area may be controlled to be zoomed as much as possible to match the area identified by the multiple markers M1. Note that when the projection device 20 captures an image from an oblique direction and projects the image onto the screen 4, the positions of the markers M11 are not captured in a relationship that draws a rectangle as shown in FIG. 8A, but rather, for example, in a relationship that draws a trapezoid.
  • the calculation circuit 30 When the calculation circuit 30 receives a request to change the image center position, it detects a predetermined marker indicating the center position of the image from the captured image of the projection surface. The calculation circuit 30 also sets the position specified by the detected marker as the center position of the shooting area. Next, the calculation circuit 30 adjusts the lens 24 along a plane perpendicular to the optical axis so that the shooting by the projection device 2 is centered on the set center position. This allows the correction device 3 to shift the projection range around the desired position during projection.
  • FIG. 9A shows a captured image including a marker M12 indicating a reference position when shifting an image.
  • the image includes four mountain-shaped markers M12, and the center of a rectangle connecting the vertices of each marker is set as the center C1 of the desired projection range.
  • the center C2 of the current projection area and shooting area is shifted from the center C1 of the desired projection range. Therefore, the calculation circuit 30 controls the projection device 2 so that the center C2 of the projection area and shooting area coincides with the center C1 of the desired projection range.
  • the calculation circuit 30 calculates the deviations of the center C2 of the current projection area and shooting area from the center C1 of the desired projection range obtained by the marker M12 on the x-axis and y-axis. In addition, the calculation circuit 30 calculates the amount of movement of the lens 24 on the x-axis and y-axis according to the obtained values, and controls the projection device 2. As a result, as shown in FIG. 9B, the center C2 of the projection area and shooting area by the projection device 2 can be matched with the center C1 specified by the marker M12. Even if no markers are included, for example, the point where the diagonals corresponding to the squares in the captured image intersect may be found as the center of the projection range, and the image may be shifted to match the found center.
  • the projection device 2 arranged facing the projection surface adjusts the shooting parameters (S101).
  • the projection device 2 adjusts the shooting parameters such as the focal length, shift correction, zoom, etc.
  • the projection device 2 captures an image of the projection surface (S102).
  • the projection device 2 transmits the shooting parameters adjusted in step S101 and the shooting image data obtained in step S102 to the correction device 3 (S103).
  • the projection device 2 receives the control signal generated by the correction device 3 for the shooting parameters and captured image data sent in step S103 (S104).
  • the projection device 2 adjusts the projection parameters according to the control signal received in step S104 (S105). At this time, the focus parameters may not be adjusted, and the parameters adjusted in step S101 may be used.
  • the projection device 2 turns on the light source 21 and projects an image onto the projection surface (S106).
  • focus adjustment is performed in the shooting process, which is a pre-processing step, so there is no need to adjust the focus in the projection process.
  • by adjusting the projection parameters using a control signal generated by the correction device according to the captured image obtained in the shooting process and the shooting parameters, and then projecting an image there is no need for adjustment after projection.
  • the correction device 3 receives the shooting parameters and the shot image data of the projection surface transmitted from the projection device 2 (S201).
  • the correction device 3 executes a correction process for the projection range and generates a control signal for correcting the projection range (S202).
  • the correction process for the projection range will be described later with reference to FIG. 12A.
  • the correction device 3 also executes a shift correction process to generate a control signal for the shift correction (S203).
  • the shift correction process will be described later with reference to FIG. 12B.
  • the correction device 3 then executes a zoom correction process and generates a zoom correction control signal (S204).
  • the zoom correction process will be described later with reference to FIG. 12C.
  • the correction device 3 transmits the control signal generated in steps S202 to S204 to the projection device 2 (S205).
  • the correction device 3 generates control signals for the projection range correction process, the zoom correction process, and the shift correction process using the shooting parameters and the shooting image data of the projection surface transmitted from the projection device 2. This allows the correction device 3 to correct the projection by the projection device 2.
  • ⁇ Projection range correction process>> 12A a description will be given of the processing of the projection range executed by the correction device 3.
  • the correction device 3 detects a marker indicating the projection range from the captured image data transmitted from the projection device 2 (S301).
  • the correction device 3 determines whether or not a marker was included in the captured image data (S302).
  • the correction device 3 sets the projection range of the image based on the marker (S303). On the other hand, when the captured image data does not include a marker, the correction device 3 receives the projection range from the user via the input device 31 (S304).
  • the correction device 3 performs coordinate transformation to project the image onto the projection range set in step S303 or S304 (S305).
  • the correction device 3 generates a control signal for the projection device 2 to correct the projection range based on the result of the coordinate conversion in step S305 (S306).
  • the correction device 3 detects a marker that identifies the center of the projection range when shifting from the captured image data transmitted from the projection device 2 (S402).
  • the correction device 3 sets the center of the projection range after the shift using the marker as a reference (S403).
  • the correction device 3 calculates the difference between the center of the shooting area of the captured image data used in step S402 and the center of the projection range set in step S403 (S404).
  • the correction device 3 uses the difference calculated in step S405 to calculate the movement distance of the lens 24 along the lens plane (S405).
  • the correction device 3 generates a control signal for the projection device 2 to correct the lens shift based on the movement distance calculated in step S405 (S406).
  • the correction device 3 detects a marker indicating the projection range after zooming from the captured image data transmitted from the projection device 2 (S502).
  • the correction device 3 sets the projection range of the zoomed image based on the marker (S503).
  • the correction device 3 calculates the ratio between the shooting area of the captured image data used in step S502 and the projection area set in step S503 (S504).
  • the correction device 3 uses the ratio calculated in step S504 to calculate the distance between the multiple lenses included in the lens 24 (S505).
  • the correction device 3 generates a control signal for the projection device 2 to correct the zoom based on the distance calculated in step S505 (S506).
  • the projection device is able to correct the projection parameters according to the shooting parameters when shooting an image. This allows the projection parameters to be corrected by simply shooting the projection surface without turning on the projection light source. Also, since there is no need to turn on the projection power source, the projection surface can be photographed and the projection parameters can be corrected even in bright places.
  • the projection system 1 is connected to a projection device 2 that projects an image and a correction device 3 that corrects parameters used in the projection of the image.
  • the form of the projection system 1 is not limited as long as the projection device 2 can capture an image before projection and correct parameters at the time of projection using the captured image and parameters at the time of capture.
  • the projection device 2 and the correction device 3 may be integrated.
  • the projection device of the present disclosure is a projection device that projects an image onto a projection surface, and includes a light source that emits light used to project the image, a lens on an optical path from the light source to the projection surface, an image sensor used to capture an image of the projection surface, and an optical element that guides the light for projection from the light source to the lens and guides the light from the lens obtained by capture to the image sensor, wherein a first partial optical path from the optical element to the projection surface of a first optical path from the light source to the projection surface coincides with a second partial optical path from the projection surface to the optical element of a second optical path from the projection surface to the image sensor.
  • the correction device of the present disclosure is a correction device that includes an arithmetic circuit and corrects projection parameters used by a projection device that projects an image onto a projection surface, the projection device including a light source that emits light used to project an image, a lens on an optical path from the light source to the projection surface, an image sensor used to capture an image of the projection surface, and an optical element that guides the light for projection from the light source to the lens and guides the light from the lens obtained by capturing the image to the image sensor, and a first partial optical path from the lens to the projection surface of a first optical path from the light source to the projection surface coincides with a second partial optical path from the projection surface to the image sensor of a second optical path from the projection surface to the image sensor, and the arithmetic circuit may acquire from the projection device a captured image including a reference point indicating the projection position by the projection device, acquired using the image sensor, and parameters for capturing when the captured image was captured, and generate a control signal for correcting the parameters for projection by the projection device according to
  • the projection system of the present disclosure is a projection system that includes an arithmetic circuit and projects an image onto a projection surface, the projection system including a light source that emits light used to project the image, a lens on an optical path from the light source to the projection surface, an image sensor used to capture an image of the projection surface, and an optical element that guides the light for projection from the light source to the lens and guides the light from the lens obtained by capturing the image to the image sensor, and a first partial optical path from the optical element to the projection surface of a first optical path from the light source to the projection surface coincides with a second partial optical path from the projection surface to the image sensor, and the arithmetic circuit acquires a captured image including a reference point indicating the projection position by the projection device, acquired using the image sensor, and parameters for capturing when the captured image was captured, and may generate a control signal for correcting the parameters for projection according to the position of the reference point included in the captured image and the parameters for capturing.
  • the arithmetic circuit may adjust the parameters of the optical system during shooting, and the lens may project an image during projection using the parameters of the optical system adjusted during shooting.
  • the parameter of the optical system during shooting may be the focus of the lens.
  • the calculation circuit may set the range indicated by the reference point as the projection range, perform coordinate transformation of the image so that the image is projected into the projection range, and generate a control signal that causes the light source to project an image of the coordinate-transformed shape.
  • the projected image can be corrected to a desired shape.
  • the projected image can be geometrically corrected.
  • control signal may be a signal that adjusts the light emitted from the light source so that the image is projected within the projection range and the image is not displayed outside the projection range.
  • the calculation circuit may set a range of the captured image specified by a user via an input device or a predetermined range as the projection range.
  • the shooting parameters include coordinates on a plane perpendicular to the optical axis of the lens, the reference point indicates the center position of the projected image, and when the calculation circuit receives a request to change the image center position, it may set the position indicated by the reference point in the captured image as the center position of the projection range, and generate a control signal for projecting the image centered on the center position using the difference between the center position of the captured image and the center position of the projected image and the shooting parameters.
  • the shooting parameters may include a position along the optical axis of the lens, the reference point may indicate the zoom range, and when the calculation circuit receives a zoom request, it may set the range indicated by the reference point from the captured image as the projection range, and may generate a control signal using the ratio between the range of the captured image and the projection range and the shooting parameters to adjust the distance to the projection surface of the lens so that the image is projected into the projection range.
  • the projection method disclosed herein is a correction method executed by a projection system that includes a calculation circuit and projects an image onto a projection surface, the projection system including a light source that emits light used to project the image, a lens on an optical path from the light source to the projection surface, an image sensor used to capture an image of the projection surface, and an optical element that guides the light for projection from the light source to the lens and the light from the lens obtained by capture to the image sensor, and a first partial optical path from the optical element to the projection surface of a first optical path from the light source to the projection surface coincides with a second partial optical path from the projection surface to the image sensor, and the calculation circuit acquires a captured image including a reference point indicating the projection position by the projection device, which is acquired using the image sensor, and parameters for capture when the captured image was acquired, and generates a control signal that corrects the parameters for projection according to the position of the reference point included in the captured image and the parameters for capture.
  • the optical axis of projection and the optical axis of photography coincide with each other, eliminating the need for coordinate conversion to match the coordinate systems of the projection area and the photography area, and also eliminating the need for calibration between the projected image and the photography image.
  • the computer program disclosed herein causes the calculation circuit of the projection system to execute the method of (11).
  • the optical axis of projection and the optical axis of photography coincide with each other, eliminating the need for coordinate conversion to match the coordinate systems of the projection area and the photography area, and also eliminating the need for calibration between the projected image and the photography image.
  • the projection device, correction device, projection system, and correction method described in all claims of the present disclosure are realized by cooperation with hardware resources, such as a processor, memory, and a computer program.
  • the projection device, correction device, projection system, correction method, and computer program disclosed herein are useful for making adjustments when projecting an image using a projection device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Projection Apparatus (AREA)

Abstract

Ce dispositif de projection projette une image sur une surface de projection. Le dispositif de projection comprend : une source de lumière qui émet de la lumière utilisée pour projeter une image ; une lentille sur un chemin optique de la source de lumière à une surface de projection ; un élément de capture d'image qui est utilisé pour capturer une image de la surface de projection ; et un élément optique qui guide la lumière à des fins de projection de la source de lumière à la lentille, et qui guide la lumière provenant de la lentille qui est obtenue par la capture sur l'élément de capture d'image. Un premier chemin optique partiel de l'élément optique à la surface de projection d'un premier chemin optique de la source de lumière à la surface de projection et un second chemin optique partiel de la surface de projection à l'élément optique d'un second chemin optique de la surface de projection à l'élément de capture d'image sont alignés.
PCT/JP2023/036495 2022-10-11 2023-10-06 Dispositif de projection, dispositif de correction, système de projection, procédé de correction et programme informatique WO2024080234A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-163451 2022-10-11
JP2022163451 2022-10-11

Publications (1)

Publication Number Publication Date
WO2024080234A1 true WO2024080234A1 (fr) 2024-04-18

Family

ID=90669248

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/036495 WO2024080234A1 (fr) 2022-10-11 2023-10-06 Dispositif de projection, dispositif de correction, système de projection, procédé de correction et programme informatique

Country Status (1)

Country Link
WO (1) WO2024080234A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08304751A (ja) * 1995-05-01 1996-11-22 Nec Corp 液晶プロジェクション装置
JP2000284363A (ja) * 1999-03-30 2000-10-13 Telecommunication Advancement Organization Of Japan 画像投影装置及び画像投影方法
JP2005318268A (ja) * 2004-04-28 2005-11-10 Fuji Electric Systems Co Ltd 投射表示装置および投射表示システム
JP2008287171A (ja) * 2007-05-21 2008-11-27 Funai Electric Co Ltd 投射型映像表示装置
JP2012151670A (ja) * 2011-01-19 2012-08-09 Renesas Electronics Corp 画像投影システム及び半導体集積回路
JP2015232583A (ja) * 2012-09-27 2015-12-24 三菱電機株式会社 画像投影システムおよび投影位置調整方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08304751A (ja) * 1995-05-01 1996-11-22 Nec Corp 液晶プロジェクション装置
JP2000284363A (ja) * 1999-03-30 2000-10-13 Telecommunication Advancement Organization Of Japan 画像投影装置及び画像投影方法
JP2005318268A (ja) * 2004-04-28 2005-11-10 Fuji Electric Systems Co Ltd 投射表示装置および投射表示システム
JP2008287171A (ja) * 2007-05-21 2008-11-27 Funai Electric Co Ltd 投射型映像表示装置
JP2012151670A (ja) * 2011-01-19 2012-08-09 Renesas Electronics Corp 画像投影システム及び半導体集積回路
JP2015232583A (ja) * 2012-09-27 2015-12-24 三菱電機株式会社 画像投影システムおよび投影位置調整方法

Similar Documents

Publication Publication Date Title
CN110677634B (zh) 投影仪的梯形校正方法、装置、系统及可读存储介质
WO2022193559A1 (fr) Procédé et appareil de correction de projection, support de stockage et dispositif électronique
WO2022179108A1 (fr) Procédé et appareil de correction de projection, support de stockage et dispositif électronique
CN102404537B (zh) 投影仪以及投影仪的控制方法
KR20160118868A (ko) 하나의 룩업 테이블을 이용한 파노라마 영상 출력 시스템 및 방법
JP2004274354A (ja) 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法
JP2006189685A (ja) 投写制御システム、プロジェクタ、プログラム、情報記憶媒体および投写制御方法
JP2006060447A (ja) スクリーンの一部の辺を用いたキーストーン補正
JP2007078821A (ja) 投影装置、投影方法及びプログラム
JP2010130225A (ja) 投写型表示装置および投写用調整方法
JP2009010782A (ja) 画像処理装置及び制御プログラム
WO2005002240A1 (fr) Procede de calcul de donnees de correction de caracteristiques d'affichage, programme de calcul de donnees de correction de caracteristiques d'affichage et dispositif de calcul de donnees de correction de caracteristiques d'affichage
JP2005124131A (ja) 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法
JP2015184383A (ja) プロジェクター、及び投写画像制御方法
US20110279738A1 (en) Control device and projection video display device
JP2011176629A (ja) 制御装置および投写型映像表示装置
JP2020088708A (ja) 制御装置、撮像装置およびプログラム
WO2017179111A1 (fr) Système d'affichage et procédé de traitement d'informations
JP2003348500A (ja) 投射画像の調整方法、画像投射方法および投射装置
CN117097872A (zh) 一种投影设备自动梯形校正系统及方法
WO2024080234A1 (fr) Dispositif de projection, dispositif de correction, système de projection, procédé de correction et programme informatique
JP2018159838A (ja) 画像投影装置とその制御方法、プログラム及び記憶媒体
JP2009253575A (ja) プロジェクタ、プログラム及び記憶媒体
WO2020162051A1 (fr) Système d'affichage de vidéo par projection
JP2020150481A (ja) 情報処理装置、投影システム、情報処理方法、及び、プログラム