CN111684793A - Image processing device, image processing method, program, and projection system - Google Patents

Image processing device, image processing method, program, and projection system Download PDF

Info

Publication number
CN111684793A
CN111684793A CN201980011189.5A CN201980011189A CN111684793A CN 111684793 A CN111684793 A CN 111684793A CN 201980011189 A CN201980011189 A CN 201980011189A CN 111684793 A CN111684793 A CN 111684793A
Authority
CN
China
Prior art keywords
image
pickup device
image pickup
captured
projector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980011189.5A
Other languages
Chinese (zh)
Inventor
染谷清登
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN111684793A publication Critical patent/CN111684793A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The present technology relates to an image processing apparatus, an image processing method, a program, and a projection system that enable easy adjustment of the position and inclination of an imaging apparatus. An image processing apparatus according to an aspect of the present technology estimates, based on a captured image captured by an image capturing apparatus, an inclination of the image capturing apparatus that is mounted at a predetermined inclination and captures an image of a projection plane of a screen on which a pattern image including an image of a predetermined pattern is projected from a projector; and displays the captured image rotated according to the inclination of the camera. The present technology is applicable to a computer that projects images from a plurality of projectors.

Description

Image processing device, image processing method, program, and projection system
Technical Field
The present technology relates to an image processing apparatus, an image processing method, a program, and a projection system, and more particularly, to an image processing apparatus, an image processing method, a program, and a projection system that facilitate adjustment of the position and the inclination of an imaging apparatus.
Background
Depending on the screen shape and the orientation of the projector, the image projected on the screen using the projector may appear distorted.
As typical distortion, there is a phenomenon that: the image projected by a projector mounted at an elevation or depression angle with respect to the screen is distorted into a trapezoid. Keystone correction is necessary in order to make an image distorted into a keystone look like the correct shape.
Reference list
Patent document
[ PTL1] Japanese patent laid-open No. 2014-238601
Disclosure of Invention
Technical problem
In the case of projecting an image using a plurality of projectors, there are various mounting conditions.
For example, it is necessary to match images in an overlap region where projection light beams from a plurality of projectors overlap each other, and match the luminance of the overlap region with the luminance of the surrounding environment. The correction for matching the images in the overlapping area is called warping (geometric correction), and the correction for making the luminance uniform is called blending (optical correction).
Further, in the case of projecting an image on a dome-shaped screen, since the screen is not flat, a correct image may not be projected without geometric correction. As a method of preparing parameters for geometric correction, there is a method including mounting a measurement imaging device at a predetermined position and analyzing a captured image.
The present technology has been proposed in view of such a situation, and contributes to adjustment of the position and inclination of the imaging apparatus.
Solution to the problem
According to an aspect of the present technology, there is provided an image processing apparatus including: an estimation unit configured to estimate an inclination of an image pickup device configured to take a picture of a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector, based on a taken image taken by the image pickup device mounted at a predetermined inclination; and a display control unit configured to perform control to display the captured image rotated depending on the inclination of the capturing device.
In an aspect of the present technology, an inclination of an image pickup device configured to take a picture of a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector is estimated based on a taken image taken by the image pickup device mounted at a predetermined inclination; and displays a photographed image rotated depending on the inclination of the image pickup device.
Advantageous effects of the invention
According to the present technology, the position and inclination of the image pickup apparatus can be easily adjusted.
Note that the effects described herein are not necessarily limiting, and may be any effects described in the present disclosure.
Drawings
Fig. 1 is a diagram showing a configuration example of a multi-projection system.
Fig. 2 is a diagram showing an example in which the positional relationship between the projector and the image pickup device is viewed from above.
Fig. 3 is a diagram showing an example of the viewpoint position.
Fig. 4 is a diagram showing a projection state.
Fig. 5 is a diagram showing an example of how the projector and the image pickup device are mounted.
Fig. 6 is a diagram showing an example of the inclination of the projector.
Fig. 7 is a diagram showing another example of how the projector and the image pickup device are mounted.
Fig. 8 is a diagram showing another example of the inclination of the projector.
Fig. 9 is a diagram showing an example of the angle of view of the image pickup apparatus.
Fig. 10 is a diagram showing an example of a preview image.
Fig. 11 is a diagram showing another example of the angle of view of the image pickup apparatus.
Fig. 12 is a diagram showing another example of a preview image.
Fig. 13 is a diagram showing an example of how an image captured by an image capturing apparatus is rotated.
Fig. 14 is a diagram showing another example of how an image captured by an image capturing apparatus is rotated.
Fig. 15 is a diagram showing an example of a projection range of a projector.
Fig. 16 is a diagram showing an example of a projection range of a projector.
Fig. 17 is a diagram showing an example of a pattern image.
Fig. 18 is a diagram showing an example of a captured image.
Fig. 19 is a diagram showing an example of a preview image.
Fig. 20 is a front view showing a state in which a pattern image has been projected.
Fig. 21 is another front view showing a state in which a pattern image has been projected.
Fig. 22 is a diagram showing an example of a preview image.
Fig. 23 is a diagram showing an example of a preview image.
Fig. 24 is a diagram showing an example of display of guidance information.
Fig. 25 is a diagram showing another display example of guidance information.
Fig. 26 is a block diagram showing an example of a hardware configuration of an image processing apparatus.
Fig. 27 is a block diagram showing a functional configuration example of an image processing apparatus.
Fig. 28 is a flowchart showing an adjustment process performed by the image processing apparatus.
Fig. 29 is a flowchart showing the global adjustment process executed in step S1 of fig. 28.
Fig. 30 is a flowchart showing a detailed adjustment process executed in step S2 of fig. 28.
Detailed Description
Now, a mode for implementing the present technology is described. The description will be made in the following order.
1. Arrangement of multiple projection systems
2. Arrangement example of projector and image pickup apparatus
3. Example of angle of view of image pickup apparatus
4. Example of projection image
5. Global adjustment
6. Detailed adjustment
7. Arrangement of image processing apparatus
8. Operation of an image processing apparatus
9. Modified examples
< configuration of multiple projection System >
Fig. 1 is a diagram showing a configuration example of a multi-projection system according to an embodiment of the present technology.
The multi-projection system 1 of fig. 1 comprises a dome-shaped screen 11 mounted on a mounting frame 12. The dome-shaped screen 11 has a dome (semispherical) projection surface 11A having a diameter of about 2 m. The dome-shaped screen 11 is installed at a height of about 1m, wherein the opening of the dome-shaped screen 11 is directed downward.
As shown in fig. 1, a chair is prepared in front of the dome-shaped screen 11. The user sits on a chair to view the content projected on the projection surface 11A.
Further, the multi-projection system 1 includes projectors 13L and 13R, surround speakers 14, woofers 15, image pickup devices 16L and 16R, and an image processing device 21. The projectors 13L and 13R, the surround speaker 14, the woofer 15, and the image pickup devices 16L and 16R are connected to the image processing device 21 via wired or wireless communication.
Projectors 13L and 13R are mounted on the left and right sides of the dome-shaped screen 11 with their projection units facing the dome-shaped screen 11. For example, the projectors 13L and 13R are fixed to the mount 12 with a metal member.
The image pickup devices 16L and 16R are also mounted on the left and right sides of the dome-shaped screen 11 with their lenses facing the dome-shaped screen 11. For example, the image pickup devices 16L and 16R are mounted on the edge of the dome-shaped screen 11 by a predetermined member so that the position and inclination of the image pickup devices are adjustable. The administrator of the multi-projection system 1 can adjust the shooting range by moving the housings of the image pickup devices 16L and 16R with his/her hands. Instead of mounting the image pickup devices 16L and 16R on the edge of the dome-shaped screen 11, the image pickup devices 16L and 16R may be mounted on the mounting frame 12 using a predetermined member.
Fig. 2 is a diagram showing an example of a positional relationship between the projector and the image pickup device as viewed from above.
As shown in fig. 2, the projector 13L is installed at a position where the projector 13L can project an image on a substantially right half area of the dome-shaped screen 11. The projector 13R is installed at a position where the projector 13R can project an image on a substantially left half area of the dome-shaped screen 11. The projection units of the projectors 13L and 13R each include a wide-angle lens, such as a fisheye lens.
In fig. 2, a range indicated by a broken line represents a projection range of the projector 13L, and a range indicated by a chain line represents a projection range of the projector 13R. Details of the projection range are described later.
The projectors 13L and 13R project the images assigned thereto to display content images on the entire projection surface 11A, thereby presenting the content images to the user. An image from each projector is generated based on the content image so that one image can be viewed from the viewpoint of the user without distortion. When the content is reproduced, images assigned to the respective projectors 13L and 13R are supplied from the image processing apparatus 21.
As shown in fig. 2, the image pickup device 16L is disposed in the vicinity of the projection unit of the projector 13L and the edge portion of the dome-shaped screen 11. Further, the imaging device 16R is disposed in the vicinity of the projection unit of the projector 13R and the edge portion of the dome-shaped screen 11.
The imaging device 16L is used to photograph the projection surface 11A on which the images have been projected from the projector 13L and the projector 13R.
For example, a preview image is generated in the image processing apparatus 21 based on an image captured by the image capturing apparatus 16L to be displayed on a display connected to the image processing apparatus 21. The preview image is an image displayed when the position and inclination of the imaging device 16L are adjusted, and the preview image represents the state of the projection surface 11A.
The imaging device 16R is used to photograph the projection surface 11A on which the images have been projected from the projector 13L and the projector 13R.
For example, a preview image is generated in the image processing apparatus 21 based on an image captured by the image capturing apparatus 16R to be displayed on a display connected to the image processing apparatus 21. The preview image is an image displayed when the position and inclination of the imaging device 16R are adjusted, and the preview image represents the state of the projection surface 11A.
The user adjusts the positions and inclinations of the image pickup devices 16L and 16R while viewing the preview image at a predetermined timing (for example, at the time of installing the multi-projection system 1).
The image captured by the adjusted image pickup device 16L whose position and inclination have been adjusted is used to generate parameters for, for example, geometric correction of the image to be projected from the projector 13L. Further, the image captured by the adjusted image pickup device 16R is used to generate a parameter for, for example, geometric correction of the image to be projected from the projector 13R.
Now, without the need to distinguish the projector 13L from the projector 13R, the projector 13L and the projector 13R are appropriately collectively referred to as "projector 13". In addition, in the case where there is no need to distinguish the imaging device 16L from the imaging device 16R, the imaging device 16L and the imaging device 16R are collectively referred to as "imaging device 16".
Returning to the description of fig. 1, the surround speaker 14 and the woofer 15 disposed below the dome-shaped screen 11 output the sound of the content reproduced by the image processing apparatus 21.
The image processing device 21 reproduces the content and generates a projection image projected from the projector 13 based on each frame of the moving image of the content. The image processing device 21 outputs the projection image to the projector 13, and controls the projector 13 to project the projection image on the projection surface 11A.
Further, the image processing apparatus 21 outputs sound data obtained by reproducing the content to the surround speakers 14 and the woofer 15, and controls the surround speakers 14 and the woofer 15 to output the sound of the content.
The image processing apparatus 21 is, for example, a PC. The image processing apparatus 21 may not be one PC but may include a plurality of PCs. Further, the image processing apparatus 21 may not be provided in the vicinity of the dome-shaped screen 11 as shown in fig. 1, but may be provided in a room different from the room in which the dome-shaped screen 11 has been installed.
Note that two projectors are provided in the example of fig. 1, but one projector may be provided or three or more projectors may be provided. The number of projectors provided in the multi-projection system 1 is optional.
Fig. 3 is a diagram showing an example of the viewpoint position.
A user sitting on a chair in front of the dome-shaped screen 11 looks up the image projected on the projection surface 11A slightly as indicated by a dotted line, where the viewpoint position is a position P1 near the sphere center of the projection surface 11A as the sphere surface. The position of the innermost portion (dome zenith) of the projection surface 11A indicated by a dotted arrow of fig. 3 is the center position of the projection surface 11A.
Fig. 4 is a diagram showing a projection state.
As illustrated in fig. 4, when the user looks up the projection image whose viewpoint position is the position P1, the field of view of the user is almost covered by the image projected on the projection surface 11A. Since the entire field of view is almost covered by the image, the user has the impression as if he/she were surrounded by the image, and can feel more realistic and immersed in the content.
For example, moving image content including an entire celestial sphere image is provided. Other moving image content such as movies, television programs, and games, and still image content such as landscape photographs may be provided.
< arrangement example of projector and image pickup apparatus >
In this way, in the multi-projection system 1, a high-resolution image and a wide-angle-of-view image can be projected using a plurality of projectors, each of which includes a fisheye lens as a lens of a projection unit. The projector 13L and the projector 13R are, for example, projectors capable of projecting a 4K image, for example, a 3840 × 2160 pixel image.
Fig. 5 is a diagram showing an example of how the projector 13L and the image pickup device 16L are mounted.
In the example of fig. 5, the positional relationship among the dome-shaped screen 11, the projector 13L, and the image pickup device 16L when the multi-projection system 1 is viewed from the left side is schematically shown.
As described above, the dome-shaped screen 11 is installed downward at a predetermined angle. However, in fig. 5, the dome-shaped screen 11 is shown as being horizontal for the purpose of illustration. An axis a connecting the sphere center (position P1) of the projection surface 11A as the sphere surface to the zenith of the projection surface 11A is shown as a horizontal axis. This axis is similarly shown in fig. 7.
In the case where the multi-projection system 1 is viewed from the left side, as shown in fig. 5, the projector 13L has an upper surface 13L-1 facing the user and a rear surface 13L-2 appearing on the drawing sheet (oriented into the plane of the drawing sheet). The projector 13L has a substantially flat box-like housing.
Specifically, as illustrated in fig. 6, the projector 13L is mounted at a 90-degree vertical position with reference to the direction of the axis a. The projector 13L is a projector that: when the upper surface 13L-1 is facing upward as shown in the left portion of FIG. 6, an image having an aspect ratio of, for example, 16:9 can be projected in a normal orientation (e.g., upper side facing upward and lower side facing downward).
In the case where the projector 13L is installed at a 90-degree vertical position, the horizontal direction of the image projected from the projector 13L corresponds to the vertical direction of fig. 5.
As indicated by a dotted circle, the projection unit of the projector 13L is disposed on the front surface of the cabinet as the other side of the rear surface 13L-2. In fig. 5, the bottom surface, the left side surface, and the right side surface of the cabinet face the dome-shaped screen 11 downward and upward, respectively.
In the case where the multi-projection system 1 is viewed from the left side, similarly, the camera 16L has an upper surface 16L-1 facing the user and a rear surface 16L-2 appearing on the drawing sheet. In the example of fig. 5, the rear surface 16L-2 of the image pickup device 16L is shaped in a vertically long substantially rounded rectangular shape.
Similar to the projector 13L, the horizontal direction of the image captured by the image pickup device 16L mounted with the upper surface facing the user corresponds to the vertical direction of fig. 5.
Fig. 7 is a diagram showing an example of how the projector 13R and the image pickup device 16R are mounted.
In the example of fig. 7, the positional relationship among the dome-shaped screen 11, the projector 13R, and the image pickup device 16R when the multi-projection system 1 is viewed from the right side is schematically shown.
In the case where the multi-projection system 1 is viewed from the right side, as shown in fig. 7, the projector 13R has an upper surface 13R-1 facing the user and a rear surface 13R-2 appearing on the drawing sheet. The projector 13R also includes a substantially flat box-like casing.
Specifically, as illustrated in fig. 8, the projector 13R is mounted at a 90-degree vertical position with reference to the direction of the axis a. The projector 13R is a projector that: when the upper surface 13R-1 is facing upward as shown in the left portion of FIG. 8, an image having an aspect ratio of, for example, 16:9 can be projected in the normal orientation.
In the case where the projector 13R is installed at the 90-degree vertical position, the horizontal direction of the image projected from the projector 13R corresponds to the vertical direction of fig. 7.
As indicated by the dotted circle, the projection unit of the projector 13R is disposed on the cabinet front surface as the other side of the rear surface 13L-2. In fig. 7, the bottom surface, the left side surface, and the right side surface of the cabinet face the dome-shaped screen 11 upward and downward, respectively.
In the case where the multi-projection system 1 is viewed from the right side, similarly, the camera 16R has an upper surface 16R-1 facing the user and a rear surface 16R-2 appearing on the drawing sheet. In the example of fig. 7, the shape of the rear surface 16L-2 of the image pickup device 16R is a vertically long substantially rounded rectangular shape.
Similar to the projector 13R, the horizontal direction of the image captured by the image pickup device 16R mounted with the upper surface facing the user corresponds to the vertical direction of fig. 7.
In this way, the projector 13L and the projector 13R are vertically installed by being rotated by 90 °.
In general, with respect to the viewing angle (resolution) of video display devices such as projectors and televisions, the horizontal resolution is higher than the vertical resolution. Two projectors are installed on the left and right sides of the dome-shaped screen 11 so that the direction having a higher resolution, i.e., the horizontal direction, corresponds to the vertical direction of the dome-shaped screen 11, with the result that a high-resolution image can be projected on the entire projection surface 11A.
Further, the image pickup device 16L and the image pickup device 16R are installed at positions where the image pickup devices do not interfere with the user's viewing of the content and the image pickup devices can be permanently installed.
As described above, the imaging device 16L and the imaging device 16R are used to photograph the projection surface 11A to measure the state of the image projected on the projection surface 11A. From the viewpoint of measurement accuracy, the image pickup device 16L and the image pickup device 16R are preferably installed as close as possible to the dome-shaped screen 11. It can be said that the positions of the imaging devices 16L and 16R are positions that satisfy such conditions.
As described with reference to fig. 5 and 6, since the projector 13L is vertically installed by being rotated 90 ° rightward (clockwise) in a state viewed from the rear surface, the image pickup device 16L is similarly installed by being rotated 90 ° rightward, for example.
Further, as described with reference to fig. 7 and 8, since the projector 13R is vertically installed by being rotated 90 ° to the left (counterclockwise direction) in a case of being viewed from the rear surface, for example, the image pickup device 16R is similarly installed by being rotated 90 ° to the left.
Similarly, regarding the angle of view of the image pickup devices 16L and 16R, the horizontal resolution is higher than the vertical resolution. In the case where the image pickup device 16L is adjusted to the same inclination as the projector 13L and the image pickup device 16R is adjusted to the same inclination as the projector 13R, a wide range including images projected from the projectors 13L and 13R can be included in the angle of view to be photographed.
< example of angle of view of image pickup apparatus >
Fig. 9 is a diagram illustrating an example of the angle of view of the image pickup device 16L.
Here, the angle of view in the case of capturing a specific landscape is described, but in reality, a range including the projection surface 11 is captured by the image capturing device 16L. This range is similarly captured in fig. 11.
As shown in the upper part of fig. 9, in the case where the image pickup device 16L is mounted in front of a specific landscape in a normal orientation (in the case where the upper surface 16L-1 faces upward and the front surface faces the object), the horizontally long rectangular range surrounded by the frame F1 of the entire object is photographed.
In fig. 9, a small dashed circle shown at a position near the upper surface 16L-1 on the rear surface 16L-2 indicates a lens provided on the front surface of the image pickup device 16L.
Meanwhile, as shown in the lower part of fig. 9, in the case where the image pickup device 16 is horizontally installed by being rotated 90 ° to the right, a vertically long rectangular range surrounded by a frame F1' of the entire object is photographed. In this case, as illustrated in fig. 10, a preview image in which the right side of fig. 9 corresponds to the upper side and the left side of fig. 9 corresponds to the lower side is displayed on the adjustment screen. The adjustment screen is a screen for adjusting the position and inclination of the image pickup device 16.
The user adjusts the position and inclination of the image pickup device 16L while viewing the preview image, but in the case where the preview image is displayed in the orientation shown in fig. 10, the orientation of the preview image is different from the orientation of the object in the real space.
Fig. 11 is a diagram illustrating an example of the angle of view of the image pickup device 16R.
As shown in the upper part of fig. 11, in the case where the image pickup device 16R is mounted in front of a specific landscape in a normal orientation (in the case where the upper surface 16R-1 faces upward and the front surface faces the object), the horizontally long rectangular range surrounded by the frame F1 of the entire object is photographed.
In fig. 11, a small dashed circle shown at a position near the upper surface 16R-1 on the rear surface 16R-2 indicates a lens provided on the front surface of the image pickup device 16R.
Meanwhile, as shown in the lower part of fig. 11, in the case where the image pickup device 16 is horizontally installed by being rotated 90 ° to the left, a vertically long rectangular range surrounded by a frame F1' of the entire object is photographed. In this case, as illustrated in fig. 12, a preview image in which the left side of fig. 11 corresponds to the upper side and the right side of fig. 11 corresponds to the lower side is displayed on the adjustment screen.
The user adjusts the position and inclination of the image pickup device 16R while viewing the preview image, but in the case where the preview image is displayed in the orientation shown in fig. 12, the orientation of the preview image is different from the orientation of the object in the real space.
In order to match the orientation of the preview image with the orientation in the real space, in the image processing device 21, processing of rotating the captured image depending on the inclination of the image capturing devices 16L and 16R is performed.
Fig. 13 is a diagram showing an example of how to rotate an image captured by the image pickup device 16L.
The image as shown in the left part of fig. 13 is an image captured with the image pickup device 16L horizontally installed as described with reference to fig. 10. In the case where such an image is captured by the image capturing device 16L, processing of rotating the image by 90 ° to the right is performed in the image processing device 21 as indicated by the outlined arrow, and the rotated image is displayed as a preview image. The inclination of the image pickup device 16L is estimated based on, for example, a captured image.
When the horizontally mounted image pickup device 16L is moved rightward in the real space as shown in the lower part of fig. 9, the display range of the preview image in the right part of fig. 13 is switched to the right range, and when the image pickup device 16L is moved leftward, the display range of the preview image is switched to the left range. Even in the case where the image pickup device 16L is horizontally installed, the moving direction of the image pickup device 16L in the real space matches the switching direction of the display range, and therefore, the user can intuitively adjust the position and the inclination of the image pickup device 16L while viewing the preview image.
Fig. 14 is a diagram showing an example of how to rotate an image captured by the image pickup device 16R.
The image as shown in the left part of fig. 14 is an image captured with the image pickup device 16R horizontally installed as described with reference to fig. 12. In the case where such an image is captured by the imaging device 16R, processing of rotating the image by 90 ° to the left is performed in the image processing device 21 as indicated by an outline arrow, and the rotated image is displayed as a preview image. The inclination of the image pickup device 16R is estimated based on, for example, a captured image.
When the horizontally mounted camera 16R is moved rightward in the real space as shown in the lower part of fig. 11, the display range of the preview image in the right part of fig. 14 is switched to the right range, and when the camera 16R is moved leftward, the display range of the preview image is switched to the left range. Even in the case where the image pickup device 16R is horizontally installed, the moving direction of the image pickup device 16R in the real space matches the switching direction of the display range, and therefore, the user can intuitively adjust the position and inclination of the image pickup device 16R while viewing the preview image.
In this way, when the image pickup device 16L or 16R is adjusted, an image in which the moving direction of the image pickup device 16L or 16R matches the switching direction of the display range is displayed as a preview image.
In the case where an image captured by an image pickup device mounted horizontally is displayed as it is as a preview image, the moving direction of the image pickup device 16L or 16R does not match the switching direction of the display range of the preview image, resulting in troublesome adjustment. The present disclosure can prevent such troublesome adjustment.
Hereinafter, adjustment by the administrator of the multi-projection system 1 to match the orientation of the imaging device 16 with the orientation of the projector 13 is referred to as "global adjustment". The image processing device 21 performs a process of rotating an image to display the image as a preview image depending on the inclination of the image pickup device 16L or 16R is referred to as "global adjustment process".
The adjustment of the image pickup devices 16L and 16R includes global adjustment and detailed adjustment described later.
< example of projection image >
Fig. 15 is a diagram showing an example of the projection range of the projector 13.
The circle shown in fig. 15 represents the state of the projection surface 11A when viewed from the front surface. The display shown in fig. 15 is realized by projecting an image having no design or pattern and having a constant luminance on the entire image by the projector 13L and the projector 13R. Since the multi-projection system 1 is in a dark environment, the area corresponding to the edge of the projection surface 11A on the outer side of the edge portion 11B is filled with black.
The projection range of the projector 13L is a range indicated by a diagonal line of a of fig. 16. As indicated by the broken line, the image projected from the projector 13L is partially projected on the outer side of the arc connecting the position p1 to the position p2 on the edge portion 11B of the dome-shaped screen 11.
In the image projected from the projector 13L, the region projected outside the projection surface 11A is a black region (black light is projected outside the projection surface 11A).
Meanwhile, the projection range of the projector 13R is a range indicated by a diagonal line of B of fig. 16. As indicated by the broken line, the image projected from the projector 13R is partially projected on the outer side of the arc connecting the position p11 to the position p12 on the edge portion 11B of the dome-shaped screen 11.
In the image projected from the projector 13R, the region projected outside the projection surface 11A is also a black region.
Images are projected from the projector 13L and the projector 13R to these areas on the projection surface 11A. A range in which the range indicated by the diagonal line of a of fig. 16 and the range indicated by the diagonal line of B of fig. 16 overlap each other is an overlapping region in which the image from the projector 13L and the image from the projector 13R overlap each other.
The process of matching the image from the projector 13L with the image from the projector 13R to prevent image blurring in the overlapping area (prevent resolution from being degraded) is geometric correction.
Further, as shown near the center of fig. 15, the brightness of the overlap region is higher than that of the surrounding environment. The process of correcting the brightness of the overlapped area is optical correction.
< Global adjustment >
As a method of displaying a preview image used in global adjustment, which is adjustment for matching the orientation of the image pickup device 16 with the orientation of the projector 13, for example, the following method is given.
(1) A method in which a user manually rotates a preview image by 90 °.
(2) A method of rotating a preview image according to an inclination detected by a sensor built in an image pickup apparatus, for example, an IMU.
(3) A method of rotating the preview image based on a captured image obtained by capturing a pattern image projected from a projector by an imaging device.
Global adjustment of the above-mentioned item (3) using a pattern image is described.
Fig. 17 is a diagram illustrating an example of a pattern image.
The image shown in the left part of fig. 17 is a pattern image projected from the projector 13L at the time of global adjustment. The pattern image projected from the projector 13L is referred to as "pattern image L". The upper left corner, upper right corner, lower right corner, and lower left corner of the pattern image L are appropriately referred to as "point a", "point B", "point C", and "point D", respectively.
The pattern image L is an area other than the lower left and lower right cornersThe constant brightness is displayed as a gray image. Placing a marker M comprising yellow pixels at the lower left corner of the pattern image LYAnd a mark M including a blue pixel is placed at the lower right cornerB. Mark MYAnd a marker MBEach comprising a predetermined number of pixels.
The representation has a mark M placed at the lower left cornerYAnd a mark M placed at the right lower cornerBThe image of the pattern of (a) is a pattern image L. The pattern image L is subjected to geometric correction (for example, geometric correction using default parameters) to be projected from the projector 13L.
Meanwhile, the image shown in the right part of fig. 17 is a pattern image projected from the projector 13R at the time of global adjustment. The pattern image projected from the projector 13R is referred to as "pattern image R". The upper left corner, upper right corner, lower right corner, and lower left corner of the pattern image R are appropriately referred to as "point AA", "point BB", "point CC", and "point DD", respectively.
The pattern image R is an image in which regions other than the lower left and lower right corners are displayed as gray with constant brightness. Placing a marker M having green pixels at the lower left corner of the pattern image RGAnd a mark M having a red pixel is placed at the lower right cornerR. Mark MGAnd a marker MREach comprising a predetermined number of pixels.
The representation has a mark M placed at the lower left cornerGAnd a mark M placed at the right lower cornerRThe image of the pattern of (a) is a pattern image R. The pattern image R is geometrically corrected to be projected from the projector 13R.
Fig. 18 is a diagram showing an example of a captured image.
The image shown in the left part of fig. 18 is a captured image obtained by capturing the state of the pattern image L projected from the projector 13L and the pattern image R projected from the projector 13R by the imaging device 16L. A captured image obtained by capturing the pattern image L and the pattern image R by the imaging device 16L is referred to as a "captured image L".
The distortion pattern image L and the distortion pattern image R projected on the projection surface 11A appear in the captured image L. The region a1 appearing at a higher brightness than the surrounding environment is an overlapping region.
A boundary line L11 appearing as a shallow curve in the lower part of the captured image L corresponds to connecting the point C of the pattern image L to the lower side of the point D. A line L12 of the boundary appearing as a shallow curve in the left part of the captured image L corresponds to the left part connecting the point a of the pattern image L to the point D. A boundary line L13 appearing as a shallow curve in the right portion of the captured image L corresponds to the right portion connecting the point B of the pattern image L to the point C.
Mark MYOccurs at the junction between the boundary line L11 and the boundary line L12. Furthermore, the mark MBOccurs at the junction between the boundary line L11 and the boundary line L13.
The arc-shaped boundary line L21 appearing near the center of the captured image L corresponds to connecting the point CC of the pattern image R to the lower side of the point DD. A boundary line L22 appearing as a shallow curve at the upper left of the captured image L corresponds to a right portion connecting the point BB of the pattern image R to the point CC. The parabolic boundary line L23 appearing at the upper right of the captured image L corresponds to the left part connecting the point AA of the pattern image R to the point DD.
Mark MROccurs at the junction between the boundary line L21 and the boundary line L22. Furthermore, the mark MGOccurs at the junction between the boundary line L21 and the boundary line L23.
Meanwhile, an image shown in the right part of fig. 18 is a captured image obtained by capturing the state of the pattern image L projected from the projector 13L and the pattern image R projected from the projector 13R by the imaging device 16R. A captured image obtained by capturing the pattern image L and the pattern image R by the image capturing device 16R is referred to as a "captured image R".
The distortion pattern image L and the distortion pattern image R projected on the projection surface 11A appear in the captured image R. The region a2 appearing at a higher brightness than the surrounding environment is an overlapping region.
The substantially straight boundary line L31 appearing in the lower portion of the captured image R corresponds to connecting the point CC of the pattern image R to the lower side of the point DD. The boundary line L32 appearing as a shallow curve in the left portion of the captured image R corresponds to the left portion connecting the point AA of the pattern image R to the point DD. A boundary line L33 appearing as a shallow curve in the right portion of the captured image R corresponds to a right portion connecting the point BB of the pattern image R to the point CC.
Mark MGOccurs at the junction between the boundary line L31 and the boundary line L32. Furthermore, the mark MROccurs at the junction between the boundary line L31 and the boundary line L33.
The arc-shaped boundary line L41 appearing near the center of the captured image R corresponds to the lower side connecting the point C of the pattern image L to the point D. The parabolic boundary line L42 appearing at the upper left of the captured image R corresponds to the right portion connecting the point B of the pattern image L to the point C. A boundary line L43 appearing as a shallow curve at the upper right of the captured image R corresponds to the left portion connecting the point a of the pattern image L to the point D.
Mark MBOccurs at the junction between the boundary line L41 and the boundary line L42. Furthermore, the mark MYOccurs at the junction between the boundary line L41 and the boundary line L43.
In the image processing apparatus 21, the mark M is detected by analyzing the captured image LY、MB、MGAnd MR. Based on mark M in shooting image LY、MB、MGAnd MREstimates the inclination of the image pickup device 16L, and rotates the captured image L to be displayed as a preview image depending on the estimated inclination.
Further, the mark M is detected by analyzing the captured image RG、MR、MYAnd MB. Based on mark M in shot image RG、MR、MYAnd MBEstimates the inclination of the image pickup device 16R, and rotates the captured image R to be displayed as a preview image depending on the estimated inclination.
Fig. 19 is a diagram showing an example of a preview image.
As shown in the left part of fig. 19, an image obtained by rotating the captured image L of fig. 18 by 90 ° to the right is displayed as a preview image of the captured image L.
Further, as shown in the right part of fig. 19, an image obtained by rotating the photographed image R of fig. 18 by 90 ° to the left is displayed as a preview image of the photographed image R.
Each preview image is displayed at a predetermined position in an adjustment screen displayed on a display connected to the image processing apparatus 21.
The administrator of the multi-projection system 1 matches the orientation of the image pickup device 16 with the orientation of the projector 13 while viewing the preview image shown in fig. 19. The administrator can determine that the camera 16 is facing the same orientation as the projector 13, with four markers appearing in each preview image, for example, in the arrangement shown in fig. 19.
Fig. 20 is a front view showing a state in which a pattern image has been projected.
In the case where the state of the projection surface 11A on which the pattern image L has been projected from the projector 13L and the pattern image R has been projected from the projector 13R is photographed from in front of the projection surface 11A, an image as shown in the left part of fig. 20 is obtained. The vertical direction of fig. 20 corresponds to the vertical direction of the dome-shaped screen 11, and the horizontal direction of fig. 20 corresponds to the horizontal direction of the dome-shaped screen 11.
For example, the arc-shaped boundary line L61 appearing on the left side of the center of the projection surface 11A corresponds to the boundary line L11 appearing on the captured image L and the boundary line L41 appearing on the captured image R of fig. 18. Further, the arc-shaped boundary line L62 appearing on the right side of the center corresponds to the boundary line L21 appearing on the captured image L and the boundary line L31 appearing on the captured image R of fig. 18.
The right diagram of fig. 20 is obtained by enlarging the range of the frame F11. Mark MYAppearing at the upper left, marker MBAppearing at the lower left. Mark MGAppearing at the lower right, marker MRAppearing at the upper right.
Note that the vicinities of the upper left corner and the upper right corner of the pattern image L and the vicinities of the upper left corner and the upper right corner of the pattern image R are projected outside the projection surface 11A as described with reference to fig. 16. Of the light of each pattern image, light of each mark is irradiated on a predetermined position on the projection surface 11A, and light of upper left and right corners is irradiated on a position outside the projection surface 11A.
Fig. 21 is another front view showing a state in which a pattern image has been projected.
For example, in the case where the pattern images are projected from the projectors 13L and 13R mounted upside down, as shown in fig. 21, the marks placed on the pattern images are not projected on the projection surface 11A (projected outside the projection surface 11A).
Having detected the fact that no mark appears in the images captured by the imaging devices 16L and 16R means that the projectors 13L and 13R are mounted upside down.
In the case where no mark appears in the image captured by the image pickup device 16, information indicating that the projector 13 is mounted upside down may be displayed to notify the administrator.
With such global adjustment, the orientation of the camera 16 is adjusted to match the orientation of the projector 13 so that the upper surfaces of the two components face the user.
< detailed adjustment >
In order to prevent image blurring in the overlapping region, geometric correction is required to match the image from projector 13L with the image from projector 13R.
In order to perform geometric correction with high accuracy, it is necessary to adjust the positions and inclinations of the image pickup devices 16L and 16 to positions and inclinations at which the image pickup devices can photograph the entire overlapping region.
Further, in order to detect the edge portion 11B, thereby reducing the light irradiated outside the dome-shaped screen 11 (the light projected outside the projection surface 11A), it is necessary to adjust the positions and inclinations of the image pickup devices 16L and 16 to the positions and inclinations at which the image pickup devices can photograph the entire edge portion 11B of the dome-shaped screen 11.
As described above, in the image processing apparatus 21, in the image projected from the projector 13, the region projected outside the projection surface 11A is set as the black region. By accurate detection of the entire edge portion 11B and correct setting of the area projected outside the projection surface 11A, light irradiated outside the dome-shaped screen 11 can be reduced.
Fig. 22 is a diagram showing an example of a preview image.
In the example of fig. 22, a preview image is schematically shown. As shown in the left part of fig. 22, when an overlap area a11 indicated by a diagonal line and an edge portion 11B indicated by a thick line L101 included in the projection range of the projector 13L appear in the preview image of the image pickup device 16L, it can be said that the image pickup device 16L has been adjusted to an appropriate position and inclination. The arc indicated by the bold line L101 corresponds to the arc connecting the position p1 to the position p2 described with reference to a of fig. 16.
Further, as shown in the right part of fig. 22, when an overlap area a12 indicated by a diagonal line and an edge portion 11B indicated by a thick line L102 included in the projection range of the projector 13R appear in the preview image of the image pickup device 16R, it can be said that the image pickup device 16R has been adjusted to an appropriate position and inclination. The arc indicated by the thick line L102 corresponds to the arc connecting the position p11 to the position p12 described with reference to B of fig. 16.
When the edge portion 11B corresponding to the arc connecting the position p1 of a of fig. 16 to the position p2 appears in the preview image of the image pickup device 16L and the edge portion 11B corresponding to the arc connecting the position p11 of B of fig. 16 to the position p12 appears in the preview image of the image pickup device 16R, the edge portion 11B has been able to be fully photographed.
Whether the overlap region occurs and whether the edge portion 11B of the dome-shaped screen 11 completely occurs are criteria for determining whether the image pickup devices 16L and 16R have been adjusted to appropriate positions and inclinations.
The adjustment by which the administrator of the multi-projection system 1 adjusts the positions and inclinations of the image pickup devices 16L and 16R while viewing the preview image in this manner is referred to as "detailed adjustment". The process in which the image processing device 21 performs analysis on the images captured by the image capturing devices 16L and 16R to determine whether the image capturing devices have been adjusted to the appropriate positions and inclinations is referred to as "detailed adjustment process".
In a case where the image pickup devices have not been adjusted to the appropriate positions and inclinations, information for guiding the moving directions of the image pickup devices 16L and 16R is displayed by being superimposed on the preview image, for example. The administrator of the multi-projection system 1 can adjust the respective positions and inclinations of the cameras 16L and 16R by moving the cameras in conformity with the displayed information.
Fig. 23 is a diagram showing an example of a preview image.
In the preview image of the image pickup device 16L shown in the left part of fig. 23, the edge portion 11B is cut off. In the case where the preview image shown in the left part of fig. 23 is displayed, in the image processing apparatus 21, it is determined that the image pickup apparatus 16L is at an inappropriate position and inclination.
In such a case, information for guiding the moving direction of the imaging device 16L as shown in the left part of fig. 24 is displayed. In the example of fig. 24, an arrow image is displayed as information for guiding the moving direction.
The administrator of the multi-projection system 1 can adjust the camera 16L to an appropriate position and inclination by moving the camera 16L to the right following such guidance.
Further, in the preview image of the image pickup device 16R shown in the right part of fig. 23, the overlapping area is cut off. In the case where the preview image shown in the right part of fig. 23 is displayed, in the image processing device 21, it is determined that the image pickup device 16R is at an inappropriate position and inclination.
In such a case, information for guiding the moving direction of the image pickup device 16R as shown in the right part of fig. 24 is displayed. The administrator of the multi-projection system 1 can move the image pickup device 16R downward by following such guidance to adjust the image pickup device 16R to an appropriate position and inclination.
The moving direction of the imaging devices 16L and 16R is detected based on, for example, the positions of the marks of the pattern image.
Fig. 25 is a diagram showing an example of guidance information.
In the case where it has been determined that the image pickup devices 16L and 16R have been adjusted to the appropriate positions and inclinations, as shown in fig. 25, information for notifying that the image pickup devices 16L and 16R are at the appropriate positions and inclinations is displayed. In the example of fig. 25, as information for notifying that the image pickup devices 16L and 16R are in the appropriate positions and inclinations, the term "OK" is displayed in the upper left of each preview image.
Since the guidance information is displayed, even in a case where the administrator of the multi-projection system 1 is not used to the adjustment, the administrator can adjust the image pickup devices 16L and 16R to appropriate positions and inclinations. Even a person who does not know the above conditions as an index can adjust the image pickup devices 16L and 16R, and therefore, the operability of the multi-projection system 1 can be improved.
Since the moving direction of the image pickup device 16 matches the switching direction of the display range of the preview image, the administrator of the multi-projection system 1 can intuitively adjust the positions and inclinations of the image pickup devices 16L and 16.
< arrangement of image processing apparatus >
Fig. 26 is a block diagram showing an example of the hardware configuration of the image processing apparatus 21.
A CPU (central processing unit) 101, a ROM (read only memory) 102, and a RAM (random access memory) 103 are connected to each other by a bus 104.
An input/output expansion bus 105 is also connected to the bus 104. A GPU (graphics processing unit) 106, a UI (user interface) I/F109, a communication I/F112, and a recording I/F113 are connected to the input/output expansion bus 105.
The GPU 106 renders the projection images projected from the projectors 13L and 13R using the VRAM 107. For example, the GPU 106 generates projection images projected from the respective projectors 13L and 13R, and outputs the projection images to the display I/F108.
The display I/F108 is a projection image output interface. The display I/F108 is configured as an interface conforming to a predetermined standard such as HDMI (registered trademark) (high definition multimedia interface). The display I/F108 outputs the projection image supplied from the GPU 106 to the projector 13L and the projector 13R, and controls the projector 13L and the projector 13R to project the projection image.
A display such as an LCD or an organic EL display is also connected to the display I/F108. The display I/F108 controls the display to display an adjustment screen including a preview image.
The UI I/F109 is an operation detection interface. The UI I/F109 detects an operation performed using the keyboard 110 or the mouse 111, and outputs information indicating the content of the operation to the CPU 101. For example, the operation is performed by an administrator of the multi-projection system 1 using the keyboard 110 or the mouse 111.
The communication I/F112 is an interface for communicating with an external device. The communication I/F112 is configured by a network interface such as a wireless LAN or a wired LAN. The communication I/F112 communicates with an external apparatus via a network such as the internet, thereby transmitting or receiving various data. The content reproduced in the multi-projection system 1 may be provided from a server via a network.
For example, the communication I/F112 transmits data relating to the sound of the content to the surround speakers 14 and the woofer 15 as appropriate, and receives data relating to images captured by the image capturing devices 16L and 16R and then transmitted from the image capturing devices 16L and 16R. In the case where a sensor or the like configured to detect the motion of the user is provided to the chair, the communication I/F112 also receives sensor data transmitted from the sensor.
The recording I/F113 is a recording medium interface. Recording media such as an HDD 114 and a removable medium 115 are mounted on the recording I/F113. The recording I/F113 reads out data recorded on the mounted recording medium and writes the data to the recording medium. On the HDD 114, various data such as pattern image data and programs executed by the CPU 101 are recorded in addition to contents.
Fig. 27 is a block diagram showing a functional configuration example of the image processing apparatus 21.
As illustrated in fig. 27, in the image processing apparatus 21, an information processing unit 131 is implemented. The information processing unit 131 includes a pattern image generation unit 151, a projection control unit 152, a pattern detection unit 153, a screen edge detection unit 154, and a display control unit 155. At least part of the functional units shown in fig. 27 is realized by the CPU 101 of fig. 26 executing a predetermined program.
The image captured by the image pickup device 16 is supplied to the pattern detection unit 153 and the display control unit 155 at the time of global adjustment, and is supplied to the pattern detection unit 153, the screen edge detection unit 154, and the display control unit 155 at the time of detailed adjustment. In the captured image supplied to each cell, a pattern image projected on the projection surface 11A appears.
The pattern image generation unit 151 generates a pattern image at the time of global adjustment. The pattern image generation unit 151 also generates a pattern image at the time of detailed adjustment. The pattern image generated by the pattern image generation unit 151 is supplied to the projection control unit 152.
At the time of global adjustment, the projection control unit 152 performs geometric correction on the pattern image generated by the pattern image generation unit 151 using, for example, default parameters, thereby generating a projection image for the projector 13L and a projection image for the projector 13R. The parameters used in the geometric correction are used as information for associating the pixels of the pattern image with the pixels on the projection surface 11A.
The projection control unit 152 controls the display I/F108 to output each projection image to the projector 13, and controls the projector 13 to project the projection image. The pattern image may also be projected upon detailed adjustment.
The pattern detection unit 153 analyzes the captured image at the time of global adjustment to detect the markers placed at the lower left and lower right corners of the pattern image. The pattern detection unit 153 estimates the inclination of the image pickup device 16 based on the position of each mark on the captured image, and outputs information indicating the estimated inclination to the display control unit 155. The pattern detection unit 153 functions as an estimation unit configured to estimate the inclination of the image pickup device 16 based on the position of each mark on the captured image.
Further, the pattern detection unit 153 analyzes the captured image at the time of detailed adjustment to detect the marks placed at the lower left and lower right corners of the pattern image. The pattern detection unit 153 outputs information indicating the position of each mark on the captured image to the screen edge detection unit 154.
The screen edge detection unit 154 performs edge detection based on the brightness of each pixel of the captured image at the time of detailed adjustment to detect an overlapping area. In the edge detection, for example, the position of the mark detected by the pattern detection unit 153 is used as a start point.
Further, the screen edge detection unit 154 performs edge detection based on the luminance of each pixel at the time of detailed adjustment to detect the edge portion 11B of the dome-shaped screen 11. The area on the outer side of the edge portion 11B appears as a dark area in the captured image as described above. For example, the position where the luminance suddenly drops is detected as the position of the edge portion 11B. In the case where the overlap region has been detected, the edge portion 11B is detected with the start point as the boundary line of the overlap region.
In a case where the overlapped area and the entire edge portion 11B appear in the captured image, the screen edge detection unit 154 outputs information indicating this fact to the display control unit 155. Further, in a case where one of the overlapping area and the edge portion 11B does not appear in the captured image, the screen edge detection unit 154 outputs information indicating the moving direction of the image pickup device 16 to the display control unit 155.
The display control unit 155 controls display on an adjustment screen displayed by the display 161. The display control unit 155 rotates the captured image supplied from the image pickup device 16 depending on the inclination of the image pickup device 16 estimated by the pattern detection unit 153 at the time of the global adjustment, and performs control to display the rotated captured image on the adjustment screen as a preview image.
Further, similar to the global adjustment, the display control unit 155 rotates the captured image supplied from the image pickup device 16 at the time of the detailed adjustment, and controls the display 161 to display the rotated captured image as a preview image.
At the time of detailed adjustment, in a case where the screen edge detection unit 154 has detected that the overlap area and the entire edge portion 11B appear in the captured image, the display control unit 155 performs control to display information indicating that the image pickup device 16 is in an appropriate position and inclination on the adjustment screen. Further, at the time of detailed adjustment, in a case where the screen edge detection unit 154 has detected that any one of the overlapping area and the entire edge portion 11B does not appear in the captured image, the display control unit 155 performs control to display information for guiding the moving direction of the image pickup device 16 on the adjustment screen based on the information supplied from the screen edge detection unit 154.
< operation of image processing apparatus >
Here, with reference to the flowchart of fig. 28, the adjustment processing performed by the image processing apparatus 21 having the configuration as described above is described.
In step S1, the information processing unit 131 executes the global adjustment processing. The administrator of the multi-projection system 1 matches the inclination of the image pickup device 16 with the inclination of the projector 13 while viewing the preview image displayed as a result of the global adjustment processing. Details of the global adjustment processing are described later with reference to the flowchart of fig. 29.
In step S2, the information processing unit 131 executes detailed adjustment processing. The administrator of the multi-projection system 1 adjusts the image pickup device 16 to an appropriate position and inclination while viewing the preview image displayed as a result of the detailed adjustment processing. Details of the detailed adjustment processing are described later with reference to the flowchart of fig. 30.
Next, with reference to the flowchart of fig. 29, the global adjustment processing executed in step S1 of fig. 28 is described.
In step S11, the pattern image generation unit 151 generates a pattern image.
In step S12, the projection control unit 152 generates the projection image of the projector 13L and the projection image of the projector 13R based on the pattern image generated by the pattern image generation unit 151, and controls the projector 13L and the projector 13R to project the respective projection images.
In step S13, the image pickup device 16 photographs the projection surface 11A on which the pattern images have been projected from the projector 13L and the projector 13R.
In step S14, the pattern detection unit 153 analyzes the captured image to detect markers placed at the lower left and lower right corners of the pattern image, thereby estimating the inclination of the image pickup device 16.
In step S15, the display control unit 155 rotates the captured image depending on the inclination of the image pickup device 16 estimated by the pattern detection unit 153, and performs control to display the rotated captured image on the adjustment screen as a preview image. The captured image is appropriately rotated as necessary.
After the preview image has been displayed, the process returns to step S1 of fig. 28, and the subsequent process is executed.
Next, with reference to the flowchart of fig. 30, the detailed adjustment processing executed in step S2 of fig. 28 is described.
In step S21, the pattern image generation unit 151 generates a pattern image.
In step S22, the projection control unit 152 generates the projection image of the projector 13L and the projection image of the projector 13R based on the pattern image generated by the pattern image generation unit 151, and controls the projector 13L and the projector 3R to project the respective projection images.
In step S23, the image pickup device 16 photographs the projection surface 11A on which the pattern images have been projected from the projector 13L and the projector 13R.
In step S24, the pattern detection unit 153 analyzes the captured image to detect markers placed at the lower left and lower right corners of the pattern image.
In step S25, the screen edge detection unit 154 detects the edge of the captured image to detect the overlapping area and the edge of the screen, i.e., the edge portion 11B of the dome-shaped screen 11.
Here, similarly to the global adjustment, a process of estimating the inclination of the image pickup device 16 based on the position of the marker and appropriately rotating the captured image depending on the inclination of the image pickup device 16 is performed. On the adjustment screen, preview images are continuously displayed.
In step S26, the screen edge detecting unit 154 determines whether the overlap area and the edge portion 11B of the dome-shaped screen 11 have been successfully detected.
In the event that determination is made in step S26 that the overlap area and the edge portion 11B of the dome-shaped screen 11 have not been successfully detected, in step S27, the display control unit 155 performs control to display information for guiding the moving direction of the image pickup device 16 on the adjustment screen based on the information supplied from the screen edge detection unit 154. After that, the process returns to step S23, and the above-described process is repeated.
Meanwhile, in a case where it is determined in step S26 that both the overlap area and the edge portion 11B of the dome-shaped screen 11 have been successfully detected, the process proceeds to step S28.
In step S28, the display control unit 155 performs control to display information indicating that the image pickup device 16 is in an appropriate position and inclination on the adjustment screen. After that, the process returns to step S2 of fig. 28, and the subsequent process is executed.
After the position and inclination of the image pickup device 16 are adjusted, a predetermined image is projected from the projector 13, and the state of the projection surface 11A on which the image has been projected from the projector 13 is photographed by the image pickup device 16. Based on the image captured by the imaging device 16, parameters used in geometric correction when projecting the content image are calculated.
Such parameter calculation is periodically performed at a predetermined cycle. Further, the parameter calculation is performed at a predetermined timing (for example, when the position of the projector 13 is moved).
As described above, the plurality of camera devices may be permanently installed at a location different from the location at which the user views, i.e., a location at which the camera devices do not interfere with the user. Further, the parameters for geometric correction can be calculated at any timing based on the images taken by the permanently installed image pickup device without adjusting the image pickup device every time.
Since the captured image rotated depending on the inclination of the image pickup device is displayed as the preview image, the administrator of the multi-projection system 1 can easily adjust the image pickup device.
Further, since information for guiding the moving direction of the imaging apparatus is displayed, even a person who is not used to adjustment can adjust the imaging apparatus. Even in the case where the image pickup apparatus has been moved due to interference, it is not necessary for a person who has experienced adjustment of the image pickup apparatus to perform adjustment of the image pickup apparatus every time, and therefore, operability can be improved.
< modified example >
An image each having a pattern using a mark of a predetermined color is used as the pattern image, but an image having a pattern using a predetermined pattern such as a circle, a square, a triangle, and a cross may be used.
The pattern image is projected from the projector 13, but the above-mentioned processing may be performed based on an image obtained by photographing the mark provided on the projection surface 11A.
In the detailed adjustment, the administrator manually adjusts the imaging device, but may automatically adjust the imaging device 16 to an appropriate position and inclination detected based on the pattern image. In such a case, a driving unit configured to adjust the position and inclination of the image pickup device 16L and a driving unit configured to adjust the position and inclination of the image pickup device 16R are provided, respectively.
The series of processing procedures mentioned above may be performed by hardware or software. In the case where a series of processing procedures are executed by software, a program configuring the software is installed, for example, from a program storage medium on a computer configuring the image processing apparatus 21 of fig. 26.
For example, the program executed by the CPU 101 is provided by being recorded on the removable medium 115, or the program executed by the CPU 101 is provided via a wired or wireless transmission medium (e.g., a local area network, the internet, or digital broadcasting), and is installed on the HDD 114.
With respect to the computer-executed program, the processing procedures of the program may be executed in chronological order in the order described herein or in parallel. Alternatively, the processing of the program may be executed at an appropriate timing, for example, when the program is called.
Note that in this document, "system" refers to a collection of multiple components (devices, modules (components), etc.), and it does not matter whether all of the components are in the same enclosure. Therefore, a plurality of devices accommodated in separate cabinets and connected to each other via a network and one device including a plurality of modules accommodated in one cabinet are both a "system".
The effects described herein are merely exemplary and not limiting, and other effects may be provided.
The embodiments of the present technology are not limited to the above-described embodiments, and various modifications may be made without departing from the gist of the present technology.
For example, the present technology may employ a configuration of cloud computing in which a plurality of apparatuses share one function via a network to cooperatively process the function.
Further, each step described in the above-mentioned flowcharts may be executed by being shared among a plurality of apparatuses and being executed by one apparatus.
Further, in the case where a plurality of processes are included in one step, the plurality of processes included in one step may be executed by being shared among a plurality of apparatuses and being executed by one apparatus.
The present technology can also adopt the following configuration.
(1) An image processing apparatus comprising:
an estimation unit configured to estimate an inclination of an image pickup device installed at a predetermined inclination based on a captured image captured by the image pickup device, the image pickup device being configured to capture a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector; and
a display control unit configured to perform control to display the captured image rotated depending on an inclination of the image pickup device.
(2) The image processing apparatus according to item (1), wherein the display control unit rotates the captured image such that a moving direction in which the image capturing apparatus moves in a real space matches a switching direction of a display range of the captured image.
(3) The image processing apparatus according to item (1) or (2), wherein the screen includes a dome-shaped screen.
(4) The image processing apparatus according to item (3), wherein the image pickup device is mounted on each of the left and right sides of the screen.
(5) The image processing apparatus according to item (4), further comprising:
a generating unit configured to generate the pattern image having predetermined marks placed at lower left and lower right corners of the pattern image; and
a projection control unit configured to control each of the plurality of projectors to project the pattern image.
(6) The image processing apparatus according to item (5), wherein two of the projectors are mounted on the left and right sides of the screen at a predetermined inclination.
(7) The image processing apparatus according to item (6), wherein the two projectors are each mounted such that a horizontal side of the pattern image is projected in an arc shape on the projection surface.
(8) The image processing apparatus according to any one of the items (5) to (7), wherein the light of the pattern image projected from the projector includes light of positions on the projection surface illuminated by the markers and light of positions outside the projection surface illuminated by upper left and right corners of the pattern image.
(9) The image processing apparatus according to any one of the items (6) to (8), further comprising:
a detection unit configured to detect an overlap region included in a pattern image projected from each of the two projectors on the projection surface and an edge portion of the screen based on each of captured images captured by two of the image capturing devices.
(10) The image processing apparatus according to item (9), wherein the display control unit performs control to display information for guiding the moving direction of the image pickup apparatus together with the captured image, based on the overlapping area of the captured image and the position of the edge portion.
(11) The image processing apparatus according to item (9) or (10), wherein in a case where the overlap area and the edge portion appear in each of the captured images captured by the two image capturing apparatuses, the display control unit performs control to display information indicating that the image capturing apparatuses are in an appropriate position and inclination.
(12) An image processing method, the method comprising controlling an image processing apparatus to:
estimating an inclination of an image pickup device installed at a predetermined inclination based on a captured image captured by the image pickup device, the image pickup device being configured to capture a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector; and
performing control to display the captured image rotated depending on an inclination of the image pickup device.
(13) A program for causing a computer to execute:
estimating an inclination of an image pickup device installed at a predetermined inclination based on a captured image captured by the image pickup device, the image pickup device being configured to capture a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector; and
performing control to display the captured image rotated depending on an inclination of the image pickup device.
(14) A projection system, comprising:
a dome-shaped screen;
a projector installed at a predetermined inclination and configured to project a pattern image including an image having a predetermined pattern on the screen;
an image pickup device installed with the projector at a predetermined inclination; and
an image processing apparatus includes:
an estimation unit configured to estimate a tilt of the image pickup device based on a captured image obtained by capturing a projection surface of the screen by the image pickup device, the pattern image having been projected on the projection surface from the projector; and
a display control unit configured to perform control to display the captured image rotated depending on an inclination of the image pickup device.
List of reference numerals
1 multi-projection system, 11 dome-shaped screen, 11A projection surface, 13L projector, 13R projector, 14 surround speaker, 15 woofer, 16L camera, 16R camera, 21 image processing device, 131 information processing unit, 151 pattern image generating unit, 152 projection control unit, 153 pattern detecting unit, 154 screen edge detecting unit, 155 display control unit

Claims (14)

1. An image processing apparatus comprising:
an estimation unit configured to estimate an inclination of an image pickup device installed at a predetermined inclination based on a captured image captured by the image pickup device, the image pickup device being configured to capture a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector; and
a display control unit configured to perform control to display the captured image rotated depending on an inclination of the image pickup device.
2. The image processing apparatus according to claim 1, wherein the display control unit rotates the captured image such that a moving direction in which the image capturing apparatus moves in a real space matches a switching direction of a display range of the captured image.
3. The image processing apparatus according to claim 1, wherein the screen comprises a dome-shaped screen.
4. The image processing apparatus according to claim 3, wherein the image pickup device is mounted on each of left and right sides of the screen.
5. The image processing apparatus according to claim 4, further comprising:
a generating unit configured to generate the pattern image having predetermined marks placed at lower left and lower right corners of the pattern image; and
a projection control unit configured to control each of the plurality of projectors to project the pattern image.
6. The image processing apparatus according to claim 5, wherein two of the projectors are installed on left and right sides of the screen at a predetermined inclination.
7. The image processing apparatus according to claim 6, wherein the two projectors are each mounted such that a horizontal side of the pattern image is projected in an arc shape on the projection surface.
8. The image processing apparatus according to claim 5, wherein the light of the pattern image projected from the projector includes light of a position on the projection surface of the mark and light of upper left and right corners of the pattern image irradiated on positions outside the projection surface.
9. The image processing apparatus according to claim 6, further comprising:
a detection unit configured to detect an overlap region included in a pattern image projected from each of the two projectors on the projection surface and an edge portion of the screen based on each of captured images captured by two of the image capturing devices.
10. The image processing apparatus according to claim 9, wherein the display control unit performs control for displaying information for guiding the moving direction of the image pickup apparatus together with the captured image, based on the overlapping area of the captured image and the position of the edge portion.
11. The image processing apparatus according to claim 9, wherein the display control unit performs control to display information indicating that the image pickup apparatus is in an appropriate position and inclination, in a case where the overlap region and the edge portion appear in each of the captured images captured by the two image pickup apparatuses.
12. An image processing method, the method comprising controlling an image processing apparatus to:
estimating an inclination of an image pickup device installed at a predetermined inclination based on a captured image captured by the image pickup device, the image pickup device being configured to capture a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector; and
performing control to display the captured image rotated depending on an inclination of the image pickup device.
13. A program for causing a computer to execute:
estimating an inclination of an image pickup device installed at a predetermined inclination based on a captured image captured by the image pickup device, the image pickup device being configured to capture a projection surface of a screen on which a pattern image including an image having a predetermined pattern has been projected from a projector; and
performing control to display the captured image rotated depending on an inclination of the image pickup device.
14. A projection system, comprising:
a dome-shaped screen;
a projector installed at a predetermined inclination and configured to project a pattern image including an image having a predetermined pattern on the screen;
an image pickup device installed with the projector at a predetermined inclination; and
an image processing apparatus includes:
an estimation unit configured to estimate a tilt of the image pickup device based on a captured image obtained by capturing a projection surface of the screen by the image pickup device, the pattern image having been projected on the projection surface from the projector; and
a display control unit configured to perform control to display the captured image rotated depending on an inclination of the image pickup device.
CN201980011189.5A 2018-02-08 2019-01-25 Image processing device, image processing method, program, and projection system Pending CN111684793A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018020696 2018-02-08
JP2018-020696 2018-09-24
PCT/JP2019/002390 WO2019155904A1 (en) 2018-02-08 2019-01-25 Image processing device, image processing method, program, and projection system

Publications (1)

Publication Number Publication Date
CN111684793A true CN111684793A (en) 2020-09-18

Family

ID=67547928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980011189.5A Pending CN111684793A (en) 2018-02-08 2019-01-25 Image processing device, image processing method, program, and projection system

Country Status (3)

Country Link
US (1) US11218662B2 (en)
CN (1) CN111684793A (en)
WO (1) WO2019155904A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111464795A (en) * 2020-05-22 2020-07-28 联想(北京)有限公司 Method and device for realizing configuration of monitoring equipment and electronic equipment

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240069424A1 (en) * 2022-08-23 2024-02-29 Applied Physics, Inc. Light sphere dome

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6141034A (en) * 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus
US20020024640A1 (en) * 2000-08-29 2002-02-28 Olympus Optical Co., Ltd. Image projection display apparatus using plural projectors and projected image compensation apparatus
US20030142883A1 (en) * 2002-01-17 2003-07-31 Olympus Optical Co., Ltd. Image correction data calculation method, image correction data calculation apparatus, and multi-projection system
WO2005084017A1 (en) * 2004-02-27 2005-09-09 Olympus Corporation Multiprojection system
US20130070094A1 (en) * 2011-09-20 2013-03-21 The Regents Of The University Of California, A California Corporation Automatic registration of multi-projector dome images

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3735158B2 (en) * 1996-06-06 2006-01-18 オリンパス株式会社 Image projection system and image processing apparatus
US6333826B1 (en) * 1997-04-16 2001-12-25 Jeffrey R. Charles Omniramic optical system having central coverage means which is associated with a camera, projector, or similar article
US20100073468A1 (en) * 2002-02-12 2010-03-25 Ben Kutner Simulation of attendance at a live event
JP4945578B2 (en) * 2007-01-04 2012-06-06 肇 鳴川 Information processing method
US8994757B2 (en) 2007-03-15 2015-03-31 Scalable Display Technologies, Inc. System and method for providing improved display quality by display adjustment and image processing using optical feedback
FR2919934B1 (en) * 2007-08-09 2010-09-17 Univ Clermont Auvergne PANORAMIC PROJECTION DEVICE AND METHOD USED THEREIN
US8013904B2 (en) * 2008-12-09 2011-09-06 Seiko Epson Corporation View projection matrix based high performance low latency display pipeline
CN101939703B (en) * 2008-12-25 2011-08-31 深圳市泛彩溢实业有限公司 Hologram three-dimensional image information collecting device and method, reproduction device and method
JP2012177676A (en) * 2011-01-31 2012-09-13 Sony Corp Information processor and method, and program
JP5592834B2 (en) * 2011-05-26 2014-09-17 日本電信電話株式会社 Optical projection control apparatus, optical projection control method, and program
US9497447B2 (en) * 2011-06-15 2016-11-15 Scalable Display Technologies, Inc. System and method for color and intensity calibrating of a display system for practical usage
JP2014003586A (en) * 2012-05-22 2014-01-09 Ricoh Co Ltd Image processing system, image processing method and program
US9128366B2 (en) * 2012-05-22 2015-09-08 Ricoh Company, Ltd. Image processing system, image processing method, and computer program product
GB2512621A (en) * 2013-04-04 2014-10-08 Sony Corp A method and apparatus
JPWO2017159063A1 (en) * 2016-03-14 2019-01-17 ソニー株式会社 Display device and information processing terminal device
WO2018100800A1 (en) * 2016-11-29 2018-06-07 ソニー株式会社 Information processing device, information processing method, and computer program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6141034A (en) * 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus
US20020024640A1 (en) * 2000-08-29 2002-02-28 Olympus Optical Co., Ltd. Image projection display apparatus using plural projectors and projected image compensation apparatus
US20030142883A1 (en) * 2002-01-17 2003-07-31 Olympus Optical Co., Ltd. Image correction data calculation method, image correction data calculation apparatus, and multi-projection system
WO2005084017A1 (en) * 2004-02-27 2005-09-09 Olympus Corporation Multiprojection system
US20130070094A1 (en) * 2011-09-20 2013-03-21 The Regents Of The University Of California, A California Corporation Automatic registration of multi-projector dome images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111464795A (en) * 2020-05-22 2020-07-28 联想(北京)有限公司 Method and device for realizing configuration of monitoring equipment and electronic equipment

Also Published As

Publication number Publication date
US20210067732A1 (en) 2021-03-04
WO2019155904A1 (en) 2019-08-15
US11218662B2 (en) 2022-01-04

Similar Documents

Publication Publication Date Title
US7234819B2 (en) Projector system and camera system
US8605174B2 (en) Apparatus, method, and system of image processing, and recording medium storing image processing control program
WO2022206527A1 (en) Method for projection image correction and laser projection device
KR20160034847A (en) System and method for calibrating a display system using a short throw camera
JP6645687B2 (en) Display device and control method
JP2006060447A (en) Keystone correction using partial side of screen
WO2005002240A1 (en) Method for calculating display characteristic correction data, program for calculating display characteristic correction data, and device for calculating display characteristic correction data
JP5839785B2 (en) Projection system, projection apparatus, and imaging apparatus
US20190297242A1 (en) Image capturing apparatus, image capturing method, and image processing apparatus
JP2014003586A (en) Image processing system, image processing method and program
CN109644248B (en) Projection type image display device and method for adjusting projection image
CN111684793A (en) Image processing device, image processing method, program, and projection system
KR20130043300A (en) Apparatus and method for correcting image projected by projector
JP2022529417A (en) Image acquisition projection system, use of the system and image acquisition projection insertion method
JP2013083755A (en) Display device, method of controlling display device, and program
JP2003348500A (en) Projection image adjustment method, image projection method, and projector
JP2014176053A (en) Image signal processor
CN116055696A (en) Projection equipment and projection method
CN111630849A (en) Image processing apparatus, image processing method, program, and projection system
JP2013145949A (en) Projection system, and alignment adjusting method for superposed image
CN111630848B (en) Image processing apparatus, image processing method, program, and projection system
JP2014160998A (en) Image processing system, image processing method, image processing program and recording medium
US11831976B2 (en) Display apparatus
CN108881872B (en) Multi-dimensional image projection device and multi-dimensional image correction method thereof
KR20160089263A (en) Automatic horizontal keeping camera apparatus and method for the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200918

RJ01 Rejection of invention patent application after publication