WO2005084017A1 - Système multiprojection - Google Patents

Système multiprojection Download PDF

Info

Publication number
WO2005084017A1
WO2005084017A1 PCT/JP2005/001337 JP2005001337W WO2005084017A1 WO 2005084017 A1 WO2005084017 A1 WO 2005084017A1 JP 2005001337 W JP2005001337 W JP 2005001337W WO 2005084017 A1 WO2005084017 A1 WO 2005084017A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
screen
correction data
marker
projected
Prior art date
Application number
PCT/JP2005/001337
Other languages
English (en)
Japanese (ja)
Inventor
Takeyuki Ajito
Kazuo Yamaguchi
Takahiro Toyama
Takafumi Kumano
Original Assignee
Olympus Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corporation filed Critical Olympus Corporation
Publication of WO2005084017A1 publication Critical patent/WO2005084017A1/fr

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/26Projecting separately subsidiary matter simultaneously with main image
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a multi-projection system for forming one large image by pasting images projected on a screen by an image projection device such as a plurality of projectors, and in particular, a position shift of an image projected on a screen.
  • the present invention relates to a multi-projection system in which distortion and color shift are detected by a digital camera and automatically corrected.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-72359
  • Patent Document 2 JP 2003-18503 [0007]
  • an image taken by a digital camera is converted into a cylinder or a sphere according to the screen shape, so that distortion is accurately corrected.
  • the three-dimensional shape of the screen is known in advance, and that the digital camera must be installed at the same Cf position as the observation position.
  • an object of the present invention made in view of the power is to correct the displacement and distortion of the projected image at an arbitrary observation position and the color shift in real time even if the shape of the screen is not known. It is an object of the present invention to provide a multi-projection system. Disclosure of the invention
  • the invention according to claim 1 which achieves the above object, provides a multi-projection system for forming one large image by combining images projected on a screen by a plurality of image projection devices,
  • Image acquisition means for acquiring an image projected on the screen by the image projection device from a different position having a known relative positional relationship to acquire parallax image data; and a parallax image acquired by the image acquisition means.
  • Correction data for estimating the three-dimensional position of each point of the image projected on the screen based on the data and the information on the relative positional relationship, and correcting the image input to the image projection device
  • Image correction data calculating means for calculating
  • Image correction means for correcting an image input to the image projection device based on the correction data calculated by the image correction data calculation means. It is.
  • the invention according to claim 2 is the multi-projection system according to claim 1, wherein the image acquisition means has two digital cameras installed at different positions having a known relative positional relationship.
  • parallax image data is obtained by photographing an image projected on the screen by the two digital cameras.
  • the invention according to claim 3 is the multi-projection system according to claim 1, wherein the image acquisition unit has one digital camera and a moving mechanism that translates the digital camera in parallel. Moving the digital camera in parallel by the moving mechanism to capture parallax image data by capturing an image projected on the screen from a different position having a known relative positional relationship. It is.
  • the invention according to claim 4 is a multi-projection system for forming one large image by pasting images projected on a screen by a plurality of image projection devices,
  • Marker projecting means for projecting a marker at a predetermined angle on the screen, and an image projected on the screen by the image projection device and an image obtained by photographing the marker projected on the screen by the marker projecting means, respectively.
  • Image acquisition means for acquiring data;
  • Support means for fixing a relative positional relationship between the marker projection means and the image acquisition means
  • the three-dimensional position of each point of the image projected on the screen is estimated based on the image data acquired by the image acquiring means and the information on the relative positional relationship, and is input to the image projection device.
  • Image correction data calculating means for calculating correction data for correcting an image to be corrected;
  • An image correcting means for correcting an image input to the image projection device based on the correction data calculated by the image correction data calculating means.
  • the invention according to claim 5 provides a multi-projection system for forming one large image by pasting images projected on a screen by a plurality of image projection devices.
  • a plurality of image acquisition means for acquiring an image data by photographing an image projected on the screen by the image projection device from a location whose relative positional relationship with the image projection device is known
  • the three-dimensional position of each point of the image projected on the screen is estimated based on the image data acquired by the plurality of image acquisition means and the information on the relative positional relationship, and the image projection device Image correction data calculating means for calculating correction data for correcting an image input to the
  • An image correcting means for correcting an image input to the image projection device based on the correction data calculated by the image correction data calculating means.
  • the invention according to claim 6 is a multi-projection system for forming one large image by pasting images projected on a screen by a plurality of image projection devices,
  • marker projecting means for projecting a marker on the screen
  • Image acquisition means for taking an image projected on the screen by the image projection device and a marker projected on the screen by the marker projection means to acquire image data
  • the three-dimensional position of each point of the image projected on the screen is estimated based on the image data acquired by the image acquiring means and the information on the relative positional relationship, and is input to the image projection device.
  • Image correction data calculating means for calculating correction data for correcting an image to be corrected;
  • An image correcting means for correcting an image input to the image projection device based on the correction data calculated by the image correction data calculating means.
  • the invention according to claim 7 is the multi-projection system according to claim 6, wherein the two marker projection units are installed at different positions with a known relative positional relationship. Characterized in that it has a laser pointer.
  • the invention according to claim 8 is the multi-projection system according to claim 6, wherein the marker projecting means has one laser pointer and a moving mechanism that translates the laser pointer in parallel.
  • the laser pointer is moved in parallel by the moving mechanism to project a marker onto the screen with a different positional force whose relative positional relationship is known.
  • the correction data calculating means performs correction based on observer position information with respect to the screen. It is characterized by calculating data.
  • the invention according to claim 10 is the multi-projection system according to claim 9, further comprising a visual inspection sensor for detecting a viewpoint position of the observer and obtaining position information of the observer. It is characterized by the following.
  • FIG. 1 is a diagram showing an overall configuration of a multi-projection system according to a first embodiment of the present invention.
  • FIG. 2 is a view showing a test pattern image input to the projector shown in FIG. 1.
  • FIG. 3 is a block diagram showing a configuration of a correction data calculation unit shown in FIG. 1.
  • FIG. 4 is a block diagram showing a configuration of an image conversion unit shown in FIG. 1.
  • FIG. 5 is a flowchart for explaining a geometric correction data calculation process in the first embodiment.
  • FIG. 6 is a view showing marker projection by the projector shown in FIG. 1 and marker imaging by two cameras.
  • FIG. 7 is a conceptual diagram showing a screen three-dimensional shape estimation method according to the first embodiment.
  • FIG. 8 is a conceptual diagram showing a method for estimating a screen shape with a small number of markers.
  • FIG. 9 is a diagram illustrating a state in which a marker projection image centering on the observation viewpoint is created from the estimated screen shape.
  • FIG. 10 is a diagram showing a state in which a marker projection image is created on a projection plane having a wide viewing angle centered on an observation viewpoint from an estimated screen shape.
  • FIG. 11 is a diagram showing an overall configuration of a multi-projection system according to a second embodiment of the present invention.
  • FIG. 12 is a block diagram showing a configuration of a correction data calculation unit shown in FIG.
  • FIG. 13 is a diagram showing an overall configuration of a multi-projection system according to a third embodiment of the present invention.
  • FIG. 14 is a block diagram showing a configuration of a correction data calculation unit shown in FIG.
  • FIG. 16 is a conceptual diagram showing a screen shape estimation method in the third embodiment.
  • FIG. 17 is a diagram showing an overall configuration of a multi-projection system according to a fourth embodiment of the present invention.
  • FIG. 18 is a block diagram showing a configuration of a correction data calculation unit shown in FIG.
  • FIG. 19 is a diagram showing an overall configuration of a multi-projection system according to a fifth embodiment of the present invention.
  • FIG. 20 is a block diagram showing a configuration of a correction data calculation unit shown in FIG. 19.
  • FIG. 21 is a diagram illustrating an overall configuration of a multi-projection system according to a sixth embodiment of the present invention.
  • FIG. 22 is a diagram for explaining an example of a screen shape calculation method according to the sixth embodiment.
  • FIG. 23 is a diagram showing an overall configuration of a multi-projection system according to a seventh embodiment of the present invention.
  • FIG. 24 is a diagram showing an overall configuration of a multi-projection system according to an eighth embodiment.
  • FIG. 25 is a diagram showing an overall configuration of a multi-projection system according to a ninth embodiment.
  • FIG. 26 is a diagram for describing the screen shape estimation processing in the ninth embodiment.
  • FIG. 27 is a flowchart for explaining the same screen shape estimation process.
  • FIG. 28 is a flowchart for explaining the same geometric correction data calculation process.
  • FIG. 29 is a block diagram illustrating a configuration of a correction data calculation unit illustrated in FIG. 25.
  • FIG. 30 is a diagram showing an overall configuration of a multi-projection system according to a tenth embodiment of the present invention.
  • FIG. 31 is a perspective view showing observation glasses used in the tenth embodiment.
  • FIG. 32 is a view showing a modification of the present invention.
  • FIG. 33 is a diagram showing details of each marker shown in FIG. 2 (a).
  • FIG. 1 shows the overall configuration of a multi-projection system
  • Figs. 2 (a) and (b) show test patterns input to the projector.
  • FIG. 3 is a block diagram showing the configuration of the correction data calculation unit shown in FIG. 1
  • FIG. 4 is a block diagram showing the configuration of the image conversion unit shown in FIG. 1, and
  • FIG. Flowchart for explanation Fig. 6 shows marker projection by the projector and marker shooting by two cameras
  • Fig. 7 is a conceptual diagram showing the method of estimating the three-dimensional shape of the screen
  • Fig. 8 is the screen with a small number of markers.
  • Fig. 9 is a conceptual diagram showing the method of estimating the shape
  • Fig. 9 is a conceptual diagram showing the method of estimating the shape
  • FIG. 9 is a diagram showing how to create a marker projection image centered on the observation viewpoint from the estimated screen shape (marker position), and Figs. 10 (a) and (b) are Estimated Figure 33 (a) and (b) show how a marker projection image is created from a screen shape (marker position) on a wide viewing angle projection plane centered on the observation viewpoint.
  • Figure 2 (a) FIG. 3 is a diagram showing details of each marker.
  • an image is partially overlapped and projected onto a dome-shaped or arch-shaped screen 2 by projectors 1 A and IB, which are image projection devices, respectively.
  • the projectors as the image projection device include a transmission type liquid crystal projector, a reflection type liquid crystal projector, a DLP type projector using a DMD (digital micromirror element), a CRT projection tube display, and a laser projector.
  • A can projector or the like can be used.
  • the images projected by the projector 1A and the projector 1B are not stuck together as they are due to the difference in the color characteristics of the projectors 1A and 1B and the displacement of the installation positions.
  • test pattern image data is input to projector 1A and projector 1B, and a test pattern image is projected on screen 2, and the projected test pattern image is obtained by image acquisition means.
  • a camera (digital camera) 3A, 3B captures an image of the test pattern.
  • the test pattern images projected here include marker images regularly arranged on the screen as shown in Fig. 2 (a) and the color characteristics of projectors 1A and 1B as shown in Fig. 2 (b). There are color signal images with different signal levels for each color of R (red), G (green), and B (blue) to acquire.
  • each marker shown in Fig. 2 (a) is point-symmetric on the XY space coordinates as shown in Fig.
  • a digital camera as an image acquisition means, a monochrome or multi-band digital camera can be applied, and a CCD or CMOS type image sensor can be applied.
  • These acquired test pattern images are sent to a correction data calculation section 4, where correction data for correcting the images input to the projectors 1A and 1B is calculated based on the shot images of the test patterns. .
  • the calculated data is sent to the image conversion unit 5, where the correction data calculated by the correction data calculation unit 4 corrects the content image data input to the projectors 1A and 1B.
  • camera 3A and the camera 3B are fixed to the support member 6 at a predetermined distance d as shown in FIG.
  • camera 3A and camera 3B can both use a wide-angle lens to capture the entire area of the screen, or a fish lens to provide a wider angle of view. Use an eye lens.
  • the parallax image of the test pattern can be obtained by photographing the test pattern images projected on the screen 2 with the cameras 3A and 3B at different viewpoint forces. Further, since the relative positional relationship between the viewpoints is known, it is possible to estimate the three-dimensional position of each point of the image projected on the screen 2 from the parallax image.
  • the correction data calculation unit 4 in the present embodiment includes a camera photographed image data storage unit 11A and a camera photographed image data storage unit 11B, a marker position detection storage unit 12, a screen shape / camera position estimation storage unit 13, an observation position. It has a setting unit 14, a marker position coordinate conversion unit 15, a projector geometric correction data calculation unit 16, a projector gamma correction data calculation unit 17, and a projector color correction matrix calculation unit 18.
  • the camera photographed image data storage unit 11A and the camera photographed image data storage unit 11B store photographed images of various test patterns photographed by the camera 3A and the camera 3B.
  • the marker position detection storage unit 12 detects the position of the marker projected from each of the test patterns captured by the cameras 3A and 3B on the captured image and projected by each projector on the captured image. Stores information.
  • the screen shape 'camera position estimation storage unit 13 estimates the three-dimensional position of each marker from the position information of each marker on the captured image corresponding to the camera 3A and the camera 3B. Calculate the 3D shape and the camera position with respect to the screen 2.
  • the observation position setting unit 14 stores the three-dimensional position information of the observation position set in advance or the observation position arbitrarily set by the user, and sends the information to the marker position coordinate conversion unit 15 in the subsequent stage.
  • the three-dimensional marker position calculated in the screen shape′camera position calculation storage unit and the three-dimensional position information of the observation position set in the observation position setting unit 14 Based on the above, the two-dimensional coordinate position of the marker when the marker is projected on the two-dimensional plane with the observation position as the viewpoint is calculated.
  • the marker position coordinate conversion unit 15 creates Using the two-dimensional coordinates of the marker with the observed observation position as the viewpoint, the geometric coordinate relationship between the projection plane centered on the observation position and the image planes of the projectors 1A and 1B is derived, and the derived geometric coordinate relation Based on the calculated geometric correction data for correcting the displacement and distortion of the projector image, the calculated geometric correction data is output to the image conversion unit 5 at the subsequent stage.
  • the projector gamma correction data calculation unit 17 corrects color unevenness and gamma characteristic unevenness in the screens of the projectors 1A and 1B based on various color signal images captured by the camera 3A (or camera 3B). Gamma correction data is calculated, and the calculated gamma correction data is output to the image conversion unit 5 at the subsequent stage.
  • the projector color correction matrix calculator 18 calculates a color correction matrix for correcting a color difference between the projectors 1A and 1B based on various color signal images captured by the camera 3A (or the camera 3B). Is calculated, and the calculated color correction matrix is output to the image conversion unit 5 at the subsequent stage.
  • the image conversion unit 5 is roughly divided into a correction data storage unit 21 and a correction data operation unit 22.
  • the correction data storage unit 21 includes a geometric correction data storage unit 23, a gamma correction data storage unit 24, and a color correction matrix storage unit 25, and the geometric data calculated by the projector geometric correction data calculation unit 16 of the correction data calculation unit 4.
  • the correction data is stored in the geometric correction data storage unit 23, the projector gamma correction data calculated in the projector gamma correction data calculation unit 17 is stored in the gamma correction data storage unit 24, and the color correction matrix calculated in the projector color correction matrix calculation unit 18 Are stored in the color correction matrix storage unit 25, respectively.
  • the correction data operation unit 22 includes a gamma conversion unit 26, a geometric correction data operation unit 27, a color correction matrix operation unit 28, a gamma correction unit 29, and a gamma correction data operation unit 30.
  • the gamma conversion unit 26 corrects the nonlinear gamma characteristic of the input image (content image data), and then the geometric correction data operation unit 27 inputs the data from the geometric correction data storage unit 23.
  • the geometric correction of the input image is performed using the geometric correction data for each projector.
  • the color correction matrix The color matrix conversion of the RGB signal of the input image is performed using the color correction matrix for each projector input from the positive matrix storage unit 24.
  • the gamma correction section 29 corrects the uniform gamma characteristics over the entire projector screen, and then the gamma correction data action section 30 executes the gamma correction based on the gamma correction data stored in the gamma correction data storage section 25.
  • uniform gamma force deviation (difference) is corrected for each pixel of the projector screen.
  • the gamma correction unit 29 performs a rough gamma correction to some extent on the entire screen, and then, if the gamma correction data for each pixel used in the gamma correction data operation unit 30 can be provided as a difference, the correction data can be obtained. Since the amount of data and the amount of memory can be compressed, costs can be reduced.
  • the content image data corrected by the image conversion unit 5 is output to the subsequent projector 1A and projector 1B.
  • the test pattern image is output to projector 1A and projector 1B without correction.
  • a state (through state) is set (step S1), and FIG.
  • the marker image shown in is input and displayed on projectors 1A and 1B (step S2).
  • the images projected on the screen 2 by the projectors 1A and 1B are photographed by the cameras 3A and 3B respectively (steps S3 and S4), and the photographed parallax images are stored in the camera photographed image data storage unit 11A of the correction data calculation unit 4. , 11B.
  • FIG. 6 shows this state, in which a marker is projected by one of the projectors 1B and a marker is photographed by the two cameras 3A and 3B.
  • the marker position detection storage unit 12 detects a manual force position. Thereafter, the screen shape 'camera position calculation storage unit 13 detects the position of a marker point (corresponding point) on the detected captured image plane corresponding to the same marker between the parallax images (step S5), and the detected position is detected.
  • the overall screen shape is estimated by interpolation and the camera position is estimated (step S6).
  • Figure 7 illustrates this situation, with two cameras 3A, 3
  • This figure conceptually shows a method for estimating the three-dimensional shape of the screen 2 with the marker image captured by B. As shown in Fig. 8, when the number of marker points is small to some extent, the screen shape is estimated with the same accuracy as that estimated at many marker points by giving rough screen shape foresight information. It is also possible.
  • step S7 the user sets the position where the projected image is actually observed.
  • the observation position may be, for example, the center position of the dome without being specified by the user, or may be determined in advance by default.
  • the marker position coordinate conversion unit 15 firstly outputs the marker on the two-dimensional projection plane centered on the viewpoint at the observation position. Calculate the position coordinates (step S8).
  • Figure 9 illustrates this situation.
  • the angle of view of the projection plane is the same as the angle of view of the content image input to the multi-projection system.
  • the content image is a wide-field image captured by a fish-eye lens, as shown in FIG. 10A
  • the projected image is also represented by a coordinate system with a wide viewing angle (for example, 110 degrees to 360 degrees). What is necessary is just to calculate the marker position coordinates on the two-dimensional projection plane.
  • FIG. 10B when the observation position changes arbitrarily, the projection point at the observation position may be calculated.
  • the projector geometric correction data calculation unit 16 calculates geometric correction data for each projector (step S9). Specifically, from the correspondence between the marker position coordinates on the viewpoint image plane at the observation position and the marker position coordinates on the test pattern image input to the projectors 1A and 1B, the coordinates on the viewpoint image plane and the positions on the projector image plane are obtained. The geometric relationship with the coordinates is obtained, and geometric correction data for correcting the input image is calculated based on the geometric relationship so that the image is output without any displacement or distortion on the viewpoint image plane. After that, the calculated geometric correction data for each projector is output to the image conversion unit 5 (step S10), and the processing of calculating the geometric correction data is terminated.
  • the shooting data of various color signal images either camera 3A or camera 3B can be used. All you have to do is take a picture.
  • the color signal image is not limited to the case where the single color signal data of R (red), G (green), and B (blue) shown in Fig. 2 (b) is used as a test pattern.
  • camera 3A can convert the color signal image of blue component into camera 3B.
  • a color signal image of the red component can be obtained at the same time. As described above, if the color signal images are shared and obtained by the cameras 3A and 3B, the shooting time can be reduced.
  • FIG. 11 and 12 show the second embodiment.
  • FIG. 11 is a diagram showing the overall configuration of the multi-projection system
  • FIG. 12 is a block diagram showing the configuration of the correction data calculation unit shown in FIG.
  • one camera (digital camera) is provided so as to be able to move in parallel to a moving stage 31 which is a moving mechanism.
  • the camera 3 is supported, the camera 3 is moved in parallel, and the relative position is sequentially photographed at both ends of a known distance d, thereby obtaining a parallax image as in the first embodiment.
  • the correction data calculation unit 4 is provided with a switching switch 32 for switching and supplying the image projected by the camera 3 to the camera photographed image data storage units 11A and 11B.
  • a switching switch 32 for switching and supplying the image projected by the camera 3 to the camera photographed image data storage units 11A and 11B.
  • the shooting time is increased by the sequential shooting while moving the camera 3, but the same correction can be realized with less equipment, and the cost can be reduced. Down can be achieved.
  • FIG. 13 to 16 show the third embodiment.
  • FIG. 13 is a diagram showing the overall configuration of the multi-projection system
  • FIG. 14 is a block diagram showing the configuration of the correction data calculation unit shown in FIG.
  • FIG. 15 is a flowchart for explaining the geometric correction data calculation processing
  • FIG. 16 is a conceptual diagram showing a method for estimating a screen shape (marker position).
  • one camera 3 projects markers at an equal angle over the entire screen.
  • the screen shape is estimated using a laser pointer 35 as marker projection means.
  • the camera 3 and the laser pointer 35 are fixed to a support member 36 as a support means, with their relative positional relationship fixed.
  • the correction data calculation unit 4 includes a marker position detection storage unit 12, a screen shape 'camera position estimation storage unit as in FIG. 13, an observation position setting unit 14, a marker position coordinate conversion unit 15, a projector geometric correction data calculation unit 16, a projector gamma correction data calculation unit 17, and a projector color correction matrix calculation unit 18
  • the functions of the camera captured image data storage unit 11, the marker position detection storage unit 12, the screen shape 'camera position calculation storage unit 13, and the marker position coordinate conversion unit 15 are different from those of the first embodiment. The functions are the same as in the first embodiment.
  • the camera photographed image data storage unit 11 stores the photographed image of the marker projected on the screen 2 by the laser pointer 35 and the test pattern of the marker image and the color signal image projected by the projectors 1A and 1B. The captured image is stored.
  • the marker one position detection storage unit 12 the position of the marker projected on the screen 2 by the laser pointer 35 and the position of the marker projected by the projectors 1A and 1B are respectively shown. From the captured image.
  • the screen shape / camera position calculation storage unit 13 estimates the screen shape and the camera position from the marker position detected by the laser pointer 35. Further, in the marker position coordinate conversion unit 15, the projector detected in the marker position detection storage unit 12 from the estimated screen shape and camera position and the observation position set in the observation position setting unit 14. The marker position coordinates by 1A and 1B are converted to the marker position coordinates viewed from the observation viewpoint.
  • projector geometric correction data calculation unit 16 calculates geometric correction data in the same manner as in the first embodiment.
  • a marker is projected on the screen 2 by the laser pointer 35 (step S11), the projected marker image is photographed by the camera 3 (step S12), and stored in the camera photographed image data storage unit 11.
  • the marker images are projected onto the screen 2 by the projectors 1A and 1B (step S13), and the projected marker images are similarly photographed by the camera 3 (step S14).
  • the marker position by the laser pointer 35 and the marker position by the projectors 1A and 1B are detected in the marker position detection storage unit 12 based on each of the captured marker images (steps S15 and S16). .
  • step S17 based on the detected marker position by the laser pointer 35, the shape of the screen 2 and the position of the camera 3 are estimated in the screen shape / camera position calculation storage unit 13 (step S17). . Specifically, as shown in FIG. 16, the three-dimensional position of the marker is calculated from the projection angle of the marker by a predetermined laser pointer 35 and the position of the marker on the captured image, and the screen shape and the camera are calculated. Estimate the location.
  • the marker position coordinate conversion unit 15 first calculates the coordinate position of the laser pointer 35 on the projection image from the observation position as a viewpoint (Step S18). 19) Then, from the marker coordinate position at the observation position and the marker coordinate position on the camera image, the coordinate relationship between the observation position and the camera image (Step S20), and then, using the calculated coordinate relationships, the coordinate positions of the markers projected by the projectors 1A and 1B on the captured image are used as the projectors 1A and 1B on the projection plane with the observation position as the viewpoint. (Step S21).
  • the projector geometric correction data calculation unit 16 calculates the geometric correction data for each projector as in the first embodiment (step S22). ), And outputs the calculated geometric correction data to the image conversion unit 5 (step S23), and performs image conversion of the input image based on the geometric correction data. This allows the projectors 1A and 1B to display the input image on the image without distortion while also viewing the observation position.
  • FIG. 17 and 18 show a fourth embodiment of the present invention.
  • FIG. 17 is a diagram showing the overall configuration of the multi-projection system
  • FIG. 18 is a block diagram showing the configuration of the correction data calculation unit shown in FIG. FIG.
  • the present embodiment is different from the first embodiment in that, in addition to cameras 3A and 3B for acquiring a parallax image of the entire screen, a camera (digital Cameras) 3C and 3D are provided, and the geometric correction data for correcting the displacement and distortion of the projectors 1A and 1B is the same as in the first embodiment using parallax images taken by the cameras 3A and 3B.
  • the color correction matrices and gamma correction data of the projectors 1A and 1B are also calculated.
  • color unevenness is corrected with finer accuracy by capturing a color signal image for detecting color unevenness in a small area on screen 2 by cameras 3C and 3D.
  • the correction data calculation unit 4 includes the camera photographed image data storage units 11A and 11B, the marker position detection storage unit 12, the screen shape "camera position" shown in Fig. 3. Estimation storage unit 13, observation position setting unit 14, marker position coordinate conversion unit 15, projector In addition to the geometric correction data calculation unit 16, camera image data storage units 11C and 11D and a projector color correction matrix / gamma correction data calculation unit 19 are provided.
  • Camera photographed image data storage units 11C and 11D store photographed images of test patterns of marker images and color signal images photographed by cameras 3C and 3D.
  • the marker one position detection storage unit 12 detects the marker one coordinate position of the marker image captured by the cameras 3A and 3B, and also detects the marker coordinate position of the marker image captured by the cameras 3C and 3D.
  • the detected marker positions in the captured images of the cameras 3C and 3D are supplied to the projector color correction matrix / gamma correction data calculation unit 19.
  • the projector color correction matrix 'gamma correction data calculation unit 19 stores the marker positions in the captured images of the cameras 3C and 3D detected by the marker position detection storage unit 12 and the camera captured image data storage units 11C and 11D.
  • the color unevenness corresponding to each pixel of the projectors 1A and 1B is detected using the obtained color signal image, and a color correction matrix and gamma correction data for correcting the color unevenness are calculated, and the calculated color is calculated.
  • the correction matrix and the gamma correction data are output to the image conversion unit 5 to convert the input image. As a result, color unevenness can be corrected with finer accuracy, and an image without color unevenness can be displayed on the screen 2.
  • the geometric correction data of the projectors 1A and 1B are the same as in the first embodiment.
  • FIG. 19 and 20 show a fifth embodiment of the present invention.
  • FIG. 19 shows the overall configuration of a multi-projection system
  • FIG. 20 shows the configuration of the correction data calculation unit shown in FIG. It is a block diagram.
  • This embodiment is different from the third embodiment in that the camera 3 is rotatably supported by the support 36, and the camera 3 is rotated by the rotation control unit 41 for each small area on the screen 2.
  • the test pattern images are sequentially photographed, and the correction data is calculated using the test pattern images photographed separately for each small area.
  • the correction data calculation unit 4 includes a rotation angle storage unit 20 for storing rotation angle information of the camera 3 from the rotation control unit 41.
  • the rotation angle information corresponding to each captured image stored in the rotation angle storage unit 20 is supplied to the screen shape 'camera position calculation storage unit 13 and the marker position coordinate conversion unit 15, and the screen shape' camera position It is used in correspondence with each photographed image data when estimating and converting the observation position to the marker position coordinates, thereby calculating respective correction data in the same manner as in the third embodiment.
  • FIGS. 21 and 22 show a sixth embodiment of the present invention.
  • FIG. 21 is a diagram showing the overall configuration of a multi-projection system
  • FIG. 22 is an example of a screen shape calculating method according to the present embodiment.
  • FIG. 21 is a diagram showing the overall configuration of a multi-projection system
  • FIG. 22 is an example of a screen shape calculating method according to the present embodiment.
  • This embodiment is different from the fifth embodiment in that the laser pointer 35 and the camera 3 are fixed to the support jig 36, and the support 36 is rotated by the rotation
  • the marker projection by the laser pointer 35 and the imaging by the camera 3 are performed for each small area, and the correction data is calculated using the test pattern image photographed separately for each small area. Is the same as in the fifth embodiment. With such a configuration, the entire screen can be covered even when the marker projection angle by the laser pointer 35 is narrow and in a range, so that the configuration of the laser pointer 35 can be simplified.
  • each image output from the rotation control unit 41 is obtained in the same manner as in the fifth embodiment.
  • the shape of the entire screen can be estimated from the relative positional relationship between the rotation angles of the captured images.
  • the projectors 1A and 1B project the markers onto the overlapping portions of the photographing areas, and photograph the markers. Based on the marker positions (same point) of projectors 1A and 1B in the image, the screen shape estimated from each captured image may be synthesized. Yes. In this way, even if there is some error in the rotation angle information, the positional relationship can be accurately synthesized.
  • FIG. 23 is a diagram showing the overall configuration of the multi-projection system according to the seventh embodiment of the present invention.
  • a plurality of sets of a camera and a laser pointer whose positions are fixed to each other are used to overlap a small area of the screen 2 with an adjacent small area to form a test pattern image.
  • the projection and its photographing are performed, and correction data is calculated using a test pattern image photographed separately for each of the small areas.
  • the other configurations are the fifth and sixth embodiments. Is the same as Fig. 23 shows two sets, one with camera 3A and laser pointer 35A fixed to support 36A at a distance dl, and the other with camera 3B and laser pointer 35B fixed at support d at a distance d2. This shows the case where is used.
  • the distances dl and d2 can be set arbitrarily, even if they are equal or different.
  • the screen shape estimation according to the present embodiment estimates the shape of the small area of the screen 2 from each marker captured image as described in the sixth embodiment, and then synthesizes as shown in FIG. Then estimate the overall screen shape!
  • FIG. 24 is a diagram showing the overall configuration of the multi-projection system according to the eighth embodiment of the present invention.
  • the present embodiment uses the same number of cameras 3A and 3B as the projectors 1A and 1B without providing a plurality of sets of cameras and laser pointers as in the seventh embodiment. Is estimated. For this reason, the projector 1A and the camera 3A are fixed to the support 36A at a distance dl, the projector 1B and the camera 3B are fixed to the support 36B at a distance d2, and the screen 3 Projection area, that is, the image projected by projector 1A and the image projected by projector 1B overlapping it Part of the image can be captured so that the camera 3B can capture the projection area of the screen 2 by the projector 1B, that is, the image projected by the projector 1B and a part of the image projected by the projector 1A that overlaps it. . Note that the distances dl and d2 can be set arbitrarily or differently, as long as the cameras 3A and 3B can shoot the projection areas by the corresponding projectors 1A and 1B.
  • the marker images are projected by the projectors 1A and 1B and photographed by the cameras 3A and 3B, and the photographed images are combined with the projectors 1A and 1B and the cameras 3A and 3B.
  • the screen shape is estimated using the information indicating the relative positional relationship between the two.
  • Other configurations and operations are the same as those of the seventh embodiment.
  • FIGS. 25 to 29 show a ninth embodiment of the present invention.
  • FIG. 25 is a diagram showing the overall configuration of a multi-projection system
  • FIGS. 26 (a) to (c) explain screen shape estimation processing.
  • FIG. 27 is a flowchart for explaining the screen shape estimation process
  • FIG. 28 is a flowchart for explaining the geometric correction data calculation process
  • FIG. 29 is a block diagram showing the configuration of the correction data calculation unit shown in FIG. FIG.
  • the two laser pointers 35A and 35B are fixed to the support 36 at a distance d, and different positional forces can be applied by the laser pointers 35A and 35B whose relative positional relationship is fixed.
  • Markers are projected on the entire screen 2, and each projected marker is shot separately for each small area of the screen 2 by cameras 3A and 3B, and the shape of the screen 2 is estimated from these shot images. To do.
  • the projection angle ⁇ 'A when the marker is projected from the laser pointer 35A to the same position as the marker projection position by the laser pointer 35B is calculated by interpolation.
  • ⁇ ' ⁇ ( ⁇ / ⁇ ) ⁇ ( ⁇ ⁇ — 0 A) + 0 ⁇
  • the screen shape is a curved surface, it can be obtained with higher accuracy by a higher-order interpolation formula.
  • the laser Calculate the three-dimensional position of the i-th marker by one pointer 35B.
  • FIG. 27 summarizes the above processing, and a detailed description thereof will be omitted because it is redundant with the description of the drawings.
  • the projection angle ⁇ 'A when projecting the marker from the laser pointer 35A to the same position as the marker projection position by the laser pointer 35B, the projection angle is actually calculated.
  • the marker is projected with the laser pointer 35A and the camera is shot again, as a result, if it is misaligned with the marker with the laser pointer 35B, it is corrected again and the more accurate projection angle with the laser pointer 35A is obtained.
  • FIG. 28 shows a process of calculating geometric correction data in the present embodiment, and a detailed description thereof will be omitted because it is the same as that in the drawings. Note that, in FIG. 28, the process indicated by reference symbol S corresponds to the process in FIG.
  • FIG. 29 is a diagram showing detailed blocks of the correction data calculation unit 4 in the present embodiment.
  • This correction data calculation unit 4 is the correction data calculation unit of the third embodiment shown in FIG. 4, instead of the camera captured image data storage unit 11, a camera captured image data storage unit 11 A for storing captured image data from the camera 3A and a camera captured image data storage unit for storing captured image data from the camera 3B 11B is provided, and the other configuration is the same as that of the third embodiment.
  • the correction data calculation unit 4 uses the image for each small area of the screen 2 stored in the camera captured image data storage unit 11A and the camera captured image data storage unit 11B. In addition to calculating the geometric correction data by performing the processing shown in FIGS. 27 and 28, the color characteristic unevenness and the gamma characteristic unevenness of the projectors 1A and 1B are calculated in the same manner as described in the first embodiment. Then, a color correction matrix and gamma correction data that make these uniform over the entire screen are calculated, and the correction data is output to the image conversion unit 5.
  • markers are projected onto the entire screen 2 from different positions by laser pointers 35A and 35B having a fixed relative positional relationship, and each projected marker is projected by a camera.
  • 3A and 3B were used to take pictures of each small area of screen 2, and the shape of screen 2 was estimated from the captured images.Therefore, even if the positions of cameras 3A and 3B were not known, they were fixed. If not, the three-dimensional position of the screen 2 can be estimated, and the degree of freedom in installing the cameras 3A and 3B can be increased.
  • one laser pointer using two laser pointers 35A and 35B is supported on the moving stage so as to be able to move in parallel, and this one laser pointer is used.
  • the correction data can be calculated in the same manner by projecting a marker on the entire screen 2 at both ends of a distance d whose relative position is known by moving in parallel.
  • FIGS. 30 and 31 show a tenth embodiment of the present invention.
  • FIG. 30 is a diagram showing the overall configuration of a multi-projection system
  • FIG. 31 is a perspective view showing an observation scope used in the present embodiment.
  • FIG. 30 is a diagram showing the overall configuration of a multi-projection system
  • FIG. 31 is a perspective view showing an observation scope used in the present embodiment.
  • observation position detection sensors 45 are provided at a plurality of positions on the screen 2, and the viewpoint position of the observer 46 is detected by the observation position detection sensors 45.
  • the position is used as the observation position in the observation position setting unit 14 of the correction data calculation unit 4.
  • a distortion is automatically corrected at the observation point based on the set observation position, and an image without distortion is displayed on the screen 2.
  • the observer 46 is put on observation glasses 48 equipped with an infrared LED 47 as shown in FIG. 31, for example, and the observation position detection sensor 45 is an infrared detection sensor or the like.
  • the observation position detection sensor 45 detects infrared rays from the infrared LED 47 to detect the viewpoint position of the observer 46.
  • the infrared rays emitted from the infrared LED 47 have directivity in the direction of the viewpoint of the observer 46.
  • the shape of the screen 2 and the projector using the camera may be set according to any one of the first to ninth embodiments.
  • the projection position relationship between 1A and 1B is calculated in advance, and distortion correction at an arbitrary observation position is always performed.
  • the observation position can be detected in real time according to the movement of the observer 46 and automatically corrected to an image without distortion. Even in a display system or the like, an image can always be observed without distortion due to the screen shape.
  • the present invention is not limited to the above-described embodiment, but can be variously modified or changed.
  • a curved screen such as an arch screen or a spherical screen in which all 360 ° directions are covered with screens is used to project and display an image on the hemispherical dome screen 2.
  • a flat screen When projecting and displaying an image, as shown in Fig. 32 (a), when projecting an image by front projection on a flat screen 2a, or as shown in Fig. 32 (b), a flat screen
  • the present invention can also be effectively applied to the case where an image is displayed by rear projection on 2a.
  • the number of projectors is not limited to two, and the present invention can be applied to a case where there are three or more projectors.
  • the three-dimensional position of each point of the image projected on the screen that is, the screen position 'shape is estimated, and the positional deviation and distortion of the projected image and the color deviation are corrected. Therefore, an image can be displayed well even if the shape of the screen is not known.

Abstract

Un système de multiprojection pour projeter une grande image unique en assemblant des images projetées sur un écran (2) par des projecteurs d’image (1A, 1B). Le système de multiprojection comprend un moyen d’acquisition d’images (3A, 3B) pour capturer les images projetées sur l’écran (2) par les projecteurs d’images (1A, 1B) à partir de positions différentes dont la relation des positions relatives est connue et pour l'acquisition de données d'images parallaxes, le moyen pour calculer des données de correction des images (4) pour estimer la position tridimensionnelle de chaque point de l'image projetée sur l'écran (2) sur la base des données d'images parallaxes acquises et les informations sur la relation des positions relatives et calculer les données de correction pour corriger les images entrant dans les projecteurs d’images (1A, 1B), et les moyens de correction des images (5) pour corriger les images entrant dans les projecteurs d’images (1A, 1B) selon les données de correction calculées.
PCT/JP2005/001337 2004-02-27 2005-01-31 Système multiprojection WO2005084017A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004054717A JP2005244835A (ja) 2004-02-27 2004-02-27 マルチプロジェクションシステム
JP2004-054717 2004-02-27

Publications (1)

Publication Number Publication Date
WO2005084017A1 true WO2005084017A1 (fr) 2005-09-09

Family

ID=34908803

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/001337 WO2005084017A1 (fr) 2004-02-27 2005-01-31 Système multiprojection

Country Status (2)

Country Link
JP (1) JP2005244835A (fr)
WO (1) WO2005084017A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012249009A (ja) * 2011-05-26 2012-12-13 Nippon Telegr & Teleph Corp <Ntt> 光学投影制御装置、光学投影制御方法、及びプログラム
CN111684793A (zh) * 2018-02-08 2020-09-18 索尼公司 图像处理装置、图像处理方法、程序以及投影系统
US11676241B2 (en) 2020-01-31 2023-06-13 Seiko Epson Corporation Control method for image projection system, and image projection system

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4857732B2 (ja) * 2005-11-24 2012-01-18 パナソニック電工株式会社 仮想現実感生成システム
JP4696018B2 (ja) * 2006-04-13 2011-06-08 日本電信電話株式会社 観察位置追従式映像提示装置及び観察位置追従式映像提示プログラム,映像提示装置及び映像提示プログラム
JP4973009B2 (ja) * 2006-05-29 2012-07-11 セイコーエプソン株式会社 プロジェクタ及び画像投写方法
JP2008017347A (ja) * 2006-07-07 2008-01-24 Matsushita Electric Works Ltd 映像表示装置、映像信号の歪み補正処理方法
JP2008017348A (ja) * 2006-07-07 2008-01-24 Matsushita Electric Works Ltd 映像表示装置、映像信号の歪み補正処理方法
JP2008015381A (ja) * 2006-07-07 2008-01-24 Matsushita Electric Works Ltd 映像表示装置、映像信号の歪み補正処理方法
JP4965967B2 (ja) * 2006-10-30 2012-07-04 株式会社日立製作所 映像表示システムの調整システム
US8994757B2 (en) 2007-03-15 2015-03-31 Scalable Display Technologies, Inc. System and method for providing improved display quality by display adjustment and image processing using optical feedback
JP5298545B2 (ja) * 2008-01-31 2013-09-25 セイコーエプソン株式会社 画像形成装置
JP5955003B2 (ja) * 2012-01-26 2016-07-20 キヤノン株式会社 画像処理装置および画像処理方法、プログラム
JP2016019194A (ja) * 2014-07-09 2016-02-01 株式会社東芝 画像処理装置、画像処理方法、および画像投影装置
CN104457616A (zh) * 2014-12-31 2015-03-25 苏州江奥光电科技有限公司 一种360度三维成像的投影装置
JP6543067B2 (ja) * 2015-03-31 2019-07-10 株式会社メガチップス 投影システム、プロジェクター装置、撮像装置、および、プログラム
JP6390032B2 (ja) * 2015-09-02 2018-09-19 カルソニックカンセイ株式会社 ヘッドアップディスプレイの歪補正方法とそれを用いたヘッドアップディスプレイの歪補正装置
US11483528B2 (en) 2018-02-08 2022-10-25 Sony Corporation Information processing apparatus and information processing method
WO2020218028A1 (fr) * 2019-04-25 2020-10-29 ソニー株式会社 Dispositif, procédé, programme et système de traitement d'images
JP2020187227A (ja) * 2019-05-13 2020-11-19 株式会社アーキジオ 全天周画像用撮影装置
JP7099497B2 (ja) * 2020-07-28 2022-07-12 セイコーエプソン株式会社 画像生成方法、画像生成システム、及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001061121A (ja) * 1999-08-23 2001-03-06 Nec Corp プロジェクタ装置
JP2001083949A (ja) * 1999-09-16 2001-03-30 Japan Science & Technology Corp 映像投影装置
JP2002532795A (ja) * 1998-12-07 2002-10-02 ユニバーサル シティ スタジオズ インコーポレイテッド 視点画像ゆがみを補償するための画像補正方法
JP2004015205A (ja) * 2002-06-04 2004-01-15 Olympus Corp マルチプロジェクションシステム及びマルチプロジェクションシステムにおける補正データ取得方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002532795A (ja) * 1998-12-07 2002-10-02 ユニバーサル シティ スタジオズ インコーポレイテッド 視点画像ゆがみを補償するための画像補正方法
JP2001061121A (ja) * 1999-08-23 2001-03-06 Nec Corp プロジェクタ装置
JP2001083949A (ja) * 1999-09-16 2001-03-30 Japan Science & Technology Corp 映像投影装置
JP2004015205A (ja) * 2002-06-04 2004-01-15 Olympus Corp マルチプロジェクションシステム及びマルチプロジェクションシステムにおける補正データ取得方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012249009A (ja) * 2011-05-26 2012-12-13 Nippon Telegr & Teleph Corp <Ntt> 光学投影制御装置、光学投影制御方法、及びプログラム
CN111684793A (zh) * 2018-02-08 2020-09-18 索尼公司 图像处理装置、图像处理方法、程序以及投影系统
US11218662B2 (en) 2018-02-08 2022-01-04 Sony Corporation Image processing device, image processing method, and projection system
US11676241B2 (en) 2020-01-31 2023-06-13 Seiko Epson Corporation Control method for image projection system, and image projection system

Also Published As

Publication number Publication date
JP2005244835A (ja) 2005-09-08

Similar Documents

Publication Publication Date Title
WO2005084017A1 (fr) Système multiprojection
JP4108609B2 (ja) カメラ付きプロジェクタをキャリブレーションする方法
US9892488B1 (en) Multi-camera frame stitching
US9195121B2 (en) Markerless geometric registration of multiple projectors on extruded surfaces using an uncalibrated camera
KR100796849B1 (ko) 휴대 단말기용 파노라마 모자이크 사진 촬영 방법
JP6037375B2 (ja) 画像投影装置および画像処理方法
WO2018076154A1 (fr) Étalonnage de positionnement spatial d&#39;un procédé de génération de séquences vidéo panoramiques fondé sur une caméra ultra-grand-angulaire
KR20160034847A (ko) 단초점 카메라를 이용하여 디스플레이 시스템을 캘리브레이팅하기 위한 시스템 및 방법
US10063792B1 (en) Formatting stitched panoramic frames for transmission
KR20160118868A (ko) 하나의 룩업 테이블을 이용한 파노라마 영상 출력 시스템 및 방법
JP2015056834A (ja) 投影システム、画像処理装置、投影方法およびプログラム
CN103685917A (zh) 图像处理器、图像处理方法和成像系统
WO2006025191A1 (fr) Procédé de correction géométrique pour système de multiprojection
US20040169827A1 (en) Projection display apparatus
KR100790887B1 (ko) 영상 처리장치 및 방법
CN110505468B (zh) 一种增强现实显示设备的测试标定及偏差修正方法
JP2004015205A (ja) マルチプロジェクションシステム及びマルチプロジェクションシステムにおける補正データ取得方法
JP2004228824A (ja) スタックプロジェクション装置及びその調整方法
CN113259642B (zh) 一种影片视角调节方法及系统
JP2004228619A (ja) プロジェクタの映像の歪み調整方法
JP2012078490A (ja) 投写型映像表示装置及び画像調整方法
KR20140121345A (ko) 감시카메라 유닛 및 그 구동방법
JP4230839B2 (ja) マルチカメラシステム及びその調整装置
Johnson et al. A distributed cooperative framework for continuous multi-projector pose estimation
JP2011228832A (ja) 画像処理装置、画像表示システムおよび画像処理方法

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase