WO2020218028A1 - Image processing device, image processing method, program, and image processing system - Google Patents

Image processing device, image processing method, program, and image processing system Download PDF

Info

Publication number
WO2020218028A1
WO2020218028A1 PCT/JP2020/016067 JP2020016067W WO2020218028A1 WO 2020218028 A1 WO2020218028 A1 WO 2020218028A1 JP 2020016067 W JP2020016067 W JP 2020016067W WO 2020218028 A1 WO2020218028 A1 WO 2020218028A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
corresponding point
projected
correction
deviation
Prior art date
Application number
PCT/JP2020/016067
Other languages
French (fr)
Japanese (ja)
Inventor
都夢 田原
小林 直樹
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020218028A1 publication Critical patent/WO2020218028A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present technology relates to an image processing device, an image processing method, a program, and an image processing system, and in particular, a deviation occurs in the corresponding point on each image between the projected image projected by the projection device and the captured image captured by the photographing device.
  • the present invention relates to an image processing apparatus, an image processing method, a program, and an image processing system that enable easy re-correction in the case of the above.
  • the projector / camera system is a system in which a projector and a camera are provided, and the posture of each device can be estimated by taking an image of a predetermined pattern projected from the projector with the camera.
  • correction deviation may occur due to changes over time of the projector or physical contact with the device.
  • the correction deviation means that a deviation occurs at the corresponding point on each image of the image projected by the projector and the image taken by the camera, which is once detected by the initial correction.
  • This technology was made in view of such a situation, and re-correction occurs when there is a deviation between the corresponding points on the projected image projected by the projection device and the captured image captured by the photographing device. It makes it easy to do.
  • the image processing device determines the corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device. Detects the deviation of the corresponding points that occurs after the initial correction including the detection of the corresponding points to be detected and the attitude estimation that estimates the postures of the projection device and the photographing device based on the result of the corresponding point detection. It is composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship between the detection unit and the corresponding point obtained by the initial correction.
  • the corresponding points on the captured image It is provided with an update unit that detects a deviation and updates the corresponding point information.
  • One aspect of the present technology is to detect a corresponding point on each image based on a projected image projected by the projection device and a captured image obtained by photographing the projected projected image by the photographing device.
  • the deviation of the corresponding points that occurs after the initial correction including the point detection and the posture estimation that estimates the postures of the projection device and the photographing device based on the result of the corresponding point detection is detected.
  • the corresponding point information which is the information indicating the relationship of the corresponding points obtained by the initial correction
  • the re-correction composed of a predetermined pattern based on the position of the corresponding points on the projected image.
  • a pattern is generated, and the deviation of the corresponding point on the captured image is detected based on the captured image obtained by photographing the recorrection pattern projected from the projection device by the photographing device.
  • the corresponding point information is updated.
  • FIG. 1 is a diagram showing a configuration example of a projector / camera system as an image processing system according to an embodiment of the present technology.
  • the projector / camera system of FIG. 1 is configured by connecting a plurality of projectors, which are projection devices, and a plurality of cameras, which are imaging devices, to an image processing device 1 such as a PC.
  • an image processing device 1 such as a PC.
  • two projectors of projectors 2-1 and 2-2 and two cameras of cameras 3-1 and 3-2 are provided.
  • a screen 11 serving as a projection surface is provided in front of the projectors 2-1 and 2-2.
  • the projector 2-1 and the projector 2-2 are installed so that the projection range of the projector 2-1 and the projection range of the projector 2-2 partially overlap.
  • An overlapping region is formed on the screen 11 in which the image projected from the projector 2-1 and the image projected from the projector 2-2 overlap.
  • the camera 3-1 is installed so that the shooting range includes the projection range of the projector 2-1. At least, the projected image of the projector 2-1 projected on the screen 11 is captured in the captured image captured by the camera 3-1.
  • the camera 3-2 is installed so that the shooting range includes the projection range of the projector 2-2. At least, the projected image of the projector 2-2 projected on the screen 11 is captured in the captured image captured by the camera 3-2.
  • the camera 3-1 is provided near the projector 2-1 and the camera 3-2 is provided near the projector 2-2.
  • the projector 2-1 and the camera 3-1 and the projector 2-2 and the camera 3-2 may each be composed of devices having the same housing. Further, the cameras 3-1 and 3-2 may be provided at positions away from the projectors 2-1 and 2, respectively.
  • the projectors 2-1 and 2-2 and the cameras 3-1 and 3-2 have optical systems capable of modeling distortion, such as a general perspective projection optical system and an optical system having a fisheye lens.
  • the shape of the screen 11 may be any shape that can be modeled, such as a flat surface, a curved surface, and a hemispherical surface (dome shape).
  • the image processing device 1 is provided with a projection control unit 21 that controls projection by projectors 2-1 and 2-2, and a shooting control unit 22 that controls shooting by cameras 3-1 and 3-2. As will be described later, the image processing device 1 is provided with various configurations other than the projection control unit 21 and the imaging control unit 22.
  • FIG. 2 is a diagram showing another configuration example of the projector / camera system.
  • sensors 4-1 and 4-2 composed of an acceleration sensor and a gyro sensor.
  • the sensor data processing unit 23 of the image processing device 1 processes the sensor data supplied from the sensor 4-1 and the sensor data supplied from the sensor 4-2. Based on the processing result by the sensor data processing unit 23, the correction deviation as described later is detected.
  • projector 2 when it is not necessary to distinguish between projectors 2-1 and 2, they are collectively referred to as projector 2.
  • camera 3 When it is not necessary to distinguish between cameras 3-1 and 3-2, they are collectively referred to as camera 3.
  • sensors 4-1 and 4-2 When it is not necessary to distinguish between sensors 4-1 and 4-2, they are collectively referred to as sensor 4.
  • the image processing device 1 has a three-dimensional measurement function that automatically measures (estimates) the posture of each device and calculates parameters, and a geometry that corrects the image projected from the projector 2 based on the calculated parameters. It has a correction function. By performing geometric correction using the parameters calculated by the posture estimation, the projected image projected on the screen 11 becomes a high-quality one with less distortion and the like.
  • the parameters calculated by the image processing device 1 by posture estimation include internal parameters and external parameters.
  • the internal parameters include information such as the focal length and principal point of the projector 2 and the camera 3. Other information, such as the distortion factor of the lens, may be estimated as an internal parameter.
  • the external parameters include information on the postures (rotation / translation) of the projector 2 and the camera 3.
  • Posture estimation for calculating such parameters is performed by inputting correspondence point information.
  • the corresponding point information is information representing corresponding points on the respective images of the projector image, which is an image projected by the projector 2, and the camera image, which is an image obtained by shooting with the camera 3.
  • the posture estimation for estimating the posture of the projector 2 and the camera 3 is performed, and the parameters are calculated. Further, the shape and position of the screen 11 are estimated based on the parameters calculated by the posture estimation.
  • a pattern for sensing is projected from a projector, and the pattern projected on the screen is photographed by a camera to obtain the correspondence between the pixel positions of the projector image and the camera image.
  • Structured Lighting It is done using a technique called structured light projection).
  • initial correction including corresponding point detection and posture estimation is performed in advance at a predetermined timing such as before projecting an image of the content.
  • Typical sensing patterns include binary code and gray code.
  • Correspondence point detection using these codes encodes each pixel of the projector image by sequentially projecting a black-and-white (binary) pattern, captures the state of projection, and decodes each pixel of the projector image. The correspondence between the image and each pixel of the camera image is obtained.
  • FIG. 3 is a diagram showing an example of coding using a Gray code.
  • the coding of each pixel of the projector image is performed by projecting a black and white pattern while switching in time series in each of the horizontal direction and the vertical direction.
  • eight patterns are used in each of the horizontal direction and the vertical direction.
  • the pixels in the upper left corner are projected in the pattern of "black, black, black, " when projected in the horizontal direction. Assuming that white is “1" and black is “0”, the pixels in the upper left corner are encoded by "000 ".
  • a striped pattern is projected in the horizontal direction and the vertical direction so that all the pixels of the projector image can be coded.
  • the patterns projected on the screen are sequentially photographed, and as shown in B of FIG. 3, the black and white changes of each pixel on the camera image are set to "1" (white) and "0" (black). It is done as shown.
  • the pixel at the position surrounded by a rectangle in B in FIG. 3 is represented by the reference numeral “00011000”.
  • FIG. 4 is a diagram showing an example of other coding.
  • the coding method shown in FIG. 4 is a method using a dot pattern in which a pattern representing a gray code is arranged at a position of a specific pixel group.
  • the checker pattern is a pattern composed of a black substantially square area and a white approximately square area arranged in a checkered pattern.
  • a dot pattern in which dots representing the gray code are arranged is projected at the position of the corner point of the checker, and coding is performed.
  • the projected dot pattern is photographed by a camera, and as shown in FIG. 5, decoding is performed so that the change in white / black of each pixel on the camera image is represented by “1” and “0”.
  • corresponding points on the projector image are appropriately referred to as projector compatible points.
  • corresponding points on the camera image are camera corresponding points.
  • step S1 the image processing device 1 generates a coding pattern.
  • the coding pattern generated here is a pattern for pixels in the entire image as described with reference to FIG. 3, for example.
  • step S2 the image processing device 1 controls the projector 2 and projects a coding pattern. Further, the image processing device 1 controls the camera 3 and captures the coded pattern projected on the screen 11. The projection and shooting of the coding pattern are repeated as many times as (number of coding bits) ⁇ (number of projectors) ⁇ (number of cameras).
  • step S3 the image processing device 1 performs decoding processing as described above based on each camera image, and detects corresponding points on the respective images of the projector image and the camera image.
  • FIG. 7 is a diagram showing an example of the result of corresponding point detection.
  • Corresponding point detection is performed by associating pixels on the projector image and pixels on the camera image having the same code as corresponding points.
  • Pixel P1 at the coordinates (X1, Y1) on the projector image and the pixel p1 at the coordinates (x1, y1) on the camera image are detected as corresponding points.
  • Pixel P1 is a projector compatible point
  • pixel p1 is a camera compatible point corresponding to pixel P1 as a projector compatible point.
  • Pixel P2 at the coordinates (X2, Y2) on the projector image and the pixel p2 at the coordinates (x2, y2) on the camera image are detected as corresponding points.
  • Pixels P3 to PN on the projector image and pixels p3 to pn on the camera image are also detected as corresponding points, respectively.
  • the corresponding point information which is the information indicating the corresponding point on each image of the projector image and the camera image is generated.
  • the correspondence point information is information representing the correspondence relationship as shown in FIG.
  • step S4 of FIG. 6 the image processing device 1 performs the posture estimation process.
  • the posture estimation process is performed based on the correspondence point information obtained by the correspondence point detection.
  • the posture estimation process calculates a parameter estimated value which is a parameter representing the postures of the projector 2 and the camera 3.
  • step S11 the image processing device 1 estimates the camera parameters based on the corresponding point information.
  • step S12 the image processing device 1 estimates the projector parameters based on the corresponding point information. Camera parameter estimates are also used to estimate projector parameters.
  • step S13 the image processing device 1 optimizes the projector parameter estimated value and the camera parameter estimated value, and calculates the projector / camera parameter estimated value.
  • the parameter estimation value is optimized so that, for example, when ray tracking is performed for each of the projector corresponding point and the camera corresponding point, each tracking destination becomes one point on the screen 11. And the camera parameter estimates are adjusted respectively.
  • step S4 in FIG. 6 After the parameter estimated value is optimized, the process returns to step S4 in FIG. 6 and the subsequent processing is performed.
  • step S5 of FIG. 6 the image processing device 1 performs geometric correction on the image projected from the projector 2 based on the parameter estimated value calculated by the posture estimation process.
  • the initial correction is a process that includes at least corresponding point detection and attitude estimation.
  • Correspondence point detection performed at the time of initial correction requires a large amount of processing because it is necessary to project a plurality of patterns. For example, when coding in the time direction such as Gray code is used, it is necessary to project a pattern for each projector by the number of coding bits. Specifically, it is necessary to project a pattern by (the number of encoded bits) ⁇ (the number of projectors) and shoot as many as the number of cameras.
  • correction deviations may occur due to physical contact with the device.
  • the correction deviation is a deviation between the projector corresponding point and the camera corresponding point detected by the corresponding point detection at the time of initial correction.
  • the re-correction for correcting the correction deviation does not perform the initial correction as described above again. It is performed based on the information obtained by the initial correction.
  • FIG. 9 is a block diagram showing a configuration example of the hardware of the image processing device 1.
  • the CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input / output expansion bus 55 is further connected to the bus 54.
  • a GPU (Graphics Processing Unit) 56, a UI (User Interface) I / F 59, a communication I / F 62, and a recording I / F 63 are connected to the input / output expansion bus 55.
  • the GPU 56 renders the image projected from the projector 2 using the VRAM 57.
  • the GPU 56 generates a pattern to be projected from the projector 2 and outputs it to the display I / F 58.
  • the display I / F 58 is an interface for outputting projector images and the like.
  • the display I / F 58 is configured as an interface of a predetermined standard such as HDMI (registered trademark) (High-Definition Multimedia Interface).
  • HDMI registered trademark
  • High-Definition Multimedia Interface High-Definition Multimedia Interface
  • the UI I / F59 is an interface for detecting operations.
  • the UI I / F 59 detects a user's operation performed by using the keyboard 60 or the mouse 61, and outputs information indicating the content of the operation to the CPU 51.
  • the communication I / F 62 is an interface for communication with an external device.
  • the communication I / F 62 is composed of network interfaces such as wireless LAN and wired LAN.
  • the communication I / F 62 communicates with an external device via a network such as the Internet, and transmits / receives various data.
  • the content played back in the projector / camera system may be provided from the server via the network.
  • the communication I / F 62 receives the captured image captured by the camera 3 and transmitted from the camera 3.
  • the communication I / F 62 receives various data transmitted from an external device, such as sensor data transmitted from the sensor 4.
  • the recording I / F 63 is an interface for a recording medium.
  • a recording medium such as an HDD 64 or a removable medium 65 is attached to the recording I / F 63.
  • the recording I / F 63 reads the data recorded on the mounted recording medium and writes the data to the recording medium.
  • various data such as programs executed by the CPU 51 are recorded in the HDD 64.
  • FIG. 10 is a block diagram showing a functional configuration example of the image processing device 1.
  • control unit 101 is realized in the image processing device 1.
  • the control unit 101 includes a coding pattern generation unit 111, a corresponding point detection unit 112, a posture estimation unit 113, a correction information storage unit 114, a geometric correction unit 115, and a correction deviation detection. It is composed of a unit 116, a pattern generation unit 117 for re-correction, and an update unit 118.
  • the projection control unit 21 projects a plurality of coded patterns generated by the coded pattern generation unit 111 from each projector 2.
  • the projection control unit 21 projects the re-correction pattern generated by the re-correction pattern generation unit 117 from the projector 2 that causes the correction deviation.
  • the re-correction pattern is a pattern for re-correcting the correction deviation, and only one pattern is generated for each projector 2 that causes the correction deviation.
  • the shooting control unit 22 controls the camera 3 at the time of initial correction to shoot the coded pattern projected on the screen 11.
  • the photographing control unit 22 acquires a photographed image in which the coded pattern is captured and outputs it as a camera image to the corresponding point detection unit 112.
  • the shooting control unit 22 controls the camera 3 at the time of re-correction, and causes the camera 3 to shoot the re-correction pattern projected on the screen 11.
  • the shooting control unit 22 acquires a shooting image in which the recorrection pattern is captured and outputs it as a camera image to the updating unit 118.
  • the coding pattern generation unit 111 At the time of initial correction, the coding pattern generation unit 111 generates a plurality of coding patterns as a pattern for sensing as described with reference to FIGS. 3 and 4.
  • the process of step S1 in FIG. 6 is a process performed by the coding pattern generation unit 111.
  • the coded pattern generated by the coded pattern generation unit 111 is supplied to the projection control unit 21 and the corresponding point detection unit 112 as a projector image.
  • the corresponding point detection unit 112 detects the corresponding point based on the coded pattern generated by the coded pattern generation unit 111 and the camera image in which the coded pattern is captured, which is supplied from the photographing control unit 22. ..
  • the process of step S3 in FIG. 6 is a process performed by the corresponding point detection unit 112.
  • each pixel of the camera image as described with reference to FIG. 3 and the like is decoded, and the corresponding point is detected.
  • the correspondence point information obtained by the correspondence point detection by the correspondence point detection unit 112 is supplied to the posture estimation unit 113 and the correction information storage unit 114.
  • the posture estimation unit 113 estimates the posture based on the correspondence point information supplied from the correspondence point detection unit 112, and calculates the parameter estimation value.
  • the process of step S4 in FIG. 6 is a process performed by the posture estimation unit 113.
  • the parameter estimated value calculated by the posture estimation unit 113 is supplied to the correction information storage unit 114 and the geometric correction unit 115.
  • the correction information storage unit 114 stores the corresponding point information supplied from the corresponding point detecting unit 112 and the parameter estimated value supplied from the posture estimation unit 113.
  • the information stored in the correction information storage unit 114 is read out at the time of re-correction.
  • the correction information storage unit 114 stores the updated corresponding point information and the parameter estimated value updated by the update unit 118 at the time of re-correction.
  • the geometric correction by the geometric correction unit 115 after the re-correction is performed based on the updated information stored in the correction information storage unit 114.
  • the geometric correction unit 115 performs geometric correction of the image projected from each projector 2 based on the parameter estimation value supplied from the posture estimation unit 113.
  • the process of step S5 in FIG. 6 is a process performed by the geometric correction unit 115.
  • the corrected image that has been geometrically corrected by the geometric correction unit 115 is supplied to the projection control unit 21 via a path (not shown) and projected.
  • the correction deviation detection unit 116 detects the correction deviation based on the information input from the outside. For example, when an impact is applied to a predetermined projector 2 and the sensor 4 detects that the posture is displaced, it is detected as a correction deviation has occurred.
  • the correction deviation detection unit 116 detects that the correction deviation has occurred, the correction deviation detection unit 116 compares the correction deviation deviation amount with the threshold amount. When the deviation amount of the correction deviation is smaller than the threshold value (small amount), the correction deviation detection unit 116 outputs information indicating that re-correction is performed to the re-correction pattern generation unit 117 and the update unit 118. ..
  • the correction deviation detection unit 116 outputs information indicating that the same correction as the initial correction is performed again to the coding pattern generation unit 111.
  • the re-correction pattern generation unit 117 reads out the corresponding point information obtained by the initial correction from the correction information storage unit 114 and acquires it. Further, the re-correction pattern generation unit 117 generates a re-correction pattern based on the corresponding point information before the occurrence of the correction deviation read from the correction information storage unit 114.
  • the re-correction pattern is generated based on the corresponding point information obtained at the time of initial correction.
  • the recorrection pattern generated by the recorrection pattern generation unit 117 is supplied to the projection control unit 21 and projected from the projector 2 that causes the correction deviation.
  • the update unit 118 reads out the corresponding point information and the parameter estimated value obtained by the initial correction from the correction information storage unit 114 and acquires them.
  • the update unit 118 detects the position of the camera corresponding point after the correction deviation occurs based on the camera image, and detects the position after the correction deviation occurs.
  • the corresponding point information is updated according to.
  • the update unit 118 updates the parameter estimated value by referring to the corresponding point information after the update.
  • the corresponding point information and the parameter estimated value updated by the update unit 118 are stored in the correction information storage unit 114 and used for the subsequent geometric correction.
  • the process of FIG. 11 is started when the initial correction process as described with reference to FIG. 6 is performed and the state of the projector / camera system is in the correction state.
  • step S31 the correction deviation detection unit 116 detects the correction deviation based on the information input from the outside.
  • step S32 the correction deviation detection unit 116 determines whether or not the deviation amount of the correction deviation is very small.
  • the deviation amount is less than the threshold value, it is determined to be minute.
  • re-correction is performed using the corresponding point information before the occurrence of the correction deviation and the parameter estimated value obtained by the initial correction.
  • the corresponding point information read from the correction information storage unit 114 is supplied to the re-correction pattern generation unit 117 and the update unit 118, and the parameter estimated value is supplied to the update unit 118.
  • step S33 the corresponding point information update process is performed.
  • the correspondence point information update process the correspondence point information before the correction deviation occurs is updated, and the updated correspondence point information is generated.
  • the corresponding point information update process will be described later with reference to the flowchart of FIG.
  • step S34 the parameter estimated value update process is performed.
  • the parameter estimated value update process updates the parameter estimated value before the correction deviation occurs, and generates the updated parameter estimated value.
  • the parameter estimation value update process will be described later with reference to the flowchart of FIG.
  • step S35 the geometric correction unit 115 performs geometric correction based on the updated parameter estimated value.
  • the corrected image with geometric correction is supplied to the projection control unit 21 and projected from the projector 2. As a result, the image after the geometric correction considering the slight correction deviation is projected.
  • step S32 since the deviation amount of the correction deviation is larger than the threshold value, if it is determined in step S32 that the deviation amount of the correction deviation is not small, the process proceeds to step S36 and the initial correction process is performed.
  • the same processing as that described with reference to FIG. 6 is performed, the corresponding point information is obtained by the corresponding point detection using a plurality of coding patterns, and the posture is estimated based on the corresponding point information. Will be done.
  • step S36 After the initial correction process is performed in step S36, the process proceeds to step S35, and the geometric correction unit 115 performs geometric correction using the parameter estimated value obtained by the initial correction process.
  • step S31 -Detection of correction deviation (change in corresponding point)
  • step S33 -Correspondence point information update process
  • step S34 -Parameter estimation value update process
  • step S35 -Geometric correction
  • the correction deviation occurs due to physical contact with the device (projector 2, camera 3, screen 11), aging of the device, adjustment of the zoom / shift function of the projector 2, and the like.
  • FIG. 12 is a diagram showing the relationship of each device before the occurrence of correction deviation.
  • the shooting ranges of the cameras 3-1 and 3-2 include the projection range of the projector 2-1 and the projection range of the projector 2-2, respectively.
  • the images projected from the projector 2-1 are taken by the cameras 3-1 and 3-2, respectively, and the images projected from the projector 2-2 are taken by the cameras 3-1 and 3-2, respectively.
  • the pixel at the position p11 on the camera image of the camera 3-1 corresponds to the pixel at the position p1 on the projector image projected by the projector 2-1.
  • the pixel at position p12-1 is the corresponding point. As shown by the broken line passing through each corresponding point, the light rays of each pixel serving as the corresponding point intersect at the position P1 on the screen 11, and there is no deviation in the corresponding point.
  • the pixel at the position p2 on the projector image projected by the projector 2-2 the pixel at the position p11-2 on the camera image of the camera 3-1 becomes a corresponding point, and the camera 3-2 On the camera image of, the pixel at position p12-2 is the corresponding point. As shown by the broken line passing through each corresponding point, the light rays of each pixel serving as the corresponding point intersect at the position P2 on the screen 11, and there is no deviation in the corresponding point.
  • FIG. 13 is a diagram showing the relationship between each device after the correction deviation occurs.
  • the projector corresponding point and the camera corresponding point are between the projector 2-2 and the camera 3-1 and between the projector 2-2 and the camera 3-2.
  • the positions are displaced as shown by the arrow A2.
  • the correction deviation is defined as a state in which the position of the corresponding point on the camera image is changed with respect to the corresponding point on the projector image due to any change in the projector 2, the camera 3, or the screen 11.
  • the slight correction deviation means a small deviation such that the projection position from the projector 2 is slightly deviated on the camera image.
  • the correction deviation due to physical contact with the device is detected, for example, based on the measurement result by the sensor 4 provided as shown in FIG.
  • the correction deviation detection unit 116 identifies the device that caused the correction deviation based on the measurement result of the sensor 4 including the acceleration sensor and the gyro sensor supplied from the sensor data processing unit 23, and uses it as a trigger. , Start re-correction.
  • the correction deviation due to the time-dependent change of the projector 2 and the change of the zoom shift is detected by using, for example, a pattern for detecting the deviation of the corresponding point.
  • the image in which the pattern is embedded is projected from the projector 2, and the projected image is captured by the camera 3.
  • the projection of the projector 2 and the shooting of the camera 3 are performed in synchronization.
  • the correction deviation detection unit 116 determines the correction deviation based on the result of the corresponding point detection performed by the corresponding point detecting unit 112, for example, using the pattern embedded in the image projected by the projector 2 and the pattern reflected in the camera image. To detect.
  • the correction deviation detection unit 116 identifies the device that caused the correction deviation, and triggers the device to start re-correction.
  • the occurrence of correction deviation may be manually input by the user who is the administrator of the projector / camera system.
  • the user calculates a change in the overlap region where the projected images overlap when a reference image such as a crosshatch or a coding pattern for sensing is projected, or a change in the corresponding point.
  • the correction deviation detection unit 116 detects the correction deviation based on the calculation result by the user, identifies the device that caused the correction deviation, and uses it as a trigger to start the re-correction.
  • the device that caused the correction deviation may be input by the user.
  • step S51 the re-correction pattern generation unit 117 reads out from the correction information storage unit 114 and acquires the corresponding point information before the occurrence of the correction deviation, which is obtained by the initial correction. Further, the update unit 118 reads out the corresponding point information and the parameter estimated value before the correction deviation occurs from the correction information storage unit 114 and acquires them.
  • step S52 the re-correction pattern generation unit 117 identifies the projector 2 and the camera 3 that caused the correction deviation based on the information supplied from the correction deviation detection unit 116 and the like.
  • the projector 2 that causes the correction deviation is set as the target projector. Further, the camera 3 that causes the correction deviation is set as the target camera.
  • the projector 2-2 is set as the target projector.
  • the camera 3-1 and the camera 3-2 which capture the image projected by the projector 2-2 as the target projector, are set as the target cameras, respectively.
  • step S53 the re-correction pattern generation unit 117 generates a re-correction pattern for each target projector.
  • a re-correction pattern is generated for each target projector.
  • a plurality of re-correction patterns may be generated for each target projector.
  • the re-correction pattern is generated by specifying the position of the projector corresponding point based on the corresponding point information before the correction deviation occurs and arranging dots of a predetermined pattern based on the specified position.
  • FIG. 15 is a diagram showing an example of a recorrection pattern.
  • each dot has a circular shape having a predetermined diameter, and is arranged so that, for example, the position of the projector corresponding point specified based on the corresponding point information is the center of gravity position. Will be done.
  • the position of the center of gravity of each dot constituting the recorrection pattern is the position of each of the pixels P1 to PN on the projector image.
  • the range projected on the screen 11 in the entire projector image is specified, for example, based on the shape of the screen 11 estimated at the time of initial correction.
  • the shape of the screen 11 is estimated at the time of geometric correction in the initial correction based on the parameter estimated value calculated by the posture estimation.
  • sensing points that are not projected on the screen 11 may be the target of the corresponding point detection, etc., which causes a decrease in the correction accuracy. Further, depending on the shape of the screen 11, the projected pattern is greatly distorted, and high-precision sensing cannot be performed, so that the correction accuracy may decrease. Sensing points that adversely affect the final geometric correction need to be removed as outliers when decoding the coding pattern or estimating the attitude.
  • the corresponding point information can be updated using only the effective dots, and it is possible to prevent the correction accuracy from being lowered.
  • processing such as outlier removal is not required, efficient processing is possible.
  • step S54 the projection control unit 21 controls the target projector and projects the recorrection pattern.
  • step S55 the shooting control unit 22 controls the target camera to shoot the recorrection pattern projected on the screen 11.
  • the camera image obtained by photographing the recorrection pattern projected on the screen 11 is supplied to the update unit 118.
  • step S56 the shooting control unit 22 determines whether or not the recorrection pattern has been shot by all the target cameras, and switches the target cameras until it is determined that the recorrection pattern has been shot by all the target cameras, and the process of step S55. repeat.
  • step S56 If it is determined in step S56 that the pattern for re-correction has been captured by all the target cameras, the process proceeds to step S57.
  • step S57 the projection control unit 21 determines whether or not the recorrection pattern has been projected from all the target projectors, and switches the target projectors until it is determined that the projection has been projected from all the target projectors. Repeat the process.
  • step S57 If it is determined in step S57 that the recorrection pattern is projected from all the target projectors, the process proceeds to step S58.
  • step S58 the update unit 118 detects the position of each camera corresponding point after the correction deviation occurs, based on the camera image supplied from the shooting control unit 22.
  • the position of the camera corresponding point after the correction deviation occurs is represented as the position of the center of gravity of the dots reflected in the camera image.
  • the update unit 118 detects the position of the center of gravity of the dots arranged at the positions corresponding to the respective cameras after the correction deviation occurs.
  • step S59 the updating unit 118 updates the position information of the camera corresponding points included in the corresponding point information before the correction deviation occurs, based on the detection result of the position after the correction deviation of each camera corresponding point occurs.
  • FIG. 17 is a diagram showing an example of a camera image in which a pattern for re-correction is captured.
  • the position of the dot D after the correction deviation occurs is shifted from the position before the correction deviation occurs in the direction indicated by the white arrow.
  • FIG. 18 is a diagram schematically showing the deviation of the dot D.
  • the center of the circle indicated by the broken line represents the position of the dot D before the correction deviation occurs, that is, the position of the camera corresponding point before the correction deviation occurs.
  • the circle shown by the solid line represents the position of the dot D after the correction deviation occurs.
  • the update unit 118 performs the nearest neighbor search based on the position of the camera corresponding point before the correction deviation occurs for the camera image in which the recorrection pattern appears, so that the camera corresponding point after the correction deviation occurs. Identify the dots placed at the position of.
  • the update unit 118 updates the position information of the camera correspondence point included in the correspondence point information by setting the position (center of gravity position) of each dot after the correction deviation occurs as a new position of the camera correspondence point. ..
  • the update of the position information of the camera corresponding point included in the corresponding point information is performed by performing the nearest neighbor search based on the position of the camera corresponding point before the correction deviation occurs, and the position after the correction deviation of the camera corresponding point occurs. It is done so as to track.
  • the slight correction deviation that triggers the re-correction is, for example, a deviation that makes it possible to trace the position of the camera corresponding point before the correction deviation occurs by the nearest neighbor search, that is, the position after the correction deviation of a certain camera corresponding point occurs. However, the deviation of the adjacent camera corresponding points does not exceed the position before the correction deviation occurs.
  • FIG. 19 is a diagram showing an example of updating the corresponding point information.
  • the projector correspondence points shown on the left side of FIG. 19 are the same as those described with reference to FIG. 7.
  • the camera correspondence point corresponding to the pixel P1 as the projector correspondence point is from the pixel p1 (FIG. 7) at the coordinates (x1, y1) on the camera image to the coordinates (x'1, y'. It has been updated as pixel p'1 in 1).
  • the camera corresponding point corresponding to the pixel P2 as the projector corresponding point is changed from the pixel p2 at the coordinates (x2, y2) on the camera image to the pixel p'2 at the coordinates (x'2, y'2). It has been updated.
  • Other camera correspondence points are also updated as pixels p'3 to p'n, respectively.
  • the process returns to step S33 in FIG. 11 and the subsequent processing is performed.
  • the updated correspondence point information is supplied to and stored in the correction information storage unit 114.
  • the uniqueness resolution by the coding / decoding process using the time-direction coding pattern such as the Gray code is completed at the time of the initial correction before the correction deviation occurs. It is not necessary to project the number of patterns according to the number of conversion bits.
  • the uniqueness solution means to identify the correspondence between the feature points on the projector image and the feature points on the camera image.
  • the parameter estimated value is updated using the updated corresponding point information and the parameter estimated value before the correction deviation occurs. Based on the updated correspondence point information, it is possible to estimate the posture of each device after the correction deviation occurs.
  • step S71 the update unit 118 evaluates the deviation amount of the correction deviation by determining whether or not the amount is less than the threshold value.
  • the update unit 118 estimates the posture of the target camera based on the updated corresponding point information in step S72.
  • the posture estimation is performed only for the camera 3 that causes the correction deviation.
  • step S73 the update unit 118 estimates the attitude of the target projector based on the updated correspondence point information and the target camera parameter estimated value calculated by the attitude estimation of the target camera.
  • the posture estimation is performed only for the projector 2 that causes the correction deviation.
  • step S74 the update unit 118 optimizes the target camera parameter estimated value calculated by the posture estimation of the target camera and the target projector parameter estimated value calculated by the posture estimation of the target projector.
  • the optimization of the parameter estimated value is the same as the optimization performed in the attitude estimation at the time of initial correction.
  • step S71 if it is determined in step S71 that the deviation amount of the correction deviation is less than the threshold amount, the posture estimation in steps S72 and S73 is skipped, and in step S74, the parameter estimation value before the correction deviation occurs is optimized. Will be done.
  • the update unit 118 updates the parameter estimated value before the occurrence of the correction deviation by the parameter estimated value obtained by the optimization performed according to the deviation amount of the correction deviation.
  • step S74 After the parameter estimated value is optimized in step S74, the process returns to step S34 in FIG. 11 and the subsequent processing is performed.
  • the projector-compatible points and the camera-compatible points can be calculated as points in the three-dimensional space by the principle of triangulation.
  • the positions of the projector corresponding points and the camera corresponding points in the three-dimensional space are specified by the parameter estimated values (internal parameters, external parameters) calculated by the parameter estimated value updating process as described above.
  • the corresponding point information due to the correction deviation caused by physical contact or aging is updated with the sensing pattern generated based on the information before the correction deviation occurs. It becomes possible to perform it efficiently by using a certain recorrection pattern.
  • the number of patterns that need to be projected can be the same as the number of projectors 2 that have caused the correction deviation. Even when the number of projectors 2 increases, it is possible to suppress an increase in the number of projections and the number of shots required for re-correction.
  • the dot pattern is projected as a pattern for re-correction, but what kind of pattern is used for re-correction as long as it is a sparse pattern that can obtain a corresponding point with sufficient accuracy for posture estimation? Pattern may be used.
  • a checker pattern as a re-correction pattern that detects the projector corresponding point before the correction deviation occurs as a corner.
  • an equilateral triangle tiling pattern or a ChAruCo marker may be used as a recorrection pattern.
  • the re-correction pattern may be generated for each combination of the projector that caused the correction deviation and the camera that captures the image projected by the projector.
  • pixels of a plurality of camera images are associated with pixels of one projector image.
  • the camera image of one camera includes corresponding points corresponding to the pixels of the camera image of another camera.
  • Correspondence points corresponding to the pixels of the camera image of another camera by using one re-correction pattern for each combination of the projector that caused the correction deviation and the camera that captures the image projected by the projector. (Because it is not necessary to include it as a dot of the recorrection pattern), it is possible to efficiently update the corresponding point.
  • the number of patterns generated is (number of projectors) ⁇ (number of cameras), but there is no change in the total number of shots.
  • FIG. 21 is a diagram showing another configuration example of the projector / camera system.
  • the above re-correction process can also be applied to a system in which three or more projectors 2 are provided as shown in FIG.
  • One projector 2 such that the projector 2-1 and the camera 3-1 are provided in the projection / photographing device 211-1 and the projector 2-2 and the camera 3-2 are provided in the projection / photographing device 201-2.
  • one camera 3 are provided as a configuration of a device having the same housing.
  • the projector 2-3 and the camera 3-3 are provided in the projection / photographing device 201-3
  • the projector 2-4 and the camera 3-4 are provided in the projection / photographing device 201-4.
  • three or more projectors 2 may be provided, or the projector 2 and the camera 3 may be provided in the same housing.
  • the frames # 1 to # 4 shown on the screen 11 represent the projection areas of the projectors 2-1 to 2-4, respectively.
  • the re-correction process can also be applied when one composite image is projected using such four projectors 2.
  • the overlap region may be formed by a combination of projection regions of any projector 2.
  • FIG. 22 is a diagram showing still another configuration example of the projector / camera system.
  • the projector 2 may be configured by a fisheye projector which is a projector equipped with a fisheye lens
  • the camera 3 may be configured by a fisheye camera which is a camera equipped with a fisheye lens.
  • the screen 11 may be a dome-shaped screen whose projection surface is a hemispherical surface.
  • the series of processes described above can be executed by hardware or software.
  • the programs constituting the software are installed from the program recording medium on a computer embedded in dedicated hardware or a general-purpose personal computer.
  • the program to be installed is provided by being recorded on the removable media 65 shown in FIG. 9, which consists of an optical disk (CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), etc.) or a semiconductor memory. It may also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting.
  • the program can be installed in ROM 52 or HDD 64 in advance.
  • the program executed by the computer may be a program in which processing is performed in chronological order in the order described in this specification, or processing is performed in parallel or at a necessary timing such as when a call is made. It may be a program to be performed.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and processed jointly.
  • each step described in the above flowchart can be executed by one device or can be shared and executed by a plurality of devices.
  • one step includes a plurality of processes
  • the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
  • the present technology can also have the following configurations.
  • Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device
  • a detection unit that detects the deviation of the corresponding points that occurs after the initial correction including the posture estimation that estimates the posture of the photographing device based on the result of the corresponding point detection.
  • a re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction.
  • the generator that generates Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated.
  • An image processing device including an update unit.
  • the updating unit estimates the parameter by adjusting the estimated value of the parameter before the deviation of the corresponding point occurs according to the deviation of the corresponding point.
  • a projection control unit that controls projection by the projection device, The image processing device according to any one of (1) to (8) above, further comprising a shooting control unit that controls shooting by the shooting device.
  • the projection control unit controls projection by the plurality of projection devices, and controls projection.
  • the projection control unit projects the recorrection pattern from the projection device that causes the deviation of the corresponding points.
  • the imaging control unit causes the projected recorrection pattern to be photographed by the imaging device that captures the projected image projected by the projection device that projects the recorrection pattern (9) or (10). ).
  • the image processing apparatus (12) The image processing apparatus according to any one of (1) to (11) above, wherein the projection surface of the projected image is a flat screen, a curved screen, or a spherical screen.
  • the corresponding point detection at the time of the initial correction uses the projected image representing a plurality of predetermined patterns and the plurality of captured images obtained by photographing each of the projected predetermined patterns.
  • the image processing apparatus according to any one of (1) to (12) above.
  • the image processing apparatus When the deviation amount of the corresponding point is smaller than the threshold value, the corresponding point information is updated using the recorrection pattern, and when the deviation amount of the corresponding point is larger than the threshold value, the initial stage
  • the image processing apparatus according to any one of (1) to (13) above, further comprising a control unit that performs the same processing as the correction.
  • the image processing device Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device The deviation of the corresponding points generated after the initial correction including the posture estimation for estimating the posture of the photographing device based on the result of the corresponding point detection is detected.
  • a re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction.
  • Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device
  • the deviation of the corresponding points generated after the initial correction including the posture estimation for estimating the posture of the photographing device based on the result of the corresponding point detection is detected.
  • a re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction.
  • Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device
  • a detection unit that detects the deviation of the corresponding points that occurs after the initial correction including the posture estimation that estimates the posture of the photographing device based on the result of the corresponding point detection.
  • a re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction.
  • the generator that generates Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated.
  • An image processing device equipped with an update unit With the plurality of the projection devices An image processing system configured to include a plurality of the photographing devices.
  • Image processing device 2-1 and 2-2 projector, 3-1, 3-2 camera, 21 projection control unit, 22 shooting control unit, 101 control unit, 111 coding pattern generation unit, 112 corresponding point detection unit, 113 Posture estimation unit, 114 Correction information storage unit, 115 Geometric correction unit, 116 Correction deviation detection unit, 117 Recorrection pattern generation unit, 118 Update unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Projection Apparatus (AREA)

Abstract

The present technology relates to an image processing device, an image processing method, a program, and an image processing system which enables re-correction to be easily performed when a mismatch occurs between corresponding points on a projection image projected by a projection device and a captured image captured by an image-capturing device. An image processing device according to an aspect of the present technology detects a mismatch between corresponding points, which occurs after initial correction has been performed. The initial correction includes corresponding point detection in which the corresponding points on the projection image and the captured image are detected, and orientation estimation in which the orientations of the projection device and the image-capturing device are estimated on the basis of the result of the corresponding point detection. In addition, on the basis of corresponding point information obtained by the initial correction, the image processing device generates a re-correction pattern based on the position of the corresponding point on the projection image, detects the mismatch between the corresponding point on the basis of a captured image obtained by capturing an image of the re-correction pattern projected from the projection device, and updates the corresponding point information. The present technology can be applied to an image processing device which controls the projection of a projector.

Description

画像処理装置、画像処理方法、プログラム、および画像処理システムImage processing equipment, image processing methods, programs, and image processing systems
 本技術は、画像処理装置、画像処理方法、プログラム、および画像処理システムに関し、特に、投影装置が投影する投影画像と撮影装置が撮影する撮影画像とのそれぞれの画像上の対応点にずれが発生した場合の再補正を容易に行うことができるようにした画像処理装置、画像処理方法、プログラム、および画像処理システムに関する。 The present technology relates to an image processing device, an image processing method, a program, and an image processing system, and in particular, a deviation occurs in the corresponding point on each image between the projected image projected by the projection device and the captured image captured by the photographing device. The present invention relates to an image processing apparatus, an image processing method, a program, and an image processing system that enable easy re-correction in the case of the above.
 平面、曲面、ドーム型などの所定の形状のスクリーンに対して、複数のプロジェクタを用いて1つの画像を投影する技術がある。複数のプロジェクタを用いて1つの画像を投影するためには、スクリーンの形状を考慮した歪み補正とともに、プロジェクタから投影された画像が重なるオーバーラップ領域の推定などが必要となる。 There is a technique for projecting one image using a plurality of projectors on a screen having a predetermined shape such as a flat surface, a curved surface, or a dome shape. In order to project one image using a plurality of projectors, it is necessary to correct the distortion in consideration of the shape of the screen and estimate the overlapping area where the images projected from the projectors overlap.
 これは、プロジェクタ・カメラシステムの3次元形状計測機能を用いたキャリブレーションによって実現することが可能となる。プロジェクタ・カメラシステムは、プロジェクタとカメラを設け、プロジェクタから投影した所定のパターンの画像をカメラで撮影することによってそれぞれの装置の姿勢などを推定することができるようにしたシステムである。 This can be achieved by calibration using the 3D shape measurement function of the projector / camera system. The projector / camera system is a system in which a projector and a camera are provided, and the posture of each device can be estimated by taking an image of a predetermined pattern projected from the projector with the camera.
 プロジェクタとカメラの姿勢を表すパラメータを推定し、プロジェクタから投影する画像の幾何補正を推定結果に応じて行うことにより、高品位な画像を投影することが可能となる。 By estimating the parameters representing the postures of the projector and the camera and performing geometric correction of the image projected from the projector according to the estimation result, it is possible to project a high-quality image.
特開2016-14720号公報Japanese Unexamined Patent Publication No. 2016-14720
 キャリブレーションを含む初期補正を一度行った後、プロジェクタの経時変化や装置への物理的な接触などにより補正ずれが発生してしまうことがある。補正ずれは、初期補正によって一度検出された、プロジェクタが投影する画像とカメラが撮影する画像のそれぞれの画像上の対応点にずれが発生することをいう。 After performing initial correction including calibration once, correction deviation may occur due to changes over time of the projector or physical contact with the device. The correction deviation means that a deviation occurs at the corresponding point on each image of the image projected by the projector and the image taken by the camera, which is once detected by the initial correction.
 このような補正ずれは、些細なものであっても視聴体験の品位に与える影響が大きい。 Such correction deviation has a large effect on the quality of the viewing experience, even if it is trivial.
 本技術はこのような状況に鑑みてなされたものであり、投影装置が投影する投影画像と撮影装置が撮影する撮影画像とのそれぞれの画像上の対応点にずれが発生した場合の再補正を容易に行うことができるようにするものである。 This technology was made in view of such a situation, and re-correction occurs when there is a deviation between the corresponding points on the projected image projected by the projection device and the captured image captured by the photographing device. It makes it easy to do.
 本技術の一側面の画像処理装置は、投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出する検出部と、前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成する生成部と、前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する更新部とを備える。 The image processing device on one aspect of the present technology determines the corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device. Detects the deviation of the corresponding points that occurs after the initial correction including the detection of the corresponding points to be detected and the attitude estimation that estimates the postures of the projection device and the photographing device based on the result of the corresponding point detection. It is composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship between the detection unit and the corresponding point obtained by the initial correction. Based on the generation unit that generates the recorrection pattern and the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing apparatus, the corresponding points on the captured image It is provided with an update unit that detects a deviation and updates the corresponding point information.
 本技術の一側面においては、投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれが検出される。また、前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンが生成され、前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれが検出され、前記対応点情報が更新される。 One aspect of the present technology is to detect a corresponding point on each image based on a projected image projected by the projection device and a captured image obtained by photographing the projected projected image by the photographing device. The deviation of the corresponding points that occurs after the initial correction including the point detection and the posture estimation that estimates the postures of the projection device and the photographing device based on the result of the corresponding point detection is detected. Further, based on the corresponding point information which is the information indicating the relationship of the corresponding points obtained by the initial correction, the re-correction composed of a predetermined pattern based on the position of the corresponding points on the projected image. A pattern is generated, and the deviation of the corresponding point on the captured image is detected based on the captured image obtained by photographing the recorrection pattern projected from the projection device by the photographing device. The corresponding point information is updated.
本技術の一実施形態に係るプロジェクタ・カメラシステムの構成例を示す図である。It is a figure which shows the configuration example of the projector / camera system which concerns on one Embodiment of this technology. プロジェクタ・カメラシステムの他の構成例を示す図である。It is a figure which shows the other configuration example of a projector / camera system. グレイコードを用いた符号化の例を示す図である。It is a figure which shows the example of the coding using the Gray code. 他の符号化の例を示す図である。It is a figure which shows the example of other coding. 復号の例を示す図である。It is a figure which shows the example of decoding. 初期補正処理について説明するフローチャートである。It is a flowchart explaining the initial correction process. 対応点検出の結果の例を示す図である。It is a figure which shows the example of the result of correspondence point detection. 姿勢推定処理について説明するフローチャートである。It is a flowchart explaining the posture estimation process. 画像処理装置のハードウェアの構成例を示すブロック図である。It is a block diagram which shows the configuration example of the hardware of an image processing apparatus. 画像処理装置の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of an image processing apparatus. 再補正処理について説明するフローチャートである。It is a flowchart explaining the recorrection process. 補正ずれ発生前の各装置の関係を示す図である。It is a figure which shows the relationship of each apparatus before the correction deviation occurs. 補正ずれ発生後の各装置の関係を示す図である。It is a figure which shows the relationship of each apparatus after the correction deviation occurs. 図11のステップS33において行われる対応点情報更新処理について説明するフローチャートである。It is a flowchart explaining the correspondence point information update process performed in step S33 of FIG. 再補正用パターンの例を示す図である。It is a figure which shows the example of the pattern for re-correction. 再補正用パターンの例を示す図である。It is a figure which shows the example of the pattern for re-correction. 再補正用パターンが写るカメラ画像の例を示す図である。It is a figure which shows the example of the camera image which shows the pattern for re-correction. ドットのずれを模式的に示す図である。It is a figure which shows the shift of a dot schematically. 対応点情報の更新の例を示す図である。It is a figure which shows the example of the update of correspondence point information. 図11のステップS34において行われるパラメータ推定値更新処理について説明するフローチャートである。It is a flowchart explaining the parameter estimation value update process performed in step S34 of FIG. プロジェクタ・カメラシステムの他の構成例を示す図である。It is a figure which shows the other configuration example of a projector / camera system. プロジェクタ・カメラシステムの他の構成例を示す図である。It is a figure which shows the other configuration example of a projector / camera system.
 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.プロジェクタ・カメラシステム
 2.初期補正時の対応点検出
 3.初期補正処理
 4.画像処理装置の構成
 5.画像処理装置の動作
 6.補正ずれ(対応点変化)の検出
 7.対応点情報更新処理
 8.パラメータ推定値更新処理
 9.幾何補正
 10.変形例
Hereinafter, modes for implementing the present technology will be described. The explanation will be given in the following order.
1. 1. Projector / camera system 2. Corresponding point detection at the time of initial correction 3. Initial correction processing 4. Configuration of image processing device 5. Operation of image processing device 6. Detection of correction deviation (change of corresponding point) 7. Correspondence point information update processing 8. Parameter estimated value update processing 9. Geometric correction 10. Modification example
<プロジェクタ・カメラシステム>
 図1は、本技術の一実施形態に係る画像処理システムとしてのプロジェクタ・カメラシステムの構成例を示す図である。
<Projector / camera system>
FIG. 1 is a diagram showing a configuration example of a projector / camera system as an image processing system according to an embodiment of the present technology.
 図1のプロジェクタ・カメラシステムは、PCなどの画像処理装置1に対して、投影装置であるプロジェクタと、撮影装置であるカメラとが複数台ずつ接続されることによって構成される。図1の例においては、プロジェクタ2-1,2-2の2台のプロジェクタと、カメラ3-1,3-2の2台のカメラとが設けられている。プロジェクタ2-1,2-2の前方には、投影面となるスクリーン11が設けられる。 The projector / camera system of FIG. 1 is configured by connecting a plurality of projectors, which are projection devices, and a plurality of cameras, which are imaging devices, to an image processing device 1 such as a PC. In the example of FIG. 1, two projectors of projectors 2-1 and 2-2 and two cameras of cameras 3-1 and 3-2 are provided. A screen 11 serving as a projection surface is provided in front of the projectors 2-1 and 2-2.
 プロジェクタ2-1とプロジェクタ2-2は、プロジェクタ2-1の投影範囲とプロジェクタ2-2の投影範囲が一部重なるように設置される。スクリーン11上には、プロジェクタ2-1から投影された画像とプロジェクタ2-2から投影された画像が重なるオーバーラップ領域が形成される。 The projector 2-1 and the projector 2-2 are installed so that the projection range of the projector 2-1 and the projection range of the projector 2-2 partially overlap. An overlapping region is formed on the screen 11 in which the image projected from the projector 2-1 and the image projected from the projector 2-2 overlap.
 カメラ3-1は、その撮影範囲にプロジェクタ2-1の投影範囲を含むように設置される。カメラ3-1により撮影された撮影画像には、少なくとも、スクリーン11上に投影されたプロジェクタ2-1の投影画像が写ることになる。 The camera 3-1 is installed so that the shooting range includes the projection range of the projector 2-1. At least, the projected image of the projector 2-1 projected on the screen 11 is captured in the captured image captured by the camera 3-1.
 同様に、カメラ3-2は、その撮影範囲にプロジェクタ2-2の投影範囲を含むように設置される。カメラ3-2により撮影された撮影画像には、少なくとも、スクリーン11上に投影されたプロジェクタ2-2の投影画像が写ることになる。 Similarly, the camera 3-2 is installed so that the shooting range includes the projection range of the projector 2-2. At least, the projected image of the projector 2-2 projected on the screen 11 is captured in the captured image captured by the camera 3-2.
 例えば、カメラ3-1はプロジェクタ2-1の近くに設けられ、カメラ3-2はプロジェクタ2-2の近くに設けられる。プロジェクタ2-1とカメラ3-1、プロジェクタ2-2とカメラ3-2がそれぞれ同一筐体の装置から構成されるようにしてもよい。また、カメラ3-1,3-2が、それぞれ、プロジェクタ2-1,2-2から離れた位置に設けられるようにしてもよい。 For example, the camera 3-1 is provided near the projector 2-1 and the camera 3-2 is provided near the projector 2-2. The projector 2-1 and the camera 3-1 and the projector 2-2 and the camera 3-2 may each be composed of devices having the same housing. Further, the cameras 3-1 and 3-2 may be provided at positions away from the projectors 2-1 and 2, respectively.
 図1の例においてはプロジェクタとカメラが2台ずつ設けられているが、プロジェクタとカメラの台数はそれぞれ任意である。 In the example of FIG. 1, two projectors and two cameras are provided, but the number of projectors and cameras is arbitrary.
 プロジェクタ2-1,2-2とカメラ3-1,3-2は、一般的な透視投影光学系、魚眼レンズを有する光学系などの、歪みをモデル化可能な光学系を有する。スクリーン11の形状は、平面、曲面、半球面(ドーム型)などの、モデル化できる形状であればどのような形状であってもよい。 The projectors 2-1 and 2-2 and the cameras 3-1 and 3-2 have optical systems capable of modeling distortion, such as a general perspective projection optical system and an optical system having a fisheye lens. The shape of the screen 11 may be any shape that can be modeled, such as a flat surface, a curved surface, and a hemispherical surface (dome shape).
 画像処理装置1には、プロジェクタ2-1,2-2による投影を制御する投影制御部21と、カメラ3-1,3-2による撮影を制御する撮影制御部22とが設けられる。後述するように、画像処理装置1には、投影制御部21と撮影制御部22以外の各種の構成が設けられる。 The image processing device 1 is provided with a projection control unit 21 that controls projection by projectors 2-1 and 2-2, and a shooting control unit 22 that controls shooting by cameras 3-1 and 3-2. As will be described later, the image processing device 1 is provided with various configurations other than the projection control unit 21 and the imaging control unit 22.
 図2は、プロジェクタ・カメラシステムの他の構成例を示す図である。 FIG. 2 is a diagram showing another configuration example of the projector / camera system.
 図2に示されるように、加速度センサやジャイロセンサにより構成されるセンサ4-1,4-2が設けられるようにすることも可能である。 As shown in FIG. 2, it is also possible to provide sensors 4-1 and 4-2 composed of an acceleration sensor and a gyro sensor.
 画像処理装置1のセンサデータ処理部23は、センサ4-1から供給されたセンサデータの処理と、センサ4-2から供給されたセンサデータの処理とを行う。センサデータ処理部23による処理結果に基づいて、後述するような補正ずれが検出される。 The sensor data processing unit 23 of the image processing device 1 processes the sensor data supplied from the sensor 4-1 and the sensor data supplied from the sensor 4-2. Based on the processing result by the sensor data processing unit 23, the correction deviation as described later is detected.
 以下、プロジェクタ2-1,2-2のそれぞれを区別する必要がない場合、まとめて、プロジェクタ2という。また、カメラ3-1,3-2のそれぞれを区別する必要がない場合、まとめて、カメラ3という。センサ4-1,4-2のそれぞれを区別する必要がない場合、まとめて、センサ4という。 Hereinafter, when it is not necessary to distinguish between projectors 2-1 and 2, they are collectively referred to as projector 2. When it is not necessary to distinguish between cameras 3-1 and 3-2, they are collectively referred to as camera 3. When it is not necessary to distinguish between sensors 4-1 and 4-2, they are collectively referred to as sensor 4.
 画像処理装置1は、各装置の姿勢を自動的に計測(推定)してパラメータを算出する3次元計測機能と、算出したパラメータに基づいて、プロジェクタ2から投影する画像に対して補正を施す幾何補正機能を有している。姿勢推定によって算出されたパラメータを用いた幾何補正が施されることにより、スクリーン11に投影された投影画像が、歪みなどの少ない高品位なものとなる。 The image processing device 1 has a three-dimensional measurement function that automatically measures (estimates) the posture of each device and calculates parameters, and a geometry that corrects the image projected from the projector 2 based on the calculated parameters. It has a correction function. By performing geometric correction using the parameters calculated by the posture estimation, the projected image projected on the screen 11 becomes a high-quality one with less distortion and the like.
 ここで、画像処理装置1が姿勢推定によって算出するパラメータには、内部パラメータと外部パラメータがある。 Here, the parameters calculated by the image processing device 1 by posture estimation include internal parameters and external parameters.
 内部パラメータには、プロジェクタ2とカメラ3の焦点距離・主点などの情報が含まれる。レンズの歪み係数などの他の情報が、内部パラメータとして推定されるようにしてもよい。外部パラメータには、プロジェクタ2とカメラ3の姿勢(回転・並進)の情報が含まれる。 The internal parameters include information such as the focal length and principal point of the projector 2 and the camera 3. Other information, such as the distortion factor of the lens, may be estimated as an internal parameter. The external parameters include information on the postures (rotation / translation) of the projector 2 and the camera 3.
 このようなパラメータを算出するための姿勢推定は、対応点情報を入力として行われる。対応点情報は、プロジェクタ2が投影する画像であるプロジェクタ画像と、カメラ3による撮影によって得られた画像であるカメラ画像とのそれぞれの画像上の対応点を表す情報である。 Posture estimation for calculating such parameters is performed by inputting correspondence point information. The corresponding point information is information representing corresponding points on the respective images of the projector image, which is an image projected by the projector 2, and the camera image, which is an image obtained by shooting with the camera 3.
 幾何補正に用いるパラメータを姿勢推定によって適切に算出するためには、対応点検出が精度よく行われる必要がある。対応点検出の結果に基づいて、プロジェクタ2、カメラ3の姿勢などを推定する姿勢推定が行われ、パラメータが算出される。また、スクリーン11の形状と位置が、姿勢推定によって算出されたパラメータに基づいて推定される。 In order to properly calculate the parameters used for geometric correction by posture estimation, it is necessary to accurately detect the corresponding points. Based on the result of the corresponding point detection, the posture estimation for estimating the posture of the projector 2 and the camera 3 is performed, and the parameters are calculated. Further, the shape and position of the screen 11 are estimated based on the parameters calculated by the posture estimation.
 一般的に、対応点検出は、センシング用のパターンをプロジェクタから投射するとともに、スクリーンに投影されたパターンをカメラで撮影し、プロジェクタ画像とカメラ画像の相互の画素位置の対応関係を求めるStructured Lighting(構造化光投影)と呼ばれる手法を用いて行われる。 In general, in the corresponding point detection, a pattern for sensing is projected from a projector, and the pattern projected on the screen is photographed by a camera to obtain the correspondence between the pixel positions of the projector image and the camera image. Structured Lighting ( It is done using a technique called structured light projection).
 図1、図2のプロジェクタ・カメラシステムにおいては、コンテンツの画像を投影する前などの所定のタイミングにおいて、対応点検出と姿勢推定を含む初期補正があらかじめ行われる。 In the projector / camera system of FIGS. 1 and 2, initial correction including corresponding point detection and posture estimation is performed in advance at a predetermined timing such as before projecting an image of the content.
<初期補正時の対応点検出>
 代表的なセンシング用のパターンには、バイナリコードやグレイコードなどがある。これらのコードを用いた対応点検出は、白黒(2値)のパターンを順次投影することによってプロジェクタ画像の各画素を符号化し、投影の様子を撮影して復号することで、プロジェクタ画像の各画素とカメラ画像の各画素との対応関係を求めるものである。
<Detection of corresponding points during initial correction>
Typical sensing patterns include binary code and gray code. Correspondence point detection using these codes encodes each pixel of the projector image by sequentially projecting a black-and-white (binary) pattern, captures the state of projection, and decodes each pixel of the projector image. The correspondence between the image and each pixel of the camera image is obtained.
 図3は、グレイコードを用いた符号化の例を示す図である。 FIG. 3 is a diagram showing an example of coding using a Gray code.
 プロジェクタ画像の各画素の符号化は、図3のAに示すように、水平方向と垂直方向のそれぞれについて、白黒のパターンを時系列に切り替えながら投影することによって行われる。8bitの符号化を行う場合、水平方向と垂直方向のそれぞれについて8枚のパターンが用いられる。 As shown in A of FIG. 3, the coding of each pixel of the projector image is performed by projecting a black and white pattern while switching in time series in each of the horizontal direction and the vertical direction. When performing 8-bit coding, eight patterns are used in each of the horizontal direction and the vertical direction.
 例えば左上隅の画素は、水平方向の投影時には、「黒、黒、黒、・・・」のパターンで投影される。白を「1」、黒を「0」とすると、左上隅の画素は、「000・・・」で符号化されることになる。代表的なグレイコードを用いた符号化においては、プロジェクタ画像の全画素を符号化できるようにするために、水平方向と垂直方向の縞模様パターンの投影が行われる。 For example, the pixels in the upper left corner are projected in the pattern of "black, black, black, ..." when projected in the horizontal direction. Assuming that white is "1" and black is "0", the pixels in the upper left corner are encoded by "000 ...". In coding using a typical Gray code, a striped pattern is projected in the horizontal direction and the vertical direction so that all the pixels of the projector image can be coded.
 一方、復号は、スクリーンに投影されたパターンを順次撮影し、図3のBに示すように、カメラ画像上の各画素の白黒の変化を「1」(白)と「0」(黒)で表すようにして行われる。例えば、図3のBに矩形で囲んで示す位置の画素は、「00011000」の符号で表される。 On the other hand, in decoding, the patterns projected on the screen are sequentially photographed, and as shown in B of FIG. 3, the black and white changes of each pixel on the camera image are set to "1" (white) and "0" (black). It is done as shown. For example, the pixel at the position surrounded by a rectangle in B in FIG. 3 is represented by the reference numeral “00011000”.
 このように、カメラ画像の各画素の固有の符号を検出し、同じ符号で表されるプロジェクタ画像の各画素と対応させることによって、プロジェクタ画像とカメラ画像のそれぞれの画像上の対応点が検出される。 In this way, by detecting the unique code of each pixel of the camera image and associating it with each pixel of the projector image represented by the same code, the corresponding points on the respective images of the projector image and the camera image are detected. To.
 図4は、他の符号化の例を示す図である。 FIG. 4 is a diagram showing an example of other coding.
 図4に示す符号化の手法は、特定の画素群の位置にグレイコードを表すパターンが配置されたドットパターンを用いる手法である。 The coding method shown in FIG. 4 is a method using a dot pattern in which a pattern representing a gray code is arranged at a position of a specific pixel group.
 はじめに、図4のAに示すようなチェッカーパターンを投影することによって、それぞれのチェッカーのコーナー点の位置がカメラ画像を用いて検出される。チェッカーパターンは、黒の略正方形の領域と白の略正方形の領域とを市松状に配置することによって構成されるパターンである。 First, by projecting a checker pattern as shown in A of FIG. 4, the positions of the corner points of each checker are detected using the camera image. The checker pattern is a pattern composed of a black substantially square area and a white approximately square area arranged in a checkered pattern.
 その後、図4のBに示すような、チェッカーのコーナー点の位置にグレイコードを表すドットが配置されたドットパターンが投影され、符号化が行われる。投影されたドットパターンをカメラによって撮影し、図5に示すように、カメラ画像上の各画素の白/黒の変化を「1」と「0」で表すようにして復号が行われる。 After that, as shown in B of FIG. 4, a dot pattern in which dots representing the gray code are arranged is projected at the position of the corner point of the checker, and coding is performed. The projected dot pattern is photographed by a camera, and as shown in FIG. 5, decoding is performed so that the change in white / black of each pixel on the camera image is represented by “1” and “0”.
 このようなドットパターンを用いることにより、チェッカーの間隔で、プロジェクタ画像とカメラ画像のそれぞれの画像上の対応点を高精度に取得することが可能となる。 By using such a dot pattern, it is possible to acquire the corresponding points on the projector image and the camera image with high accuracy at checker intervals.
 以下、適宜、プロジェクタ画像上の対応点を、プロジェクタ対応点という。カメラ画像上の対応点は、カメラ対応点となる。 Hereinafter, the corresponding points on the projector image are appropriately referred to as projector compatible points. Corresponding points on the camera image are camera corresponding points.
<初期補正処理>
 図6のフローチャートを参照して、以上のような対応点検出を含む一連の初期補正処理について説明する。
<Initial correction processing>
A series of initial correction processes including the above-mentioned correspondence point detection will be described with reference to the flowchart of FIG.
 ステップS1において、画像処理装置1は、符号化パターンを生成する。ここで生成される符号化パターンは、例えば図3を参照して説明したような、画像全域の画素を対象としたパターンである。 In step S1, the image processing device 1 generates a coding pattern. The coding pattern generated here is a pattern for pixels in the entire image as described with reference to FIG. 3, for example.
 ステップS2において、画像処理装置1は、プロジェクタ2を制御し、符号化パターンを投影する。また、画像処理装置1は、カメラ3を制御し、スクリーン11に投影された符号化パターンを撮影する。符号化パターンの投影と撮影は、(符号化bit数)×(プロジェクタ数)×(カメラ数)の回数だけ繰り返される。 In step S2, the image processing device 1 controls the projector 2 and projects a coding pattern. Further, the image processing device 1 controls the camera 3 and captures the coded pattern projected on the screen 11. The projection and shooting of the coding pattern are repeated as many times as (number of coding bits) × (number of projectors) × (number of cameras).
 ステップS3において、画像処理装置1は、それぞれのカメラ画像に基づいて上述したようにして復号処理を行い、プロジェクタ画像とカメラ画像のそれぞれの画像上の対応点を検出する。 In step S3, the image processing device 1 performs decoding processing as described above based on each camera image, and detects corresponding points on the respective images of the projector image and the camera image.
 図7は、対応点検出の結果の例を示す図である。 FIG. 7 is a diagram showing an example of the result of corresponding point detection.
 対応点検出は、同じ符号を有するプロジェクタ画像上の画素とカメラ画像上の画素とを対応点として対応付けることによって行われる。 Corresponding point detection is performed by associating pixels on the projector image and pixels on the camera image having the same code as corresponding points.
 図7の例においては、プロジェクタ画像上の座標(X1,Y1)にある画素P1と、カメラ画像上の座標(x1,y1)にある画素p1とが対応点として検出されている。画素P1がプロジェクタ対応点となり、画素p1が、プロジェクタ対応点としての画素P1に対応するカメラ対応点となる。 In the example of FIG. 7, the pixel P1 at the coordinates (X1, Y1) on the projector image and the pixel p1 at the coordinates (x1, y1) on the camera image are detected as corresponding points. Pixel P1 is a projector compatible point, and pixel p1 is a camera compatible point corresponding to pixel P1 as a projector compatible point.
 また、プロジェクタ画像上の座標(X2,Y2)にある画素P2と、カメラ画像上の座標(x2,y2)にある画素p2とが対応点として検出されている。プロジェクタ画像上の画素P3乃至PNとカメラ画像上の画素p3乃至pnについても、それぞれ対応点として検出されている。 Further, the pixel P2 at the coordinates (X2, Y2) on the projector image and the pixel p2 at the coordinates (x2, y2) on the camera image are detected as corresponding points. Pixels P3 to PN on the projector image and pixels p3 to pn on the camera image are also detected as corresponding points, respectively.
 以上のような対応点検出により、プロジェクタ画像とカメラ画像のそれぞれの画像上の対応点を表す情報である対応点情報が生成される。対応点情報は、図7に示すような対応関係を表す情報となる。 By the corresponding point detection as described above, the corresponding point information which is the information indicating the corresponding point on each image of the projector image and the camera image is generated. The correspondence point information is information representing the correspondence relationship as shown in FIG.
 図6のステップS4において、画像処理装置1は姿勢推定処理を行う。姿勢推定処理は、対応点検出によって求められた対応点情報に基づいて行われる。姿勢推定処理により、プロジェクタ2とカメラ3の姿勢を表すパラメータであるパラメータ推定値が算出される。 In step S4 of FIG. 6, the image processing device 1 performs the posture estimation process. The posture estimation process is performed based on the correspondence point information obtained by the correspondence point detection. The posture estimation process calculates a parameter estimated value which is a parameter representing the postures of the projector 2 and the camera 3.
 図8のフローチャートを参照して、姿勢推定処理について説明する。 The posture estimation process will be described with reference to the flowchart of FIG.
 ステップS11において、画像処理装置1は、対応点情報に基づいてカメラパラメータの推定を行う。 In step S11, the image processing device 1 estimates the camera parameters based on the corresponding point information.
 ステップS12において、画像処理装置1は、対応点情報に基づいてプロジェクタパラメータの推定を行う。プロジェクタパラメータの推定には、カメラパラメータ推定値も用いられる。 In step S12, the image processing device 1 estimates the projector parameters based on the corresponding point information. Camera parameter estimates are also used to estimate projector parameters.
 ステップS13において、画像処理装置1は、プロジェクタパラメータ推定値とカメラパラメータ推定値の最適化を行い、プロジェクタ・カメラパラメータ推定値を算出する。 In step S13, the image processing device 1 optimizes the projector parameter estimated value and the camera parameter estimated value, and calculates the projector / camera parameter estimated value.
 パラメータ推定値の最適化は、例えば、プロジェクタ対応点とカメラ対応点のそれぞれを対象として光線追跡を行った場合に、それぞれの追跡先がスクリーン11上の1点となるように、プロジェクタパラメータ推定値とカメラパラメータ推定値のそれぞれを調整するようにして行われる。 The parameter estimation value is optimized so that, for example, when ray tracking is performed for each of the projector corresponding point and the camera corresponding point, each tracking destination becomes one point on the screen 11. And the camera parameter estimates are adjusted respectively.
 パラメータ推定値の最適化が行われた後、図6のステップS4に戻り、それ以降の処理が行われる。 After the parameter estimated value is optimized, the process returns to step S4 in FIG. 6 and the subsequent processing is performed.
 図6のステップS5において、画像処理装置1は、姿勢推定処理によって算出されたパラメータ推定値に基づいて、プロジェクタ2から投影する画像に対して幾何補正を施す。 In step S5 of FIG. 6, the image processing device 1 performs geometric correction on the image projected from the projector 2 based on the parameter estimated value calculated by the posture estimation process.
 以上のように、初期補正は、対応点検出と姿勢推定とを少なくとも含む処理である。 As described above, the initial correction is a process that includes at least corresponding point detection and attitude estimation.
 初期補正時に行われる対応点検出は、複数枚のパターンを投影する必要があるために処理量が多くなる。例えば、グレイコードのような時間方向への符号化を用いた場合、プロジェクタ毎に、符号化bit数の分だけ、パターンを投影する必要がある。具体的には、(符号化bit数)×(プロジェクタ数)だけパターンを投影し、それを、カメラの台数分だけ、撮影する必要がある。 Correspondence point detection performed at the time of initial correction requires a large amount of processing because it is necessary to project a plurality of patterns. For example, when coding in the time direction such as Gray code is used, it is necessary to project a pattern for each projector by the number of coding bits. Specifically, it is necessary to project a pattern by (the number of encoded bits) × (the number of projectors) and shoot as many as the number of cameras.
 すなわち、プロジェクタの数が増える毎に、符号化bit数の分だけ、全体の投影枚数が増えることになる。同時に、センシングする対応点の数も増えるため、対応点検出後の姿勢推定にかかる時間も長くなってしまう。 That is, as the number of projectors increases, the total number of projected images increases by the number of encoded bits. At the same time, since the number of corresponding points to be sensed increases, the time required for posture estimation after detecting the corresponding points also becomes long.
 一連の初期補正が行われた後、実際の運用の環境によっては、装置への物理的な接触などにより、補正ずれが発生してしまうことがある。補正ずれは、初期補正時の対応点検出によって検出されたプロジェクタ対応点とカメラ対応点にずれが発生することである。 After a series of initial corrections, depending on the actual operating environment, correction deviations may occur due to physical contact with the device. The correction deviation is a deviation between the projector corresponding point and the camera corresponding point detected by the corresponding point detection at the time of initial correction.
 図1、図2のプロジェクタ・カメラシステムにおいては、補正ずれのずれ量が微少である場合には、補正ずれを補正するための再補正が、上述したような初期補正を再度行うのではなく、初期補正によって得られた情報を元にして行われる。 In the projector / camera system of FIGS. 1 and 2, when the deviation amount of the correction deviation is small, the re-correction for correcting the correction deviation does not perform the initial correction as described above again. It is performed based on the information obtained by the initial correction.
 これにより、補正ずれが発生した場合であっても、再補正を容易に行うことが可能となる。再補正の一連の処理については後述する。 This makes it possible to easily perform re-correction even if a correction deviation occurs. A series of recorrection processes will be described later.
<画像処理装置の構成>
 図9は、画像処理装置1のハードウェアの構成例を示すブロック図である。
<Configuration of image processing device>
FIG. 9 is a block diagram showing a configuration example of the hardware of the image processing device 1.
 CPU(Central Processing Unit)51、ROM(Read Only Memory)52、RAM(Random Access Memory)53は、バス54により相互に接続される。 The CPU (Central Processing Unit) 51, ROM (Read Only Memory) 52, and RAM (Random Access Memory) 53 are connected to each other by the bus 54.
 バス54には、さらに、入出力拡張バス55が接続される。入出力拡張バス55には、GPU(Graphics Processing Unit)56、UI(User Interface)用I/F59、通信用I/F62、および記録用I/F63が接続される。 An input / output expansion bus 55 is further connected to the bus 54. A GPU (Graphics Processing Unit) 56, a UI (User Interface) I / F 59, a communication I / F 62, and a recording I / F 63 are connected to the input / output expansion bus 55.
 GPU56は、プロジェクタ2から投影させる画像のレンダリングをVRAM57を用いて行う。例えば、GPU56は、プロジェクタ2から投影させるパターンを生成し、表示用I/F58に出力する。 The GPU 56 renders the image projected from the projector 2 using the VRAM 57. For example, the GPU 56 generates a pattern to be projected from the projector 2 and outputs it to the display I / F 58.
 表示用I/F58は、プロジェクタ画像などの出力用のインタフェースである。表示用I/F58は、HDMI(登録商標)(High-Definition Multimedia Interface)などの所定の規格のインタフェースとして構成される。表示用I/F58は、GPU56から供給されたプロジェクタ画像をプロジェクタ2に出力し、投影させる。 The display I / F 58 is an interface for outputting projector images and the like. The display I / F 58 is configured as an interface of a predetermined standard such as HDMI (registered trademark) (High-Definition Multimedia Interface). The display I / F 58 outputs the projector image supplied from the GPU 56 to the projector 2 and projects it.
 UI用I/F59は、操作の検出用のインタフェースである。UI用I/F59は、キーボード60やマウス61を用いて行われるユーザの操作を検出し、操作の内容を表す情報をCPU51に出力する。 The UI I / F59 is an interface for detecting operations. The UI I / F 59 detects a user's operation performed by using the keyboard 60 or the mouse 61, and outputs information indicating the content of the operation to the CPU 51.
 通信用I/F62は、外部の装置との通信用のインタフェースである。通信用I/F62は、無線LAN、有線LANなどのネットワークインタフェースにより構成される。通信用I/F62は、インターネットなどのネットワークを介して外部の装置と通信を行い、各種のデータの送受信を行う。プロジェクタ・カメラシステムにおいて再生されるコンテンツが、サーバからネットワークを介して提供されるようにしてもよい。 The communication I / F 62 is an interface for communication with an external device. The communication I / F 62 is composed of network interfaces such as wireless LAN and wired LAN. The communication I / F 62 communicates with an external device via a network such as the Internet, and transmits / receives various data. The content played back in the projector / camera system may be provided from the server via the network.
 また、通信用I/F62は、カメラ3により撮影され、カメラ3から送信された撮影画像を受信する。通信用I/F62は、センサ4から送信されたセンサデータなどの、外部の装置から送信された各種のデータを受信する。 Further, the communication I / F 62 receives the captured image captured by the camera 3 and transmitted from the camera 3. The communication I / F 62 receives various data transmitted from an external device, such as sensor data transmitted from the sensor 4.
 記録用I/F63は、記録媒体用のインタフェースである。記録用I/F63には、HDD64、リムーバブルメディア65などの記録媒体が装着される。記録用I/F63は、装着された記録媒体に記録されているデータの読み出し、記録媒体に対するデータの書き込みを行う。HDD64には、コンテンツの他に、CPU51が実行するプログラムなどの各種のデータが記録される。 The recording I / F 63 is an interface for a recording medium. A recording medium such as an HDD 64 or a removable medium 65 is attached to the recording I / F 63. The recording I / F 63 reads the data recorded on the mounted recording medium and writes the data to the recording medium. In addition to the contents, various data such as programs executed by the CPU 51 are recorded in the HDD 64.
 図10は、画像処理装置1の機能構成例を示すブロック図である。 FIG. 10 is a block diagram showing a functional configuration example of the image processing device 1.
 図10に示すように、画像処理装置1においては制御部101が実現される。制御部101は、投影制御部21、撮影制御部22の他に、符号化パターン生成部111、対応点検出部112、姿勢推定部113、補正情報記憶部114、幾何補正部115、補正ずれ検出部116、再補正用パターン生成部117、および更新部118から構成される。 As shown in FIG. 10, the control unit 101 is realized in the image processing device 1. In addition to the projection control unit 21 and the imaging control unit 22, the control unit 101 includes a coding pattern generation unit 111, a corresponding point detection unit 112, a posture estimation unit 113, a correction information storage unit 114, a geometric correction unit 115, and a correction deviation detection. It is composed of a unit 116, a pattern generation unit 117 for re-correction, and an update unit 118.
 投影制御部21は、初期補正時、符号化パターン生成部111により生成された複数枚の符号化パターンをそれぞれのプロジェクタ2から投影させる。 At the time of initial correction, the projection control unit 21 projects a plurality of coded patterns generated by the coded pattern generation unit 111 from each projector 2.
 また、投影制御部21は、再補正時、再補正用パターン生成部117により生成された再補正用パターンを、補正ずれの要因となったプロジェクタ2から投影させる。後に詳述するように、再補正用パターンは、補正ずれの再補正のためのパターンであり、補正ずれの要因となったプロジェクタ2毎に1枚だけ生成される。 Further, at the time of re-correction, the projection control unit 21 projects the re-correction pattern generated by the re-correction pattern generation unit 117 from the projector 2 that causes the correction deviation. As will be described in detail later, the re-correction pattern is a pattern for re-correcting the correction deviation, and only one pattern is generated for each projector 2 that causes the correction deviation.
 撮影制御部22は、初期補正時、カメラ3を制御し、スクリーン11に投影された符号化パターンの撮影を行わせる。撮影制御部22は、通信用I/F62を制御することによって、符号化パターンが写る撮影画像を取得し、カメラ画像として対応点検出部112に出力する。 The shooting control unit 22 controls the camera 3 at the time of initial correction to shoot the coded pattern projected on the screen 11. By controlling the communication I / F 62, the photographing control unit 22 acquires a photographed image in which the coded pattern is captured and outputs it as a camera image to the corresponding point detection unit 112.
 また、撮影制御部22は、再補正時、カメラ3を制御し、スクリーン11に投影された再補正用パターンの撮影を行わせる。撮影制御部22は、通信用I/F62を制御することによって、再補正用パターンが写る撮影画像を取得し、カメラ画像として更新部118に出力する。 Further, the shooting control unit 22 controls the camera 3 at the time of re-correction, and causes the camera 3 to shoot the re-correction pattern projected on the screen 11. By controlling the communication I / F 62, the shooting control unit 22 acquires a shooting image in which the recorrection pattern is captured and outputs it as a camera image to the updating unit 118.
 符号化パターン生成部111は、初期補正時、図3、図4を参照して説明したような複数枚の符号化パターンをセンシング用のパターンとして生成する。図6のステップS1の処理が、符号化パターン生成部111により行われる処理となる。符号化パターン生成部111により生成された符号化パターンは、プロジェクタ画像として投影制御部21と対応点検出部112に供給される。 At the time of initial correction, the coding pattern generation unit 111 generates a plurality of coding patterns as a pattern for sensing as described with reference to FIGS. 3 and 4. The process of step S1 in FIG. 6 is a process performed by the coding pattern generation unit 111. The coded pattern generated by the coded pattern generation unit 111 is supplied to the projection control unit 21 and the corresponding point detection unit 112 as a projector image.
 対応点検出部112は、初期補正時、符号化パターン生成部111により生成された符号化パターンと、撮影制御部22から供給された、符号化パターンが写るカメラ画像に基づいて対応点検出を行う。図6のステップS3の処理が、対応点検出部112により行われる処理となる。 At the time of initial correction, the corresponding point detection unit 112 detects the corresponding point based on the coded pattern generated by the coded pattern generation unit 111 and the camera image in which the coded pattern is captured, which is supplied from the photographing control unit 22. .. The process of step S3 in FIG. 6 is a process performed by the corresponding point detection unit 112.
 すなわち、図3等を参照して説明したようなカメラ画像の各画素の復号が行われ、対応点が検出される。対応点検出部112による対応点検出によって求められた対応点情報は、姿勢推定部113と補正情報記憶部114に供給される。 That is, each pixel of the camera image as described with reference to FIG. 3 and the like is decoded, and the corresponding point is detected. The correspondence point information obtained by the correspondence point detection by the correspondence point detection unit 112 is supplied to the posture estimation unit 113 and the correction information storage unit 114.
 姿勢推定部113は、初期補正時、対応点検出部112から供給された対応点情報に基づいて姿勢推定を行い、パラメータ推定値を算出する。図6のステップS4の処理が、姿勢推定部113により行われる処理となる。姿勢推定部113により算出されたパラメータ推定値は、補正情報記憶部114と幾何補正部115に供給される。 At the time of initial correction, the posture estimation unit 113 estimates the posture based on the correspondence point information supplied from the correspondence point detection unit 112, and calculates the parameter estimation value. The process of step S4 in FIG. 6 is a process performed by the posture estimation unit 113. The parameter estimated value calculated by the posture estimation unit 113 is supplied to the correction information storage unit 114 and the geometric correction unit 115.
 補正情報記憶部114は、対応点検出部112から供給された対応点情報と、姿勢推定部113から供給されたパラメータ推定値とを記憶する。補正情報記憶部114に記憶された情報は、再補正時に読み出される。 The correction information storage unit 114 stores the corresponding point information supplied from the corresponding point detecting unit 112 and the parameter estimated value supplied from the posture estimation unit 113. The information stored in the correction information storage unit 114 is read out at the time of re-correction.
 また、補正情報記憶部114は、再補正時に更新部118により更新された更新後の対応点情報とパラメータ推定値とを記憶する。再補正後の幾何補正部115による幾何補正は、補正情報記憶部114に記憶された更新後の情報に基づいて行われる。 Further, the correction information storage unit 114 stores the updated corresponding point information and the parameter estimated value updated by the update unit 118 at the time of re-correction. The geometric correction by the geometric correction unit 115 after the re-correction is performed based on the updated information stored in the correction information storage unit 114.
 幾何補正部115は、姿勢推定部113から供給されたパラメータ推定値に基づいて、それぞれのプロジェクタ2から投影させる画像の幾何補正を行う。図6のステップS5の処理が、幾何補正部115により行われる処理となる。幾何補正部115により幾何補正が施された補正後の画像が、図示せぬ経路を介して投影制御部21に供給され、投影される。 The geometric correction unit 115 performs geometric correction of the image projected from each projector 2 based on the parameter estimation value supplied from the posture estimation unit 113. The process of step S5 in FIG. 6 is a process performed by the geometric correction unit 115. The corrected image that has been geometrically corrected by the geometric correction unit 115 is supplied to the projection control unit 21 via a path (not shown) and projected.
 補正ずれ検出部116は、外部から入力された情報に基づいて補正ずれを検出する。例えば、所定のプロジェクタ2に衝撃が加わり、姿勢がずれたことがセンサ4により検出された場合、補正ずれが発生したものとして検出される。 The correction deviation detection unit 116 detects the correction deviation based on the information input from the outside. For example, when an impact is applied to a predetermined projector 2 and the sensor 4 detects that the posture is displaced, it is detected as a correction deviation has occurred.
 補正ずれ検出部116は、補正ずれが発生したことを検出した場合、補正ずれのずれ量を閾値となる量と比較する。補正ずれ検出部116は、補正ずれのずれ量が閾値となる量より少ない場合(微少である場合)、再補正を行うことを表す情報を再補正用パターン生成部117と更新部118に出力する。 When the correction deviation detection unit 116 detects that the correction deviation has occurred, the correction deviation detection unit 116 compares the correction deviation deviation amount with the threshold amount. When the deviation amount of the correction deviation is smaller than the threshold value (small amount), the correction deviation detection unit 116 outputs information indicating that re-correction is performed to the re-correction pattern generation unit 117 and the update unit 118. ..
 一方、補正ずれ検出部116は、補正ずれのずれ量が閾値となる量より多い場合、初期補正と同様の補正を再度行うことを表す情報を符号化パターン生成部111に出力する。 On the other hand, when the deviation amount of the correction deviation is larger than the threshold value, the correction deviation detection unit 116 outputs information indicating that the same correction as the initial correction is performed again to the coding pattern generation unit 111.
 再補正用パターン生成部117は、再補正を行うことが補正ずれ検出部116により選択された場合、初期補正によって得られた対応点情報を補正情報記憶部114から読み出して取得する。また、再補正用パターン生成部117は、補正情報記憶部114から読み出した補正ずれ発生前の対応点情報に基づいて再補正用パターンを生成する。 When the correction deviation detection unit 116 selects to perform re-correction, the re-correction pattern generation unit 117 reads out the corresponding point information obtained by the initial correction from the correction information storage unit 114 and acquires it. Further, the re-correction pattern generation unit 117 generates a re-correction pattern based on the corresponding point information before the occurrence of the correction deviation read from the correction information storage unit 114.
 このように、再補正用パターンの生成は、初期補正時に得られた対応点情報を元にして行われる。再補正用パターン生成部117により生成された再補正用パターンは投影制御部21に供給され、補正ずれの要因となったプロジェクタ2から投影される。 In this way, the re-correction pattern is generated based on the corresponding point information obtained at the time of initial correction. The recorrection pattern generated by the recorrection pattern generation unit 117 is supplied to the projection control unit 21 and projected from the projector 2 that causes the correction deviation.
 更新部118は、再補正を行うことが補正ずれ検出部116により選択された場合、初期補正によって得られた対応点情報とパラメータ推定値を補正情報記憶部114から読み出して取得する。 When the correction deviation detection unit 116 selects to perform re-correction, the update unit 118 reads out the corresponding point information and the parameter estimated value obtained by the initial correction from the correction information storage unit 114 and acquires them.
 更新部118は、再補正用パターンが写るカメラ画像が撮影制御部22から供給された場合、カメラ画像に基づいて、カメラ対応点の補正ずれ発生後の位置を検出し、補正ずれ発生後の位置に応じて対応点情報を更新する。 When the camera image in which the re-correction pattern is captured is supplied from the shooting control unit 22, the update unit 118 detects the position of the camera corresponding point after the correction deviation occurs based on the camera image, and detects the position after the correction deviation occurs. The corresponding point information is updated according to.
 また、更新部118は、更新後の対応点情報を参照するなどして、パラメータ推定値を更新する。更新部118により更新された対応点情報とパラメータ推定値は、補正情報記憶部114に記憶され、それ以降の幾何補正に用いられる。 In addition, the update unit 118 updates the parameter estimated value by referring to the corresponding point information after the update. The corresponding point information and the parameter estimated value updated by the update unit 118 are stored in the correction information storage unit 114 and used for the subsequent geometric correction.
<画像処理装置の動作>
 ここで、図11のフローチャートを参照して、以上のような構成を有する画像処理装置1の再補正処理について説明する。
<Operation of image processing device>
Here, the recorrection process of the image processing apparatus 1 having the above configuration will be described with reference to the flowchart of FIG.
 図11の処理は、図6を参照して説明したような初期補正処理が行われ、プロジェクタ・カメラシステムの状態が補正状態にあるときに開始される。 The process of FIG. 11 is started when the initial correction process as described with reference to FIG. 6 is performed and the state of the projector / camera system is in the correction state.
 ステップS31において、補正ずれ検出部116は、外部から入力された情報に基づいて補正ずれを検出する。 In step S31, the correction deviation detection unit 116 detects the correction deviation based on the information input from the outside.
 ステップS32において、補正ずれ検出部116は、補正ずれのずれ量が微少であるか否かを判定する。ここでは、ずれ量が閾値となる量より少ない場合に微少であると判定される。 In step S32, the correction deviation detection unit 116 determines whether or not the deviation amount of the correction deviation is very small. Here, when the deviation amount is less than the threshold value, it is determined to be minute.
 補正ずれのずれ量が微少である場合、初期補正により求められた、補正ずれ発生前の対応点情報とパラメータ推定値を用いて再補正が行われる。補正情報記憶部114から読み出された対応点情報は再補正用パターン生成部117と更新部118に供給され、パラメータ推定値は更新部118に供給される。 If the amount of deviation of the correction deviation is small, re-correction is performed using the corresponding point information before the occurrence of the correction deviation and the parameter estimated value obtained by the initial correction. The corresponding point information read from the correction information storage unit 114 is supplied to the re-correction pattern generation unit 117 and the update unit 118, and the parameter estimated value is supplied to the update unit 118.
 ステップS33において、対応点情報更新処理が行われる。対応点情報更新処理により、補正ずれ発生前の対応点情報が更新され、更新済みの対応点情報が生成される。対応点情報更新処理については図14のフローチャートを参照して後述する。 In step S33, the corresponding point information update process is performed. By the correspondence point information update process, the correspondence point information before the correction deviation occurs is updated, and the updated correspondence point information is generated. The corresponding point information update process will be described later with reference to the flowchart of FIG.
 ステップS34において、パラメータ推定値更新処理が行われる。パラメータ推定値更新処理により、補正ずれ発生前のパラメータ推定値が更新され、更新済みのパラメータ推定値が生成される。パラメータ推定値更新処理については図20のフローチャートを参照して後述する。 In step S34, the parameter estimated value update process is performed. The parameter estimated value update process updates the parameter estimated value before the correction deviation occurs, and generates the updated parameter estimated value. The parameter estimation value update process will be described later with reference to the flowchart of FIG.
 ステップS35において、幾何補正部115は、更新済みのパラメータ推定値に基づいて幾何補正を行う。幾何補正が施された補正後の画像が投影制御部21に供給され、プロジェクタ2から投影される。これにより、微少な補正ずれを考慮した幾何補正後の画像が投影されることになる。 In step S35, the geometric correction unit 115 performs geometric correction based on the updated parameter estimated value. The corrected image with geometric correction is supplied to the projection control unit 21 and projected from the projector 2. As a result, the image after the geometric correction considering the slight correction deviation is projected.
 一方、補正ずれのずれ量が閾値となる量より多いことから、補正ずれのずれ量が微少ではないとステップS32において判定された場合、ステップS36に進み、初期補正処理が行われる。ここでは、図6を参照して説明した処理と同様の処理が行われ、複数枚の符号化パターンを用いた対応点検出によって対応点情報が求められるとともに、対応点情報に基づいて、姿勢推定が行われることになる。 On the other hand, since the deviation amount of the correction deviation is larger than the threshold value, if it is determined in step S32 that the deviation amount of the correction deviation is not small, the process proceeds to step S36 and the initial correction process is performed. Here, the same processing as that described with reference to FIG. 6 is performed, the corresponding point information is obtained by the corresponding point detection using a plurality of coding patterns, and the posture is estimated based on the corresponding point information. Will be done.
 ステップS36において初期補正処理が行われた後、ステップS35に進み、初期補正処理によって求められたパラメータ推定値を用いた幾何補正が幾何補正部115により行われる。 After the initial correction process is performed in step S36, the process proceeds to step S35, and the geometric correction unit 115 performs geometric correction using the parameter estimated value obtained by the initial correction process.
 以上のような再補正処理を構成する各処理の詳細について、以下の順で説明する。
 ・補正ずれ(対応点変化)の検出(ステップS31)
 ・対応点情報更新処理(ステップS33)
 ・パラメータ推定値更新処理(ステップS34)
 ・幾何補正(ステップS35)
Details of each process constituting the above recorrection process will be described in the following order.
-Detection of correction deviation (change in corresponding point) (step S31)
-Correspondence point information update process (step S33)
-Parameter estimation value update process (step S34)
-Geometric correction (step S35)
<補正ずれ(対応点変化)の検出>
 補正ずれは、装置(プロジェクタ2、カメラ3、スクリーン11)への物理的な接触、装置の経時変化、プロジェクタ2のズーム・シフト機能の調整などにより発生する。
<Detection of correction deviation (change of corresponding point)>
The correction deviation occurs due to physical contact with the device (projector 2, camera 3, screen 11), aging of the device, adjustment of the zoom / shift function of the projector 2, and the like.
 図12は、補正ずれ発生前の各装置の関係を示す図である。 FIG. 12 is a diagram showing the relationship of each device before the occurrence of correction deviation.
 図12の例においては、カメラ3-1,3-2の撮影範囲には、それぞれ、プロジェクタ2-1の投影範囲とプロジェクタ2-2の投影範囲が含まれる。プロジェクタ2-1から投影された画像は、カメラ3-1,3-2によりそれぞれ撮影され、プロジェクタ2-2から投影された画像は、カメラ3-1,3-2によりそれぞれ撮影される。 In the example of FIG. 12, the shooting ranges of the cameras 3-1 and 3-2 include the projection range of the projector 2-1 and the projection range of the projector 2-2, respectively. The images projected from the projector 2-1 are taken by the cameras 3-1 and 3-2, respectively, and the images projected from the projector 2-2 are taken by the cameras 3-1 and 3-2, respectively.
 補正ずれが発生していない補正状態にあるから、プロジェクタ画像とカメラ画像の対応点にはずれが発生していない。 Since the correction state is in the correction state where no correction deviation has occurred, there is no deviation at the corresponding points between the projector image and the camera image.
 図12において、プロジェクタ2-1が投影するプロジェクタ画像上の位置p1にある画素に対しては、カメラ3-1のカメラ画像上においては位置p11-1にある画素が対応点となり、カメラ3-2のカメラ画像上においては位置p12-1にある画素が対応点となる。それぞれの対応点を通る破線に示すように、対応点となる各画素の光線がスクリーン11上の位置P1で交わり、対応点にずれがない。 In FIG. 12, the pixel at the position p11 on the camera image of the camera 3-1 corresponds to the pixel at the position p1 on the projector image projected by the projector 2-1. On the camera image of 2, the pixel at position p12-1 is the corresponding point. As shown by the broken line passing through each corresponding point, the light rays of each pixel serving as the corresponding point intersect at the position P1 on the screen 11, and there is no deviation in the corresponding point.
 同様に、プロジェクタ2-2が投影するプロジェクタ画像上の位置p2にある画素に対しては、カメラ3-1のカメラ画像上においては位置p11-2にある画素が対応点となり、カメラ3-2のカメラ画像上においては位置p12-2にある画素が対応点となる。それぞれの対応点を通る破線に示すように、対応点となる各画素の光線がスクリーン11上の位置P2で交わり、対応点にずれがない。 Similarly, for the pixel at the position p2 on the projector image projected by the projector 2-2, the pixel at the position p11-2 on the camera image of the camera 3-1 becomes a corresponding point, and the camera 3-2 On the camera image of, the pixel at position p12-2 is the corresponding point. As shown by the broken line passing through each corresponding point, the light rays of each pixel serving as the corresponding point intersect at the position P2 on the screen 11, and there is no deviation in the corresponding point.
 図13は、補正ずれ発生後の各装置の関係を示す図である。 FIG. 13 is a diagram showing the relationship between each device after the correction deviation occurs.
 例えば、矢印A1に示すようにプロジェクタ2-2の位置がずれた場合、プロジェクタ対応点とカメラ対応点は、プロジェクタ2-2とカメラ3-1間、プロジェクタ2-2とカメラ3-2間のそれぞれにおいて、矢印A2に示すように位置がずれた状態となる。 For example, when the position of the projector 2-2 shifts as shown by the arrow A1, the projector corresponding point and the camera corresponding point are between the projector 2-2 and the camera 3-1 and between the projector 2-2 and the camera 3-2. In each case, the positions are displaced as shown by the arrow A2.
 このように、補正ずれは、プロジェクタ2、カメラ3、スクリーン11のいずれかの変化により、プロジェクタ画像上の対応点に対する、カメラ画像上の対応点の位置が変化した状態として定義される。 As described above, the correction deviation is defined as a state in which the position of the corresponding point on the camera image is changed with respect to the corresponding point on the projector image due to any change in the projector 2, the camera 3, or the screen 11.
 微少な補正ずれが発生した場合、初期補正によって得られた補正ずれ発生前の情報を元にした再補正が行われる。微少な補正ずれとは、プロジェクタ2からの投影位置が、カメラ画像上で僅かにずれる程度の小さなずれを意味する。 If a slight correction deviation occurs, re-correction is performed based on the information before the correction deviation occurred obtained by the initial correction. The slight correction deviation means a small deviation such that the projection position from the projector 2 is slightly deviated on the camera image.
 このような微少な補正ずれが、図11のステップS31において、補正ずれ検出部116により検出される。 Such a slight correction deviation is detected by the correction deviation detection unit 116 in step S31 of FIG.
 装置に対する物理的な接触による補正ずれは、例えば、図2に示すように設けられたセンサ4による計測結果に基づいて検出される。補正ずれ検出部116は、センサデータ処理部23から供給された、加速度センサやジャイロセンサなどよりなるセンサ4の計測結果に基づいて、補正ずれの要因となった装置を特定し、それをトリガとして、再補正を開始させる。 The correction deviation due to physical contact with the device is detected, for example, based on the measurement result by the sensor 4 provided as shown in FIG. The correction deviation detection unit 116 identifies the device that caused the correction deviation based on the measurement result of the sensor 4 including the acceleration sensor and the gyro sensor supplied from the sensor data processing unit 23, and uses it as a trigger. , Start re-correction.
 また、プロジェクタ2の経時変化やズーム・シフトの変化による補正ずれは、例えば、対応点のずれを検出するためのパターンを用いて検出される。この場合、パターンが埋め込まれた画像がプロジェクタ2から投影され、投影された画像がカメラ3により撮影される。プロジェクタ2の投影とカメラ3の撮影は同期して行われる。 Further, the correction deviation due to the time-dependent change of the projector 2 and the change of the zoom shift is detected by using, for example, a pattern for detecting the deviation of the corresponding point. In this case, the image in which the pattern is embedded is projected from the projector 2, and the projected image is captured by the camera 3. The projection of the projector 2 and the shooting of the camera 3 are performed in synchronization.
 補正ずれ検出部116は、例えば、対応点検出部112により行われる、プロジェクタ2が投影する画像に埋め込まれたパターンとカメラ画像に写るパターンとを用いた対応点検出の結果に基づいて補正ずれを検出する。補正ずれ検出部116は、補正ずれの要因となった装置を特定し、それをトリガとして、再補正を開始させる。 The correction deviation detection unit 116 determines the correction deviation based on the result of the corresponding point detection performed by the corresponding point detecting unit 112, for example, using the pattern embedded in the image projected by the projector 2 and the pattern reflected in the camera image. To detect. The correction deviation detection unit 116 identifies the device that caused the correction deviation, and triggers the device to start re-correction.
 補正ずれが発生したことが、プロジェクタ・カメラシステムの管理者であるユーザにより手動で入力されるようにしてもよい。この場合、クロスハッチのような基準画像やセンシング用の符号化パターンを投影したときの投影画像が重なるオーバーラップ領域の変化、あるいは、対応点の変化がユーザにより計算される。 The occurrence of correction deviation may be manually input by the user who is the administrator of the projector / camera system. In this case, the user calculates a change in the overlap region where the projected images overlap when a reference image such as a crosshatch or a coding pattern for sensing is projected, or a change in the corresponding point.
 補正ずれ検出部116は、ユーザによる計算結果に基づいて、補正ずれを検出するとともに、補正ずれの要因となった装置を特定し、それをトリガとして、再補正を開始させる。補正ずれの要因となった装置がユーザにより入力されるようにしてもよい。 The correction deviation detection unit 116 detects the correction deviation based on the calculation result by the user, identifies the device that caused the correction deviation, and uses it as a trigger to start the re-correction. The device that caused the correction deviation may be input by the user.
<対応点情報更新処理>
 図14のフローチャートを参照して、図11のステップS33において行われる対応点情報更新処理について説明する。
<Corresponding point information update processing>
The corresponding point information update process performed in step S33 of FIG. 11 will be described with reference to the flowchart of FIG.
 ステップS51において、再補正用パターン生成部117は、初期補正によって求められた、補正ずれ発生前の対応点情報を補正情報記憶部114から読み出して取得する。また、更新部118は、補正ずれ発生前の対応点情報とパラメータ推定値を補正情報記憶部114から読み出して取得する。 In step S51, the re-correction pattern generation unit 117 reads out from the correction information storage unit 114 and acquires the corresponding point information before the occurrence of the correction deviation, which is obtained by the initial correction. Further, the update unit 118 reads out the corresponding point information and the parameter estimated value before the correction deviation occurs from the correction information storage unit 114 and acquires them.
 ステップS52において、再補正用パターン生成部117は、補正ずれ検出部116から供給された情報などに基づいて、補正ずれの要因となったプロジェクタ2とカメラ3を特定する。補正ずれの要因となったプロジェクタ2は対象プロジェクタとして設定される。また、補正ずれの要因となったカメラ3は対象カメラとして設定される。 In step S52, the re-correction pattern generation unit 117 identifies the projector 2 and the camera 3 that caused the correction deviation based on the information supplied from the correction deviation detection unit 116 and the like. The projector 2 that causes the correction deviation is set as the target projector. Further, the camera 3 that causes the correction deviation is set as the target camera.
 例えば、図13を参照して説明したようにプロジェクタ2-2の位置がずれることによって補正ずれが発生した場合、プロジェクタ2-2が対象プロジェクタとして設定される。また、対象プロジェクタとしてのプロジェクタ2-2が投影する画像を撮影する、カメラ3-1とカメラ3-2がそれぞれ対象カメラとして設定される。 For example, when the correction deviation occurs due to the displacement of the projector 2-2 as described with reference to FIG. 13, the projector 2-2 is set as the target projector. Further, the camera 3-1 and the camera 3-2, which capture the image projected by the projector 2-2 as the target projector, are set as the target cameras, respectively.
 ステップS53において、再補正用パターン生成部117は、それぞれの対象プロジェクタ用の再補正用パターンを生成する。ここでは、例えば、対象プロジェクタ毎に1枚の再補正用パターンが生成される。対象プロジェクタ毎に複数枚の再補正用パターンが生成されるようにしてもよい。 In step S53, the re-correction pattern generation unit 117 generates a re-correction pattern for each target projector. Here, for example, one recorrection pattern is generated for each target projector. A plurality of re-correction patterns may be generated for each target projector.
 再補正用パターンの生成は、補正ずれ発生前の対応点情報に基づいてプロジェクタ対応点の位置を特定し、特定した位置を基準として、所定のパターンのドットを配置することによって行われる。 The re-correction pattern is generated by specifying the position of the projector corresponding point based on the corresponding point information before the correction deviation occurs and arranging dots of a predetermined pattern based on the specified position.
 図15は、再補正用パターンの例を示す図である。 FIG. 15 is a diagram showing an example of a recorrection pattern.
 図15に示すように、各ドットは、所定の径を有する円形状の形状を有し、例えば、対応点情報に基づいて特定された、プロジェクタ対応点の位置を重心位置とするようにして配置される。図7を参照して説明したようにして対応点情報が求められている場合、再補正用パターンを構成する各ドットの重心位置は、プロジェクタ画像上の画素P1乃至PNのそれぞれの位置となる。 As shown in FIG. 15, each dot has a circular shape having a predetermined diameter, and is arranged so that, for example, the position of the projector corresponding point specified based on the corresponding point information is the center of gravity position. Will be done. When the corresponding point information is obtained as described with reference to FIG. 7, the position of the center of gravity of each dot constituting the recorrection pattern is the position of each of the pixels P1 to PN on the projector image.
 このように、プロジェクタ2から投影した際に、特徴点となる対応点をカメラ画像に基づいて検出可能な疎なパターンが、再補正用パターンとして生成される。 In this way, when projected from the projector 2, a sparse pattern in which the corresponding points that are feature points can be detected based on the camera image is generated as a recorrection pattern.
 画像全域にドットが配置されるのではなく、図16の右側に示すように、スクリーン11に投影される範囲に含まれるドットだけが配置されるようにしてもよい。図16の楕円#1で囲んで示す範囲は、スクリーン11に実際に投影される範囲である。 Instead of arranging dots over the entire image, as shown on the right side of FIG. 16, only dots included in the range projected on the screen 11 may be arranged. The range surrounded by the ellipse # 1 in FIG. 16 is the range actually projected on the screen 11.
 プロジェクタ画像全域のうちの、スクリーン11に投影される範囲は、例えば、初期補正時に推定されたスクリーン11の形状に基づいて特定される。スクリーン11の形状は、姿勢推定によって算出されたパラメータ推定値に基づいて、初期補正における幾何補正時などに推定される。 The range projected on the screen 11 in the entire projector image is specified, for example, based on the shape of the screen 11 estimated at the time of initial correction. The shape of the screen 11 is estimated at the time of geometric correction in the initial correction based on the parameter estimated value calculated by the posture estimation.
 これにより、スクリーン11に実際に投影されるドットだけを、対応点情報の更新に用いることが可能となる。 This makes it possible to use only the dots actually projected on the screen 11 for updating the corresponding point information.
 初期補正においては、スクリーン11の形状や位置が未知であるため、プロジェクタ画像全域のうちの、スクリーン11に実際に投影される範囲は未知である。そのため、初期補正時に投影するセンシング用の符号化パターンとしては、プロジェクタ画像全域を対象としたパターンを利用する必要がある。 In the initial correction, since the shape and position of the screen 11 are unknown, the range actually projected on the screen 11 in the entire projector image is unknown. Therefore, it is necessary to use a pattern for the entire projector image as the coding pattern for sensing to be projected at the time of initial correction.
 このことは、スクリーン11に投影されないセンシング点が、対応点検出などの対象となる可能性があることを意味し、補正精度低下の要因となる。また、スクリーン11の形状によっては、投影されたパターンが大きく歪み、高精度なセンシングができないために補正精度が低下することもある。最終的な幾何補正に悪影響を与えるセンシング点については、符号化パターンの復号や姿勢推定のときに、外れ値として除去する必要がある。 This means that the sensing points that are not projected on the screen 11 may be the target of the corresponding point detection, etc., which causes a decrease in the correction accuracy. Further, depending on the shape of the screen 11, the projected pattern is greatly distorted, and high-precision sensing cannot be performed, so that the correction accuracy may decrease. Sensing points that adversely affect the final geometric correction need to be removed as outliers when decoding the coding pattern or estimating the attitude.
 スクリーン11に実際に投影されるドットからなる再補正用パターンを用いることにより、有効なドットだけを用いて対応点情報の更新を行うことができ、補正精度の低下を防ぐことが可能となる。また、外れ値除去などの処理が不要となるため、効率的な処理が可能となる。 By using the re-correction pattern consisting of the dots actually projected on the screen 11, the corresponding point information can be updated using only the effective dots, and it is possible to prevent the correction accuracy from being lowered. In addition, since processing such as outlier removal is not required, efficient processing is possible.
 図14の説明に戻り、ステップS54において、投影制御部21は、対象プロジェクタを制御し、再補正用パターンを投影させる。 Returning to the description of FIG. 14, in step S54, the projection control unit 21 controls the target projector and projects the recorrection pattern.
 ステップS55において、撮影制御部22は、対象カメラを制御し、スクリーン11に投影された再補正用パターンを撮影させる。スクリーン11に投影された再補正用パターンを撮影して得られたカメラ画像は更新部118に供給される。 In step S55, the shooting control unit 22 controls the target camera to shoot the recorrection pattern projected on the screen 11. The camera image obtained by photographing the recorrection pattern projected on the screen 11 is supplied to the update unit 118.
 ステップS56において、撮影制御部22は、全ての対象カメラで再補正用パターンを撮影したか否かを判定し、全ての対象カメラで撮影したと判定するまで、対象カメラを切り替えてステップS55の処理を繰り返す。 In step S56, the shooting control unit 22 determines whether or not the recorrection pattern has been shot by all the target cameras, and switches the target cameras until it is determined that the recorrection pattern has been shot by all the target cameras, and the process of step S55. repeat.
 全ての対象カメラで再補正用パターンを撮影したとステップS56において判定された場合、処理はステップS57に進む。 If it is determined in step S56 that the pattern for re-correction has been captured by all the target cameras, the process proceeds to step S57.
 ステップS57において、投影制御部21は、全ての対象プロジェクタから再補正用パターンを投影させたか否かを判定し、全ての対象プロジェクタから投影させたと判定するまで、対象プロジェクタを切り替えてステップS54以降の処理を繰り返す。 In step S57, the projection control unit 21 determines whether or not the recorrection pattern has been projected from all the target projectors, and switches the target projectors until it is determined that the projection has been projected from all the target projectors. Repeat the process.
 全ての対象プロジェクタから再補正用パターンを投影させたとステップS57において判定された場合、処理はステップS58に進む。 If it is determined in step S57 that the recorrection pattern is projected from all the target projectors, the process proceeds to step S58.
 ステップS58において、更新部118は、撮影制御部22から供給されたカメラ画像に基づいて、それぞれのカメラ対応点の補正ずれ発生後の位置を検出する。 In step S58, the update unit 118 detects the position of each camera corresponding point after the correction deviation occurs, based on the camera image supplied from the shooting control unit 22.
 再補正用パターンを撮影して得られたカメラ画像上において、補正ずれ発生後のカメラ対応点の位置は、カメラ画像に写るドットの重心位置として表される。更新部118は、それぞれのカメラ対応点の位置に配置されたドットの、補正ずれ発生後の重心位置を検出する。 On the camera image obtained by shooting the pattern for re-correction, the position of the camera corresponding point after the correction deviation occurs is represented as the position of the center of gravity of the dots reflected in the camera image. The update unit 118 detects the position of the center of gravity of the dots arranged at the positions corresponding to the respective cameras after the correction deviation occurs.
 ステップS59において、更新部118は、それぞれのカメラ対応点の補正ずれ発生後の位置の検出結果に基づいて、補正ずれ発生前の対応点情報に含まれる、カメラ対応点の位置情報を更新する。 In step S59, the updating unit 118 updates the position information of the camera corresponding points included in the corresponding point information before the correction deviation occurs, based on the detection result of the position after the correction deviation of each camera corresponding point occurs.
 図17は、再補正用パターンが写るカメラ画像の例を示す図である。 FIG. 17 is a diagram showing an example of a camera image in which a pattern for re-correction is captured.
 再補正用パターンと同じパターンを、補正ずれ発生前にプロジェクタ2から投影し、それをカメラ3で撮影した場合、図17の左側に示すような、再補正用パターンと同じパターンが写るカメラ画像が得られる。なお、再補正用パターンは補正ずれ発生後に生成されるものであるから、実際には、図17の左側に示すようなカメラ画像の撮影は行われない。 When the same pattern as the recorrection pattern is projected from the projector 2 before the correction deviation occurs and the camera 3 captures the same pattern, a camera image showing the same pattern as the recorrection pattern as shown on the left side of FIG. 17 appears. can get. Since the re-correction pattern is generated after the correction deviation occurs, the camera image as shown on the left side of FIG. 17 is not actually taken.
 また、補正ずれ発生後、補正ずれの要因となったプロジェクタ2から再補正用パターンを投影し、それをカメラ3で撮影した場合、図17の右側に示すような、再補正用パターンが写るカメラ画像が得られる。 Further, after the correction deviation occurs, when the re-correction pattern is projected from the projector 2 which is the cause of the correction deviation and the re-correction pattern is photographed by the camera 3, the camera showing the re-correction pattern as shown on the right side of FIG. An image is obtained.
 例えば右端の1つのドットDに注目した場合、補正ずれ発生後のドットDの位置は、補正ずれ発生前の位置から、白抜き矢印で示す方向にずれた状態で写ることになる。 For example, when paying attention to one dot D at the right end, the position of the dot D after the correction deviation occurs is shifted from the position before the correction deviation occurs in the direction indicated by the white arrow.
 図18は、ドットDのずれを模式的に示す図である。 FIG. 18 is a diagram schematically showing the deviation of the dot D.
 図18において、破線で示す円の中心は、ドットDの補正ずれ発生前の位置、すなわち、補正ずれ発生前のカメラ対応点の位置を表す。実線で示す円は、ドットDの補正ずれ発生後の位置を表す。 In FIG. 18, the center of the circle indicated by the broken line represents the position of the dot D before the correction deviation occurs, that is, the position of the camera corresponding point before the correction deviation occurs. The circle shown by the solid line represents the position of the dot D after the correction deviation occurs.
 更新部118は、例えば、補正ずれ発生前のカメラ対応点の位置を基準とした最近傍探索を、再補正用パターンが写るカメラ画像を対象として行うことにより、補正ずれ発生後の、カメラ対応点の位置に配置されたドットを特定する。 For example, the update unit 118 performs the nearest neighbor search based on the position of the camera corresponding point before the correction deviation occurs for the camera image in which the recorrection pattern appears, so that the camera corresponding point after the correction deviation occurs. Identify the dots placed at the position of.
 更新部118は、補正ずれ発生後のそれぞれのドットの位置(重心位置)を、カメラ対応点の新たな位置として設定することにより、対応点情報に含まれるカメラ対応点の位置情報の更新を行う。 The update unit 118 updates the position information of the camera correspondence point included in the correspondence point information by setting the position (center of gravity position) of each dot after the correction deviation occurs as a new position of the camera correspondence point. ..
 このように、対応点情報に含まれるカメラ対応点の位置情報の更新は、補正ずれ発生前のカメラ対応点の位置を基準とした最近傍探索を行い、カメラ対応点の補正ずれ発生後の位置を追跡するようにして行われる。再補正のトリガとなる微少な補正ずれは、例えば、補正ずれ発生前のカメラ対応点の位置を最近傍探索によって追跡可能となる程度のずれ、すなわち、あるカメラ対応点の補正ずれ発生後の位置が、隣接するカメラ対応点の、補正ずれ発生前の位置を越えない程度のずれとなる。 In this way, the update of the position information of the camera corresponding point included in the corresponding point information is performed by performing the nearest neighbor search based on the position of the camera corresponding point before the correction deviation occurs, and the position after the correction deviation of the camera corresponding point occurs. It is done so as to track. The slight correction deviation that triggers the re-correction is, for example, a deviation that makes it possible to trace the position of the camera corresponding point before the correction deviation occurs by the nearest neighbor search, that is, the position after the correction deviation of a certain camera corresponding point occurs. However, the deviation of the adjacent camera corresponding points does not exceed the position before the correction deviation occurs.
 図19は、対応点情報の更新の例を示す図である。 FIG. 19 is a diagram showing an example of updating the corresponding point information.
 図19の左側に示すプロジェクタ対応点は、図7を参照して説明したものと同じである。図19の例においては、プロジェクタ対応点としての画素P1に対応するカメラ対応点は、カメラ画像上の座標(x1,y1)にある画素p1(図7)から、座標(x’1,y’1)にある画素p’1として更新されている。 The projector correspondence points shown on the left side of FIG. 19 are the same as those described with reference to FIG. 7. In the example of FIG. 19, the camera correspondence point corresponding to the pixel P1 as the projector correspondence point is from the pixel p1 (FIG. 7) at the coordinates (x1, y1) on the camera image to the coordinates (x'1, y'. It has been updated as pixel p'1 in 1).
 また、プロジェクタ対応点としての画素P2に対応するカメラ対応点は、カメラ画像上の座標(x2,y2)にある画素p2から、座標(x’2,y’2)にある画素p’2として更新されている。他のカメラ対応点も、それぞれ画素p’3乃至p’nとして更新されている。 Further, the camera corresponding point corresponding to the pixel P2 as the projector corresponding point is changed from the pixel p2 at the coordinates (x2, y2) on the camera image to the pixel p'2 at the coordinates (x'2, y'2). It has been updated. Other camera correspondence points are also updated as pixels p'3 to p'n, respectively.
 以上のようにして対応点情報が更新された後、図11のステップS33に戻り、それ以降の処理が行われる。更新された対応点情報は、補正情報記憶部114に供給され、記憶される。 After the corresponding point information is updated as described above, the process returns to step S33 in FIG. 11 and the subsequent processing is performed. The updated correspondence point information is supplied to and stored in the correction information storage unit 114.
 このように、再補正においては、グレイコードなどの時間方向の符号化パターンを用いた符号化・復号処理によるユニーク性解決が補正ずれ発生前の初期補正時に完了しているため、新たに、符号化bit数に応じた枚数のパターンを投影する必要がない。ユニーク性解決は、プロジェクタ画像上の特徴点とカメラ画像上の特徴点との対応関係を特定することを意味する。 In this way, in the re-correction, the uniqueness resolution by the coding / decoding process using the time-direction coding pattern such as the Gray code is completed at the time of the initial correction before the correction deviation occurs. It is not necessary to project the number of patterns according to the number of conversion bits. The uniqueness solution means to identify the correspondence between the feature points on the projector image and the feature points on the camera image.
 これにより、再補正時に投影するパターンの枚数を削減することが可能となる。 This makes it possible to reduce the number of patterns projected during re-correction.
<パラメータ推定値更新処理>
 図20のフローチャートを参照して、図11のステップS34において行われるパラメータ推定値更新処理について説明する。
<Parameter estimation value update processing>
The parameter estimation value update process performed in step S34 of FIG. 11 will be described with reference to the flowchart of FIG.
 パラメータ推定値の更新は、更新済みの対応点情報と、補正ずれ発生前のパラメータ推定値を用いて行われる。更新済みの対応点情報に基づいて、補正ずれ発生後の各装置の姿勢を推定することが可能となる。 The parameter estimated value is updated using the updated corresponding point information and the parameter estimated value before the correction deviation occurs. Based on the updated correspondence point information, it is possible to estimate the posture of each device after the correction deviation occurs.
 ステップS71において、更新部118は、閾値となる量より少ないか否かを判定することによって、補正ずれのずれ量の評価を行う。 In step S71, the update unit 118 evaluates the deviation amount of the correction deviation by determining whether or not the amount is less than the threshold value.
 補正ずれのずれ量が閾値となる量より多いとステップS71において判定した場合、ステップS72において、更新部118は、更新済みの対応点情報に基づいて、対象カメラの姿勢推定を行う。ここでは、補正ずれの要因となったカメラ3のみを対象として姿勢推定が行われる。 When it is determined in step S71 that the deviation amount of the correction deviation is larger than the threshold value, the update unit 118 estimates the posture of the target camera based on the updated corresponding point information in step S72. Here, the posture estimation is performed only for the camera 3 that causes the correction deviation.
 ステップS73において、更新部118は、更新済みの対応点情報と、対象カメラの姿勢推定によって算出された対象カメラパラメータ推定値に基づいて、対象プロジェクタの姿勢推定を行う。ここでは、補正ずれの要因となったプロジェクタ2のみを対象として姿勢推定が行われる。 In step S73, the update unit 118 estimates the attitude of the target projector based on the updated correspondence point information and the target camera parameter estimated value calculated by the attitude estimation of the target camera. Here, the posture estimation is performed only for the projector 2 that causes the correction deviation.
 このように、補正ずれの要因となった装置のみを対象として姿勢推定を行うことにより、再補正に要する処理を軽減することが可能となる。補正ずれの要因となっていない装置のパラメータ推定値については、補正ずれ発生前の値がそのまま用いられる。 In this way, it is possible to reduce the processing required for re-correction by estimating the posture only for the device that caused the correction deviation. As for the parameter estimation value of the device that does not cause the correction deviation, the value before the correction deviation occurs is used as it is.
 ステップS74において、更新部118は、対象カメラの姿勢推定によって算出された対象カメラパラメータ推定値と、対象プロジェクタの姿勢推定によって算出された対象プロジェクタパラメータ推定値との最適化を行う。パラメータ推定値の最適化は、初期補正時の姿勢推定において行われる最適化と同様である。 In step S74, the update unit 118 optimizes the target camera parameter estimated value calculated by the posture estimation of the target camera and the target projector parameter estimated value calculated by the posture estimation of the target projector. The optimization of the parameter estimated value is the same as the optimization performed in the attitude estimation at the time of initial correction.
 一方、補正ずれのずれ量が閾値となる量より少ないとステップS71において判定された場合、ステップS72,S73の姿勢推定がスキップされ、ステップS74において、補正ずれ発生前のパラメータ推定値の最適化が行われる。 On the other hand, if it is determined in step S71 that the deviation amount of the correction deviation is less than the threshold amount, the posture estimation in steps S72 and S73 is skipped, and in step S74, the parameter estimation value before the correction deviation occurs is optimized. Will be done.
 補正ずれのずれ量が僅かである場合、初期補正によって算出されたパラメータ推定値は、補正ずれ発生後であっても一定の精度を有している値であるといえる。更新部118は、補正ずれのずれ量に応じた形で行われる最適化によって得られたパラメータ推定値によって、補正ずれ発生前のパラメータ推定値を更新することになる。 When the deviation amount of the correction deviation is small, it can be said that the parameter estimated value calculated by the initial correction has a certain accuracy even after the correction deviation occurs. The update unit 118 updates the parameter estimated value before the occurrence of the correction deviation by the parameter estimated value obtained by the optimization performed according to the deviation amount of the correction deviation.
 このように、補正ずれが発生した場合であってもパラメータ推定値が一定の精度を有していると考えられるときには、姿勢推定は行われず、微調整ともいえる最適化によって、補正ずれ発生前のパラメータ推定値の更新を完了させることが可能となる。 In this way, even if the correction deviation occurs, when the parameter estimation value is considered to have a certain accuracy, the posture estimation is not performed, and the optimization, which can be said to be a fine adjustment, is performed before the correction deviation occurs. It is possible to complete the update of the parameter estimates.
 ステップS74においてパラメータ推定値の最適化が行われた後、図11のステップS34に戻り、それ以降の処理が行われる。 After the parameter estimated value is optimized in step S74, the process returns to step S34 in FIG. 11 and the subsequent processing is performed.
<幾何補正>
 プロジェクタ2とカメラ3のパラメータ推定値が姿勢推定によって得られた後、三角測量の原理により、プロジェクタ対応点とカメラ対応点を、3次元空間における点として算出することが可能となる。3次元空間上のプロジェクタ対応点とカメラ対応点の位置は、以上のようなパラメータ推定値更新処理によって算出されたパラメータ推定値(内部パラメータ、外部パラメータ)により特定される。
<Geometric correction>
After the parameter estimates of the projector 2 and the camera 3 are obtained by posture estimation, the projector-compatible points and the camera-compatible points can be calculated as points in the three-dimensional space by the principle of triangulation. The positions of the projector corresponding points and the camera corresponding points in the three-dimensional space are specified by the parameter estimated values (internal parameters, external parameters) calculated by the parameter estimated value updating process as described above.
 すなわち、対応点の密度と同じ密度で、スクリーン11上の各点を復元することが可能となる。算出した点群を、スクリーン11を表す平面や二次曲面のモデルに適用することによって、スクリーン11全域の形状を推定することができることになる。 That is, it is possible to restore each point on the screen 11 with the same density as the corresponding points. By applying the calculated point cloud to a model of a plane or a quadric surface representing the screen 11, the shape of the entire screen 11 can be estimated.
 プロジェクタ画像の各画素が、スクリーン11上で、どの位置に対応するのかということを表す幾何補正ベクトルを画素単位で生成することができるため、スクリーン11の形状を考慮した幾何補正を実現することが可能となる。 Since it is possible to generate a geometric correction vector for each pixel indicating which position each pixel of the projector image corresponds to on the screen 11, it is possible to realize geometric correction in consideration of the shape of the screen 11. It will be possible.
 再補正が以上のようにして行われることにより、物理的な接触や経時変化などにより生じた補正ずれによる対応点情報の更新を、補正ずれ発生前の情報を元に生成したセンシング用のパターンである再補正用パターンを用いて、効率的に行うことが可能となる。 By performing the re-correction as described above, the corresponding point information due to the correction deviation caused by physical contact or aging is updated with the sensing pattern generated based on the information before the correction deviation occurs. It becomes possible to perform it efficiently by using a certain recorrection pattern.
 また、補正ずれ発生前の情報を元に生成した再補正用パターンを用いることにより、符号化パターンの復号処理を省略し、再補正のための全体の投影枚数と撮影枚数を削減することが可能となる。投影する必要があるパターンの枚数は、補正ずれの要因となったプロジェクタ2の台数と同じ数で済む。プロジェクタ2の台数が増加した場合であっても、再補正に要する投影枚数と撮影枚数の増加を抑制することが可能となる。 Further, by using the re-correction pattern generated based on the information before the correction deviation occurs, it is possible to omit the decoding process of the coding pattern and reduce the total number of projected images and the number of shots for re-correction. It becomes. The number of patterns that need to be projected can be the same as the number of projectors 2 that have caused the correction deviation. Even when the number of projectors 2 increases, it is possible to suppress an increase in the number of projections and the number of shots required for re-correction.
 このように、補正ずれの発生前に特定されたユニーク性を流用しつつ、対応点の位置だけを更新することができるため、符号化・復号処理を省き、容易な処理によって、対応点情報の更新を行うことが可能となる。 In this way, since it is possible to update only the position of the corresponding point while diverting the uniqueness identified before the occurrence of the correction deviation, the coding / decoding process is omitted, and the corresponding point information can be obtained by simple processing. It will be possible to update.
 さらに、更新済みの対応点情報を用いることによって、補正ずれ発生前のパラメータ推定値の更新を、容易に、かつ迅速に行うことが可能となる。 Furthermore, by using the updated correspondence point information, it is possible to easily and quickly update the parameter estimated value before the correction deviation occurs.
<変形例>
・パターンの変形例
 本技術におけるセンシングにおいては、「グレイコードのようなパターンを時系列に投影することによる空間符号化を行うこと」と、「取得される対応点は密なものではなく、投影領域における疎な特徴点として検出できること」とが前提となっている。
<Modification example>
-Example of pattern modification In the sensing in this technology, "Spatial coding by projecting a pattern such as Gray code in time series" and "The corresponding points to be acquired are not dense and are projected. It is premised that it can be detected as a sparse feature point in the area. "
 上述した例においては、ドットパターンを再補正用パターンとして投影するものとして説明したが、再補正用のパターンとしては、姿勢推定に十分な精度の対応点を取得できる疎なパターンであればどのようなパターンであってもよい。 In the above example, the dot pattern is projected as a pattern for re-correction, but what kind of pattern is used for re-correction as long as it is a sparse pattern that can obtain a corresponding point with sufficient accuracy for posture estimation? Pattern may be used.
 例えば、スクリーンの歪みが小さい場合には、補正ずれ発生前のプロジェクタ対応点をコーナーとして検出することになるようなチェッカーパターンが再補正用パターンとして用いられるようにすることが可能である。また、正三角形のタイリングパターンや、ChAruCoマーカーが再補正用パターンとして用いられるようにしてもよい。 For example, when the distortion of the screen is small, it is possible to use a checker pattern as a re-correction pattern that detects the projector corresponding point before the correction deviation occurs as a corner. Further, an equilateral triangle tiling pattern or a ChAruCo marker may be used as a recorrection pattern.
 再補正用パターンが、補正ずれの要因となったプロジェクタと、そのプロジェクタが投影する画像を撮影するカメラとの組み合わせ毎に生成されるようにしてもよい。 The re-correction pattern may be generated for each combination of the projector that caused the correction deviation and the camera that captures the image projected by the projector.
 複数台のプロジェクタ・カメラで構成されるシステムにおいては、1つのプロジェクタ画像の画素に対して、複数のカメラ画像の画素が対応付けられることが起こりうる。 In a system composed of a plurality of projectors and cameras, it is possible that pixels of a plurality of camera images are associated with pixels of one projector image.
 上述した例においては、再補正用パターン生成の際、補正ずれの要因となったプロジェクタ毎に1枚のパターンが生成される。このとき、あるカメラのカメラ画像には、別のカメラのカメラ画像の画素に対応する対応点が含まれる。対応点の補正ずれ発生後の位置を追跡する際、カメラ3毎に、補正ずれ発生前の対応点の近傍点を探索することにより、そのような点との誤対応を取ってしまう可能性を排除することができるが、対応点の更新処理としてはやや冗長なものとなる。 In the above example, when the pattern for re-correction is generated, one pattern is generated for each projector that causes the correction deviation. At this time, the camera image of one camera includes corresponding points corresponding to the pixels of the camera image of another camera. When tracking the position of the corresponding point after the correction deviation occurs, by searching for a point near the corresponding point before the correction deviation occurs for each camera 3, there is a possibility that an erroneous correspondence with such a point is taken. Although it can be eliminated, it is a little redundant as a corresponding point update process.
 補正ずれの要因となったプロジェクタと、そのプロジェクタが投影する画像を撮影するカメラとの組み合わせ毎に1枚の再補正用パターンを用いることにより、他のカメラのカメラ画像の画素に対応する対応点を生成しておく必要がないため(再補正用パターンのドットとして含めておく必要がないため)、対応点の更新処理を効率的に行うことが可能となる。このとき、パターンの生成枚数は(プロジェクタ数)×(カメラ数)となるが、全体の撮影枚数には変化がない。 Correspondence points corresponding to the pixels of the camera image of another camera by using one re-correction pattern for each combination of the projector that caused the correction deviation and the camera that captures the image projected by the projector. (Because it is not necessary to include it as a dot of the recorrection pattern), it is possible to efficiently update the corresponding point. At this time, the number of patterns generated is (number of projectors) × (number of cameras), but there is no change in the total number of shots.
・装置構成の変形例
 図21は、プロジェクタ・カメラシステムの他の構成例を示す図である。
-Modified example of device configuration FIG. 21 is a diagram showing another configuration example of the projector / camera system.
 以上のような再補正処理は、図21に示すような、3台以上のプロジェクタ2が設けられるシステムにも適用可能である。 The above re-correction process can also be applied to a system in which three or more projectors 2 are provided as shown in FIG.
 図21に示すプロジェクタ・カメラシステムにおいては、プロジェクタ2-1乃至2-4の4台のプロジェクタ2が設けられている。また、カメラ3-1乃至3-4の4台のカメラ3が設けられている。 In the projector / camera system shown in FIG. 21, four projectors 2 of projectors 2-1 to 2-4 are provided. Further, four cameras 3 of cameras 3-1 to 3-4 are provided.
 プロジェクタ2-1とカメラ3-1が投影・撮影装置201-1に設けられ、プロジェクタ2-2とカメラ3-2が投影・撮影装置201-2に設けられるといったように、1台のプロジェクタ2と1台のカメラ3が同じ筐体の装置の構成として設けられる。 One projector 2 such that the projector 2-1 and the camera 3-1 are provided in the projection / photographing device 211-1 and the projector 2-2 and the camera 3-2 are provided in the projection / photographing device 201-2. And one camera 3 are provided as a configuration of a device having the same housing.
 同様に、プロジェクタ2-3とカメラ3-3が投影・撮影装置201-3に設けられ、プロジェクタ2-4とカメラ3-4が投影・撮影装置201-4に設けられる。 Similarly, the projector 2-3 and the camera 3-3 are provided in the projection / photographing device 201-3, and the projector 2-4 and the camera 3-4 are provided in the projection / photographing device 201-4.
 このように、3台以上のプロジェクタ2が設けられるようにしてもよいし、プロジェクタ2とカメラ3が同じ筐体に設けられるようにしてもよい。 In this way, three or more projectors 2 may be provided, or the projector 2 and the camera 3 may be provided in the same housing.
 スクリーン11に示す枠#1乃至#4は、それぞれ、プロジェクタ2-1乃至2-4の投影領域を表す。このような4台のプロジェクタ2を使って1つの合成画像を投影する場合にも、再補正処理は適用可能である。オーバーラップ領域がどのプロジェクタ2の投影領域の組み合わせによって形成されるようにしてもよい。 The frames # 1 to # 4 shown on the screen 11 represent the projection areas of the projectors 2-1 to 2-4, respectively. The re-correction process can also be applied when one composite image is projected using such four projectors 2. The overlap region may be formed by a combination of projection regions of any projector 2.
 図22は、プロジェクタ・カメラシステムのさらに他の構成例を示す図である。 FIG. 22 is a diagram showing still another configuration example of the projector / camera system.
 図22に示すように、魚眼レンズが搭載されたプロジェクタである魚眼プロジェクタによってプロジェクタ2が構成され、魚眼レンズが搭載されたカメラである魚眼カメラによってカメラ3が構成されるようにしてもよい。 As shown in FIG. 22, the projector 2 may be configured by a fisheye projector which is a projector equipped with a fisheye lens, and the camera 3 may be configured by a fisheye camera which is a camera equipped with a fisheye lens.
 また、図22に示すように、スクリーン11は、投影面が半球面となるドーム型のスクリーンであってもよい。 Further, as shown in FIG. 22, the screen 11 may be a dome-shaped screen whose projection surface is a hemispherical surface.
・プログラムについて
 上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または汎用のパーソナルコンピュータなどに、プログラム記録媒体からインストールされる。
-About the program The series of processes described above can be executed by hardware or software. When a series of processes are executed by software, the programs constituting the software are installed from the program recording medium on a computer embedded in dedicated hardware or a general-purpose personal computer.
 インストールされるプログラムは、光ディスク(CD-ROM(Compact Disc-Read Only Memory),DVD(Digital Versatile Disc)等)や半導体メモリなどよりなる図9に示されるリムーバブルメディア65に記録して提供される。また、ローカルエリアネットワーク、インターネット、デジタル放送といった、有線または無線の伝送媒体を介して提供されるようにしてもよい。プログラムは、ROM52やHDD64に、あらかじめインストールしておくことができる。 The program to be installed is provided by being recorded on the removable media 65 shown in FIG. 9, which consists of an optical disk (CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), etc.) or a semiconductor memory. It may also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting. The program can be installed in ROM 52 or HDD 64 in advance.
 コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program in which processing is performed in chronological order in the order described in this specification, or processing is performed in parallel or at a necessary timing such as when a call is made. It may be a program to be performed.
 なお、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In the present specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 The effects described in this specification are merely examples and are not limited, and other effects may be obtained.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and processed jointly.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, each step described in the above flowchart can be executed by one device or can be shared and executed by a plurality of devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, when one step includes a plurality of processes, the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
<構成の組み合わせ例>
 本技術は、以下のような構成をとることもできる。
<Example of configuration combination>
The present technology can also have the following configurations.
(1)
 投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出する検出部と、
 前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成する生成部と、
 前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する更新部と
 を備える画像処理装置。
(2)
 前記生成部は、前記投影画像上の前記対応点の位置にドットが配置された前記再補正用パターンを生成する
 前記(1)に記載の画像処理装置。
(3)
 前記生成部は、前記投影画像上の前記対応点の位置であり、投影面に投影される範囲内の位置にドットが配置された前記再補正用パターンを生成する
 前記(1)に記載の画像処理装置。
(4)
 前記生成部は、前記対応点のずれの要因となった前記投影装置毎に1枚の前記再補正用パターンを生成する
 前記(1)乃至(3)のいずれかに記載の画像処理装置。
(5)
 前記更新部は、前記撮影画像上の前記対応点のずれに基づいて、前記対応点情報を更新する
 前記(1)乃至(4)のいずれかに記載の画像処理装置。
(6)
 前記更新部は、前記初期補正時の前記姿勢推定によって得られたパラメータ推定値を、更新済みの前記対応点情報に基づいて更新する
 前記(1)乃至(5)のいずれかに記載の画像処理装置。
(7)
 前記更新部は、前記対応点のずれ量が閾値となる量より多い場合、前記対応点のずれの要因となった前記投影装置と前記撮影装置の姿勢を推定することによって前記パラメータ推定値を更新する
 前記(6)に記載の画像処理装置。
(8)
 前記更新部は、前記対応点のずれ量が閾値となる量より少ない場合、前記対応点のずれの発生前の前記パラメータ推定値を前記対応点のずれに応じて調整することによって、前記パラメータ推定値を更新する
 前記(6)に記載の画像処理装置。
(9)
 前記投影装置による投影を制御する投影制御部と、
 前記撮影装置による撮影を制御する撮影制御部と
 をさらに備える前記(1)乃至(8)のいずれかに記載の画像処理装置。
(10)
 前記投影制御部は、複数の前記投影装置による投影を制御し、
 前記撮影制御部は、複数の前記撮影装置による撮影を制御する
 前記(9)に記載の画像処理装置。
(11)
 前記投影制御部は、前記対応点のずれの要因となった前記投影装置から前記再補正用パターンを投影させ、
 前記撮影制御部は、投影された前記再補正用パターンを、前記再補正用パターンを投影する前記投影装置が投影する前記投影画像の撮影を行う前記撮影装置によって撮影させる
 前記(9)または(10)に記載の画像処理装置。
(12)
 前記投影画像の投影面は、平面のスクリーン、曲面のスクリーン、または球面のスクリーンである
 前記(1)乃至(11)のいずれかに記載の画像処理装置。
(13)
 前記初期補正時の前記対応点検出は、複数枚の所定のパターンを表す前記投影画像と、投影されたそれぞれの前記所定のパターンを撮影して得られた複数枚の前記撮影画像とを用いて行われる
 前記(1)乃至(12)のいずれかに記載の画像処理装置。
(14)
 前記対応点のずれ量が閾値となる量より少ない場合、前記再補正用パターンを用いた前記対応点情報の更新を行わせ、前記対応点のずれ量が閾値となる量より多い場合、前記初期補正と同じ処理を行わせる制御部をさらに備える
 前記(1)乃至(13)のいずれかに記載の画像処理装置。
(15)
 画像処理装置が、
 投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出し、
 前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成し、
 前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する
 画像処理方法。
(16)
 コンピュータに、
 投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出し、
 前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成し、
 前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する
 処理を実行させるためのプログラム。
(17)
  投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出する検出部と、
  前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成する生成部と、
  前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する更新部と
 を備える画像処理装置と、
 複数の前記投影装置と、
 複数の前記撮影装置と
 を含むように構成された画像処理システム。
(1)
Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device A detection unit that detects the deviation of the corresponding points that occurs after the initial correction including the posture estimation that estimates the posture of the photographing device based on the result of the corresponding point detection.
A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. And the generator that generates
Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. An image processing device including an update unit.
(2)
The image processing apparatus according to (1), wherein the generation unit generates the recorrection pattern in which dots are arranged at the positions of the corresponding points on the projected image.
(3)
The image according to (1) above, wherein the generation unit is a position of the corresponding point on the projected image, and generates the recorrection pattern in which dots are arranged at positions within a range projected on the projection surface. Processing equipment.
(4)
The image processing device according to any one of (1) to (3), wherein the generation unit generates one recorrection pattern for each projection device that causes the deviation of the corresponding points.
(5)
The image processing device according to any one of (1) to (4), wherein the updating unit updates the corresponding point information based on the deviation of the corresponding points on the captured image.
(6)
The image processing according to any one of (1) to (5) above, wherein the updating unit updates the parameter estimated value obtained by the posture estimation at the time of the initial correction based on the updated corresponding point information. apparatus.
(7)
When the amount of deviation of the corresponding points is larger than the amount that becomes the threshold value, the updating unit updates the parameter estimated value by estimating the postures of the projection device and the photographing device that caused the deviation of the corresponding points. The image processing apparatus according to (6) above.
(8)
When the amount of deviation of the corresponding point is less than the amount that becomes the threshold value, the updating unit estimates the parameter by adjusting the estimated value of the parameter before the deviation of the corresponding point occurs according to the deviation of the corresponding point. The image processing apparatus according to (6) above, wherein the value is updated.
(9)
A projection control unit that controls projection by the projection device,
The image processing device according to any one of (1) to (8) above, further comprising a shooting control unit that controls shooting by the shooting device.
(10)
The projection control unit controls projection by the plurality of projection devices, and controls projection.
The image processing device according to (9) above, wherein the imaging control unit controls imaging by a plurality of the imaging devices.
(11)
The projection control unit projects the recorrection pattern from the projection device that causes the deviation of the corresponding points.
The imaging control unit causes the projected recorrection pattern to be photographed by the imaging device that captures the projected image projected by the projection device that projects the recorrection pattern (9) or (10). ). The image processing apparatus.
(12)
The image processing apparatus according to any one of (1) to (11) above, wherein the projection surface of the projected image is a flat screen, a curved screen, or a spherical screen.
(13)
The corresponding point detection at the time of the initial correction uses the projected image representing a plurality of predetermined patterns and the plurality of captured images obtained by photographing each of the projected predetermined patterns. The image processing apparatus according to any one of (1) to (12) above.
(14)
When the deviation amount of the corresponding point is smaller than the threshold value, the corresponding point information is updated using the recorrection pattern, and when the deviation amount of the corresponding point is larger than the threshold value, the initial stage The image processing apparatus according to any one of (1) to (13) above, further comprising a control unit that performs the same processing as the correction.
(15)
The image processing device
Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device The deviation of the corresponding points generated after the initial correction including the posture estimation for estimating the posture of the photographing device based on the result of the corresponding point detection is detected.
A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. To generate
Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. Image processing method to be performed.
(16)
On the computer
Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device The deviation of the corresponding points generated after the initial correction including the posture estimation for estimating the posture of the photographing device based on the result of the corresponding point detection is detected.
A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. To generate
Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. A program for executing the processing to be performed.
(17)
Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device A detection unit that detects the deviation of the corresponding points that occurs after the initial correction including the posture estimation that estimates the posture of the photographing device based on the result of the corresponding point detection.
A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. And the generator that generates
Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. An image processing device equipped with an update unit
With the plurality of the projection devices
An image processing system configured to include a plurality of the photographing devices.
 1 画像処理装置, 2-1,2-2 プロジェクタ, 3-1,3-2 カメラ, 21 投影制御部, 22 撮影制御部, 101 制御部, 111 符号化パターン生成部, 112 対応点検出部, 113 姿勢推定部, 114 補正情報記憶部, 115 幾何補正部, 116 補正ずれ検出部, 117 再補正用パターン生成部, 118 更新部 1 Image processing device, 2-1 and 2-2 projector, 3-1, 3-2 camera, 21 projection control unit, 22 shooting control unit, 101 control unit, 111 coding pattern generation unit, 112 corresponding point detection unit, 113 Posture estimation unit, 114 Correction information storage unit, 115 Geometric correction unit, 116 Correction deviation detection unit, 117 Recorrection pattern generation unit, 118 Update unit

Claims (17)

  1.  投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出する検出部と、
     前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成する生成部と、
     前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する更新部と
     を備える画像処理装置。
    Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device A detection unit that detects the deviation of the corresponding points that occurs after the initial correction including the posture estimation that estimates the posture of the photographing device based on the result of the corresponding point detection.
    A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. And the generator that generates
    Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. An image processing device including an update unit.
  2.  前記生成部は、前記投影画像上の前記対応点の位置にドットが配置された前記再補正用パターンを生成する
     請求項1に記載の画像処理装置。
    The image processing apparatus according to claim 1, wherein the generation unit generates the recorrection pattern in which dots are arranged at the positions of the corresponding points on the projected image.
  3.  前記生成部は、前記投影画像上の前記対応点の位置であり、投影面に投影される範囲内の位置にドットが配置された前記再補正用パターンを生成する
     請求項1に記載の画像処理装置。
    The image processing according to claim 1, wherein the generation unit is a position of the corresponding point on the projected image, and generates the recorrection pattern in which dots are arranged at positions within a range projected on the projection surface. apparatus.
  4.  前記生成部は、前記対応点のずれの要因となった前記投影装置毎に1枚の前記再補正用パターンを生成する
     請求項1に記載の画像処理装置。
    The image processing device according to claim 1, wherein the generation unit generates one recorrection pattern for each projection device that causes the deviation of the corresponding points.
  5.  前記更新部は、前記撮影画像上の前記対応点のずれに基づいて、前記対応点情報を更新する
     請求項1に記載の画像処理装置。
    The image processing device according to claim 1, wherein the updating unit updates the corresponding point information based on the deviation of the corresponding points on the captured image.
  6.  前記更新部は、前記初期補正時の前記姿勢推定によって得られたパラメータ推定値を、更新済みの前記対応点情報に基づいて更新する
     請求項1に記載の画像処理装置。
    The image processing device according to claim 1, wherein the updating unit updates the parameter estimated value obtained by the posture estimation at the time of the initial correction based on the updated corresponding point information.
  7.  前記更新部は、前記対応点のずれ量が閾値となる量より多い場合、前記対応点のずれの要因となった前記投影装置と前記撮影装置の姿勢を推定することによって前記パラメータ推定値を更新する
     請求項6に記載の画像処理装置。
    When the amount of deviation of the corresponding points is larger than the amount that becomes the threshold value, the updating unit updates the parameter estimated value by estimating the postures of the projection device and the photographing device that caused the deviation of the corresponding points. The image processing apparatus according to claim 6.
  8.  前記更新部は、前記対応点のずれ量が閾値となる量より少ない場合、前記対応点のずれの発生前の前記パラメータ推定値を前記対応点のずれに応じて調整することによって、前記パラメータ推定値を更新する
     請求項6に記載の画像処理装置。
    When the amount of deviation of the corresponding point is less than the amount that becomes the threshold value, the updating unit estimates the parameter by adjusting the estimated value of the parameter before the deviation of the corresponding point occurs according to the deviation of the corresponding point. The image processing apparatus according to claim 6, wherein the value is updated.
  9.  前記投影装置による投影を制御する投影制御部と、
     前記撮影装置による撮影を制御する撮影制御部と
     をさらに備える請求項1に記載の画像処理装置。
    A projection control unit that controls projection by the projection device,
    The image processing device according to claim 1, further comprising a shooting control unit that controls shooting by the shooting device.
  10.  前記投影制御部は、複数の前記投影装置による投影を制御し、
     前記撮影制御部は、複数の前記撮影装置による撮影を制御する
     請求項9に記載の画像処理装置。
    The projection control unit controls projection by the plurality of projection devices, and controls projection.
    The image processing device according to claim 9, wherein the imaging control unit controls imaging by the plurality of imaging devices.
  11.  前記投影制御部は、前記対応点のずれの要因となった前記投影装置から前記再補正用パターンを投影させ、
     前記撮影制御部は、投影された前記再補正用パターンを、前記再補正用パターンを投影する前記投影装置が投影する前記投影画像の撮影を行う前記撮影装置によって撮影させる
     請求項9に記載の画像処理装置。
    The projection control unit projects the recorrection pattern from the projection device that causes the deviation of the corresponding points.
    The image according to claim 9, wherein the imaging control unit captures the projected recorrection pattern by the imaging device that captures the projected image projected by the projection device that projects the recorrection pattern. Processing equipment.
  12.  前記投影画像の投影面は、平面のスクリーン、曲面のスクリーン、または球面のスクリーンである
     請求項1に記載の画像処理装置。
    The image processing apparatus according to claim 1, wherein the projection surface of the projected image is a flat screen, a curved screen, or a spherical screen.
  13.  前記初期補正時の前記対応点検出は、複数枚の所定のパターンを表す前記投影画像と、投影されたそれぞれの前記所定のパターンを撮影して得られた複数枚の前記撮影画像とを用いて行われる
     請求項1に記載の画像処理装置。
    The corresponding point detection at the time of the initial correction uses the projected image representing a plurality of predetermined patterns and the plurality of captured images obtained by photographing each of the projected predetermined patterns. The image processing apparatus according to claim 1.
  14.  前記対応点のずれ量が閾値となる量より少ない場合、前記再補正用パターンを用いた前記対応点情報の更新を行わせ、前記対応点のずれ量が閾値となる量より多い場合、前記初期補正と同じ処理を行わせる制御部をさらに備える
     請求項1に記載の画像処理装置。
    When the deviation amount of the corresponding point is smaller than the threshold value, the corresponding point information is updated using the recorrection pattern, and when the deviation amount of the corresponding point is larger than the threshold value, the initial stage The image processing apparatus according to claim 1, further comprising a control unit that performs the same processing as the correction.
  15.  画像処理装置が、
     投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出し、
     前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成し、
     前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する
     画像処理方法。
    The image processing device
    Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device The deviation of the corresponding points generated after the initial correction including the posture estimation for estimating the posture of the photographing device based on the result of the corresponding point detection is detected.
    A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. To generate
    Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. Image processing method to be performed.
  16.  コンピュータに、
     投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出し、
     前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成し、
     前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する
     処理を実行させるためのプログラム。
    On the computer
    Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device The deviation of the corresponding points generated after the initial correction including the posture estimation for estimating the posture of the photographing device based on the result of the corresponding point detection is detected.
    A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. To generate
    Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. A program for executing the processing to be performed.
  17.   投影装置が投影する投影画像と、投影された前記投影画像を撮影装置によって撮影して得られた撮影画像とに基づいてそれぞれの画像上の対応点を検出する対応点検出と、前記投影装置と前記撮影装置の姿勢を前記対応点検出の結果に基づいて推定する姿勢推定とを含む初期補正が行われた後に発生した前記対応点のずれを検出する検出部と、
      前記初期補正によって得られた前記対応点の関係を表す情報である対応点情報に基づいて、前記投影画像上の前記対応点の位置を基準とした、所定のパターンから構成される再補正用パターンを生成する生成部と、
      前記投影装置から投影された前記再補正用パターンを前記撮影装置によって撮影して得られた前記撮影画像に基づいて、前記撮影画像上の前記対応点のずれを検出し、前記対応点情報を更新する更新部と
     を備える画像処理装置と、
     複数の前記投影装置と、
     複数の前記撮影装置と
     を含むように構成された画像処理システム。
    Correspondence point detection that detects corresponding points on each image based on the projected image projected by the projection device and the captured image obtained by photographing the projected projected image by the photographing device, and the projection device A detection unit that detects the deviation of the corresponding points that occurs after the initial correction including the posture estimation that estimates the posture of the photographing device based on the result of the corresponding point detection.
    A re-correction pattern composed of a predetermined pattern based on the position of the corresponding point on the projected image based on the corresponding point information which is the information indicating the relationship of the corresponding point obtained by the initial correction. And the generator that generates
    Based on the photographed image obtained by photographing the recorrection pattern projected from the projection device by the photographing device, the deviation of the corresponding point on the photographed image is detected, and the corresponding point information is updated. An image processing device equipped with an update unit
    With the plurality of the projection devices
    An image processing system configured to include a plurality of the photographing devices.
PCT/JP2020/016067 2019-04-25 2020-04-10 Image processing device, image processing method, program, and image processing system WO2020218028A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-083726 2019-04-25
JP2019083726 2019-04-25

Publications (1)

Publication Number Publication Date
WO2020218028A1 true WO2020218028A1 (en) 2020-10-29

Family

ID=72941957

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016067 WO2020218028A1 (en) 2019-04-25 2020-04-10 Image processing device, image processing method, program, and image processing system

Country Status (1)

Country Link
WO (1) WO2020218028A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005244835A (en) * 2004-02-27 2005-09-08 Olympus Corp Multiprojection system
JP2016224172A (en) * 2015-05-28 2016-12-28 株式会社リコー Projection system, image processing device, calibration method and program
WO2017006779A1 (en) * 2015-07-08 2017-01-12 ソニー株式会社 Information processing device and method, and projection imaging device and information processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005244835A (en) * 2004-02-27 2005-09-08 Olympus Corp Multiprojection system
JP2016224172A (en) * 2015-05-28 2016-12-28 株式会社リコー Projection system, image processing device, calibration method and program
WO2017006779A1 (en) * 2015-07-08 2017-01-12 ソニー株式会社 Information processing device and method, and projection imaging device and information processing method

Similar Documents

Publication Publication Date Title
US9344695B2 (en) Automatic projection image correction system, automatic projection image correction method, and non-transitory storage medium
CN110913188B (en) Apparatus and method for generating updated camera-projector correspondence from reduced set of test patterns
US9774837B2 (en) System for performing distortion correction and calibration using pattern projection, and method using the same
JP6394005B2 (en) Projection image correction apparatus, method and program for correcting original image to be projected
JP5961945B2 (en) Image processing apparatus, projector and projector system having the image processing apparatus, image processing method, program thereof, and recording medium recording the program
CN112399158B (en) Projection image calibration method and device and projection equipment
CN102714695A (en) Image processing device, image processing method and program
JP6953188B2 (en) Image processing system, image processing system control method, and program
JP2014521262A (en) Method and apparatus for calibrating an imaging device
JP2005033703A (en) System and method for processing image, projector, program, and information storage medium
JP2019079487A (en) Parameter optimization device, parameter optimization method and program
CN106537908A (en) Camera calibration
CN108702450A (en) Stablize the method for image sequence
JP2011049733A (en) Camera calibration device and video distortion correction device
WO2020255766A1 (en) Information processing device, information processing method, program, projection device, and information processing system
JP2011160344A (en) Apparatus and method for correcting stereoscopic image
WO2020218028A1 (en) Image processing device, image processing method, program, and image processing system
WO2017094536A1 (en) Image-processing device and image-processing method
JP2011103031A (en) Perspective conversion parameter generation device, image correction device, perspective conversion parameter generation method, image correction method and program
EP3899870A1 (en) Cloud-based camera calibration
US11166005B2 (en) Three-dimensional information acquisition system using pitching practice, and method for calculating camera parameters
KR100945555B1 (en) Apparatus and method for providing augmented reality space
WO2019244667A1 (en) Information processing device, information processing method, and program
KR101845612B1 (en) 3d information acquisition system using practice of pitching and method for calculation of camera parameter
JP2021127998A (en) Distance information acquisition device and distance information acquisition method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20794733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20794733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP