WO2012175029A1 - 多投影拼接几何校正方法及校正装置 - Google Patents

多投影拼接几何校正方法及校正装置 Download PDF

Info

Publication number
WO2012175029A1
WO2012175029A1 PCT/CN2012/077294 CN2012077294W WO2012175029A1 WO 2012175029 A1 WO2012175029 A1 WO 2012175029A1 CN 2012077294 W CN2012077294 W CN 2012077294W WO 2012175029 A1 WO2012175029 A1 WO 2012175029A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
arc
arc screen
screen
Prior art date
Application number
PCT/CN2012/077294
Other languages
English (en)
French (fr)
Inventor
李凯
王静
赵光耀
刘源
Original Assignee
华为终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为终端有限公司 filed Critical 华为终端有限公司
Publication of WO2012175029A1 publication Critical patent/WO2012175029A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/395Arrangements specially adapted for transferring the contents of the bit-mapped memory to the screen

Definitions

  • Multi-projection splicing geometry correction method and correction device The present application claims priority to Chinese patent application filed on June 22, 2011, the Chinese Patent Office, the application number is 201110169655.8, and the invention name is "multi-projection mosaic geometric correction method and calibration device" The entire contents of which are incorporated herein by reference.
  • the present invention relates to image processing technology, and in particular to a multi-projection mosaic geometric correction method and a correction device. Background technique
  • the large screen display system is a multi-input, large single-screen display system.
  • an overall picture is displayed, which is composed of images of various signal sources.
  • Each device only displays a portion of the image.
  • Arc back projection refers to projecting images from the back of the arc to the arc.
  • it is necessary to calibrate the image in the projector.
  • Embodiments of the present invention provide a multi-projection mosaic geometric correction method and a correction device, which implement geometric calibration using less image mapping.
  • An embodiment of the present invention provides a multi-projection mosaic geometric correction method, including: acquiring a mapping relationship between a camera and a 3D arc screen;
  • An embodiment of the present invention provides a calibration apparatus, including:
  • An obtaining module configured to acquire a mapping relationship between the camera and the 3D arc screen; a determining module, configured to obtain a projector frame buffer image conversion according to a mapping relationship between the projector and the camera, a mapping relationship between the camera and the 3D arc screen obtained by the acquiring module, and a mapping relationship between the 3D arc screen and the input super-resolution image Mapping table
  • a correction module configured to perform geometric registration correction on the image to be projected by the projector according to the projector frame buffer image conversion mapping table.
  • the required image mapping relationship is less, so that less geometric mapping can be used to realize geometric calibration during multi-projection, and the implementation method is simple. Easy. DRAWINGS
  • FIG. 1 is a schematic flow chart of a method according to a first embodiment of the present invention
  • FIG. 2 is a schematic flow chart of a method according to a second embodiment of the present invention.
  • FIG. 3 is an image and a camera pair of a 3D arc screen in a world coordinate system according to a second embodiment of the present invention
  • FIG. 4 is a schematic diagram of a camera pinhole imaging model in a second embodiment of the present invention.
  • FIG. 5 is a schematic flow chart of an implementation manner 1 for solving internal and external parameters of a camera according to a second embodiment of the present invention
  • FIG. 6 is a schematic flow chart showing an implementation manner 2 of solving a parameter inside and outside a camera in a second embodiment of the present invention
  • FIG. 7 is a schematic diagram showing a 2D parametric representation of a 3D arc screen in a second embodiment of the present invention.
  • FIG. 8 is a schematic diagram showing a 2D parameter representation of the same full input image coordinates and 3D arc screen coordinates in the second embodiment of the present invention.
  • FIG. 9 is a schematic flow chart of a method according to a third embodiment of the present invention.
  • 10 is a schematic diagram of a 3D arc screen and a virtual 2D flat screen according to a third embodiment of the present invention;
  • FIG. 11 is a schematic diagram of a camera image according to a third embodiment of the present invention.
  • FIG. 12 is a schematic diagram of a virtual camera image in a third embodiment of the present invention.
  • Figure 13 is a block diagram showing the structure of a device according to a fourth embodiment of the present invention. detailed description
  • FIG. 1 is a schematic flowchart of a method according to a first embodiment of the present invention, including:
  • Step 11 The calibration device acquires a mapping relationship between the camera and the 3D arc screen
  • the mapping relationship between the camera and the 3D arc screen can be determined by using two virtual flat screens as intermediate parameters, which may include: according to the mapping relationship between the camera image and the virtual camera image, and the mapping relationship between the virtual camera image and the virtual 2D flat screen, The mapping relationship between the camera and the virtual 2D flat screen; according to the mapping relationship between the camera and the virtual 2D flat screen, and the mapping relationship between the virtual 2D flat screen and the 3D arc screen, the mapping relationship between the camera and the 3D arc screen is obtained.
  • the mapping relationship between the camera and the 3D arc screen can also be determined by using the internal and external parameters of the camera as an intermediate parameter, which may include: solving the internal and external parameters of the camera according to the captured image of the 3D arc screen and the geometric information of the 3D arc screen; According to the internal and external parameters of the camera and the 2D coordinates of the 3D arc screen, the mapping relationship between the camera and the 3D arc screen is obtained.
  • Step 12 The calibration device obtains a projector frame buffer image conversion mapping table according to a mapping relationship between the projector and the camera, a mapping relationship between the camera and the 3D arc screen, and a mapping relationship between the 3D arc screen and the input super-resolution image;
  • the mapping relationship between the camera and the 3D arc screen can be determined through step 11.
  • mapping of the obtained dense projector to the camera image is used as the above-described mapping relationship between the projector and the camera image F ⁇ 1 , 2 , 3 ).
  • D represents the projection screen (Display)
  • Su represents the input super resolution (Superlmage) image
  • is a unit change.
  • the projector-to-camera mapping relationship F' p ⁇ e ( l, 2, 3), camera
  • the mapping relationship between the 3D arc screen and the input super-resolution image mapping relationship F D ⁇ S" to the 3D arc screen can be obtained according to the above three mapping relationships, and the projector frame buffer image conversion mapping table can be obtained as follows:
  • the point in the input super-resolution image corresponding to a certain point in the projector frame image is obtained, and the color value is calculated according to the point in the input super-resolution image and assigned to the point in the projector frame image. For example, if (1 ⁇ 1) corresponds to ( ⁇ 1,? ⁇ 1), you can assign the color value obtained by interpolation to the point adjacent to ? ⁇ 1, ⁇ 1 (the number of points can be set) (xl , yl).
  • RGB ⁇ x,y RGB ⁇ m PW x ,m PW y
  • the above RGB() represents a color value.
  • Step 13 The correcting device performs geometric registration correction on the image to be projected by the projector according to the projector frame buffer image conversion mapping table.
  • geometric registration correction can be performed on the cache image in the projector, that is, the coordinates of the corrected point -
  • the above-mentioned correcting device may be an independent device that performs the above-described geometric registration correction on the image to be cached in the projector, and then sends the corrected image to the projector for buffering and projection. It is also possible that the correction device is embedded in the projector for correction before buffering.
  • the required image mapping relationship is less, so that the geometric alignment of the multi-projection can be realized by using less image mapping, and the implementation method is simple and easy.
  • FIG. 2 is a schematic flowchart of a method according to a second embodiment of the present invention.
  • a mapping relationship between a camera and a 3D arc screen is determined by using internal and external parameters of the camera as an intermediate parameter.
  • the embodiment includes: Step 21: Solving the internal and external parameters of the camera according to the captured image of the 3D arc screen and the geometric information of the 3D arc screen;
  • the geometric information of the 3D arc curtain of the embodiment of the invention is known.
  • the 3D arc screen can bring the user an immersive experience, and the local and remote users have the feeling of being in the same venue.
  • cylindrical arc curtain is commercialized, and mass customization is easier.
  • the specifications of the cylindrical arc curtain are easy to customize.
  • the 3D cylindrical arc screen with regularized and known information is used in the multi-projection splicing system. When solving various transformation maps, it is not necessary to solve the projection screen geometry information, and the known geometric information can be directly used.
  • the embodiment of the present invention can adopt a customized 3D arc curtain, that is, can be in geometry. Get the geometric information of the known 3D arc screen during calibration.
  • the known 3D arc screen information can also be used as an initial estimate of the 3D arc screen geometry information, and then various methods can be used to obtain more accurate 3D arc screen geometry information.
  • 3 is an image of a 3D arc screen in a world coordinate system and a camera for the 3D arc according to an embodiment of the present invention
  • Schematic diagram of the captured image of the screen see Figure 3
  • the 3D arc screen includes 4 vertices A, B, C and D.
  • the upper and lower curves of the 3D arc screen are arc AB and arc CD respectively, and the left and right line segments are line segments AD respectively.
  • line segment BC is an image of a 3D arc screen in a world coordinate system and a camera for the 3D arc according to an embodiment of the present invention
  • the lengths of the line segments AB, CD, AD, BC, the length of the arc AB, the length of the arc CD, the radius of the arc AB, and the radius of the arc CD are all Known.
  • FIG. 4 is a schematic diagram of a camera pinhole imaging model according to an embodiment of the present invention.
  • the camera obtains a homogeneous expression of a point of the 3D world space P 3 by perspective projection transformation.
  • M (x, y , z , lf )
  • w (M, v, lf.
  • the formula can be written as:
  • H represents a 3 x 4 camera projection matrix
  • K, R, and T are the internal and external parameters of the camera to be solved, where ⁇ is the internal parameter of the camera, R is the rotation matrix of the camera relative to the world coordinate system, and ⁇ is the translation matrix of the camera relative to the world coordinate system.
  • the internal and external parameters of the camera can be described as follows:
  • the purpose of camera calibration is to obtain the camera's internal parameters ⁇ and external parameters (including the rotation matrix R and translation vector ⁇ ), the internal geometric and optical characteristics of the camera, that is, internal parameters; the positional relationship of the camera coordinate system relative to the spatial coordinate system, It is an external parameter.
  • is an upper triangular matrix representing the internal parameters of the camera, as follows:
  • S is a distortion factor corresponding to the distortion of the camera coordinate axis
  • c. u , c. v is the coordinates of the principal point in pixels.
  • the parameters and / v are closely related to the focal length of the camera.
  • / M is the camera focal length in pixels
  • / M is the ratio of the focal length f to the size of the pixel in the u direction, which is the ratio of the focal length f to the size of the pixel in the V direction.
  • the rotation matrix R is a three-axis, 3 X 3-order rotation vector of the camera coordinate system relative to the world coordinate system, including three parameters to be solved;
  • the translation vector T is the camera coordinate system relative to the world coordinate system of 3 x 1
  • the order translation vector including the three parameters that need to be solved.
  • the number of parameters that need to be solved is seven.
  • the 2D image information corresponding to the geometric information of the 3D arc screen can be acquired.
  • the first step is to determine the 2D coordinates of the four corners of the captured image:
  • the image segmentation detection algorithm is used to detect the 3D arc screen contour, and the sampling points of the closed contour are further studied. 1 Take the sampling point in the closed contour and find the Gaussian kernel difference at the position of the point.
  • DoG Difference-of-Gaussian, DoG
  • ⁇ ( ⁇ , ⁇ ) t a n l ((L(x, y + 1) - L(x,y - ⁇ ))/(L(x + ⁇ , y) - L(x - ⁇ ,y) ))
  • the threshold needs to be determined by measurement, similar to the feature point detection method SIFT, SURF threshold.
  • the sampling point larger than T is used as the corner point. If there are more than four points that meet the condition, it is further determined whether it belongs to the corner point position close to the four sides of the image. If there are more satisfied points, then the points are clustered. The average value is used as a corner point.
  • the second step is to determine the 2D coordinates of the curve of the captured image:
  • the detection can also be performed manually;
  • information about the captured image of the 3D arc screen by the camera, and 2D coordinates corresponding to the geometric information of the 3D arc screen (related to the internal and external parameters of the camera) may be acquired, and then, according to the above two parameters , can solve the parameters inside and outside the camera.
  • the equally spaced sampling point re-injection mode may be adopted, or the line re-injection mode may be used for solving.
  • the following are described separately:
  • FIG. 5 is a schematic flowchart of an implementation manner 1 for solving internal and external parameters of a camera according to an embodiment of the present invention.
  • an equal interval sampling point re-injection mode is taken as an example.
  • the method includes: Step 51: According to the number of points of the upper and lower curves in the captured image, the upper and lower curves of the 3D arc screen are equally spaced, and the correspondence between the points of the captured image and the sampling points of the 3D arc screen is established.
  • the equal interval sampling refers to dividing the curve into equal-divided curve segments, and the end points of each curve segment are sampling points of equally spaced sampling.
  • the curve AB As an example, if the number of points of the captured curve AB is n+1, the curve AB of the 3D arc screen is sampled, and n+1 sampling points are obtained and between each two sampling points.
  • the curve pitch is the same, and the first point of the captured image corresponds to the first sample point of the 3D arc screen.
  • Step 52 Calculate the error sum of the 2D coordinates of the points in the captured image and the 2D coordinates corresponding to the sampling points of the 3D arc screen.
  • the initial settings of the camera internal and external parameters can be used. Solving, and the 2D coordinates of each point in the captured image are known, so the above error sum can be obtained: Ff 2
  • m i is the coordinates of the camera-captured image of point i
  • m' i is the 2D coordinate corresponding to the 3D arc of sample point i.
  • Step 53 Determine whether the error and E satisfy the accuracy requirement. If yes, execute step 54, no, and perform step 55.
  • the threshold can be selected as a value close to 0.
  • Step 54 If it approaches 0, the 7 unknown parameters determined are determined as the set internal and external parameters of the camera, that is, the parameters inside and outside the camera are obtained.
  • Step 55 Update the parameters inside and outside the camera. After that, repeat step 52.
  • FIG. 6 is a schematic flowchart of the second embodiment of the method for solving the internal and external parameters of the camera in the embodiment of the present invention.
  • the line re-injection mode is taken as an example. Referring to FIG. 6, the method includes:
  • Step 61 Calculate a 2D curve corresponding to the curve of the 3D arc screen
  • the 3D arc AB (or the arc CD) may be equally spaced, and the arc 2 is re-projected point by point to obtain an image of the corresponding 2D image point by point according to the initially set parameters of the camera; Related description in step 51.
  • these 2D image points can be approximated by least squares method, polynomial curve fitting, etc. to obtain a corresponding 2D curve. ;
  • Step 62 Calculate the distance between the 2D curve corresponding to the curve of the 3D arc screen and the curve of the captured image.
  • Step 63 Determine whether the distance E satisfies the accuracy requirement. If yes, go to step 64. Otherwise, go to step 65.
  • Step 64 If it approaches 0, then the 7 unknown parameters obtained are obtained, that is, the internal and external parameters of the camera are obtained.
  • Step 65 Update the parameters inside and outside the camera. After that, repeat step 61.
  • the Levenberg-Marquardt optimization algorithm can be used when setting and updating the parameters inside and outside the camera.
  • Step 22 obtaining a mapping relationship between the camera and the 3D arc screen according to the internal and external parameters of the camera and the 2D coordinates of the 3D arc screen;
  • mapping relationship between the camera and the 3D arc screen can be obtained according to the internal and external parameters of the camera.
  • the details can be as follows:
  • Camera internal parameter K the external parameters R and T, and 2D coordinates of the 3D arc screen. among them,
  • ( ⁇ , ⁇ , ⁇ ) is the coordinates of the point of the 3D arc screen in the world coordinate system
  • t is the height and s is the arc length.
  • Step 23 obtaining a projector frame buffer image conversion mapping table according to a mapping relationship between the projector and the camera, a mapping relationship between the camera and the 3D arc screen, and a mapping relationship between the 3D arc screen and the input super-resolution image;
  • Step 24 According to the projector frame buffer image conversion mapping table, a map to be projected on the projector Like performing geometric registration correction;
  • steps 23 ⁇ 24 can be found in steps 12 ⁇ 13.
  • the calculation process of each mapping relationship described above may be as follows:
  • the unit is pixel
  • x is the distance from any point in the image to the X axis, which is the distance from any point in the image to the y axis.
  • the essential meaning is the same as the 2D parameter of the 3D arc screen 5 and ⁇
  • 1 input super resolution original image size is 5760 1080; 2
  • the height of the 3D arc screen can be 1 unit, and the 1 unit is represented as 1080 pixels.
  • PW Indicates the image after the projector image is converted (ProjectorWarped)
  • C Indicates the image by the camera (Camera)
  • RGB (- ) indicates the RGB color value of a coordinate point or pixel ( ⁇ ).
  • the flow of generating a projector frame buffer conversion map can be as follows.
  • the numbers 1, 2, 3 indicate the 1, 2, 3 projectors or the 1st, 2nd, and 3rd cameras
  • a function representing the i-th (i 1 , 2 , 3) projector frame buffer image to camera image conversion, respectively.
  • Step: Take any point in the image in the i-th projector ( ⁇ ', ' ⁇ 2 , 3 )
  • step 4 If the floating point in the frame buffer image before the corresponding transformation is not found by step 1, the point is assigned the default value (eg: infinity value).
  • C means that the camera image
  • D means the projection screen (Display)
  • D represents the projection screen (Display)
  • Su represents the original super resolution (Superlmage) image
  • ⁇ 3 D represent points arc screen function to image points of the original super-resolution conversion.
  • 1, 2, 3 means the first, 2, 3 projectors or the original super-resolution image parts 1, 2, 3;
  • Su represents a Super (Super) resolution image
  • P represents a Projector image
  • ( ⁇ ) is defined as the image frame to be projected by the projector frame buffer, and the original super-resolution large image point is input;
  • n Pf r — x J represents a point after the projector frame buffer image is transformed
  • RGB ( ⁇ ) indicates the RGB color value of a certain coordinate point or pixel point ( ⁇ );
  • PW Projector Warped
  • the 2D coordinates corresponding to the geometric information of the 3D arc screen are required, that is, the required mathematical model is only the correspondence between 3D and 2D, the required mathematical model is simple, and the projection is required in the geometric correction.
  • the mapping relationship between the camera and the camera and the mapping relationship between the camera and the 3D arc screen require less image mapping, so the geometric calibration of multi-projection can be realized with a simpler mathematical model and less image mapping.
  • FIG. 9 is a schematic flowchart of a method according to a third embodiment of the present invention.
  • two virtual flat screens are used as intermediate parameters to determine a mapping relationship between a camera and a 3D arc screen.
  • the embodiment includes: Step 91: According to the mapping relationship between the camera image and the virtual camera image, and the mapping relationship between the virtual camera image and the virtual 2D flat screen, the mapping relationship between the camera and the virtual 2D flat screen is obtained; wherein, the virtual The 2D flat screen is a virtual 2D flat screen corresponding to the 3D arc screen.
  • the 3D arc screen is unfolded, and the obtained 2D flat screen is a virtual 2D flat screen.
  • the mapping relationship between the camera image and the virtual camera image can be obtained from the cylinder-to-plane distortion algorithm, for example, using a trapezoidal distortion algorithm.
  • the camera captures the template image projected onto the 3D arc screen to obtain the camera image corresponding to the template image.
  • the camera image is distorted into a virtual camera image using a distortion algorithm.
  • the mapping relationship between the virtual camera image and the virtual 2D flat screen can be based on the mapping relationship between the two screens.
  • the details can be as follows: Calculate the 2D projection transformation H. Let there be M pairs of matching feature points between image 0 and image J. Now it is necessary to determine the 2D projection transformation, and map the M feature points on image 0 to image J respectively. M feature points.
  • the N - 1 determined steps can be divided into two steps: one is to calculate N - 1 using a linear method, and the other is to iteratively refine the N - 1 using the Levenberg - Marquardt optimization method.
  • Step 1) DLT method for linear determination of each can be found on page 88 of R. Hartley and A. Zisserman. Multiple View Gemetry in Computer Vision. Cambrdge University Press, ISBN: 0521540518, second edition, 2004.
  • the equation is a system of equations consisting of two equations with nine terms of j being unknown, that is, a pair There are two 9 terms in the matching point as the unknown equation, so between image 0 and image J
  • l of h and the modulo I h
  • the eigenvectors which can be easily found by performing SVD decomposition.
  • 3 ⁇ 4 are the calibration values of 1 ⁇ and respectively, where the goal of the Levenberg-Marquardt optimization method is, by iterative refinement, the sum is calculated to minimize the following errors:
  • the initial value may take a value determined linearly by step 1), and the initial value of ⁇ may be taken.
  • mapping relationship between the camera image and the virtual camera image can be obtained.
  • the mapping relationship between the virtual camera image and the virtual camera image and the virtual 2D screen is virtualized.
  • the mapping relationship between the camera and the virtual 2D screen can be obtained by cascading.
  • F c ⁇ virtual m virtual c ⁇ virtual 2 . 0 ⁇ Virtual c).
  • mapping relationship of the 3D arc screen is obtained, and the mapping relationship between the camera and the 3D arc screen is obtained;
  • Step 91 can obtain a mapping relationship between the camera and the virtual 2D flat screen F C ⁇
  • mapping relationship between the virtual 2D screen and the 3D arc screen can be obtained by inversely transforming the mapping relationship between the 3D arc screen and the virtual 2D screen, and the mapping relationship between the 3D arc screen and the virtual 2D screen can be determined as follows:
  • t represents the height of the curtain
  • / ( ) is a 3D curve function, ⁇ corresponds to t, obviously makes:
  • the parameter coordinate representation in the form of a cylindrical arc is changed.
  • the 3D coordinate point of any point on the arc is (Xw, Yw, Zw)
  • the corresponding point on the straightened line segment is solved (S, T
  • Step 93 obtaining a projector frame buffer image conversion mapping table according to a mapping relationship between the projector and the camera, a mapping relationship between the camera and the 3D arc screen, and a mapping relationship between the 3D arc screen and the input super-resolution image;
  • Step 94 Perform geometric registration correction on the image to be projected by the projector according to the projector frame buffer image transformation mapping table. For the contents of steps 93 ⁇ 94, refer to steps 12 ⁇ 13.
  • the 3D arc screen is not taken into a plane screen, that is, by discretizing the 3D arc screen, each small arc screen is assumed to be a flat screen, and then the flat projection geometric correction method is adopted to correct the 3D arc screen.
  • the 3D arc screen is transformed into a virtual 2D flat screen, and the 3D arc screen captured by the camera is transformed into a virtual plane image, and the existing flat projection geometry is transplanted on the virtual 2D flat screen and the virtual flat screen image.
  • Correction method correcting 3D arc screen projection. That is, the arc screen stitching geometry correction is performed to the greatest extent based on the existing flat screen correction technique. Ingeniously avoiding the construction of complex 3D arc screen models and avoiding the mapping of 3D arc screens to 2D images.
  • FIG. 13 is a schematic structural diagram of a device according to a fourth embodiment of the present invention.
  • the device may be an independently set device, and then the corrected image is sent to a projector cache, or may be a device disposed in the projector before being cached. Correction.
  • the device includes an obtaining module 131, a determining module 132, and a correcting module 133.
  • the obtaining module 131 is configured to acquire a mapping relationship between the camera and the 3D arc screen.
  • the determining module 132 is configured to obtain, according to the mapping relationship between the projector and the camera, the acquiring module.
  • the correction module 133 is configured to convert the mapping table according to the projector frame buffer image, Perform geometric registration correction on the image to be projected by the projector.
  • the acquiring module may include a first unit, where the first unit is configured to: obtain a camera and a virtual 2D flat screen according to a mapping relationship between the camera image and the virtual camera image, and a mapping relationship between the virtual camera image and the virtual 2D flat screen. Mapping relationship; According to the mapping relationship between the camera and the virtual 2D flat screen, and the mapping relationship between the virtual 2D flat screen and the 3D arc screen, the mapping relationship between the camera and the 3D arc screen is obtained.
  • the acquiring unit includes a second unit, where the second unit is configured to: solve the internal and external parameters of the camera according to the captured image of the 3D arc screen and the geometric information of the 3D arc screen; The 2D coordinates of the 3D arc screen obtain the mapping relationship between the camera and the 3D arc screen.
  • the geometric information of the 3D arc screen acquired by the second unit is known information.
  • the second unit is specifically configured to: according to points of upper and lower curves in the captured image The number, the upper and lower curves of the 3D arc screen are equally spaced, and the correspondence between the point of the captured image and the sampling point of the 3D arc screen is established; the 2D coordinates of the point in the captured image and the sampling point of the 3D arc screen are calculated.
  • the error of the corresponding 2D coordinate and the 2D coordinate corresponding to the sampling point of the 3D arc screen are obtained according to the set internal and external parameters of the camera and the geometric information of the 3D arc curtain; if the error and the unsatisfied For accuracy requirements, the internal and external parameters of the initial camera are updated, and the 2D coordinates corresponding to the sampling points of the 3D arc screen are recalculated using the updated internal and external parameters of the camera until the error and the accuracy requirement are met; The internal and external parameters of the camera corresponding to the accuracy requirements.
  • the second unit is specifically configured to: calculate a 2D curve corresponding to a curve of the 3D arc screen according to the geometric information of the 3D arc screen and the set internal and external parameters of the camera; and calculate a curve corresponding to the curve of the 3D arc screen
  • the distance between the 2D curve and the curve of the captured image if the distance does not meet the accuracy requirement, the internal and external parameters of the initial camera are updated, and the updated internal and external parameters of the camera are used to recalculate the curve corresponding to the curve of the 3D arc screen. 2D curve, until the distance meets the accuracy requirement; obtain the internal and external parameters of the camera corresponding to the distance when the accuracy requirement is met.
  • the required image mapping relationship is less, so that the geometric alignment of the multi-projection can be realized by using less image mapping, and the implementation method is simple and easy.
  • the foregoing program may be stored in a computer readable storage medium, and when executed, the program includes The foregoing steps of the method embodiment; and the foregoing storage medium includes: a medium that can store program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

提供一种多投影拼接几何校正方法及校正装置。该方法包括获取摄像机到3D弧幕的映射关系;根据投影仪到摄像机的映射关系、所述摄像机到3D弧幕的映射关系以及3D弧幕到输入超分辨率图像映射关系,得到投影仪帧缓存图像变换映射表;根据所述投影仪帧缓存图像变换映射表,对投影仪要投影的图像进行几何配准校正。可以简单有效地实现校正。

Description

多投影拼接几何校正方法及校正装置 本申请要求于 2011 年 06 月 22 日提交中国专利局、 申请号为 201110169655.8、 发明名称为"多投影拼接几何校正方法及校正装置"的中国 专利申请的优先权, 其全部内容通过引用结合在本申请中。 技术领域
本发明涉及图像处理技术, 尤其涉及一种多投影拼接几何校正方法及校 正装置。 背景技术
大屏幕显示系统是一个多路输入、 超大单屏的显示系统。 在大屏幕拼接 系统中, 显示出来的是一幅整体的画面, 它由各信号源的图像拼接而成。 每 台设备只显示图像的一部分。 弧幕背投是指从弧幕背面, 分别投影图像到弧 幕中。 为了得到较好的显示效果, 需要对投影仪中的图像进行校准。 现有技 术存在一些几何校准方案, 但是, 现有方案中需要较复杂的数学模型建立或 者需要过多的图像映射变换过程。 发明内容
本发明实施例提供一种多投影拼接几何校正方法及校正装置, 采用较少 的图像映射实现几何校准。
本发明实施例提供一种多投影拼接几何校正方法, 包括: 获取摄像机到 3D弧幕的映射关系;
根据投影仪到摄像机的映射关系、 所述摄像机到 3D弧幕的映射关系以及 3D弧幕到输入超分辨率图像映射关系 , 得到投影仪帧緩存图像变换映射表; 根据所述投影仪帧緩存图像变换映射表, 对投影仪要投影的图像进行几 何配准校正。 本发明实施例提供一种校正装置, 包括:
获取模块, 用于获取摄像机到 3D弧幕的映射关系; 确定模块, 用于根据投影仪到摄像机的映射关系、 所述获取模块得到的 所述摄像机到 3D弧幕的映射关系以及 3D弧幕到输入超分辨率图像映射关系, 得到投影仪帧緩存图像变换映射表;
校正模块, 用于根据所述投影仪帧緩存图像变换映射表, 对投影仪要投 影的图像进行几何配准校正。
由上述技术方案可知, 本发明实施例在求解投影仪帧緩存图像变换映射 表时, 所需的图像映射关系较少, 因此可以采用较少的图像映射实现多投影 时的几何校准, 实现方法简便易行。 附图说明
为了更清楚地说明本发明实施例中的技术方案, 下面将对实施例描述中 所需要使用的附图作一简单地介绍, 显而易见地, 下面描述中的附图是本发 明的一些实施例, 对于本领域普通技术人员来讲, 在不付出创造性劳动性的 前提下, 还可以根据这些附图获得其他的附图。
图 1为本发明第一实施例的方法流程示意图;
图 2为本发明第二实施例的方法流程示意图;
图 3为本发明第二实施例中 3D弧幕在世界坐标系中的图像及摄像机对 该
3D弧幕的拍摄图像的示意图;
图 4为本发明第二实施例中摄像机针孔成像模型的示意图;
图 5为本发明第二实施例中求解摄像机内外参数的实现方式一的流程示 意图;
图 6为本发明第二实施例中求解摄像机内外参数的实现方式二的流程示 意图;
图 7为本发明第二实施例中 3D弧幕的 2D参数化表示示意图;
图 8为本发明第二实施例中完整输入图像坐标与 3D弧幕坐标相同的 2D参 数化表示示意图;
图 9为本发明第三实施例的方法流程示意图; 图 10为本发明第三实施例中 3D弧幕与虚拟 2D平幕的示意图; 图 11为本发明第三实施例中摄像机图像的示意图;
图 12为本发明第三实施例中虚拟摄像机图像的示意图;
图 13为本发明第四实施例的装置结构示意图。 具体实施方式
为使本发明实施例的目的、 技术方案和优点更加清楚, 下面将结合本发 明实施例中的附图, 对本发明实施例中的技术方案进行清楚、 完整地描述, 显然, 所描述的实施例是本发明一部分实施例, 而不是全部的实施例。 基于 本发明中的实施例, 本领域普通技术人员在没有做出创造性劳动前提下所获 得的所有其他实施例, 都属于本发明保护的范围。
图 1为本发明第一实施例的方法流程示意图, 包括:
步骤 11: 校正装置获取摄像机到 3D弧幕的映射关系;
其中, 摄像机到 3D弧幕的映射关系可以通过两个虚拟平幕作为中间参数 确定, 此时可以包括: 根据摄像机图像与虚拟摄像机图像的映射关系, 以及 虚拟摄像机图像与虚拟 2D平幕的映射关系, 得到摄像机与虚拟 2D平幕的映射 关系;根据所述摄像机与虚拟 2D平幕的映射关系, 以及虚拟 2D平幕与 3D弧幕 的映射关系, 得到摄像机到 3D弧幕的映射关系。
或者, 摄像机到 3D弧幕的映射关系也可以通过摄像机内外参数作为中间 参数确定, 此时可以包括: 根据对 3D弧幕的拍摄图像, 及所述 3D弧幕的几何 信息, 求解摄像机内外参数; 根据所述摄像机内外参数及所述 3D弧幕的 2D坐 标, 得到摄像机到 3D弧幕的映射关系。
上述两种方法可以具体参见下述实施例。
步骤 12: 校正装置根据投影仪到摄像机的映射关系、 所述摄像机到 3D弧 幕的映射关系以及 3D弧幕到输入超分辨率图像映射关系 , 得到投影仪帧緩存 图像变换映射表; 其中, 通过步骤 1 1可以确定摄像机到 3D弧幕的映射关系 。
另外, 投影仪到摄像机的映射关 F^→c('' = i,23) (以 3个投影仪进行拼接 为例)可以采用现有技术实现, 例如, 可以采用如下方式确定:
1 )制作特征 blob模板图;
2 ) 3个投影仪分别投影到 3D弧幕;
3 )分别由固定位置与参数的摄像机拍摄, 获得 3个摄像机图像;
4 )对该 3个摄像机图像进行特征检测, 并建立与已知特征 blob位置信息 的模板图间的点对映射关系; 此时的映射是稀疏的。
5 )应用 Rational Bezier Patch曲面插值算法, 建立致密的投影仪到摄像机 图像的映射。
至此, 将得到的致密的投影仪到摄像机图像的映射作为上述的投影仪到 摄像机图像的映射关系 F ^123)。
再者, 3D弧幕到输入超分辨率图像映射关系 FD→ 可以采用如下方式确 定:
此映射变换公式如下:
(mx , my) = FD→Su (s, t) = (s, t)
其中,
D表示投影幕 (Display) , Su表示输入超分辨率 (Superlmage)图像;
( xy t)分别为输入超分辨率图像中的点, 3D弧幕的 2D坐标点; 由于输入超分辨率图像的坐标含义与 3D弧幕的 2D坐标定义一致, 则显
Figure imgf000005_0001
因此, → 为单位变换。
通过上述计算,得到了投影仪到摄像机的映射关系 F'p→e( = l,2,3)、摄像机 到 3D弧幕的映射关系 3D弧幕到输入超分辨率图像映射关系 FD→S", 之后, 可以根据上述 3 个映射关系, 得到投影仪帧緩存图像变换映射表。 具 体可以如下:
首先, 根据上述 3个映射关系级联得到 ^→^'' = 1,23), 即,
(wx ,my) = FD→Su (FC→D (F;→c (X, y)))
= Fc→D(Fp i →c(x,y))
= F Su(x,y) 其中, (x,y) 为投影仪帧图像点的坐标。
之后,根据 F " = ½3)得到投影仪帧图像中某一点对应的输入超分辨率 图像中的点, 根据输入超分辨率图像中的点计算颜色值并赋给该投影仪帧图 像中的点, 例如, ( 1^1)对应(^1,?^1), 则可以将 ?^1,^1邻近的点(点的个数 可以设定 )进行插值计算得到的颜色值赋给 (xl,yl)。
之后, 通过颜色比对确定投影仪帧緩存图像变换映射表: 另 RGB{x,y) = RGB{mPW x,mPW y) , 则得到点(, 到点 ( mPW x,mPW y ) 的映射
Fip→pw(i-W), y)的值为输入超分辨率图像中的点的颜色值。 即选取一个点 ,计算得到该点的颜色值 之后找到具有相同颜色 值 WG^x^ ^^ ^— ― 的点 ( mPW x,mPW y ), 即得到变换关系: (mPW x,mPW y) = Fp l →pw(i = 1,2,3)( , ')
上述的 RGB()表示颜色值。
至此则得到了投影仪帧緩存图像变换映射表 ^^':123)。
步骤 13: 校正装置根据所述投影仪帧緩存图像变换映射表, 对投影仪要 投影的图像进行几何配准校正。
即, 在得到投影仪帧緩存图像变换映射表 ^^^ 123)之后, 可以对投 影仪中的緩存图像进行几何配准校正, 即校正后的点的坐标 ― 与校 正前点的坐标 (, 的关系为: , MY ) = F'P→PW y) j = i, 2, 3)。
另外, 上述的校正装置可以为独立的装置, 其对投影仪中待緩存的图像 进行上述几何配准校正后 , 将校正后的图像发送给投影仪进行緩存后投影。 也可以是, 该校正设备内嵌在投影仪中以在緩存前进行校正。
本实施例在求解投影仪帧緩存图像变换映射表时, 所需的图像映射关系 较少, 因此可以采用较少的图像映射实现多投影时的几何校准, 实现方法简 便易行。
图 2为本发明第二实施例的方法流程示意图, 本实施例以摄像机内外参数 作为中间参数确定摄像机到 3D弧幕的映射关系。 参见图 2, 本实施例包括: 步骤 21 : 根据对 3D弧幕的拍摄图像, 及所述 3D弧幕的几何信息, 求解摄 像机内外参数;
优选的, 本发明实施例的 3D弧幕的几何信息是已知的。
在实际多点视频会议应用中, 3D弧幕, 特别是柱面弧幕多投影系统更能 给用户带来沉浸式体验, 本地与远端用户更有身处同一会场的感觉。
并且, 柱面弧幕产品化, 大批量定制化更容易, 柱面弧幕的规格容易实 现定制下来。 规则化的、 已知信息的 3D柱面弧幕用于多投影拼接系统中, 在 求解各种变换映射时, 就没有必要再去求解投影幕几何信息了, 直接使用已 知几何信息即可。
而现有技术中通常使用各种复杂的方式求解该 3D弧幕的几何信息,但是, 随着 3D弧幕定制化的出现, 本发明实施例可以采用定制化的 3D弧幕, 即可以 在几何校准时获取已知的 3D弧幕的几何信息。 当然, 如果对几何校正有非常 高的精度要求, 也可以将已知的 3D弧幕信息作为 3D弧幕几何信息的初始估 计, 之后再采用各种方法获取更为精确的 3D弧幕几何信息。
求解摄像机内外参数可以如下:
首先, 关于 3D弧幕:
图 3为本发明实施例中 3D弧幕在世界坐标系中的图像及摄像机对该 3D弧 幕的拍摄图像的示意图, 参见图 3, 3D弧幕包括 4个顶点 A、 B、 C和 D, 3D弧 幕的上下两个曲线分别为圓弧 AB和圓弧 CD , 左右线段分别为线段 AD和线段 BC。
当 3D弧幕的几何信息是已知的, 则线段 AB、 CD、 AD、 BC的长度、 圓弧 AB的长度、 圓弧 CD的长度、 圓弧 AB的半径及圓弧 CD的半径均是已知的。
其次, 关于 3D与 2D的对应关系:
摄像机采用的是针孔模型, 图 4为本发明实施例中摄像机针孔成像模型的 示意图, 在该针孔模型下, 摄像机按透视射影变换将 3D世界空间 P3的点的齐 次表达式为 M = (x,y,z,lf , 投影到 2D图像空间 P2的点的齐次表达式为 w = (M,v,lf。 用公式标定可写成:
m « HM = K(R T)M
其中, H表示 3 x 4的摄像机投影矩阵, 《表示方程两边在相差一个比例因 子的意义下相等。 (R T ) 为由 R和 T组成的一个大矩阵。 假设摄像机光心不 在无穷远平面上, 摄像机投影矩阵 H可做如下分解:
H = K(R T)
K、 R、 T为待求解的摄像机内外参数, 其中, Κ为摄像机的内部参数, R 为摄像机相对于世界坐标系的旋转矩阵, Τ为摄像机相对于世界坐标系的平移 矩阵。
摄像机内外参数可以参照如下描述:
摄像机标定的目的是获得摄像机的内部参数 Κ和外部参数(包括旋转矩阵 R和平移向量 Τ ) , 摄像机内部的几何和光学特性, 即为内部参数; 摄像机坐 标系相对于空间坐标系的位置关系, 即为外部参数。
Κ为一个表示摄像机内部参数的上三角矩阵, 如下:
fu s C0l
κ o fv c。、
0 0 1 其中, 是图像 u方向 (横向) 以像素为单位的放大倍数; 是图像 v方向 (纵向) 以像素为单位的放大倍数;
S是相应于相机坐标轴扭曲的畸变因子;
c。u、 c。v是以像素为单位的主点的坐标。
参数 和 /v与摄像机的焦距有密切的联系。 在摄像机的感光阵列中包含 的像素是正方形像素的情况下 (即/ M = /v ) , 若 s=0, 则 和/:即是以像素为 单位的摄像机焦距; 若感光阵列中包含的像素是非正方形像素 (比如 CCD摄 像机) , 则/ M是焦距 f与 u方向像素的大小的比值, 是焦距 f与 V方向像素的大 小的比值。
在实际运算中, 可令 = /ν = /, f为摄像机焦距;
可令: s=0;
可令: C。M和 c。v均为图像分辨率的一半。
采用 Tsai以及张正友棋盘标定法,上述假设在通用品牌的标清或高清摄像 机中是切实可行的。
由于图像分辨率是已知的, 在上述假设条件下, 需要求解的参数为 f。 旋转矩阵 R是三个轴向的、 摄像机坐标系相对于世界坐标系的 3 X 3阶旋转 向量, 包括 3个需要求解的参数; 平移向量 T是摄像机坐标系相对于世界坐标 系的 3 x 1阶平移向量, 包括 3个需要求解的参数。
综上所述, 在求解摄像机的内外参时, 需要求解的参数为 7个。
另外, 根据上述的 3D与 2D的关系, 在设定摄像机内外参数初始值后, 可 以获取 3D弧幕的几何信息对应的 2 D图像信息。
再次, 关于摄像机对 3D弧幕的拍摄图像:
第一步, 确定拍摄图像的四个角点的 2D坐标:
四个角点在摄像机图像中的 2D坐标, 分别为
为了更加鲁棒地获得 3D弧幕的四个角点, 采取基于图像分割检测算法检 测 3D弧幕轮廓, 再进一步对封闭的轮廓的采样点进行研判。 ①任取封闭轮廓中采样点 , 求该点位置处的 高斯核差分
( Difference-of-Gaussian, DoG ) , DoG算子如下式所示:
D(x,y,a) - (G(x, y, ko) * I(x, y)) - L(x, y, ko) - L(x, y, σ)
②计算该点位置处的梯度方向
θ(χ,γ) = tan l((L(x, y + 1) - L(x,y - \))/(L(x + \, y) - L(x - \,y)))
③取所有采样点的 DoG进行分析, 取一个阈值 T
该阈值需要实测确定, 类同于特征点检测法 SIFT、 SURF阔值。
④由该阈值 T研判采样点是否属于四个角点。
将大于 T的采样点作为角点, 若符合条件的点多于四个, 进一步判断是否 属于接近于图像四边的角点位置, 若还是有更多的满足的点, 则取这些点聚 类的平均值作为角点。
至此, 求得了摄像机图像点
第二步, 确定拍摄图像的曲线的 2D坐标:
①对输入图像, 采取图割, 如最小割、 MeanShift颜色分割方法获得柱面 弧幕轮廓;
②检测弧幕的四个角点, 该检测也可手动进行;
③基于四个角点所规范的电子坐标范围, 确定曲线上所有致密采样点, 统计点数;
④基于③, 多项式最小曲线拟合, 采样上下曲线轮廓。
综上,通过上述步骤, 可以获取摄像机对所述 3D弧幕的拍摄图像的信息, 及所述 3D弧幕的几何信息对应的 2D坐标(与摄像机内外参数相关) , 之后, 根据上述两个参数, 可以求解摄像机内外参数。
具体地, 可以采用等间隔采样点重投方式, 或者采用线线重投方式进行 求解。 下面分别描述:
图 5为本发明实施例中求解摄像机内外参数的实现方式一的流程示意图, 本实施例以等间隔采样点重投方式为例, 参见图 5, 包括: 步骤 51 : 根据拍摄图像中上下曲线的点的数目, 对 3D弧幕的上下曲线进 行等间隔采样, 并建立拍摄图像的点与 3D弧幕的采样点之间的对应关系。
其中, 等间隔采样是指将曲线划分为设定个数的等间隔的曲线段, 每个 曲线段的端点即为等间隔采样的采样点。
例如, 以曲线 AB为例, 如果拍摄得到的曲线 AB的点的个数为 n+1 , 则对 3D弧幕的曲线 AB进行采样, 得到 n+1个采样点且每两个采样点之间的曲线间 距相同, 并且拍摄图像的第 1个点与 3D弧幕的第一个采样点是对应的。
步骤 52:计算拍摄图像中点的 2D坐标 m与 3D弧幕的采样点对应的 2D坐标 的误差和。
例如, 通过公式 ^ = ( ? Γ)Μ,·以及 3D弧幕的采样点的 3D坐标, 可以 获取对应的 2D坐标, 而在计算 3D对应的 2D坐标时,可以采用初始设定的摄像 机内外参数进行求解, 而在拍摄图像中各点的 2D坐标是已知的, 因此, 可以 获取上述的误差和:
Figure imgf000011_0001
ff2
其中, m i是点 i的摄像机拍摄图像的坐标, m' i是采样点 i的 3D弧幕对应的 2D坐标。
步骤 53: 判断该误差和 E是否满足精度要求, 若是, 执行步骤 54, 否贝' J , 执行步骤 55。
即: 是否趋近于 0 (该绝对值是否小于等于一个预设的阈值, 该阈 值可以选为接近于 0的值) 。
步骤 54: 若趋近于 0, 则将所求的 7个未知参数确定为设定的摄像机内外 参数, 即获取了摄像机内外参数。
步骤 55: 更新摄像机内外参数。 之后, 重复执行步骤 52。
其中, 在设定及更新摄像机内外参数时可以采用 Levenberg-Marquardt 优化算法。 图 6为本发明实施例中求解摄像机内外参数的实现方式二的流程示意图, 本实施例以线线重投方式为例, 参见图 6, 包括:
步骤 61 : 计算 3D弧幕的曲线对应的 2D曲线;
具体地, 可以等间隔采样 3D圓弧 AB (或圓弧 CD), 根据初始设定的摄像机 内外参数, 逐点重投该圓弧 AB到图像中获得对应的 2D图像点; 等间隔采样可 以参见步骤 51中的相关描述。
之后, 可以采用最小二乘法, 多项式曲线拟合等对这些 2D图像点进行拟 合得到对应的 2D曲线。 ;
步骤 62:计算 3D弧幕的曲线对应的 2D曲线与拍摄图像的曲线之间的距离 步骤 63: 判断该距离 E是否满足精度要求, 若是, 执行步骤 64, 否则, 执 行步骤 65。
即: |E - |是否趋近于 0 (该绝对值是否小于等于一个预设的阈值, 该阈 值可以选为接近于 0的值) 。
步骤 64: 若趋近于 0, 则所求的 7个未知参数获得了, 即获取了摄像机内 外参数。
步骤 65: 更新摄像机内外参数。 之后, 重复执行步骤 61。
其中, 在设定及更新摄像机内外参数时可以采用 Levenberg-Marquardt 优化算法。
步骤 22: 根据所述摄像机内外参数及所述 3D弧幕的 2D坐标, 得到摄像机 到 3D弧幕的映射关系;
其中, 根据上述的针孔模型得到的 3D与 2D的对应关系, 根据该摄像机内 外参数, 可以得到摄像机到 3D弧幕的映射关系。 具体可以如下:
输入: 摄像机内部参数 K:、 外部参数 R和 T, 以及 3D弧幕的 2D坐标。 其中,
Figure imgf000013_0001
上述的 K:、 R、 T可以通过图 5或图 6得到;
3D孤幕的 2D坐标:
Figure imgf000013_0002
其中: ①、 (Χ,Υ,Ζ)为 3D弧幕在世界坐标系下的点的坐标;
②、 Ζ = /( )
t为高度, s为弧长。
输出: FC→D (摄像机图像到 3D弧幕映射关系)。
具体计算过程可以如下:
图像点 m=(u,v)与 3D弧幕点 M=(X,Y,Z)的成像几何关系如下: am = K(R Τ)Μ , α为使得两侧对齐的常数; 其中, w为图像点 m的齐次表示, m = (u,v,\f; 同理, 为 M的齐次坐标, -( ,7,Ζ,1)Γ;
为摄像机内部参数; R、 T分别为摄像机外部参数。 由 = ( ? Τ)Μ , 可求得与 /^ = (M,v,lf对应的 ;
之后, 将 转化成 2D参数化 (s,t) , 即采用
Figure imgf000013_0003
(^)可 以由 = ( ,;Τ,Ζ,1 得到对应的(S,t), 另夕卜, 也与 w = 0,v,lf对应, 因此, 可以 得到摄像机到 3D弧幕的映射关系 1> , 其中, ,0 =^ (u,v)。
步骤 23: 根据投影仪到摄像机的映射关系、 所述摄像机到 3D弧幕的映 射关系以及 3D弧幕到输入超分辨率图像映射关系 ,得到投影仪帧緩存图像变 换映射表;
步骤 24: 根据所述投影仪帧緩存图像变换映射表, 对投影仪要投影的图 像进行几何配准校正;
其中, 步骤 23~24的内容可以参见步骤 12~13所示。 具体地, 上述各映射 关系的计算过程可以如下:
投影仪帧緩存图像坐标:
(x',y), -0,1,..., 1920x1080
其巾:
① )为第 i个投影仪帧緩存图像像素点坐标, 单位为 pixel;
②、 xe[0,191K 0,1079] 摄像机图像坐标:
其中,
① 0, 为摄像机图像的像素点坐标, 单位为 p ixel;
Me[0,1919],ve [0,1079]
参见图 7, 3D弧幕的 2D参数坐标点: (w)
其中,
① ί表示幕的高度;
② s表示孤长, s= ^\ + f x)dx, 单位为 pixel。
注: 令 s或 t的单位为 pixel, 方便与输入超分辨率图像统一。
参见图 8, 与 3D弧幕所对应的输入超分辨率图像坐标:
x为图像中任一点到 X轴距离, 为图像中任一点到 y轴距离, 本质含义 同 3D弧幕的 2D参数 5和^
mx, 单位为 pixel, mx e [0, 1920 * 3 - 1], wy e [0, 1080*3-1]
s = mx 在实际实现中,
①输入的超分辨率原始图像大小为 5760 1080; ②可令 3D弧幕的高度为 1个单位, 该 1个单位表示为 1080个 pixel。 弧幕曲 线的总长度为 1920 x 3 pixel=5760 pixel, 满足弧幕实际尺寸 (弧幕平面展开 后的矩形 ) 比率 48: 9。
这样, 输入的超分辨率原始图像大小和 3D弧幕的大小就一模一样, 避免 在二者映射时的插值运算, 也恰到好处地符合了 "贴墙纸" 原理。
各种映射参数含义
d l,2,3) . 3个 Projector帧緩存图像—— Camera图像映射
F, C→D Camera图像—— ^Display弧幕映射
F D,→Su Display弧幕—— Superlmage (输入超分辨率图
像) 变换映射
Fp'→Su(i = W) 3个 Projector帧緩存图像—— Superlmage图像变
换映射
F ― 1, 2, 3) 生成 3个 Projector帧緩存图像变换 (Wraped)映射表
P: 表示投影仪 (Projector)变换之前图像
PW: 表示投影仪图像变换之后的图像 (ProjectorWarped) C: 表示由摄像机 (Camera)图像
D: 表示 3D弧幕 (Display)
Su: 表示输入超分辨率 (Superlmage)图像
RGB (- ) 表示某一坐标点或像素点(·)的 RGB颜色值 细化 2个投影帧緩存图像变换映射表求解
输入:
3个投影仪到摄像机映射^→C( = 1,2,3)、
摄像机到弧幕幕映射 、
弧幕到输入超分辨率图像映射 FD→Su 输出:
3个投影仪帧緩存图像变换映射表 p→W( = l,2,3)
生成投影仪帧緩存变换映射表的流程可以如下,
1 ) F;→c(i = 1,2,3)
3个 Projector帧緩存图像—— 〇3116「3图像映射 '=1,2,3)
此 3个映射变换公式如下:
Figure imgf000016_0001
其巾:
数字 1、 2、 3表示第 1、 2、 3个投影仪或第 1、 2、 3个摄像机;
① C表示由摄像机 (Camera)图像, P表示投影仪 (Projector)帧緩存图像;
②、 , ( ", 的定义分别为投影仪帧緩存待投影图像点, 由摄像机拍摄 的图像点;
③、 分别表示第 i(i=1,2,3)个投影仪帧緩存图像到摄像机图像变换的 函数。
获得变换后的摄像机图像
步骤: 任取第 i个投影仪中的图像任一点 (χ', '^ = 23)
①、 由映射表^→^' = 1,23), 找到在摄像机图像中对应的浮点;
②、 取该浮点邻近 4个像素点, 双线性插值, 获得点 (Μ',ν')的颜色值;
③、 重复①②, 直至待投影仪图像中最后一点。
④、 若由步骤①, 找不到对应的变换之前帧緩存图像中的浮点, 则该点 赋值为默认值 (如: 无穷大值)。
2) FC D :
Camera图像 ^Display弧幕映射 此映射变换公式如下:
(s,t)^Fc→D(u,v)
其中,
①、 C表示由摄像机 (Camera)图像, D表示投影幕 (Display);
②、 (^),(", 的定义分别为 3D弧幕中 2D参数点, 由摄像机拍摄的图像点;
③、 分别表示摄像机图像到 3D弧幕变换的函数。
Display弧幕—— -^Superlmage (输入原始超分辨率图像) 变换映射 此映射变换公式如下:
(mx,my) = FD→Su(s,t) = (s,t)
其中,
①、 D表示投影幕 (Display) , Su表示原始超分辨率 (Superlmage)图像;
②、 ( ^W)定义分别为原始超分辨率图像中的点, 3D弧幕中 2D参
并且, 参数 , 与 有相同的含义。
③、 分别表示 3D弧幕点到原始超分辨率图像点变换的函数。
原始输入大图的坐标含义与 3D弧幕的 2D参数化定义一致,显然有:
Figure imgf000017_0001
故而 为单位变换
Figure imgf000017_0002
3个 Projector帧緩存图像—— -^Superlmage图像变换映射
级联应用映射变换^→C('' = 1,23)、 FC→D 、 FD→Su , 可获得映射变换
Fp l →Su(i = \,2, >) ? 有: (wx ,my) = FD→Su (FC→D (F→c (X, y )
= Fc→D(Fp i →c(x,y)) 等价于:
Figure imgf000018_0001
其巾:
① 数字 1、 2、 3表示第 1、 2、 3个投影仪或原始超分辨率图像第 1、 2、 3部分;
Su表示超 (Super)分辨率图像, P表示投影仪 (Projector)图像;
②、 , (^ )的定义分别为投影仪帧緩存待投影图像点, 输入原始超 分辨率大图像点;
③、 分别表示第 i(i=1,2,3)个投影仪帧緩存图像变换函数,表示由原 始超分辨大图像变换到最终要投影的图像之变换函数, 该变换函数不能由固 定的数学表达式所表达, 是由图像坐标映射表所表示。
获得变换后的对应的超分辨率图像点
步骤: 任取第 i个投影仪中的图像任一点 (^,^(^123)
①、 由映射 → '=1,23), 找到摄像机图像中的浮点 (Μλ ^, 。 );
②、 由 FC D , 找到 x,v。 j)所对应的 3D孤幕中的浮点 sfll。" ffl。 、;
③、 也就找到了超分辨率图像中对应的浮点 , m' j );
④、 取该浮点邻近 4个像素点, 双线性插值, 得到颜色值, 并把该值 赋给 ( );
⑤、 重复①②③④, 直至待投影仪图像中最后一点。
5 ) Fl P→PW(i = 1,2,3) 此变换公式如下: (基于 → =1,23)来求解) 已知我们已经获得了变换 7 "^123) ,待求变换 7'^^^123) ,也就是: 令 RGB(x,y) - RGB(mPW x,mPW y)时, 求点(xj)到点(?¾^— x,w^—y)的映射
Fl p→pw(i = 1,2,3) ^ 即:
Figure imgf000019_0001
(x,j)( = l,2,3)。 其中, ①、 (X, 表示投影仪帧緩存图 像变换之前的点;
②、 nPfr—x J表示投影仪帧緩存图像变换之后的点;
③、 RGB (·)表示某一坐标点或像素点 (·) 的 RGB颜色值;
④、 PW ( Projector Warped )表示投影仪变换之后的意思。
本实施例在求解摄像机内外参数时, 需要 3D弧幕的几何信息对应的 2D 坐标, 即需要的数学模型只是 3D与 2D的对应关系, 所需数学模型简单, 并 且在几何校正时需要的是投影仪到摄像机的映射关系和摄像机到 3D 弧幕的 映射关系, 所需的图像映射关系较少, 因此可以采用较简单的数学模型及较 少的图像映射实现多投影时的几何校准。
图 9 为本发明第三实施例的方法流程示意图, 本实施例以两个虚拟平幕 作为中间参数确定摄像机到 3D弧幕的映射关系。 参见图 9, 本实施例包括: 步骤 91: 根据摄像机图像与虚拟摄像机图像的映射关系, 以及虚拟摄像 机图像与虚拟 2D平幕的映射关系 , 得到摄像机与虚拟 2D平幕的映射关系; 其中, 上述的虚拟 2D平幕为与 3D弧幕对应的虚拟 2D平幕, 例如, 参见图 10, 假设将 3D弧幕展开, 得到的展开后的 2D平幕即为虚拟 2D平幕。
摄像机图像与虚拟摄像机图像的映射关系可以根据柱面到平面的畸变算 法得到, 例如采用梯形畸变算法得到。 例如, 参见图 11, 摄像机对投影到 3D 弧幕上的模板图进行拍摄, 得到模板图对应的摄像机图像, 之后, 参见图 12, 采用畸变算法将摄像机图像畸变为虚拟摄像机图像。
虚拟摄像机图像与虚拟 2D平幕的映射关系可以根据两个平幕的映射关系 确定, 具体可以如下: 计算 2D投影变换 H. 设图像 0和图像 J之间有 M对匹配的特征点, 现在需要确定 2D投影变换 , 将图像 0上的 M个特征点分别映射到图像 J上的 M个特征点。 N - 1个 的确定的步骤可分为两步:其一是使用线性方法分别计算 N - 1 个 , 其二是使用 Levenberg-Marquardt优化方法迭代求精这 N - 1个
H
步骤 1 ) DLT法线性确定各个 可以参见 R.Hartley and A.Zisserman. Multiple View Gemetry in Computer Vision. Cambrdge University Press, ISBN: 0521540518, second edition,2004.中的第 88页。
设 = , , 1)τUl j = iu\, , If是第 j对分别位于图像 0和图像 J的匹配的 特征点。 如下的方程在这对匹配点是成立的:
Figure imgf000020_0001
其中的 未知, 设
Figure imgf000020_0002
则可形成如下的方程组:
Figure imgf000020_0003
= Uhjl + Vi hJ, + hJ9 消去未知量 后整理得,
Figure imgf000020_0004
)hj + (- j )hJ9 - 0
0- h + 0 - h + 0 ' h + ut h + vt h + h + (-^ v/ )h + (-v; v )h + (-v )h = 0
H
该方程是以 j的 9个项为未知量的由 2个方程组成的方程组, 也就是一对 匹配点之间存在 2个以 的 9个项为未知量的方程, 于是, 图像 0和图像 J之间
4对匹配点就能产生 8个以 的 9个项为未知量的方程, 从而就能在相差一个 比例因子的意义上确定出 (令 =1 ) 。 对于图像 0和图像 J之间存在 M对 (M≥5) 以上匹配点的情形, 可以形成一个具备如下形式的超定方程组。
其中 是 2 M X 9的矩阵, 而 h = , h , ^'3, hj4, hjs ,hj6, r ,hjS, hj9 Y是 H的项组 成的列向量。对于超定方程组 ^h = 0, 需要求解出这样的 h,使得 h的模 l|h||=l 并且 的模 I h||最小化,这样的 h正好是相应于矩阵 的最小特征值的特征 向量, 这可以通过对 进行 SVD分解方便地找到。 步骤 2 )用 Levenberg-Marquardt优化方法迭代求精这 N - 1个1^ 由方程 , v1)T = ^, v'°, 1)τ可引入向量 = ^, , )τ, 并令:
= H j Y l
¾ 分别是1 ^和 的校准值, 这里 Levenberg-Marquardt优化方法的目 标, 通过迭代求精, 计算出 和 , 使得如下的误差最小:
的初始值可取由步骤 1 ) 线性确定的值, 而^的初始值可取 。
规范化可参见 R. Hartley and A. Zisserman. Multiple View Gemetry in
Computer Vision. Cambrdge University Press, ISBN: 0521540518, second edition,2004.中的第四章第四节。
至此可以得到摄像机图像与虚拟摄像机图像的映射关系 虚拟 以及虚 拟摄像机图像与虚拟 2D平幕的映射关系 虚拟 虚拟 之后, 采用级联的方式即 可以得到摄像机与虚拟 2D平幕的映射关系 Fc→虚拟 m = 虚拟 c→虚拟 2。0^→虚拟 c)。 步骤 92: 根据所述摄像机与虚拟 2D平幕的映射关系, 以及虚拟 2D平幕与
3D弧幕的映射关系, 得到摄像机到 3D弧幕的映射关系;
其中, 步骤 91可以得到摄像机与虚拟 2D平幕的映射关系 FC→|M2D
虚拟 2D平幕与 3D弧幕的映射关系可以对 3D弧幕与虚拟 2D平幕的映射关 系进行逆变换得到, 而 3D弧幕与虚拟 2D平幕的映射关系可以如下确定:
3D弧幕的 2D坐标: (s,0
其中: ①、 t表示幕的高度;
②、 s表示弧长, = l + /'W 单位为 pixel
Ζ = /( )为 3D曲线函数, Υ与 t对应, 显然可令:
t = Y
5 = + f x)dx 其中: ①、 (Χ,Υ,Ζ)为世界坐标系下的点的坐标;
②、 ζ = /( )。
这样也建立了世界坐标系 (Χ,Υ,Ζ)与 3D弧幕的 2D参数化表示 (s,t)。
本实施例中要改成柱面圓弧形式的参数坐标表示, 例如, 如果圓弧上任 一点的 3D坐标点为 (Xw,Yw,Zw ), 求解拉直的线段上的对应点(S,T ), 求解 步骤可以包括: 由 (Xw,Zw )求解圓弧弧长 S, 而丁=丫\¥。
至此, 可以确定摄像机与虚拟 2D平幕的映射关系 虚《2。, 以及虚拟 2D 平幕与 3D弧幕的映射关系 ^^^。, 之后, 根据级联可以得到摄像机到 3D弧
!映射关系 Fc→ = 拟 虚拟 。
步骤 93: 根据投影仪到摄像机的映射关系、 所述摄像机到 3D弧幕的映射 关系以及 3D弧幕到输入超分辨率图像映射关系 , 得到投影仪帧緩存图像变换 映射表;
步骤 94: 根据所述投影仪帧緩存图像变换映射表, 对投影仪要投影的图 像进行几何配准校正。 步骤 93~94的内容可以参见步骤 12~ 13。
本实施例不采取等分 3D弧幕成平面幕, 即, 通过离散化 3D弧幕, 每一 个小段弧幕假设为一个平幕, 进而采取平幕投影几何校正方法来校正 3D弧 幕。 本实施例把 3D弧幕变换到虚拟的 2D平幕中, 把摄像机拍摄到的 3D弧幕 变换到虚拟的平面图像中, 在虚拟的 2D平幕和虚拟的平幕的图像上, 移植现 有平幕投影几何校正方法, 校正 3D弧幕投影。 也就是, 最大程度基于现有平 面幕校正技术, 进行弧幕拼接几何校正。 巧妙规避了复杂的 3D弧幕模型的建 立, 以及规避求解 3D弧幕到 2D图像的映射。
图 13为本发明第四实施例的装置结构示意图, 该装置可以为独立设置的 装置, 之后将校正后的图像发送给投影仪緩存, 也可以为设置在投影仪内的 装置, 在緩存前进行校正。 该装置包括获取模块 131、 确定模块 132和校正模 块 133; 获取模块 131用于获取摄像机到 3D弧幕的映射关系; 确定模块 132用 于根据投影仪到摄像机的映射关系、 所述获取模块得到的所述摄像机到 3D弧 幕的映射关系以及 3D弧幕到输入超分辨率图像映射关系 , 得到投影仪帧緩存 图像变换映射表; 校正模块 133用于根据所述投影仪帧緩存图像变换映射表, 对投影仪要投影的图像进行几何配准校正。
可以是, 所述获取模块包括第一单元, 所述第一单元用于: 根据摄像机 图像与虚拟摄像机图像的映射关系, 以及虚拟摄像机图像与虚拟 2D平幕的映 射关系, 得到摄像机与虚拟 2D平幕的映射关系; 根据所述摄像机与虚拟 2D平 幕的映射关系, 以及虚拟 2D平幕与 3D弧幕的映射关系,得到摄像机到 3D弧幕 的映射关系。
或者, 所述获取模块包括第二单元, 所述第二单元用于: 根据对 3D弧幕 的拍摄图像, 及所述 3D弧幕的几何信息, 求解摄像机内外参数; 根据所述摄 像机内外参数及所述 3D弧幕的 2D坐标, 得到摄像机到 3D弧幕的映射关系。
进一步可以是,所述第二单元获取的所述 3D弧幕的几何信息为已知信息。 具体地, 所述第二单元具体用于: 根据所述拍摄图像中上下曲线的点的 数目, 对所述 3D弧幕的上下曲线进行等间隔采样, 并建立拍摄图像的点与 3D 弧幕的采样点之间的对应关系; 计算拍摄图像中点的 2D坐标与 3D弧幕的采样 点对应的 2D坐标的误差和,所述 3D弧幕的采样点对应的 2D坐标为根据设定的 初始的摄像机的内外参数及所述 3D弧幕的几何信息得到的; 如果所述误差和 不满足精度要求, 则更新所述初始的摄像机的内外参数, 并采用更新后的摄 像机的内外参数重新计算所述 3D弧幕的采样点对应的 2D坐标, 直至所述误差 和满足精度要求; 获取误差和满足精度要求时对应的摄像机的内外参数。
或者, 所述第二单元具体用于: 根据所述 3D弧幕的几何信息及设定的初 始的摄像机的内外参数,计算 3D弧幕的曲线对应的 2D曲线; 计算 3D弧幕的曲 线对应的 2D曲线与拍摄图像的曲线之间的距离; 如果所述距离不满足精度要 求, 则更新所述初始的摄像机的内外参数, 并采用更新后的摄像机的内外参 数重新计算 3D弧幕的曲线对应的 2D曲线, 直至所述距离满足精度要求; 获取 距离满足精度要求时对应的摄像机的内外参数。
本实施例在求解投影仪帧緩存图像变换映射表时, 所需的图像映射关系 较少, 因此可以采用较少的图像映射实现多投影时的几何校准, 实现方法简 便易行。
本领域普通技术人员可以理解: 实现上述方法实施例的全部或部分步骤 可以通过程序指令相关的硬件来完成, 前述的程序可以存储于计算机可读取 存储介质中, 该程序在执行时, 执行包括上述方法实施例的步骤; 而前述的 存储介质包括: ROM, RAM ,磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是: 以上实施例仅用以说明本发明的技术方案, 而非对其 限制; 尽管参照前述实施例对本发明进行了详细的说明, 本领域的普通技术 人员应当理解: 其依然可以对前述各实施例所记载的技术方案进行修改, 或 者对其中部分技术特征进行等同替换; 而这些修改或者替换, 并不使相应技 术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims

权利 要求 书
1、 一种多投影拼接几何校正方法, 其特征在于, 包括: 获取摄像机到 3D弧幕的映射关系;
根据投影仪到摄像机的映射关系、 所述摄像机到 3D弧幕的映射关系以及 3D 弧幕到输入超分辨率图像映射关系, 得到投影仪帧緩存图像变换映射表; 根据所述投影仪帧緩存图像变换映射表, 对投影仪要投影的图像进行几何 配准校正。
2、 根据权利要求 1所述的方法, 其特征在于, 所述获取摄像机到 3D弧幕的 映射关系, 包括: 根据摄像机图像与虚拟摄像机图像的映射关系, 以及虚拟摄像机图像与虚 拟 2D平幕的映射关系, 得到摄像机与虚拟 2D平幕的映射关系; 根据所述摄像机与虚拟 2D平幕的映射关系,以及虚拟 2D平幕与 3D弧幕的映 射关系, 得到摄像机到 3D弧幕的映射关系。
3、 根据权利要求 1所述的方法, 其特征在于, 所述获取摄像机到 3D弧幕的 映射关系, 包括:
根据对 3D弧幕的拍摄图像, 及所述 3D弧幕的几何信息, 获取摄像机内外参 数;
根据所述摄像机内外参数及所述 3D弧幕的 2D坐标,得到摄像机到 3D弧幕的 映射关系。
4、 根据权利要求 3所述的方法, 其特征在于, 所述 3D弧幕的几何信息为已 知信息。
5、 根据权利要求 3或 4所述的方法, 其特征在于, 所述根据对 3D弧幕的拍 摄图像, 及所述 3D弧幕的几何信息, 获取摄像机内外参数, 包括:
根据所述拍摄图像中上下曲线的点的数目, 对所述 3D弧幕的上下曲线进行 等间隔采样, 并建立拍摄图像的点与 3D弧幕的采样点之间的对应关系; 根据设定的初始的摄像机的内外参数及所述 3D弧幕的几何信息得到所述
3D弧幕采样点对应的 2D坐标, 并获取拍摄图像中点的 2D坐标与 3D弧幕的采样 点对应的 2 D坐标的误差和;
如果所述误差和不满足精度要求, 则更新所述初始的摄像机的内外参数, 并采用更新后的摄像机的内外参数重新计算所述 3D弧幕的采样点对应的 2D坐 标, 直至所述误差和满足精度要求;
获取误差和满足精度要求时对应的摄像机的内外参数。
6、 根据权利要求 3或 4所述的方法, 其特征在于, 所述根据对 3D弧幕的拍 摄图像, 及所述 3D弧幕的几何信息, 获取摄像机内外参数, 包括:
根据所述 3D弧幕的几何信息及设定的初始的摄像机的内外参数, 计算 3D弧 幕的曲线对应的 2D曲线;
计算 3D弧幕的曲线对应的 2D曲线与拍摄图像的曲线之间的距离;
如果所述距离不满足精度要求, 则更新所述初始的摄像机的内外参数, 并 采用更新后的摄像机的内外参数重新计算 3D弧幕的曲线对应的 2D曲线, 直至所 述距离满足精度要求;
获取距离满足精度要求时对应的摄像机的内外参数。
7、 一种校正装置, 其特征在于, 包括:
获取模块, 用于获取摄像机到 3D弧幕的映射关系;
确定模块, 用于根据投影仪到摄像机的映射关系、 所述获取模块得到的所 述摄像机到 3D弧幕的映射关系, 以及 3D弧幕到输入超分辨率图像映射关系, 得 到投影仪帧緩存图像变换映射表;
校正模块, 用于根据所述投影仪帧緩存图像变换映射表, 对投影仪要投影 的图像进行几何配准校正。
8、 根据权利要求 7所述的装置, 其特征在于, 所述获取模块包括第一单元, 所述第一单元用于: 根据摄像机图像与虚拟摄像机图像的映射关系, 以及虚拟摄像机图像与虚 拟 2D平幕的映射关系, 得到摄像机与虚拟 2D平幕的映射关系;
根据所述摄像机与虚拟 2D平幕的映射关系,以及虚拟 2D平幕与 3D弧幕的映 射关系, 得到摄像机到 3D弧幕的映射关系。
9、 根据权利要求 7所述的装置, 其特征在于, 所述获取模块包括第二单元, 所述第二单元用于:
根据对 3D弧幕的拍摄图像, 及所述 3D弧幕的几何信息, 获取摄像机内外参 数;
根据所述摄像机内外参数及所述 3D弧幕的 2D坐标,得到摄像机到 3D弧幕的 映射关系。
10、根据权利要求 9所述的装置, 其特征在于, 所述第二单元获取的所述 3D 弧幕的几何信息为已知信息。
1 1、根据权利要求 9或 10所述的装置,其特征在于,所述第二单元具体用于: 根据所述拍摄图像中上下曲线的点的数目, 对所述 3D弧幕的上下曲线进行 等间隔采样, 并建立拍摄图像的点与 3D弧幕的采样点之间的对应关系;
根据设定的初始的摄像机的内外参数及所述 3D弧幕的几何信息得到所述
3D弧幕采样点对应的 2D坐标, 并获取拍摄图像中点的 2D坐标与 3D弧幕的采样 点对应的 2 D坐标的误差和;
如果所述误差和不满足精度要求, 则更新所述初始的摄像机的内外参数, 并采用更新后的摄像机的内外参数重新计算所述 3D弧幕的采样点对应的 2D坐 标, 直至所述误差和满足精度要求;
获取误差和满足精度要求时对应的摄像机的内外参数; 并
根据所述摄像机内外参数及所述 3D弧幕的 2D坐标,得到摄像机到 3D弧幕的 映射关系。
12、根据权利要求 9或 10所述的装置,其特征在于,所述第二单元具体用于: 根据所述 3D弧幕的几何信息及设定的初始的摄像机的内外参数, 计算 3D弧 幕的曲线对应的 2D曲线;
计算 3D弧幕的曲线对应的 2D曲线与拍摄图像的曲线之间的距离; 如果所述距离不满足精度要求, 则更新所述初始的摄像机的内外参数, 并 采用更新后的摄像机的内外参数重新计算 3D弧幕的曲线对应的 2D曲线, 直至所 述距离满足精度要求;
获取距离满足精度要求时对应的摄像机的内外参数; 并
根据所述摄像机内外参数及所述 3D弧幕的 2D坐标,得到摄像机到 3D弧幕的 映射关系。
PCT/CN2012/077294 2011-06-22 2012-06-21 多投影拼接几何校正方法及校正装置 WO2012175029A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201110169655.8 2011-06-22
CN201110169655.8A CN102841767B (zh) 2011-06-22 2011-06-22 多投影拼接几何校正方法及校正装置

Publications (1)

Publication Number Publication Date
WO2012175029A1 true WO2012175029A1 (zh) 2012-12-27

Family

ID=47369177

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/077294 WO2012175029A1 (zh) 2011-06-22 2012-06-21 多投影拼接几何校正方法及校正装置

Country Status (2)

Country Link
CN (1) CN102841767B (zh)
WO (1) WO2012175029A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110717936A (zh) * 2019-10-15 2020-01-21 哈尔滨工业大学 一种基于相机姿态估计的图像拼接方法
CN111062869A (zh) * 2019-12-09 2020-04-24 北京东方瑞丰航空技术有限公司 一种面向曲面幕的多通道校正拼接的方法
CN111429516A (zh) * 2020-03-23 2020-07-17 上海眼控科技股份有限公司 车架号的角点定位方法、装置、计算机设备及存储介质
CN115314690A (zh) * 2022-08-09 2022-11-08 北京淳中科技股份有限公司 一种图像融合带处理方法、装置、电子设备及存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9582864B2 (en) * 2014-01-10 2017-02-28 Perkinelmer Cellular Technologies Germany Gmbh Method and system for image correction using a quasiperiodic grid
CN104778694B (zh) * 2015-04-10 2017-11-14 北京航空航天大学 一种面向多投影拼接显示的参数化自动几何校正方法
CN105043251B (zh) * 2015-06-01 2017-09-29 河北工业大学 一种基于机械运动的线结构光传感器的标定方法与装置
CN107277380B (zh) * 2017-08-16 2020-10-30 成都极米科技股份有限公司 一种变焦方法及装置
CN107657658A (zh) * 2017-09-26 2018-02-02 安徽美图信息科技有限公司 一种基于web与三维模型相结合的虚拟展厅展示系统
CN112118435B (zh) * 2020-08-04 2021-06-25 山东大学 面向异形金属屏幕的多投影融合方法及系统
CN112184662B (zh) * 2020-09-27 2023-12-15 成都数之联科技股份有限公司 应用于无人机图像拼接中的相机外参数初始方法及系统
CN112734860B (zh) * 2021-01-15 2021-09-21 中国传媒大学 一种基于弧形幕先验信息的逐像素映射投影几何校正方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101116049A (zh) * 2005-02-10 2008-01-30 有限会社策划设计工程 指示器光跟踪方法、程序及其记录媒体
JP2008145594A (ja) * 2006-12-07 2008-06-26 Sony Ericsson Mobilecommunications Japan Inc 画像表示処理方法および画像表示装置
CN101572787A (zh) * 2009-01-04 2009-11-04 四川川大智胜软件股份有限公司 基于计算机视觉精密测量多投影视景自动几何校正和拼接方法
CN101968890A (zh) * 2009-07-27 2011-02-09 西安费斯达自动化工程有限公司 基于球面显示的360°全景仿真系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101188020A (zh) * 2007-12-20 2008-05-28 四川川大智胜软件股份有限公司 投影仪投影图象与计算机帧缓存图象之间象素几何位置对应关系精确获取方法
CN101621701B (zh) * 2009-01-04 2010-12-08 四川川大智胜软件股份有限公司 独立于几何校正的任意光滑曲面屏幕多投影仪显示墙色彩校正方法
CN101815188A (zh) * 2009-11-30 2010-08-25 四川川大智胜软件股份有限公司 一种非规则光滑曲面显示墙多投影仪图像画面校正方法
CN101916175B (zh) * 2010-08-20 2012-05-02 浙江大学 自适应于投影表面的智能投影方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101116049A (zh) * 2005-02-10 2008-01-30 有限会社策划设计工程 指示器光跟踪方法、程序及其记录媒体
JP2008145594A (ja) * 2006-12-07 2008-06-26 Sony Ericsson Mobilecommunications Japan Inc 画像表示処理方法および画像表示装置
CN101572787A (zh) * 2009-01-04 2009-11-04 四川川大智胜软件股份有限公司 基于计算机视觉精密测量多投影视景自动几何校正和拼接方法
CN101968890A (zh) * 2009-07-27 2011-02-09 西安费斯达自动化工程有限公司 基于球面显示的360°全景仿真系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110717936A (zh) * 2019-10-15 2020-01-21 哈尔滨工业大学 一种基于相机姿态估计的图像拼接方法
CN110717936B (zh) * 2019-10-15 2023-04-28 哈尔滨工业大学 一种基于相机姿态估计的图像拼接方法
CN111062869A (zh) * 2019-12-09 2020-04-24 北京东方瑞丰航空技术有限公司 一种面向曲面幕的多通道校正拼接的方法
CN111429516A (zh) * 2020-03-23 2020-07-17 上海眼控科技股份有限公司 车架号的角点定位方法、装置、计算机设备及存储介质
CN115314690A (zh) * 2022-08-09 2022-11-08 北京淳中科技股份有限公司 一种图像融合带处理方法、装置、电子设备及存储介质
CN115314690B (zh) * 2022-08-09 2023-09-26 北京淳中科技股份有限公司 一种图像融合带处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN102841767B (zh) 2015-05-27
CN102841767A (zh) 2012-12-26

Similar Documents

Publication Publication Date Title
WO2012175029A1 (zh) 多投影拼接几何校正方法及校正装置
WO2020001168A1 (zh) 三维重建方法、装置、设备和存储介质
US9195121B2 (en) Markerless geometric registration of multiple projectors on extruded surfaces using an uncalibrated camera
Sajadi et al. Auto-calibration of cylindrical multi-projector systems
TWI719493B (zh) 投影系統、投影裝置以及其顯示影像的校正方法
TWI476729B (zh) Dimensional image and three - dimensional model of the combination of the system and its computer program products
CN110070598B (zh) 用于3d扫描重建的移动终端及其进行3d扫描重建方法
JP5558973B2 (ja) 画像補正装置、補正画像生成方法、補正テーブル生成装置、補正テーブル生成方法、補正テーブル生成プログラムおよび補正画像生成プログラム
TW201808000A (zh) 投影機的影像校正方法及影像校正系統
JP2019008286A (ja) 投影システム及び表示画像の補正方法
KR101988372B1 (ko) 사진 이미지를 이용한 3차원 건축물 모델 역설계 장치 및 방법
WO2019089822A1 (en) Modeling indoor scenes based on digital images
JPWO2018235163A1 (ja) キャリブレーション装置、キャリブレーション用チャート、チャートパターン生成装置、およびキャリブレーション方法
WO2021082264A1 (zh) 一种基于双目视觉的投影图像自动校正方法及系统
CN112734860B (zh) 一种基于弧形幕先验信息的逐像素映射投影几何校正方法
WO2016018392A1 (en) Three dimensional scanning system and framework
Sajadi et al. Autocalibrating tiled projectors on piecewise smooth vertically extruded surfaces
JP2015219679A (ja) 画像処理システム、情報処理装置、プログラム
WO2023035841A1 (zh) 用于图像处理的方法、装置、设备、存储介质和程序产品
Sajadi et al. Markerless view-independent registration of multiple distorted projectors on extruded surfaces using an uncalibrated camera
US20200294269A1 (en) Calibrating cameras and computing point projections using non-central camera model involving axial viewpoint shift
TW202101372A (zh) 基於全景影像內控制區的紋理座標調整方法
WO2018113339A1 (zh) 一种投影图构建方法及装置
TWI517094B (zh) 影像校正方法及影像校正電路
US8149260B2 (en) Methods and systems for producing seamless composite images without requiring overlap of source images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12801806

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12801806

Country of ref document: EP

Kind code of ref document: A1