US20080285843A1 - Camera-Projector Duality: Multi-Projector 3D Reconstruction - Google Patents
Camera-Projector Duality: Multi-Projector 3D Reconstruction Download PDFInfo
- Publication number
- US20080285843A1 US20080285843A1 US12/121,056 US12105608A US2008285843A1 US 20080285843 A1 US20080285843 A1 US 20080285843A1 US 12105608 A US12105608 A US 12105608A US 2008285843 A1 US2008285843 A1 US 2008285843A1
- Authority
- US
- United States
- Prior art keywords
- projector
- camera
- projectors
- calibration data
- calibrating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 59
- 238000013519 translation Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 8
- 230000010363 phase shift Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 14
- 230000003287 optical effect Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 7
- 230000015654 memory Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
Definitions
- the present invention relates generally to computer vision methods and systems of three-dimensional scene reconstruction, and in particular to a projector-based three-dimensional scene reconstruction system using camera-projector duality.
- Traditional three-dimensional (3D) scene reconstruction algorithms measures distance to the objects of the scene by analyzing the triangles constructed by the projection rays of two optical systems.
- Traditional stereo systems use two cameras to find two projection rays from the two camera centers which meet at the object surface being measured.
- Space-time stereo systems use two cameras and a projector or a time-varying light resource for 3D scene reconstruction.
- Multi-view stereo systems use multiple cameras for 3D scene reconstruction.
- Traditional stereo systems, e.g., space-time stereo and multi-view stereo require solving a correspondence problem which determines which pairs of points in two images captured by camera(s) are projections of the same point in an object being processed. The process to solving the correspondence problem is generally referred to as epipolar search. This is a very complex problem and computationally expensive.
- the camera calibration process estimates the internal and/or external parameters of a camera that affect the 3D scene reconstruction.
- the internal parameters of a camera such as camera focal length, optical center location, skewness of image pixels, and radial distortion of the lens, are often called intrinsic camera parameters.
- the external parameters of a camera define the location and orientation of a camera with respect to some world coordinate system.
- the external parameters of a camera are often called extrinsic camera parameters.
- Two calibrated optical systems with respect to a known reference coordinate system such as two calibrated cameras, or one calibrated camera and one calibrated projector, are related to each other by a translation matrix and a rotation matrix which map a pixel point in one optical system to a corresponding pixel point in the other optical system.
- Camera-camera calibration calculates the translation and rotation matrices between the calibrated cameras.
- camera-projector calibration calculates the translation and rotation matrices between the calibrated projector and camera.
- FIG. 1A is a block diagram illustrating a traditional space-time stereo 3D scene reconstruction system.
- the reconstruction system illustrated in FIG. 1A comprises a pair of cameras, 102 a and 102 b, a projector 104 and a camera-camera calibration module 106 .
- the projector 104 projects light patterns to disambiguate correspondence between a pair of projection rays, resulting in a unique match between pixels captured in both cameras 102 .
- a reference coordinate system for reconstruction is defined in one of the cameras 102 .
- the projector 104 needs not be calibrated with the cameras 102 nor defined in the stereo rig of the two cameras 102 .
- the camera calibration module 106 calibrates the camera 102 and generates extrinsic camera parameters with respect to the reference coordinate system.
- the camera-camera calibration module 106 also calculates the rotation and translation matrices between the cameras 102 using the intrinsic and extrinsic camera parameters of the cameras 102 .
- the camera calibration module 106 may use any existing camera calibration algorithm which is readily known to a person of ordinary skills in the art such as photogrammetric calibration or self-calibration.
- the rotation and translation matrices between the cameras 102 obtained by camera calibration are needed. Additionally, the computational expensive epipolar search for correspondence described above is needed for the reconstruction.
- FIG. 1C is a block diagram illustrating a traditional multi-view stereo 3D scene reconstruction system.
- the reconstruction system in FIG. 1C includes three calibrated cameras 102 a - 102 c and two camera calibration-camera modules 106 a and 106 b. More calibrated cameras 102 and camera calibration modules 106 may be used for multi-view reconstruction.
- the camera-camera calibration module 106 a calibrates the cameras 102 a and 102 b and generates the rotation and translation matrices between the calibrated cameras 102 a and 102 b.
- the camera-camera calibration module 106 b calibrates the cameras 102 b and 102 c.
- the traditional space-time stereo reconstruction system illustrated in FIG. 1A the multi-view stereo reconstruction system in FIG. 1C requires computational expensive epipolar search for correspondence for 3D scene reconstruction.
- Traditional structured light systems for 3D scene reconstruction employ one camera and a projector. Light travels from the projector, and is reflected to the surface of the camera. Instead of finding two corresponding incoming projection rays, a structured light algorithm projects specific light pattern(s) onto the surface of an object, and from the observed patterns, the algorithm figures out which projection ray is reflected on the surface of the object in the scene and reaches the camera.
- Traditional structured light systems require the projected patterns to be well differentiated from the other objects and ambient light falling on the surface of the object being processed. This requirement often translates into a requirement for a high powered and well focused projected light.
- FIG. 3A is a block diagram illustrating a structured light reconstruction method running in a traditional structured light 3D scene reconstruction system.
- the reconstruction system in FIG. 3A includes a camera 102 , a reference coordinate system 308 defined in the camera 102 , a projector 104 , a camera-projector calibration module 116 a and an object being processed 302 .
- the camera-projector calibration module 116 a calibrates the camera 102 a and the projector 104 .
- the camera-projector module 116 a may use any existing projector calibration method that is known to a person of ordinary skills in the art, such as P. Song, “A theory for photometric self-calibration of multiple overlapping projectors and cameras,” IEEE International Workshop on Projector-Camera Systems, 2005, which is incorporated by reference herein in its entirety.
- the projector 104 projects a plane 306 a of a calibration pattern onto the object 302 , and the camera 102 observes a reflected ray 306 b from the object 302 .
- the reconstruction system in FIG. 3A identifies which projected ray is reflected on the surface of the object 302 and reached the camera 102 .
- the system identifies that an observed point 304 on the surface of the object 302 is the intersection point from the projection plane 306 a projected from the projector 104 and the viewing ray 306 b from the camera 102 .
- the reconstruction system in FIG. 3A needs to know the camera 102 position relative to the projector 104 , i.e., the rotation and translation matrices between the camera 102 and the projector 104 , for the reconstruction.
- a projector-based 3D scene reconstruction system exploits camera-projector duality and can use existing 3D scene reconstruction algorithms, e.g., structured light.
- a projector-based 3D scene reconstruction system calibrates the projectors in the system, and eliminates the need of computational expensive epipolar search for correspondence required by traditional 3D scene reconstruction algorithms.
- the projector-based reconstruction system has a wide range applications to real world problems, such as three-dimensional scanning systems, multi-projector system with high resolution, natural and effective human-computer interface and augmented reality.
- One embodiment of a disclosed system includes calibrating a plurality of projectors using camera-projector duality.
- the system includes a camera-projector calibration module and a projector-projector calibration module.
- the camera-projector calibration module is configured to calibrate a first projector with the camera and generate a first camera-projector calibration data using camera-projector duality.
- the camera-projector calibration module is also configured to calibrate a second projector with the camera and generate a second camera-projector calibration data.
- the projector-projector calibration module is configured to calibrate the first and the second projector using the first and the second camera-projector calibration data.
- FIG. 1A is a block diagram illustrating a traditional space-time stereo 3D scene reconstruction system.
- FIG. 1B is a block diagram illustrating a stereo-projector 3D scene reconstruction system according to one embodiment.
- FIG. 1C is a block diagram illustrating a traditional multi-view stereo 3D scene reconstruction system.
- FIG. 1D is a block diagram illustrating an exemplary multi-projector 3D scene reconstruction system according to one embodiment.
- FIG. 2 is a flowchart using a projector-based 3D scene reconstruction method in a projector-based 3D scene reconstruction system according to one embodiment.
- FIG. 3A is a block diagram illustrating a structured light reconstruction method running in a traditional structured light 3D scene reconstruction system.
- FIG. 3B is an exemplary block diagram illustrating a projector-based reconstruction method running in a stereo-projector 3D scene reconstruction system according to one embodiment
- FIG. 4 is a flowchart illustrating a camera-projector calibration method using camera-projector duality according to one embodiment.
- FIG. 5 is a flowchart illustrating a projector-projector calibration method using camera-projector duality according to one embodiment.
- FIG. 6 is an exemplary block diagram illustrating a projector-based reconstruction method running in a multi-projector 3D scene reconstruction system according to one embodiment.
- FIG. 7 is a flowchart showing using a projector-based 3D scene reconstruction method in a multi-projector 3D scene reconstruction system according to one embodiment.
- Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by a variety of operating systems.
- the present invention also relates to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
- the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- FIG. 1B is a block diagram illustrating a stereo-projector 3D scene reconstruction system according to one embodiment.
- the reconstruction system illustrated in FIG. 1B is the projector-based counter part of the traditional space-time stereo reconstruction system illustrated in FIG. 1A .
- the two calibrated cameras 102 a and 102 b in FIG. 1A are replaced by a pair of calibrated projectors 104 a and 104 b, while the projector 104 in FIG. 1A is replaced by a camera 102 .
- the reconstruction system in FIG. 1B also includes a projector-projector calibration module 126 , which calibrates the projectors 104 using camera-projector duality.
- the result from the projector-projector calibration by the calibration module 116 is the rotation and translation matrices between the projectors 104 .
- the projector-projector calibration module 126 includes a memory unit 20 , a microprocessor 22 , a camera-projector calibration unit 24 and a projector-projector calibration unit 26 .
- the memory unit 20 and microprocessor 22 may reside outside the projector-projector calibration module 126 .
- the rotation and translation matrices between two calibrated optical systems are referred to as calibration data between the two calibrated optical systems from herein and throughout the entire specification.
- the camera 102 in the projector-based reconstruction system illustrated in FIG. 1B is used to identify the projected rays reflected on an object surface in a projector-based reconstruction system, and the camera 102 can move freely around the scene to be reconstructed after projector calibration.
- the reference coordinate system for reconstruction is defined in one of the projectors 104 , not to the camera 102 as illustrated in FIG. 1A . Consequently, in a projector-based reconstruction system as in FIG. 1B , neither extrinsic camera parameters nor computational expensive epipolar search for correspondence is required for reconstruction. Further detail on projector calibration using camera-projector duality is presented in connection with the discussion of FIGS. 3B , 4 and 5 .
- FIG. 1D is a block diagram illustrating an exemplary multi-projector 3D scene reconstruction system according to one embodiment.
- the reconstruction system illustrated in FIG. 1D is the projector-based counter part of the traditional multi-view stereo reconstruction system illustrated in FIG. 1C .
- the three calibrated cameras 102 a, 102 b and 102 c in FIG. 1C are replaced by three calibrated projectors 104 a, 104 b and 104 c.
- the reconstruction system in FIG. 1D also includes one or more cameras 102 and projector-projector calibration modules 126 as described in FIG. 1B .
- only one camera 102 is included for simplicity of description of an embodiment, and more cameras can be used in other embodiments.
- the projector-projector calibration module 126 a calibrates the projector 104 a and the projector 104 b using camera projector duality, and generates the calibration data between the projectors 104 a - b.
- the projector-projector calibration module 126 a similarly calibrates the projectors 104 b - c.
- the projectors 104 b - c are calibrated by projector-projector calibration module 126 b.
- the projector-based reconstruction system illustrated in FIG. 1D does not need the intrinsic and extrinsic parameters of the camera 102 for the reconstruction process after projector calibration.
- the camera 102 freely moves around the scene to read the projected patterns from a plurality of directions in the reconstruction system.
- the reference coordinate system for reconstruction can be attached to one of the projectors 104 . Consequently, reconstruction is simple and fast because all projected ray intersection occurs in the same reference coordinate system defined in a projector 104 .
- the computationally expensive epipolar search for correspondence required for reconstruction in a traditional multi-view stereo reconstruction system is no longer needed in the multi-projector based reconstruction system because the intrinsic and extrinsic parameters of the camera 102 are not needed for the reconstruction, and projection rays and planes are all from the projectors 104 .
- FIG. 2 is a flowchart using a projector-based 3D scene reconstruction method in a projector-based 3D scene reconstruction system according to one embodiment.
- the projector-based reconstruction system calibrates 202 projectors used in the system using camera-projector duality. Each calibrated projector projects 204 calibration patterns, such as known checkerboard patterns, onto an object in the scene being reconstructed.
- a camera captures 206 the projected calibration patterns from a plurality of directions.
- the reconstruction system identifies 208 the projection rays reflected from the object surface to the captured images. Once two projection rays from different projectors are identified, the projection rays are put in the reference coordinate system defined in one of the projectors in the system.
- the projector-based method recovers 210 the signature, i.e., identification, of each individual projection ray.
- the intersection point between the two projection rays becomes the reconstructed surface point of the object in the scene being reconstructed. Since the correspondence information between two rays is directly read from the captured images of the projected calibration patterns, the projector-based method does not need to perform the computational expensive epipolar search for correspondence as in a conventional 3D scene reconstruction system. From the information obtained from the above steps, the projector-based reconstruction system reconstructs 212 the 3D scene using any existing 3D scene reconstruction algorithms, such as structured light. Further detail on the steps of identifying projection rays and recovering the signatures of projection rays are presented in connection with the discussion of FIG. 3B .
- the projector-based reconstruction system described above needs to calibrate two or more projectors using camera-projector duality (e.g., step 202 in FIG. 2 ).
- Cameras and projectors are dual to each other due to the fact that cameras and projectors resemble each other very much in structure and functions except light travels in opposite directions.
- a camera measures the amount of light arriving on a pixel of a scene from a direction, whereas a projector projects a specified amount of light to the pixel of the scene in the opposite direction.
- the same format (or types) of intrinsic parameters can be used for both camera and projector camera and projector use 2D array pixels and a lens assembly. Therefore, the projector-based reconstruction system modifies the conventional camera calibration mechanisms used in the stereo camera systems for calibrating projectors.
- the projector-projector calibration may be implemented in two sequential steps, first by calibrating each individual projector with a camera by camera-projector calibration using camera-projector duality, followed by projector-projector calibration using camera-projector calibration data.
- FIG. 4 is a flowchart illustrating a camera-projector calibration method using camera-projector duality according to one embodiment
- FIG. 5 is a flowchart of projector-projector calibration method using camera-projector calibration data according to one embodiment.
- the camera-projector calibration unit 24 of the projector-projector calibration module 126 obtains 402 the intrinsic parameters of the camera using a conventional camera calibration mechanism known to a person of ordinary skills in the art.
- a projector in the stereo-projector reconstruction system projects 404 calibration patterns with a selected color channel, such as red color, onto a calibration board.
- the calibration pattern is a checkerboard pattern and the checkerboard pattern is printed on the calibration board in yellow color.
- the camera used in the reconstruction system captures 406 the images of the scene being reconstructed.
- Each image captured by the camera includes the projected patterns onto the calibration board. For example, a sample captured image is an image with red projected calibration pattern on the calibration board where yellow calibration pattern is printed.
- the camera-projector calibration unit 24 detects 408 in parallel the corner points of the calibration patterns, one for the printed patterns, i.e., 408 a, and the other for the projected patterns, i.e., 408 b. From the printed patterns, the calibration unit 24 computes 410 the extrinsic parameters of the calibration board, i.e., the position and orientation of the calibration board in a captured image. From the projected patterns and the extrinsic parameters of the calibration board, the calibration unit 24 computes 412 the camera-projector calibration data between the camera and the projector in addition to the intrinsic parameters of the projector being calibrated. The calibration unit 24 repeats the above steps to calibrate each projector used in the system. The result of such camera-projector calibration is a plurality of camera-projector calibration data.
- FIG. 5 is a flowchart illustrating a projector-projector calibration method using camera-projector duality in a stereo-projector reconstruction system according to one embodiment.
- the projector-projector calibration unit 26 of the projector-projector calibration module 126 receives 502 / 504 the camera-projector 1 calibration data between the camera and the projector 1 , and the camera and projector 2 calibration data.
- the calibration unit 26 then calibrates the projector 1 and projector 2 using the received calibration data.
- the result of the calibration from the projector-projector calibration module 126 is the calibration data between the projector 1 and projector 2 , e.g., the rotation and translation matrices between the projector 1 and projector 2 .
- R 1 , R 2 be rotation matrices which relate the projector 1 and projector 2 to the camera
- t 1 , t 2 be corresponding translation vectors.
- x c R 1 x 1 +t 1
- x c R 2 x 2 +t 2
- x 1 and x 2 are points in the projector coordinate systems within a projector and x c is a point in the camera coordinate system.
- the projector-based reconstruction system described above identifies the projection rays reflected from an object surface to the captured images by a camera (e.g., step 208 in FIG. 2 ).
- identifying projection rays is the process of tagging a projection ray with a distinguishable signature or identification, and from the captured image by a camera, recovering the signature of the projection ray that has arrived on the surface point of an object which is corresponding to the pixel in the captured image.
- a projection ray identification module uses time sequential binary codes with phase shift to find the projection ray signature.
- a projector can generate many levels of brightness per pixel of an image, but it is not desirable to relay on the magnitude of a projected light for ray identification, especially when a camera may capture the reflected light on an object surface whose reflectance property is unknown.
- the projector-based reconstruction system uses binary codes, i.e., two or three levels of brightness of the projected light, for robust detection of projection ray.
- the reconstruction system multiplexes the ray signature to multiple channels.
- the multiplexed ray signatures are then reassembled after capturing.
- the channels may be multiple consecutive frames.
- Other embodiments may use spatial neighborhoods or spectral bands, such as colors, as multiple channels.
- the projector-based reconstruction system also multiplexes the ray signatures with time. Time-multiplexing is simple but very effective if the scene to be reconstructed is static. Bits of signatures for all projection rays are built as patterns and are projected to the scene being reconstructed. From the captured images, the projector-based reconstruction system detects the bits and assembles the bits into ray signatures. This approach allows spatially dense coding for ray identification. To deal with reconstruction of a scene that is changing, in one embodiment, the projector-based reconstruction system uses the time-multiplexing with phase shift for ray identification detection.
- FIG. 3B is an exemplary block diagram illustrating a projector-based reconstruction method running in a stereo-projector 3D scene reconstruction system according to one embodiment.
- the reconstruction system in FIG. 3B includes two calibrated projectors 104 a and 104 b, a reference coordinate system 308 defined in the projector 104 b, a camera 102 and an object 302 being processed.
- the reconstruction system in FIG. 3B also includes a projector-projector calibration module 126 and a 3D reconstruction module 11 8 .
- the projector-projector calibration module 126 calibrates the projectors 104 using camera-projector duality, and calculates the calibration data between the projector 104 a and 104 b.
- the 3D reconstruction module 118 identifies the intersection point 304 from the ray 306 b projected by the projector 104 b and the projector plane 306 a from the projector 104 a on the surface of the object 302 .
- the projector-based reconstruction system in FIG. 3B treats rays in the projected plane 306 a as identical to each other, and only one ray in the plane 306 a is used for the reconstruction. It is noted that the projector-based reconstruction system does not need to know a priori of which ray in the plane 306 a is to be used for reconstruction.
- FIG. 3B a further analysis of projection ray identification of a projector-based reconstruction method is presented in FIG. 3B with the time sequential binary codes with phase shift.
- the projector-based reconstruction method in FIG. 3B runs on one camera 102 and two rigidly attached projectors 104 a and 104 b which have been calibrated using camera-projector duality.
- the camera 102 reads the projected calibration patterns onto the scene being reconstructed.
- the 3D reconstruction module 118 identifies the intersection 304 of the projection ray 306 b from the projector 104 b and the projection plane 306 a from the other projector 104 a.
- the projector 104 b Since the reference coordinate system 308 is defined in the projector 104 b, the projector 104 b is thus referred to as the reference projector.
- the reference projector 104 b projects ray-tagging pattern(s) and the other projector 104 a projects the plane-tagging pattern(s) onto the surface of the object 302 .
- the 3D reconstruction module 118 identifies the x-, and y-coordinate of the ray-tagging pattern from the projector 104 b, e.g., the projection ray 306 b.
- the 3D reconstruction module 118 For the projector 104 a, the 3D reconstruction module 118 only uses the x-coordinate of the plane-tagging pattern, e.g., the projection plane 306 a. From the captured image of the scene with the projected ray-tagging and plane-tagging patterns by the camera 102 , the 3D reconstruction module 118 recovers the signatures of two projection rays, one ray being 306 b, and the other being a ray in the projection plane 306 a. For the pixels on the surface of the object 302 , which have both projection rays being detected, the intersection of both rays, such as the intersection point 304 in FIG. 3B , are computed using the calibration data between the two projectors 104 a and 104 b. The intersection point 304 is the recovered surface point of the object being reconstructed. The 3D reconstruction module 118 repeats the above steps to reconstruct each surface point of the object.
- the 3D reconstruction module 118 repeats the above steps to reconstruct each surface point of the
- the camera 102 can be used for reading the projected calibration patterns from the projectors 104 .
- the camera 102 does not need to be calibrated at all (i.e., neither intrinsic nor extrinsic parameters of the camera 102 are needed), thus, the calibration data between the projectors 104 and the camera 102 are no longer needed in the reconstruction process after calibrating the projectors 104 , and the camera 102 can move freely around the object 302 as needed to capture the projected calibration patterns. All the reconstruction result will reside in the reference coordinate system within a projector 104 .
- Another advantage of using projector-based reconstruction system illustrated in FIG. 3B is that the correspondence search over an epipolar line between two projected rays or a projection ray and a projection plane is not necessary.
- Reading projected calibration patterns by the projector-based reconstruction system indicates which projection ray or plane from the projector 104 arrives at the surface of the object 302 , and the intersection of two projection rays or one projection ray and one projection plane, e.g., intersection point 304 , can be directly computed since the reference coordinate system for reconstruction 308 is defined in the projector 104 b.
- the projector-based reconstruction system needs the calibration data between projectors 104 including both intrinsic and extrinsic parameters of each projector 104 .
- the projector-based reconstruction system needs the projector 104 extrinsic parameters to locate the optical center and axis of the projector 104 , and the projector 104 intrinsic parameters to determine which direction the ray is going from the optical center of the projector 104 .
- the ID of the ray is decoded from the captured images according to the ray-tagging scheme described above.
- FIG. 6 is an exemplary block diagram illustrating a projector-based reconstruction method running in a multi-projector reconstruction system according to one embodiment.
- the multi-projector reconstruction system illustrated in FIG. 6 extends the stereo-projector system depicted in FIG. 3B to use additional projectors.
- projectors may project projection planes rather than projection rays as described in the FIG. 3B . This is because projection patterns for projection planes are generally simpler and more robust to noise than patterns for projection rays in a multi-projector reconstruction environment.
- the reconstruction method is able to uniquely determine a point in 3D space.
- the projector-based reconstruction method illustrated in FIG. 6 calibrates each projector using the camera-projector duality described above. After the projector calibration, the reconstruction method no longer needs the extrinsic parameters of the camera and computational expensive epipolar line search for 3D correspondence as required in conventional space-time multi-view 3D scene reconstruction systems, or the camera motion tracking as required in conventional structured-light multi-view 3D scene reconstruction systems.
- the reconstruction system in FIG. 6 in one embodiment, includes three calibrated projectors 104 a, 104 b, and 104 c, one camera 102 and an objector 302 being reconstructed.
- the reconstruction system also includes two projector-projector calibration modules 126 a - b and a 3D reconstruction module 1 18 .
- the calibration data between the projector 104 a and the projector 104 c is omitted for simplicity of description of an embodiment; the calibration data between the projectors 104 a and 104 c can be similarly computed as for other projector pairs.
- the multi-projector reconstruction system may include multiple cameras 102 and more than three projectors 104 .
- the reference coordinate system for reconstruction 308 is defined in the projector 104 b in the embodiment described in FIG. 6 . In other embodiments, the reference coordinate system 308 may be defined in other projectors 104 .
- the calibrated projectors 104 a - 104 c projects projection planes onto the surface of the object 302 . From the intersection point on the surface of the object 302 , the 3D reconstruction module 118 identifies each individual plane projected from the corresponding projector 104 .
- each projector 104 projects a projection plane 606 onto the surface of the object 302 .
- the projector 104 a projects the plane 606 a, the plane 606 b by the projector 104 b and the plane 606 c by the projector 104 c.
- the three planes 606 a - 606 c intersects at the intersection point 304 on the surface of the object 302 .
- the 3D reconstruction module 118 in the FIG. 6 identifies each projection plane 306 and its corresponding projector 104 from the intersection point 304 .
- FIG. 7 is a flowchart showing using a projector-based 3D scene reconstruction method in a multi-projector reconstruction system as described in FIG. 6 according to one embodiment.
- the projector-projector calibration module 126 calibrates 702 the camera 102 and a selected projector 104 , e.g., the projector 104 a, following the calibration steps such as those described in FIG. 4 , and stores 704 the camera-projector calibration data in a storage medium, such as a local cache memory.
- the projector-projector calibration module 126 checks 706 whether all projectors 104 in the reconstruction system has been calibrated with the camera 102 .
- the projector-projector calibration module 126 calibrates 710 a pair of projectors as described in FIG. 5 . In response to not all the projectors 104 being calibrated with the camera 102 , the projector-projector calibration module 126 repeats the camera-projector calibration steps 702 - 706 . For each projector-projector calibration, the projector-projector calibration module 126 further checks 710 whether all the projectors 104 have calibrated with each other. Responsive to all the projectors 104 being calibrated, the 3D reconstruction module 118 constructs 712 the 3D scene following the 3D reconstruction steps, e.g., steps 202 - 214 described in FIG.
- the projector-projector calibration module 126 continues to calibrate 708 the projectors 104 , followed by constructing 712 the 3D scene by the 3D reconstruction module 1 18 .
- the projector-based 3D scene reconstruction system calibrates the projectors in the system, and eliminates the need of computational expensive epipolar search for correspondence required by traditional 3D scene reconstruction algorithms.
- the projector-based reconstruction system has a wide range applications to real world problems, such as three-dimensional scanning systems, self-calibrating multi-projector system with higher resolution, natural and effective human-computer interface and augmented reality.
Abstract
Description
- This application claims priority from U.S. Patent Application No. 60/938,323, entitled “Camera-Projector Duality: Multi-projector 3D Reconstruction”, filed on May 16, 2007, which is hereby incorporated by reference in its entirety.
- The present invention relates generally to computer vision methods and systems of three-dimensional scene reconstruction, and in particular to a projector-based three-dimensional scene reconstruction system using camera-projector duality.
- Traditional three-dimensional (3D) scene reconstruction algorithms, such as stereo, space-time stereo, multi-view stereo or structured light, measures distance to the objects of the scene by analyzing the triangles constructed by the projection rays of two optical systems. Traditional stereo systems use two cameras to find two projection rays from the two camera centers which meet at the object surface being measured. Space-time stereo systems use two cameras and a projector or a time-varying light resource for 3D scene reconstruction. Multi-view stereo systems use multiple cameras for 3D scene reconstruction. Traditional stereo systems, e.g., space-time stereo and multi-view stereo, require solving a correspondence problem which determines which pairs of points in two images captured by camera(s) are projections of the same point in an object being processed. The process to solving the correspondence problem is generally referred to as epipolar search. This is a very complex problem and computationally expensive.
- The camera calibration process estimates the internal and/or external parameters of a camera that affect the 3D scene reconstruction. The internal parameters of a camera, such as camera focal length, optical center location, skewness of image pixels, and radial distortion of the lens, are often called intrinsic camera parameters. The external parameters of a camera define the location and orientation of a camera with respect to some world coordinate system. The external parameters of a camera are often called extrinsic camera parameters. Two calibrated optical systems with respect to a known reference coordinate system, such as two calibrated cameras, or one calibrated camera and one calibrated projector, are related to each other by a translation matrix and a rotation matrix which map a pixel point in one optical system to a corresponding pixel point in the other optical system. Camera-camera calibration calculates the translation and rotation matrices between the calibrated cameras. Similarly, camera-projector calibration calculates the translation and rotation matrices between the calibrated projector and camera.
-
FIG. 1A is a block diagram illustrating a traditional space-time stereo 3D scene reconstruction system. The reconstruction system illustrated inFIG. 1A comprises a pair of cameras, 102 a and 102 b, aprojector 104 and a camera-camera calibration module 106. Theprojector 104 projects light patterns to disambiguate correspondence between a pair of projection rays, resulting in a unique match between pixels captured in bothcameras 102. A reference coordinate system for reconstruction is defined in one of thecameras 102. Theprojector 104 needs not be calibrated with thecameras 102 nor defined in the stereo rig of the twocameras 102. For eachcamera 102, thecamera calibration module 106 calibrates thecamera 102 and generates extrinsic camera parameters with respect to the reference coordinate system. The camera-camera calibration module 106 also calculates the rotation and translation matrices between thecameras 102 using the intrinsic and extrinsic camera parameters of thecameras 102. Thecamera calibration module 106 may use any existing camera calibration algorithm which is readily known to a person of ordinary skills in the art such as photogrammetric calibration or self-calibration. To reconstruct the 3D scene, the rotation and translation matrices between thecameras 102 obtained by camera calibration are needed. Additionally, the computational expensive epipolar search for correspondence described above is needed for the reconstruction. - To reconstruct a 360-degree view of a scene, cameras in a traditional stereo or structured light reconstruction system need to go around the scene, and generate depthmaps or point clouds from many views. The reconstruction system then links the reconstructed pieces together.
FIG. 1C is a block diagram illustrating a traditionalmulti-view stereo 3D scene reconstruction system. The reconstruction system inFIG. 1C includes threecalibrated cameras 102 a-102 c and two camera calibration-camera modules calibrated cameras 102 andcamera calibration modules 106 may be used for multi-view reconstruction. The camera-camera calibration module 106 a calibrates thecameras calibrated cameras camera calibration module 106 b calibrates thecameras FIG. 1A , the multi-view stereo reconstruction system inFIG. 1C requires computational expensive epipolar search for correspondence for 3D scene reconstruction. - Traditional structured light systems for 3D scene reconstruction employ one camera and a projector. Light travels from the projector, and is reflected to the surface of the camera. Instead of finding two corresponding incoming projection rays, a structured light algorithm projects specific light pattern(s) onto the surface of an object, and from the observed patterns, the algorithm figures out which projection ray is reflected on the surface of the object in the scene and reaches the camera. Traditional structured light systems require the projected patterns to be well differentiated from the other objects and ambient light falling on the surface of the object being processed. This requirement often translates into a requirement for a high powered and well focused projected light.
-
FIG. 3A is a block diagram illustrating a structured light reconstruction method running in a traditional structuredlight 3D scene reconstruction system. The reconstruction system inFIG. 3A includes acamera 102, areference coordinate system 308 defined in thecamera 102, aprojector 104, a camera-projector calibration module 116 a and an object being processed 302. The camera-projector calibration module 116 a calibrates thecamera 102 a and theprojector 104. To calibrate theprojector 104 with respect to thereference coordinate system 308, the camera-projector module 116 a may use any existing projector calibration method that is known to a person of ordinary skills in the art, such as P. Song, “A theory for photometric self-calibration of multiple overlapping projectors and cameras,” IEEE International Workshop on Projector-Camera Systems, 2005, which is incorporated by reference herein in its entirety. - To reconstruct the 3D image of the
object 302, theprojector 104 projects aplane 306 a of a calibration pattern onto theobject 302, and thecamera 102 observes areflected ray 306 b from theobject 302. The reconstruction system inFIG. 3A identifies which projected ray is reflected on the surface of theobject 302 and reached thecamera 102. For example, the system identifies that an observedpoint 304 on the surface of theobject 302 is the intersection point from theprojection plane 306 a projected from theprojector 104 and theviewing ray 306 b from thecamera 102. The reconstruction system illustrated inFIG. 3A requires the projected patterns to be well differentiated from the other objects in the scene and ambient light falling on the surface of theobject 302 being processed. This requirement often translates into a requirement for a high powered and well focused projected light. In addition, the reconstruction system inFIG. 3A needs to know thecamera 102 position relative to theprojector 104, i.e., the rotation and translation matrices between thecamera 102 and theprojector 104, for the reconstruction. - In a conventional camera-projector system using traditional 3D scene reconstruction algorithms, camera(s) need to be attached rigidly to a projector rig, and the rotation and translation matrices between a camera and a projector are required for the reconstruction. Consequently, the reconstruction process is often very computationally expensive, or requires camera motion tracking or mesh stitching to integrate multiple snapshots into a full 3D model.
- To provide flexibility and computational efficiency for 3D scene reconstruction, a projector-based 3D scene reconstruction system exploits camera-projector duality and can use existing 3D scene reconstruction algorithms, e.g., structured light. A projector-based 3D scene reconstruction system calibrates the projectors in the system, and eliminates the need of computational expensive epipolar search for correspondence required by traditional 3D scene reconstruction algorithms. The projector-based reconstruction system has a wide range applications to real world problems, such as three-dimensional scanning systems, multi-projector system with high resolution, natural and effective human-computer interface and augmented reality.
- One embodiment of a disclosed system (and method) includes calibrating a plurality of projectors using camera-projector duality. The system includes a camera-projector calibration module and a projector-projector calibration module. The camera-projector calibration module is configured to calibrate a first projector with the camera and generate a first camera-projector calibration data using camera-projector duality. The camera-projector calibration module is also configured to calibrate a second projector with the camera and generate a second camera-projector calibration data. The projector-projector calibration module is configured to calibrate the first and the second projector using the first and the second camera-projector calibration data.
- The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
-
FIG. 1A is a block diagram illustrating a traditional space-time stereo 3D scene reconstruction system. -
FIG. 1B is a block diagram illustrating a stereo-projector 3D scene reconstruction system according to one embodiment. -
FIG. 1C is a block diagram illustrating a traditionalmulti-view stereo 3D scene reconstruction system. -
FIG. 1D is a block diagram illustrating an exemplary multi-projector 3D scene reconstruction system according to one embodiment. -
FIG. 2 is a flowchart using a projector-based 3D scene reconstruction method in a projector-based 3D scene reconstruction system according to one embodiment. -
FIG. 3A is a block diagram illustrating a structured light reconstruction method running in a traditional structured light 3D scene reconstruction system. -
FIG. 3B is an exemplary block diagram illustrating a projector-based reconstruction method running in a stereo-projector 3D scene reconstruction system according to one embodiment -
FIG. 4 is a flowchart illustrating a camera-projector calibration method using camera-projector duality according to one embodiment. -
FIG. 5 is a flowchart illustrating a projector-projector calibration method using camera-projector duality according to one embodiment. -
FIG. 6 is an exemplary block diagram illustrating a projector-based reconstruction method running in a multi-projector 3D scene reconstruction system according to one embodiment. -
FIG. 7 is a flowchart showing using a projector-based 3D scene reconstruction method in a multi-projector 3D scene reconstruction system according to one embodiment. - A preferred embodiment of the present invention is now described with reference to the figures where like reference numbers indicate identical or functionally similar elements.
- Reference in the specification to “one embodiment” or to “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Some portions of the detailed description that follows are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps (instructions) leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. Furthermore, it is also convenient at times, to refer to certain arrangements of steps requiring physical manipulations of physical quantities as modules or code devices, without loss of generality.
- However, all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “determining” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
- Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by a variety of operating systems.
- The present invention also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any references below to specific languages are provided for disclosure of enablement and best mode of the present invention.
- In addition, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the claims.
-
FIG. 1B is a block diagram illustrating a stereo-projector 3D scene reconstruction system according to one embodiment. The reconstruction system illustrated inFIG. 1B is the projector-based counter part of the traditional space-time stereo reconstruction system illustrated inFIG. 1A . The two calibratedcameras FIG. 1A are replaced by a pair of calibratedprojectors projector 104 inFIG. 1A is replaced by acamera 102. The reconstruction system inFIG. 1B also includes a projector-projector calibration module 126, which calibrates theprojectors 104 using camera-projector duality. The result from the projector-projector calibration by the calibration module 116 is the rotation and translation matrices between theprojectors 104. In one embodiment, the projector-projector calibration module 126 includes amemory unit 20, amicroprocessor 22, a camera-projector calibration unit 24 and a projector-projector calibration unit 26. In other embodiments, thememory unit 20 andmicroprocessor 22 may reside outside the projector-projector calibration module 126. To simplify the description of an embodiment, the rotation and translation matrices between two calibrated optical systems (e.g., camera-camera or camera-projector) are referred to as calibration data between the two calibrated optical systems from herein and throughout the entire specification. - The
camera 102 in the projector-based reconstruction system illustrated inFIG. 1B is used to identify the projected rays reflected on an object surface in a projector-based reconstruction system, and thecamera 102 can move freely around the scene to be reconstructed after projector calibration. In addition, the reference coordinate system for reconstruction is defined in one of theprojectors 104, not to thecamera 102 as illustrated inFIG. 1A . Consequently, in a projector-based reconstruction system as inFIG. 1B , neither extrinsic camera parameters nor computational expensive epipolar search for correspondence is required for reconstruction. Further detail on projector calibration using camera-projector duality is presented in connection with the discussion ofFIGS. 3B , 4 and 5. -
FIG. 1D is a block diagram illustrating an exemplary multi-projector 3D scene reconstruction system according to one embodiment. The reconstruction system illustrated inFIG. 1D is the projector-based counter part of the traditional multi-view stereo reconstruction system illustrated inFIG. 1C . The three calibratedcameras FIG. 1C are replaced by three calibratedprojectors FIG. 1D also includes one ormore cameras 102 and projector-projector calibration modules 126 as described inFIG. 1B . InFIG. 1D , only onecamera 102 is included for simplicity of description of an embodiment, and more cameras can be used in other embodiments. - In one embodiment, the projector-
projector calibration module 126 a calibrates theprojector 104 a and theprojector 104 b using camera projector duality, and generates the calibration data between theprojectors 104 a-b. The projector-projector calibration module 126 a similarly calibrates theprojectors 104 b-c. In other embodiments, theprojectors 104 b-c are calibrated by projector-projector calibration module 126 b. Thus, the projector-based reconstruction system illustrated inFIG. 1D does not need the intrinsic and extrinsic parameters of thecamera 102 for the reconstruction process after projector calibration. Thecamera 102 freely moves around the scene to read the projected patterns from a plurality of directions in the reconstruction system. In addition, the reference coordinate system for reconstruction can be attached to one of theprojectors 104. Consequently, reconstruction is simple and fast because all projected ray intersection occurs in the same reference coordinate system defined in aprojector 104. The computationally expensive epipolar search for correspondence required for reconstruction in a traditional multi-view stereo reconstruction system is no longer needed in the multi-projector based reconstruction system because the intrinsic and extrinsic parameters of thecamera 102 are not needed for the reconstruction, and projection rays and planes are all from theprojectors 104. -
FIG. 2 is a flowchart using a projector-based 3D scene reconstruction method in a projector-based 3D scene reconstruction system according to one embodiment. Initially, the projector-based reconstruction system calibrates 202 projectors used in the system using camera-projector duality. Each calibratedprojector projects 204 calibration patterns, such as known checkerboard patterns, onto an object in the scene being reconstructed. A camera captures 206 the projected calibration patterns from a plurality of directions. The reconstruction system identifies 208 the projection rays reflected from the object surface to the captured images. Once two projection rays from different projectors are identified, the projection rays are put in the reference coordinate system defined in one of the projectors in the system. The projector-based method recovers 210 the signature, i.e., identification, of each individual projection ray. The intersection point between the two projection rays becomes the reconstructed surface point of the object in the scene being reconstructed. Since the correspondence information between two rays is directly read from the captured images of the projected calibration patterns, the projector-based method does not need to perform the computational expensive epipolar search for correspondence as in a conventional 3D scene reconstruction system. From the information obtained from the above steps, the projector-based reconstruction system reconstructs 212 the 3D scene using any existing 3D scene reconstruction algorithms, such as structured light. Further detail on the steps of identifying projection rays and recovering the signatures of projection rays are presented in connection with the discussion ofFIG. 3B . - The projector-based reconstruction system described above needs to calibrate two or more projectors using camera-projector duality (e.g.,
step 202 inFIG. 2 ). Cameras and projectors are dual to each other due to the fact that cameras and projectors resemble each other very much in structure and functions except light travels in opposite directions. A camera measures the amount of light arriving on a pixel of a scene from a direction, whereas a projector projects a specified amount of light to the pixel of the scene in the opposite direction. For example, the same format (or types) of intrinsic parameters can be used for both camera and projector camera and projector use 2D array pixels and a lens assembly. Therefore, the projector-based reconstruction system modifies the conventional camera calibration mechanisms used in the stereo camera systems for calibrating projectors. - In one embodiment, the projector-projector calibration may be implemented in two sequential steps, first by calibrating each individual projector with a camera by camera-projector calibration using camera-projector duality, followed by projector-projector calibration using camera-projector calibration data.
FIG. 4 is a flowchart illustrating a camera-projector calibration method using camera-projector duality according to one embodiment, andFIG. 5 is a flowchart of projector-projector calibration method using camera-projector calibration data according to one embodiment. - In
FIG. 4 , initially, the camera-projector calibration unit 24 of the projector-projector calibration module 126 obtains 402 the intrinsic parameters of the camera using a conventional camera calibration mechanism known to a person of ordinary skills in the art. A projector in the stereo-projector reconstruction system projects 404 calibration patterns with a selected color channel, such as red color, onto a calibration board. In one embodiment, the calibration pattern is a checkerboard pattern and the checkerboard pattern is printed on the calibration board in yellow color. The camera used in the reconstruction system captures 406 the images of the scene being reconstructed. Each image captured by the camera includes the projected patterns onto the calibration board. For example, a sample captured image is an image with red projected calibration pattern on the calibration board where yellow calibration pattern is printed. A red channel of this captured image shows the projected checkerboard pattern, and a blue channel of this captured image shows the printed checkerboard pattern. In one embodiment, the camera-projector calibration unit 24 detects 408 in parallel the corner points of the calibration patterns, one for the printed patterns, i.e., 408 a, and the other for the projected patterns, i.e., 408 b. From the printed patterns, thecalibration unit 24 computes 410 the extrinsic parameters of the calibration board, i.e., the position and orientation of the calibration board in a captured image. From the projected patterns and the extrinsic parameters of the calibration board, thecalibration unit 24 computes 412 the camera-projector calibration data between the camera and the projector in addition to the intrinsic parameters of the projector being calibrated. Thecalibration unit 24 repeats the above steps to calibrate each projector used in the system. The result of such camera-projector calibration is a plurality of camera-projector calibration data. -
FIG. 5 is a flowchart illustrating a projector-projector calibration method using camera-projector duality in a stereo-projector reconstruction system according to one embodiment. In this embodiment, assuming that two projectors projector1 and projector2 are each calibrated with the camera used in the system using the camera-projector calibration method described inFIG. 4 , the projector-projector calibration unit 26 of the projector-projector calibration module 126 receives 502/504 the camera-projector1 calibration data between the camera and the projector1, and the camera and projector2 calibration data. Thecalibration unit 26 then calibrates the projector1 and projector2 using the received calibration data. The result of the calibration from the projector-projector calibration module 126 is the calibration data between the projector1 and projector2, e.g., the rotation and translation matrices between the projector 1 and projector 2. - To further illustrate the projector-projector calibration method using camera-projector duality, as described above, let R1, R2 be rotation matrices which relate the projector1 and projector2 to the camera, and t1, t2 be corresponding translation vectors. xc=R1x1+t1, xc=R2x2+t2, where x1 and x2 are points in the projector coordinate systems within a projector and xc is a point in the camera coordinate system. Then R12 and t12 that describes the transformation between the projector1 and the projector2 can be written as follows, R12=R1 TR2, t12=R1 T(t2−t1), since x1=R1 T(R2x2+t2−t1) holds.
- To reconstruct 3D scene, the projector-based reconstruction system described above identifies the projection rays reflected from an object surface to the captured images by a camera (e.g.,
step 208 inFIG. 2 ). Conceptually, identifying projection rays is the process of tagging a projection ray with a distinguishable signature or identification, and from the captured image by a camera, recovering the signature of the projection ray that has arrived on the surface point of an object which is corresponding to the pixel in the captured image. In one embodiment, a projection ray identification module uses time sequential binary codes with phase shift to find the projection ray signature. Other embodiments may use other ray/plane identification coding schemes which are known to a person of ordinary skills in the art, e.g., 1D (horizontal or vertical) plane tagging, and 2D ray identification as described in U.S. patent application No. 61/016,304, entitled “Optimized Projection Pattern for Long-Range Depth Sensing”, which is hereby incorporated by reference in its entirety. - Generally, a projector can generate many levels of brightness per pixel of an image, but it is not desirable to relay on the magnitude of a projected light for ray identification, especially when a camera may capture the reflected light on an object surface whose reflectance property is unknown. Thus, in one embodiment, the projector-based reconstruction system uses binary codes, i.e., two or three levels of brightness of the projected light, for robust detection of projection ray. The reconstruction system multiplexes the ray signature to multiple channels. The multiplexed ray signatures are then reassembled after capturing. In one embodiment, the channels may be multiple consecutive frames. Other embodiments may use spatial neighborhoods or spectral bands, such as colors, as multiple channels.
- To simplify the binary codes described above while maintaining its effectiveness, the projector-based reconstruction system also multiplexes the ray signatures with time. Time-multiplexing is simple but very effective if the scene to be reconstructed is static. Bits of signatures for all projection rays are built as patterns and are projected to the scene being reconstructed. From the captured images, the projector-based reconstruction system detects the bits and assembles the bits into ray signatures. This approach allows spatially dense coding for ray identification. To deal with reconstruction of a scene that is changing, in one embodiment, the projector-based reconstruction system uses the time-multiplexing with phase shift for ray identification detection.
-
FIG. 3B is an exemplary block diagram illustrating a projector-based reconstruction method running in a stereo-projector 3D scene reconstruction system according to one embodiment. The reconstruction system inFIG. 3B includes two calibratedprojectors system 308 defined in theprojector 104 b, acamera 102 and anobject 302 being processed. The reconstruction system inFIG. 3B also includes a projector-projector calibration module 126 and a 3D reconstruction module 11 8. The projector-projector calibration module 126 calibrates theprojectors 104 using camera-projector duality, and calculates the calibration data between theprojector 3D reconstruction module 118 identifies theintersection point 304 from theray 306 b projected by theprojector 104 b and theprojector plane 306 a from theprojector 104 a on the surface of theobject 302. The projector-based reconstruction system inFIG. 3B treats rays in the projectedplane 306 a as identical to each other, and only one ray in theplane 306 a is used for the reconstruction. It is noted that the projector-based reconstruction system does not need to know a priori of which ray in theplane 306 a is to be used for reconstruction. - With respect to the processing steps as described in
FIG. 2 , a further analysis of projection ray identification of a projector-based reconstruction method is presented inFIG. 3B with the time sequential binary codes with phase shift. The projector-based reconstruction method inFIG. 3B runs on onecamera 102 and two rigidly attachedprojectors camera 102 reads the projected calibration patterns onto the scene being reconstructed. The3D reconstruction module 118 identifies theintersection 304 of theprojection ray 306 b from theprojector 104 b and theprojection plane 306 a from theother projector 104 a. Since the reference coordinatesystem 308 is defined in theprojector 104 b, theprojector 104 b is thus referred to as the reference projector. In one embodiment, thereference projector 104 b projects ray-tagging pattern(s) and theother projector 104 a projects the plane-tagging pattern(s) onto the surface of theobject 302. For a ray-tagging pattern projected by theprojector 104 b, the3D reconstruction module 118 identifies the x-, and y-coordinate of the ray-tagging pattern from theprojector 104 b, e.g., theprojection ray 306 b. For theprojector 104 a, the3D reconstruction module 118 only uses the x-coordinate of the plane-tagging pattern, e.g., theprojection plane 306 a. From the captured image of the scene with the projected ray-tagging and plane-tagging patterns by thecamera 102, the3D reconstruction module 118 recovers the signatures of two projection rays, one ray being 306 b, and the other being a ray in theprojection plane 306 a. For the pixels on the surface of theobject 302, which have both projection rays being detected, the intersection of both rays, such as theintersection point 304 inFIG. 3B , are computed using the calibration data between the twoprojectors intersection point 304 is the recovered surface point of the object being reconstructed. The3D reconstruction module 118 repeats the above steps to reconstruct each surface point of the object. - In one embodiment, the
camera 102 can be used for reading the projected calibration patterns from theprojectors 104. Thecamera 102 does not need to be calibrated at all (i.e., neither intrinsic nor extrinsic parameters of thecamera 102 are needed), thus, the calibration data between theprojectors 104 and thecamera 102 are no longer needed in the reconstruction process after calibrating theprojectors 104, and thecamera 102 can move freely around theobject 302 as needed to capture the projected calibration patterns. All the reconstruction result will reside in the reference coordinate system within aprojector 104. Another advantage of using projector-based reconstruction system illustrated inFIG. 3B is that the correspondence search over an epipolar line between two projected rays or a projection ray and a projection plane is not necessary. Reading projected calibration patterns by the projector-based reconstruction system indicates which projection ray or plane from theprojector 104 arrives at the surface of theobject 302, and the intersection of two projection rays or one projection ray and one projection plane, e.g.,intersection point 304, can be directly computed since the reference coordinate system forreconstruction 308 is defined in theprojector 104 b. - It is noted that using the projector-based reconstruction system described above, in one embodiment, needs the calibration data between
projectors 104 including both intrinsic and extrinsic parameters of eachprojector 104. To locate each projected ray in the reference coordinatesystem 308, the projector-based reconstruction system needs theprojector 104 extrinsic parameters to locate the optical center and axis of theprojector 104, and theprojector 104 intrinsic parameters to determine which direction the ray is going from the optical center of theprojector 104. The ID of the ray is decoded from the captured images according to the ray-tagging scheme described above. -
FIG. 6 is an exemplary block diagram illustrating a projector-based reconstruction method running in a multi-projector reconstruction system according to one embodiment. The multi-projector reconstruction system illustrated inFIG. 6 extends the stereo-projector system depicted inFIG. 3B to use additional projectors. In a multi-projector reconstruction system, projectors may project projection planes rather than projection rays as described in theFIG. 3B . This is because projection patterns for projection planes are generally simpler and more robust to noise than patterns for projection rays in a multi-projector reconstruction environment. In response to three projection planes from two or three different projectors being detected by the reconstruction method, the reconstruction method is able to uniquely determine a point in 3D space. To configure the multiple projectors for 3D scene reconstruction, the projector-based reconstruction method illustrated inFIG. 6 calibrates each projector using the camera-projector duality described above. After the projector calibration, the reconstruction method no longer needs the extrinsic parameters of the camera and computational expensive epipolar line search for 3D correspondence as required in conventional space-time multi-view 3D scene reconstruction systems, or the camera motion tracking as required in conventional structured-light multi-view 3D scene reconstruction systems. - The reconstruction system in
FIG. 6 , in one embodiment, includes three calibratedprojectors camera 102 and anobjector 302 being reconstructed. The reconstruction system also includes two projector-projector calibration modules 126 a-b and a 3D reconstruction module 1 18. In theFIG. 6 , the calibration data between theprojector 104 a and theprojector 104 c is omitted for simplicity of description of an embodiment; the calibration data between theprojectors multiple cameras 102 and more than threeprojectors 104. The reference coordinate system forreconstruction 308 is defined in theprojector 104 b in the embodiment described inFIG. 6 . In other embodiments, the reference coordinatesystem 308 may be defined inother projectors 104. The calibratedprojectors 104 a-104 c projects projection planes onto the surface of theobject 302. From the intersection point on the surface of theobject 302, the3D reconstruction module 118 identifies each individual plane projected from the correspondingprojector 104. - For example, in the embodiment illustrated in
FIG. 6 , eachprojector 104 projects a projection plane 606 onto the surface of theobject 302. Specifically, theprojector 104 a projects the plane 606 a, the plane 606 b by theprojector 104 b and the plane 606 c by theprojector 104 c. The three planes 606 a-606 c intersects at theintersection point 304 on the surface of theobject 302. The3D reconstruction module 118 in theFIG. 6 identifies each projection plane 306 and itscorresponding projector 104 from theintersection point 304. -
FIG. 7 is a flowchart showing using a projector-based 3D scene reconstruction method in a multi-projector reconstruction system as described inFIG. 6 according to one embodiment. Initially, the projector-projector calibration module 126 calibrates 702 thecamera 102 and a selectedprojector 104, e.g., theprojector 104 a, following the calibration steps such as those described inFIG. 4 , and stores 704 the camera-projector calibration data in a storage medium, such as a local cache memory. The projector-projector calibration module 126checks 706 whether allprojectors 104 in the reconstruction system has been calibrated with thecamera 102. In response to all pairs of theprojectors 104 being calibrated with thecamera 102, the projector-projector calibration module 126 calibrates 710 a pair of projectors as described inFIG. 5 . In response to not all theprojectors 104 being calibrated with thecamera 102, the projector-projector calibration module 126 repeats the camera-projector calibration steps 702-706. For each projector-projector calibration, the projector-projector calibration module 126further checks 710 whether all theprojectors 104 have calibrated with each other. Responsive to all theprojectors 104 being calibrated, the3D reconstruction module 118 constructs 712 the 3D scene following the 3D reconstruction steps, e.g., steps 202-214 described inFIG. 2 . If not all theprojectors 104 have been calibrated with each other, the projector-projector calibration module 126 continues to calibrate 708 theprojectors 104, followed by constructing 712 the 3D scene by the 3D reconstruction module 1 18. - To provide flexibility and computational efficiency for 3D scene reconstruction, it is possible to have a projector-based 3D scene reconstruction system which exploits the camera-projector duality and existing 3D scene reconstruction algorithms, e.g., structured light. The projector-based 3D scene reconstruction system calibrates the projectors in the system, and eliminates the need of computational expensive epipolar search for correspondence required by traditional 3D scene reconstruction algorithms. The projector-based reconstruction system has a wide range applications to real world problems, such as three-dimensional scanning systems, self-calibrating multi-projector system with higher resolution, natural and effective human-computer interface and augmented reality.
- While particular embodiments and applications of the present invention have been illustrated and described herein, it is to be understood that the invention is not limited to the precise construction and components disclosed herein and that various modifications, changes, and variations may be made in the arrangement, operation, and details of the methods and apparatuses of the present invention without departing from the spirit and scope of the invention as it is defined in the appended claims.
Claims (32)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/121,056 US8172407B2 (en) | 2007-05-16 | 2008-05-15 | Camera-projector duality: multi-projector 3D reconstruction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US93832307P | 2007-05-16 | 2007-05-16 | |
US12/121,056 US8172407B2 (en) | 2007-05-16 | 2008-05-15 | Camera-projector duality: multi-projector 3D reconstruction |
Publications (2)
Publication Number | Publication Date |
---|---|
US20080285843A1 true US20080285843A1 (en) | 2008-11-20 |
US8172407B2 US8172407B2 (en) | 2012-05-08 |
Family
ID=40027533
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/121,056 Active 2031-03-09 US8172407B2 (en) | 2007-05-16 | 2008-05-15 | Camera-projector duality: multi-projector 3D reconstruction |
Country Status (2)
Country | Link |
---|---|
US (1) | US8172407B2 (en) |
WO (1) | WO2008144370A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100315412A1 (en) * | 2009-06-15 | 2010-12-16 | Microsoft Corporation | Piecewise planar reconstruction of three-dimensional scenes |
US20110034176A1 (en) * | 2009-05-01 | 2011-02-10 | Lord John D | Methods and Systems for Content Processing |
US20110205341A1 (en) * | 2010-02-23 | 2011-08-25 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction. |
US20120150573A1 (en) * | 2010-12-13 | 2012-06-14 | Omar Soubra | Real-time site monitoring design |
US9025860B2 (en) | 2012-08-06 | 2015-05-05 | Microsoft Technology Licensing, Llc | Three-dimensional object browsing in documents |
US20150138349A1 (en) * | 2012-07-04 | 2015-05-21 | Creaform Inc. | 3-d scanning and positioning system |
US9087408B2 (en) | 2011-08-16 | 2015-07-21 | Google Inc. | Systems and methods for generating depthmaps |
US9329469B2 (en) | 2011-02-17 | 2016-05-03 | Microsoft Technology Licensing, Llc | Providing an interactive experience using a 3D depth camera and a 3D projector |
US9372552B2 (en) | 2008-09-30 | 2016-06-21 | Microsoft Technology Licensing, Llc | Using physical objects in conjunction with an interactive surface |
US9480907B2 (en) | 2011-03-02 | 2016-11-01 | Microsoft Technology Licensing, Llc | Immersive display with peripheral illusions |
US9597587B2 (en) | 2011-06-08 | 2017-03-21 | Microsoft Technology Licensing, Llc | Locational node device |
US10176594B2 (en) * | 2014-10-09 | 2019-01-08 | Denso Corporation | Progressive in-vehicle camera calibrator, image generator, in-vehicle camera calibration method, and image generation method |
CN109556534A (en) * | 2017-09-26 | 2019-04-02 | 海克斯康计量(以色列)有限公司 | Global localization of the sensor relative to the different splicing blocks of global three-dimensional surface rebuilding |
US20190180475A1 (en) * | 2017-12-08 | 2019-06-13 | Qualcomm Incorporated | Dynamic camera calibration |
US10401142B2 (en) | 2012-07-18 | 2019-09-03 | Creaform Inc. | 3-D scanning and positioning interface |
CN111782854A (en) * | 2020-07-07 | 2020-10-16 | 科珑诗菁生物科技(上海)有限公司 | Multi-view projection makeup method and multi-view projection makeup dressing wearing equipment |
US20210172732A1 (en) * | 2019-12-09 | 2021-06-10 | Industrial Technology Research Institute | Projecting apparatus and projecting calibration method |
US20210187736A1 (en) * | 2013-03-15 | 2021-06-24 | X Development Llc | Determining a Virtual Representation of an Environment By Projecting Texture Patterns |
US20210302152A1 (en) * | 2018-08-01 | 2021-09-30 | Shining3D Tech Co., Ltd. | Three-Dimensional Scanning Method and System |
US11185697B2 (en) | 2016-08-08 | 2021-11-30 | Deep Brain Stimulation Technologies Pty. Ltd. | Systems and methods for monitoring neural activity |
US11298070B2 (en) | 2017-05-22 | 2022-04-12 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
US11503259B2 (en) * | 2018-07-31 | 2022-11-15 | Coretronic Corporation | Projector calibration method and projection system using the same |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112011102991B4 (en) * | 2010-09-08 | 2016-09-22 | Canon Kabushiki Kaisha | Method and device for 3D measurement by detecting a predetermined pattern |
US8736674B2 (en) | 2010-09-23 | 2014-05-27 | Dolby Laboratories Licensing Corporation | Method and system for 3D display calibration with feedback determined by a camera device |
US9723293B1 (en) | 2011-06-21 | 2017-08-01 | Amazon Technologies, Inc. | Identifying projection surfaces in augmented reality environments |
US20130083997A1 (en) * | 2011-10-04 | 2013-04-04 | Alcatel-Lucent Usa Inc. | Temporally structured light |
US8840250B1 (en) * | 2012-01-11 | 2014-09-23 | Rawles Llc | Projection screen qualification and selection |
EP2939423A1 (en) | 2012-12-28 | 2015-11-04 | Metaio GmbH | Method of and system for projecting digital information on a real object in a real environment |
US9052186B2 (en) | 2013-01-31 | 2015-06-09 | Hewlett-Packard Development Company, L.P. | Correspondence mapping between an imaging system and a directional rendering system |
US9547222B2 (en) * | 2013-02-08 | 2017-01-17 | University Of South Australia | Method and apparatus for calibration of multiple projector systems |
TW201520673A (en) * | 2013-11-26 | 2015-06-01 | Automotive Res & Testing Ct | Information display system with automatic viewable range adjustment and display method thereof |
US10257498B2 (en) | 2015-12-04 | 2019-04-09 | Empire Technology Development Llc | Coordination of multiple structured light-based 3D image detectors |
JPWO2018167999A1 (en) * | 2017-03-17 | 2020-01-16 | パナソニックIpマネジメント株式会社 | Projector and projector system |
AU2017251725A1 (en) * | 2017-10-24 | 2019-05-09 | Canon Kabushiki Kaisha | Calibration of projection systems |
US11074700B2 (en) | 2018-04-23 | 2021-07-27 | Cognex Corporation | Systems, methods, and computer-readable storage media for determining saturation data for a temporal pixel |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6520647B2 (en) * | 2000-08-17 | 2003-02-18 | Mitsubishi Electric Research Laboratories Inc. | Automatic keystone correction for projectors with arbitrary orientation |
US6527395B1 (en) * | 2001-12-10 | 2003-03-04 | Mitsubishi Electric Research Laboratories, Inc. | Method for calibrating a projector with a camera |
US6707444B1 (en) * | 2000-08-18 | 2004-03-16 | International Business Machines Corporation | Projector and camera arrangement with shared optics and optical marker for use with whiteboard systems |
US6754370B1 (en) * | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
US20040150835A1 (en) * | 2003-01-28 | 2004-08-05 | Beat Frick | Profiling device, electronic projector equipped therewith and process for the profiling of an electronic display device |
US6813035B2 (en) * | 1999-12-27 | 2004-11-02 | Siemens Aktiengesellschaft | Method for determining three-dimensional surface coordinates |
US20040257540A1 (en) * | 2003-04-16 | 2004-12-23 | Sebastien Roy | Single or multi-projector for arbitrary surfaces without calibration nor reconstruction |
US20060192925A1 (en) * | 2005-02-28 | 2006-08-31 | Chang Nelson L A | Multi-projector geometric calibration |
US20060210145A1 (en) * | 2005-02-16 | 2006-09-21 | Sungkyunkwan University Foundation For Corporate Collaboration | Method and system of structural light-based 3d depth imaging using signal separation coding and error correction thereof |
US7149544B2 (en) * | 2002-03-05 | 2006-12-12 | Microsoft Corporation | Detachable radio module |
US7182465B2 (en) * | 2004-02-25 | 2007-02-27 | The University Of North Carolina | Methods, systems, and computer program products for imperceptibly embedding structured light patterns in projected color images for display on planar and non-planar surfaces |
US20070057946A1 (en) * | 2003-07-24 | 2007-03-15 | Dan Albeck | Method and system for the three-dimensional surface reconstruction of an object |
US20090169095A1 (en) * | 2008-01-02 | 2009-07-02 | Spatial Integrated Systems, Inc. | System and method for generating structured light for 3-dimensional image rendering |
US20090245690A1 (en) * | 2008-03-26 | 2009-10-01 | City University Of Hong Kong | Auto-calibration method for a projector-camera system |
US7724379B2 (en) * | 2005-05-12 | 2010-05-25 | Technodream21, Inc. | 3-Dimensional shape measuring method and device thereof |
US7773827B2 (en) * | 2006-02-15 | 2010-08-10 | Mersive Technologies, Inc. | Hybrid system for multi-projector geometry calibration |
US7901093B2 (en) * | 2006-01-24 | 2011-03-08 | Seiko Epson Corporation | Modeling light transport in complex display systems |
US7942530B2 (en) * | 2006-10-31 | 2011-05-17 | The Regents Of The University Of California | Apparatus and method for self-calibrating multi-projector displays via plug and play projectors |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2350242T3 (en) | 2005-11-28 | 2011-01-20 | 3Shape A/S | STRUCTURED LIGHT CODED. |
-
2008
- 2008-05-15 WO PCT/US2008/063665 patent/WO2008144370A1/en active Application Filing
- 2008-05-15 US US12/121,056 patent/US8172407B2/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6813035B2 (en) * | 1999-12-27 | 2004-11-02 | Siemens Aktiengesellschaft | Method for determining three-dimensional surface coordinates |
US6754370B1 (en) * | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
US6520647B2 (en) * | 2000-08-17 | 2003-02-18 | Mitsubishi Electric Research Laboratories Inc. | Automatic keystone correction for projectors with arbitrary orientation |
US6707444B1 (en) * | 2000-08-18 | 2004-03-16 | International Business Machines Corporation | Projector and camera arrangement with shared optics and optical marker for use with whiteboard systems |
US6527395B1 (en) * | 2001-12-10 | 2003-03-04 | Mitsubishi Electric Research Laboratories, Inc. | Method for calibrating a projector with a camera |
US7149544B2 (en) * | 2002-03-05 | 2006-12-12 | Microsoft Corporation | Detachable radio module |
US20040150835A1 (en) * | 2003-01-28 | 2004-08-05 | Beat Frick | Profiling device, electronic projector equipped therewith and process for the profiling of an electronic display device |
US20040257540A1 (en) * | 2003-04-16 | 2004-12-23 | Sebastien Roy | Single or multi-projector for arbitrary surfaces without calibration nor reconstruction |
US20070057946A1 (en) * | 2003-07-24 | 2007-03-15 | Dan Albeck | Method and system for the three-dimensional surface reconstruction of an object |
US7182465B2 (en) * | 2004-02-25 | 2007-02-27 | The University Of North Carolina | Methods, systems, and computer program products for imperceptibly embedding structured light patterns in projected color images for display on planar and non-planar surfaces |
US20060210145A1 (en) * | 2005-02-16 | 2006-09-21 | Sungkyunkwan University Foundation For Corporate Collaboration | Method and system of structural light-based 3d depth imaging using signal separation coding and error correction thereof |
US7916932B2 (en) * | 2005-02-16 | 2011-03-29 | In-G Co., Ltd. | Method and system of structural light-based 3D depth imaging using signal separation coding and error correction thereof |
US20060192925A1 (en) * | 2005-02-28 | 2006-08-31 | Chang Nelson L A | Multi-projector geometric calibration |
US7306341B2 (en) * | 2005-02-28 | 2007-12-11 | Hewlett-Packard Development Company, L.P. | Multi-projector geometric calibration |
US7724379B2 (en) * | 2005-05-12 | 2010-05-25 | Technodream21, Inc. | 3-Dimensional shape measuring method and device thereof |
US7901093B2 (en) * | 2006-01-24 | 2011-03-08 | Seiko Epson Corporation | Modeling light transport in complex display systems |
US7773827B2 (en) * | 2006-02-15 | 2010-08-10 | Mersive Technologies, Inc. | Hybrid system for multi-projector geometry calibration |
US7942530B2 (en) * | 2006-10-31 | 2011-05-17 | The Regents Of The University Of California | Apparatus and method for self-calibrating multi-projector displays via plug and play projectors |
US20090169095A1 (en) * | 2008-01-02 | 2009-07-02 | Spatial Integrated Systems, Inc. | System and method for generating structured light for 3-dimensional image rendering |
US20090245690A1 (en) * | 2008-03-26 | 2009-10-01 | City University Of Hong Kong | Auto-calibration method for a projector-camera system |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9372552B2 (en) | 2008-09-30 | 2016-06-21 | Microsoft Technology Licensing, Llc | Using physical objects in conjunction with an interactive surface |
US10346529B2 (en) | 2008-09-30 | 2019-07-09 | Microsoft Technology Licensing, Llc | Using physical objects in conjunction with an interactive surface |
US20110034176A1 (en) * | 2009-05-01 | 2011-02-10 | Lord John D | Methods and Systems for Content Processing |
US8886206B2 (en) * | 2009-05-01 | 2014-11-11 | Digimarc Corporation | Methods and systems for content processing |
US8933925B2 (en) * | 2009-06-15 | 2015-01-13 | Microsoft Corporation | Piecewise planar reconstruction of three-dimensional scenes |
US20100315412A1 (en) * | 2009-06-15 | 2010-12-16 | Microsoft Corporation | Piecewise planar reconstruction of three-dimensional scenes |
US20110205341A1 (en) * | 2010-02-23 | 2011-08-25 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction. |
US8730309B2 (en) | 2010-02-23 | 2014-05-20 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction |
US9509981B2 (en) | 2010-02-23 | 2016-11-29 | Microsoft Technology Licensing, Llc | Projectors and depth cameras for deviceless augmented reality and interaction |
US20120150573A1 (en) * | 2010-12-13 | 2012-06-14 | Omar Soubra | Real-time site monitoring design |
US9329469B2 (en) | 2011-02-17 | 2016-05-03 | Microsoft Technology Licensing, Llc | Providing an interactive experience using a 3D depth camera and a 3D projector |
US9480907B2 (en) | 2011-03-02 | 2016-11-01 | Microsoft Technology Licensing, Llc | Immersive display with peripheral illusions |
US9597587B2 (en) | 2011-06-08 | 2017-03-21 | Microsoft Technology Licensing, Llc | Locational node device |
US9087408B2 (en) | 2011-08-16 | 2015-07-21 | Google Inc. | Systems and methods for generating depthmaps |
US9816809B2 (en) * | 2012-07-04 | 2017-11-14 | Creaform Inc. | 3-D scanning and positioning system |
US20150138349A1 (en) * | 2012-07-04 | 2015-05-21 | Creaform Inc. | 3-d scanning and positioning system |
US10401142B2 (en) | 2012-07-18 | 2019-09-03 | Creaform Inc. | 3-D scanning and positioning interface |
US10928183B2 (en) | 2012-07-18 | 2021-02-23 | Creaform Inc. | 3-D scanning and positioning interface |
US9025860B2 (en) | 2012-08-06 | 2015-05-05 | Microsoft Technology Licensing, Llc | Three-dimensional object browsing in documents |
US20210187736A1 (en) * | 2013-03-15 | 2021-06-24 | X Development Llc | Determining a Virtual Representation of an Environment By Projecting Texture Patterns |
US10176594B2 (en) * | 2014-10-09 | 2019-01-08 | Denso Corporation | Progressive in-vehicle camera calibrator, image generator, in-vehicle camera calibration method, and image generation method |
US11185697B2 (en) | 2016-08-08 | 2021-11-30 | Deep Brain Stimulation Technologies Pty. Ltd. | Systems and methods for monitoring neural activity |
US11278726B2 (en) | 2016-08-08 | 2022-03-22 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
US11890478B2 (en) | 2016-08-08 | 2024-02-06 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
US11298070B2 (en) | 2017-05-22 | 2022-04-12 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
US10832441B2 (en) * | 2017-09-26 | 2020-11-10 | Hexagon Metrology (Israel) Ltd. | Global positioning of a sensor with respect to different tiles for a global three-dimensional surface reconstruction |
CN109556534A (en) * | 2017-09-26 | 2019-04-02 | 海克斯康计量(以色列)有限公司 | Global localization of the sensor relative to the different splicing blocks of global three-dimensional surface rebuilding |
US20190180475A1 (en) * | 2017-12-08 | 2019-06-13 | Qualcomm Incorporated | Dynamic camera calibration |
US11503259B2 (en) * | 2018-07-31 | 2022-11-15 | Coretronic Corporation | Projector calibration method and projection system using the same |
US20210302152A1 (en) * | 2018-08-01 | 2021-09-30 | Shining3D Tech Co., Ltd. | Three-Dimensional Scanning Method and System |
US20210172732A1 (en) * | 2019-12-09 | 2021-06-10 | Industrial Technology Research Institute | Projecting apparatus and projecting calibration method |
US11549805B2 (en) * | 2019-12-09 | 2023-01-10 | Industrial Technology Research Institute | Projecting apparatus and projecting calibration method |
CN111782854A (en) * | 2020-07-07 | 2020-10-16 | 科珑诗菁生物科技(上海)有限公司 | Multi-view projection makeup method and multi-view projection makeup dressing wearing equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2008144370A1 (en) | 2008-11-27 |
US8172407B2 (en) | 2012-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8172407B2 (en) | Camera-projector duality: multi-projector 3D reconstruction | |
US7146036B2 (en) | Multiframe correspondence estimation | |
Alhwarin et al. | IR stereo kinect: improving depth images by combining structured light with IR stereo | |
US7733404B2 (en) | Fast imaging system calibration | |
CN103069250B (en) | 3-D measuring apparatus, method for three-dimensional measurement | |
US9514537B2 (en) | System and method for adaptive depth map reconstruction | |
US20040257540A1 (en) | Single or multi-projector for arbitrary surfaces without calibration nor reconstruction | |
CA2786436C (en) | Depth camera compatibility | |
JP2017223687A (en) | Design of code in affine-invariant spatial mask | |
WO2013009662A2 (en) | Calibration between depth and color sensors for depth cameras | |
US10540784B2 (en) | Calibrating texture cameras using features extracted from depth images | |
JP2011253376A (en) | Image processing device, image processing method and program | |
JP2005072888A (en) | Image projection method and image projection device | |
US20080319704A1 (en) | Device and Method for Determining Spatial Co-Ordinates of an Object | |
JP2004515832A (en) | Apparatus and method for spatio-temporal normalization matching of image sequence | |
Fiala et al. | Panoramic stereo reconstruction using non-SVP optics | |
KR20230065978A (en) | Systems, methods and media for directly repairing planar surfaces in a scene using structured light | |
Martinez et al. | Kinect Unleashed: Getting Control over High Resolution Depth Maps. | |
KR20200049958A (en) | Apparatus and method for measuring depth of three dimensions | |
Svoboda | Quick guide to multi-camera self-calibration | |
JP2001338280A (en) | Three-dimensional space information input device | |
McIlroy et al. | Kinectrack: Agile 6-dof tracking using a projected dot pattern | |
CN112233139A (en) | System and method for detecting motion during 3D data reconstruction | |
KR100933304B1 (en) | An object information estimator using the single camera, a method thereof, a multimedia device and a computer device including the estimator, and a computer-readable recording medium storing a program for performing the method. | |
JP3221384B2 (en) | 3D coordinate measuring device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HONDA MOTOR CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIM, JONGWOO;REEL/FRAME:020954/0200 Effective date: 20080509 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |