KR101853269B1 - Apparatus of stitching depth maps for stereo images - Google Patents
Apparatus of stitching depth maps for stereo images Download PDFInfo
- Publication number
- KR101853269B1 KR101853269B1 KR1020170047363A KR20170047363A KR101853269B1 KR 101853269 B1 KR101853269 B1 KR 101853269B1 KR 1020170047363 A KR1020170047363 A KR 1020170047363A KR 20170047363 A KR20170047363 A KR 20170047363A KR 101853269 B1 KR101853269 B1 KR 101853269B1
- Authority
- KR
- South Korea
- Prior art keywords
- depth
- depth map
- stitching
- maps
- image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/15—Processing image signals for colour aspects of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Abstract
Description
Field of the Invention [0002] The present invention relates to a stereo image matching technique, and more particularly, to a depth map stitching apparatus for stereo images capable of performing stitching on depth maps associated with stereo images in real time.
The three-dimensional image can be acquired through a stereo camera. More specifically, a stereo camera produces right and left two-dimensional images, respectively, obtained through binocular. A three-dimensional image can be obtained through image processing on the right and left two-dimensional images, and a depth map can be used in this process.
Korean Patent Registration No. 10-1370785 discloses a method and an apparatus for generating a depth map of a stereoscopic image, the depth map of a stereoscopic image that enables detailed representation of the depth of the image by considering not only a vanishing point but also detailed lines within the image. Generating method and apparatus.
Korean Patent Laid-Open Publication No. 10-2016-0086802 relates to an apparatus and method for generating a depth map, and a stereoscopic image converting apparatus and method using the apparatus and method. More specifically, the apparatus includes a feature information extracting unit A depth map initialization unit for generating an initial depth map for the input image based on the characteristic information, an FFT transform unit for performing an FFT on the input image to convert the input image into a frequency image, There is provided a depth map generating apparatus including a depth map determining unit for determining a final depth map based on the correlation value by obtaining a correlation value using an average value of the depth map.
One embodiment of the present invention seeks to provide a depth map stitching apparatus for stereo images that can perform stitching on depth maps associated with stereo images in real time.
In an exemplary embodiment of the present invention, a stereo image, which can match segments existing within an overlapping region between depth maps based on depth of each object in the segments and difference in RGB color distribution of each corresponding object in the stereo images, To provide a depth map stitching apparatus.
An embodiment of the present invention is to provide a depth map stitching apparatus for stereo images capable of detecting overlapping types of matched segments and integrating matched segments so as to have a single depth value for each type.
Among the embodiments, a depth map stitching apparatus for stereo images includes a camera calibration unit for correcting physical characteristics of a plurality of stereo cameras, a depth map for receiving stereo images from the stereo cameras and generating depth maps, A depth map adjusting unit for adjusting each of the depth maps on a virtual space, and a depth determining unit for determining an object based on a segment in each of the depth maps, calculating a depth of the determined object, And a depth map stitching unit.
The depth map stitching unit may perform segment labeling on the segments to detect the shape of the object.
The depth map stitching unit may adjust the global depth of the object by arranging the detected object in a single space to obtain an approximate value of the depth.
The depth map stitching unit may perform a local optimization on the detected object based on the stereo images to calculate a local depth of the detected object.
The depth map stitching unit may stitch the depth maps by matching and integrating segments in the overlapping area of the depth maps.
The depth map stitching unit may match the segments in the overlapping area based on the depth of each object in the segments and the difference in RGB color distribution of each corresponding object in the corresponding stereo images.
The depth map stitching unit may detect the overlapping type of the matched segments and integrate the matched segments to have a single depth value.
The depth map stitching unit may integrate the depth maps with an average depth value of the background area measured in the corresponding depth maps when there is a background area in which the outline of the specific object is not distinguished in the boundary of the stitching or in the overlap area have.
The disclosed technique may have the following effects. It is to be understood, however, that the scope of the disclosed technology is not to be construed as limited thereby, as it is not meant to imply that a particular embodiment should include all of the following effects or only the following effects.
The depth map stitching apparatus for stereo images according to an embodiment of the present invention can provide a technique for realizing stitching on depth maps associated with stereo images in real time.
The map stitching apparatus relating to stereo images according to an embodiment of the present invention may include a technique capable of matching segments existing within an overlapping area between depth maps and integrating them so as to have a single depth value according to the overlapping type of matched segments Can be provided.
1 is a view for explaining an overall system for generating a virtual viewpoint image synthesized in real time including a depth map stitching apparatus according to an embodiment of the present invention.
2 is a block diagram illustrating the depth map stitching apparatus of FIG.
3 is a block diagram illustrating the depth map stitching unit of FIG.
FIG. 4 is a view for explaining an embodiment of a process of generating a depth map specifier by the depth map adjuster shown in FIG.
5 is a view for explaining an embodiment for projecting and integrating depth maps arranged on a three-dimensional virtual space onto one projection sphere by the depth map stitching unit in Fig.
FIG. 6 is a view showing an embodiment of a process in which the segment labeling module in FIG. 3 determines the depth of overlapping objects through a depth quantization process.
FIG. 7 is a diagram illustrating an embodiment of a process of the segment matching module in FIG. 3 to re-project segments of each patch onto a three-dimensional virtual space in order to perform segment matching.
FIG. 8 is a view for explaining an embodiment of a process in which the segment matching module in FIG. 3 performs matching between overlapping segments in an overlapping region between depth maps. FIG.
FIG. 9 is a flowchart illustrating a depth map stitching process performed in the depth map stitching apparatus shown in FIG. 2. FIG.
The description of the present invention is merely an example for structural or functional explanation, and the scope of the present invention should not be construed as being limited by the embodiments described in the text. That is, the embodiments are to be construed as being variously embodied and having various forms, so that the scope of the present invention should be understood to include equivalents capable of realizing technical ideas. Also, the purpose or effect of the present invention should not be construed as limiting the scope of the present invention, since it does not mean that a specific embodiment should include all or only such effect.
Meanwhile, the meaning of the terms described in the present application should be understood as follows.
The terms "first "," second ", and the like are intended to distinguish one element from another, and the scope of the right should not be limited by these terms. For example, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.
It is to be understood that when an element is referred to as being "connected" to another element, it may be directly connected to the other element, but there may be other elements in between. On the other hand, when an element is referred to as being "directly connected" to another element, it should be understood that there are no other elements in between. On the other hand, other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.
It is to be understood that the singular " include " or "have" are to be construed as including the stated feature, number, step, operation, It is to be understood that the combination is intended to specify that it does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
In each step, the identification code (e.g., a, b, c, etc.) is used for convenience of explanation, the identification code does not describe the order of each step, Unless otherwise stated, it may occur differently from the stated order. That is, each step may occur in the same order as described, may be performed substantially concurrently, or may be performed in reverse order.
The present invention can be embodied as computer-readable code on a computer-readable recording medium, and the computer-readable recording medium includes all kinds of recording devices for storing data that can be read by a computer system . Examples of the computer-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like.
All terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs, unless otherwise defined. Commonly used predefined terms should be interpreted to be consistent with the meanings in the context of the related art and can not be interpreted as having ideal or overly formal meaning unless explicitly defined in the present application.
1 is a view for explaining an overall system for generating a virtual viewpoint image synthesized in real time including a depth map stitching apparatus according to an embodiment of the present invention.
Referring to FIG. 1, a virtual viewpoint
Here, stitching means that the depth maps of the overlapping regions are integrated into one by selecting the optimum depth value among the depth values included in the depth map for the overlapping region between the respective depth maps. Depth map stitching allows natural connection between depth maps from each stereo camera and creates a depth map for the entire image area of high resolution.
A stereo image means a stereoscopic image, and more specifically, an image that emphasizes a stereoscopic effect generated using camera images captured by both right and left lenses in a stereo camera.
A camera manager (110) can capture camera images from a plurality of stereo cameras. Here, a plurality of stereo cameras corresponds to a stereo camera device for photographing a three-dimensional space, and each of the plurality of stereo cameras is implemented as a three-dimensional stereo camera for acquiring stereoscopic images using two pairs of cameras for both eyes, A three-dimensional camera capable of simultaneously capturing two images or images of the same subject including two photographing lenses.
The
The
The depth
The depth
The depth
When the depth map is generated by the depth
In one embodiment, the scene
2 is a block diagram illustrating the depth map stitching apparatus of FIG.
2, the depth
The
In one embodiment, the
The depth
In one embodiment, the
The depth
In one embodiment, the
In one embodiment, the
In one embodiment, the
FIG. 4 is a diagram for explaining an embodiment of a process of generating a depth map specimen by the
First, the
Secondly, the
In one embodiment, in order to solve the problem that sequential processing of a three-dimensional point is not suitable when performing a projection operation of a three-dimensional point re-projected by the GPU in the GPU, Point can be projected.
For example, for a problem of selecting the depth value of a pixel point , t , the
The
The depth
In one embodiment, the depth
5 is a view for explaining an embodiment of the depth
More specifically, FIG. 5 shows a case in which overlap occurs between two views of stereo cameras. In Fig. 5, the shaded area represents an overlapping area between the left and right camera pairs. If the object point is in the overlap region, the object appears on both stereo cameras, and the depth of the object can be measured by each stereo camera. The depth
The overlapping areas occur differently depending on the camera settings in the three-dimensional space. In FIG. 5, in the case of the overlapping, if all the objects in the scene are theoretically infinite distances from the camera, the overlapping area occupies the image area indicated by lines 1, 1, and k . Here, the first, corresponds to a line consisting of a left border, 2, the right border, 1, k is the pixel points by which the projection light of the image region # 22, parallel to the r in the
The
3 is a block diagram illustrating the depth map stitching unit in FIG.
3, the depth
The
In one embodiment, the
In one embodiment, the
FIG. 6 is a diagram illustrating an exemplary process of the
More specifically, the
In one embodiment, the
The
In one embodiment, the
The
FIG. 7 is a diagram illustrating an embodiment of a process in which the
In one embodiment, the
FIG. 8 is a view for explaining an embodiment of a process in which the segment matching module in FIG. 3 performs matching between overlapping segments in an overlapping region between depth maps. FIG. Referring to FIG. 8, in
The segment
In one embodiment, the overlapping types of segments matched by the
In one embodiment, the segment
In one embodiment, the segment
The
FIG. 9 is a flowchart illustrating a depth map stitching process performed in the depth map stitching apparatus shown in FIG. 2. FIG.
Referring to FIG. 9, the
The depth
The depth
Specifically, the
The
The
The segment
The overlapping type of the segments matched by the
In one embodiment, the segment
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the present invention as defined by the following claims It can be understood that
100: Real-time synthesized virtual point image generation system
110: camera manager 130: image handler
150: depth map stitching apparatus 170: scene image generating apparatus
210: camera calibration unit 230: depth map generating unit
250: depth map adjusting unit 270: depth map stitching unit
290: control unit 310: segment labeling module
330: Segment optimization module 350: Segment matching module
370: Segment Depth Measurement Module 390: Control Module
Claims (8)
A depth map generator for receiving stereo images from the stereo cameras and generating depth maps;
A depth map adjuster for adjusting each of the depth maps on a virtual space; And
And a depth map stitching unit for stitching the depth maps by determining an object based on a segment in each of the depth maps and calculating a depth of the determined object,
The depth map stitching unit
And stitching the depth maps by matching the segments in the overlap area based on the depth of each object in the segments and the difference in RGB color distribution of each corresponding object in the stereo images, A depth map stitching device for images.
And a segment labeling step of detecting a shape of an object by performing segment labeling on the segments.
And arranging the detected object in a single space to obtain an approximate value of depth to adjust a global depth of the object.
Wherein the local depth of the detected object is calculated by performing a local optimization to obtain a specific depth value for the detail components within an image area of the detected object based on the stereo images. Depth map stitching device.
And detects the overlapping type of the matched segments to consolidate the matched segments so as to have a single depth value.
Wherein when the boundary of the stitching or the background area in which the appearance of the specific object is not determined in the overlapping area exists, the depth maps are integrated with the average depth value of the background area measured in the depth maps. Depth map stitching device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170047363A KR101853269B1 (en) | 2017-04-12 | 2017-04-12 | Apparatus of stitching depth maps for stereo images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170047363A KR101853269B1 (en) | 2017-04-12 | 2017-04-12 | Apparatus of stitching depth maps for stereo images |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101853269B1 true KR101853269B1 (en) | 2018-06-14 |
Family
ID=62629273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020170047363A KR101853269B1 (en) | 2017-04-12 | 2017-04-12 | Apparatus of stitching depth maps for stereo images |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101853269B1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102310958B1 (en) * | 2020-08-20 | 2021-10-12 | (주)아고스비전 | Wide viewing angle stereo camera apparatus and depth image processing method using the same |
US11164326B2 (en) | 2018-12-18 | 2021-11-02 | Samsung Electronics Co., Ltd. | Method and apparatus for calculating depth map |
WO2022039404A1 (en) * | 2020-08-20 | 2022-02-24 | (주)아고스비전 | Stereo camera apparatus having wide field of view, and depth image processing method using same |
KR102430273B1 (en) * | 2021-02-22 | 2022-08-09 | (주)아고스비전 | Wide viewing angle stereo camera- based first person vision system and image processing method using the same |
KR102430274B1 (en) * | 2021-02-22 | 2022-08-09 | (주)아고스비전 | Wide viewing angle stereo camera-based people following system and method therefore |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090080556A (en) * | 2006-12-22 | 2009-07-24 | 퀄컴 인코포레이티드 | Complexity-adaptive 2d-to-3d video sequence conversion |
JP2013074473A (en) * | 2011-09-28 | 2013-04-22 | Panasonic Corp | Panorama imaging apparatus |
KR101370785B1 (en) | 2012-11-06 | 2014-03-06 | 한국과학기술원 | Apparatus and method for generating depth map of streoscopic image |
WO2014055239A1 (en) * | 2012-10-01 | 2014-04-10 | Microsoft Corporation | Multi-camera depth imaging |
KR20160086802A (en) | 2016-07-11 | 2016-07-20 | 에스케이플래닛 주식회사 | Apparatus and Method for generating Depth Map, stereo-scopic image conversion apparatus and method usig that |
-
2017
- 2017-04-12 KR KR1020170047363A patent/KR101853269B1/en active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090080556A (en) * | 2006-12-22 | 2009-07-24 | 퀄컴 인코포레이티드 | Complexity-adaptive 2d-to-3d video sequence conversion |
JP2013074473A (en) * | 2011-09-28 | 2013-04-22 | Panasonic Corp | Panorama imaging apparatus |
WO2014055239A1 (en) * | 2012-10-01 | 2014-04-10 | Microsoft Corporation | Multi-camera depth imaging |
KR101370785B1 (en) | 2012-11-06 | 2014-03-06 | 한국과학기술원 | Apparatus and method for generating depth map of streoscopic image |
KR20160086802A (en) | 2016-07-11 | 2016-07-20 | 에스케이플래닛 주식회사 | Apparatus and Method for generating Depth Map, stereo-scopic image conversion apparatus and method usig that |
Non-Patent Citations (1)
Title |
---|
Semi-Global Depth Estimation Algorithm for Mobile 3-D Video Applications, TSINGHUA SCIENCE AND TECHNOLOGY Volume 17, Number 2, April 2012* * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11164326B2 (en) | 2018-12-18 | 2021-11-02 | Samsung Electronics Co., Ltd. | Method and apparatus for calculating depth map |
KR102310958B1 (en) * | 2020-08-20 | 2021-10-12 | (주)아고스비전 | Wide viewing angle stereo camera apparatus and depth image processing method using the same |
WO2022039404A1 (en) * | 2020-08-20 | 2022-02-24 | (주)아고스비전 | Stereo camera apparatus having wide field of view, and depth image processing method using same |
KR102430273B1 (en) * | 2021-02-22 | 2022-08-09 | (주)아고스비전 | Wide viewing angle stereo camera- based first person vision system and image processing method using the same |
KR102430274B1 (en) * | 2021-02-22 | 2022-08-09 | (주)아고스비전 | Wide viewing angle stereo camera-based people following system and method therefore |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101853269B1 (en) | Apparatus of stitching depth maps for stereo images | |
US10540806B2 (en) | Systems and methods for depth-assisted perspective distortion correction | |
US10609282B2 (en) | Wide-area image acquiring method and apparatus | |
CN106462944B (en) | High-resolution panorama VR generator and method | |
US8941750B2 (en) | Image processing device for generating reconstruction image, image generating method, and storage medium | |
US11568516B2 (en) | Depth-based image stitching for handling parallax | |
EP3668093B1 (en) | Method, system and apparatus for capture of image data for free viewpoint video | |
US11348267B2 (en) | Method and apparatus for generating a three-dimensional model | |
JP2017531976A (en) | System and method for dynamically calibrating an array camera | |
US20140340486A1 (en) | Image processing system, image processing method, and image processing program | |
US20150379720A1 (en) | Methods for converting two-dimensional images into three-dimensional images | |
KR101983586B1 (en) | Method of stitching depth maps for stereo images | |
US9665967B2 (en) | Disparity map generation including reliability estimation | |
KR101933037B1 (en) | Apparatus for reproducing 360 degrees video images for virtual reality | |
US10169891B2 (en) | Producing three-dimensional representation based on images of a person | |
WO2018032841A1 (en) | Method, device and system for drawing three-dimensional image | |
KR20170130150A (en) | Camera rig method for acquiring 3d data, camera rig system performing the same, and storage medium storing the same | |
JP2003179800A (en) | Device for generating multi-viewpoint image, image processor, method and computer program | |
CN107376360B (en) | Game live broadcast method and game live broadcast system | |
JP2013185905A (en) | Information processing apparatus, method, and program | |
Wang et al. | Robust color correction in stereo vision | |
TW202029056A (en) | Disparity estimation from a wide angle image | |
CN114331835A (en) | Panoramic image splicing method and device based on optimal mapping matrix | |
JP2017050857A (en) | Image processor, image processing method and program | |
GB2585197A (en) | Method and system for obtaining depth data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GRNT | Written decision to grant |