KR20130068617A - Apparatus and method for generating texture coordinates using visibility and edge information - Google Patents
Apparatus and method for generating texture coordinates using visibility and edge information Download PDFInfo
- Publication number
- KR20130068617A KR20130068617A KR1020110135912A KR20110135912A KR20130068617A KR 20130068617 A KR20130068617 A KR 20130068617A KR 1020110135912 A KR1020110135912 A KR 1020110135912A KR 20110135912 A KR20110135912 A KR 20110135912A KR 20130068617 A KR20130068617 A KR 20130068617A
- Authority
- KR
- South Korea
- Prior art keywords
- visibility
- triangles
- score
- mesh
- texture
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
Description
The present invention relates to an apparatus and method for generating texture coordinates using visibility and interface information, and more particularly, to an apparatus for generating texture coordinates using visibility and interface information for generating texture coordinates required for a 3D object modeled by computer graphics technology. It's about how.
Recently, due to technological developments, 3D graphics have been used in various fields such as broadcasting, games, and movies. To this end, various methods of generating three-dimensional data such as a plurality of two-dimensional images, images captured by a stereo camera, a depth information sensor, and the like have been developed. The generated 3D data is frequently used in games and broadcast movies, and is also used in the field of modifying / processing 2D image data such as photographs using reconstructed data.
The reconstructed three-dimensional data consists of triangles having vertices or three-dimensional coordinates. Vertices are often reconstructed into triangles with three-dimensional coordinates for fast visualization and use in other applications. The conversion algorithm for this is the traditional graphics field, and the marching cube is a representative algorithm. The three-dimensional triangle must have not only vertex coordinates but also texture coordinates to represent the details of the surface. To assign two-dimensional texture coordinates to each vertex of triangles in three-dimensional coordinates, we must solve the space allocation problem.
Triangles projected from 3D to 2D texture space must satisfy the condition that triangles do not overlap in 2D texture space, and that adjacent triangles in 3D space should be adjacent in 2D space as much as possible.
The condition that triangles do not overlap in two-dimensional texture space means that two triangles cannot exist in a texture space because the surface information of the triangle must be unique.
The conditions in which adjacent triangles in three-dimensional space should be adjacent in two-dimensional space as much as possible is that when the adjacent triangles in three-dimensional space are not adjacent in two-dimensional space, holes or cracks are generated in the process of obtaining texture information. There is a problem that the quality of the final image generated by the visualization of the three-dimensional object.
However, since the conventional techniques allow overlapping in determining texture coordinates of three-dimensional triangles constituting the restored three-dimensional data, triangles projected from three-dimensional to two-dimensional texture space overlap triangles in two-dimensional texture space. There is a problem that does not satisfy some conditions that are not.
In addition, the conventional techniques deal with triangles individually, so that adjacent triangles in three-dimensional space do not satisfy the condition that they must be adjacent in two-dimensional space at all.
In addition, since the conventional techniques view triangles in terms of interconnected meshes and allocate coordinates by dividing the triangles into groups around feature points where the shape of the mesh changes significantly, the adjacent triangles in the three-dimensional space should be adjacent in the two-dimensional space as much as possible. Some satisfy.
However, since the triangles are viewed in terms of interconnected meshes and the coordinates are assigned by dividing the triangles into groups around feature points where the shape of the mesh changes drastically, the technology that satisfies condition # 2 only considers the shape of the mesh without considering the human perception. As a result, holes, gaps, etc. may occur, resulting in a deterioration in the quality of the final image.
The present invention has been proposed to solve the above-described problems, and an object of the present invention is to provide an apparatus and method for generating texture coordinates using visibility and boundary information for generating texture coordinates using visibility and interface information. In other words, the present invention provides a texture coordinate generating apparatus and method using the visibility and boundary information to generate texture coordinates in consideration of the area and interface information that is less visible to generate fewer holes and gaps that humans can recognize. The purpose.
In order to achieve the above object, a texture coordinate generating apparatus using visibility and boundary surface information according to an embodiment of the present invention, a pre-processing unit for converting the three-dimensional data converted into a triangle into a mesh; A visibility checker configured to calculate a final score of each corner of the triangles through visibility checks on the triangles constituting the converted mesh; And a texture coordinate generation unit configured to calculate texture coordinates of triangles constituting the mesh based on the final scores of the calculated corners.
The visibility checker calculates a score for each of the triangles constituting the mesh, and calculates an edge score based on the calculated scores of the triangles.
The visibility checker calculates scores of all edges based on boundary information of triangles constituting the mesh.
The visibility checker calculates the final score of the edge based on the score of the visibility test result and the score based on the interface information.
The texture coordinate generator calculates texture coordinates of triangles based on a function set generated by comparing a final score and a set value of each corner.
In order to achieve the above object, a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention includes: converting, by a preprocessor, three-dimensional data converted into triangles into a mesh; Calculating, by the visibility checker, a final score of each corner of the triangles through visibility checks on the triangles constituting the converted mesh; And calculating, by the texture coordinate generator, texture coordinates of triangles constituting the mesh based on the final scores of the calculated corners.
The calculating of the final score may include calculating a score for each of the triangles constituting the mesh by the visibility checker; And calculating an edge score based on the scores of the triangles calculated by the visibility checker.
The calculating of the final score includes calculating the scores of all the edges based on the boundary information of the triangles constituting the mesh by the visibility checker.
The calculating of the final score includes calculating the final score of the corner based on the visibility test result score and the interface information based score by the visibility checker.
In the step of calculating the texture coordinates, the texture coordinate generation unit calculates texture coordinates of triangles based on a function set generated by comparing a final score and a set value of each corner.
According to the present invention, a texture coordinate generation apparatus and method using visibility and boundary information calculate texture coordinates in consideration of visibility and interface information, thereby improving final image quality compared to a mesh segmentation and texture coordinate generation algorithm considering only mesh characteristics. It can be effected. That is, the apparatus and method for generating texture coordinates using the visibility and boundary information of the present invention take into consideration a place where a human cannot see well (that is, a lack of visibility) and a place where an original boundary exists in the image (that is, boundary information). By generating texture coordinates, image quality can be improved by generating fewer holes and gaps that can be perceived in the final image output as compared to the mesh segmentation and texture coordinate generation algorithms considering only mesh characteristics.
1 is a block diagram illustrating an apparatus for generating texture coordinates using visibility and interface information according to an embodiment of the present invention.
2 is a view for explaining a preprocessor of FIG.
3 is a view for explaining the visibility inspection unit of FIG.
4 is a flowchart illustrating a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention.
5 is a flow chart for explaining the step of generating the mesh of FIG.
6 is a flowchart for explaining a final score calculating step of FIG.
7 is a flowchart for explaining a texture coordinate calculation step of FIG. 4.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings in order to facilitate a person skilled in the art to easily carry out the technical idea of the present invention. . In the drawings, the same reference numerals are used to designate the same or similar components throughout the drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
Hereinafter, an apparatus for generating texture coordinates using visibility and interface information according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. 1 is a block diagram illustrating an apparatus for generating texture coordinates using visibility and interface information according to an embodiment of the present invention. FIG. 2 is a view for explaining the preprocessor of FIG. 1, and FIG. 3 is a view for explaining the visibility checker of FIG. 1.
As illustrated in FIG. 1, the texture
The
The
The
The
The
The
The
The texture coordinate
The texture coordinate
The texture coordinate
Hereinafter, a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. 4 is a flowchart illustrating a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention. FIG. 5 is a flowchart for describing the step of generating the mesh of FIG. 4, FIG. 6 is a flowchart for explaining the final score calculation step of FIG. 4, and FIG. 7 is a flowchart for explaining the texture coordinate calculation step of FIG. 4. to be.
The
First, the
The
The
First, the
The
The
The texture coordinate
First, the texture coordinate
If the edge flag is set to 1 (S330; YES), the texture coordinate
The texture coordinate
The texture coordinate
As described above, the apparatus and method for generating texture coordinates using visibility and boundary information calculates texture coordinates in consideration of visibility and interface information, thereby improving final image quality compared to a mesh segmentation and texture coordinate generation algorithm considering only mesh characteristics. It can be effected. That is, the apparatus and method for generating texture coordinates using the visibility and boundary information of the present invention take into consideration a place where a human cannot see well (that is, a lack of visibility) and a place where an original boundary exists in the image (that is, boundary information). By generating texture coordinates, image quality can be improved by generating fewer holes and gaps that can be perceived in the final image output as compared to the mesh segmentation and texture coordinate generation algorithms considering only mesh characteristics.
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but many variations and modifications may be made without departing from the scope of the present invention. It will be understood that the invention may be practiced.
100: texture coordinate generator 120: preprocessor
140: visibility inspection unit 160: texture coordinate generation unit
Claims (1)
A visibility checker configured to calculate a final score of each corner of the triangles through visibility checks on the triangles constituting the converted mesh; And
And a texture coordinate generation unit for calculating texture coordinates of triangles constituting the mesh based on the final scores of the corners of the calculated corners.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110135912A KR20130068617A (en) | 2011-12-15 | 2011-12-15 | Apparatus and method for generating texture coordinates using visibility and edge information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110135912A KR20130068617A (en) | 2011-12-15 | 2011-12-15 | Apparatus and method for generating texture coordinates using visibility and edge information |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20130068617A true KR20130068617A (en) | 2013-06-26 |
Family
ID=48864207
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020110135912A KR20130068617A (en) | 2011-12-15 | 2011-12-15 | Apparatus and method for generating texture coordinates using visibility and edge information |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20130068617A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10674983B2 (en) | 2013-09-25 | 2020-06-09 | Richard R. Black | Patient-specific analysis of positron emission tomography data |
-
2011
- 2011-12-15 KR KR1020110135912A patent/KR20130068617A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10674983B2 (en) | 2013-09-25 | 2020-06-09 | Richard R. Black | Patient-specific analysis of positron emission tomography data |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111066065B (en) | System and method for hybrid depth regularization | |
JP7403528B2 (en) | Method and system for reconstructing color and depth information of a scene | |
JP6143747B2 (en) | Improved depth measurement quality | |
CN103927717B (en) | Depth image restoration methods based on modified model bilateral filtering | |
US20190098278A1 (en) | Image processing apparatus, image processing method, and storage medium | |
CN115699114B (en) | Method and apparatus for image augmentation for analysis | |
CN107025660B (en) | Method and device for determining image parallax of binocular dynamic vision sensor | |
US10347052B2 (en) | Color-based geometric feature enhancement for 3D models | |
JP2007529070A (en) | Depth map generation method and apparatus | |
KR20120003232A (en) | Apparatus and method for bidirectional inpainting in occlusion based on volume prediction | |
US20140211286A1 (en) | Apparatus and method for generating digital hologram | |
JP4885042B2 (en) | Image processing method, apparatus, and program | |
KR101593316B1 (en) | Method and apparatus for recontructing 3-dimension model using stereo camera | |
WO2020075252A1 (en) | Information processing device, program, and information processing method | |
JP5911292B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and image processing program | |
JP2018036898A (en) | Image processing device and control method of the same | |
KR20110055032A (en) | Apparatus and method for generating three demension content in electronic device | |
JP5909176B2 (en) | Shadow information deriving device, shadow information deriving method and program | |
JP6762570B2 (en) | Image processing equipment, image processing method, and image processing program | |
US20130194254A1 (en) | Image processing apparatus, image processing method and program | |
CN105530505B (en) | 3-D view conversion method and device | |
JP5795556B2 (en) | Shadow information deriving device, shadow information deriving method and program | |
CN108876704A (en) | The method, apparatus and computer storage medium of facial image deformation | |
KR20130068617A (en) | Apparatus and method for generating texture coordinates using visibility and edge information | |
JP2020112928A (en) | Background model generation device, background model generation method, and background model generation program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |