KR20130068617A - Apparatus and method for generating texture coordinates using visibility and edge information - Google Patents

Apparatus and method for generating texture coordinates using visibility and edge information Download PDF

Info

Publication number
KR20130068617A
KR20130068617A KR1020110135912A KR20110135912A KR20130068617A KR 20130068617 A KR20130068617 A KR 20130068617A KR 1020110135912 A KR1020110135912 A KR 1020110135912A KR 20110135912 A KR20110135912 A KR 20110135912A KR 20130068617 A KR20130068617 A KR 20130068617A
Authority
KR
South Korea
Prior art keywords
visibility
triangles
score
mesh
texture
Prior art date
Application number
KR1020110135912A
Other languages
Korean (ko)
Inventor
김도형
구본기
이지형
최윤석
박정철
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020110135912A priority Critical patent/KR20130068617A/en
Publication of KR20130068617A publication Critical patent/KR20130068617A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Abstract

PURPOSE: A texture coordinate generation device and a method thereof using visibility and interface information are provided to generate less holes and gaps recognizable by a human in the final image result by generating a texture coordinate in consideration of a location in which visibility is low and interface information. CONSTITUTION: A pre-processing unit(120) converts 3D data converted into triangles into a mesh. A visibility test unit(140) calculates the final score of the triangles through the visibility test with respect to the triangles. A texture coordinate generation unit(160) calculates a texture coordinate of the triangles based on the calculated final score. [Reference numerals] (120) Pre-processing unit; (140) Visibility test unit; (160) Texture coordinate generation unit

Description

Apparatus and method for generating texture coordinates using visibility and boundary information {APPARATUS AND METHOD FOR GENERATING TEXTURE COORDINATES USING VISIBILITY AND EDGE INFORMATION}
The present invention relates to an apparatus and method for generating texture coordinates using visibility and interface information, and more particularly, to an apparatus for generating texture coordinates using visibility and interface information for generating texture coordinates required for a 3D object modeled by computer graphics technology. It's about how.
Recently, due to technological developments, 3D graphics have been used in various fields such as broadcasting, games, and movies. To this end, various methods of generating three-dimensional data such as a plurality of two-dimensional images, images captured by a stereo camera, a depth information sensor, and the like have been developed. The generated 3D data is frequently used in games and broadcast movies, and is also used in the field of modifying / processing 2D image data such as photographs using reconstructed data.
The reconstructed three-dimensional data consists of triangles having vertices or three-dimensional coordinates. Vertices are often reconstructed into triangles with three-dimensional coordinates for fast visualization and use in other applications. The conversion algorithm for this is the traditional graphics field, and the marching cube is a representative algorithm. The three-dimensional triangle must have not only vertex coordinates but also texture coordinates to represent the details of the surface. To assign two-dimensional texture coordinates to each vertex of triangles in three-dimensional coordinates, we must solve the space allocation problem.
Triangles projected from 3D to 2D texture space must satisfy the condition that triangles do not overlap in 2D texture space, and that adjacent triangles in 3D space should be adjacent in 2D space as much as possible.
The condition that triangles do not overlap in two-dimensional texture space means that two triangles cannot exist in a texture space because the surface information of the triangle must be unique.
The conditions in which adjacent triangles in three-dimensional space should be adjacent in two-dimensional space as much as possible is that when the adjacent triangles in three-dimensional space are not adjacent in two-dimensional space, holes or cracks are generated in the process of obtaining texture information. There is a problem that the quality of the final image generated by the visualization of the three-dimensional object.
 However, since the conventional techniques allow overlapping in determining texture coordinates of three-dimensional triangles constituting the restored three-dimensional data, triangles projected from three-dimensional to two-dimensional texture space overlap triangles in two-dimensional texture space. There is a problem that does not satisfy some conditions that are not.
In addition, the conventional techniques deal with triangles individually, so that adjacent triangles in three-dimensional space do not satisfy the condition that they must be adjacent in two-dimensional space at all.
In addition, since the conventional techniques view triangles in terms of interconnected meshes and allocate coordinates by dividing the triangles into groups around feature points where the shape of the mesh changes significantly, the adjacent triangles in the three-dimensional space should be adjacent in the two-dimensional space as much as possible. Some satisfy.
However, since the triangles are viewed in terms of interconnected meshes and the coordinates are assigned by dividing the triangles into groups around feature points where the shape of the mesh changes drastically, the technology that satisfies condition # 2 only considers the shape of the mesh without considering the human perception. As a result, holes, gaps, etc. may occur, resulting in a deterioration in the quality of the final image.
The present invention has been proposed to solve the above-described problems, and an object of the present invention is to provide an apparatus and method for generating texture coordinates using visibility and boundary information for generating texture coordinates using visibility and interface information. In other words, the present invention provides a texture coordinate generating apparatus and method using the visibility and boundary information to generate texture coordinates in consideration of the area and interface information that is less visible to generate fewer holes and gaps that humans can recognize. The purpose.
In order to achieve the above object, a texture coordinate generating apparatus using visibility and boundary surface information according to an embodiment of the present invention, a pre-processing unit for converting the three-dimensional data converted into a triangle into a mesh; A visibility checker configured to calculate a final score of each corner of the triangles through visibility checks on the triangles constituting the converted mesh; And a texture coordinate generation unit configured to calculate texture coordinates of triangles constituting the mesh based on the final scores of the calculated corners.
The visibility checker calculates a score for each of the triangles constituting the mesh, and calculates an edge score based on the calculated scores of the triangles.
The visibility checker calculates scores of all edges based on boundary information of triangles constituting the mesh.
The visibility checker calculates the final score of the edge based on the score of the visibility test result and the score based on the interface information.
The texture coordinate generator calculates texture coordinates of triangles based on a function set generated by comparing a final score and a set value of each corner.
In order to achieve the above object, a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention includes: converting, by a preprocessor, three-dimensional data converted into triangles into a mesh; Calculating, by the visibility checker, a final score of each corner of the triangles through visibility checks on the triangles constituting the converted mesh; And calculating, by the texture coordinate generator, texture coordinates of triangles constituting the mesh based on the final scores of the calculated corners.
The calculating of the final score may include calculating a score for each of the triangles constituting the mesh by the visibility checker; And calculating an edge score based on the scores of the triangles calculated by the visibility checker.
The calculating of the final score includes calculating the scores of all the edges based on the boundary information of the triangles constituting the mesh by the visibility checker.
The calculating of the final score includes calculating the final score of the corner based on the visibility test result score and the interface information based score by the visibility checker.
In the step of calculating the texture coordinates, the texture coordinate generation unit calculates texture coordinates of triangles based on a function set generated by comparing a final score and a set value of each corner.
According to the present invention, a texture coordinate generation apparatus and method using visibility and boundary information calculate texture coordinates in consideration of visibility and interface information, thereby improving final image quality compared to a mesh segmentation and texture coordinate generation algorithm considering only mesh characteristics. It can be effected. That is, the apparatus and method for generating texture coordinates using the visibility and boundary information of the present invention take into consideration a place where a human cannot see well (that is, a lack of visibility) and a place where an original boundary exists in the image (that is, boundary information). By generating texture coordinates, image quality can be improved by generating fewer holes and gaps that can be perceived in the final image output as compared to the mesh segmentation and texture coordinate generation algorithms considering only mesh characteristics.
1 is a block diagram illustrating an apparatus for generating texture coordinates using visibility and interface information according to an embodiment of the present invention.
2 is a view for explaining a preprocessor of FIG.
3 is a view for explaining the visibility inspection unit of FIG.
4 is a flowchart illustrating a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention.
5 is a flow chart for explaining the step of generating the mesh of FIG.
6 is a flowchart for explaining a final score calculating step of FIG.
7 is a flowchart for explaining a texture coordinate calculation step of FIG. 4.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings in order to facilitate a person skilled in the art to easily carry out the technical idea of the present invention. . In the drawings, the same reference numerals are used to designate the same or similar components throughout the drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
Hereinafter, an apparatus for generating texture coordinates using visibility and interface information according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. 1 is a block diagram illustrating an apparatus for generating texture coordinates using visibility and interface information according to an embodiment of the present invention. FIG. 2 is a view for explaining the preprocessor of FIG. 1, and FIG. 3 is a view for explaining the visibility checker of FIG. 1.
As illustrated in FIG. 1, the texture coordinate generating apparatus 100 using visibility and boundary surface information includes a preprocessor 120, a visibility inspector 140, and a texture coordinate generator 160.
The preprocessor 120 converts the input 3D data into triangles. That is, the preprocessor 120 converts three-dimensional data consisting of vertices into triangles. In this case, the preprocessor 120 may not perform the conversion when the 3D data converted into triangles is input.
The preprocessor 120 converts the 3D data converted into triangles into meshes. That is, the preprocessor 120 removes the overlapping edges of the three-dimensional triangles and converts them to a mesh. In this case, the triangles converted into meshes share the vertices and the edges. The preprocessor 120 generates a normalized mesh by removing holes and gaps of the converted mesh. In this case, the normalized mesh is formed as shown in FIG. 2.
The visibility checker 140 performs a visibility test on the triangles constituting the mesh generated by the preprocessor 120. To this end, as shown in FIG. 3, the visibility checker 140 sets up a virtual camera. Here, the virtual camera means a camera used to render an image, and the visibility checker 140 sets one or more virtual cameras.
The visibility checker 140 performs a visibility check on the triangles constituting the mesh and calculates a score for each triangle. In this case, the visibility checker 140 calculates a score for each triangle by performing a visibility test on all triangles, each triangle, all cameras, and each camera. The visibility checker 140 calculates a score for each triangle by using Equation 1 below. Here, the internal angle between the triangle normal vector and the gaze vector of the virtual camera is a value between 0 degrees and 180 degrees.
Figure pat00001
The visibility checker 140 calculates the score of the corner based on the score of the triangles. That is, the visibility checker 140 calculates the average score of the triangles sharing the corners as the scores of the edges and normalizes the scores of the edges. At this time, the score of each normalized corner is normalized to a score of 0 or more and 1 or less.
The visibility checker 140 calculates scores of all the corners based on the boundary information of the triangles. That is, the visibility inspecting unit 140 performs 3D rendering on the virtual camera and each virtual camera in the mode. The visibility checker 140 performs edge detection on the rendered image result. The visibility inspection unit 140 is a distance between the vertices and Mosiri that constitute the boundary line detected in the image result at the corners of each triangle with respect to the triangles of the inner product of the eye vector of the virtual camera and the normal vector of the triangle. Calculate the value. The visibility checker 140 sets a value obtained by subtracting a value normalizing the scores of all edges from 1 as the score of the edge. That is, the visibility checker 140 calculates the scores of the edges using Equation 2 below for all corners and each corner. In this case, the boundary information based score of the final edge is a value in the range of 0 or more and less than 1, and the result of calculating the distance value between the vertices and the edges constituting the boundary line detected in the image result is far from the boundary line. Therefore, we will select the edges close to the boundary, so we subtract the normalized edge score from 1, the final step in calculating the value.
Figure pat00002
The visibility checker 140 calculates the final score of the edge in consideration of the visibility test result score and the interface information based score. That is, as shown in Equation 3 below, the visibility inspecting unit 140 calculates the final score of each corner by adding an added value to each of the visibility test result score and the interface information based score. At this time, the addition values W0 and W1 are received in advance from the user.
Figure pat00003
The texture coordinate generation unit 160 calculates texture coordinates of triangles constituting the mesh based on the score calculated by the visibility checker 140. That is, the texture coordinate generator 160 sets an edge flag based on the final score of each edge. At this time, the texture coordinate generator 160 sets the edge flag to '1' if the final score of each edge exceeds a set value (value input from the user) for all edges, and the final score of each edge is less than or equal to the set value. Set the face edge flag to '0'.
The texture coordinate generation unit 160 registers edges in which the edge flag is set to '1' in a feature set. The texture coordinate generator 160 performs a mesh based on a feature set. The texture coordinate generation unit 160 parameterizes the divided meshes (charts or atlas) and converts them into two-dimensional coordinates.
The texture coordinate generator 160 arranges the converted meshes in the texture space so that the converted meshes do not overlap. This problem is a traditional packing problem, which can be seen as placing different sized rectangles inside a rectangle. In this way, the texture coordinate generation unit 160 calculates texture coordinates of triangles constituting the mesh.
Hereinafter, a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. 4 is a flowchart illustrating a texture coordinate generation method using visibility and boundary information according to an embodiment of the present invention. FIG. 5 is a flowchart for describing the step of generating the mesh of FIG. 4, FIG. 6 is a flowchart for explaining the final score calculation step of FIG. 4, and FIG. 7 is a flowchart for explaining the texture coordinate calculation step of FIG. 4. to be.
The preprocessor 120 generates a mesh by preprocessing the input 3D data (S100). This will be described in more detail with reference to FIG. 5.
First, the preprocessor 120 converts the input 3D data into triangles (S120). Of course, when the 3D data converted into triangles is input, the conversion is not performed.
The preprocessor 120 converts the 3D data converted into triangles into meshes (S140). At this time, the preprocessing unit 120 removes the overlapping edges of the three-dimensional triangle is converted into a mesh. Triangles converted to meshes share vertices and edges. The preprocessor 120 generates a normalized mesh by removing holes and gaps of the converted mesh.
The visibility inspecting unit 140 calculates a final score of each corner by performing visibility inspection on the triangles constituting the mesh (S200). This will be described in more detail with reference to FIG. 6.
First, the visibility checker 140 sets up one or more virtual cameras for rendering an image to perform visibility checks on triangles constituting a mesh. The visibility inspecting unit 140 calculates a score for each triangle through performing a visibility test (S220). In this case, the visibility checker 140 calculates a score for each triangle by performing a visibility test on all triangles, each triangle, all cameras, and each camera.
Visibility check unit 140 calculates the score of the corner based on the score of the triangle (S240). In this case, the visibility checker 140 calculates a score average value of the triangles sharing the corners as the corner scores and normalizes the scores of the corners. At this time, the score of each normalized corner is normalized to a score of 0 or more and 1 or less.
The visibility checker 140 calculates scores of all edges based on the boundary surface information of the triangles (S260). The visibility inspecting unit 140 performs 3D rendering on the virtual camera and each virtual camera in the mode. The visibility checker 140 performs edge detection on the rendered image result. The visibility inspection unit 140 is a distance between the vertices and Mosiri that constitute the boundary line detected in the image result at the corners of each triangle with respect to the triangles of the inner product of the eye vector of the virtual camera and the normal vector of the triangle. Calculate the value. The visibility checker 140 sets a value obtained by subtracting a value normalizing the scores of all edges from 1 as the score of the edge. That is, the visibility checker 140 calculates the score of the corners for all corners and each corner. In this case, the boundary information based score of the final edge is a value in the range of 0 or more and less than 1, and the result of calculating the distance value between the vertices and the edges constituting the boundary line detected in the image result is far from the boundary line. Therefore, we will select the edges close to the boundary, so we subtract the normalized edge score from 1, the final step in calculating the value.
The visibility inspecting unit 140 calculates the final score of the edge considering the visibility test result score and the interface information based score (S280). That is, the visibility inspecting unit 140 assigns an added value to each of the visibility test result score and the interface information based score, and then calculates the added value as the final score of each corner.
The texture coordinate generator 160 calculates texture coordinates of triangles constituting the mesh based on the calculated final score of each corner (S300). This will be described in more detail with reference to FIG. 7.
First, the texture coordinate generator 160 sets an edge flag based on the final score of each corner calculated by the visibility inspector 140 (S330). At this time, the texture coordinate generator 160 sets the edge flag to '1' when the final score of each edge exceeds the set value for all edges, and sets the edge flag to '0' when the final score of each edge is less than or equal to the set value. Set to. The texture coordinate generator 160 sets a corner flag for all corners.
If the edge flag is set to 1 (S330; YES), the texture coordinate generator 160 registers the corresponding edge in the function set (S350).
The texture coordinate generation unit 160 performs mesh division based on the function set, and parameterizes the divided mesh to convert to two-dimensional coordinates (S370).
The texture coordinate generator 160 arranges the converted meshes in the texture space so that the converted meshes do not overlap. This is a traditional parking problem that can be viewed as placing a different sized rectangle inside the rectangle. In this way, the texture coordinate generation unit 160 calculates texture coordinates of triangles constituting the mesh (S390).
As described above, the apparatus and method for generating texture coordinates using visibility and boundary information calculates texture coordinates in consideration of visibility and interface information, thereby improving final image quality compared to a mesh segmentation and texture coordinate generation algorithm considering only mesh characteristics. It can be effected. That is, the apparatus and method for generating texture coordinates using the visibility and boundary information of the present invention take into consideration a place where a human cannot see well (that is, a lack of visibility) and a place where an original boundary exists in the image (that is, boundary information). By generating texture coordinates, image quality can be improved by generating fewer holes and gaps that can be perceived in the final image output as compared to the mesh segmentation and texture coordinate generation algorithms considering only mesh characteristics.
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but many variations and modifications may be made without departing from the scope of the present invention. It will be understood that the invention may be practiced.
100: texture coordinate generator 120: preprocessor
140: visibility inspection unit 160: texture coordinate generation unit

Claims (1)

  1. A preprocessor converting the 3D data converted into triangles into a mesh;
    A visibility checker configured to calculate a final score of each corner of the triangles through visibility checks on the triangles constituting the converted mesh; And
    And a texture coordinate generation unit for calculating texture coordinates of triangles constituting the mesh based on the final scores of the corners of the calculated corners.
KR1020110135912A 2011-12-15 2011-12-15 Apparatus and method for generating texture coordinates using visibility and edge information KR20130068617A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110135912A KR20130068617A (en) 2011-12-15 2011-12-15 Apparatus and method for generating texture coordinates using visibility and edge information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110135912A KR20130068617A (en) 2011-12-15 2011-12-15 Apparatus and method for generating texture coordinates using visibility and edge information

Publications (1)

Publication Number Publication Date
KR20130068617A true KR20130068617A (en) 2013-06-26

Family

ID=48864207

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110135912A KR20130068617A (en) 2011-12-15 2011-12-15 Apparatus and method for generating texture coordinates using visibility and edge information

Country Status (1)

Country Link
KR (1) KR20130068617A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10674983B2 (en) 2013-09-25 2020-06-09 Richard R. Black Patient-specific analysis of positron emission tomography data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10674983B2 (en) 2013-09-25 2020-06-09 Richard R. Black Patient-specific analysis of positron emission tomography data

Similar Documents

Publication Publication Date Title
JP6143747B2 (en) Improved depth measurement quality
CN103927717B (en) Depth image restoration methods based on modified model bilateral filtering
JP4966431B2 (en) Image processing device
JP2007529070A (en) Depth map generation method and apparatus
KR101669820B1 (en) Apparatus and method for bidirectional inpainting in occlusion based on volume prediction
US20140211286A1 (en) Apparatus and method for generating digital hologram
US10347052B2 (en) Color-based geometric feature enhancement for 3D models
CN111066065A (en) System and method for hybrid depth regularization
JP5911292B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP2018036898A (en) Image processing device and control method of the same
KR20110055032A (en) Apparatus and method for generating three demension content in electronic device
KR101593316B1 (en) Method and apparatus for recontructing 3-dimension model using stereo camera
CN105530505B (en) 3-D view conversion method and device
CN107341804B (en) Method and device for determining plane in point cloud data, and method and equipment for image superposition
CN107025660B (en) Method and device for determining image parallax of binocular dynamic vision sensor
JP5909176B2 (en) Shadow information deriving device, shadow information deriving method and program
US20190098278A1 (en) Image processing apparatus, image processing method, and storage medium
KR20130068617A (en) Apparatus and method for generating texture coordinates using visibility and edge information
CN108876704A (en) The method, apparatus and computer storage medium of facial image deformation
US20130194254A1 (en) Image processing apparatus, image processing method and program
US20210241495A1 (en) Method and system for reconstructing colour and depth information of a scene
JP6762570B2 (en) Image processing equipment, image processing method, and image processing program
JP2013069026A (en) Device, method, and program for restoring three-dimensional shape of subject
JP5795556B2 (en) Shadow information deriving device, shadow information deriving method and program
KR101566459B1 (en) Concave surface modeling in image-based visual hull

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination