CN117197365A - Texture reconstruction method and system based on RGB-D image - Google Patents

Texture reconstruction method and system based on RGB-D image Download PDF

Info

Publication number
CN117197365A
CN117197365A CN202311465347.9A CN202311465347A CN117197365A CN 117197365 A CN117197365 A CN 117197365A CN 202311465347 A CN202311465347 A CN 202311465347A CN 117197365 A CN117197365 A CN 117197365A
Authority
CN
China
Prior art keywords
texture
key frame
patch
rgb
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311465347.9A
Other languages
Chinese (zh)
Inventor
黄浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Qiushi Higher Research Institute
Original Assignee
Jiangxi Qiushi Higher Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Qiushi Higher Research Institute filed Critical Jiangxi Qiushi Higher Research Institute
Priority to CN202311465347.9A priority Critical patent/CN117197365A/en
Publication of CN117197365A publication Critical patent/CN117197365A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Generation (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a texture reconstruction method and a texture reconstruction system based on an RGB-D image, wherein the method comprises the following steps: acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object; screening the picture sequence to obtain a key frame; extracting a basic level diagram and a detail level diagram of a picture according to the key frames, selecting the key frames of the optimal angles corresponding to each patch as candidate textures by combining the detail level diagrams, and projecting the candidate textures to all patches of the same key frame to generate a patch; according to similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing energy function E 2 And optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map. By the technique of the applicationThe texture map generated by the scheme can avoid the conditions of texture tearing, blurring and ghost artifacts and improve the stitching quality of the reconstructed texture.

Description

Texture reconstruction method and system based on RGB-D image
Technical Field
The application relates to the technical field of three-dimensional reconstruction, in particular to a texture reconstruction method and system based on RGB-D images.
Background
After the three-dimensional model of the object is obtained through the 3D reconstruction technology, color information of the model is not recorded, and texture reconstruction aims at obtaining a texture map of the model based on a picture of the object actually corresponding to the model. The current texture reconstruction technology mainly adopts a texture reconstruction method based on mapping, and the method specifically comprises the following steps: firstly, selecting an optimal picture of each face of a 3D model as a texture candidate; secondly, combining the adjacent lower patches with the same view into a whole, which is called patch; further, mapping the candidate texture map corresponding to the patch onto the texture map; finally, the colors of the vertices on the texture map are adjusted based on global and local design optimization functions, so that gaps among different patches are eliminated.
In the prior art, the texture reconstruction method based on mapping may cause obvious gaps between the patches due to different viewing angles between the patches, light conditions during shooting, and other factors, so as to affect the stitching quality of textures, and referring to fig. 4 specifically, the texture obtained by the texture reconstruction method in the prior art has obvious tearing phenomenon, and is particularly shown in the staggered phenomenon of the flower stem texture in fig. 4.
Disclosure of Invention
Based on the above, the application aims to provide a texture reconstruction method and a texture reconstruction system for RGB-D images, which are used for solving the technical problem that the texture quality is affected because obvious gaps are generated between the patches due to factors such as different viewing angles between the patches and light conditions during shooting in the texture reconstruction method based on mapping in the prior art.
In one aspect, the present application provides a texture reconstruction method based on an RGB-D image, including:
acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object, wherein the three-dimensional model comprises model coordinates and surface patch information, the surface patch is a triangle constructed by three vertexes, the picture sequence comprises depth photos of the object model under different angles, and the depth photos comprise RGB color information and depth information of each pixel of the object;
screening the picture sequence to obtain key frames, wherein each angle corresponds to one key frame;
extracting a basic level diagram of a picture according to the key frame, obtaining a detail level diagram of the picture according to the basic level diagram, selecting the key frame of each patch corresponding to the optimal angle as a candidate texture in combination with the detail level diagram, and projecting the key frame to all patches of the same key frame to generate a patch;
mapping the candidate texture corresponding to each patch onto a texture map for texture reconstruction, and respectively calculating the similarity E between textures before and after reconstruction adjustment m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing said energy function E 2 Optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map, wherein the optimal texture map is an energy function E 2 Taking a texture map corresponding to the minimum value;
the step of selecting the key frame of the optimal angle corresponding to each patch as the candidate texture comprises the following steps:
obtaining a global projection area E of the three-dimensional model according to the area of the patch projected to the corresponding key frame of the three-dimensional model P
Acquiring a set of common edges of all the patches, sampling the common edges, acquiring sampling points, and calculating to obtain texture smoothness E according to RGB pixel values of the sampling points on a key frame and RGB pixel values of the sampling points on another key frame on the common edges smooth
Calculating to obtain a detail texture perception degree E according to the corresponding pixel value of the corresponding detail level diagram of the sampling point on the key frame and the corresponding pixel value of the corresponding detail level diagram of the common edge sampling point on the other key frame tex
According to the global projection area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing said energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result.
According to the texture reconstruction method based on the RGB-D image, firstly, key frames are screened, so that the calculation efficiency is improved; secondly, designing a brand new function on patch generation to generate high-quality textures; furthermore, a new optimization function is designed for seam stitching to minimize texture seams, thereby enabling the present applicationAccording to the technical scheme, the high-quality texture map can be generated, the conditions of texture tearing, blurring and ghost artifacts are avoided, and the stitching quality of the reconstructed texture is improved. Specifically, according to the global projected area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result. Mapping the candidate texture corresponding to each patch onto the texture map for texture reconstruction, and calculating the similarity E between textures before and after reconstruction adjustment m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing energy function E 2 And optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map.
In addition, the texture reconstruction method based on the RGB-D image can also have the following additional technical characteristics:
further, global projected area E P The calculation formula of (2) is as follows:
wherein, # Faces represents the number of die sheets; ii (#) represents projection; n-shaped%T if i ) Indicating that the ith dough sheet is to be usedf i Projection to corresponding keyframeT i The method comprises the steps of carrying out a first treatment on the surface of the area (#) represents area;
texture smoothness E smooth The calculation formula of (2) is as follows:
wherein,βindicated at E P In an item, all projected to the same keyframeT i The patches of (1) constitute a collection of common edges of all patches;v x representing the corresponding sampling point of the public edge e;I Ti () Representing sampling pointsIn key frameT i An rgb pixel value above;I Tj () An rgb pixel value representing a sample point on the common edge on another keyframe;
detail texture awareness E tex The calculation formula of (2) is as follows:
wherein,I Di () Representing sampling points in key framesT i Corresponding level of detail map onD i Corresponding pixel values thereon;I Dj () Detail level diagram representing correspondence of sampling points on common edge on key frame of anotherD j Corresponding pixel values thereon;
energy function E 1 The calculation formula of (2) is as follows:
E 1 =-E P1 E smooth2 E tex
wherein lambda is 1 、λ 2 For adjusting factors, for controlling weight, the energy function E 1 Taking the minimum value, all are projected to the same key frameT i Is formed into a patch; the function solution belongs to the Markov random field problem and can be solved by using a graph segmentation algorithm.
Further, similarity E m The calculation formula of (2) is as follows:
wherein, thereinsRepresenting a texture map prior to adjustment; t represents the adjusted texture map;mrepresentation ofsIs a patch in (a); t represents patch in T; dist () is a distance function, specifically using the sum of squares of errors of RGB space as distance; l represents the average number of pixels per patch.
Further, color consistency E c The calculation formula of (2) is as follows:
wherein,irepresenting texture mapsC A A number of pixels of (a);C A representing a texture map corresponding to the adjusted patch A;C B representing a texture map corresponding to the adjusted patch B, wherein B and A are adjacent;G A representing a collection of patches adjacent to patch A;C A (u i ) Representing pixelsu i In texture mapC A Is a value of (2);representing pixelsu i In texture mapC A The value at the time of the last update,x(u i ) Texture mapC A Is a pixel of (2)u i Remapping toC B The upper part of the upper part is provided with a plurality of grooves,ρas->And C B (x(u i ) A weighted average coefficient of the term.
Further, energy function E 2 The calculation formula of (2) is as follows: e (E) 2 =E m3 E c The method comprises the steps of carrying out a first treatment on the surface of the Wherein lambda is 3 For adjusting factors for adjusting colour consistency E c As the energy function E 2 And taking the texture map corresponding to the minimum value as an optimal texture map.
Further, the step of screening the sequence of pictures to obtain a key frame includes:
the first picture in the picture sequence is regarded as a key frame, and the next picture is selected downwards in sequence for judgment; and if the camera gesture of the picture compared with the previous key frame rotates and translates beyond a set threshold value, the picture is considered to be a new key frame.
Further, the step of extracting a basic level diagram of the picture according to the key frame and obtaining a detail level diagram of the picture according to the basic level diagram comprises the following steps:
converting RGB part of the depth map into gray map;
carrying out bilateral filtering treatment on the gray level map to obtain a basic level map;
and taking logarithms of the basic level diagram and the gray level diagram respectively, and subtracting the basic level diagram from the gray level diagram after taking the logarithms to obtain a detail level diagram.
Another aspect of the present application provides a texture reconstruction system based on RGB-D images, the system comprising:
the acquisition module is used for acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object, wherein the three-dimensional model comprises model coordinates and surface patch information, the surface patch is a triangle constructed by three vertexes, the picture sequence comprises depth photos of the object model under different angles, and the depth photos comprise RGB color information and depth information of each pixel of the object;
the screening module is used for screening the picture sequence to obtain key frames, and each angle corresponds to one key frame;
the Patch generation module is used for extracting a basic level diagram of the picture according to the key frame, obtaining a detail level diagram of the picture according to the basic level diagram, selecting the key frame of each Patch corresponding to the optimal angle as a candidate texture in combination with the detail level diagram, and projecting all patches of the same key frame to generate a Patch;
the gap optimization module is used for mapping the candidate texture corresponding to each patch onto the texture map to reconstruct the texture, and calculating the similarity E between the textures before and after reconstruction adjustment respectively m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing said energy function E 2 Optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map, wherein the optimal texture map is an energy function E 2 Taking a texture map corresponding to the minimum value;
the step of selecting the key frame of the optimal angle corresponding to each patch as the candidate texture comprises the following steps:
obtaining a global projection area E of the three-dimensional model according to the area of the patch projected to the corresponding key frame of the three-dimensional model P
Obtaining common to all patchesThe method comprises the steps of collecting edges, sampling a common edge, obtaining sampling points, and calculating to obtain texture smoothness E according to RGB pixel values of the sampling points on a key frame and RGB pixel values of the sampling points on another key frame on the common edge smooth
Calculating to obtain a detail texture perception degree E according to the corresponding pixel value of the corresponding detail level diagram of the sampling point on the key frame and the corresponding pixel value of the corresponding detail level diagram of the common edge sampling point on the other key frame tex
According to the global projection area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing said energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result.
Another aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a texture reconstruction method based on RGB-D images as described above.
In another aspect, the present application also provides a data processing apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the above-mentioned RGB-D image-based texture reconstruction method when executing the program.
Drawings
FIG. 1 is a flowchart of a texture reconstruction method based on RGB-D image according to a first embodiment of the present application;
FIG. 2 is a texture map reconstructed by the texture reconstruction method of the present application;
FIG. 3 is a system block diagram of an RGB-D image based texture reconstruction system according to a second embodiment of the present application;
FIG. 4 is a texture map reconstructed using a texture reconstruction method of the prior art;
the application will be further described in the following detailed description in conjunction with the above-described figures.
Detailed Description
In order that the application may be readily understood, a more complete description of the application will be rendered by reference to the appended drawings. Several embodiments of the application are presented in the figures. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
In order to solve the technical problems that in the prior art, the texture reconstruction method based on the mapping is easy to cause obvious gaps between the patches due to different visual angles between the patches, light conditions during shooting and the like, so that the texture quality is affected, the application provides the texture reconstruction method and the texture reconstruction system based on the RGB-D image, and the computing efficiency is improved by screening key frames; secondly, designing a brand new function on patch generation to generate high-quality textures; furthermore, a new optimization function is designed for seam stitching and is used for reducing texture seams to the greatest extent, so that the technical scheme of the application can generate a high-quality texture map, avoid the conditions of texture tearing, blurring and ghost artifacts, and improve the stitching quality of reconstructed textures.
In order to facilitate an understanding of the application, several embodiments of the application will be presented below. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Example 1
Referring to fig. 1, a texture reconstruction method based on RGB-D image according to a first embodiment of the present application is shown, and the method includes steps S101 to S104:
s101, acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object, wherein the three-dimensional model comprises model coordinates and surface patch information, the surface patch is a triangle constructed by three vertexes, the picture sequence comprises depth photos of the object model at different angles, and the depth photos comprise RGB color information and depth information of each pixel of the object.
The model contains three-dimensional coordinates in the world coordinate system of the vertices and also contains face information, wherein the face is a triangle picture sequence determined by three vertices, which is considered as a depth picture of the model at different angles, has RGB color information and depth information of each pixel, and is known to correspond to a camera pose, specifically, the camera pose comprises translation and rotation relative to the origin of the world coordinate system.
S102, screening the picture sequence to obtain key frames, wherein each angle corresponds to one key frame.
Since the picture sequence is usually redundant and includes repeated or similar pictures, in order to remove the repeated or similar pictures to improve the reconstruction efficiency and reduce the cost, as a specific example, the pictures in the picture sequence need to be screened, specifically, the first picture in the picture sequence is considered as a key frame, and the next picture is selected downwards in sequence to determine; and if the camera gesture of the picture compared with the previous key frame rotates and translates beyond a set threshold value, the picture is considered to be a new key frame.
S103, extracting a basic level diagram of the picture according to the key frame, obtaining a detail level diagram of the picture according to the basic level diagram, selecting the key frame of the optimal angle corresponding to each patch according to the detail level diagram as a candidate texture, and projecting the key frame to all patches of the same key frame to generate a patch.
In this embodiment, regarding a basic level diagram and a detail level diagram of an extracted picture, a specific extraction method includes: converting RGB part of the depth map into gray map; carrying out bilateral filtering treatment on the gray level map to obtain a basic level map; taking logarithm for the basic level diagram and the gray level diagram respectively, and subtracting the basic level diagram from the gray level diagram after taking logarithm to obtain a detail level diagram.
As a specific example, the method for selecting the keyframe of the best angle corresponding to each patch as the candidate texture specifically includes steps S1031 to S1034:
s1031, obtaining a global projection area E of the three-dimensional model according to the area of the patch projected to the corresponding key frame of the three-dimensional model P
Global projected area E P The calculation formula of (2) is as follows:the method comprises the steps of carrying out a first treatment on the surface of the Wherein, # Faces represents the number of die sheets; ii (#) represents projection; n-shaped%T if i ) Indicating that the ith dough sheet is to be usedf i Projection to corresponding keyframeT i The method comprises the steps of carrying out a first treatment on the surface of the area (#) represents area.
S1032, acquiring a set of common edges of all the patches, sampling the common edges and acquiring sampling points, and calculating to obtain texture smoothness E according to RGB pixel values of the sampling points on a key frame and RGB pixel values of the sampling points on another key frame smooth
Texture smoothness E smooth The calculation formula of (2) is as follows:
wherein,βindicated at E P In an item, all projected to the same keyframeT i The patches of (1) constitute a collection of common edges of all patches;v x representing the corresponding sampling point of the public edge e;I Ti () Representing sampling points in key framesT i An rgb pixel value above;I Tj () Representing the rgb pixel values of the sample points on the common edge on the key frame of the other.
S1033, calculating to obtain detail texture perceptibility E according to the corresponding pixel value of the detail level diagram corresponding to the sampling point on the key frame and the corresponding pixel value of the detail level diagram corresponding to the common edge sampling point on the other key frame tex
Detail texture awareness E tex The calculation formula of (2) is as follows:
wherein,I Di () Representing sampling points in key framesT i Corresponding level of detail map onD i Corresponding pixel values thereon;I Dj () Detail level diagram representing correspondence of sampling points on common edge on key frame of anotherD j Corresponding pixel values on the pixel values.
S1034 according to the global projection area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result.
Energy function E 1 The calculation formula of (2) is as follows: e (E) 1 =-E P1 E smooth2 E tex The method comprises the steps of carrying out a first treatment on the surface of the Wherein lambda is 1 、λ 2 For adjusting factors, for controlling weight, the energy function E 1 Taking the minimum value, all are projected to the same key frameT i Is formed into a patch; the function solution belongs to the Markov random field problem (namely MRF problem) and can be solved by a Graph segmentation algorithm (namely Graph cut algorithm).
S104, mapping the candidate texture corresponding to each patch onto a texture map for texture reconstruction, and respectively calculating the similarity E between textures before and after reconstruction adjustment m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing energy function E 2 Optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map, wherein the optimal texture map is an energy function E 2 And taking a texture map corresponding to the minimum value.
As a specific example, the similarity E m The calculation formula of (2) is as follows:
wherein, thereinsRepresenting a texture map prior to adjustment; t represents the adjusted texture map;mrepresentation ofsIs a patch in (a); t represents patch in T; dist () is a distance function, specifically using the sum of squares of errors of RGB space as distance; l represents the average number of pixels per patch.
Second, color consistency E c The calculation formula of (2) is as follows:
wherein,irepresenting texture mapsC A A number of pixels of (a);C A representing a texture map corresponding to the adjusted patch A;C B representing a texture map corresponding to the adjusted patch B, wherein B and A are adjacent;G A representing a collection of patches adjacent to patch A;C A (u i ) Representing pixelsu i In texture mapC A Is a value of (2);representing pixelsu i In texture mapC A The value at the time of the last update,x(u i ) Texture mapC A Is a pixel of (2)u i Remapping toC B The method can be realized by adopting a remap algorithm;ρas->And C B (x(u i ) A weighted average coefficient of the term.
Furthermore, the energy function E 2 The calculation formula of (2) is as follows: e (E) 2 =E m3 E c The method comprises the steps of carrying out a first treatment on the surface of the Wherein lambda is 3 For adjusting factors for adjusting colour consistency E c The optimization of this function is solved using the EM algorithm when the energy function E 2 And taking the texture map which is correspondingly adjusted when the minimum value is taken as the optimal texture map. Please refer to fig. 2 in detail.
As shown in fig. 2, microsoft Azure Kinect DK is used as a depth camera, and the texture reconstruction method based on RGB-D images of the present application is used to reconstruct textures of an object, where the left graph in fig. 2 is an object model, and the right graph in fig. 2 is a texture graph at a corresponding position, so that the texture graph obtained by texture reconstruction is clear in texture and has no obvious gaps.
In summary, in the texture reconstruction method based on RGB-D image in the above embodiment of the present application, first, the calculation efficiency is improved by performing the screening of the key frame; secondly, designing a brand new function on patch generation to generate high-quality textures; furthermore, a new optimization function is designed for seam stitching and is used for reducing texture seams to the greatest extent, so that the technical scheme of the application can generate a high-quality texture map, avoid the conditions of texture tearing, blurring and ghost artifacts, and improve the stitching quality of reconstructed textures. Specifically, according to the global projected area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result. Mapping the candidate texture corresponding to each patch onto the texture map for texture reconstruction, and calculating the similarity E between textures before and after reconstruction adjustment m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing energy function E 2 And optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map.
Example two
Referring to fig. 3, a texture reconstruction system based on RGB-D image according to a second embodiment of the present application includes:
the acquisition module is used for acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object, wherein the three-dimensional model comprises model coordinates and surface patch information, the surface patch is a triangle constructed by three vertexes, the picture sequence comprises depth photos of the object model under different angles, and the depth photos comprise RGB color information and depth information of each pixel of the object;
the screening module is used for screening the picture sequence to obtain key frames, and each angle corresponds to one key frame;
the Patch generation module is used for extracting a basic level diagram of the picture according to the key frame, obtaining a detail level diagram of the picture according to the basic level diagram, selecting the key frame of each Patch corresponding to the optimal angle as a candidate texture in combination with the detail level diagram, and projecting all patches of the same key frame to generate a Patch;
the gap optimization module is used for mapping the candidate texture corresponding to each patch onto the texture map to reconstruct the texture, and calculating the similarity E between the textures before and after reconstruction adjustment respectively m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing said energy function E 2 Optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map, wherein the optimal texture map is an energy function E 2 And taking a texture map corresponding to the minimum value.
In summary, in the texture reconstruction system based on RGB-D image in the above embodiment of the present application, first, the calculation efficiency is improved by performing the screening of the key frame; secondly, designing a brand new function on patch generation to generate high-quality textures; furthermore, a new optimization function is designed for seam stitching and is used for reducing texture seams to the greatest extent, so that the technical scheme of the application can generate a high-quality texture map, avoid the conditions of texture tearing, blurring and ghost artifacts, and improve the stitching quality of reconstructed textures. Specifically, according to the global projected area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result. Mapping the candidate texture corresponding to each patch onto the texture map for texture reconstruction, and calculating the similarity E between textures before and after reconstruction adjustment m Color consistency E C According to the similarityDegree E of m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing energy function E 2 And optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map.
Furthermore, an embodiment of the present application proposes a computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, implements the steps of the method in the above-mentioned embodiment.
Furthermore, an embodiment of the present application also proposes a data processing apparatus including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method in the above embodiment when executing the program.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium may even be paper or other suitable medium upon which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the application, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. A texture reconstruction method based on an RGB-D image, comprising:
acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object, wherein the three-dimensional model comprises model coordinates and surface patch information, the surface patch is a triangle constructed by three vertexes, the picture sequence comprises depth photos of the model of the object under different angles, and the depth photos comprise RGB color information and depth information of each pixel of the object;
screening the picture sequence to obtain key frames, wherein each angle corresponds to one key frame;
extracting a basic level diagram of a picture according to the key frame, obtaining a detail level diagram of the picture according to the basic level diagram, selecting the key frame of each patch corresponding to the optimal angle as a candidate texture in combination with the detail level diagram, and projecting the key frame to all patches of the same key frame to generate a patch;
mapping the candidate texture corresponding to each patch onto a texture map for texture reconstruction, and respectively calculating the similarity E between textures before and after reconstruction adjustment m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing said energy function E 2 Optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map, wherein the optimal texture map is an energy function E 2 Taking a texture map corresponding to the minimum value;
the step of selecting the key frame of the optimal angle corresponding to each patch as the candidate texture comprises the following steps:
obtaining a global projection area E of the three-dimensional model according to the area of the patch projected to the corresponding key frame of the three-dimensional model P
Acquiring a set of common edges of all the patches, sampling the common edges, acquiring sampling points, and calculating to obtain texture smoothness E according to RGB pixel values of the sampling points on a key frame and RGB pixel values of the sampling points on another key frame on the common edges smooth
Calculating to obtain a detail texture perception degree E according to the corresponding pixel value of the corresponding detail level diagram of the sampling point on the key frame and the corresponding pixel value of the corresponding detail level diagram of the common edge sampling point on the other key frame tex
According to the global projection area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing said energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result.
2. The method for reconstructing a texture based on an RGB-D image according to claim 1,
global projected area E P The calculation formula of (2) is as follows:
wherein, # Faces represents the number of die sheets; ii (#) represents projection; n-shaped%T if i ) Representing the will beiPersonal dough sheetf i Projection to corresponding keyframeT i The method comprises the steps of carrying out a first treatment on the surface of the area (#) represents area;
texture smoothness E smooth The calculation formula of (2) is as follows:
wherein,βindicated at E P In an item, all projected to the same keyframeT i The patches of (1) constitute a collection of common edges of all patches;v x representing the corresponding sampling point of the public edge e;I Ti () Representing sampling points in key framesT i An rgb pixel value above;I Tj () An rgb pixel value representing a sample point on the common edge on another keyframe;
detail texture awareness E tex The calculation formula of (2) is as follows:
wherein,I Di () Representing sampling points in key framesT i Corresponding level of detail map onD i Corresponding pixel onA value;I Dj () Detail level diagram representing correspondence of sampling points on common edge on key frame of anotherD j Corresponding pixel values thereon;
energy function E 1 The calculation formula of (2) is as follows:
E 1 =-E P1 E smooth2 E tex
wherein lambda is 1 、λ 2 For adjusting factors, for controlling weight, the energy function E 1 Taking the minimum value, all are projected to the same key frameT i Is formed into a patch; the function solution belongs to the Markov random field problem and can be solved by using a graph segmentation algorithm.
3. The method for reconstructing texture based on RGB-D image according to claim 1, wherein the similarity E m The calculation formula of (2) is as follows:
wherein, thereinsRepresenting a texture map prior to adjustment; t represents the adjusted texture map;mrepresentation ofsIs a patch in (a); t represents patch in T; dist () is a distance function, specifically using the sum of squares of errors of RGB space as distance; l represents the average number of pixels per patch.
4. A method of texture reconstruction based on RGB-D images according to claim 3, characterized by color consistency E c The calculation formula of (2) is as follows:
wherein,irepresenting texture mapsC A A number of pixels of (a);C A representing a texture map corresponding to the adjusted patch A;C B representing a texture map corresponding to the adjusted patch B, wherein B and A are adjacent;G A representing a collection of patches adjacent to patch A;C A (u i ) Representing pixelsu i In texture mapC A Is a value of (2);representing pixelsu i In texture mapC A The value at the time of the last update,x(u i ) Texture mapC A Is a pixel of (2)u i Remapping toC B The upper part of the upper part is provided with a plurality of grooves,ρas->And C B (x(u i ) A weighted average coefficient of the term.
5. The method for reconstructing a texture based on an RGB-D image according to claim 4, wherein the energy function E 2 The calculation formula of (2) is as follows:
E 2 =E m3 E c
wherein lambda is 3 For adjusting factors for adjusting colour consistency E c As the energy function E 2 And taking the texture map corresponding to the minimum value as an optimal texture map.
6. The RGB-D image-based texture reconstruction method of claim 1, wherein the step of screening the sequence of pictures to obtain keyframes comprises:
the first picture in the picture sequence is regarded as a key frame, and the next picture is selected downwards in sequence for judgment; and if the camera gesture of the picture compared with the previous key frame rotates and translates beyond a set threshold value, the picture is considered to be a new key frame.
7. The RGB-D image-based texture reconstruction method of claim 1, wherein the step of extracting a base level map of a picture from the key frame and obtaining a detail level map of the picture from the base level map comprises:
converting RGB part of the depth map into gray map;
carrying out bilateral filtering treatment on the gray level map to obtain a basic level map;
and taking logarithms of the basic level diagram and the gray level diagram respectively, and subtracting the basic level diagram from the gray level diagram after taking the logarithms to obtain a detail level diagram.
8. A texture reconstruction system based on RGB-D images, the system comprising:
the acquisition module is used for acquiring a three-dimensional model of an object and a picture sequence containing texture information of the object, wherein the three-dimensional model comprises model coordinates and surface patch information, the surface patch is a triangle constructed by three vertexes, the picture sequence comprises depth photos of the model of the object under different angles, and the depth photos comprise RGB color information and depth information of each pixel of the object;
the screening module is used for screening the picture sequence to obtain key frames, and each angle corresponds to one key frame;
the Patch generation module is used for extracting a basic level diagram of the picture according to the key frame, obtaining a detail level diagram of the picture according to the basic level diagram, selecting the key frame of each Patch corresponding to the optimal angle as a candidate texture in combination with the detail level diagram, and projecting all patches of the same key frame to generate a Patch;
the gap optimization module is used for mapping the candidate texture corresponding to each patch onto the texture map to reconstruct the texture, and calculating the similarity E between the textures before and after reconstruction adjustment respectively m Color consistency E C According to the similarity E m Color consistency E c Constructing and obtaining an energy function E 2 Optimizing said energy function E 2 Optimizing the reconstructed texture map gap according to the function result to obtain an optimal texture map, wherein the optimal texture map is an energy function E 2 Taking a texture map corresponding to the minimum value;
the step of selecting the key frame of the optimal angle corresponding to each patch as the candidate texture comprises the following steps:
obtaining a global projection area E of the three-dimensional model according to the area of the patch projected to the corresponding key frame of the three-dimensional model P
Acquiring a set of common edges of all the patches, sampling the common edges, acquiring sampling points, and calculating to obtain texture smoothness E according to RGB pixel values of the sampling points on a key frame and RGB pixel values of the sampling points on another key frame on the common edges smooth
Calculating to obtain a detail texture perception degree E according to the corresponding pixel value of the corresponding detail level diagram of the sampling point on the key frame and the corresponding pixel value of the corresponding detail level diagram of the common edge sampling point on the other key frame tex
According to the global projection area E P Smoothness of texture E smooth Detail texture awareness E tex Constructing and obtaining an energy function E 1 Optimizing said energy function E 1 And obtaining a minimum value of the function result, and obtaining a key frame of the patch corresponding to the optimal angle according to the minimum value of the function result.
9. A computer readable storage medium having stored thereon a computer program, which when executed by a processor implements the RGB-D image-based texture reconstruction method according to any one of claims 1-7.
10. A data processing apparatus comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the RGB-D image-based texture reconstruction method of any one of claims 1-7 when the program is executed.
CN202311465347.9A 2023-11-07 2023-11-07 Texture reconstruction method and system based on RGB-D image Pending CN117197365A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311465347.9A CN117197365A (en) 2023-11-07 2023-11-07 Texture reconstruction method and system based on RGB-D image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311465347.9A CN117197365A (en) 2023-11-07 2023-11-07 Texture reconstruction method and system based on RGB-D image

Publications (1)

Publication Number Publication Date
CN117197365A true CN117197365A (en) 2023-12-08

Family

ID=89005651

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311465347.9A Pending CN117197365A (en) 2023-11-07 2023-11-07 Texture reconstruction method and system based on RGB-D image

Country Status (1)

Country Link
CN (1) CN117197365A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147025A (en) * 2018-07-11 2019-01-04 北京航空航天大学 A kind of Texture Generating Approach towards RGBD three-dimensional reconstruction
CN111311662A (en) * 2020-02-12 2020-06-19 清华大学深圳国际研究生院 Method and device for reconstructing three-dimensional scene in real time

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147025A (en) * 2018-07-11 2019-01-04 北京航空航天大学 A kind of Texture Generating Approach towards RGBD three-dimensional reconstruction
CN111311662A (en) * 2020-02-12 2020-06-19 清华大学深圳国际研究生院 Method and device for reconstructing three-dimensional scene in real time

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YANPING FU, ET AL.: "Seamless Texture Optimization for RGB-D Reconstruction", 《IEEE》, pages 1845 - 1859 *

Similar Documents

Publication Publication Date Title
US11776216B2 (en) System and method for extracting a region of interest from volume data
Paris A gentle introduction to bilateral filtering and its applications
US7755645B2 (en) Object-based image inpainting
US8340422B2 (en) Generation of depth map for an image
JP5985620B2 (en) Controlling objects in a virtual environment
CN109379625A (en) Method for processing video frequency, device, electronic equipment and computer-readable medium
US10347052B2 (en) Color-based geometric feature enhancement for 3D models
Brown et al. Restoring 2D content from distorted documents
CN111899295B (en) Monocular scene depth prediction method based on deep learning
Chen et al. 3D texture mapping in multi-view reconstruction
CN109671039B (en) Image vectorization method based on layering characteristics
CN114255314B (en) Automatic texture mapping method, system and terminal for shielding avoidance three-dimensional model
Delon et al. Stabilization of flicker-like effects in image sequences through local contrast correction
US20240062345A1 (en) Method, apparatus, and computer-readable medium for foreground object deletion and inpainting
Zhu et al. Panorama completion for street views
Luo et al. Depth-aided inpainting for disocclusion restoration of multi-view images using depth-image-based rendering
CN117197365A (en) Texture reconstruction method and system based on RGB-D image
CN111243062A (en) Manufacturing method for converting planar mural into three-dimensional high-definition digital mural
Zhang et al. Automatic genaration of sketch-like pencil drawing from image
KR20120118462A (en) Concave surface modeling in image-based visual hull
Kawai et al. Image Inpainting Considering Brightness Change and Spatial Locality of Textures.
CN111815532A (en) Depth map repairing method and related device thereof
CN114821703B (en) Distance self-adaptive thermal infrared face recognition method
US20230196659A1 (en) Computer implemented method and system for classifying an input image for new view synthesis in a 3d visual effect, and non-transitory computer readable storage medium
Wu et al. Camera Tripod Removal Model in Panoramic Images Based on Generative Adversarial Networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination