CN104318514A - Three-dimensional significance based image warping method - Google Patents

Three-dimensional significance based image warping method Download PDF

Info

Publication number
CN104318514A
CN104318514A CN201410553252.7A CN201410553252A CN104318514A CN 104318514 A CN104318514 A CN 104318514A CN 201410553252 A CN201410553252 A CN 201410553252A CN 104318514 A CN104318514 A CN 104318514A
Authority
CN
China
Prior art keywords
target image
formula
described target
image
triangle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410553252.7A
Other languages
Chinese (zh)
Other versions
CN104318514B (en
Inventor
汪萌
高欣健
陈雁翔
潘宜飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN201410553252.7A priority Critical patent/CN104318514B/en
Publication of CN104318514A publication Critical patent/CN104318514A/en
Application granted granted Critical
Publication of CN104318514B publication Critical patent/CN104318514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T3/067
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a three-dimensional significance based image warping method. The three-dimensional significance based image warping method is characterized by the following steps of 1 obtaining a depth map of a target image through depth data; 2 combining the depth map and a two-dimensional model to construct a three-dimensional significance model; 3 updating a weight between the depth data and the two-dimensional model according to distribution of image gradation in a self-adapting mode; 4 calculating the gradient of an image energy function through the three-dimensional significance model; 5 extracting an image two-dimensional edge and a depth contour feature and generating into a triangular mesh through feature points; 6 establishing a target function and increasing constraint; 7 requesting an extreme value for the target function and constructing a transformational relation. The three-dimensional significance based image warping method can combine the depth information and the two-dimensional significance and increases the warping robustness.

Description

Based on the image warping method of three-dimensional significance
Technical field
The invention belongs to image processing field, relate generally to a kind of image warping method based on three-dimensional significance.
Background technology
Universal along with mobile device such as smart mobile phone and panel computer, the photo upload taken by them is shared with friend by user habit to social network sites.Consider that the model of the mobile device that user uses is different, the photo how making user share can show well on different terminals, is one of heat subject of current computer vision field research.
For the problems referred to above, researchers propose certain methods, but inadequate due to distortion and accuracy of detection, and the distortion of objects in images and the loss of part important information annoying researchers all the time.In 2009, international top-level meeting IEEE International Conference on Image Processing having delivered article " Saliency detection for content-aware image resizing " this article recently proposes to utilize the significance calculated in target image image to be carried out to the constraint of distortion, and the point making significance in distort process large is preserved and sacrifices the little point of significance.But when scene more complicated, texture and the object in image is more, the method of this article is utilized to calculate the significance of many points in image all very large, if retained these points, image just can not get effective distortion, if give up these points, some important profile informations will be cast out.Up to the present, still do not occur that one can ensure that objects in images can not be out of shape or lose, the method for distortion can be carried out again the image of scene more complicated.
Summary of the invention
The present invention is intended to solve Most current image warping method and carries out target image producing distortion, lost part important information to the object in image and effectively can not carrying out the problem of distortion to the image having complex scene in the process of distortion, a kind of image warping method based on three-dimensional significance is proposed, depth information and two-dimentional significance can be combined, and strengthen the robustness of distortion.
The present invention is that technical solution problem adopts following technical scheme:
The feature of a kind of image warping method based on three-dimensional significance of the present invention is carried out as follows:
Step 1: utilize formula (1) computed image size to be the energy function E of each pixel in the target image I of m × n:
E ( x , y ) = | δ δx I ( x , y ) | + | δ δy I ( x , y ) | - - - ( 1 )
In formula (1), E (x, y) is the energy value of described target image I at pixel (x, y) place; I (x, y) is the gray-scale value of described target image I at pixel (x, y) place; X ∈ (0, m); Y ∈ (0, n);
Step 2: carry out feature extraction to described target image I, obtains two dimensional character matrix X;
Step 3, formula (2) is utilized to obtain the two-dimentional significance S of described target image I 2D:
S 2 D = exp ( - | | X i - X j | | 2 2 σ 2 ) - - - ( 2 )
In formula (2), X i, X jbe respectively described two dimensional character matrix two different rows vectors; σ is constant;
Step 4, formula (3) is utilized to build three-dimensional significance model S 3D:
S 3D=(1-α)S 2D+α·E depth (3)
In formula (3), E depthfor the depth map utilizing 3D camera to obtain described target image I, α is auto-adaptive parameter; And have:
α = Σ x , y = 0 m , n n ( x , y ) · ( I ( x , y ) - I ( x , y ) ‾ ) 2 D max - - - ( 4 )
In formula (4), n (x, y) represents the number of pixels equaling pixel (x, y) gray-scale value; D maxfor constant;
Step 5: utilize formula (1) and (3) that described energy function E is newly defined as E':
E'(x,y)=E(x,y)·S 3D(x,y) (5)
In formula (5), E'(x, y) for described target image I is at the new energy value at pixel (x, y) place;
Step 6: utilize formula (6) to calculate the image significance S of described target image I:
S ( ( x b , n ) , ( x a , n - 1 ) ) = Σ a = 1 b - 1 | G a , n v - G a , n d | + Σ a = a + 1 b | G a , n v G a - 1 , n d | - - - ( 6 )
In formula (6), (x b, n) be b pixel of the n-th row in described target image I, (x a, n-1) and be the (n-1)th a pixel arranged in described target image I, a ≠ b, and a, b ∈ (0, m); S ((x b, n), (x a, n-1)) to represent in described target image I that n-th arranges b pixel x ba pixel x is arranged with (n-1)th aenergy differences; represent the gradient on described target image I horizontal direction v; And have represent the gradient on described target image I diagonal d, and have G a , n d = | E a , n ′ - E a + 1 , n - 1 ′ | ;
Step 7, depth map according to described target image I, obtain the profile of body surface in described target image I;
Step 8, Delaunay trigonometry is utilized to be coupled together and build triangle gridding on described target image by each point in described two dimensional character matrix X; Described triangle gridding comprises several triangles t;
Step 9, formula (7) is utilized to carry out distortion to triangles all in described triangle gridding:
G t = s t α 0 0 s t β - - - ( 7 )
In formula (7), with be respectively the distortion of described triangle t on transverse direction α and longitudinal direction β; G trepresent the warp function to any one triangle t;
Step 10, formula (8) is utilized to obtain target equation E s:
E s = Σ t ∈ T S t A t | | J t ( q ) - G t | | 2 - - - ( 8 )
In formula (8), T is the set of described triangle t; A tfor the area of described triangle t, J tq () expression carries out the Jacobin matrix after distortion to described triangle t; S tfor the significance of any one triangle t;
Step 11: utilize formula (9) to define constraint condition E f:
E f = Σ t ∈ T T | | ( c t 2 q - c t 1 q ) - r i R i ( c t 1 q - c t q ) | | 2 - - - ( 9 )
In formula (9), for three summits without distortion on described triangle t, for three summits of three summits on described triangle t after distortion; r iit is summit and summit between limit and summit and summit between the side ratio on limit, R tit is summit and summit between limit and summit and summit between the rotation matrix that forms of limit;
Step 12: utilize formula (10) to obtain distortion matrix F:
F=λE s+(1-λ)E f (10)
In formula (10), λ is coefficient; Translation or the distortion of rotation realization to described target image I is carried out according to the value that distortion matrix F is corresponding with triangle t each in described target image I.
Compared with prior art, beneficial effect of the present invention is embodied in:
1, the method for target image being carried out to energy function calculating based on L1 normal form of classics combines with depth information and proposes a kind of three-dimensional significance newly by the present invention, both ensure that the advantage of original two-dimentional significance, and with the addition of again depth information and make still can well carry out distortion when scene more complicated in target image.
2, the parameter in the present invention between two-dimentional significance and depth information is according to the adaptive adjustment of the distribution of gray scale in image, when intensity profile in image is more single and image scene is more single, the weight of two dimension significance, by the weight higher than depth information, therefore remains the advantage of conventional two-dimensional significance; And when intensity profile is more extensive and image scene more complicated time, the weight of depth information, higher than two-dimentional significance, therefore better can carry out distortion to the image of complex scene.
3, because the precision of two-dimentional significance is subject to the comparatively large and precision of depth information of the impact of ambient light photograph hardly by extraneous illumination effect, therefore the inventive method to external world environment there is stronger noise immunity.
4, the present invention is owing to the addition of depth information and triangle gridding as constraint, both the size of the significance of each point had been considered, consider the integrality at objects in images edge simultaneously, the large significance of two somes significance on such hypothesis same limit is little, and final distortion result also can not make this edge that the situations such as fracture occur; Thus ensure that the robustness of scalloping.
Accompanying drawing explanation
Fig. 1 is target image of the present invention;
Fig. 2 is the distortion result figure of the present invention to target image.
Embodiment
In the present embodiment, a kind of image warping method based on three-dimensional significance is the depth map utilizing depth data to obtain target image; Then depth map and two dimensional model are combined and build three-dimensional significance model; Secondly according to the weight between the adaptive renewal depth data of the distribution of gradation of image and two dimensional model; Again utilize three-dimensional significance model computed image energy function gradient; Extract two-dimensional image edge and depth profile feature afterwards, utilize unique point to generate triangle gridding; Final establishing target function is asked for extreme value and adds constraint, obtains distortion matrix; Carry out as follows specifically:
Step 1: utilize formula (1) computed image size to be the energy function E of each pixel in the target image I of m × n: this energy function is obtained by the gradient utilizing L1 normal form and calculate each pixel of target image; Gradient information reflects the edge of objects in images usually, can effectively ensure the complete of image in tailoring process, target image as shown in Figure 1:
E ( x , y ) = | δ δx I ( x , y ) | + | δ δy I ( x , y ) | - - - ( 1 )
In formula (1), E (x, y) is for target image I is at the energy value at pixel (x, y) place; I (x, y) is for target image I is at the gray-scale value at pixel (x, y) place; X ∈ (0, m); Y ∈ (0, n);
Step 2: feature extraction is carried out to target image I, the method of image being carried out to feature extraction is unrestricted, and the feature extracting methods such as Local Binary Pattern (LBP) or SIFT such as can be adopted can to obtain obtaining two dimensional character matrix X;
Step 3, formula (2) is utilized to obtain the two-dimentional significance S of target image I 2D:
S 2 D = exp ( - | | X i - X j | | 2 2 σ 2 ) - - - ( 2 )
In formula (2), X i, X jbe respectively two dimensional character matrix two different rows vectors; σ is constant;
Step 4, formula (3) is utilized to build three-dimensional significance model S 3D: depth information combines by model on original two-dimentional significance basis, and according to the difference of image, the weights relation between Automatic adjusument two dimension significance and depth information;
S 3D=(1-α)S 2D+α·E depth (3)
In formula (3), E depthfor the depth map utilizing 3D camera to obtain target image I, α is auto-adaptive parameter; And have:
α = Σ x , y = 0 m , n n ( x , y ) · ( I ( x , y ) - I ( x , y ) ‾ ) 2 D max - - - ( 4 )
In formula (4), n (x, y) represents the number of pixels equaling pixel (x, y) gray-scale value; D maxfor constant; When scene more complicated in image, the distribution of its gray-scale value is by more concentrated, and in contrast, when the scene in image is fairly simple, the distribution of its gray-scale value is just more sparse; When the scene of image is fairly simple, utilizes traditional two-dimentional significance just can calculate the significance of image well, and add that depth information has come to calculate the significance of image when scene more complicated with regard to needing;
Step 5: utilize formula (1) and (3) that energy function E is newly defined as E':
E'(x,y)=E(x,y)·S 3D(x,y) (5)
In formula (5), E'(x, y) for target image I is at the new energy value at pixel (x, y) place;
Step 6: utilize formula (6) to calculate the image significance S of target image I:
S ( ( x b , n ) , ( x a , n - 1 ) ) = Σ a = 1 b - 1 | G a , n v - G a , n d | + Σ a = a + 1 b | G a , n v G a - 1 , n d | - - - ( 6 )
In formula (6), (x b, n) be b pixel of the n-th row in target image I, (x a, n-1) and be the (n-1)th a pixel arranged in target image I, a ≠ b, and a, b ∈ (0, m); S ((x b, n), (x a, n-1)) to represent in target image I that n-th arranges b pixel x ba pixel x is arranged with (n-1)th aenergy differences; represent the gradient on target image I horizontal direction v; And have represent the gradient on target image I diagonal d, and have by calculating the difference of the energy function in each pixel horizontal direction and diagonal, calculating the significance of each point thus obtaining the significance of whole image;
Step 7, according to the depth information in the depth map of target image I, gradient is asked for the depth information of each point, obtain the profile of body surface in target image I;
Step 8, Delaunay trigonometry is utilized to be coupled together and build triangle gridding on target image by point each in two dimensional character matrix X; Triangle gridding comprises several triangles t; Delaunay trigonometry utilizes the character of discrete point, the while of being linked to be leg-of-mutton, ensure that immediate three discrete points and calculates and all can obtain identical triangular mesh from which point;
Step 9, formula (7) is utilized to carry out distortion to triangles all in triangle gridding:
G t = s t α 0 0 s t β - - - ( 7 )
In formula (7), with be respectively the distortion of triangle t on transverse direction α and longitudinal direction β; G trepresent the warp function to any one triangle t;
Step 10, formula (8) is utilized to obtain target equation E s:
E s = Σ t ∈ T S t A t | | J t ( q ) - G t | | 2 - - - ( 8 )
In formula (8), T is the set of triangle t; A tfor the area of triangle t, J tq () expression diabolo t carries out the Jacobin matrix after distortion; S tfor the significance of any one triangle t, this target equation is minimized, just can obtain optimum transformation matrix;
Step 11: the profiling object surface obtained to prevent depth information can not because of conversion distortion too many, therefore need on the basis of above-mentioned objective function, add a constraint: utilize formula (9) to define constraint condition E f:
E f = Σ t ∈ T T | | ( c t 2 q - c t 1 q ) - r i R i ( c t 1 q - c t q ) | | 2 - - - ( 9 )
In formula (9), for three summits without distortion on triangle t, for three summits of the summit of three on triangle t after distortion; r iit is summit and summit between limit and summit and summit between the side ratio on limit, R tit is summit and summit between limit and summit and summit between the rotation matrix that forms of limit; By minimizing this objective function, make the shape of former three points as far as possible similar to the shape of three points after conversion;
Step 12: utilize formula (10) to obtain distortion matrix F:
F=λE s+(1-λ)E f (10)
In formula (10), λ is coefficient; Carry out translation or the distortion of rotation realization to target image I with triangle t each in target image I according to the value that distortion matrix F is corresponding, thus both saved the high point of significance in image, in turn ensure that validity.

Claims (1)

1., based on an image warping method for three-dimensional significance, it is characterized in that carrying out as follows:
Step 1: utilize formula (1) computed image size to be the energy function E of each pixel in the target image I of m × n:
E ( x , y ) = | δ δx I ( x , y ) | + | δ δy I ( x , y ) | - - - ( 1 )
In formula (1), E (x, y) is the energy value of described target image I at pixel (x, y) place; I (x, y) is the gray-scale value of described target image I at pixel (x, y) place; X ∈ (0, m); Y ∈ (0, n);
Step 2: carry out feature extraction to described target image I, obtains two dimensional character matrix X;
Step 3, formula (2) is utilized to obtain the two-dimentional significance S of described target image I 2D:
S 2 D = exp ( - | | X i - X j | | 2 2 σ 2 ) - - - ( 2 )
In formula (2), X i, X jbe respectively described two dimensional character matrix two different rows vectors; σ is constant;
Step 4, formula (3) is utilized to build three-dimensional significance model S 3D:
S 3D=(1-α)S 2D+α·E depth (3)
In formula (3), E depthfor the depth map utilizing 3D camera to obtain described target image I, α is auto-adaptive parameter; And have:
α = Σ x , y = 0 m , n n ( x , y ) · ( I ( x , y ) - I ( x , y ) ‾ ) 2 D max - - - ( 4 )
In formula (4), n (x, y) represents the number of pixels equaling pixel (x, y) gray-scale value; D maxfor constant;
Step 5: utilize formula (1) and (3) that described energy function E is newly defined as E':
E'(x,y)=E(x,y)·S 3D(x,y) (5)
In formula (5), E'(x, y) for described target image I is at the new energy value at pixel (x, y) place;
Step 6: utilize formula (6) to calculate the image significance S of described target image I:
S ( ( x b , n ) , ( x a , n - 1 ) ) = Σ a = 1 b - 1 | G a , n v - G a , n d | + Σ a = a + 1 b | G a , n v G a - 1 , n d | - - - ( 6 )
In formula (6), (x b, n) be b pixel of the n-th row in described target image I, (x a, n-1) and be the (n-1)th a pixel arranged in described target image I, a ≠ b, and a, b ∈ (0, m); S ((x b, n), (x a, n-1)) to represent in described target image I that n-th arranges b pixel x ba pixel x is arranged with (n-1)th aenergy differences; represent the gradient on described target image I horizontal direction v; And have represent the gradient on described target image I diagonal d, and have G a , n d = | E a , n ′ - E a + 1 , n - 1 ′ | ;
Step 7, depth map according to described target image I, obtain the profile of body surface in described target image I;
Step 8, Delaunay trigonometry is utilized to be coupled together and build triangle gridding on described target image by each point in described two dimensional character matrix X; Described triangle gridding comprises several triangles t;
Step 9, formula (7) is utilized to carry out distortion to triangles all in described triangle gridding:
G t = s t α 0 0 s t β - - - ( 7 )
In formula (7), with be respectively the distortion of described triangle t on transverse direction α and longitudinal direction β; G trepresent the warp function to any one triangle t;
Step 10, formula (8) is utilized to obtain target equation E s:
E s = Σ t ∈ T S t A t | | J t ( q ) - G t | | 2 - - - ( 8 )
In formula (8), T is the set of described triangle t; A tfor the area of described triangle t, J tq () expression carries out the Jacobin matrix after distortion to described triangle t; S tfor the significance of any one triangle t;
Step 11: utilize formula (9) to define constraint condition E f:
E f = Σ t ∈ T T | | ( c t 2 q - c t 1 q ) - r i R i ( c t 1 q - c t q ) | | 2 - - - ( 9 )
In formula (9), for three summits without distortion on described triangle t for three summits of three summits on described triangle t after distortion; r iit is summit and summit between limit and summit and summit between the side ratio on limit, R tit is summit and summit between limit and summit and summit between the rotation matrix that forms of limit;
Step 12: utilize formula (10) to obtain distortion matrix F:
F=λE s+(1-λ)E f (10)
In formula (10), λ is coefficient; Translation or the distortion of rotation realization to described target image I is carried out according to the value that distortion matrix F is corresponding with triangle t each in described target image I.
CN201410553252.7A 2014-10-17 2014-10-17 Three-dimensional significance based image warping method Active CN104318514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410553252.7A CN104318514B (en) 2014-10-17 2014-10-17 Three-dimensional significance based image warping method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410553252.7A CN104318514B (en) 2014-10-17 2014-10-17 Three-dimensional significance based image warping method

Publications (2)

Publication Number Publication Date
CN104318514A true CN104318514A (en) 2015-01-28
CN104318514B CN104318514B (en) 2017-05-17

Family

ID=52373740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410553252.7A Active CN104318514B (en) 2014-10-17 2014-10-17 Three-dimensional significance based image warping method

Country Status (1)

Country Link
CN (1) CN104318514B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510299A (en) * 2009-03-04 2009-08-19 上海大学 Image self-adapting method based on vision significance
EP2523165A2 (en) * 2011-05-13 2012-11-14 Omron Co., Ltd. Image processing method and image processing device
CN103050110A (en) * 2012-12-31 2013-04-17 华为技术有限公司 Method, device and system for image adjustment
WO2014116346A1 (en) * 2013-01-24 2014-07-31 Google Inc. Systems and methods for resizing an image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510299A (en) * 2009-03-04 2009-08-19 上海大学 Image self-adapting method based on vision significance
EP2523165A2 (en) * 2011-05-13 2012-11-14 Omron Co., Ltd. Image processing method and image processing device
CN103050110A (en) * 2012-12-31 2013-04-17 华为技术有限公司 Method, device and system for image adjustment
WO2014116346A1 (en) * 2013-01-24 2014-07-31 Google Inc. Systems and methods for resizing an image

Also Published As

Publication number Publication date
CN104318514B (en) 2017-05-17

Similar Documents

Publication Publication Date Title
CN107578436B (en) Monocular image depth estimation method based on full convolution neural network FCN
CN105303616B (en) Embossment modeling method based on single photo
CN107578430B (en) Stereo matching method based on self-adaptive weight and local entropy
CN106709948A (en) Quick binocular stereo matching method based on superpixel segmentation
CN111079685A (en) 3D target detection method
CN106127818B (en) A kind of material appearance acquisition system and method based on single image
CN104299250A (en) Front face image synthesis method and system based on prior model
CN102663399B (en) Image local feature extracting method on basis of Hilbert curve and LBP (length between perpendiculars)
CN102074014A (en) Stereo matching method by utilizing graph theory-based image segmentation algorithm
CN104156957A (en) Stable and high-efficiency high-resolution stereo matching method
CN104820991A (en) Multi-soft-constraint stereo matching method based on cost matrix
CN104715504A (en) Robust large-scene dense three-dimensional reconstruction method
CN103927727A (en) Method for converting scalar image into vector image
CN103778598A (en) Method and device for disparity map improving
CN102609936A (en) Stereo image matching method based on belief propagation
CN111553296B (en) Two-value neural network stereo vision matching method based on FPGA
CN115861570A (en) Multi-view human body reconstruction method based on luminosity consistency matching and optimization algorithm
CN104301706B (en) A kind of synthetic method for strengthening bore hole stereoscopic display effect
To et al. Bas-relief generation from face photograph based on facial feature enhancement
Vázquez‐Delgado et al. Real‐time multi‐window stereo matching algorithm with fuzzy logic
CN104796624A (en) Method for editing and propagating light fields
CN107330930A (en) Depth of 3 D picture information extracting method
CN117132737A (en) Three-dimensional building model construction method, system and equipment
CN104200469B (en) Data fusion method for vision intelligent numerical-control system
CN104318514A (en) Three-dimensional significance based image warping method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant