CN1745386A - Image blending by guided interpolation - Google Patents

Image blending by guided interpolation Download PDF

Info

Publication number
CN1745386A
CN1745386A CN 200480003010 CN200480003010A CN1745386A CN 1745386 A CN1745386 A CN 1745386A CN 200480003010 CN200480003010 CN 200480003010 CN 200480003010 A CN200480003010 A CN 200480003010A CN 1745386 A CN1745386 A CN 1745386A
Authority
CN
China
Prior art keywords
interpolation
target image
aiming field
images
source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200480003010
Other languages
Chinese (zh)
Other versions
CN100386772C (en
Inventor
P·佩雷斯
M·冈根特
A·布雷克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Publication of CN1745386A publication Critical patent/CN1745386A/en
Application granted granted Critical
Publication of CN100386772C publication Critical patent/CN100386772C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

A blended result image is computed using guided interpolation to alter image data within a destination domain. The destination domain may be altered based on a guided interpolator that is either dependent or not dependent upon a source image. When blending a source region into a destination region of an image, guided interpolation eliminates or minimizes apparent seams between the inserted region and the rest of the image. A variety of interpolation guides may be used to yield different effects in the blended results image. Such interpolation guides may include without limitation an identity guide, a smooth guide, a filtering guide, a transparent guide, a masking guide, and a weighted average combination of various guides.

Description

Image by guided interpolation is mixed
Related application
This application requires U.S. Provisional Patent Application the 60/450th, No. 078, the right of priority that is entitled as " Image Blending ByGuided Interpolation " (image of the interpolation that is instructed mixed), this application is quoted the content of its disclosed and instruction and is included in this especially by integral body.
Technical field
The present invention relates generally to the picture editting, relate in particular to blended image.
Background of invention
Editor digital picture, especially digital photos are activities common in family and the professional environment.A kind of common task comprises an image is attached in another image, perhaps is attached to other position of same image.This operation is used for repairing, the revision digital picture.
For example, in Fig. 1, the user wishes and will be inserted in the target image 104 from the sun 100 of source images 102.Simply duplicating in the paste operation, the user will be in source images 102 selects or cutting, duplicates the zone of institute's cutting and the zone of institute's cutting is attached to desired locations in the target image 104 along the circumference of the sun 100.Perhaps, can use simple drag-and-drop operation with similar fashion.Yet, in these two kinds of methods, the border between the institute clipping region of insertion and the remainder of target image 104 (or claim " seam ") can seem any irregular in very discontinuous and the reflection trimming operation.In addition, when being inserted into when representing the target image 104 of sky by dusky background, any part of the background of the sun in the source images 102 of being caught in trimming operation (that is, pure light blue in the coloured image) will show as stiff discontinuous.In other words, the cutting of being inserted partly would not blend in the target image 104.
The replacement method of a kind of being called " clone " relates to the anchor point that defines in source images.On target image, use paintbrush to control the zone that source image data should be copied in the objective definition image wherein.The result is similar to duplicate and pastes or drag and drop replacement method.
In addition, other image-editing operations also can relate to the outward appearance that changes selected areas in the image.For example, can the graduation image regional texture, can make background or prospect decolouring or can on certain zone of image, carry out illumination change.Yet this type of editing operation also can be introduced the problem of seam between editing area and original image remainder.
In order to improve the result of these existing methods, can carry out part " emergence " operation and come the cutting part of cache insertion and the seam between destination image background.Such emergence method is usually directed to view data is blured or smear at seam crossing, cutting part of being inserted to approach and the graduation transition between destination image background.Yet the operation of sprouting wings may produce unsafty result, because seam is only replaced by fuzzy view data.In addition, existing method can not provide various mixed mode to allow the user image section that is inserted to be integrated in the target image best.
Therefore, in many general case, existing cutting and cloning process provide flexible and seamless image to insert and editor unsatisfactorily.
Summary of the invention
The view data that the interpolation that various embodiments of the present invention are instructed by use changes in the aiming field provides mixed result images, thereby has solved the problem of being discussed.Can change aiming field based on the interpolater that is instructed that relies on or be independent of source images.Can use various interpolations to instruct and in mixed result images, produce different effects.
In realization of the present invention, provide the manufacturing article as computer program.One embodiment of computer program provides computer program memory medium, and it is a computer system-readable, and has encoded and calculated the computer program of mixed result images from target image.Another embodiment of computer program can be provided with the form of computer data signal, and this computer data signal is specialized in carrier wave by computing system, and has encoded and calculated the computer program of mixed result images from target image.
This computer program product encodes on computer system, carry out, be used for calculating the computer program of the computer procedures of mixed result images from target image.In target image, defined aiming field with border.Provide and comprised that interpolation instructs the interpolater that is instructed with the boundary condition related with the borderline phase of aiming field.Calculate mixed result images based on the interpolater that instructed, satisfying the boundary condition of aiming field boundary, and interpolation instructed and the gradient of mixed result images between cross over aiming field difference minimize.
In of the present invention another realized, provide a kind of method of calculating mixed result images from target image.Definition has the aiming field on border in target image.Provide and comprised that interpolation instructs the interpolater that is instructed with the boundary condition related with the borderline phase of aiming field.Calculate mixed result images based on the interpolater that instructed, satisfying the boundary condition of aiming field boundary, and interpolation instructed and the gradient of mixed result images between cross over aiming field difference minimize.
In further embodiment of this invention, provide a kind of system that is used for calculating mixed result images from target image.The territory definition module defines the aiming field with border in target image.The interpolation guide datastore provides and has comprised that interpolation instructs the interpolater that is instructed with the boundary condition related with the borderline phase of aiming field.Blended computation module is calculated mixed result images based on the interpolater that is instructed, satisfying the boundary condition of aiming field boundary, and interpolation instructed and the gradient of mixed result images between cross over aiming field difference minimize.
After describing in detail below the reading and checking relevant drawings, these of performance characteristic of the present invention and various further feature together with other advantage, will become apparent.
The accompanying drawing summary
Fig. 1 has described schematically showing of source images in one embodiment of the present of invention, target image and mixed result images.
Fig. 2 has described the notional mixed of two one dimension functions in one embodiment of the present of invention.
Fig. 3 shows the mixed of two two dimensional image zones in one embodiment of the present of invention.
Fig. 4 has described the source images in one embodiment of the present of invention, target image and result images.
Fig. 5 has described the hybrid system in one embodiment of the present of invention.
Fig. 6 has described to be used in one embodiment of the present of invention the operation that mixes.
Fig. 7 has described to be derived from one embodiment of the present of invention the mixed result that transparent interpolation instructs.
Fig. 8 has described to be derived from one embodiment of the present of invention the mixed result that the mask interpolation instructs.
Fig. 9 shows in one embodiment of the present of invention and is blended in the target image.
Figure 10 shows the example system that is used to implement one embodiment of the present of invention.
The detailed description of preferred embodiment
Various embodiments of the present invention use the interpolation that is instructed to change view data in the target, thereby mixed result images is provided.Can change aiming field based on the interpolater that is instructed that relies on or be independent of source images.When in the aiming field that source region is blended into image, the interpolation that is instructed can eliminate or minimize the obvious seam between the remainder of the zone of inserting and image.In addition, can use various interpolations to instruct in mixed result images, to produce different-effect.This type of interpolation instructs can include, but not limited to identical guidance, level and smooth guidance, filter guide, transparent guide, mask instructs and the weighted mean combination of various guidances.
Fig. 1 has described the source images in one embodiment of the present of invention, target image and mixed result images.Source images 102 comprises the source region 100 that contains the sun.The user wishes source region 100 is inserted and is blended in the target image 104, and 108 places do not comprise the sun that is inserted to the latter in the position at first.In first operation, the user defines source region 100 by such as pull out profile around source region 100.Can use the replacement method of definition source region 100, include but not limited to, for clone's paintbrush is provided with anchor point, selects element in image, the chromatic characteristic in assigned source zone or specified coordinate etc. in source images 102.
Generally speaking, digital picture is represented by pixel value.For example, the digital picture of black and white can be represented that wherein each vector element is corresponding to single pixel by the pixel brightness value vector.On the contrary, the Digital Image Data of coloured image can be represented with a plurality of values of each pixel.For example, in RGB (red, green, blue) color space, each pixel is presented by each the brightness value in 3 color channels.Also conceive other color space, included but not limited to LC1C2 and CIE-lab space.
Should be appreciated that only source region 100 to be duplicated and paste the border or the seam 110 that can cause the circumsolar irregular blue colour of sky in the target image 104 usually, the darker night sky background 112 of its coverage goal image 104.Compare with target image, seam is usually expressed as the sudden change in color, brightness, feature, texture or the like.So, produce result images more true to nature, the mixed observability that can blur or eliminate the seam 110 in the result images 106 of the source region that inserts 104.Therefore, as shown in result images 106, source region 100 is inserted into and is blended in the target image 104, to produce effect true to nature.
Fig. 2 has described the notional mixed of two one dimension functions in one embodiment of the present of invention.The conceptual demonstration of the mix operation that relates to one dimension " image " is provided about this description of Fig. 2.But, should be appreciated that and in category of the present invention, conceived various mix operation, include but not limited to, with respect to the described two-dimentional mix operation of other accompanying drawing herein.
The first luminance function A of expression one dimension target " image " has been shown in aim curve Figure 200.The second luminance function B of the one-dimensional source " image " in the target image of indicating to be inserted into curve Figure 200 has been shown in source curve map 202.The mixed result's's " image " of expression one dimension mixed result function has been shown among result curve Figure 20 4.
The border of dotted line 206 expression source regions and target area.In another embodiment of the present invention, source region and aiming field can have diverse size and dimension (for example, rectangle, circle, irregular shape), and the both can be changed by system, thereby for mixed more compatible.
In one embodiment of the invention, the user wishes to use mix operation that source region is merged in the aiming field, thereby the view data of aiming field boundary is by mixed true to nature to show.In shown embodiment, this conceptually illustrates by the frontier point of the source region of the first luminance function A being linked the second luminance function B.So, Hun He result function shows the combination of two luminance functions.In order to eliminate, minimize or reduce the appearance of seam, each forms the boundary condition of function by mixed, thereby when each approximation of function frontier point 208 and 210, each value of forming function equates.
Fig. 3 shows mixing of two two dimensional image zones in one embodiment of the invention.Describe for the concept nature that the one dimension of expander graphs 2 is mixed, two-dimentional mixed example is provided among Fig. 3.Target image 300 illustrates the dark water surface.Among two dimension brightness curve Figure 20 2 the corresponding brightness data that are associated with target image 300 have been shown, wherein brightness can comprise that monochromatic brightness is (such as the white in the black white image, the perhaps single color channel in the coloured image), perhaps certain other outward appearance signal (for example, the component of LC1C2 or polymerization value).For example, note when target image 300 when locating to become more shallow near the top, the brightness of the white shown in brightness curve Figure 30 2 has increased.
In intermediate image 304, the source region 312 that is included in the inverted image of the sun on the different waters surface is inserted in this image.In an illustrated embodiment, the target area is defined as by the zone of the overlapping image of source region 312.Before mixed, to the difference of small part because of the color of water, the seam between the remainder of source region and target image is very obvious, as shown in intermediate image 304.This also can find out in non-mixed brightness curve Figure 30 6, and the brightness data that wherein relates to source region 312 is visible in fits and starts at 314 places.
In mixed result images 308, source region is blended in the target area, so that the photorealism of sun inverted image on the water surface to be provided.The variation of image data illumination in the obviously visible source region in mixed Figure 31 of brightness curve as a result 0, wherein the brightness data of the source region that is inserted is by mixed, with brightness data, thereby reduced total " brightness " of source region at the boundary of this first area coupling target image.
Fig. 4 has described the source images in one embodiment of the present of invention, target image and mixed result images.Target image 400 (with coloured image) illustrates light green sea, place, seabeach.Source images 402 (with coloured image) illustrates two children in the swimming pool that water seems light blue.For the purpose of this discussion, and for characteristics of image being mapped to algorithm described herein, target image 400 is known as " f *", the target area 404 in the target image 400 is known as " Ω ".The border 412 of target area 404 is known as " Ω ".Similarly, source images 402 is known as " g *", the view data of source region 416 is known as g.In mixed result images 410, source region 416 is blended in the target area 404, comprises image-region 418 and 420 (the blended image data f of Ω inside, and f with generation *Remainder, i.e. the part of target image outside the target area) synthetic result images.Attention: mixed result images is to be inserted in the target image by the source region that will comprise each child in order to generate.
In one embodiment of the invention, mix operation is to use following function minimization to realize as the interpolater (or " minimizing device ") that is instructed:
min f∫∫ Ω|f-g| 2
It has boundary condition f| Ω=f *| Ω, wherein
▿ . = [ ∂ . ∂ x , ∂ . ∂ y ]
It is gradient operator.In addition, can use other interpolater, such as the maximization function and be optimized to set the goal, the function of threshold value or scope.
Euler-Lagrange (Euler-Lagrange) equation that is associated, the condition that interpolater satisfied that is promptly instructed is considered to have Poisson (Poisson) equation of Di Li Cray (Dirichlet) boundary condition:
Δ f=Δ g on the Ω, its f| Ω=f *| Ω,
Wherein ▿ . = [ ∂ . ∂ x , ∂ . ∂ y ] It is Laplce (Laplacian) operator.
The interpolating method that is instructed can more briefly be expressed as the interpolater that is instructed by following function minimization:
min f∫∫ Ω|f-v| 2
It has boundary condition f| Ω=f *| ΩWherein v represents that interpolation instructs, and f represents " interpolation type ".Integration illustrates that this function minimization is the unique solution of following Poisson equation:
△ f=div v on the Ω, its f| Ω=f *| Ω,
Wherein
div v = [ ∂ u ∂ x + ∂ v ∂ y ] Be v=[u, v] divergence.
The various selections of v allow to mix with the whole bag of tricks, as shown in following example interpolation guidance table, instruct although also conceived other interpolation in category of the present invention.
Interpolation instructs title Symbol Interpolation instructs definition
Identical I ?v=f *
Smoothly S ?v=0
Filter F ?v=F(g)
Transparent T v=1 [|f*|>|g|]f *+1 [|g|>|f*|]g
Mask M v=1 [|f|>τ]f
The weighted mean combination W V=α I+ (1-α) F (for example)
For example, identical interpolation instructs I to cause no change (therefore, f=f in the target image *).Alternatively, smoothly interpolation instructs S to change target image.The Euler-Lagrange equation that instructs S to be associated with level and smooth interpolation is:
f=0 on the Ω, its f| Ω=f *| Ω,
And be considered to have the Laplace's equation of Dirichlet boundary conditions.Unique solution with Laplace's equation of Dirichlet boundary conditions often is known as " film interpolation type (membrane interpolant) ".The discrete form of head it off causes image-region very level and smooth or fuzzy among the Ω of territory in each color channel.Level and smooth interpolation instructs S the gradient of this inside, territory to be minimized (" the empty guidance "), and causes being positioned at the very level and smooth interpolation of the borderline color in territory.Level and smooth interpolation instructs S to be used by the computer graphical region-filling algorithm, comprises the filling algorithm of polygon or arbitrary shape.
Example effect transparent and that the mask interpolation instructs illustrates and discusses with respect to Fig. 7 and 8 respectively.
Fig. 5 has described the hybrid system in one embodiment of the present of invention.The target 504 of the view data of image editing system 500 reception such as target images etc.The selection that target selection module 508 receives the target area in the target 504.As previously mentioned, other method in the zone that the target area can be by paste operation, clone operations, outline operation or select target image defines.
In at least one embodiment, image editing system 500 also receives such as sources such as source images 502.The selection that the source selects module 506 to receive the source region in the source 504.But, should be appreciated that relating to the alternative embodiments such as embodiment that level and smooth interpolation instructs or the mask interpolation instructs such as those does not need source or source to select module, is independent of any source because this interpolation instructs.
Territory definition module 510 receives one or more selections, and objective definition territory (may also active territory).In one embodiment, the territory definition can be the form of the scope of X and Y coordinates on the image, although conceived other method of field of definition in category of the present invention, includes but not limited to the array and the zone definitions algorithm of picture position.
Image data collection device 511 is collected in the input picture in this article respectively by function f *, the f on the Ω *The target image of representing with g, aiming field, the view data that is comprised in may also active territory.Blended computation module 512 receiving target territory definition, destination image data, may also have source region information.In addition, blended computation module 512 is used these inputs, based on being derived from or taking from the interpolation that is stored in the information in the interpolation guide datastore 515 and instruct, calculates the view data of mixed result images 516.Output module 514 receives the view data of mixed result images 516, and this module presents mixed result images 516 on display, perhaps this image is outputed to memory device or Another application program.
Fig. 6 has described the operation that mixes in one embodiment of the present of invention.Receive the selection that operation 600 receives source (for example, source region).Another receives the selection of operation 602 receiving targets (for example, target area).Defining operation 604 defines the territory, source based on the selection to the source.Another defining operation 606 comes the objective definition territory based on the selection to target.
Collecting operation 608 is collected in herein respectively by function f *, the f on the Ω *The target image of representing with g, aiming field, the view data that is comprised in may also active territory.Provide operation 610 by store or provide the algorithm of hard coded such as visit data, the interpolater that is instructed is provided.Calculating operation 612 calculates mixed result images from collected view data and the interpolater that is instructed.Output function 614 outputs to display or other output device with mixed result images.
Fig. 7 has described to be derived from one embodiment of the present of invention the mixed result that transparent interpolation instructs.Use the filtration interpolation to instruct F that source images 700 is blended in the target image 702, to produce mixed result images 704.Yet, it is evident that from result images 704 between each letter of word " BLEND " and inner interval is not presented practically.On the contrary, used and be called transparent second interpolation and instruct, produced mixed result images 706.Transparent algorithm provides in last table.
The replacement of transparent algorithm is represented:
Figure A20048000301000131
Transparent interpolation instructs the image gradient (selecting from source image data or from destination image data) of each the choice of location maximum of T in the target area.In this way, Hun He result images comprised with aiming field in the obvious characteristics of source region or target area of the given position view data of mating most.
Fig. 8 has described to be derived from one embodiment of the present of invention the mixed result that the mask interpolation instructs M.In the mask interpolation instructs, the gradient optimization to satisfy preassigned, is included but not limited to, be lower than threshold value, be higher than threshold value, in given range and satisfy and set the goal.
As implied above, the mask interpolation instructs M to filter destination image data, thereby only generates notable attribute in the target area of mixed result images.For example, use the mask interpolation as shown in preceding table to instruct M, low (perhaps, alternatively, height) gradient region is weakened the image data illumination in the source region 804 of source images 800, to produce the effect of the similar cartoon shown in the mixed result images 802.
Alternatively, can revise the mask interpolation instructs M that band filter range is provided:
Figure A20048000301000141
Fig. 9 shows and is blended in one embodiment of the present of invention in territory, the source dimension target image in addition.Show target image 900 with target area of being defined by object boundary Ω 902.Mixed regional T 906 is defined by mixed border T 904 in inside, is externally defined by object boundary Ω 902.The interpolater that is instructed can be provided, to provide a certain amount of mixed among the regional T between object boundary Ω 902 and mixed border T 904:
min f∫∫ Ω|f-v| 2+α∫∫ T|f-f *| 2
It has boundary condition f| Ω=f *| ΩThe big I of mixed effect is controlled by weighting parameters α in this zone.
Be used to realize the example hardware of Figure 10 of the present invention and the universal computing device that operating environment comprises computing machine 20 forms, comprise processing unit 21, system storage 22 and the various system components that are used for comprising system storage are coupled to the system bus 23 of processing unit 21.Can only have one or an above processing unit 21 is arranged, thereby the processor of computing machine 20 comprises single CPU (central processing unit) (CPU), or a plurality of processing units of so-called parallel processing environment.Computing machine 20 can be the computing machine of conventional computing machine, distributed computer or any other type; The present invention is not limited by this type of.
System bus 23 can be any one in the various bus structure, any the local bus that comprises memory bus or Memory Controller, peripheral bus and use various bus architectures.System storage also can abbreviate storer as, and comprises ROM (read-only memory) (ROM) 24 and random-access memory (ram) 25.Comprise such as when starting, helping basic input/output (BIOS) 26 to be stored among the ROM 24 in that personal computer 20 inner each interelement transmit the basic routine of information.Computing machine 20 also comprises the hard disk drive 27 that is used for reading or writing the hard disk (not shown), be used for reading or writing the disc driver 28 of moveable magnetic disc 29 and be used for reading or writing CD drive 30 such as removable CDs 31 such as CD ROM or other light media.
Hard disk drive 27, disc driver 28 and CD drive 30 are linked system bus 23 by hard disk drive interface 32, disk drive interface 33 and CD drive interface 34 respectively.These drivers and the computer-readable medium that is associated thereof provide the non-volatile memories of computer-readable instruction, data structure, program module and other data for computing machine 20.It will be appreciated by those skilled in the art that such as any storage such as tape cassete, flash card, digital video disc, Bei Nuli magnetic tape cassette, random-access memory (ram), ROM (read-only memory) (ROM) and can also can be used in this exemplary operation environment by the computer-readable medium of other type of the data of computer access.
The some program modules that comprise operating system 35, one or more application program 36, other program module 37 and routine data 38 can be stored on hard disk, disk 29, CD 31, ROM 24 or the RAM 25.The user can be by coming input command and information in personal computer 20 such as input equipments such as keyboard 40 and positioning equipments 42.Other input equipment (not shown) can comprise microphone, operating rod, game mat, satellite dish, scanner or the like.These and other input equipment normal open overcoupling is linked processing unit 21 to the serial port interface 46 of system bus, but also availablely waits other interface to connect such as parallel port, game port or USB (universal serial bus) (USB).The display device of monitor 47 or other type is also via linking system bus 23 such as interfaces such as video adapters 48.Except that monitor, computing machine generally includes such as other peripheral output device (not shown) such as loudspeaker and printers.
Personal computer 20 can be used logic such as one or more remote computers such as remote computers 49 and be connected in the networked environment and move.These logics connections are by being coupled to computing machine 20 or realizing as its a part of communication facilities.Remote computer 49 can be another computing machine, server, router, network PC, client computer, peer device or other common network node, and generally include above-mentioned with respect to personal computer 20 described many or whole elements, although in Figure 10, only show memory storage devices 50.The logic of describing among Figure 10 connects and comprises Local Area Network 51 and wide area network (WAN) 52.This type of network environment is common in office, enterprise-wide. computer networks, Intranet and the Internet, and they are the type of network.
When being used for the lan network environment, computing machine 20 is linked LAN (Local Area Network) 51 by the network interface or the adapter 53 that belong to a class communication facilities.When being used for the WAN network environment, computing machine 20 generally includes the modulator-demodular unit 54 that belongs to a class communication facilities or is used for setting up any other device of communication by wide area network 52.Modulator-demodular unit 54 can be internal or external, and it links system bus 23 via serial port interface 46.In networked environment, program module or its part described with respect to personal computer 20 can be stored in the remote memory storage devices.Should be appreciated that it is exemplary that the network that illustrates connects, and also can use other device and the communication facilities of setting up communication link at intercomputer.
In one embodiment of the invention, select module, territory definition module, blended computation module and output module can be merged into the part of operating system 35, application program 36 or other program module 37.Interpolation guide datastore and view data can be stored as routine data 38.
Each embodiment of the present invention described herein is implemented as the logic step in one or more computer systems.Logical operation of the present invention is implemented as: the sequence of steps that the processor that carry out in one or more computer systems (1) is realized, and the machine module of (2) one or more inside computer system interconnection.This realization is to depend on the selection problem that realizes that performance of computer systems of the present invention requires.Therefore, the logical operation of forming various embodiments of the present invention described herein is known as operation, step, object or module under different occasions.
Above instructions, example and data provide the structure of exemplary embodiment of the present and the complete description of use.Can not depart from spirit of the present invention and category because making many embodiments of the invention, so the present invention resides in the appending claims.

Claims (26)

1. one kind is calculated the method for mixed result images from target image, and described method comprises:
Definition has the aiming field on border in described target image,
One interpolater that is instructed is provided, and it comprises that interpolation instructs and the boundary condition related with the borderline phase of described aiming field; And
Calculate described mixed result images based on the described interpolater that is instructed, with the boundary condition of the boundary that satisfies described aiming field, and described interpolation instructed and the gradient of described mixed result images between cross over described aiming field difference minimize.
2. the method for claim 1 is characterized in that, described provide the operation comprise:
Provide a mask interpolation to instruct as interpolation and instruct, described mask interpolation instructs the gradient that comprises the target image in the described aiming field, and the gradient of wherein said target image satisfies a preassigned.
3. the method for claim 1 is characterized in that, at least a portion of source images is blended in the described target image, and to generate described mixed result images, it is the function of described source images that described interpolation instructs.
4. the method for claim 1 is characterized in that, at least a portion of source images is blended in the described target image, and generating described mixed result images, and described method also comprises:
Definition has the territory, source on border in described source images.
5. method as claimed in claim 4 is characterized in that, described provide the operation comprise:
Provide one to filter the interpolation guidance as the interpolation guidance, described filtration interpolation instructs the gradient that comprises the source images in the territory, described source.
6. method as claimed in claim 4 is characterized in that, described provide the operation comprise:
Provide a transparent interpolation to instruct as interpolation and instruct, described transparent interpolation instructs the greater between the gradient of the interior target image of the gradient that comprises the source images territory, described source in and described aiming field.
7. the method for claim 1 is characterized in that, described boundary condition requires the view data in the described mixed result images to equal in the described target image view data in the respective value of described boundary in each value of the boundary of described aiming field.
8. the method for claim 1 is characterized in that, the described interpolater that is instructed is represented by following function minimization:
min f ∫ ∫ Ω | ▿ f - v | 2 ,
It has boundary condition f| Ω=f *| Ω, wherein v represents that interpolation instructs, f represents the mixed result images in the described aiming field, f *Represent the mixed result images that described aiming field is outer.
9. computer program of computer program of having encoded, described computer program are carried out a kind of computer procedures that are used for calculating from target image mixed result images on computer system, described computer procedures comprise:
Definition has the aiming field on border in described target image,
One interpolater that is instructed is provided, and it comprises that interpolation instructs and the boundary condition related with the borderline phase of described aiming field; And
Calculate described mixed result images based on the described interpolater that is instructed, with the boundary condition of the boundary that satisfies described aiming field, and described interpolation instructed and the gradient of described mixed result images between cross over described aiming field difference minimize.
10. computer program as claimed in claim 9 is characterized in that, described provide the operation comprise:
Provide a mask interpolation to instruct as interpolation and instruct, described mask interpolation instructs the gradient that comprises the target image in the described aiming field, and the gradient of wherein said target image satisfies a preassigned.
11. computer program as claimed in claim 9 is characterized in that, at least a portion of source images is blended in the described target image, and to generate described mixed result images, it is the function of described source images that described interpolation instructs.
12. computer program as claimed in claim 9 is characterized in that, at least a portion of source images is blended in the described target image, and generating described mixed result images, and described computer procedures also comprise:
Definition has the territory, source on border in described source images.
13. computer program as claimed in claim 12 is characterized in that, described provide the operation comprise:
Provide one to filter the interpolation guidance as the interpolation guidance, described filtration interpolation instructs the gradient that comprises the source images in the territory, described source.
14. computer program as claimed in claim 12 is characterized in that, described provide the operation comprise
Provide a transparent interpolation to instruct as interpolation and instruct, described transparent interpolation instructs the greater between the gradient of the interior target image of the gradient that comprises the source images territory, described source in and described aiming field.
15. computer program as claimed in claim 9, it is characterized in that described boundary condition requires the view data in the described mixed result images to equal in the described target image view data in the respective value of described boundary in each value of the boundary of described aiming field.
16. computer program as claimed in claim 9 is characterized in that, the described interpolater that is instructed is represented by following function minimization:
min f ∫ ∫ Ω | ▿ f - v | 2 ,
It has boundary condition f| Ω=f *| Ω, wherein v represents that interpolation instructs, f represents the mixed result images in the described aiming field, f *Represent the mixed result images that described aiming field is outer.
17. a system that is used for calculating from target image mixed result images, described system comprises:
The territory definition module, it defines the aiming field with border in described target image,
The interpolation guide datastore, it stores an interpolater that is instructed, and described interpolater comprises that interpolation instructs and the boundary condition related with the borderline phase of described aiming field; And
Blended computation module, it calculates described mixed result images based on the described interpolater that is instructed, with the boundary condition of the boundary that satisfies described aiming field, and described interpolation instructed and the gradient of described mixed result images between cross over described aiming field difference minimize.
18. system as claimed in claim 17 is characterized in that, described interpolation instructs the gradient that comprises the target image in the described aiming field.
19. system as claimed in claim 17 is characterized in that, described interpolation instructs and equals 0.
20. system as claimed in claim 17 is characterized in that, described interpolation instructs the gradient that comprises the target image in the described aiming field, and wherein, the gradient of described target image satisfies a preassigned.
21. system as claimed in claim 17 is characterized in that, at least a portion of source images is blended in the described target image, and to generate described mixed result images, it is the function of described source images that described interpolation instructs.
22. system as claimed in claim 17 is characterized in that, at least a portion of source images is blended in the described target image, and generating described mixed result images, and described system also comprises:
Definition has the territory, source on border in described source images.
23. the system as claimed in claim 22 is characterized in that, described interpolation instructs the gradient that comprises the source images in the territory, described source.
24. the system as claimed in claim 22 is characterized in that, described interpolation instructs the greater between the gradient of the interior target image of the gradient that comprises the source images territory, described source in and described aiming field.
25. system as claimed in claim 17, it is characterized in that described boundary condition requires the view data in the described mixed result images to equal view data in the described target image in the respective value of described boundary in each value of the boundary of described aiming field.
26. system as claimed in claim 17 is characterized in that, the described interpolater that is instructed is represented by following function minimization:
min f ∫ ∫ Ω | ▿ f - v | 2 ,
It has boundary condition f| Ω=f *| Ω, wherein v represents that interpolation instructs, f represents the mixed result images in the described aiming field, f *Represent the mixed result images that described aiming field is outer.
CNB2004800030105A 2003-02-25 2004-01-07 Image blending by guided interpolation Expired - Fee Related CN100386772C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US45007803P 2003-02-25 2003-02-25
US60/450,078 2003-02-25
US10/417,937 2003-04-16

Publications (2)

Publication Number Publication Date
CN1745386A true CN1745386A (en) 2006-03-08
CN100386772C CN100386772C (en) 2008-05-07

Family

ID=36140016

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2004800030105A Expired - Fee Related CN100386772C (en) 2003-02-25 2004-01-07 Image blending by guided interpolation

Country Status (1)

Country Link
CN (1) CN100386772C (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101964110A (en) * 2009-07-23 2011-02-02 三星电子株式会社 Method and system for creating an image
CN101655969B (en) * 2008-08-19 2011-12-21 方正国际软件(北京)有限公司 Method for editing color gradient attributes
US8165396B2 (en) 2008-07-15 2012-04-24 Hon Hai Precision Industry Co., Ltd. Digital image editing system and method for combining a foreground image with a background image
CN106339997A (en) * 2015-07-09 2017-01-18 株式会社理光 Image fusion method, device and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4602286A (en) * 1982-01-15 1986-07-22 Quantel Limited Video processing for composite images
US5022085A (en) * 1990-05-29 1991-06-04 Eastman Kodak Company Neighborhood-based merging of image data
AU727503B2 (en) * 1996-07-31 2000-12-14 Canon Kabushiki Kaisha Image filtering method and apparatus
US5870103A (en) * 1996-09-25 1999-02-09 Eastman Kodak Company Method for creating realistic-looking composite images
JP4114191B2 (en) * 1997-06-24 2008-07-09 株式会社セガ Image processing apparatus and image processing method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165396B2 (en) 2008-07-15 2012-04-24 Hon Hai Precision Industry Co., Ltd. Digital image editing system and method for combining a foreground image with a background image
CN101631189B (en) * 2008-07-15 2012-08-29 鸿富锦精密工业(深圳)有限公司 Image synthesis system and method
CN101655969B (en) * 2008-08-19 2011-12-21 方正国际软件(北京)有限公司 Method for editing color gradient attributes
CN101964110A (en) * 2009-07-23 2011-02-02 三星电子株式会社 Method and system for creating an image
US8830251B2 (en) 2009-07-23 2014-09-09 Samsung Electronics Co., Ltd. Method and system for creating an image
CN101964110B (en) * 2009-07-23 2016-03-16 三星电子株式会社 For creating the method and system of image
CN106339997A (en) * 2015-07-09 2017-01-18 株式会社理光 Image fusion method, device and system
CN106339997B (en) * 2015-07-09 2019-08-09 株式会社理光 Image interfusion method, equipment and system

Also Published As

Publication number Publication date
CN100386772C (en) 2008-05-07

Similar Documents

Publication Publication Date Title
JP4316571B2 (en) Image synthesis by guided interpolation
US9135732B2 (en) Object-level image editing
AU2003204466B2 (en) Method and system for enhancing portrait images
US6987520B2 (en) Image region filling by exemplar-based inpainting
US20090297022A1 (en) Color correcting method and apparatus
CN1147822C (en) Method and system for image processing
JP4541786B2 (en) Method and apparatus for generating blur
CN101606179B (en) Universal front end for masks, selections and paths
JPH1166337A (en) Image compositing method by computer illustration system
US20060239548A1 (en) Segmentation of digital images
US20040227766A1 (en) Multilevel texture processing method for mapping multiple images onto 3D models
WO2019207296A1 (en) Method and system for providing photorealistic changes for digital image
CA2519627A1 (en) Selective enhancement of digital images
EP1826724B1 (en) Object-level image editing using tiles of image data
US11544889B2 (en) System and method for generating an animation from a template
CN1745386A (en) Image blending by guided interpolation
JPH1166286A (en) Method for simplifying scene image compositing by computer illustration system
CN113870287A (en) Vehicle VR interior processing method, computing device and storage medium
JP2005100120A (en) Composite image preparing method and device and its program
KR102606373B1 (en) Method and apparatus for adjusting facial landmarks detected in images
Plante et al. Evaluating the location of hot spots in interactive scenes using the 3R toolbox
CN101263529A (en) 2D editing metaphor for 3D graphics
Peutz Image Manipulation with Phoenix
Geng et al. Artistic Painting by Reference Images
JPH06337925A (en) Method and apparatus for generation of control data string

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150521

C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20150521

Address after: Washington State

Patentee after: Micro soft technique license Co., Ltd

Address before: Washington State

Patentee before: Microsoft Corp.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20080507

Termination date: 20180107