CN105574834B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN105574834B
CN105574834B CN201510982411.XA CN201510982411A CN105574834B CN 105574834 B CN105574834 B CN 105574834B CN 201510982411 A CN201510982411 A CN 201510982411A CN 105574834 B CN105574834 B CN 105574834B
Authority
CN
China
Prior art keywords
image
edge
obtains
input picture
diffusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510982411.XA
Other languages
Chinese (zh)
Other versions
CN105574834A (en
Inventor
侯文迪
陈志军
王百超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201510982411.XA priority Critical patent/CN105574834B/en
Publication of CN105574834A publication Critical patent/CN105574834A/en
Application granted granted Critical
Publication of CN105574834B publication Critical patent/CN105574834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure is directed to a kind of image processing method and devices, wherein image processing method includes:Obtain the edge image of input picture;Processing is diffused to edge image, obtains edge-diffusion image;Feature stream is carried out to edge diffusion image to be filtered, and obtains the first image;Stretch processing is carried out to the gray level image of input picture or input picture, obtains the second image;Fusion treatment is carried out according to the texture image of input, the first image and the second image, obtains target image.The embodiment of the present disclosure, it is diffused processing by the edge image to acquisition, obtain edge-diffusion image, feature stream is carried out to edge diffusion image to be filtered, the first image is obtained, and stretch processing is carried out to the gray level image of input picture or input picture, obtains the second image, then fusion treatment is carried out according to the texture image of input, the first image and the second image so that the target image of generation has good non-genuine effect.

Description

Image processing method and device
Technical field
This disclosure relates to technical field of image processing more particularly to a kind of image processing method and device.
Background technology
In recent years, with the fast development of image processing techniques, for video, image, threedimensional model feeling of unreality at Reason becomes more and more popular.Feeling of unreality processing, which refers to generating using computer, does not have the sense of reality as photo, and has hand Paint the technology of the figure of style.The target of non-photorealistic rendering does not lie in the authenticity of figure, and essentially consists in performance figure Art features, simulation artistic work or effective supplement as photo realism graphic.
How image to be handled using non-genuine treatment technology so that treated, and special efficacy that image had is mesh A preceding technical problem for being badly in need of solving.
Invention content
To overcome the problems in correlation technique, a kind of image processing method of disclosure offer and device.
According to the first aspect of the embodiments of the present disclosure, a kind of image processing method is provided, including:
Obtain the edge image of input picture;
Processing is diffused to the edge image, obtains edge-diffusion image;
Feature stream is carried out to the edge-diffusion image to be filtered, and obtains the first image;
Stretch processing is carried out to the gray level image of the input picture or the input picture, obtains the second image;
Fusion treatment is carried out according to the texture image of input, described first image and second image, obtains target figure Picture.
In one embodiment, the edge image for obtaining input picture, including:
If the input picture is coloured image, the coloured image of input is converted into gray level image, and described in extraction The edge image of gray level image;
If the input picture is gray level image, the edge image of the gray level image is directly extracted.
In one embodiment, described that processing is diffused to the edge image, edge-diffusion image is obtained, including:
Processing is diffused to the edge image, obtains skirt response image;
Processing is diffused to the skirt response image, obtains the edge-diffusion image.
In one embodiment, described that edge-diffusion image progress feature stream is filtered, including:
The edge tangent line direction of the input picture is calculated, and according to the edge tangent line direction to the edge-diffusion figure As carrying out linear convolution.
In one embodiment, the gray level image to the input picture or the input picture carries out stretch processing, The second image is obtained, including:
Obtain the correspondence of image and histogram curve;
The correspondence is inquired according to the gray level image of the input picture or the input picture, obtains target histogram Figure curve;
The histogram distribution for obtaining the input picture adjusts the input picture according to the goal histogram curve Histogram distribution, to generate second image.
In one embodiment, described to be melted according to the texture image of input, described first image and second image Conjunction is handled, and obtains target image, including:
Grain table is carried out according to the texture image and second image, obtains third image;
Edge contour fusion treatment is carried out according to described first image and the third image, obtains the target image.
In one embodiment, the method further includes:
RGB RGB color is converted into brightness and color LAB color spaces, and the target image is expressed as institute State the brightness of LAB color spaces;
By the LAB color space conversions to the RGB color, to obtain the coloured image of the target image.
According to the second aspect of the embodiment of the present disclosure, a kind of image processing apparatus is provided, including:
Acquisition module is configured as obtaining the edge image of input picture;
DIFFUSION TREATMENT module is configured as being diffused processing to the edge image that the acquisition module obtains, obtain To edge-diffusion image;
It is filtered module, is configured as carrying out feature to the edge-diffusion image that the DIFFUSION TREATMENT module obtains Stream is filtered, and obtains the first image;
Stretch processing module is configured as carrying out at stretching the gray level image of the input picture or the input picture Reason, obtains the second image;
Fusion treatment module is configured as the texture image according to input, described for being filtered module and obtaining Second image that one image and the stretch processing module obtain carries out fusion treatment, obtains target image.
In one embodiment, the acquisition module includes:
Extracting sub-module is converted, if it is coloured image to be configured as the input picture, the coloured image of input is turned It is changed to gray level image, and extracts the edge image of the gray level image;
Extracting sub-module directly extracts the gray level image if it is gray level image to be configured as the input picture Edge image.
In one embodiment, the DIFFUSION TREATMENT module includes:
First DIFFUSION TREATMENT submodule is configured as being diffused processing to the edge image, obtains skirt response figure Picture;
Second DIFFUSION TREATMENT submodule is configured as the skirt response obtained to the first DIFFUSION TREATMENT submodule Image is diffused processing, obtains the edge-diffusion image.
In one embodiment, described to be filtered module, it is configured as:
The edge tangent line direction of the input picture is calculated, and according to the edge tangent line direction to the edge-diffusion figure As carrying out linear convolution.
In one embodiment, the stretch processing module includes:
First obtains submodule, is configured as obtaining the correspondence of image and histogram curve;
Second obtains submodule, is configured as inquiring institute according to the gray level image of the input picture or the input picture The correspondence of the first acquisition submodule acquisition is stated, goal histogram curve is obtained;
Submodule is adjusted, is configured as obtaining the histogram distribution of the input picture, submodule is obtained according to described second The goal histogram curve that block obtains adjusts the histogram distribution of the input picture, to generate second image.
In one embodiment, the fusion treatment module includes:
Grain table submodule is configured as carrying out grain table to second image, obtains third image;
Edge contour merges submodule, is configured as being obtained according to described first image and the grain table submodule The third image carries out edge contour fusion treatment, obtains the target image.
In one embodiment, described device further includes:
Representation module is converted, is configured as RGB RGB color being converted to brightness and color LAB color spaces, and The target image is expressed as to the brightness of the LAB color spaces;
It is converted to module, is configured as the LAB color space conversions to the RGB color, to obtain State the coloured image of target image.
According to the third aspect of the embodiment of the present disclosure, a kind of image processing apparatus is provided, including:
Processor;
Memory for storing processor-executable instruction;
Wherein, processor is configured as:
Obtain the edge image of input picture;
Processing is diffused to the edge image, obtains edge-diffusion image;
Feature stream is carried out to the edge-diffusion image to be filtered, and obtains the first image;
Stretch processing is carried out to the gray level image of the input picture or the input picture, obtains the second image;
Fusion treatment is carried out according to the texture image of input, described first image and second image, obtains target figure Picture.
The technical scheme provided by this disclosed embodiment can include the following benefits:Pass through the edge image to acquisition It is diffused processing, obtains edge-diffusion image, carrying out feature stream to edge diffusion image is filtered, and obtains the first image, And stretch processing is carried out to the gray level image of input picture or input picture, the second image is obtained, then according to the texture of input Image, the first image and the second image carry out fusion treatment, obtain target image so that the target image of generation has good Non-genuine effect.
It should be understood that above general description and following detailed description is only exemplary and explanatory, not The disclosure can be limited.
Description of the drawings
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the present invention Example, and be used to explain the principle of the present invention together with specification.
Fig. 1 is a kind of flow chart of image processing method shown according to an exemplary embodiment.
Fig. 2 is a kind of flow chart of acquisition edge-diffusion image shown according to an exemplary embodiment.
Fig. 3 is a kind of flow chart of the second image of acquisition shown according to an exemplary embodiment.
Fig. 4 is a kind of flow chart of acquisition target image shown according to an exemplary embodiment.
Fig. 5 is the flow chart of another image processing method shown according to an exemplary embodiment.
Fig. 6 is a kind of block diagram of image processing apparatus shown according to an exemplary embodiment.
Fig. 7 is the block diagram of another image processing apparatus shown according to an exemplary embodiment.
Fig. 8 is the block diagram of another image processing apparatus shown according to an exemplary embodiment.
Fig. 9 is the block diagram of another image processing apparatus shown according to an exemplary embodiment.
Figure 10 is the block diagram of another image processing apparatus shown according to an exemplary embodiment.
Figure 11 is the block diagram of another image processing apparatus shown according to an exemplary embodiment.
Figure 12 is a kind of block diagram suitable for image processing apparatus shown according to an exemplary embodiment.
Specific implementation mode
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment Described in embodiment do not represent and the consistent all embodiments of the present invention.On the contrary, they be only with it is such as appended The example of the consistent device and method of some aspects being described in detail in claims, of the invention.
Fig. 1 is a kind of flow chart of image processing method shown according to an exemplary embodiment, as shown in Figure 1, the figure As processing method can be applied on image processing equipment, this approach includes the following steps S101-S105:
In step S101, the edge image of input picture is obtained.
In this embodiment, if input picture is coloured image, the coloured image of input can be converted to gray-scale map Then picture extracts the edge image of the gray level image.If input picture is gray level image, the gray level image can be directly extracted Edge image.
Assuming that transformed gray level image or the gray level image of input are G, then the gray scale can be extracted by following formula The edge image E of image:
Wherein,GxIndicate the gradient of gray level image horizontal direction, GyIndicate ash Spend the gradient of image vertical direction, hxIndicate horizontal shuttering, hyIndicate vertical formwork,Indicate convolution algorithm.
In step s 102, processing is diffused to edge image, obtains edge-diffusion image.
As shown in Fig. 2, step S102 may comprise steps of:
In step S1021, processing is diffused to edge image, obtains skirt response image.
In this embodiment it is possible to which the edge-diffusion for carrying out 8 directions to edge image is handled, in this way, each pixel can To get 8 responses, then take the maximum response in each direction as skirt response image.
In step S1022, processing is diffused to edge response image, obtains edge-diffusion image.
After obtaining skirt response image, secondary DIFFUSION TREATMENT can be carried out to the skirt response image, i.e., to the edge Response image carries out 8 direction diffusions, and 8 directional response values are added up, and obtains final edge-diffusion image S, wherein side Edge diffusion image S can be obtained by following formula:
In step s 103, it carries out feature stream to edge diffusion image to be filtered, obtains the first image.
Wherein, being filtered to edge diffusion image progress feature stream to be:The edge tangent line side of calculating input image To, and linear convolution is carried out to edge diffusion image according to edge tangent line direction.
In this embodiment, the process in the edge tangent line direction of calculating input image can be:
Calculating input image is in the edge tangent line direction of each pixel position, namely acquires current pixel position Feature vector in structure tensor, structure tensor T can be obtained by following formula:
Wherein, G is Gaussian function, IxIt is horizontally oriented gradient, IyIt is vertical direction gradient, then, the feature of structure tensor Edge gradient after vectorial U indicates smooth is vectorial, can be obtained by following formula:
Wherein, λ2Indicate characteristic value.
In this embodiment, can be to the process of edge diffusion image progress linear convolution according to edge tangent line direction:
Any point takes forward the curve conduct side of certain length backward along gradient tangential direction on edge-diffusion image To line, and desired value of the aggregate value method of weighting score value as current pixel is obtained, to obtain the first image, realization process can pass through Following formula obtains:
Wherein, w indicates the weight in direction line, and the length of direction line is set as L=2*3 σ+1 when convolution, and σ is Gauss weighting Value, f indicate that the number of pixels calculated forward, b indicate that the number of pixels calculated backward, Texture indicate edge-diffusion image.
In the embodiment, by carrying out linear convolution according to the feature stream of input picture, edge-smoothing processing, pole are strengthened The earth improves the treatment effect of subsequent image.
In step S104, stretch processing is carried out to the gray level image of input picture or input picture, obtains the second image.
As shown in figure 3, step S104 may include following steps:
In step S1041, the correspondence of image and histogram curve is obtained.
In this embodiment it is possible to collect the image collection of similar photo in advance, with statistical picture and histogram curve Correspondence.
In step S1042, according to the gray level image query image and histogram curve of input picture or input picture Correspondence obtains goal histogram curve.
In this embodiment, if input picture is gray level image, query image and histogram can be gone using input picture The correspondence of figure curve obtains goal histogram curve.If input picture is coloured image, the gray scale of coloured image is obtained Image can use the gray level image of coloured image to remove the correspondence of query image and histogram curve, obtain target histogram Figure curve.
In step S1043, the histogram distribution of input picture is obtained, input picture is adjusted according to goal histogram curve Histogram distribution, to generate the second image.
It in this embodiment, can be according to goal histogram curve tune after the histogram distribution for obtaining input picture The histogram distribution of whole input picture, to generate the second image.Wherein, input picture is being adjusted according to goal histogram curve Histogram distribution during may be used a variety of histogram matchings, such as single mapping law (SML) and group mapping rule Then (GML) matching process.
In step S105, fusion treatment is carried out according to the texture image of input, the first image and the second image, obtains mesh Logo image.
As shown in figure 4, step S105 may include following steps:
In step S1051, grain table is carried out according to texture image and the second image, obtains third image.
In this embodiment it is possible to obtain parameter beta by calculating the minimum value of following formula*
Wherein, λ values are the texture image that 0.2, P (x) is input, and above-mentioned formula is according to P (x)β(x)≈ T (x) are converted It obtains, T (x) is the second image, then third image T is:
In step S1052, edge contour fusion treatment is carried out according to the first image and third image, obtains target figure Picture.
In this embodiment it is possible to by following formula, target image S is obtained:
S=Slic*T
Wherein, T is third image, SlicFor the first image.
The image processing method that the embodiment provides can be used for but be not limited to processing figure portrait painting and landscape painting etc..
Above-mentioned image processing method embodiment is diffused processing by the edge image to acquisition, obtains edge-diffusion Image carries out feature stream to edge diffusion image and is filtered, obtains the first image, and to the ash of input picture or input picture It spends image and carries out stretch processing, obtain the second image, then carried out according to the texture image of input, the first image and the second image Fusion treatment obtains target image so that the target image of generation has good non-genuine effect.
Fig. 5 is the flow chart of another image processing method shown according to an exemplary embodiment, as shown in figure 5, After step S105, this method can also include the following steps:
In step s 106, it is brightness and color (LAB) color space by RGB (RGB) color space conversion, and by mesh Logo image is expressed as the brightness of LAB color spaces.
The target image obtained in embodiment illustrated in fig. 1 is single channel image, if it is desired to obtaining coloured image, is then needed pair Target image carries out color conversion processing.
In this embodiment, target image is transformed into LAB color spaces by RGB color first, and by target figure Brightness as being expressed as LAB color spaces.
In step s 107, by LAB color space conversions to RGB color, to obtain the coloured image of target image.
It, can be by LAB color space conversions to RGB face after target image is expressed as the brightness of LAB color spaces The colour space, to obtain the coloured image of target image.
Above-mentioned image processing method embodiment obtains target image by carrying out color conversion processing to target image Coloured image, it is hereby achieved that the desired color of image of user.
Corresponding with aforementioned image processing method embodiment, the disclosure additionally provides image processing apparatus embodiment.
Fig. 6 is a kind of block diagram of image processing apparatus shown according to an exemplary embodiment, as shown in fig. 6, the image Processing unit includes:Acquisition module 61, DIFFUSION TREATMENT module 62 are filtered at module 63, stretch processing module 64 and fusion Manage module 65.
Acquisition module 61 is configured as obtaining the edge image of input picture.
In this embodiment, if input picture is coloured image, the coloured image of input can be converted to gray-scale map Then picture extracts the edge image of the gray level image.If input picture is gray level image, the gray level image can be directly extracted Edge image.
Assuming that transformed gray level image or the gray level image of input are G, then the gray scale can be extracted by following formula The edge image E of image:
Wherein,GxIndicate the gradient of gray level image horizontal direction, GyIndicate ash Spend the gradient of image vertical direction, hxIndicate horizontal shuttering, hyIndicate vertical formwork,Indicate convolution algorithm.
DIFFUSION TREATMENT module 62 is configured as being diffused processing to the edge image that acquisition module 61 obtains, and obtains edge Diffusion image.
Module 63 is filtered to be configured as carrying out feature stream filter to the edge-diffusion image that DIFFUSION TREATMENT module 62 obtains Wave processing, obtains the first image.
Wherein, being filtered to edge diffusion image progress feature stream to be:The edge tangent line side of calculating input image To, and linear convolution is carried out to edge diffusion image according to edge tangent line direction.
In this embodiment, the process in the edge tangent line direction of calculating input image can be:
Calculating input image is in the edge tangent line direction of each pixel position, namely acquires current pixel position Feature vector in structure tensor, structure tensor T can be obtained by following formula:
Wherein, G is Gaussian function, IxIt is horizontally oriented gradient, IyIt is vertical direction gradient, then, the feature of structure tensor Edge gradient after vectorial U indicates smooth is vectorial, can be obtained by following formula:
Wherein, λ2Indicate characteristic value.
In this embodiment, can be to the process of edge diffusion image progress linear convolution according to edge tangent line direction:
Any point takes forward the curve conduct side of certain length backward along gradient tangential direction on edge-diffusion image To line, and desired value of the aggregate value method of weighting score value as current pixel is obtained, to obtain the first image, realization process can pass through Following formula obtains:
Wherein, w indicates the weight in direction line, and the length of direction line is set as L=2*3 σ+1 when convolution, and σ is Gauss weighting Value, f indicate that the number of pixels calculated forward, b indicate that the number of pixels calculated backward, Texture indicate edge-diffusion image.
In the embodiment, by carrying out linear convolution according to the feature stream of input picture, edge-smoothing processing, pole are strengthened The earth improves the treatment effect of subsequent image.
Stretch processing module 64 is configured as carrying out stretch processing to the gray level image of input picture or input picture, obtains Second image.
Fusion treatment module 65 is configured as the texture image according to input, is filtered the first image that module 63 obtains The second image obtained with stretch processing module 64 carries out fusion treatment, obtains target image.
For device as shown in FIG. 6 for realizing above-mentioned method flow as shown in Figure 1, the related content being related to describes phase Together, it does not repeat herein.
Above-mentioned image processing apparatus embodiment is diffused processing by the edge image to acquisition, obtains edge-diffusion Image carries out feature stream to edge diffusion image and is filtered, obtains the first image, and to the ash of input picture or input picture It spends image and carries out stretch processing, obtain the second image, then carried out according to the texture image of input, the first image and the second image Fusion treatment obtains target image so that the target image of generation has good non-genuine effect.
Fig. 7 is the block diagram of another image processing apparatus shown according to an exemplary embodiment, as shown in fig. 7, upper It states on the basis of Fig. 6 shows embodiment, acquisition module 61 may include:Convert extracting sub-module 611 and extracting sub-module 612.
If converting extracting sub-module 611 is configured as input picture as coloured image, the coloured image of input is converted For gray level image, and extract the edge image of gray level image.
If it is gray level image that extracting sub-module 612, which is configured as input picture, the edge graph of gray level image is directly extracted Picture.
For device as shown in Figure 7 for realizing above-mentioned method flow as shown in Figure 1, the related content being related to describes phase Together, it does not repeat herein.
Above-mentioned image processing apparatus embodiment, it is preferably non-for follow-up acquisition by extracting the edge image of gray level image True effect provides condition.
Fig. 8 is the block diagram of another image processing apparatus shown according to an exemplary embodiment, as shown in figure 8, upper On the basis of stating embodiment illustrated in fig. 6, DIFFUSION TREATMENT module 62 may include:First DIFFUSION TREATMENT submodule 621 and second is spread Handle submodule 622.
First DIFFUSION TREATMENT submodule 621 is configured as being diffused processing to edge image, obtains skirt response image.
In this embodiment it is possible to which the edge-diffusion for carrying out 8 directions to edge image is handled, in this way, each pixel can To get 8 responses, then take the maximum response in each direction as skirt response image.
Second DIFFUSION TREATMENT submodule 622 is configured as the skirt response figure obtained to the first DIFFUSION TREATMENT submodule 621 As being diffused processing, edge-diffusion image is obtained.
After obtaining skirt response image, secondary DIFFUSION TREATMENT can be carried out to the skirt response image, i.e., to the edge Response image carries out 8 direction diffusions, and 8 directional response values are added up, and obtains final edge-diffusion image S, wherein side Edge diffusion image S can be obtained by following formula:
For device as shown in Figure 8 for realizing above-mentioned method flow as shown in Figure 2, the related content being related to describes phase Together, it does not repeat herein.
Above-mentioned image processing apparatus embodiment, it is preferably non-genuine for follow-up acquisition by being diffused processing to image Effect provides condition.
Fig. 9 is the block diagram of another image processing apparatus shown according to an exemplary embodiment, as shown in figure 9, upper On the basis of stating embodiment illustrated in fig. 6, stretch processing module 64 may include:First, which obtains submodule 641, second, obtains submodule 642 and adjustment submodule 643.
First acquisition submodule 641 is configured as obtaining the correspondence of image and histogram curve.
In this embodiment it is possible to collect the image collection of similar photo in advance, with statistical picture and histogram curve Correspondence.
Second acquisition submodule 642 is configured as being obtained according to the gray level image of input picture or input picture inquiry first The correspondence that submodule 641 obtains obtains goal histogram curve.
In this embodiment it is possible to remove the correspondence of query image and histogram curve using the first image, mesh is obtained Mark histogram curve, i.e., histogram curve corresponding with the first image.
Adjustment submodule 643 is configured as obtaining the histogram distribution of input picture, is obtained according to the second acquisition submodule 642 The histogram distribution of the goal histogram curve adjustment input picture obtained, to generate the second image.
It in this embodiment, can be according to goal histogram curve tune after the histogram distribution for obtaining input picture The histogram distribution of whole input picture, to generate the second image.Wherein, input picture is being adjusted according to goal histogram curve Histogram distribution during may be used a variety of histogram matchings, such as single mapping law (SML) and group mapping rule Then (GML) matching process.
For device as shown in Figure 9 for realizing above-mentioned method flow as shown in Figure 3, the related content being related to describes phase Together, it does not repeat herein.
Above-mentioned image processing apparatus embodiment, it is preferably non-genuine for follow-up acquisition by carrying out stretch processing to image Effect provides condition.
Figure 10 is the block diagram of another image processing apparatus shown according to an exemplary embodiment, as shown in Figure 10, On the basis of above-mentioned embodiment illustrated in fig. 6, fusion treatment module 65 may include:Grain table submodule 651 and edge contour melt Zygote module 652.
Grain table submodule 651 is configured as carrying out grain table according to texture image and the second image, obtains third Image.
In this embodiment it is possible to obtain parameter beta by calculating the minimum value of following formula*
Wherein, λ values are the texture image that 0.2, P (x) is input, and above-mentioned formula is according to P (x)β(x)≈ T (x) are converted It obtains, T (x) is the second image, then third image T is:
Edge contour fusion submodule 652 is configured as obtained according to the first image and grain table submodule 651 the Three images carry out edge contour fusion treatment, obtain target image.
In this embodiment it is possible to by following formula, target image S is obtained:
S=Slic*T
Wherein, T is third image, SlicFor the first image.
For realizing above-mentioned method flow as shown in Figure 4, the related content being related to describes device as shown in Figure 10 It is identical, it does not repeat herein.
Above-mentioned image processing apparatus embodiment, by carrying out grain table and edge contour fusion to image so that generate Target image have good non-genuine effect.
Figure 11 is the block diagram of another image processing apparatus shown according to an exemplary embodiment, as shown in figure 11, On the basis of above-mentioned embodiment illustrated in fig. 6, image processing apparatus may also include:It converts representation module 66 and is converted to module 67。
Conversion representation module 66 is configured as RGB RGB color being converted to brightness and color LAB color spaces, and Target image is expressed as to the brightness of LAB color spaces.
It is converted to module 67 to be configured as LAB color space conversions to RGB color, to obtain target image Coloured image.
For realizing above-mentioned method flow as shown in Figure 5, the related content being related to describes device as shown in figure 11 It is identical, it does not repeat herein.
Above-mentioned image processing apparatus embodiment obtains target image by carrying out color conversion processing to target image Coloured image, it is hereby achieved that the desired color of image of user.
About the device in above-described embodiment, the concrete mode that wherein modules, submodule execute operation is having It closes and is described in detail in the embodiment of this method, explanation will be not set forth in detail herein.
Figure 12 is a kind of block diagram suitable for image processing apparatus shown according to an exemplary embodiment.For example, device 1200 can be mobile phone, computer, digital broadcast terminal, messaging devices, game console, tablet device, medical treatment Equipment, body-building equipment, personal digital assistant, aircraft etc..
Referring to Fig.1 2, device 1200 may include following one or more components:Processing component 1202, memory 1204, Power supply module 1206, multimedia component 1208, audio component 1210, the interface 1212 of input/output (I/O), sensor module 1214 and communication component 1216.
The integrated operation of 1202 usual control device 1200 of processing component, such as with display, call, data communication, Camera operation and record operate associated operation.Processing element 1202 may include one or more processors 1220 to execute Instruction, to perform all or part of the steps of the methods described above.In addition, processing component 1202 may include one or more moulds Block, convenient for the interaction between processing component 1202 and other assemblies.For example, processing component 1202 may include multi-media module, To facilitate the interaction between multimedia component 1208 and processing component 1202.
Memory 1204 is configured as storing various types of data to support the operation in equipment 1200.These data Example includes the instruction for any application program or method that are operated on device 1200, contact data, telephone book data, Message, picture, video etc..Memory 1204 can by any kind of volatibility or non-volatile memory device or they Combination is realized, such as static RAM (SRAM), electrically erasable programmable read-only memory (EEPROM), it is erasable can Program read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory Reservoir, disk or CD.
Electric power assembly 1206 provides electric power for the various assemblies of device 1200.Electric power assembly 1206 may include power management System, one or more power supplys and other generated with for device 1200, management and the associated component of distribution electric power.
Multimedia component 1208 is included in the screen of one output interface of offer between described device 1200 and user. In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, Screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes that one or more touch passes Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding is dynamic The boundary of work, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more Media component 1208 includes a front camera and/or rear camera.When equipment 1200 is in operation mode, mould is such as shot When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 1210 is configured as output and/or input audio signal.For example, audio component 1210 includes a wheat Gram wind (MIC), when device 1200 is in operation mode, when such as call model, logging mode and speech recognition mode, microphone quilt It is configured to receive external audio signal.The received audio signal can be further stored in memory 1204 or via communication Component 1216 is sent.In some embodiments, audio component 1210 further includes a loud speaker, is used for exports audio signal.
I/O interfaces 1212 provide interface, above-mentioned peripheral interface module between processing component 1202 and peripheral interface module Can be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and Locking press button.
Sensor module 1214 includes one or more sensors, and the state for providing various aspects for device 1200 is commented Estimate.For example, sensor module 1214 can detect the state that opens/closes of equipment 1200, the relative positioning of component, such as institute The display and keypad that component is device 1200 are stated, sensor module 1214 can be with detection device 1200 or device 1,200 1 The position change of a component, the existence or non-existence that user contacts with device 1200,1200 orientation of device or acceleration/deceleration and dress Set 1200 temperature change.Sensor module 1214 may include proximity sensor, be configured in not any physics It is detected the presence of nearby objects when contact.Sensor module 1214 can also include optical sensor, as CMOS or ccd image are sensed Device, for being used in imaging applications.In some embodiments, which can also include acceleration sensing Device, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 1216 is configured to facilitate the communication of wired or wireless way between device 1200 and other equipment.Dress The wireless network based on communication standard, such as WiFi can be accessed by setting 1200,2G or 3G or combination thereof.It is exemplary at one In embodiment, communication component 1216 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel Information.In one exemplary embodiment, the communication component 1216 further includes near-field communication (NFC) module, to promote short distance Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module (UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 1200 can be by one or more application application-specific integrated circuit (ASIC), number Signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, it includes the non-transitorycomputer readable storage medium instructed, example to additionally provide a kind of Such as include the memory 1204 of instruction, above-metioned instruction can be executed by the processor 1220 of device 1200 to complete the above method.Example Such as, the non-transitorycomputer readable storage medium can be ROM, it is random access memory (RAM), CD-ROM, tape, soft Disk and optical data storage devices etc..
Those skilled in the art will readily occur to its of the disclosure after considering specification and putting into practice disclosure disclosed herein Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or Person's adaptive change follows the general principles of this disclosure and includes the undocumented common knowledge in the art of the disclosure Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.

Claims (14)

1. a kind of image processing method, which is characterized in that the method includes:
Obtain the edge image of input picture;
Processing is diffused to the edge image, obtains edge-diffusion image;
Feature stream is carried out to the edge-diffusion image to be filtered, and obtains the first image;
Stretch processing is carried out to the gray level image of the input picture or the input picture, obtains the second image;
Fusion treatment is carried out according to the texture image of input, described first image and second image, obtains target image.
2. image processing method according to claim 1, which is characterized in that the edge image for obtaining input picture, Including:
If the input picture is coloured image, the coloured image of input is converted into gray level image, and extract the gray scale The edge image of image;
If the input picture is gray level image, the edge image of the gray level image is directly extracted.
3. image processing method according to claim 1, which is characterized in that described to be diffused place to the edge image Reason, obtains edge-diffusion image, including:
Processing is diffused to the edge image, obtains skirt response image;
Processing is diffused to the skirt response image, obtains the edge-diffusion image.
4. image processing method according to claim 1, which is characterized in that described to carry out spy to the edge-diffusion image Sign stream is filtered, including:
Calculate the edge tangent line direction of the input picture, and according to the edge tangent line direction to the edge-diffusion image into Row linear convolution.
5. image processing method according to claim 1, which is characterized in that described to the input picture or the input The gray level image of image carries out stretch processing, obtains the second image, including:
Obtain the correspondence of image and histogram curve;
The correspondence is inquired according to the gray level image of the input picture or the input picture, it is bent to obtain goal histogram Line;
The histogram distribution for obtaining the input picture adjusts the histogram of the input picture according to the goal histogram curve Figure distribution, to generate second image.
6. image processing method according to claim 1, which is characterized in that the texture image according to input, described First image and second image carry out fusion treatment, obtain target image, including:
Grain table is carried out according to the texture image and second image, obtains third image;
Edge contour fusion treatment is carried out according to described first image and the third image, obtains the target image.
7. according to claim 1-6 any one of them image processing methods, which is characterized in that the method further includes:
RGB RGB color is converted into brightness and color LAB color spaces, and the target image is expressed as described The brightness of LAB color spaces;
By the LAB color space conversions to the RGB color, to obtain the coloured image of the target image.
8. a kind of image processing apparatus, which is characterized in that described device includes:
Acquisition module is configured as obtaining the edge image of input picture;
DIFFUSION TREATMENT module is configured as being diffused processing to the edge image that the acquisition module obtains, obtains side Edge diffusion image;
It is filtered module, is configured as carrying out feature stream filter to the edge-diffusion image that the DIFFUSION TREATMENT module obtains Wave processing, obtains the first image;
Stretch processing module is configured as carrying out stretch processing to the gray level image of the input picture or the input picture, Obtain the second image;
Fusion treatment module is configured as the texture image according to input, first figure for being filtered module and obtaining Second image that picture and the stretch processing module obtain carries out fusion treatment, obtains target image.
9. image processing apparatus according to claim 8, which is characterized in that the acquisition module includes:
Extracting sub-module is converted, if it is coloured image to be configured as the input picture, the coloured image of input is converted to Gray level image, and extract the edge image of the gray level image;
Extracting sub-module directly extracts the edge of the gray level image if it is gray level image to be configured as the input picture Image.
10. image processing apparatus according to claim 8, which is characterized in that the DIFFUSION TREATMENT module includes:
First DIFFUSION TREATMENT submodule is configured as being diffused processing to the edge image, obtains skirt response image;
Second DIFFUSION TREATMENT submodule is configured as the skirt response image obtained to the first DIFFUSION TREATMENT submodule It is diffused processing, obtains the edge-diffusion image.
11. image processing apparatus according to claim 8, which is characterized in that it is described to be filtered module, it is configured as:
Calculate the edge tangent line direction of the input picture, and according to the edge tangent line direction to the edge-diffusion image into Row linear convolution.
12. image processing apparatus according to claim 8, which is characterized in that the stretch processing module includes:
First obtains submodule, is configured as obtaining the correspondence of image and histogram curve;
Second obtains submodule, is configured as according to the gray level image of the input picture or input picture inquiry described the One obtains the correspondence that submodule obtains, and obtains goal histogram curve;
Submodule is adjusted, is configured as obtaining the histogram distribution of the input picture, obtaining submodule according to described second obtains The goal histogram curve obtained adjusts the histogram distribution of the input picture, to generate second image.
13. image processing apparatus according to claim 8, which is characterized in that the fusion treatment module includes:
Grain table submodule is configured as carrying out grain table according to the texture image and second image, obtains the Three images;
Edge contour merges submodule, be configured as being obtained according to described first image and the grain table submodule described in Third image carries out edge contour fusion treatment, obtains the target image.
14. according to claim 8-13 any one of them image processing apparatus, which is characterized in that described device further includes:
Representation module is converted, is configured as RGB RGB color being converted to brightness and color LAB color spaces, and by institute State the brightness that target image is expressed as the LAB color spaces;
It is converted to module, is configured as the LAB color space conversions to the RGB color, to obtain the mesh The coloured image of logo image.
CN201510982411.XA 2015-12-23 2015-12-23 Image processing method and device Active CN105574834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510982411.XA CN105574834B (en) 2015-12-23 2015-12-23 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510982411.XA CN105574834B (en) 2015-12-23 2015-12-23 Image processing method and device

Publications (2)

Publication Number Publication Date
CN105574834A CN105574834A (en) 2016-05-11
CN105574834B true CN105574834B (en) 2018-09-04

Family

ID=55884929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510982411.XA Active CN105574834B (en) 2015-12-23 2015-12-23 Image processing method and device

Country Status (1)

Country Link
CN (1) CN105574834B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205804B (en) * 2016-12-16 2022-05-31 斑马智行网络(香港)有限公司 Image processing method and device and electronic equipment
CN107527359B (en) * 2017-08-07 2020-04-10 东软医疗系统股份有限公司 PET image reconstruction method and PET imaging equipment
CN109544490B (en) * 2018-10-17 2021-07-13 北京达佳互联信息技术有限公司 Image enhancement method, device and computer readable storage medium
CN109741248B (en) * 2018-12-29 2023-06-13 深圳美图创新科技有限公司 Watercolour style drawing method and device
CN113781296B (en) * 2021-09-22 2024-05-28 亿图软件(湖南)有限公司 Image watercolor processing method, device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2034436A1 (en) * 2007-09-06 2009-03-11 Thomson Licensing Method for non-photorealistic rendering
US7663622B2 (en) * 2005-12-08 2010-02-16 Electronics And Telecommunications Research Institute Unified framework based on extensible styles for 3D non-photorealistic rendering and method of configuring the same
WO2010134675A1 (en) * 2009-05-18 2010-11-25 연세대학교 산학협력단 Apparatus and method for generating non-photorealistic image using non-photorealistic rendering, and recording medium wherein program for implementing the method is recorded
CN103729866A (en) * 2013-12-13 2014-04-16 深圳市云宙多媒体技术有限公司 Water color style non-photorealistic rendering method and device
CN103778655A (en) * 2014-01-28 2014-05-07 西安理工大学 Color natural image calculation artifying method based on self-adaption ink painting spreading
CN103914862A (en) * 2014-03-10 2014-07-09 上海大学 Pencil sketch simulating method based on edge tangent stream
CN104346789A (en) * 2014-08-19 2015-02-11 浙江工业大学 Fast artistic style study method supporting diverse images

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7663622B2 (en) * 2005-12-08 2010-02-16 Electronics And Telecommunications Research Institute Unified framework based on extensible styles for 3D non-photorealistic rendering and method of configuring the same
EP2034436A1 (en) * 2007-09-06 2009-03-11 Thomson Licensing Method for non-photorealistic rendering
WO2010134675A1 (en) * 2009-05-18 2010-11-25 연세대학교 산학협력단 Apparatus and method for generating non-photorealistic image using non-photorealistic rendering, and recording medium wherein program for implementing the method is recorded
CN103729866A (en) * 2013-12-13 2014-04-16 深圳市云宙多媒体技术有限公司 Water color style non-photorealistic rendering method and device
CN103778655A (en) * 2014-01-28 2014-05-07 西安理工大学 Color natural image calculation artifying method based on self-adaption ink painting spreading
CN103914862A (en) * 2014-03-10 2014-07-09 上海大学 Pencil sketch simulating method based on edge tangent stream
CN104346789A (en) * 2014-08-19 2015-02-11 浙江工业大学 Fast artistic style study method supporting diverse images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于特征流的抽象线条画绘制;王山东 等;《计算机学报》;20140331;第37卷(第3期);第611-620页 *
基于直方图拉伸的图像增强算法及其实现;祝中秋;《信息技术》;20091231(第5期);第54-57页 *
结合轮廓和纹理特征的铅笔画自动生成系统;徐美婷 等;《电子测量技术》;20150630;第38卷(第6期);第48-53页 *

Also Published As

Publication number Publication date
CN105574834A (en) 2016-05-11

Similar Documents

Publication Publication Date Title
CN106156730B (en) A kind of synthetic method and device of facial image
CN107256555B (en) Image processing method, device and storage medium
CN108594997B (en) Gesture skeleton construction method, device, equipment and storage medium
CN105528606B (en) Area recognizing method and device
CN105574834B (en) Image processing method and device
CN105631797B (en) Watermark adding method and device
CN107423699B (en) Biopsy method and Related product
CN110059685B (en) Character area detection method, device and storage medium
JP6355746B2 (en) Image editing techniques for devices
CN109670397A (en) Detection method, device, electronic equipment and the storage medium of skeleton key point
CN104700353B (en) Image filters generation method and device
CN104219445B (en) Screening-mode method of adjustment and device
CN108830892B (en) Face image processing method and device, electronic equipment and computer readable storage medium
JP2016531362A (en) Skin color adjustment method, skin color adjustment device, program, and recording medium
CN104484858B (en) Character image processing method and processing device
CN105957037B (en) Image enchancing method and device
CN105354793A (en) Facial image processing method and device
CN105654093B (en) Feature extracting method and device
CN104243814A (en) Analysis method for object layout in image and image shoot reminding method and device
CN107967459A (en) convolution processing method, device and storage medium
CN109784327B (en) Boundary box determining method and device, electronic equipment and storage medium
JP2018507495A (en) Feature extraction method and apparatus
CN105528765A (en) Method and device for processing image
CN105635574B (en) The treating method and apparatus of image
CN110827195A (en) Virtual article adding method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant