CN103413337A - Color fog generation method based on human-machine interaction - Google Patents

Color fog generation method based on human-machine interaction Download PDF

Info

Publication number
CN103413337A
CN103413337A CN2013101138508A CN201310113850A CN103413337A CN 103413337 A CN103413337 A CN 103413337A CN 2013101138508 A CN2013101138508 A CN 2013101138508A CN 201310113850 A CN201310113850 A CN 201310113850A CN 103413337 A CN103413337 A CN 103413337A
Authority
CN
China
Prior art keywords
transmissivity
image
value
mist
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013101138508A
Other languages
Chinese (zh)
Other versions
CN103413337B (en
Inventor
樊鑫
王祎
高仁杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201310113850.8A priority Critical patent/CN103413337B/en
Publication of CN103413337A publication Critical patent/CN103413337A/en
Application granted granted Critical
Publication of CN103413337B publication Critical patent/CN103413337B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)

Abstract

Disclosed is a color fog generation method based on human-machine interaction. An atmospheric scattering model and simple human-machine interaction are used for operation on a foggy image. Modules of dark primary color image extraction, atmospheric light estimation, transmissivity conversion, optimization compensation and color fog generation are included. a dark primary color image is obtained through a method based on dark primary color priori; the atmospheric light is estimated and a rough transmissivity diagram is obtained based on a local minimum derivation; a guided filter (guided filter) approach is used for optimization of the transmissivity diagram and sky compensation; index corresponding relationship between transmissivities is then used for calculation of different transmissivities; and the atmospheric light is set with different colors to change fog generation so that a foggy image in a color fog scene is obtained. The method can be used for color fog generation on a foggy image, thereby greatly reducing the complexity of fog visual effects and increasing versatility.

Description

A kind of colored mist effect generation method based on man-machine interactively
Technical field
The present invention relates to computer vision field, be specifically related to a kind of colored mist effect generation method based on man-machine interactively.
Background technology
Visual effect is the combination of color and relation thereof, and the same with drawing is the manifestation mode of visual art.In recent decades, visual effect develops and many branches, and has been widely used in a plurality of fields such as video display field, field of play, animation field.And be requisite part in video display game performance visual effect as the mist of natural scene important component part.But, due to the complicacy of mist effect simulation itself, for the generation of mist effect, always can be subject to the restriction of the algorithm of software own and operating personnel's technology.How can be convenient under the condition of interactive operation, scene is made to mist effect simulation preferably, be at present in the problem that need to solve aspect mist effect visual effect.
Current, for the processing of the visual effect of mist effect, mainly concentrate on this field of 3D scene.The mist effect of utilizing basic atmospheric scattering physical model to carry out scene generates, and has obtained certain effect.But itself can expend a large amount of internal memories these methods, and and be not suitable for the mist effect simulation of 2D scene.
The mist effect that is image for the 2D scene generates, and does not have at present too many research.Current, for the processing that the mist image is arranged, mainly concentrate on this field of demist.And some powerful business softwares have higher requirement for the software relevant knowledge of user own, and complicated operation, versatility do not possessed.A kind of digital filter method of effect of mist based on dark primary passage priori, be based on simple physical model and data and calculate, and just can carry out the effective ways that the simulation of mist effect comprises the demist operation to the mist image is arranged.Yet the method has just been considered the mist effect simulation in natural situation, only can generate white mist effect, does not make further expansion for the visual effect aspect of mist effect.
Summary of the invention
The invention provides a kind of colored mist effect generation method based on man-machine interactively, the deficiencies in the prior art have been made up, the method only needs succinct man-machine interactively, simple physical model and data are calculated, just can generate there being the mist image to carry out colored mist effect, greatly reduce the complicacy that obtains mist effect visual effect, increased its versatility.
The technical solution used in the present invention:
A kind of colored mist effect generation method based on man-machine interactively, adopt the atmospheric scattering model, and utilize simple man-machine interactively, operates the mist image is arranged.
A kind of colored mist effect generation method based on man-machine interactively, comprise the modules such as the extraction of dark primary image, the estimation of atmosphere light, transmissivity conversion, Optimization Compensation, the generation of colored mist effect, and its step is as follows:
The first step, the dark primary image extracts
A moving window is set on image, obtains pixel value in the RGB minimum value of pixel in this window this window and be set to this minimum value; Then window be take to 1 pixel as unit moves, until handle whole image, obtain single pass dark primary image.
Second step, atmosphere light is estimated
By the method that atmosphere light is estimated, obtain atmosphere light.0.1% the brightest point (being the point of gray-scale value maximum) in the dark primary image is taken out, and using the pixel in the source images of its correspondence position as a set o (x), get the three-channel pixel maximal value of RGB in o (x), according to scattering law, it is carried out to the coefficient correction, formula is as follows:
I max = max y ∈ o ( x ) ( H ( y ) ) - - - ( 1 )
Wherein H is source images, and Imax is the value of atmosphere light.
The 3rd step, the transmissivity conversion
According to step (1), obtain the rgb value of atmosphere light, then, in conjunction with dark primary passage priori principle and atmospheric scattering model formation (4), (5), derive and obtain transmissivity derivation formula (6), specific as follows:
t(x)=e -βd(x) (2)
I(x)=J(x)t(x)+A(1-t(x)) (3)
Wherein t is transmissivity, and e is natural constant, and β is transmission coefficient, and d is distance, and I is source images, and J is without the mist image, and A is atmosphere light.
Derivation transmissivity derivation formula:
t ( x ) = 1 - I dark ( x ) A - - - ( 4 )
I wherein DarkFor dark primary figure, A is atmosphere light, and t is the rough transmissivity figure obtained.
The 4th step, Optimization Compensation
Use the method for guiding filtering (guided filter) to be optimized this transmissivity figure.After optimization, it is carried out to the sky compensation, the aberration of the bright areas such as the inapplicable sky of correction dark primary passage priori principle, by the method for man-machine interactively, obtain sky compensation threshold alpha, this threshold value, as the cut off value of distinguishing the bright areas such as sky and actual scenery, is utilized the sky compensation formula:
T=2 α-t (5) carries out the sky compensation to transmissivity figure, the transmissivity figure that obtains revising.
The 5th step, colored mist effect generates
Set the visibility under different mist effect scenes, when variable concentrations mist image reached maximum visibility simultaneously, their transmissivity was identical, derived according to formula (4):
t n ( x ) = t h ( x ) D vis h / Dcis n - - - ( 6 )
T wherein n(x) and t h(x) be respectively the transmissivity of simulate fog image and source images, Dvis hAnd Dvis nBe respectively the source images of man-machine interactively setting and the visibility of simulate fog image.
Again the scene under different mist effects is connected, and the atmosphere light value A that participates in calculating is carried out to different man-machine interactivelies.As follows alternately:
The first, original color mist generates
Atmosphere light value A is set as to the color-values of artificial appointment.Namely
A(R,G,B)=Set(R,G,B) (7)
Wherein Set is the artificial color value of setting.
The second, the color mist of laterally/vertical partition generates
The image that is m*n by size is divided into laterally/longitudinally W zone, is m for each size W* n/m*n WZone, all set different color-values.
A W(R,G,B)=Set W(R,G,B) (8)
A wherein WFor atmosphere light value corresponding to each zone, Set WFor artificial each regional color value of setting.
The third, the color mist of laterally/vertical partition gradual change generates
Image is divided into to laterally/longitudinally W zone, is m for each size W* n/m*n WZone, all set different color-values.And the A value is done to corresponding gradual change operation.
A W ( R , G , B ) = ( m W - δi ) m W Set W 1 ( R , G , B ) + δi m W Set W 2 ( R , G , B ) - - - ( 9 )
A W ( R , G , B ) = ( n W - ηj ) n W Set W 1 ( R , G , B ) + ηj n W Set W 2 ( R , G , B ) - - - ( 10 )
The horizontal and vertical A value gradual change operation of correspondence respectively of two formula.A wherein WFor atmosphere light value corresponding to each zone, Set W1, Set W2For two groups of color values of the artificial participation gradual change operation of setting, i and j mean the horizontal stroke/ordinate of corresponding pixel points, and δ and η mean respectively control variable/functional value arbitrarily.
Formula according to formula (3) derivation:
N(x)=(H(x)t n(x)-A(t n(x)-t h(x)))/t h(x) (11)
Wherein N (x) is required simulate fog image, and H (x) is source images, t n(x) and t h(x) be respectively the transmissivity of simulate fog image and source images, A is atmosphere light.By the transmissivity figure that the mist scene is arranged obtained, source images, the transmissivity figure of source images, and calculate by the atmosphere light value substitution formula (11) that interactive computing generates the image that colored mist effect generates.Finally again image is carried out to suitable exposure and modify to improve its visual effect.
The present invention is based on the physical model of general atmospheric scattering, employing is asked for the dark primary image based on the method for dark primary priori, estimate atmosphere light and obtain rough transmissivity figure according to the derivation of local minimum, utilize the method that guides filtering (guided filter) to be optimized transmissivity figure and carry out the sky compensation, and then utilize the index corresponding relation between transmissivity to calculate different transmissivities, and atmosphere light is carried out to different color settings change the mist effect and generate, thereby obtain colored mist effect scene the mist image arranged.
The accompanying drawing explanation
Accompanying drawing is method workflow diagram of the present invention.
101 source images, 102 dark primary images, 103 rough transmissivity images, the 104 transmissivity images of optimizing, the transmissivity image after 105 sky compensation, the transmissivity image of 106 simulate fog images, 107 colored mist effect images.
Embodiment
For making purpose of the present invention, technical scheme and advantage clearer, below in conjunction with accompanying drawing and instantiation, the present invention is described in further details.These examples are only illustrative, and are not limitation of the present invention.
The present invention proposes a kind of colored mist effect generation method based on man-machine interactively, the concrete implementation step of the method: at first, load source images 101, principle according to atmospheric scattering model dark primary passage priori, set up the single channel image of a blank, adopt the window area of 15*15 size, source images 101 is carried out to computing, get pixel RGB minimum value in window, and pixel value in should zone using this minimum value as the single channel image, the pixel of take is carried out identical operation as unit moves this window, until handle whole source images 101, the single channel image obtained is the dark primary image 102 of source images,
Then, 0.1% the brightest point (being the point of gray-scale value maximum) in dark primary image 102 is taken out, and using the pixel in the source images of its correspondence position as a set, get the three-channel pixel maximal value of RGB in this set, according to scattering law, it is carried out to the coefficient correction, according to formula (1), calculate atmosphere light, then according to formula (4), calculate rough transmissivity Figure 103, rough transmissivity figure is optimized to the transmissivity Figure 104 be optimized by the method for guiding filtering (guided filter);
Then, according to the image situation, utilize formula (5) to carry out the transmissivity Figure 105 after sky compensation (being 0.1 if there is no day dummy section threshold value is set) is compensated to the transmissivity figure optimized.The maximum visibility of definition source images and the image that needs simulation, then manually determine interaction schemes, the atmosphere light value is set, utilize formula (6) substitution source images and need the maximum visibility of the image of simulating and transmissivity Figure 106 that the transmissivity figure after compensation obtains the simulate fog image, to utilize the atmosphere light (annotate: used herein is formula (9)) after calculate formula (7)/(8)/(9)/(10), source images, the transmissivity substitution formula (11) of source images and simulate fog image, and suitable its result being modified with exposure, obtain final colored mist effect synthetic image 107.
Finally, can be written into different images, adopt different interaction schemes, generate different colored mist effects.
Accompanying drawing is the atlas of the different mist effects of simulation.From left to right, be followed successively by from top to bottom source images, original color mist effect generates figure, and the color mist of horizontal partition generates figure, and the color mist of vertical partition generates figure, and the color mist of horizontal partition gradual change generates figure, and the color mist of vertical partition gradual change generates figure.

Claims (1)

1. the colored mist effect generation method based on man-machine interactively, is characterized in that following steps,
The first step, dark primary image are extracted: a moving window is set on image, obtains pixel value in the RGB minimum value of pixel in this window this window and be set to this minimum value; Then window be take to 1 pixel as unit moves, until handle whole image, obtain single pass dark primary image;
Second step, the method of using atmosphere light to estimate obtains atmosphere light: 0.1% the brightest point in the dark primary image is taken out, and using the pixel in the source images of its correspondence position as a set o (x), get the three-channel pixel maximal value of RGB in o (x), according to scattering law, it is carried out to the coefficient correction, formula is as follows:
I max = max y ∈ o ( x ) ( H ( y ) ) - - - ( 1 )
Wherein H is source images, and Imax is the value of atmosphere light;
The 3rd step, transmissivity conversion: step (1) obtains the rgb value of atmosphere light, then, in conjunction with dark primary passage priori principle and atmospheric scattering model formation (4), (5), derives and obtain transmissivity derivation formula (6), specific as follows:
t(x)=e -βd(x) (2)
I(x)=J(x)t(x)+A(1-t(x)) (3)
Wherein t is transmissivity, and e is natural constant, and β is transmission coefficient, and d is distance, and I is source images, and J is without the mist image, and A is atmosphere light;
Derivation transmissivity derivation formula:
t ( x ) = 1 - I dark ( x ) A - - - ( 4 )
I wherein DarkFor dark primary figure, A is atmosphere light, and t is the rough transmissivity figure obtained;
The 4th step, use the method for guiding filtering to be optimized compensation to this transmissivity figure: the aberration of the bright areas such as the inapplicable sky of correction dark primary passage priori principle, by the method for man-machine interactively, obtain sky compensation threshold alpha, this threshold value, as the cut off value of distinguishing the bright areas such as sky and actual scenery, is utilized the sky compensation formula:
t=2α-t (5)
Transmissivity figure is carried out to the sky compensation, the transmissivity figure that obtains revising.
The 5th step, colored mist effect generates: set the visibility under different mist effect scenes, when variable concentrations mist image reached maximum visibility simultaneously, their transmissivity was identical, derived according to formula (4):
t n ( x ) = t h ( x ) D vis h / Dvis n - - - ( 6 )
T wherein n(x) and t h(x) be respectively the transmissivity of simulate fog image and source images, Dvis hAnd Dvis nBe respectively the source images of man-machine interactively setting and the visibility of simulate fog image;
Again the scene under different mist effects is connected, and the atmosphere light value A that participates in calculating is carried out to different man-machine interactivelies; As follows alternately:
The first, original color mist generates: atmosphere light value A is set as to the color-values of artificial appointment, namely
A(R,G,B)=Set(R,G,B) (7)
Wherein Set is the artificial color value of setting.
The second, the color mist of laterally/vertical partition generates: the image that is m*n by size is divided into laterally/longitudinally W zone, is m for each size W* n/m*n WZone, all set different color-values;
A W(R,G,B)=Set W(R,G,B) (8)
A wherein WFor atmosphere light value corresponding to each zone, Set WFor artificial each regional color value of setting.
The third, the color mist of laterally/vertical partition gradual change generates: image is divided into to laterally/longitudinally W zone, is m for each size W* n/m*n WZone, all set different color-values.And the A value is done to corresponding gradual change operation;
A W ( R , G , B ) = ( m W - δi ) m W Set W 1 ( R , G , B ) + δi m W Set W 2 ( R , G , B ) - - - ( 9 )
A W ( R , G , B ) = ( n W - ηj ) n W Set W 1 ( R , G , B ) + ηj n W Set W 2 ( R , G , B ) - - - ( 10 )
The horizontal and vertical A value gradual change operation of correspondence respectively of two formula.A wherein WFor atmosphere light value corresponding to each zone, Set W1, Set W2For two groups of color values of the artificial participation gradual change operation of setting, i and j mean the horizontal stroke/ordinate of corresponding pixel points, and δ and η mean respectively control variable/functional value arbitrarily.
Formula according to formula (3) derivation:
N(x)=(H(x)t n(x)-A(t n(x)-t h(x)))/t h(x) (11)
Wherein N (x) is required simulate fog image, and H (x) is source images, and tn (x) and th (x) are respectively the transmissivity of simulate fog image and source images, and A is atmosphere light; By the transmissivity figure that the mist scene is arranged obtained, source images, the transmissivity figure of source images, and calculate by the atmosphere light value substitution formula (11) that interactive computing generates the image that colored mist effect generates.
CN201310113850.8A 2013-04-02 2013-04-02 A kind of color fog generation method based on man-machine interactively Expired - Fee Related CN103413337B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310113850.8A CN103413337B (en) 2013-04-02 2013-04-02 A kind of color fog generation method based on man-machine interactively

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310113850.8A CN103413337B (en) 2013-04-02 2013-04-02 A kind of color fog generation method based on man-machine interactively

Publications (2)

Publication Number Publication Date
CN103413337A true CN103413337A (en) 2013-11-27
CN103413337B CN103413337B (en) 2015-12-23

Family

ID=49606342

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310113850.8A Expired - Fee Related CN103413337B (en) 2013-04-02 2013-04-02 A kind of color fog generation method based on man-machine interactively

Country Status (1)

Country Link
CN (1) CN103413337B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103745446A (en) * 2014-01-27 2014-04-23 广东威创视讯科技股份有限公司 Image guide filtering method and system
CN104408757A (en) * 2014-11-07 2015-03-11 吉林大学 Method and system for adding haze effect to driving scene video
CN106169176A (en) * 2016-06-27 2016-11-30 上海集成电路研发中心有限公司 A kind of image defogging method
CN112950483A (en) * 2019-12-11 2021-06-11 福建天晴数码有限公司 Deep fog effect processing method and system based on mobile game platform

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009038580A (en) * 2007-08-01 2009-02-19 Canon Inc Unit and method for processing image
CN102663694A (en) * 2012-03-30 2012-09-12 大连理工大学 Digital fog effect filter method based on dark primary color channel prior principle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009038580A (en) * 2007-08-01 2009-02-19 Canon Inc Unit and method for processing image
CN102663694A (en) * 2012-03-30 2012-09-12 大连理工大学 Digital fog effect filter method based on dark primary color channel prior principle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张颖超 等: "基于OSG和粒子系统的雪效仿真", 《南京信息工程大学学报(自然科学版)》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103745446A (en) * 2014-01-27 2014-04-23 广东威创视讯科技股份有限公司 Image guide filtering method and system
CN103745446B (en) * 2014-01-27 2017-08-29 广东威创视讯科技股份有限公司 Image guiding filtering method and system
CN104408757A (en) * 2014-11-07 2015-03-11 吉林大学 Method and system for adding haze effect to driving scene video
CN104408757B (en) * 2014-11-07 2017-11-14 吉林大学 The method and system of haze effect are added in a kind of video to Driving Scene
CN106169176A (en) * 2016-06-27 2016-11-30 上海集成电路研发中心有限公司 A kind of image defogging method
CN112950483A (en) * 2019-12-11 2021-06-11 福建天晴数码有限公司 Deep fog effect processing method and system based on mobile game platform
CN112950483B (en) * 2019-12-11 2023-07-21 福建天晴数码有限公司 Deep fog effect processing method and system based on mobile game platform

Also Published As

Publication number Publication date
CN103413337B (en) 2015-12-23

Similar Documents

Publication Publication Date Title
CN105574827B (en) A kind of method, apparatus of image defogging
CN102663694A (en) Digital fog effect filter method based on dark primary color channel prior principle
CN107690672B (en) Training data generation method and device and image semantic segmentation method thereof
CN102968772B (en) A kind of image defogging method capable based on dark channel information
CN102663766B (en) Non-photorealistic based art illustration effect drawing method
CN104598915B (en) A kind of gesture identification method and device
CN102881011B (en) Region-segmentation-based portrait illumination transfer method
CN108830252A (en) A kind of convolutional neural networks human motion recognition method of amalgamation of global space-time characteristic
CN103700114B (en) A kind of complex background modeling method based on variable Gaussian mixture number
CN102098528B (en) Method and device for converting planar image into stereoscopic image
CN102231791B (en) Video image defogging method based on image brightness stratification
CN104881879B (en) A kind of remote sensing images haze emulation mode based on dark channel prior
CN105447906A (en) Method for calculating lighting parameters and carrying out relighting rendering based on image and model
ATE500578T1 (en) NON-PHOTO REALISTIC REPRESENTATION OF AUGMENTED REALITY
CN102945079A (en) Intelligent recognition and control-based stereographic projection system and method
CN104954780A (en) DIBR (depth image-based rendering) virtual image restoration method applicable to high-definition 2D/3D (two-dimensional/three-dimensional) conversion
CN105678724A (en) Background replacing method and apparatus for images
CN106127715A (en) A kind of image defogging method and system
CN103413337A (en) Color fog generation method based on human-machine interaction
CN104063888B (en) A kind of wave spectrum artistic style method for drafting based on feeling of unreality
CN106027962A (en) Video monitoring coverage rate calculation method and device, and video monitoring layout method and system
CN106709901A (en) Simulation fog image generation method based on depth priori
CN103198513B (en) Film later stage synthesis antialiasing method
CN104732497A (en) Image defogging method, FPGA and defogging system including FPGA
CN104282013B (en) A kind of image processing method and device for foreground target detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151223

CF01 Termination of patent right due to non-payment of annual fee