CN108280831A - A kind of acquisition methods and system of image sequence light stream - Google Patents

A kind of acquisition methods and system of image sequence light stream Download PDF

Info

Publication number
CN108280831A
CN108280831A CN201810105993.7A CN201810105993A CN108280831A CN 108280831 A CN108280831 A CN 108280831A CN 201810105993 A CN201810105993 A CN 201810105993A CN 108280831 A CN108280831 A CN 108280831A
Authority
CN
China
Prior art keywords
optical flow
layer image
value
image
current layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810105993.7A
Other languages
Chinese (zh)
Other versions
CN108280831B (en
Inventor
陈震
王雪冰
张聪炫
江少锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Hangkong University
Original Assignee
Nanchang Hangkong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Hangkong University filed Critical Nanchang Hangkong University
Priority to CN201810105993.7A priority Critical patent/CN108280831B/en
Publication of CN108280831A publication Critical patent/CN108280831A/en
Application granted granted Critical
Publication of CN108280831B publication Critical patent/CN108280831B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of acquisition methods and system of image sequence light stream.The method includes:Obtain sample image;Pyramid down-sampling layering is carried out to the sample image, obtains image pyramid;Calculate the light stream increment, light stream initial value and light stream output valve of current tomographic image;Alternately guiding filtering is carried out to the light stream output valve of current tomographic image, obtains the light stream value after alternately guiding filtering;Judge whether the number of plies of the current tomographic image is equal to the pyramidal number of plies of described image, if it is not, then repeating the above steps;If then using the light stream value after the guiding filtering as the output light flow valuve of described image sequence.The method of the present invention carries out non-linearization filtering using alternately guiding filtering to image sequence Pyramid technology optical flow computation model, to solve the problems, such as that image and movement edge are fuzzy in image sequence Pyramid technology model optical flow computation result, realize the accurate segmentation of the target object and scene in image sequence.

Description

Method and system for acquiring image sequence optical flow
Technical Field
The present invention relates to the field of image recognition, and in particular, to a method and a system for acquiring an optical flow of an image sequence.
Background
Since the 21 st century, with the continuous improvement of the software and hardware levels of computers, the image sequence optical flow calculation and the related technical research thereof gradually become the hot problems in the research fields of computer vision, pattern recognition and the like. The research result is widely applied to the fields of aerospace, military, social production, life, cultural relic protection and restoration, medical image processing and analysis and the like, for example, the research result is applied to a visual system of an industrial robot, foreground and obstacle detection in the advancing process of an unmanned vehicle, intelligent traffic detection and control, unmanned aerial vehicle navigation and take-off and landing systems, satellite cloud picture analysis and three-dimensional display, reconstruction and analysis of organs in medical images, diagnosis and the like.
At present, an image sequence pyramid hierarchical estimation model based on median filtering is a method which is most commonly adopted in an image sequence optical flow calculation technology, can effectively inhibit noise and overflow points in an optical flow calculation result, and has high optical flow calculation accuracy for difficult scenes such as large displacement, illumination change and the like. However, because the nature of the median filtering is smooth filtering, the filtering result often makes the edges of the images and the motion in the optical flow calculation result of the image sequence too smooth, so that the contour of the scene or the object is very blurred, and the target object and the scene in the image sequence cannot be accurately segmented.
Disclosure of Invention
The invention aims to provide a method and a system for acquiring an optical flow of an image sequence so as to realize accurate segmentation of a target object and a scene in the image sequence.
In order to achieve the purpose, the invention provides the following scheme:
a method for acquiring an optical flow of a sequence of images, said method comprising the steps of:
selecting any two continuous frames of images in the image sequence to obtain a sample image;
carrying out pyramid downsampling layering on the sample image to obtain an image pyramid;
randomly selecting a layer of image from the image pyramid as a current layer image, and calculating an optical flow increment and an optical flow initial value of the current layer image;
calculating an optical flow output value of the current layer image according to the optical flow increment and the optical flow initial value of the current layer image;
carrying out alternate guide filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternate guide filtering;
calculating an output optical flow increment of the current layer image according to the optical flow value after the alternative guide filtering and the optical flow initial value of the current layer image;
judging whether the number of layers of the current layer image is equal to that of the image pyramid or not, and obtaining a first judgment result;
if the first judgment result is negative, taking the next layer image of the current layer image as the current layer image, taking the optical flow value after the alternative guide filtering as the optical flow initial value of the current layer image, taking the output optical flow increment as the optical flow increment of the current layer image, and returning to the step of calculating the optical flow output value of the current layer image according to the optical flow increment and the optical flow initial value of the current layer image;
and if the first judgment result is yes, taking the optical flow value after the guide filtering as an output optical flow value of the image sequence.
Optionally, the selecting a layer of image from the image pyramid as a current layer image, and calculating an optical flow increment and an optical flow initial value of the current layer image specifically includes:
selecting a k layer image as the current layer image;
calculating an optical flow increment and an optical flow initial value of the current layer image by using formula (1);
wherein u isk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、 dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis; psi'kRepresenting non-quadratic penalty functionIn the partial derivative of the k layer image, epsilon represents the independent variable of the function; i isx k、Iy k、It kRespectively representing the partial derivatives of the luminance I of the k-th layer image pixel point along the directions of x, y axes and time t; div represents the optical flow divergence.
Optionally, the calculating an optical flow output value of the current layer image according to the optical flow increment and the optical flow initial value of the current layer image specifically includes:
taking the k layer image as a current layer image, and calculating an optical flow output value of the current layer image by using a formula (2);
uk+1、vk+1representing the components of the optical flow output value of the k-th layer image along the x-and y-axes, u, respectivelyk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x-axis and the y-axis; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis.
Optionally, the alternately guiding and filtering the optical flow output value of the current layer image to obtain an optical flow value after alternately guiding and filtering specifically includes:
step one, each pixel point (i, j) of the current layer image is processedTOptical flow output value ofAs input to the first joint bilateral filter of the nth iteration, subjecting each of said pixel points (i, j) T to the median filter value of the (n-1) th iterationAs a guide to the first joint bilateral filter for the nth iteration; wherein the median filtered value of the 0 th iterationN represents the current iteration number, N is 1,2, …, and N is a preset iteration number;
step two, respectively obtaining the optical flow value of each pixel point (i, j) T after the first combined bilateral filtering of the nth iteration by using a formula (3)
Wherein,representing a Filter Window Point Pixel (i, j)TFirst combined bilateral filtered optical flow value, K, through nth iterationp1,Kp2Which represents a normalization factor of the weights,
σsrepresenting the spatial weight control standard deviation, σrRepresents the range weight control standard deviation;
thirdly, the optical flow value of each pixel point (i, j) T of the current layer image after the first combined bilateral filtering of the nth iterationAs input to the second combined bilateral filter for the nth iteration, each pixel (i, j)TOptical flow output value ofAs a guide to the second joint bilateral filter for the nth iteration;
step four, respectively obtaining each pixel point (i, j) by using a formula (4)TA second combined bilateral filtered optical flow value for the nth iteration;
wherein,representing each of said pixels (i, j)TOptical flow value, K, after a second joint bilateral filtering of the nth iterationp3,Kp4Which represents a normalization factor of the weights,
fifthly, adopting a formula (5) to carry out optical flow value after the second combined bilateral filtering of the nth iterationMedian filtering to obtain each pixel point (i, j)TThe median filtered value after the nth iteration;
indicating pixel (i, j)TThe median filtered optical flow value, f (·) represents the median filtering;
step six, judging whether the current iteration times n is less than the preset iteration times, if not, then each pixel point (i, j)TThe median filtering value after the nth iteration is used as an optical flow value after the alternative guide filtering of the current layer image; and if so, increasing n by 1 to serve as the current iteration times, and returning to the step one.
An acquisition system of an optical flow of a sequence of images, the acquisition system comprising:
the sample image acquisition module is used for selecting any two continuous frames of images in the image sequence to obtain a sample image;
the pyramid layering module is used for carrying out pyramid downsampling layering on the sample image to obtain an image pyramid;
the optical flow increment and optical flow initial value calculation module is used for randomly selecting a layer of image from the image pyramid as a current layer image and calculating the optical flow increment and the optical flow initial value of the current layer image;
the optical flow output value calculating module is used for calculating the optical flow output value of the current layer image according to the optical flow increment of the current layer image and the optical flow initial value;
the alternative guide filtering module is used for carrying out alternative guide filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternative guide filtering;
an output optical flow increment calculating module, configured to calculate an output optical flow increment of the current layer image according to the optical flow value after the alternative guide filtering and the optical flow initial value of the current layer image;
the first result judging module is used for judging whether the number of layers of the current layer image is smaller than that of the image pyramid or not to obtain a first judging result; if the first judgment result is yes, taking the next layer image of the current layer image as the current layer image, taking the optical flow value after the alternative guide filtering as the initial optical flow value of the current layer image, taking the output optical flow increment as the optical flow increment of the current layer image, returning to the optical flow increment and the initial optical flow value of the current layer image, and calculating the optical flow output value of the current layer image; and if the first judgment result is negative, taking the optical flow value after the guide filtering as an output optical flow value of the image sequence.
Optionally, the optical flow increment and optical flow initial value calculating module specifically includes:
the current layer image selection submodule is used for selecting the kth layer image as the current layer image;
the optical flow increment and initial value calculation submodule is used for calculating the optical flow increment and the initial value of the optical flow of the current layer image by using a formula (1);
wherein u isk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、 dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis; psi'kRepresenting non-quadratic penalty functionAt the k layer imageE represents the argument of the function; i isx k、Iy k、It kRespectively representing the partial derivatives of the luminance I of the k-th layer image pixel point along the directions of x, y axes and time t; div represents the optical flow divergence.
Optionally, the optical flow output value calculating module specifically includes:
the optical flow output value operator module is used for taking the k-th layer image as a current layer image and calculating the optical flow output value of the current layer image by using a formula (2);
uk+1、vk+1representing the components of the optical flow output value of the k-th layer image along the x-and y-axes, u, respectivelyk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x-axis and the y-axis; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis.
Optionally, the alternating-direction filtering module specifically includes:
a first input setting submodule for setting each pixel (i, j) of the current layer imageTLight stream output value ofAs input to the first joint bilateral filter for the nth iteration, applying each of said pixel points (i, j)TMedian filtered values over n-1 iterationsAs a guide to the first joint bilateral filter for the nth iteration; wherein the median filtered value of the 0 th iterationn represents the current number of iterations, n is 1,2,…, N, N is a preset iteration number;
a first joint bilateral filtering submodule for obtaining each of said pixel points (i, j) using equation (3) respectivelyTFirst joint bilateral filtered optical flow values for an nth iteration
Wherein,representing a Filter Window Point Pixel (i, j)TFirst combined bilateral filtered optical flow value, K, through nth iterationp1,Kp2Which represents a normalization factor of the weights,
σsrepresenting the spatial weight control standard deviation, σrRepresents the range weight control standard deviation;
a second input value setting submodule for setting each pixel point (i, j) of the current layer imageTFirst joint bilateral filtered optical flow values for an nth iterationAs input to the second combined bilateral filter for the nth iteration, each pixel (i, j)TOptical flow output value ofAs the second of the nth iterationBoot of the joint bilateral filter;
a second combined bilateral filtering module for obtaining each pixel point (i, j) by formula (4)TA second combined bilateral filtered optical flow value for the nth iteration;
wherein,representing each of said pixels (i, j)TOptical flow value, K, after a second joint bilateral filtering of the nth iterationp3,Kp4Which represents a normalization factor of the weights,
a median filtering submodule for performing median filtering on the optical flow value subjected to the second combined bilateral filtering of the nth iteration by adopting a formula (5) to obtain each pixel point (i, j)TThe median filtered value after the nth iteration;
indicating pixel (i, j)TThe median filtered optical flow value, f (·) represents the median filtering;
a second result judgment submodule for judging the current iteration numberWhether the number n is less than the preset iteration number, if not, each pixel point (i, j)TAnd taking the median filtering value of the nth iteration as an optical flow value after the alternate guide filtering of the current layer image, if so, increasing n by 1 to be taken as the current iteration number, and returning to the first input setting submodule.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention discloses a method and a system for acquiring an image sequence optical flow, which are used for carrying out nonlinear filtering on an image sequence pyramid layered optical flow calculation model by utilizing alternative guide filtering so as to solve the problem that images and moving edges in an image sequence pyramid layered model optical flow calculation result are blurred and realize accurate segmentation of a target object and a scene in an image sequence.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor.
FIG. 1 is a flowchart of a method for acquiring an optical flow of an image sequence according to the present invention;
FIG. 2a is a previous frame image according to an embodiment of the present invention;
FIG. 2b is a next frame of image according to an embodiment of the present invention;
FIG. 3a is an image pyramid of one embodiment of the present invention;
FIG. 3b is a flow chart of alternative guided filtering according to an embodiment of the present invention;
FIG. 4 is a diagram of a filter window according to an embodiment of the present invention;
FIG. 5 is a diagram of an output result of a specific real-time example provided by the present invention;
fig. 6 is a structural diagram of an image sequence optical flow acquisition system provided in the present invention.
Detailed Description
The invention aims to provide a method and a system for acquiring an optical flow of an image sequence, which are used for accurately segmenting a target object and a scene in the image sequence.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
As shown in fig. 1, the present invention provides a method for acquiring an optical flow of an image sequence, the method comprising the following steps:
step 101, selecting any two continuous frames of images in an image sequence to obtain a sample image.
And 102, carrying out pyramid downsampling layering on the sample image to obtain an image pyramid.
Step 103, selecting a layer of image from the image pyramid as a current layer image, and calculating an optical flow increment and an optical flow initial value of the current layer image.
Step 103 is to arbitrarily select a layer of image from the image pyramid as a current layer image, and calculate an optical flow increment and an optical flow initial value of the current layer image, and specifically includes:
selecting a k layer image as the current layer image, wherein k is more than or equal to 1; calculating the optical flow increment and the initial value of the optical flow of the current layer image by using the formula (1):
wherein u isk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、 dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis; psi'kRepresenting non-quadratic penalty functionIn the partial derivative of the k layer image, epsilon represents the independent variable of the function; i isx k、Iy k、It kRespectively representing the partial derivatives of the luminance I of the k-th layer image pixel point along the directions of x, y axes and time t; div represents the optical flow divergence.
And 104, calculating an optical flow output value of the current layer image according to the optical flow increment of the current layer image and the optical flow initial value. The step 104 of calculating an optical flow output value of the current layer image according to the optical flow increment and the optical flow initial value of the current layer image specifically includes:
the k-th layer image is used as a current layer image, and an optical flow output value of the current layer image is calculated by using formula (2).
uk+1、vk+1Representing the components of the optical flow output value of the k-th layer image along the x-and y-axes, u, respectivelyk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis.
And 105, performing alternate guide filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternate guide filtering. The step 105 of performing alternating-guide filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternating-guide filtering specifically includes:
step one, each pixel point (i, j) of the current layer image is processedTOptical flow output value ofAs input to the first joint bilateral filter for the nth iteration, dropping each of said pixel points (i, j)TMedian filtered values over n-1 iterationsAs a guide to the first joint bilateral filter for the nth iteration; wherein the median filtered value of the 0 th iterationN represents the current iteration number, N is 1,2, …, and N is a preset iteration number;
step two, respectively obtaining each pixel point (i, j) by using a formula (3)TFirst joint bilateral filtered optical flow values for an nth iterationSpecifically, n is not less than 1
Wherein,representing a Filter Window Point Pixel (i, j)TFirst combined bilateral filtered optical flow value, K, through nth iterationp1,Kp2Which represents a normalization factor of the weights,
σsrepresenting the spatial weight control standard deviation, σs>0,σrDenotes the range weight control standard deviation, σr>0;
Step three, every pixel point (i, j) of the current layer imageTFirst combined bilateral filtered optical flow values for an nth iterationAs input to the second combined bilateral filter for the nth iteration, each pixel (i, j)TOptical flow output value ofAs a guide to the second joint bilateral filter for the nth iteration;
step four, respectively obtaining each pixel point (i, j) by using a formula (4)TA second combined bilateral filtered optical flow value for the nth iteration;
wherein,representing each of said pixels (i, j)TOptical flow value, K, after a second joint bilateral filtering of the nth iterationp3,Kp4Which represents a normalization factor of the weights,
step five, carrying out median filtering on the optical flow value subjected to the second combined bilateral filtering of the nth iteration by adopting a formula (5) to obtain each pixel point (i, j)TThe median filtered value after the nth iteration;
indicating pixel (i, j)TThe median filtered optical flow value, f (·) represents the median filtering;
step six, judging whether the current iteration times n is less than the preset iteration times, if not, then each pixel point (i, j)TThe median filtering value after the nth iteration is used as an optical flow value after the alternative guide filtering of the current layer image; and if so, increasing n by 1 to serve as the current iteration times, and returning to the step one.
And 106, calculating an output optical flow increment of the current layer image according to the optical flow value after the alternative guide filtering and the optical flow initial value of the current layer image.
Step 107, judging whether the number of layers of the current layer image is smaller than that of the image pyramid, and obtaining a first judgment result.
And 108, if the first judgment result is yes, taking the next layer image of the current layer image as the current layer image, taking the optical flow value after the alternative guide filtering as the optical flow initial value of the current layer image, taking the output optical flow increment as the optical flow increment of the current layer image, and returning to calculate the optical flow output value of the current layer image according to the optical flow increment of the current layer image and the optical flow initial value.
And step 109, if the first determination result is negative, taking the optical flow value after the guide filtering as an output optical flow value of the image sequence.
As a specific embodiment, referring to fig. 2 to 5, a Venus image sequence optical flow calculation experiment is used to describe an image sequence pyramid hierarchical optical flow calculation method based on alternating-guide filtering:
1) inputting two continuous frame images of the Venus image sequence in the input images of FIG. 2a and FIG. 2 b; wherein: FIG. 2a is a previous frame image, and FIG. 2b is a next two frame image;
2) as shown in fig. 3a, pyramid downsampling layering is performed on the input Venus image sequence, the sampling coefficient is 0 < β < 1, specifically, the sampling coefficient is 0.5, the number of image pyramid layers is 4,
3) as shown in FIG. 3b, the output value (u) is calculated for the inputted k-th layer image optical flowk+1,vk+1)TPerforming alternate guided filtering, σsIs 3, σrIs 0.3, and the preset iteration number N is 3;
4) calculating an optical flow of the image sequence from the k-th layer (k is 1) image of the image pyramid, wherein the optical flow calculation model is as follows:
in the formula (1), uk、vkRespectively representing the components of the initial value of the k-th layer image optical flow calculation along the x and y axes; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image optical flow calculation along the x and y axes; psi'kRepresenting a non-quadratic penalty functionIn the partial derivative of the k layer image, epsilon represents the independent variable of the function; i isx k、Iy k、It kRespectively representing the partial derivatives of the luminance I of the pixel point of the k-th layer image along the directions of an x axis, a y axis and time t; div represents the optical flow divergence;
4) the k layer image optical flow increment du calculated according to the formula (1)k、dvkAnd k layer image optical flow calculation initial value uk、vkCalculating the output value u of the k layer image optical flow calculationk+1、vk+1The calculation formula is as follows:
5) iterative computation is required to be adopted for the optical flow of the k-th layer image calculated in the formula (2), so that the optical flow computation result generates a smoothing phenomenon in an edge region; the optical flow calculation result is subjected to an alternate guide filtering operation to remove noise and fine structure optical flow values and to maintain the optical flow values of the moving edges.
As shown in FIG. 4, a pixel (i, j) is generated in the k-th layer optical flow graphTA filter window w of size h x h centered onkCenter point of window (i, j)TOptical flow (u) ofi,j,vi,j)TCan be composed of any pixel points (i ', j') in the neighborhoodTOptical flow (u) ofi',j',vi',j')TAnd obtaining weighted average, and traversing the whole optical flow image by a filtering window, namely filtering each pixel point. The invention divides the whole filtering step into three steps, the first step: adopting a combined bilateral filtering framework to calculate the output value of the k-th layer image optical flowAs input of the first combined bilateral filter, optical flow value after 6) and 7) t-1 times of alternate guide filtering processingAs a guide to the first joint bilateral filter,the initial value is set to 0, and the whole light flow graph is traversed to eliminate the light flow values of noise and fine structures, and the formula is as follows:
6) formula (3)Representing a Filter Window Point Pixel (i, j)TFirst combined bilateral filtered optical flow value, K, of nth iterationp1,Kp2Which represents a normalization factor of the weights,
σsrepresenting the spatial weight control standard deviation, σrIndicating the range weight control standard deviation. The second step is that: adopting a combined bilateral filtering framework to obtain the optical flow value obtained by the formula (3)As input to a second combined bilateral filter, the original optical flow values
As the guide of the second combined bilateral filter, the whole light flow graph is traversed to recover the light flow value of the moving edge, and the formula is as follows:
7) formula (4)Pixel point (i, j) representing the center of the filter windowTSecond combined bilateral filtered optical flow value, K, of nth iterationp3,Kp4Which represents a normalization factor of the weights,
σsrepresenting the sum of the spatial weight control standard deviations σrIndicating the range weight control standard deviation. The third step: traversing the whole light flow diagram to obtain the light flow value through the formula (4)
Median filtering is performed to remove the abnormal optical flow value of the moving edge, and the formula is as follows:
8) formula (5)Indicating pixel (i, j)TThe median filtered optical flow value of the nth iteration, f (·) denotes median filtering. Taking the optical flow value processed by the formula (5) as a guide of the n +1 th first alternating guide filtering, repeating the steps 5) to 7) to carry out iterative calculation, stopping the loop until the current iteration number n is 3, and outputting the optical flow value subjected to the alternating guide filteringThe calculation process is shown in fig. 3 b.
9) Taking the optical flow after 3 times of alternate guiding filtering as an initial value for calculating the k +1 th layer optical flow of the image sequence pyramid, repeating the steps 3) to 8), stopping the circulation when the pyramid layer number k is 4, and outputting the final optical flow calculation result as shown in fig. 5.
As can be seen from the optical flow calculation result in FIG. 5, the method of the present invention overcomes the problem of the image and moving edge being too smooth in the image sequence optical flow calculation result, has higher calculation accuracy and better applicability to complex scenes and complex edge image sequences, and has a wide application prospect in the fields of safety monitoring, traffic detection, target segmentation and tracking, etc.
As shown in fig. 6, the present invention also provides an acquisition system for optical flow of an image sequence, the acquisition system including:
the sample image obtaining module 601 is configured to select any two consecutive frames of images in the image sequence to obtain a sample image;
a pyramid layering module 602, configured to perform pyramid downsampling layering on the sample image to obtain an image pyramid;
an optical flow increment and optical flow initial value calculating module 603, configured to arbitrarily select a layer of image from the image pyramid as a current layer image, and calculate an optical flow increment and an optical flow initial value of the current layer image.
The optical flow increment and optical flow initial value calculating module 603 specifically includes:
the current layer image selection submodule is used for selecting the kth layer image as the current layer image; and the optical flow increment and initial value calculation submodule is used for calculating the optical flow increment and the initial value of the optical flow of the current layer image by using the formula (1).
Wherein u isk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、 dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis; psi'kRepresenting non-quadratic penalty functionIn the partial derivative of the k layer image, epsilon represents the independent variable of the function; i isx k、Iy k、It kRespectively representing partial derivatives of the luminance I of the k-th layer image pixel point along x, y axes and time n directions; div represents the optical flow divergence.
And an optical flow output value calculating module 604, configured to calculate an optical flow output value of the current layer image according to the optical flow increment and the optical flow initial value of the current layer image. The optical flow output value calculating module 604 specifically includes: and the optical flow output value operator module is used for taking the k-th layer image as the current layer image and calculating the optical flow output value of the current layer image by using a formula (2).
uk+1、vk+1Representing the components of the optical flow output value of the k-th layer image along the x-and y-axes, u, respectivelyk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x-axis and the y-axis; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis.
And an alternative guiding filtering module 605, configured to perform alternative guiding filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternative guiding filtering.
The alternating directing filtering module 605 specifically includes:
a first input setting submodule for setting each pixel (i, j) of the current layer imageTLight stream output value ofAs the input of the first combined bilateral filter of the nth iteration, passing the median filtering value of the nth-1 iteration of each pixel point (i, j) TAs a guide to the first joint bilateral filter for the nth iteration; wherein the median filtered value of the 0 th iterationN represents the current iteration number, N is 1,2, …, and N is a preset iteration number;
a first joint bilateral filtering submodule for obtaining each of said pixel points (i, j) using equation (3) respectivelyTFirst joint bilateral filtered optical flow values for an nth iteration
Wherein,representing a Filter Window Point Pixel (i, j)TFirst combined bilateral filtered optical flow value, K, through nth iterationp1,Kp2Which represents a normalization factor of the weights,
σsrepresenting the spatial weight control standard deviation, σrRepresenting range weightsControlling the standard deviation;
a second input value setting submodule for setting each pixel point (i, j) of the current layer imageTFirst joint bilateral filtered optical flow values for an nth iterationAs input to the second combined bilateral filter for the nth iteration, each pixel (i, j)TOptical flow output value ofAs a guide to the nth iteration of the second combined bilateral filter;
a second combined bilateral filtering module for obtaining each pixel point (i, j) by formula (4)TA second combined bilateral filtered optical flow value for the nth iteration;
wherein,representing each of said pixels (i, j)TOptical flow value, K, after a second joint bilateral filtering of the nth iterationp3,Kp4Which represents a normalization factor of the weights,
a median filtering submodule for performing median filtering on the optical flow value subjected to the second combined bilateral filtering of the nth iteration by adopting a formula (5) to obtain each pixel point(i,j)TThe median filtered value after the nth iteration;
indicating pixel (i, j)TThe median filtered optical flow value, f (·) represents the median filtering;
a second result judgment submodule for judging whether the current iteration number n is less than the preset iteration number, if not, each pixel point (i, j)TAnd taking the median filtering value of the nth iteration as an optical flow value after the alternate guide filtering of the current layer image, if so, increasing n by 1 to be taken as the current iteration number, and returning to the first input setting submodule.
An output optical flow increment calculation module 606, configured to calculate an output optical flow increment of the current layer image according to the optical flow value after the alternating-guide filtering and the initial value of the current layer image;
a first result determining module 607, configured to determine whether the number of layers of the current layer image is smaller than the number of layers of the image pyramid, to obtain a first determination result; if the first judgment result is yes, taking the next layer image of the current layer image as the current layer image, taking the optical flow value after the alternative guide filtering as the optical flow initial value of the current layer image, taking the output optical flow increment as the optical flow increment of the current layer image, returning to the optical flow increment and the optical flow initial value of the current layer image, and calculating the optical flow output value of the current layer image; and if the first judgment result is negative, taking the optical flow value after the guide filtering as an output optical flow value of the image sequence.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principle and the implementation manner of the present invention are explained by applying specific examples, the above description of the embodiments is only used to help understanding the method of the present invention and the core idea thereof, the described embodiments are only a part of the embodiments of the present invention, not all embodiments, and all other embodiments obtained by one of ordinary skill in the art based on the embodiments of the present invention are within the protection scope of the present invention without any creative efforts.

Claims (8)

1. A method for acquiring an optical flow of an image sequence, the method comprising the steps of:
selecting any two continuous frames of images in the image sequence to obtain a sample image;
carrying out pyramid downsampling layering on the sample image to obtain an image pyramid;
randomly selecting a layer of image from the image pyramid as a current layer image, and calculating an optical flow increment and an optical flow initial value of the current layer image;
calculating an optical flow output value of the current layer image according to the optical flow increment and the optical flow initial value of the current layer image;
carrying out alternate guide filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternate guide filtering;
calculating an output optical flow increment of the current layer image according to the optical flow value after the alternative guide filtering and the optical flow initial value of the current layer image;
judging whether the number of layers of the current layer image is smaller than that of the image pyramid or not to obtain a first judgment result;
if the first judgment result is yes, taking the next layer image of the current layer image as the current layer image, taking the optical flow value after the alternative guide filtering as the optical flow initial value of the current layer image, taking the output optical flow increment as the optical flow increment of the current layer image, and returning to the step of calculating the optical flow output value of the current layer image according to the optical flow increment of the current layer image and the optical flow initial value;
and if the first judgment result is negative, taking the optical flow value after the guide filtering as an output optical flow value of the image sequence.
2. The method as claimed in claim 1, wherein the step of arbitrarily selecting a layer of image from the image pyramid as a current layer image, and calculating an optical flow increment and an optical flow initial value of the current layer image comprises:
selecting a k layer image as the current layer image;
calculating an optical flow increment and an optical flow initial value of the current layer image by using formula (1);
wherein u isk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、dvkOptical flows respectively representing k-th layer imagesThe component of the increment along the x, y axes; psi'kRepresenting a non-quadratic penalty functionIn the partial derivative of the k layer image, epsilon represents the independent variable of the function; i isx k、Iy k、It kRespectively representing the partial derivatives of the luminance I of the pixel point of the k-th layer image along the directions of an x axis, a y axis and time t; div represents the optical flow divergence.
3. The method for acquiring an optical flow of an image sequence according to claim 1, wherein the calculating an optical flow output value of a current layer image according to an optical flow increment and an optical flow initial value of the current layer image specifically comprises:
taking the k-th layer image as a current layer image, and calculating an optical flow output value of the current layer image by using a formula (2);
uk+1、vk+1representing the components of the optical flow output value of the k-th layer image along the x-and y-axes, u, respectivelyk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis.
4. The method according to claim 1, wherein the step of performing alternating-guidance filtering on the optical flow output value of the current-layer image to obtain an optical flow value after the alternating-guidance filtering includes:
step one, each pixel point (i, j) of the current layer image is processedTOptical flow output value ofAs a first joint bilateral filter for the nth iterationInputting, each of said pixels (i, j)TMedian filtered values over n-1 iterationsAs a guide to the first joint bilateral filter for the nth iteration; wherein the median filtered value of the 0 th iterationN represents the current iteration number, N is 1,2, …, and N is a preset iteration number;
step two, respectively obtaining each pixel point (i, j) by using a formula (3)TFirst joint bilateral filtered optical flow values for an nth iteration
Wherein,representing a Filter Window Point Pixel (i, j)TOptical flow value, K, after first joint bilateral filtering of the nth iterationp1,Kp2Which represents a normalization factor of the weights, σsrepresenting the spatial weight control standard deviation, σrRepresents the range weight control standard deviation;
step three, every pixel point (i, j) of the current layer imageTFirst joint bilateral filtered optical flow values for an nth iterationAs input to the second combined bilateral filter for the nth iteration, each pixel (i, j)TOptical flow output value ofAs a guide to the nth iteration second combined bilateral filter;
step four, respectively obtaining each pixel point (i, j) by using a formula (4)TA second combined bilateral filtered optical flow value for the nth iteration;
wherein,representing each of said pixels (i, j)TSecond combined bilateral filtered optical flow value, K, through nth iterationp3,Kp4Which represents a normalization factor of the weights,
step five, adopting a formula (5) to carry out median filtering on the optical flow value subjected to the second combined bilateral filtering of the nth iteration to obtain each pixel point (i, j)TThe median filtered value after the nth iteration;
indicating pixel (i, j)TMedian filtered optical flow values, f (·) representing median filtering;
step six, judging whether the current iteration times n is less than the preset iteration times, if not, then each pixel point (i, j)TThe median filtering value after the nth iteration is used as an optical flow value after the alternative guiding filtering of the current layer image; and if so, increasing n by 1 to serve as the current iteration times, and returning to the step one.
5. An acquisition system of an optical flow of a sequence of images, characterized in that it comprises:
the sample image acquisition module is used for selecting any two continuous frames of images in the image sequence to obtain a sample image;
the pyramid layering module is used for carrying out pyramid downsampling layering on the sample image to obtain an image pyramid;
the optical flow increment and optical flow initial value calculation module is used for randomly selecting a layer of image from the image pyramid as a current layer image and calculating the optical flow increment and the optical flow initial value of the current layer image;
the optical flow output value calculating module is used for calculating the optical flow output value of the current layer image according to the optical flow increment of the current layer image and the optical flow initial value;
the alternative guide filtering module is used for carrying out alternative guide filtering on the optical flow output value of the current layer image to obtain an optical flow value after the alternative guide filtering;
an output optical flow increment calculating module, configured to calculate an output optical flow increment of the current layer image according to the optical flow value after the alternating-guide filtering and the optical flow initial value of the current layer image;
the first result judging module is used for judging whether the number of layers of the current layer image is equal to that of the image pyramid or not to obtain a first judging result; if the first judgment result is negative, taking the next layer image of the current layer image as the current layer image, taking the optical flow value after the alternative guide filtering as the optical flow initial value of the current layer image, taking the output optical flow increment as the optical flow increment of the current layer image, returning to the optical flow increment and the optical flow initial value according to the current layer image, and calculating the optical flow output value of the current layer image; and if the first judgment result is yes, taking the optical flow value after the guide filtering as an output optical flow value of the image sequence.
6. The system for acquiring optical flow of a sequence of images according to claim 5, wherein the optical flow increment and optical flow initial value calculating module specifically comprises:
the current layer image selection submodule is used for selecting the kth layer image as the current layer image;
the optical flow increment and initial value calculation submodule is used for calculating the optical flow increment and the initial value of the optical flow of the current layer image by using a formula (1);
wherein u isk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis; psi'kRepresenting a non-quadratic penalty functionIn the partial derivative of the k layer image, epsilon represents the independent variable of the function; i isx k、Iy k、It kRespectively representing the partial derivatives of the luminance I of the pixel point of the k-th layer image along the directions of an x axis, a y axis and time n; div represents the optical flow divergence.
7. The system for acquiring optical flow of image sequence according to claim 5, wherein the optical flow output value calculating module specifically comprises:
the optical flow output value operator module is used for taking the k-th layer image as a current layer image and calculating the optical flow output value of the current layer image by using a formula (2);
uk+1、vk+1representing the components of the optical flow output value of the k-th layer image along the x-and y-axes, u, respectivelyk、vkRespectively representing the components of the optical flow initial value of the k-th layer image along the x and y axes; du muk、dvkRespectively representing the components of the optical flow increment of the k-th layer image along the x-axis and the y-axis.
8. The system for acquiring optical flow of a sequence of images according to claim 5, wherein the alternating guided filtering module specifically comprises:
a first input setting submodule for setting each pixel (i, j) of the current layer imageTOptical flow output value ofAs input to the first joint bilateral filter for the nth iteration, passing each of said pixels (i, j)TMedian filtered values over n-1 iterationsAs a guide to the first joint bilateral filter for the nth iteration; wherein the median filtered value of the 0 th iterationN represents the current iteration number, N is 1,2, …, and N is a preset iteration number;
a first combined bilateral filtering submodule for obtaining each of said pixels (i, j) using formula (3) respectivelyTFirst joint bilateral filtered optical flow values for an nth iteration
Wherein,representing a Filter Window Point Pixel (i, j)TOptical flow value, K, after first joint bilateral filtering of the nth iterationp1,Kp2Which represents a normalization factor of the weights, σsrepresenting the spatial weight control standard deviation, σrRepresents the range weight control standard deviation;
a second input value setting submodule for setting each pixel point (i, j) of the current layer imageTFirst joint bilateral filtered optical flow values for an nth iterationAs input to the second combined bilateral filter for the nth iteration, each pixel (i, j)TOptical flow output value ofAs a guide to the nth iteration second combined bilateral filter;
a second combined bilateral filtering module for obtaining each pixel point (i, j) by formula (4)TA second combined bilateral filtered optical flow value for the nth iteration;
wherein,representing each of said pixels (i, j)TSecond combined bilateral filtered optical flow value, K, through nth iterationp3,Kp4Which represents a normalization factor of the weights,
a median filtering submodule for performing median filtering on the optical flow value subjected to the second combined bilateral filtering of the nth iteration by using a formula (5) to obtain each pixel point (i, j)TThe median filtered value after the nth iteration;
indicating pixel (i, j)TMedian filtered optical flow values, f (·) representing median filtering;
a second result judgment submodule for judging whether the current iteration number n is less than the preset iteration number, if not, each pixel point (i, j)TAnd taking the median filtering value of the nth iteration as an optical flow value after the alternate guide filtering of the current layer image, if so, increasing n by 1 to be taken as the current iteration number, and returning to the first input setting submodule.
CN201810105993.7A 2018-02-02 2018-02-02 Method and system for acquiring image sequence optical flow Active CN108280831B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810105993.7A CN108280831B (en) 2018-02-02 2018-02-02 Method and system for acquiring image sequence optical flow

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810105993.7A CN108280831B (en) 2018-02-02 2018-02-02 Method and system for acquiring image sequence optical flow

Publications (2)

Publication Number Publication Date
CN108280831A true CN108280831A (en) 2018-07-13
CN108280831B CN108280831B (en) 2020-04-21

Family

ID=62807376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810105993.7A Active CN108280831B (en) 2018-02-02 2018-02-02 Method and system for acquiring image sequence optical flow

Country Status (1)

Country Link
CN (1) CN108280831B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345564A (en) * 2018-07-30 2019-02-15 深圳市艾为智能有限公司 A method of it solves not being inconsistent due to self-similarity characteristics generation optical flow field with sports ground
CN109801279A (en) * 2019-01-21 2019-05-24 京东方科技集团股份有限公司 Object detection method and device, electronic equipment, storage medium in image
CN111583295A (en) * 2020-04-28 2020-08-25 清华大学 Real-time dense optical flow computing method based on image block binarization Hash representation
CN112529931A (en) * 2020-12-23 2021-03-19 南京航空航天大学 Foreground segmentation method and system
CN112862715A (en) * 2021-02-08 2021-05-28 天津大学 Real-time and controllable scale space filtering method
CN112842257A (en) * 2019-11-12 2021-05-28 磅客策(上海)机器人有限公司 Blood vessel positioning method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3086552A1 (en) * 2015-04-20 2016-10-26 Thomson Licensing Method and apparatus for image colorization
CN106097392A (en) * 2016-06-13 2016-11-09 西安电子科技大学 High-precision optical flow estimation method based on two-stage edge sensitive filtering
CN106162133A (en) * 2016-06-30 2016-11-23 北京大学 Color interpolation method based on adaptive directed filtering
CN106934820A (en) * 2017-03-17 2017-07-07 南昌航空大学 Image sequence Pyramid technology optical flow computation method based on guiding filtering

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3086552A1 (en) * 2015-04-20 2016-10-26 Thomson Licensing Method and apparatus for image colorization
CN106097392A (en) * 2016-06-13 2016-11-09 西安电子科技大学 High-precision optical flow estimation method based on two-stage edge sensitive filtering
CN106162133A (en) * 2016-06-30 2016-11-23 北京大学 Color interpolation method based on adaptive directed filtering
CN106934820A (en) * 2017-03-17 2017-07-07 南昌航空大学 Image sequence Pyramid technology optical flow computation method based on guiding filtering

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ALEXANDER TOET: ""Alternating guided image filtering"", 《PEERJ COMPUTER SCIENCE》 *
ZHIGANG TU ETAL.: ""Adaptiveguidedimage filterforwarpinginvariationaloptical flowcomputation"", 《IEEE》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345564A (en) * 2018-07-30 2019-02-15 深圳市艾为智能有限公司 A method of it solves not being inconsistent due to self-similarity characteristics generation optical flow field with sports ground
CN109801279A (en) * 2019-01-21 2019-05-24 京东方科技集团股份有限公司 Object detection method and device, electronic equipment, storage medium in image
CN112842257A (en) * 2019-11-12 2021-05-28 磅客策(上海)机器人有限公司 Blood vessel positioning method and device
CN111583295A (en) * 2020-04-28 2020-08-25 清华大学 Real-time dense optical flow computing method based on image block binarization Hash representation
CN111583295B (en) * 2020-04-28 2022-08-12 清华大学 Real-time dense optical flow computing method based on image block binarization Hash representation
CN112529931A (en) * 2020-12-23 2021-03-19 南京航空航天大学 Foreground segmentation method and system
CN112529931B (en) * 2020-12-23 2024-04-12 南京航空航天大学 Method and system for foreground segmentation
CN112862715A (en) * 2021-02-08 2021-05-28 天津大学 Real-time and controllable scale space filtering method

Also Published As

Publication number Publication date
CN108280831B (en) 2020-04-21

Similar Documents

Publication Publication Date Title
CN108280831B (en) Method and system for acquiring image sequence optical flow
DE69231812T2 (en) METHOD FOR DETERMINING PROBE MOVEMENT AND SCENE STRUCTURE AND IMAGE PROCESSING SYSTEM THEREFOR
Yang et al. Fusion of median and bilateral filtering for range image upsampling
CN111340844B (en) Multi-scale characteristic optical flow learning calculation method based on self-attention mechanism
Barron et al. Quantitative color optical flow
CN103458261B (en) Video scene variation detection method based on stereoscopic vision
CN106934820B (en) Image sequence Pyramid technology optical flow computation method based on guiding filtering
DE102016209625A1 (en) Method for evaluating image data of a vehicle camera
Ding et al. U 2 D 2 Net: Unsupervised unified image dehazing and denoising network for single hazy image enhancement
WO2001063236A2 (en) Method for automatically detecting casting defects in a test piece
CN110349186B (en) Large-displacement motion optical flow calculation method based on depth matching
CN108921170B (en) Effective image noise detection and denoising method and system
Yue et al. Semi-supervised monocular depth estimation based on semantic supervision
CN104182940B (en) Blurred image restoration method and system
CN117456330A (en) MSFAF-Net-based low-illumination target detection method
CN117058474B (en) Depth estimation method and system based on multi-sensor fusion
Hua et al. An Efficient Multiscale Spatial Rearrangement MLP Architecture for Image Restoration
CN108492308B (en) Method and system for determining variable light split flow based on mutual structure guided filtering
CN109614976B (en) Heterogeneous image fusion method based on Gabor characteristics
CN103236053B (en) A kind of MOF method of moving object detection under mobile platform
CN111191694A (en) Image stereo matching method
CN113536905B (en) Time-frequency domain combined panoramic segmentation convolutional neural network and application thereof
Bellamine et al. Optical flow estimation based on the structure–texture image decomposition
Bellamine et al. Motion estimation using the total variation-local-global optical flow and the structure-texture image decomposition
Ezumi et al. Single Image Raindrop Removal Using a Non-Local Operator and Feature Maps in the Frequency Domain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant