CN114842386A - Event motion segmentation method for progressive iterative optimization of event camera - Google Patents
Event motion segmentation method for progressive iterative optimization of event camera Download PDFInfo
- Publication number
- CN114842386A CN114842386A CN202210484727.6A CN202210484727A CN114842386A CN 114842386 A CN114842386 A CN 114842386A CN 202210484727 A CN202210484727 A CN 202210484727A CN 114842386 A CN114842386 A CN 114842386A
- Authority
- CN
- China
- Prior art keywords
- event
- motion
- iter
- iteration
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 192
- 230000011218 segmentation Effects 0.000 title claims abstract description 57
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000005457 optimization Methods 0.000 title claims abstract description 15
- 230000000750 progressive effect Effects 0.000 title claims abstract description 13
- 239000011159 matrix material Substances 0.000 claims description 60
- 230000003287 optical effect Effects 0.000 claims description 22
- 238000009499 grossing Methods 0.000 claims description 9
- 239000011541 reaction mixture Substances 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000013507 mapping Methods 0.000 claims description 6
- 238000005192 partition Methods 0.000 claims description 6
- 238000004422 calculation algorithm Methods 0.000 claims description 5
- 238000010586 diagram Methods 0.000 claims description 3
- 230000004927 fusion Effects 0.000 claims description 3
- 238000003064 k means clustering Methods 0.000 claims description 3
- 241000287196 Asthenes Species 0.000 claims description 2
- 238000007781 pre-processing Methods 0.000 abstract 1
- 230000000875 corresponding effect Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 3
- 238000009795 derivation Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/49—Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses an event motion segmentation method for progressive iterative optimization of an event camera, which comprises the following steps: 1. preprocessing a shot event to obtain an event packet processed each time and defining motion segmentation parameters, 2 initializing the corresponding motion segmentation parameters and optimization parameters when processing an event packet, 3 performing motion estimation on the event in one iteration to obtain motion parameters and calculating event correlation, 4 denoising by using the event correlation obtained by the motion estimation in the iteration, 5 returning to the step 3 and iterating until motion segmentation results are converged, 6 returning to the step 2 and sequentially processing all event packets, and fusing and obtaining a final motion segmentation result after all event packets are processed. The invention can remove noise without losing motion information, thereby effectively improving motion segmentation performance.
Description
Technical Field
The invention belongs to the field of event camera motion segmentation, and particularly relates to an event camera-oriented progressive iterative optimization event motion segmentation method.
Background
An event camera (Dynamic Vision Sensor) is a new type of biologically inspired visual Sensor that senses scene brightness asynchronously for each pixel and outputs a series of positive and negative binary pulse signals (also called events) corresponding to the cues of relative motion between the camera and the object. The time resolution of the event camera can reach microsecond magnitude, so the event camera is very sensitive to scene brightness change, can record a fine action evolution rule, and provides rich motion clues for motion segmentation tasks.
Event camera motion segmentation aims at segmenting events into different clusters based on the motion to which the events belong, and the current methods for motion segmentation of event stream input can be roughly divided into two categories: the first is to convert the event into frames and then to do motion segmentation using traditional frame-based methods; the second method is to directly perform motion segmentation in an event space, wherein in the segmentation process, firstly, events need to be aggregated into different clusters; then, the motion parameters of each cluster are calculated separately. However, neither of these methods takes into account the effect of background noise on motion segmentation. Background noise originates from dark current and junction leakage current in the camera sensing process and is distributed more randomly and sparsely, which destroys the spatial and temporal correlation of real events and eventually leads to a decrease in motion segmentation accuracy. And since the event camera captures logarithmic light intensity, the background noise intensity is also related to the scene brightness level, the darker the scene the more noise, which is a scene dependent noise.
In order to suppress the influence of noise on event motion segmentation, a direct scheme is to obtain a denoised event stream by using a denoising algorithm, and then perform motion segmentation on the remaining events. However, because events are sparse, the traditional denoising algorithm using spatial correlation cannot be directly used, and the existing denoising algorithm using local space-time correlation directly on the events cannot capture the long-time dependence of the events, which is necessary for motion segmentation. There is therefore a need for a motion segmentation method that can eliminate the effect of noise on motion segmentation without destroying the temporal correlation between real events.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides an event motion segmentation method for progressive iterative optimization of an event camera, so that motion information is not lost while event denoising is carried out, and the motion segmentation performance can be effectively improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention relates to an event motion segmentation method for progressive iterative optimization of an event camera, which is characterized by comprising the following steps of:
step 1, shooting a motion scene by using an event camera and obtaining all eventsWherein e is k Represents the kth event, and e k =b k δ(t-t k ,x-x k ,y-y k ) Wherein b is k Indicating the polarity of the k-th event, b k ∈{-1,1};t k Representing the occurrence time of the kth event; x is the number of k And y k Respectively representing the spatial coordinates of the k-th event occurrence; n represents the total number of events; (t, x, y) represent spatio-temporal projection coordinates; delta is an indicative function of the spatio-temporal coordinates representing the occurrence of an event falling in the spatio-temporal projection coordinates;
setting the number of clusters of motion segmentation as L and setting the motion compensation function W of the jth motion class j (ii) a Setting the number of events per division processing to N e And all events are combinedCut into overlapping in time sequenceAn event package, wherein the ith divided event package is marked asAnd the ith event package E i And the (i + 1) th event package E i+1 Therein is provided withThe events overlap;
let the ith event package E i Corresponding event probability matrix P i ={p ikj Is dimension N e xL, where the element p in the k-th row and j-th column ikj Represents the ith event package E i The k-th event e in (1) k Probability of belonging to the jth motion class, and event probability matrix P i The sum of each row of (a) is 1;
let the ith event package E i Corresponding event confidence matrix C i ={c ikj Is dimension N e X L non-negative matrix, in which the element c of the k-th row and j column ikj Represents the ith event package E i The kth event e in (1) k Confidence of true event in jth motion class, and event confidence matrix C i Each element of (a) is not more than 1;
initializing i to 1;
step 2, defining and initializing the current iteration number iter to be 0;
initialize the ith event Package E of the iter iteration i Motion parameter of corresponding jth motion categoryIth event package E for iter iteration i Event probability matrix ofIth event package E for iter iteration i Event confidence matrix of
Define and initialize the step size mu of the iter iteration (iter) ;
Step 3, motion parameters based on iter iterationEvent probability matrixAnd event confidence matrixFor the ith event package E i Performing motion estimation on all events to obtain motion parameters of iter +1 iteration
Step 4, according to the event correlation diagramFor the ith event package E i Denoising all events to obtain an event confidence coefficient matrix of iter +1 iteration
Step 5, after iter +1 is assigned to iter, returning to step 3 to execute the sequence until the motion estimation is converged or the highest iteration number is reached, thereby obtaining the ith event packet E i Motion segmentation matrix ofWherein,an event probability matrix which is the final iteration;an event confidence matrix for the final iteration;
and 6, after the value of i +1 is assigned to i, returning to the step 2 for sequential execution until motion segmentation results of all event packages are obtained, and fusing the event motion segmentation results of the overlapped event packages to obtain a motion matrix PC of all events.
The event motion segmentation method for the progressive iterative optimization of the event camera is also characterized in that the ith event package E is subjected to the step 2 i The parameter initialization is carried out according to the following steps:
step 2.1, when the current iteration time iter is equal to 0;
Step 2.2, when iter is 0 and i is 1, for the ith event packet E i Artificially setting or randomly initializing the motion parameters of the jth motion class
When all L motions comprise optical flow motion, the least square method is used for the k event e k Get the kth event e k The optical flow of (a); thereby composed of the ith event package E i In N e Constructing an optical flow space by the optical flow of each event;
initializing the ith event package E by using a k-means clustering algorithm i The optical flow value of (a) is the optical flow of L cluster centers in the optical flow space;
step 2.3, when i>1, for the ith event package E i Initializing the event package E with the motion parameter of i-1 i-1 Finally estimated motion parameters
The step 3 comprises the following steps:
step 3.1, the kth event e is processed by using the formula (1) k Projection to the same instant according to the jth motion:
in the formula (1), the reaction mixture is,representing a mapping of inputs to outputs, W j A projection function representing the jth motion, e' k Representing events after a motion projection, t ref Is the projection time (x' k ,y' k ) Denotes e k Coordinates after motion projection;
step 3.2, obtaining the weighted motion compensation map after the iter iteration respectively by using the formula (2) and the formula (3)Graph of correlation with events
In the formulas (2) and (3), (x, y) are space projection coordinates, t 0 And t 1 Are each t 0 And t 1 Respectively as the start time and the end time of the event package;
respectively using the formula (4) and the formula (5) and the formula (6) to carry out weighted motion compensation on the graph after the iter iterationGraph of correlation with eventsUpdating:
in equations (4) and (5), equation (x) is convolution operation, and ← represents assignment, and σ ═ is (σ ═ x ,σ y ) The bandwidth of the Gaussian kernel in the x direction and the y direction of the space projection coordinate; ker σ (x, y) represents a spatial smoothing kernel and has:
in formula (6), σ x Representing the bandwidth of the smoothing kernel in the x-direction, σ y Represents the bandwidth of the smoothing kernel in the y direction;
In the formula (7), the reaction mixture is,for all motion parameters at the iter iteration, Shar represents the contrast index and is derived from equation (8):
in formula (8), sigma x,y Indicating that all pixel coordinates (x, y) are summed,the local contrast of the projection frame at (x, y) pixel at the iter's iteration is given by equation (9):
in the formula (9), ω (x, y) is a neighborhood of (x, y),the expectation of pixel values in the neighborhood ω (x, y) for the iter iteration; | ω | represents the neighborhood size and is derived from equation (10):
the gradient used to update the motion parameters is calculated using equation (11):
in formula (12), G x And G y For intermediate variables of the gradient calculation, and G is obtained from the equations (13) and (14) x And G y Pixel value at (x, y) pixel:
will be provided withAndis arranged asThen, the calculation is performed according to the procedures of formula (12) to formula (14)
Step 3.4, using the motion parameter of iter +1 iterationCalculating to obtain a weighted motion compensation map of iter +1 iterationsGraph of correlation with eventsAnd obtaining the kth event e of the iter +1 iteration by using the formula (15) k Probability in jth motion
The step 4 comprises the following steps:
step 4.1, obtaining the kth event e of iter +1 iteration by using the formula (16) k Absolute correlation in jth motionThereby obtaining a dimension N e X L Absolute correlation matrix EC (iter+1) :
Step 4.2, calculate the average correlation λ of all events using equation (17) and as normalized weight:
step 4.3, obtaining the kth event e of iter +1 iteration by using the formula (18) k Event confidence in jth motion
The fusion in the step 6 is carried out according to the following steps:
step 6.1, define the final motion segmentation result matrix of all events asWherein,representing a confidence probability that the kth event among all events belongs to the jth motion;
step 6.2, obtaining the kth event E by using the formula (19) and the formula (20) i The event packet sequence number at the first occurrence and the event sequence number in the corresponding event packet are i k And k':
step 6.3, if i k 1 andorThen, it is orderedOtherwise, the final motion segmentation result PC is obtained using equation (21) all Each element in (1)
In the formula (21), k' representsThe k event is at the i k+1 An event sequence number in each event packet, and is the ith k Motion partition matrix for event packageThe element in the k 'th row and the j' th column represents the ith k The confidence probability that the kth event in an event package belongs to the jth motion,is the ith k+1 Motion partition matrix for event packageLine k "and column j.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the method, the motion estimation and event denoising are carried out iteratively, the characteristic that motion information can promote events subjected to denoising and denoising to promote motion estimation is utilized, and the problems that long-time dependence of the events is difficult to obtain in the existing denoising method and precision of the existing motion estimation method can be reduced under the influence of noise are solved, so that time correlation among the events is kept while denoising effect is better, and motion segmentation performance is effectively improved.
2. According to the invention, a loss function which is more stable to noise and a more accurate gradient calculation method are designed for a motion estimation link, so that the precision of motion estimation is improved, and the time correlation among events can be better captured to assist in denoising.
3. The invention introduces the event correlation information with long-time dependency relationship, namely motion information, into the denoising link, and overcomes the problem that the existing denoising method only can utilize local space-time correlation, so that the time correlation is kept while the denoising effect is better.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
In this embodiment, as shown in fig. 1, a motion estimation step and a denoising step are included, and the two steps are performed iteratively to obtain an event motion segmentation result, and an event motion segmentation method for progressive iterative optimization of an event camera is performed according to the following steps:
step 1, shooting a motion scene by using an event camera and obtaining all eventsWherein e is k Represents the kth event, and e k =b k δ(t-t k ,x-x k ,y-y k ) Wherein b is k Indicating the polarity of the k-th event, b k ∈{-1,1};t k Representing the occurrence time of the kth event; x is the number of k And y k Respectively representing the spatial coordinates of the k-th event occurrence; n represents the total number of events; (t, x, y) represent spatio-temporal projection coordinates; delta is an indicative function of the spatio-temporal coordinates representing the occurrence of an event falling in the spatio-temporal projection coordinates;
setting the number of clusters of motion segmentation as L and setting the motion compensation function W of the jth motion class j (ii) a Setting the number of events per division processing to N e And all events are combinedCut into overlapping in time sequenceAn event package, wherein the ith divided event package is marked asAnd the ith event package E i And the (i + 1) th event package E i+1 Therein is provided withThe events overlap; when the motion compensation function is set, the most suitable function form and number for describing the motion condition of the scene need to be selected according to the actual shooting condition, for example, when an object moving parallel to the shooting plane exists in the scene, the corresponding motion compensation function can be set to be two-dimensional optical flow motion, and when only the camera itself rotates in a static scene, the corresponding motion compensation function can be correspondingly set to be three-dimensional constant-speed rotation motion. The reason why the motion division is performed by dividing all events into different event packages is that the assumed motion situation and motion parameters are usually kept constant only for a short time, and therefore, in order to obtain a consistent motion division result, the number of events is not too large, and the corresponding number of events may be appropriately adjusted in a specific implementation, or an event packetization method based on a time reference may be adopted.
Let the ith event package E i Corresponding event probability matrix P i ={p ikj Is dimension N e xL, where the element p in the k-th row and j-th column ikj Represents the ith event package E i The k-th event e in (1) k Probability of belonging to the jth motion class, and event probability matrix P i The sum of each row of (a) is 1;
let the ith event package E i Corresponding event confidence matrix C i ={c ikj Is of dimension N e X L non-negative matrix, in which the element c of the k-th row and j column ikj Represents the ith event package E i The k-th event e in (1) k Confidence of true event in jth motion class, and event confidence matrix C i Each element of (a) is not more than 1;
initializing i to 1;
step 2, defining and initializing the current iteration number iter to be 0;
initialize the ith event Package E of the iter iteration i Motion parameter of corresponding jth motion categoryIth event package E for iter iteration i Event probability matrix ofIth event package E for iter iteration i Event confidence matrix of
Define and initialize the step size mu of the iter iteration (iter) ;
For the ith event package E in step 2 i The parameter initialization is carried out according to the following steps:
step 2.1, when the current iteration time iter is equal to 0;
Step 2.2, when iter is 0 and i is 1, for the ith event packet E i Artificially setting or randomly initializing the motion parameters of the jth motion class
When the L motions all comprise optical flow motion, using least square method to determine k event e k Get the kth event e k The optical flow of (a); thereby composed of the ith event package E i In N e Constructing an optical flow space by the optical flow of each event;
initialization with k-means clustering algorithmChange the ith event Package E i The optical flow value of (a) is the optical flow of L cluster centers in the optical flow space;
step 2.3, when i>1, for the ith event package E i Initializing the event package E with the motion parameter of i-1 i-1 Finally estimated motion parameters
Step 3, motion parameters based on iter iterationEvent probability matrixAnd event confidence matrixFor the ith event package E i Performing motion estimation on all events to obtain motion parameters of iter +1 iteration
Step 3.1, the kth event e is processed by using the formula (1) k Projection to the same instant according to the jth motion:
in the formula (1), the reaction mixture is,representing a mapping of inputs to outputs, W j A projection function representing the jth motion, e' k Representing events after a motion projection, t ref Is the projection time (x' k ,y' k ) Denotes e k Coordinates after motion projection;
step 3.2, obtaining the weighted motion compensation graph after the iter iteration respectively by using the formula (2) and the formula (3)Graph of correlation with events
In the formulas (2) and (3), (x, y) are space projection coordinates, t 0 And t 1 Are each t 0 And t 1 Respectively as the start time and the end time of the event package; the event correlation graph obtained by calculation through the method corresponds to the gradient of the scene along the motion direction under the condition that the motion parameters are completely accurate, the number of events distributed on the motion track of the pixel with larger gradient is more, and therefore the value of the event correlation graph at each pixel represents the motion correlation between the events.
Respectively using the formula (4) and the formula (5) and the formula (6) to carry out weighted motion compensation on the graph after the iter iterationGraph of correlation with eventsUpdating:
in equations (4) and (5), equation (x) is convolution operation, and ← represents assignment, and σ ═ is (σ ═ x ,σ y ) The bandwidth of the Gaussian kernel in the x direction and the y direction of the space projection coordinate; ker σ (x, y) represents a spatial smoothing kernel and has:
in formula (6), σ x Representing the bandwidth of the smoothing kernel in the x-direction, σ y Represents the bandwidth of the smoothing kernel in the y direction;
In the formula (7), the reaction mixture is,for all motion parameters at the iter iteration, sharp represents a contrast index and is obtained from equation (8):
in formula (8), sigma x,y Indicating the summation of all pixel coordinates (x, y). The final result of equation (8) is a weighted sum of the local contrasts, and the weights are event correlations, so the local contrast at the less correlated positionsThe contribution to the ensemble will be smaller and the resulting contrast will be less susceptible to noise because the noise events are less correlated than the real events.
The local contrast of the projection frame at (x, y) pixel at the iter's iteration is given by equation (9):
in the formula (9), ω (x, y) is a neighborhood of (x, y),the expectation of pixel values in the neighborhood ω (x, y) for the iter iteration; | ω | represents the neighborhood size and is derived from equation (10):
the gradient used to update the motion parameters is calculated using equation (11):
in formula (12), G x And G y For intermediate variables of the gradient calculation, and G is obtained from the equations (13) and (14) x And G y Pixel value at (x, y) pixel:
calculated by the same method, whereinNeed to be replaced byEquation (11) is a chain rule of derivation, but because the values of the pixel values in the event projection framing operation are discrete, the existing method cannot be directly calculatedAnd withThe two derivatives, and the numerical error is large. In order to eliminate the error of the numerical method as much as possible, the gradient calculation method of the formulas (12) to (14) is obtained through theoretical derivation, wherein only numerical approximation is neededAndsince both can be considered as gradients of the image, the estimation can be based on existing image processing methods. Experiments prove that the gradient calculated by the method is lower in operation complexity compared with a numerical method, the solution space is smoother, and the optimization process is easier to converge.
Step 3.4, using the motion parameter of iter +1 iterationCalculating to obtain a weighted motion compensation map of iter +1 iterationsGraph of correlation with eventsAnd obtaining the kth event e of the iter +1 iteration by using the formula (15) k Probability in jth motion
Step 4, according to the event correlation diagramFor the ith event package E i Denoising all events to obtain an event confidence coefficient matrix of iter +1 iteration
Step 4.1, obtaining the kth event e of iter +1 iteration by using the formula (16) k Absolute correlation in jth motionThereby obtaining a dimension N e X L Absolute correlation matrix EC (iter+1) :
Step 4.2, calculate the average correlation λ of all events using equation (17) and as normalized weight:
step 4.3, obtaining the kth event e of iter +1 iteration by using the formula (18) k Event confidence in jth motion
In equation (18), tanh represents a confidence mapping function,is in the range of [0, 1). The tanh function is selected because the event confidence and the event correlation are positively correlated, and it can be guaranteed that the correlation of 0 can be mapped to the confidence of 0 and the correlation and the confidence can be approximately in a linear relationship near 0. The normalized weight λ is used as a reference of the confidence, and an event higher than the average confidence can be basically regarded as an effective event under the definition of the equation (17), but in an actual application scenario, the event can be corrected according to the noise level of the scenario, for example, when the noise level is low, the value of λ can be appropriately reduced, and conversely, the value of λ can be increased.
Step 5, after iter +1 is assigned to iter, returning to step 3 to execute the sequence until the motion estimation is converged or the highest iteration number is reached, thereby obtaining the ith event packet E i Motion segmentation matrix ofWherein,an event probability matrix which is the final iteration;an event confidence matrix for the final iteration;
and 6, after the value of i +1 is assigned to i, returning to the step 2 for sequential execution until motion segmentation results of all event packages are obtained, and fusing the event motion segmentation results of the overlapped event packages to obtain a motion matrix PC of all events.
Step 6.1, define the final motion segmentation result matrix of all events asWherein,representing a confidence probability that the kth event among all events belongs to the jth motion;
step 6.2, obtaining the kth event E by using the formula (19) and the formula (20) i The event packet sequence number at the first occurrence and the event sequence number in the corresponding event packet are i k And k':
step 6.3, if i k 1 andorThen, it is orderedOtherwise, the final motion segmentation result PC is obtained using equation (21) all Each element in (1)
In the formula (21), k' represents that the k-th event is in the i-th k+1 An event sequence number in each event packet, and is the ith k Motion partition matrix for event packageThe element in the k 'th row and the j' th column represents the ith k The confidence probability that the kth event in an event package belongs to the jth motion,is the ith k+1 Motion partition matrix for event packageRow k and column j;
the motion segmentation result after such fusion can be regarded as the average of the two segmentation results, and the final segmentation result is kept consistent when the two segmentation results are the same.
Claims (5)
1. An event motion segmentation method for progressive iterative optimization of an event camera is characterized by comprising the following steps:
step 1, shooting a motion scene by using an event camera and obtaining all eventsWherein e is k Represents the kth event, and e k =b k δ(t-t k ,x-x k ,y-y k ) Wherein, b k Indicating the polarity of the k-th event, b k ∈{-1,1};t k Representing the occurrence time of the kth event; x is the number of k And y k Respectively representing the spatial coordinates of the k-th event occurrence; n represents the total number of events; (t, x, y) represent spatio-temporal projection coordinates; delta is an indicative function of the spatio-temporal coordinates representing the occurrence of an event falling in the spatio-temporal projection coordinates;
setting the number of clusters of motion segmentation as L and setting the motion compensation function W of the jth motion class j (ii) a Setting the number of events per division processing to N e And all events are combinedCut into overlapping in time sequenceAn event package, wherein the ith divided event package is marked asAnd the ith event package E i And the (i + 1) th event package E i+1 Therein is provided withThe events overlap;
let the ith event package E i Corresponding event probability matrix P i ={p ikj Is dimension N e xL, where the element p in the k-th row and j-th column ikj Represents the ith event package E i The k-th event e in (1) k Probability of belonging to the jth motion class, and event probability matrix P i The sum of each row of (a) is 1;
let the ith event package E i Corresponding event confidence matrix C i ={c ikj Is dimension N e X L non-negative matrix, in which the element c of the k-th row and j column ikj Represents the ith event package E i The k-th event e in (1) k Of real events in the jth motion classConfidence, and event confidence matrix C i Each element of (a) is not more than 1;
initializing i to 1;
step 2, defining and initializing the current iteration number iter to be 0;
initialize the ith event Package E of the iter iteration i Motion parameter of corresponding jth motion categoryIth event package E for iter iteration i Event probability matrix ofIth event package E for iter iteration i Event confidence matrix of
Define and initialize the step size mu of the iter iteration (iter) ;
Step 3, motion parameters based on iter iterationEvent probability matrixAnd event confidence matrixFor the ith event package E i Performing motion estimation on all events to obtain motion parameters of iter +1 iteration
Step 4, according to the event correlation diagramFor the ith event package E i Denoising all events to obtain an event confidence coefficient matrix of iter +1 iteration
Step 5, after iter +1 is assigned to iter, returning to step 3 to execute the sequence until the motion estimation is converged or the highest iteration number is reached, thereby obtaining the ith event packet E i Motion segmentation matrix ofWherein,an event probability matrix which is the final iteration;an event confidence matrix for the final iteration;
and 6, after the value of i +1 is assigned to i, returning to the step 2 for sequential execution until motion segmentation results of all event packages are obtained, and fusing the event motion segmentation results of the overlapped event packages to obtain a motion matrix PC of all events.
2. The event motion segmentation method for progressive iterative optimization of event cameras as claimed in claim 1, wherein the ith event package E in the step 2 i The parameter initialization is carried out according to the following steps:
step 2.1, when the current iteration time iter is equal to 0;
Step 2.2, when iter is 0 and i is 1, for the ith event packet E i Artificially setting or randomly initializing the motion parameters of the jth motion class
When the L motions all comprise optical flow motion, using least square method to determine k event e k Get the kth event e k The optical flow of (a); thereby composed of the ith event package E i In N e Constructing an optical flow space by the optical flow of each event;
initializing the ith event package E by using a k-means clustering algorithm i The optical flow value of (a) is the optical flow of L cluster centers in the optical flow space;
3. The event motion segmentation method for progressive iterative optimization of event cameras according to claim 2, wherein the step 3 comprises:
step (ii) of3.1, using equation (1) to compare the kth event e k Projection to the same instant according to the jth motion:
in the formula (1), the reaction mixture is,representing a mapping of inputs to outputs, W j A projection function representing the jth motion, e' k Representing events after a motion projection, t ref Is the projection time (x' k ,y′ k ) Denotes e k Coordinates after motion projection;
step 3.2, obtaining the weighted motion compensation graph after the iter iteration respectively by using the formula (2) and the formula (3)Graph of correlation with events
In the formulas (2) and (3), (x, y) are space projection coordinates, t 0 And t 1 Are each t 0 And t 1 Respectively as the start time and the end time of the event package;
respectively using the formula (4) and the formula (5) and the formula (6) to carry out weighted motion compensation on the graph after the iter iterationGraph of correlation with eventsUpdating:
in equations (4) and (5), equation (x) is convolution operation, and ← represents assignment, and σ ═ is (σ ═ x ,σ y ) The bandwidth of the Gaussian kernel in the x direction and the y direction of the space projection coordinate; ker σ (x, y) represents a spatial smoothing kernel and has:
in formula (6), σ x Representing the bandwidth of the smoothing kernel in the x-direction, σ y Represents the bandwidth of the smoothing kernel in the y direction;
In the formula (7), the reaction mixture is,for all motion parameters at the iter iteration, Shar represents the contrast index and is derived from equation (8):
in the formula (8), E x,y Indicating that all pixel coordinates (x, y) are summed,the local contrast of the projection frame at (x, y) pixel at the iter's iteration is given by equation (9):
in the formula (9), ω (x, y) is a neighborhood of (x, y),the expectation of pixel values in the neighborhood ω (x, y) for the iter iteration; | ω | represents the neighborhood size and is derived from equation (10):
the gradient used to update the motion parameters is calculated using equation (11):
in formula (12), G x And G y To perform a gradientThe calculated intermediate variables, and G is obtained from the equations (13) and (14) x And G y Pixel value at (x, y) pixel:
will be provided withAndis arranged asThen, the calculation is performed according to the procedures of formula (12) to formula (14)
Step 3.4, using the motion parameter of iter +1 iterationCalculating to obtain a weighted motion compensation map of iter +1 iterationsGraph of correlation with eventsAnd obtaining the kth event e of the iter +1 iteration by using the formula (15) k Probability in jth motion
4. The event motion segmentation method for progressive iterative optimization of event cameras according to claim 4, wherein the step 4 comprises:
step 4.1, obtaining the kth event e of iter +1 iteration by using the formula (16) k Absolute correlation in jth motionThereby obtaining a dimension N e X L Absolute correlation matrix EC (iter+1) :
Step 4.2, calculate the average correlation λ of all events using equation (17) and as normalized weight:
step 4.3, obtaining the kth event e of iter +1 iteration by using the formula (18) k Event confidence in jth motion
5. The event motion segmentation method for progressive iterative optimization of event cameras according to claim 5, wherein the fusion in step 6 is performed as follows:
step 6.1, define the final motion segmentation result matrix of all events asWherein,representing a confidence probability that a kth event among all events belongs to a jth motion;
step 6.2, obtaining the kth event E by using the formula (19) and the formula (20) i The event packet sequence number at the first occurrence and the event sequence number in the corresponding event packet are i k And k':
step 6.3, if i k 1 andorThen, it is orderedOtherwise, the final motion segmentation result PC is obtained using equation (21) all Each element in (1)
In the formula (21), k' represents that the k-th event is in the i-th event k+1 An event sequence number in each event packet, and is the ith k Motion partition matrix for event packageThe element in the k 'th row and the j' th column represents the ith k The confidence probability that the kth event in an event package belongs to the jth motion,is the ith k+1 Motion partition matrix for event packageLine kth and column jth.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210484727.6A CN114842386B (en) | 2022-05-06 | 2022-05-06 | Event motion segmentation method for progressive iterative optimization of event camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210484727.6A CN114842386B (en) | 2022-05-06 | 2022-05-06 | Event motion segmentation method for progressive iterative optimization of event camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114842386A true CN114842386A (en) | 2022-08-02 |
CN114842386B CN114842386B (en) | 2024-05-17 |
Family
ID=82567599
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210484727.6A Active CN114842386B (en) | 2022-05-06 | 2022-05-06 | Event motion segmentation method for progressive iterative optimization of event camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114842386B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111798485A (en) * | 2020-06-30 | 2020-10-20 | 武汉大学 | Event camera optical flow estimation method and system enhanced by IMU |
US20210321052A1 (en) * | 2020-04-13 | 2021-10-14 | Northwestern University | System and method for high-resolution, high-speed, and noise-robust imaging |
CN113837938A (en) * | 2021-07-28 | 2021-12-24 | 北京大学 | Super-resolution method for reconstructing potential image based on dynamic vision sensor |
-
2022
- 2022-05-06 CN CN202210484727.6A patent/CN114842386B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210321052A1 (en) * | 2020-04-13 | 2021-10-14 | Northwestern University | System and method for high-resolution, high-speed, and noise-robust imaging |
CN111798485A (en) * | 2020-06-30 | 2020-10-20 | 武汉大学 | Event camera optical flow estimation method and system enhanced by IMU |
CN113837938A (en) * | 2021-07-28 | 2021-12-24 | 北京大学 | Super-resolution method for reconstructing potential image based on dynamic vision sensor |
Non-Patent Citations (2)
Title |
---|
JINZE CHEN等: "ProgressiveMotionSeg: Mutually Reinforced Framework for Event-Based Motion Segmentation", ARXIV, 22 March 2022 (2022-03-22) * |
许志宏;王沛;: "基于L_2Boost的低阶核回归迭代去噪算法", 上海电机学院学报, no. 01, 25 February 2011 (2011-02-25) * |
Also Published As
Publication number | Publication date |
---|---|
CN114842386B (en) | 2024-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liu et al. | EDFLOW: Event driven optical flow camera with keypoint detection and adaptive block matching | |
CN112184759A (en) | Moving target detection and tracking method and system based on video | |
CN110910421B (en) | Weak and small moving object detection method based on block characterization and variable neighborhood clustering | |
CN111798485B (en) | Event camera optical flow estimation method and system enhanced by IMU | |
CN107832716B (en) | Anomaly detection method based on active and passive Gaussian online learning | |
CN112183506A (en) | Human body posture generation method and system | |
CN113379789B (en) | Moving target tracking method in complex environment | |
CN111178261A (en) | Face detection acceleration method based on video coding technology | |
CN112541423A (en) | Synchronous positioning and map construction method and system | |
CN110827262A (en) | Weak and small target detection method based on continuous limited frame infrared image | |
CN117036404A (en) | Monocular thermal imaging simultaneous positioning and mapping method and system | |
CN117788693B (en) | Stair modeling method and device based on point cloud data, legged robot and medium | |
CN113362377B (en) | VO weighted optimization method based on monocular camera | |
US20110222759A1 (en) | Information processing apparatus, information processing method, and program | |
CN113436251A (en) | Pose estimation system and method based on improved YOLO6D algorithm | |
US9672412B2 (en) | Real-time head pose tracking with online face template reconstruction | |
CN111798484B (en) | Continuous dense optical flow estimation method and system based on event camera | |
CN117615255A (en) | Shooting tracking method, device, equipment and storage medium based on cradle head | |
CN111553954B (en) | Online luminosity calibration method based on direct method monocular SLAM | |
CN112131991A (en) | Data association method based on event camera | |
CN114842386B (en) | Event motion segmentation method for progressive iterative optimization of event camera | |
CN114419259B (en) | Visual positioning method and system based on physical model imaging simulation | |
CN112508168B (en) | Frame regression neural network construction method based on automatic correction of prediction frame | |
CN110136164A (en) | Method based on online transitting probability, low-rank sparse matrix decomposition removal dynamic background | |
CN114022949A (en) | Event camera motion compensation method and device based on motion model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |