CN114842386B - Event motion segmentation method for progressive iterative optimization of event camera - Google Patents

Event motion segmentation method for progressive iterative optimization of event camera Download PDF

Info

Publication number
CN114842386B
CN114842386B CN202210484727.6A CN202210484727A CN114842386B CN 114842386 B CN114842386 B CN 114842386B CN 202210484727 A CN202210484727 A CN 202210484727A CN 114842386 B CN114842386 B CN 114842386B
Authority
CN
China
Prior art keywords
event
motion
ith
iteration
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210484727.6A
Other languages
Chinese (zh)
Other versions
CN114842386A (en
Inventor
查正军
曹洋
王洋
陈进泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202210484727.6A priority Critical patent/CN114842386B/en
Publication of CN114842386A publication Critical patent/CN114842386A/en
Application granted granted Critical
Publication of CN114842386B publication Critical patent/CN114842386B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/49Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an event motion segmentation method for progressive iterative optimization of an event camera, which comprises the following steps: 1. preprocessing the shot event to obtain event packets processed each time and defining motion segmentation parameters, 2 initializing the motion segmentation parameters and optimization parameters corresponding to the event packets when processing one event packet, 3 performing motion estimation on the event in one iteration to obtain motion parameters and calculating event correlation, 4 denoising the event correlation obtained by using the motion estimation in the iteration, 5 returning to the step 3 and iterating until the motion segmentation result is converged, 6 returning to the step 2 and sequentially processing all event packets, and fusing and obtaining the final motion segmentation result after all event packets are processed. The invention can remove noise and simultaneously does not lose motion information, thereby effectively improving motion segmentation performance.

Description

Event motion segmentation method for progressive iterative optimization of event camera
Technical Field
The invention belongs to the field of event camera motion segmentation, and particularly relates to a progressive iterative optimization event motion segmentation method for an event camera.
Background
An event camera (Dynamic Vision Sensor) is a novel bio-inspired vision sensor, each pixel of which asynchronously senses scene brightness and outputs a series of positive and negative binary pulse signals (also called events) corresponding to the relative motion cues between the camera and the object. Because the time resolution of the event camera can reach microsecond magnitude, the event camera is very sensitive to scene brightness change, can record fine action evolution rules, and provides rich motion clues for motion segmentation tasks.
Event camera motion segmentation aims at segmenting events into different clusters based on the motion to which the events belong, and the current method for motion segmentation of event stream input can be roughly divided into two main categories: the first converts events into frames and then uses traditional frame-based methods for motion segmentation; the second method is to divide the motion in the event space directly, and in the dividing process, the events need to be gathered into different clusters; then, the motion parameters of each cluster are calculated separately. However neither method takes into account the effect of background noise on motion segmentation. Background noise originates from dark current and junction leakage current during camera sensing and is more random and sparse in distribution, which can destroy the spatiotemporal correlation of real events and ultimately lead to reduced motion segmentation accuracy. And since the event camera captures logarithmic light intensity, the background noise intensity is also related to scene brightness level, the darker the scene, the more noise, which is a scene dependent noise.
In order to suppress the influence of noise on event motion segmentation, a straightforward solution is to first obtain a denoised event stream using a denoising algorithm, and then perform motion segmentation on the remaining events. However, since the events are sparse, the conventional denoising algorithm using spatial correlation cannot be directly used, and the existing denoising algorithm directly using local spatial-temporal correlation on the events cannot capture the long-time dependence of the events, which is necessary for motion segmentation. There is therefore a need for a motion segmentation method that can eliminate the effects of noise on motion segmentation while not destroying the temporal correlation between real events.
Disclosure of Invention
The invention aims to solve the defects in the prior art, and provides an event motion segmentation method for progressive iterative optimization of an event camera, so that the event denoising can be performed without losing motion information, and the motion segmentation performance can be effectively improved.
In order to achieve the aim of the invention, the invention adopts the following technical scheme:
The invention discloses an event motion segmentation method for progressive iterative optimization of an event camera, which is characterized by comprising the following steps:
step 1, shooting a motion scene by using an event camera to obtain all events Wherein e k represents a kth event, and e k=bkδ(t-tk,x-xk,y-yk), wherein b k represents the polarity of the kth event, and b k∈{-1,1};tk represents the occurrence time of the kth event; x k and y k represent the spatial coordinates of the occurrence of the kth event, respectively; n represents the total number of events; (t, x, y) represents space-time projection coordinates; delta is an indication function representing that the space-time coordinates of the event fall on the space-time projection coordinates;
setting the clustering number of motion segmentation as L, and setting a motion compensation function W j of a j-th motion class; setting the number of events of each segmentation process as N e, and setting all events Chronologically split into overlapping/>An event package, wherein the i-th partitioned event package is denoted/>And there are/>, in the i-th event package E i and the i+1th event package E i+1 The events overlap;
Let the event probability matrix P i={pikj corresponding to the ith event packet E i be a probability distribution matrix with dimension N e ×l, where the element P ikj of the kth row and j column represents the probability that the kth event E k in the ith event packet E i belongs to the jth motion class, and the sum of the rows of the event probability matrix P i is 1;
Let the event confidence matrix C i={cikj corresponding to the ith event packet E i be a non-negative matrix with dimension N e ×l, where the element C ikj of the kth row and j column represents the confidence that the kth event E k in the ith event packet E i is a real event in the jth motion class, and each element of the event confidence matrix C i is not greater than 1;
Initializing i=1;
step 2, defining and initializing the current iteration number iter=0;
initializing the motion parameters of the j motion category corresponding to the i event package E i of the ith iteration Event probability matrix/>, of ith event packet E i for the ith iterationEvent confidence matrix/>, of ith event package E i for the ith iteration
Defining and initializing a step size mu (iter) of an ith iteration;
step 3, motion parameters based on item-th iteration Event probability matrix/>With event confidence matrix/>Motion estimation is carried out on all events in the ith event packet E i to obtain motion parameters of the ith iteration (item+1)
Based on the motion parametersCalculate event correlation map/>, for item+1 iterations
Step 4, according to the event correlation diagramDenoising all events in the ith event package E i to obtain an event confidence matrix/>, of the ith iteration (item+1)
Step 5, after the iter+1 is assigned to the iter, returning to the step 3 for sequential execution until the motion estimation converges or the highest iteration number is reached, thereby obtaining a motion segmentation matrix of the ith event packet E i Wherein/>An event probability matrix for the final iteration; /(I)An event confidence matrix for the final iteration;
And 6, after the i+1 is assigned to the i, returning to the step 2 for sequential execution until the motion segmentation results of all event packages are obtained, and fusing the event motion segmentation results of the event packages with overlapping events, thereby obtaining the motion matrix PC of all events.
The progressive iterative optimization event motion segmentation method for the event camera is also characterized in that the parameter initialization of the ith event packet E i in the step 2 is performed according to the following steps:
step 2.1, when the current iteration number iter=0;
Initializing an event probability matrix for the ith event package E i Element of kth row and jth column of (a)
Initializing elements of a kth row and a jth column in an event confidence matrix C i of an ith event package E i
Step 2.2, when iter=0, i=1, for the ith event package E i, artificially setting or randomly initializing the motion parameters of the jth motion class
When the L movements all comprise optical flow movements, the optical flow of a kth event e k is obtained by utilizing a least square method in the neighborhood fitting of the kth event e k; thereby constructing an optical flow space from the optical flows of the N e events in the ith event package E i;
Initializing the light value of an ith event packet E i by using a k-means clustering algorithm to be the light streams of L clustering centers in the light stream space;
Step 2.3, initializing the motion parameters of the ith event packet E i to the motion parameters finally estimated by the (i-1) th event packet E i-1 when i >1
The step 3 comprises the following steps:
Step 3.1, using equation (1) to project the kth event e k to the same instant in time according to the jth motion:
in the formula (1), the components are as follows, Representing the input-to-output mapping, W j represents the projection function of the j-th motion, e 'k represents the event after the motion projection, t ref is the projection time, (x' k,y'k) represents the coordinates of e k after the motion projection;
Step 3.2, respectively obtaining weighted motion compensation graphs after the ith iteration by using the formula (2) and the formula (3) Event correlation map/>
In the formulas (2) and (3), (x, y) are space projection coordinates, t 0 and t 1 are t 0 and t 1 respectively, and the starting time and the ending time of the event packet are respectively;
Weighted motion compensation graphs after the ith iteration are respectively calculated by using the formula (4) and the formula (5) and the formula (6) Event correlation map/>Updating:
In the formulas (4) and (5), the convolution operation is performed, the assignment is represented by the expression, and σ= (σ xy) is the bandwidth of the gaussian kernel in the x and y directions of the space projection coordinates; ker σ (x, y) denotes a spatially smooth kernel, and has:
In formula (6), σ x represents the bandwidth of the smoothing kernel in the x-direction, and σ y represents the bandwidth of the smoothing kernel in the y-direction;
Step 3.3, obtaining the motion parameters of the ith+1st iteration by using the formula (7)
In the formula (7), the amino acid sequence of the compound,For all motion parameters at item number iteration Shar represents the contrast index and is derived from equation (8):
in equation (8), Σ x,y represents summing all pixel coordinates (x, y), The local contrast of the projected frame at the (x, y) pixel at the item iteration is given by equation (9):
in the formula (9), ω (x, y) is a neighborhood of (x, y), For the expectation of pixel values within the neighborhood ω (x, y) at the iter iteration; the ω represents the neighborhood size and is derived from equation (10):
calculating a gradient for updating the motion parameter using equation (11):
In the formula (11), the amino acid sequence of the compound, Obtained from the formula (12):
In the formula (12), G x and G y are intermediate variables for gradient calculation, and the pixel values of G x and G y at the (x, y) pixels are obtained from the formula (13) and the formula (14), respectively:
Will be And/>Set as/>Thereafter, the/> is calculated according to the process of the formula (12) -the formula (14)
Step 3.4, motion parameters using the ith+1st iterationCalculating to obtain weighted motion compensation diagram/>, of the ith iteration (item+1)Event correlation map/>And using equation (15) to obtain the probability/>, in the j-th motion, of the k-th event e k of the (iter+1) -th iteration
The step 4 comprises the following steps:
Step 4.1, obtaining the absolute correlation of the kth event e k in the jth motion for the (item+1) th iteration using equation (16) Thereby obtaining an absolute correlation matrix EC (iter+1) with the dimension of N e X L:
step 4.2, calculating the average correlation lambda of all events by using the formula (17) and taking the average correlation lambda as a normalized weight:
step 4.3, obtaining the event confidence level of the kth event e k in the jth motion of the ith+1th iteration by using the formula (18)
In equation (18), tanh represents a confidence map function,In the range between [0, 1).
The fusion in the step6 is carried out according to the following steps:
step 6.1, defining the final motion segmentation result matrix of all events as Wherein/>Representing a confidence probability that a kth event in all events belongs to a jth motion;
Step 6.2, obtaining the sequence number of the event packet where the kth event E i occurs for the first time and the sequence number of the event in the corresponding event packet by using the formula (19) and the formula (20) as i k and k':
Step 6.3 if i k = 1 and Or/>When let/>Otherwise, each element/>, in the final motion segmentation result PC all is obtained using equation (21)
In the formula (21), k' represents the event sequence number of the kth event in the ith k+1 event package, an Motion partition matrix/>, for the i k th event packageElements of the kth row and the jth column of the (i) k are representative of the confidence probability that the kth event in the ith k event package belongs to the jth motion,/>Motion partition matrix/>, for the i k+1 th event packageThe elements of the kth row and the jth column of the (b).
Compared with the prior art, the invention has the beneficial effects that:
1. The method for carrying out motion estimation and event denoising iteratively utilizes the characteristic that motion information can promote denoising and denoised events can promote motion estimation, overcomes the problems that long-time dependence of the events is difficult to obtain and accuracy of the existing motion estimation method is reduced under the influence of noise in the existing denoising method, and therefore time correlation among the events is reserved while denoising effect is better, and motion segmentation performance is effectively improved.
2. According to the invention, a more robust loss function and a more accurate gradient calculation method for noise are designed for the motion estimation link, so that the accuracy of motion estimation is improved, and the time correlation between events can be better captured to assist in denoising.
3. According to the method, the event correlation information with long-time dependency relationship, namely the motion information, is introduced into the denoising link, so that the problem that the existing denoising method only can utilize local space-time correlation is solved, and the time correlation is reserved while the denoising effect is better.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
In this embodiment, as shown in fig. 1, the method includes a motion estimation step and a denoising step, which are performed iteratively to obtain an event motion segmentation result, and a progressive iterative optimization event motion segmentation method for an event camera is performed according to the following steps:
step 1, shooting a motion scene by using an event camera to obtain all events Wherein e k represents a kth event, and e k=bkδ(t-tk,x-xk,y-yk), wherein b k represents the polarity of the kth event, and b k∈{-1,1};tk represents the occurrence time of the kth event; x k and y k represent the spatial coordinates of the occurrence of the kth event, respectively; n represents the total number of events; (t, x, y) represents space-time projection coordinates; delta is an indication function representing that the space-time coordinates of the event fall on the space-time projection coordinates;
setting the clustering number of motion segmentation as L, and setting a motion compensation function W j of a j-th motion class; setting the number of events of each segmentation process as N e, and setting all events Chronologically split into overlapping/>An event package, wherein the i-th partitioned event package is denoted/>And there are/>, in the i-th event package E i and the i+1th event package E i+1 The events overlap; when setting the motion compensation function, the function form and the number which are most suitable for describing the scene motion situation need to be selected according to the actual shooting situation, for example, when an object moving parallel to the shooting plane exists in the scene, the optical flow motion of which the corresponding motion compensation function is two-dimensional can be set, and when the camera is only in the rotating motion in the static scene, the corresponding motion compensation function can be correspondingly set to be three-dimensional uniform rotation motion. The reason why all events are divided into different event packages for motion segmentation is that the motion situation and motion parameters assumed in the general case are kept constant only in a short time, so that the number of events corresponding to the event package may not be excessive in order to obtain a consistent motion segmentation result, and the number of events corresponding to the event package may be properly adjusted in the implementation, or a time-based event package method may be adopted.
Let the event probability matrix P i={pikj corresponding to the ith event packet E i be a probability distribution matrix with dimension N e ×l, where the element P ikj of the kth row and j column represents the probability that the kth event E k in the ith event packet E i belongs to the jth motion class, and the sum of the rows of the event probability matrix P i is 1;
Let the event confidence matrix C i={cikj corresponding to the ith event packet E i be a non-negative matrix with dimension N e ×l, where the element C ikj of the kth row and j column represents the confidence that the kth event E k in the ith event packet E i is a real event in the jth motion class, and each element of the event confidence matrix C i is not greater than 1;
Initializing i=1;
step 2, defining and initializing the current iteration number iter=0;
initializing the motion parameters of the j motion category corresponding to the i event package E i of the ith iteration Event probability matrix/>, of ith event packet E i for the ith iterationEvent confidence matrix/>, of ith event package E i for the ith iteration
Defining and initializing a step size mu (iter) of an ith iteration;
in step 2, the initialization of the parameters of the ith event package E i is performed as follows:
step 2.1, when the current iteration number iter=0;
Initializing an event probability matrix for the ith event package E i Element of kth row and jth column of (a)
Initializing elements of a kth row and a jth column in an event confidence matrix C i of an ith event package E i
Step 2.2, when iter=0, i=1, for the ith event package E i, artificially setting or randomly initializing the motion parameters of the jth motion class
When the L movements all comprise optical flow movements, the optical flow of a kth event e k is obtained by utilizing a least square method in the neighborhood fitting of the kth event e k; thereby constructing an optical flow space from the optical flows of the N e events in the ith event package E i;
Initializing the light value of an ith event packet E i by using a k-means clustering algorithm to be the light streams of L clustering centers in the light stream space;
Step 2.3, initializing the motion parameters of the ith event packet E i to the motion parameters finally estimated by the (i-1) th event packet E i-1 when i >1
Step 3, motion parameters based on item-th iterationEvent probability matrix/>With event confidence matrix/>Motion estimation is carried out on all events in the ith event packet E i to obtain motion parameters of the ith iteration (item+1)
Based on the motion parametersCalculate event correlation map/>, for item+1 iterations
Step 3.1, using equation (1) to project the kth event e k to the same instant in time according to the jth motion:
in the formula (1), the components are as follows, Representing the input-to-output mapping, W j represents the projection function of the j-th motion, e 'k represents the event after the motion projection, t ref is the projection time, (x' k,y'k) represents the coordinates of e k after the motion projection;
Step 3.2, respectively obtaining weighted motion compensation graphs after the ith iteration by using the formula (2) and the formula (3) Event correlation map/>
In the formulas (2) and (3), (x, y) are space projection coordinates, t 0 and t 1 are t 0 and t 1 respectively, and the starting time and the ending time of the event packet are respectively; the event correlation graph obtained by calculation through the method corresponds to the gradient of the scene along the motion direction under the condition that the motion parameters are completely accurate, and the number of events distributed on the motion track of the scene at the pixel with the larger gradient is also larger, so that the value of the event correlation graph at each pixel represents the motion correlation among the events.
Weighted motion compensation graphs after the ith iteration are respectively calculated by using the formula (4) and the formula (5) and the formula (6)Event correlation map/>Updating:
In the formulas (4) and (5), the convolution operation is performed, the assignment is represented by the expression, and σ= (σ xy) is the bandwidth of the gaussian kernel in the x and y directions of the space projection coordinates; ker σ (x, y) denotes a spatially smooth kernel, and has:
In formula (6), σ x represents the bandwidth of the smoothing kernel in the x-direction, and σ y represents the bandwidth of the smoothing kernel in the y-direction;
Step 3.3, obtaining the motion parameters of the ith+1st iteration by using the formula (7)
In the formula (7), the amino acid sequence of the compound,For all motion parameters at item number iteration SHARPNESS represents the contrast index and is derived from equation (8):
In equation (8), Σ x,y represents summing all pixel coordinates (x, y). The final result of equation (8) is a weighted sum of local contrasts and the weights are event correlations, so that local contrasts at less correlated locations will contribute less to the overall, and because the correlation of noise events is less than that of real events, the resulting contrast is less susceptible to noise.
The local contrast of the projected frame at the (x, y) pixel at the item iteration is given by equation (9):
in the formula (9), ω (x, y) is a neighborhood of (x, y), For the expectation of pixel values within the neighborhood ω (x, y) at the iter iteration; the ω represents the neighborhood size and is derived from equation (10):
calculating a gradient for updating the motion parameter using equation (11):
In the formula (11) Obtained from the formula (12):
g x and G y in formula (12) are intermediate variables for gradient calculation, and the pixel values of G x and G y at the (x, y) pixels are obtained from formula (13) and formula (14), respectively:
Calculated by the same method, wherein/> Needs to be replaced with/>The formula (11) is a derived chain rule, but because the pixel values in the event projection framing operation are discrete, the conventional method cannot directly calculate and obtain/>And/>These two derivatives are relatively large in numerical errors. To try to eliminate the numerical errors, we have derived the gradient calculations of formulas (12) to (14) through theoretical derivation, where the only/>, numerical approximation is neededAnd/>Because both can be regarded as gradients of the image, the estimation can be based on existing image processing methods. Experiments prove that the gradient calculated by the method has lower operation complexity compared with a numerical method, the solution space is smoother, and the optimization process is easier to converge.
Step 3.4, motion parameters using the ith+1st iterationCalculating to obtain weighted motion compensation diagram/>, of the ith iteration (item+1)Event correlation map/>And using equation (15) to obtain the probability/>, in the j-th motion, of the k-th event e k of the (iter+1) -th iteration
Step 4, according to the event correlation diagramDenoising all events in the ith event package E i to obtain an event confidence matrix/>, of the ith iteration (item+1)
Step 4.1, obtaining the absolute correlation of the kth event e k in the jth motion for the (item+1) th iteration using equation (16)Thereby obtaining an absolute correlation matrix EC (iter+1) with the dimension of N e X L:
step 4.2, calculating the average correlation lambda of all events by using the formula (17) and taking the average correlation lambda as a normalized weight:
step 4.3, obtaining the event confidence level of the kth event e k in the jth motion of the ith+1th iteration by using the formula (18)
In equation (18), tanh represents a confidence map function,In the range between [0, 1). The tanh function is chosen because the event confidence is positively correlated with the correlation of the event, and using it can ensure that a correlation of 0 must be mapped to a confidence of 0 and that near 0 the correlation can have an approximately linear relationship with the confidence. The normalized weight λ is used as a reference of the confidence, and the event higher than the average confidence under the definition of the formula (17) can be basically regarded as an effective event, but in a practical application scene, the event can be corrected according to the noise level of the scene, for example, the value of λ can be properly reduced when the noise level is lower, and conversely, the value of λ can be increased.
Step 5, after the iter+1 is assigned to the iter, returning to the step 3 for sequential execution until the motion estimation converges or the highest iteration number is reached, thereby obtaining a motion segmentation matrix of the ith event packet E i Wherein/>An event probability matrix for the final iteration; /(I)An event confidence matrix for the final iteration;
And 6, after the i+1 is assigned to the i, returning to the step 2 for sequential execution until the motion segmentation results of all event packages are obtained, and fusing the event motion segmentation results of the event packages with overlapping events, thereby obtaining the motion matrix PC of all events.
Step 6.1, defining the final motion segmentation result matrix of all events asWherein/>Representing a confidence probability that a kth event in all events belongs to a jth motion;
Step 6.2, obtaining the sequence number of the event packet where the kth event E i occurs for the first time and the sequence number of the event in the corresponding event packet by using the formula (19) and the formula (20) as i k and k':
Step 6.3 if i k = 1 and Or/>When let/>Otherwise, each element/>, in the final motion segmentation result PC all is obtained using equation (21)
In the formula (21), k' represents the event sequence number of the kth event in the ith k+1 event package, an Motion partition matrix/>, for the i k th event packageElements of the kth row and the jth column of the (i) k are representative of the confidence probability that the kth event in the ith k event package belongs to the jth motion,/>Motion partition matrix/>, for the i k+1 th event packageElements of the kth row and the jth column;
The motion segmentation result after such fusion can be regarded as an average of the two segmentation results, and the final segmentation result remains the same when the two segmentation results are identical.

Claims (5)

1. The event motion segmentation method for progressive iterative optimization of the event camera is characterized by comprising the following steps of:
step 1, shooting a motion scene by using an event camera to obtain all events Wherein e k represents a kth event, and e k=bkδ(t-tk,x-xk,y-yk), wherein b k represents the polarity of the kth event, and b k∈{-1,1};tk represents the occurrence time of the kth event; x k and y k represent the spatial coordinates of the occurrence of the kth event, respectively; n represents the total number of events; (t, x, y) represents space-time projection coordinates; delta is an indication function representing that the space-time coordinates of the event fall on the space-time projection coordinates;
setting the clustering number of motion segmentation as L, and setting a motion compensation function W j of a j-th motion class; setting the number of events of each segmentation process as N e, and setting all events Chronologically split into overlapping/>An event package, wherein the i-th partitioned event package is denoted/>And there are/>, in the i-th event package E i and the i+1th event package E i+1 The events overlap;
Let the event probability matrix P i={pikj corresponding to the ith event packet E i be a probability distribution matrix with dimension N e ×l, where the element P ikj of the kth row and j column represents the probability that the kth event E k in the ith event packet E i belongs to the jth motion class, and the sum of the rows of the event probability matrix P i is 1;
Let the event confidence matrix C i={cikj corresponding to the ith event packet E i be a non-negative matrix with dimension N e ×l, where the element C ikj of the kth row and j column represents the confidence that the kth event E k in the ith event packet E i is a real event in the jth motion class, and each element of the event confidence matrix C i is not greater than 1;
Initializing i=1;
step 2, defining and initializing the current iteration number iter=0;
initializing the motion parameters of the j motion category corresponding to the i event package E i of the ith iteration Event probability matrix/>, of ith event packet E i for the ith iterationEvent confidence matrix/>, of ith event package E i for the ith iteration
Defining and initializing a step size mu (iter) of an ith iteration;
step 3, motion parameters based on item-th iteration Event probability matrix/>Confidence matrix of eventsMotion estimation is carried out on all events in the ith event packet E i to obtain motion parameters of the ith iteration (item+1)
Based on the motion parametersCalculate event correlation map/>, for item+1 iterations
Step 4, according to the event correlation diagramDenoising all events in the ith event package E i to obtain an event confidence matrix/>, of the ith iteration (item+1)
Step 5, after the iter+1 is assigned to the iter, returning to the step 3 for sequential execution until the motion estimation converges or the highest iteration number is reached, thereby obtaining a motion segmentation matrix of the ith event packet E i Wherein/>An event probability matrix for the final iteration; /(I)An event confidence matrix for the final iteration;
And 6, after the i+1 is assigned to the i, returning to the step 2 for sequential execution until the motion segmentation results of all event packages are obtained, and fusing the event motion segmentation results of the event packages with overlapping events, thereby obtaining the motion matrix PC of all events.
2. The event motion segmentation method for progressive iterative optimization of an event camera according to claim 1, wherein the initializing parameters of the i-th event package E i in step 2 is performed as follows:
step 2.1, when the current iteration number iter=0;
Initializing an event probability matrix for the ith event package E i Element of kth row and jth column of (a)
Initializing elements of a kth row and a jth column in an event confidence matrix C i of an ith event package E i
Step 2.2, when iter=0, i=1, for the ith event package E i, artificially setting or randomly initializing the motion parameters of the jth motion class
When the L movements all comprise optical flow movements, the optical flow of a kth event e k is obtained by utilizing a least square method in the neighborhood fitting of the kth event e k; thereby constructing an optical flow space from the optical flows of the N e events in the ith event package E i;
Initializing the light value of an ith event packet E i by using a k-means clustering algorithm to be the light streams of L clustering centers in the light stream space;
Step 2.3, initializing the motion parameters of the ith event packet E i to the motion parameters finally estimated by the (i-1) th event packet E i-1 when i >1
3. The event motion segmentation method for progressive iterative optimization of an event camera according to claim 2, wherein said step 3 comprises:
Step 3.1, using equation (1) to project the kth event e k to the same instant in time according to the jth motion:
in the formula (1), the components are as follows, Representing the input-to-output mapping, W j represents the projection function of the j-th motion, e 'k represents the event after the motion projection, t ref is the projection time, (x' k,y'k) represents the coordinates of e k after the motion projection;
Step 3.2, respectively obtaining weighted motion compensation graphs after the ith iteration by using the formula (2) and the formula (3) Event correlation map/>
In the formulas (2) and (3), (x, y) are space projection coordinates, t 0 and t 1 are t 0 and t 1 respectively, and the starting time and the ending time of the event packet are respectively;
Weighted motion compensation graphs after the ith iteration are respectively calculated by using the formula (4) and the formula (5) and the formula (6) Event correlation map/>Updating:
In the formulas (4) and (5), the convolution operation is performed, the assignment is represented by the expression, and σ= (σ xy) is the bandwidth of the gaussian kernel in the x and y directions of the space projection coordinates; ker σ (x, y) denotes a spatially smooth kernel, and has:
In formula (6), σ x represents the bandwidth of the smoothing kernel in the x-direction, and σ y represents the bandwidth of the smoothing kernel in the y-direction;
Step 3.3, obtaining the motion parameters of the ith+1st iteration by using the formula (7)
In the formula (7), the amino acid sequence of the compound,For all motion parameters at item number iteration Shar represents the contrast index and is derived from equation (8):
in equation (8), Σ x,t represents summing all pixel coordinates (x, y), The local contrast of the projected frame at the (x, y) pixel at the item iteration is given by equation (9):
in the formula (9), ω (x, y) is a neighborhood of (x, y), Is the expectation of the pixel values within the neighborhood ω (x, y) at the ter-th iteration; the ω represents the neighborhood size and is derived from equation (10):
calculating a gradient for updating the motion parameter using equation (11):
In the formula (11), the amino acid sequence of the compound, Obtained from the formula (12):
In the formula (12), G x and G y are intermediate variables for gradient calculation, and the pixel values of G x and G y at the (x, y) pixels are obtained from the formula (13) and the formula (14), respectively:
Will be And/>Set as/>Thereafter, the/> is calculated according to the process of the formula (12) -the formula (14)
Step 3.4, motion parameters using the ith+1st iterationCalculating to obtain weighted motion compensation diagram/>, of the ith iteration (item+1)Event correlation map/>And using equation (15) to obtain the probability/>, in the j-th motion, of the k-th event e k of the (iter+1) -th iteration
4. A progressive iterative optimization event motion segmentation method for an event camera according to claim 3, wherein said step 4 comprises:
Step 4.1, obtaining the absolute correlation of the kth event e k in the jth motion for the (item+1) th iteration using equation (16) Thereby obtaining an absolute correlation matrix EC (iter+1) with the dimension of N e X L:
step 4.2, calculating the average correlation lambda of all events by using the formula (17) and taking the average correlation lambda as a normalized weight:
step 4.3, obtaining the event confidence level of the kth event e k in the jth motion of the ith+1th iteration by using the formula (18)
In equation (18), tanh represents a confidence map function,In the range between [0, 1).
5. The event motion segmentation method for progressive iterative optimization of an event camera according to claim 4, wherein the fusing in step 6 is performed as follows:
step 6.1, defining the final motion segmentation result matrix of all events as Wherein/>Representing a confidence probability that a kth event in all events belongs to a jth motion;
Step 6.2, obtaining the sequence number of the event packet where the kth event E i occurs for the first time and the sequence number of the event in the corresponding event packet by using the formula (19) and the formula (20) as i k and k':
Step 6.3 if i k = 1 and Or/>When let/>Otherwise, each element/>, in the final motion segmentation result PC all is obtained using equation (21)
In the formula (21), k' represents the event sequence number of the kth event in the ith k+1 event package, an Motion partition matrix/>, for the i k th event packageElements of the kth row and the jth column of the (i) k are representative of the confidence probability that the kth event in the ith k event package belongs to the jth motion,/>Motion partition matrix/>, for the i k+1 th event packageThe elements of the kth row and the jth column of the (b).
CN202210484727.6A 2022-05-06 2022-05-06 Event motion segmentation method for progressive iterative optimization of event camera Active CN114842386B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210484727.6A CN114842386B (en) 2022-05-06 2022-05-06 Event motion segmentation method for progressive iterative optimization of event camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210484727.6A CN114842386B (en) 2022-05-06 2022-05-06 Event motion segmentation method for progressive iterative optimization of event camera

Publications (2)

Publication Number Publication Date
CN114842386A CN114842386A (en) 2022-08-02
CN114842386B true CN114842386B (en) 2024-05-17

Family

ID=82567599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210484727.6A Active CN114842386B (en) 2022-05-06 2022-05-06 Event motion segmentation method for progressive iterative optimization of event camera

Country Status (1)

Country Link
CN (1) CN114842386B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111798485A (en) * 2020-06-30 2020-10-20 武汉大学 Event camera optical flow estimation method and system enhanced by IMU
CN113837938A (en) * 2021-07-28 2021-12-24 北京大学 Super-resolution method for reconstructing potential image based on dynamic vision sensor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11303793B2 (en) * 2020-04-13 2022-04-12 Northwestern University System and method for high-resolution, high-speed, and noise-robust imaging

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111798485A (en) * 2020-06-30 2020-10-20 武汉大学 Event camera optical flow estimation method and system enhanced by IMU
CN113837938A (en) * 2021-07-28 2021-12-24 北京大学 Super-resolution method for reconstructing potential image based on dynamic vision sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ProgressiveMotionSeg: Mutually Reinforced Framework for Event-Based Motion Segmentation;Jinze Chen等;ARXIV;20220322;全文 *
基于L_2Boost的低阶核回归迭代去噪算法;许志宏;王沛;;上海电机学院学报;20110225(第01期);全文 *

Also Published As

Publication number Publication date
CN114842386A (en) 2022-08-02

Similar Documents

Publication Publication Date Title
US10769480B2 (en) Object detection method and system
US7925051B2 (en) Method for capturing images comprising a measurement of local motions
Almatrafi et al. Distance surface for event-based optical flow
CN113286194A (en) Video processing method and device, electronic equipment and readable storage medium
CN109685045B (en) Moving target video tracking method and system
CN110910421B (en) Weak and small moving object detection method based on block characterization and variable neighborhood clustering
CN112184759A (en) Moving target detection and tracking method and system based on video
US11367195B2 (en) Image segmentation method, image segmentation apparatus, image segmentation device
CN108764244B (en) Potential target area detection method based on convolutional neural network and conditional random field
CN107341815B (en) Violent motion detection method based on multi-view stereoscopic vision scene stream
CN109377499B (en) Pixel-level object segmentation method and device
CN111798485B (en) Event camera optical flow estimation method and system enhanced by IMU
CN107832716B (en) Anomaly detection method based on active and passive Gaussian online learning
CN113744315B (en) Semi-direct vision odometer based on binocular vision
CN109509213B (en) Harris corner detection method applied to asynchronous time domain vision sensor
CN110930411A (en) Human body segmentation method and system based on depth camera
EP3895061A1 (en) Method of tracking objects in a scene
CN115239882A (en) Crop three-dimensional reconstruction method based on low-light image enhancement
CN116030498A (en) Virtual garment running and showing oriented three-dimensional human body posture estimation method
Low et al. Robust e-NeRF: NeRF from sparse & noisy events under non-uniform motion
Gallego et al. Region based foreground segmentation combining color and depth sensors via logarithmic opinion pool decision
CN114842386B (en) Event motion segmentation method for progressive iterative optimization of event camera
Shao et al. Hyper RPCA: joint maximum correntropy criterion and Laplacian scale mixture modeling on-the-fly for moving object detection
CN112131991A (en) Data association method based on event camera
CN111950599A (en) Dense visual odometer method for fusing edge information in dynamic environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant