CN109726356A - A kind of address events flow data denoising method of dynamic visual sensor - Google Patents

A kind of address events flow data denoising method of dynamic visual sensor Download PDF

Info

Publication number
CN109726356A
CN109726356A CN201910045615.9A CN201910045615A CN109726356A CN 109726356 A CN109726356 A CN 109726356A CN 201910045615 A CN201910045615 A CN 201910045615A CN 109726356 A CN109726356 A CN 109726356A
Authority
CN
China
Prior art keywords
event
address
flow data
probability
graph model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910045615.9A
Other languages
Chinese (zh)
Other versions
CN109726356B (en
Inventor
吴金建
马传威
石光明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201910045615.9A priority Critical patent/CN109726356B/en
Publication of CN109726356A publication Critical patent/CN109726356A/en
Application granted granted Critical
Publication of CN109726356B publication Critical patent/CN109726356B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention proposes a kind of address events flow data denoising methods of dynamic visual sensor, mainly solve the deficiency that the prior art denoises the address events flow data that dynamic visual sensor exports, and include the following steps: to construct the undirected graph model of probability;The energy function of the undirected graph model of acquisition probability;Event area division is carried out to address eventstream data;Denoising is carried out to each event area.The present invention utilizes the correlation between event in the address events flow data of dynamic visual sensor output, building includes the undirected graph model of probability of event row address information, column address information and activationary time information, then address events flow data is divided into multiple event areas to denoise, while being meant to ensure that the integrality of the address events streaming data information of dynamic visual sensor output, the denoising efficiency for improving address events flow data can be used for the address events flow data denoising of dynamic visual sensor output.

Description

A kind of address events flow data denoising method of dynamic visual sensor
Technical field
The invention belongs to technical field of computer vision, the address events flow data for being related to a kind of dynamic visual sensor is gone Method for de-noising can be used for the address events flow data denoising of dynamic visual sensor output.
Background technique
With the breakthrough and leap of sciemtifec and technical sphere, computer vision research deepens continuously, and image and video processing were in recent years Achieve huge achievement, visible miscellaneous camera everywhere in life, sensor used in camera is determine camera The key of picture quality.Conventional image sensor such as CCD and cmos sensor has been obtained very in daily life at present It is widely applied, in simple terms, ccd image sensor mainly passes through photoarray and charge transfer device completes signal electricity The generation and transfer of lotus, so that optical signalling is converted to digital signal;Cmos image sensor is by the member with specific function Part, which is integrated on one piece of silicon wafer, realizes the conversion of optical signalling to digital signal.Both the above sensor is with certain frame per second The scene of records photographing, to form series of frame images.However, the frame image that this mode obtains contains a large amount of redundancy Information, and the response time of camera is increased, so that camera is difficult to capture the target fast moved, it is easy to appear image blur The case where, and traditional imaging sensor is difficult to handle the scene of half-light and strong light.
Dynamic visual sensor (Dynamic Vision Sensor, DVS) be able to solve problem above one kind it is novel Sensor, it can be realized the Parallel signal processing of pixel scale, and be based on event driven playback mode.Sensor chip It is made of independent photosensitive pixel one by one, each pixel unit can independent monitoring illumination in unactivated state Intensity changes with time, and enters state of activation and to outside when intensity of illumination variation reaches preset threshold value Circuit sends request.The pixel unit address being activated, brightness and timestamp information are read in the form of address events stream simultaneously Out.
The address events flow data form that dynamic visual sensor is read is the form of AER flow of event, i.e., (x, y, t, p), Individual event includes following several information: detecting the address x (row address) and y (column address) of the pixel of brightness change;Pixel The time t of unit activating;The polarity p of the photosensitive variable quantity of pixel.It can determine the pole of brightness change according to the symbol of luminance delta Property: ON or OFF, generating the brightness of ON event table increases, and generates OFF event table brightness reduction, over the ground due to polarity information Not any influence of the denoising of location eventstream data, the prior art is when carrying out the denoising of address events flow data, often Neglect polarity information.
Dynamic visual sensor, which reads address date streaming data information, can reflect out the motion profile of object, the direction of motion And movement velocity.The speed of dynamic visual sensor is not limited only to traditional concept such as time for exposure, frame rate etc..It can be supervised The high speed motion for needing to capture in conventional methods where with the expensive high speed camera of tens of thousands of frames rate operation per second is measured, and The quantity of output reduces thousands of times, this greatly reduced the cost of follow-up signal processing.The technology can be applied to aviation The fields such as space flight, automatic driving, consumer electronics, industrial vision and safety.
But a certain number of noises, noise are usually contained in the address events flow data of dynamic visual sensor reading Source mainly includes following several: dynamic visual sensor hardware itself can bring certain noise, as thermal noise, shot are made an uproar Sound, low-frequency noise, fixed pattern noise etc.;Photographed scene can bring certain noise;Hardware shake can generate certain when shooting Noise.In addition, since address events flow data and traditional frame image data are entirely different, so traditional image is gone Method for de-noising is not suitable for the denoising of address flow of event.
In order to solve the Denoising Problems of address events stream, the prior art often converts frame for address events flow data first Then image is denoised using traditional image de-noising method, for example, application publication number is 107610069A, entitled " base Visualize video denoising method in the DVS of shared K-SVD dictionary " patent application, disclose a kind of based on shared K-SVD dictionary DVS visualize video denoising method, this method utilize dynamic visual sensor trap address eventstream data;Event is circulated For clear-cut DVS image, and image grouping is carried out, the excellent of every group of first frame image is then obtained by K-SVD algorithm Change dictionary, and denoising is carried out to remaining all image with the study dictionary that every group of first frame image obtains;Video is set Frame per second and frame number carry out the DVS image Jing Guo denoising to turn video processing.But have a defect that, this method pass through by The three-dimensional address eventstream data of dynamic visual sensor output switchs to two-dimensional frames image, and using the two-dimensional frames image as wait go It makes an uproar image, not only due to the time attribute being lost in address events flow data causes the integrality of data information to reduce, simultaneously Reduce denoising efficiency.
Summary of the invention
It is an object of the invention to overcome the problems of the above-mentioned prior art, a kind of dynamic visual sensor is proposed Address events flow data denoising method, first between event in the address events flow data of dynamic visual sensor output Correlation, building includes the undirected graph model of probability of event row address information, column address and activationary time information, then by ground Location eventstream data is divided into multiple event areas to be denoised, it is intended to guarantee the same of address events stream information integrality When, improve the denoising efficiency of address events flow data.
To achieve the above object, the technical solution that the present invention takes includes the following steps:
(1) the undirected graph model of probability is constructed:
(1a) assumes that the address events flow data X of dynamic visual sensor output includes M event, and the X is by denoising Address events flow data afterwards is Y, and each of X event xi,j,tWith the event x in its space-time neighborhoodi±Δi,j±Δj,t、 xi,j,t±Δt、xi±Δi,j±Δj,t±ΔtWith correlation, xi,j,tEvent y corresponding with Yi,j,tWith correlation, wherein i, j and t Row address, column address and the activationary time of individual event are respectively represented, Δ i, Δ j and Δ t are respectively the offset of row address, column The offset of address and the offset of activationary time, M >=2;
(1b) is by each of X event xi,j,tWith each of Y event yi,j,tComposition joint probability distribution P (X, Y it) is used as the undirected graph model of probability, the undirected graph model of the probability includes four agglomerates: { xi,j,t,yi,j,t}、{xi,j,t, xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±ΔtAnd { xi,j,t,xi±Δi,j±Δj,t±Δt};
(2) the energy function E (X, Y) of the undirected graph model of acquisition probability:
According to the correlation between each agglomerate built-in variable of the undirected graph model of probability, agglomerate { x is seti,j,t,yi,j,t}、 {xi,j,t,xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±ΔtAnd { xi,j,t,xi±Δi,j±Δj,t±ΔtEnergy function be respectively Exy、 Espace、EtimeAnd Est, and the energy function group of four agglomerates is combined into the energy function E (X, Y) of the undirected graph model of probability;
(3) event area division is carried out to address eventstream data X:
Address events flow data Y after denoising is initialized as address events flow data X by (3a);
M event for including in address events flow data X is divided into N number of event area R according to output sequence by (3b)1, R2, Rl..., RN, the event number that preceding N-1 event area includes is Δ m, the event number that nth event region includes For mN, wherein RlIndicate first of event area, l=1,2 ..., N,Δ m >=2, mN=M- (N-1) × Δ m;
(3c) is by first of event area RlThe event for including is expressed asWhereinTable Show first of event area RlIn p-th of event, p=1,2 ..., m, m indicate RlThe sum of interior event:
(4) to each event area RlCarry out denoising:
(4a) enables l=1;
(4b) reads in event area Rl, event area R is calculated using the energy function E (X, Y) of the undirected graph model of probabilityl's Energy
(4c) enables p=1;
(4d) is utilizedWith agglomerate ENERGY Exy、Espace、EtimeAnd EstCalculate event area RlIn remove eventExcept RegionEnergy
(4e) judgementWhether it is less thanIf so, flag eventFor noise event, and execute step (4f), otherwise, flag eventFor validity event, and execute step (4f);
(4f) judges whether p is less than or equal to m, if so, enabling p=p+1, and executes step (4d), otherwise, executes step (4g);
(4g) judges whether l is less than or equal to N, if so, enabling l=l+1, and executes step (4b), otherwise, by it is all label for The event of noise event is deleted, the address events flow data Y after being denoised.
Compared with the prior art, the invention has the following advantages:
1. the present invention is due to when obtaining the address events flow data after denoising in the constructed undirected graph model of probability, Include row address information, column address information and the activationary time information of event, thus remain address events flow data when Between attribute, avoid when three-dimensional address eventstream data is converted two-dimensional frames image by the prior art and lose event time attribute, To ensure that the integrality of address events streaming data information.
2. address thing of the present invention due to the undirected graph model of probability using building directly to dynamic visual sensor output Part flow data carries out denoising, avoids the behaviour that the prior art converts three-dimensional address eventstream data to two-dimensional frames image Make, to improve denoising efficiency.
Detailed description of the invention
Fig. 1 is implementation flow chart of the invention;
Fig. 2 is the present invention to the effect after the address events flow data progress two-dimensional visualization of dynamic visual sensor output Fruit figure;
Fig. 3 is after the present invention carries out denoising to the address events flow data of dynamic visual sensor output, to carry out The effect picture of two-dimensional visualization;
Fig. 4 is the present invention to the effect after the address events flow data progress three-dimensional visualization of dynamic visual sensor output Fruit figure;
Fig. 5 is after the present invention carries out denoising to the address events flow data of dynamic visual sensor output, to carry out The effect picture of three-dimensional visualization.
Specific embodiment
Below in conjunction with the drawings and specific embodiments, present invention is further described in detail:
Referring to attached drawing 1, steps are as follows for realization of the invention:
Step 1) constructs the undirected graph model of probability:
(1a) assumes that the address events flow data X of dynamic visual sensor output includes M event, M=in the present invention 5000000, the X pass through the address events flow data after denoising for Y, and each of X event xi,j,tWith its space-time neighborhood Interior event xi±Δi,j±Δj,t、xi,j,t±Δt、xi±Δi,j±Δj,t±ΔtWith correlation, xi,j,tEvent y corresponding with Yi,j,tTool There are correlation, x in the present inventioni,j,t∈ {+1, -1 }, yi,j,t∈ {+1, -1 }, wherein i, j and t respectively represent the row of individual event Address, column address and activationary time, i=1,2 ..., 768, j=1,2 ..., 640, t >=1, Δ i, Δ j and Δ t are respectively row ground The offset of the offset of location, the offset of column address and activationary time, Δ i=1, Δ j=1, Δ t=5000;
(1b) is by each of X event xi,j,tWith each of Y event yi,j,tComposition joint probability distribution P (X, Y it) is used as the undirected graph model of probability, the undirected graph model of the probability includes four agglomerates: { xi,j,t,yi,j,t}、{xi,j,t, xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±ΔtAnd { xi,j,t,xi±Δi,j±Δj,t±Δt};
The energy function E (X, Y) of the undirected graph model of step 2) acquisition probability:
Four kinds of agglomerates in the undirected graph model of probability are combined by (2a), and the Clique of the available probability non-directed graph is {xi,j,t,yi,j,t,xi±Δi,j±Δj,t,xi,j,t±Δt,xi±Δi,j±Δj,t±Δt, it, will be general according to Hammersley-Clifford theorem The joint probability distribution of the undirected graph model of rate is expressed as the product of the function of the stochastic variable on its Clique, it may be assumed that
Wherein, Z is standardizing factor, is a normaliztion constant, it is ensured that probability distribution P (X, Y) is by correct normalizing Change, C is the Clique of probability non-directed graph, XCYCIt is the corresponding stochastic variable of node of Clique C, ΨC(XCYC) it is to be defined on C Stringent positive function, product are carried out on all Cliques of probability non-directed graph;
Due to ΨC(XCYC) be restricted to strictly larger than zero, therefore by function ΨC(XCYC) it is expressed as exponential form, it may be assumed that
ΨC(XCYC)=exp {-E (X, Y) }
Wherein E (X, Y) is the energy function of the undirected graph model of probability;
The undirected graph model of probability is arranged according to the correlation between each agglomerate built-in variable of the undirected graph model of probability in (2b) Agglomerate { xi,j,t,yi,j,t}、{xi,j,t,xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±ΔtAnd { xi,j,t,xi±Δi,j±Δj,t±ΔtEnergy Flow function is respectively as follows:
Exy=-η xi,j,tyi,j,t
Espace=-α xi,j,txi±Δi,j±Δj,t
Etime=-β xi,j,txi,j,t±Δt
Est=-λ xi,j,txi±Δi,j±Δj,t±Δt
Wherein η, α, β and λ are non-negative weight parameter, η=2.1 × 10 in the present invention-3, α=1 × 10-3, β=5 × 10-4 With λ=4 × 10-4, i, j and t respectively represent row address, column address and the activationary time of individual event, Δ i, Δ j and Δ t difference For the offset of row address, the offset of the offset of column address and activationary time;
(2b) is defined as the product of potential function, the energy of model due to the joint probability distribution of the undirected graph model of probability Amount can be obtained by the method for being added the energy of the maximum agglomerate of each of undirected graph model of probability, by setting in step (2a) Agglomerate energy function obtain the complete energy function E (X, Y) of the undirected graph model of probability are as follows:
Finally obtain the joint probability P (X, Y) of the undirected graph model of probability are as follows:
In order to improve the denoising effect of address events flow data, need to improve the undirected graph model of probability joint probability P (X, Y), in order to improve joint probability P (X, Y), need to reduce the ENERGY E (X, Y) of model;
Step 3) carries out event area division to address eventstream data X:
Address events flow data Y after denoising is initialized as address events flow data X by (3a);
M event for including in address events flow data X is divided into N number of event area R according to output sequence by (3b)1, R2..., Rl..., RN, the event number that preceding N-1 event area includes is Δ m, the event number that nth event region includes Amount is mN, wherein RlIndicate first of event area, l=1,2 ..., N,Δ m >=2, mN=M- (N-1) × Δ m, In the present invention, Δ m=20000;
(3c) is by first of event area RlThe event for including is expressed asWhereinTable Show first of event area RlIn p-th of event, p=1,2 ..., m, m indicate RlThe sum of interior event:
Step 4) is to each event area RlCarry out denoising:
(4a) enables l=1;
(4b) reads in event area Rl, event area R is calculated using the energy function E (X, Y) of the undirected graph model of probabilityl's Energy
Wherein, only for event area RlInterior all events carry out above-mentioned calculating, to obtain
(4c) enables p=1;
(4d) is utilizedWith agglomerate ENERGY Exy、Espace、EtimeAnd EstCalculate event area RlIn remove eventExcept RegionEnergyRealize that steps are as follows:
(4d1) is enabledRemember agglomerate ENERGY EoldAre as follows:
Wherein, yi,j,tIndicate Y inCorresponding event, xi±Δi,j±Δj,t、xi,j,t±ΔtAnd xi±Δi,j±Δj,t±ΔtIt indicates In XEvent in space-time neighborhood;
(4d2) is enabledRemember agglomerate ENERGY EnewAre as follows:
(4c3) is calculated
WhereinIndicate event area RlEnergy;
(4e) judgementWhether it is less thanIf, it was demonstrated that the energy of the undirected graph model of probability reduces, joint probability It increases, then flag eventFor noise event, and step (4f) is executed, otherwise, it was demonstrated that the energy of the undirected graph model of probability does not have It reduces, joint probability does not increase, then flag eventFor validity event, and execute step (4f);
(4f) judges whether p is less than or equal to m, if so, enabling p=p+1, and executes step (4d), otherwise, executes step (4g);
(4g) judges whether l is less than or equal to N, if so, enabling l=l+1, and executes step (4b), otherwise, by it is all label for The event of noise event is deleted, the address events flow data Y after being denoised.
Effect of the invention can be illustrated by following emulation experiment:
1. simulated conditions and content:
1.1. simulated conditions:
The hardware test platform that emulation experiment of the invention uses is: processor for Inter (R) Core (TM) i7-6700, Dominant frequency is 3.40GHz, memory 8GB;Software platform are as follows: 7 Ultimate of Windows, 64 bit manipulation system, JetBrains PyCharm 2017.3.1。
1.2. emulation content:
The address events fluxion acquired in reality scene in the present invention using CeleX-IV dynamic visual sensor According to being denoted as Double-Balls-Events.For address events flow data Double-Balls-Events, using of the invention Method carries out the address events flow data Double-Balls-Events-Denoised after denoising is denoised, and will count 250 frame images are converted into according to Double-Balls-Events, choose the 1st frame image therein as Fig. 2, by data Double-Balls-Events-Denoised is converted into 250 frame images, chooses first frame image therein as Fig. 3, Using the three-dimensional visualization figure of data Double-Balls-Events as Fig. 4, by data Double-Balls-Events- The three-dimensional visualization figure of Denoised is as Fig. 5, and for Fig. 4 and Fig. 5, reference axis respectively indicates the row address of event, column address And activationary time.
2. analysis of simulation result:
Pass through the comparison of Fig. 2 and Fig. 3, it can be seen that the present invention can remove the address events of dynamic visual sensor output Noise in flow data passes through the comparison of Fig. 4 and Fig. 5, it can be seen that present invention preserves the time attributes of event, ensure that ground The integrality of location eventstream data information.
Above description is only example of the present invention, does not constitute any limitation of the invention.Obviously for this It, all may be without departing substantially from the principle of the invention, structure after having understood the content of present invention and principle for the professional in field In the case of, various modifications and change in form and details are carried out, but these modifications and variations based on inventive concept are still Within the scope of the claims of the present invention.

Claims (3)

1. a kind of address events flow data denoising method of dynamic visual sensor, which comprises the steps of:
(1) the undirected graph model of probability is constructed:
(1a) assumes that the address events flow data X of dynamic visual sensor output includes M event, and the X is after denoising Address events flow data is Y, and each of X event xi,j,tWith the event x in its space-time neighborhoodi±Δi,j±Δj,t、 xi,j,t±Δt、xi±Δi,j±Δj,t±ΔtWith correlation, xi,j,tEvent y corresponding with Yi,j,tWith correlation, wherein i, j and t Row address, column address and the activationary time of individual event are respectively represented, Δ i, Δ j and Δ t are respectively the offset of row address, column The offset of address and the offset of activationary time, M >=2;
(1b) is by each of X event xi,j,tWith each of Y event yi,j,tThe joint probability distribution P (X, Y) of composition makees For the undirected graph model of probability, the undirected graph model of the probability includes four agglomerates: { xi,j,t,yi,j,t}、{xi,j,t, xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±ΔtAnd { xi,j,t,xi±Δi,j±Δj,t±Δt};
(2) the energy function E (X, Y) of the undirected graph model of acquisition probability:
According to the correlation between each agglomerate built-in variable of the undirected graph model of probability, agglomerate { x is seti,j,t,yi,j,t}、 {xi,j,t,xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±ΔtAnd { xi,j,t,xi±Δi,j±Δj,t±ΔtEnergy function be respectively Exy、 Espace、EtimeAnd Est, and the energy function group of four agglomerates is combined into the energy function E (X, Y) of the undirected graph model of probability;
(3) event area division is carried out to address eventstream data X:
Address events flow data Y after denoising is initialized as address events flow data X by (3a);
M event for including in address events flow data X is divided into N number of event area R according to output sequence by (3b)1,R2..., Rl..., RN, the event number that preceding N-1 event area includes is Δ m, and the event number that nth event region includes is mN, Wherein RlIndicate first of event area, l=1,2 ..., N,Δ m >=2, mN=M- (N-1) × Δ m;
(3c) is by first of event area RlThe event for including is expressed asWhereinIndicate l A event area RlIn p-th of event, p=1,2 ..., m, m indicate RlThe sum of interior event:
(4) to each event area RlCarry out denoising:
(4a) enables l=1;
(4b) reads in event area Rl, event area R is calculated using the energy function E (X, Y) of the undirected graph model of probabilitylEnergy
(4c) enables p=1;
(4d) is utilizedWith agglomerate ENERGY Exy、Espace、EtimeAnd EstCalculate event area RlIn remove eventExcept area DomainEnergy
(4e) judgementWhether it is less thanIf so, flag eventFor noise event, and step (4f) is executed, it is no Then, flag eventFor validity event, and execute step (4f);
(4f) judges whether p is less than or equal to m, if so, enabling p=p+1, and executes step (4d), otherwise, executes step (4g);
(4g) judges whether l is less than or equal to N, if so, enabling l=l+1, and executes step (4b), otherwise, is by all labels The event of event is deleted, the address events flow data Y after being denoised.
2. a kind of address events flow data denoising method of dynamic visual sensor according to claim 1, feature exist In the energy function E (X, Y) of the undirected graph model of acquisition probability described in step (2) includes the following steps:
Agglomerate { the x of (2a) setting undirected graph model of probabilityi,j,t,yi,j,t}、{xi,j,t,xi±Δi,j±Δj,t}、{xi,j,t,xi,j,t±Δt} { xi,j,t,xi±Δi,j±Δj,t±ΔtEnergy function be respectively as follows:
Exy=-η xi,j,tyi,j,t
Espace=-α xi,j,txi±Δi,j±Δj,t
Etime=-β xi,j,txi,j,t±Δt
Est=-λ xi,j,txi±Δi,j±Δj,t±Δt
Wherein η, α, β and λ are non-negative weight parameter, and i, j and t respectively represent row address, column address and the activation of individual event Time, Δ i, Δ j and Δ t are respectively the offset of the offset of row address, the offset of column address and activationary time;
(2b) obtained using the agglomerate energy function being arranged in step (2a) the undirected graph model of probability complete energy function E (X, Y) are as follows:
3. a kind of address events flow data denoising method of dynamic visual sensor according to claim 1, feature exist In calculating event area R described in step (4d)lIn remove eventExcept regionEnergyRealize step Are as follows:
(4d1) is enabledRemember agglomerate ENERGY EoldAre as follows:
Wherein, yi,j,tIndicate Y inCorresponding event, xi±Δi,j±Δj,t、xi,j,t±ΔtAnd xi±Δi,j±Δj,t±ΔtIt indicates in XEvent in space-time neighborhood, η, α, β and λ be non-negative weight parameter, i, j and t respectively represent individual event row address, Column address and activationary time, Δ i, Δ j and Δ t are respectively the offset of row address, the offset of column address and activationary time Offset;
(4d2) is enabledRemember agglomerate ENERGY EnewAre as follows:
(4c3) is calculated
WhereinIndicate event area RlEnergy.
CN201910045615.9A 2019-01-17 2019-01-17 Address event stream data denoising method of dynamic vision sensor Active CN109726356B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910045615.9A CN109726356B (en) 2019-01-17 2019-01-17 Address event stream data denoising method of dynamic vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910045615.9A CN109726356B (en) 2019-01-17 2019-01-17 Address event stream data denoising method of dynamic vision sensor

Publications (2)

Publication Number Publication Date
CN109726356A true CN109726356A (en) 2019-05-07
CN109726356B CN109726356B (en) 2021-05-04

Family

ID=66299844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910045615.9A Active CN109726356B (en) 2019-01-17 2019-01-17 Address event stream data denoising method of dynamic vision sensor

Country Status (1)

Country Link
CN (1) CN109726356B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111667442A (en) * 2020-05-21 2020-09-15 武汉大学 High-quality high-frame-rate image reconstruction method based on event camera
CN111770290A (en) * 2020-07-29 2020-10-13 中国科学院长春光学精密机械与物理研究所 Noise reduction method for dynamic vision sensor output event stream
CN112308087A (en) * 2020-11-03 2021-02-02 西安电子科技大学 Integrated imaging identification system and method based on dynamic vision sensor
CN112581491A (en) * 2020-12-17 2021-03-30 西安电子科技大学 Moving target positioning method based on address event connected domain
CN114285962A (en) * 2021-12-14 2022-04-05 成都时识科技有限公司 Noise processing device, method, chip, event imaging device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090060266A1 (en) * 2007-08-31 2009-03-05 University Of Georgia Research Foundation, Inc. Methods and Systems for Analyzing Ratiometric Data
CN107220942A (en) * 2016-03-22 2017-09-29 三星电子株式会社 Method and apparatus for the graphical representation and processing of dynamic visual sensor
CN108537102A (en) * 2018-01-25 2018-09-14 西安电子科技大学 High Resolution SAR image classification method based on sparse features and condition random field

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090060266A1 (en) * 2007-08-31 2009-03-05 University Of Georgia Research Foundation, Inc. Methods and Systems for Analyzing Ratiometric Data
CN107220942A (en) * 2016-03-22 2017-09-29 三星电子株式会社 Method and apparatus for the graphical representation and processing of dynamic visual sensor
CN108537102A (en) * 2018-01-25 2018-09-14 西安电子科技大学 High Resolution SAR image classification method based on sparse features and condition random field

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JINJIAN WU等: "Probabilistic Undirected Graph Based Denoising Method for Dynamic Vision Sensor", 《IEEE TRANSACTIONS ON MULTIMEDIA》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111667442A (en) * 2020-05-21 2020-09-15 武汉大学 High-quality high-frame-rate image reconstruction method based on event camera
CN111770290A (en) * 2020-07-29 2020-10-13 中国科学院长春光学精密机械与物理研究所 Noise reduction method for dynamic vision sensor output event stream
CN112308087A (en) * 2020-11-03 2021-02-02 西安电子科技大学 Integrated imaging identification system and method based on dynamic vision sensor
CN112308087B (en) * 2020-11-03 2023-04-07 西安电子科技大学 Integrated imaging identification method based on dynamic vision sensor
CN112581491A (en) * 2020-12-17 2021-03-30 西安电子科技大学 Moving target positioning method based on address event connected domain
CN112581491B (en) * 2020-12-17 2023-03-21 西安电子科技大学 Moving target positioning method based on address event connected domain
CN114285962A (en) * 2021-12-14 2022-04-05 成都时识科技有限公司 Noise processing device, method, chip, event imaging device and electronic equipment
CN114285962B (en) * 2021-12-14 2023-04-07 成都时识科技有限公司 Noise processing device, method, chip, event imaging device and electronic equipment

Also Published As

Publication number Publication date
CN109726356B (en) 2021-05-04

Similar Documents

Publication Publication Date Title
CN109726356A (en) A kind of address events flow data denoising method of dynamic visual sensor
Kim et al. N-imagenet: Towards robust, fine-grained object recognition with event cameras
Nayar et al. Motion-based motion deblurring
Hu et al. Optical flow estimation for spiking camera
CN110428477B (en) Method for forming image of event camera without influence of speed
CN110503666B (en) Dense crowd counting method and system based on video
CN113408671B (en) Object identification method and device, chip and electronic equipment
CN111079507B (en) Behavior recognition method and device, computer device and readable storage medium
CN112927279A (en) Image depth information generation method, device and storage medium
Sajjanar et al. Implementation of real time moving object detection and tracking on FPGA for video surveillance applications
CN111798370A (en) Manifold constraint-based event camera image reconstruction method and system
CN114245007A (en) High frame rate video synthesis method, device, equipment and storage medium
CN115331183A (en) Improved YOLOv5s infrared target detection method
Liang et al. Video super-resolution reconstruction based on deep learning and spatio-temporal feature self-similarity
CN115731513B (en) Intelligent park management system based on digital twinning
CN110426560B (en) Method for generating space-time upsampler of pulse array signal
CN105957060B (en) A kind of TVS event cluster-dividing method based on optical flow analysis
Venkatesvara Rao et al. Real-time video object detection and classification using hybrid texture feature extraction
CN112399032A (en) Optical flow acquisition method of pulse type image sensor based on detector
CN111401209B (en) Action recognition method based on deep learning
JP4563982B2 (en) Motion estimation method, apparatus, program thereof, and recording medium thereof
CN115205793B (en) Electric power machine room smoke detection method and device based on deep learning secondary confirmation
CN116433822A (en) Neural radiation field training method, device, equipment and medium
Nunes et al. Adaptive global decay process for event cameras
Pingault et al. Motion estimation of transparent objects in the frequency domain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant