CN111145199B - Edge detection method based on long-and-short-time-range synaptic complementation neuron network - Google Patents

Edge detection method based on long-and-short-time-range synaptic complementation neuron network Download PDF

Info

Publication number
CN111145199B
CN111145199B CN202010049326.9A CN202010049326A CN111145199B CN 111145199 B CN111145199 B CN 111145199B CN 202010049326 A CN202010049326 A CN 202010049326A CN 111145199 B CN111145199 B CN 111145199B
Authority
CN
China
Prior art keywords
time
long
short
synaptic
coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010049326.9A
Other languages
Chinese (zh)
Other versions
CN111145199A (en
Inventor
范影乐
余翔
武薇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202010049326.9A priority Critical patent/CN111145199B/en
Publication of CN111145199A publication Critical patent/CN111145199A/en
Application granted granted Critical
Publication of CN111145199B publication Critical patent/CN111145199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Abstract

The invention relates to an edge detection method based on a long-time-range synapse complementary neuron network. And constructing a neuron network with long-and-short-time-range synapse complementation characteristics, wherein the neuron network comprises a color antagonistic weighted coding module, a discharge time coding module and a long-and-short-time-range synapse complementation coding module. In a color antagonism weighted coding module, carrying out weighted coding on a color antagonism channel of an image to be detected; in the discharge time coding module, realizing discharge time coding of weighted coding response; in the long-time and short-time synapse complementary coding module, long-time and short-time synapse plasticity coding is realized based on the discharge activity space-time dependency and synchronous discharge characteristics of a neuron group, the complementary fusion of the results of the long-time and short-time synapses is realized, and edge response is obtained by coding a time information stream; and obtaining a final edge result through normalization and gray mapping processing. The method considers the complementation of the long-time and short-time synaptic plasticity in the edge detection process, and has a good detection effect on images with complex backgrounds and more weak edges.

Description

Edge detection method based on long-and-short-time-range synaptic complementation neuron network
Technical Field
The invention belongs to the field of visual nerve calculation, and mainly relates to an edge detection method based on a long-time-range synapse complementary neuron network.
Background
The image edge detection technology is the basis of image segmentation, target area identification and other technologies. In the traditional edge detection method, gradient operators such as Sobel operators are often used for measuring the catastrophe steps of the image edge, but the edge is not accurately positioned under the complex background condition; there are also detection methods based on LOG filters, which have weak noise elimination capability while realizing edge positioning. Considering that in a biological visual system, a synaptic plasticity mechanism is important for realizing the visual perception function of the brain, for example, the research on simulating synaptic plasticity from a receptive field deformation angle is carried out, but the dynamic regulation effect of synaptic plasticity on neuron discharge in the image information stream coding process cannot be deeply researched; there are also studies focusing on single synaptic plasticity, for example, a method based on long-time-course synaptic plasticity only considers synaptic connections in a long time scale, which results in insufficient sensitivity to weak signal changes in the information transmission process, failure to effectively capture weak edges in the image, and neglecting presynaptic correlation, possibly resulting in fracture of image edges. For example, there are also studies on short-term synaptic plasticity-based methods, but neglects synaptic dynamic regulation in the long-term range, and is greatly interfered by noise. In fact, the perception of external stimuli by the biological visual system is a process of long-and-short-term synaptic complementation and dynamic regulation.
Disclosure of Invention
Aiming at (1) the edge detection method based on the synaptic plasticity mechanism at present, the deformation of a simulation receptive field model is generally taken as a strategy, an image to be detected is taken as a research target, and the discharge activity of hundreds of millions of neurons in a neural loop is ignored as the basis of human visual perception; (2) At present, a method for simulating synaptic plasticity by researching neuron discharge activity generally simulates a single synaptic plasticity process, and neglecting that edge detection in a biological visual system is a result of long-time and short-time synaptic synergy and dynamic regulation. Neglecting short-term synaptic plasticity will result in edge images being prone to break, while neglecting long-term synaptic plasticity will result in detection results being more affected by noise.
Therefore, the invention constructs a neuron network with long-and-short-range synapse complementary characteristics from the cooperative working angle of the long-and-short-range synapses, and mainly comprises modules such as color antagonistic weighted coding, discharge time coding, long-and-short-range synapse complementary coding, a normalization layer and the like. The method comprises the steps of obtaining primary edge perception of an image to be detected by simulating color antagonism weighting characteristics of biological vision, then carrying out neuron discharge time modeling on the primary edge perception, then carrying out long-time interval and short-time interval synaptic complementation coding on a modeling result, and finally carrying out normalization and gray mapping processing on a normalization layer to obtain a final edge result.
The method mainly comprises the following steps:
constructing a neural network with long-time-range and short-time-range synapse complementary characteristics, wherein the size of the neural network is the same as that of a map (i, j) to be detected, and i =1,2, …, M; j =1,2, …, N, wherein M, N represents the length and width of the image to be measured respectively; the neuron network comprises a color antagonism weighted coding module, a discharge time coding module, a long-time-range and short-time-range synapse complementary coding module and a normalization layer module;
considering that the influence degrees of different color antagonistic channels on the edge detection result are different, constructing a color antagonistic weighted coding module, defining color antagonistic influence factors, and carrying out weighted coding on the responses of the different color antagonistic channels to obtain a weighted coding response S _ result (i, j); the specific implementation process is as follows:
firstly, red, green, blue and yellow components R (i, j), G (i, j), B (i, j) and Y (i, j) of map (i, j) are obtained, wherein the relation between the yellow component and the red and green components is shown as a formula (1);
Y(i,j)=(R(i,j)+G(i,j))/2 (1)
with R + /G - Taking a color antagonistic channel as an example, in order to simulate the receptive field of a single-center structure, extracting local information of the image to be detected, and performing two-dimensional Gaussian function processing with the same scale of σ =1.5 on the red component R (i, j) and the blue component G (i, j) to obtain
Figure BDA0002370552010000021
Then, performing single-color antagonistic coding as shown in formula (2) to obtain R + /G - Single color antagonistic coded response S corresponding to color antagonistic channel rg (i,j);
Figure BDA0002370552010000031
In the formula of lambda 12 ,λ 12 Epsilon (0,1) represents the input weight of the cone cells, reflects the intensity of the luminance information and the chrominance information of the image, and defaults to lambda 1 =1,λ 2 E (0.5,0.8); by the same way, other three color antagonistic channels G can be obtained + /R - 、B + /Y - 、Y + /B - Single color antagonistic coded response S gr (i,j)、S by (i,j)、S yb (i,j);
For different images to be detected, antagonistic channels of color information of the dominant images are different, so that the antagonistic channels are determinedA false color antagonistic influence factor; with R + /G - The color antagonistic channel is, for example, as shown in formula (3);
Figure BDA0002370552010000032
in the formula (I), the compound is shown in the specification,
Figure BDA0002370552010000033
mean values of red, green, blue, and yellow components, respectively; the same theory can obtain other three color antagonistic channels G + /R - 、B + /Y - 、Y + /B - Color antagonism influencing factor mu of g 、μ b 、μ y
Using a colour antagonism influencing factor mu r 、μ g 、μ b 、μ y Single color antagonistic coded response S to an image under test rg (i,j)、S gr (i,j)、S by (i,j)、S yb (i, j) carrying out weighted coding to obtain weighted coding responses S _ result (i, j) of four color antagonistic channels of the image to be detected, as shown in a formula (4);
S_result(i,j)=μ r ×S rg (i,j)+μ g ×S gr (i,j)+μ b ×S by (i,j)+μ y ×S yb (i,j) (4)
step (3) a discharge time coding module is constructed, and obtained weighted coding responses S _ result (i, j) of four color antagonistic channels of the image to be detected are coded into time-related information so as to simulate the influence of input signal intensity on the first discharge time and discharge frequency of the neuron;
construction of a synaptic Effect Window D with a boundary Length L n (x, y) where L is an odd number, taking 3-7, x, y =1,2, …, L, N =1,2, …, M × N, peripheral modulation neurons (k, L) as pre-synaptic neurons,
Figure BDA0002370552010000034
central modulation neurons (m, n) as post-synaptic neurons,
Figure BDA0002370552010000035
through pair D n (x, y) performing a sliding window shifting mode to enable the central modulation neuron to correspond to each element in the S _ result (i, j) one by one; meanwhile, to solve the boundary overflow problem, zero padding is performed at the edge of the S _ result (i, j) matrix so that its size becomes equal to
Figure BDA0002370552010000041
Considering that dynamic synaptic connections between neurons have pulse timing dependence and pulse frequency dependence, the input signal strength will be one of the key factors affecting the above characteristics; therefore, the Izhikevich neuron model is used to realize the synaptic action window D n (x, y) discharge time coding of S _ result (i, j), respectively, to obtain D n First firing time t of each neuron in (x, y) n (x, y) forming a time matrix T n (x, y) and each T n The maximum value of the element in (x, y) is t _ max n Minimum value of t _ min n (ii) a Then, D is calculated n Average value t _ aver of first discharge time of each neuron in (x, y) n As shown in formula (5);
Figure BDA0002370552010000042
constructing a long-time-range synapse complementary coding module in the step (4), and carrying out comparison on the T obtained in the step (3) n (x, y) carrying out long-short time interval synaptic complementary coding;
the traditional neural coding method is used for coding the discharge activity of a single neuron by modeling the neuron so as to obtain relevant information such as discharge time sequence, frequency and the like; considering that only a single neuron is susceptible to noise interference in a short time span; therefore, to ensure the accuracy and stability of information expression, the time neighborhood of short-time synapse action is defined, and T is encoded based on the neuron population discharge frequency n (x, y) short-term synaptic plasticity coding; the specific implementation process is as follows:
first, define t _ aver n One ofNeighborhood (t _ aver) n -t_short n ,t_aver n +t_short n ) Time neighborhood for short-range synapses, t _ short n The definition is shown as a formula (6);
Figure BDA0002370552010000043
then, make statistics of D n Cluster discharge frequency f of neuron population in (x, y) n As shown in formula (7);
Figure BDA0002370552010000044
wherein m is n Is shown by D n In (x, y), t n (x,y)∈(t_aver n -t_short n ,t_aver n +t_short n ) The number of neurons;
in order to ensure the non-linear characteristic of synaptic connection between neurons and to ensure robustness and avoid large fluctuation of synaptic action, the frequency factor is activated non-linearly so as to obtain D n Short-time-range synaptic efficacy coefficient in (x, y) Syp _ short n As shown in formula (8);
Figure BDA0002370552010000051
wherein, λ and g are constant, λ =0.98 is taken as default,
Figure BDA0002370552010000052
considering that the synapse connection with a long time range is influenced by the discharge time sequence and the space topological structure of each neuron in the cluster expression process of the neurons; thus based on the neuron population space-time coding, the T obtained in step (3) n (x, y) performing long-time Cheng Tuchu plastic coding; the specific implementation process is as follows:
the long-term Cheng Tuchu dynamic link is considered to be the basis for the learning and memory abilities of the brainThe firing sequence of the neurons before and after synapse determines that the long-term Cheng Tuchu connection effect shows long-term enhancement or inhibition; thus, pair D n Carrying out discharge time sequence coding on neurons in (x, y) and carrying out nonlinear activation to obtain discharge time sequence coding response t _ in n (k, l) is represented by formula (9);
Figure BDA0002370552010000053
in the formula, t n (m,n)、t n (k, l) denotes the window of synaptic action D n (x, y) the first firing time of central and peripheral modulation neurons, τ is a synaptic action time scale coefficient, and the sensitivity of long-time synaptic action to time factors is measured, wherein the default is τ =25ms;
then, carrying out space topological coding on the neurons; calculating D n The Euclidean distance between the peripheral modulation neuron at (k, l) and the central modulation neuron at (m, n) in (x, y) is shown as formula (10), and the maximum value is d max
Figure BDA0002370552010000054
Then, d (k, l) is subjected to gaussian function processing, and as shown in equation (11), when d (k, l) =1, s (k, l) takes a maximum value s max (ii) a When d (k, l) = d max When s (k, l) is taken to be the minimum value s min
Figure BDA0002370552010000061
Wherein a, b and c are all constant: a represents a peak coefficient of a Gaussian function curve; b represents the minimum Euclidean distance between the peripheral modulation neurons and the central modulation neurons in the synaptic action window, and the synaptic action is strongest at the moment; c is a synaptic activity space scale coefficient, and the sensitivity of the long-time synaptic activity to space factors is measured; defaults to take a =1,b =1,c =2;
normalizing s (k, l) to obtain a space topological coding response s _ in (k, l), as shown in formula (12);
Figure BDA0002370552010000062
then, the discharge timing is encoded to respond to t _ in n Multiplying (k, l) by the space topological coding response s _ in (k, l) to obtain a long-time Cheng Tuchu action matrix Syp _ long n (k, l) is represented by formula (13);
Figure BDA0002370552010000063
the limiting conditions in the formula indicate that when the peripheral modulation neurons discharge before the central modulation neurons, the long-term Cheng Tuchu action shows an enhancement trend, and conversely shows a reduction trend;
meanwhile, in order to ensure normal operation, syp _ long is assigned to central elements of long-time Cheng Tuchu action matrix n (m, n) =0.1, and the long-term Cheng Tuchu action coefficient matrix Syp _ long is formed n (x,y);
Step (6) is to carry out the Syp _ short obtained in the step (4) n And Syp _ long obtained in step (5) n Multiplying (x, y) to obtain the action matrix Syn with long and short time intervals and synaptic plasticity complementation n (x, y), i.e. the window of synaptic activity with long-and short-term synaptic complementarity, as shown in formula (14);
Syn n (x,y)=Syn_short n ×Syn_long n (x,y) (14)
Syn n (x, y) passing and discharging time matrix T n (x, y) performing a convolution operation to achieve D n (x, y) the long-short time synaptic plasticity complementary codes in the formula (15);
Res n (x,y)=T n (x,y)*Syn n (x,y) (15)
will Res n (x, y) according to D n (x, y) reconstructing the corresponding position in the S _ result (i, j), and obtaining an edge response result Res (i, j); and it notes the location of Res (i,j) The maximum value of the middle element is Res max Minimum value of Res min
Finally, performing gray scale normalization and gray scale mapping operation on Res (i, j) in the normalization layer to obtain an edge detection result based on the long-time-range and short-time-range synapse complementary neuron network, as shown in formula (16);
Figure BDA0002370552010000071
the invention has the following beneficial effects:
1. aiming at an image to be detected at the edge, designing a neuron network with long-short range synapse complementation characteristic, simulating long-short range synapse plasticity, and proposing that the edge detection of the image to be detected is realized by utilizing the complementation of the long-short range synapses;
2. in the early processing process of an image to be detected, a color antagonistic weighted coding module is constructed, a color antagonistic influence factor is defined, the combined action of a plurality of color antagonistic channels in the color antagonistic process is fully considered, and the color antagonistic channels of the image are weighted and coded to obtain primary edge perception;
3. constructing a discharge time coding module, defining a synapse action window, and coding primary edge perception into information related to the discharge time of the neuron;
4. constructing a long-time-range synapse complementary coding module, defining a time neighborhood of short-time-range synapse action, simulating the short-time-range synapse action, fully considering the correlation of an incoming pulse sequence, and having important effects on improving the sensitivity to weak signals in the information transmission process of a nervous system and ensuring the certainty of the information transmission of the nervous system, thereby ensuring the continuity of edge detection results; the long-time-range synapse effect is simulated, and the long-time-range time scale of the synapse effect has a good effect on eliminating the false edge noise of the image;
5. the method has the advantages that the plasticity of the long-time and short-time synapsis and the complementary action of the plasticity of the long-time and short-time synapsis are fully considered, and the long-time and short-time synapsis results are fused, so that the self-adaptive learning of the dynamic synapsis connection between the neurons is realized, the weak edge of the image is strengthened, the false edge of the image is inhibited, the detection result is more complete and clear, and the visual perception effect of human eyes is more similar.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
With reference to the attached drawing 1, the specific implementation steps of the invention are as follows:
constructing a neuron network with long-time and short-time synaptic complementation characteristics, wherein the size of the neuron network is the same as that of an image map (i, j) to be detected, and i =1,2, … and M; j =1,2, …, N, wherein M, N represents the length and width of the image to be measured respectively; the neuron network comprises a color antagonism weighted coding module, a discharge time coding module, a long-time-range and short-time-range synapse complementary coding module and a normalization layer module;
considering that the influence degrees of different color antagonistic channels on the edge detection result are different, constructing a color antagonistic weighted coding module, defining color antagonistic influence factors, and carrying out weighted coding on the responses of the different color antagonistic channels to obtain weighted coding responses S _ result (i, j); the specific implementation process is as follows:
firstly, red, green, blue and yellow components R (i, j), G (i, j), B (i, j) and Y (i, j) of map (i, j) are obtained, wherein the relation between the yellow component and the red and green components is shown as a formula (1);
Y(i,j)=(R(i,j)+G(i,j))/2 (1)
with R + /G - Taking a color antagonistic channel as an example, in order to simulate the receptive field of a single-center structure, extracting local information of the image to be detected, and performing two-dimensional Gaussian function processing with the same scale of σ =1.5 on the red component R (i, j) and the blue component G (i, j) to obtain
Figure BDA0002370552010000081
Then, performing single-color antagonistic coding as shown in formula (2) to obtain R + /G - Single color antagonistic coded response S corresponding to color antagonistic channel rg (i,j);
Figure BDA0002370552010000082
In the formula of lambda 12 ,λ 12 Epsilon (0,1) represents the input weight of the cone cells, reflects the intensity of the luminance information and the chrominance information of the image, and defaults to lambda 1 =1,λ 2 E (0.5,0.8); in the same way, other three color antagonistic channels G can be obtained + /R - 、B + /Y - 、Y + /B - Single color antagonistically coded response S gr (i,j)、S by (i,j)、S yb (i,j);
For different images to be detected, antagonistic channels of leading image color information are different, so that color antagonistic influence factors are defined; with R + /G - The color antagonistic channel is, for example, as shown in formula (3);
Figure BDA0002370552010000091
in the formula (I), the compound is shown in the specification,
Figure BDA0002370552010000092
respectively representing the average values of red, green, blue and yellow components; the same theory can obtain other three color antagonistic channels G + /R - 、B + /Y - 、Y + /B - Color antagonistic influence factor mu of g 、μ b 、μ y
Using a colour antagonistic influencing factor mu r 、μ g 、μ b 、μ y Single color antagonistic coded response S to an image under test rg (i,j)、S gr (i,j)、S by (i,j)、S yb (i, j) carrying out weighted coding to obtain weighted coding responses S _ result (i, j) of four color antagonistic channels of the image to be detected, as shown in a formula (4);
S_result(i,j)=μ r ×S rg (i,j)+μ g ×S gr (i,j)+μ b ×S by (i,j)+μ y ×S yb (i,j) (4)
step (3) a discharge time coding module is constructed, and obtained weighted coding responses S _ result (i, j) of four color antagonistic channels of the image to be detected are coded into time-related information so as to simulate the influence of input signal intensity on the first discharge time and discharge frequency of the neuron;
construction of a synaptic Effect Window D with a boundary Length L n (x, y) where L is an odd number, taking 3-7, x, y =1,2, …, L, N =1,2, …, M × N, peripheral modulation neurons (k, L) as pre-synaptic neurons,
Figure BDA0002370552010000093
central modulation neurons (m, n) as post-synaptic neurons,
Figure BDA0002370552010000094
through pair D n (x, y) performing a sliding window shifting mode to enable the central modulation neuron to correspond to each element in the S _ result (i, j) one by one; meanwhile, to solve the boundary overflow problem, zero padding is performed at the edge of the S _ result (i, j) matrix so that its size becomes equal to
Figure BDA0002370552010000095
Considering that dynamic synaptic connections between neurons have pulse timing dependence and pulse frequency dependence, the input signal strength will be one of the key factors affecting the above characteristics; therefore, using the Izhikevich neuron model, the synaptic effect window D is measured n (x, y), S _ result (i, j) is discharge time encoded to obtain D n First firing time t of each neuron in (x, y) n (x, y) forming a time matrix T n (x, y) and each T n The maximum value of the element in (x, y) is t _ max n Minimum value of t _ min n (ii) a Then, D is calculated n Average value t _ aver of first discharge time of each neuron in (x, y) n As shown in formula (5);
Figure BDA0002370552010000101
step (4) constructing long and short time processA touch complementary coding module for the T obtained in the step (3) n (x, y) carrying out long-time and short-time interval synaptic complementation coding;
the traditional neural coding method is used for coding the discharge activity of a single neuron by modeling the neuron so as to obtain relevant information such as discharge time sequence, frequency and the like; considering that only a single neuron is susceptible to noise interference in a short time range; therefore, to ensure the accuracy and stability of information expression, the time neighborhood of short-time synapse action is defined, and T is encoded based on the neuron population discharge frequency n (x, y) short-term synaptic plasticity coding; the specific implementation process is as follows:
first, define t _ aver n Is (d) a neighborhood (t aver) n -t_short n ,t_aver n +t_short n ) Time neighborhood for short-range synapses, t _ short n The definition is shown as a formula (6);
Figure BDA0002370552010000102
then, make statistics of D n Cluster discharge frequency f of neuron population in (x, y) n As shown in formula (7);
Figure BDA0002370552010000103
wherein m is n Is shown by D n In (x, y), t n (x,y)∈(t_aver n -t_short n ,t_aver n +t_short n ) The number of neurons;
in order to ensure the non-linear characteristic of synaptic connection between neurons and to ensure robustness and avoid large fluctuation of synaptic action, the frequency factor is activated non-linearly, so as to obtain D n Short-time-range synaptic activity coefficient in (x, y) Syp _ short n As shown in formula (8);
Figure BDA0002370552010000111
wherein, λ and g are constants, the default is λ =0.98,
Figure BDA0002370552010000112
considering that the synapse connection with a long time range is influenced by the discharge time sequence and the space topological structure of each neuron in the cluster expression process of the neurons; thus based on the neuron population space-time coding, the T obtained in step (3) n (x, y) performing long-time Cheng Tuchu plasticity encoding; the specific implementation process is as follows:
the long-term Cheng Tuchu dynamic connection is considered as the basis of the learning ability and the memory ability of the brain, and the discharge sequence of neurons before and after synapse determines that the connection effect of the long-term Cheng Tuchu shows long-term enhancement or inhibition; thus, for D n Carrying out discharge time sequence coding on neurons in (x, y) and carrying out nonlinear activation to obtain discharge time sequence coding response t _ in n (k, l) is represented by formula (9);
Figure BDA0002370552010000113
in the formula, t n (m,n)、t n (k, l) denotes the window of synaptic action D n (x, y) the first firing time of central and peripheral modulation neurons, τ is a synaptic action time scale coefficient, and the sensitivity of long-time synaptic action to time factors is measured, wherein the default is τ =25ms;
then, carrying out space topological coding on the neurons; calculating D n The Euclidean distance between the peripheral modulation neuron at (k, l) and the central modulation neuron at (m, n) in (x, y) is shown as formula (10), and the maximum value is d max
Figure BDA0002370552010000114
Then, d (k, l) is gaussian-function-processed, and as shown in equation (11), when d (k, l) =1, s (k, l) is takenMaximum value s max (ii) a When d (k, l) = d max When s (k, l) is taken to be the minimum value s min
Figure BDA0002370552010000121
Wherein a, b and c are all constant values: a represents a peak coefficient of a Gaussian function curve; b represents the minimum Euclidean distance between peripheral modulation neurons and central modulation neurons in a synaptic action window, and the synaptic action is strongest at the moment; c is a synaptic activity space scale coefficient, and the sensitivity of the long-time synaptic activity to space factors is measured; default to a =1,b =1,c =2;
normalizing s (k, l) to obtain a space topological coding response s _ in (k, l), as shown in formula (12);
Figure BDA0002370552010000122
then, the discharge timing is encoded to respond to t _ in n Multiplying (k, l) by the space topological coding response s _ in (k, l) to obtain a long-time Cheng Tuchu action matrix Syp _ long n (k, l) is represented by formula (13);
Figure BDA0002370552010000123
the limiting conditions in the formula indicate that when the peripheral modulation neurons discharge before the central modulation neurons, the long-term Cheng Tuchu action shows an enhancement trend, and conversely shows a reduction trend;
meanwhile, in order to ensure normal operation, syp _ long is assigned to central elements of long-time Cheng Tuchu action matrix n (m, n) =0.1, and forms long-term Cheng Tuchu action coefficient matrix Syp _ long n (x,y);
Step (6) is to carry out the Syp _ short obtained in the step (4) n And Syp _ long obtained in step (5) n Multiplying (x, y) to obtain action matrix Syn with long-short time range and synaptic plasticity complementation n (x, y), i.e. having long and short duration synaptic interactionsA synaptic window of complementary character, as shown in equation (14);
Syn n (x,y)=Syn_short n ×Syn_long n (x,y) (14)
Syn n (x, y) passage-and-discharge time matrix T n (x, y) performing a convolution operation to achieve D n The long-short time interval synapses in (x, y) are subjected to plasticity complementation coding, and are shown as a formula (15);
Res n (x,y)=T n (x,y)*Syn n (x,y) (15)
will Res n (x, y) according to D n (x, y) reconstructing the corresponding position in the S _ result (i, j), and obtaining an edge response result Res (i, j); and recording the maximum value of the elements in Res (i, j) as Res max Minimum value of Res min
Finally, performing gray scale normalization and gray scale mapping operation on Res (i, j) in the normalization layer to obtain an edge detection result based on the long-time-range and short-time-range synapse complementary neuron network, as shown in formula (16);
Figure BDA0002370552010000131

Claims (1)

1. an edge detection method based on a long-time and short-time synaptic complementation neuron network is characterized by comprising the following steps:
constructing a neuron network with long-time and short-time synaptic complementation characteristics, wherein the size of the neuron network is the same as that of an image map (i, j) to be detected, and i =1,2, … and M; j =1,2, …, N, wherein M, N represents the length and width of the image to be measured respectively;
considering that the influence degrees of different color antagonistic channels on the edge detection result are different, constructing a color antagonistic weighted coding module, defining color antagonistic influence factors, and carrying out weighted coding on the responses of the different color antagonistic channels to obtain a weighted coding response S _ result (i, j); the specific implementation process is as follows:
firstly, red, green, blue and yellow components R (i, j), G (i, j), B (i, j) and Y (i, j) of map (i, j) are obtained, wherein the relation between the yellow component and the red and green components is shown as a formula (1);
Y(i,j)=(R(i,j)+G(i,j))/2 (1)
with R + /G - Taking a color antagonistic channel as an example, in order to simulate the receptive field of a single-center structure, extracting local information of the image to be detected, and performing two-dimensional Gaussian function processing with the same scale of σ =1.5 on the red component R (i, j) and the blue component G (i, j) to obtain
Figure FDA0002370550000000011
Then carrying out single color antagonistic coding as shown in formula (2) to obtain R + /G - Single color antagonistic coded response S corresponding to color antagonistic channel rg (i,j);
Figure FDA0002370550000000012
In the formula, λ 12 ,λ 12 The epsilon (0,1) represents the input weight of the cone cells and reflects the intensity of the luminance information and the chrominance information of the image; by the same way, other three color antagonistic channels G can be obtained + /R - 、B + /Y - 、Y + /B - Single color antagonistic coded response S gr (i,j)、S by (i,j)、S yb (i,j);
For different images to be detected, antagonistic channels of leading image color information are different, so that color antagonistic influence factors are defined; with R + /G - The color antagonistic channel is, for example, as shown in formula (3);
Figure FDA0002370550000000021
in the formula (I), the compound is shown in the specification,
Figure FDA0002370550000000022
respectively represent red, green, blue and yellowAn average of the components; the same theory can obtain other three color antagonistic channels G + /R - 、B + /Y - 、Y + /B - Color antagonism influencing factor mu of g 、μ b 、μ y
Using a colour antagonism influencing factor mu r 、μ g 、μ b 、μ y Single color antagonistic coded response S to an image under test rg (i,j)、S gr (i,j)、S by (i,j)、S yb (i, j) carrying out weighted coding to obtain weighted coding responses S _ result (i, j) of four color antagonistic channels of the image to be detected, as shown in a formula (4);
S_result(i,j)=μ r ×S rg (i,j)+μ g ×S gr (i,j)+μ b ×S by (i,j)+μ y ×S yb (i,j) (4)
step (3) a discharge time coding module is constructed, and obtained weighted coding responses S _ result (i, j) of four color antagonistic channels of the image to be detected are coded into time-related information so as to simulate the influence of input signal intensity on the first discharge time and discharge frequency of the neuron;
construction of a synaptic Effect Window D with a boundary Length L n (x, y) where L is an odd number, x, y =1,2, …, L, N =1,2, …, M × N, peripheral modulation neurons (k, L) as pre-synaptic neurons,
Figure FDA0002370550000000023
central modulation neurons (m, n) as post-synaptic neurons,
Figure FDA0002370550000000024
by pair D n (x, y) performing a sliding window moving mode to enable the central modulation neuron to correspond to each element in the S _ result (i, j) one by one; meanwhile, to solve the boundary overflow problem, zero padding is performed at the edge of the S _ result (i, j) matrix so that its size becomes equal to
Figure FDA0002370550000000025
Consider thatDynamic synaptic connections to neurons are pulse timing dependent and pulse frequency dependent, where input signal strength will be one of the key factors affecting the above properties; therefore, the Izhikevich neuron model is used to realize the synaptic action window D n (x, y), S _ result (i, j) is discharge time encoded to obtain D n First firing time t of each neuron in (x, y) n (x, y) forming a time matrix T n (x, y) and each T n The maximum value of the element in (x, y) is t _ max n Minimum value of t _ min n (ii) a Then, D is calculated n Average value t _ aver of first discharge time of each neuron in (x, y) n As shown in formula (5);
Figure FDA0002370550000000031
constructing a long-time-range synapse complementary coding module in the step (4), and carrying out comparison on the T obtained in the step (3) n (x, y) carrying out long-time and short-time interval synaptic complementation coding;
defining temporal neighborhood of short-time synaptic effects, encoding T based on neuron population firing frequency n (x, y) short-term synaptic plasticity coding; the specific implementation process is as follows:
first define t _ aver n Is (d) a neighborhood (t aver) n -t_short n ,t_aver n +t_short n ) Time neighborhood for short-range synapses, t _ short n The definition is shown as formula (6);
Figure FDA0002370550000000032
then, make statistics of D n Frequency f of cluster discharge of neuron population in (x, y) n As shown in formula (7);
Figure FDA0002370550000000033
wherein m is n Is shown by D n In (x, y), t n (x,y)∈(t_aver n -t_short n ,t_aver n +t_short n ) The number of neurons;
in order to ensure the non-linear characteristic of synaptic connection between neurons and to ensure robustness and avoid large fluctuation of synaptic action, the frequency factor is activated non-linearly, so as to obtain D n Short-time-range synaptic activity coefficient in (x, y) Syp _ short n As shown in formula (8);
Figure FDA0002370550000000034
in the formula, lambda and g are constants;
considering that the synapse connection with a long time course is influenced by the discharge time sequence and the spatial topological structure of each neuron in the cluster expression process of the neurons; thus based on the neuron population space-time coding, the T obtained in step (3) n (x, y) performing long-time Cheng Tuchu plastic coding; the specific implementation process is as follows:
the long-term Cheng Tuchu dynamic connection is considered as the basis of the learning ability and the memory ability of the brain, and the discharge sequence of neurons before and after synapse determines that the connection effect of the long-term Cheng Tuchu shows long-term enhancement or inhibition; thus, pair D n Carrying out discharge time sequence coding on neurons in (x, y) and carrying out nonlinear activation to obtain discharge time sequence coding response t _ in n (k, l) is represented by formula (9);
Figure FDA0002370550000000041
in the formula, t n (m,n)、t n (k, l) denotes the window of synaptic action D n (x, y) the first firing time of central and peripheral modulating neurons, τ is the time scale coefficient of synaptic action and measures the sensitivity of long-time synaptic action to time factors;
next, the neurons are topologically spatially encoded(ii) a Calculating D n The Euclidean distance between the peripheral modulation neurons at (k, l) and the central modulation neuron at (m, n) in (x, y) is shown in formula (10), and the maximum value is recorded as d max
Figure FDA0002370550000000042
Then, d (k, l) is subjected to gaussian function processing, and as shown in equation (11), when d (k, l) =1, s (k, l) takes a maximum value s max (ii) a When d (k, l) = d max When s (k, l) is taken to be the minimum value s min
Figure FDA0002370550000000043
Wherein a, b and c are all constant: a represents a peak coefficient of a Gaussian function curve; b represents the minimum Euclidean distance between the peripheral modulation neurons and the central modulation neurons in the synaptic action window, and the synaptic action is strongest at the moment; c is a synaptic activity space scale coefficient, and the sensitivity of the long-time synaptic activity to the space factor is measured;
normalizing s (k, l) to obtain a space topological coding response s _ in (k, l), as shown in formula (12);
Figure FDA0002370550000000051
then, the discharge timing is coded to respond to t _ in n Multiplying (k, l) by the space topological coding response s _ in (k, l) to obtain a long-time Cheng Tuchu action matrix Syp _ long n (k, l) as shown in formula (13);
Figure FDA0002370550000000052
the limiting conditions in the formula indicate that when the peripheral modulation neurons discharge before the central modulation neurons, the long-term Cheng Tuchu action shows an enhancement trend, and conversely shows a reduction trend;
meanwhile, in order to ensure normal operation, syp _ long is assigned to central elements of long-time Cheng Tuchu action matrix n (m, n) =0.1, and forms long-term Cheng Tuchu action coefficient matrix Syp _ long n (x,y);
Step (6) is to use the Syp _ short obtained in the step (4) n And Syp _ long obtained in step (5) n Multiplying (x, y) to obtain the action matrix Syn with long and short time intervals and synaptic plasticity complementation n (x, y), i.e. the window of synaptic action with long-short time interval synaptic complementarity, as shown in formula (14);
Syn n (x,y)=Syn_short n ×Syn_long n (x,y) (14)
Syn n (x, y) passing and discharging time matrix T n (x, y) performing a convolution operation to achieve D n (x, y) the long-short time synaptic plasticity complementary codes in the formula (15);
Res n (x,y)=T n (x,y)*Syn n (x,y) (15)
will Res n (x, y) according to D n (x, y) reconstructing the corresponding position in the S _ result (i, j), and obtaining an edge response result Res (i, j); and recording the maximum value of the elements in Res (i, j) as Res max Minimum value of Res min
Finally, performing gray scale normalization and gray scale mapping operation on Res (i, j) in the normalization layer to obtain an edge detection result based on the long-time-range and short-time-range synapse complementary neuron network, as shown in formula (16);
Figure FDA0002370550000000061
CN202010049326.9A 2020-01-16 2020-01-16 Edge detection method based on long-and-short-time-range synaptic complementation neuron network Active CN111145199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010049326.9A CN111145199B (en) 2020-01-16 2020-01-16 Edge detection method based on long-and-short-time-range synaptic complementation neuron network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010049326.9A CN111145199B (en) 2020-01-16 2020-01-16 Edge detection method based on long-and-short-time-range synaptic complementation neuron network

Publications (2)

Publication Number Publication Date
CN111145199A CN111145199A (en) 2020-05-12
CN111145199B true CN111145199B (en) 2023-02-03

Family

ID=70525525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010049326.9A Active CN111145199B (en) 2020-01-16 2020-01-16 Edge detection method based on long-and-short-time-range synaptic complementation neuron network

Country Status (1)

Country Link
CN (1) CN111145199B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112819142B (en) * 2021-02-04 2024-01-19 成都市深思创芯科技有限公司 Short-time synaptic plasticity work memory computing system and method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679710A (en) * 2013-11-29 2014-03-26 杭州电子科技大学 Method for detecting weak edges of images on basis of discharge information of multilayer neuron groups

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11308387B2 (en) * 2017-05-09 2022-04-19 Samsung Electronics Co., Ltd. STDP with synaptic fatigue for learning of spike-time-coded patterns in the presence of parallel rate-coding

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679710A (en) * 2013-11-29 2014-03-26 杭州电子科技大学 Method for detecting weak edges of images on basis of discharge information of multilayer neuron groups

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Spiking neural network with synaptic plasticity for recognition;Jing Li et al.;《IEEE Xplore》;20181216;全文 *
一种基于颜色拮抗感受野的轮廓检测模型;吴莉 等;《计算机科学》;20160715(第07期);全文 *
基于神经元颜色拮抗与动态编码的轮廓检测方法;胡钧皓 等;《中国生物医学工程学报》;20171020(第05期);全文 *

Also Published As

Publication number Publication date
CN111145199A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
CN110210563B (en) Image pulse data space-time information learning and identification method based on Spike cube SNN
CN110991299B (en) Confrontation sample generation method aiming at face recognition system in physical domain
CN111709902A (en) Infrared and visible light image fusion method based on self-attention mechanism
CN108510194A (en) Air control model training method, Risk Identification Method, device, equipment and medium
CN109002848B (en) Weak and small target detection method based on feature mapping neural network
CN109379153A (en) A kind of frequency spectrum sensing method
CN106570516A (en) Obstacle recognition method using convolution neural network
CN111145199B (en) Edge detection method based on long-and-short-time-range synaptic complementation neuron network
CN110569916A (en) Confrontation sample defense system and method for artificial intelligence classification
AU2020103251A4 (en) Method and system for identifying metallic minerals under microscope based on bp nueral network
Yang et al. An adaptive contourlet HMM–PCNN model of sparse representation for image denoising
CN103985115A (en) Image multi-strength edge detection method having visual photosensitive layer simulation function
Gomez-Villa et al. On the synthesis of visual illusions using deep generative models
Choe Perceptual groupings in a self-organizing map of spiking neurons
Tajima et al. Saliency-based color accessibility
CN111325730B (en) Underwater image index evaluation method based on random connection network
CN114548239A (en) Image identification and classification method based on artificial neural network of mammal-like retina structure
Gomez-Villa et al. Synthesizing visual illusions using generative adversarial networks
CN112819712A (en) Low-illumination color image enhancement method based on PNA-MSPCNN model
Kung et al. A Study on Image Quality Assessment using Neural Networks and Structure Similarty.
CN110889876A (en) Color image quantization method based on CA-SPCNN algorithm
Silva et al. Modeling disinhibition within a layered structure of the LGMD neuron
CN112865915B (en) Radio signal falsification method for counteracting deep learning
US20230281431A1 (en) Computer implemented method for processing structured data
Cui et al. An image quality metric based on a colour appearance model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant