CN107067408A - Simulate the image outline detection method of human eye fine motion - Google Patents
Simulate the image outline detection method of human eye fine motion Download PDFInfo
- Publication number
- CN107067408A CN107067408A CN201710230521.XA CN201710230521A CN107067408A CN 107067408 A CN107067408 A CN 107067408A CN 201710230521 A CN201710230521 A CN 201710230521A CN 107067408 A CN107067408 A CN 107067408A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- pixel
- msup
- receptive field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 22
- 238000004364 calculation method Methods 0.000 claims abstract description 39
- 238000000034 method Methods 0.000 claims abstract description 36
- 230000001629 suppression Effects 0.000 claims abstract description 13
- 230000008569 process Effects 0.000 claims description 22
- 230000004044 response Effects 0.000 claims description 17
- 230000000903 blocking effect Effects 0.000 claims description 6
- 230000010354 integration Effects 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 239000004744 fabric Substances 0.000 claims 1
- 238000001914 filtration Methods 0.000 abstract description 3
- 230000007547 defect Effects 0.000 abstract description 2
- 230000000694 effects Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005764 inhibitory process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 210000000977 primary visual cortex Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000000638 stimulation Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 241000282806 Rhinoceros Species 0.000 description 1
- 206010044565 Tremor Diseases 0.000 description 1
- 230000008309 brain mechanism Effects 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 210000002592 gangliocyte Anatomy 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004256 retinal image Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)
Abstract
The present invention provides a kind of image outline detection method for simulating human eye fine motion, comprises the following steps:A, input image to be detected, preset overall suppression parameter, rejection coefficient and Gabor filter group, and filtering obtains the classical receptive field stimuli responsive of each pixel;B, to classical receptive field stimuli responsive carry out truncation;C, using DoG templates and Provisional Center area, calculate the normalized weighting function for obtaining each pixel;The non-classical receptive field stimuli responsive rough calculation value under each pixel all directions is calculated, and seeks standard deviation;D, calculating obtain the standard deviation weight of each pixel, and then calculate the whole calculation value of non-classical receptive field stimuli responsive for obtaining each pixel;E, calculating obtain the outline identification value of each pixel, constitute the outline identification image of image to be detected.This method overcomes the defect of prior art, retains weak edge while strong texture is suppressed, improves the success rate of outline identification.
Description
Technical field
The present invention relates to image processing field, and in particular to a kind of image outline detection method of simulation human eye fine motion.
Background technology
The research big warp that is all based on centered finite region of the physiology and Neuscience of early stage in Vision information processing
Allusion quotation receptive field.However, the later stage, which more studies the light stimulus for showing to close in a wider context, can modulate classical receptive field response, this
Individual outer region is called non-classical receptive field.This modulation function enables neuron to integrate information in larger scope simultaneously
Pass to subsequent vision process.It is compared to the suppression of gangliocyte and foreign journals cell, the non-warp of primary visual cortex
Allusion quotation receptive field has more complicated characteristic.There is document representation non-classical receptive field to be probably divided into four kinds of patterns:(1) gamut
Suppress;(2) gamut promotes;(3) both sides suppress, and two ends promote;(4) both sides promote, and two ends suppress.In addition, non-classical impression
There is independent set direction open country.When classical receptive field and non-classical receptive field receive different types of direction, brightness, space
Frequency, space phase, and during movement velocity stimulation, stronger response will be produced, when classical receptive field and non-classical receptive field
Receive the direction of same type, brightness, spatial frequency, space phase, and during movement velocity stimulation, weaker sound will be produced
Should.
However, the property of these receptive fields is all based on anesthetized animal experiment mostly.In this case, the motion of human eye
With regard to ignored.But the actually motion of human eye is helpful to the brain mechanism of Vision information processing.Specifically, human eye movement's energy
It is divided into stabilization to watch attentively, the eye that motion is watched attentively and fixed is moved.Wherein fixed eye is dynamic again including trembling, drift and micro- bounce.Eye is dynamic
Retinal images and its follow-up vision system, including foreign journals, primary visual cortex and high-level vision cortex are affected, therefore
Carrying out
The content of the invention
The present invention is intended to provide a kind of image outline detection method for simulating human eye fine motion, this method overcomes prior art not
Consider the defect of human eye fine motion mechanism, retain weak edge while strong texture is suppressed well, improve outline identification into
Power.
Technical scheme is as follows:The image outline detection method of human eye fine motion is simulated, is comprised the following steps:
A, the image to be detected of input through gray proces, preset overall suppression parameter and rejection coefficient, preset circumferentially equal
The Gabor filter group of the multiple directions parameter of even distribution, joins according to all directions respectively to each pixel in image to be detected
Number carries out Gabor filtering, obtains the Gabor energy values of all directions of each pixel;For each pixel, its all directions is chosen
Maximum in Gabor energy values, is used as the classical receptive field stimuli responsive of the pixel;
B, for each pixel, its classical receptive field stimuli responsive is subjected to truncation, obtained after each pixel blocks
Classical receptive field stimuli responsive;
C, using difference of Gaussian function DoG templates, build one group of Provisional Center area, each Provisional Center area is relative to the visual field
Center has different deviation angles;For each pixel, the response of its Provisional Center area is integrated and normalizing with DoG templates
Change, obtain one group of normalized weighting function;
For each pixel, under different deviation angles, by after blocking in normalized weighting function and DoG templates
Classical receptive field stimuli responsive is made to sum after product, and obtaining non-classical receptive field of each pixel under each deviation angle stimulates sound
Answer rough calculation value;Standard deviation is asked to non-classical receptive field stimuli responsive rough calculation value of each pixel under each deviation angle;
D, for each pixel, with reference to the non-classical receptive field stimuli responsive rough calculation value under each deviation angle standard deviation and
Overall suppression parameter, which is calculated, obtains standard deviation weight;Non-classical receptive field under standard deviation weight and each deviation angle is stimulated and rung
The minimum value of rough calculation value is answered to carry out the whole calculation value of non-classical receptive field stimuli responsive that product obtains the pixel;
E, for each pixel, its classical receptive field stimuli responsive and the whole calculation value of non-classical receptive field stimuli responsive are combined
Rejection coefficient calculates the resultant stimulus response for obtaining the pixel, is the outline identification value of the pixel, by image to be detected
The outline identification of whole pixels is the outline identification for obtaining image to be detected after carrying out non-maxima suppression and value binaryzation
Image.
Preferably, the calculating of classical receptive field stimuli responsive is specific as follows in described step A:
The two-dimensional Gabor function expression of described Gabor filter group is as follows:
Whereinγ is one and represents oval receptive field axial ratio
The constant of example, parameter lambda is wavelength, and σ is the standard deviation of Gabor functions and the bandwidth in area of DoG template center, and 1/ λ is cosine letter
Several spatial frequencys, σ/λ is the bandwidth of spatial frequency,It is phase angular dimensions, θ is the angle parameter that Gabor is filtered;
I (x, y) is image to be detected, and * is convolution operator;
Gabor energy values are calculated as follows:
Wherein θiThe a certain angle filtered for Gabor, NθFor the number of the Gabor angles filtered;
E(x,y;σ) E (x, y) is the maximum of each angle Gabor filtered energy values of pixel (x, y), as pixel
The classical receptive field stimuli responsive of (x, y).
Preferably, the calculating process of the classical receptive field stimuli responsive after being blocked in described step B is as follows:
Utilize upper limit ratio PH∈ (0,1) and lower proportion ratio PL∈ (0,1) is to E (x, y;σ) blocked:
By E (x, the y of each pixel;σ) chosen from small to large, select PHE (x, the y of correspondence percentage number;
σ), maximum therein is set to QH, it is used as upper limit quantile;
By E (x, the y of each pixel;σ) chosen from small to large, select PLE (x, the y of correspondence percentage number;
σ), maximum therein is set to QL, it is used as lower limit quantile;
Classical receptive field stimuli responsive after blocking:
Preferably, the expression formula of the DoG templates in described step C:
Wherein k is the parameter of control DoG template sizes;
The expression formula of described Provisional Center area response is as follows:
Wherein d represents central region to the distance in Provisional Center area,Represent the deviation angle in Provisional Center area
Degree;
The integration of described each pixel and normalization process are as follows:
Pass through normalized weighting functionExpression formula is carried out, and expression formula is as follows:
Wherein w (x, y;D, φ)=wm(x,y;d,φ)·DoG(x,y;σ, k), | | | |1For (L1) norm regularization, H
(X) for take on the occasion of function;
The calculating process of non-classical receptive field stimuli responsive rough calculation value under described each deviation angle of each pixel is as follows:
Wherein:Inhe(x,y;σ,φi) be each deviation angle of each pixel under non-classical receptive field stimuli responsive rough calculation
Value;
-3kσ<x′<3kσ;-3kσ<y′<3k σ, represent the scope of DoG templates;
φiRepresent multiple deviation angles;
The average value and standard of non-classical receptive field stimuli responsive rough calculation value under described each deviation angle of each pixel
The calculating process of difference is as follows:
Wherein STDinh(x, y) be each pixel all directions under non-classical receptive field stimuli responsive rough calculation value standard deviation,
Aveinh(x, y) be each pixel all directions under non-classical receptive field stimuli responsive rough calculation value average value.
Preferably, the calculating process of the standard deviation weight in described step D is as follows:
Wherein wstd(x,y;σ) it is standard deviation weight, fosSuppress parameter to be overall;
The calculating process of non-classical receptive field stimuli responsive end calculation value is as follows:
Inh(x,y;σ)=Inhm(x,y;σ)·wstd(x,y;σ) (15);
Inhm(x,y;σ)=min { Inhe(x,y;σ,φi) | i=1,2 ..., Nφ} (16);
Wherein Inhm(x,y;σ) it is Inhe(x,y;σ,φi) minimum value.
Preferably, the calculating process of the resultant stimulus response in described step E is as follows:
R (x, y)=H (E (x, y;σ)-αInh(x,y;σ)) (17);
Wherein R (x, y) responds for the resultant stimulus of pixel, and α is rejection coefficient.
The skew for setting interim central area to produce human eye fine motion in the inventive method is simulated, it is assumed that fixed
The dynamic skew of eye can cause an interim central area, herein we have just assumed that inhibitory action only occurs at short distance or length
The neuron connection of distance, i.e., interim center can't suppress the response of interim central area, micro- by simulating human eye
Dynamic interim central area ensures the accuracy of the authenticity and contour detecting of simulation;
Also, the influence in being suppressed by the anthropomorphic eye fine motion of multi-channel filter mould from periphery, multi-channel feature
Selection is to simulate the non-directional of human eye fine motion, improve the authenticity of simulation and the accuracy of contour detecting;Meanwhile,
According to experiment it can be found that connatural texture can cause standard deviation to diminish, therefore, set in algorithms selection when standard deviation is small
When, inhibition level will strengthen;Different filter results is merged using the method for standard deviation, significant line can be suppressed
Reason, significant texture is got rid of, to reduce the probability of error detection;
Further, because excessive Gabor energy values can cause inaccurate inhibition response, more possible not weaker edge
Can be by the response suppression at the stronger edge in its periphery, therefore the method blocked using energy strengthens weaker edge, to reduce Lou
The probability of detection, improves detection quality.
Brief description of the drawings
Fig. 1 is profile testing method flow chart of the invention
Fig. 2 is the method for embodiment 1 and the Detection results comparison diagram of the contour detecting model of file 1
Fig. 3 is the method for embodiment 1 and the detection comparative bid parameter of the contour detecting model of file 1
Embodiment
The present invention is illustrated with reference to the accompanying drawings and examples.
Embodiment 1
As shown in figure 1, the image outline detection method for the simulation human eye fine motion that the present embodiment is provided comprises the following steps:
A, the image to be detected of input through gray proces, preset overall suppression parameter and rejection coefficient, preset circumferentially equal
The Gabor filter group of the multiple directions parameter of even distribution, joins according to all directions respectively to each pixel in image to be detected
Number carries out Gabor filtering, obtains the Gabor energy values of all directions of each pixel;For each pixel, its all directions is chosen
Maximum in Gabor energy values, is used as the classical receptive field stimuli responsive of the pixel;
The calculating of classical receptive field stimuli responsive is specific as follows in described step A:
The two-dimensional Gabor function expression of described Gabor filter group is as follows:
Whereinγ is one and represents oval receptive field major and minor axis ratio
Constant, parameter lambda is wavelength, and σ is the standard deviation of Gabor functions and the bandwidth in area of DoG template center, and 1/ λ is cosine function
Spatial frequency, σ/λ be spatial frequency bandwidth,It is phase angular dimensions, θ is the angle parameter that Gabor is filtered;
I (x, y) is image to be detected, and * is convolution operator;
Gabor energy values are calculated as follows:
Wherein θiThe a certain angle filtered for Gabor, NθFor the number of the Gabor angles filtered;
E(x,y;σ) E (x, y) is the maximum of each angle Gabor filtered energy values of pixel (x, y), as pixel
The classical receptive field stimuli responsive of (x, y);
B, for each pixel, its classical receptive field stimuli responsive is subjected to truncation, obtained after each pixel blocks
Classical receptive field stimuli responsive;
The calculating process of classical receptive field stimuli responsive after being blocked in described step B is as follows:
Utilize upper limit ratio PH∈ (0,1) and lower proportion ratio PL∈ (0,1) is to E (x, y;σ) blocked:
By E (x, the y of each pixel;σ) chosen from small to large, select PHE (x, the y of correspondence percentage number;
σ), maximum therein is set to QH, it is used as upper limit quantile;The present embodiment PH=0.8, will each pixel E (x, y;σ) from
The small value to 80% number of big selection, Q is set to by maximum thereinH;
By E (x, the y of each pixel;σ) chosen from small to large, select PLE (x, the y of correspondence percentage number;
σ), maximum therein is set to QL;The present embodiment PL=0.1, will each pixel E (x, y;σ) 10% is chosen from small to large to count
Purpose value, Q is set to by maximum thereinL;
Classical receptive field stimuli responsive after blocking:
C, using difference of Gaussian function DoG templates, build one group of Provisional Center area, each Provisional Center area is relative to the visual field
Center has different deviation angles;For each pixel, the response of its Provisional Center area is integrated and normalizing with DoG templates
Change, obtain one group of normalized weighting function;
For each pixel, under different deviation angles, by after blocking in normalized weighting function and DoG templates
Classical receptive field stimuli responsive is made to sum after product, and obtaining non-classical receptive field of each pixel under each deviation angle stimulates sound
Answer rough calculation value;Standard deviation is asked to non-classical receptive field stimuli responsive rough calculation value of each pixel under each deviation angle;
The expression formula of DoG templates in described step C:
Wherein k is the parameter of control DoG template sizes;
The expression formula of described Provisional Center area response is as follows:
Wherein d represents central region to the distance in Provisional Center area,Represent the deviation angle in Provisional Center area
Degree;
The integration of described each pixel and normalization process are as follows:
Pass through normalized weighting functionExpression formula is carried out, and expression formula is as follows:
Wherein w (x, y;D, φ)=wm(x,y;d,φ)·DoG(x,y;σ, k), | | | |1For (L1) norm regularization, H
(X) for take on the occasion of function;
The calculating process of non-classical receptive field stimuli responsive rough calculation value under described each deviation angle of each pixel is as follows:
Wherein:Inhe(x,y;σ,φi) be each deviation angle of each pixel under non-classical receptive field stimuli responsive rough calculation
Value;
-3kσ<x′<3kσ;-3kσ<y′<3k σ, represent the scope of DoG templates;
φiRepresent multiple deviation angles;
The average value and standard of non-classical receptive field stimuli responsive rough calculation value under described each deviation angle of each pixel
The calculating process of difference is as follows:
Wherein STDinh(x, y) be each pixel all directions under non-classical receptive field stimuli responsive rough calculation value standard deviation,
Aveinh(x, y) be each pixel all directions under non-classical receptive field stimuli responsive rough calculation value average value;
D, for each pixel, with reference to the non-classical receptive field stimuli responsive rough calculation value under each deviation angle standard deviation and
Overall suppression parameter, which is calculated, obtains standard deviation weight;Non-classical receptive field under standard deviation weight and each deviation angle is stimulated and rung
The minimum value of rough calculation value is answered to carry out the whole calculation value of non-classical receptive field stimuli responsive that product obtains the pixel;
Preferably, the calculating process of the standard deviation weight in described step D is as follows:
Wherein wstd(x,y;σ) it is standard deviation weight, fosSuppress parameter to be overall;
The calculating process of non-classical receptive field stimuli responsive end calculation value is as follows:
Inh(x,y;σ)=Inhm(x,y;σ)·wstd(x,y;σ) (15);
Inhm(x,y;σ)=min { Inhe(x,y;σ,φi) | i=1,2 ..., Nφ} (16);
Wherein Inhm(x,y;σ) it is Inhe(x,y;σ,φi) minimum value;
E, for each pixel, its classical receptive field stimuli responsive and the whole calculation value of non-classical receptive field stimuli responsive are combined
Rejection coefficient calculates the resultant stimulus response for obtaining the pixel, is the outline identification value of the pixel, by image to be detected
The outline identification of whole pixels is the outline identification for obtaining image to be detected after carrying out non-maxima suppression and value binaryzation
Image;
The calculating process of resultant stimulus response in described step E is as follows:
R (x, y)=H (E (x, y;σ)-αInh(x,y;σ)) (17);
Wherein R (x, y) responds for the resultant stimulus of pixel, and α is rejection coefficient.
The contour detecting isotropic model and items that the profile testing method of the present embodiment and document 1 are provided below
Different in nature model carries out Usefulness Pair ratio, wherein being carried out from the isotropic model and anisotropic model in document 1 effective
Property contrast, document 1 is as follows:
Document 1:Grigorescu C,Petkov N,Westenberg M.Contour detection based on
nonclassical receptive field inhibition[J].IEEE Transactions on Image
Processing,2003,12(7):729-739.
To ensure the validity of contrast, use to enter with identical non-maxima suppression method in document 1 for the present embodiment
The follow-up profile of row is integrated, wherein the two threshold value t includedh,tlIt is set to tl=0.5th, calculated by threshold value quantile p and obtained;
Wherein Performance Evaluating Indexes P uses the following standard provided in document 1:
N in formulaTP、nFP、nFNThe number of the profile of correct profile, error profile and omission that detection is obtained is represented respectively,
nGTRepresent the number of actual profile, efnRepresent error detection parameter, efpRepresent to omit detection parameter;Evaluation metricses P values exist
Between [0,1], represent that the effect of contour detecting is better closer to 1, in addition, definition tolerance is:Detected in 5*5 neighborhood
All calculations correctly detect.
Choose the secondary classic map picture of hairbrush, elephant, rhinoceros 3 and carry out Usefulness Pair ratio, the isotropic in document 1 is respectively adopted
Model, anisotropic model and the method for embodiment 1 carry out contour detecting, the wherein method selection of embodiment 1 to above-mentioned 3 width figure
Parameter group is as shown in table 1,
The parameter group table of 1 embodiment of table 1
Isotropic model, anisotropic model in document 1 use following 80 groups of parameters:α={ 1.0,1.2 }, σ=
{ 1.4,1.6,1.8,2.0,2.2,2.4,2.6,2.8 }, p={ 0.5,0.4,0.3,0.2,0.1 };
One group of best parameter of effect is carried out in selection isotropic model, anisotropic model and the method for embodiment 1
Contrast, the contrast of contours extract design sketch is as shown in Fig. 2 as seen from Figure 2, from the effect of contours extract, the method for embodiment 1
It is superior to isotropic model, anisotropic model in document 1;Fig. 3 is the schematic diagram in Provisional Center area, wherein dotted ellipse
Part is Provisional Center area;
Wherein table 2 is the corresponding partial parameters table of the result figure of embodiment 1, and remaining parameter is with reference to the data in table 1;Table 3,4
Respectively isotropic model, the corresponding parameter list of anisotropic model result figure, table 5 are that the method for embodiment 1 is contrasted with other
The recognition effect contrast table of model, further proves, the method for embodiment 1 is superior to isotropic model, Ge Xiangyi in document 1
Property model.
The corresponding partial parameters table of the result figure of 2 embodiment of table 1
The corresponding parameter list of the isotropic model result figure of table 3
The corresponding parameter list of the anisotropic model result figure of table 4
The experimental result comparison diagram of table 5
Claims (6)
1. simulate the image outline detection method of human eye fine motion, it is characterised in that comprise the following steps:
A, the image to be detected of input through gray proces, preset overall suppression parameter and rejection coefficient, preset circumferentially uniform point
The Gabor filter group of the multiple directions parameter of cloth, enters according to all directions parameter respectively to each pixel in image to be detected
Row Gabor is filtered, and obtains the Gabor energy values of all directions of each pixel;For each pixel, its all directions is chosen
Maximum in Gabor energy values, is used as the classical receptive field stimuli responsive of the pixel;
B, for each pixel, its classical receptive field stimuli responsive is subjected to truncation, the warp after each pixel is blocked is obtained
Allusion quotation receptive field stimuli responsive;
C, using difference of Gaussian function DoG templates, build one group of Provisional Center area, each Provisional Center area is relative to central region
Area has different deviation angles;For each pixel, the response of its Provisional Center area is integrated and normalized with DoG templates,
Obtain one group of normalized weighting function;
For each pixel, under different deviation angles, by the classics after blocking in normalized weighting function and DoG templates
Receptive field stimuli responsive is made to sum after product, obtains non-classical receptive field stimuli responsive of each pixel under each deviation angle thick
Calculation value;Standard deviation is asked to non-classical receptive field stimuli responsive rough calculation value of each pixel under each deviation angle;
D, for each pixel, with reference to the standard deviation and entirety of the non-classical receptive field stimuli responsive rough calculation value under each deviation angle
Suppress parameter calculating and obtain standard deviation weight;Standard deviation weight and non-classical receptive field stimuli responsive under each deviation angle is thick
The minimum value of calculation value carries out the whole calculation value of non-classical receptive field stimuli responsive that product obtains the pixel;
E, for each pixel, its classical receptive field stimuli responsive and the whole calculation value of non-classical receptive field stimuli responsive are combined into suppression
Coefficient calculates the resultant stimulus response for obtaining the pixel, is the outline identification value of the pixel, and image to be detected is whole
The outline identification of pixel is the outline identification image for obtaining image to be detected after carrying out non-maxima suppression and value binaryzation.
2. the image outline detection method of human eye fine motion is simulated as claimed in claim 1, it is characterised in that:
The calculating of classical receptive field stimuli responsive is specific as follows in described step A:
The two-dimensional Gabor function expression of described Gabor filter group is as follows:
Whereinγ is one and represents the normal of oval receptive field major and minor axis ratio
Number, parameter lambda is wavelength, and σ is the standard deviation of Gabor functions and the bandwidth in area of DoG template center, and 1/ λ is the sky of cosine function
Between frequency, σ/λ be spatial frequency bandwidth,It is phase angular dimensions, θ is the angle parameter that Gabor is filtered;
I (x, y) is image to be detected, and * is convolution operator;
Gabor energy values are calculated as follows:
<mrow>
<msub>
<mi>E</mi>
<mrow>
<mi>&lambda;</mi>
<mo>,</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&theta;</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msqrt>
<mrow>
<msub>
<msup>
<mi>e</mi>
<mn>2</mn>
</msup>
<mrow>
<mi>&lambda;</mi>
<mo>,</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&theta;</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<mn>0</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>+</mo>
<msub>
<msup>
<mi>e</mi>
<mn>2</mn>
</msup>
<mrow>
<mi>&lambda;</mi>
<mo>,</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&theta;</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<mi>&pi;</mi>
<mo>/</mo>
<mn>2</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
</mrow>
</msqrt>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
<mrow>
<msub>
<mi>&theta;</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<mi>&pi;</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
<msub>
<mi>N</mi>
<mi>&theta;</mi>
</msub>
</mfrac>
<mo>,</mo>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mo>...</mo>
<msub>
<mi>N</mi>
<mi>&theta;</mi>
</msub>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein θiThe a certain angle filtered for Gabor, NθFor the number of the Gabor angles filtered;
<mrow>
<mi>E</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mo>{</mo>
<msub>
<mi>E</mi>
<mrow>
<mi>&lambda;</mi>
<mo>,</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&theta;</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mo>...</mo>
<msub>
<mi>N</mi>
<mi>&theta;</mi>
</msub>
<mo>}</mo>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
E(x,y;σ) for pixel (x, y) each angle Gabor filtered energy values maximum, as pixel (x, y) warp
Allusion quotation receptive field stimuli responsive.
3. the image outline detection method of human eye fine motion is simulated as claimed in claim 2, it is characterised in that:
The calculating process of classical receptive field stimuli responsive after being blocked in described step B is as follows:
Utilize upper limit ratio PH∈ (0,1) and lower proportion ratio PL∈ (0,1) is to E (x, y;σ) blocked:
By E (x, the y of each pixel;σ) chosen from small to large, select PHE (x, the y of correspondence percentage number;σ), wherein
Maximum be set to QH, it is used as upper limit quantile;
By E (x, the y of each pixel;σ) chosen from small to large, select PLE (x, the y of correspondence percentage number;σ), wherein
Maximum be set to QL, it is used as lower limit quantile;
Classical receptive field stimuli responsive after blocking:
<mrow>
<mover>
<mi>E</mi>
<mo>^</mo>
</mover>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<msub>
<mi>Q</mi>
<mi>H</mi>
</msub>
</mtd>
<mtd>
<mrow>
<mi>E</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mo>></mo>
<msub>
<mi>Q</mi>
<mi>H</mi>
</msub>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>E</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>Q</mi>
<mi>L</mi>
</msub>
<mo><</mo>
<mi>E</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mo><</mo>
<msub>
<mi>Q</mi>
<mi>H</mi>
</msub>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Q</mi>
<mi>L</mi>
</msub>
</mtd>
<mtd>
<mrow>
<mi>E</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mo><</mo>
<msub>
<mi>Q</mi>
<mi>L</mi>
</msub>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>)</mo>
</mrow>
<mo>.</mo>
</mrow>
4. the image outline detection method of human eye fine motion is simulated as claimed in claim 3, it is characterised in that:
The expression formula of DoG templates in described step C:
<mrow>
<mi>D</mi>
<mi>o</mi>
<mi>G</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mn>2</mn>
<mi>&pi;</mi>
<msup>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
<mi>exp</mi>
<mrow>
<mo>(</mo>
<mo>-</mo>
<mfrac>
<mrow>
<msup>
<mi>x</mi>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mi>y</mi>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<mn>2</mn>
<msup>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mn>2</mn>
<msup>
<mi>&pi;&sigma;</mi>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
<mi>exp</mi>
<mrow>
<mo>(</mo>
<mo>-</mo>
<mfrac>
<mrow>
<msup>
<mi>x</mi>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mi>y</mi>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<mn>2</mn>
<msup>
<mi>&sigma;</mi>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>7</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein k is the parameter of control DoG template sizes;
The expression formula of described Provisional Center area response is as follows:
Wherein d represents central region to the distance in Provisional Center area,Represent the deviation angle in Provisional Center area;
The integration of described each pixel and normalization process are as follows:
Pass through normalized weighting functionExpression formula is carried out, and expression formula is as follows:
<mrow>
<mover>
<mi>w</mi>
<mo>^</mo>
</mover>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>d</mi>
<mo>,</mo>
<mi>&phi;</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<mi>H</mi>
<mrow>
<mo>(</mo>
<mi>w</mi>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>d</mi>
<mo>,</mo>
<mi>&phi;</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
</mrow>
<mrow>
<mo>|</mo>
<mo>|</mo>
<mi>H</mi>
<mrow>
<mo>(</mo>
<mi>w</mi>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>d</mi>
<mo>,</mo>
<mi>&phi;</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
<mo>|</mo>
<msub>
<mo>|</mo>
<mn>1</mn>
</msub>
</mrow>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>9</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein w (x, y;D, φ)=wm(x,y;d,φ)·DoG(x,y;σ, k), | | | |1For (L1) norm regularization, H (X) is
Take on the occasion of function;
The calculating process of non-classical receptive field stimuli responsive rough calculation value under described each deviation angle of each pixel is as follows:
<mrow>
<msub>
<mi>Inh</mi>
<mi>e</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&phi;</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msub>
<mi>&Sigma;</mi>
<msup>
<mi>x</mi>
<mo>&prime;</mo>
</msup>
</msub>
<msub>
<mi>&Sigma;</mi>
<msup>
<mi>y</mi>
<mo>&prime;</mo>
</msup>
</msub>
<mrow>
<mo>(</mo>
<mover>
<mi>E</mi>
<mo>^</mo>
</mover>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mo>+</mo>
<msup>
<mi>x</mi>
<mo>&prime;</mo>
</msup>
<mo>,</mo>
<mi>y</mi>
<mo>+</mo>
<msup>
<mi>y</mi>
<mo>&prime;</mo>
</msup>
<mo>;</mo>
<mi>&sigma;</mi>
</mrow>
<mo>)</mo>
<mover>
<mi>w</mi>
<mo>^</mo>
</mover>
<mo>(</mo>
<mrow>
<msup>
<mi>x</mi>
<mo>&prime;</mo>
</msup>
<mo>,</mo>
<msup>
<mi>y</mi>
<mo>&prime;</mo>
</msup>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&phi;</mi>
<mi>i</mi>
</msub>
</mrow>
<mo>)</mo>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>10</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein:Inhe(x,y;σ,φi) be each deviation angle of each pixel under non-classical receptive field stimuli responsive rough calculation value;
-3kσ<x′<3kσ;-3kσ<y′<3k σ, represent the scope of DoG templates;
φiRepresent multiple deviation angles;
The average value of non-classical receptive field stimuli responsive rough calculation value under described each deviation angle of each pixel and standard deviation
Calculating process is as follows:
<mrow>
<msub>
<mi>STD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msqrt>
<mfrac>
<mrow>
<msub>
<mi>&Sigma;</mi>
<mi>i</mi>
</msub>
<msup>
<mrow>
<mo>{</mo>
<msub>
<mi>Inh</mi>
<mi>e</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&phi;</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msub>
<mi>Ave</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<msub>
<mi>N</mi>
<mi>&phi;</mi>
</msub>
</mfrac>
</msqrt>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>11</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
<mrow>
<msub>
<mi>Ave</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mi>&Sigma;</mi>
<mi>i</mi>
</msub>
<msub>
<mi>Inh</mi>
<mi>e</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>,</mo>
<msub>
<mi>&phi;</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<msub>
<mi>N</mi>
<mi>&phi;</mi>
</msub>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>12</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein STDinh(x, y) be each pixel all directions under non-classical receptive field stimuli responsive rough calculation value standard deviation,
Aveinh(x, y) be each pixel all directions under non-classical receptive field stimuli responsive rough calculation value average value.
5. the image outline detection method of human eye fine motion is simulated as claimed in claim 4, it is characterised in that:
The calculating process of standard deviation weight in described step D is as follows:
<mrow>
<msub>
<mi>w</mi>
<mrow>
<mi>s</mi>
<mi>t</mi>
<mi>d</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>;</mo>
<mi>&sigma;</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>exp</mi>
<mrow>
<mo>(</mo>
<mo>-</mo>
<mfrac>
<mrow>
<mover>
<mi>S</mi>
<mo>^</mo>
</mover>
<msub>
<mi>TD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
</mrow>
<mi>&sigma;</mi>
</mfrac>
<mo>)</mo>
</mrow>
<mo>+</mo>
<msub>
<mi>f</mi>
<mrow>
<mi>o</mi>
<mi>s</mi>
</mrow>
</msub>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>13</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
<mrow>
<mover>
<mi>S</mi>
<mo>^</mo>
</mover>
<msub>
<mi>TD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mi>STD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>STD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
</mrow>
<mrow>
<mi>max</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>STD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
<mo>-</mo>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>STD</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
<mi>h</mi>
</mrow>
</msub>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
</mrow>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>14</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
Wherein wstd(x,y;σ) it is standard deviation weight, fosSuppress parameter to be overall;
The calculating process of non-classical receptive field stimuli responsive end calculation value is as follows:
Inh(x,y;σ)=Inhm(x,y;σ)·wstd(x,y;σ) (15);
Inhm(x,y;σ)=min { Inhe(x,y;σ,φi) | i=1,2 ..., Nφ} (16);
Wherein Inhm(x,y;σ) it is Inhe(x,y;σ,φi) minimum value.
6. the image outline detection method of human eye fine motion is simulated as claimed in claim 5, it is characterised in that:
The calculating process of resultant stimulus response in described step E is as follows:
R (x, y)=H (E (x, y;σ)-αInh(x,y;σ)) (17);
Wherein R (x, y) responds for the resultant stimulus of pixel, and α is rejection coefficient.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710230521.XA CN107067408B (en) | 2017-04-11 | 2017-04-11 | Image contour detection method for simulating human eye micromotion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710230521.XA CN107067408B (en) | 2017-04-11 | 2017-04-11 | Image contour detection method for simulating human eye micromotion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107067408A true CN107067408A (en) | 2017-08-18 |
CN107067408B CN107067408B (en) | 2020-01-31 |
Family
ID=59601799
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710230521.XA Active CN107067408B (en) | 2017-04-11 | 2017-04-11 | Image contour detection method for simulating human eye micromotion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107067408B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111080663A (en) * | 2019-12-30 | 2020-04-28 | 广西科技大学 | Bionic contour detection method based on dynamic receptive field |
CN111968139A (en) * | 2020-06-23 | 2020-11-20 | 广西科技大学 | Contour detection method based on primary vision cortex vision fixation micro-motion mechanism |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4579191B2 (en) * | 2006-06-05 | 2010-11-10 | 本田技研工業株式会社 | Collision avoidance system, program and method for moving object |
US20150062000A1 (en) * | 2013-08-29 | 2015-03-05 | Seiko Epson Corporation | Head mounted display apparatus |
CN104484667A (en) * | 2014-12-30 | 2015-04-01 | 华中科技大学 | Contour extraction method based on brightness characteristic and contour integrity |
CN106033606A (en) * | 2015-07-24 | 2016-10-19 | 广西科技大学 | Target contour detection method of biomimetic smooth tracking eye movement information processing mechanism |
CN106033608A (en) * | 2015-07-24 | 2016-10-19 | 广西科技大学 | Target contour detection method of biomimetic smooth tracking eye movement information processing mechanism |
CN106127740A (en) * | 2016-06-16 | 2016-11-16 | 杭州电子科技大学 | A kind of profile testing method based on the association of visual pathway many orientation of sensory field |
CN106156779A (en) * | 2016-06-24 | 2016-11-23 | 清华大学深圳研究生院 | A kind of contour extraction of objects method in complex scene |
CN106338733A (en) * | 2016-09-09 | 2017-01-18 | 河海大学常州校区 | Forward-looking sonar object tracking method based on frog-eye visual characteristic |
-
2017
- 2017-04-11 CN CN201710230521.XA patent/CN107067408B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4579191B2 (en) * | 2006-06-05 | 2010-11-10 | 本田技研工業株式会社 | Collision avoidance system, program and method for moving object |
US20150062000A1 (en) * | 2013-08-29 | 2015-03-05 | Seiko Epson Corporation | Head mounted display apparatus |
CN104484667A (en) * | 2014-12-30 | 2015-04-01 | 华中科技大学 | Contour extraction method based on brightness characteristic and contour integrity |
CN106033606A (en) * | 2015-07-24 | 2016-10-19 | 广西科技大学 | Target contour detection method of biomimetic smooth tracking eye movement information processing mechanism |
CN106033608A (en) * | 2015-07-24 | 2016-10-19 | 广西科技大学 | Target contour detection method of biomimetic smooth tracking eye movement information processing mechanism |
CN106127740A (en) * | 2016-06-16 | 2016-11-16 | 杭州电子科技大学 | A kind of profile testing method based on the association of visual pathway many orientation of sensory field |
CN106156779A (en) * | 2016-06-24 | 2016-11-23 | 清华大学深圳研究生院 | A kind of contour extraction of objects method in complex scene |
CN106338733A (en) * | 2016-09-09 | 2017-01-18 | 河海大学常州校区 | Forward-looking sonar object tracking method based on frog-eye visual characteristic |
Non-Patent Citations (4)
Title |
---|
CHUAN LIN 等: "Improved contour detection model with spatial summation properties based on nonclassical receptive field", 《JOURNAL OF ELECTRONIC IMAGING》 * |
COSMIN GRIGORESCU 等: "Improved Contour Detection by Non-classical Receptive Field Inhibition", 《BIOLOGICALLY MOTIVATED COMPUTER VISION》 * |
李康群 等: "基于视通路多感受野朝向性关联的轮廓检测方法", 《中国生物医学工程学报》 * |
林川 等: "考虑微动机制与感受野特性的轮廓检测模型", 《计算机工程与应用》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111080663A (en) * | 2019-12-30 | 2020-04-28 | 广西科技大学 | Bionic contour detection method based on dynamic receptive field |
CN111080663B (en) * | 2019-12-30 | 2020-09-22 | 广西科技大学 | Bionic contour detection method based on dynamic receptive field |
CN111968139A (en) * | 2020-06-23 | 2020-11-20 | 广西科技大学 | Contour detection method based on primary vision cortex vision fixation micro-motion mechanism |
CN111968139B (en) * | 2020-06-23 | 2023-06-13 | 广西科技大学 | Contour detection method based on primary visual cortex vision fixation micro-motion mechanism |
Also Published As
Publication number | Publication date |
---|---|
CN107067408B (en) | 2020-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Cao et al. | Latent orientation field estimation via convolutional neural network | |
CN105744256B (en) | Based on the significant objective evaluation method for quality of stereo images of collection of illustrative plates vision | |
CN107767387B (en) | Contour detection method based on variable receptive field scale global modulation | |
CN101329726B (en) | Method for reinforcing fingerprint image based on one-dimensional filtering | |
CN106033608B (en) | The objective contour detection method of bionical object smooth pursuit eye movement information processing mechanism | |
CN107798318A (en) | The method and its device of a kind of happy micro- expression of robot identification face | |
CN106485222A (en) | A kind of method for detecting human face being layered based on the colour of skin | |
KR101786754B1 (en) | Device and method for human age estimation | |
CN110059593B (en) | Facial expression recognition method based on feedback convolutional neural network | |
CN108010046A (en) | Based on the bionical profile testing method for improving classical receptive field | |
CN105139000A (en) | Face recognition method and device enabling glasses trace removal | |
CN116311549A (en) | Living body object identification method, apparatus, and computer-readable storage medium | |
CN106033610A (en) | Contour detection method based on non-classical receptive field space summation modulation | |
CN107067407A (en) | Profile testing method based on non-classical receptive field and linear non-linear modulation | |
CN105893916A (en) | New method for detection of face pretreatment, feature extraction and dimensionality reduction description | |
CN106033609B (en) | The objective contour detection method of bionical object jump eye movement information processing mechanism | |
CN107067408A (en) | Simulate the image outline detection method of human eye fine motion | |
CN102222231B (en) | Visual attention information computing device based on guidance of dorsal pathway and processing method thereof | |
He et al. | Structure-preserving texture smoothing via scale-aware bilateral total variation | |
CN107742302A (en) | Profile testing method based on the multiple dimensioned profile fusion of primary visual cortex | |
CN105809085A (en) | Human eye positioning method and device | |
CN116778558A (en) | Face wrinkle detection method and system based on adaptive hybrid hessian filter | |
CN104966271B (en) | Image de-noising method based on biological vision receptive field mechanism | |
US20210374916A1 (en) | Storage medium storing program, image processing apparatus, and training method of machine learning model | |
CN106033606B (en) | The objective contour detection method of bionical object smooth pursuit eye movement information processing mechanism |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20170818 Assignee: HUALI FAMILY PRODUCTS CO.,LTD. Assignor: GUANGXI University OF SCIENCE AND TECHNOLOGY Contract record no.: X2023980054119 Denomination of invention: Image contour detection method for simulating human eye movement Granted publication date: 20200131 License type: Common License Record date: 20231226 |
|
OL01 | Intention to license declared |