CN103049902A - Image change detection method and device - Google Patents
Image change detection method and device Download PDFInfo
- Publication number
- CN103049902A CN103049902A CN2012104049164A CN201210404916A CN103049902A CN 103049902 A CN103049902 A CN 103049902A CN 2012104049164 A CN2012104049164 A CN 2012104049164A CN 201210404916 A CN201210404916 A CN 201210404916A CN 103049902 A CN103049902 A CN 103049902A
- Authority
- CN
- China
- Prior art keywords
- inhibiting factor
- width
- image
- cloth
- picture point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an image change detection method and device, wherein the method comprises the following steps that two image points with the same position in two images are determined, and the edge tag (ET) information ET1 and ET2 of the two image points is acquired; the cross correlation coefficient and the inhibitor of the ET1 and ET2 are calculated, so that the correlation coefficient with the inhibitor of the ET1 and ET2 is obtained; the correlation coefficient with the inhibitor is compared with a preset change threshold; and that whether the image points with the same position in the two images are changed or not is determined. According to the image change detection method and the device, when an ET method is used for detecting the changes, the impact of illumination changes and registration errors within a certain range can be effectively overcome.
Description
Technical field
The present invention relates to technical field of image detection, relate in particular to a kind of change detecting method and device of image.
Background technology
For Multitemporal Remote Sensing Images, the factor such as illumination variation, sensor noise, registration noise is so that produce insignificant variation in the image, and the impact that overcomes meaningless variation is to change the difficult point that detects.Pertinent literature is discussed the change detecting method that occurs in recent years, most methods all can be included under the Bayes check theoretical frame, namely utilize the method for CFAR detection, determine through image D (x after the algebraic operation, y) variation in, these methods suppose that usually the element among the D (x, y) is independent identically distributed, therefore can not embody dependency structure characteristic intrinsic in the image, also can't overcome the impact of illumination change.In addition, some methods are from relevant angle, check the linear dependence degree of data in the moving window corresponding in two width of cloth images, for unchanged zone, even two interior image illuminations of window have the difference of a constant, remain linear dependence between them, therefore overcome the impact of illumination change, but this method can not overcome the impact of registration noise.
Summary of the invention
The present invention proposes a kind of method and device that image is changed detection based on edge labelling, can effectively overcome the impact of illumination change and reduce to a certain extent registration error.
The object of the invention is achieved through the following technical solutions:
A kind of change detecting method of image,, may further comprise the steps: determine two picture point of the same position in two width of cloth images, and obtain the edge labelling ET information of these two picture point: ET
1, ET
2Calculate ET
1, ET
2Cross-correlation coefficient and inhibiting factor, thereby obtain ET
1, ET
2The related coefficient with inhibiting factor; Above-mentioned related coefficient with inhibiting factor is compared with the variation threshold value of presetting, determine whether the picture point of same position in described two width of cloth images changes.
Wherein, suppose that the picture position is (x, y), the picture point of this position is f in the first width of cloth image
1(x, y), the picture point of this position is f in the second width of cloth image
2(x, y) then obtains ET
1, ET
2Detailed process be: utilize respectively gradient amplitude Description Image point f
1(x, y), f
2The edge strength that (x, y) locates is
Respectively with p
1(x, y), p
2(x, y) asks inner product with the Gabor function of different directions, obtains ET
1=(ET
1 1, ET
1 2..., ET
1 n), ET
2=(ET
2 1, ET
2 2..., ET
2 n).
Wherein, the described ET that obtains
1, ET
2The detailed process with the related coefficient of inhibiting factor be: calculate ET
1, ET
2Cross-correlation coefficient:
Wherein,
The expression auto-covariance function,
The expression cross covariance function; Determine inhibiting factor
Wherein,
Expression point (x, y) is located the average of ET,
The absolute value that represents two ET average differences; Definition with the related coefficient of inhibiting factor is: s ρ
Xy(m)=A (1-| ρ
Xy(m) |).
Choosing of described inhibiting factor A should be satisfied: A trend 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A trend 1 that differs greatly.
Describedly determine whether vicissitudinous process is the picture point in two width of cloth images: when described related coefficient with inhibiting factor during greater than threshold value, determine that the image of the same position of two width of cloth figure changes, otherwise, determine not change.
A kind of change detecting device of image comprises: ET information determining unit is used for obtaining the ET information of two picture point of the same position of two width of cloth images: ET
1, ET
2The correlation calculations unit is used for calculating ET
1, ET
2Cross-correlation coefficient and inhibiting factor, thereby obtain ET
1, ET
2The related coefficient with inhibiting factor; Judging unit is used for above-mentioned related coefficient with inhibiting factor is compared with the variation threshold value of presetting, and determines whether the picture point of same position in described two width of cloth images changes.
Wherein, suppose that the picture position is (x, y), the picture point of this position is f in the first width of cloth image
1(x, y), the picture point of this position is f in the second width of cloth image
2(x, y), then ET information determining unit comprises: the edge strength computation subunit is used for utilizing respectively gradient amplitude Description Image point f
1(x, y), f
2The edge strength that (x, y) locates; ET acquisition of information subelement is used for respectively with p
1(x, y), p
2(x, y) asks inner product with the Gabor function of different directions, obtains ET
1, ET
2
Wherein, described correlation calculations unit further comprises: the cross-correlation coefficient computation subunit is used for calculating ET
1, ET
2Cross-correlation coefficient; Inhibiting factor is determined subelement, is used for determining inhibiting factor A; Related coefficient is determined subelement, is used for determining with the related coefficient of inhibiting factor according to cross-correlation coefficient and inhibiting factor being.
Wherein, described inhibiting factor determines that subelement chooses choosing of inhibiting factor A and should satisfy: A trend 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A trend 1 that differs greatly.
Wherein, described judging unit determines whether vicissitudinous process is for picture point in two width of cloth images: when described related coefficient with inhibiting factor during greater than threshold value, determine that the image of the same position of two width of cloth figure changes, otherwise, determine not change.
Technical solution of the present invention has following technique effect:
When the edge labelling method is used for changing detection, can effectively overcome the impact of the registration error in illumination change and the certain limit.Compare with other detection method, have better detection effect, be suitable for the detection that structural information changes in the remote sensing images.The edge labelling method is a kind of description of picture structure information, not only can be used for changing detecting, and certain application prospect is also being arranged aspect rim detection, the Corner Detection.
Description of drawings
Fig. 1 is ET vector curve corresponding to different directions of the present invention edge;
Fig. 2 is the ET vector curve that the present invention utilizes ET antagonism registration noiseproof feature;
Fig. 3 is the ET vector curve that the present invention utilizes ET antagonism illumination change performance;
Fig. 4 is that the present invention utilizes ET information image to be changed the method flow diagram of detection;
Fig. 5 is that the present invention utilizes ET information that image is carried out the change detecting device schematic diagram.
Embodiment
The factor such as strong and weak variations of illumination, shade variations, seasonal variations is so that produce difference between the image, this class difference main manifestations is that the border in zone in two width of cloth images does not change, and is intra-zone pixel value difference.Another kind of difference is caused by the variation of object space, shape, yardstick in the scene, and variation has occured the structural informations such as the edge that the notable feature that this class changes is corresponding region in the image, texture, such as the appearance of object, disappearance, movement, malformation etc.
The present invention is from the biological vision principle, utilize multi-direction Gabor function to extract the structural information of image, be referred to as edge labelling (Edge Tag, be called for short ET), and use it in the detection of structural information variation, testing process is: at first extract the ET feature in two width of cloth figure, then with the correlation coefficient process with inhibiting factor, detect the similarity between the ET, determine at last the variation of image according to similarity measure.
Need to prove, the edge labelling method is a kind of description of picture structure information, not only can be used for changing detecting, and also can use aspect rim detection, the Corner Detection.For convenience of description, the present invention describes image is changed detect as example.
At first the edge label information is described.
Studies show that of Neuropsychology, the neuron of vision system is to the perception domain model of light structure in the form of a ring, and when being subject to the light excitation, the central area in perception territory is excitatory state, and fringe region is holddown.In the actual life, when people watch a certain when zone attentively, the core in zone is clear and the marginal portion fuzzy, and this shows that this model meets human psychology of vision rule.The perception territory characteristic of the fine imictron of two-dimensional Gabor function energy, the Gabor function of even symmetry is:
I represents i Along ent with timesharing such as circumference n, θ in the formula
iExpression
The fluctuation direction, λ represents yardstick, μ
x, μ
yThe average of expression Gaussian function has been described the center of Gabor function,
The variance of expression Gaussian function has been described function main lobe size in the x and y direction, and v represents the frequency that fluctuates.As fluctuation direction θ
iCan obtain the Gabor function of n different directions when getting successively n Along ent on the circumference.
To studies show that of mammal vision system, optic nerve unit has directional selectivity to the excitation of light, it is first that a pocket on the visual cortex includes many optic nerves, single neuron has stronger response to the light excitation of specific direction, and a little less than the response to other direction, can use this directional selectivity of Gabor function representation of different directions, i direction epineural unit is as follows with the Gabor function representation to the response of image:
Wherein n represents the number with circumference equal dividing.Neuron is to the interior product representation of the response of image, and for image, the two-dimensional discrete inner product is:
Wherein
With (μ
x, μ
y) centered by, 2 * l and 2 * m are the interior value of the rectangle of height and width.
Have different directions optionally neuron combine the received field that has formed light, vision system utilizes received field that content in the visual field is described in the response of light, and the received field model is called Hypercolumn.Comprehensive single neuron is to the response of light, and the mathematical model that comprises n neuronic received field is:
T=(T
1,T
2,...,T
n) (4)
T represents the formed feature of the common Description Image of neuron with different directions, is referred to as the mark that picture point (x, y) is located.
The concept of mark not only can be used for gray-scale map and extract feature, and can be used in the intensity map at edge.Edge in the image has represented the place of gray scale transition, has comprised abundant radio-frequency component, and the edge provides the remarkable information that is different from background for recognition system.If image f (x, y) describes the edge strength that point (x, y) locates with the amplitude of gradient and is
Derivative operation process so that among the edge strength figure value of smooth domain be tending towards 0, so only reflected the variation of gray scale among the edge strength figure, thereby eliminated the impact of whole differences of illumination intensities in two width of cloth images.Utilize biosystem to extract the principle of feature, extraction Tag information is in p (x, y):
Following formula represents the Gabor function with different directions
Carry out inner product with same edge strength image-region.With ET
iVector ET=(the ET that forms
1, ET
2..., ET
n) being called edge labelling (Edge Tag, ET), it has described the marginal texture information that image mid point (x, y) is located, and ET has following characteristic:
Comprise marginate direction and intensity information among the ET.
The direction of middle definition and two-dimensional Gabor function fluctuation perpendicular direction is taken direction as the leading factor, if being centered close on the edge of Gabor function, because the rotation of Gabor function, the dominant direction that the maximal value component is corresponding in the ET vector is inevitable consistent with the direction at edge, be the direction that the position of maximal value component in vector represented the edge, the size of component has represented relative intensity.Fig. 1 has described ET vector curve corresponding to different directions edge, and as can be seen from the figure, the corresponding angle of the maximal value in the ET vector has represented main edge direction.
ET has robustness to the registration noise.Since when calculating the ET vector, the discrete value of n the angle that the fluctuation direction of Gabor function is only uniformly-spaced divided on circumference, when changing in the dominant direction neighborhood [π/n, π/n] of edge direction at the Gabor function, the ET that produce this moment
iComponent is still maximal value with respect to other component,
The position of maximal value in vector do not change, so ET [π/n, π/n] scope inward turning that mismatch is caused
Turn error and have robustness, its performance depends on the size of n value.Fig. 2 has described when edge direction has less offset among two width of cloth figure, and angle corresponding to ET vector maximal value is constant.
It is less that ET is affected by differences of illumination intensities.The difference of two width of cloth image illuminations shows as and differs a constant on the gray scale, eliminated constant term in the edge image that the gradient derivative operation obtains, so ET has overcome the impact of illumination change substantially.Fig. 3 has described the variation of ET under two kinds of different illumination conditions, when two width of cloth image illuminations change greatly, still has stronger similarity between two ET.
How the below's introduction utilizes ET information that image is changed detection.
ET is a kind of description of image border structural information, by the similarity between the contrast ET vector, can determine the variation of structure in two width of cloth images, establishes the ET that two width of cloth image mid points (x, y) locate and is respectively ET
1(x, y), ET
2(x, y), its cross-correlation coefficient is:
Wherein
The expression auto-covariance function,
The expression cross covariance function.When corresponding point non-structure in the image changes, have linear relationship between its gradient amplitude, two ET vectors are linear dependences, so ρ
Xy(0)=1; When recurring structure changed, gradient amplitude had larger difference, did not have linear dependence between vector, at this moment ρ
Xy(0)=0, can determine whether variation on the recurring structure by related coefficient, therefore have:
In two breadths edge intensity images, corresponding point (x, y) numerical value difference hour between the ET vector of locating to extract, should think that this some place does not change, but when using coherent detection, might be uncorrelated between these two ET, thereby can cause false-alarm, the false-alarm of the variation testing result that causes for the ET vector of avoiding by little difference must suppress the uncorrelated nature between little difference vector, and definition with the related coefficient of inhibiting factor is:
sρ
xy(m)=A(1-|ρ
xy(m)|) (8)
Wherein A is inhibiting factor, and it is chosen and should satisfy: A → 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A → 1 that differs greatly.A is taken as herein:
A represents the equal value difference after the normalization.
Can find out to only have the larger point of average strength difference from the form of A, its inhibiting factor is just near 1, when two ET all represented strong edge or weak edge, A was tending towards 0, has suppressed the non-correlation of the less ET of difference.
By s ρ
Xy(m) but whether level off to 1 judging point (x, y) and locate whether to have occured variation, in actual applications because the impact of inhibiting factor, at the regional s ρ that changes
Xy(m) usually less than 1, so threshold value of the present invention gets empirical value 0.5, and final court verdict is:
For overcoming the impact of noise, this paper selects Sobel operator compute gradient.
Referring to Fig. 4, be the method flow diagram that image is changed detection based on ET information provided by the invention.
S401: determine two picture point of the same position in two width of cloth images, and obtain the ET information of these two picture point: ET
1, ET
2
Suppose that the picture position is (x, y), the picture point in the first width of cloth image is f
1(x, y), the picture point in the second width of cloth image is f
2(x, y) then obtains ET
1, ET
2Detailed process be:
C. utilize respectively gradient amplitude Description Image point f
1(x, y), f
2The edge strength that (x, y) locates is
D. respectively with p
1(x, y), p
2(x, y) asks inner product with the Gabor function of different directions, obtains
ET
1=(ET
1 1,ET
1 2,...,ET
1 n)、ET
2=(ET
2 1,ET
2 2,...,ET
2 n)
S402: calculate ET
1, ET
2Cross-correlation coefficient and inhibiting factor, thereby obtain ET
1, ET
2The related coefficient with inhibiting factor;
The detailed process of this step S402 is:
D. calculate ET
1, ET
2Cross-correlation coefficient:
E. determine inhibiting factor
Wherein, choosing of A should be satisfied: A → 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A → 1 that differs greatly.
F. definition with the related coefficient of inhibiting factor is: s ρ
Xy(m)=A (1-| ρ
Xy(m) |).
S403: above-mentioned related coefficient with inhibiting factor is compared with the variation threshold value of presetting, determine whether the picture point in described two width of cloth images changes.
For example, the deterministic process of step S403 is: will change thresholding and produce and be empirical value 0.5, as s ρ
Xy(m) greater than 0.5 o'clock, determine that the image of the identical point of two width of cloth figure changes, otherwise, determine not change.
Corresponding with said method, the present invention also provides a kind of device that image is changed detection based on ET information, and this device can adopt hardware, software or software and hardware combining to realize in practice.With reference to figure 5, this device comprises ET information determining unit 501, correlation calculations unit 502 and judging unit 503.Wherein,
ET information determining unit 501 is used for obtaining the ET information of two picture point of the same position of two width of cloth images: ET
1, ET
2
Correlation calculations unit 502 is connected with ET information determining unit 501, the ET information ET that provides according to ET information determining unit 501
1, ET
2, calculate ET
1, ET
2Cross-correlation coefficient and inhibiting factor, thereby obtain ET
1, ET
2The related coefficient with inhibiting factor;
Judging unit 503 is connected with correlation calculations unit 502, is used for the related coefficient with inhibiting factor that correlation calculations unit 502 provides is compared with the variation threshold value of presetting, and determines whether the picture point of same position in described two width of cloth images changes.
Wherein, suppose that the picture position is (x, y), the picture point of this position is f in the first width of cloth image
1(x, y), the picture point of this position is f in the second width of cloth image
2(x, y), then ET information determining unit 501 comprises:
Edge strength computation subunit 5011 is used for utilizing respectively gradient amplitude Description Image point f
1(x, y), f
2The edge strength that (x, y) locates;
ET information calculations subelement 5012 is used for respectively with p
1(x, y), p
2(x, y) asks inner product with the Gabor function of different directions, obtains ET
1, ET
2
Correlation calculations unit 502 further comprises:
Cross-correlation coefficient computation subunit 5021 is used for calculating ET
1, ET
2Cross-correlation coefficient;
Inhibiting factor is determined subelement 5022, is used for determining inhibiting factor A;
Related coefficient is determined subelement 5023, is used for determining with the related coefficient of inhibiting factor according to cross-correlation coefficient and inhibiting factor being.
Wherein, inhibiting factor determines that subelement 5022 chooses choosing of inhibiting factor A and should satisfy: A trend 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A trend 1 that differs greatly.
Judging unit 503 determines whether vicissitudinous process is for picture point in two width of cloth images: when described related coefficient with inhibiting factor during greater than threshold value, determine that the image of the same position of two width of cloth figure changes, otherwise, determine not change.
Detailed operation principle, formula and process about each unit in the device can with reference to the description among the said method embodiment, not given unnecessary details herein.
Use the present invention, when the edge labelling method is used for changing detection, can effectively overcome the impact of the registration error in illumination change and the certain limit.Compare with other detection method, have better detection effect, be suitable for the detection that structural information changes in the remote sensing images.The edge labelling method is a kind of description of picture structure information, not only can be used for changing detecting, and certain application prospect is also being arranged aspect rim detection, the Corner Detection.
Above provide specific descriptions of the present invention are set forth and illustrated being used for.But do not really want exhaustive or limit the invention to disclosed precise forms.According to above instruction, can realize a lot of the modification and modification.Above-described embodiment is selected for explaining best principle of the present invention and practical application thereof, thereby so that those skilled in the art can and utilize the different modification that are suitable for the certain expected purposes to utilize best the present invention with different embodiment.Scope of the present invention will be defined by claims.
Claims (10)
1. the change detecting method of an image is characterized in that, may further comprise the steps:
Determine two picture point of the same position in two width of cloth images, and obtain the edge labelling ET information of these two picture point: ET
1, ET
2
Calculate ET
1, ET
2Cross-correlation coefficient and inhibiting factor, thereby obtain ET
1, ET
2The related coefficient with inhibiting factor;
Above-mentioned related coefficient with inhibiting factor is compared with the variation threshold value of presetting, determine whether the picture point of same position in described two width of cloth images changes.
2. method according to claim 1 is characterized in that, supposes that the picture position is (x, y), and the picture point of this position is f in the first width of cloth image
1(x, y), the picture point of this position is f in the second width of cloth image
2(x, y) then obtains ET
1, ET
2Detailed process be:
A. utilize respectively gradient amplitude Description Image point f
1(x, y), f
2The edge strength that (x, y) locates is
B. respectively with p
1(x, y), p
2(x, y) asks inner product with the Gabor function of different directions, obtains
ET
1=(ET
1 1,ET
1 2,...,ET
1 n)、ET
2=(ET
2 1,ET
2 2,...,ET
2 n)。
3. method according to claim 1 is characterized in that, the described ET that obtains
1, ET
2The detailed process with the related coefficient of inhibiting factor be:
A. calculate ET
1, ET
2Cross-correlation coefficient:
Wherein,
The expression auto-covariance function,
The expression cross covariance function;
B. determine inhibiting factor
Wherein,
Expression point (x, y) is located the average of ET,
The absolute value that represents two ET average differences;
C. definition with the related coefficient of inhibiting factor is: s ρ
Xy(m)=A (1-| ρ
Xy(m) |).
4. method according to claim 3 is characterized in that, choosing of described inhibiting factor A should be satisfied: A trend 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A trend 1 that differs greatly.
5. method according to claim 1 is characterized in that, whether vicissitudinous process is the picture point in described definite two width of cloth images:
When described related coefficient with inhibiting factor during greater than threshold value, determine that the image of the same position of two width of cloth figure changes, otherwise, determine not change.
6. the change detecting device of an image is characterized in that, comprising:
ET information determining unit is used for obtaining the ET information of two picture point of the same position of two width of cloth images: ET
1, ET
2
The correlation calculations unit is used for calculating ET
1, ET
2Cross-correlation coefficient and inhibiting factor, thereby obtain ET
1, ET
2The related coefficient with inhibiting factor;
Judging unit is used for above-mentioned related coefficient with inhibiting factor is compared with the variation threshold value of presetting, and determines whether the picture point of same position in described two width of cloth images changes.
7. device according to claim 6 is characterized in that, supposes that the picture position is (x, y), and the picture point of this position is f in the first width of cloth image
1(x, y), the picture point of this position is f in the second width of cloth image
2(x, y), then ET information determining unit comprises:
The edge strength computation subunit is used for utilizing respectively gradient amplitude Description Image point f
1(x, y), f
2The edge strength that (x, y) locates;
ET information calculations subelement is used for respectively with p
1(x, y), p
2(x, y) asks inner product with the Gabor function of different directions, obtains ET
1, ET
2
8. device according to claim 6 is characterized in that, described correlation calculations unit further comprises:
The cross-correlation coefficient computation subunit is used for calculating ET
1, ET
2Cross-correlation coefficient;
Inhibiting factor is determined subelement, is used for determining inhibiting factor A;
Related coefficient is determined subelement, is used for determining with the related coefficient of inhibiting factor according to cross-correlation coefficient and inhibiting factor being.
9. device according to claim 8 is characterized in that, described inhibiting factor determines that subelement chooses choosing of inhibiting factor A and should satisfy: A trend 0 when (x, y) is positioned at edge strength difference smaller part; When (x, y) is positioned at edge strength when place A trend 1 that differs greatly.
10. device according to claim 6, it is characterized in that, described judging unit determines whether vicissitudinous process is for picture point in two width of cloth images: when described related coefficient with inhibiting factor during greater than threshold value, the image of determining the same position of two width of cloth figure changes, otherwise, determine not change.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2012104049164A CN103049902A (en) | 2012-10-23 | 2012-10-23 | Image change detection method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2012104049164A CN103049902A (en) | 2012-10-23 | 2012-10-23 | Image change detection method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103049902A true CN103049902A (en) | 2013-04-17 |
Family
ID=48062531
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2012104049164A Pending CN103049902A (en) | 2012-10-23 | 2012-10-23 | Image change detection method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103049902A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103218823A (en) * | 2013-05-08 | 2013-07-24 | 西安电子科技大学 | Remote sensing image change detection method based on nuclear transmission |
CN113298755A (en) * | 2021-04-13 | 2021-08-24 | 生态环境部卫星环境应用中心 | Method and device for rapidly detecting ecological environment change patch based on time sequence image |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030190075A1 (en) * | 2001-10-31 | 2003-10-09 | Infowrap, Inc. | Method for illumination independent change detection in a pair of registered gray images |
CN102169545A (en) * | 2011-04-25 | 2011-08-31 | 中国科学院自动化研究所 | Detection method for changes of high-resolution remote sensing images |
CN102169584A (en) * | 2011-05-28 | 2011-08-31 | 西安电子科技大学 | Remote sensing image change detection method based on watershed and treelet algorithms |
CN102629380A (en) * | 2012-03-03 | 2012-08-08 | 西安电子科技大学 | Remote sensing image change detection method based on multi-group filtering and dimension reduction |
CN102629378A (en) * | 2012-03-01 | 2012-08-08 | 西安电子科技大学 | Remote sensing image change detection method based on multi-feature fusion |
-
2012
- 2012-10-23 CN CN2012104049164A patent/CN103049902A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030190075A1 (en) * | 2001-10-31 | 2003-10-09 | Infowrap, Inc. | Method for illumination independent change detection in a pair of registered gray images |
CN102169545A (en) * | 2011-04-25 | 2011-08-31 | 中国科学院自动化研究所 | Detection method for changes of high-resolution remote sensing images |
CN102169584A (en) * | 2011-05-28 | 2011-08-31 | 西安电子科技大学 | Remote sensing image change detection method based on watershed and treelet algorithms |
CN102629378A (en) * | 2012-03-01 | 2012-08-08 | 西安电子科技大学 | Remote sensing image change detection method based on multi-feature fusion |
CN102629380A (en) * | 2012-03-03 | 2012-08-08 | 西安电子科技大学 | Remote sensing image change detection method based on multi-group filtering and dimension reduction |
Non-Patent Citations (1)
Title |
---|
陈涛等: ""基于生物视觉原理的图像结构信息变化检测方法"", 《国防科技大学学报》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103218823A (en) * | 2013-05-08 | 2013-07-24 | 西安电子科技大学 | Remote sensing image change detection method based on nuclear transmission |
CN103218823B (en) * | 2013-05-08 | 2016-04-13 | 西安电子科技大学 | Based on the method for detecting change of remote sensing image that core is propagated |
CN113298755A (en) * | 2021-04-13 | 2021-08-24 | 生态环境部卫星环境应用中心 | Method and device for rapidly detecting ecological environment change patch based on time sequence image |
CN113298755B (en) * | 2021-04-13 | 2021-11-26 | 生态环境部卫星环境应用中心 | Method and device for rapidly detecting ecological environment change patch based on time sequence image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3093820B1 (en) | Vehicle-surroundings recognition device | |
US9405982B2 (en) | Driver gaze detection system | |
Zilly et al. | Boosting convolutional filters with entropy sampling for optic cup and disc image segmentation from fundus images | |
Du et al. | Comprehensive and practical vision system for self-driving vehicle lane-level localization | |
CN107122737B (en) | Automatic detection and identification method for road traffic signs | |
Lu et al. | Circle detection by arc-support line segments | |
US7599524B2 (en) | Method and apparatus for providing a robust object finder | |
US20150086077A1 (en) | System and method of alerting a driver that visual perception of pedestrian may be difficult | |
CN102163282A (en) | Method and device for acquiring interested area in palm print image | |
CN104063700A (en) | Method for locating central points of eyes in natural lighting front face image | |
CN105373783A (en) | Seat belt not-wearing detection method based on mixed multi-scale deformable component model | |
CN106778551A (en) | A kind of fastlink and urban road Lane detection method | |
US11132582B2 (en) | Individual identification device | |
US20180174328A1 (en) | Turning radius-based corner detection algorithm | |
CN102903109A (en) | Integrated partition registering method of optical image and synthetic aperture radar (SAR) image | |
CN106682678A (en) | Image angle point detection and classification method based on support domain | |
CN102194102A (en) | Method and device for classifying a traffic sign | |
Masood et al. | Corner detection by sliding rectangles along planar curves | |
CN103927509A (en) | Eye locating method and device | |
CN106056046A (en) | Method and device of extracting features from image | |
Haselhoff et al. | On visual crosswalk detection for driver assistance systems | |
CN103049902A (en) | Image change detection method and device | |
Li et al. | Multi-layer sparse coding based ship detection for remote sensing images | |
CN106355596A (en) | Edge detection method fusing uniform color information and compound receptive field model | |
Jiao et al. | A novel and fast corner detection method for sar imagery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20130417 |
|
WD01 | Invention patent application deemed withdrawn after publication |