CN105631849B - The change detecting method and device of target polygon - Google Patents

The change detecting method and device of target polygon Download PDF

Info

Publication number
CN105631849B
CN105631849B CN201410638618.0A CN201410638618A CN105631849B CN 105631849 B CN105631849 B CN 105631849B CN 201410638618 A CN201410638618 A CN 201410638618A CN 105631849 B CN105631849 B CN 105631849B
Authority
CN
China
Prior art keywords
target
boundary
gradient
area
extracted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410638618.0A
Other languages
Chinese (zh)
Other versions
CN105631849A (en
Inventor
刘明超
李翔翔
汪红强
王剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Space Star Technology Co Ltd
Original Assignee
Space Star Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Space Star Technology Co Ltd filed Critical Space Star Technology Co Ltd
Priority to CN201410638618.0A priority Critical patent/CN105631849B/en
Publication of CN105631849A publication Critical patent/CN105631849A/en
Application granted granted Critical
Publication of CN105631849B publication Critical patent/CN105631849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of change detecting method of target polygon and device, this method includes:Under the map reference of the target, the GIS data based on the target builds boundary buffering area and the integrated buffer area of the target;Extract the pixel point set of the boundary buffering area and the pixel point set on the integrated buffer area boundaries Zhong Fei;Based on the pixel point set extracted from the gradient image of the target extract boundary buffering area in boundary point gradient set and the non-boundary point gradient set in integrated buffer area;Based on the gradient set extracted, the borderline significance value of the target is calculated using z value methods of inspection;And if the borderline significance value calculated is less than the borderline significance threshold value of setting, judge the target for changed target.Method and device provided by the invention is simple and efficient, and has higher accuracy of detection, accuracy and the degree of automation.

Description

The change detecting method and device of target polygon
Technical field
The present invention relates to high-definition remote sensing technical fields, and in particular to a kind of change detecting method of target polygon and Device.
Background technology
With smart city, the construction of digital earth, China has built up large quantities of bases, topic space data set, These data play important work in all trades and professions such as such as survey of territorial resources, urban planning, hazard forecasting and Damage assessments With.Remote sensing satellite observation has the characteristics that the time cycle is short, observation scope is wide, can easily obtain continuous within the scope of large space The data of time series.Therefore, being changed detection and update to space data sets using remotely-sensed data has important reality Meaning and it is widely applied foreground.
Remote Sensing Change Detection Technology originates from the 1960s, when current research Main way all concentrates on utilizing more The remote sensing image of phase is changed detection.Research method be usually directly using two width or pixel light spectrum in several images, Or the characteristic parameters such as vegetation index, parametric texture, edge feature, PCA (principal component) extracted, pass through difference, ratio, correlation The methods of analysis obtains change information.It is changed detection with remote sensing image at present, is based only upon the spectral signature of atural object mostly It is changed detection, detection result is weaker when spectrum has difference to atural object of the same name in different images.
GIS data has abundant semantic information, is the symbolic formulation of atural object after interpretation, GIS data is used to know as priori Know, the variation detection of target is carried out in conjunction with remote sensing image, can be to avoid the difficulty of target identification be carried out in the picture, it will be interested Area is accurately positioned in target itself, can greatly improve the accuracy of detection.
Zhang Xiaodong (2005) combines GIS data and remotely-sensed data, it is proposed that based on the adaptive of area of a polygon filling rate Change threshold determines method, and the Global Iterative Schemes method for solving based on region feature.Wu Xiaoyan etc. (2010) is by GIS data and distant Feeling image combines progress road extraction and variation to detect, and improves the newer the degree of automation of road net data.Xu Wenxiang (2011) In conjunction with remote sensing image, point, line, surface element Geometrical change type is analyzed, it is proposed that the vector element variation based on space characteristics code Detection research.Huang Jun etc. (2012) combines high-resolution remote sensing image and GIS data, right using each category feature of figure spot polygon Land use pattern has carried out variation detection.The deformation that big equal (2013) consider the atural objects of the same name such as building is opened, correlation is improved Index variation detection method, improves accuracy of detection.
The method of above application remotely-sensed data update GIS data is primarily present following problem at present:(1) above method is all The target in image is extracted by classification first, is then changed detection again, and Target Recognition Algorithms are extremely complex, effect is again Difference has seriously affected efficiency and the accuracy of variation detection;(2) in high resolution image, due to it is non-just penetrate caused by projection Difference is more prominent, therefore traditional images registration is directed to image entirety, and larger for target individual deviation;(3) only consider mesh It marks the intensity at edge and has ignored the variation on target periphery, versatility is poor.
Invention content
In view of this, to overcome above-mentioned at least one disadvantage, and following at least one advantages are provided.The invention discloses one The change detecting method and device of kind target polygon.
In order to solve the above technical problems, the present invention uses following technical scheme:A kind of variation detection side of target polygon Method, including:
Under the map reference of the target, the GIS data based on the target builds the boundary buffering area of the target With integrated buffer area;
Extract the pixel point set of the boundary buffering area and the pixel point set on the integrated buffer area boundaries Zhong Fei;
Based on the pixel point set extracted from the boundary extracted in the gradient image of the target in the boundary buffering area Non- boundary point gradient set in point gradient set and the integrated buffer area;
Based on the gradient set extracted, the borderline significance value of the target is calculated using z value methods of inspection;And
If the borderline significance value calculated is less than the borderline significance threshold value of setting, the target is judged to occur The target of variation.
In the change detecting method of target polygon as described above, the GIS data structure based on the target Before the boundary buffering area of the target and integrated buffer area, further include:The latitude and longitude coordinates of the GIS data of the target are pressed Geographical coordinate is converted to according to earth coordinates, then the geographical coordinate is converted into map reference according to projected coordinate system;And
The remote sensing image of the target is read, and uses sobel operators, Roberts operators or Laplace operator by institute Remote sensing image is stated to be handled to obtain gradient image.
In the change detecting method of target polygon as described above, the GIS data structure based on the target The boundary buffering area of the target and integrated buffer area include:
Based on the boundary of target in the GIS data, boundary buffering area is established by radius of the width of 1 pixel;And
Determine that radius, structure integrated buffer area, the radius are the area based on the area of target in the GIS data 1/10th.
In the change detecting method of target polygon as described above, boundary in the extraction boundary buffering area Pixel point set and the pixel point set on the integrated buffer area boundaries Zhong Fei include:
The pixel point set on boundary in the boundary buffering area is extracted based on formula (2):
PC1=(x, y) | (x, y) ∈ buff1 } (2)
Wherein, buff1 is the boundary buffering area;
The pixel point set on the integrated buffer area boundaries Zhong Fei is extracted based on formula (3):
Wherein, buff2 is the integrated buffer area.
In the change detecting method of target polygon as described above, it is described based on the pixel point set extracted from described It is extracted in the gradient image of target non-in boundary point gradient set and the integrated buffer area in the boundary buffering area Boundary point gradient set includes:
Based on the pixel point set PC1 extracted and formula (4), it is slow that the boundary is extracted from the gradient image of the target Rush the boundary point gradient set in area:
GC1=g (x, y) | (x, y) ∈ PC1 } (4)
Based on the pixel point set PC2 extracted and formula (5), it is slow that the entirety is extracted from the gradient image of the target Rush the non-boundary point gradient set in area:
GC2=g (x, y) | (x, y) ∈ PC2 } (5)
Wherein, g (x, y) is the image gradient of point (x, y).
It is described to utilize z based on the gradient set extracted in the change detecting method of target polygon as described above The borderline significance value that value method of inspection calculates the target includes:
The borderline significance value of the target is calculated based on formula (6):
Wherein, Z indicates that the borderline significance value of the target, n1 indicate the boundary point gradient collection in the boundary buffering area The element number of GC1 is closed, n2 indicates that the element number of the non-boundary point gradient set GC2 in the integrated buffer area, μ 1 indicate The mean value of boundary point gradient set GC1 in the boundary buffering area, μ 2 indicate the ladder of the non-boundary point in the integrated buffer area Spend the mean value of set GC2, S1 indicates the standard deviation of the boundary point gradient set GC1 in the boundary buffering area, described in S2 expressions The standard deviation of non-boundary point gradient set GC2 in integrated buffer area;
The target is translated based on formula (7):
(x', y')=(x+i, y+i) (7)
Wherein, (x', y') indicates that the coordinate after (x, y) translation, i indicate translational movement,
The borderline significance value after the target translation is calculated, point translates the target pixel-by-pixel in a certain range, and The borderline significance value for calculating separately the target after translation every time, obtains the set of the borderline significance value in the range, and Take the maximum value in the set of the borderline significance value as final for being compared with the borderline significance threshold value Borderline significance value, wherein described a certain range is determined according to the degrees of offset of the target.
In the change detecting method of target polygon as described above, the borderline significance threshold value is based on picture quality And target readability is set.
In order to solve the above technical problems, the present invention also uses following technical scheme:A kind of variation detection of target polygon Device, including:
Buffering area builds module, under the map reference of the target, the GIS data based on the target to build institute State boundary buffering area and the integrated buffer area of target;
Pixel point set extraction module, in the pixel point set and the integrated buffer area for extracting the boundary buffering area The pixel point set on non-boundary;
Gradient set extraction module, for extracting institute from the gradient image of the target based on the pixel point set extracted State the boundary point gradient set in the buffering area of boundary and the non-boundary point gradient set in the integrated buffer area;
Computing module, for based on the gradient set extracted, the boundary that the target is calculated using z value methods of inspection to be notable Property value;And
Judgment module judges the target for hair if the borderline significance value for being calculated is less than given threshold The target for changing.
In the change detecting device of target polygon as described above, further include:Coordinate transferring, being used for will be described The latitude and longitude coordinates of the GIS data of target are converted to geographical coordinate according to earth coordinates, then by the geographical coordinate according to throwing Shadow coordinate system is converted to map reference;And
Image processing module, the remote sensing image for reading the target, and using sobel operators, Roberts operators or Laplace operator is handled the remote sensing image to obtain gradient image.
In the change detecting device of target polygon as described above, the buffering area structure module is specifically used for being based on Boundary buffering area is established in the boundary of target in the GIS data by radius of the width of 1 pixel;And
Determine that radius, structure integrated buffer area, the radius are the area based on the area of target in the GIS data 1/10th.
In the change detecting device of target polygon as described above, the pixel point set extraction module is specifically used for base The pixel point set on boundary in the boundary buffering area is extracted in formula (2):
PC1=(x, y) | (x, y) ∈ buff1 } (2)
Wherein, buff1 is the boundary buffering area;
The pixel point set on the integrated buffer area boundaries Zhong Fei is extracted based on formula (3):
Wherein, buff2 is the integrated buffer area.
In the change detecting device of target polygon as described above, the gradient set extraction module is specifically used for base In the pixel point set PC1 and formula (4) that are extracted, from the side extracted in the gradient image of the target in the boundary buffering area Boundary's point gradient set:
GC1=g (x, y) | (x, y) ∈ PC1 } (4)
Based on the pixel point set PC2 extracted and formula (5), it is slow that the entirety is extracted from the gradient image of the target Rush the non-boundary point gradient set in area:
GC2=g (x, y) | (x, y) ∈ PC2 } (5)
Wherein, g (x, y) is the image gradient of point (x, y).
In the change detecting device of target polygon as described above, the computing module includes:
Borderline significance computing unit, the borderline significance value for calculating the target based on formula (6):
Wherein, Z indicates that the borderline significance value of the target, n1 indicate the boundary point gradient collection in the boundary buffering area The element number of GC1 is closed, n2 indicates that the element number of the non-boundary point gradient set GC2 in the integrated buffer area, μ 1 indicate The mean value of boundary point gradient set GC1 in the boundary buffering area, μ 2 indicate the ladder of the non-boundary point in the integrated buffer area Spend the mean value of set GC2, S1 indicates the standard deviation of the boundary point gradient set GC1 in the boundary buffering area, described in S2 expressions The standard deviation of non-boundary point gradient set GC2 in integrated buffer area;
Translation unit, for being translated to the target based on formula (7):
(x', y')=(x+i, y+i) (7)
Wherein, (x', y') indicates that the coordinate after (x, y) translation, (i, j) indicate translational movement,
Computing unit, for calculating the borderline significance value after the target translates, point is flat pixel-by-pixel in a certain range The target is moved, and calculates separately the borderline significance value of the target after translation every time, the boundary obtained in the range is notable Property value set, and take maximum value in the set of the borderline significance value be used as with the borderline significance threshold value into The ultimate bound significance value that row compares, wherein described a certain range is determined according to the degrees of offset of the target, the side Boundary's conspicuousness threshold value is set based on picture quality and target readability.
By using above-mentioned technical proposal, it is of the invention reached have the beneficial effect that:Method provided by the present invention and Device is based on GIS data mainly for area feature, compares target and background, need not set absolute threshold value, without carrying out Complicated target identification, the image-forming condition of image do not have an impact algorithm, greatly simplify the parameter setting of algorithm, simple high Effect;Meanwhile the method for the present invention eliminates the deformation generated due to non-orthogonal projection when atural object is imaged, substantially increases the essence of detection Degree and accuracy, and highly practical, high degree of automation.
Description of the drawings
To describe the technical solutions in the embodiments of the present invention more clearly, institute in being described below to the embodiment of the present invention Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the present invention Example without creative efforts, can also be implemented for those of ordinary skill in the art according to the present invention The content of example and these attached drawings obtain other attached drawings.
Fig. 1 is the flow chart of the change detecting method of target polygon provided by one embodiment of the present invention;
Fig. 2 is the schematic diagram of the GIS data of target polygon provided by one embodiment of the present invention;
Fig. 3 is the schematic diagram of the boundary buffering area of target polygon provided by one embodiment of the present invention;
Fig. 4 is the schematic diagram in the integrated buffer area of target polygon provided by one embodiment of the present invention;
Fig. 5 is the schematic diagram of the non-boundary buffering area of target polygon provided by one embodiment of the present invention;And
Fig. 6 is the flow chart of the change detecting device of target polygon provided by one embodiment of the present invention.
Specific implementation mode
For make present invention solves the technical problem that, the technical solution that uses and the technique effect that reaches it is clearer, below The technical solution of the embodiment of the present invention will be described in further detail in conjunction with attached drawing, it is clear that described embodiment is only It is a part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those skilled in the art exist The every other embodiment obtained under the premise of creative work is not made, shall fall within the protection scope of the present invention.
Technical solution to further illustrate the present invention below with reference to the accompanying drawings and specific embodiments.
In invention, high-resolution remote sensing image refers to the high resolution image of the space size relative to target, target It is made of a certain number of pixels in image, number of pixels all exists shared by object boundary and target internal (the non-boundary of target) 30 or more.
Borderline significance analysis of the present invention, is to judge the whether changed core algorithm of target.The present invention People thinks that target has differences with the gray scale of background in the picture, and the difference of target itself and background itself is relatively small, therefore Image gradient value at object boundary is larger, and the Grad of target internal and ambient field is smaller, thus two groups of gradient Value Datas are deposited In larger difference.If target is changed or has been not present, the object boundary and target internal marked and drawed in GIS data Significant difference is not present in corresponding Grad.So the significance analysis algorithm in the present invention is according to significance test (Test of statistical significance) principle, judge gradient on object boundary relative to target itself and its Whether other gradients in neighborhood have significant difference.Think that target exists if there are significant difference, if there is no notable Sex differernce then thinks that target is not present or target has occurred that variation, to carry out the variation detection of target.Below with specific Embodiment the present invention is described in detail.
Embodiment 1
As shown in Figure 1, the flow chart of the change detecting method for target polygon provided by one embodiment of the present invention, this In embodiment, (profile of target has wherein been marked and drawed in GIS data, has been the history of target using GIS polygon datas as target Data), using high-resolution remote sensing image be used to change detection (its Moderate-High Spatial Resolution Remote Sensing Image updates the data for target, Remote sensing image data after changing for target).Such as using the GIS data in somewhere house as target, data format is Shapefile, the high-resolution remote sensing image after the house location earthquake, resolution ratio is meter level, and data format is geotiff。
The change detecting method of the houses object includes the following steps:
Step S10, the remote sensing image img0 for reading target, the remote sensing image is handled using edge detection operator, To obtain gradient image img1.
In the present embodiment, it may be used and edge detection is carried out to remote sensing image such as sobel operators, Sobel operators are by two Convolution kernel (Gx,Gy) form, calculation formula is as follows:
The gradient magnitude of each pixelGradient direction θ=arctan (Gy/Gx)。
In addition, Roberts operators, Laplace operator etc. can also be used to carry out edge detection in this step.And the step Suddenly whenever can be carried out before step S30, however it is not limited to be executed before step S20.
Step S20, the coordinate of target in GIS data is converted into map reference.
Since GIS data is different from the source of remote sensing image, usually there is different geographic coordinate system and projected coordinate system, It thus needs that the coordinate points in GIS data are corresponded to the pixel on remote sensing image according to the correspondence between coordinate system, To complete being superimposed for vector and raster data.
For example, the coordinate of GIS data is latitude and longitude coordinates, remote sensing image is WGS84 coordinate system utm projections, then this step The latitude coordinates are converted to geographical coordinate by middle needs according to earth coordinates, then geographical coordinate is converted according to projected coordinate system For map reference, to complete the unification of GIS data and remote sensing image coordinate.
Step S30, on the diagram under coordinate, the borderline significance value of each target is calculated, following steps are specifically included:
Step S31, boundary buffering area is established based on the boundary line of target;
Since the boundary of target in reality is not necessarily corresponded in remote sensing image in a pixel, the boundary of target may It is a wide line of two to three pixels, therefore in the present embodiment, builds boundary buffering area, which is object boundary Possible range.In the present embodiment, with the width of 1 pixel, such as 0.6m is that radius establishes boundary buffering area buff1, such as Fig. 2 and Shown in 3, target polygon and the boundary buffering area of the target are respectively illustrated, wherein dash area indicates boundary buffering area;
Step S32, integrated buffer area is established based on target itself;
Integrated buffer area is a contiguous range of the target based on target polygon, is executed in subsequent step aobvious Work property is examined, to detect the conspicuousness of target within this range.Specifically, if the area of target is area, integrated buffer area Radius be selected asK preferably can establish integrated buffer area buff2, as shown in figure 4, shadow part with value 0.1 It is divided into integrated buffer area;
Step S33, the pixel point set of the pixel point set and the integrated buffer area boundaries Zhong Fei on boundary in the buffering area of boundary is extracted;
Wherein, the pixel point set PC1 on boundary is indicated with formula (2) in the buffering area of boundary:
PC1=(x, y) | (x, y) ∈ buff1 } (2)
The point set range is as shown in Figure 3;
The pixel point set on non-boundary is indicated with formula (3):
The point set range is as shown in two dash areas in Fig. 5.
Step S34, according to pixel the point set PC1 and PC2 extracted in previous step, boundary is extracted from gradient image img1 Non- boundary point gradient set in boundary point gradient set and integrated buffer area in buffering area;
Wherein, the boundary point gradient set in the buffering area of boundary is indicated with formula (4):
GC1=g (x, y) | (x, y) ∈ PC1 } (4)
Non- boundary point gradient set in integrated buffer area is indicated with formula (5):
GC2=g (x, y) | (x, y) ∈ PC2 } (5)
Wherein, g (x, y) is the image gradient of point (x, y).
Step S35, the difference Z of gradient set GC1 and GC2, i.e. the borderline significance value of target are calculated;
The present embodiment examines the difference of data in set GC1 and GC2 aobvious using significance test algorithm, that is, Z values method of inspection Work property, Z values are bigger, and the difference for representing data in two set is bigger.Specifically, if boundary point gradient set GC1 and non-boundary point The element number of gradient set GC2 is n1, n2 respectively, and mean value is μ 1, μ 2 respectively, and standard deviation is S1, S2 respectively, then is based on formula (6) Z values are calculated:
Step S36, target is translated, recalculates the borderline significance value of the target after translation.
In this step, when measurement error, image error, imaging it is non-just penetrate caused by the factors such as parallax, target is in GIS Profile of the profile not necessarily in remote sensing image in data, i.e. the two intact can not necessarily match, it is therefore desirable to right Target carries out a certain range of translation, finds physical location of the target in image.The range of translation is according to the possibility of target Degrees of offset determines that the bigger target offset in angle of inclination of sensor may be more when being imaged under normal conditions;Unknown inclined In the case of shifting degree, it could be provided as and the approximate range of translation of target sizes.Specifically, if the Z values not translated are Z0,0, Target vector is translated, then the coordinate after translating is indicated with formula (7):
(x', y')=(x+i, y+i) (7)
Wherein, (i, j) is translation vector.
The object boundary significance value Z after translation is recalculated based on the coordinate after translationi,j, after similarly calculating translation Integrated buffer area buff3 in all possible Z values, composition set:
Zc={ Zi,j|i,j∈buff3} (8)
Step S37, set Z is takencIn maximum value Zmax=MAX (Zc) ultimate bound significance value as target, Z takes most Translation vector (i when big valuemax,jmax) offset vector as target.
Step S40, by the target conspicuousness threshold value Z of the borderline significance value of target and settingthresholdIt is compared, if Z <Zthreshold, then it is changed target by the target label, and save as shapefile format vector datas.
For each houses object in basic GIS data, the detection of above-mentioned steps object variations conspicuousness is all carried out Step to obtain the borderline significance value of each target, and judges whether each target is changed target.
Embodiment 2
As shown in fig. 6, the module map of the change detecting device for target polygon provided by one embodiment of the present invention, knot The method in embodiment 1 is closed, which includes:Image processing module 10, coordinate transferring 20, buffering area build module 30, pixel point set extraction module 40, gradient set extraction module 50, computing module 60 and judgment module 70.
Wherein, image processing module 10 is used to read the remote sensing image of the target, and using sobel operators, Roberts Operator or Laplace operator are handled the remote sensing image to obtain gradient image;Coordinate transferring 20 is used for will be described The latitude and longitude coordinates of the GIS data of target are converted to geographical coordinate according to earth coordinates, then by the geographical coordinate according to throwing Shadow coordinate system is converted to map reference;Buffering area builds module 30 and is used under the map reference of the target, is based on the mesh Target GIS data builds boundary buffering area and the integrated buffer area of the target;Pixel point set extraction module 40 is for extracting institute State the pixel point set of boundary buffering area and the pixel point set on the integrated buffer area boundaries Zhong Fei;Gradient set extraction module 50 For based on the pixel point set extracted from the gradient image of target extract boundary buffering area in boundary point gradient set with And the non-boundary point gradient set in integrated buffer area;Computing module 60 is used to, based on the gradient set extracted, examine using z values The method of testing calculates the borderline significance value of the target;If the borderline significance value that judgment module 70 is used to be calculated is less than setting Threshold value then judges the target for changed target.
Buffering area builds module 30 and is specifically used for the boundary based on target in the GIS data, and the width with 1 pixel is Radius establishes boundary buffering area;And radius, structure integrated buffer area are determined based on the area of target in the GIS data.Picture Vegetarian refreshments collection extraction module 40 is specifically used for extracting the pixel point set on boundary in the boundary buffering area based on formula (2), is based on formula (3) Extract the pixel point set on the integrated buffer area boundaries Zhong Fei.Gradient set extraction module 50 is specifically used for based on the picture extracted Vegetarian refreshments collection PC1 and formula (4), from the boundary point gradient set extracted in the gradient image of the target in the buffering area of boundary, base In the pixel point set PC2 and formula (5) that are extracted, from the non-boundary extracted in the gradient image of the target in integrated buffer area Point gradient set.
Computing module 60 specifically includes:Borderline significance computing unit, translation unit and computing unit (not shown).
Wherein, borderline significance computing unit is used to calculate the borderline significance value of the target based on formula (6);Translation is single Member is for translating the target based on formula (7);Computing unit, the borderline significance for calculating the target after translating Value, and borderline significance value all in the integrated buffer area after translation is calculated, the set of borderline significance value is obtained, and take institute The maximum value in set is stated as the ultimate bound significance value for being compared with the borderline significance threshold value.
Method and device provided by the present invention is based on GIS data mainly for area feature, compares target and background, Absolute threshold value need not be set, without carrying out complicated target identification, the image-forming condition of image does not have an impact algorithm, pole The big parameter setting for simplifying algorithm, is simple and efficient;Meanwhile the method for the present invention eliminates when atural object is imaged due to non-orthogonal projection The deformation of generation substantially increases precision and the accuracy of detection, and highly practical, high degree of automation.
Above example provide technical solution in all or part of content can be realized by software programming, software Program is stored in the storage medium that can be read, and storage medium is for example:Hard disk, CD in computer or floppy disk.
Note that above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that The present invention is not limited to specific embodiments described here, can carry out for a person skilled in the art it is various it is apparent variation, It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out to the present invention by above example It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also May include other more equivalent embodiments, and the scope of the present invention is determined by scope of the appended claims.

Claims (11)

1. a kind of change detecting method of target polygon, which is characterized in that including:
Under the map reference of the target, the GIS data based on the target builds the boundary buffering area of the target and whole Volume buffer;
Extract the pixel point set of the boundary buffering area and the pixel point set on the integrated buffer area boundaries Zhong Fei;
It is terraced from the boundary point extracted in the gradient image of the target in the boundary buffering area based on the pixel point set extracted Non- boundary point gradient set in degree set and the integrated buffer area;
Based on the gradient set extracted, the borderline significance value of the target is calculated using z value methods of inspection;And
If the borderline significance value calculated is less than the borderline significance threshold value of setting, the target is judged to change Target;
The GIS data based on the target builds the boundary buffering area of the target and integrated buffer area includes:
Based on the boundary of target in the GIS data, boundary buffering area is established by radius of the width of 1 pixel;And
Determine that radius, structure integrated buffer area, the radius are the ten of the area based on the area of target in the GIS data / mono-.
2. the change detecting method of target polygon as described in claim 1, which is characterized in that described based on the target Before GIS data builds boundary buffering area and the integrated buffer area of the target, further include:By the GIS data of the target Latitude and longitude coordinates are converted to geographical coordinate according to earth coordinates, then the geographical coordinate is converted to figure according to projected coordinate system Upper coordinate;And
The remote sensing image of the target is read, and will be described distant using sobel operators, Roberts operators or Laplace operator Sense image is handled to obtain gradient image.
3. the change detecting method of target polygon as claimed in claim 2, which is characterized in that the extraction boundary buffering The pixel point set on boundary and the pixel point set on the integrated buffer area boundaries Zhong Fei include in area:
The pixel point set on boundary in the boundary buffering area is extracted based on formula (2):
PC1=(x, y) | (x, y) ∈ buff1 } (2)
Wherein, buff1 is the boundary buffering area;
The pixel point set on the integrated buffer area boundaries Zhong Fei is extracted based on formula (3):
Wherein, buff2 is the integrated buffer area.
4. the change detecting method of target polygon as claimed in claim 3, which is characterized in that described based on the pixel extracted Point set is slow from the boundary point gradient set and the entirety extracted in the gradient image of the target in the boundary buffering area The non-boundary point gradient set rushed in area includes:
Based on the pixel point set PC1 extracted and formula (4), the boundary buffering area is extracted from the gradient image of the target In boundary point gradient set:
GC1=g (x, y) | (x, y) ∈ PC1 } (4)
Based on the pixel point set PC2 extracted and formula (5), the integrated buffer area is extracted from the gradient image of the target In non-boundary point gradient set:
GC2=g (x, y) | (x, y) ∈ PC2 } (5)
Wherein, g (x, y) is the image gradient of point (x, y).
5. the change detecting method of target polygon as claimed in claim 4, which is characterized in that based on the gradient collection extracted It closes, the borderline significance value that the target is calculated using z value methods of inspection includes:
The borderline significance value of the target is calculated based on formula (6):
Wherein, Z indicates that the borderline significance value of the target, n1 indicate the boundary point gradient set in the boundary buffering area The element number of GC1, n2 indicate that the element number of the non-boundary point gradient set GC2 in the integrated buffer area, μ 1 indicate institute The mean value of the boundary point gradient set GC1 in the buffering area of boundary is stated, μ 2 indicates the non-boundary point gradient in the integrated buffer area The mean value of set GC2, S1 indicate that the standard deviation of the boundary point gradient set GC1 in the boundary buffering area, S2 indicate described whole The standard deviation of non-boundary point gradient set GC2 in volume buffer;
The target is translated based on formula (7):
(x', y')=(x+i, y+i) (7)
Wherein, (x', y') indicates that the coordinate after (x, y) translation, i indicate translational movement,
The borderline significance value after the target translation is calculated, point translates the target pixel-by-pixel in a certain range, and respectively The borderline significance value for calculating the target after translation every time, obtains the set of a certain range of borderline significance value, and Take the maximum value in the set of the borderline significance value as final for being compared with the borderline significance threshold value Borderline significance value, wherein described a certain range is determined according to the degrees of offset of the target.
6. the change detecting method of target polygon as claimed in claim 5, which is characterized in that the borderline significance threshold value base It is set in picture quality and target readability.
7. a kind of change detecting device of target polygon, which is characterized in that including:
Buffering area builds module, under the map reference of the target, the GIS data based on the target to build the mesh Target boundary buffering area and integrated buffer area;
Pixel point set extraction module, the pixel point set for extracting the boundary buffering area and the integrated buffer area sides Zhong Fei The pixel point set on boundary;
Gradient set extraction module, for extracting the side from the gradient image of the target based on the pixel point set extracted The non-boundary point gradient set in boundary point gradient set and the integrated buffer area in boundary's buffering area;
Computing module, for based on the gradient set extracted, the borderline significance of the target to be calculated using z value methods of inspection Value;And
Judgment module judges the target to become if the borderline significance value for being calculated is less than given threshold The target of change;
The buffering area structure module is specifically used for the boundary based on target in the GIS data, is half with the width of 1 pixel Diameter establishes boundary buffering area;And
Determine that radius, structure integrated buffer area, the radius are the ten of the area based on the area of target in the GIS data / mono-.
8. the change detecting device of target polygon as claimed in claim 7, which is characterized in that further include:Coordinate transferring, For the latitude and longitude coordinates of the GIS data of the target to be converted to geographical coordinate according to earth coordinates, then by the geography Coordinate is converted to map reference according to projected coordinate system;And
Image processing module, the remote sensing image for reading the target, and using sobel operators, Roberts operators or draw general Laplacian operater is handled the remote sensing image to obtain gradient image.
9. the change detecting device of target polygon as claimed in claim 8, which is characterized in that the pixel point set extraction module The pixel point set on boundary in the boundary buffering area is extracted specifically for being based on formula (2):
PC1=(x, y) | (x, y) ∈ buff1 } (2)
Wherein, buff1 is the boundary buffering area;
The pixel point set on the integrated buffer area boundaries Zhong Fei is extracted based on formula (3):
Wherein, buff2 is the integrated buffer area.
10. the change detecting device of target polygon as claimed in claim 9, which is characterized in that the gradient set extracts mould Block is specifically used for, based on the pixel point set PC1 extracted and formula (4), the boundary being extracted from the gradient image of the target Boundary point gradient set in buffering area:
GC1=g (x, y) | (x, y) ∈ PC1 } (4)
Based on the pixel point set PC2 extracted and formula (5), the integrated buffer area is extracted from the gradient image of the target In non-boundary point gradient set:
GC2=g (x, y) | (x, y) ∈ PC2 } (5)
Wherein, g (x, y) is the image gradient of point (x, y).
11. the change detecting device of target polygon as claimed in claim 10, which is characterized in that the computing module includes:
Borderline significance computing unit, the borderline significance value for calculating the target based on formula (6):
Wherein, Z indicates that the borderline significance value of the target, n1 indicate the boundary point gradient set in the boundary buffering area The element number of GC1, n2 indicate that the element number of the non-boundary point gradient set GC2 in the integrated buffer area, μ 1 indicate institute The mean value of the boundary point gradient set GC1 in the buffering area of boundary is stated, μ 2 indicates the boundary point gradient collection in the integrated buffer area The mean value of GC2 is closed, S1 indicates that the standard deviation of the boundary point gradient set GC1 in the boundary buffering area, S2 indicate the entirety The standard deviation of boundary point gradient set GC2 in buffering area;
Translation unit, for being translated to the target based on formula (7):
(x', y')=(x+i, y+i) (7)
Wherein, (x', y') indicates that the coordinate after (x, y) translation, (i, j) indicate translation vector;
Computing unit, for calculating the borderline significance value after the target translates, point translates institute pixel-by-pixel in a certain range Target is stated, and calculates separately the borderline significance value of the target after translation every time, obtains the borderline significance value in the range Set, and maximum value in the set of the borderline significance value is taken to be used as being compared with the borderline significance threshold value Compared with ultimate bound significance value, wherein described a certain range is determined according to the degrees of offset of the target, the boundary is aobvious Work property threshold value is set based on picture quality and target readability.
CN201410638618.0A 2014-11-06 2014-11-06 The change detecting method and device of target polygon Active CN105631849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410638618.0A CN105631849B (en) 2014-11-06 2014-11-06 The change detecting method and device of target polygon

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410638618.0A CN105631849B (en) 2014-11-06 2014-11-06 The change detecting method and device of target polygon

Publications (2)

Publication Number Publication Date
CN105631849A CN105631849A (en) 2016-06-01
CN105631849B true CN105631849B (en) 2018-08-24

Family

ID=56046737

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410638618.0A Active CN105631849B (en) 2014-11-06 2014-11-06 The change detecting method and device of target polygon

Country Status (1)

Country Link
CN (1) CN105631849B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169981B (en) * 2017-05-12 2020-07-07 西南交通大学 Method and device for detecting three-dimensional profile of ballast particles
CN107481296B (en) * 2017-08-02 2020-10-09 长威信息科技发展股份有限公司 Method and device for displaying building height based on two-dimensional map
CN110727755B (en) * 2019-10-14 2022-02-08 武汉汉达瑞科技有限公司 Terrain shape regularization method, electronic device and storage medium
CN115578607B (en) * 2022-12-08 2023-04-25 自然资源部第三航测遥感院 Method for rapidly extracting coverage range of effective pixels of remote sensing image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101126813A (en) * 2007-09-29 2008-02-20 北京交通大学 High resolution ratio satellite remote-sensing image architecture profile extraction method
CN101126812A (en) * 2007-09-27 2008-02-20 武汉大学 High resolution ratio remote-sensing image division and classification and variety detection integration method
CN102938066A (en) * 2012-12-07 2013-02-20 南京大学 Method for reconstructing outer outline polygon of building based on multivariate data
US8851594B2 (en) * 2010-10-17 2014-10-07 Hewlett-Packard Development Company, L.P. Fill reduction for printing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101126812A (en) * 2007-09-27 2008-02-20 武汉大学 High resolution ratio remote-sensing image division and classification and variety detection integration method
CN101126813A (en) * 2007-09-29 2008-02-20 北京交通大学 High resolution ratio satellite remote-sensing image architecture profile extraction method
US8851594B2 (en) * 2010-10-17 2014-10-07 Hewlett-Packard Development Company, L.P. Fill reduction for printing
CN102938066A (en) * 2012-12-07 2013-02-20 南京大学 Method for reconstructing outer outline polygon of building based on multivariate data

Also Published As

Publication number Publication date
CN105631849A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
CN108596101B (en) Remote sensing image multi-target detection method based on convolutional neural network
US9846946B2 (en) Objection recognition in a 3D scene
CN110263717B (en) Method for determining land utilization category of street view image
CN102959946B (en) The technology of view data is expanded based on relevant 3D cloud data
Huang et al. Road centreline extraction from high‐resolution imagery based on multiscale structural features and support vector machines
CN109598794B (en) Construction method of three-dimensional GIS dynamic model
CN103530881B (en) Be applicable to the Outdoor Augmented Reality no marks point Tracing Registration method of mobile terminal
CN108596108B (en) Aerial remote sensing image change detection method based on triple semantic relation learning
Yang et al. Automated extraction of building outlines from airborne laser scanning point clouds
CN108052624A (en) Processing Method of Point-clouds, device and computer readable storage medium
CN110009561A (en) A kind of monitor video target is mapped to the method and system of three-dimensional geographical model of place
GB2543749A (en) 3D scene rendering
CN107092877A (en) Remote sensing image roof contour extracting method based on basement bottom of the building vector
CN103258203A (en) Method for automatically extracting road centerline of remote-sensing image
CN107977992A (en) A kind of building change detecting method and device based on unmanned plane laser radar
CN103605978A (en) Urban illegal building identification system and method based on three-dimensional live-action data
CN107767400A (en) Remote sensing images sequence moving target detection method based on stratification significance analysis
Nyaruhuma et al. Verification of 2D building outlines using oblique airborne images
CN110263716B (en) Remote sensing image super-resolution land cover mapping method based on street view image
Zeng et al. An evaluation system for building footprint extraction from remotely sensed data
CN104463240B (en) A kind of instrument localization method and device
CN105631849B (en) The change detecting method and device of target polygon
Yuan et al. Combining maps and street level images for building height and facade estimation
CN110992366A (en) Image semantic segmentation method and device and storage medium
CN112489099A (en) Point cloud registration method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant