CN105182350B - A kind of multibeam sonar object detection method of application signature tracking - Google Patents
A kind of multibeam sonar object detection method of application signature tracking Download PDFInfo
- Publication number
- CN105182350B CN105182350B CN201510623181.8A CN201510623181A CN105182350B CN 105182350 B CN105182350 B CN 105182350B CN 201510623181 A CN201510623181 A CN 201510623181A CN 105182350 B CN105182350 B CN 105182350B
- Authority
- CN
- China
- Prior art keywords
- mrow
- sonar
- msub
- mtd
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/66—Sonar tracking systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The present invention is to provide a kind of multibeam sonar object detection method of application signature tracking.(1) by Sonar system gathered data, imaging forms sonar image, and continuous sonar image is constituted into sonar image sequence;(2) pre-processed;(3) the Scale invariant features transform feature after pretreatment per frame sonar image is extracted;(4) deploy characteristic matching in the front cross frame of sonar image sequence, the features localization that the match is successful is potential target, unsuccessful feature will be matched and abandoned;(5) real goal is filtered out from potential target after the subsequent frame tracking of sonar image sequence represents the feature of potential target, traversal image sequence, and obtains the characteristic locus of real goal.The inventive method pin, which is used in single-frame images, judges the presence or absence of target, and by being tracked to multiple target signatures in sonar image sequence simultaneously, real goal is filtered out from potential target so that this method has wider array of applicability.
Description
Technical field
The present invention relates to a kind of Underwater Target Detection method, specifically a kind of multi-beam of application signature tracking
Sonar target detection method.
Background technology
With the traction of national marine strategy, Underwater Target Detection technology is developed rapidly, and its application demand includes
The field such as avoidance and navigation under water, hypostasis search and salvage, pipe-line maintenance and bridge pier detection.Researcher it is universal by sonar into
Picture, forms the acoustic image of underwater scene, and Underwater Target Detection is realized by image segmentation, and problems faced includes segmentation threshold
Selection and the limitation of single image information.Tracking (Track Before Detect, TBD) is a kind of emerging faint before detection
Target acquisition is theoretical, and to whetheing there is target in single-frame images first without judging, multiple targets are carried out simultaneously in image sequence
Tracking, then filters out real goal according to target trajectory from potential target.Based on progress in TBD under water acoustic image sequence
Signature tracking, is a kind of novel method for realizing Underwater Target Detection.
Feature selecting be realize underwateracoustic image sequence tracking key, common feature include statistical nature, textural characteristics,
Shape facility, mathematic(al) manipulation feature etc..The application of these features is under water in target acquisition, however it remains separating capacity is weak, calculate
Complexity is big, poor robustness the problems such as.Scale invariant features transform (the Scale Invariant Feature that Lowe is proposed
Transform, SIFT), based on Scale-space theory, there is successfully application first in optical imagery matching and identification field
Example.The feature of Scale invariant features transform generation maintains the invariance to rotation, scaling, brightness change, to visual angle change, imitates
Penetrate conversion, noise and also keep a certain degree of stability, can preferably characterize image quality relatively low and easily dry by environment
Potential target in the sonar image disturbed.
The content of the invention
It is not used in single-frame images judging the presence or absence of target it is an object of the invention to provide one kind, passes through sonar image sequence
Multiple target signatures are tracked simultaneously in row, many ripples of the application signature tracking of real goal are filtered out from potential target
Beam sonar target detection method.
The object of the present invention is achieved like this:
(1) by Sonar system gathered data, to the data imaging formation sonar image gathered, by continuous sonar figure
As composition sonar image sequence;
(2) every two field picture in the sonar image sequence is pre-processed;
(3) the Scale invariant features transform feature after pretreatment per frame sonar image is extracted;
(4) characteristic matching is deployed in the front cross frame of sonar image sequence, the features localization that the match is successful is potential target, will
Unsuccessful feature is matched to abandon;
(5) represented in the subsequent frame tracking of sonar image sequence after the feature of potential target, traversal image sequence from potential
Real goal is filtered out in target, and obtains the characteristic locus of real goal.
The present invention can also include:
1st, the Sonar system is shallow water high-resolution multibeam sounding system, and acoustics basic matrix is T-shape, and transmitting basic matrix is many
First arc battle array, receives the even linear array that basic matrix is 80 array elements;Working frequency is 300kHz, and sample frequency is 40kHz, the pulse of selection
Width is 1ms.
2nd, the pretreatment realizes noise filtering using dynamic brightness distribution enhancing contrast, by medium filtering, dynamic
Original image intensity is mapped to linear gradation [0,255] during luminance distribution, will using minimum of intensity as L in original image
Preceding the 1% of original image intensity takes average as maximum H, and the mapped function relation of input intensity and output gray level is as follows,
Wherein zinIt is image input intensity value, zoutIt is image output gray level value, γ is that mapping parameters, span are
[0.5-2.0]。
3rd, the Scale invariant features transform feature includes coordinate, yardstick and directional information, and the feature of 128 dimensions is described
Vector.
4th, characteristic matching is used using the feature description vectors Euclidean distance of Scale invariant features transform as similarity measurement
Arest neighbors ratio method, finds arest neighbors and time neighbour of characteristic point to be matched, if nearest neighbor distance and the ratio of time nearest neighbor distance
Value is less than certain threshold value, then it is assumed that nearest neighbor point and this feature are best match pair, is otherwise considered as that it fails to match.
The invention provides a kind of Underwater Target Detection method, it is not used in single-frame images judging the presence or absence of target, and leads to
Cross in sonar image sequence to multiple target signatures while be tracked, real goal is filtered out from potential target.
The present invention is due to taking above technical scheme, and it has advantages below
(1) present invention is distributed using medium filtering and dynamic brightness, is significantly improved underwateracoustic picture quality, is characterized and carries
Take and tracking creates advantage.
(2) the Scale invariant features transform feature that the present invention is used, the robust comprising more frame information, frame matching
Property it is stronger, suitable for characterize underwateracoustic image sequence potential target.
(3) present invention uses characteristic tracking method, is not used in single-frame images judging the presence or absence of target, and passes through image sequence
Multiple clarifications of objective are tracked simultaneously in row, real goal is filtered out from potential target, are that Underwater Target Detection is carried
A new approach is supplied.
Brief description of the drawings
The flow chart of Fig. 1 Underwater Target Detection methods of signature tracking designed by the present invention.
Fig. 2 is the present invention collection frame original image of image sequence the 1st.
Fig. 3 a to Fig. 3 d are the frame of image sequence the 1st of the present invention dynamic brightness distribution effects in different mappings parameter, wherein
Fig. 3 a, γ=0.5;Fig. 3 b, γ=1;Fig. 3 c, γ=1.5;Fig. 3 d, γ=2.
Fig. 4 is the frame pretreatment image of image sequence the 1st of the present invention.
The frame Scale invariant features transform feature of image sequence the 1st that Fig. 5 extracts for the present invention.
Fig. 6 is the frame of image sequence the 1st of the present invention and the 2nd frame Scale invariant features transform characteristic matching situation.
The algorithm flow for the signature tracking that Fig. 7 designs for the present invention.
Fig. 8 is the target acquisition experimental result that feature of present invention is tracked.
Embodiment
Overview flow chart using the multibeam sonar Underwater Target Detection method of signature tracking is as shown in Figure 1.Tie below
Case study on implementation is closed, the method to the present invention is described in further detail.
1st, by Sonar system gathered data, to data imaging formation sonar image, by continuous sonar image composition sound
Image sequence.
Testing equipment is shallow water high-resolution multibeam sounding system.The acoustics basic matrix of system is T-shape, and transmitting basic matrix is many
First arc battle array, receives the even linear array that basic matrix is 80 array elements.System operating frequency is 300kHz, and sample frequency is 40kHz, selection
Pulse width is 1ms.The depth of water is 20~30m, and using oil drum as dynamic Small object, surveying vessel is rested in into bank, sound in experiment
System is fixedly mounted, and hangs oil drum using two hawsers in topside, and allows it to be done in water and slowly vertically and horizontally transports
It is dynamic.By the data imaging of collection, picture size is to choose continuous 28 frame composition image sequence.The 1st is read from image sequence
Frame, imaging effect is as shown in Figure 2.
2nd, every two field picture in sonar image sequence is pre-processed.Using dynamic brightness distribution enhancing contrast, lead to
Cross medium filtering and realize noise filtering;
When dynamic brightness is distributed, the mapped function relation of input intensity and output gray level is as follows:
Setting mapping parameters γ values are contrasted for 0.5,1,1.5,2.0 respectively, and result of the test is as shown in figure 3, compare hair
The balance that now contrast reinforcing and details retain as γ=1.5 is best, therefore Choose for user parameter γ=1.5.Medium filtering
Sliding window size is used for 3 × 3, it is as shown in Figure 4 by the image of pretreatment.
3rd, the Scale invariant features transform characteristic present potential target in acoustic image under water is extracted.
It is I (x, y) by underwateracoustic image definition, gaussian kernel function is G (x, y, σ), and expression formula is shown below.
In formula, σ is the metric space factor, reflects the degree that image is smoothed.I (x, y) and G (x, y, σ) convolution are obtained
It is expressed as follows to metric space of the image under different scale:
R (x, y, σ)=G (x, y, σ) * I (x, y)
Difference Gauss (Difference of Gaussian, DoG) operator definitions of image are:
D (x, y, σ)=R (x, y, k σ)-R (x, y, σ)
K is the scale factor between adjacent yardstick in formula.In the DoG pyramids of DoG operator pie graph pictures, DoG pyramids
Each point need to be compared with 8 points of surrounding neighbors of same yardstick and 18 points of surrounding neighbors of adjacent yardstick correspondence position
Compared with.When the point is local extremum, it is determined as candidate feature point.Define the candidate feature point set U of image0, according to following public affairs
It is low and positioned at Edge Feature Points that formula therefrom rejects contrast, filters out key point S (x, y).
In formula, u ∈ U0,TcFor contrast threshold,For contrast absolute value.
In formula, TγFor principal curvatures fractional threshold, Tr (u) is u mark, and Det (He) is Hessian determinants of a matrix.
The mould m (x, y) and direction θ (x, y) of its gradient are calculated by following formula.It is sharp in neighborhood window centered on S (x, y)
The gradient distribution of neighborhood territory pixel is counted with histogrammic mode, the main side of neighborhood gradient residing for histogrammic peak value reflection S (x, y)
To being accurately obtained S (x, y) direction by parabola interpolation.
16 × 16 windows are taken centered on key point, the gradient direction Nogata in 8 directions is calculated in each 4 × 4 fritter
Figure, draws the accumulated value of each gradient direction, you can form a seed point, and each key point is retouched using 16 seed points
State.Key point is Scale invariant features transform feature, comprising coordinate, yardstick and directional information, and 128 dimensions features describe to
Amount.With the frame feature extraction effect of image sequence the 1st as shown in figure 5, extracting feature 79.Abscissa represents relative basic matrix in figure
Horizontal level, ordinate represents the depth apart from the water surface, and arrow is pointed to and distribution of lengths represents characteristic direction and yardstick.From image
It can be seen that feature concentrates on the water-bed profile that the dynamic Small object that depth is 17m and depth are 22~26m.
4th, characteristic matching is deployed in the front cross frame of sonar image sequence, the features localization that the match is successful is potential target, will
Unsuccessful feature is matched to abandon.
Characteristic matching is using the feature description vectors Euclidean distance of Scale invariant features transform as similarity measurement, using most
Neighbour's ratio method.Arest neighbors and time neighbour of characteristic point to be matched are found, if nearest neighbor distance and the ratio of time nearest neighbor distance
Less than certain threshold value, then it is assumed that nearest neighbor point and this feature are best match pair, otherwise it is considered as that it fails to match.The frame of image sequence the 1st
42 matching characteristics are had with the 2nd two field picture, matching state is as shown in Figure 6.As can be seen from the figure characterize dynamic Small object and
The feature of water-bed profile all realizes matching, but also has the characteristic matching of some non-targeted, it is necessary to by complete image sequence
Track further to determine whether to represent real goal.
5th, represented in the subsequent frame tracking of sonar image sequence after the feature of potential target, traversal image sequence from potential mesh
Real goal is filtered out in mark, and obtains the characteristic locus of real goal.
The algorithm flow of signature tracking is as shown in Figure 7.
1) two field picture is read, Scale invariant features transform feature set is obtained and saves as a template;
2) next two field picture is read, each Scale invariant features transform feature of extraction is matched with template, is matched
This feature is demarcated as potential target and kept in by success
3) continuous 3 frame matching is unsuccessful, and this feature, which has, may greatly characterize false-alarm, and can be rejected from temporarily providing room should
Feature;
4) feature still retained is judged to representing real goal after completing whole tracking, while obtaining the spy of the target
Levy track.
Target acquisition result is as shown in Figure 8.Successfully tracked from image sequence to 6 groups of Scale invariant features transform features,
The less sign Static Water return pulley of wherein 5 groups offsets is wide, and the dynamic Small object of the larger sign of 1 group of offset.Water-bed profile
Feature locations distribution is (- 13.0,24.4), (- 7.0,24.1), (5.8,23.8), (16.2,23.2) and (20.8,23.0), is moved
The motion upwards since 17m of state Small object, horizontal movement is started to 11m.
Claims (3)
1. a kind of multibeam sonar object detection method of application signature tracking, it is characterized in that:
(1) by Sonar system gathered data, to the data imaging formation sonar image gathered, by continuous sonar image group
Into sonar image sequence;
(2) every two field picture in the sonar image sequence is pre-processed;
(3) the Scale invariant features transform feature after pretreatment per frame sonar image is extracted;The Scale invariant features transform is special
Levy the feature description vectors for including coordinate, yardstick and directional information, and 128 dimensions;
(4) characteristic matching is deployed in the front cross frame of sonar image sequence, the features localization that the match is successful is potential target, will be matched
Unsuccessful feature is abandoned;Specifically include:Characteristic matching is made with the feature description vectors Euclidean distance of Scale invariant features transform
For similarity measurement, using arest neighbors ratio method, arest neighbors and time neighbour of characteristic point to be matched are found, if nearest neighbor distance
It is less than certain threshold value with the ratio of secondary nearest neighbor distance, then it is assumed that nearest neighbor point is best match pair with this feature, is otherwise considered as matching
Failure;
(5) represented in the subsequent frame tracking of sonar image sequence after the feature of potential target, traversal image sequence from potential target
In filter out real goal, and obtain the characteristic locus of real goal.
2. the multibeam sonar object detection method of application signature tracking according to claim 1, it is characterized in that:The sound
System is shallow water high-resolution multibeam sounding system, and acoustics basic matrix is T-shape, and transmitting basic matrix is polynary arc battle array, receives basic matrix
For the even linear array of 80 array elements;Working frequency is 300kHz, and sample frequency is 40kHz, and the pulse width of selection is 1ms.
3. the multibeam sonar object detection method of application signature tracking according to claim 1 or 2, it is characterized in that:Institute
Stating pretreatment strengthens contrast using dynamic brightness distribution, noise filtering is realized by medium filtering, will when dynamic brightness is distributed
Original image intensity is mapped to linear gradation [0,255], using minimum of intensity as L in original image, by original image intensity
Preceding 1% take average as maximum H, the mapped function relation of input intensity and output gray level is as follows,
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>z</mi>
<mrow>
<mi>o</mi>
<mi>u</mi>
<mi>t</mi>
</mrow>
</msub>
<mo>=</mo>
<mn>0</mn>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>z</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
<mo>=</mo>
<mi>L</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>z</mi>
<mrow>
<mi>o</mi>
<mi>u</mi>
<mi>t</mi>
</mrow>
</msub>
<mo>=</mo>
<mn>255</mn>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mrow>
<msub>
<mi>z</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
<mo>-</mo>
<mi>L</mi>
</mrow>
<mrow>
<mi>H</mi>
<mo>-</mo>
<mi>L</mi>
</mrow>
</mfrac>
<mo>)</mo>
</mrow>
<mi>&gamma;</mi>
</msup>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>L</mi>
<mo><</mo>
<msub>
<mi>z</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
<mo><</mo>
<mi>H</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>z</mi>
<mrow>
<mi>o</mi>
<mi>u</mi>
<mi>t</mi>
</mrow>
</msub>
<mo>=</mo>
<mn>255</mn>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>z</mi>
<mrow>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
<mo>&GreaterEqual;</mo>
<mi>H</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
Wherein zinIt is image input intensity value, zoutIt is image output gray level value, γ is that mapping parameters, span are [0.5-
2.0]。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510623181.8A CN105182350B (en) | 2015-09-26 | 2015-09-26 | A kind of multibeam sonar object detection method of application signature tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510623181.8A CN105182350B (en) | 2015-09-26 | 2015-09-26 | A kind of multibeam sonar object detection method of application signature tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105182350A CN105182350A (en) | 2015-12-23 |
CN105182350B true CN105182350B (en) | 2017-10-31 |
Family
ID=54904549
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510623181.8A Active CN105182350B (en) | 2015-09-26 | 2015-09-26 | A kind of multibeam sonar object detection method of application signature tracking |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105182350B (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105574529B (en) * | 2016-01-28 | 2019-02-19 | 中国船舶重工集团公司第七一〇研究所 | A kind of side-scan sonar object detection method |
CN108241150B (en) * | 2016-12-26 | 2023-05-30 | 中国科学院软件研究所 | Method for detecting and tracking moving object in three-dimensional sonar point cloud environment |
CN106707265B (en) * | 2016-12-26 | 2019-06-07 | 江苏中海达海洋信息技术有限公司 | A kind of conversion update method of sonar sector diagram |
CN108103964A (en) * | 2018-01-05 | 2018-06-01 | 辽宁科技大学 | For the equipment of bridge underwater structure observation |
CN109035224B (en) * | 2018-07-11 | 2021-11-09 | 哈尔滨工程大学 | Submarine pipeline detection and three-dimensional reconstruction method based on multi-beam point cloud |
CN110363123A (en) * | 2019-07-03 | 2019-10-22 | 中国电子科技集团公司第三研究所 | The detecting and tracking method and system of sub-aqua sport Small object |
CN110852959B (en) * | 2019-10-14 | 2023-05-16 | 江苏帝一集团有限公司 | Sonar image filtering method based on novel median filtering algorithm |
CN111596296B (en) * | 2019-12-19 | 2023-09-15 | 上海大学 | Underwater target tracking method based on forward-looking sonar image |
CN111427042B (en) * | 2020-03-24 | 2022-04-19 | 哈尔滨工程大学 | Underwater weak target echo pre-detection tracking method based on high-refresh-rate acoustic pulses |
CN111722231B (en) * | 2020-06-18 | 2021-03-16 | 中国科学院声学研究所 | Active sonar target motion information fusion method |
CN112735164B (en) * | 2020-12-25 | 2022-08-05 | 北京智能车联产业创新中心有限公司 | Test data construction method and test method |
CN113343964B (en) * | 2021-08-09 | 2022-06-10 | 湖南汇视威智能科技有限公司 | Balanced underwater acoustic image target detection method |
CN114578333B (en) * | 2022-04-28 | 2022-08-02 | 杭州应用声学研究所(中国船舶重工集团公司第七一五研究所) | Active sonar target dynamic and static identification method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6259396B1 (en) * | 1999-08-26 | 2001-07-10 | Raytheon Company | Target acquisition system and radon transform based method for target azimuth aspect estimation |
CN103903237B (en) * | 2014-03-21 | 2017-03-29 | 上海大学 | Sonar image sequence assembly method is swept before one kind |
CN104392428A (en) * | 2014-12-10 | 2015-03-04 | 黑龙江真美广播通讯器材有限公司 | Splicing system for side-scan sonar images |
-
2015
- 2015-09-26 CN CN201510623181.8A patent/CN105182350B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN105182350A (en) | 2015-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105182350B (en) | A kind of multibeam sonar object detection method of application signature tracking | |
CN109615611B (en) | Inspection image-based insulator self-explosion defect detection method | |
CN104574347B (en) | Satellite in orbit image geometry positioning accuracy evaluation method based on multi- source Remote Sensing Data data | |
CN104392228A (en) | Unmanned aerial vehicle image target class detection method based on conditional random field model | |
CN106296638A (en) | Significance information acquisition device and significance information acquisition method | |
CN105160310A (en) | 3D (three-dimensional) convolutional neural network based human body behavior recognition method | |
CN106780485A (en) | SAR image change detection based on super-pixel segmentation and feature learning | |
CN109410207A (en) | A kind of unmanned plane line walking image transmission line faultlocating method based on NCC feature | |
CN104966285B (en) | A kind of detection method of salient region | |
CN104933434A (en) | Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method | |
CN109376641B (en) | Moving vehicle detection method based on unmanned aerial vehicle aerial video | |
Trouvé et al. | Single image local blur identification | |
CN107256547A (en) | A kind of face crack recognition methods detected based on conspicuousness | |
CN106530313A (en) | Sea-sky line real-time detection method based on region segmentation | |
CN113392856B (en) | Image forgery detection device and method | |
CN113111878B (en) | Infrared weak and small target detection method under complex background | |
CN103080979A (en) | System and method for synthesizing portrait sketch from photo | |
CN110378924A (en) | Level set image segmentation method based on local entropy | |
CN110222609A (en) | A kind of wall body slit intelligent identification Method based on image procossing | |
CN105809673A (en) | SURF (Speeded-Up Robust Features) algorithm and maximal similarity region merging based video foreground segmentation method | |
CN113822352A (en) | Infrared dim target detection method based on multi-feature fusion | |
CN110458019B (en) | Water surface target detection method for eliminating reflection interference under scarce cognitive sample condition | |
CN111027637A (en) | Character detection method and computer readable storage medium | |
CN102682435A (en) | Multi-focus image edge detection method based on space relative altitude information | |
CN110751680A (en) | Image processing method with fast alignment algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |