CN112085749B - Multi-scale non-iterative superpixel segmentation method - Google Patents

Multi-scale non-iterative superpixel segmentation method Download PDF

Info

Publication number
CN112085749B
CN112085749B CN202010948185.4A CN202010948185A CN112085749B CN 112085749 B CN112085749 B CN 112085749B CN 202010948185 A CN202010948185 A CN 202010948185A CN 112085749 B CN112085749 B CN 112085749B
Authority
CN
China
Prior art keywords
pixel
distance
follows
image
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010948185.4A
Other languages
Chinese (zh)
Other versions
CN112085749A (en
Inventor
蓝如师
郑金云
罗笑南
王小琴
臧美美
赵文婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN202010948185.4A priority Critical patent/CN112085749B/en
Publication of CN112085749A publication Critical patent/CN112085749A/en
Application granted granted Critical
Publication of CN112085749B publication Critical patent/CN112085749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a multi-scale non-iterative superpixel segmentation method, which comprises the steps of firstly utilizing Gaussian convolution to calculate color gradient characteristics of each pixel point in the horizontal and vertical directions on a Lab color space, secondly, obtaining morphological contour characteristics of the pixel points by carrying out corrosion and expansion operations on the pixel points, and enhancing the algorithm edge hit rate while not losing gray gradient representation. Finally, the super-pixel segmentation is realized by depending on colors, spaces, color gradients and custom weighting distances of morphological contour features among pixel points based on a non-iterative clustering framework of an SNIC algorithm. Experimental results show that under the condition that the number of the super pixels is the same, compared with other mainstream algorithms, the method provided by the invention effectively improves the super pixel segmentation quality while ensuring low time complexity.

Description

Multi-scale non-iterative superpixel segmentation method
Technical Field
The invention relates to the technical field of image segmentation, in particular to a multi-scale non-iterative superpixel segmentation method.
Background
Superpixel segmentation refers to the segmentation of adjacent pixels of an image similar according to low-level characteristics such as gray scale, color, texture and the like into irregular pixel blocks with certain visual significance. The goal is to represent the image as a more meaningful and easier to analyze object. A small number of super pixels are used for replacing a large number of pixels to express picture information, so that the number of processing objects can be greatly reduced, and the efficiency of subsequent processing is improved.
In the gradient-based superpixel segmentation algorithm, a simple linear iterative clustering method slic (simple linear iterative clustering) is proposed by Achanta et al. The method is improved from K mean value, and is different from the traditional K mean value algorithm in that the centroid of the method only searches a local area with the size far smaller than the total number of image pixels, so that the time complexity of the SLIC method is lower than that of the traditional K mean value. Hu et al propose a spatially constrained watershed algorithm SCoW (Watershed superpixel) based on the watershed algorithm. SCoW performs watershed segmentation through a series of uniform landmarks and introduces edge preprocessing to ensure a balance between uniformity and compactness, which achieves efficient segmentation speed at the expense of accuracy. Achanta et al propose a SNIC (Simple Non-iterative Cluster) method based on a modified version of SLIC. In the SNIC method, a pixel point priority queue is introduced. The SNIC method is simple to realize, and the non-iterative realization speed is higher; compared with an SLIC algorithm, the distance pairs are calculated, the time and space cost is saved, and the connectivity among the super pixels is enhanced. However, the image edges are not well fitted. Ban et al propose a segmentation method GMMSP (Superpixel segmentation Using Gaussian Mixture Model) based on local Gaussian Mixture Model. In the method, each Gaussian function adopts the same weight, so that each super pixel has a similar expected size value, and the generation of super pixels with similar sizes is facilitated. The algorithm precision of the method is high, but the method has certain weakness in the aspect of attaching the image boundary.
Aiming at the problems, the invention provides a multi-scale non-iterative superpixel segmentation algorithm. The algorithm firstly enhances the characteristic representation of the pixel points; on one hand, the color gradient characteristics of the pixel points in the Lab color space are solved through Gaussian convolution, and the pixel point information of the color image is more fully expressed from different scales. In general, the method not only improves the segmentation precision, but also presents good boundary maintenance while keeping the segmentation error rate low.
Disclosure of Invention
The invention aims to overcome the defect that the traditional super-pixel segmentation method based on clustering cannot well fit the edge of an image, and provides a multi-scale non-iterative super-pixel segmentation method which is high in segmentation precision and high in speed.
The technical scheme for realizing the purpose of the invention is as follows:
a multi-scale non-iterative superpixel segmentation method comprises the following steps:
1) dividing the image into uniform grids, acquiring K initial clustering central points, and giving 1-K label information to the central points;
2) obtaining the feature representation of 4 neighborhood pixel points of each cluster central point;
3) calculating the characteristic distance between the clustering center point and 4 neighborhood pixels of the clustering center point, and sequentially distributing class labels of the clustering center points to the neighborhood pixels from small to large according to the distance;
4) solving the feature mean value of the pixels with the same label, and updating the seed points into mean feature vectors;
5) if all pixel points in the image are distributed with the class labels, executing the step 6); otherwise, returning to the step 2);
6) and returning all the label information and outputting the image.
In the Step 1), assuming that the Size of the image is Size, the number of the super-pixel image blocks to be generated is K, and a Step Size Step calculation formula of the uniform grid is as follows:
Figure GDA0002741511230000021
in step 2), the features of the pixel points are expressed, and any pixel point p is expressed by a seven-dimensional feature vector, i is ═ l, a, b, x, y, g, m ], wherein [ l, a, b ] expresses the color features in the CIELAB space of the pixel point, [ x, y ] expresses the spatial features, g expresses the color gradient features, and m expresses the morphological contour features; wherein:
2-1) the calculation method of the color gradient characteristic comprises the following steps: for a pixel point i, the coordinate is (x, y), and the gradient feature solving process is as follows:
the gaussian convolution kernel is:
Figure GDA0002741511230000022
the gaussian convolution template is:
Figure GDA0002741511230000031
2-2) the solving method of the color gradient characteristics on l, a and b is as follows:
gil=flΔH (4)
gia=faΔH (5)
gib=fbΔH (6)
gi=gil+gia+gib (7)
wherein f isl、faAnd fbRespectively representing a pixel value matrix of l, a and b with a pixel point i as a center and a window template size of (2k +1) × (2k +1), wherein k is 1; operator Δ represents the horizontal and vertical convolution operations;
2-3) the solving method of the morphological contour features comprises the following steps: for a pixel point i, the coordinate is (x, y), and the morphological contour feature is solved as follows:
Figure GDA0002741511230000032
Figure GDA0002741511230000033
mi=JDilation(x,y)-JErosion(x,y) (10)
formula (8) shows that the expansion operation is performed on the image, namely (x, y) is replaced by the maximum value in the 3 x 3 neighborhood pixel points (x + x ', y + y'), formula (9) shows that the erosion operation is performed on the image, namely (x, y) is replaced by the minimum value in the 3 x 3 neighborhood pixel points (x + x ', y + y'), and the expansion and erosion operation results are subtracted to obtain the morphological outline characteristics of the image.
In step 3), the characteristic distance between the cluster center point and the 4 neighborhood pixels is calculated as follows:
color feature distance measurement for arbitrary two pixel points pi=[li,ai,bi,xi,yi,gi,mi]And pc=[lc,ac,bc,xc,yc,gc,mc],
The color distance is defined as follows:
dc=(li-lc)2+(ai-ac)2+(bi-bc)2 (11)
the spatial distance is defined as follows:
ds=(xi-xc)2+(yi-yc)2 (12)
the gradient distance is defined as follows:
dg=(gi-gc)2 (13)
the morphological contour feature distance is defined as follows:
dm=mi-mc (14)
the feature distance is defined as follows:
Figure GDA0002741511230000041
where s represents the maximum spatial distance within a class, m represents the maximum color distance, and γ and λ are hyperparameters.
The invention provides a multi-scale non-iterative superpixel segmentation method, which introduces color gradient characteristics and morphological contour characteristics, and realizes superpixel segmentation by using a custom weighted distance based on a K-means non-iterative clustering frame. The test result shows that compared with other mainstream methods, the method and the device for super-pixel segmentation guarantee low time complexity and effectively improve super-pixel segmentation quality.
Drawings
FIG. 1 is a block flow diagram of a multi-scale non-iterative superpixel segmentation method of the present invention;
fig. 2 is an original image in a data set.
Fig. 3 is a diagram showing the effect of performing superpixel segmentation using the prior art for the image shown in fig. 2.
Fig. 4 is a diagram showing the effect of performing superpixel segmentation by the method of the present embodiment on the image shown in fig. 2.
Detailed Description
The invention is further illustrated but not limited by the following figures and examples.
A multi-scale non-iterative superpixel segmentation method, as shown in fig. 1, comprising the steps of:
initializing a clustering center point:
1) dividing the image into uniform grids, taking the central points of the grids as clustering central points, obtaining K initial clustering central points, and giving 1-K label information to the central points; the specific division method comprises the following steps: assuming that the Size of the image is Size and the number of super-pixel image blocks to be generated is K, the Step Size Step calculation formula of the uniform grid is as follows:
Figure GDA0002741511230000051
2) solving to obtain multi-scale feature representation of each cluster center point, and obtaining any pixel pointpRepresented by a seven-dimensional feature vector, i.e. [ l, a, b, x, y, g, m ═ i]Wherein [ l, a, b]Representing color characteristics in the CIELAB space of a pixel point, [ x, y ]]Representing spatial features, g representing color gradient features, and m representing morphological contour features; wherein:
2-1) the calculation method of the color gradient characteristic comprises the following steps: for a pixel point i, the coordinate is (x, y), and the gradient characteristic is solved as follows:
the gaussian convolution kernel is:
Figure GDA0002741511230000052
the gaussian convolution template is:
Figure GDA0002741511230000053
2-2) the solving method of the color gradient characteristics on l, a and b is as follows:
gil=flΔH (4)
gia=faΔH (5)
gib=fbΔH (6)
gi=gil+gia+gib (7)
wherein f isl、faAnd fbRespectively representing a pixel value matrix of l, a and b with a pixel point i as a center and a window template size of (2k +1) × (2k +1), wherein k is 1; the operator delta represents the horizontal and vertical convolution operations.
2-3) the solving method of the morphological contour features comprises the following steps: for a pixel point i, the coordinate is (x, y), and the morphological contour feature is solved as follows:
Figure GDA0002741511230000054
Figure GDA0002741511230000061
mi=JDilation(x,y)-JErosion(x,y) (10)
3) calculating the characteristic distance between the clustering center point and 4 neighborhood pixel points, and sequentially distributing class labels of the clustering center point to the neighborhood pixel points according to the distance from small to large; the calculation method of the characteristic distance is as follows:
non-iterative clustering solution:
3-1) sequentially acquiring multi-scale feature representation of the neighborhood of the cluster central point 4;
3-2) solving the characteristic distance between the clustering center point and the neighborhood pixel point, wherein the solving method comprises the following steps:
for any two pixel points pi=[li,ai,bi,xi,yi,gi,mi]And pc=[lc,ac,bc,xc,yc,gc,mc]The color distance is defined as follows:
dc=(li-lc)2+(ai-ac)2+(bi-bc)2 (11)
the spatial distance is defined as follows:
ds=(xi-xc)2+(yi-yc)2 (12)
the gradient distance is defined as follows:
dg=(gi-gc)2 (13)
the morphological contour feature distance is defined as follows:
dm=mi-mc (14)
the feature distance is defined as follows:
Figure GDA0002741511230000062
wherein s represents the maximum space distance in the class, m represents the maximum color distance, gamma and lambda are hyper-parameters, and labels of clustering center points are sequentially distributed to the neighborhood points from near to far according to the characteristic distance from the clustering center.
4) Solving the coordinate feature mean value of the same class label pixel points, and updating the seed point feature representation into the feature of the pixel under the new coordinate;
5) if all pixel points in the image are distributed with the class labels, executing the step 6; otherwise, returning to the step 2), repeating the steps until all pixel points in the image are assigned with labels;
6) returning the label image, obtaining the label information of each pixel point through the steps, forming a super-pixel block by the pixels with the same label information, and finally outputting the image.
Example (b):
in practical applications, too few superpixel blocks cannot satisfy better boundary preservation, too many superpixels are used, and the over-segmentation is likely to occur, so the performance comparison of the superpixel segmentation method is performed by giving K500 in this embodiment.
Table 1 evaluation of the super pixel segmentation results for BSDS500 data set (K500)
Figure GDA0002741511230000071
Table 1 shows specific values of each evaluation index when the number K of superpixel blocks is 500 for the proposed method and PB, SLIC, SCoW, SNIC, and SQL algorithms. Under the condition of the same time complexity, the proposed method not only improves the segmentation precision, but also presents good boundary maintenance while keeping the segmentation error rate low from the overall effect.
Intuitively, fig. 2 is subjected to superpixel division, K is 500, and the SNIC method obtains the division result shown in fig. 3, while the present embodiment uses the MNSS method, which is the above-described method, to obtain the division result shown in fig. 4. Observing fig. 3, the MNSS algorithm shows better performance than the SNIC algorithm in the fitting of the edges of the objects in the empennages and backgrounds of the airplane body, the oxhorn and the wild goose. In summary, the MNSS method introducing multi-scale features based on the SNIC non-iterative clustering framework is not only theoretically feasible, but also has considerable results in specific practice.

Claims (3)

1. A multi-scale non-iterative superpixel segmentation method is characterized by comprising the following steps:
1) dividing an image into uniform grids, acquiring K initial clustering central points, and giving 1-K label information to the central points;
2) obtaining the feature representation of 4 neighborhood pixel points of each cluster central point;
3) calculating the characteristic distance between the clustering center point and 4 neighborhood pixels of the clustering center point, and sequentially distributing class labels of the clustering center points to the neighborhood pixels from small to large according to the distance;
4) solving the characteristic mean value of the pixels with the same label, and updating the seed points into mean characteristic vectors;
5) if all pixel points in the image are distributed with the class labels, executing the step 6); otherwise, returning to the step 2);
6) returning all label information and outputting images;
in step 2), the characteristics of the pixel points are expressed, and any pixel point p is expressed by a seven-dimensional characteristic vector, i is ═ l, a, b, x, y, g, m ], wherein [ l, a, b ] expresses the color characteristics in the CIELAB space of the pixel point, [ x, y ] expresses the space characteristics, g expresses the color gradient characteristics, and m expresses the morphological contour characteristics; wherein:
2-1) the calculation method of the color gradient characteristic comprises the following steps: for a pixel point i, the coordinate is (x, y), and the gradient feature solving process is as follows:
the gaussian convolution kernel is:
Figure FDA0003656192710000011
the gaussian convolution template is:
Figure FDA0003656192710000012
2-2) the solving method of the color gradient characteristics on l, a and b is as follows:
gil=flΔH (4)
gia=faΔH (5)
gib=fbΔH (6)
gi=gil+gia+gib (7)
wherein f isl、faAnd fbRespectively representing a pixel value matrix of l, a and b with a pixel point i as a center and a window template size of (2k +1) × (2k +1), wherein k is 1; operator Δ represents the horizontal and vertical convolution operations;
2-3) the solving method of the morphological contour features comprises the following steps: for a pixel point i, the coordinate is (x, y), and the morphological contour feature is solved as follows:
Figure FDA0003656192710000021
Figure FDA0003656192710000022
mi=JDilation(x,y)-JErosion(x,y) (10)
formula (8) represents that the image is subjected to expansion operation, namely (x, y) is replaced by the maximum value in 3 x 3 neighborhood pixel points (x + x ', y + y'), formula (9) represents that the image is subjected to corrosion operation, namely (x, y) is replaced by the minimum value in 3 x 3 neighborhood pixel points (x + x ', y + y'), and the expansion and corrosion operation results are subtracted to obtain the morphological contour characteristics of the image.
2. The method as claimed in claim 1, wherein in Step 1), assuming the image Size is Size, the number of super-pixel image blocks to be generated is K, and the Step Size Step of the uniform grid is calculated by:
Figure FDA0003656192710000023
3. the multi-scale non-iterative superpixel segmentation method according to claim 1, wherein in step 3), the characteristic distance between the cluster center point and its 4-neighborhood pixels is calculated as follows:
color feature distance measurement for arbitrary two pixels pi=[li,ai,bi,xi,yi,gi,mi]And pc=[lc,ac,bc,xc,yc,gc,mc],
The color distance is defined as follows:
dc=(li-lc)2+(ai-ac)2+(bi-bc)2 (11)
the spatial distance is defined as follows:
ds=(xi-xc)2+(yi-yc)2 (12)
the gradient distance is defined as follows:
dg=(gi-gc)2 (13)
the morphological contour feature distance is defined as follows:
dm=mi-mc (14)
the feature distance is defined as follows:
Figure FDA0003656192710000031
where s represents the maximum spatial distance within a class, m represents the maximum color distance, and γ and λ are hyperparameters.
CN202010948185.4A 2020-09-10 2020-09-10 Multi-scale non-iterative superpixel segmentation method Active CN112085749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010948185.4A CN112085749B (en) 2020-09-10 2020-09-10 Multi-scale non-iterative superpixel segmentation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010948185.4A CN112085749B (en) 2020-09-10 2020-09-10 Multi-scale non-iterative superpixel segmentation method

Publications (2)

Publication Number Publication Date
CN112085749A CN112085749A (en) 2020-12-15
CN112085749B true CN112085749B (en) 2022-07-05

Family

ID=73736338

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010948185.4A Active CN112085749B (en) 2020-09-10 2020-09-10 Multi-scale non-iterative superpixel segmentation method

Country Status (1)

Country Link
CN (1) CN112085749B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109635809A (en) * 2018-11-02 2019-04-16 浙江工业大学 A kind of superpixel segmentation method towards vision degraded image
CN110796667A (en) * 2019-10-22 2020-02-14 辽宁工程技术大学 Color image segmentation method based on improved wavelet clustering
CN111583279A (en) * 2020-05-12 2020-08-25 重庆理工大学 Super-pixel image segmentation method based on PCBA

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012258421A1 (en) * 2012-11-30 2014-06-19 Canon Kabushiki Kaisha Superpixel-based refinement of low-resolution foreground segmentation
US10650531B2 (en) * 2018-03-16 2020-05-12 Honda Motor Co., Ltd. Lidar noise removal using image pixel clusterings
CN110503656A (en) * 2019-08-28 2019-11-26 苏州大学 A kind of superpixel segmentation method and relevant device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109635809A (en) * 2018-11-02 2019-04-16 浙江工业大学 A kind of superpixel segmentation method towards vision degraded image
CN110796667A (en) * 2019-10-22 2020-02-14 辽宁工程技术大学 Color image segmentation method based on improved wavelet clustering
CN111583279A (en) * 2020-05-12 2020-08-25 重庆理工大学 Super-pixel image segmentation method based on PCBA

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
superpixels and polygons using simple non-iterative clustering;Radhakrishna Achanta;《2017 IEEE conference on computer vision and pattern recognition》;20171231;全文 *
一种新的图像超像素分割方法;廖苗;《电子与信息学报》;20200215;全文 *
基于超像素的图像分割方法研究;袁旭;《中国优秀硕士学位论文全文数据库 信息科技辑》;20200315;第26-31页 *

Also Published As

Publication number Publication date
CN112085749A (en) 2020-12-15

Similar Documents

Publication Publication Date Title
CN107194872B (en) Remote sensed image super-resolution reconstruction method based on perception of content deep learning network
Luo et al. Unsupervised multiscale color image segmentation based on MDL principle
CN106097252B (en) High spectrum image superpixel segmentation method based on figure Graph model
CN109741358B (en) Superpixel segmentation method based on adaptive hypergraph learning
CN111340697B (en) Image super-resolution method based on clustered regression
CN110334762A (en) A kind of feature matching method combining ORB and SIFT based on quaternary tree
CN110634147A (en) Image matting method based on bilateral boot up-sampling
CN110853064B (en) Image collaborative segmentation method based on minimum fuzzy divergence
CN112270697B (en) Satellite sequence image moving target detection method combined with super-resolution reconstruction
CN113177592B (en) Image segmentation method and device, computer equipment and storage medium
CN106096651B (en) Polarization SAR terrain classification method based on super-pixel and metric learning
CN113705579A (en) Automatic image annotation method driven by visual saliency
Srinivas et al. Remote sensing image segmentation using OTSU algorithm
KR20120000732A (en) An automatic segmentation method for object-based analysis using high resolution satellite imagery
CN112766102A (en) Unsupervised hyperspectral video target tracking method based on space-spectrum feature fusion
CN113450376A (en) Cotton plant edge detection method based on FPGA
CN109191482B (en) Image merging and segmenting method based on regional adaptive spectral angle threshold
CN112884795A (en) Power transmission line inspection foreground and background segmentation method based on multi-feature significance fusion
CN105701810B (en) A kind of unmanned plane image electronic based on click type image segmentation is sketched method
CN112085749B (en) Multi-scale non-iterative superpixel segmentation method
CN114119924A (en) Three-dimensional model fuzzy texture feature saliency method based on condition generation countermeasure
CN111899278B (en) Unmanned aerial vehicle image rapid target tracking method based on mobile terminal
CN110910417B (en) Weak and small moving target detection method based on super-pixel adjacent frame feature comparison
CN108765384B (en) Significance detection method for joint manifold sequencing and improved convex hull
CN108537798B (en) Rapid super-pixel segmentation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant