CN103336974A - Flower and plant category recognition method based on local constraint sparse characterization - Google Patents

Flower and plant category recognition method based on local constraint sparse characterization Download PDF

Info

Publication number
CN103336974A
CN103336974A CN2013102506935A CN201310250693A CN103336974A CN 103336974 A CN103336974 A CN 103336974A CN 2013102506935 A CN2013102506935 A CN 2013102506935A CN 201310250693 A CN201310250693 A CN 201310250693A CN 103336974 A CN103336974 A CN 103336974A
Authority
CN
China
Prior art keywords
flowers
feature
classification
flower
test data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013102506935A
Other languages
Chinese (zh)
Other versions
CN103336974B (en
Inventor
郭礼华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201310250693.5A priority Critical patent/CN103336974B/en
Publication of CN103336974A publication Critical patent/CN103336974A/en
Application granted granted Critical
Publication of CN103336974B publication Critical patent/CN103336974B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a flower and plant category recognition method based on local constraint sparse characterization, which comprises the following steps: firstly a flower and plant image database is collected, a flower and plant popular science knowledge base is built, and then various image characteristics of flower and plant pictures are exteracted; then the linear representation of training data characetristics and test data characteristics is built by utilizing the sparse coding theory; when the linear representation is built, the minimum error of the linear representation of the test image characteristics and the training image characteristics is taken into account, the weight constraint of similar structures of local characteristics of the test data characteristics and the traning data characteristics is increased at the same time, and the efficient solving process is performed by utilizing the statistical gradient descent method of kernel function extension, so that flower and plant category recognition and studying processes are completed; finally, the to-be-recognized flower and plant image extraction characteristics are substituted into the flower and plant category judgment formula so as to obtain the recognition result, and corresponding explanatory notes are called out of the flower and plant popular science knowledge base. The method has the advantage of high recognition performance.

Description

A kind of flowers classification discrimination method based on the sparse sign of local restriction
Technical field
The present invention relates to pattern-recognition and field of artificial intelligence, particularly a kind of flowers classification discrimination method based on the sparse sign of local restriction.
Background technology
The identification of flowers classification namely refers to utilize computing machine that the image information of flowers is carried out feature extraction, computing machine according to people's understanding and the mode of thinking in addition kind sort out and understand, and then can be the flowers knowledge that the user provides some science popularization, it belongs to the category of computing machine automatic object identification, patent is fewer aspect the identification of flowers classification at present, in academia a small amount of paper publishing is arranged, as paper (Yuning CHAI, Victor LEMPITSKY, Andrew ZISSERMAN.BiCoS:A Bi-level Co-Segmentation Method for Image Classification.ICCV, 2011), it utilizes image partition method that image is divided into prospect and background, extract color distribution and the super Pixel Information of image then, and utilize the effective algorithm of deducing to identify.Paper (Nilsback, M-E.and Zisserman, A.Automated flower classification over a large number of classes, Proceedings of the Indian Conference on Computer Vision, Graphics and Image Processing2008), it extracts the color histogram of image, SIFT feature, after three kinds of features of HOG feature, carry out the classification of flowers classification with svm classifier.Paper (Nilsback, M-E.and Zisserman, A, A Visual Vocabulary for Flower Classification, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition2006), it utilizes the BOW feature to carry out the classification of flower chart picture.
Summary of the invention
Above-mentioned shortcoming and deficiency in order to overcome prior art the object of the present invention is to provide a kind of flowers classification discrimination method based on the sparse sign of local restriction, the recognition performance height.
Purpose of the present invention is achieved through the following technical solutions: a kind of flowers classification discrimination method based on the sparse sign of local restriction may further comprise the steps:
(1) collects the flowers image data base and set up flowers popular science knowledge storehouse: the title of utilizing S flowers classification commonly used of existing plants and flowers wikipedia definition, utilize network search engines to search for flowers character introduction and the picture of each flowers classification correspondence, picture is formed flower chart as database; Explanatory note is included into flowers popular science knowledge storehouse;
(2) all pictures in the flowers image data base that step (1) is obtained carry out feature extraction, and what every pictures extracted is characterized as m;
(3) flowers classification identification learning process:
(3-1) from the flowers image data base, select the p pictures, with its feature as test data set Y={y k, wherein k=1...m represents m the feature that every pictures extracts respectively, and p<N, N are the picture number in the flowers image data base; The feature of remaining N-p pictures is as training dataset in the flowers image data base
Figure BDA00003384301100025
J=1 wherein, 2 ... S, the classification of expression flowers;
(3-2) with test data feature y kUse the training data feature
Figure BDA00003384301100026
Linear expression is:
Figure BDA00003384301100027
Wherein
Figure BDA00003384301100028
Be the weight coefficient of training dataset characterization test data set, it is worth greater than 0;
(3-3) for training dataset and test data set linear expression error minimum, increase the weight constraints of the locality characteristic similarity structure of test data feature and training data feature simultaneously, the optimization learning criterion below setting up:
min W j k 1 2 Σ k = 1 m | | y k - Σ j = 1 S X j k w j k | | 2 2 + λ Σ j = 1 S | | D j k Θ w j k | | 2 - - - ( 1 )
Wherein Θ represents vector dot; λ is the bound term weight, is the constant variables between balance linear expression error and the weight coefficient;
Figure 20131025069351000022
Be test data feature y kWith the training data feature
Figure BDA00003384301100029
Euclidean distance;
(3-4) adopt the statistical gradient descent method right Carry out iteration and upgrade, renewal equation is:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein, t is the iterations of SGD iterative process, and η is the learning rate of statistical gradient decline iterative process;
(3-5) utilize nonlinear function φ that the feature of training and testing data is carried out Nonlinear Mapping to the reproducing kernel Hilbert space (reproducing kernel hilbert space is called for short RKHS) of higher-dimension, i.e. φ (x i) Tφ (x j)=g (x i, x j), g (x wherein ix j) be χ 2Kernel function, x iAnd x jIt is data characteristics; Formula (2) is transformed into:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - h k + G k w j k , t + λ ( P k - 2 h h + G k ) · w j k , t - - - ( 3 )
Wherein It is the training data feature
Figure BDA000033843011000212
With test data feature y kThe dot product kernel function,
Figure BDA00003384301100033
It is the training data feature
Figure BDA00003384301100034
With the dot product kernel function of self, P k=φ (y k) φ (y k) be test data feature y kWith the dot product kernel function of self; Through the repeatedly iteration of formula (3), obtain optimum sign weight coefficient
Figure BDA00003384301100035
(4) identification flowers: the user takes the image of flowers to be identified, and flower chart picture to be identified is extracted characteristic Z k, wherein k=1...m represents m the feature that every pictures extracts respectively;
According to Z k, by flowers classification judgement formula identification flowers classification, and from flowers popular science knowledge storehouse, access this flowers classification corresponding character explanation;
Wherein flowers classification judgement formula is:
Figure BDA00003384301100031
J wherein *Expression utilizes linear expression and the minimum error values of test data of the training data of certain j classification, selects by minimum, and j is and identifies the flowers classification that obtains.
In the step (3-4), η reduces along with the increase of iterations t,
The described χ of step (3-5) 2The expression formula of kernel function is exp (χ 2(x, x)/μ), χ wherein 2Be card side (Chi-squared) distance of symmetrical expression, μ is the χ of current training dataset 2The average of distance.
The described title of utilizing S flowers classification commonly used of existing plants and flowers wikipedia definition of step (1), utilize network search engines to search for flowers character introduction and the picture of each flowers classification correspondence, be specially: the title of utilizing 313 herbage flower classifications commonly used of existing plants and flowers wikipedia definition, utilize network search engines to search for flowers character introduction and the picture of each flowers classification correspondence, each flowers classification is downloaded 100 pictures.
The feature that every pictures extracts comprises color histogram, SIFT feature (Scale-invariant feature transform, yardstick invariant features converting characteristic), HOG feature (Histogram of oriented gradients, the direction gradient histogram feature), BOW feature (Bag of word, dictionary bag feature), SSIM feature (Structural Similarity, the structural similarity feature), GB feature (Structural Similarity, structural similarity feature).
Compared with prior art, the present invention has the following advantages and beneficial effect:
(1) in the flowers classification identification learning process, the optimization learning criterion has not only been considered current linear expression error minimum principle, but also with image based on the similar features structural modeling of the part bound term to learning criterion, thereby image is when linear expression, system preferably selects the most similar characteristics of image to carry out linear expression, when guaranteeing the sparse property of linear coefficient again, improved the identification performance of system again.
(2) framed structure of the present invention can seamlessly incorporate the more images feature, and for follow-up system upgrading facilitates, and the characteristics of image that identification arranged also can further improve the identification performance of system more.
(3) identification method of the present invention can be applied in the actual flowers science popularization system well, because identification modeling method recognition performance height of the present invention, thereby robustness and the stability of assurance flowers science popularization system.
Description of drawings
Fig. 1 is the process flow diagram based on the flowers classification discrimination method of the sparse sign of local restriction of present embodiment.
Fig. 2 is the main process flow diagram of statistical gradient descending method of present embodiment.
Embodiment
Below in conjunction with embodiment, the present invention is described in further detail, but embodiments of the present invention are not limited thereto.
Embodiment
As shown in Figure 1, the flowers classification discrimination method based on the sparse sign of local restriction of present embodiment may further comprise the steps:
(1) collects the flowers image data base and set up flowers popular science knowledge storehouse: the title of utilizing 313 herbage flower classifications commonly used of existing plants and flowers wikipedia definition, utilize network search engines such as google and Baidu to search for flowers character introduction and the picture of each flowers classification correspondence, each flowers classification is downloaded 100 pictures.Picture is formed flower chart as database; Explanatory note is included into flowers popular science knowledge storehouse.
(2) all pictures in the flowers image data base that step (1) is obtained carry out feature extraction, what every pictures extracted is characterized as 6, comprise color histogram, the SIFT feature (realizes details reference Lowe, D.G., Distinctive Image Features from Scale-Invariant Keypoints, International Journal of Computer Vision, 60,2, pp.91-110,2004), the HOG feature (realizes details reference N.Dalal and B.Triggs.Histogram of oriented gradients for human detection.In Computer Vision and Pattern Recognition, pp.887 – 893.IEEE, 2005), the BOW feature (realizes details reference S.Lazebnik, C.Schmid, and J.Ponce.Beyond bag of features:Spatial pyramid matching for recognizing natural scene categories.In Computer Vision and Pattern Recognition, pp.2169 – 2178.IEEE, 2006), the SSIM feature (realizes details reference E.Shechtman and M.Irani., Matching local self-similarities across images and videos.In Proc.CVPR, 2007), the GB feature (realizes details reference A.C.Berg, T.L.Berg, and J.Malik., Shape matching and object recognition using low distortion correspondences.In Proc.CVPR, 2005).
(3) flowers classification identification learning process:
(3-1) from the flowers image data base, select 30 pictures, with its feature as test data set Y={y k, wherein k=1...6 distinguishes 6 kinds of features of representative image: color histogram, SIFT feature, HOG feature, BOW feature, SSIM feature and GB feature; The feature of the 31270 remaining pictures in the flowers image data base is as training dataset
Figure BDA00003384301100054
J=1...313 wherein, the classification of expression flowers;
(3-2) with test data feature y kUse the training data feature Linear expression is:
Figure BDA00003384301100056
Wherein Be the weight coefficient of training dataset characterization test data set, it is worth greater than 0;
(3-3) for training dataset and test data set linear expression error minimum, increase the weight constraints of the locality characteristic similarity structure of test data feature and training data feature simultaneously, the optimization learning criterion below setting up:
min W j k 1 2 Σ k = 1 m | | y k - Σ j = 1 S X j k w j k | | 2 2 + λ Σ j = 1 S | | D j k Θ w j k | | 2 - - - ( 1 )
Wherein Θ represents vector dot; λ is the bound term weight, is that constant variables between balance linear expression error and the weight coefficient (is set at 0 in the present embodiment.01);
Figure BDA00003384301100058
Expression test data feature y kThe training data feature similar with it Locality describe, concrete definition is
Figure 20131025069351000021
Be test data feature y kWith the training data feature
Figure BDA000033843011000511
Euclidean distance; Under the optimization learning criterion, can guarantee that similar test feature can select similar training characteristics to carry out linear expression; In order to make that final linear expression coefficient can be sparse, generally can be with weight coefficient wherein
Figure BDA000033843011000512
Be lower than certain threshold value (present embodiment 0.005) and be made as 0.
(3-4) adopt statistical gradient descent method (Stochastic gradient descent Method is called for short SGD) right Carry out iteration and upgrade, renewal equation is:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein, t is the iterations of SGD iterative process, and η is the learning rate of statistical gradient decline iterative process; In order to guarantee that learning criterion can effectively restrain, present embodiment is set learning rate η and is reduced along with the increase of iterations t, η = 1 1 + 100 t ;
The main flowchart process of statistical gradient descending method at first is the initializes weights coefficient as shown in Figure 2 Select the part training dataset then at random
Figure BDA000033843011000515
Calculate the kernel function of its training dataset and test data correspondence
Figure BDA00003384301100063
With
Figure BDA00003384301100064
Then allow i=1, ..., n utilizes formula (3) to carry out n iteration of weight coefficient, after iteration is intact, computing formula (2) optimization aim functional value, if it is littler than the duration of last iteration process, the randomization training dataset carries out the next iteration circulation again, shows that perhaps iteration has searched out the optimal weight coefficient, iteration finishes, and the output optimum linearity is represented weight coefficient;
(3-5) utilize nonlinear function φ that the feature of training and testing data is carried out Nonlinear Mapping to the reproducing kernel Hilbert space (reproducing kernel hilbert space is called for short RKHS) of higher-dimension, i.e. φ (x i) Tφ (x j)=g (x i, x j), g (x wherein i, x j) be χ 2Kernel function, x iAnd x jIt is data characteristics; χ 2The expression formula of kernel function is exp(-χ 2(x, x)/μ, wherein χ 2Be card side (Chi-squared) distance of symmetrical expression, μ is the χ of current training dataset 2The average of distance.
Formula (2) is transformed into:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein
Figure BDA00003384301100065
It is the training data feature
Figure BDA00003384301100066
With test data feature y kThe dot product kernel function,
Figure BDA00003384301100067
It is the training data feature
Figure BDA00003384301100068
With the dot product kernel function of self, P k=φ (y k) φ (y k) be test data feature y kWith the dot product kernel function of self; Through the repeatedly iteration of formula (3), obtain optimum sign weight coefficient
Figure BDA00003384301100069
(4) identification flowers: the user takes the image of flowers to be identified, and flower chart picture to be identified is extracted characteristic Z k, wherein k=1...6 distinguishes representative color histogram, SIFT feature, HOG feature, BOW feature, SSIM feature, GB feature;
According to Z k, by flowers classification judgement formula identification flowers classification, and from flowers popular science knowledge storehouse, access this flowers classification corresponding character explanation;
Wherein flowers classification judgement formula is:
Figure BDA00003384301100062
J wherein *Expression utilizes linear expression and the minimum error values of test data of the training data of certain j classification, selects by minimum, and j is and identifies the flowers classification that obtains.
Above-described embodiment is preferred implementation of the present invention; but embodiments of the present invention are not limited by the examples; other any do not deviate from change, the modification done under spiritual essence of the present invention and the principle, substitutes, combination, simplify; all should be the substitute mode of equivalence, be included within protection scope of the present invention.

Claims (5)

1. the flowers classification discrimination method based on the sparse sign of local restriction is characterized in that, may further comprise the steps:
(1) collects the flowers image data base and set up flowers popular science knowledge storehouse: the title of utilizing S flowers classification commonly used of existing plants and flowers wikipedia definition, utilize network search engines to search for flowers character introduction and the picture of each flowers classification correspondence, picture is formed flower chart as database; Explanatory note is included into flowers popular science knowledge storehouse;
(2) all pictures in the flowers image data base that step (1) is obtained carry out feature extraction, and what every pictures extracted is characterized as m;
(3) flowers classification identification learning process:
(3-1) from the flowers image data base, select the p pictures, with its feature as test data set Y={y k, wherein k=1...m represents m the feature that every pictures extracts respectively, and p<N, N are the picture number in the flowers image data base; The feature of remaining N-p pictures is as training dataset in the flowers image data base
Figure FDA00003384301000013
J=1 wherein, 2 ... S, the classification of expression flowers;
(3-2) with test data feature y kUse the training data feature
Figure FDA00003384301000014
Linear expression is: Wherein
Figure FDA00003384301000016
Be the weight coefficient of training dataset characterization test data set, it is worth greater than 0;
(3-3) for training dataset and test data set linear expression error minimum, increase the weight constraints of the locality characteristic similarity structure of test data feature and training data feature simultaneously, the optimization learning criterion below setting up:
min W j k 1 2 Σ k = 1 m | | y k - Σ j = 1 S X j k w j k | | 2 2 + λ Σ j = 1 S | | D j k Θ w j k | | 2 - - - ( 1 )
Wherein Θ represents vector dot; λ is the bound term weight, is the constant variables between balance linear expression error and the weight coefficient;
Figure FDA00003384301000017
Be test data feature y kWith the training data feature
Figure FDA00003384301000018
Euclidean distance;
(3-4) adopt the statistical gradient descent method right
Figure FDA00003384301000019
Carry out iteration and upgrade, renewal equation is:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein, t is the iterations of SGD iterative process, and η is the learning rate of statistical gradient decline iterative process;
(3-5) utilize nonlinear function φ that the feature of training and testing data is carried out Nonlinear Mapping to the reproducing kernel Hilbert space of higher-dimension, i.e. φ (x i) Tφ (x j)=g (x i, x j), g (x wherein i, x j) be χ 2Kernel function, x iAnd x jIt is data characteristics; Formula (2) is transformed into:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - h k + G k w j k , t + λ ( P k - 2 h h + G k ) · w j k , t - - - ( 3 )
Wherein
Figure FDA00003384301000024
It is the training data feature
Figure FDA00003384301000025
With test data feature y kThe dot product kernel function,
Figure FDA00003384301000026
It is the training data feature
Figure FDA00003384301000027
With the dot product kernel function of self, P k=φ (y k) φ (y k) be test data feature y kWith the dot product kernel function of self; Through the repeatedly iteration of formula (3), obtain optimum sign weight coefficient
Figure FDA00003384301000028
(4) identification flowers: the user takes the image of flowers to be identified, and flower chart picture to be identified is extracted characteristic Z k, wherein k=1...m represents m the feature that every pictures extracts respectively;
According to Z k, by flowers classification judgement formula identification flowers classification, and from flowers popular science knowledge storehouse, access this flowers classification corresponding character explanation;
Wherein flowers classification judgement formula is:
Figure FDA00003384301000022
J wherein *Expression utilizes linear expression and the minimum error values of test data of the training data of certain j classification, selects by minimum, and j is and identifies the flowers classification that obtains.
2. the flowers classification discrimination method based on the sparse sign of local restriction according to claim 1 is characterized in that in the step (3-4), η reduces along with the increase of iterations t,
Figure FDA00003384301000023
3. the flowers classification discrimination method based on the sparse sign of local restriction according to claim 1 is characterized in that, the described χ of step (3-5) 2The expression formula of kernel function is exp (χ 2(x, x)/μ), χ wherein 2Be card side's distance of symmetrical expression, μ is the χ of current training dataset 2The average of distance.
4. the flowers classification discrimination method based on the sparse sign of local restriction according to claim 1, it is characterized in that, the described title of utilizing S flowers classification commonly used of existing plants and flowers wikipedia definition of step (1), utilize network search engines to search for flowers character introduction and the picture of each flowers classification correspondence, be specially: the title of utilizing 313 herbage flower classifications commonly used of existing plants and flowers wikipedia definition, utilize network search engines to search for flowers character introduction and the picture of each flowers classification correspondence, each flowers classification is downloaded 100 pictures.
5. the flowers classification discrimination method based on the sparse sign of local restriction according to claim 1 is characterized in that, the feature that every pictures extracts comprises color histogram, SIFT feature, HOG feature, BOW feature, SSIM feature, GB feature.
CN201310250693.5A 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation Active CN103336974B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310250693.5A CN103336974B (en) 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310250693.5A CN103336974B (en) 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation

Publications (2)

Publication Number Publication Date
CN103336974A true CN103336974A (en) 2013-10-02
CN103336974B CN103336974B (en) 2016-12-28

Family

ID=49245131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310250693.5A Active CN103336974B (en) 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation

Country Status (1)

Country Link
CN (1) CN103336974B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844535A (en) * 2016-04-19 2016-08-10 柳州名品科技有限公司 Agricultural vegetable greenhouse intelligent management platform having self-learning function
CN106941586A (en) * 2016-01-05 2017-07-11 腾讯科技(深圳)有限公司 The method and apparatus for shooting photo
CN107153844A (en) * 2017-05-12 2017-09-12 上海斐讯数据通信技术有限公司 The accessory system being improved to flowers identifying system and the method being improved
CN110297930A (en) * 2019-06-14 2019-10-01 韶关市启之信息技术有限公司 A kind of colored language methods of exhibiting and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3918143B2 (en) * 2000-12-28 2007-05-23 独立行政法人科学技術振興機構 Plant recognition system
CN101826161A (en) * 2010-04-09 2010-09-08 中国科学院自动化研究所 Method for identifying target based on local neighbor sparse representation
CN102902961A (en) * 2012-09-21 2013-01-30 武汉大学 Face super-resolution processing method based on K neighbor sparse coding average value constraint

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3918143B2 (en) * 2000-12-28 2007-05-23 独立行政法人科学技術振興機構 Plant recognition system
CN101826161A (en) * 2010-04-09 2010-09-08 中国科学院自动化研究所 Method for identifying target based on local neighbor sparse representation
CN102902961A (en) * 2012-09-21 2013-01-30 武汉大学 Face super-resolution processing method based on K neighbor sparse coding average value constraint

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JINJUN WANG,JIANCHAO YANG,ECT: "《locality-constrained linear coding for image classification》", 《IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 *
TAKESHI SAITOH,KIMIYA AOKI,ECT: "Automatic recognition of Blooming Flowers", 《PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION,IEEE》 *
裴勇: "基于数字图像的花卉种类识别技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106941586A (en) * 2016-01-05 2017-07-11 腾讯科技(深圳)有限公司 The method and apparatus for shooting photo
CN105844535A (en) * 2016-04-19 2016-08-10 柳州名品科技有限公司 Agricultural vegetable greenhouse intelligent management platform having self-learning function
CN107153844A (en) * 2017-05-12 2017-09-12 上海斐讯数据通信技术有限公司 The accessory system being improved to flowers identifying system and the method being improved
CN110297930A (en) * 2019-06-14 2019-10-01 韶关市启之信息技术有限公司 A kind of colored language methods of exhibiting and device

Also Published As

Publication number Publication date
CN103336974B (en) 2016-12-28

Similar Documents

Publication Publication Date Title
Zhao et al. A survey on deep learning-based fine-grained object classification and semantic segmentation
CN101551809B (en) Search method of SAR images classified based on Gauss hybrid model
Bui et al. Using grayscale images for object recognition with convolutional-recursive neural network
Lin et al. Multiple instance ffeature for robust part-based object detection
Shotton et al. Semantic texton forests for image categorization and segmentation
Kobayashi BFO meets HOG: feature extraction based on histograms of oriented pdf gradients for image classification
Ott et al. Shared parts for deformable part-based models
Negrel et al. Evaluation of second-order visual features for land-use classification
Larios et al. Haar random forest features and SVM spatial matching kernel for stonefly species identification
Prasad et al. Classifying computer generated charts
CN105981008A (en) Learning deep face representation
CN102622607A (en) Remote sensing image classification method based on multi-feature fusion
CN102663413A (en) Multi-gesture and cross-age oriented face image authentication method
Zhao et al. Semantic parts based top-down pyramid for action recognition
Kontschieder et al. Context-sensitive decision forests for object detection
CN105184298A (en) Image classification method through fast and locality-constrained low-rank coding process
CN107092931B (en) Method for identifying dairy cow individuals
CN105718866A (en) Visual target detection and identification method
CN104636732A (en) Sequence deeply convinced network-based pedestrian identifying method
Chen et al. Ibm research australia at lifeclef2014: Plant identification task.
CN103336974B (en) A kind of flowers classification discrimination method based on local restriction sparse representation
Yanulevskaya et al. Learning to group objects
Chen et al. Page segmentation for historical handwritten document images using conditional random fields
CN104318271A (en) Image classification method based on adaptability coding and geometrical smooth convergence
Li et al. Codemaps-segment, classify and search objects locally

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant