CN103544501A - Indoor and outdoor scene classification method based on Fourier transformation - Google Patents

Indoor and outdoor scene classification method based on Fourier transformation Download PDF

Info

Publication number
CN103544501A
CN103544501A CN201310516017.8A CN201310516017A CN103544501A CN 103544501 A CN103544501 A CN 103544501A CN 201310516017 A CN201310516017 A CN 201310516017A CN 103544501 A CN103544501 A CN 103544501A
Authority
CN
China
Prior art keywords
image
indoor
ecoh
outdoor scene
fourier transform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310516017.8A
Other languages
Chinese (zh)
Other versions
CN103544501B (en
Inventor
赵志杰
王海涛
张立志
孙华东
金雪松
吴迁
陈婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Commerce
Original Assignee
Harbin University of Commerce
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Commerce filed Critical Harbin University of Commerce
Priority to CN201310516017.8A priority Critical patent/CN103544501B/en
Publication of CN103544501A publication Critical patent/CN103544501A/en
Application granted granted Critical
Publication of CN103544501B publication Critical patent/CN103544501B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an indoor and outdoor scene classification method based on Fourier transformation, and relates to a method for classifying indoor and outdoor scene images. The method aims to solve the problems that frequency domain characteristics are not taken into consideration in an existing indoor and outdoor scene classification method, and classification accuracy is poor. The indoor and outdoor scene classification method based on Fourier transformation includes the following steps of segmenting scene images, obtaining an edge orientation histogram (EOH), obtaining a color orientation histogram (COH), calculating a characteristic vectors of the images, carrying out Fourier transformation of the images, carrying out secondary segmentation of the images, and classifying indoor and outdoor scenes. According to the method, classification accuracy is improved from 91% to 92%, and the method can be used in an indoor and outdoor picture scene classification method.

Description

Indoor and outdoor scene classification method based on Fourier transform
Technical field
The present invention relates to a kind of method of the indoor and outdoor scene image of classifying.
Background technology
In image is processed, it is particularly important that indoor and outdoor scene classification all seems in a lot of fields, such as the application of content-based image recovery, digital library, autonomous robot and digital photography etc.Because the object of some indoor pictures also can be found in outdoor picture, the structure of the buildings in outdoor picture is similar to doors structure simultaneously, this has just brought very large difficulty to indoor and outdoor scene classification problem, and certain methods is by being used form, test data and the semantic knowledge of different sorters, feature, training automatically to classify.
Color and textural characteristics are the features that indoor and outdoor scene classification problem is the most often used, in order further to improve classification accuracy, some post-processing technologies have obtained concern, refusal threshold value is as a kind of post-processing technology, its use can be used for improving classification accuracy effectively, except choosing suitable feature, select a good sorter no less important.Edge and color direction histogram (Edge and Color Orientation Histogram, ECOH) by Wonjun Kim, equaled to propose for 2010, the method is passed through edge and the color direction histogram of pixel in statistical picture, thereby has effectively improved classification accuracy.
Outdoor scene contains the textures such as sky mostly compared with flat feature, the complicacy of interior space object and there is more edge and linear information, therefore texture variations is inevitable larger, be reflected to frequency domain space, the frequency change of outdoor scene will be more violent, and indoor scene can relatively flat, yet existing method is only considered the space domain characteristic such as color and texture edge, for scene classification, have equally the but not consideration of frequency field feature of vital role, classification accuracy is lower.
Summary of the invention
The object of the invention is, in order to solve existing indoor and outdoor scene classification method owing to not considering frequency field feature, to cause the problem that classification accuracy is lower, a kind of indoor and outdoor scene classification method based on Fourier transform is provided.
Indoor and outdoor scene classification method based on Fourier transform, according to following steps, carry out:
One, cutting apart of scene image: image is divided into BLK1, BLK2, BLK3, BLK4 and BLK5 five bulks;
Two, ask edge histogram EOH: to dividing each bulk obtaining in step 1, ask edge histogram EOH;
Three, ask color histogram COH: to dividing each bulk obtaining in step 1, ask color histogram COH;
Four, the eigenvector of computed image: the color histogram COH trying to achieve in the edge histogram EOH trying to achieve in step 2 and step 3 is merged, each bulk is assigned weight, obtain the eigenvector of image, i.e. ECOH method;
Five, the Fourier transform of image: each bulk is carried out respectively to Fourier transform, the piece after Fourier transform is carried out to translation transformation, low frequency part is positioned at center;
Six, again cutting apart of image: each bulk is divided into 32 fritters again, calculates amplitude the summation of each fritter, as a dimension of eigenvector, dividing mode is: BLK1 is divided into 16 of every row, totally 2 row, BLK2, BLK3, BLK4 and BLK5 are divided into 2 of every row, totally 16 row;
Seven, indoor and outdoor scene classification: on ECOH method basis, add Fourier spectrum feature, comprehensive spectrum signature and ECOH method, calculate the final vector of image, pie graph sheet scene characteristic of division space, finally in svm classifier device, carry out the classification of indoor and outdoor scene, described feature space form is as follows:
F = F 1 ECOH , F 1 O , F 2 ECOH , F 2 O . . . F 5 ECOH , F 5 , O
Wherein F i O = F i 1 O , F i 2 O . . . F i 32 O , 1 ≤ i ≤ 5 .
The present invention comprises following beneficial effect:
The present invention proposes a kind of spectrum information that utilizes after Fourier transform for the method for indoor and outdoor scene classification, spectrum information is combined with ECOH method, image is divided into five bulks at first, extracting section after each is cut apart color, texture and spectrum information is wherein as feature, the part of each picture segmentation has independently eigenvector, therefore by this stack features, can represent this width picture, for feature extraction phases, we have verified that frequency information can be used in classification indoor and outdoor picture, finally uses support vector machine to classify.After comparing with other sorting techniques, the present invention makes classification accuracy rate bring up to 92% from 91%.
Accompanying drawing explanation
Fig. 1 be picture cut apart at first figure;
Fig. 2 be BLK1 again cut apart figure;
Fig. 3 be BLK2, BLK3, BLK4 and BLK5 again cut apart figure;
Fig. 4 is that ECOH method is found best C and gamma figure;
Fig. 5 is that ECOH+ECFH method is found best C and gamma figure.
Embodiment
Technical solution of the present invention is not limited to following cited embodiment, also comprises the combination in any between each embodiment.
Embodiment one: the indoor and outdoor scene classification method based on Fourier transform of present embodiment, according to following steps, carry out:
One, cutting apart of scene image: image is divided into BLK1, BLK2, BLK3, BLK4 and BLK5 five bulks;
Two, ask edge histogram EOH: to dividing each bulk obtaining in step 1, ask edge histogram EOH;
Three, ask color histogram COH: to dividing each bulk obtaining in step 1, ask color histogram COH;
Four, the eigenvector of computed image: the color histogram COH trying to achieve in the edge histogram EOH trying to achieve in step 2 and step 3 is merged, each bulk is assigned weight, obtain the eigenvector of image, i.e. ECOH method;
Five, the Fourier transform of image: each bulk is carried out respectively to Fourier transform, the piece after Fourier transform is carried out to translation transformation, low frequency part is positioned at center;
Six, again cutting apart of image: each bulk is divided into 32 fritters again, calculates amplitude the summation of each fritter, as a dimension of eigenvector, dividing mode is: BLK1 is divided into 16 of every row, totally 2 row, BLK2, BLK3, BLK4 and BLK5 are divided into 2 of every row, totally 16 row;
Seven, indoor and outdoor scene classification: on ECOH method basis, add Fourier spectrum feature, comprehensive spectrum signature and ECOH method, calculate the final vector of image, pie graph sheet scene characteristic of division space, finally in svm classifier device, carry out the classification of indoor and outdoor scene, described feature space form is as follows:
F = F 1 ECOH , F 1 O , F 2 ECOH , F 2 O . . . F 5 ECOH , F 5 O ,
Wherein F i O = F i 1 O , F i 2 O . . . F i 32 O , 1 ≤ i ≤ 5 .
Embodiment two: present embodiment is different from embodiment one: the image dividing mode described in step 1 is: BLK1 is topmost one of image, wide is the wide of entire image, height is 1/8 of image, BLK2, BLK3, BLK4 and BLK5 be evenly distributed in BLK1 below, wide 1/4 of the figure image width that is respectively of BLK2, BLK3, BLK4 and BLK5, height is respectively 7/8 of figure image height.Other is identical with embodiment one.
Embodiment three: present embodiment is different from embodiment one or two: the concrete steps of asking edge histogram EOH described in step 2 are: according to formula A = ( x , y ) = P x ( x , y ) 2 + P y ( x , y ) 2 And θ ( x , y ) = tan - 1 P y ( x , y ) P x ( x , y ) Obtain texture amplitude A (x, y) and argument θ (x, y), the wherein P of every middle edge pixel x(x, y) and P y(x, y) be respectively pixel (x, y) level and perpendicular direction vector, the direction vector of pixel is quantized within the scope of 0 °~180 °, i.e. the argument of pixel and 180 deliverys, according to the argument of texture edge pixel point, by 180 degree eight equal parts, edge pixel point is divided in 8 regions, calculate be divided into pixel amplitude each region in, formula is as follows:
E i , m = Σ ( x , y ) ∈ BLK θ ( x , y ) ∈ m A ( x , y ) , 1 ≤ i ≤ 5,1 ≤ m ≤ 8 , E wherein i,mfor belong to i piece in m region amplitude and.Other is identical with embodiment one or two.
Embodiment four: present embodiment is different from one of embodiment one to three: the concrete steps of asking color histogram COH described in step 3 are: image is transformed into HSV color space by rgb color space, the span of H component is 0 °~360 °, by 360 degree 8 deciles, according to the H component value of pixel using different pixels be divided in 8 regions and ask pixel S component in each region and as a dimension, formula is as follows
Figure BDA0000403006430000036
wherein, s (x, y) and h (x, y) represent pixel point (x, y) is located respectively degree of saturation and form and aspect.Other is identical with one of embodiment one to three.
Embodiment five: present embodiment is different from one of embodiment one to four: the feature space form of the characteristics of image vector described in step 4 is as follows:
F = ω 1 F 1 ECOH , ω 2 F 2 ECOH , . . . ω 5 F 5 ECOH , Wherein F i ECOH = F i 1 E F i 1 C , F i 2 E F i 2 C , . . . F i 8 E F i 8 C , 1 ≤ i ≤ 5 . Other is identical with one of embodiment one to four.
By following examples, verify beneficial effect of the present invention:
Embodiment mono-: the present embodiment view data is from the Corel image database of Washington, DC university computer scientific and engineering institute, and wherein training sample picture has 2638 width, indoor picture 1494 width, outdoor picture 1144 width.Test sample book has 2139 width pictures, indoor picture 1583 width, outdoor picture 556 width.
The indoor and outdoor scene classification method based on Fourier transform of the present embodiment, carry out according to the following steps:
One, picture is divided into five bulks, concrete dividing mode is as Fig. 1, and wherein BLK1 be one of the top, and wide is the wide of picture in its entirety, and height is 1/8 of picture, all the other four wide be picture wide 1/4, height be picture high 7/8.
Two, according to formula A = ( x , y ) = P x ( x , y ) 2 + P y ( x , y ) 2 And θ ( x , y ) = tan - 1 P y ( x , y ) P x ( x , y ) Obtain texture amplitude A (x, y) and the argument θ (x, y) of every middle edge pixel, P x(x, y) and P y(x, y) be respectively pixel (x, y) level and perpendicular direction vector, the direction vector of pixel is quantized in 0 to 180 scope, be argument and 180 deliverys of pixel, according to the argument of texture edge pixel point, by 180 degree eight equal parts, edge pixel point is divided in 8 regions, and calculate be divided into pixel amplitude each region in.Thereby obtain EOH, formula is as follows:
Figure BDA0000403006430000045
e wherein i,mfor belonging in m region, the amplitude of i piece and.
Three, image is transformed into HSV color space by rgb color space, because the span of H component is 0~360, and the degree of saturation of S representation in components color, therefore similar to the 1st step (asking EOH process), by 360 degree 8 deciles, according to the H component value of pixel using different pixels be divided in 8 bin and ask pixel S component in each bin and as a dimension, calculate the H value that is divided in every part with.Formula is as follows:
Figure BDA0000403006430000046
wherein, s (x, y) and h (x, y) represent pixel point (x, y) is located respectively degree of saturation and form and aspect.
Four, by EOH and COH, merged, and different bulks is distributed to different weights, obtain the final eigenvector of every width picture, form is as follows:
F = ω 1 F 1 ECOH , ω 2 F 2 ECOH , . . . ω 5 F 5 ECOH
Wherein
Figure BDA0000403006430000052
can be calculated 80 (8bin*5 piece (EOH)+8bin*5(COH) piece) n dimensional vector n.
Five, each bulk (i=1~5) is carried out respectively to Fourier transform, the piece after Fourier transform is carried out to translation transformation, make low frequency part be positioned at center.The discrete Fourier transformation (FT) of image can be represented by following formula:
Figure BDA0000403006430000053
wherein, its spectral amplitude A (f)=| I (f) | represent the size of Fourier transform
Six, each bulk is divided into again to 32 fritters, dividing mode is: BLK1 is divided into 16 of every row, totally 2 row, as Fig. 2, BLK2, BLK3, BLK4 and BLK5 are divided into 2 of every row, totally 16 row, as Fig. 3, calculate amplitude the summation of each fritter, as a dimension of eigenvector;
Seven, respectively all pixels in every fritter are calculated to its amplitude summation, a dimension as eigenvector, the spectrum signature of image is extracted, joins in original ECOH method, therefore in every width picture, can extract 240 (ECOH(80)+32*5 pieces) dimensional feature vector.
Use MATLAB software and classify with libsvm tool box.In order to obtain result more accurately, before setting up model, SVM carried out parameter optimization.SVM model has two very important parameters C and gamma.Wherein C is penalty coefficient, i.e. the tolerance to error.C is higher, illustrates more to can't stand to occur error.C is excessive or too small, and generalization ability variation gamma is after selecting RBF function as kernel, the parameter that this function carries.Impliedly determined that data-mapping arrives the distribution after new feature space, the number impact training of d support vector and the speed of prediction.Therefore reasonably choose C and gamma value is extremely important to classifying quality.First the feature space after training is divided into 5 parts.By the mode of cross validation, respectively using 4 parts in 5 parts as training sample, a part of as test sample book in addition, thus find out optimum C and gamma value, as Fig. 4 and Fig. 5.To different c and gamma value, obtain respectively different prediction classification accuracies, according to predictablity rate, finally determine that C is that 4, gamma value is 1, different C, the predictablity rate that gamma value obtains is as following table:
The optimizing of table 2 this paper method
Optimizing is carried out on Python.Respectively ECOH method and method in this paper have been carried out to experimental verification, the result obtaining is as follows:
Table 3 method contrast accuracy rate
From result, the method for carrying out indoor and outdoor scene classification by the information in frequency domain space is feasible and is effective.The present embodiment is combined spectrum information with ECOH method, utilize support vector machine to classify.Experiment warp and the comparison of ECOH method, accuracy rate has improved 1%.Higher accuracy can be applied in the picture scene classification method of indoor and outdoor this paper method.Work from now on will be considered the spectrum signature (as wavelet transformation) of other conversion, and classification accuracy rate is further improved.

Claims (5)

1. the indoor and outdoor scene classification method based on Fourier transform, is characterized in that described method carries out according to following steps:
One, cutting apart of scene image: image is divided into BLK1, BLK2, BLK3, BLK4 and BLK5 five bulks;
Two, ask edge histogram EOH: to dividing each bulk obtaining in step 1, ask edge histogram EOH;
Three, ask color histogram COH: to dividing each bulk obtaining in step 1, ask color histogram COH;
Four, the eigenvector of computed image: the color histogram COH trying to achieve in the edge histogram EOH trying to achieve in step 2 and step 3 is merged, each bulk is assigned weight, obtain the eigenvector of image, i.e. ECOH method;
Five, the Fourier transform of image: each bulk is carried out respectively to Fourier transform, the piece after Fourier transform is carried out to translation transformation, low frequency part is positioned at center;
Six, again cutting apart of image: each bulk is divided into 32 fritters again, calculates amplitude the summation of each fritter, as a dimension of eigenvector, dividing mode is: BLK1 is divided into 16 of every row, totally 2 row, BLK2, BLK3, BLK4 and BLK5 are divided into 2 of every row, totally 16 row;
Seven, indoor and outdoor scene classification: on ECOH method basis, add Fourier spectrum feature, comprehensive spectrum signature and ECOH method, calculate the final vector of image, pie graph sheet scene characteristic of division space, finally in svm classifier device, carry out the classification of indoor and outdoor scene, described feature space form is as follows:
F = F 1 ECOH , F 1 O , F 2 ECOH , F 2 O . . . F 5 ECOH , F 5 , O
Wherein F i O = F i 1 O , F i 2 O . . . F i 32 O , 1 ≤ i ≤ 5 .
2. according to the indoor and outdoor scene classification method based on Fourier transform described in claim 1, it is characterized in that the image dividing mode described in step 1 is: BLK1 is one of image topmost, wide is the wide of entire image, height is 1/8 of image, BLK2, BLK3, BLK4 and BLK5 be evenly distributed in BLK1 below, wide 1/4 of the figure image width that is respectively of BLK2, BLK3, BLK4 and BLK5, height is respectively 7/8 of figure image height.
3. according to the indoor and outdoor scene classification method based on Fourier transform described in claim 2, it is characterized in that the concrete steps of asking edge histogram EOH described in step 2 are: according to formula
Figure FDA0000403006420000013
and
Figure FDA0000403006420000014
obtain texture amplitude A (x, y) and argument θ (x, y), the wherein P of every middle edge pixel x(x, y) and P y(x, y) be respectively pixel (x, y) level and perpendicular direction vector, the direction vector of pixel is quantized within the scope of 0 °~180 °, according to the argument of texture edge pixel point, by 180 degree eight equal parts, edge pixel point is divided in 8 regions, calculating be divided into pixel amplitude in each region and, formula is as follows:
E i , m = Σ ( x , y ) ∈ BLK θ ( x , y ) ∈ m A ( x , y ) , 1 ≤ i ≤ 5,1 ≤ m ≤ 8 ,
E wherein i,mfor belong to i piece in m region amplitude and.
4. according to the indoor and outdoor scene classification method based on Fourier transform described in claim 3, it is characterized in that the concrete steps of asking color histogram COH described in step 3 are: image is transformed into HSV color space by rgb color space, the span of H component is 0 °~360 °, by 360 degree 8 deciles, according to the H component value of pixel using different pixels be divided in 8 regions and ask pixel S component in each region and as a dimension, formula is as follows
Figure FDA0000403006420000021
right and form and aspect.
5. according to the indoor and outdoor scene classification method based on Fourier transform described in claim 4, it is characterized in that the feature space form of the characteristics of image vector described in step 4 is as follows:
F = ω 1 F 1 ECOH , ω 2 F 2 ECOH , . . . , ω 5 F 5 ECOH ,
Wherein F i ECOH = F i 1 E F i 1 C , F i 2 E F i 2 C . . . F i 8 E F i 8 C , 1 ≤ i ≤ 5 .
CN201310516017.8A 2013-10-28 2013-10-28 Indoor and outdoor based on Fourier transformation scene classification method Expired - Fee Related CN103544501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310516017.8A CN103544501B (en) 2013-10-28 2013-10-28 Indoor and outdoor based on Fourier transformation scene classification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310516017.8A CN103544501B (en) 2013-10-28 2013-10-28 Indoor and outdoor based on Fourier transformation scene classification method

Publications (2)

Publication Number Publication Date
CN103544501A true CN103544501A (en) 2014-01-29
CN103544501B CN103544501B (en) 2016-08-17

Family

ID=49967936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310516017.8A Expired - Fee Related CN103544501B (en) 2013-10-28 2013-10-28 Indoor and outdoor based on Fourier transformation scene classification method

Country Status (1)

Country Link
CN (1) CN103544501B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991434A (en) * 2017-03-07 2017-07-28 中国矿业大学 A kind of gray-scale map sorting technique and system based on the twin SVMs of small echo
CN110174559A (en) * 2018-09-20 2019-08-27 永康市巴九灵科技有限公司 Real-time remaining capacity measuring mechanism
CN115761458A (en) * 2022-11-24 2023-03-07 北京的卢铭视科技有限公司 Indoor and outdoor environment judging method, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090257682A1 (en) * 2008-04-10 2009-10-15 Fuji Xerox Co., Ltd. System and method for automatic digital image orientation detection
CN101964055A (en) * 2010-10-21 2011-02-02 重庆大学 Visual perception mechansim simulation natural scene type identification method
CN102663357A (en) * 2012-03-28 2012-09-12 北京工业大学 Color characteristic-based detection algorithm for stall at parking lot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090257682A1 (en) * 2008-04-10 2009-10-15 Fuji Xerox Co., Ltd. System and method for automatic digital image orientation detection
CN101964055A (en) * 2010-10-21 2011-02-02 重庆大学 Visual perception mechansim simulation natural scene type identification method
CN102663357A (en) * 2012-03-28 2012-09-12 北京工业大学 Color characteristic-based detection algorithm for stall at parking lot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LALIT GUPTA ET AL.: "Indoor versus Outdoor Scene Classification Using Probabilistic Neural Network", 《EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING》 *
WONJUN KIM ET AL.: "A Novel Method for Efficient Indoor–Outdoor Image Classification", 《JOURNAL OF SIGNAL PROCESS SYSTEMS》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991434A (en) * 2017-03-07 2017-07-28 中国矿业大学 A kind of gray-scale map sorting technique and system based on the twin SVMs of small echo
CN110174559A (en) * 2018-09-20 2019-08-27 永康市巴九灵科技有限公司 Real-time remaining capacity measuring mechanism
CN115761458A (en) * 2022-11-24 2023-03-07 北京的卢铭视科技有限公司 Indoor and outdoor environment judging method, electronic equipment and storage medium
CN115761458B (en) * 2022-11-24 2023-09-01 北京的卢铭视科技有限公司 Indoor and outdoor environment judging method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN103544501B (en) 2016-08-17

Similar Documents

Publication Publication Date Title
CN105574534A (en) Significant object detection method based on sparse subspace clustering and low-order expression
CN101526995B (en) Synthetic aperture radar target identification method based on diagonal subclass judgment analysis
CN102298773B (en) Shape-adaptive non-local mean denoising method
CN102722891A (en) Method for detecting image significance
CN105005565B (en) Live soles spoor decorative pattern image search method
CN103310236A (en) Mosaic image detection method and system based on local two-dimensional characteristics
CN104504734A (en) Image color transferring method based on semantics
Meshgi et al. Expanding histogram of colors with gridding to improve tracking accuracy
CN107886539B (en) High-precision gear visual detection method in industrial scene
CN104281849A (en) Fabric image color feature extraction method
CN110569860A (en) Image interesting binary classification prediction method combining discriminant analysis and multi-kernel learning
Nguyen et al. Satellite image classification using convolutional learning
CN102722734B (en) Image target identification method based on curvelet domain bilateral two-dimension principal component analysis
CN110991547A (en) Image significance detection method based on multi-feature optimal fusion
CN112580647A (en) Stacked object oriented identification method and system
CN104751171A (en) Method of classifying Naive Bayes scanned certificate images based on feature weighting
CN103544501A (en) Indoor and outdoor scene classification method based on Fourier transformation
CN106777159A (en) A kind of video clip retrieval and localization method based on content
Jang et al. Object classification using CNN for video traffic detection system
CN104217430A (en) Image significance detection method based on L1 regularization
CN107609565B (en) Indoor visual positioning method based on image global feature principal component linear regression
Varish et al. A novel similarity measure for content based image retrieval in discrete cosine transform domain
CN103049570B (en) Based on the image/video search ordering method of relevant Preserving map and a sorter
CN106408029A (en) Image texture classification method based on structural difference histogram
CN106056575A (en) Image matching method based on object similarity recommended algorithm

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160817

Termination date: 20181028

CF01 Termination of patent right due to non-payment of annual fee