CN101944358B - Ant colony algorithm-based codebook classification method and codebook classification device thereof - Google Patents

Ant colony algorithm-based codebook classification method and codebook classification device thereof Download PDF

Info

Publication number
CN101944358B
CN101944358B CN201010267156.8A CN201010267156A CN101944358B CN 101944358 B CN101944358 B CN 101944358B CN 201010267156 A CN201010267156 A CN 201010267156A CN 101944358 B CN101944358 B CN 101944358B
Authority
CN
China
Prior art keywords
code book
book
vector
ant
code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201010267156.8A
Other languages
Chinese (zh)
Other versions
CN101944358A (en
Inventor
李凤莲
张雪英
马朝阳
王峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Technology
Original Assignee
Taiyuan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Technology filed Critical Taiyuan University of Technology
Priority to CN201010267156.8A priority Critical patent/CN101944358B/en
Publication of CN101944358A publication Critical patent/CN101944358A/en
Application granted granted Critical
Publication of CN101944358B publication Critical patent/CN101944358B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Compression, Expansion, Code Conversion, And Decoders (AREA)

Abstract

The invention discloses an ant colony algorithm-based codebook classification method and a codebook classification device thereof. By the method, a designed codebook is divided into a plurality of sub codebooks by a codebook classification method in the classification process of the codebook; the classified sub codebooks are represented by characteristic values of the sub codebooks; an ant colony algorithm is adopted in the codebook classification method, a pick-up probability function and a put-down probability function are introduced in the ant colony algorithm, a random probability range is combined with a probability function value and an algorithm convergence speed is improved; and in a codebook rearranging process, the sub codebooks are arranged according to characteristic value orders of the sub codebooks to form classified codebooks. The device consists of a sub codebook characteristic value unit, a sub codebook code word number unit and a classified codebook unit. The code word searching range of a codebook classifying vector quantizer is limited as the sub codebooks of the classified codebook through the characteristic values and the code word number information of the sub codebooks when the codebook classifying vector quantizer is quantized, so that the searching range of code words and the time complexity of the vector quantizer are reduced.

Description

Code book sorting technique based on ant colony clustering algorithm and code book sorter thereof
Technical field
The present invention relates to a kind of voice signal and process and swarm intelligence algorithm technology, specifically a kind of employing optimized code book sorting technique and the code book sorter thereof that ant colony clustering algorithm carries out.
Background technology
Vector quantization technology application is in practice very extensive, has related to the fields such as digital picture and voice compression coding, speech recognition, emotion recognition, literature search and database retrieval.
Vector quantizer is mainly comprised of encoder, wherein includes identical or different code book.During quantification, input vector need to carry out distortion computation according to all code words in distortion measure and Vector Quantization scrambler code book conventionally, and to find the coupling code word of distortion minimum, what adopt is the method for exhaustive search vector quantization.The major advantage of exhaustive search vector quantizer is to find the code word of coupling according to a certain distortion measure, but its computation complexity while quantizing is maximum.Therefore proposed the strategy of many reduction quantization complexity, these strategies are considered from the formation of vector quantizer on the one hand, consider on the other hand from code word searching algorithm.Constraint vector quantizer additional various constraint condition on exhaustive search vector quantizer basis be take reduction quantization complexity as object proposes, and produces thus more corresponding encryption algorithms and Codebook Design technology.A kind of method that can reduce quantization complexity is that the code word in quantizer code book is carried out to some constraints, makes its code word no longer have any distribution, but distributes in a kind of affined mode, thereby make nearest neighbor search become easier.
As a kind of constraint vector quantizer, classified vector quantization device principle is according to the characteristic of quantization parameter, input vector to be classified, then in the code book of respective class, search for nearest code word, the size of all kinds of code books can be different, just forms altogether total code book of quantizer.Because the size of each subcode book of this quantizer is all smaller, so time complexity has obtained reduction.During classified vector quantization tolerance, can produce two index, one is codebook index, one is codewords indexes, wherein which code book is codebook index need to carry out at for determining input vector code word search procedure, and codewords indexes is that input vector searches the index of arest neighbors code word in the code book determining.The design process of classified vector quantization device code book is normally first divided into several subsets with sorter by the set of input trained vector, then adopts Codebook Design algorithm to produce corresponding code book, and these code books just form final total code book altogether.The key that affects classified vector quantization device performance is also that the size of all kinds of code books of how to confirm under the certain condition of total code book size makes the overall performance of quantizer best.Conventionally adopt two kinds of methods to determine all kinds of code book sizes, a kind of is bit distribution algorithm, and another kind is in the situation that total code book size is certain, thinks that the size of all kinds of code books and the size of training subset are directly proportional.Because classified vector quantization device needs codebook index and codewords indexes, so its quantizing bit number is all relevant with sorter number and each code book size.
Existing ant group algorithm is a kind of probability search method that is used for finding in the drawings path optimizing, by people such as Italian scholar MarcoDorigo, in phase early 1990s, proposed, its inspiration comes from ant and in search of food process, finds the behavior in path, be a kind of heuristic bionical optimizing algorithm, be mainly used in solving complicated combinatorial optimization problem.Up to now, ant group algorithm has successfully solved many practical problemss, as traveling salesman problem, quadratic assignment problem, Job-Shop scheduling problem and discrete optimization problems of device etc.By ant group algorithm, for cluster analysis, inspiration comes from corpse and their young behavior of classification that ant is piled up them.Because real ant group motion process is close to actual clustering problem, so emerge in recent years a large amount of ant colony clustering algorithms.
The clustering algorithm that forms principle based on ant heap is the propositions such as Deneubourg the earliest, they are according to the similarity of data object and its surroundings, allow ant move randomly, pick up or put down data object, to reach the object of cluster data, this basic model is successfully applied to the fields such as robot.First Lumer etc. improve this algorithm, have proposed LF algorithm, are carrying out having obtained certain effect aspect cluster analysis with ant group algorithm.
The fundamental mechanism of ant group carry an object is: a random mobile non-loaded ant is when running into an object, if around the similarity of object is less in position with it for this object, the probability of " picking up " this object is larger; Otherwise, one random mobile have load object that ant carries on the back and its position object similarity larger, the probability that " puts down " this object is larger.This mechanism can guarantee not destroy the object of raft, and can assemble the object of rickle.
Based on this, researcher has proposed the basic ideas of ant colony clustering algorithm.Its main thought be by data to be clustered initial random be dispersed in a two dimensional surface, then in this plane, produce some virtual ants it carried out to cluster analysis.First data object is projected to randomly to a two dimensional surface, then every ant is selected a data object at random, the probability obtaining in the similarity of regional area according to this object, determines whether ant " is picked up ", " movement " or " putting down " this object.Through limited number of time iteration, the data object in plane is assembled by its similarity, finally obtains cluster result and clusters number.
Speed and non-loaded ant that above-mentioned ant colony clustering algorithm affects the principal element of algorithm the convergence speed has load ant to put down object are picked up the speed of object, and the key that affects these two speed be in algorithm put down probability function and pick up probability function and random chance that system produces between relation.While changing due to existing ant colony clustering algorithm similarity function value, put down probability function and pick up probability function that with its value, to change the variation occurring be not very remarkable, make to put down probability function and pick up probability function value to be greater than for a long time the random chance that system produces, while causing having load ant that object should be put down, can not put down immediately object, when should picking up object, non-loaded ant can not pick up immediately object, make object in two dimensional surface by its similarity, not form cluster result with speed faster, thereby speed of convergence and the cluster result of clustering algorithm have directly been affected.
Summary of the invention
The object of this invention is to provide a kind of code book sorting technique and code book sorter thereof based on ant colony clustering algorithm, unordered to solve the arrangement mode of code word in exhaustive search vector quantizer code book, code word hunting zone is large, the problem that time complexity is high,
The present invention is as follows in order to address the above problem taked technical scheme:
A code book sorting technique based on ant colony clustering algorithm, based on immittance spectral frequencies parameter, the method comprises:
Code book assorting process:
With the code book of Codebook Design algorithm design, use the code book sorting technique based on ant colony clustering algorithm to be categorized as subcode book, a sub-code book eigenwert representative for each subcode book;
Code book rearrangement process:
With each subcode book of code book sorting technique classification, by combining composition and classification code book with identical the putting in order of subcode book eigenwert.
Ant colony clustering algorithm described in technique scheme of the present invention is to adopt following similarity function:
Figure BSA00000249041300031
(1) in formula: d (o i, o j) be data o iwith data o jbetween Euclidean distance; d mAX(o i, o j) be data o iin the cluster radius r of place and o ibetween maximum Euclidean distance, d mAX(o i, o j) in o jfor data o iin the cluster radius r of place and o ibetween there are the data of maximum Euclidean distance; α is for regulating the parameter of similarity between data object, α=4 in (1) formula;
Picking up probability function and putting down probability function in described ant colony clustering algorithm, it is picked up probability function and is expressed as follows:
In formula (2-4), similarity f determines according to formula (1); B=0.3, k 1=11.11;
Putting down probability function is expressed as follows:
Figure BSA00000249041300033
In formula (3-6), similarity f determines according to formula (1); k 2=11.11;
The disposal route in the isolated point in described ant colony clustering algorithm and atypia classification district is as follows
Its isolated point disposal route is to reclassify by arest neighbors criterion;
Its atypia classification district disposal route is to merge with arest neighbors criterion and other classification district;
The random chance scope of described ant colony clustering algorithm be according to formula (2-4) and (3-6) statistics of calculated value determine.
The code book sorter of a kind of code book sorting technique based on ant colony clustering algorithm described in technique scheme of the present invention, this device comprises subcode book eigenwert unit, subcode book code word number unit and classification code book list unit;
Described subcode book eigenwert unit is the subcode book eigenwert that storage obtains by code book sorting technique, for the vector to be quantified of input being carried out to determine when code book is classified the position of subcode book, subcode book eigenwert unit is arranged in code book classified vector quantization apparatus cell encoder; Described subcode book code word number unit is the code word number that each subcode school bag of obtaining by code book sorting technique of storage is drawn together, for the vector to be quantified of input being carried out when code book is classified, determine the position of subcode book and the scope of subcode book, subcode book code word number unit is arranged in code book classified vector quantization apparatus cell encoder; Described classification code book list unit is that the code book obtaining is arranged in storage by the order identical with subcode book eigenwert location contents by subcode book, is positioned at code book classified vector quantization apparatus cell encoder and decoder element.
Code book classified vector quantization device described in technique scheme of the present invention is the vector quantizer that comprises code book sorter; Code book classified vector quantization apparatus is by cell encoder and demoder cell formation; The cell encoder of described code book classified vector quantization apparatus comprises code book sorter and code book classification quantitative module, code book classification quantitative module is used for the vector to be quantified of input to determine corresponding quantization vector at code book sorter, and the quantization index of quantization vector is write to code stream; The decoder element of described code book classified vector quantization apparatus comprises classification code book list unit and decoder module, described decoder module is used for receiving the quantization index that is sent to decoder element by code stream, and in classification code book list unit, searches for the corresponding reconstructed vector that quantizes the input vector to be quantified of index value.
A kind of code book sorting technique and code book sorter thereof based on ant colony clustering algorithm of the present invention, code book sorting technique adopts ant colony clustering algorithm, compare with existing ant colony clustering algorithm, the present invention proposes new picking up probability function and put down probability function, and make random chance span and pick up probability function and put down probability function value to combine, thereby accelerated convergence of algorithm speed, improved clustering performance.
Compare with the code book of existing employing LBG algorithm design, code book through the classification of code book sorting technique, code book size is identical, code word is identical, but there is marked change in putting in order of code word, the code word with identical subcode book eigenwert is divided into a sub-code book, and each subcode book, by combining composition and classification code book with identical the putting in order of subcode book eigenwert, leaves in code book sorter.When quantizing, the complexity that code book classified vector quantization device quantization complexity based on code book sorting technique and code book sorter increases while being search code book classified information adds the time complexity while searching for quantization vector in subcode book, because subcode book hunting zone adds that code book classified information hunting zone will be lower than the hunting zone of classification code book, therefore, time complexity during this code book classified vector quantization tolerance has obtained significantly reducing.
For example, when AMR-WB voice AMR-WB encryption algorithm ISF parameter one-level is quantized, if adopt code book classified vector quantization device of the present invention to quantize, the example that is quantified as with 7 dimension sub-vectors, the code book size of 7 dimension sub-vectors is 256, if be 14 classes with the classified information number of code book after optimization ant colony clustering algorithm automatic cluster, the storage demand that the present invention increases in encoder-side is 28 floating-point storage unit, but in classification code book, obtaining the required code book hunting zone of quantization vector has reduced a lot, thereby make quantization complexity obtain significantly reducing.The code word number that each subcode school bag that one of them cluster result forms of take is drawn together is example, and the code word number that its each subcode school bag is drawn together is respectively { 39,35,13,16,15,11,7,8,52,17,15,13,7,8}.Visible, the code word number that each subcode book comprises is unequal.The time complexity when code word number that each subcode book comprises has determined vector quantization, code word number is fewer, and code word hunting zone is less, and time complexity during quantification is lower.During code book classified vector quantization tolerance, first need 7 dimension sub-vectors and each subcode book eigenwert of input to estimate and judge with Euclidean distance, to determine subcode book and position thereof, then in subcode book, further with exhaustive search algorithm, determine its quantized value.If the number of times tolerance that adds (subtracting) method, multiplication and comparison operation for the time complexity of quantizer, and represent code book size with N, k represents quantization vector dimension, the time complexity that adopts exhaustive search vector quantizer to need is 3Nk-1, the content that these 3Nk-1 computing method are known to the skilled person.While adopting respectively exhaustive search vector quantizer and code book classified vector quantization tolerance, time complexity contrast is as follows:
Time complexity while adopting exhaustive search vector quantizer:
3Nk-1=3 * 256 * 7-1=5375 (inferior/input vector)
While adopting code book classified vector quantization device maximum time complexity:
3 * 14 * 7-1+3 * 52 * 7-1=3 * 66 * 7-2=1384 (inferior/input vector)
Minimum time complexity while adopting code book classified vector quantization device:
3 * 14 * 7-1+3 * 7 * 7-1=3 * 21 * 7-2=439 (inferior/input vector)
It is as follows that time complexity while adopting code book classified vector quantization device is at least reduced to the number percent of exhaustive search:
1384/5375×100%=25.75%
It is as follows that time complexity while adopting code book classified vector quantization device is reduced at most the number percent of exhaustive search:
439/5375×100%=8.17%。
Compare with existing classified vector quantization device, significantly difference is that the sorter that the present invention designs only need leave encoder-side in, and decoder end does not need to deposit.The subcode book characteristic value information relevant to code book sorter and subcode book code word number information do not need to be sent to decoder end yet, therefore, do not take quantization bit, like this in the situation that quantization bit is identical, can save a part of bit number, make it for the other parts of encryption algorithm.In speech coding algorithm, if remainder bit number used is identical, can makes the quantization bit of quantizer be reduced, thereby whole algorithm coding speed is reduced.
In sum, a kind of code book sorting technique and code book sorter thereof based on ant colony clustering algorithm of the present invention, code book sorting technique adopts ant colony clustering algorithm, compared with prior art, the present invention proposes new picking up probability function and put down probability function, and make random chance span and pick up probability function and put down probability function value to combine, thereby accelerated convergence of algorithm speed, improved clustering performance.Through code book sorted code book, subcode book, consist of, subcode book putting in order in classification code book puts in order identically with subcode book eigenwert, and subcode book consists of the code word with identical subcode book eigenwert.When vector quantizer quantizes, by eigenwert and the subcode book code word number information of subcode book, vector quantizer code word hunting zone has been limited to a sub-code book of classification code book, thereby reduced vector quantizer code word hunting zone, reduced time complexity when vector quantizer quantizes, and classified information need to not transmit at channel, so not increase of quantization bit, vector quantizer quantization performance has reached the effect of transparent quantification.
Accompanying drawing explanation
Fig. 1 is the code book sorting technique theory diagram based on ant colony clustering algorithm that the embodiment of the present invention provides;
Fig. 2 be the embodiment of the present invention provide pick up probability function curvilinear motion rule comparison diagram;
Fig. 3 be the embodiment of the present invention provide put down probability function curvilinear motion rule comparison diagram;
Fig. 4 is the ant colony clustering algorithm theory diagram that the embodiment of the present invention provides;
Fig. 5 is the code book sorter structural representation that the embodiment of the present invention provides;
Fig. 6 is the code book classified vector quantization apparatus structural representation that the embodiment of the present invention provides;
Fig. 7 is the multistage Split vector quantizer apparatus structural representation that comprises code book sorter that the embodiment of the present invention provides.
Embodiment
Below a kind of code book sorting technique and code book sorter thereof based on ant colony clustering algorithm of the present invention is further detailed.
Embodiment 1
The code book sorting technique that the present invention is based on ant colony clustering algorithm comprises that for the specific implementation process of code book classification code book classification and code book reset two processes.
Fig. 1 shows the code book sorting technique theory diagram based on optimizing ant colony clustering algorithm that the embodiment of the present invention provides, when code book being classified with optimization ant colony clustering algorithm, the data to be clustered that ant colony clustering algorithm is optimized in input are the code book designing in advance, the method of design code book can be used LBG Codebook Design algorithm, and LBG Codebook Design algorithm is known to the skilled person content.
Code book assorting process based on optimizing ant colony clustering algorithm comprises the steps:
Step 1. initiation parameter.Comprise cluster maximum cycle N mAX, ant bears the maximum mobile number of times thresholding N of object 1MAX, ant number, two dimensional surface, the initial value of cluster radius etc. is set.
The present invention establishes N mAX=20000, N 1MAX=200, ant number is 1, and two dimensional surface is limited to (0~100,0~100) region, and cluster radius r is taken as 3.
Step 2. to two dimensional surface, is given one at the coordinate of original two-dimensional plane domain to data object Random Maps to be clustered at random each data object, and is generated a random chance p who is distributed between 0-60% r.
Step 3. is placed into ant on the two dimensional surface of localized area scope at random, and ant original state is set for not load.
Step 4. is pressed following formula (1) and is calculated similarity parameter f (o i);
Figure BSA00000249041300061
(1) d (o in formula i, o j) for being mapped to two data o on two dimensional surface iwith o jbetween Euclidean distance, d mAX(o i, o j) be data o iin the cluster radius r of place and o ibetween maximum Euclidean distance, o jfor data o iin the cluster radius r of place and o ibetween there are the data of maximum Euclidean distance.α is for regulating the parameter of similarity between data object, and its value has determined the number of cluster and the speed of convergence, and when α is larger, between object, similarity degree is larger, perhaps makes not too identical object be classified as a class, and its clusters number is fewer, and speed of convergence is also faster; Otherwise α is less, between object, similarity degree is less, under extreme case, a large class may be divided into many groups.Clusters number increases simultaneously, and speed of convergence is slack-off; The present invention finally gets α=4 according to many experiments result.
If the current state of step 5. ant is not load, by picking up probability function calculating, pick up probability, and judge whether pick up probability is greater than random chance, if be greater than, ant is picked up the object of its position, start mobile object, otherwise ant moves to other position at random, then jump to step 4 calculating similarity parameter.
An existing probability function of picking up that the random motion ant of load is not picked up an object has following several form:
(1) LF/Deneubourg basic model
p p = ( k 1 k 1 + f ) 2 - - - ( 2 - 1 )
(2) Sigmoid function
p p = 1 1 + e k 1 f - - - ( 2 - 2 )
(3) piecewise function
p p=1-k 1f (2-3)
Above k in 3 formulas 1for threshold constant, value according to actual needs, and k in 3 formulas in use 1value generally different.
The present invention will pick up probability function and be defined as following formula (2-4), and wherein parameter f is according to the definite similarity of formula (1); B, k 1for threshold constant, its value size affect convergence of algorithm speed, and meets in similarity and get over and hour pick up probability and should be the bigger the better, and along with the increase gradually of similarity, picking up probability should be more and more less.Many experiments discovery, the coefficient of similarity calculating according to formula (1) is generally distributed in the scope of 0≤f≤0.3, and therefore, the present invention gets b=0.3, k 1=11.11.
Figure BSA00000249041300073
Fig. 2 has provided four of formulas (2-1) to (2-4) and has picked up the variation rule curve that probability function changes with similarity f.Can find out, with first three individual existing probability function curve comparison of picking up, when similarity is distributed in 0≤f≤0.3 scope, the probability function of picking up that the present invention provides increases and reduces gradually from probability 100% with f, when f is tending towards 0.3, pick up probability and be tending towards 0, at f, be greater than after 0.3, pick up probability and all become 0.In experiment, find, the f calculating according to formula (1) there will not be the situation of f > 0.3 substantially, this just makes to pick up the variation of probability function value and combines with formula (1) similarity parameter variation range, is conducive to ant with the object that fast speed is picked up with distinctiveness ratio is larger around.
If the current state of step 6. ant is load condition, by the following probability function calculating object that puts down, put down probability, and judge whether be greater than random chance, if be greater than, ant is put down object if putting down probability, juxtaposition ant state is load condition not, goes to step 7; Otherwise ant bears object continues to move to new Data Position, ant bears same object moves number of times and adds 1, judges that ant bears same object and moves number of times and whether be greater than thresholding N 1MAXif, being greater than, ant bears same object moves number of times and sets to 0, and ant is put down object, and juxtaposition ant state is load condition not, goes to step 7, otherwise goes to step 4.
The present invention bears same object and moves number of times by ant is set, and judges that in this step ant bears same object and moves number of times and whether be greater than thresholding N 1MAX, can prevent that ant from bearing same object always and can not find extended position, program enters infinite loop, if ant bears same object, moves number of times over threshold value N 1MAX, no matter put down Probability Condition and whether meet, also must put down object.
The probability function that puts down that the ant of the random motion of an existing load is put next object is defined as following several form:
(1) piecewise function
p d = k 2 f f < 1 k 2 0 f &le; 0 1 f = 1 - - - ( 3 - 1 )
(2) LF basic model
Figure BSA00000249041300082
(3) Sigmoid function
p d = 1 1 + e - k 2 f - - - ( 3 - 3 )
(4) Deneubourg basic model
p d = ( f k 2 + f ) 2 - - - ( 3 - 4 )
(5) LF improved model
p d = f f < 1.0 1 f &GreaterEqual; 1 - - - ( 3 - 5 )
Wherein parameter f is according to the definite similarity of formula (1); k 2for thresholding (threshold value) constant, value according to actual needs, and k in above-mentioned 5 formulas in use 2value generally different.Can find out, formula (3-1), formula (3-2) are all to change by straight line rule with formula (3-5), and just straight slope is different, and LF basic model straight slope is that 2, LF improved model straight slope is 1, and piecewise function exists
Figure BSA00000249041300086
time, straight slope is k 2.Below further during contrast, only contrast with LF improved model.
The Changing Pattern that puts down probability function should be that similarity is larger, puts down probability also larger; Similarity is less, puts down probability also less.For this reason, the present invention considers that with secondary, increase progressively curve realizes, and will put down probability function and be defined as formula (3-6) form:
Figure BSA00000249041300091
The Changing Pattern of formula (3-6) quafric curve, in similarity hour, can meet and pick up probability demands, along with the increase of similarity, and coefficient k above 2value directly affect probability and change, k 2too little, it is too slow with f increase increase that secondary increases progressively curve, k 2too large, secondary increases progressively curve, and with f increase, increase is too fast, can be tending towards very soon 100% and pick up probability.But because formula (1) similarity variation range is substantially in interval, 0≤f≤0.3, in the time of therefore should making f > 0.3, p d=100%, k like this 2=11.11.
Fig. 3 has provided the existing several formulas (3-6) of putting down probability function and the new proposition of the present invention of formula (3-3) to (3-5) and has put down probability function with the change curve of similarity.Can find out, in 0≤f≤0.3 o'clock, with f, increase, formula (3-6) is put down probability function and is increased the fastest, when f > 0.3, formula (3-6) is put down probability and is become 100%, other several typical curves still slowly change when f > 0.3, experimental result shows that the f calculating according to formula (1) there will not be the situation of f > 0.3 substantially, at f, approach at 0.3 o'clock like this, if it is too little to put down probability, be unfavorable for that ant puts down object fast.Therefore, formula (3-6) is put down probability function Changing Pattern and can be improved ant and put down the speed of bearing object, and this is conducive to improve the operational efficiency of algorithm, avoids ant to bear for a long time object and fails to lay down.
Step 7. is assigned to the new Data Position of ant, again generates a random chance p between 0-60% r, cycle index adds 1, if cycle index is greater than maximum cycle N mAX, finish cluster circulation, go to step 8, otherwise go to step 4;
Random chance p between described 0-60% rbe to determine according to the random chance scope of ant colony clustering algorithm, the random chance scope of ant colony clustering algorithm is to determine according to the calculated value statistics of formula (2-4) and formula (3-6).
Its calculated value statistical method is that the result of calculation of formula (2-4) and formula (3-6) is added up, and statistics acquired results shows p dand p pvalue is distributed within the scope of 0-0.6, does not occur being greater than 0.6 situation.If random chance p rvalue is still between existing 0-100%, make to put down probability function and pick up probability function value to be greater than for a long time the random chance that system produces, while causing having load ant that object should be put down, can not put down immediately object, when should picking up object, non-loaded ant can not pick up immediately object, make object in two dimensional surface by its similarity, not form cluster result with speed faster, thereby affected convergence of algorithm speed and cluster result.
Step 8. is divided subspace by cluster result by cluster radius.In partition process, for an input vector, belong to the situation of a plurality of subspaces simultaneously, further increase Euclidean distance criterion and decide final classification results.The unfiled isolated point occurring during cluster, adopt arest neighbors criterion to process, according to Euclidean distance criterion, judge the input vector nearest with this isolated point, and this isolated point and nearest input vector are divided into a class, if nearest input vector is also isolated point, a newly-built classification that comprises these two isolated points.
The cluster that refers to described isolated point finishes rear the occurred input vector that does not belong to any subspace, is also wild point.
Described in Fig. 4, optimize ant colony clustering algorithm end of run, code book assorting process finishes, and then carries out code book rearrangement process, and code book rearrangement process comprises the following steps
Step (1) is calculated the input vector number of each non-NULL subspace, and solves the barycenter of every sub spaces; Described non-NULL subspace refers to that this subspace has an input vector at least.
Step (2) merges with arest neighbors criterion and other classification district atypia classification district, input vector number is less than the subspace of certain particular value, adopts arest neighbors criterion to process.This particular value is to determine according to the number of input vector, and if the number of input vector of the present invention is 256, the input vector number in its atypia classification district is 5.
Described atypia classification district refers to that input vector number is less than the subspace of certain particular value.
It is to find behind the subspace with its centroid distance minimum by Euclidean distance that described employing arest neighbors criterion is processed, and two sub spaces input vectors are merged into a class; This step is that therefore, the subspace and other subspace that input vector number are less than to certain particular value merge in order to prevent that some input vector numbers are less than the subspace formation atypia barycenter of certain particular value.
Described other classification district refers to the non-NULL subspace beyond atypia classification district.
Step (3) recalculates the input vector number of each non-NULL subspace, and solves the barycenter of every sub spaces.
Step (4) stores together input vector to form a sub-code book by the subspace under it, and the eigenwert of subcode book represents with the barycenter of corresponding subspace.
Step (5) is lined up each subcode book to form classification code book by the identical order of the eigenwert with subcode book.
Above-mentioned steps 1 to 8, and step (1) to (5) forms the complete procedure of the code book sorting technique based on ant colony clustering algorithm that the embodiment of the present invention provides in the lump, wherein, step 1 also can be used as the embodiment of the code book assorting process based on ant colony clustering algorithm that the embodiment of the present invention provides to step 8, step (1) also can be used as the embodiment of the code book rearrangement process based on ant colony clustering algorithm that the embodiment of the present invention provides to step (5).
Embodiment 2
The code book sorter of a kind of code book sorting technique based on the ant colony clustering algorithm now embodiment of the present invention being provided is elaborated as follows:
Fig. 5 shows the code book sorter structural representation of the code book sorting technique based on ant colony clustering algorithm that the embodiment of the present invention provides, and this device comprises subcode book eigenwert unit, subcode book code word number unit and classification code book list unit.
The subcode book eigenwert that wherein subcode book eigenwert unit is stored, for carrying out determining when code book is classified the position of subcode book to the vector to be quantified of input; The code word number that draw together with each subcode school bag of subcode book code word number unit storage this subcode book eigenwert position subcode book that definite input vector to be quantified is corresponding is jointly in the entry address of classification code book list unit; This entry address subcode book code word number corresponding with definite subcode book eigenwert position determines that subcode book that input vector to be quantified is corresponding is at the exit address of classification code book list unit jointly.
Subcode book code word number unit is the code word number that each subcode school bag of obtaining by code book sorting technique of storage is drawn together, be arranged in the cell encoder of code book classified vector quantization apparatus, for the vector to be quantified of input being carried out to code book when classification, determine the position of subcode book and the scope of subcode book;
Classification code book list unit is that the code book obtaining is arranged in storage by the order identical with subcode book eigenwert location contents by subcode book, be positioned at code book classified vector quantization apparatus cell encoder and decoder element, for classification code book being provided to the code book classification quantitative module of code book classified vector quantization device.
The calculating that all parts in Fig. 5 device relates to, the formula and the computing method that in the code book sorting technique based on ant colony clustering algorithm in the embodiment of the present invention, provide all can be provided, can be used as a kind of embodiment of the code book sorter of the code book sorting technique based on ant colony clustering algorithm that the embodiment of the present invention provides
Embodiment 3
Fig. 6 shows the code book classified vector quantization apparatus structural representation based on ant colony clustering algorithm that the embodiment of the present invention provides, this device comprises cell encoder and decoder element, wherein cell encoder comprises: code book sorter and code book classification quantitative module, decoder element comprises classification rearrangement code book unit and decoder module.
Cell encoder carries out vector quantization to inputting vector to be quantified successively, and during quantification, each component function is as follows:
Code book sorter, determines corresponding subcode book scope during for vector quantization, after subcode book scope is determined, input the subcode book hunting zone that vector to be quantified is corresponding and determine.Subcode book hunting zone determines that method is: input subcode book entry address that vector to be quantified is corresponding and the code word between exit address.
Code book classification quantitative module, first in code book sorter, determine the subcode book scope corresponding with input vector to be quantified, subcode book scope is further determined the code word with input vector distance minimum to be quantified again in subcode book after determining, this code word is exactly that input vector to be quantified is at the quantization vector of cell encoder, the position of this code word in classification code book list unit is exactly the quantization index of input vector to be quantified, further quantization index write afterwards in the code stream that scrambler provides.
Decoder element obtains the reconstructed vector of input vector to be quantified for be sent to the code stream of decoder end according to channel.Each ingredient in decoder element is as follows:
Code book unit is reset in classification, for storing classification code book, provides inquiry while decoding for demoder.
Decoder module, the code stream analyzing that is sent to decoder end according to channel goes out quantization index, inquires about the reconstructed vector of input vector to be quantified in classification code book list unit according to quantization index.
The calculating that all parts in Fig. 6 device relates to, the formula and the computing method that in the code book sorting technique based on ant colony clustering algorithm in the embodiment of the present invention, provide all can be provided, can be used as a kind of embodiment of the code book classified vector quantization apparatus based on ant colony clustering algorithm that the embodiment of the present invention provides.
Embodiment 1
The quantization parameter adopting in the present embodiment is the 16 dimension immittance spectral frequencies ISF that AMR-WB AMR-WB speech coder is used, the quantization method adopting is two-stage Split vector quantizer device, when quantizing, the two-stage Split vector quantizer device one-level in this example adopts the code book classified vector quantization device based on ant colony clustering algorithm, during second level Split vector quantizer, still adopt exhaustive search vector quantizer, be called the two-stage Split vector quantizer device that comprises code book sorter, this quantizer needs to carry out in advance the first step and the second step of following narration before quantizing, the 3rd step that this quantizer quantizing process comprises following narration and the 4th step.Further describe below:
The first step: Codebook Design process.
Codebook Design adopts LBG algorithm.The content that LBG Codebook Design algorithm is known to the skilled person.To AMR-WB wideband speech coding algorithm 16 dimension quantization parameter ISF, coding mode 1 to 8 o'clock, 9 dimension sub-vector code book sizes and 7 dimension sub-vector code book sizes used during first order Split vector quantizer are all 256, during the Split vector quantizer of the second level, first 9 dimension sub-vectors are split into 3 dimension sub-vectors, 7 dimension sub-vectors are split into 3 peacekeeping 4 dimension sub-vectors, 5 sub-vector code book sizes are respectively 64,128,128,32,32.
Second step: code book classification rearrangement process.
Can find out, when during first order Split vector quantizer, 9 dimension sub-vector code books and 7 used are tieed up sub-vector code books all than second level Split vector quantizer, 5 sub-vector code books sizes are large, therefore, only to first order Split vector quantizer, adopt code book classified vector quantization device to realize, during second level Split vector quantizer, 5 sub-vector code books are still adopted to exhaustive search algorithm, so only need carry out the processing of code book classification rearrangement process to 9 dimension sub-vector code books and 7 dimension sub-vector code books, specifically comprise the following steps
(1) with ant colony clustering algorithm, carry out code book classification.9 dimension sub-vector code books of first step design are the input vectors of ant colony clustering algorithm.Ant colony clustering algorithm end of run, the code word arrangements in 9 dimension sub-vector code books with identical subcode book barycenter has formed 9 dimension subcode books together, and the barycenter of 9 dimension subcode books is exactly the eigenwert of corresponding 9 dimension subcode books.Similar, 7 dimension sub-vector code books, be re-used as the input vector of ant colony clustering algorithm, operation ant colony clustering algorithm, during algorithm end of run, the code word arrangements in 7 dimension sub-vector code books with identical subcode book barycenter has formed 7 dimension subcode books together, and the barycenter of 7 dimension subcode books is exactly the eigenwert of corresponding 7 dimension subcode books.
(2) process that code book is reset.The subcode book of the 9 dimension sub-vectors that form is lined up by the order identical with 9 corresponding dimension subcode book eigenwerts, form 9 final dimension classification code books, the code word number that each the 9 dimension subcode school bag forming is drawn together is lined up by the order identical with 9 corresponding dimension subcode book eigenwerts.Similar, the subcode book of the 7 dimension sub-vectors that form is lined up by the order identical with 7 corresponding dimension subcode book eigenwerts, form 7 final dimension classification code books, the code word number that each 7 dimension subcode school bag is drawn together is lined up by the order identical with 7 corresponding dimension subcode book eigenwerts.
(3) storing process of classified information and rearrangement code book.
The 9 dimension classification code books that form are deposited in to 9 dimension classification code book list units, and the code word number that each 9 dimension subcode school bag is drawn together leaves 9 dimension subcode book code word number unit in, and the eigenwert of each 9 dimension subcode book leaves 9 dimension subcode book eigenwert unit in; During concrete enforcement, the content in above 3 unit must be corresponding one by one.
Similar, the code word number that 7 dimension classification code books, 7 dimension subcode school bags are drawn together and 7 dimension subcode book eigenwerts leave respectively 7 dimension classification code book list units, 7 dimension subcode book code word number unit and 7 dimension subcode book eigenwert unit in, and the content in 3 unit also must be corresponding one by one.
Above-mentioned storage unit is all arranged in the two-stage Split vector quantizer apparatus appropriate section that comprises code book sorter of AMR-WB wideband speech coding algorithm.
The 3rd step: the cataloged procedure of inputting vector to be quantified.
During coding, first 16 dimensions are inputted to vector to be quantified and deduct vector average, obtain inputting residual error vector to be quantified, wherein vector average training in advance obtains; Again 16 dimensions are inputted to residual error vector to be quantified and be split into 9 dimension sub-vectors and 7 dimension sub-vectors, to 9 dimension sub-vectors, code book sorter 1 provides code book classified information to code book classification quantitative module 1, code book classification quantitative module 1 utilizes Euclidean distance to estimate the quantization vector of determining 9 dimension sub-vectors in the subcode book of classification code book list unit 1 according to the information providing, and the quantization index I of this quantization vector in classification code book 1 11.
To 7 dimension sub-vectors, code book sorter 2 provides code book classified information to code book classification quantitative module 2, code book classification quantitative module 2 utilizes Euclidean distance to estimate the quantization vector of determining 7 dimension sub-vectors in the subcode book of classification code book list unit 2 according to the information providing, and the quantization index I of this quantization vector in classification code book 2 12.
When the second level quantizes, first 9 dimension sub-vectors are deducted to the quantization vector of 9 dimension sub-vectors, obtain the residual error vector of 9 dimension sub-vectors, Split vector quantizer coding module 1 is further split into 9 dimension sub-vectors 33 dimension sub-vectors, and 33 dimension sub-vectors utilize respectively Euclidean distance to estimate to adopt in secondary code book 1, secondary code book 2 and secondary code book 3 exhaustive search algorithm to determine quantization vector and the quantization index I of 33 dimension sub-vectors 21, I 22and I 23.
Similar, to 7 dimension sub-vectors, also be first 7 dimension sub-vectors to be deducted to the quantization vector of 7 dimension sub-vectors, obtain the residual error vector of 7 dimension sub-vectors, Split vector quantizer coding module 2 is then split into 7 dimension sub-vectors 13 dimension sub-vector and 14 dimension sub-vector, quantization vector and quantization index I that 3 dimension sub-vectors and 4 dimension sub-vectors further utilize respectively Euclidean distance to estimate in secondary code book 4 and secondary code book 5, to adopt exhaustive search algorithm to determine 2 sub-vectors 24and I 25.
End-of-encode, by quantization index I 11, I 12, I 21, I 22, I 23, I 24and I 25write code stream.
The 4th step: the decode procedure of inputting vector to be quantified.
During decoding, Split vector quantizer decoder module is sent to the code stream of decoder element according to channel, parse all quantization index, and in classification code book 1, classification code book 2, secondary code book 1, secondary code book 2, secondary code book 3, secondary code book 4 and secondary code book 5, inquire about respectively the quantization vector of corresponding sub-vector, again 16 dimension ISF parameters are respectively tieed up to component one-level quantization vector, secondary quantization vector and mean value vector and be added, obtain inputting the reconstructed vector of vector to be quantified.
Fig. 7 shows the two-stage Split vector quantizer apparatus structural representation that comprises code book sorter, this device comprises cell encoder and decoder element, and wherein cell encoder comprises: mean value vector unit, 3 totalizers, code book sorter 1, code book sorter 2, code book classification quantitative module 1, code book classification quantitative module 2, Split vector quantizer coding module 1, Split vector quantizer coding module 2 and secondary code book unit.Decoder element comprises mean value vector unit, secondary code book unit, classification code book list unit and Split vector quantizer decoder module.The calculating that all parts in this device relates to, all can use the formula and the computing method that in code book sorting technique in the embodiment of the present invention and quantizing process, provide, a kind of embodiment of the two-stage Split vector quantizer apparatus that comprises code book sorter as the embodiment of the present invention.
Cell encoder carries out vector quantization to inputting vector to be quantified successively, and in cell encoder, each ingredient is described as follows:
Mean value vector unit, respectively ties up the average of component for storing 16 dimension immittance spectral frequencies ISF, need training in advance to obtain, the content being known to the skilled person.
Totalizer 1, for input vector to be quantified respectively being tieed up to component and mean value vector unit, respectively tie up the average of component and subtract each other, and offer code book sorter 1 by subtracting each other the 1st residual error sub-vector 1 of tieing up component to the 9 dimension components formation obtaining, the residual error sub-vector 2 that subtracts each other the 10th dimension component to the 16 dimension components formation that obtain is offered to code book sorter 2.
Totalizer 2, subtracts each other for the quantized value of residual error sub-vector 1 that residual error sub-vector 1 and code book classification quantitative module 1 are quantized to obtain, obtains the quantification residual error of residual error sub-vector 1, and offers Split vector quantizer coding module 1.
Totalizer 3, subtracts each other for the quantized value of residual error sub-vector 2 that residual error sub-vector 2 and code book classification quantitative module 2 are quantized to obtain, obtains the quantification residual error of residual error sub-vector 2, and offers Split vector quantizer coding module 2.
Code book sorter 1, comprises subcode book eigenwert unit 1, subcode book code word number unit 1 and classification code book list unit 1, and each Elementary Function is: subcode book eigenwert unit 1 is for providing the subcode book eigenwert of 9 dimension sub-vectors; Subcode book code word number unit 1 is for providing the 9 dimension sub-vectors code word number that each subcode school bag is drawn together; Classification code book list unit 1 is for providing the classification code book consisting of each subcode book of 9 dimension sub-vectors.Above 3 location contents are determined when residual error sub-vector 1 is quantized jointly in the subcode book hunting zone of classifying code book 1.
Code book sorter 2, comprises subcode book eigenwert unit 2, subcode book code word number unit 2 and classification code book list unit 2, and each Elementary Function is: subcode book eigenwert unit 2 is for providing the subcode book eigenwert of 7 dimension sub-vectors; Subcode book code word number unit 2 is for providing the 7 dimension sub-vectors code word number that each subcode school bag is drawn together; Classification code book list unit 2 is for providing the classification code book consisting of each subcode book of 7 dimension sub-vectors.Above 3 location contents are determined when residual error sub-vector 2 is quantized jointly in the subcode book hunting zone of classifying code book 2.
Code book classification quantitative module 1, the subcode book hunting zone for the residual error sub-vector 1 that provides according to code book sorter 1 at classification code book 1, determines quantization vector and the quantization index I of residual error sub-vector 1 11, and quantization index is write to code stream, quantization vector is offered to totalizer 2.
Code book classification quantitative module 2, the subcode book hunting zone for the residual error sub-vector 2 that provides according to code book sorter 2 at classification code book 2, determines quantization vector and the quantization index I of residual error sub-vector 2 12, and quantization index is write to code stream, quantization vector is offered to totalizer 3.
Split vector quantizer coding module 1, for the quantification residual error of the residual error sub-vector 1 that provides according to totalizer 2, adopts Split vector quantizer method in 3 secondary code books, with exhaustive search quantization method, to determine 3 quantization vectors and quantization index I from secondary code book unit 21, I 22and I 23, and quantization index is write to code stream.
Split vector quantizer coding module 2, for the quantification residual error of the residual error sub-vector 2 that provides according to totalizer 3, adopts Split vector quantizer method in 2 secondary code books, with exhaustive search quantization method, to determine 2 quantization vectors and quantization index I from secondary code book unit 24and I 25, and quantization index is write to code stream.
The content that Split vector quantizer method described in Split vector quantizer coding module 1 and Split vector quantizer coding module 2 is known to the skilled person.
Secondary code book unit, stores five secondary code books and is secondary code book 1 to secondary code book 5, for Split vector quantizer coding module 1 and Split vector quantizer coding module 2 provide the code book using while quantizing.The concrete training method of five secondary code books adopts LBG Codebook Design algorithm, the content being known to the skilled person.
The code stream decoding that decoder element is sent to decoder end according to channel successively obtains inputting the reconstructed vector of vector to be quantified, and each ingredient in decoder element is described as follows:
Mean value vector unit, respectively ties up the average of component for storing 16 dimension immittance spectral frequencies ISF, identical with content in mean value vector unit in cell encoder, offers Split vector quantizer decoder module.
Secondary code book unit, storing five secondary code books is that secondary code book 1 arrives secondary code book 5, identical with five secondary code book contents in cell encoder, for Split vector quantizer decoder module provides inquiry.
Classification code book list unit, for storing classification code book 1 and classification code book 2, for Split vector quantizer decoder module provides inquiry.
Split vector quantizer decoder module, for be sent to the code stream of decoder end according to channel, parse quantization index value, in each code book of secondary code book unit, inquire about the quantized value that residual error sub-vector 1 and residual error sub-vector 2 quantize residual error, in each code book of classification code book list unit, inquire about the quantized value of residual error sub-vector 1 and residual error sub-vector 2, and residual error sub-vector 1 is quantized to the quantized value of residual error and the addition of the quantized value of residual error sub-vector 1, again with mean value vector unit before 9 dimensions respectively tie up the average summation of component, residual error sub-vector 2 is quantized to the quantized value of residual error and the addition of the quantized value of residual error sub-vector 2, again with mean value vector unit after 7 dimensions respectively tie up the average summation of component, front 9 dimension summed result and rear 7 dimension summed result combine, obtain 16 dimensions and input the reconstructed vector of vector to be quantified.
The two-stage Split vector quantizer apparatus that comprises code book sorter, for AMR-WB algorithm, by following experimental result, is illustrated to the effect that code book sorting technique that the embodiment of the present invention provides and code book sorter thereof are obtained:
At present, spectrum distortion parameter is to evaluate a kind of objective evaluation standard that quantizes tolerance voltinism energy.In speech coding algorithm, make not introduce any additional appreciable distortion in encoded voice, the distortion of General Requirements vector quantizer quantized spectrum needs to reach the quality index requirement of transparent quantification, that is: averaging spectrum distortion is about 1dB; The number percent that averaging spectrum distortion is greater than 4dB frame is tending towards 0; Averaging spectrum distortion is 2% left and right at the number percent of 2~4dB scope frame; Do not have averaging spectrum distortion to surpass the speech frame of 4dB.Spectrum distortion value when following table 1 has provided the two-stage Split vector quantizer tolerance that comprises code book sorter, what wherein code book sorting technique adopted is optimization ant colony clustering algorithm of the present invention, and the code book number of categories of 9 dimension sub-vector code books and 7 dimension sub-vector code books all have respectively 4,6,12 and 14 totally 4 classes.Can find out, adopt after code book sorter of the present invention, two-stage Split vector quantizer tolerance spectrum distortion has reached the effect of transparent quantification.
Table 2 has provided AMR-WB algorithm reconstructed speech w-PESQ value.What in table, S-MSVQ represented employing is multistage Split vector quantizer method, is two-stage division, when its column data representation AMR-WB algorithm adopts S-MSVQ quantization method to ISF parameter quantification, and the w-PESQ value of reconstructed speech; What code book number of categories column data representation AMR-WB algorithm adopted during to ISF parameter quantification is the multistage Split vector quantizer device that comprises code book sorter, the w-PESQ value of its reconstructed speech, wherein first 4 represents that 9 dimension sub-vector code books divide for 4 sub-code books in 4/4, represent that 7 dimension sub-vector code books also divide for 4 sub-code books for second 4,6/6,12/12 and 14/14 represents that meaning is identical; Adopt the multistage Split vector quantizer device reconstructed speech w-PESQ value comprise code book sorter and the difference of S-MSVQ reconstructed speech w-PESQ value with S-MSVQ difference column data representation.
Can find out, compare with employing S-MSVQ quantization method, when code book number of categories is 4 classes and 6 class, the average reconstructed speech w-PESQ value of 9 kinds of patterns of two-stage Split vector quantizer device that comprises code book sorter slightly improves, code book number of categories is 12 and during 14 class, 9 kinds of average reconstructed speech w-PESQ values of pattern slightly decline, but decline and the amplitude improving few, subjective auditory perception fundamental sensation is less than the difference with former algorithm decoded speech quality.Wherein coding mode is 2, number of categories is 6 o'clock, w-PESQ value improve at most, be 0.037; Coding mode is 0, number of categories is while being 14 class, w-PESQ value decline at most, be-0.134.Hence one can see that, and the number of categories of code book sorter can not be too many, otherwise can cause the remarkable decline of reconstructed speech quality.
Above-described specific embodiment, further describes object of the present invention, technical scheme and beneficial effect, for helping to understand method of the present invention and thought thereof; But the foregoing is only specific embodiments of the invention; the protection domain being not intended to limit the present invention, for one of ordinary skill in the art, according to thought of the present invention; any modification of doing, be equal to replacement, improvement etc., all should be included in protection scope of the present invention.
Table 1
Figure BSA00000249041300161
Table 2
Figure BSA00000249041300171

Claims (1)

1. a code book sorter for the code book sorting technique based on ant colony clustering algorithm, this device comprises subcode book eigenwert unit, subcode book code word number unit and classification code book list unit;
Described subcode book eigenwert unit is the subcode book eigenwert that storage obtains by code book sorting technique, for being the position that 16 dimension immittance spectral frequencies ISF of AMR-WB AMR-WB speech coder use carry out determining when code book is classified subcode book to the vector to be quantified of input, subcode book eigenwert unit is arranged in the cell encoder of code book classified vector quantization apparatus;
Described subcode book code word number unit is the code word number that each subcode school bag of obtaining by code book sorting technique of storage is drawn together, for being that the 16 dimension immittance spectral frequencies ISF that AMR-WB AMR-WB speech coder is used carry out when code book is classified determining the position of subcode book and the scope of subcode book to the vector to be quantified of input, subcode book code word number unit is arranged in the cell encoder of code book classified vector quantization apparatus;
Described classification code book list unit is that the code book obtaining is arranged in storage by the order identical with subcode book eigenwert location contents by subcode book, is positioned at code book classified vector quantization apparatus cell encoder and decoder element;
Wherein, described code book classified vector quantization device is the vector quantizer that comprises code book sorter; Code book classified vector quantization apparatus is by cell encoder and demoder cell formation; Cell encoder comprises code book sorter and code book classification quantitative module, and code book classification quantitative module is used for the vector to be quantified of input to determine corresponding quantization vector at code book sorter, and the quantization index of quantization vector is write to code stream; Decoder element comprises classification code book list unit and decoder module, and described decoder module is used for receiving the quantization index that is sent to decoder element by code stream, and in classification code book list unit, searches for the corresponding reconstructed vector that quantizes the input vector to be quantified of index value;
The code book sorting technique of described code book classified vector quantization device comprises code book assorting process and code book rearrangement process;
Code book assorting process follows these steps to carry out:
(1) initiation parameter, comprises cluster maximum cycle N mAX, ant bears the maximum mobile number of times thresholding N of object 1MAX, ant number, two dimensional surface, the initial value of cluster radius is set;
(2) by data object Random Maps to be clustered to two dimensional surface, give one at the coordinate of original two-dimensional plane domain at random each data object, and generate a random chance p who is distributed between 0-60% r; Described data object to be clustered is the code book designing in advance;
(3) ant is placed on the two dimensional surface of localized area scope at random, and ant original state is set for not load;
(4) press following formula (1) and calculate similarity parameter f (o i);
Figure FSB0000117720740000021
D (o in above-mentioned (1) formula i, o j) for being mapped to two data o on two dimensional surface iwith o jbetween Euclidean distance; d mAX(o i, o j) be data o iin the cluster radius r of place and o ibetween maximum Euclidean distance; o jfor data o iin the cluster radius r of place and o ibetween there are the data of maximum Euclidean distance; α is for regulating the parameter of similarity between data object;
(5) if the current state of ant is not load, by the following probability function calculating of picking up, pick up probability, and judge whether pick up probability is greater than random chance, if be greater than, ant is picked up the object of its position, start mobile object, otherwise ant moves to other position at random, then jump to above-mentioned steps (4) calculating similarity parameter;
The described probability function of picking up is defined as following formula (2-4):
Figure FSB0000117720740000022
Wherein parameter f is according to the definite similarity of formula (1); B, k 1for threshold constant, get b=0.3, k 1=11.11;
(6) if the current state of ant is load condition, by the following probability function that puts down, calculate object and put down probability, and judgement puts down probability and whether be greater than random chance, if be greater than, ant is put down object, juxtaposition ant state is load condition not; Otherwise ant bears object continues to move to new Data Position, ant bears same object moves number of times and adds 1, judges that ant bears same object and moves number of times and whether be greater than thresholding N 1MAXif, being greater than, ant bears same object moves number of times and sets to 0, and ant is put down object, and juxtaposition ant state is load condition not, otherwise goes to above-mentioned steps (4);
The described probability function that puts down is defined as (3-6) formula:
Figure FSB0000117720740000031
Above formula (3-6) coefficient k 2=11.11;
(7) be assigned to the new Data Position of ant, again generate a random chance p between 0-60% r, cycle index adds 1, if cycle index is greater than maximum cycle N mAX, finish cluster circulation; Otherwise go to above-mentioned steps (4);
(8) cluster result is divided to subspace by cluster radius, the unfiled isolated point occurring during cluster, adopt arest neighbors criterion to process, and this isolated point and nearest input vector are divided into a class, if nearest input vector is also isolated point, a newly-built classification that comprises these two isolated points;
The cluster that refers to described isolated point finishes rear the occurred input vector that does not belong to any subspace, is also wild point;
Described code book rearrangement process follows these steps to carry out:
(1) calculate the input vector number of each non-NULL subspace, and solve the barycenter of every sub spaces;
Described non-NULL subspace refers to that this subspace has an input vector at least;
(2) atypia classification district is merged with arest neighbors criterion and other classification district;
Described atypia classification district refers to that input vector number is less than the subspace of certain particular value;
Described other classification district refers to the non-NULL subspace beyond atypia classification district;
(3) recalculate the input vector number of each non-NULL subspace, and solve the barycenter of every sub spaces;
(4) input vector is stored together and forms a sub-code book by the subspace under it, the eigenwert of subcode book represents with the barycenter of corresponding subspace;
(5) each subcode book is lined up and formed classification code book by the identical order of the eigenwert with subcode book.
CN201010267156.8A 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof Expired - Fee Related CN101944358B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010267156.8A CN101944358B (en) 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010267156.8A CN101944358B (en) 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof

Publications (2)

Publication Number Publication Date
CN101944358A CN101944358A (en) 2011-01-12
CN101944358B true CN101944358B (en) 2014-04-09

Family

ID=43436318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010267156.8A Expired - Fee Related CN101944358B (en) 2010-08-27 2010-08-27 Ant colony algorithm-based codebook classification method and codebook classification device thereof

Country Status (1)

Country Link
CN (1) CN101944358B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222098A (en) * 2011-06-20 2011-10-19 北京邮电大学 Method and system for pre-fetching webpage
CN103297766B (en) 2012-02-23 2016-12-14 中兴通讯股份有限公司 The compression method of vertex data and device in a kind of 3 d image data
CN104050963B (en) * 2014-06-23 2017-02-15 东南大学 Continuous speech emotion prediction method based on emotion data field
CN104459686B (en) * 2014-12-30 2017-12-05 南京信息工程大学 A kind of object detecting and tracking method based on Hough transform Yu ant colony similarity
CN106156841A (en) * 2016-06-24 2016-11-23 武汉理工大学 A kind of k means data processing method based on minimax pheromone
CN112435674A (en) * 2020-12-09 2021-03-02 北京百瑞互联技术有限公司 Method, apparatus, medium for optimizing LC3 arithmetic coding search table of spectrum data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179944A1 (en) * 2005-11-23 2007-08-02 Henry Van Dyke Parunak Hierarchical ant clustering and foraging
CN101266621A (en) * 2008-04-24 2008-09-17 北京学门科技有限公司 High dimension sparse data clustering system and method
CN101414365A (en) * 2008-11-20 2009-04-22 山东大学威海分校 Vector code quantizer based on particle group

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179944A1 (en) * 2005-11-23 2007-08-02 Henry Van Dyke Parunak Hierarchical ant clustering and foraging
CN101266621A (en) * 2008-04-24 2008-09-17 北京学门科技有限公司 High dimension sparse data clustering system and method
CN101414365A (en) * 2008-11-20 2009-04-22 山东大学威海分校 Vector code quantizer based on particle group

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Classification With Ant Colony Optimization;Martens, D.;De Backer, M.; et al.;《IEEE Transactions on Evolutionary Computation 》;20071031;651 - 665 *
人工蚁群聚类码书设计算法;胡宏梅; 董恩清;;《通信技术》;20070731;全文 *
基于蚁群聚类的码书设计;胡宏梅; 董恩清;;《苏州大学学报(工科版)》;20070430;全文 *
蚁群聚类神经网络的耳语音声调识别;陈雪勤; 赵鹤鸣; 俞一彪;;《应用科学学报》;20081031;全文 *

Also Published As

Publication number Publication date
CN101944358A (en) 2011-01-12

Similar Documents

Publication Publication Date Title
CN101944358B (en) Ant colony algorithm-based codebook classification method and codebook classification device thereof
CN109948143B (en) Answer extraction method of community question-answering system
CN107622182B (en) Method and system for predicting local structural features of protein
CN100530979C (en) A vector quantification method and vector quantifier
CN108717439A (en) A kind of Chinese Text Categorization merged based on attention mechanism and characteristic strengthening
CN112000772B (en) Sentence-to-semantic matching method based on semantic feature cube and oriented to intelligent question and answer
CN103020122A (en) Transfer learning method based on semi-supervised clustering
CN110826618A (en) Personal credit risk assessment method based on random forest
CN108805257A (en) A kind of neural network quantization method based on parameter norm
CN101937680B (en) Vector quantization method for sorting and rearranging code book and vector quantizer thereof
CN110188192A (en) A kind of multitask network struction and multiple dimensioned charge law article unified prediction
Pietron et al. Retrain or not retrain?-efficient pruning methods of deep cnn networks
Ko et al. Limiting numerical precision of neural networks to achieve real-time voice activity detection
CN114880428B (en) Method for recognizing speech part components based on graph neural network
Kim et al. Provable memorization capacity of transformers
CN114329233A (en) Cross-region cross-scoring collaborative filtering recommendation method and system
CN114579743A (en) Attention-based text classification method and device and computer readable medium
Chandra et al. Elegant Decision Tree Algorithm for Classification in Data Mining.
CN108846128A (en) A kind of cross-domain texts classification method based on adaptive noise encoder
Chou et al. The Importance of Calibration: Rethinking Confidence and Performance of Speech Multi-label Emotion Classifiers
CN103310275A (en) Novel codebook design method based on ant colony clustering and genetic algorithm
CN107220320A (en) A kind of emerging technology recognition methods based on Patent Citation
CN115329116A (en) Image retrieval method based on multi-layer feature fusion
CN111368976B (en) Data compression method based on neural network feature recognition
CN108388942A (en) Information intelligent processing method based on big data

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140409

Termination date: 20210827