CN110213660A - Distribution method, system, computer equipment and the storage medium of program - Google Patents
Distribution method, system, computer equipment and the storage medium of program Download PDFInfo
- Publication number
- CN110213660A CN110213660A CN201910448059.XA CN201910448059A CN110213660A CN 110213660 A CN110213660 A CN 110213660A CN 201910448059 A CN201910448059 A CN 201910448059A CN 110213660 A CN110213660 A CN 110213660A
- Authority
- CN
- China
- Prior art keywords
- program
- vector
- main broadcaster
- model
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4662—Learning process for intelligent management, e.g. learning user preferences for recommending movies characterized by learning algorithms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4662—Learning process for intelligent management, e.g. learning user preferences for recommending movies characterized by learning algorithms
- H04N21/4666—Learning process for intelligent management, e.g. learning user preferences for recommending movies characterized by learning algorithms using neural networks, e.g. processing the feedback provided by the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4668—Learning process for intelligent management, e.g. learning user preferences for recommending movies for recommending content, e.g. movies
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The present invention relates to a kind of distribution method of program, system, computer equipment and storage mediums, and wherein method includes obtaining the program vector of new program and user preference program to be distributed;It is input to the program vector model pre-established from new program, calculates the program vector made new programs;Wherein, program vector model is obtained from the carry out learning training based on neural network algorithm to history program;The cosine similarity of new program Yu user preference program is calculated according to the program vector of the program vector of new program and user preference program;When cosine similarity be greater than preset threshold when, by new program distribution to be distributed to user preference program to user.This method is logical to be found out the program most like with new program and is recommended or distributed according to most like program, so as to which the distribution of new program is rapidly completed, and distributes accuracy rate height.
Description
Technical field
The present invention relates to network technique field, more particularly to a kind of distribution method of program, system, computer equipment and
Storage medium.
Background technique
As Internet technology continues to develop, some video display platforms (such as lichee, yy live streaming etc.) occur and fast in succession
Speed development.As platform constantly expands, there is a large amount of program to upload to these platforms per minute and is listened to for platform user,
The program for how distributing these is a kind of extremely complex work.Such as lichee is a UGC platform, what lichee uploaded daily
Program has hundreds of thousands, how these programs quickly to be recommended user, is a bigger problem.
Currently, generalling use collaborative filtering or the method based on label recommendations;But collaborative filtering relies on a large amount of user
Data, for new program, it is impossible to a large amount of user data can be accumulated in rigid upload program, it is very big here it is one
Bottleneck;And it is based on the new program of label recommendations, it is easy to appear that quality is irregular to be extremely difficult to good effect.
Summary of the invention
Based on this, it is necessary to exist for the distribution method of current program and need to rely on a large number of users data or recommendation
Program quality irregular the problem of not rising, provide distribution method, system, computer equipment and the storage medium of a kind of program.
A kind of distribution method of program, the described method comprises the following steps:
Obtain new program and user preference section object vector to be distributed;
It is input to the program vector model pre-established from the new program, calculates the new section object vector;Wherein,
The program vector model is obtained from the carry out learning training based on neural network algorithm to history program;
The new program and user preference are calculated according to user preference section object vector described in the vector sum of the new program
The cosine similarity of program;
It is when the cosine similarity is greater than preset threshold, the new program distribution to be distributed is extremely inclined with the user
Good program to user.
The mode of the program vector model pre-established in one of the embodiments, comprising:
The history program is obtained from program database;
Training sample is selected according to the clicking rate of the history program;
Extract main broadcaster's vector, label vector and quality vector from the training sample, and by main broadcaster's vector, described
Label vector and quality vector are input to the neural network model and carry out learning training, obtain the program vector model.
In one of the embodiments, main broadcaster's vector, the label vector and quality vector are input to it is described
In the step of neural network model carries out learning training, obtains the program vector model, comprising:
By main broadcaster's vector, the label vector and the quality vector of any two program in the training sample
The hidden layer of the neural network model is inputted respectively, generates two section object vectors respectively;
Calculate the Hadamard product of two section object vectors;
Two section object vectors are subtracted each other, difference is obtained;
The clicking rate of two programs is calculated according to the quality vector;
The Hadamard product, the difference and the clicking rate are inputted to the sigmoid of the neural network model
Layer, calculates the similarity of two programs;
Successively calculate the similarity of every two program in the training sample;
Penalty values are calculated according to the similarity and loss function;
When the penalty values are less than preset value, the program vector model is obtained.
In one of the embodiments, main broadcaster's vector, the label vector and quality vector are input to it is described
Before the step of neural network model carries out learning training, obtains the program vector model, comprising:
Main broadcaster's vector mould is obtained using the training of word2vec algorithm according to preference of the user to main broadcaster in history program
Type;
Label vector model is obtained using the training of word2vec algorithm according to the label system of the history program;
Bucket algorithm training is divided to obtain quality vector model using one-hot according to the quality system of the history program.
Main broadcaster's vector model is obtained using the training of word2vec algorithm to main broadcaster's vector in one of the embodiments,
The step of in, comprising:
Construct user main broadcaster's rating matrix;
User main broadcaster's rating matrix is analyzed using roulette algorithm, is each according in the result of analysis
The main broadcaster that user chooses preset quantity forms main broadcaster's matrix;
Main broadcaster's matrix is obtained into main broadcaster's vector model using the training of word2vec algorithm.
The quality system includes playing number, clicking rate, complete broadcasting rate and issuing time, root in one of the embodiments,
In the step of dividing bucket algorithm training to obtain quality vector model according to the use one-hot of quality system in history program, comprising:
It chooses the broadcasting number of the history program, the clicking rate, described complete broadcast rate and the issuing time uses
One-hot divides bucket algorithm to carry out a point bucket, obtains quality vector model.
The label system includes Keyword Tag in one of the embodiments,;In being total to according to the history program
In the step of now relationship obtains label vector model using the training of word2vec algorithm, comprising:
Extract all keywords and the corresponding label of keyword of the history program;
The corresponding label of the keyword is obtained into Keyword Tag model using the training of word2vec algorithm.
A kind of dissemination system of program, comprising:
Data obtaining module, for obtaining the program vector of new program and user preference program to be distributed;
Vector calculation module calculates described for being input to the program vector model pre-established from the new program
The program vector of new program;Wherein, the program vector model is to be learnt based on neural network algorithm to history program
Obtained from training;
Cosine similarity computing module, for according to the program vector of the new program and the section of the user preference program
Mesh vector calculates the cosine similarity of the new program and user preference program;
Program distribution module is used for when the cosine similarity is greater than preset threshold, by the new program to be distributed
Be distributed to the user preference program to user.
A kind of computer equipment can be run on a memory and on a processor including memory, processor and storage
Computer program, which is characterized in that the processor performs the steps of when executing the computer program
Obtain the program vector of new program and user preference program to be distributed;
It is input to the program vector model pre-established from the new program, calculates the program vector of the new program;
Wherein, the program vector model is obtained from the carry out learning training based on neural network algorithm to history program;
According to the program vector of the new program and the program vector of the user preference program calculate the new program with
The cosine similarity of user preference program;
It is when the cosine similarity is greater than preset threshold, the new program distribution to be distributed is extremely inclined with the user
Good program to user.
A kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
It is performed the steps of when being executed by processor
Obtain the program vector of new program and user preference program to be distributed;
It is input to the program vector model pre-established from the new program, calculates the program vector of the new program;
Wherein, the program vector model is obtained from the carry out learning training based on neural network algorithm to history program;
According to the program vector of the new program and the program vector of the user preference program calculate the new program with
The cosine similarity of user preference program;
It is when the cosine similarity is greater than preset threshold, the new program distribution to be distributed is extremely inclined with the user
Good program to user.
Distribution method, system, computer equipment and the storage medium of above-mentioned program, first choice obtain program to be distributed and
Then the program vector of user preference program the program vector model that new program input to be distributed pre-establishes is calculated
The program vector of new program calculates new program and user according to the program vector of new program and the program vector of user preference program
The cosine similarity of preferred program, then cosine similarity is compared with preset threshold, when cosine similarity is greater than default threshold
When value, show that new program is similar to user preference program, therefore gives user preference program corresponding user new program distribution;It should
Then the distribution method of program passes through the program vector of new program and user preference program by the program vector of the new program of calculating
Program vector determine the similarity of new program Yu user preference program, find out with new program most like program and according to most
Similar program is recommended or is distributed, and so as to which the distribution of new program is rapidly completed, and distributes accuracy rate height.
Detailed description of the invention
Fig. 1 is the structural schematic diagram of the distribution method of program of the invention in one embodiment;
Fig. 2 is the registration information structural schematic diagram of register device of the invention in one embodiment;
Fig. 3 is the structural schematic diagram of the distribution method of program of the invention in one embodiment;
Fig. 4 is the structural schematic diagram of the distribution method of program of the invention in one embodiment;
Fig. 5 is the flow diagram of the distribution method of program of the invention in one embodiment;
Fig. 6 is the internal structure chart of computer equipment in one embodiment.
Specific embodiment
The contents of the present invention are described in further detail below in conjunction with preferred embodiment and attached drawing.Obviously, hereafter institute
The examples are only for explaining the invention for description, rather than limitation of the invention.Based on the embodiments of the present invention, this field is general
Logical technical staff every other embodiment obtained without making creative work belongs to what the present invention protected
Range.It should be noted that for ease of description, only some but not all contents related to the present invention are shown in the drawings.
[related description part]
It should be noted that term involved in the embodiment of the present invention " first second third " be only be that difference is similar
Object, do not represent the particular sorted for object, it is possible to understand that ground, " Yi Er third " can be in the case where permission
Exchange specific sequence or precedence.It should be understood that the object that " first second third " is distinguished in the appropriate case can be mutual
It changes, so that the embodiment of the present invention described herein can be real with the sequence other than those of illustrating or describing herein
It applies.
The term " includes " of the embodiment of the present invention and " having " and their any deformations, it is intended that cover non-exclusive
Include.Such as contain series of steps or the process, method, method, product or equipment of (module) unit are not limited to
The step of listing or unit, but optionally further comprising the step of not listing or unit, or optionally further comprising for these
The intrinsic other step or units of process, method, product or equipment.
Referenced herein " embodiment " is it is meant that a particular feature, structure, or characteristic described can wrap in conjunction with the embodiments
It is contained at least one embodiment of the application.Each position in the description occur the phrase might not each mean it is identical
Embodiment, nor the independent or alternative embodiment with other embodiments mutual exclusion.Those skilled in the art explicitly and
Implicitly understand, embodiment described herein can be combined with other embodiments.
The distribution method of program provided by the present application, this method apply in terminal, and terminal can be personal computer, pen
Remember this computer etc..Corresponding application program can be run in terminal, new program to be distributed, which is inputted corresponding application program, is
The distribution to new program can be quickly accomplished.
In one embodiment, as shown in Figure 1, providing a kind of distribution method of program, it is applied to terminal in this way
For be illustrated, comprising the following steps:
Step S102 obtains the program vector of new program and user preference program to be distributed;
Wherein, new program to be distributed refers to that platform just receives and (either just uploads to platform) and does not also recommend or be distributed to
The program of user;Vector (also referred to as Euclid's vector, geometric vector, vector) in mathematics refers to size and Orientation
Amount;And program vector refers to that with one, there is the amount of size and Orientation to indicate program, can be used to indicate the feature of program.Section
Mesh vector generally includes main broadcaster's vector, label vector and quality vector, and main broadcaster's vector refers to be calculated according to the co-occurrence behavior of main broadcaster
Out;Label vector is calculated according to the corresponding label of program;Quality vector is to play number according to program, click
Rate complete broadcasts rate and issuing time calculates;User preference program typically refers to the obvious preference of user or that likes go through
The program vector of history program, user preference program precalculates out, can store in terminal or database.
Step S104 is input to the program vector model pre-established from new program, calculates the vector made new programs;Its
In, program vector model is obtained from the carry out learning training based on neural network algorithm to history program;
Wherein, the program vector model pre-established is to carry out learning training to history program using neural network algorithm to obtain
It arrives;History program refers to the program for platform the past period receiving and being distributed to, and may include all completions point
The program of hair.
Step S106 calculates new program and user according to the program vector of new program and the program vector of user preference program
The cosine similarity of preferred program;
Step S108, when cosine similarity be greater than preset threshold when, by new program distribution to be distributed to and user preference
Program to user.
Specifically, cosine similarity, also known as cosine similarity, are commented by calculating the included angle cosine value of two vectors
Estimate their similarity.This cosine value can be used to characterize, the similitude of the two vectors.Angle is smaller, and cosine value more connects
It is bordering on 1, their direction more coincide, then more similar.In the present embodiment, according to the program vector and user preference of new program
Program vector calculate the cosine similarity of new program and user preference program, thus to judge new program and user preference program
Similitude, find out user preference program similar with program is selected, it is then that new program push is corresponding to user preference program
User.
In wherein an optional embodiment mode, it is calculated by the following formula cosine similarity:
Wherein, similarity indicates cosine similarity, AiIndicate the program vector of new program, BiIndicate user preference section
Purpose program vector.
In addition, preset threshold can be a value range, in the present embodiment for greater than 0 less than 1 numerical value, be according to reality
What demand determined, when preset threshold is bigger, the new program chosen and user preference program similarity are higher.
The distribution method first choice of above-mentioned program obtains the program vector of program and user preference program to be distributed, then
The program vector model that new program input to be distributed pre-establishes is calculated to the program vector of new program, according to new program
Program vector and the program vector of user preference program calculate the cosine similarity of new program Yu user preference program, then will be remaining
String similarity is compared with preset threshold, when cosine similarity is greater than preset threshold, shows new program and user preference section
Mesh is similar, therefore gives user preference program corresponding user new program distribution;The distribution method of the program is by calculating new section
Then purpose program vector is determined new program and is used by the program vector of new program and the program vector of user preference program
The similarity of family preferred program finds out the program most like with new program and is recommended or distributed according to most like program, from
And the distribution of new program can be rapidly completed, and distribute accuracy rate height.
In one of the embodiments, as shown in Fig. 2, the mode of the program vector model pre-established, comprising:
Step S202 obtains history program from program database;
Step S204 selects training sample according to the clicking rate of history program;
Step S206, extracts main broadcaster's vector, label vector and quality vector from training sample, and by main broadcaster's vector, mark
Label vector sum quality vector is input to neural network model and carries out learning training, obtains program vector model.
Specifically, according to the clicking rate of history program select training sample, wherein select clicking rate be greater than 0 as positive sample
This, selecting clicking rate for 0 is negative sample;In order to guarantee that positive and negative sample size is of substantially equal, to selection training sample after to it
It carries out sampling processing.Clicking rate=exposure frequency/number of clicks;Such as exposure frequency is 1, number of clicks 1, clicking rate is just
It is 1/1;For the accuracy of data, with reference to Laplce's smoothed curve, a constant is added to denominator, it such as (can root plus 5
Confirm according to actual conditions), it is believed that the program of 5 personal exposures just has statistical significance, i.e., 1/ (1+5) is about 0.1667, and selection is just
Sample size is 500W;Then it is greater than 5 through exposure frequency known to analysis, and the sample that clicking rate is 0 is also 500w, as
Negative sample, so that positive sample and negative sample collectively constitute training sample;The training sample of selection can accurately make the section of training
Mesh vector model is more accurate, reduces the generation of mistake.
Program vector generally includes main broadcaster's vector, label vector and quality vector;Therefore mould is being carried out using training sample
When type training, first choice extracts main broadcaster's vector, label vector and quality vector from history program, then by the main broadcaster of extraction to
Amount, label vector and quality vector are input to neural network model and are trained, to obtain program vector model.
Main broadcaster's vector, label vector and quality vector are being input to neural network model in one of the embodiments,
In the step of carrying out learning training, obtaining program vector model, comprising:
Main broadcaster's vector, label vector and the quality vector of any two program in training sample are inputted into neural network respectively
The hidden layer of model generates two section object vectors respectively;
Calculate the Hadamard product of two section object vectors;
Two section object vectors are subtracted each other, difference is obtained;
The clicking rate of two programs is calculated according to quality vector;
By the sigmoid layer of Hadamard product, difference and clicking rate input neural network model, two programs are calculated
Similarity;
Successively calculate the similarity of every two program in training sample;
Penalty values are calculated according to similarity and loss function;
When penalty values are less than preset value, program vector model is obtained.
Specifically, during training pattern, (1) input be in training sample the corresponding main broadcaster of any two program to
Amount, label vector, quality vector, by the corresponding main broadcaster's vector of two programs, label vector, quality vector inputs a shared mind
Through network, this shared neural network has N layers of hidden layer.In an optional embodiment mode, according to complexity and
Measure of effectiveness, the neural network select three layers, and intermediate one layer of hidden layer, which increases relu, prevents over-fitting, after neural network
Two section object vectors are generated respectively;(2) then to two section object vectors ask Hadamard accumulate (respective items multiplication), then this two
A vector subtracts each other, and then result merges.Be multiplied and subtract each other mainly be to provide calculate two program vectors between difference
Have much, combined effects be combine two species diversity, can preferably depict two programs otherness have it is much.(3) then
Calculate the clicking rate of the two programs;(4) result of (2) and (3) is linked together, by one sigmoid layers, prediction
Similarity between two programs out, wherein similarity is usually one 0~1 number.It uses in the present embodiment
Tensorflow is as main Computational frame, and wherein loss function is using cross entropy;Finally according to similarity and loss
Function calculates the loss (penalty values) in entire training process, when penalty values are less than preset value, indicates that model training is completed.
The pre-set value of preset value, which is not a unique numerical value, can be any one in a certain range
Value is usually determined according to model training actual demand.
In one of the embodiments, as shown in figure 3, main broadcaster's vector, label vector and quality vector are input to mind
Before the step of carrying out learning training through network model, obtain program vector model, comprising:
Step S302 is led according to preference of the user to main broadcaster in history program using the training of word2vec algorithm
Broadcast vector model;
Step S304 obtains label vector model using the training of word2vec algorithm according to the label system of history program;
Step S306 divides bucket algorithm training to obtain quality vector according to the quality system of history program using one-hot
Model.
In the present embodiment, main broadcaster's vector, label vector and quality vector are trained in advance, so that obtaining program vector
When model, does not have to carry out random initializtion, greatly accelerate the speed of network training.
Specifically, being trained to obtain main broadcaster's vector model to main broadcaster's vector;Usually broadcast before according to main broadcaster and main broadcaster
Relationship between the history program put determines, is (to generally use and beat according to preference of the user to main broadcaster in history program
Divide system) determine.Label vector is to be obtained according to the label system of history program using the training of word2vec method;Matter
Amount label is that bucket algorithm training is divided to obtain quality vector model according to the quality system one-hot of history program.Wherein label body
System refers to the set of all labelings of program;Quality system refers to the combination of all mass parameters of program.
In addition, the program that main broadcaster can send out different, different programs have label system and quality system (as shown in Figure 4), packet
Include level-one label and scene tag;Level-one label is usually the label of programme variety (classification), and scene tag refers to that program is applicable in
Gender, applicable time/scene are applicable in mood and are applicable in the label of educational stage.Level-one label includes second level label, second level mark
Label are the carry out step refinings to level-one label;Second level label includes entity tag and Keyword Tag.Quality system is mainly commented
Estimating the index of the quality of program quality mainly has broadcasting number, and clicking rate is complete to broadcast rate, issuing time etc..
In order to make it easy to understand, providing a detailed embodiment;As shown in table 1, second level label is under the jurisdiction of under first-level class
Subclassification, a second level label guarantees coarse size as far as possible, guarantees complete orthogonal system as far as possible, guarantees that any one program can
It is assigned to a secondary classification.The following figure is then a second level label system of lichee program.
1 level-one of table, second level label system
Entity: name, place name make the name of an article, song title etc.
Keyword: the meaning is complete and clear, is to describe a kind of program not to be to be that the theme for describing program or program are thought individually
It is influenced to caused by people.
Scene tag: being applicable in gender, and the applicable time/scene is applicable in mood, is applicable in educational stage etc..
The step of main broadcaster's vector model is obtained using the training of word2vec algorithm to main broadcaster's vector in one of the embodiments,
In rapid, comprising:
Construct user main broadcaster's rating matrix;
User main broadcaster's rating matrix is analyzed using roulette algorithm, is each user according in the result of analysis
The main broadcaster for choosing preset quantity forms main broadcaster's matrix;
Main broadcaster's matrix is obtained into main broadcaster's vector model using the training of word2vec algorithm.
The detailed process of training main broadcaster's vector model are as follows: (1) firstly, we understand structuring user's, the rating matrix of main broadcaster is main
Will broadcasting according to user to main broadcaster, thumb up, the behaviors such as concern portray user to the fancy grade of main broadcaster, and we are this
A preference value ScoreiNormalize to 0~1.Such as format below:
User1, NJ1, Score1
User1, NJ2, Score2
User2, NJ3, Score3
(2) main broadcaster's list that then we like according to user id, counting user, is converted into following format
User1 NJ1: Score11, NJ2: Score12, NJ3: Score13………
User2 NJ2: Score21, NJ2: Score22, NJ4: Score24……….
(3) roulette algorithm first:
NJ1, NJ2……NJn
Assuming that we are to main broadcaster NJ1, NJ2......NJnScoring be Score1, Score2......Scoren, then NJi
The probability drawn is Pi,
Then, we are according to random roulette algorithm, and according to user to the preference weight of main broadcaster, preference is bigger, are drawn
Probability it is higher, according to user, main broadcaster's rating matrix, it is contemplated that accuracy rate and calculated performance, for each user extract 10 masters
It broadcasts, is configured to format below:
User1 NJ1, NJ2, NJ1, NJ3…
User2 NJ2, NJ3, NJ2, NJ4……
(4) remove the UserId of front, and separator uses space, is converted into following format
NJ1 NJ2 NJ1 NJ3……
NJ2 NJ3 NJ2 NJ4……
(5) we are input to the sample of above construction in word2vec, obtain main broadcaster's vector model.When program inputs
Main broadcaster's vector of program can be obtained in main broadcaster's vector model.
Word2vec is the correlation model that a group is used to generate term vector.These models are the shallow and double-deck neural network,
For training with the word text of construction linguistics again.Network is existing with vocabulary, and need to guess the input word of adjacent position,
Under bag of words are assumed in word2vec, the sequence of word is unessential.After training is completed, word2vec model can be used to reflect
Each word is penetrated to a vector, can be used to indicate word to the relationship between word, which is the hidden layer of neural network.I.e.
Word2vec can according to given corpus, by the training pattern after optimization fast and effeciently by a word be expressed as to
Amount form.
In one of the embodiments, quality system include play number, clicking rate, it is complete broadcast rate and issuing time, according to going through
Quality system divides bucket algorithm to train in the step of obtaining quality vector model using one-hot in history program, comprising:
The broadcasting number of history program is chosen, clicking rate, complete rate and issuing time broadcast and divides bucket algorithm to be divided using one-hot
Bucket, obtains quality vector model.
Specifically, one-hot coding be class variable is converted to machine learning algorithm be easy to utilize it is a form of
Process.In the present embodiment, by the broadcasting number of history program, clicking rate is complete to broadcast rate, issuing time be transformed into 0 or 1 composition to
Amount.
For playing number, we from small to large sort the broadcasting number of program, calculate the corresponding program of each broadcasting number
Number, it is 20%, 40%, 60% respectively that we are split with quantile, are divided into 5 equal portions, and 80% is split, it is assumed that this four
A quantile is 200,4000,60000,100000, then 200 broadcasting number programs below account for 20%, 4000 play number or less
Program account for 40%, 60000 broadcasting number programs below account for 60%, and 100000 programs below account for 80%.Some program
Playing number is 321, then the quantile of his data greater than 20% is less than the quantile of this program 40% again, then this
The corresponding vector value for playing number of program is [0,1,0,0,0].Wherein quantile is also known as quantile, refer to by a stochastic variable from
It is small to arrive longer spread, it is divided into the numerical point of several equal portions;The specific value of quantile can be selected according to actual demand.
In addition, for clicking rate, complete broadcasting rate and issuing time and carrying out in the same way.
Label system includes Keyword Tag in one of the embodiments,;It is adopted according to the cooccurrence relation of history program
In the step of obtaining label vector model with the training of word2vec algorithm, comprising:
Extract all keywords and the corresponding label of keyword of history program;
The corresponding label of keyword is obtained into Keyword Tag model using the training of word2vec algorithm.
Specifically, (1) flocks together the keyword of all programs of each main broadcaster in keyword training process,
Upset at random without carrying out duplicate removal processing, then the corresponding label of main broadcaster, such as main broadcaster's program corresponding relationship:
NJ1, Audio1
NJ1, Audio2
NJ1, Audio3
NJ1, Audio4
NJ2, Audio5
………
Program corresponding label
Audio1: Ta.g1, Tag2
Audio2: Tag2
Audio3: Tag3
Audio4: Tag5
Audio5: Tag6
………
So main broadcaster's corresponding label is
NJ1: Tag1, Tag2, Tag2, Tag3
NJ2: Tag5, Tag6
………
(2) remove the NJ of front, then above program is upset at random
Tag1, Tag2, Tag2, Tag3
Tag5, Tag6
………
(3) we are input to the sample of above construction in word2vec, obtain keyword vector model.When program is defeated
The keyword vector of program can be obtained in entry keyword vector model.
Label system includes entity tag in one of the embodiments,;It is used according to the cooccurrence relation of history program
In the step of training of word2vec algorithm obtains label vector model, comprising: training level-one or second level label model;Wherein, have
Body process is similar to training keyword, therefore is not repeating herein.
Label system includes entity tag in one of the embodiments,;It is used according to the cooccurrence relation of history program
In the step of training of word2vec algorithm obtains label vector model, comprising: training physical model;Wherein train physical model mistake
Journey are as follows: (1) structuring user's, the rating matrix of main broadcaster, the relationship between main broadcaster and entity associate according to main broadcaster, be configured to
User, the rating matrix of entity.
User main broadcaster's rating matrix format:
User1, NJ1, Score1
User1, NJ2, Score2
User2, NJ3, Score3
………
Main broadcaster's program corresponding relationship:
NJ1, Audio1
NJ1, Audio2
NJ1, Audio3
NJ1, Audio4
NJ2, Audio5
………
Program correspondent entity:
Audio1: Entity1, Entity2
Audio2: Entity3
Audio3: Enity4, Entity2
Audio5: Enity5
………
So main broadcaster's correspondent entity are as follows:
NJ1: Entity1, Entity2, Entity3, Entity4
NJ2: Entity5
………
Ultimately constructed user, the rating matrix of entity
User1 Entity1: Score11, Entity2: Score12, Entity3: Score13………
User2 Entity2: Score21, Entity3: Score22, Entity4: Score24……….
(2) according to random roulette algorithm, for user to the preference weight of entity, preference is bigger, and the probability drawn is higher,
Construction is following at format below
User1 Entity1, Entity2, Entity1, Entity3…
User2 Entity2, Entity3, Entity2, Entity4……
(3) remove the UserId of front, and separator uses space, is converted into following format
Entity1, Entity2, Entity1, Entity3…
Entity2, Entity3, Entity2, Entity4……
(4) we are input to the sample of above construction in word2vec, obtain entity vector model.When program inputs
Section destination entity word vector can be obtained in entity vector model.
According to the distribution method of above-mentioned program, the present invention also provides a kind of dissemination systems of program.
Fig. 5 is the structural schematic diagram of the distribution method of program of the invention in one embodiment.As shown in figure 5, the reality
The dissemination system of the program in example is applied,
Data obtaining module 10, for obtaining the program vector of new program and user preference program to be distributed;
Vector calculation module 20 calculates institute for being input to the program vector model pre-established from the new program
State the program vector of new program;Wherein, the program vector model is learning to history program based on neural network algorithm
It practises obtained from training;
Cosine similarity computing module 30, for according to the program vector of the new program and the user preference program
Program vector calculates the cosine similarity of the new program and user preference program;
Program distribution module 40 is used for when the cosine similarity is greater than preset threshold, by the new section to be distributed
Mesh be distributed to the user preference program to user.
Include: in one of the embodiments,
History program obtains module, for obtaining history program from program database;
Training sample selection module, for selecting training sample according to the clicking rate of history program;
Program vector model obtains module, for extracting main broadcaster's vector, label vector and quality vector from training sample,
And main broadcaster's vector, label vector and quality vector are input to neural network model and carry out learning training, obtain program vector mould
Type.
Program vector model obtains module and includes: in one of the embodiments,
Vector generation module, for by main broadcaster's vector, label vector and the quality of any two program in training sample to
Amount inputs the hidden layer of neural network model respectively, generates two section object vectors respectively;
Hadamard accumulates computing module, for calculating the Hadamard product of two section object vectors;
Difference calculating module obtains difference for subtracting each other two section object vectors;
Clicking rate computing module, for calculating the clicking rate of two programs according to quality vector;
Similarity calculation module, for Hadamard product, difference and clicking rate to be inputted neural network model
Sigmoid layers, calculate the similarity of two programs;Successively calculate the similarity of every two program in training sample;
Penalty values computing module, for calculating penalty values according to similarity and loss function;
Program vector model obtains module, for obtaining program vector model when penalty values are less than preset value.
In one of the embodiments, further include:
Main broadcaster's vector model obtains module, for being used according to preference of the user to main broadcaster in history program
The training of word2vec algorithm obtains main broadcaster's vector model;
Label vector model obtains module, for trained using word2vec algorithm according to the label system of history program
To label vector model;
Quality vector model obtains module, and the use one-hot for the quality system according to history program divides bucket algorithm
Training obtains quality vector model.
Main broadcaster's vector model obtains module and includes: in one of the embodiments,
Rating matrix constructs module, for constructing user main broadcaster's rating matrix;
Main broadcaster's matrix stroke module, for being analyzed user main broadcaster's rating matrix using roulette algorithm, according to point
The main broadcaster for choosing preset quantity in the result of analysis for each user forms main broadcaster's matrix;
Main broadcaster's vector model obtains module, for main broadcaster's matrix to be obtained main broadcaster's vector mould using the training of word2vec algorithm
Type.
Quality system includes playing number, clicking rate, complete broadcasting rate and issuing time in one of the embodiments, comprising:
Quality vector model obtains module, for choosing the broadcasting number of history program, clicking rate, complete broadcasting rate and issuing time
Divide bucket algorithm to carry out a point bucket using one-hot, obtains quality vector model.
Label system includes Keyword Tag in one of the embodiments,;Include:
Keyword Tag extraction module, for extracting all keywords and the corresponding label of keyword of history program;
Keyword Tag model obtains module, for obtaining the corresponding label of keyword using the training of word2vec algorithm
Keyword Tag model.
The specific of dissemination system about program limits the restriction that may refer to the dissemination system above for program,
This is repeated no more.Modules in the dissemination system of above-mentioned program can come fully or partially through software, hardware and combinations thereof
It realizes.Above-mentioned each module can be embedded in the form of hardware or independently of in the processor in computer equipment, can also be with software
Form is stored in the memory in computer equipment, executes the corresponding operation of the above modules in order to which processor calls.
In one embodiment, a kind of computer equipment is provided, which can be server, internal junction
Composition can be as shown in Figure 6.The computer equipment include by system bus connect processor, memory, network interface and
Database.Wherein, the processor of the computer equipment is for providing calculating and control ability.The memory packet of the computer equipment
Include non-volatile memory medium, built-in storage.The non-volatile memory medium is stored with operating system, computer program and data
Library.The built-in storage provides environment for the operation of operating system and computer program in non-volatile memory medium.The calculating
The database of machine equipment is for storing fault case data.The network interface of the computer equipment is used to pass through with external terminal
Network connection communication.A kind of distribution method of program is realized when the computer program is executed by processor.
It will be understood by those skilled in the art that structure shown in Fig. 6, only part relevant to application scheme is tied
The block diagram of structure does not constitute the restriction for the computer equipment being applied thereon to application scheme, specific computer equipment
It may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, a kind of computer equipment is provided, including memory, processor and storage are on a memory
And the computer program that can be run on a processor, processor perform the steps of when executing computer program
Obtain new program and user preference section object vector to be distributed;
It is input to the program vector model pre-established from new program, calculates the vector made new programs;Wherein, program vector
Model is obtained from the carry out learning training based on neural network algorithm to history program;
Cosine phase of the new program with user preference program is calculated according to the vector sum user preference section object vector of new program
Like degree;
When cosine similarity is greater than preset threshold, by new program distribution to be distributed to user preference program institute to
User.
In one embodiment, disconnecting link abnormal state data are also performed the steps of when processor executes computer program
Identification model is obtained by following steps:
History program is obtained from program database;
Training sample is selected according to the clicking rate of history program;
Extract main broadcaster's vector, label vector and quality vector from training sample, and by main broadcaster's vector, label vector and matter
Amount vector is input to neural network model and carries out learning training, obtains program vector model.
In one embodiment, disconnecting link abnormal state data are also performed the steps of when processor executes computer program
Identification model is obtained by following steps: by main broadcaster's vector, label vector and quality vector be input to neural network model into
Row learning training, in the step of obtaining program vector model, comprising:
Main broadcaster's vector, label vector and the quality vector of any two program in training sample are inputted into neural network respectively
The hidden layer of model generates two section object vectors respectively;
Calculate the Hadamard product of two section object vectors;
Two section object vectors are subtracted each other, difference is obtained;
The clicking rate of two programs is calculated according to quality vector;
By the sigmoid layer of Hadamard product, difference and clicking rate input neural network model, two programs are calculated
Similarity;
Successively calculate the similarity of every two program in training sample;
Penalty values are calculated according to similarity and loss function;
When penalty values are less than preset value, program vector model is obtained.
In one embodiment, disconnecting link abnormal state data are also performed the steps of when processor executes computer program
Identification model is obtained by following steps: by main broadcaster's vector, label vector and quality vector be input to neural network model into
Row learning training, before the step of obtaining program vector model, comprising:
Main broadcaster's vector mould is obtained using the training of word2vec algorithm according to preference of the user to main broadcaster in history program
Type;
Label vector model is obtained using the training of word2vec algorithm according to the label system of history program;
Bucket algorithm training is divided to obtain quality vector model using one-hot according to the quality system of history program.
In one embodiment, disconnecting link abnormal state data are also performed the steps of when processor executes computer program
Identification model is obtained by following steps: the step of obtaining main broadcaster's vector model using the training of word2vec algorithm to main broadcaster's vector
In, comprising:
Construct user main broadcaster's rating matrix;
User main broadcaster's rating matrix is analyzed using roulette algorithm, is each user according in the result of analysis
The main broadcaster for choosing preset quantity forms main broadcaster's matrix;
Main broadcaster's matrix is obtained into main broadcaster's vector model using the training of word2vec algorithm.
In one embodiment, disconnecting link abnormal state data are also performed the steps of when processor executes computer program
Identification model is obtained by following steps: quality system include play number, clicking rate, it is complete broadcast rate and issuing time, according to history
Quality system divides bucket algorithm to train in the step of obtaining quality vector model using one-hot in program, comprising:
The broadcasting number of history program is chosen, clicking rate, complete rate and issuing time broadcast and divides bucket algorithm to be divided using one-hot
Bucket, obtains quality vector model.
In one embodiment, disconnecting link abnormal state data are also performed the steps of when processor executes computer program
Identification model is obtained by following steps: label system includes Keyword Tag;It is used according to the cooccurrence relation of history program
In the step of training of word2vec algorithm obtains label vector model, comprising:
Extract all keywords and the corresponding label of keyword of history program;
The corresponding label of keyword is obtained into Keyword Tag model using the training of word2vec algorithm.
In one embodiment, a kind of computer readable storage medium is provided, computer program is stored thereon with, is calculated
Machine program performs the steps of when being executed by processor
Obtain new program and user preference section object vector to be distributed;
It is input to the program vector model pre-established from new program, calculates the vector made new programs;Wherein, program vector
Model is obtained from the carry out learning training based on neural network algorithm to history program;
Cosine phase of the new program with user preference program is calculated according to the vector sum user preference section object vector of new program
Like degree;
When cosine similarity is greater than preset threshold, by new program distribution to be distributed to user preference program institute to
User.
In one embodiment, the program pre-established is also performed the steps of when computer program is executed by processor
The mode of vector model, comprising:
History program is obtained from program database;
Training sample is selected according to the clicking rate of history program;
Extract main broadcaster's vector, label vector and quality vector from training sample, and by main broadcaster's vector, label vector and matter
Amount vector is input to neural network model and carries out learning training, obtains program vector model.
In one embodiment, also performed the steps of when computer program is executed by processor by main broadcaster's vector, mark
Label vector sum quality vector is input in the step of neural network model carries out learning training, obtains program vector model, comprising:
Main broadcaster's vector, label vector and the quality vector of any two program in training sample are inputted into neural network respectively
The hidden layer of model generates two section object vectors respectively;
Calculate the Hadamard product of two section object vectors;
Two section object vectors are subtracted each other, difference is obtained;
The clicking rate of two programs is calculated according to quality vector;
By the sigmoid layer of Hadamard product, difference and clicking rate input neural network model, two programs are calculated
Similarity;
Successively calculate the similarity of every two program in training sample;
Penalty values are calculated according to similarity and loss function;
When penalty values are less than preset value, program vector model is obtained.
In one embodiment, also performed the steps of when computer program is executed by processor by main broadcaster's vector, mark
Label vector sum quality vector is input to before the step of neural network model carries out learning training, obtains program vector model, packet
It includes:
Main broadcaster's vector mould is obtained using the training of word2vec algorithm according to preference of the user to main broadcaster in history program
Type;
Label vector model is obtained using the training of word2vec algorithm according to the label system of history program;
Bucket algorithm training is divided to obtain quality vector model using one-hot according to the quality system of history program.
In one embodiment, it is also performed the steps of when computer program is executed by processor and main broadcaster's vector is used
In the step of training of word2vec algorithm obtains main broadcaster's vector model, comprising:
Construct user main broadcaster's rating matrix;
User main broadcaster's rating matrix is analyzed using roulette algorithm, is each user according in the result of analysis
The main broadcaster for choosing preset quantity forms main broadcaster's matrix;
Main broadcaster's matrix is obtained into main broadcaster's vector model using the training of word2vec algorithm.
In one embodiment, it includes broadcasting that quality system is also performed the steps of when computer program is executed by processor
Put number, clicking rate, it is complete broadcast rate and issuing time, according in history program quality system using one- hot divide bucket algorithm training
In the step of obtaining quality vector model, comprising:
The broadcasting number of history program is chosen, clicking rate, complete rate and issuing time broadcast and divides bucket algorithm to be divided using one-hot
Bucket, obtains quality vector model.
In one embodiment, it includes closing that label system is also performed the steps of when computer program is executed by processor
Key word label;In the step of obtaining label vector model using the training of word2vec algorithm according to the cooccurrence relation of history program
In, comprising:
Extract all keywords and the corresponding label of keyword of history program;
The corresponding label of keyword is obtained into Keyword Tag model using the training of word2vec algorithm.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with
Relevant hardware is instructed to complete by computer program, the computer program can be stored in a non-volatile computer
In read/write memory medium, the computer program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein,
To any reference of memory, storage, database or other media used in each embodiment provided herein,
Including non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM
(PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include
Random access memory (RAM) or external cache.By way of illustration and not limitation, RAM in a variety of forms may be used
, such as static state RAM (SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double data rate sdram (DDRSDRAM),
Enhanced SDRAM (ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM
(RDRAM), direct memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
Each technical characteristic of embodiment described above can be combined arbitrarily, for simplicity of description, not to above-mentioned reality
It applies all possible combination of each technical characteristic in example to be all described, as long as however, the combination of these technical characteristics is not deposited
In contradiction, all should be considered as described in this specification.
The embodiments described above only express several embodiments of the present invention, and the description thereof is more specific and detailed, but simultaneously
It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art
It says, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to protection of the invention
Range.Therefore, the scope of protection of the patent of the invention shall be subject to the appended claims.
Claims (10)
1. a kind of distribution method of program, which is characterized in that the described method includes:
Obtain the program vector of new program and user preference program to be distributed;
It is input to the program vector model pre-established from the new program, calculates the program vector of the new program;Wherein,
The program vector model is obtained from the carry out learning training based on neural network algorithm to history program;
The new program and user are calculated according to the program vector of the new program and the program vector of the user preference program
The cosine similarity of preferred program;
When the cosine similarity be greater than preset threshold when, by the new program distribution to be distributed to the user preference section
Mesh to user.
2. the distribution method of program according to claim 1, which is characterized in that the program vector model pre-established
Mode, comprising:
The history program is obtained from program database;
Training sample is selected according to the clicking rate of the history program;
Extract main broadcaster's vector, label vector and quality vector from the training sample, and by main broadcaster's vector, the label
Vector sum quality vector is input to the neural network model and carries out learning training, obtains the program vector model.
3. the distribution method of program according to claim 2, which is characterized in that by main broadcaster's vector, the label
The step of vector sum quality vector is input to the neural network model and carries out learning training, obtains the program vector model
In, comprising:
By main broadcaster's vector of any two program, the label vector and quality vector difference in the training sample
The hidden layer of the neural network model is inputted, generates two section object vectors respectively;
Calculate the Hadamard product of two section object vectors;
Two section object vectors are subtracted each other, difference is obtained;
The clicking rate of two programs is calculated according to the quality vector;
The sigmoid layer that the Hadamard product, the difference and the clicking rate are inputted to the neural network model, calculates
The similarity of two programs out;
Successively calculate the similarity of every two program in the training sample;
Penalty values are calculated according to the similarity and loss function;
When the penalty values are less than preset value, the program vector model is obtained.
4. the distribution method of program according to claim 2, which is characterized in that by main broadcaster's vector, the label
The step of vector sum quality vector is input to the neural network model and carries out learning training, obtains the program vector model it
Before, comprising:
Main broadcaster's vector model is obtained using the training of word2vec algorithm according to preference of the user to main broadcaster in history program;
Label vector model is obtained using the training of word2vec algorithm according to the label system of the history program;
Bucket algorithm training is divided to obtain quality vector model using one-hot according to the quality system of the history program.
5. the distribution method of program according to claim 4, which is characterized in that use word2vec to main broadcaster's vector
In the step of algorithm training obtains main broadcaster's vector model, comprising:
Construct user main broadcaster's rating matrix;
User main broadcaster's rating matrix is analyzed using roulette algorithm, is each user according in the result of analysis
The main broadcaster for choosing preset quantity forms main broadcaster's matrix;
Main broadcaster's matrix is obtained into main broadcaster's vector model using the training of word2vec algorithm.
6. the distribution method of program according to claim 4, it is characterised in that: the quality system includes playing number, point
Hit rate, it is complete broadcast rate and issuing time, according in history program quality system using one-hot divide bucket algorithm training obtain quality
In the step of vector model, comprising:
The broadcasting number of the history program, the clicking rate, complete rate and the issuing time broadcast are chosen using one-
Hot divides bucket algorithm to carry out a point bucket, obtains quality vector model.
7. the distribution method of program according to claim 4, which is characterized in that the label system includes keyword mark
Label;In the step of obtaining label vector model using the training of word2vec algorithm according to the cooccurrence relation of the history program,
Include:
Extract all keywords and the corresponding label of keyword of the history program;
The corresponding label of the keyword is obtained into Keyword Tag model using the training of word2vec algorithm.
8. a kind of dissemination system of program characterized by comprising
Data obtaining module, for obtaining the program vector of new program and user preference program to be distributed;
Vector calculation module calculates the new section for being input to the program vector model pre-established from the new program
Purpose program vector;Wherein, the program vector model is based on neural network algorithm to the carry out learning training of history program
Obtained from;
Cosine similarity computing module, for according to the program of the program vector of the new program and the user preference program to
Amount calculates the cosine similarity of the new program and user preference program;
Program distribution module is used for when the cosine similarity is greater than preset threshold, by the new program distribution to be distributed
To with the user preference program to user.
9. a kind of computer equipment including memory, processor and stores the meter that can be run on a memory and on a processor
Calculation machine program, which is characterized in that the processor realizes any one of claims 1 to 7 institute when executing the computer program
The step of stating method.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
The step of method described in any one of claims 1 to 7 is realized when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910448059.XA CN110213660B (en) | 2019-05-27 | 2019-05-27 | Program distribution method, system, computer device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910448059.XA CN110213660B (en) | 2019-05-27 | 2019-05-27 | Program distribution method, system, computer device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110213660A true CN110213660A (en) | 2019-09-06 |
CN110213660B CN110213660B (en) | 2021-08-20 |
Family
ID=67788888
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910448059.XA Active CN110213660B (en) | 2019-05-27 | 2019-05-27 | Program distribution method, system, computer device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110213660B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110851647A (en) * | 2019-09-29 | 2020-02-28 | 广州荔支网络技术有限公司 | Intelligent distribution method, device and equipment for audio content flow and readable storage medium |
CN110909202A (en) * | 2019-10-28 | 2020-03-24 | 广州荔支网络技术有限公司 | Audio value evaluation method and device and readable storage medium |
CN113742564A (en) * | 2020-05-29 | 2021-12-03 | 北京沃东天骏信息技术有限公司 | Target resource pushing method and device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110030012A1 (en) * | 2004-07-30 | 2011-02-03 | Diaz Perez Milton | Method of common addressing of tv program content on internet and tv services platform of a digital tv services provider |
US20130145276A1 (en) * | 2011-12-01 | 2013-06-06 | Nokia Corporation | Methods and apparatus for enabling context-aware and personalized web content browsing experience |
CN103686237A (en) * | 2013-11-19 | 2014-03-26 | 乐视致新电子科技(天津)有限公司 | Method and system for recommending video resource |
CN105898420A (en) * | 2015-01-09 | 2016-08-24 | 阿里巴巴集团控股有限公司 | Video recommendation method and device, and electronic equipment |
CN108009528A (en) * | 2017-12-26 | 2018-05-08 | 广州广电运通金融电子股份有限公司 | Face authentication method, device, computer equipment and storage medium based on Triplet Loss |
CN109558512A (en) * | 2019-01-24 | 2019-04-02 | 广州荔支网络技术有限公司 | A kind of personalized recommendation method based on audio, device and mobile terminal |
-
2019
- 2019-05-27 CN CN201910448059.XA patent/CN110213660B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110030012A1 (en) * | 2004-07-30 | 2011-02-03 | Diaz Perez Milton | Method of common addressing of tv program content on internet and tv services platform of a digital tv services provider |
US20130145276A1 (en) * | 2011-12-01 | 2013-06-06 | Nokia Corporation | Methods and apparatus for enabling context-aware and personalized web content browsing experience |
CN103686237A (en) * | 2013-11-19 | 2014-03-26 | 乐视致新电子科技(天津)有限公司 | Method and system for recommending video resource |
CN105898420A (en) * | 2015-01-09 | 2016-08-24 | 阿里巴巴集团控股有限公司 | Video recommendation method and device, and electronic equipment |
CN108009528A (en) * | 2017-12-26 | 2018-05-08 | 广州广电运通金融电子股份有限公司 | Face authentication method, device, computer equipment and storage medium based on Triplet Loss |
CN109558512A (en) * | 2019-01-24 | 2019-04-02 | 广州荔支网络技术有限公司 | A kind of personalized recommendation method based on audio, device and mobile terminal |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110851647A (en) * | 2019-09-29 | 2020-02-28 | 广州荔支网络技术有限公司 | Intelligent distribution method, device and equipment for audio content flow and readable storage medium |
CN110909202A (en) * | 2019-10-28 | 2020-03-24 | 广州荔支网络技术有限公司 | Audio value evaluation method and device and readable storage medium |
CN113742564A (en) * | 2020-05-29 | 2021-12-03 | 北京沃东天骏信息技术有限公司 | Target resource pushing method and device |
Also Published As
Publication number | Publication date |
---|---|
CN110213660B (en) | 2021-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110704674B (en) | Video playing integrity prediction method and device | |
US20210271975A1 (en) | User tag generation method and apparatus, storage medium, and computer device | |
CN104679743B (en) | A kind of method and device of the preference pattern of determining user | |
CN103246672B (en) | User is carried out method and the device of personalized recommendation | |
CN104050247B (en) | The method for realizing massive video quick-searching | |
CN103744928B (en) | A kind of network video classification method based on history access record | |
CN104598518B (en) | Content pushing method and device | |
CN110213660A (en) | Distribution method, system, computer equipment and the storage medium of program | |
CN107330719A (en) | A kind of insurance products recommend method and system | |
CN107944986A (en) | A kind of O2O Method of Commodity Recommendation, system and equipment | |
CN112559900B (en) | Product recommendation method and device, computer equipment and storage medium | |
CN106326413A (en) | Personalized video recommending system and method | |
CN104239552B (en) | Generation association keyword, the method and system that association keyword is provided | |
CN110737859A (en) | UP main matching method and device | |
CN114282054A (en) | Video recommendation method and device, computer equipment and storage medium | |
Galvão et al. | Forecasting movie box office profitability | |
CN112749330B (en) | Information pushing method, device, computer equipment and storage medium | |
CN109816438A (en) | Information-pushing method and device | |
CN108846097A (en) | The interest tags representation method of user, article recommended method and device, equipment | |
WO2020135642A1 (en) | Model training method and apparatus employing generative adversarial network | |
CN111125429A (en) | Video pushing method and device and computer readable storage medium | |
CN108897750A (en) | Merge the personalized location recommendation method and equipment of polynary contextual information | |
CN107920260A (en) | Digital cable customers behavior prediction method and device | |
CN111858969A (en) | Multimedia data recommendation method and device, computer equipment and storage medium | |
CN114339362A (en) | Video bullet screen matching method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |