CN110266745A - Information flow recommended method, device, equipment and storage medium based on depth network - Google Patents
Information flow recommended method, device, equipment and storage medium based on depth network Download PDFInfo
- Publication number
- CN110266745A CN110266745A CN201910175634.3A CN201910175634A CN110266745A CN 110266745 A CN110266745 A CN 110266745A CN 201910175634 A CN201910175634 A CN 201910175634A CN 110266745 A CN110266745 A CN 110266745A
- Authority
- CN
- China
- Prior art keywords
- document
- evaluation index
- information flow
- score
- depth network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/55—Push-based network services
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
This application discloses a kind of information flow recommended method, device, equipment and storage mediums based on depth network, belong to data processing field.The described method includes: receiving the information recommendation request of target account number;Invocation target depth network model calculates protonatomic mass score of the candidate document in n evaluation index, and n is the integer greater than 1;According to the protonatomic mass score and the corresponding weight of n evaluation index in n evaluation index, the weighted quality score of document is calculated;It is generated and recommendation information stream according to weighted quality score.The application can evaluate document in multiple evaluation indexes, so that the information flow recommended has preferable performance in multiple evaluation indexes, preferable effect can only be obtained in single evaluation index by avoiding the neural network model of single evaluation index information flow generated, and may be the poor effect in other evaluation indexes the problem of.
Description
Technical field
This application involves data processing field, in particular to a kind of information flow recommended method based on depth network, device,
Equipment and storage medium.
Background technique
Information flow is that multiple information sequences arrange constituted stream data.Information flow recommender system includes server and end
End, server push one or more information flows to terminal.
In the related technology, depth network model is provided in information flow recommender system.When server needs to recommend to terminal
When information flow, server using depth network model prediction user to the clicking rate of each document in candidate documents set,
Go out n documents according to the sequential selection of clicking rate from high to low, the information flow that the n document is formed is pushed to terminal, terminal
Show the information flow.
Above-mentioned recommender system is to optimize based on single evaluation index (clicking rate), although can be improved user to information flow
Touching quantity, but be also easy to recommend the article that title is attractive out but content quality is low, it causes such as sharing, appreciation and message interaction
Etc. evaluation indexes decline the case where.
Summary of the invention
The embodiment of the present application provides a kind of information flow recommended method based on depth network, device, equipment and storage and is situated between
Matter can be used for solving to optimize the recommender system based on single evaluation index, it is easy to other evaluation indexes be caused to decline
Situation.The technical solution is as follows:
According to one aspect of the disclosure, a kind of information flow recommended method based on depth network, the method are provided
Include:
Receive the information recommendation request of target account number;
The candidate documents set of the target account number is generated, the candidate documents set includes at least two documents;
Invocation target depth network model calculates protonatomic mass score of the document in n evaluation index, and n is greater than 1
Integer;
According to the protonatomic mass score and the corresponding weight of the n evaluation index in the n evaluation index, it is calculated
The weighted quality score of the document;
Destination document is selected from the candidate documents set according to the weighted quality score, according to the target text
Shelves generate the information flow;
Recommend the information flow to the corresponding terminal of the target account number.
According to another aspect of the present disclosure, a kind of information flow recommendation apparatus based on depth network, described device are provided
Include:
Receiving module, the information recommendation for receiving target account number are requested;
Generation module, for generating the candidate documents set of the target account number, the candidate documents set includes at least
Two documents;
Calling module calculates protonatomic mass of the document in n evaluation index for invocation target depth network model
Score, n are the integer greater than 1;
Computing module, for according in the n evaluation index protonatomic mass score and the n evaluation index it is corresponding
The weighted quality score of the document is calculated in weight;
Generation module, for selecting destination document from the candidate documents set according to the weighted quality score,
The information flow is generated according to the destination document;
Recommending module, for recommending the information flow to the corresponding terminal of the target account number.
According to another aspect of the present disclosure, a kind of server is provided, the server includes:
Processor and memory, the memory are stored at least one instruction, at least a Duan Chengxu, code set or instruction
Collection, at least one instruction, an at least Duan Chengxu, the code set or instruction set are loaded and are executed by the processor
To realize as described above based on the information flow recommended method of depth network.
According to another aspect of the present disclosure, a kind of computer readable storage medium is provided, which is characterized in that the calculating
Machine readable storage medium storing program for executing is stored at least one instruction, at least a Duan Chengxu, code set or instruction set, and described at least one refers to
It enables, an at least Duan Chengxu, the code set or instruction set are loaded by the processor and executed to realize as described above
Information flow recommended method based on depth network.
Technical solution bring beneficial effect provided by the embodiments of the present application includes at least:
Protonatomic mass score of the document in n evaluation index is calculated by the same target depth network model,
The weighted quality score of document is obtained further according to n sub- mass fractions and corresponding weight, according to weighted quality score to text
Shelves generate information flow after being ranked up, to evaluate in multiple evaluation indexes document, so that the information flow recommended exists
There is preferable performance in multiple evaluation indexes, the neural network model of single evaluation index information flow generated is avoided only to exist
It can obtain preferable effect in single evaluation index, and may be the poor effect in other evaluation indexes the problem of.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, the drawings in the following description are only some examples of the present application, for
For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other
Attached drawing.
Fig. 1 is the block diagram for the computer system that one exemplary embodiment of the application provides;
Fig. 2 is the process for the information flow recommended method based on depth network that one exemplary embodiment of the application provides
Figure;
Fig. 3 is pushing away online for the information flow recommended method based on depth network that one exemplary embodiment of the application provides
Recommend the schematic illustration of process;
Fig. 4 is the structural schematic diagram for the target depth network model that one exemplary embodiment of the application provides;
Fig. 5 is the flow chart of the training method for the target depth network model that one exemplary embodiment of the application provides;
Fig. 6 is that the application for the information flow recommended method based on depth network that one exemplary embodiment of the application provides is shown
It is intended to;
Fig. 7 is that the interface for the information flow recommended method based on depth network that one exemplary embodiment of the application provides shows
It is intended to;
Fig. 8 is the structural frames for the information flow recommendation apparatus based on depth network that one exemplary embodiment of the application provides
Figure;
Fig. 9 is the structural frames for the information flow recommendation apparatus based on depth network that one exemplary embodiment of the application provides
Figure;
Figure 10 is the structural block diagram for the server that one exemplary embodiment of the application provides.
Specific embodiment
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with attached drawing to the application embodiment party
Formula is described in further detail.
Currently, most of recommender systems for having applied of industry are still to optimize based on single evaluation index, as news pushes away
System is recommended using clicking rate as target;Short video recommendation system is to play rate as target;Electric business recommender system is using conversion ratio as target
Etc..The top-priority effectiveness indicator of these recommender systems is referred to as main index, and human-computer interaction ecology is simply pushed away very much
System is recommended, only emphasis considers that the optimization of main index is feasible.But it to the more complicated recommender system of human-computer interaction ecology, uses
The man-machine interaction mode of family and document (item) often very abundant.By taking item is article as an example, user can be to the text liked
Zhang Jinhang clicks, appreciates, leaving a message, commenting on, be shared with good friend and shares social display platform (such as circle of friends), or collection
Get up.Even for the article not liked, user can also carry out negative feedback.If recommender system uses traditional recommendation side
Method only considers that user is ranked up the clicking rate index of article to recommendation results, although touching quantity improves, is easy
Recommend title party out and low-quality article, thus the case where causing such as sharing, appreciation and message interaction index decline;In this way
Recommender system be will affect to the acquisition of new user and retain effect, the product social activity for being also unfavorable for establishing health on long terms is raw
State.
In face of this case, some traditional ways of industry can lead on the recommendation results that single index sorts
It crosses and introduces artificial rule to adjust recommendation list, to reduce the disadvantage of single evaluation index optimization, but artificial rule is difficult
Big data environment is adapted to, expected effect is often not achieved.In view of this, the embodiment of the present application provides one kind based on depth net
The information flow suggested design of network.
Fig. 1 shows the structural block diagram of the computer system 100 of one exemplary embodiment of the application offer.The computer
System 100 can be an instant communicating system, news push-delivery system, purchase system, Online Video system, be based on topic or frequency
Road or circle carry out the social category application program of crowd's polymerization or the other application program system with social attribute, this Shen
Please embodiment this is not limited.The computer system 100 includes: first terminal 120, server 140 and second terminal
160。
First terminal 120 is connected by wireless network or cable network with server cluster 120.First terminal 120 can be with
It is smart phone, game host, desktop computer, tablet computer, E-book reader, MP3 player, MP4 player and knee
At least one of mo(u)ld top half portable computer.First equipment, 120 installation and operation, which has, supports answering for social attribute and information recommendation
Use program.The application program can be instant communicating system, news push-delivery system, purchase system, Online Video system, be based on words
Topic or channel or circle carry out the social category application program of crowd's polymerization or the other application program system with social attribute
Any one.First terminal 120 is the terminal that the first user uses, and is logged in the application program run in first terminal 120
There is the first account number.
First terminal 120 is connected by wireless network or cable network with server 140.
Server 140 includes at least one in a server, multiple servers, cloud computing platform and virtualization center
Kind.Server 140 is used to that the application program of information recommendation to be supported to provide background service.Optionally, server 140 undertakes mainly
Work is calculated, first terminal 120 and second terminal 160 undertake secondary calculation work;Alternatively, server 140 undertakes secondary calculation
Work, first terminal 120 and second terminal 160 undertake main calculating work;Alternatively, server 140, first terminal 120 and
Cooperated computing is carried out using distributed computing architecture between two terminals, 160 three.
Optionally, server 140 includes: access server 142 and information recommendation server 144.Access server 142 is used
In providing the access service and information recommendation service of first terminal 120 and second terminal 160, and by recommendation information (article, figure
Piece, audio, in video at least one of) from information recommendation server 144 be sent to terminal (first terminal 120 or second terminal
160).Information recommendation server 144 can be one or more.When information recommendation server 144 is more, there is at least two
Station information recommendation server 144 is used to provide different services, and/or, there are at least two station information recommendation servers 144 and is used for
Identical service is provided, for example provides same service with load balancing mode, the embodiment of the present application is not limited this.
160 installation and operation of second terminal has the application program for supporting social attribute and information recommendation.The application program can
To be instant communicating system, news push-delivery system, purchase system, Online Video system, be based on topic or channel or circle progress
Any one of the social category application program that crowd polymerize or the other application program system with social attribute.Second eventually
End 160 is the terminal that second user uses.The second account number is logged in the application program of second terminal 120.
Optionally, the first account number and the second account number are in virtual social network, which includes the first account
Social networks chain number between the second account number.The virtual social network can be what same social platform provided, be also possible to
It is provided there are multiple social platforms collaboration of incidence relation (such as authorization login relationship), the embodiment of the present application is to virtual social
The concrete form of network is not limited.Optionally, the first account number and the second account number may belong to the same troop, same group
It knits, with friend relation or with provisional communication permission.Optionally, the first account number and the second account number are also possible to stranger
Relationship.In short, the virtual social network provides the unidirectional message route of transmission or two-way between the first account number and the second account number
Message route of transmission.
Optionally, the application program installed in first terminal 120 and second terminal 160 be identical or two terminals on
The application program of installation is the application program installed in the same type application program or two terminals of different operating system platform
It is different but supports information mutual communication.Different operating system includes: Apple Macintosh operating system, Android operation system, Linux operation system
System, Windows operating system etc..
First terminal 120 can refer to one in multiple terminals, and second terminal 160 can refer to one in multiple terminals
A, the present embodiment is only illustrated with first terminal 120 and second terminal 160.First terminal 120 and second terminal 160
Terminal type is identical or different, which includes: smart phone, game host, desktop computer, tablet computer, electronics
At least one of book reader, MP3 player, MP4 player and pocket computer on knee.Following embodiment is whole with first
End 120 and/or second terminal 140 are that there are friend relation chains between smart phone, the first account number and the second account number to illustrate
It is bright.
Those skilled in the art could be aware that the quantity of above-mentioned terminal can be more or less.For example above-mentioned terminal can be with
Only one perhaps above-mentioned terminal be tens or several hundred or greater number, above-mentioned computer system further includes it at this time
Its terminal 180, when other terminals 180 have second be logged in one or more terminals with the first account number there are friend relation
Account number.The embodiment of the present application is not limited the quantity and device type of terminal.
In a schematical example, answering for social attribute and information recommendation is supported to run to have in first terminal 120
For program, when the first user starts the application program and opens information flow and show interface, the application program is to server
140 send information recommendation request.Server 140 can by with interested to the second account number of first account number there are friend relation
Article (such as: according to user draw a portrait prediction it is interested, the second account number thumbed up, the second account number was commented on, the second account number
Shared) screening is candidate documents set, then using the target depth network model provided in following examples, to candidate
Each document in collection of document carries out overall merit in n evaluation index, obtains the weighted quality score of each document.Clothes
Sequence of the business device according to weighted quality score from high to low, extracts K document structure tree information flow, by this from candidate documents set
Information flow is pushed to the application program in first terminal 120, which shows the information flow.In other examples, it services
Other way can also be used to generate candidate documents and close in device 140, is not limited to this.
Since above-mentioned target depth network model can carry out comprehensive consideration to the quality of document from n evaluation index,
Therefore the neural network model information flow generated that can be avoided single evaluation index can only obtain in single evaluation index
Preferable effect, and may be the poor effect in other evaluation indexes the problem of.
Fig. 2 is the process for the information flow recommended method based on depth network that one illustrative examples of the application provide
Figure.This method can computer system as shown in Figure 1 execute.This method comprises:
Step 201, terminal to server sends the information recommendation request of target account number;
Application program is installed in terminal, target account number is logged in application program.The application program, which has, to be obtained and shows
Show the function of information flow.
Application program sends information recommendation request by terminal to server, and information recommendation request is for requesting to target
Account number recommendation information stream.For example, application program is sent to server when information flow in the application shows that interface is triggered
Information recommendation request.
Step 202, server receives the information recommendation request of target account number;
Optionally, server extracts target account number from information recommendation request.
Step 203, server generates the candidate documents set of target account number, and candidate documents set includes at least two texts
Shelves;
Document is the information that can be shown in application program, and a document includes: picture, audio, video, text, AR member
At least one of element element.
When to target account number recommendation information stream, server generates the candidate documents set of target account number, the candidate documents
Set includes that target account number may interested multiple documents.
Optionally, server is drawn a portrait according to the user of target account number generates candidate documents set, and user's portrait includes target
The information such as gender, age, area, educational background, the occupation of account number;It is given birth to alternatively, server is recorded according to the historical viewings of target account number
At candidate documents set;Alternatively, server generates candidate documents set etc., the application according to the history consumer record of target account number
The mode that embodiment generates candidate documents set to server is not limited.
Optionally, when target account number has one or more good friend's account numbers, server also be can be used as under type
At least one is closed to generate candidate documents: the document that good friend's account number is clicked is added to candidate documents set;By good friend's account number
The longer document of stay time be added to candidate documents set;The document that good friend's account number is thumbed up is as candidate document;It will
The document that good friend's account number is shared is added to candidate documents set.
Optionally, from candidate documents set (also referred to as candidate documents pond), filtering out m target account number may feel server
The document of interest, forms candidate documents set, and m is the integer greater than 1.
Step 204, server calls target depth network model calculates the document protonatomic mass in n evaluation index point
Number;
Target depth network model is the model for calculating mass fraction of the document in n evaluation index, n be greater than
1 integer.N evaluation index includes but is not limited to: clicking rate, broadcasting rate, conversion ratio, appreciation rate, message rate, average rate and point
Enjoy at least two in rate.
Target depth network model is a depth network model.In conjunction with reference Fig. 3, the same target of server calls is deep
Network model is spent, protonatomic mass score of the document in n evaluation index can be calculated.For example, calculating candidate documents set
In each document in the protonatomic mass score 1 in evaluation index 1, the protonatomic mass score 2 in evaluation index 2,,, evaluating
Protonatomic mass score n on index n.
Step 205, server is according to the protonatomic mass score and the corresponding weight of n evaluation index in n evaluation index, meter
Calculation obtains the weighted quality score of document;
Each evaluation index in n evaluation index is also corresponding with respective weight.In a schematical example, n
Weight in a evaluation index is after being added and value is 1.
For document i, server is by protonatomic mass score of the document i in n evaluation index and corresponding multiplied by weight, so
N product addition is obtained into the weighted quality score of document afterwards.
Step 206, server selects destination document from candidate documents set according to weighted quality score;
Optionally, server is ranked up each document in candidate documents set according to weighted quality score, then
Document by sequence at preceding k is determined as destination document.Wherein, k is the integer no more than m.
Preceding k document can be whole documents in candidate documents set, one be also possible in candidate documents set
Single cent shelves.When current k document is a part of document, it can be the document of fixed number, such as 100;Be also possible to according to
Ratio extract document, such as by candidate documents set sort preceding 80% document be determined as before k document.
Step 207, server is according to destination document output information stream;
Optionally, information flow includes k destination document of arranged in sequence.With reference to Fig. 3, the corresponding weighted quality of document index 1
The highest document of score, the fractional high document of the corresponding weighted quality of document index 2, and so on, server can will sort
In preceding k destination documents, information flow is sequentially generated.
Optionally, server by entire information stream to terminal, alternatively, entire information flow is divided into multiple portions by server
Point, information flow is sent to terminal in batches, each batch sends a part in the information flow.
Step 208, terminal shows information flow.
Optionally, application program shows the information flow after receiving information flow.
In conclusion method provided in this embodiment, evaluates document at n by the same target depth network model
Protonatomic mass score in index is calculated, and obtains the weighting matter of document further according to n sub- mass fractions and corresponding weight
Score is measured, information flow is generated after being ranked up according to weighted quality score to document, thus to document in multiple evaluation indexes
It is evaluated, so that the information flow recommended has preferable performance in multiple evaluation indexes, avoids the mind of single evaluation index
Preferable effect can be only obtained in single evaluation index through network model information flow generated, and may be in other evaluations
The problem of poor effect in index.
Optionally, weight corresponding to each evaluation index is adjustable, can have needle by adjusting different weights
Preferable effect is obtained in some or certain several objective appraisal indexs to property.
A kind of target depth network model is proposed in the embodiment of the present application, which can be to document
It gives a mark in n evaluation index.Evaluation index includes but is not limited to: Fig. 4 shows the model knot of the depth network model
Composition.The target depth network model includes: shared network portion 220 and n separate network part 240;
Shared network portion 220, for extracting the Common feature vector of input data, which includes the text of document
Shelves feature.Shared network portion 220 includes the sharing feature layer 221 being sequentially connected, shared mapping layer 222 and shares full articulamentum
223.Sharing feature layer 221, for extracting the subcharacter vector of corresponding at least two dimension of input data;Shared mapping layer
222, for the subcharacter vector of at least two dimensions to be carried out dimensionality reduction mapping, the subcharacter vector after obtaining dimensionality reduction mapping;Altogether
Full articulamentum 223 is enjoyed, is exported after carrying out full attended operation for the subcharacter vector after mapping dimensionality reduction as Common feature vector.
Mutually side by side, i-th of separate network part in n separate network part is used for for n separate network part 240
Protonatomic mass score of the document in i-th of evaluation index is calculated according to Common feature vector, i is the integer no more than n.It is optional
Ground, there are the network structures of at least two groups separate network part arranged side by side to be different.Optionally, network structure includes: nerve
The number of plies of network layer, the type of neural net layer, the weight coefficient in neural net layer, in the putting in order of neural net layer
At least one of.
Optionally, since n separate network part 240 is all made of the Common feature vector that shared network portion 220 exports
As input, therefore the number of nodes of the first layer neural net layer of n separate network part 240 is identical.
For Fig. 4 using 2 evaluation indexes as schematical example, server obtains user characteristics and upper according to target account number
Following traits, and file characteristics are obtained according to document, wherein contextual feature includes but is not limited to time, place, network rings
At least one of border, terminal type, OS Type feature.Server is special by user characteristics, contextual feature and document
Sign is used as input data, is input to target depth network model;Document is calculated by target depth network model to comment at n
Protonatomic mass score in valence index.Wherein, shared network portion indicates the general character in multiple evaluation indexes, i-th of separate network
Part indicates the individual character in i-th of evaluation index.Each separate network part 240 is shared to be exported using shared network portion 220
Sharing feature vector, personal characteristics vector related with evaluation index is then extracted using network structure independent,
And final output protonatomic mass score corresponding with evaluation index.
In Fig. 4, sharing mapping 222 main purpose of (embedding) layer is that sparse features are carried out dimensionality reduction, shares mapping
The essence of layer 222 is a full articulamentum.
In above formula, i indicates single feature group serial number, XfiIndicate the corresponding subcharacter vector of ith feature group, WeiI-th
A feature group corresponding weight in mapping process.Wherein, the result z of output is the subcharacter vector after the mapping in upper figure
(embedding output).After subcharacter vector after mapping carries out full connection layer operation, export as sharing feature vector.No
The shared mapping layer 222 is shared between same separate network part 240 and shares the shared spy obtained after full articulamentum 223 is handled
Levy vector.
Neural network structure independent is enjoyed in each separate network part 240, is conducive to refer in each evaluation in this way
It puts on and does better data fitting according to the distribution of itself label.The number of plies of neural network and every layer of number of nodes need to carry out preparatory
Design, the neural networks of different separate network parts 240 must be identical outer except the node of the first floor, and middle layer design can difference.
It is constructed in a manner of connecting entirely between two neighboring neural net layer in separate network part.Calculation between two layers
It is as follows
Wherein, i indicates i-th of separate network part, and k indicates the kth layer of separate network part.XikI-th of expression independent
The nodal values of the kth layer neural net layer of network portion.G () indicates activation primitive (Activation functions), is
The nonlinear transformation added outside linear operation.WikWith bikIt is the weight and offset parameter from -1 layer of kth to kth interlayer, and instruction
The parameter for needing to learn during practicing.
There are between the two of progressive relationship evaluation index, as clicked target and sharing between target, produced according to recommendation
The setting of product, user generally require first to click, and being then possible to behaviors, the sharing rates such as generation sharing can be expressed as a little
It hits rate and shares probability multiplied by the condition after click, meet following formula:
P (y=1, z=1 | and x)=p (y=1 | x) × p (z=1 | y=1, x)
Wherein, x indicates feature, and y indicates to click, and z indicates to share.So for existing in n separate network part
The the first separate network part and the second separate network part of dependence, the output of the second separate network part, which is also connected with, to be multiplied
Product node;The the first protonatomic mass score and second separate mesh that product node is used to export first separate network part
After the protonatomic mass fractional multiplication of network part output, it is determined as the final protonatomic mass score of second separate network part.That is,
Separate network part network output node according to the progressive relationship between business objective, the output of evaluation index 2 is added
Holistic modeling is carried out to the product that evaluation index 1 exports, it is more stable for the optimization of target depth network model in this way.Such as
There is no progressive relationship between the output of two evaluation indexes of fruit, then can cancel such dependence.
It should be noted that it includes user characteristics, contextual feature and file characteristics totally three dimensions that Fig. 4, which is with input data,
The data of degree illustrate.In a possible embodiment, input data includes user characteristics and file characteristics totally two
The data of a dimension.In another possible embodiment, input data can also include except user characteristics, contextual feature and
The data of other dimensions except file characteristics, the present embodiment do not limit data dimension number included by input data
It is fixed.But for the relationship between characterization user and document, input data has generally comprised user characteristics and file characteristics, and can
Carry out feature-rich dimension selectively to increase the data of other dimensions.
In conclusion target depth network model provided in this embodiment, is combined with multiple simple neural network models
Compare in the method for solving the problems, such as multi objective, using a complicated target depth network model can simplify feature extraction and
Sample manufacturing process, while simplifying pre- flow gauge on line, additionally it is possible to mitigate newly-increased evaluation index bring performance pressures.
Target depth network model in the present embodiment, for each optimizing index using shared mapping layer
Embedding parameter, can be excessively sparse to avoid the data of individual optimizing index, while being able to use pumping common in feature
The prediction of each optimizing index is assisted as speciality.
Target depth network model in the present embodiment has independent output for each evaluation index, can be on line
Carry out flexible combination.It can to tilt when sequence to specified evaluation index by adjusting weight, the business met on line needs
It asks.
Above-mentioned target depth network model can be obtained by off-line training process.Fig. 5 shows the application one signal
Property embodiment propose target depth network model training method flow chart.This method comprises:
Step 501, according to the user characteristic data of sample of users account number and candidate documents data, training sample is generated;
Sample of users account number is account number used in sample of users.Optionally, log number of the server from inline system
In, the user characteristic data and candidate documents data of sample of users account number are extracted.The user of each sample of users account number is special
Sign data and candidate documents data are formed as a training sample.
User characteristic data includes but is not limited to: user's representation data and user's history behavior.
Step 502, positive and negative label is calibrated according to each evaluation index to training sample respectively, obtains training sample set;
Since target depth network model uses the corresponding separate network part of at least two evaluation indexes, and this is at least
Two separate network parts should stand-alone training, therefore calibrate positive and negative label respectively for each evaluation index.That is, being directed to
Evaluation index 1 demarcates positive label 1 or negative label 1 to training sample;For evaluation index 2, positive label 2 is demarcated to training sample
Or negative label 2;For evaluation index 3, positive label 3 or negative label 3 are demarcated to training sample, and so on, it repeats no more.
After calibrating positive and negative label to each training sample, calibrated training sample is added to training sample
Collection.
Step 503, each training sample that training sample is concentrated is input to target depth network model, it is anti-using error
Target depth network model is trained to propagation algorithm, the target depth network model after being trained.
Target depth network model is needed in the training process using to loss function (loss function).Lose letter
Number is for measuring the inconsistent degree between the predicted value f (x) of target depth network model and true value Y.Loss function is smaller,
The prediction effect of target depth network model is more excellent.
Schematically, the corresponding loss function of target depth network model is according to the whole of the comprehensive building of n evaluation index
Bulk diffusion function.
Design one is used for the whole loss function of training process, which contributed by each evaluation index
It obtains.In one implementation, the contribution of each evaluation index is equality in the whole loss function, and there is no be directed to
The deviation of some evaluation index.In another implementation, when more important there are some or certain several evaluation indexes,
The contribution of each evaluation index is unequal in the loss function, there is the stressing property for important evaluation index.
In a schematical example, the contribution of each evaluation index is equality, whole loss function Loss=L1+
L2+L3+L4+ ...+Ln, wherein Li is the corresponding loss function of i-th of evaluation index, and i is positive integer.
By taking evaluation index is clicking rate as an example, maximum likelihood function can be used and take the mode of logarithm as loss function,
And remember that the loss function of clicking rate is L_ctr.
It is to share the evaluation index of dimension, the evaluation index of collection dimension, the evaluation index for thumbing up dimension with evaluation index
For, since sample is all Bernoulli Jacob's event of 0,1 value, loss function all meets the expression-form of bi-distribution, therefore divides
The loss function for enjoying (share) dimension can be denoted as L_share, and the loss function of collection (collect) dimension can be denoted as L_
Collect, the loss function for thumbing up (like) dimension can be denoted as L_like.
By taking evaluation index is the evaluation index of stay time as an example, since sample data is non-Bernoulli Jacob's event namely sample
This label is a real number, rather than the 0 of bi-distribution or 1 value, loss function can be indicated using mean square error (RMSE).
Optionally, corresponding loss function, such as the evaluation index of negative-feedback dimension are had for different value types, can also made
It uses hinge Loss as loss function, and is denoted as L_nega.
In a schematical example, it is assumed that evaluation index includes clicking rate, sharing, collect, thumb up, stay time and
Negative-feedback the evaluation index of totally 6 dimensions when, whole loss function Loss=L_ctr+L_share+L_collect+L_like+
L_nega is indicated.
In another schematical example, the corresponding loss function of each evaluation index can also increase weight.
In conclusion training method provided in this embodiment, by for the corresponding independence of each evaluation index stand-alone training
Network portion reduces the degree of coupling between each separate network part, so as to effectively avoid being inclined to number in training process
According to dense evaluation index, so that the evaluation index of Sparse also can be learnt and be optimized, it is more in line with actual answer
Use environment.
Above- mentioned information stream recommended method is illustrated below with reference to a specific example.If operation has application in terminal
Program is provided with the user interface for showing information flow in the application program.Server is the background service of the application program
Device.When server is a server, there are information flow recommendation functions in the server;When server is server cluster
When, there is the server for information recommendation in the server cluster.As shown in fig. 6, Fig. 6 is that the application one is schematically implemented
The application schematic diagram for the information flow recommended method that example provides.The information flow recommended method includes following Four processes:
User's use process 61:
User sends information recommendation to server using application program and requests, and server is to application program feedback information stream.
Stage in the early stage, the information flow of server feedback can be used the generating mode based on user's portrait and generate, the present embodiment pair
This is not limited.
Application program shows information flow on a user interface, and user can be on the user interface to the document in information flow
At least one of clicked, share, appreciate and comment on operation.
Log collection process 62:
Server will record operation behavior of the user in application program and form log.Optionally, every log recording has
User account number, operation behavior and operating time.Operation behavior is any one in clicking, share, appreciate and commenting on.
Optionally, server also will record user's portrait of each user account number.
Off-line training process 63:
Technical staff can construct target depth network model according to the network structure of above-mentioned Fig. 2.Then from journal file
Middle extraction user characteristics and file characteristics.
Optionally, technical staff draws a portrait from user's history behavior and user and extracts user characteristics, extracts text from document information
Shelves feature.User characteristics and file characteristics are input in target depth network model and are trained, the target after being trained
Depth network model.The training process can be with reference to process described in above-mentioned Fig. 5.
It should be noted that above-mentioned training process can be completed using offline mode.The off-line training process can also be
Different time sections execute multiple.
Online recommendation process 64:
After the training of target depth network model, which, which can put on line, is used.Work as clothes
When business device receives the information recommendation request of the target account number of terminal transmission, target account number is request user (user), server
The user characteristics of target account number are obtained, and generate candidate documents set for target account number.For every in candidate documents set
A document also obtains the file characteristics of the document.Server is by the file characteristics of the user characteristics of target account number and each document
It is input in target depth network model, obtains protonatomic mass score of each document in n evaluation index, then according to n
Protonatomic mass score and the corresponding weight of n evaluation index in evaluation index, are calculated the weighted quality score of document.
Sequence of the server according to the weighted quality score of each document from high to low, to each in candidate documents set
Document is ranked up.The information flow recommended to target account number is generated according to the preceding k destination document after sequence.Server is by information
Stream is sent to terminal, and terminal shows the information flow on a user interface.
As shown in fig. 7, terminal shows information flow in application program 70 using list control, it include more in list control
A list items 71 to 75, each list items for showing a document, arrange with the project in information flow by putting in order for list items
Column sequence it is identical, the higher project of weighted quality score corresponding list items in list control are more forward, for example, article 1 plus
Mass fraction highest is weighed, then the uppermost position of the corresponding display of list items 71 on a user interface, the weighted quality point of article 2
Number is less than the weighted quality score of article 1, then corresponding list items 72 are shown in the lower section of list items 71.User can be in user
List control is slided using upper skating gesture and downslide gesture on interface, to check sequence in the preceding or posterior list of sequence
?.
Optionally, the article title of article is shown in each list items, author, thumbnail, is thumbed up button 76 and is commented
By button 77.
Optionally, after some list items is clicked, the displaying interface of article corresponding with the list items is shown, in the exhibition
Show and may also display sharing button on interface.
It should be noted that the weight of above-mentioned n evaluation index is adjustable.Server receives weight modification operation;
The corresponding weight of n evaluation index of modification is operated according to weight modification.
In a possible embodiment, the weight setting between multiple evaluation indexes, can first pass through offline evaluation mode
Candidate weight combination is calculated, then verifies the effect of every group of weight combination by testing on line.It is estimated in simple target
In scene, the collating sequence and corresponding target labels of a candidate documents set are given, generally uses Auc (Area under
The curve, model-evaluation index) index as offline evaluation mode, it is more accurate that the bigger expression of Auc is estimated, and sort effect
Better.The application is using the method for grid search to search for the weighting between more evaluation indexes in off-line model prediction result
The optimal weight combination of mass fraction, detailed process is as follows: for the weight set each time, calculating the more of each document
The weighted quality score y exported between a evaluation index carries out the document in candidate documents set according to weighted quality score y value
Sequence, and the Auc in each evaluation index is calculated to ranking results;Different weights is combined, available different y value
It with the Auc in corresponding multiple evaluation indexes, is combined further according to the product objective selection Auc weight being dominant, or according to product
The importance of target selects Auc to be weighted and averaged highest weight combination as the test combinations on line.
Following is the application Installation practice, can be used for executing the application embodiment of the method.It is real for the application device
Undisclosed details in example is applied, the application embodiment of the method is please referred to.
With reference to Fig. 8, it illustrates the information flow recommendations based on depth network that one illustrative examples of the application provide
The structural block diagram of device.The device can pass through the whole or one of software, hardware or both being implemented in combination with as server
Part.Described device includes:
Receiving module 810, the information recommendation for receiving target account number are requested;
Generation module 820, for generating the candidate documents set of the target account number, the candidate documents set includes extremely
Few two documents;
Calling module 830 calculates each document at least two document for invocation target depth network model
Protonatomic mass score in n evaluation index, n are the integer greater than 1;
Computing module 840, for according in the n evaluation index protonatomic mass score and the n evaluation index pair
The weighted quality score of the document is calculated in the weight answered;
Selecting module 850, for selecting target text from the candidate documents set according to the weighted quality score
Shelves generate the information flow according to the destination document;
Recommending module 860, for recommending the information flow to the corresponding terminal of the target account number.
In an alternative embodiment, the target depth network model includes: shared network portion and n separate mesh
Network part, n separate network part are arranged side by side mutually;
The shared network portion, for extracting the Common feature vector of input data, the input data includes described
The file characteristics of document;
I-th of separate network part in n separate network part, for being calculated according to the Common feature vector
Protonatomic mass score of the document in i-th of evaluation index, i are the integer no more than n.
In an alternative embodiment, the shared network portion includes the sharing feature layer being sequentially connected, shares and reflect
It penetrates layer and shares full articulamentum;
The sharing feature layer, for extracting the subcharacter vector of corresponding at least two dimension of the input data;
The shared mapping layer is dropped for the subcharacter vector of at least two dimension to be carried out dimensionality reduction mapping
Subcharacter vector after dimension mapping;
It is described to share full articulamentum, for being exported after the subcharacter vector after dimensionality reduction mapping is carried out full attended operation
For the Common feature vector.
In an alternative embodiment, there are the network structures of at least two separate network parts to be different,
The network structure includes: the number of plies of neural net layer, the type of neural net layer, the weight coefficient of neural net layer, nerve
At least one of in the putting in order of network layer.
In an alternative embodiment, the number of nodes of the first layer neural net layer of n separate network part is
It is identical.
In an alternative embodiment, for the first independence in n separate network part there are dependence
Network portion and the second separate network part, the output of second separate network part are also connected with product node;
The product node is used for the first protonatomic mass score and described second for exporting first separate network part
After the protonatomic mass fractional multiplication of separate network part output, it is determined as the final protonatomic mass point of second separate network part
Number.
In an alternative embodiment, described device further includes obtaining module 870.
The acquisition module 870, for being obtained according to target account number acquisition user characteristics, and according to the document
File characteristics;
The calling module 830, for using the user characteristics and the file characteristics as the input data, input
To the target depth network model;The document is calculated in n evaluation index by the target depth network model
On protonatomic mass score.
In an alternative embodiment, the acquisition module 870, for obtaining user characteristics according to the target account number
And contextual feature, and file characteristics are obtained according to the document;Wherein, the contextual feature includes time, place, net
At least one of network environment, terminal type, OS Type feature;
The calling module 830, for using the user characteristics, the contextual feature and the file characteristics as institute
Input data is stated, the target depth network model is input to;The text is calculated by the target depth network model
Protonatomic mass score of the shelves in n evaluation index.
In an alternative embodiment, described device further include: modified module 880;
The receiving module 810, for receiving weight modification operation;
The modified module 880, for modifying the corresponding power of the n evaluation index according to weight modification operation
Weight.
In an alternative embodiment, the selecting module 850 is used for according to the weighted quality score to the time
Each document in collection of document is selected to be ranked up;Sequence is determined as the destination document in preceding k document, k is no more than m
Integer.
In an alternative embodiment, the corresponding loss function of the target depth network model is according to the n
The whole loss function of the comprehensive building of evaluation index.
It should be understood that the information flow recommendation apparatus provided by the above embodiment based on depth network is recommended to terminal
When information flow, only the example of the division of the above functional modules, in practical application, can according to need and will be above-mentioned
Function distribution is completed by different functional modules, i.e., the internal structure of equipment is divided into different functional modules, with complete with
The all or part of function of upper description.In addition, information flow recommendation apparatus provided by the above embodiment and information flow recommended method
Embodiment of the method belong to same design, specific implementation process is detailed in embodiment of the method, and which is not described herein again.
Figure 10 shows the structural schematic diagram of the server of the application one embodiment offer.The server is for implementing
The information flow recommended method based on depth network provided in embodiment is provided.Specifically:
The server 1000 includes 1002 He of central processing unit (CPU) 1001 including random access memory (RAM)
The system storage 1004 of read-only memory (ROM) 1003, and connection system storage 1004 and central processing unit 1001
System bus 1005.The server 1000 further includes that the substantially defeated of information is transmitted between each device helped in computer
Enter/output system (I/O system) 1006, and is used for storage program area 1013, application program 1014 and other program modules
1015 mass-memory unit 1007.
The basic input/output 1006 includes display 1008 for showing information and inputs for user
The input equipment 1009 of such as mouse, keyboard etc of information.Wherein the display 1008 and input equipment 1009 all pass through
The input and output controller 1010 for being connected to system bus 1005 is connected to central processing unit 1001.The basic input/defeated
System 1006 can also include input and output controller 1010 to touch for receiving and handling from keyboard, mouse or electronics out
Control the input of multiple other equipment such as pen.Similarly, input and output controller 1010 also provide output to display screen, printer or
Other kinds of output equipment.
The mass-memory unit 1007 (is not shown by being connected to the bulk memory controller of system bus 1005
It is connected to central processing unit 1001 out).The mass-memory unit 1007 and its associated computer-readable medium are
Server 1000 provides non-volatile memories.That is, the mass-memory unit 1007 may include such as hard disk or
The computer-readable medium (not shown) of person's CD-ROM drive etc.
Without loss of generality, the computer-readable medium may include computer storage media and communication media.Computer
Storage medium includes information such as computer readable instructions, data structure, program module or other data for storage
The volatile and non-volatile of any method or technique realization, removable and irremovable medium.Computer storage medium includes
RAM, ROM, EPROM, EEPROM, flash memory or other solid-state storages its technologies, CD-ROM, DVD or other optical storages, tape
Box, tape, disk storage or other magnetic storage devices.Certainly, skilled person will appreciate that the computer storage medium
It is not limited to above-mentioned several.Above-mentioned system storage 1004 and mass-memory unit 1007 may be collectively referred to as memory.
According to the various embodiments of the application, the server 1000 can also be arrived by network connections such as internets
Remote computer operation on network.Namely server 1000 can be connect by the network being connected on the system bus 1005
Mouth unit 1011 is connected to network 1012, in other words, it is other kinds of to be connected to that Network Interface Unit 1011 also can be used
Network or remote computer system (not shown).
The memory is stored with one or more than one program, the one or more programs are stored in
In memory, and it is configured to be executed by one or more than one processor.Said one or more than one program include
The information flow recommended method based on depth network provided for realizing above-mentioned each embodiment.
The embodiment of the present application also provides a kind of computer readable storage medium, which has
One or more than one program, the one or more programs are stored in memory, and are configured to by one
A or more than one processor executes.Said one or more than one program include to mention for realizing above-mentioned each embodiment
The information flow recommended method based on depth network supplied.
The embodiment of the present application also provides a kind of computer program product, the computer program product be stored with one or
More than one program, the one or more programs are stored in memory, and are configured to by one or one
A above processor executes.Said one or more than one program include for realizing above-mentioned each embodiment provide based on
The information flow recommended method of depth network.
It should be understood that referenced herein " multiple " refer to two or more."and/or", description association
The incidence relation of object indicates may exist three kinds of relationships, for example, A and/or B, can indicate: individualism A exists simultaneously A
And B, individualism B these three situations.Character "/" typicallys represent the relationship that forward-backward correlation object is a kind of "or".
Above-mentioned the embodiment of the present application serial number is for illustration only, does not represent the advantages or disadvantages of the embodiments.
Those of ordinary skill in the art will appreciate that realizing that all or part of the steps of above-described embodiment can pass through hardware
It completes, relevant hardware can also be instructed to complete by program, the program can store in a kind of computer-readable
In storage medium, storage medium mentioned above can be read-only memory, disk or CD etc..
The foregoing is merely the preferred embodiments of the application, not to limit the application, it is all in spirit herein and
Within principle, any modification, equivalent replacement, improvement and so on be should be included within the scope of protection of this application.
Claims (15)
1. a kind of information flow recommended method based on depth network, which is characterized in that the described method includes:
Receive the information recommendation request of target account number;
The candidate documents set of the target account number is generated, the candidate documents set includes at least two documents;
Invocation target depth network model calculates sub- matter of each document in n evaluation index at least two document
Score is measured, n is the integer greater than 1;
According to the protonatomic mass score and the corresponding weight of the n evaluation index in the n evaluation index, it is calculated described
The weighted quality score of document;
Destination document is selected from the candidate documents set according to the weighted quality score, it is raw according to the destination document
At the information flow;
Recommend the information flow to the corresponding terminal of the target account number.
2. the method according to claim 1, wherein the target depth network model includes: shared Network Dept.
Divide and n separate network part, n separate network part are arranged side by side mutually;
The shared network portion, for extracting the Common feature vector of input data, the input data includes the document
File characteristics;
I-th of separate network part in n separate network part, for according to Common feature vector calculating
Protonatomic mass score of the document in i-th of evaluation index, i are the integer no more than n.
3. according to the method described in claim 2, it is characterized in that, the shared network portion includes the shared spy being sequentially connected
Sign layer shares mapping layer and shares full articulamentum;
The sharing feature layer, for extracting the subcharacter vector of corresponding at least two dimension of the input data;
The shared mapping layer obtains dimensionality reduction and reflects for the subcharacter vector of at least two dimension to be carried out dimensionality reduction mapping
Subcharacter vector after penetrating;
It is described to share full articulamentum, for exporting after the subcharacter vector after dimensionality reduction mapping is carried out full attended operation as institute
State Common feature vector.
4. according to the method described in claim 2, it is characterized in that, there are the network structures of at least two separate network parts to be
It is different, the network structure include: the number of plies of neural net layer, the type of neural net layer, neural net layer weight system
At least one of in number, the putting in order of neural net layer.
5. according to the method described in claim 2, it is characterized in that, the first layer neural network of n separate network part
The number of nodes of layer is identical.
6. according to the method described in claim 2, it is characterized in that, being closed for existing to rely in n separate network part
The the first separate network part and the second separate network part of system, the output of second separate network part are also connected with product
Node;
The first protonatomic mass score and described second that the product node is used to export first separate network part is independently
After the protonatomic mass fractional multiplication of network portion output, it is determined as the final protonatomic mass score of second separate network part.
7. method according to any one of claims 1 to 5, which is characterized in that the invocation target depth network model calculates
The document is before the protonatomic mass score in n evaluation index, further includes:
User characteristics are obtained according to the target account number, and file characteristics are obtained according to the document;
The invocation target depth network model calculates protonatomic mass score of the document in n evaluation index, comprising:
Using the user characteristics and the file characteristics as the input data, it is input to the target depth network model;
Protonatomic mass score of the document in n evaluation index is calculated by the target depth network model.
8. method according to any one of claims 1 to 5, which is characterized in that the invocation target depth network model calculates
The document is before the protonatomic mass score in n evaluation index, further includes:
User characteristics and contextual feature are obtained according to the target account number, and file characteristics are obtained according to the document;Its
In, the contextual feature includes at least one of time, place, network environment, terminal type, OS Type spy
Sign;
The invocation target depth network model calculates protonatomic mass score of the document in n evaluation index, comprising:
Using the user characteristics, the contextual feature and the file characteristics as the input data, it is input to the mesh
Mark depth network model;
Protonatomic mass score of the document in n evaluation index is calculated by the target depth network model.
9. method according to any one of claims 1 to 5, which is characterized in that the method also includes:
Receive weight modification operation;
The corresponding weight of the n evaluation index is modified according to weight modification operation.
10. the method according to claim 1, wherein it is described according to the weighted quality score from the candidate
Destination document is selected in collection of document, comprising:
Each document in the candidate documents set is ranked up according to the weighted quality score;
Sequence is determined as the destination document in preceding k document, k is the integer no more than m.
11. method according to any one of claims 1 to 5, which is characterized in that the corresponding damage of the target depth network model
Losing function is the whole loss function according to the comprehensive building of the n evaluation index.
12. a kind of information flow recommendation apparatus based on depth network, which is characterized in that described device includes:
Receiving module, the information recommendation for receiving target account number are requested;
Generation module, for generating the candidate documents set of the target account number, the candidate documents set includes at least two
Document;
Calling module calculates each document at least two document for invocation target depth network model and comments at n
Protonatomic mass score in valence index, n are the integer greater than 1;
Computing module, for according to the protonatomic mass score and the corresponding power of the n evaluation index in the n evaluation index
Weight, is calculated the weighted quality score of the document;
Selecting module, for selecting destination document from the candidate documents set according to the weighted quality score, according to
The destination document generates the information flow;
Recommending module, for recommending the information flow to the corresponding terminal of the target account number.
13. device according to claim 12, which is characterized in that the target depth network model includes: shared network
Part and n separate network part, n separate network part are arranged side by side mutually;
The shared network portion, for extracting the Common feature vector of input data, the input data includes the document
File characteristics;
I-th of separate network part in n separate network part, for according to Common feature vector calculating
Protonatomic mass score of the document in i-th of evaluation index, i are the integer no more than n.
14. a kind of server, which is characterized in that the server includes:
Processor and memory, the memory are stored at least one instruction, at least a Duan Chengxu, code set or instruction set,
At least one instruction, an at least Duan Chengxu, the code set or instruction set loaded by the processor and executed with
Realize the information flow recommended method as described in any one of claim 1 to 9 based on depth network.
15. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage has at least one
Instruction, at least a Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the code set
Or instruction set is loaded by the processor and is executed to realize the letter as described in any one of claim 1 to 9 based on depth network
Breath stream recommended method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910175634.3A CN110266745B (en) | 2019-03-08 | 2019-03-08 | Information flow recommendation method, device, equipment and storage medium based on deep network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910175634.3A CN110266745B (en) | 2019-03-08 | 2019-03-08 | Information flow recommendation method, device, equipment and storage medium based on deep network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110266745A true CN110266745A (en) | 2019-09-20 |
CN110266745B CN110266745B (en) | 2022-02-25 |
Family
ID=67911723
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910175634.3A Active CN110266745B (en) | 2019-03-08 | 2019-03-08 | Information flow recommendation method, device, equipment and storage medium based on deep network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110266745B (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110837598A (en) * | 2019-11-11 | 2020-02-25 | 腾讯科技(深圳)有限公司 | Information recommendation method, device, equipment and storage medium |
CN111193795A (en) * | 2019-12-30 | 2020-05-22 | 腾讯科技(深圳)有限公司 | Information pushing method and device, electronic equipment and computer readable storage medium |
CN111291266A (en) * | 2020-02-13 | 2020-06-16 | 腾讯科技(北京)有限公司 | Artificial intelligence based recommendation method and device, electronic equipment and storage medium |
CN111310034A (en) * | 2020-01-23 | 2020-06-19 | 腾讯科技(深圳)有限公司 | Resource recommendation method and related equipment |
CN111459783A (en) * | 2020-04-03 | 2020-07-28 | 北京字节跳动网络技术有限公司 | Application program optimization method and device, electronic equipment and storage medium |
CN112200639A (en) * | 2020-10-30 | 2021-01-08 | 杭州时趣信息技术有限公司 | Information flow model construction method, device and medium |
CN112579729A (en) * | 2020-12-25 | 2021-03-30 | 百度(中国)有限公司 | Training method and device for document quality evaluation model, electronic equipment and medium |
CN112784151A (en) * | 2019-11-08 | 2021-05-11 | 北京搜狗科技发展有限公司 | Method and related device for determining recommendation information |
CN112925963A (en) * | 2019-12-06 | 2021-06-08 | 杭州海康威视数字技术股份有限公司 | Data recommendation method and device |
CN112925924A (en) * | 2019-12-05 | 2021-06-08 | 北京达佳互联信息技术有限公司 | Multimedia file recommendation method and device, electronic equipment and storage medium |
CN113542893A (en) * | 2020-04-14 | 2021-10-22 | 北京达佳互联信息技术有限公司 | Method and device for acquiring evaluation information of works, and video screening method and device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105337843A (en) * | 2015-09-23 | 2016-02-17 | 腾讯科技(深圳)有限公司 | Interaction system and method, client, and background server |
CN107045693A (en) * | 2017-05-05 | 2017-08-15 | 北京媒立方传媒科技有限公司 | Media characteristic determination, Media Recommendation Method and device |
CN108280114A (en) * | 2017-07-28 | 2018-07-13 | 淮阴工学院 | A kind of user's literature reading interest analysis method based on deep learning |
US10173773B1 (en) * | 2016-02-23 | 2019-01-08 | State Farm Mutual Automobile Insurance Company | Systems and methods for operating drones in response to an incident |
CN109241425A (en) * | 2018-08-31 | 2019-01-18 | 腾讯科技(深圳)有限公司 | A kind of resource recommendation method, device, equipment and storage medium |
-
2019
- 2019-03-08 CN CN201910175634.3A patent/CN110266745B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105337843A (en) * | 2015-09-23 | 2016-02-17 | 腾讯科技(深圳)有限公司 | Interaction system and method, client, and background server |
US10173773B1 (en) * | 2016-02-23 | 2019-01-08 | State Farm Mutual Automobile Insurance Company | Systems and methods for operating drones in response to an incident |
CN107045693A (en) * | 2017-05-05 | 2017-08-15 | 北京媒立方传媒科技有限公司 | Media characteristic determination, Media Recommendation Method and device |
CN108280114A (en) * | 2017-07-28 | 2018-07-13 | 淮阴工学院 | A kind of user's literature reading interest analysis method based on deep learning |
CN109241425A (en) * | 2018-08-31 | 2019-01-18 | 腾讯科技(深圳)有限公司 | A kind of resource recommendation method, device, equipment and storage medium |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112784151A (en) * | 2019-11-08 | 2021-05-11 | 北京搜狗科技发展有限公司 | Method and related device for determining recommendation information |
CN112784151B (en) * | 2019-11-08 | 2024-02-06 | 北京搜狗科技发展有限公司 | Method and related device for determining recommended information |
CN110837598A (en) * | 2019-11-11 | 2020-02-25 | 腾讯科技(深圳)有限公司 | Information recommendation method, device, equipment and storage medium |
CN112925924A (en) * | 2019-12-05 | 2021-06-08 | 北京达佳互联信息技术有限公司 | Multimedia file recommendation method and device, electronic equipment and storage medium |
CN112925963B (en) * | 2019-12-06 | 2022-11-22 | 杭州海康威视数字技术股份有限公司 | Data recommendation method and device |
CN112925963A (en) * | 2019-12-06 | 2021-06-08 | 杭州海康威视数字技术股份有限公司 | Data recommendation method and device |
CN111193795A (en) * | 2019-12-30 | 2020-05-22 | 腾讯科技(深圳)有限公司 | Information pushing method and device, electronic equipment and computer readable storage medium |
CN111310034B (en) * | 2020-01-23 | 2023-04-07 | 深圳市雅阅科技有限公司 | Resource recommendation method and related equipment |
CN111310034A (en) * | 2020-01-23 | 2020-06-19 | 腾讯科技(深圳)有限公司 | Resource recommendation method and related equipment |
CN111291266A (en) * | 2020-02-13 | 2020-06-16 | 腾讯科技(北京)有限公司 | Artificial intelligence based recommendation method and device, electronic equipment and storage medium |
WO2021159776A1 (en) * | 2020-02-13 | 2021-08-19 | 腾讯科技(深圳)有限公司 | Artificial intelligence-based recommendation method and apparatus, electronic device, and storage medium |
CN111459783B (en) * | 2020-04-03 | 2023-04-18 | 北京字节跳动网络技术有限公司 | Application program optimization method and device, electronic equipment and storage medium |
CN111459783A (en) * | 2020-04-03 | 2020-07-28 | 北京字节跳动网络技术有限公司 | Application program optimization method and device, electronic equipment and storage medium |
CN113542893A (en) * | 2020-04-14 | 2021-10-22 | 北京达佳互联信息技术有限公司 | Method and device for acquiring evaluation information of works, and video screening method and device |
CN113542893B (en) * | 2020-04-14 | 2023-08-18 | 北京达佳互联信息技术有限公司 | Method and device for acquiring work evaluation information, and video screening method and device |
CN112200639A (en) * | 2020-10-30 | 2021-01-08 | 杭州时趣信息技术有限公司 | Information flow model construction method, device and medium |
CN112579729A (en) * | 2020-12-25 | 2021-03-30 | 百度(中国)有限公司 | Training method and device for document quality evaluation model, electronic equipment and medium |
CN112579729B (en) * | 2020-12-25 | 2024-05-21 | 百度(中国)有限公司 | Training method and device for document quality evaluation model, electronic equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN110266745B (en) | 2022-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110266745A (en) | Information flow recommended method, device, equipment and storage medium based on depth network | |
US10367862B2 (en) | Large-scale page recommendations on online social networks | |
US20190340538A1 (en) | Identifying entities using a deep-learning model | |
CN108763314A (en) | A kind of interest recommends method, apparatus, server and storage medium | |
CN108280104A (en) | The characteristics information extraction method and device of target object | |
CN109087178A (en) | Method of Commodity Recommendation and device | |
US10726087B2 (en) | Machine learning system and method to identify and connect like-minded users | |
WO2020238502A1 (en) | Article recommendation method and apparatus, electronic device and storage medium | |
WO2023065859A1 (en) | Item recommendation method and apparatus, and storage medium | |
CN104081429A (en) | Video recommendation based on video co-occurrence statistics | |
US9542458B2 (en) | Systems and methods for processing and displaying user-generated content | |
WO2017087833A1 (en) | Measuring influence propagation within networks | |
CN112380453B (en) | Article recommendation method and device, storage medium and equipment | |
US9552556B2 (en) | Site flow optimization | |
US20150074544A1 (en) | Information processing apparatus, information processing method, and program | |
Mozafari et al. | An enriched social behavioural information diffusion model in social networks | |
CN112765482A (en) | Product delivery method, device, equipment and computer readable medium | |
CN105096161B (en) | It is a kind of enter row information displaying method and apparatus | |
CN111652673B (en) | Intelligent recommendation method, device, server and storage medium | |
CN107491509B (en) | A kind of customer attribute information method for digging, device and medium | |
CN115034836A (en) | Model training method and related device | |
CN108876435A (en) | Artificial intelligence platform implementation method, device, computer equipment and storage medium | |
CN114625894A (en) | Appreciation evaluation method, model training method, appreciation evaluation apparatus, model training medium, and computing apparatus | |
KR20150023433A (en) | Method and apparatus for obfuscating user demographics | |
Yuan et al. | Research on group POIs recommendation fusion of users' gregariousness and activity in LBSN |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20221129 Address after: 1402, Floor 14, Block A, Haina Baichuan Headquarters Building, No. 6, Baoxing Road, Haibin Community, Xin'an Street, Bao'an District, Shenzhen, Guangdong 518133 Patentee after: Shenzhen Yayue Technology Co.,Ltd. Address before: 518057 Tencent Building, No. 1 High-tech Zone, Nanshan District, Shenzhen City, Guangdong Province, 35 floors Patentee before: TENCENT TECHNOLOGY (SHENZHEN) Co.,Ltd. |
|
TR01 | Transfer of patent right |