CN106227792B - Method and apparatus for pushed information - Google Patents
Method and apparatus for pushed information Download PDFInfo
- Publication number
- CN106227792B CN106227792B CN201610576501.3A CN201610576501A CN106227792B CN 106227792 B CN106227792 B CN 106227792B CN 201610576501 A CN201610576501 A CN 201610576501A CN 106227792 B CN106227792 B CN 106227792B
- Authority
- CN
- China
- Prior art keywords
- pushed information
- history
- data
- cluster
- history pushed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Electrically Operated Instructional Devices (AREA)
- Machine Translation (AREA)
Abstract
This application discloses the method and apparatus for pushed information.One specific embodiment of the method includes: to obtain history pushed information;History pushed information is inputted into deep learning model, obtains the hidden layer output data of deep learning model;Hidden layer output data is encoded, coded data is obtained;The history pushed information with identical coded data is clustered, history pushed information cluster is obtained;It is less than predetermined threshold in response to the quantity of history pushed information in history pushed information cluster, history pushed information cluster is based on, to user's pushed information.The embodiment has taken into account the efficiency of cluster and the precision of cluster result, to improve the efficiency and specific aim of pushed information.
Description
Technical field
This application involves field of computer technology, and in particular to Internet technical field, more particularly, to pushed information
Method and apparatus.
Background technique
Currently, the background server of searched page would generally cluster the Similar Problems in historical problem, Zhi Hougen
Most stylish and top quality problem and answer are extracted according to cluster result, is preferentially opened up when user searches for the answer of problem
Show, namely pushes most stylish and top quality problem and answer to user.
When being clustered the Similar Problems in historical problem, term vector or feature can be converted by the text of problem
Vector clusters text according to the similarity of term vector or feature vector later;Or utilize natural language processing technique or machine
Learning art obtains the high-level semantics feature of text, clusters text according to the similarity of high-level semantics feature later.
However, it is current according to cluster result to the method for user's pushed information, according to according to term vector or feature to
The method for measuring cluster then has that semantic missing, the recall rate of cluster result are low;According to poly- according to high-level semantics feature
The method of class, since high-level semantics feature is usually higher-dimension array or floating number array, then the efficiency of similarity calculation is lower.Cause
This, current has that semantic missing or computational efficiency are lower to the method for user's pushed information.
Summary of the invention
The purpose of the application is to propose a kind of improved method and apparatus for pushed information, to solve background above
The technical issues of technology segment is mentioned.
In a first aspect, this application provides a kind of methods for pushed information, which comprises obtain history push
Information;The history pushed information is inputted into deep learning model, obtains the hidden layer output data of deep learning model;Coding institute
Hidden layer output data is stated, coded data is obtained;The history pushed information with identical coded data is clustered, history push letter is obtained
Cease cluster;It is less than predetermined threshold in response to the quantity of history pushed information in the history pushed information cluster, is pushed away based on the history
Informational cluster is sent, to user's pushed information.
In some embodiments, the coding hidden layer output data, obtaining coded data includes: described in first encoding
Hidden layer output data obtains preliminary coded data;Preliminary coded data described in secondary coding, obtains secondary coding data;And
The cluster has the history pushed information of identical coded data, and obtaining history pushed information cluster includes: that cluster has identical two
The history pushed information of secondary coded data obtains history pushed information cluster.
In some embodiments, the method also includes: in response to history pushed information in the history pushed information cluster
Quantity be greater than or equal to predetermined threshold, adjust the coding parameter of the secondary coding according to preset rules to improve compression ratio,
And preliminary coded data described in the secondary coding is executed, obtain secondary coding data step.
In some embodiments, the first encoding includes following one or more: dimensionality reduction;By floating number vector format
Input data is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and the input after the convolution of pond
Data, the output data after obtaining pond convolution;And sample or intercept input data, the output number after being sampled or being intercepted
According to;And/or the secondary coding includes following one or more: dimensionality reduction;The input data of floating number vector format is mapped to 1
With 0, the output data of boolean vector format is obtained;Convolution input data, and the input data after the convolution of pond, obtain Chi Huajuan
Output data after product;And sample or intercept input data, the output data after being sampled or being intercepted.
In some embodiments, described to adjust the coding parameter of the secondary coding according to preset rules to improve compression ratio
It include: to increase the sampled point of the sampling according to preset rules to improve compression ratio.
In some embodiments, described to be based on history pushed information cluster, it include with the next item down or more to user's pushed information
: history push letter of the duration in the history pushed information cluster apart from current time less than scheduled duration is pushed to user
Breath;The history pushed information that scoring in the history pushed information cluster is higher than predetermined score value is pushed to user;Delete preset time
History pushed information before;And delete the history pushed information that scoring is lower than preset threshold.
In some embodiments, described to be based on history pushed information cluster, it include with the next item down or more to user's pushed information
: the duration in the history pushed information cluster apart from current time is pushed less than scheduled duration to user and scoring is higher than predetermined
The history pushed information of score value;Before deleting preset time and scoring is lower than the history pushed information of preset threshold.
Second aspect, this application provides a kind of device for pushed information, described device includes: acquisition of information list
Member, for obtaining history pushed information;Data capture unit, for the history pushed information to be inputted deep learning model,
Obtain the hidden layer output data of deep learning model;Data encoding unit is encoded for encoding the hidden layer output data
Data;Information cluster unit clusters the history pushed information with identical coded data, obtains history pushed information cluster;Information
Push unit is less than predetermined threshold for the quantity in response to history pushed information in the history pushed information cluster, is based on institute
History pushed information cluster is stated, to user's pushed information.
In some embodiments, the data encoding unit includes: preliminary coded sub-units, for hidden described in first encoding
Layer output data, obtains preliminary coded data;Secondary coding subelement is obtained for preliminary coded data described in secondary coding
Secondary coding data;And the information cluster unit includes: information cluster subelement, has identical secondary coding for clustering
The history pushed information of data obtains history pushed information cluster.
In some embodiments, described device further include: parameter adjustment unit, in response to the history pushed information
The quantity of history pushed information is greater than or equal to predetermined threshold in cluster, adjusts the secondary coding subelement according to preset rules and uses
In the coding parameter of secondary coding to improve compression ratio, and call the secondary coding subelement tentatively to compile described in secondary coding
Code data, obtain secondary coding data.
In some embodiments, the first encoding in the preliminary coded sub-units includes following one or more: dimensionality reduction;
The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data,
And the input data after the convolution of pond, the output data after obtaining pond convolution;And input data is sampled or intercepts, it is adopted
Output data after sample or interception;And/or the secondary coding in the secondary coding subelement includes following one or more: drop
Dimension;The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution inputs number
According to, and the input data after the convolution of pond, the output data after obtaining pond convolution;And input data is sampled or intercepts, it obtains
Output data to after sampling or interception.
In some embodiments, the parameter adjustment unit is further used for: increasing the sampling according to preset rules
Sampled point is to improve compression ratio.
In some embodiments, the information push unit is further used for following one or more: pushing institute to user
State the history pushed information that the duration in history pushed information cluster apart from current time is less than scheduled duration;To described in user's push
Scoring is higher than the history pushed information of predetermined score value in history pushed information cluster;History before deleting preset time pushes letter
Breath;And delete the history pushed information that scoring is lower than preset threshold.
In some embodiments, the information push unit is further used for following one or more: pushing institute to user
The duration in history pushed information cluster apart from current time is stated to be less than scheduled duration and score higher than the history push of predetermined score value
Information;Before deleting preset time and scoring is lower than the history pushed information of preset threshold.
Method and apparatus provided by the present application for pushed information, by obtaining history pushed information, then by history
Pushed information inputs deep learning model, obtains the hidden layer output data of deep learning model, encodes hidden layer output data later,
Coded data is obtained, cluster has the history pushed information of identical coded data later, obtains history pushed information cluster, finally rings
It the quantity of history pushed information should be less than predetermined threshold in history pushed information cluster, history pushed information cluster is based on, to user
Pushed information has taken into account the efficiency of cluster and the precision of cluster result, to improve the efficiency and specific aim of pushed information.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 is that this application can be applied to exemplary system architecture figures therein;
Fig. 2 is the schematic flow chart according to one embodiment of the method for pushed information of the application;
Fig. 3 is the schematic flow chart according to another embodiment of the method for pushed information of the application;
Fig. 4 is the schematic flow chart according to the third embodiment of the method for pushed information of the application;
Fig. 5 is the exemplary block diagram according to one embodiment of the device for pushed information of the application;
Fig. 6 is adapted for the structural representation of the computer system for the terminal device or server of realizing the embodiment of the present application
Figure.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to
Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 is shown can be using the method for pushed information of the application or the implementation of the device for pushed information
The exemplary system architecture 100 of example.
As shown in Figure 1, system architecture 100 may include terminal device 101,102,103, network 104 and server 105,
106.Network 104 between terminal device 101,102,103 and server 105,106 to provide the medium of communication link.Net
Network 104 may include various connection types, such as wired, wireless communication link or fiber optic cables etc..
Terminal device 101,102,103 can be with display screen and support the various electronic equipments of function of search, packet
Include but be not limited to smart phone, tablet computer, E-book reader, MP3 player (Moving Picture Experts
Group Audio Layer III, dynamic image expert's compression standard audio level 3), MP4 (Moving Picture
Experts Group Audio Layer IV, dynamic image expert's compression standard audio level 4) it is player, on knee portable
Computer and desktop computer etc..
Server 105,106 can be to provide the server of various services, such as provide terminal device 101,102,103
The background server of support.Background server can carry out the data such as the request received the processing such as analyzing, and processing is tied
Fruit (such as user's interesting target of push) feeds back to terminal device.
It should be noted that the embodiment of the present application provided by for pushed information method generally by server 105,
106 execute, and correspondingly, the device for pushed information is generally positioned in server 105,106.
It should be understood that the number of terminal device, network and server in Fig. 1 is only schematical.According to realization need
It wants, can have any number of terminal device, network and server.
With continued reference to Fig. 2, the process of one embodiment of the method for pushed information according to the application is shown
200.The method for pushed information, comprising the following steps:
In the present embodiment, electronic equipment (such as the service shown in FIG. 1 for the method operation of pushed information thereon
Device), can receive user terminal send the problem of searching request before or receive user terminal send the problem of searching request when,
Obtain history pushed information.Here history pushed information can be the interior to user for the previous period of acquisition history pushed information
The information of push, such as obtain the problem of pushing to user interior for the previous period and its answer of history pushed information.
In the present embodiment, deep learning model is a kind of multilayer perceptron containing more hidden layers, by combining low-level feature
More abstract high-rise expression attribute classification or feature are formed, to find that the distributed nature of data indicates.Depth here
Model is practised, can not limited this for the deep learning model in the prior art or the technology of future development, the application.It is inciting somebody to action
After the history pushed information input deep learning model obtained in step 201, the hidden layer of available deep learning model is exported
Data.
In some optional implementations of the present embodiment, deep learning model may include Recognition with Recurrent Neural Network model,
Any one in convolutional neural networks model and recurrent neural networks model.
Here Recognition with Recurrent Neural Network model (RNN, Recurrent Neural Networks), can be to the information of front
Remembered and be applied in the calculating currently exported, the node between hidden layer has connection, and the input of hidden layer
Not only the output including input layer further includes the output of last moment hidden layer.Convolutional neural networks model (CNN,
Convolutional Neural Network) it is a kind of feedforward neural network, its artificial neuron can respond a part
Surrounding cells in coverage area may include convolutional layer and pond layer.Recurrent neural networks model (recursive neural
Network), different from the training algorithm of Recognition with Recurrent Neural Network model, but belong to the different variants of same algorithm, mainly utilize phase
As the increasingly complex depth network of neural network structure recurrence Construction.It should be noted that above-mentioned Recognition with Recurrent Neural Network model,
Convolutional neural networks model and recurrent neural networks model are the well-known techniques studied and applied extensively at present, no longer superfluous herein
It states.
In the present embodiment, coding, which refers to, encodes hidden layer output data to obtain coded data, and coding here is
Refer to from a kind of form or format and be converted to another form of process, may include following one or more: dimensionality reduction refers to defeated
Enter data and carry out Data Reduction, namely reduce the dimension of input data, to obtain output data;By floating number vector format
Input data is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and the input after the convolution of pond
Data, the output data after obtaining pond convolution;And sample or intercept input data, the output number after being sampled or being intercepted
According to.Here dimensionality reduction at least may include following any one: linear dimensionality reduction, the Nonlinear Dimension Reduction based on kernel function and based on spy
The Nonlinear Dimension Reduction of value indicative.
Here coding can not do this for the coding mode in the prior art or the technology of future development, the application
It limits.
It in the present embodiment, can be by the history with identical coded data based on coded data obtained in step 203
Pushed information cluster, obtains multiple history pushed information clusters, namely obtains the history push letter with identical secondary coding data
The set of breath.
In the present embodiment, based on history pushed information cluster obtained in step 204, if going through in history pushed information cluster
The quantity of history pushed information is less than predetermined threshold, then it represents that compression ratio is appropriate, can be based on going through in the history pushed information cluster
The similarity of history pushed information, to user's pushed information.
In some optional implementations of the present embodiment, it is based on history pushed information cluster, it can be with to user's pushed information
Including following one or more: pushing the duration in history pushed information cluster apart from current time to user and be less than scheduled duration
History pushed information;The history pushed information that scoring in history pushed information cluster is higher than predetermined score value is pushed to user;It deletes pre-
If the history pushed information before the time;And delete the history pushed information that scoring is lower than preset threshold.
Herein, the history that the duration in history pushed information cluster apart from current time is less than scheduled duration is pushed to user
Pushed information, it is ensured that the timeliness of pushed information.Scoring in history pushed information cluster, which is pushed, to user is higher than predetermined score value
History pushed information, it can be ensured that pushed information is quality information.The history pushed information before preset time is deleted, and
Delete scoring be lower than preset threshold history pushed information, can by the time in history pushed information library relatively long or scoring it is lower
History pushed information delete, to improve cluster efficiency, to improve the efficiency of pushed information.
It should be appreciated that predetermined score value here can be to be preset for screening the score value of history pushed information, Ke Yigen
It is determined according to problem and its parameters of answer, such as can be scored according to user, thumb up number, expert analysis mode and answerer's body
One or more determinations therein such as part certification.
In some optional implementations of the present embodiment, it is based on history pushed information cluster, includes to user's pushed information
It is one or more below: to push the duration in history pushed information cluster apart from current time to user and be less than scheduled duration and scoring
Higher than the history pushed information of predetermined score value;Before deleting preset time and scoring is lower than the history pushed information of preset threshold.
Herein, the duration in history pushed information cluster apart from current time is pushed to user be less than scheduled duration and scoring
Higher than the history pushed information of predetermined score value, the timeliness and quality of pushed information can be improved.Delete preset time before and
Scoring is lower than the history pushed information of preset threshold, and the efficiency to user's pushed information can be improved.
The method provided by the above embodiment of the application is realized efficient and rich by taking into account clustering precision and cluster efficiency
It is pushed in targetedly information.
With continued reference to Fig. 3, the process of another embodiment of the method for pushed information according to the application is shown
300.The method for pushed information, comprising the following steps:
In the present embodiment, electronic equipment (such as the service shown in FIG. 1 for the method operation of pushed information thereon
Device), can receive user terminal send the problem of searching request before or receive user terminal send the problem of searching request when,
Obtain history pushed information.Here history pushed information can be the interior to user for the previous period of acquisition history pushed information
The information of push, such as obtain the problem of pushing to user interior for the previous period and its answer of history pushed information.
In the present embodiment, deep learning model is a kind of multilayer perceptron containing more hidden layers, by combining low-level feature
More abstract high-rise expression attribute classification or feature are formed, to find that the distributed nature of data indicates.Depth here
Model is practised, can not limited this for the deep learning model in the prior art or the technology of future development, the application.It is inciting somebody to action
After the history pushed information input deep learning model obtained in step 301, the hidden layer of available deep learning model is exported
Data.
In some optional implementations of the present embodiment, deep learning model may include Recognition with Recurrent Neural Network model,
Any one in convolutional neural networks model and recurrent neural networks model.
In the present embodiment, first encoding, which refers to, encodes hidden layer output data to obtain preliminary coded data, here
Coding refer to from a kind of form or format and be converted to another form of process.Here first encoding, usual compression ratio compared with
Height can not limit this for the coding mode in the prior art or the technology of future development, the application.
In some optional implementations of the present embodiment, first encoding may include following one or more: dimensionality reduction is
Refer to and Data Reduction is carried out to input data, namely reduce the dimension of input data, to obtain output data;By floating-point number vector
The input data of format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and after the convolution of pond
Input data, the output data after obtaining pond convolution;And sample or intercept input data, after being sampled or being intercepted
Output data.For example, the input data of floating number vector format can be mapped to 1 in a specific coding scene
With 0, the output data of boolean vector format is obtained and then using the output data of boolean vector format as the input number of convolution
According to, and the input data after the convolution of pond, the output data after obtaining pond convolution is defeated after re-sampling or interception pond convolution
Data out, to obtain final output data.
In the present embodiment, secondary coding, which refers to, encodes preliminary coded data obtained in step 303, obtains
Secondary coding data.Here coding can be the coding mode in the prior art or the technology of future development, and the application is to this
Without limitation.
Herein, secondary coding can be identical with coding mode used by first encoding, such as first encoding and secondary
Coding is all made of the coding mode in convolution sum pond;Secondary coding can also be different with coding mode used by first encoding,
Such as first encoding uses the coding mode in convolution sum pond, and secondary coding is using the coding mode of sampling.
In some optional implementations of the present embodiment, secondary coding may include following one or more: dimensionality reduction is
Refer to and Data Reduction is carried out to input data, namely reduce the dimension of input data, to obtain output data;By floating-point number vector
The input data of format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and after the convolution of pond
Input data, the output data after obtaining pond convolution;And sample or intercept input data, after being sampled or being intercepted
Output data.
In the present embodiment, based on secondary coding data obtained in step 304, can there will be identical secondary coding number
According to history pushed information cluster, obtain multiple history pushed information clusters, namely obtain that there is going through for identical secondary coding data
The set of history pushed information.
In the present embodiment, based on history pushed information cluster obtained in step 305, if going through in history pushed information cluster
The quantity of history pushed information is less than predetermined threshold, then it represents that compression ratio is appropriate, can be based on going through in the history pushed information cluster
The similarity of history pushed information, to user's pushed information.
It should be appreciated that referring to the side that step 301,302 and 306 in the method for Fig. 3 description are described with reference Fig. 2 respectively
Step 201,202 and 205 in method are corresponding.Operation above with respect to the description of step 201,202 and 205 and spy as a result,
Sign is equally applicable to step 301,302 and 306, and details are not described herein.
The method provided by the above embodiment of the application is realized efficient and rich by taking into account clustering precision and cluster efficiency
It is pushed in targetedly information.
Referring to FIG. 4, Fig. 4 is shown according to the process of the third embodiment of the method for pushed information of the present embodiment
It is intended to.
In Fig. 4, the method 400 for pushed information, comprising the following steps:
In step 401, history pushed information is obtained, executes step 402 later.
In the present embodiment, electronic equipment (such as the service shown in FIG. 1 for the method operation of pushed information thereon
Device), can receive user terminal send the problem of searching request before or receive user terminal send the problem of searching request when,
Obtain history pushed information.Here history pushed information can be the interior to user for the previous period of acquisition history pushed information
The information of push, such as obtain the problem of pushing to user interior for the previous period and its answer of history pushed information.
In step 402, history pushed information is inputted into deep learning model, obtains the hidden layer output of deep learning model
Data execute step 403 later.
In the present embodiment, deep learning model is a kind of multilayer perceptron containing more hidden layers, by combining low-level feature
More abstract high-rise expression attribute classification or feature are formed, to find that the distributed nature of data indicates.Depth here
Model is practised, can not limited this for the deep learning model in the prior art or the technology of future development, the application.It is inciting somebody to action
After the history pushed information input deep learning model obtained in step 401, the hidden layer of available deep learning model is exported
Data.
In some optional implementations of the present embodiment, deep learning model may include Recognition with Recurrent Neural Network model,
Any one in convolutional neural networks model and recurrent neural networks model.
In step 403, first encoding hidden layer output data obtains preliminary coded data, executes step 404 later.
In the present embodiment, dimensionality reduction, which refers to, carries out Data Reduction, namely drop to the hidden layer output data obtained in step 402
The dimension of low hidden layer output data;First encoding, which refers to, to be encoded hidden layer output data to obtain preliminary coded data, here
Coding refer to from a kind of form or format and be converted to another form of process.Here first encoding, usual compression ratio compared with
Height can not limit this for the coding mode in the prior art or the technology of future development, the application.
In some optional implementations of the present embodiment, first encoding may include following one or more: dimensionality reduction;It will
The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and
Input data after the convolution of pond, the output data after obtaining pond convolution;And input data is sampled or intercepts, it is sampled
Or the output data after interception.
In step 404, the preliminary coded data of secondary coding obtains secondary coding data, executes step 405 later.
In the present embodiment, secondary coding, which refers to, encodes preliminary coded data obtained in step 403, obtains
Secondary coding data.Here coding can be the coding mode in the prior art or the technology of future development, and the application is to this
Without limitation.
In some optional implementations of the present embodiment, secondary coding may include following one or more: dimensionality reduction;It will
The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and
Input data after the convolution of pond, the output data after obtaining pond convolution;And input data is sampled or intercepts, it is sampled
Or the output data after interception.
In step 405, cluster has the history pushed information of identical secondary coding data, obtains history pushed information
Cluster executes step 406 later.
In the present embodiment, based on secondary coding data obtained in step 404, can there will be identical secondary coding number
According to history pushed information cluster, obtain multiple history pushed information clusters, namely obtain that there is going through for identical secondary coding data
The set of history pushed information.
In a step 406, judge whether the quantity of history pushed information in history pushed information cluster is less than predetermined threshold, if
It is to then follow the steps 407, if it is not, thening follow the steps 408.
In the present embodiment, predetermined threshold refer in predetermined history pushed information cluster can efficiently and high-precision into
The quantity of the capable history pushed information of similarity calculation two-by-two.Herein, by judging that history pushes in history pushed information cluster
Whether the quantity of information is less than predetermined threshold, can determine that can the history pushed information in history pushed information cluster carry out efficiently
And high-precision similarity calculation two-by-two.
In step 407, it is less than predetermined threshold in response to the quantity of history pushed information in history pushed information cluster, is based on
History pushed information cluster, to user's pushed information.
In the present embodiment, the quantity based on the history pushed information in history pushed information cluster obtained in step 406
Less than predetermined threshold, then it represents that compression ratio is more appropriate, similarity calculation two-by-two can efficiently and be in high precision carried out, to be based on
The similarity of history pushed information in the history pushed information cluster, to user's pushed information.
In a step 408, it is greater than or equal to predetermined threshold in response to the quantity of history pushed information in history pushed information cluster
Value, the coding parameter for adjusting secondary coding according to preset rules execute step 404 to improve compression ratio later.
In the present embodiment, the quantity based on the history pushed information in history pushed information cluster obtained in step 406
More than or equal to predetermined threshold, then it represents that compression ratio is smaller, and the efficiency for carrying out similarity calculation two-by-two is lower and precision is lower, because
This can be considered improve compression ratio after re-start secondary coding, the quantity for being desirably to obtain history pushed information more appropriate is gone through
History pushed information cluster, so as to efficiently and in high precision carry out similarity calculation two-by-two.
It should be appreciated that referring to the method that the step 401 in the method for Fig. 4 description is described with reference Fig. 3 respectively to step 405
In step 301 it is corresponding to step 305.The operation and feature described as a result, above with respect to step 301 to step 305 is same
Suitable for step 401 to step 405, details are not described herein.
Figure 4, it is seen that the method for pushed information compared with the corresponding embodiment of Fig. 3, in the present embodiment
Process 400 highlight and judge whether the quantity of history pushed information in history pushed information cluster is less than predetermined threshold,
If being less than the coding parameter according to preset rules adjustment secondary coding to improve compression ratio, and jump to the step of secondary coding
Suddenly.As a result, the present embodiment description scheme can with the quantity of history pushed information in dynamic corrections history pushed information cluster, thus
Realize more accurate and highly efficient pushed information.
With further reference to Fig. 5, as an implementation of the above method, this application provides a kind of devices for pushed information
One embodiment, the Installation practice is corresponding with embodiment of the method shown in FIG. 1, which specifically can be applied to various
In electronic equipment.
As shown in figure 5, the device 500 described in the present embodiment for pushed information includes: information acquisition unit 510, number
According to acquiring unit 520, data encoding unit 530, information cluster unit 540 and information push unit 550.
Wherein, information acquisition unit 510, for obtaining history pushed information.Data capture unit 520 is used for history
Pushed information inputs deep learning model, obtains the hidden layer output data of deep learning model.Data encoding unit 530, is used for
Hidden layer output data is encoded, coded data is obtained.Information cluster unit 540, for clustering the history with identical coded data
Pushed information obtains history pushed information cluster.Information push unit 550, for being pushed away in response to history in history pushed information cluster
Deliver letters breath quantity be less than predetermined threshold, be based on history pushed information cluster, to user's pushed information.
Here deep learning model may include following any one: Recognition with Recurrent Neural Network model, convolutional neural networks
Model and recurrent neural networks model.
In some optional implementations of the present embodiment, data encoding unit 530 includes: preliminary coded sub-units 531,
For first encoding hidden layer output data, preliminary coded data is obtained;Secondary coding subelement 532, it is preliminary for secondary coding
Coded data obtains secondary coding data;And information cluster unit 540 includes: information cluster subelement 541, for clustering
History pushed information with identical secondary coding data obtains history pushed information cluster.
In some optional implementations of the present embodiment, device further include: parameter adjustment unit 560, in response to
The quantity of history pushed information is greater than or equal to predetermined threshold in history pushed information cluster, adjusts secondary coding according to preset rules
Subelement is used for the coding parameter of secondary coding to improve compression ratio, and secondary coding subelement is called tentatively to compile with secondary coding
Code data, obtain secondary coding data.The optional implementation is corresponding with embodiment of the method shown in Fig. 4.
In some optional implementations of the present embodiment, the first encoding in preliminary coded sub-units includes with the next item down
It is or multinomial: dimensionality reduction;The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;
Convolution input data, and the input data after the convolution of pond, the output data after obtaining pond convolution;And it samples or intercepts defeated
Enter data, the output data after being sampled or being intercepted;And/or the secondary coding in secondary coding subelement includes with the next item down
It is or multinomial: dimensionality reduction;The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;
Convolution input data, and the input data after the convolution of pond, the output data after obtaining pond convolution;And it samples or intercepts defeated
Enter data, the output data after being sampled or being intercepted.Here dimensionality reduction at least may include following any one: linear drop
Dimension, the Nonlinear Dimension Reduction based on kernel function and the Nonlinear Dimension Reduction based on characteristic value.
In some optional implementations of the present embodiment, parameter adjustment unit is further used for: being increased according to preset rules
Add the sampled point of sampling to improve compression ratio.
In some optional implementations of the present embodiment, information push unit is further used for following one or more:
The history pushed information that the duration in history pushed information cluster apart from current time is less than scheduled duration is pushed to user;To user
Push the history pushed information that scoring in history pushed information cluster is higher than predetermined score value;Delete the history push before preset time
Information;And delete the history pushed information that scoring is lower than preset threshold.
In some optional implementations of the present embodiment, information push unit is further used for following one or more:
The duration in history pushed information cluster apart from current time, which is pushed, to user is less than scheduled duration and scoring higher than predetermined score value
History pushed information;Before deleting preset time and scoring is lower than the history pushed information of preset threshold.
It should be appreciated that with reference to Fig. 5 description device in unit 510 to 550 respectively and described in Fig. 2 the step of method
201 is corresponding to step 205, and subelement 531 and subelement 532 are opposite with step 303 described in Fig. 3 and step 304 respectively
It answers, unit 510 to unit 560 is corresponding to step 408 with the step 401 of method described in Fig. 4 respectively.Needle above as a result,
The operation and feature describe to step 201 to step 205 is equally applicable to unit 510 to unit 550, above with respect to step 303
The operation and feature described with step 304 is equally applicable to subelement 531 and subelement 532, above with respect to step 401 to step
The operation of 408 descriptions and feature are equally applicable to unit 510 to unit 560, and details are not described herein.
From figure 5 it can be seen that the device 500 of the method for pushed information in the present embodiment is pushing letter to history
On the basis of the hidden layer output data of the deep learning model of breath carries out preliminary coding and secondary coding, pushes and believe in response to history
The quantity for ceasing history pushed information in cluster is less than predetermined threshold, is based on history pushed information cluster pushed information.The present embodiment as a result,
Clustering precision and cluster efficiency can be taken into account, to realize efficiently and be imbued with targetedly information push.
Further, in the implementation of part, the device 500 of the method for pushed information is pushed in response to history to be believed
The quantity for ceasing history pushed information in cluster is greater than or equal to predetermined threshold, and the coding parameter of secondary coding is adjusted according to preset rules
To improve compression ratio, and jumps to and re-execute secondary coding.The scheme of the present embodiment description can be gone through as a result, with dynamic corrections
The quantity of history pushed information in history pushed information cluster, to realize more accurate and highly efficient pushed information.
In above-described embodiment of the application, first encoding and secondary coding only represent the source data and output encoded twice
Data are different, and coding can choose identical coding mode twice, can also choose different coding modes.Those skilled in the art
Member should be appreciated that the particular determination of therein primary or secondary not structure digram coding.
Similarly, in above-described embodiment of the application, preliminary coded data and secondary coding data only represent first encoding
Output data and secondary coding output data.It will be appreciated by those skilled in the art that therein preliminary or secondary not structure
The particular determination of digram coding data.
Below with reference to Fig. 6, it illustrates the calculating of the terminal device or server that are suitable for being used to realize the embodiment of the present application
The structural schematic diagram of machine system 600.
As shown in fig. 6, computer system 600 includes central processing unit (CPU) 601, it can be read-only according to being stored in
In memory (ROM) 602
Program executes various suitable from the program that storage section 608 is loaded into random access storage device (RAM) 603
When movement and processing.In RAM 603, also it is stored with system 600 and operates required various programs and data.CPU 601,
ROM 602 and RAM 603 is connected with each other by bus 604.Input/output (I/O) interface 605 is also connected to bus 604.
I/O interface 605 is connected to lower component: the importation 606 including keyboard, mouse etc.;It is penetrated including such as cathode
The output par, c 607 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage section 608 including hard disk etc.;
And the communications portion 609 of the network interface card including LAN card, modem etc..Communications portion 609 via such as because
The network of spy's net executes communication process.Driver 610 is also connected to I/O interface 606 as needed.Detachable media 611, such as
Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on as needed on driver 610, in order to read from thereon
Computer program be mounted into storage section 608 as needed.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description
Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be tangibly embodied in machine readable
Computer program on medium, the computer program include the program code for method shown in execution flow chart.At this
In the embodiment of sample, which can be downloaded and installed from network by communications portion 609, and/or from removable
Medium 611 is unloaded to be mounted.When the computer program is executed by central processing unit (CPU) 601, execute in the present processes
The above-mentioned function of limiting.
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the application, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part of one unit of table, program segment or code, a part of the unit, program segment or code include one or more
Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box
The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical
On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants
It is noted that the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart, Ke Yiyong
The dedicated hardware based system of defined functions or operations is executed to realize, or can be referred to specialized hardware and computer
The combination of order is realized.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard
The mode of part is realized.Described unit also can be set in the processor, for example, can be described as: a kind of processor packet
It includes and connects information acquisition unit, data capture unit, data encoding unit, information cluster unit and information push unit.Wherein, this
The title of a little units does not constitute the restriction to the unit itself under certain conditions, for example, receiving unit can also be described
For " unit for obtaining history pushed information ".
As on the other hand, present invention also provides a kind of nonvolatile computer storage media, the non-volatile calculating
Machine storage medium can be nonvolatile computer storage media included in device described in above-described embodiment;It is also possible to
Individualism, without the nonvolatile computer storage media in supplying terminal.Above-mentioned nonvolatile computer storage media is deposited
One or more program is contained, when one or more of programs are executed by an equipment, so that the equipment: obtaining
History pushed information;History pushed information is inputted into deep learning model, obtains the hidden layer output data of deep learning model;It compiles
Code hidden layer output data, obtains coded data;The history pushed information with identical coded data is clustered, history push letter is obtained
Cease cluster;It is less than predetermined threshold in response to the quantity of history pushed information in history pushed information cluster, is based on history pushed information cluster,
To user's pushed information.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Those skilled in the art
Member is it should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic
Scheme, while should also cover in the case where not departing from the inventive concept, it is carried out by above-mentioned technical characteristic or its equivalent feature
Any combination and the other technical solutions formed.Such as features described above has similar function with (but being not limited to) disclosed herein
Can technical characteristic replaced mutually and the technical solution that is formed.
Claims (14)
1. a kind of method for pushed information, which is characterized in that the described method includes:
Obtain history pushed information;
The history pushed information is inputted into deep learning model, obtains the hidden layer output data of deep learning model;
The hidden layer output data is encoded, coded data is obtained;
The history pushed information with identical coded data is clustered, history pushed information cluster is obtained;
It is less than predetermined threshold in response to the quantity of history pushed information in the history pushed information cluster, is pushed based on the history
Informational cluster, to user's pushed information.
2. the method according to claim 1, wherein the coding hidden layer output data, obtains coded number
According to including: hidden layer output data described in first encoding, preliminary coded data is obtained;Preliminary coded data, obtains described in secondary coding
To secondary coding data;And
The cluster has the history pushed information of identical coded data, and obtaining history pushed information cluster includes: that cluster has phase
With the history pushed information of secondary coding data, history pushed information cluster is obtained.
3. according to the method described in claim 2, it is characterized in that, the method also includes:
It is greater than or equal to predetermined threshold in response to the quantity of history pushed information in the history pushed information cluster, according to default rule
It adjusts the coding parameter of the secondary coding then to improve compression ratio, and executes preliminary coded data described in the secondary coding,
Obtain secondary coding data step.
4. according to the method described in claim 3, it is characterized in that, the first encoding includes following one or more: dimensionality reduction;
The input data of floating number vector format is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data,
And the input data after the convolution of pond, the output data after obtaining pond convolution;And input data is sampled or intercepts, it is adopted
Output data after sample or interception;And/or
The secondary coding includes following one or more: dimensionality reduction;The input data of floating number vector format is mapped to 1 and 0,
Obtain the output data of boolean vector format;Convolution input data, and the input data after the convolution of pond, after obtaining pond convolution
Output data;And sample or intercept input data, the output data after being sampled or being intercepted.
5. according to the method described in claim 4, it is characterized in that, the volume for adjusting the secondary coding according to preset rules
Code parameter to improve compression ratio includes:
Increase the sampled point of the sampling according to preset rules to improve compression ratio.
6. method described in -5 any one according to claim 1, which is characterized in that it is described to be based on history pushed information cluster, to
User's pushed information includes following one or more:
History push letter of the duration in the history pushed information cluster apart from current time less than scheduled duration is pushed to user
Breath;
The history pushed information that scoring in the history pushed information cluster is higher than predetermined score value is pushed to user;
Delete the history pushed information before preset time;And
Delete the history pushed information that scoring is lower than preset threshold.
7. method described in -5 any one according to claim 1, which is characterized in that it is described to be based on history pushed information cluster, to
User's pushed information includes following one or more:
The duration in the history pushed information cluster apart from current time is pushed less than scheduled duration to user and scoring is higher than in advance
Determine the history pushed information of score value;
Before deleting preset time and scoring is lower than the history pushed information of preset threshold.
8. a kind of device for pushed information, which is characterized in that described device includes:
Information acquisition unit, for obtaining history pushed information;
Data capture unit obtains the hidden of deep learning model for the history pushed information to be inputted deep learning model
Layer output data;
Data encoding unit obtains coded data for encoding the hidden layer output data;
Information cluster unit clusters the history pushed information with identical coded data, obtains history pushed information cluster;
Information push unit is less than predetermined threshold for the quantity in response to history pushed information in the history pushed information cluster
Value is based on the history pushed information cluster, to user's pushed information.
9. device according to claim 8, which is characterized in that the data encoding unit includes: preliminary coded sub-units,
For hidden layer output data described in first encoding, preliminary coded data is obtained;Secondary coding subelement, for described in secondary coding
Preliminary coded data obtains secondary coding data;And
The information cluster unit includes: information cluster subelement, is pushed away for clustering the history with identical secondary coding data
It delivers letters breath, obtains history pushed information cluster.
10. device according to claim 9, which is characterized in that described device further include:
Parameter adjustment unit is greater than or equal to pre- for the quantity in response to history pushed information in the history pushed information cluster
Determine threshold value, according to preset rules adjust the secondary coding subelement for secondary coding coding parameter to improve compression ratio,
And call the secondary coding subelement with preliminary coded data described in secondary coding, obtain secondary coding data.
11. device according to claim 10, which is characterized in that the first encoding in the preliminary coded sub-units includes
It is one or more below: dimensionality reduction;The input data of floating number vector format is mapped to 1 and 0, obtains the defeated of boolean vector format
Data out;Convolution input data, and the input data after the convolution of pond, the output data after obtaining pond convolution;And sampling
Or interception input data, the output data after being sampled or being intercepted;And/or
Secondary coding in the secondary coding subelement includes following one or more: dimensionality reduction;By floating number vector format
Input data is mapped to 1 and 0, obtains the output data of boolean vector format;Convolution input data, and the input after the convolution of pond
Data, the output data after obtaining pond convolution;And sample or intercept input data, the output number after being sampled or being intercepted
According to.
12. device according to claim 11, which is characterized in that the parameter adjustment unit is further used for:
Increase the sampled point of the sampling according to preset rules to improve compression ratio.
13. according to device described in claim 8-12 any one, which is characterized in that the information push unit is further used
In following one or more:
History push letter of the duration in the history pushed information cluster apart from current time less than scheduled duration is pushed to user
Breath;
The history pushed information that scoring in the history pushed information cluster is higher than predetermined score value is pushed to user;
Delete the history pushed information before preset time;And
Delete the history pushed information that scoring is lower than preset threshold.
14. according to device described in claim 8-12 any one, which is characterized in that the information push unit is further used
In following one or more:
The duration in the history pushed information cluster apart from current time is pushed less than scheduled duration to user and scoring is higher than in advance
Determine the history pushed information of score value;
Before deleting preset time and scoring is lower than the history pushed information of preset threshold.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610576501.3A CN106227792B (en) | 2016-07-20 | 2016-07-20 | Method and apparatus for pushed information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610576501.3A CN106227792B (en) | 2016-07-20 | 2016-07-20 | Method and apparatus for pushed information |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106227792A CN106227792A (en) | 2016-12-14 |
CN106227792B true CN106227792B (en) | 2019-10-15 |
Family
ID=57531212
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610576501.3A Active CN106227792B (en) | 2016-07-20 | 2016-07-20 | Method and apparatus for pushed information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106227792B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108733694B (en) * | 2017-04-18 | 2022-02-25 | 北京国双科技有限公司 | Retrieval recommendation method and device |
CN107943943B (en) * | 2017-11-23 | 2020-11-03 | 北京小度信息科技有限公司 | User similarity determination method and device, electronic equipment and storage medium |
CN108322363B (en) * | 2018-02-12 | 2020-11-13 | 腾讯科技(深圳)有限公司 | Pushed data abnormity monitoring method and device, computer equipment and storage medium |
CN110309293A (en) * | 2018-02-13 | 2019-10-08 | 北京京东尚科信息技术有限公司 | Text recommended method and device |
CN109242109B (en) * | 2018-07-06 | 2022-05-10 | 网宿科技股份有限公司 | Management method of depth model and server |
CN111651755B (en) * | 2020-05-08 | 2023-04-18 | 中国联合网络通信集团有限公司 | Intrusion detection method and device |
CN112989207B (en) * | 2021-04-27 | 2021-08-27 | 武汉卓尔数字传媒科技有限公司 | Information recommendation method and device, electronic equipment and storage medium |
CN113254803B (en) * | 2021-06-24 | 2021-10-22 | 暨南大学 | Social recommendation method based on multi-feature heterogeneous graph neural network |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103353872A (en) * | 2013-06-03 | 2013-10-16 | 大连理工大学 | Teaching resource personalization recommendation method based on neural network |
CN103995823A (en) * | 2014-03-25 | 2014-08-20 | 南京邮电大学 | Information recommending method based on social network |
CN105205046A (en) * | 2015-09-25 | 2015-12-30 | 镇江明泰信息科技有限公司 | System and method for on-line user recommendation based on semantic analysis |
CN105279288A (en) * | 2015-12-04 | 2016-01-27 | 深圳大学 | Online content recommending method based on deep neural network |
CN105357232A (en) * | 2014-08-09 | 2016-02-24 | 星际空间(天津)科技发展有限公司 | Location awareness-based information pushing system and method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10417554B2 (en) * | 2014-05-22 | 2019-09-17 | Lee J. Scheffler | Methods and systems for neural and cognitive processing |
-
2016
- 2016-07-20 CN CN201610576501.3A patent/CN106227792B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103353872A (en) * | 2013-06-03 | 2013-10-16 | 大连理工大学 | Teaching resource personalization recommendation method based on neural network |
CN103995823A (en) * | 2014-03-25 | 2014-08-20 | 南京邮电大学 | Information recommending method based on social network |
CN105357232A (en) * | 2014-08-09 | 2016-02-24 | 星际空间(天津)科技发展有限公司 | Location awareness-based information pushing system and method |
CN105205046A (en) * | 2015-09-25 | 2015-12-30 | 镇江明泰信息科技有限公司 | System and method for on-line user recommendation based on semantic analysis |
CN105279288A (en) * | 2015-12-04 | 2016-01-27 | 深圳大学 | Online content recommending method based on deep neural network |
Also Published As
Publication number | Publication date |
---|---|
CN106227792A (en) | 2016-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106227792B (en) | Method and apparatus for pushed information | |
CN112685565B (en) | Text classification method based on multi-mode information fusion and related equipment thereof | |
US11810576B2 (en) | Personalization of experiences with digital assistants in communal settings through voice and query processing | |
CN105654950B (en) | Adaptive voice feedback method and device | |
US7853582B2 (en) | Method and system for providing information services related to multimodal inputs | |
CN111428010B (en) | Man-machine intelligent question-answering method and device | |
CN106383875B (en) | Man-machine interaction method and device based on artificial intelligence | |
CN108694225A (en) | A kind of image search method, the generation method of feature vector, device and electronic equipment | |
CN107679039A (en) | The method and apparatus being intended to for determining sentence | |
US20180329985A1 (en) | Method and Apparatus for Compressing Topic Model | |
US20230080230A1 (en) | Method for generating federated learning model | |
JP2022088304A (en) | Method for processing video, device, electronic device, medium, and computer program | |
CN106354856B (en) | Artificial intelligence-based deep neural network enhanced search method and device | |
CN111444357A (en) | Content information determination method and device, computer equipment and storage medium | |
US20110093264A1 (en) | Providing Information Services Related to Multimodal Inputs | |
JP7488871B2 (en) | Dialogue recommendation method, device, electronic device, storage medium, and computer program | |
CN109582825B (en) | Method and apparatus for generating information | |
CN113806588B (en) | Method and device for searching video | |
CN110554782A (en) | Expression input image synthesis method and system | |
CN112434533B (en) | Entity disambiguation method, entity disambiguation device, electronic device, and computer-readable storage medium | |
CN112650842A (en) | Human-computer interaction based customer service robot intention recognition method and related equipment | |
CN110727871A (en) | Multi-mode data acquisition and comprehensive analysis platform based on convolution decomposition depth model | |
CN113641797A (en) | Data processing method, device, equipment, storage medium and computer program product | |
CN109325178A (en) | Method and apparatus for handling information | |
CN115062709A (en) | Model optimization method, device, equipment, storage medium and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |