CN114626889A - Method, device, storage medium, device and program product for estimating number of objects - Google Patents

Method, device, storage medium, device and program product for estimating number of objects Download PDF

Info

Publication number
CN114626889A
CN114626889A CN202210272449.8A CN202210272449A CN114626889A CN 114626889 A CN114626889 A CN 114626889A CN 202210272449 A CN202210272449 A CN 202210272449A CN 114626889 A CN114626889 A CN 114626889A
Authority
CN
China
Prior art keywords
stage
prediction model
objects
predicted
label
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210272449.8A
Other languages
Chinese (zh)
Inventor
钟子宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210272449.8A priority Critical patent/CN114626889A/en
Publication of CN114626889A publication Critical patent/CN114626889A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Algebra (AREA)
  • General Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a method, a device, a storage medium, equipment and a program product for estimating the number of objects, which can be applied to the scenes of maps, the traffic field, the Internet of vehicles, information services and the like. The method comprises the following steps: modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage to obtain a prediction model; determining a t +1 stage object label according to the t stage object characteristic data and the prediction model; and (4) obtaining the predicted object number of the t +1 stage according to the t +1 stage object label statistics. According to the embodiment of the application, a prediction model is obtained by modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage, so that the prediction model can obtain the relation between the object characteristic data and the object labels, the object labels at the t +1 stage can be accurately determined according to the object characteristic data at the t stage, and the accurate number of the predicted objects at the t +1 stage can be obtained according to the statistics of the object labels at the t +1 stage.

Description

Method, device, storage medium, device and program product for estimating number of objects
Technical Field
The present application relates to the field of internet technologies, and in particular, to an object quantity estimation method, an object quantity estimation device, a computer-readable storage medium, a computer device, and a computer program product.
Background
The product decision maker can know the operation condition of the product by knowing the number of users of the product, wherein the number of future users of the product can be estimated so that the product decision maker can prepare in advance, and the decision delay caused by the product failure is avoided. In the related art, the estimation of the number of future users of a product is performed through the change of the number of overall users of the product, and the estimation result obtained in this way has a large error.
Disclosure of Invention
The embodiment of the application provides an object quantity estimation method, an object quantity estimation device, a computer readable storage medium, a computer device and a computer program product, which can more accurately determine an object label through object characteristic data and further obtain more accurate predicted object quantity, and are convenient for a product decision maker to know the operation condition of a target product.
In one aspect, a method for predicting the number of objects is provided, where the method includes: acquiring object characteristic data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n; modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage to obtain a prediction model; determining a t +1 stage object label according to the t stage object characteristic data and the prediction model; and obtaining the number of predicted objects in the t +1 th stage according to the t +1 th stage object label statistics.
In another aspect, an object quantity estimation apparatus is provided, and the apparatus includes an obtaining module, a first processing module, a second processing module, and a counting module. The acquisition module is used for acquiring object feature data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n. The first processing module is used for modeling according to the characteristic data of the object from the t-n stage to the t-1 stage and the object label from the t-n +1 stage to the t stage to obtain a prediction model. And the second processing module is used for determining a t +1 stage object label according to the t stage object characteristic data and the prediction model. And the statistical module is used for obtaining the number of the predicted objects in the t +1 th stage according to the t +1 th stage object label statistics.
In another aspect, a computer-readable storage medium is provided, where a computer program is stored, where the computer program is suitable for being loaded by a processor to execute the steps in the method for estimating the number of objects according to any embodiment.
In another aspect, a computer device is provided, the computer device includes a processor and a memory, the memory stores a computer program, and the processor is configured to execute the steps in the object quantity estimation method according to any one of the above embodiments by calling the computer program stored in the memory.
In another aspect, a computer program product is provided, which includes computer instructions, and the computer instructions, when executed by a processor, implement the steps in the object quantity estimation method according to any one of the above embodiments.
According to the method and the device, the prediction model is obtained through modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage, so that the prediction model can obtain the relation between the object characteristic data and the object labels, the object labels of the t +1 stage can be determined according to the object characteristic data of the t stage, and the number of the predicted objects of the t +1 stage can be obtained through statistics of the object labels of the t +1 stage.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an object quantity estimation system according to an embodiment of the present disclosure.
Fig. 2 is a first flowchart of a method for estimating a number of objects according to an embodiment of the present disclosure.
Fig. 3 is a second flow chart of the method for estimating the number of objects according to the embodiment of the present application.
Fig. 4 is a third flow diagram of the object quantity estimation method according to the embodiment of the present application.
Fig. 5 is a fourth flowchart illustrating an object quantity estimation method according to an embodiment of the present application.
Fig. 6 is a fifth flowchart illustrating an object quantity estimation method according to an embodiment of the present application.
Fig. 7 is a sixth flowchart of the method for estimating the number of objects according to the embodiment of the present application.
Fig. 8 is a seventh flowchart illustrating an object quantity estimation method according to an embodiment of the present application.
Fig. 9 is an eighth flowchart of the method for estimating the number of objects according to the embodiment of the present application.
Fig. 10 is a ninth flowchart illustrating an object quantity estimation method according to an embodiment of the present application.
Fig. 11 is a tenth flowchart illustrating an object quantity estimation method according to an embodiment of the present application.
Fig. 12 is a schematic structural diagram of an object quantity estimation apparatus according to an embodiment of the present application.
Fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an object quantity estimation method, an object quantity estimation device, a computer readable storage medium, a computer device and a computer program product. Specifically, the method for estimating the number of objects in the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server. The terminal can be a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart television, a smart sound box, a wearable smart device, a smart voice interaction device, a smart home appliance, a smart vehicle-mounted terminal, an aircraft and other devices, and can further comprise a client, wherein the client can be a video client, a browser client or an instant messaging client and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like.
The embodiment of the application can be applied to various internet scenes such as maps, traffic fields, internet of vehicles, information services, household appliances and intelligent terminals.
First, some terms or expressions appearing in the course of describing the embodiments of the present application are explained as follows:
an Intelligent Transportation System (ITS), also called Intelligent Transportation System (Intelligent Transportation System), is a comprehensive Transportation System which effectively and comprehensively applies advanced scientific technologies (information technology, computer technology, data communication technology, sensor technology, electronic control technology, automatic control theory, operational research, artificial intelligence and the like) to Transportation, service control and vehicle manufacturing, strengthens the relation among vehicles, roads and users, and thus forms a safety-guaranteeing, efficiency-improving, environment-improving and energy-saving comprehensive Transportation System.
AR algorithm (autoregressive algorithm): is a statistical method of processing time series using the previous periods of the same variable, e.g., x, i.e., x1To xk-1To predict the current period xkAnd assume that they are in a linear relationship.
RNN algorithm: the recurrent neural network is a recurrent neural network which takes sequence data as input, recurses in the evolution direction of the sequence and all nodes (cyclic units) are connected in a chain manner, and aims to process time series data.
Number of Daily Active User terminals (Daily Active User, abbreviation: DAU): generally, the number of user terminals that have logged in or used a certain product within one day (statistical day) is counted (user terminals that have not logged in repeatedly) and is often used to reflect the operation of a website, an internet application, or a network game. The definition of activity may not be exactly the same for different products, for example, if the product is a client product with an account, the account login is usually used as an activity index, such as a car networking client, QQ, IM, online tour. In the case of certain tool applications, the tool applications can be operated and used as active applications, such as a beautiful show and the like, and the tool applications at least need to take a picture or use a one-time cropping function to be active.
Newly adding an object: active during period t, but no active objects for all history periods < period t.
Object to be persisted: subjects with active t phase and active t-1 phase.
Attrition subjects: objects that are active during t-1 but not active during t.
Reflowing the object: objects that are active during t-2 and inactive during t-1, but active during t.
Pca (principal Component analysis): the method is a common data analysis mode, is usually used for reducing the dimension of high-dimensional data, and can be used for extracting the main characteristic components of the data.
Normalization: the data is mapped between a range (default 0-1) by transforming the original data.
And (3) standardization: the processed data are made to conform to the standard normal distribution, i.e. the mean is 0 and the standard deviation is 1.
The embodiment of the application provides an object quantity estimation method, which can obtain a prediction model through modeling of historical object characteristic data and historical object tags, so that the prediction model can obtain the relation between the object characteristic data and the object tags, further estimate future object tags according to the current object characteristic data, and obtain the estimated object quantity according to the statistical result of the future object tags. According to the method and the device, the characteristic data of the object in the target product is obtained for pre-estimation, the operation strategy can be better subdivided into each object, the decision and the object preference of the target product can be better grasped, and the influence of the object preference on the operation condition of the target product is predicted in advance.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an object quantity estimation system according to an embodiment of the present disclosure. The object quantity estimation system comprises a terminal 10, a server 20 and the like; the terminal 10 and the server 20 are connected via a network, such as a wired or wireless network connection.
The terminal 10, among other things, may be used to display a graphical user interface. The terminal is used for interacting with a user through a graphical user interface, for example, downloading and installing a corresponding client through the terminal and running the client, for example, calling a corresponding applet and running the applet, for example, displaying a corresponding graphical user interface through logging in a website, and the like. In the embodiment of the present application, the terminal 10 may be a terminal device used by a product decider and a product operator to receive alarm information. The server 20 determines the number of the predicted daily active objects in the t +1 th period, then judges whether the number of the predicted daily active objects in the t +1 th period is abnormal, and transmits alarm information to the terminal 10 when the number of the predicted daily active objects in the t +1 th period is abnormal.
In this embodiment, in determining the number of predicted daily active objects in the t +1 th period, the server 20 may specifically be configured to: acquiring object characteristic data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n; modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage to obtain a prediction model; determining a t +1 stage object label according to the t stage object characteristic data and the prediction model; obtaining the number of predicted objects in the t +1 th stage according to the statistics of the object tags in the t +1 th stage; and determining the number of the predicted daily active objects in the t +1 th period according to the number of the predicted objects in the t +1 th period, then judging whether the number of the predicted daily active objects in the t +1 th period is abnormal or not, and sending alarm information to the terminal 10 when the number of the predicted daily active objects in the t +1 th period is abnormal.
The following are detailed below. It should be noted that the description sequence of the following embodiments is not intended to limit the priority sequence of the embodiments.
The embodiments of the application provide an object quantity estimation method, which can be executed by a terminal or a server, or can be executed by the terminal and the server together; the embodiment of the present application is described by taking an object quantity estimation method as an example, which is executed by a server.
Referring to fig. 2 to fig. 11, fig. 2 to fig. 11 are schematic flow charts of the object quantity estimation method according to the embodiment of the present disclosure. The method comprises the following steps:
step 201, acquiring object characteristic data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n.
Specifically, the target product may be a related product of the internet, including but not limited to a car networking system (e.g., an intelligent transportation system), an application (music APP, shopping APP, video APP, etc.), a web page, etc., the object may refer to a user, a client, an account, etc., the object feature data may refer to behavior feature data of the user in the target product, the behavior feature may be, for example, a click behavior of the user, a purchase behavior of the user, a payment behavior of the user, a shielding behavior of the user, etc., and the object feature data may be obtained by processing log data of the t-n th to t-th periods. In the implementation of the present application, the target product is a vehicle network system, and the object is a vehicle network client, for example, where the object feature data may be the number of times and time that the vehicle network client logs in.
The object tags may be tags corresponding to object types, and include a new object tag, a reflow object tag, a retained object tag, an attrition object tag, and the like.
The period refers to a period, specifically, one day may be used as one period, one month may be used as one period, and the like. The t-th period can refer to the current day or month, n can be a positive integer greater than or equal to 1, and t is greater than n.
Step 202, modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage to obtain a prediction model.
Specifically, the t-n th to t th object feature data sequences are, for example, { X }t,Xt-1,…,Xt-nT-n +1 th to t-th object tag sequences are, for example, { Y }t,Yt-1,…,Yt-n+1Get an object sample of each cycle, e.g., { S }, by matching with an object ID (user name, client name, account name, etc.)t, St-1,…,St-n+1An object sample comprising object characteristic data and an object label, i.e. StX which may be the same object IDt-1And Yt,St-1X which may be the same object IDt-2And Yt-1,St-n+1X which may be the same object IDt-nAnd Yt-n+1. The samples of each period are subjected to data processing (PCA, normalization, etc.), and then the object sample may be randomly segmented into training samples and test samples according to a preset ratio, where the ratio of the training samples is c, and the ratio of the test samples is 1-c, and in one example, c may be 80%, and of course, c may also be 70%, 81%, 85%, 89%, 95%, etc. In one embodiment, the training samples are, for example
Figure RE-GDA0003610636240000041
Wherein h may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA0003610636240000042
The sequence of object tags may be
Figure RE-GDA0003610636240000043
The test specimen is, for example
Figure RE-GDA0003610636240000044
Where j may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA0003610636240000045
The sequence of object tags may be
Figure RE-GDA0003610636240000046
The prediction model may be an RNN algorithm model, and may be constructed as follows:
Figure RE-GDA0003610636240000051
wherein, XtObject feature vector representing the t-th stage, HtRepresenting a hidden layer vector, YtIndicating object tag, Yt1,0, where 1 indicates that the object is of the type corresponding to the object tag, and 0 indicates that the object is not of the type corresponding to the object tag. W is a group ofY,WH, UHRepresenting a model parameter matrix, BY,BHRepresenting a model parameter vector; tanh represents an activation function of the hidden layer, and sigmod represents an activation function of the output layer, wherein the activation function of the hidden layer and the activation function of the output layer may also adopt other activation functions, such as a ReLU function, and the like, and are not limited in detail herein.
Inputting training samples
Figure RE-GDA0003610636240000052
And a test specimen
Figure RE-GDA0003610636240000053
Substituting into the RNN algorithm model to obtain a model parameter matrix W of the RNN algorithm modelY,WH,UHModel parameter vector BY,BHAnd a hidden layer { H ] per periodt,Ht-1,…,Ht-n}。
In some embodiments, the prediction model may also be a regression algorithm model, and the like, and is not limited in this respect.
In this way, a prediction model can be obtained by modeling according to the object feature data from the t-n stage to the t-1 stage and the object tags from the t-n +1 stage to the t stage, so that the prediction model can obtain the relationship between the object feature data and the object tags.
Step 203, determining a t +1 stage object label according to the t stage object characteristic data and the prediction model.
Specifically, the t-th stage object feature data XtModel parameter matrix WY,WH,UHModel parameter vector BY, BHAnd a hidden layer HtInputting the prediction probability into the prediction model to obtain the prediction probability of the t +1 th stage. Classifying according to the prediction probability of the t +1 th stage and a probability threshold, wherein the probability threshold is 0.5, and then the object label is marked as:
Figure RE-GDA0003610636240000054
i.e. when predicting the probability P { Y }t+1When the probability is greater than or equal to a probability threshold (for example, 0.5), the mark of the object tag at the t +1 th stage is 1, and 1 indicates that the object is of the type corresponding to the object tag; when predicting the probability P { Y }t+1When the probability is smaller than the probability threshold, the mark of the object label at the t +1 th stage is 0, and 0 indicates that the object is not the type corresponding to the object label.
Therefore, the t +1 stage object label can be accurately predicted according to the t stage object characteristic data and the prediction model.
And 204, obtaining the number of the predicted objects in the t +1 th stage according to the t +1 th stage object label statistics.
Specifically, an object ID with an object tag of 1 is input, and deduplication statistics (the same object may be repeatedly calculated due to multiple times of activity and the like) is performed, so as to obtain the predicted object number in the t +1 th stage. Therefore, the number of the predicted objects in the t +1 th stage can be accurately counted and obtained.
Optionally, the object tag includes a reflow object tag, the reflow object is an object that is active in t-2 phase, inactive in t-1 phase, and active in t phase in the target product, the prediction model includes a reflow object prediction model, and the reflow object prediction model includes a reflow object prediction model parameter matrix and a reflow object prediction model parameter vector, as shown in fig. 3, step 202 may be implemented through step 301, specifically:
step 301, determining a parameter matrix and a parameter vector of a reflow object prediction model according to the object feature data from the t-n stage to the t-1 stage and the reflow object tags from the t-n +1 stage to the t stage to obtain a reflow object prediction model.
Specifically, the t-n th to t th object feature data sequences are, for example, { X }t,Xt-1,…,Xt-nThe reflux object tag sequence from t-n +1 th to t-th stages is, for example
Figure RE-GDA0003610636240000055
By matching the object IDs (user name, client name, account name, etc.), a reflow object sample of each cycle can be obtained, such as
Figure RE-GDA0003610636240000056
The reflow object sample includes object feature data and a reflow object label, i.e.
Figure RE-GDA0003610636240000057
X which may be the same object IDt-1And
Figure RE-GDA0003610636240000058
x which may be the same object IDt-2And
Figure RE-GDA0003610636240000061
x which may be the same object IDt-nAnd
Figure RE-GDA0003610636240000062
the samples of each cycle are subjected to data processing (PCA, normalization, etc.), and the reflow object samples can then be scaled by a preset ratioAnd randomly splitting the test sample into a reflow training sample and a reflow test sample, wherein the proportion of the reflow training sample is c, for example, the proportion of the reflow test sample is 1-c, in one example, c may be 80%, and of course, c may also be 70%, 81%, 85%, 89%, 95%, and the like. In one embodiment, the reflow training samples are, for example
Figure RE-GDA0003610636240000063
Wherein h may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA0003610636240000064
The sequence of reflow object tags may be
Figure RE-GDA0003610636240000065
Reflow test specimens such as
Figure RE-GDA0003610636240000066
Figure RE-GDA0003610636240000067
Where j may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA0003610636240000068
The sequence of reflow object tags may be
Figure RE-GDA0003610636240000069
The reflow object prediction model can be an RNN algorithm model, and can be constructed as follows:
Figure RE-GDA00036106362400000610
wherein, XtA feature vector of the object representing the t-th stage,
Figure RE-GDA00036106362400000611
representing the reflow hidden layer vector(s),
Figure RE-GDA00036106362400000612
a label representing the reflow object is displayed,
Figure RE-GDA00036106362400000613
where 1 indicates that the object is a reflow object and 0 indicates that the object is not a reflow object.
Figure RE-GDA00036106362400000614
A matrix of parameters representing a prediction model of the reflow object,
Figure RE-GDA00036106362400000615
representing a reflow object prediction model parameter vector; tanh represents an activation function of the hidden layer, and sigmod represents an activation function of the output layer, where the activation function of the hidden layer and the activation function of the output layer may also adopt other activation functions, such as a ReLU function, and the like, which is not limited herein.
Input reflux training sample
Figure RE-GDA00036106362400000616
And reflow test specimens
Figure RE-GDA00036106362400000617
Figure RE-GDA00036106362400000618
Substituting into the reflow object prediction model to obtain a parameter matrix of the reflow object prediction model
Figure RE-GDA00036106362400000619
Reflow object prediction model parameter vector
Figure RE-GDA00036106362400000620
Figure RE-GDA00036106362400000621
And each periodReflow hidden layer of
Figure RE-GDA00036106362400000622
In some embodiments, the reflow object prediction model may also use a regression algorithm model, and the like, and is not limited in this respect.
Therefore, the backflow object prediction model can be obtained through modeling according to the object characteristic data from the t-n stage to the t-1 stage and the backflow object label from the t-n +1 stage to the t stage, and therefore the backflow object prediction model can obtain the relation between the object characteristic data and the backflow object label.
Optionally, step 203 may be implemented by step 302 and step 303, specifically:
and step 302, determining the reflux probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model.
Specifically, the t-th stage object feature data XtParameter matrix of reflow object prediction model
Figure RE-GDA00036106362400000623
Figure RE-GDA00036106362400000624
Reflow object prediction model parameter vector
Figure RE-GDA00036106362400000625
And a reflow-hidden layer
Figure RE-GDA00036106362400000626
Inputting the backflow probability into a backflow object prediction model to obtain the backflow probability of the t +1 th stage.
And step 303, classifying according to the backflow probability and the backflow probability threshold value of the t +1 th stage to determine a backflow object label of the t +1 th stage.
Specifically, the classification is performed according to the reflow probability of the t +1 th stage and a reflow probability threshold, where the reflow probability threshold is, for example, 0.5, and then the label of the reflow object tag is:
Figure RE-GDA00036106362400000627
therefore, the reflow object label at the t +1 th stage can be accurately predicted according to the characteristic data of the object at the t th stage and the reflow object prediction model.
Optionally, the predicting the number of objects includes predicting the number of reflow objects, and step 204 may be implemented through step 304 to step 306, specifically:
step 304, when the label of the reflow object label in the t +1 th stage is the first label, determining that the object is the reflow object.
Probability of current return
Figure RE-GDA0003610636240000071
When the reflow probability is greater than or equal to the reflow probability threshold (for example, 0.5), the label of the reflow object label in the t +1 th stage is a first label (the first label may be 1), and 1 indicates that the object is a reflow object.
Step 305, when the label of the reflow object label in the t +1 th stage is the second label, determining that the object is a non-reflow object.
When predicting the probability
Figure RE-GDA0003610636240000072
When the reflow probability is smaller than the reflow probability threshold, the label of the reflow object label in the t +1 th period is a second label (the second label may be 0), and 0 indicates that the object is not a reflow object.
And step 306, counting the number of the reflow objects in the t +1 th stage to obtain the predicted reflow object number in the t +1 th stage.
Specifically, an object ID with a reflow object tag of 1 is input, and deduplication statistics (the same object may be repeatedly calculated due to multiple times of activity and the like) is performed, so as to obtain the predicted reflow object number in the t +1 th period. Therefore, the predicted reflow object number in the t +1 th stage can be accurately counted and obtained.
Optionally, the object tags include attrition object tags, the attrition objects are objects in the target product, which are active in the t-1 period and inactive in the t period, the prediction model includes an attrition object prediction model, and the attrition object prediction model includes an attrition object prediction model parameter matrix and an attrition object prediction model parameter vector, as shown in fig. 4, step 202 may be implemented through step 401, specifically:
step 401, determining an attrition object prediction model parameter matrix and an attrition object prediction model parameter vector according to the object characteristic data from the t-n stage to the t-1 stage and the attrition object labels from the t-n +1 stage to the t stage to obtain an attrition object prediction model.
Specifically, the t-n th to t th object feature data sequences are, for example, { X }t,Xt-1,…,Xt-nThe tag sequence of the attrition targets from t-n +1 to t is, for example, as
Figure RE-GDA0003610636240000073
By matching the object IDs (user name, client name, account name, etc.), a sample of attrition objects, such as an attrition sample, may be obtained for each period
Figure RE-GDA0003610636240000074
The attrition object sample comprises object characteristic data and an attrition object label, i.e.
Figure RE-GDA0003610636240000075
X which may be the same object IDt-1And
Figure RE-GDA0003610636240000076
x which may be the same object IDt-2And
Figure RE-GDA0003610636240000077
Figure RE-GDA0003610636240000078
x which may be the same object IDt-nAnd
Figure RE-GDA0003610636240000079
the samples of each cycle are subjected to data processing (PCA, normalization, etc.), howeverThe sample to be lost may be randomly divided into a loss training sample and a loss testing sample according to a preset ratio, where the ratio of the loss training sample is, for example, c, and the ratio of the loss testing sample is 1-c, in one example, c may be 80%, and of course, c may also be 70%, 81%, 85%, 89%, 95%, and the like. In one embodiment, the runoff training samples are, for example
Figure RE-GDA00036106362400000710
Wherein h may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA00036106362400000711
The sequence of the attrition object tag may be
Figure RE-GDA00036106362400000712
Figure RE-GDA00036106362400000713
The runoff test specimen is, for example
Figure RE-GDA00036106362400000714
Where j may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA00036106362400000715
The sequence of the attrition object tag may be
Figure RE-GDA00036106362400000716
The attrition object prediction model can be an RNN algorithm model, and can be constructed as follows:
Figure RE-GDA00036106362400000717
wherein XtThe feature vector of the object representing the t-th stage,
Figure RE-GDA00036106362400000718
represents the loss-hidden layer vector(s),
Figure RE-GDA00036106362400000719
a label that represents an attrition object is provided,
Figure RE-GDA00036106362400000720
wherein 1 indicates that the subject is an attrition subject and 0 indicates that the subject is not an attrition subject.
Figure RE-GDA00036106362400000721
Figure RE-GDA00036106362400000722
A matrix of parameters representing a predictive model of the attrition object,
Figure RE-GDA00036106362400000723
representing a parameter vector of a prediction model of the attrition object; tanh represents an activation function of the hidden layer, and sigmod represents an activation function of the output layer, wherein the activation function of the hidden layer and the activation function of the output layer may also adopt other activation functions, such as a ReLU function, and the like, and are not limited in detail herein.
Input loss training sample
Figure RE-GDA0003610636240000081
And run-off test samples
Figure RE-GDA0003610636240000082
Substituting the parameters into the prediction model to obtain the parameter matrix of the prediction model
Figure RE-GDA0003610636240000083
Attrition object prediction model parameter vector
Figure RE-GDA0003610636240000084
And a drain hidden layer for each period
Figure RE-GDA0003610636240000085
In some embodiments, the attrition object prediction model may also be a regression algorithm model, and the like, and is not limited herein.
Therefore, an attrition object prediction model can be obtained by modeling according to the object characteristic data from the t-n stage to the t-1 stage and the attrition object labels from the t-n +1 stage to the t stage, so that the attrition object prediction model can obtain the relation between the object characteristic data and the attrition object labels.
Optionally, step 203 may be implemented by step 402 and step 403, specifically:
and step 402, determining the loss probability of the t +1 th stage according to the characteristic data of the object in the t th stage and the prediction model.
Specifically, the t-th stage object feature data XtParameter matrix of prediction model of attrition object
Figure RE-GDA0003610636240000086
Attrition object prediction model parameter vector
Figure RE-GDA0003610636240000087
And a loss concealment layer
Figure RE-GDA0003610636240000088
Inputting the loss probability into a loss object prediction model to obtain the loss probability of the t +1 th stage.
And step 403, classifying according to the attrition probability of the t +1 th stage and an attrition probability threshold value to determine an attrition object label of the t +1 th stage.
Specifically, classification is performed according to the attrition probability of the t +1 th stage and an attrition probability threshold, where the attrition probability threshold is, for example, 0.5, and then the label of the attrition object is:
Figure RE-GDA0003610636240000089
therefore, the loss object label of the t +1 th stage can be accurately predicted according to the t stage object characteristic data and the loss object prediction model.
Optionally, predicting the number of objects includes predicting the number of attrition objects, and step 204 may be implemented through steps 404 to 406, specifically:
in step 404, when the attrition object tag in the t +1 th stage is marked as the first tag, the object is determined to be an attrition object.
When the loss probability
Figure RE-GDA00036106362400000810
If the attrition probability threshold is greater than or equal to (e.g. 0.5), the label of the attrition object label in the t +1 th stage is marked as a first label (the first label may be 1), and 1 indicates that the object is an attrition object.
Step 405, determining the object as a non-attrition object when the attrition object tag of the t +1 th stage is marked as a second tag.
When predicting the probability
Figure RE-GDA00036106362400000811
When the attrition probability threshold is less than the attrition probability threshold, the label of the attrition object label in stage t +1 is marked as a second label (the second label may be 0), and 0 indicates that the object is not an attrition object.
Step 406, counting the number of the attrition subjects in the t +1 th stage to obtain the predicted attrition subject number in the t +1 th stage.
Specifically, an object ID with an attrition object tag of 1 is input, and deduplication statistics (the same object may be repeatedly calculated due to multiple activities and the like) is performed, so as to obtain the predicted number of attrition objects in the t +1 th stage. Therefore, the number of the predicted loss objects in the t +1 th stage can be accurately counted and obtained.
Optionally, the object tag includes a retained object tag, the retained object is an object in which both the t-1 period and the t period are active in the target product, the prediction model includes a retained object prediction model, and the retained object prediction model includes a retained object prediction model parameter matrix and a retained object prediction model parameter vector, as shown in fig. 5, step 202 may be implemented through step 501, specifically:
step 501, determining a parameter matrix and a parameter vector of a reserved object prediction model according to object feature data from the t-n stage to the t-1 stage and reserved object tags from the t-n +1 stage to the t stage to obtain a reserved object prediction model.
Specifically, the object feature data sequence from the t-n stage to the t stage is, for example, { X }t,Xt-1,…,Xt-nThe retention object tag sequence from the t-n +1 th stage to the t-th stage is, for example
Figure RE-GDA0003610636240000091
By matching the object IDs (user name, client name, account name, etc.), a sample of retained objects, e.g., for each cycle, can be obtained
Figure RE-GDA0003610636240000092
The retained object exemplar includes object feature data and a retained object tag, i.e.
Figure RE-GDA0003610636240000093
X which may be the same object IDt-1And
Figure RE-GDA0003610636240000094
x which may be the same object IDt-2And
Figure RE-GDA0003610636240000095
Figure RE-GDA0003610636240000096
x which may be the same object IDt-nAnd
Figure RE-GDA0003610636240000097
the samples from each cycle are processed (PCA, normalized, etc.) and then the retained subject samples may be randomly split into retained training samples and retained test samples at a preset ratio, e.g., c, and the retained test samples at a ratio of 1-c, in one example, c may be 80%,of course, c may be 70%, 81%, 85%, 89%, 95%, or the like. In one embodiment, the retention training samples are, for example
Figure RE-GDA0003610636240000098
Wherein h may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA0003610636240000099
The sequence of persistent object tags may be
Figure RE-GDA00036106362400000910
Retention test specimens, for example, are
Figure RE-GDA00036106362400000911
Where j may be a positive integer greater than or equal to 1, and the sequence of object feature data may be
Figure RE-GDA00036106362400000912
The sequence of persistent object tags may be
Figure RE-GDA00036106362400000913
The retention object prediction model may be an RNN algorithm model, and may be constructed as follows:
Figure RE-GDA00036106362400000914
wherein, XtThe feature vector of the object representing the t-th stage,
Figure RE-GDA00036106362400000915
representing the persistence of the hidden layer vector,
Figure RE-GDA00036106362400000916
a tag indicating the object to be persisted,
Figure RE-GDA00036106362400000917
where 1 indicates that the object is a persisted object and 0 indicates that the object is not a persisted object.
Figure RE-GDA00036106362400000918
Figure RE-GDA00036106362400000919
A matrix of parameters representing the predictive model of the surviving object,
Figure RE-GDA00036106362400000920
representing a surviving object prediction model parameter vector; tanh represents an activation function of the hidden layer, and sigmod represents an activation function of the output layer, wherein the activation function of the hidden layer and the activation function of the output layer may also adopt other activation functions, such as a ReLU function, and the like, and are not limited in detail herein.
Input retention training sample
Figure RE-GDA00036106362400000921
And retention of the test sample
Figure RE-GDA00036106362400000922
Substituting the prediction model into the reserved object prediction model to obtain a reserved object prediction model parameter matrix of the reserved object prediction model
Figure RE-GDA00036106362400000923
Persistent object prediction model parameter vectors
Figure RE-GDA00036106362400000924
And a hidden layer for each cycle
Figure RE-GDA00036106362400000925
In some embodiments, the surviving object prediction model may also adopt a regression algorithm model, and the like, and is not specifically limited herein.
Therefore, the retention object prediction model can be obtained according to the object characteristic data from the t-n stage to the t-1 stage and the retention object label from the t-n +1 stage to the t stage, so that the retention object prediction model can obtain the relation between the object characteristic data and the retention object label.
Optionally, step 203 may be implemented by step 502 and step 503, specifically:
and 502, determining the retention probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model.
Specifically, the t-th stage object feature data XtAnd a parameter matrix of the prediction model of the retained object
Figure RE-GDA00036106362400000926
Persistent object prediction model parameter vectors
Figure RE-GDA00036106362400000927
And a hidden layer
Figure RE-GDA00036106362400000928
And inputting the retention probability into a retention object prediction model to obtain the retention probability of the t +1 th stage.
And step 503, classifying according to the retention probability of the t +1 th stage and a retention probability threshold value to determine a retention object label of the t +1 th stage.
Specifically, the classification is performed according to the retention probability of the t +1 th stage and a retention probability threshold, where the retention probability threshold is, for example, 0.5, and then the label of the retention object tag is:
Figure RE-GDA0003610636240000101
therefore, the t + 1-stage retention object label can be accurately predicted according to the t-stage object characteristic data and the retention object prediction model.
Optionally, the predicting the number of objects includes predicting a number of retained objects, and step 204 may be implemented by steps 504 to 506, specifically:
step 504, when the mark of the reserved object label in the t +1 th stage is the first mark, determining that the object is the reserved object.
Probability of surviving
Figure RE-GDA0003610636240000102
When the retention probability threshold is greater than or equal to (e.g., 0.5), the label of the retained object label in the t +1 th stage is a first label (the first label may be 1), and 1 indicates that the object is a retained object.
Step 505, when the mark of the reserved object label in the t +1 th period is the second mark, the object is determined to be a non-reserved object.
When predicting the probability
Figure RE-GDA0003610636240000103
When the retention probability is smaller than the threshold, the label of the retention object label of the t +1 th stage is a second label (the second label can be 0), and 0 indicates that the object is not the retention object.
Step 506, counting the number of the retention objects at the t +1 th stage to obtain the predicted retention object number at the t +1 th stage.
Specifically, an object ID with a persisted object tag of 1 is input, and deduplication statistics (the same object may be repeatedly calculated due to multiple times of activity and the like) is performed, so as to obtain the predicted number of persisted objects in the t +1 th stage. Therefore, the number of the predicted retention objects in the t +1 th stage can be accurately counted and obtained.
Optionally, the predicting the number of objects includes predicting the number of reflow objects, predicting the number of attrition objects, and predicting the number of retention objects, as shown in fig. 6, the method for predicting the number of objects further includes:
step 601, determining the number of the predicted active objects on the predicted day of the t +1 th stage according to the number of the predicted newly-added objects on the t +1 th stage, the number of the predicted reflow objects on the t +1 th stage, the number of the predicted erosion objects on the t +1 th stage and the number of the predicted retention objects on the t +1 th stage.
Specifically, the predicted number of newly added objects in the t +1 th period is, for example
Figure RE-GDA0003610636240000104
The predicted number of reflow objects in the t +1 th stage is, for example
Figure RE-GDA0003610636240000105
The predicted number of attrition subjects in stage t +1 is, for example
Figure RE-GDA0003610636240000106
The predicted number of surviving objects for period t +1 is, for example
Figure RE-GDA0003610636240000107
Taking one day as a period, the number of daily active objects can be obtained, and the number of daily active objects can be specifically DAU, so that the predicted number of daily active objects in the t +1 th period
Figure RE-GDA0003610636240000108
Therefore, four factors (newly added, retained, lost and reflowed) influencing the DAU can be split, and the more accurate predicted daily active object number can be obtained according to the predicted newly added object number in the t +1 th stage, the predicted reflowing object number in the t +1 th stage, the predicted lost object number in the t +1 th stage and the predicted retained object number in the t +1 th stage.
In other embodiments, a month may also be taken as a period, the monthly active object number may be obtained, the monthly active object number may specifically be MAU, and then the predicted monthly active object number in the t +1 th period
Figure RE-GDA0003610636240000109
Therefore, the more accurate monthly active object quantity prediction can be obtained.
Optionally, as shown in fig. 7, the method for predicting the number of objects includes:
step 701, acquiring the number of actually added objects from the t-m stage to the t stage, wherein m is a positive integer greater than or equal to 1, and t is greater than m.
Specifically, the actual number of newly added objects from the t-m stage to the t stage is, for example, { nv }t,nvt-1,…,nvt-m}。
And step 702, modeling according to the actual number of the newly added objects from the t-m stage to the t-1 stage to obtain a newly added object prediction model.
Specifically, the newly added object prediction model may be an AR algorithm model, and the following model may be constructed:
nvt=w0nvt-1+w1nvt-2+…+wp-1nvt-pt-1
wherein p is a positive integer greater than or equal to 1, t is greater than p, m is greater than p, λt-1N (0, 1), N (0, 1) representing a standard normal distribution, { w0,w1,…,wp-1Are the model weights. In certain embodiments, m-2 p-2 n.
Inputting the number of the actual newly added objects { nvt,nvt-1,…,nvt-mSubstituting into the above AR algorithm model, the following equation set can be formed:
Figure RE-GDA0003610636240000111
training and testing the AR algorithm model by adopting a linear regression method to obtain the model weight
Figure RE-GDA0003610636240000112
Figure RE-GDA0003610636240000113
Step 703, determining the predicted number of newly added objects at the t +1 th stage according to the actual number of newly added objects at the t th stage and the newly added object prediction model.
Specifically, the number of actually newly added objects from the t-p +1 th stage to the t-th stage is, for example, { nv }t,nvt-1,…,nvt-p+1}, model weight
Figure RE-GDA0003610636240000114
Inputting the predicted new objects into the prediction model to obtain the predicted new object number of the t +1 th stage
Figure RE-GDA0003610636240000115
Therefore, the predicted newly-added object number in the t +1 th stage can be more accurately determined.
Optionally, the newly added object prediction model includes a model weight, as shown in fig. 8, step 702 may be implemented by step 801, specifically:
step 801, determining model weight according to the actual number of newly added objects from the t-m stage to the t-1 stage to obtain a newly added object prediction model. Step 801 may refer to the description of step 702.
It can be understood that the new object belongs to a new object, and the new object does not have object characteristic data; the reflow object, the attrition object and the retention object belong to old objects, and the old objects have object characteristic data, so that the quantity of the reflow object, the quantity of the attrition object and the quantity of the retention object can be predicted by the object characteristic data, and the quantity of the newly added objects is predicted by the historical quantity of the newly added objects.
Optionally, as shown in fig. 9, the method for predicting the number of objects further includes:
step 901, constructing a standard deviation index according to the predicted daily active object number from the t-n stage to the t +1 stage and the actual daily active object number from the t-n stage to the t +1 stage.
Specifically, the predicted daily active object number of the t +1 th period is, for example
Figure RE-GDA0003610636240000116
The actual daily active object number in the t +1 th period is, for example, daut+1Then a standard deviation index can be constructed
Figure RE-GDA0003610636240000117
The standard deviation index sequence [ MSE ] can be obtained by constructing the standard deviation index of each periodt-n,…,MSEt,MSEt+1]。
And step 902, when the standard deviation indexes accord with the preset distribution, determining that the prediction model and the newly added object prediction model for obtaining the predicted newly added object number accord with the model evaluation standard.
Specifically, a standard deviation trend graph of each period is generated according to a standard deviation index sequence, when the standard deviation trend graph meets the standard normal distribution, the standard deviation index accords with the preset distribution, the loss of the number of the active daily objects is forecasted to accord with the standard normal model, and the forecast model (the reflux object forecast model, the lost object forecast model and the reserved object forecast model) and the newly added object forecast model) accord with the model evaluation standard, so that the quantity of the objects can be accurately forecasted by the forecast model and the newly added object forecast model.
And 903, when the standard deviation indexes do not accord with the preset distribution, retraining the prediction model and the newly added object prediction model.
Specifically, a standard deviation trend graph of each period is generated according to a standard deviation index sequence, when the standard deviation trend graph does not meet the standard normal distribution, the standard deviation index does not meet the preset distribution, the loss of the number of the daily active objects does not meet the standard normal model, the prediction model (the backflow object prediction model, the loss object prediction model and the retention object prediction model) and the newly added object prediction model do not meet the model evaluation standard, at the moment, the prediction model and the newly added object model need to be trained again, and the prediction is performed again after the retraining until the standard deviation index meets the preset distribution, so that the prediction model and the newly added prediction object prediction model can accurately estimate the number of the objects.
Optionally, as shown in fig. 10, the method for estimating the number of objects further includes:
and 1001, calculating an average value and a standard deviation according to the number of the active objects on the actual day from the t-n stage to the t stage.
Specifically, the actual daily active object number from the t-n th period to the t th period is, for example, [ dau ]t-n,…,daut]Calculating the average value of the number of the active objects on the actual day from the t-n stage to the t stage:
Figure RE-GDA0003610636240000121
standard deviation of actual daily active object number from t-n stage to t stage:
Figure RE-GDA0003610636240000122
at step 1002, a confidence interval is constructed based on the mean and standard deviation.
In particular, the confidence interval may be
Figure RE-GDA0003610636240000123
Wherein
Figure RE-GDA0003610636240000124
sau may be the lower limit of the confidence interval,
Figure RE-GDA0003610636240000125
may be a lower limit value of the confidence interval, zα/2May be a preset confidence level, and in one embodiment, zα/2Can be 95%, of course, in other embodiments, zα/2The content may be 90%, 98%, etc., and is not particularly limited.
And 1003, sending out alarm information when the number of the predicted daily active objects in the t +1 th period is outside the confidence interval.
Specifically, the predicted daily active object number of the t +1 th period is input
Figure RE-GDA0003610636240000126
And confidence interval
Figure RE-GDA0003610636240000127
Figure RE-GDA0003610636240000128
Judgment of
Figure RE-GDA0003610636240000129
Whether or not it is outside the confidence interval, if
Figure RE-GDA00036106362400001210
And sending out alarm information outside the confidence interval. The manner of sending the alarm information may be: through public numbers and postsAnd sending alarm information to product decision makers and product operators by the parts, the short messages, the enterprise WeChat, the WeChat and the like. After the alarm information is received, the alarm information can be displayed through a data visualization tool for a product decision maker and a product operator to check.
Optionally, the confidence interval includes a lower limit value and an upper limit value, as shown in fig. 11, step 1002 may be implemented by step 1101, specifically:
and 1101, calculating according to the average value, the preset confidence degree and the standard deviation, and obtaining a lower limit value and an upper limit value to construct a confidence interval. Step 1101 can be referred to the description of step 1002.
In order to better explain the object quantity estimation method provided by the embodiment of the present application, the flow of the object quantity estimation method provided by the embodiment of the present application can be summarized and summarized as the following steps:
step 201, acquiring object characteristic data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n.
Step 301, determining a parameter matrix and a parameter vector of a reflow object prediction model according to the object feature data from the t-n stage to the t-1 stage and the reflow object tags from the t-n +1 stage to the t stage to obtain a reflow object prediction model. And step 302, determining the reflux probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model. And step 303, classifying according to the backflow probability and the backflow probability threshold value of the t +1 th stage to determine a backflow object label of the t +1 th stage. Step 304, when the label of the reflow object label in the t +1 th stage is the first label, determining that the object is the reflow object. Step 305, when the label of the reflow object label in the t +1 th stage is the second label, determining that the object is a non-reflow object. And step 306, counting the number of the reflow objects in the t +1 th stage to obtain the predicted reflow object number in the t +1 th stage.
Step 401, determining an attrition object prediction model parameter matrix and an attrition object prediction model parameter vector according to the object characteristic data from the t-n stage to the t-1 stage and the attrition object labels from the t-n +1 stage to the t stage to obtain an attrition object prediction model. And step 402, determining the loss probability of the t +1 th stage according to the characteristic data of the object in the t th stage and the prediction model. And step 403, classifying according to the attrition probability of the t +1 th stage and an attrition probability threshold value to determine an attrition object label of the t +1 th stage. In step 404, when the attrition object tag in the t +1 th stage is marked as the first tag, the object is determined to be an attrition object. Step 405, determining the object as a non-attrition object when the attrition object tag of the t +1 th stage is marked as a second tag. Step 406, counting the number of the attrition subjects in the t +1 th stage to obtain the predicted attrition subject number in the t +1 th stage.
Step 501, determining a parameter matrix and a parameter vector of a reserved object prediction model according to object feature data from the t-n stage to the t-1 stage and reserved object tags from the t-n +1 stage to the t stage to obtain a reserved object prediction model. Step 502, determining the retention probability of the t +1 th stage according to the characteristic data of the t-th stage object and the prediction model. And step 503, classifying according to the retention probability of the t +1 th stage and a retention probability threshold value to determine a retention object label of the t +1 th stage. Step 504, when the mark of the reserved object label in the t +1 th stage is the first mark, determining that the object is the reserved object. Step 505, when the mark of the reserved object label in the t +1 th period is the second mark, the object is determined to be a non-reserved object. Step 506, counting the number of the retention objects at the t +1 th stage to obtain the predicted retention object number at the t +1 th stage.
Step 701, acquiring the number of actually added objects from the t-m stage to the t stage, wherein m is a positive integer greater than or equal to 1, and t is greater than m. And step 702, modeling according to the actual number of the newly added objects from the t-m stage to the t-1 stage to obtain a newly added object prediction model. Step 703, determining the predicted number of newly added objects at the t +1 th stage according to the actual number of newly added objects at the t th stage and the newly added object prediction model.
Step 601, determining the number of active objects in the predicted day of the t +1 th period according to the number of the predicted newly-added objects in the t +1 th period, the number of the predicted reflow objects in the t +1 th period, the number of the predicted attrition objects in the t +1 th period and the number of the predicted retention objects in the t +1 th period.
Step 901, constructing a standard deviation index according to the predicted daily active object number from the t-n stage to the t +1 stage and the actual daily active object number from the t-n stage to the t +1 stage. And step 902, when the standard deviation indexes accord with the preset distribution, determining that the prediction model and the newly added object prediction model for obtaining the predicted newly added object number accord with the model evaluation standard. And 903, when the standard deviation index does not accord with the preset distribution, re-training the prediction model and the newly added object prediction model.
And 1001, calculating an average value and a standard deviation according to the number of the active objects on the actual day from the t-n stage to the t stage. At step 1002, a confidence interval is constructed based on the mean and standard deviation. And 1003, sending out alarm information when the number of the predicted daily active objects in the t +1 th period is outside the confidence interval.
All the above technical solutions can be combined arbitrarily to form the optional embodiments of the present application, and are not described herein again.
According to the method and the device, the prediction model is obtained through modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage, so that the prediction model can obtain the relation between the object characteristic data and the object labels, the object labels of the t +1 stage can be determined according to the object characteristic data of the t stage, and the number of the predicted objects of the t +1 stage can be obtained through statistics of the object labels of the t +1 stage.
In order to better implement the object quantity estimation method in the embodiment of the present application, an object quantity estimation device is further provided in the embodiment of the present application. Referring to fig. 12, fig. 12 is a schematic structural diagram of an object quantity estimation apparatus according to an embodiment of the present disclosure. The object quantity estimation apparatus 1200 may include:
the obtaining module 1210, the obtaining module 1210 is configured to obtain object feature data of a target product from a t-n stage to a t stage and object tags from the t-n +1 stage to the t stage, t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n;
the first processing module 1220, the first processing module 1220 is configured to obtain a prediction model according to the object feature data from the t-n th stage to the t-1 th stage and the object tag from the t-n +1 th stage to the t-th stage;
the second processing module 1230, the second processing module 1230 is configured to determine a t +1 th stage object tag according to the t-th stage object feature data and the prediction model;
the statistical module 1240 is used for obtaining the number of the predicted objects in the t +1 th stage according to the statistics of the object tags in the t +1 th stage.
Optionally, the first processing module 1220 may be configured to: and determining a parameter matrix and a parameter vector of a reflow object prediction model according to the object characteristic data from the t-n stage to the t-1 stage and the reflow object labels from the t-n +1 stage to the t stage to obtain the reflow object prediction model.
Optionally, the second processing module 1230 may be configured to: determining the reflux probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model; and classifying according to the reflux probability of the t +1 th stage and a reflux probability threshold value to determine the reflux object label of the t +1 th stage.
Optionally, the statistics module 1240 may be configured to: when the label of the reflow object label in the t +1 th stage is a first label, determining that the object is a reflow object; when the label of the reflow object label in the t +1 th stage is a second label, determining that the object is a non-reflow object; and counting the number of reflow objects in the t +1 th stage to obtain the predicted reflow object number in the t +1 th stage.
Optionally, the first processing module 1220 may be configured to: and determining an attrition object prediction model parameter matrix and an attrition object prediction model parameter vector according to the object characteristic data from the t-n stage to the t-1 stage and the attrition object labels from the t-n +1 stage to the t stage so as to obtain an attrition object prediction model.
Optionally, the second processing module 1230 is configured to: determining the loss probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model; and classifying according to the attrition probability of the t +1 th stage and an attrition probability threshold value to determine the attrition object label of the t +1 th stage.
Optionally, the statistics module 1240 may be configured to: when the mark of the attrition object label in the t +1 th stage is a first mark, determining the object as an attrition object; when the mark of the attrition object label in the t +1 th period is a second mark, determining the object as a non-attrition object; and (4) counting the number of the loss objects in the t +1 th stage to obtain the predicted loss object number in the t +1 th stage.
Optionally, the first processing module 1220 may be configured to: and determining a retention object prediction model parameter matrix and a retention object prediction model parameter vector according to the object characteristic data from the t-n stage to the t-1 stage and the retention object label from the t-n +1 stage to the t stage to obtain a retention object prediction model.
Optionally, the second processing module 1230 may be configured to: determining the retention probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model; and classifying according to the retention probability of the t +1 th stage and a retention probability threshold value to determine the retention object label of the t +1 th stage.
Optionally, the statistics module 1240 may be configured to: when the mark of the reserved object label in the t +1 th stage is a first mark, determining that the object is a reserved object; when the mark of the reserved object label in the t +1 th stage is a second mark, determining that the object is a non-reserved object; and counting the number of the retention objects at the t +1 th stage to obtain the predicted retention object number at the t +1 th stage.
Optionally, the object quantity predicting apparatus 1200 further includes a third processing module 1250, and the third processing module 1250 is configured to: and determining the predicted daily active object number of the t +1 th stage according to the predicted newly-increased object number of the t +1 th stage, the predicted backflow object number of the t +1 th stage, the predicted attrition object number of the t +1 th stage and the predicted retention object number of the t +1 th stage.
Optionally, the object quantity estimation apparatus 1200 further includes an acquisition module 1261, a fourth processing module 1262, and a fifth processing module 1263. The acquisition module 1261 may be configured to obtain the number of actually newly added objects from the t-m stage to the t stage, where m is a positive integer greater than or equal to 1, and t is greater than m. The fourth processing module 1262 may be configured to obtain a newly added object prediction model by modeling according to the actual number of newly added objects from the t-m stage to the t-1 stage. The fifth processing module 1263 may be configured to determine a predicted number of newly added objects at stage t +1 based on the actual number of newly added objects at stage t and the newly added object prediction model.
Optionally, the fourth processing module 1262 may be configured to: and determining the weight of the model according to the actual number of the newly added objects from the t-m stage to the t-1 stage to obtain a newly added object prediction model.
Optionally, the object quantity estimation apparatus 1200 further includes a sixth processing module 1271 and a seventh processing module 1272. The sixth processing module 1271 may be configured to construct a standard deviation indicator according to the predicted number of active daily objects from the t-n th period to the t +1 th period and the actual number of active daily objects from the t-n th period to the t +1 th period. The seventh processing module 1272 may be configured to determine that the prediction model and the newly added object prediction model used to obtain the predicted number of newly added objects meet the model evaluation criteria when the standard deviation indicator meets the preset distribution. Alternatively, the seventh processing module 1272 may be configured to retrain the prediction model and the prediction model of the newly added object when the standard deviation indicator does not meet the preset distribution.
Optionally, the object quantity estimation apparatus 1200 further includes an eighth processing module 1281, a ninth processing module 1282, and a tenth processing module 1283. The eighth processing module 1281 may be configured to calculate a mean and a standard deviation according to the number of active objects on the actual day from the t-n th period to the t-th period. A ninth processing module 1282 may be used to construct confidence intervals based on the mean and standard deviation. The tenth processing module 1283 may be configured to issue an alert message when the predicted number of daily active objects in the t +1 th period is outside the confidence interval.
All or part of the modules and units in the object quantity estimation device can be realized by software, hardware and a combination thereof. The modules and units may be embedded in a hardware form or may be independent of a processor in the computer device, or may be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules and units.
The object quantity estimation device 1200 may be integrated in a terminal or a server having a storage and a processor installed therein and having a computing capability, or the object quantity estimation device 1200 may be the terminal or the server.
Optionally, the present application further provides a computer device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps in the foregoing method embodiments when executing the computer program.
Fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present application, where the computer device may be the terminal or the server shown in fig. 1. As shown in fig. 13, the computer device 1300 may include: a communication interface 1301, a memory 1302, a processor 1303 and a communication bus 1304. The communication interface 1301, the memory 1302, and the processor 1303 implement communication with each other through a communication bus 1304. The communication interface 1301 is used for data communication with an external device. The memory 1302 may be used for storing software programs and modules, and the processor 1303 may execute the software programs and modules stored in the memory 1302, for example, the software programs of the corresponding operations in the foregoing method embodiments.
Alternatively, the processor 1303 may call the software programs and modules stored in the memory 1302 to perform the following operations: acquiring object characteristic data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n; modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage to obtain a prediction model; determining a t +1 stage object label according to the t stage object characteristic data and the prediction model; and (4) obtaining the predicted object number of the t +1 stage according to the t +1 stage object label statistics.
The present application also provides a computer-readable storage medium for storing a computer program. The computer-readable storage medium can be applied to a computer device, and the computer program enables the computer device to execute the corresponding process in the object quantity estimation method in the embodiment of the present application, which is not described herein again for brevity.
The present application also provides a computer program product comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and executes the computer instruction, so that the computer device executes the corresponding process in the object quantity estimation method in the embodiment of the present application, which is not described herein again for brevity.
The present application also provides a computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and executes the computer instruction, so that the computer device executes the corresponding process in the object quantity estimation method in the embodiment of the present application, which is not described herein again for brevity.
It should be understood that the processor of the embodiments of the present application may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous link SDRAM (SLDRAM), and Direct Rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
It should be understood that the above memories are exemplary but not limiting illustrations, for example, the memories in the embodiments of the present application may also be Static Random Access Memory (SRAM), dynamic random access memory (dynamic RAM, DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (enhanced SDRAM, ESDRAM), Synchronous Link DRAM (SLDRAM), Direct Rambus RAM (DR RAM), and the like. That is, the memory in the embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer or a server) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A method for predicting the number of objects is characterized by comprising the following steps:
acquiring object characteristic data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n;
modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object labels from the t-n +1 stage to the t stage to obtain a prediction model;
determining a t +1 stage object label according to the t stage object characteristic data and the prediction model;
and obtaining the number of predicted objects in the t +1 th stage according to the t +1 th stage object label statistics.
2. The method for predicting the quantity of objects according to claim 1, wherein the object labels comprise reflow object labels, the reflow objects are objects with t-2 active periods, t-1 inactive periods and t active periods in the target product, the prediction model comprises a reflow object prediction model, the reflow object prediction model comprises a reflow object prediction model parameter matrix and a reflow object prediction model parameter vector, and the modeling of the object feature data from t-n periods to t-1 periods and the object labels from t-n +1 periods to t periods obtains the prediction model by:
and determining the parameter matrix of the reflow object prediction model and the parameter vector of the reflow object prediction model according to the object characteristic data from the t-n stage to the t-1 stage and the reflow object label from the t-n +1 stage to the t stage so as to obtain the reflow object prediction model.
3. The method for predicting the number of objects according to claim 2, wherein the determining the t +1 th object label according to the t-th object feature data and the prediction model comprises:
determining the backflow probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model;
and classifying according to the reflux probability of the t +1 th stage and a reflux probability threshold value to determine a reflux object label of the t +1 th stage.
4. The method of claim 3, wherein predicting the number of objects comprises predicting the number of reflow objects, and obtaining the number of predicted objects in the t +1 th stage according to the t +1 th stage object tag statistics comprises:
when the label of the reflow object label in the t +1 th period is a first label, determining that the object is a reflow object;
when the label of the reflow object label in the t +1 th stage is a second label, determining that the object is a non-reflow object;
and counting the number of reflow objects in the t +1 th stage to obtain the predicted reflow object number in the t +1 th stage.
5. The method of claim 1, wherein the object tags comprise attrition object tags, the attrition objects are t-1 active and t-inactive objects in the target product, the prediction model comprises an attrition object prediction model, the attrition object prediction model comprises an attrition object prediction model parameter matrix and an attrition object prediction model parameter vector, and the modeling from the object feature data from t-n stage to t-1 stage and the object tags from t-n +1 stage to t stage obtains the prediction model, which comprises:
and determining the parameter matrix of the attrition object prediction model and the parameter vector of the attrition object prediction model according to the object characteristic data from the t-n stage to the t-1 stage and the attrition object labels from the t-n +1 stage to the t stage so as to obtain the attrition object prediction model.
6. The method for predicting the number of objects according to claim 5, wherein the determining the t +1 th object label according to the t-th object feature data and the prediction model comprises:
determining the loss probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model;
and classifying according to the attrition probability of the t +1 th stage and an attrition probability threshold value to determine the attrition object label of the t +1 th stage.
7. The method of claim 6, wherein predicting the number of subjects comprises predicting the number of attrition subjects, and wherein obtaining the number of t +1 st stage predicted subjects according to the t +1 st stage subject label statistics comprises:
when the mark of the attrition object label in the t +1 th stage is a first mark, determining that the object is an attrition object;
when the mark of the attrition object label in the t +1 stage is a second mark, determining that the object is a non-attrition object;
and (4) counting the number of the loss objects in the t +1 th stage to obtain the predicted loss object number in the t +1 th stage.
8. The method according to claim 1, wherein the object labels comprise retained object labels, the retained objects are objects in the target product with active t-1 and t periods, the prediction model comprises a retained object prediction model, the retained object prediction model comprises a retained object prediction model parameter matrix and a retained object prediction model parameter vector, and the prediction model is obtained by modeling according to object feature data from t-n period to t-1 period and object labels from t-n +1 period to t period, and comprises:
and determining the parameter matrix of the preserved object prediction model and the parameter vector of the preserved object prediction model according to the object characteristic data from the t-n stage to the t-1 stage and the preserved object label from the t-n +1 stage to the t stage so as to obtain the preserved object prediction model.
9. The method for predicting the number of objects according to claim 8, wherein the determining the t +1 th object label according to the t-th object feature data and the prediction model comprises:
determining the retention probability of the t +1 th stage according to the characteristic data of the object at the t th stage and the prediction model;
and classifying according to the retention probability of the t +1 th stage and a retention probability threshold value to determine a retention object label of the t +1 th stage.
10. The method for predicting the number of objects according to claim 9, wherein predicting the number of objects includes predicting a number of remaining objects, and obtaining the number of predicted objects in the t +1 th stage according to the t +1 th stage object tag statistics includes:
when the mark of the reserved object label at the t +1 th stage is a first mark, determining that the object is a reserved object;
when the mark of the reserved object label at the t +1 th stage is a second mark, determining that the object is a non-reserved object;
and counting the number of the retention objects at the t +1 th stage to obtain the predicted retention object number at the t +1 th stage.
11. The method of claim 1, wherein the number of predicted objects comprises a predicted number of reflow objects, a predicted number of attrition objects, and a predicted number of retention objects, and wherein the method further comprises:
and determining the predicted daily active object number of the t +1 th stage according to the predicted newly-increased object number of the t +1 th stage, the predicted backflow object number of the t +1 th stage, the predicted attrition object number of the t +1 th stage and the predicted retention object number of the t +1 th stage.
12. The method of claim 11, wherein the method of predicting the number of objects comprises:
acquiring the number of actual newly added objects from the t-m stage to the t stage, wherein m is a positive integer greater than or equal to 1, and t is greater than m;
modeling according to the actual number of newly added objects from the t-m stage to the t-1 stage to obtain a newly added object prediction model;
and determining the predicted new added object number of the t +1 th stage according to the actual new added object number of the t th stage and the new added object prediction model.
13. The method for predicting the number of objects according to claim 12, wherein the model for predicting the newly added objects comprises model weights, and the model for obtaining the model for predicting the newly added objects according to the actual number of the newly added objects from the t-m stage to the t-1 stage comprises:
and determining the model weight according to the actual number of the newly added objects from the t-m stage to the t-1 stage to obtain the newly added object prediction model.
14. The method of claim 11, wherein the method of predicting the number of objects further comprises:
constructing a standard deviation index according to the predicted number of the daily active objects from the t-n stage to the t +1 stage and the actual number of the daily active objects from the t-n stage to the t +1 stage;
when the standard deviation indexes accord with preset distribution, determining that the prediction model and a newly added object prediction model for obtaining the predicted newly added object number accord with model evaluation standards; or
And when the standard deviation indexes do not accord with the preset distribution, retraining the prediction model and the newly added object prediction model.
15. The method of claim 11, wherein the method of predicting the number of objects further comprises:
calculating the average value and the standard deviation according to the number of the active objects in the actual days from the t-n stage to the t stage;
constructing a confidence interval according to the average value and the standard deviation;
and when the number of the predicted daily active objects in the t +1 th period is outside the confidence interval, sending out alarm information.
16. The method of claim 15, wherein the confidence interval comprises a lower limit and an upper limit, and the constructing the confidence interval according to the mean and the standard deviation comprises:
and calculating according to the average value, the preset confidence degree and the standard deviation to obtain the lower limit value and the upper limit value so as to construct the confidence interval.
17. An apparatus for predicting a number of objects, the apparatus comprising:
the acquisition module is used for acquiring object feature data of a target product from the t-n stage to the t stage and object labels from the t-n +1 stage to the t stage, wherein t is a positive integer greater than 1, n is a positive integer greater than or equal to 1, and t is greater than n;
the first processing module is used for modeling according to the object characteristic data from the t-n stage to the t-1 stage and the object label from the t-n +1 stage to the t stage to obtain a prediction model;
the second processing module is used for determining a t +1 stage object label according to the t stage object characteristic data and the prediction model;
and the statistical module is used for obtaining the number of the predicted objects in the t +1 th stage according to the statistics of the object tags in the t +1 th stage.
18. A computer-readable storage medium, storing a computer program adapted to be loaded by a processor to perform the steps of the method for estimating the number of objects according to any one of claims 1 to 16.
19. A computer device, characterized in that the computer device comprises a processor and a memory, wherein the memory stores a computer program, and the processor is used for executing the steps in the object quantity estimation method according to any one of claims 1 to 16 by calling the computer program stored in the memory.
20. A computer program product comprising computer instructions, wherein said computer instructions, when executed by a processor, implement the steps in the method for estimating the number of objects according to any one of claims 1 to 16.
CN202210272449.8A 2022-03-18 2022-03-18 Method, device, storage medium, device and program product for estimating number of objects Pending CN114626889A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210272449.8A CN114626889A (en) 2022-03-18 2022-03-18 Method, device, storage medium, device and program product for estimating number of objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210272449.8A CN114626889A (en) 2022-03-18 2022-03-18 Method, device, storage medium, device and program product for estimating number of objects

Publications (1)

Publication Number Publication Date
CN114626889A true CN114626889A (en) 2022-06-14

Family

ID=81901135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210272449.8A Pending CN114626889A (en) 2022-03-18 2022-03-18 Method, device, storage medium, device and program product for estimating number of objects

Country Status (1)

Country Link
CN (1) CN114626889A (en)

Similar Documents

Publication Publication Date Title
CN109034660B (en) Method and related device for determining risk control strategy based on prediction model
CN108932582B (en) Risk information determination method and device, computer equipment and storage medium
CN110297968B (en) Product pushing method, device, computer equipment and storage medium
CN112508638B (en) Data processing method and device and computer equipment
CN111768305A (en) Anti-money laundering identification method and device
CN111563560A (en) Data stream classification method and device based on time sequence feature learning
CN110912874A (en) Method and system for effectively identifying machine access behaviors
CN110807050B (en) Performance analysis method, device, computer equipment and storage medium
CN113610157A (en) Service big data characteristic acquisition method based on artificial intelligence and server
CN111835536B (en) Flow prediction method and device
CN113297486B (en) Click rate prediction method and related device
CN113592593A (en) Training and application method, device, equipment and storage medium of sequence recommendation model
CN117540336A (en) Time sequence prediction method and device and electronic equipment
CN110910241B (en) Cash flow evaluation method, apparatus, server device and storage medium
CN110796379B (en) Risk assessment method, device and equipment of business channel and storage medium
CN116257885A (en) Private data communication method, system and computer equipment based on federal learning
CN117081941A (en) Flow prediction method and device based on attention mechanism and electronic equipment
CN114626889A (en) Method, device, storage medium, device and program product for estimating number of objects
CN113343577B (en) Parameter optimization method, device, equipment and medium based on machine learning
CN114925275A (en) Product recommendation method and device, computer equipment and storage medium
CN114090407A (en) Interface performance early warning method based on linear regression model and related equipment thereof
CN114358692A (en) Distribution time length adjusting method and device and electronic equipment
CN114510627A (en) Object pushing method and device, electronic equipment and storage medium
CN117041121B (en) Internet of Things anomaly monitoring method and system based on data mining
CN111833098A (en) Information prediction method, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40070939

Country of ref document: HK