CN117689235A - Data processing method, device, computer equipment and readable storage medium - Google Patents

Data processing method, device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN117689235A
CN117689235A CN202211034638.8A CN202211034638A CN117689235A CN 117689235 A CN117689235 A CN 117689235A CN 202211034638 A CN202211034638 A CN 202211034638A CN 117689235 A CN117689235 A CN 117689235A
Authority
CN
China
Prior art keywords
time period
unit time
business
target
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211034638.8A
Other languages
Chinese (zh)
Inventor
朱少杰
贾骐玮
李玮
符志航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211034638.8A priority Critical patent/CN117689235A/en
Publication of CN117689235A publication Critical patent/CN117689235A/en
Pending legal-status Critical Current

Links

Landscapes

  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the application provides a data processing method, a device, computer equipment and a readable storage medium, wherein the method can be applied to cloud technology, artificial intelligence, intelligent traffic, auxiliary driving, multimedia, recommendation and other scenes, and comprises the following steps: acquiring a conversion rate parameter set and an attenuation parameter in each unit time period in T unit time periods; determining conversion rate index parameters of the participation objects in N business scenes aiming at the business objects respectively according to the conversion rate parameter groups and the attenuation parameters; acquiring interaction time lengths of the participation objects in N business scenes respectively in a target unit time period; determining importance index parameters of the participation objects in N business scenes aiming at the business objects according to the interaction time length; and fusing the conversion rate index parameter and the importance index parameter to generate the interest degree parameter of the participated object aiming at the business object in the target time period. The method and the device can improve the accuracy of generating the interestingness parameters and the relevance of the interestingness parameters.

Description

Data processing method, device, computer equipment and readable storage medium
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a data processing method, a data processing device, a computer device, and a readable storage medium.
Background
Currently, when determining the interest level of a participating object in relation to a business object, the interest level of the participating object in relation to the business object is usually divided manually, different participating objects are divided into different grades, the participating objects in different grades correspond to different interest levels, and the participating objects in the same grade have the same interest level, so that the interest level parameters of the participating object in relation to the business object cannot be accurately determined. In addition, in the prior art, different interestingness is generated in different business scenes, and the interestingness in different business scenes does not have mobility among the business scenes, so that the problem of data islanding (namely, lack of relevance among data and incompatibility of databases) is formed.
Disclosure of Invention
The embodiment of the application provides a data processing method, a data processing device, computer equipment and a readable storage medium, which can improve the accuracy of generating interest degree parameters and the relevance of the interest degree parameters.
In one aspect, an embodiment of the present application provides a data processing method, including:
Acquiring a conversion rate parameter group of a participated object in each unit time period and attenuation parameters corresponding to each unit time period respectively in T unit time periods; t is a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects in N business scenes aiming at the business objects respectively; n is a positive integer greater than 1; t unit time periods belong to a target time period; the target time period includes a target unit time period;
according to the conversion rate parameter group in each unit time period and the attenuation parameter corresponding to each unit time period, determining the conversion rate index parameters of the participation objects in N business scenes aiming at the business objects in the target time period;
acquiring interaction time lengths of the participation objects in N business scenes aiming at the business objects respectively in a target unit time period;
according to the interaction time length in N business scenes in the target unit time period, determining importance index parameters of the reference object and the object in the target unit time period in N business scenes respectively aiming at the business object;
and fusing the conversion rate index parameter and the importance index parameter to generate the interest degree parameter of the participated object aiming at the business object in the target time period.
An aspect of an embodiment of the present application provides a data processing apparatus, including:
the parameter acquisition module is used for acquiring a conversion rate parameter group of the participated object in each unit time period and attenuation parameters corresponding to each unit time period respectively in T unit time periods; t is a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects in N business scenes aiming at the business objects respectively; n is a positive integer greater than 1; t unit time periods belong to a target time period; the target time period includes a target unit time period;
the first parameter determining module is used for determining conversion rate index parameters of the participation objects in N business scenes for the business objects in the target time period according to the conversion rate parameter groups in each unit time period and attenuation parameters corresponding to each unit time period respectively;
the time length acquisition module is used for acquiring interaction time lengths of the participation objects in N business scenes aiming at the business objects respectively in a target unit time period;
the second parameter determining module is used for determining importance index parameters of the reference object and the object in the target unit time period in N service scenes respectively aiming at the service object according to the interaction time periods in the N service scenes in the target unit time period;
And the parameter fusion module is used for fusing the conversion rate index parameter and the importance index parameter to generate the interest degree parameter of the participated object aiming at the business object in the target time period.
Wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N;
the parameter acquisition module comprises:
a parameter acquisition unit for respectively acquiring the participation objects in the service scene D in T unit time periods i The conversion rate parameter aiming at the business object;
the parameter dividing unit is used for dividing the conversion rate parameters acquired in the same unit time period into the same conversion rate parameter group to obtain the conversion rate parameter group in each unit time period;
the interval acquisition unit is used for acquiring the interval of each unit time period and the auxiliary unit time period, and determining attenuation parameters corresponding to each unit time period respectively according to the interval of the time period; the auxiliary unit time period is a unit time period later than the target time period, and the auxiliary unit time period is adjacent to the target time period.
Wherein the T unit time periods include a unit time period P e E is a positive integer less than or equal to T;
parameter acquisition unit, in particular for use in business scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e Acquiring the media exposure times of the participation object aiming at the target multimedia data and the media total interaction times of the participation object aiming at the target multimedia data;
a parameter acquisition unit for taking the ratio of the total interaction times of the media to the exposure times of the media as a unit time period P e Internally-related and object in service scene D i For the conversion parameters of the business object.
Wherein the T unit time periods include a unit time period P e E is a positive integer less than or equal to T;
parameter acquisition unit, in particular for use in business scenario D i In acquiring a service object associated with a service objectTarget multimedia data, in unit time period P e The method comprises the steps of internally obtaining the total media interaction times of a participation object aiming at target multimedia data, screening the total media interaction times, and obtaining the effective media interaction times of the participation object aiming at the target multimedia data;
the parameter obtaining unit is specifically configured to use a ratio of the number of available interactions of the media to the total number of interactions of the media as a unit time period P e Internally-related and object in service scene D i For the conversion parameters of the business object.
Wherein the T unit time periods include a unit time period P e E is a positive integer less than or equal to T;
parameter acquisition unit, in particular for use in business scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e The method comprises the steps of internally obtaining media effective interaction times of a participation object aiming at target multimedia data, screening the media effective interaction times, and obtaining media complete interaction times of the participation object aiming at the target multimedia data;
the parameter obtaining unit is specifically configured to use a ratio of the number of complete interactions of the media to the number of effective interactions of the media as a unit time period P e Internally-related and object in service scene D i For the conversion parameters of the business object.
Wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N; t unit time periods include unit time period P e E is a positive integer less than or equal to T; unit time period P e The group of conversion parameters in the reactor comprises a unit time period P e Intrinsic business scenario D i Conversion parameters of (a);
a first parameter determination module, specifically configured to determine a first parameter according to a unit time period P e Intrinsic business scenario D i Conversion parameter and unit time period P e Corresponding attenuation parameters, determining a unit time period P e Intrinsic business scenario D i Attenuation conversion parameters in (a);
A first parameter determining module, specifically configured to determine, for each of the T unit time periodsService scenario D i The attenuation conversion rate parameters in the target time period are fused to obtain the service scene D of the reference object in the target time period i The conversion rate index parameter aiming at the business object.
Wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N;
a duration acquisition module, specifically configured to perform a service scenario D i Acquiring target multimedia data associated with a service object, and acquiring auxiliary time length of a participation object aiming at the target multimedia data in a target unit time period; the time unit of the auxiliary duration is a first time unit;
the time length obtaining module is specifically configured to perform time unit conversion on the auxiliary time length with the first time unit to obtain a traffic scene D of the reference object in the target unit time period i The interaction time length of the service object; the time unit of the interaction time length is a second time unit; the first time unit and the second time unit are different.
Wherein the second parameter determination module includes:
a deviation determining unit for determining the traffic scene D of the reference object in the target unit time period according to the auxiliary time period i The deviation parameter for the business object;
A parameter determining unit for determining a traffic scene D in a target unit time period i The interaction time length and deviation parameters in the process of determining the business scene D of the reference object in the target unit time period i Importance index parameters for business objects.
The deviation determining unit is specifically configured to compare the auxiliary time length with a plurality of time length ranges in the parameter mapping table, and determine a time length range including the auxiliary time length in the plurality of time length ranges as a target time length range;
the deviation determining unit is specifically configured to determine a duration parameter corresponding to the target duration range in the parameter mapping table as a reference object in the traffic scene D within the target unit time period i In the business object.
Wherein N business scenes comprise businessesService scene D i I is a positive integer less than or equal to N; for business scenario D i The number of the conversion index parameters is G, and G is a positive integer greater than 1;
parameter fusion module, specifically for service scene D i G conversion index parameters and business scenario D i The importance index parameters in the service scene D are fused to obtain the service scene D i Corresponding fusion index parameters;
the parameter fusion module is specifically configured to fuse fusion index parameters corresponding to the N service scenes respectively, so as to obtain an interest degree parameter of the participating object in the target time period for the service object.
Optionally, the N service scenarios include service scenario D i I is a positive integer less than or equal to N; the business object is an entity in the knowledge graph, and the business scene D i The media data to be linked in the knowledge graph is multimedia data with a link relation with the entity;
a first type determining module for determining a resource type of the media data to be linked, and if the resource type is a graphics context type, taking the media data to be linked as a service scene D based on auxiliary media data of the media data to be linked i Target multimedia data associated with a business object;
the second type judging module is used for determining the video length type of the media data to be linked according to the duration of the media data to be linked if the resource type is the video type, and taking the media data to be linked as a service scene D based on the video length type i Target multimedia data associated with a business object.
Wherein the second type determination module comprises:
a first type determining unit, configured to, if the video length type is a long video type, take the media data to be linked as a service scene D based on the media data identifier of the media data to be linked and the object identifier of the service object i Target multimedia data associated with a business object;
a second type judging unit for judging if the video length typeFor short video type, the media data to be linked is taken as a service scene D based on the auxiliary media data of the media data to be linked i Target multimedia data associated with a business object.
Wherein the auxiliary media data comprises auxiliary text data;
the second type judging unit is specifically used for mapping the media data to be linked according to the auxiliary text data of the media data to be linked;
a second type determining unit, configured to, if the media data to be linked is mapped to the auxiliary long video data successfully, take the media data to be linked as a service scene D based on the video data identifier of the auxiliary long video data and the object identifier of the service object i Target multimedia data associated with a business object;
the second type determining unit is specifically configured to determine, if mapping of media data to be linked fails, an auxiliary object corresponding to the media data to be linked according to auxiliary media data of the media data to be linked, and take the media data to be linked as a service scene D based on similarity between the auxiliary object and the service object i Target multimedia data associated with a business object.
In one aspect, a computer device is provided, including: a processor and a memory;
the processor is connected to the memory, wherein the memory is configured to store a computer program, and when the computer program is executed by the processor, the computer device is caused to execute the method provided in the embodiment of the application.
In one aspect, the present application provides a computer readable storage medium storing a computer program adapted to be loaded and executed by a processor, so that a computer device having the processor performs the method provided in the embodiments of the present application.
In one aspect, the present application provides a computer program product comprising a computer program stored on a computer readable storage medium. The processor of the computer device reads the computer program from the computer-readable storage medium, and the processor executes the computer program, so that the computer device executes the method provided in the embodiment of the present application.
In the embodiment of the present application, the computer device may obtain, in T unit time periods, a conversion rate parameter set of the participation object in each unit time period and attenuation parameters corresponding to each unit time period, and further determine, according to the conversion rate parameter set in each unit time period and the attenuation parameters corresponding to each unit time period, conversion rate index parameters of the participation object in the target time period, respectively in N service scenes, for the service object. Wherein, T may be a positive integer greater than 1, and N may be a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects in N business scenes aiming at the business objects respectively; the T unit time periods belong to a target time period including the target unit time period. Further, the computer device may obtain, in a target unit time period, interaction durations of the participation objects in the N service scenes for the service objects, and further determine, according to the interaction durations in the N service scenes in the target unit time period, importance index parameters of the participation objects in the target unit time period in the N service scenes for the service objects. Further, the computer device may fuse the conversion index parameter and the importance index parameter to generate an interest parameter of the participating object for the business object in the target time period. Therefore, the embodiment of the application can form a set of calculation scheme for automatically producing the comprehensive interest score of the continuity measurement participation object for the business object in batch, wherein the comprehensive interest score of the participation object for the business object refers to the interest degree parameter, and the interest degree parameter is obtained by fusing the conversion rate index parameters in N business scenes and the importance index parameters in N business scenes, so that the relevance of the interest degree parameter can be improved. In addition, the conversion rate index parameter refers to the interestingness in T unit time periods of the target time period, the importance index parameter refers to the interestingness in the target unit time period of the T unit time periods, and the conversion rate index parameter and the importance index parameter are fused to generate the interestingness parameter represented in a discretization mode, so that the accuracy of generating the interestingness parameter can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the related art, the drawings that are required to be used in the embodiments or the related technical descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive effort for a person having ordinary skill in the art.
Fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application;
fig. 2 is a schematic view of a scenario for data interaction according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a data processing method according to an embodiment of the present application;
FIG. 4 is a schematic view of a scene of an importance curve according to an embodiment of the present disclosure;
FIG. 5a is a schematic flow chart of feature fusion according to an embodiment of the present application;
fig. 5b is a schematic view of a scene for feature fusion according to an embodiment of the present application;
fig. 6 is a schematic flow chart of scene fusion according to an embodiment of the present application;
FIG. 7a is a schematic diagram of a scenario in which a business object is displayed according to an embodiment of the present application;
FIG. 7b is a schematic diagram of a scenario in which a business object is displayed according to an embodiment of the present application;
FIG. 7c is a schematic diagram of a scenario in which a business object is displayed according to an embodiment of the present application;
FIG. 7d is a schematic diagram of a scenario in which a business object is displayed according to an embodiment of the present application;
FIG. 8 is a flow chart of a data processing method according to an embodiment of the present disclosure;
FIG. 9 is a schematic flow chart of a data processing method according to an embodiment of the present application;
fig. 10 is a schematic view of a knowledge graph according to an embodiment of the present application;
FIG. 11 is a flowchart of linking target media data according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a data processing apparatus according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It should be appreciated that artificial intelligence (Artificial Intelligence, AI for short) is the intelligence of a person using a digital computer or a machine controlled by a digital computer to simulate, extend and extend the environment, sense the environment, acquire knowledge and use knowledge to obtain the best results. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision.
Specifically, referring to fig. 1, fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application. As shown in fig. 1, the network architecture may include a server 2000 and a cluster of terminal devices. Wherein the cluster of terminal devices may in particular comprise one or more terminal devices, the number of terminal devices in the cluster of terminal devices will not be limited here. As shown in fig. 1, the plurality of terminal devices may specifically include a terminal device 3000a, a terminal device 3000b, terminal devices 3000c, …, a terminal device 3000n; the terminal devices 3000a, 3000b, 3000c, …, 3000n may be directly or indirectly connected to the server 2000 through a wired or wireless communication manner, respectively, so that each terminal device may interact with the server 2000 through the network connection.
Wherein each terminal device in the terminal device cluster may include: smart phones, tablet computers, notebook computers, desktop computers, intelligent voice interaction devices, intelligent home appliances (e.g., smart televisions), wearable devices, vehicle terminals, aircraft and other intelligent terminals with data processing functions. For ease of understanding, the embodiment of the present application may select one terminal device from the plurality of terminal devices shown in fig. 1 as the target terminal device. For example, the embodiment of the present application may take the terminal device 3000a shown in fig. 1 as the target terminal device.
The server 2000 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms.
For ease of understanding, the embodiments of the present application may refer to IP (Intellectual Property ) as a business object, IP is a web popular language, IP in the internet world may be understood as a collective term for all entitled and creative works, that is, IP may represent copyrights of intellectual creation, for example, IP may be invention, literature, film, animation, games, artistic works, etc. Further, the content to be distributed is an IP, which is a "stalk" or "topic" that can bring about an effect, so that the IP can be said to be a product that can bring about an effect, for example, the IP can be a new concept in a character or movie.
It can be understood that the business object in the embodiment of the present application may be used to construct a knowledge graph, where the knowledge graph is a knowledge base including entities and relationships, and is generally represented by such triplet information (head entity, relationship, tail entity) (or (entity, attribute value)); the knowledge graph has the advantages of semantic relation containing related entities, various relation compositions, good interpretation and the like, so that the knowledge graph is widely applied to the fields of searching, recommending, question answering and the like in recent years. The header entity and the tail entity may be collectively referred to as an entity, and the business object in the embodiment of the present application may be used as an entity in the knowledge graph. Alternatively, the head entity may be referred to as an entity, the relationship may be referred to as an attribute, and the tail entity may be referred to as an attribute value.
For ease of understanding, the user corresponding to the target terminal device may be referred to as a participation object in the embodiments of the present application. It should be appreciated that the above-described network framework may be adapted to determine a degree of interest (i.e., an interest degree parameter, also referred to as a user interest degree) of a participant object with respect to a business object, where the user interest degree is used to characterize a degree of interest of a user in a certain entity (i.e., IP) in a knowledge graph. The user interest degree can be used for personalized marketing and analysis of user groups with different interest degrees in the business object, so that the conversion capability of the business object is improved, and the user experience is improved; the user interest degree can be applied to business such as e-commerce operation, advertisement delivery, model optimization and the like. The interestingness parameter in the model optimization service can be used as a label or a feature in the model training process.
It should be appreciated that the computer device (e.g., the server 2000) may obtain the conversion index parameters of the reference object for the service object in the N service scenarios in the target time period, and the importance index parameters of the reference object for the service object in the N service scenarios in the target unit time period, so as to fuse the conversion index parameters in the N service scenarios and the importance index parameters in the N service scenarios, and generate the interest index parameters of the participation object for the service object in the target time period. The target unit time period may be the last unit time period of the T unit time periods in the target time period, or may be the last multiple time periods (for example, the last two unit time periods) of the T unit time periods, which is not limited in the present application. For ease of understanding, the conversion index parameter and the importance index parameter may be collectively referred to as an interestingness index parameter in embodiments of the present application.
It can be understood that the interestingness parameter can be directly applied to advertisement delivery, video personalized recommendation, recall strategy (namely, the interestingness parameter is used as recall feature in the recommendation field), product operation and the like; the interestingness parameter can be used for participating in object recommendation in the warm-up period of a new business object by means of cross-domain capability or solving the cold start problem; the interestingness parameter can be used for the operation of the service object oriented to the high-quality user in each service domain, and the oriented delivery of each resource bit.
For ease of understanding, further, please refer to fig. 2, fig. 2 is a schematic diagram of a scenario for data interaction according to an embodiment of the present application. The server 20a shown in fig. 2 may be the server 2000 in the embodiment corresponding to fig. 1, and the terminal device 20b shown in fig. 2 may be the target terminal device in the embodiment corresponding to fig. 1. Wherein the terminal device 20b may be configured to display target multimedia data associated with a service object, the terminal device 20b may be configured to display recommended media data, and the user corresponding to the terminal device 20b may be a participant object 20c, wherein the target multimedia data associated with the service object may include target graphics class data and target video class data.
As shown in fig. 2, the participating object 20c may perform an interactive operation with respect to the target multimedia data associated with the service object through the terminal device 20b, so that the terminal device 20b may transmit the target interactive information associated with the service object to the server 20a. Wherein the interactive operations performed by the participant object 20c with respect to the target multimedia data associated with the business object may include, but are not limited to: clicking operation, praying operation, sharing operation, collection operation, comment operation and the like.
As shown in fig. 2, after receiving the target interaction information, the server 20a may store the target interaction information in the multimedia database 21a, and the multimedia database 21a may be separately provided, or may be integrated on the server 20a, or may be integrated on another device or cloud, which is not limited herein. The multimedia database 21a may include a plurality of databases, and the plurality of databases may specifically include: database 22a, databases 22b, …, database 22c; the databases 22a, 22b, …, 22c may be used to store data associated with business objects, e.g., the database 22a may be used to store teletext class data, the database 22b may be used to store video class data, and the database 22c may be used to store interaction information for the teletext class data and the video class data, which may be collectively referred to herein as multimedia data. Thus, database 22a may be used to store target video class data, database 22b may be used to store target video class data, and server 20a may store target interaction information into database 22 c.
The coordinate axis 21b shown in fig. 2 may be a target time period, which may include T unit time periods, where T may be a positive integer greater than 1. Wherein, the coordinate axis 21b may include 7 segments, that is, the target time period may include 7 unit time segments (i.e., herein, T is equal to 7 for example), and the 7 unit time segments may specifically include: the unit time period 25a, the unit time period 25b, the unit time period 25c, the unit time period 25d, the unit time period 25e, the unit time period 25f, and the unit time period 25g, the embodiment of the present application may take the unit time period 25g as the target unit time period.
Further, as shown in fig. 2, the server 20a may obtain interaction information associated with the business object in T unit time periods from the database 22c, respectively. Since multimedia data belong to different service scenarios, interaction information associated with a service object (i.e., interaction information for multimedia data) within T unit time periods correspondingly belong to different service scenarios. The number of the service scenes may be N, where N may be a positive integer greater than 1, and the N service scenes may include, but are not limited to: literature fields (i.e., literature scenes), news fields (i.e., news scenes), video fields (i.e., video scenes), browser fields (i.e., browser scenes), and the like. Similarly, server 20a may obtain interaction information associated with business objects for a target unit time period from database 22 c.
Further, as shown in fig. 2, the server 20a may determine, according to the interaction information associated with the business object in each unit time period, a conversion rate parameter set of the participation object 20c in each unit time period and attenuation parameters corresponding to each unit time period respectively; similarly, the server 20a may determine, according to the interaction information associated with the service object in the target unit time period, the interaction durations of the participation objects 20c for the service objects in the N service scenes, respectively. Wherein, in T unit time periods, one unit time period may correspond to one conversion rate parameter set, one unit time period may correspond to one attenuation parameter, one conversion rate parameter set may include conversion rate parameters of the participating object 20c for the service object in N service scenes, respectively, that is, one unit time period may correspond to N groups of conversion rate parameters, and T unit time periods may correspond to n×t groups of conversion rate parameters; within the target unit time period, one service scene may correspond to one interaction duration, and N service scenes may correspond to N interaction durations. Wherein a set of conversion parameters may include G conversion parameters, where G may be a positive integer greater than 1.
The conversion parameter set in the unit time period 25a may be the conversion parameter set 23a, and the attenuation parameter corresponding to the unit time period 25a may be the attenuation parameter 24a; the conversion parameter sets and the attenuation parameters corresponding to the unit time periods 25b, … and 25g, respectively, may be referred to the description of the unit time period 25a, and will not be described herein.
Further, as shown in fig. 2, the server 20a may determine, according to the conversion rate parameter set in each unit time period and the attenuation parameters corresponding to each unit time period, the conversion rate index parameters of the reference object 20c for the service objects in the N service scenes in the target time period. Similarly, the server may determine, according to the interaction durations in the N service scenarios in the target unit time period, importance index parameters of the reference object 20c in the target unit time period for the service objects in the N service scenarios respectively. One service scenario may correspond to G conversion index parameters, N service scenarios may correspond to n×g conversion index parameters, one service scenario may correspond to one importance index parameter, and N service scenarios may correspond to N importance index parameters.
Further, as shown in fig. 2, the server 20a may fuse the conversion index parameter and the importance index parameter to generate an interest degree parameter of the participating object 20c for the service object in the target time period. Further, the server 20a may obtain the recommended media data for recommending to the participant object 20c according to the interestingness parameter of the participant object 20c for the service object in the target time period, and further return the recommended media data to the terminal device 20b, so that the terminal device 20b may display the recommended media data. Here, the recommended media data may be media data that the participant object 20c is interested in, for example, when the participant object 20c has a high interest level parameter for the service object in the target time period (i.e., when the parameter object 20c is interested in the service object), the server 20a may recommend a picture, commodity, pendant, etc. associated with the service object to the terminal device 20b, where the recommended media data may be a picture, commodity, pendant, etc.
Therefore, according to the embodiment of the application, the interest degree parameter of the participation object aiming at the business object can be determined, the interest degree parameter is obtained by fusing the conversion rate index parameters under N business scenes and the importance index parameters under N business scenes, and the relevance of the interest degree parameter can be improved by fusing the interest degree index parameters (namely the conversion rate index parameter and the importance index parameter) in the N business scenes. In addition, the embodiment of the application can generate the interestingness parameter of the interestingness of the continuity measurement participation object aiming at the business object by fusing the conversion rate index parameter and the importance index parameter, so that the accuracy of generating the interestingness parameter can be improved.
Further, referring to fig. 3, fig. 3 is a flow chart of a data processing method according to an embodiment of the present application. The method may be performed by a server, or may be performed by a terminal device, or may be performed by a server and a terminal device together, where the server may be the server 20a in the embodiment corresponding to fig. 2, and the terminal device may be the terminal device 20b in the embodiment corresponding to fig. 2. For ease of understanding, embodiments of the present application will be described in terms of this method being performed by a server. The data processing method may include the following steps S101 to S105:
step S101, acquiring a conversion rate parameter set of a participation object in each unit time period and attenuation parameters corresponding to each unit time period in T unit time periods;
wherein, T herein may be a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects for the service objects in N service scenes respectively, wherein N can be a positive integer greater than 1; the T unit time periods belong to a target time period including the target unit time period. Wherein, for each unit time period, each of the N business scenarios may include G conversion parameters, where G may be a positive integer greater than 1, the N business scenarios including business scenario D i Where i may be a positive integer less than or equal to N, traffic scenario D i May include G conversion parameters per unit time period (i.e., for traffic scenario D i The number of conversion index parameters of (a) is G), business scenario D i T x G conversion parameters may be included in T unit time periods. In other words, the embodiment of the application may provide G conversion indexes, and service scenario D i The parameter values for the G conversion indexes are the G conversion parameters.
It should be understood that the time length of the target time period is greater than the time length of the unit time period, and the embodiment of the present application does not limit the time length of the target time period and the time length of the unit time period. For example, when the target time period is equal to 30 days, the unit time period may be 1 day, where T is equal to 30; for another example, when the target time period is equal to 28 days, the unit time period may be 7 days, where T is equal to 4; for another example, where the target time period is equal to 365 days (i.e., 8760 hours), the unit time period may be 876 hours, where T is equal to 10. For another example, when the target time period is equal to 1 year, the unit time period may be 1 month, where T is equal to 12.
It should be appreciated that the server may obtain the participation object in the traffic scene D within T unit time periods respectively i For the conversion parameters of the business object. Further, the server may divide the conversion parameters acquired in the same unit time period into the same conversion parameter group, to obtain a conversion parameter group in each unit time period. One conversion parameter set may correspond to one unit time period, and T conversion parameter sets may correspond to T unit time periods. Further, the server may acquire a time period interval between each unit time period and the auxiliary unit time period, and determine attenuation parameters corresponding to each unit time period according to the time period interval. The auxiliary unit time period is a unit time period later than the target time period, and the auxiliary unit time period is adjacent to the target time period; in other words, the auxiliary unit time period may be the first unit time period after the target time period. For example, if the target time period is equal to 30 days and the unit time period is 1 day, the auxiliary unit time period may be today (e.g., 8 months 15 days), and the target time period may be 30 days forward from yesterday (e.g., 7 months 16 days to 8 months 14 days).
It should be understood that, besides basic interaction characteristics of conventional participation objects on service objects, the embodiment of the application can comb and summarize the basic interaction characteristics of each service scene, and design an interest index for more effectively and comprehensively describing the interest of the participation objects on the service objects. Wherein the basic interaction characteristics are determined by interaction information, which may include, but is not limited to: exposure, click, praise, share, collection, comment, play (for video class data), browse (for graphics class data), etc., where play and browse may be collectively referred to as view. The specific process of the server obtaining the conversion rate parameter (i.e. the parameter value of the interest index) of the participation object for the business object may be referred to as the description of step S1011 in the embodiment corresponding to fig. 8 below.
It can be understood that, in the present application, related data such as exposure, click-through, praise, share, collection, comment, play, browse, etc. in the interactive information are related, and when the embodiments of the present application are applied to specific products or technologies, user permission or consent needs to be obtained, and collection, use and processing of related data need to comply with relevant national laws and regulations and national standards of the country where the related data is located. For example, the terminal device may display a prompt message "whether to record the current collection information and send the recorded information to the server", and when the user authorization corresponding to the terminal device passes, the terminal device may upload the collection information to the server, so that the server counts the collection amount.
Wherein the N business scenes comprise a business scene D i Where i may be a positive integer less than or equal to N; t unit time periods include unit time period P e Here e may be a positive integer less than or equal to T; unit time period P e The group of conversion parameters in the reactor comprises a unit time period P e Intrinsic business scenario D i Is a conversion parameter in (a).
Step S102, according to the conversion rate parameter group in each unit time period and the attenuation parameters corresponding to each unit time period, determining conversion rate index parameters of the participation objects in N business scenes aiming at the business objects in the target time period;
Specifically, the server may determine the unit time period P e Intrinsic business scenario D i Conversion parameter and unit time period P e Corresponding attenuation parameters, determining a unit time period P e Intrinsic business scenario D i Is a decaying conversion parameter in (a). Wherein the server can make the unit time period P e Intrinsic business scenario D i Conversion parameter and unit time period P e The product of the corresponding decay parameters as a unit time period P e Intrinsic business scenario D i Is a decaying conversion parameter in (a). Further, the server may respectively store the T unit time periodsService scenario D i The attenuation conversion rate parameters in the target time period are fused to obtain the service scene D of the reference object in the target time period i The conversion rate index parameter aiming at the business object.
If the weights corresponding to the unit time periods are the same, the server can directly determine that the T unit time periods are respectively in the service scene D i The attenuation conversion rate parameters in the target time period are summed to obtain the reference object in the service scene D i The conversion rate index parameter aiming at the business object; optionally, if the weights corresponding to each unit time period are different, the server may respectively store the service scenario D in each of the T unit time periods according to the weight parameters corresponding to each unit time period i The attenuation conversion rate parameters in the target time period are weighted and summed to obtain the reference object in the service scene D i The conversion rate index parameter aiming at the business object.
The process of determining the conversion rate index parameters of the reference object in the target time period in N business scenes for the business objects by the server is shown in the formula (1):
wherein D is i Representing the ith information field (i.e. traffic scenario), F k The characteristic of the k-th feature is indicated,a kth feature score (i.e., conversion index parameter) representing an ith information field, rate k The conversion (i.e., conversion parameter) representing the kth characteristic, e representing any one of T unit time periods, today representing the auxiliary unit time period, today-e representing the time period interval between each unit time period and the auxiliary unit time period, (-)>Representing attenuation parameters, U e Representing the upper limit of the period(i.e., the last unit time period in the target time period, P T ),L e Represents the lower limit of the period (i.e., the first unit period in the target period, P 1 ). Wherein U is e =today-1, if the target time period is equal to 30 days, U e -L e +1=30。
Step S103, acquiring interaction time lengths of the participation objects in N business scenes aiming at the business objects respectively in a target unit time period;
Specifically, the server may be in traffic scenario D i The method comprises the steps of obtaining target multimedia data associated with a business object, and obtaining auxiliary time length of a participation object aiming at the target multimedia data in a target unit time period. The time unit of the auxiliary duration is a first time unit. Further, the server may perform time unit conversion on the auxiliary duration with the first time unit to obtain a traffic scene D of the reference object in the target unit time period i For the duration of interaction of the business object. The time unit of the interaction time is a second time unit, and the first time unit and the second time unit are different.
The process of performing time unit conversion on the auxiliary duration by the server is shown in a formula (2):
/>
wherein,representing the viewing time (i.e., auxiliary duration) of the participating object on the jth entity IP, T yesterday Representing the time of consumption (i.e., the duration of the interaction). Wherein the unit of the auxiliary duration is minutes (i.e., the first time unit), the unit of the interaction duration is hours (i.e., the second time unit), and the time unit conversion may represent rounding the auxiliary duration, i.e., rounding down the formula generated based on the auxiliary duration. It can be understood that if the target unit time period is 1 day, the auxiliary time period has a value of [0,1440 ] ]The value of the interaction time length is [0,23]For ease of understanding, the present embodiment will be described by taking 1 day as an example of a unit time period.
Step S104, determining importance index parameters of the reference object and the object in the target unit time period in N service scenes respectively according to the interaction time periods in the N service scenes in the target unit time period;
specifically, the server may determine that the reference object is in the traffic scene D within the target unit time period according to the auxiliary time period i In the business object. Further, the server may determine the traffic scene D in the target unit time period i The interaction time length and deviation parameters in the process of determining the business scene D of the reference object in the target unit time period i Importance index parameters for business objects.
It may be appreciated that the server may compare the auxiliary duration with a plurality of duration ranges in the parameter mapping table, and determine a duration range including the auxiliary duration from the plurality of duration ranges as the target duration range. Further, the server may determine the duration parameter corresponding to the target duration range in the parameter mapping table as the parameter and the object in the traffic scene D within the target unit time period i In the business object.
Wherein, the server determines that the reference object is in the business scene D within the target unit time period i The process of the deviation parameter for the business object is shown in the formula (3):
wherein, the formula (3) can be called as a deviation stage function of the importance of the consumption time period, beta represents a deviation variable (i.e. deviation parameter) of the importance of the consumption time period,representing the viewing time (i.e., the auxiliary duration) of the participating object for the j-th entity IP. It will be appreciated that the parameter mapping table may include a plurality of time ranges and a plurality of time range divisionsThe specific values of the corresponding time length parameters of each time length range are not limited in the embodiment of the application. The depicting value of the consumption behavior of the participated object in the target unit time period (such as the last day) is far better than that of the consumption behavior of the participated object in any unit time period (any day) in history, through a large amount of data analysis, the accumulated consumption time length in the last 24 hours is found to be quite different between the time of 2 hours (namely 120 minutes) and the time of 4 hours (namely 240 minutes), so that the embodiment of the application makes fine design aiming at the accumulated consumption importance (namely importance index parameter) in the last day so as to accurately and continuously learn the interest information of the participated object.
For ease of understanding, the embodiment of the present application is illustrated by taking an example in which the parameter mapping table includes 3 duration ranges, as shown in formula (3), the 3 duration ranges may include a duration range (0, 120 min) (i.e., 0-120 min), a duration range [120min,240min ] (i.e., 120-240 min), and a duration range [240min,1440min ] (i.e., 240-1440 min); the duration parameter corresponding to the duration range (0, 120 min) may be 0.0, the duration parameter corresponding to the duration range [120min,240min ] may be 0.1, and the duration parameter corresponding to the duration range [240min,1440min ] may be 0.2. For example, if the auxiliary duration has a value of 251 minutes, the server may determine the duration range [240min,1440min ] as the target duration range, and determine the duration parameter corresponding to the target duration range as the deviation parameter (i.e. 0.1).
Wherein, the server determines that the reference object is in the business scene D within the target unit time period i The process of the importance index parameter for the business object is shown in the formula (4):
wherein D is i Representing the ith information field (i.e. traffic scenario), F k The characteristic of the k-th feature is indicated,the kth feature representing the ith information fieldScore (i.e. importance index parameter), T yesterday Representing the time of consumption (i.e., the interaction duration), and β represents the deviation variable (i.e., the deviation parameter) of the importance of the duration of consumption.
For ease of understanding, please refer to fig. 4, fig. 4 is a schematic view of a scene of an importance curve according to an embodiment of the present application. As shown in fig. 4, the importance index parameter is distributed in a target unit time period, that is, the cumulative consumption importance curve for the target unit time period (the last day), and the distribution corresponds to the above formula (4). The horizontal axis of the graph shown in fig. 4 is the interaction time length, the vertical axis is the importance index parameter, the value of the interaction time length is any one integer of [0,23], the importance index parameter increases along with the increase of the interaction time length, and when the interaction time length increases to a certain degree, the increasing speed of the importance index parameter gradually slows down.
Step S105, fusing the conversion rate index parameter and the importance index parameter to generate the interest degree parameter of the participatory object aiming at the business object in the target time period.
Specifically, the server may be configured to perform a service scenario D i Conversion index parameters (i.e., G conversion index parameters) and business scenario D i The importance index parameters in the service scene D are fused to obtain the service scene D i Corresponding fusion index parameters. Further, the server may fuse the fusion index parameters corresponding to the N service scenarios respectively, to obtain the interestingness parameter of the participating object for the service object in the target time period.
Optionally, the server may fuse the conversion index parameters in the N service scenarios to obtain a fused conversion index parameter. The server may fuse conversion index parameters corresponding to the same interest index in N service scenarios to obtain fusion conversion index parameters (i.e., G fusion conversion index parameters) corresponding to each interest index (i.e., G interest indexes). Further, the server may fuse the importance index parameters in the N service scenarios to obtain a fused importance index parameter. Further, the server may fuse the fused transformation index parameter with the fused important index parameter to obtain an interest degree parameter of the participating object in the target time period for the service object.
Wherein, the server is used for the service scene D i G conversion index parameters and business scenario D i The manner of fusing the importance index parameters in the above table may be weighted summation or other manners, which is not limited in the present application. Wherein, the server can be based on the service scene D i G conversion index parameters and business scenario D i Index weights corresponding to the importance index parameters respectively for the service scene D i G conversion index parameters and business scenario D i And the importance index parameters of the data are weighted and summed. Similarly, the manner in which the server fuses the fusion index parameters corresponding to the N service scenes respectively may be weighted summation or may be other manners, which is not limited in this application. The server may perform weighted summation on the fusion index parameters corresponding to the N service scenes according to the scene weights corresponding to the N service scenes respectively.
Wherein, the server is used for the service scene D i G conversion index parameters and business scenario D i The process of weighted summation of the importance index parameters is shown in the formula (5):
wherein,the kth feature score (i.e. conversion index parameter and importance index parameter) representing the ith information field, is->The weight (i.e. index weight) representing the kth feature in the ith information field,representing the ith information domain user pair jthThe interestingness score (i.e., the fusion index parameter) for each IP, where M equals (G+1). set (F) k )={F 1 ,...,F M May represent F k A set of structures. Wherein the sum of the M index weights in the ith information field is equal to 1, i.e. +.>Wherein k can be a value from 1 to M, alternatively k can be a value from 0 to (M-1).
For easy understanding, a specific process of generating fusion index parameters corresponding to N service scenarios by the server may refer to fig. 5a, and fig. 5a is a schematic flow diagram of feature fusion according to an embodiment of the present application. As shown in fig. 5a, which is a main process of calculating the intra-domain interest level of the participation object to the business object, the server may execute step S51, and extract basic features of the data dimension through step S51, that is, calculate basic features of T unit time periods in the data dimension, where the basic features may include interaction information such as praise, comment, exposure, click, browse, play, viewing frequency, viewing duration, and the like. The viewing frequency may represent the number of times that the participation object appears in the statistical period, the viewing duration may represent the playing duration or the browsing duration, and the viewing frequency may represent the playing frequency or the browsing frequency.
Further, as shown in fig. 5a, the server may execute step S52, where the basic features of the object dimension are aggregated in step S52, that is, the basic features of the data dimension are aggregated according to the object identifier of the service object, and the basic features of the data dimension are converged to the basic features of the object dimension (that is, the basic features of the data dimension are associated to different service objects), and then execute step S53, where the feature conversion of the object dimension is calculated in step S53, that is, the basic features of the object dimension are converted and calculated, so as to obtain the final scores (that is, the conversion index parameter and the importance index parameter) of each feature.
Further, as shown in fig. 5a, the server may perform step S54, calculate the in-domain interestingness score of the object through step S54, that is, fuse the conversion index parameter and the importance index parameter (that is, fuse the conversion index parameter and the importance index parameter), in other words, the server may perform weighted summation on M interestingness index parameters (that is, G conversion index parameters and 1 importance index parameter) according to the index weight, and calculate the in-domain interestingness score (that is, fuse the index parameter) of the final service object.
It may be understood that, in this embodiment of the present application, M interestingness indexes (i.e., interestingness features, abbreviated as indexes, features) may be provided, parameter values of the M interestingness indexes are M interestingness index parameters, and M interestingness indexes in different service scenes may be the same or different, and the number of interestingness indexes in different service scenes may be the same or different.
For easy understanding, a specific process of generating fusion index parameters corresponding to N service scenarios by the server may refer to fig. 5b, and fig. 5b is a schematic view of a scenario for feature fusion provided in the embodiment of the present application. As shown in fig. 5b, the N service scenarios may specifically include service scenario D 1 Service scenario D 2 …, business scenario D N Service scenario D 1 The index parameter in (a) can be interest index parameter 50a, service scene D 2 The interest index parameter in (a) can be interest index parameters 50b, …, and a service scene D N The interestingness index parameter of (c) may be an interestingness index parameter 50c.
Wherein, the service scene D 1 The M included interestingness index parameters (i.e., interestingness index parameter 50 a) may be importance index parameter 52a, conversion index parameters 52b, …, conversion index parameter 52c, traffic scenario D 2 The M included interestingness index parameters (i.e., interestingness index parameter 50 b) may be importance index parameter 53a, conversion index parameters 53b, …, conversion index parameters 53c, …, business scenario D N The M interestingness indicator parameters included (i.e., interestingness indicator parameter 50 c) may be importance indicator parameter 54a, conversion indicator parameters 54b, …, conversion indicator parameter 54c.
It may be appreciated that the M interestingness indicators may specifically include: interestingness index F 1 Interest level index F 2 … and interestingness index F M Interestingness index F 1 Can be an importance index, an interest index F 2 … and interestingness index F M May be an index of conversion. For example, the interestingness index parameter 52a (i.e., importance index parameter 52 a), the interestingness index parameter 53a (i.e., importance index parameter 53 a), …, and the interestingness index parameter 54a (i.e., importance index parameter 54 a) may be specific to the interestingness index F 1 The interestingness index parameter 52a, the interestingness index parameters 53a, …, the interestingness index parameter 54a may be collectively referred to as the interestingness index parameter 51a; the interestingness index parameters 52b (i.e., conversion index parameters 52 b), the interestingness index parameters 53b (i.e., conversion index parameters 53 b), …, and the interestingness index parameters 54b (i.e., conversion index parameters 54 b) may be specific to the interestingness index F 2 The interestingness index parameter 52b, the interestingness index parameters 53b, …, the interestingness index parameter 54b may be collectively referred to as the interestingness index parameter 51b; …; the interestingness index parameter 52c (i.e., conversion index parameter 52 c), interestingness index parameter 53c (i.e., conversion index parameter 53 c), …, interestingness index parameter 54c (i.e., conversion index parameter 54 c) may be specific to the interestingness index F M The interestingness index parameter 52c, the interestingness index parameters 53c, …, and the interestingness index parameter 54c may be collectively referred to as the interestingness index parameter 51c.
As shown in fig. 5b, the server may generate the service scenario D according to the importance index parameter 52a, the conversion index parameters 52b, …, and the conversion index parameter 52c 1 Corresponding fusion index parameters (i.e. business scenario D 1 Interestingness); the server can generate a service scene D according to the importance index parameters 53a, the conversion index parameters 53b, … and the conversion index parameters 53c 2 Corresponding fusion index parameters (i.e. business scenario D 2 Interestingness); …; the server may generate the service scenario D according to the importance index parameter 54a, the conversion index parameters 54b, …, and the conversion index parameter 54c N Corresponding fusion index parameters (i.e. business scenario D N Interestingness).
The process of weighting and summing the fusion index parameters corresponding to the N service scenes respectively by the server is shown in a formula (6):
wherein,representing the interest degree score (i.e. fusion index parameter) of the ith information domain participation object to the jth business object, ">The weight (i.e. scene weight) representing the ith information field,/or->Representing the total score of interest (i.e. the interest parameter) of the participating object in the j-th business object after fusing N information fields. set (D) i )={D 1 ,...,D N May represent D i A set of structures. Wherein the sum of scene weights corresponding to N business scenes is equal to 1, namelyWherein i can be a value from 1 to N, alternatively i can be a value from 0 to (N-1).
For easy understanding, a specific process of generating, by the server, the interest degree parameter of the participating object for the service object in the target time period may refer to fig. 6, and fig. 6 is a schematic flow chart for performing scene fusion according to an embodiment of the present application. As shown in fig. 6, the main flow of cross-domain interest calculation of a participating object on a service object is shown, the cross-domain interest calculation of the participating object on the service object is to fuse the interest (i.e. fusion index parameter) of the participating object on the service object, which has been calculated in each domain (i.e. in the service scene), on the granularity of the participating object and the service object, the interest result (i.e. the interest parameter) of the final fusion multi-domain information is obtained through the cross-domain fusion algorithm, and the threshold range of the interest parameter is [0,1]. The cross-domain fusion algorithm may perform weighted summation on the fusion index parameters, or may be other operations, which is not limited in this application.
As shown in fig. 6, the server may perform a cross-domain fusion algorithm on the service scenario D 1 Interestingness, business scene D 2 Interestingness, …, business scenario D N And the interestingness is fused to obtain the interestingness parameter of the participatory object aiming at the business object in the target time period (namely, the cross-domain interestingness result of the participatory object on the business object). For example, business scenario D 1 Can be literature domain, business scene D 2 Can be news domain, …, business scene D N The interest level parameter can represent parameter values which have the interest level of the literature domain, the interest level of the news domain and the interest level of the browser domain.
For ease of understanding, please refer to fig. 7a, 7b, 7c and 7d, wherein fig. 7a, 7b, 7c and 7d are schematic diagrams illustrating a scenario of displaying a business object according to an embodiment of the present application. The terminal interface 70a as shown in fig. 7a may be an interface in a terminal device, in which terminal interface 70a may display a plurality of application clients, which may include a client 70c, and the participation object 70d may perform a trigger operation with respect to the client 70c, so that the terminal device may switch the terminal interface 70a to the terminal interface 70b in response to the trigger operation performed by the participation object 70d with respect to the client 70 c.
As shown in fig. 7a, the terminal device may perform full-screen resource bit placement on the service object in the terminal interface 70b, that is, full-screen display of resources associated with the service object (i.e., recommended media data) in the terminal interface 70b, where the service object may be ABCD.
The terminal interface 71a shown in fig. 7b may be the terminal interface 70a in the implementation corresponding to fig. 7a, where the terminal interface 71a may display the client 71c, and the participation object 71d may perform a triggering operation on the client 71c, so that the terminal device may switch the terminal interface 71a to the terminal interface 71b in response to the triggering operation performed by the participation object 71d on the client 71 c.
As shown in fig. 7b, the terminal device may perform bubble resource allocation on the service object in the terminal interface 71b, that is, display resources associated with the service object (i.e., recommended media data) in the form of bubbles in the terminal interface 71b, where the service object may be ABCD.
The terminal interface 72a shown in fig. 7c may be the terminal interface 70a in the implementation corresponding to fig. 7a, where the terminal interface 72a may display the client 72c, and the participation object 72d may perform a triggering operation on the client 72c, so that the terminal device may switch the terminal interface 72a to the terminal interface 72b in response to the triggering operation performed by the participation object 72d on the client 72 c.
As shown in fig. 7c, the terminal device may perform pendant resource placement on the service object in the terminal interface 72b, that is, display, in a pendant form, resources associated with the service object (i.e., recommended media data) in the terminal interface 72b, where the service object may be ABCD. It should be understood that the bubble shown in fig. 7b and the pendant shown in fig. 7c may be displays of the same resource at different locations.
The terminal interface 73a shown in fig. 7d may be the terminal interface 70b in the embodiment corresponding to fig. 7a, the terminal interface 71b in the embodiment corresponding to fig. 7b, or the terminal interface 72b in the embodiment corresponding to fig. 7c, where a plurality of multimedia data may be displayed in the terminal interface 73a, the plurality of multimedia data may include the multimedia data 73c, and the participation object 73d may perform a triggering operation with respect to the multimedia data 73c, so that the terminal device may switch the terminal interface 73a to the terminal interface 73b in response to the triggering operation performed by the participation object 73d with respect to the multimedia data 73 c. The terminal interface 73b may be a play detail page of the video application.
As shown in fig. 7d, the terminal device may perform a half-screen resource bit placement on the service object in the terminal interface 73b, that is, half-screen display, in the terminal interface 73b, of a resource associated with the service object (that is, recommended media data), where the service object may be ABCD. Wherein, the full screen shown in fig. 7a and the half screen shown in fig. 7d can be different display modes of the terminal device, the size occupied by the recommended media data in the half screen is smaller than the size of the whole terminal screen, and the size occupied by the recommended media data in the full screen is equal to the size of the whole terminal screen.
It can be understood that in the embodiments corresponding to fig. 7a, fig. 7b, fig. 7c, and fig. 7d, the server may determine the interest level parameter of the participation object corresponding to the terminal device for the service object, and further divide the participation object according to the interest level parameter and the participation threshold of the participation object for the service object, to obtain the level information corresponding to the participation object. It can be understood that when the level information of the participation object for the service object is higher, the server can acquire the resource associated with the service object, and further push the resource associated with the service object to the terminal device corresponding to the participation object. It should be appreciated that the participating objects corresponding to different terminal devices have different level information for different business objects, and the server may push resources associated with different business objects to different terminal devices.
Therefore, the embodiment of the application can realize multi-domain information fusion and alleviate the problem of cold start of new business objects in the business domain; the user interest degree of a large number of service objects is automatically calculated, the operation labor cost is released, and meanwhile, the stability of high-quality user circle selection results is ensured; and the user interest degree of a large number of business objects is automatically calculated, and the result reusability is high.
Therefore, the embodiment of the application can form a set of calculation scheme for automatically producing the comprehensive interest score of the continuity measurement participation object for the business object in batch, wherein the comprehensive interest score of the participation object for the business object refers to the interest degree parameter, and the interest degree parameter is obtained by fusing the conversion rate index parameters in N business scenes and the importance index parameters in N business scenes, so that the relevance of the interest degree parameter can be improved. In addition, the conversion rate index parameter refers to the interestingness in T unit time periods of the target time period, the importance index parameter refers to the interestingness in the target unit time period of the T unit time periods, and the conversion rate index parameter and the importance index parameter are fused to generate the interestingness parameter represented in a discretization mode, so that the accuracy of generating the interestingness parameter can be improved.
Further, referring to fig. 8, fig. 8 is a flow chart of a data processing method according to an embodiment of the present application. The data processing method may include the following steps S1011-S1013, and steps S1011-S1013 are one embodiment of step S101 in the embodiment corresponding to fig. 3. Wherein the N business scenes comprise a business scene D i Where i may be a positive integer less than or equal to N; t unit time periods include unit time period P e Here e may be a positive integer less than or equal to T.
Step S1011, respectively acquiring the participation objects in the business scene D in T unit time periods i The conversion rate parameter aiming at the business object;
it is understood that the conversion parameters in embodiments of the present application may include, but are not limited to, a first conversion parameter, a second conversion parameter, and a third conversion parameter. In other words, traffic scenario D i The G conversion parameters of (2) may include that the participating object is in traffic scene D i A first conversion parameter, a second conversion parameter, and a third conversion parameter for the business object. Optionally, if G is equal to 1, service scenario D i Can be G conversion rate parameters of the participation object in the business scene D i Any one of the first conversion parameter, the second conversion parameter, and the third conversion parameter for the business object.
Wherein it should be understood that the server may be in traffic scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e The media exposure times (i.e. exposure amounts) of the participation objects for the target multimedia data and the total media interaction times of the participation objects for the target multimedia data are internally acquired. Wherein the total number of media interactions represents the click-through amount and the number of target multimedia data being automatically played, one exposure for the target multimedia data may have a total number of media interactions of one or 0. Further, the server can take the ratio of the total interaction times of the media to the exposure times of the media as a unit Time period P e Internally-related and object in service scene D i For the business object (i.e., the first conversion parameter). Wherein the first conversion parameter is indicative of time-decay based exposure>The conversion rate characteristics of consumption, whether to consume, how many times to consume, are different for all active consumer product patterns, and the interest is a decaying trend over time, so embodiments of the present application design exposure to consumption conversion rate characteristics based on time decay to learn continuously about the object interest in this dimension.
Wherein it should be understood that the server may be in traffic scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e And obtaining the total media interaction times of the participation object aiming at the target multimedia data, and screening the total media interaction times to obtain the effective media interaction times of the participation object aiming at the target multimedia data. Further, the server can take the ratio of the effective interaction times of the media to the total interaction times of the media as a unit time period P e Internally-related and object in service scene D i For the business object (i.e., the second conversion parameter). Wherein the second conversion parameter is indicative of time-decay based consumption >The conversion rate characteristics of effective consumption are found through a large number of data analysis, the interest degree of a user is directly calculated from the quantity of the characteristics of exposure, clicking, playing, reading, browsing and the like, and a large number of noise data of false interests exist in interest degree results due to product strategies of passive exposure, false clicking or automatic playing and the like, so that the conversion rate characteristics from consumption to effective consumption are adopted, false interest information brought by the noise behaviors can be well filtered, and meanwhile, factors of interest attenuation over time are considered.
It can be understood that if the target multimedia data is the target image-text data, the server can screen the current interaction in the total interaction times of the media according to the effective consumption standard corresponding to the image-text data; optionally, if the target multimedia data is the target video data, the server may filter the current interaction in the total interaction times of the media according to an effective consumption standard corresponding to the video multimedia data. The effective consumption standard corresponding to the multimedia data of the video class may include an effective consumption standard corresponding to the multimedia data of the long video class and an effective consumption standard corresponding to the multimedia data of the short video class. In addition, when the service screens the total interaction times of the media, the browsing amount corresponding to the target image-text data and the playing amount corresponding to the target video data are required to be used.
For ease of understanding, referring to table 1, table 1 is a valid consumption criteria list provided in the embodiments of the present application, where valid consumption criteria corresponding to multimedia data of different resource types may be stored in the valid consumption criteria list. As shown in table 1:
TABLE 1
The effective consumption standard corresponding to the multimedia data with the resource type of the image-text type is that the slip or browsing completion degree of the detail page is more than or equal to 14%, namely, in a certain consumption of the multimedia data of the image-text type, the consumption can be determined to be effective consumption when the detail page of the multimedia data has slip or the browsing completion degree is more than or equal to 14%; the effective consumption standard corresponding to the multimedia data with the resource type being the long video class is that the playing completion degree is more than or equal to 14%, namely, in a certain consumption of the multimedia data aiming at the long video class, when the playing completion degree of the multimedia data is more than or equal to 14%, the consumption can be determined to be effective consumption; the effective consumption standard corresponding to the multimedia data with the resource type being short video type is that the playing time length is more than or equal to 5 seconds, namely, in a certain consumption of the multimedia data aiming at the short video type, when the playing time length of the multimedia data is more than or equal to 5 seconds, the consumption can be determined to be effective consumption.
Wherein it should be understood that the server may be in traffic scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e And obtaining the media effective interaction times of the participation object aiming at the target multimedia data, and screening the media effective interaction times to obtain the media complete interaction times of the participation object aiming at the target multimedia data. Further, the server can take the ratio of the complete interaction times of the media to the effective interaction times of the media as a unit time period P e Internally-related and object in service scene D i In the business object (i.e., the third conversion parameter). Wherein the third conversion parameter is indicative of (1) effective consumption based on time decay>The conversion rate characteristics of complete consumption and the difference of the consumption depth show that the interest degree of the participation objects in the content is different, and the change from interest to very interest of the participation objects in the business objects is found from a large amount of data analysis, the result is very sensitive to a threshold value point only by virtue of the frequency or proportion of the behaviors of the complete consumption, so that the problems are solved, the information of the complete consumption is fully utilized, and the factors of interest attenuation are considered as time goes by.
It can be understood that if the target multimedia data is the target image-text data, the server can screen the current interaction in the effective interaction times of the media according to the complete consumption standard corresponding to the image-text data; optionally, if the target multimedia data is the target video data, the server may filter the current interaction in the number of media effective interactions according to the complete consumption standard corresponding to the video multimedia data. The complete consumption standard corresponding to the multimedia data of the video class may include a complete consumption standard corresponding to the multimedia data of the long video class and a complete consumption standard corresponding to the multimedia data of the short video class. In addition, when the service screens the effective interaction times of the media, the browsing amount corresponding to the target image-text data and the playing amount corresponding to the target video data are required to be used.
For ease of understanding, please refer to table 2, table 2 is a complete consumption standard list provided in the embodiment of the present application, where complete consumption standards corresponding to multimedia data of different resource types may be stored in the complete consumption standard list. As shown in table 2:
TABLE 2
The complete consumption standard corresponding to the multimedia data with the resource type of the image-text type is that the browsing completion degree is more than or equal to 60%, namely, in a certain consumption of the multimedia data aiming at the image-text type, the consumption can be determined to be the complete consumption when the browsing completion degree of the multimedia data is more than or equal to 60%; the complete consumption standard corresponding to the multimedia data with the resource type being the long video class is that the playing completion degree is more than or equal to 60%, namely, in a certain consumption of the multimedia data aiming at the long video class, when the playing completion degree of the multimedia data is more than or equal to 60%, the consumption can be determined to be the complete consumption; the complete consumption standard corresponding to the multimedia data with the resource type of short video is that the playing duration is more than 30 seconds, namely, when the playing duration of the multimedia data is more than 30 seconds in a certain consumption of the multimedia data of short video, the consumption can be determined to be the complete consumption; or the complete consumption standard corresponding to the multimedia data with the resource type of short video is that the playing duration is less than or equal to 30 seconds and the playing completion degree is more than or equal to 80%, namely, in a certain consumption of the multimedia data of short video, the consumption can be determined to be the complete consumption when the playing duration of the multimedia data is less than or equal to 30 seconds and the playing completion degree is more than or equal to 80%.
Optionally, service scenario D i The G conversion parameters of (2) may also include the participation object in traffic scene D i The praise rate (i.e., the ratio of the praise amount to the click amount), the comment rate (i.e., the ratio of the comment amount to the click amount), etc., for the business object, will not be listed one by one.
Step S1012, dividing the conversion parameters acquired in the same unit time period into the same conversion parameter group to obtain the conversion parameter group in each unit time period;
it will be appreciated that the server may store the unit time period P e The conversion rate parameters of the internal reference object in N business scenes aiming at the business object are divided into the same conversion rate parameter group, so that a unit time period P is obtained e A corresponding set of conversion parameters. In other words, the server can make the unit time period P e Internally-related and object in service scene D i The first conversion rate parameter, the second conversion rate parameter and the third conversion rate parameter for the business object are divided into a unit time period P e A corresponding set of conversion parameters; …; the server can make the unit time period P e Internally-related and object in service scene D N The first conversion rate parameter, the second conversion rate parameter and the third conversion rate parameter for the business object are divided into a unit time period P e A corresponding set of conversion parameters.
In step S1013, a time interval between each unit time interval and the auxiliary unit time interval is acquired, and attenuation parameters corresponding to each unit time interval are determined according to the time interval.
The auxiliary unit time period is a unit time period later than the target time period, and the auxiliary unit time period is adjacent to the target time period.
Therefore, compared with the prior art that the user interests are discretized through the classification model (namely the participation objects are divided into discrete users according to different degrees), the relevance among the characteristics is ignored, and the embodiment of the application can describe the interests of the participation objects for the business objects by adopting continuous values from bottom to top, namely the interests of the participation objects for the business objects are continuous, so that the relevance among the characteristics is reserved from the result, the problem of threshold sensitivity is thoroughly avoided, and meanwhile, the interest distinction of each user is more quantitative and sensitive. In addition, in the prior art, different service scenes have very different characteristics selected by the classification model, and the classification result has poor service adaptability, so that the embodiment of the application can introduce multiple types of characteristics in different scenes, and improve the adaptability of the user interest in different service scenes.
Therefore, according to the embodiment of the application, the activity parameters of the participation objects aiming at the business objects can be continuously measured, and then the information barriers of each domain are broken by means of the capability of the knowledge graph, the interaction behaviors of the participation objects of N business scenes aiming at the business objects are learned and calculated, the activity parameters of the N business scenes are obtained, the activity index parameters of the N business scenes can be used for generating the activity index parameters of the N business scenes, the interest parameters of the participation objects aiming at the business objects, which are integrated with the cross domains, can be finally obtained, and accordingly the accuracy of generating the interest parameters can be improved. In addition, the embodiment of the application can automatically calculate a plurality of business objects, reduce labor cost, ensure wider applicability and higher performance of the calculation process of the interestingness parameter, and simultaneously characterize the interestingness more comprehensively.
Further, referring to fig. 9, fig. 9 is a flow chart of a data processing method according to an embodiment of the present application. The method may be performed by a server, or may be performed by a terminal device, or may be performed by a server and a terminal device together, where the server may be the server 20a in the embodiment corresponding to fig. 2, and the terminal device may be the terminal device 20b in the embodiment corresponding to fig. 2. For ease of understanding, embodiments of the present application will be described in terms of this method being performed by a server. The data processing method may include the following steps S201 to S203:
Step S201, determining the resource type of the media data to be linked;
wherein the N business scenes comprise a business scene D i Where N may be a positive integer greater than 1, and where i may be a positive integer less than or equal to N. The business object is an entity in the knowledge graph, and the business scene D i The media data to be linked in the knowledge graph is multimedia data with a link relation with the entity; in other words, the media data to be linked may be multimedia data to be linked to an entity in the knowledge-graph.
Step S202, if the resource type is the graphic type, the media data to be linked is used as the service scene D based on the auxiliary media data of the media data to be linked i Associated with business objectsTarget multimedia data;
at this time, if the resource type of the media data to be linked is the graphic type, the media data to be linked is the graphic type data, and the auxiliary media data of the media data to be linked may include, but is not limited to, a title, a brief introduction, a full text, a picture, and the like of the graphic type data. The server may link the teletext class data to the business object through an entity linking algorithm.
The server may determine an auxiliary object corresponding to the media data to be linked according to the auxiliary media data of the media data to be linked, and use the media data to be linked as a service scene D based on the similarity between the auxiliary object and the service object i Target multimedia data associated with a business object.
Step S203, if the resource type is video type, determining the video length type of the media data to be linked according to the duration of the media data to be linked, and taking the media data to be linked as the service scene D based on the video length type i Target multimedia data associated with a business object.
Specifically, if the video length type is a long video type, the server may use the media data to be linked as the service scene D based on the media data identifier of the media data to be linked and the object identifier of the service object i Target multimedia data associated with a business object. Optionally, if the video length type is a short video type, the server may use the media data to be linked as the service scene D based on the auxiliary media data of the media data to be linked i Target multimedia data associated with a business object.
The knowledge graph stores object identifiers of service objects, and if the media data identifiers are the same as the object identifiers, the server can directly link the media data to be linked to the service objects; optionally, if the media data identifier and the object identifier are different, the server does not need to link the media data to be linked to the service object, where the media data to be linked cannot be linked to the entity in the knowledge graph, or the media data to be linked may be linked to the entity other than the service object in the knowledge graph.
Optionally, the server may further store an identifier mapping table, where a mapping relationship between the long video identifier and the entity identifier may be stored in the identifier mapping table, and at this time, the server may use the media data identifier as the long video identifier, use the object identifier as the entity identifier, and find whether there is a mapping relationship between the media data identifier and the object identifier in the identifier mapping table. If the mapping relation exists between the media data identifier and the object identifier (i.e. the mapping relation exists between the media data identifier and the object identifier in the identifier mapping table), the server can directly link the media data to be linked to the service object; optionally, if there is no mapping relationship between the media data identifier and the object identifier (i.e. there is no mapping relationship between the media data identifier and the object identifier in the identifier mapping table), the server does not need to link the media data to be linked to the service object, where the media data to be linked cannot be linked to the entity in the knowledge graph.
Wherein the auxiliary media data comprises auxiliary text data. At this time, if the resource type of the media data to be linked is a video type, the media data to be linked is video type data, and the auxiliary media data of the media data to be linked may include, but is not limited to, a title, a brief introduction, a caption, a line, and the like of the video type data. The server may link the video class data to the business object through an entity linking algorithm. Wherein the auxiliary text data may be a title of the video class data.
It should be appreciated that the server takes the media data to be linked as the service scenario D based on the auxiliary media data of the media data to be linked i The specific procedure of the target multimedia data associated with the business object can be described as: the server may map the media data to be linked according to the auxiliary text data of the media data to be linked. If the auxiliary text data comprises a title to the auxiliary long video data, the server can successfully map the media data to be linked to the auxiliary long video data; optionally, if the auxiliary text data does not include a title to the auxiliary long video data, the mapping of the media data to be linked fails. Further, if the media data to be linked is successfulMapping to auxiliary long video data, the server can take the media data to be linked as a service scene D based on the video data identification of the auxiliary long video data and the object identification of the service object i Target multimedia data associated with a business object. Optionally, if mapping of the media data to be linked fails, the server may determine an auxiliary object corresponding to the media data to be linked according to the auxiliary media data of the media data to be linked, and use the media data to be linked as the service scene D based on similarity between the auxiliary object and the service object i Target multimedia data associated with a business object.
The specific process of using the media data to be linked as the target multimedia data by the server based on the video data identifier of the auxiliary video data and the object identifier of the service object can be referred to above based on the media data identifier of the media data to be linked and the object identifier of the service object, and description of using the media data to be linked as the target multimedia data will not be repeated here.
It will be appreciated that the server may extract the set of entity references from the auxiliary media data of the media data to be linked. Wherein the entity-mention set may comprise a plurality of entity-mention (i.e. auxiliary objects), the entity-mention may represent all entities mentioned in the auxiliary media data, and the entity-mention set may comprise an entity-mention X. Further, the server may search the knowledge graph for each entity reference to a respective set of candidate entities. The method of generating the candidate entity set corresponding to each entity mention by the server can be a dictionary matching method, a surface form expansion method, a statistical model method and the like. Further, the server may determine a similarity between each entity mention (for example, entity mention X) and a plurality of candidate entities in the candidate entity set corresponding to entity mention X, and further rank the similarity, where the candidate entity with the highest similarity is used as the entity linking result of entity mention X, i.e., the candidate entity with the highest similarity is used as the candidate entity matched with entity mention X. Wherein, the candidate entity ranking method can comprise a method based on supervised learning and a method based on unsupervised learning; not every entity mention can find a corresponding entity in the knowledge graph, for which entity mention an entity linking system typically links it to a particular "empty entity". Further, the server may determine a target entity (i.e. a business object) from the candidate entities that match each entity reference, i.e. the server may link the media data to be linked to the business object.
For ease of understanding, please refer to fig. 10, fig. 10 is a schematic view of a knowledge graph provided in an embodiment of the present application. As shown in fig. 10, the knowledge graph sub-graph may be a knowledge graph sub-graph of the knowledge graph provided in the embodiment of the present application, where the knowledge graph sub-graph is a sub-graph taken from the knowledge graph, and the knowledge graph sub-graph may include various information: entity IP, persona entity, resource entity, and diverse relationships. Here, the description is given taking the entity IP as the video IP as an example, and the character entity may also be the entity IP (i.e. the artist IP) in the embodiment of the present application, and the description is given taking the character entity not as the entity IP (i.e. the non-artist IP) as an example.
As shown in fig. 10, the entity IP may specifically include: [ TV play ]]And [ variety]Etc.; the persona entity may specifically include: [ name X ] 1 ][ name X ] 2 ]Etc., the type of character entity may be actor, drama, director, etc.; the resource entity may specifically include: [ video_1 ]](i.e. synthetic version V) 1 )、[video_2](i.e. synthetic version V) 2 )、[video_3](i.e. version V of the television series) 3 )、[video_4](i.e. version V of the television series) 4 ) Waiting for resources to play, etc.; the diverse relationships may include: video.writer (i.e., drama), video.star (i.e., actor), video.data (i.e., data), etc.
Wherein, the variety may have two playing resources, and the two playing resources of the variety may be variety version V 1 And variety version V 2 The actor of the variety may be [ name X ] 1 ]For example, variety version V 1 Can be a common version and a variety version V 2 Can be a preferred version, the television play can have two play resources, and the two play resources of the television play can be television play version V 3 And version V of TV 4 TV playThe actor may be [ name X ] 1 ]The drama of the television drama may be [ name X ] 2 ]For example, version V of TV 3 Can be TV (Television) edition, TV play version V 4 May be version DVD (Digital Video Disc).
It can be understood that the knowledge graph has been widely used in the application fields of bottom asset construction, upper recommendation, search, question-answering and the like in recent years due to the advantages of rich entities, various link relations, good interpretation and the like. According to the embodiment of the application, a knowledge graph is introduced into a calculation scheme of interest degree of a participation object of a business object: (1) By means of the association relation between the entity IP and the resource entities, users who consume the same entity IP are converged to the entity IP granularity, so that the user interest calculation is more complete; (2) The knowledge graph has the capability of automatic construction, so that the entity IP is also automatically updated in full quantity, and the automatic construction of the interest degree of the full quantity IP users is ensured; (3) The knowledge graph is used as a knowledge base containing multi-domain IP, and has the advantage of breaking business domain data barriers (the data barriers can cause data island phenomenon) on the content level.
For ease of understanding, please refer to fig. 11, fig. 11 is a flowchart of a linking target media data provided in an embodiment of the present application. As shown in fig. 11, which shows a business object linking main flow based on a knowledge graph, a server may determine an information domain (e.g., a business scenario D i ) Any content resource within (i.e., media data to be linked) is either a video type (i.e., video class) or a teletext type (i.e., teletext class). It will be appreciated that, as shown in fig. 11, for non-video resources (i.e. video class data), i.e. graphics resources (graphics class data), the server may use the entity linking algorithm to perform IP linking according to the auxiliary media data, so as to find the object identifier of the correct service object.
Alternatively, as shown in fig. 11, it is continued to determine whether or not it is a short video (i.e., a short video type) for the video-type resource. For non-short video (i.e. long video type), the resource doc_id (i.e. long video identification) and video_id (i.e. object identification) in the knowledge graph have a matching relationship, so that the object identification of the correct service object can be directly found; optionally, for short video, the short-to-long identification mapping (i.e. short video identification to long video identification conversion) is performed preferentially, and the identification of the correct service object can be successfully matched in the knowledge graph, otherwise, the entity linking algorithm is adopted to perform IP linking, so as to find the object identification of the correct service object. In addition, for all resources for which the correct business object is not found yet through the entity link, no information introduction is performed.
Therefore, compared with the situation that consumption behaviors of the same business object of different participators on different domains cannot be fused and complemented due to the problem of data island of each business domain in the prior art, the embodiment of the application provides a cross-domain interest degree calculation scheme based on a knowledge graph, and the consumption behaviors of the business object and the participators in different business domains are aligned, fused and quantified by means of the capability of the knowledge graph, so that the problem of data island among the business domains is solved, and a calculation scheme for automatically producing a comprehensive interest score (an impulse interest degree parameter) of a continuity measurement participator for the business object in batch is formed. The main flow of the cross-domain interestingness calculation scheme comprises three parts: the interest degree parameters generated by the three parts can improve the accuracy of generating the interest degree parameters and the relevance of the interest degree parameters.
Further, referring to fig. 12, fig. 12 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application, where the data processing apparatus 1 may include: the system comprises a parameter acquisition module 11, a first parameter determination module 12, a duration acquisition module 13, a second parameter determination module 14 and a parameter fusion module 15; further, the data processing apparatus 1 may further include: a first type determination module 16, a second type determination module 17;
A parameter obtaining module 11, configured to obtain, in T unit time periods, a conversion rate parameter set of the participating object in each unit time period and attenuation parameters corresponding to each unit time period respectively; t is a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects in N business scenes aiming at the business objects respectively; n is a positive integer greater than 1; t unit time periods belong to a target time period; the target time period includes a target unit time period;
wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N;
the parameter acquisition module 11 includes: a parameter acquisition unit 111, a parameter division unit 112, an interval acquisition unit 113;
a parameter obtaining unit 111 for obtaining the participation objects in the service scene D in T unit time periods i The conversion rate parameter aiming at the business object;
wherein the T unit time periods include a unit time period P e E is a positive integer less than or equal to T;
the parameter obtaining unit 111 is specifically configured to, in a service scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e Acquiring the media exposure times of the participation object aiming at the target multimedia data and the media total interaction times of the participation object aiming at the target multimedia data;
The parameter obtaining unit 111 is specifically configured to use the ratio of the total interaction times of the media to the exposure times of the media as the unit time period P e Internally-related and object in service scene D i For the conversion parameters of the business object.
Wherein the T unit time periods include a unit time period P e E is a positive integer less than or equal to T;
the parameter obtaining unit 111 is specifically configured to, in a service scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e The method comprises the steps of internally obtaining the total media interaction times of a participation object aiming at target multimedia data, screening the total media interaction times, and obtaining the effective media interaction times of the participation object aiming at the target multimedia data;
the parameter obtaining unit 111 is specifically configured to use the ratio of the effective media interaction times to the total media interaction times as a singleBit period P e Internally-related and object in service scene D i For the conversion parameters of the business object.
Wherein the T unit time periods include a unit time period P e E is a positive integer less than or equal to T;
the parameter obtaining unit 111 is specifically configured to, in a service scenario D i In acquiring target multimedia data associated with a business object in a unit time period P e The method comprises the steps of internally obtaining media effective interaction times of a participation object aiming at target multimedia data, screening the media effective interaction times, and obtaining media complete interaction times of the participation object aiming at the target multimedia data;
the parameter obtaining unit 111 is specifically configured to use the ratio of the number of complete interactions of the media to the number of effective interactions of the media as the unit time period P e Internally-related and object in service scene D i For the conversion parameters of the business object.
A parameter dividing unit 112, configured to divide the conversion parameters acquired in the same unit time period into the same conversion parameter set, to obtain a conversion parameter set in each unit time period;
an interval acquisition unit 113, configured to acquire a time interval between each unit time interval and an auxiliary unit time interval, and determine attenuation parameters corresponding to each unit time interval according to the time interval; the auxiliary unit time period is a unit time period later than the target time period, and the auxiliary unit time period is adjacent to the target time period.
For specific implementation manners of the parameter obtaining unit 111, the parameter dividing unit 112 and the interval obtaining unit 113, reference may be made to the description of the step S101 and the step S1011-step S1013 in the embodiment corresponding to fig. 8 in the embodiment corresponding to fig. 3, and the description will not be repeated here.
The first parameter determining module 12 is configured to determine, according to the conversion rate parameter set in each unit time period and the attenuation parameters corresponding to each unit time period, conversion rate index parameters of the participating objects in the target time period for the service objects in the N service scenes respectively;
wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N; t unit time periods include unit time period P e E is a positive integer less than or equal to T; unit time period P e The group of conversion parameters in the reactor comprises a unit time period P e Intrinsic business scenario D i Conversion parameters of (a);
a first parameter determination module 12, specifically configured to, according to the unit time period P e Intrinsic business scenario D i Conversion parameter and unit time period P e Corresponding attenuation parameters, determining a unit time period P e Intrinsic business scenario D i Attenuation conversion parameters in (a);
the first parameter determining module 12 is specifically configured to determine, for each of the T unit time periods, a traffic scenario D i The attenuation conversion rate parameters in the target time period are fused to obtain the service scene D of the reference object in the target time period i The conversion rate index parameter aiming at the business object.
A duration acquisition module 13, configured to acquire interaction durations of the participating objects in the N service scenes for the service objects respectively in a target unit time period;
Wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N;
the duration acquisition module 13 is specifically configured to, in the service scenario D i Acquiring target multimedia data associated with a service object, and acquiring auxiliary time length of a participation object aiming at the target multimedia data in a target unit time period; the time unit of the auxiliary duration is a first time unit;
the duration obtaining module 13 is specifically configured to perform time unit conversion on the auxiliary duration with the first time unit, so as to obtain a traffic scene D of the reference object in the target unit time period i The interaction time length of the service object; the time unit of the interaction time length is a second time unit; the first time unit and the second time unit are different.
The second parameter determining module 14 is configured to determine importance index parameters of the reference object in the target unit time period for the service objects in the N service scenes according to the interaction durations in the N service scenes in the target unit time period;
wherein the second parameter determination module 14 comprises: a deviation determination unit 141, a parameter determination unit 142;
a deviation determining unit 141 for determining the traffic scene D of the reference object in the target unit time period according to the auxiliary time period i The deviation parameter for the business object;
the deviation determining unit 141 is specifically configured to compare the auxiliary duration with a plurality of duration ranges in the parameter mapping table, and determine a duration range including the auxiliary duration from the plurality of duration ranges as a target duration range;
the deviation determining unit 141 is specifically configured to determine a duration parameter corresponding to the target duration range in the parameter mapping table as a reference object in the traffic scene D within the target unit time period i In the business object.
A parameter determining unit 142 for determining a traffic scene D in a target unit time period i The interaction time length and deviation parameters in the process of determining the business scene D of the reference object in the target unit time period i Importance index parameters for business objects.
For specific implementation manners of the deviation determining unit 141 and the parameter determining unit 142, reference may be made to the description of step S104 in the embodiment corresponding to fig. 3, and the description will not be repeated here.
The parameter fusion module 15 is configured to fuse the conversion rate index parameter and the importance index parameter, and generate an interest degree parameter of the participating object in the target time period for the service object.
Wherein the N business scenes comprise a business scene D i I is a positive integer less than or equal to N; for business scenario D i The number of the conversion index parameters is G, and G is a positive integer greater than 1;
the parameter fusion module 15 is specifically configured to perform a service scenario D i G conversion index parameters and business scenario D i The importance index parameters in the data are fused to obtain the industryService scene D i Corresponding fusion index parameters;
the parameter fusion module 15 is specifically configured to fuse fusion index parameters corresponding to the N service scenarios respectively, so as to obtain an interest degree parameter of the participating object in the target time period for the service object.
Optionally, the N service scenarios include service scenario D i I is a positive integer less than or equal to N; the business object is an entity in the knowledge graph, and the business scene D i The media data to be linked in the knowledge graph is multimedia data with a link relation with the entity;
a first type determining module 16 for determining a resource type of the media data to be linked, and if the resource type is a graphics context type, using the media data to be linked as a service scene D based on auxiliary media data of the media data to be linked i Target multimedia data associated with a business object;
a second type determining module 17, configured to determine a video length type of the media data to be linked according to a duration of the media data to be linked if the resource type is a video type, and use the media data to be linked as a service scene D based on the video length type i Target multimedia data associated with a business object.
Wherein the second type decision module 17 comprises: a first type determination unit 171, a second type determination unit 172;
a first type determining unit 171, configured to, if the video length type is a long video type, take the media data to be linked as a service scene D based on the media data identifier of the media data to be linked and the object identifier of the service object i Target multimedia data associated with a business object;
a second type determining unit 172, configured to, if the video length type is a short video type, take the media data to be linked as a service scene D based on the auxiliary media data of the media data to be linked i Target multimedia data associated with a business object.
Wherein the auxiliary media data comprises auxiliary text data;
a second type determining unit 172, specifically configured to map the media data to be linked according to the auxiliary text data of the media data to be linked;
the second type determining unit 172 is specifically configured to, if the media data to be linked is mapped to the auxiliary long video data successfully, take the media data to be linked as the service scene D based on the video data identifier of the auxiliary long video data and the object identifier of the service object i Target multimedia data associated with a business object;
the second type determining unit 172 is specifically configured to determine, if mapping of the media data to be linked fails, an auxiliary object corresponding to the media data to be linked according to the auxiliary media data of the media data to be linked, and use the media data to be linked as the service scene D based on the similarity between the auxiliary object and the service object i Target multimedia data associated with a business object.
For specific implementation manners of the first type determining unit 171 and the second type determining unit 172, reference may be made to the description of step S203 in the embodiment corresponding to fig. 9, and the description thereof will not be repeated here.
The specific implementation manners of the parameter obtaining module 11, the first parameter determining module 12, the duration obtaining module 13, the second parameter determining module 14 and the parameter fusion module 15 may be referred to the description of the steps S101-S105 in the embodiment corresponding to fig. 3 and the steps S1011-S1013 in the embodiment corresponding to fig. 8, and will not be repeated here. For specific implementation manners of the first type determining module 16 and the second type determining module 17, reference may be made to the descriptions of step S201 to step S203 in the embodiment corresponding to fig. 9, and the detailed descriptions will be omitted here. In addition, the description of the beneficial effects of the same method is omitted.
Further, referring to fig. 13, fig. 13 is a schematic structural diagram of a computer device provided in an embodiment of the present application, where the computer device may be a terminal device or a server. As shown in fig. 13, the computer device 1000 may include: processor 1001, network interface 1004, and memory 1005, and in addition, the above-described computer device 1000 may further include: a user interface 1003, and at least one communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. In some embodiments, the user interface 1003 may include a Display (Display), a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface, among others. Alternatively, the network interface 1004 may include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory 1005 may also be at least one memory device located remotely from the aforementioned processor 1001. As shown in fig. 13, an operating system, a network communication module, a user interface module, and a device control application program may be included in the memory 1005, which is one type of computer-readable storage medium.
In the computer device 1000 shown in FIG. 13, the network interface 1004 may provide network communication functions; while user interface 1003 is primarily used as an interface for providing input to a user; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
acquiring a conversion rate parameter group of a participated object in each unit time period and attenuation parameters corresponding to each unit time period respectively in T unit time periods; t is a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects in N business scenes aiming at the business objects respectively; n is a positive integer greater than 1; t unit time periods belong to a target time period; the target time period includes a target unit time period;
according to the conversion rate parameter group in each unit time period and the attenuation parameter corresponding to each unit time period, determining the conversion rate index parameters of the participation objects in N business scenes aiming at the business objects in the target time period;
acquiring interaction time lengths of the participation objects in N business scenes aiming at the business objects respectively in a target unit time period;
according to the interaction time length in N business scenes in the target unit time period, determining importance index parameters of the reference object and the object in the target unit time period in N business scenes respectively aiming at the business object;
And fusing the conversion rate index parameter and the importance index parameter to generate the interest degree parameter of the participated object aiming at the business object in the target time period.
It should be understood that the computer device 1000 described in the embodiments of the present application may perform the description of the data processing method in the embodiments corresponding to fig. 3, 8 or 9, and may also perform the description of the data processing apparatus 1 in the embodiments corresponding to fig. 12, which are not described herein. In addition, the description of the beneficial effects of the same method is omitted.
Furthermore, it should be noted here that: the embodiments of the present application further provide a computer readable storage medium, in which the aforementioned computer program executed by the data processing apparatus 1 is stored, and when the processor executes the computer program, the description of the data processing method in the embodiment corresponding to fig. 3, 8 or 9 can be executed, and therefore, a detailed description will not be given here. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer-readable storage medium according to the present application, please refer to the description of the method embodiments of the present application.
In addition, it should be noted that: embodiments of the present application also provide a computer program product, which may include a computer program, which may be stored in a computer readable storage medium. The processor of the computer device reads the computer program from the computer readable storage medium, and the processor may execute the computer program, so that the computer device performs the description of the data processing method in the embodiment corresponding to fig. 3, fig. 8, or fig. 9, and thus, a detailed description thereof will not be provided herein. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer program product according to the present application, reference is made to the description of the embodiments of the method according to the present application.
Those skilled in the art will appreciate that implementing all or part of the above-described methods may be accomplished by way of a computer program stored in a computer-readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
The foregoing disclosure is only illustrative of the preferred embodiments of the present application and is not intended to limit the scope of the claims herein, as the equivalent of the claims herein shall be construed to fall within the scope of the claims herein.

Claims (17)

1. A method of data processing, comprising:
acquiring a conversion rate parameter set of a participated object in each unit time period and attenuation parameters corresponding to each unit time period respectively in T unit time periods; the T is a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects for the service objects in N service scenes respectively; the N is a positive integer greater than 1; t said unit time periods belong to a target time period; the target time period includes a target unit time period;
according to the conversion rate parameter set in each unit time period and the attenuation parameters corresponding to each unit time period, determining conversion rate index parameters of the participation objects in N business scenes aiming at the business objects in the target time period;
acquiring interaction time lengths of the participation objects in N business scenes aiming at the business objects respectively in the target unit time period;
Determining importance index parameters of the participation objects in N business scenes aiming at the business objects in the target unit time period according to the interaction time periods in the N business scenes in the target unit time period;
and fusing the conversion rate index parameter and the importance index parameter to generate an interest degree parameter of the participated object aiming at the business object in the target time period.
2. The method of claim 1, wherein N of the traffic scenarios comprise traffic scenario D i The i is a positive integer less than or equal to the N;
the obtaining, in T unit time periods, a conversion rate parameter set of the participating object in each unit time period and attenuation parameters corresponding to each unit time period respectively includes:
respectively acquiring the participation objects in the service scene D in T unit time periods i The conversion rate parameter aiming at the business object;
dividing the conversion parameters acquired in the same unit time period into the same conversion parameter group to acquire the conversion parameter group in each unit time period;
acquiring a time period interval between each unit time period and an auxiliary unit time period, and determining attenuation parameters corresponding to each unit time period according to the time period interval; the auxiliary unit time period is a unit time period later than the target time period, and the auxiliary unit time period is adjacent to the target time period.
3. The method of claim 2, wherein T of the unit time periods include a unit time period P e The e is a positive integer less than or equal to the T;
the method comprises the steps of respectively acquiring the participation objects in the business scene D in T unit time periods i The conversion rate parameters for the business object include:
in the business scenario D i In acquiring target multimedia data associated with a business object, in said unit time period P e Internally acquiring the media exposure times of the participation object to the target multimedia data and the participation object to the targetThe total interaction times of the multimedia data;
taking the ratio of the total interaction times of the media to the exposure times of the media as the unit time period P e Within the business scenario D i For the conversion parameters of the business object.
4. The method of claim 2, wherein T of the unit time periods include a unit time period P e The e is a positive integer less than or equal to the T;
the method comprises the steps of respectively acquiring the participation objects in the business scene D in T unit time periods i The conversion rate parameters for the business object include:
in the business scenario D i In acquiring target multimedia data associated with a business object, in said unit time period P e The media total interaction times of the participation object aiming at the target multimedia data are acquired, and the media total interaction times are screened to obtain the media effective interaction times of the participation object aiming at the target multimedia data;
taking the ratio of the effective interaction times of the media to the total interaction times of the media as the unit time period P e Within the business scenario D i For the conversion parameters of the business object.
5. The method of claim 2, wherein T of the unit time periods include a unit time period P e The e is a positive integer less than or equal to the T;
the method comprises the steps of respectively acquiring the participation objects in the business scene D in T unit time periods i The conversion rate parameters for the business object include:
in the business scenario D i In acquiring target multimedia data associated with a business object, in said unit time period P e Acquiring the media effective interaction times of the participated object aiming at the target multimedia data, and screening the media effective interaction timesSelecting, to obtain the media complete interaction times of the participation object aiming at the target multimedia data;
Taking the ratio of the complete interaction times of the media to the effective interaction times of the media as the unit time period P e Within the business scenario D i For the business object.
6. The method of claim 1, wherein N of the traffic scenarios comprise traffic scenario D i The i is a positive integer less than or equal to the N; t of the unit time periods include a unit time period P e The e is a positive integer less than or equal to the T; the unit time period P e The group of conversion parameters within includes the unit time period P e Intrinsic to the business scenario D i Conversion parameters of (a);
the determining, according to the conversion rate parameter set in each unit time period and the attenuation parameters corresponding to each unit time period, conversion rate index parameters of the participating objects in the target time period in the N service scenes for the service objects respectively includes:
according to the unit time period P e Intrinsic to the business scenario D i Conversion parameter in (c) and said unit time period P e Corresponding attenuation parameters, determining the unit time period P e Intrinsic to the business scenario D i Attenuation conversion parameters in (a);
For T said unit time periods, respectively in said service scene D i The attenuation conversion rate parameters in the target time period are fused to obtain the situation that the participated object is in the service scene D i The conversion rate index parameter aiming at the business object.
7. The method of claim 1, wherein N of the traffic scenarios comprise traffic scenario D i The i is a positive integer less than or equal to the N;
the step of obtaining the interaction time lengths of the participation objects in the N business scenes for the business objects in the target unit time period comprises the following steps:
in the business scenario D i Acquiring target multimedia data associated with the business object, and acquiring auxiliary time length of the participation object aiming at the target multimedia data in the target unit time period; the time unit of the auxiliary duration is a first time unit;
performing time unit conversion on the auxiliary time length with the first time unit to obtain the situation that the participation object is in the service scene D in the target unit time period i The interaction time length of the business object; the time unit of the interaction time length is a second time unit; the first time unit and the second time unit are different.
8. The method according to claim 7, wherein the determining, according to the interaction durations in the N service scenarios in the target unit time period, importance index parameters of the participating objects in the N service scenarios in the target unit time period for the service objects respectively includes:
determining that the participation object is in the service scene D within the target unit time period according to the auxiliary time length i A deviation parameter for the business object;
according to the service scene D in the target unit time period i The interaction time length and the deviation parameter in the process of determining the participation object in the service scene D in the target unit time period i Importance index parameters for the business object.
9. The method according to claim 8, wherein the determining the participation object in the service scenario D within the target unit time period according to the auxiliary time period i The deviation parameter for the business object comprises:
comparing the auxiliary time length with a plurality of time length ranges in a parameter mapping table, and determining the time length range including the auxiliary time length in the plurality of time length ranges as a target time length range;
Determining a time length parameter corresponding to the target time length range in the parameter mapping table as the participation object in the service scene D in the target unit time period i For the business object.
10. The method of claim 1, wherein N of the traffic scenarios comprise traffic scenario D i The i is a positive integer less than or equal to the N; for the business scenario D i The number of the conversion index parameters of (2) is G, wherein G is a positive integer greater than 1;
the fusing the conversion rate index parameter and the importance index parameter to generate an interest degree parameter of the participating object for the business object in the target time period, including:
for the business scene D i G of said conversion index parameters and said traffic scenario D i The importance index parameters in the service scene D are fused to obtain the service scene D i Corresponding fusion index parameters;
and fusing the fusion index parameters corresponding to the N business scenes respectively to obtain the interestingness parameters of the participation object aiming at the business object in the target time period.
11. The method of claim 1, wherein N of the traffic scenarios comprise traffic scenario D i The i is a positive integer less than or equal to the N; the business object is an entity in the knowledge graph, and the business scene D i The media data to be linked in the knowledge graph is multimedia data with a link relation with the entity;
the method further comprises the steps of:
determining the resource type of the media data to be linked, and if the resource type is the graphic type, based on the media data to be linkedAuxiliary media data of media data, wherein the media data to be linked is used as the service scene D i Target multimedia data associated with the business object;
if the resource type is a video type, determining a video length type of the media data to be linked according to the duration of the media data to be linked, and taking the media data to be linked as the service scene D based on the video length type i Is associated with the business object.
12. The method according to claim 11, wherein the media data to be linked is taken as the service scene D based on the video length type i Target multimedia data associated with the business object, comprising:
If the video length type is a long video type, taking the media data to be linked as the service scene D based on the media data identification of the media data to be linked and the object identification of the service object i Target multimedia data associated with the business object;
if the video length type is a short video type, taking the media data to be linked as the service scene D based on the auxiliary media data of the media data to be linked i Is associated with the business object.
13. The method of claim 12, wherein the auxiliary media data comprises auxiliary text data;
the auxiliary media data based on the media data to be linked takes the media data to be linked as the service scene D i Target multimedia data associated with the business object, comprising:
mapping the media data to be linked according to the auxiliary text data of the media data to be linked;
if the media data to be linked is successfully linkedMapping to auxiliary long video data, and taking the media data to be linked as the service scene D based on the video data identification of the auxiliary long video data and the object identification of the service object i Target multimedia data associated with the business object;
if the mapping of the media data to be linked fails, determining an auxiliary object corresponding to the media data to be linked according to the auxiliary media data of the media data to be linked, and taking the media data to be linked as the service scene D based on the similarity between the auxiliary object and the service object i Is associated with the business object.
14. A data processing apparatus, comprising:
the parameter acquisition module is used for acquiring a conversion rate parameter set of the participated object in each unit time period and attenuation parameters corresponding to each unit time period respectively in T unit time periods; the T is a positive integer greater than 1; the conversion rate parameter group comprises conversion rate parameters of the participation objects for the service objects in N service scenes respectively; the N is a positive integer greater than 1; t said unit time periods belong to a target time period; the target time period includes a target unit time period;
the first parameter determining module is used for determining conversion rate index parameters of the participation objects in N business scenes for the business objects in the target time period according to the conversion rate parameter groups in each unit time period and attenuation parameters corresponding to each unit time period respectively;
The duration acquisition module is used for acquiring interaction durations of the participation objects in N business scenes aiming at the business objects respectively in the target unit time period;
the second parameter determining module is used for determining importance index parameters of the participation objects in N business scenes respectively in the target unit time period according to the interaction time periods in the N business scenes in the target unit time period;
and the parameter fusion module is used for fusing the conversion rate index parameter and the importance index parameter to generate the interest degree parameter of the participated object aiming at the business object in the target time period.
15. A computer device, comprising: a processor and a memory;
the processor is connected to the memory, wherein the memory is configured to store a computer program, and the processor is configured to invoke the computer program to cause the computer device to perform the method of any of claims 1-13.
16. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program adapted to be loaded and executed by a processor to cause a computer device having the processor to perform the method of any of claims 1-13.
17. A computer program product, characterized in that the computer program product comprises a computer program stored in a computer readable storage medium and adapted to be read and executed by a processor to cause a computer device with the processor to perform the method of any of claims 1-13.
CN202211034638.8A 2022-08-26 2022-08-26 Data processing method, device, computer equipment and readable storage medium Pending CN117689235A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211034638.8A CN117689235A (en) 2022-08-26 2022-08-26 Data processing method, device, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211034638.8A CN117689235A (en) 2022-08-26 2022-08-26 Data processing method, device, computer equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN117689235A true CN117689235A (en) 2024-03-12

Family

ID=90130610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211034638.8A Pending CN117689235A (en) 2022-08-26 2022-08-26 Data processing method, device, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117689235A (en)

Similar Documents

Publication Publication Date Title
US11418846B2 (en) System and method for enabling review of a digital multimedia presentation and redirection therefrom
US10614139B2 (en) System and method for providing curated content items
Allam et al. Applying a multi-dimensional hedonic concept of intrinsic motivation on social tagging tools: A theoretical model and empirical validation
CN1607527B (en) Information processing apparatus and information processing method
CN109688479B (en) Bullet screen display method, bullet screen display device and bullet screen display server
CN108595492B (en) Content pushing method and device, storage medium and electronic device
CN113742567B (en) Recommendation method and device for multimedia resources, electronic equipment and storage medium
CN112507163A (en) Duration prediction model training method, recommendation method, device, equipment and medium
Dogruel Cross-Cultural differences in movie selection. decision-making of German, US, and Singaporean media users for video-on-demand movies
Proctor et al. Understanding and improving cross-cultural decision making in design and use of digital media: a research agenda
Lin Applying the UTAUT model to understand factors affecting the use of e-books in Fujian, China
Wu et al. The role of cognitive factors in consumers’ perceived value and subscription intention of video streaming platforms: a systematic literature review
Smith Measures and maps of Usenet
Yang Reality-creating technologies as a global phenomenon
CN117689235A (en) Data processing method, device, computer equipment and readable storage medium
Venkatasawmy Communication and media education in an era of big data
CN115705379A (en) Intelligent recommendation method and device, equipment and storage medium
Chiu et al. Do critical reviews affect box office revenues through community engagement and user reviews?
CN117709756A (en) Data processing method, device, computer equipment and readable storage medium
Verboord et al. Where to look next for a shot of culture? Repertoires of cultural information production and consumption on the Internet
CN117033610A (en) Method, device, client, server and storage medium for acquiring topics
De-Pablos-Heredero Future Intelligent Systems and Networks 2019
Feng et al. Study on user experience of live streaming sales based on ISM and kano quality model
Daoudi YouTube-based programming and the Saudi youth: exploring the economic, political and cultural context of YouTube in Saudi Arabia
Short et al. How can automatic systems facilitate journalistic discovery of newsworthy content?

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination