CN112433655A - Information flow interaction processing method based on cloud computing and cloud computing verification interaction center - Google Patents

Information flow interaction processing method based on cloud computing and cloud computing verification interaction center Download PDF

Info

Publication number
CN112433655A
CN112433655A CN202011417255.XA CN202011417255A CN112433655A CN 112433655 A CN112433655 A CN 112433655A CN 202011417255 A CN202011417255 A CN 202011417255A CN 112433655 A CN112433655 A CN 112433655A
Authority
CN
China
Prior art keywords
interactive
information
target
interaction
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011417255.XA
Other languages
Chinese (zh)
Other versions
CN112433655B (en
Inventor
崔秀芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Maiyi Information Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202011417255.XA priority Critical patent/CN112433655B/en
Publication of CN112433655A publication Critical patent/CN112433655A/en
Application granted granted Critical
Publication of CN112433655B publication Critical patent/CN112433655B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The embodiment of the application provides an information flow interactive processing method based on cloud computing and a cloud computing verification interactive center, after loading drawing control information between a target loading interactive event and a target control drawing object is generated according to drawing parameters of the target loading interactive event and the target control drawing object under at least one control drawing object, the loading drawing control information between the target loading interactive event and the target control drawing object under each drawing attribute type is recorded in each target simulation drawing control, a large number of reference bases based on the loading interactive event can be utilized, so that more target control drawing objects are obtained, the accuracy of information flow matching of subsequent simulation drawing resources is improved, and the condition that errors occur when simulation drawing is carried out on a to-be-simulated drawing loading element can be avoided when the loading interactive event is taken as an independent interactive processing unit, therefore, the accuracy of information flow matching of the simulation drawing resources is improved.

Description

Information flow interaction processing method based on cloud computing and cloud computing verification interaction center
Technical Field
The application relates to the technical field of information flow interaction based on cloud computing, in particular to an information flow interaction processing method based on cloud computing and a cloud computing verification interaction center.
Background
Cloud computing (cloud computing) is one type of distributed computing, and means that a huge data computing processing program is decomposed into countless small programs through a network "cloud", and then the small programs are processed and analyzed through a system consisting of a plurality of servers to obtain results and are returned to a user. In the early stage of cloud computing, simple distributed computing is adopted, task distribution is solved, and computing results are merged. Thus, cloud computing is also known as grid computing. By the technology, tens of thousands of data can be processed in a short time (several seconds), so that strong network service is achieved.
With the continuous development of high-speed internet technology and audio-video technology, multi-platform multi-object internet interaction is more and more popular, and information stream interaction processing by adopting online cloud computing is more and more common. The internet interactive video is a novel video which integrates interactive experience into a linearly played video through various technical means, and the played multi-object interactive information stream is expanded and displayed on the cloud platform, and personalized interactive function options are configured for a user, so that personalized watching requirements of different audiences can be met.
In the related art, the generation of the interactive event verification stream can be performed on the multi-object interactive information stream, so that the behavior portrait basis formed by the user in the interactive process is summarized, and the subsequent function improvement is facilitated. However, in the conventional scheme, either the loading interactivity event fragment is used as a minimum unit or the loading interactivity event fragment is used as a minimum unit. The inventor of the present application finds that, when the information stream interaction processing is performed by using the loading interactive event fragment as the independent interactive processing unit, the situation of error occurring when the simulation rendering is performed on the element to be rendered and loaded can be avoided, and the problem of the mistakenly loaded interactive event fragment can be better solved. In addition, when the loading interaction event is taken as the independent interaction processing unit, the situation of error is often easy to occur when the simulation drawing is performed on the loading element to be simulated and drawn, so that the accuracy of information flow matching for pushing the simulation drawing resource is low.
Disclosure of Invention
In order to overcome at least the above-mentioned deficiencies in the prior art, an object of the present application is to provide an information flow interaction processing method based on cloud computing and a cloud computing verification interaction center, wherein after loading rendering control information between a target loading interaction event and a target control rendering object is generated according to a target loading interaction event and rendering parameters of the target control rendering object under at least one control rendering object, the loading rendering control information between the target loading interaction event and the target control rendering object under each rendering attribute category is entered into each target simulation rendering control, so that a large number of reference bases based on the loading interaction event can be used to make more target rendering objects, which is beneficial to improve accuracy of information flow matching of subsequent simulation rendering resources, and when the loading interaction event is used as an independent interaction processing unit, occurrence of errors during simulation rendering of a to-be-simulated rendering loading element can be avoided The situation, therefore, improves the accuracy of information flow matching of the simulation rendering resources.
In a first aspect, the present application provides an information flow interaction processing method based on cloud computing, which is applied to a cloud computing verification interaction center, where the cloud computing verification interaction center is in communication connection with a plurality of information flow node terminals, and the method includes:
acquiring interactive event updating information of an interactive window track in the multi-object interactive information stream of the information stream node terminal;
acquiring a drawing and loading element to be simulated which is matched with a plurality of interaction event fragments to be loaded and a target simulation drawing control corresponding to the drawing and loading element to be simulated based on the interaction event update information, wherein the target simulation drawing control is a simulation drawing control which is served by an interaction component to which the event loading information of the drawing and loading element to be simulated belongs, and the target simulation drawing control comprises at least one control drawing object;
screening and matching a plurality of interaction event fragments to be loaded to obtain a target loading interaction event having a drawing association relation with at least one control drawing object, and generating loading drawing control information between the target loading interaction event and the target control drawing object according to drawing parameters of the target loading interaction event and the at least one control drawing object under a target drawing attribute category;
and inputting loading drawing control information between the target loading interactive event and a target control drawing object under each drawing attribute category in each target simulation drawing control, selecting a target simulation drawing resource matched with the drawing loading element to be simulated from a preset target simulation drawing resource set according to an input result, and pushing an interactive event verification stream of the target simulation drawing resource to the information flow node terminal, so that the interactive event verification stream is used for information mining of a user of the information flow node terminal after the information flow node terminal verifies and confirms the interactive event verification stream.
In a possible implementation manner of the first aspect, the generating, according to the drawing parameters of the target loading interaction event and the at least one control drawing object in the target drawing attribute category, loading drawing control information between the target loading interaction event and the target control drawing object includes:
determining a target drawing attribute type corresponding to each control drawing object according to the drawing incidence relation between the target loading interaction event and the control drawing object;
calling drawing parameters of the target loading interaction event and at least one control drawing object in the determined target drawing attribute category based on the determined target drawing attribute category, and determining the control drawing object with the drawing parameters meeting a preset drawing service range as a target control drawing object;
and generating loading drawing control information between the target loading interaction event and the target control drawing object according to the drawing parameters of the target loading interaction event and the target control drawing object under at least one drawing attribute category.
In a possible implementation manner of the first aspect, the invoking, based on the determined target rendering attribute category, a rendering parameter of the target loading interaction event and at least one control rendering object in the determined target rendering attribute category, and determining, as the target control rendering object, the control rendering object whose rendering parameter meets a preset rendering service range includes:
calling a first drawing parameter of the target loading interaction event and at least one control drawing object under the same drawing attribute category, and determining the control drawing object of which the first drawing parameter meets a preset drawing service range as a first target control drawing object;
calling a second drawing parameter of the target loading interaction event and at least one control drawing object under the hierarchy drawing attribute category, and determining the control drawing object of which the second drawing parameter meets a preset drawing service range as a second target control drawing object;
calling a third drawing parameter of the target loading interaction event and at least one control drawing object in a partition drawing attribute category, and determining the control drawing object of which the third drawing parameter meets a preset drawing service range as a third target control drawing object;
the calling of the first drawing parameter of the target loading interaction event and the drawing object of at least one control under the same drawing attribute category, and determining the control drawing object with the first drawing parameter meeting a preset drawing service range as a first target control drawing object, includes:
selecting a same-drawing attribute sequence from the target simulation drawing control, wherein the same-drawing attribute sequence comprises a plurality of same-drawing attribute lists, and each same-drawing attribute list comprises at least two control drawing objects with the same drawing attribute description vector;
determining a same drawing attribute list which has the same drawing attribute as the loading level drawing attribute of the target loading interaction event to obtain a target same drawing attribute list;
calling a first drawing parameter between the target loading interaction event and each control drawing object in the target same drawing attribute list, and determining a first target control drawing object by using the control drawing object of which the first drawing parameter meets a preset drawing service range;
the generating of the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category includes:
generating first loading drawing control information between the target loading interaction event and a first target control drawing object according to a first drawing parameter between the target loading interaction event and the first target control drawing object;
or, the calling the target loading interaction event and the second drawing parameter of the at least one control drawing object in the hierarchy drawing attribute category, and determining the control drawing object of which the second drawing parameter meets a preset drawing service range as the second target control drawing object, includes:
determining a level drawing attribute relation between the target loading interaction event and at least one control drawing object according to the loading level drawing attribute of the target loading interaction event and the loading level drawing attribute of each control drawing object;
calling a second drawing parameter between the target loading interaction event and the corresponding upper control drawing object based on the determined hierarchy drawing attribute relation, and determining a control drawing object of which the second drawing parameter meets a preset drawing service range as a second target control drawing object;
the generating of the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category includes:
generating second loading drawing control information between the target loading interaction event and a second target control drawing object according to a second drawing parameter between the target loading interaction event and the second target control drawing object;
or, the calling the third drawing parameter of the target loading interaction event and the at least one control drawing object in the partition drawing attribute category, and determining the control drawing object of which the third drawing parameter meets a preset drawing service range as a third target control drawing object, includes:
collecting a pre-established drawing attribute partition of each control drawing object;
calling a mapping rendering value between the target loading interaction event and each control rendering object, and determining the control rendering object with the mapping rendering value larger than a preset value as a key loading interaction event to be selected;
calling a third drawing parameter of the target loading interactive event and the key loading interactive event to be selected, of which the drawing attribute partition covers a preset partition, and determining the key loading interactive event to be selected, of which the third drawing parameter meets a preset drawing service range, as a third target control drawing object;
the generating of the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category includes:
and generating third loading drawing control information between the target loading interaction event and a third target control drawing object according to a third drawing parameter between the target loading interaction event and the third target control drawing object.
In a possible implementation manner of the first aspect, the entering, in each of the target simulation rendering controls, load rendering control information between the target load interaction event and the target control rendering object under each of the rendering attribute categories includes:
acquiring a preset partition template corresponding to each drawing attribute type;
calculating partition template matching information of loading drawing control information between the obtained partition template and the target loading interaction event and the target control drawing object under the corresponding drawing attribute category to obtain partition loading drawing control information corresponding to each drawing attribute category;
and inputting partition loading drawing control information corresponding to each drawing attribute type into each target simulation drawing control.
In a possible implementation manner of the first aspect, the screening and matching the multiple interactive event slices to be loaded to obtain a target loading interactive event having a drawing association relationship with at least one control drawing object includes:
determining target drawing operation track data, the drawing participation confidence coefficient of which between the target drawing operation track data and at least one control drawing object in the interaction event fragments to be loaded is greater than a set confidence coefficient, and a first drawing operation track node and a second drawing operation track node which take the target drawing operation track data as reference drawing operation track data according to the drawing operation track data corresponding to the interaction event fragments to be loaded, wherein the drawing participation element information of the first drawing operation track node is not overlapped with the drawing participation element information of the second drawing operation track node, and a loading interaction relationship exists between the drawing participation element information and the drawing participation element information of the second drawing operation track node;
determining a drawing operation track component meeting a first target requirement in the first drawing operation track node, and determining first interactive scene interface window information corresponding to the first drawing operation track node according to scene area modeling information of a multi-level scene area between a visual scene object of the drawing operation track component meeting the first target requirement and an associated preset scene object; the drawing operation track component meeting the first target requirement is a drawing operation track component of a visual scene object matched with the associated preset scene object;
determining a drawing operation track component meeting a second target requirement in the second drawing operation track node, and determining second interactive scene interface window information corresponding to the second drawing operation track node according to scene area modeling information of a multi-level scene area between a visual scene object of the drawing operation track component meeting the second target requirement and an associated preset scene object; the drawing operation track component meeting the second target requirement is a drawing operation track component of a visual scene object matched with the associated preset scene object;
obtaining window flow information of the drawing operation track component in each first drawing participation element information according to first interactive scene interface window information corresponding to the first drawing operation track node, and obtaining window flow information of the drawing operation track component in each second drawing participation element information according to second interactive scene interface window information in the second drawing operation track node;
according to the window flow information of each first drawing participating element information and each second drawing participating element information, respectively carrying out event tracing on the drawing operation track component in each drawing participating element information to obtain first event tracing information of each first drawing participating element information and second event tracing information of each second drawing participating element information;
obtaining corresponding event tracing information according to the first event tracing information of each first drawing participating element information and the second event tracing information of each second drawing participating element information;
and obtaining a target loading interaction event having a drawing association relation with the at least one control drawing object according to the event tracing information.
In a possible implementation manner of the first aspect, the obtaining of the interaction event update information of the interaction window trajectory in the multi-object interaction information stream of the information stream node terminal includes:
acquiring a multi-object interactive information stream of the information stream node terminal, and performing independent movable window extraction processing on the multi-object interactive information stream to obtain independent movable window information of interactive window tracks in the multi-object interactive information stream, wherein the multi-object interactive information stream is an interactive information stream formed by object interactive information recorded by each interactive window track acquired based on a single interactive request;
carrying out interactive behavior tracking extraction based on the independent movable window information of the interactive window track to obtain target interactive behavior migration characteristics of the interactive window track;
extracting interactive content migration characteristics of the multi-object interactive information stream based on an artificial intelligence model to obtain interactive content migration characteristic information of the interactive window track;
and performing interactive linkage event synthesis on the target interactive behavior migration characteristic of the interactive window track in the multi-object interactive information stream and the interactive content migration characteristic information of the interactive window track to obtain interactive linkage event synthesis information of the interactive window track, and performing interactive event update on the interactive event record control of the multi-object interactive information stream based on the interactive linkage event synthesis information of the interactive window track to obtain interactive event update information of the interactive window track.
In a possible implementation manner of the first aspect, the performing independent movable window extraction processing on a multi-object interactive information stream to obtain independent movable window information of an interactive window trajectory in the multi-object interactive information stream includes:
acquiring an interactive graphic element set in an interactive graphic track recorded by a window service of each object interaction event in the multi-object interactive information stream, wherein the interactive graphic element set in the interactive graphic track comprises interactive graphic elements taking each interactive graphic track as an interaction area, and the interactive graphic elements comprise graphic interaction trigger information and graphic attribute information of the interactive graphic track and interactive graphic records in the interactive graphic track;
for each interactive graphic track, according to each content editing graphic in a plurality of content editing graphics in an updating graphic record of the interactive graphic track of each object interaction event, according to page interaction elements of the content editing interaction pages in the content editing graphics, determining whether each content editing interaction page in the content editing graphics is a reference target content editing interaction page, according to the number of the reference target content editing interaction pages in the content editing graphics, determining each reference interaction indication control corresponding to the content editing graphics, for each reference interaction indication control, dividing the reference interaction indication control into a plurality of sub-interaction indication controls, according to editing objects and preset object ranges of the content editing interaction pages in each sub-interaction indication control, determining whether the reference interaction indication control is a target interaction indication control, wherein each content editing interaction page corresponds to each content editing interaction behavior;
acquiring interaction window partition information of a preset interaction window rule matched with each content editing interaction page in the target interaction indication control, wherein the interaction window partition information comprises interaction window calling information and interaction window component information, and the preset interaction window rule comprises matching modes corresponding to different interaction window services;
determining an interactive window drawing attribute feature of each interactive window drawing attribute map and an interactive window precondition of each interactive window abstract map according to interactive window partition information of each updated graphic record of each different interactive graphic track in an interactive graphic element set in the interactive graphic track, determining an interactive window label object of each object interactive event in the interactive graphic track according to the interactive window drawing attribute feature of each interactive window drawing attribute map and the interactive window precondition of each interactive window abstract map in the target interactive indication control, and taking the features in the interactive window range of the interactive window label object and the features outside the interactive window range of the interactive window label object and in the interactive window range of the interactive window label object as the features of each object interactive event behind the independently movable window feature in the interactive graphic track And fusing the independent movable window characteristics of each object interaction event in all interactive graphic tracks to obtain independent movable window information of an interaction window track in the multi-object interaction information stream.
In a possible implementation manner of the first aspect, the step of performing interactive behavior tracking extraction based on the independent movable window information of the interactive window trajectory to obtain a target interactive behavior migration feature of the interactive window trajectory includes:
acquiring interactive behavior request information of a user interaction behavior set added to window demonstration information of each independent movable window characteristic in independent movable window information of the interactive window track, and determining a first interactive flow segment list corresponding to the interactive behavior request information, wherein the interactive behavior request information comprises interactive behavior result information of interactive behavior running information determined according to interactive behavior input information and interactive behavior output information of the user interaction behavior set, and the first interactive flow segment list comprises a sequence of a plurality of interactive flow segments of the interactive behavior result information;
determining window presentation information for each of the independent movable window features based on a first interactive behavior vector of the interactive behavior input information and a second interactive behavior vector of the interactive behavior output information;
determining migration analysis parameters for performing migration analysis on the first interactive flow segment list according to the interactive flow segment sequence relation of the first interactive behavior vector and the second interactive behavior vector;
performing migration analysis on the first interactive flow segment list based on the migration analysis parameters to obtain a second interactive flow segment list;
carrying out migration node positioning on the second interactive flow segment list to obtain a plurality of migration node positioning portions, and carrying out feature extraction on each migration node positioning portion to obtain migration node positioning features;
determining the interactive behavior migration characteristics of each independent movable window characteristic according to the interactive behavior migration characteristics corresponding to the plurality of migration node positioning characteristics corresponding to the second interactive flow segment list;
and obtaining the target interactive behavior migration characteristic of the interactive window track based on the interactive behavior migration characteristic of each independent movable window characteristic.
For example, in a possible implementation manner of the first aspect, the step of performing interactive content migration feature extraction on the multi-object interactive information stream based on the artificial intelligence model to obtain the interactive content migration feature information of the interactive window trajectory includes:
inputting the multi-object interaction information stream into a pre-trained artificial intelligence model, and obtaining the confidence coefficient of the multi-object interaction information stream matched with each interactive content label, wherein the artificial intelligence model is obtained by training based on a training sample and training label information corresponding to the training sample, the training sample is a multi-object interaction information stream sample, and the training label information is interactive content index information label information;
determining target interactive content index information corresponding to the multi-object interactive information stream according to the confidence coefficient of the multi-object interactive information stream matched with each interactive content label;
and extracting interactive content migration characteristic information matched with each interactive window track from the interactive content index information description information of the target interactive content index information corresponding to the multi-object interactive information stream.
In a possible implementation manner of the first aspect, the updating, based on the interactive linkage event synthesis information of the interactive window trajectory, the interactive event of the interactive event record control of the multi-object interactive information stream to obtain the interactive event update information of the interactive window trajectory includes:
acquiring interaction event information of the interaction window track under an interaction event recording control of the multi-object interaction information stream;
acquiring interaction event items under the interaction event information and event relation configuration information corresponding to each interaction event item;
and overlapping and configuring the interactive linkage event synthesis information of the interactive window track under the event relation configuration information corresponding to each interactive event item to obtain the interactive event update information of the interactive window track.
In a second aspect, an embodiment of the present application further provides an information flow interaction processing apparatus based on cloud computing, which is applied to a cloud computing verification interaction center, where the cloud computing verification interaction center is in communication connection with a plurality of information flow node terminals, and the apparatus includes:
the first acquisition module is used for acquiring interactive event updating information of an interactive window track in the multi-object interactive information stream of the information stream node terminal;
a second obtaining module, configured to obtain, based on the interaction event update information, a to-be-simulated drawing and loading element that matches multiple to-be-loaded interaction event fragments and a target simulated drawing control corresponding to the to-be-simulated drawing and loading element, where the target simulated drawing control is a simulated drawing control that serves an interaction component to which event loading information of the to-be-simulated drawing and loading element belongs, and the target simulated drawing control includes at least one control drawing object;
the generating module is used for screening and matching the multiple interactive event fragments to be loaded to obtain a target loading interactive event which has a drawing association relation with at least one control drawing object, and generating loading drawing control information between the target loading interactive event and the target control drawing object according to drawing parameters of the target loading interactive event and the at least one control drawing object under a target drawing attribute category;
and the verification module is used for inputting the loading and drawing control information between the target loading interactive event and the target control drawing object under each drawing attribute category in each target simulation drawing control, selecting a target simulation drawing resource matched with the drawing and loading element to be simulated from a preset target simulation drawing resource set according to an input result, and pushing an interactive event verification stream of the target simulation drawing resource to the information flow node terminal, so that the interactive event verification stream is used for information mining of a user of the information flow node terminal after the information flow node terminal verifies and confirms the interactive event verification stream.
In a third aspect, an embodiment of the present application further provides an information flow interaction processing system based on cloud computing, where the information flow interaction processing system based on cloud computing includes a cloud computing verification interaction center and a plurality of information flow node terminals communicatively connected to the cloud computing verification interaction center;
the cloud computing verification interaction center is used for:
acquiring interactive event updating information of an interactive window track in the multi-object interactive information stream of the information stream node terminal;
acquiring a drawing and loading element to be simulated which is matched with a plurality of interaction event fragments to be loaded and a target simulation drawing control corresponding to the drawing and loading element to be simulated based on the interaction event update information, wherein the target simulation drawing control is a simulation drawing control which is served by an interaction component to which the event loading information of the drawing and loading element to be simulated belongs, and the target simulation drawing control comprises at least one control drawing object;
screening and matching a plurality of interaction event fragments to be loaded to obtain a target loading interaction event having a drawing association relation with at least one control drawing object, and generating loading drawing control information between the target loading interaction event and the target control drawing object according to drawing parameters of the target loading interaction event and the at least one control drawing object under a target drawing attribute category;
and inputting loading drawing control information between the target loading interactive event and a target control drawing object under each drawing attribute category in each target simulation drawing control, selecting a target simulation drawing resource matched with the drawing loading element to be simulated from a preset target simulation drawing resource set according to an input result, and pushing an interactive event verification stream of the target simulation drawing resource to the information flow node terminal, so that the interactive event verification stream is used for information mining of a user of the information flow node terminal after the information flow node terminal verifies and confirms the interactive event verification stream.
In a fourth aspect, an embodiment of the present application further provides a cloud computing verification interaction center, where the cloud computing verification interaction center includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected by a bus system, the network interface is used for being in communication connection with at least one information flow node terminal, the machine-readable storage medium is used for storing a program, an instruction, or a code, and the processor is used for executing the program, the instruction, or the code in the machine-readable storage medium to execute the cloud computing-based information flow interaction processing method in the first aspect or any one of possible implementation manners in the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed, the computer is caused to execute the cloud computing-based information flow interaction processing method in the first aspect or any one of the possible implementation manners of the first aspect.
Based on any one of the above aspects, after generating the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object under at least one control rendering object, by entering load rendering control information between the target load interaction event and the target control rendering object under each rendering attribute category into each of the target simulated rendering controls, a large number of reference bases based on the loading interaction events can be utilized to ensure that more target control drawing objects are obtained, which is beneficial to improving the accuracy of information flow matching of subsequent simulation drawing resources, and, when the loading interaction event is taken as the independent interaction processing unit, the situation of error occurring when the loading element to be simulated and drawn is subjected to simulation drawing can be avoided, and therefore the accuracy of information flow matching of the simulation drawing resource is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that need to be called in the embodiments are briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic application scenario diagram of an information flow interaction processing system based on cloud computing according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an information flow interaction processing method based on cloud computing according to an embodiment of the present application;
fig. 3 is a schematic functional module diagram of an information flow interaction processing apparatus based on cloud computing according to an embodiment of the present application;
fig. 4 is a schematic block diagram of structural components of a cloud computing verification interaction center for implementing the above-described information flow interaction processing method based on cloud computing according to the embodiment of the present application.
Detailed Description
The present application will now be described in detail with reference to the drawings, and the specific operations in the method embodiments may also be applied to the apparatus embodiments or the system embodiments.
Fig. 1 is an interaction schematic diagram of an information flow interaction processing system 10 based on cloud computing according to an embodiment of the present application. The cloud computing-based information flow interaction processing system 10 may include a cloud computing verification interaction center 100 and an information flow node terminal 200 communicatively connected to the cloud computing verification interaction center 100. The cloud computing-based information flow interaction processing system 10 shown in fig. 1 is only one possible example, and in other possible embodiments, the cloud computing-based information flow interaction processing system 10 may also include only one of the components shown in fig. 1 or may also include other components.
Based on the inventive concept of the technical solution provided by the present application, the cloud computing verification interaction center 100 provided by the present application can be applied to scenes such as smart medical, smart city management, smart industrial internet, general service monitoring management, etc. in which a big data technology or a cloud computing technology is applied, and for example, can also be applied to scenes such as but not limited to new energy automobile system management, smart cloud office, cloud platform data processing, cloud game data processing, cloud live broadcast processing, cloud automobile management platform, block chain financial data service platform, etc., but is not limited thereto.
In this embodiment, the cloud computing verification interaction center 100 and the information flow node terminal 200 in the information flow interaction processing system 10 based on cloud computing may execute the information flow interaction processing method based on cloud computing described in the following method embodiment in a matching manner, and the detailed description of the following method embodiment may be referred to in the specific steps executed by the cloud computing verification interaction center 100 and the information flow node terminal 200.
In order to solve the technical problem in the foregoing background art, fig. 2 is a schematic flowchart of a cloud computing-based information flow interaction processing method provided in an embodiment of the present application, and the cloud computing-based information flow interaction processing method provided in the embodiment may be executed by the cloud computing verification interaction center 100 shown in fig. 1, and the cloud computing-based information flow interaction processing method is described in detail below.
Step S110, obtaining the interactive event updating information of the interactive window track in the multi-object interactive information flow of the information flow node terminal.
And step S120, acquiring a drawing loading element to be simulated and matched with the interaction event slices to be loaded and a target simulation drawing control corresponding to the drawing loading element to be simulated based on the interaction event update information.
In this embodiment, the target simulation drawing control may be understood as a simulation drawing control of an interaction component service to which event loading information of a to-be-simulated drawing loading element belongs, where the target simulation drawing control may include at least one control drawing object. In an alternative implementation manner, the interaction event update information may have one or more interaction event segments to be loaded, where an interaction event segment to be loaded may be understood as a specific interaction process included in a loading interaction event (for example, an e-commerce live broadcast interaction process for a specific e-commerce commodity), and a loading interaction event may be understood as a complete interaction process (for example, a process set of e-commerce live broadcast interaction processes for a plurality of specific e-commerce commodities in an overall live broadcast process), where each interaction process may be an interaction process for a certain individual object or an interaction process of an interaction task composed of a plurality of individual objects.
In addition, the to-be-simulated drawing and loading elements matched with each to-be-loaded interactive event fragment may be obtained correspondingly from a current real-time simulation drawing and loading element library based on the to-be-loaded interactive event fragment, or obtained correspondingly from a pre-configured simulation drawing and loading element library, which is not limited specifically. The simulation drawing loading element may be understood as information such as scene description, voting problem, background information, social media interaction, and the like loaded during simulation drawing, and the target simulation drawing control corresponding to the drawing loading element to be simulated may be obtained based on a control parameter (for example, an SDK (software development kit interface) and the like) associated in advance with each drawing loading element to be simulated.
Step S130, the multiple interactive event fragments to be loaded are screened and matched to obtain a target loading interactive event which has a drawing association relation with at least one control drawing object, and loading drawing control information between the target loading interactive event and the target control drawing object is generated according to drawing parameters of the target loading interactive event and the at least one control drawing object under the target drawing attribute category.
In this embodiment, the rendering parameters may be understood as rendering instructions when the target loading interaction event and the at least one control rendering object match the same rendering attribute in the target rendering attribute category, and the specific determination manner may refer to an existing common rendering attribute algorithm model. In addition, the load rendering control information may be used to represent control instructions for the target load interaction event and the target control rendering object to be load rendered.
Step S140, inputting the loading drawing control information between the target loading interaction event and the target control drawing object under each drawing attribute category in each target simulation drawing control, selecting a target simulation drawing resource matched with the drawing loading element to be simulated from a preset target simulation drawing resource set according to an input result, and pushing an interaction event verification stream of the target simulation drawing resource to the information flow node terminal, so that the interaction event verification stream is used for information mining of a user of the information flow node terminal after the information flow node terminal verifies and confirms the interaction event verification stream.
In this embodiment, the simulation rendering resource may be understood as a specific rendering resource that is finally pushed to the information flow node terminal, and may include, but is not limited to, a content prompt rendering resource (such as some special effect schemes), a commodity object rendering resource, a session rendering resource, and the like, for example, but is not limited thereto.
For example, in the process of selecting a target simulation drawing resource matched with a drawing loading element to be simulated from a preset target simulation drawing resource set according to an input result and pushing an interactive event verification stream of the target simulation drawing resource to the information flow node terminal, a target simulation drawing loading element matched with a loading interactive event fragment included in a target loading interactive event of which the loading drawing control information is greater than the preset loading drawing control information can be determined, then target simulation drawing resources corresponding to the target simulation drawing loading elements are obtained from the preset target simulation drawing resource set, the target simulation drawing resource is drawn to generate an interactive event verification stream, and the interactive event verification stream of the target simulation drawing resource is pushed to the information flow node terminal.
Based on the above steps, after generating the loading rendering control information between the target loading interactivity event and the target control rendering object according to the rendering parameters of the target loading interactivity event and the target control rendering object under at least one control rendering object, by entering load rendering control information between the target load interaction event and the target control rendering object under each rendering attribute category into each of the target simulated rendering controls, a large number of reference bases based on the loading interaction events can be utilized to ensure that more target control drawing objects are obtained, which is beneficial to improving the accuracy of information flow matching of subsequent simulation drawing resources, and, when the loading interaction event is taken as the independent interaction processing unit, the situation of error occurring when the loading element to be simulated and drawn is subjected to simulation drawing can be avoided, and therefore the accuracy of information flow matching of the simulation drawing resource is improved.
In one possible implementation manner, for step S130, in the process of generating the load rendering control information between the target load interaction event and the target control rendering object according to the rendering parameters of the target load interaction event and the at least one control rendering object in the target rendering property category, the load rendering control information may be generated through the following exemplary sub-steps, which are described in detail below.
And a substep S131, determining a target rendering attribute category corresponding to each control rendering object according to the rendering incidence relation existing between the target loading interaction event and the control rendering object.
And a substep S132, based on the determined target drawing attribute type, calling drawing parameters of the target loading interaction event and at least one control drawing object in the determined target drawing attribute type, and determining the control drawing object with the drawing parameters meeting the preset drawing service range as the target control drawing object.
And a substep S133 of generating loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category.
For example, in the sub-step S132, the following three implementation manners can be implemented.
And (I) calculating first drawing parameters of the target loading interaction event and at least one control drawing object in the same drawing attribute category, and determining the control drawing object with the first drawing parameters meeting a preset drawing service range as a first target control drawing object.
And (II) calculating a second drawing parameter of the target loading interaction event and at least one control drawing object under the hierarchy drawing attribute category, and determining the control drawing object of which the second drawing parameter meets a preset drawing service range as a second target control drawing object.
And (III) calculating a third drawing parameter of the target loading interaction event and at least one control drawing object in the partition drawing attribute category, and determining the control drawing object of which the third drawing parameter meets a preset drawing service range as a third target control drawing object.
In (one), a same-drawing property sequence may be selected in the target simulated drawing control, and the same-drawing property sequence may include a plurality of same-drawing property lists, each of which includes control drawing objects with at least two same drawing property description vectors. Then, a same drawing attribute list which has the same drawing attribute as the loading level drawing attribute of the target loading interaction event is determined, and a target same drawing attribute list is obtained. On the basis, a first drawing parameter between the target loading interaction event and each control drawing object in the target and drawing attribute list is called, and the control drawing object with the first drawing parameter meeting a preset drawing service range is determined to be a first target control drawing object.
In this way, in sub-step S133, first load rendering control information between the target load interactivity event and the first target control rendering object may be generated according to the first rendering parameter between the target load interactivity event and the first target control rendering object.
Alternatively, in the second step (b), the hierarchical rendering property relationship between the target load interaction event and the at least one control rendering object may be determined according to the load hierarchical rendering property of the target load interaction event and the load hierarchical rendering property of each control rendering object. And then, based on the determined hierarchy drawing attribute relationship, calling a second drawing parameter between the target loading interaction event and the corresponding upper control drawing object, and determining the control drawing object of which the second drawing parameter meets a preset drawing service range as a second target control drawing object.
In this way, in sub-step S133, second load rendering control information between the target load interactivity event and the second target control rendering object may be generated according to a second rendering parameter between the target load interactivity event and the second target control rendering object.
Or, in the third step, a drawing attribute partition pre-established for each control drawing object may be collected, then a drawing mapping occurrence value between the target loading interaction event and each control drawing object is calculated, and the control drawing object with the drawing mapping occurrence value greater than a preset value is determined as the key loading interaction event to be selected. On the basis, a third drawing parameter of the target loading interactive event and the key loading interactive event to be selected with the drawing attribute partition covering the preset partition can be calculated, and the key loading interactive event to be selected with the third drawing parameter meeting the preset drawing service range is determined as a third target control drawing object.
In this way, in sub-step S133, third loading rendering control information between the target loading interactivity event and the third target control rendering object may be generated according to a third rendering parameter between the target loading interactivity event and the third target control rendering object.
Further, in a possible implementation manner, for step S140, in the process of entering, in each of the target simulation rendering controls, load rendering control information between the target load interaction event and the target control rendering object under each of the rendering attribute categories, the following exemplary embodiments may be implemented, and the following detailed description is provided.
And a substep S141 of obtaining a preset partition template corresponding to each drawing attribute type.
And a substep S142, calling the partition template matching information of the loaded drawing control information between the obtained partition template and the target loaded interactive event under the corresponding drawing attribute type and the target control drawing object, and obtaining the partition loaded drawing control information corresponding to each drawing attribute type.
And a substep S143, recording the partition loading drawing control information corresponding to each drawing attribute type in each target simulation drawing control.
Further, in a possible implementation manner, for step S140, in the process of performing filtering and matching on a plurality of interaction event fragments to be loaded to obtain a target loading interaction event having a drawing association relationship with at least one control drawing object, the following exemplary implementation manner may be implemented, and the following detailed description is provided.
And a substep S131, identifying the page pushing category of each interactive event fragment to be loaded.
And a substep S132, removing the interactive event slices to be loaded with the page pushing category being the blacklist loading category, and performing arrangement, screening and matching on the reserved interactive event slices to be loaded to obtain a target loading interactive event with a drawing association relation with at least one control drawing object.
For example, the arrangement, screening and matching manner may refer to a rule or configuration information of a page layout for a current interaction request, or perform adaptive arrangement, screening and matching according to the size of each interactive event fragment to be loaded, which are all within the protection scope of the embodiment of the present application, so that a target loading interactive event having a drawing association relationship with at least one control drawing object may be obtained.
In a possible implementation manner, for step S110, in the process of obtaining the interaction event update information of the interaction window trajectory in the multi-object interaction information stream of the information stream node terminal, the following exemplary sub-steps may be implemented, and the detailed description is as follows
Step S111, acquiring the multi-object interactive information stream of the information stream node terminal 200, and performing independent movable window extraction processing on the multi-object interactive information stream to obtain independent movable window information of an interactive window trajectory in the multi-object interactive information stream.
And step S112, carrying out interactive behavior tracking extraction based on the independent movable window information of the interactive window track to obtain the target interactive behavior migration characteristic of the interactive window track.
And S113, extracting interactive content migration characteristics of the multi-object interactive information stream based on the artificial intelligence model to obtain interactive content migration characteristic information of the interactive window track.
Step S114, carrying out interactive linkage event synthesis on the target interactive behavior migration characteristic of the interactive window track in the multi-object interactive information stream and the interactive content migration characteristic information of the interactive window track to obtain interactive linkage event synthesis information of the interactive window track, and carrying out interactive event updating on the interactive event record control of the multi-object interactive information stream based on the interactive linkage event synthesis information of the interactive window track to obtain interactive event updating information of the interactive window track.
In this embodiment, the multi-object interaction information stream may be understood as an interaction information stream formed by object interaction information recorded in each interaction window track acquired based on a single interaction request. The interaction request may be an interaction instruction specifically initiated by the information flow node terminal 200 to another interaction object. The interactive window trajectory may refer to a trajectory formed by interactive windows formed under an interactive request, and an interactive window may be understood as an interactive unit for providing various functions required for interaction, and different service functions may generally have an association relationship therebetween, so that interactive windows having an association relationship may be formed into an interactive window trajectory based on the association relationship. In addition, the independent movable window information can be used for characterizing the service interaction condition of the independent movable window related to the interactive window data.
In this embodiment, the interactive behavior migration feature may be used to represent behavior feature information of a migration situation of an interactive behavior initiated by a user at a certain time node or a certain space node, for example, when another user initiates a response interaction (for example, but not limited to purchasing, collecting, and subscribing) on a live broadcast interactive behavior of an e-commerce live broadcast commodity a after the user initiates the live broadcast interactive behavior of the e-commerce live broadcast commodity a in the e-commerce live broadcast process, the interactive behavior migration feature may be recorded. In addition, the interactive content migration characteristic information may be content characteristic information that characterizes that the interactive behavior initiated by the user is concerned when the migration condition occurs at a certain time node or a certain space node, for example, taking the above example as an example, it may be understood as content of a specific interaction when another user initiates a response interaction (for example, including but not limited to purchasing, collecting, subscribing) on the live interactive behavior of the live commercial product a.
In this embodiment, after obtaining the update information of the interactive event of the interactive window trajectory, the subsequent operation of pushing the service information is performed based on the update information of the interactive event of the interactive window trajectory, so that the present embodiment synthesizes the target interactive behavior migration characteristic of the interactive window trajectory and the interactive content migration characteristic information of the interactive window trajectory through the interactive linkage event, extracts the rich service relationship characteristic information of the interactive window trajectory by integrating the independent movable window information of the interactive window trajectory and the interactive content migration characteristic information of the interactive window trajectory, and provides data support for accurate interactive mining; in addition, interactive event updating is carried out on the interactive window track through the target interactive behavior migration characteristic of the interactive window track, and interactive event updating information of the interactive window track is obtained, so that an accurate interactive mining process is realized.
In a possible implementation manner, for step S111, in the process of performing independent movable window extraction processing on the multi-object interactive information stream to obtain independent movable window information of an interactive window trajectory in the multi-object interactive information stream, the following exemplary sub-steps may be implemented, which are described in detail below.
Sub-step S1111, obtaining a set of interactive graphic elements in the interactive graphic track recorded by the window service of each object interaction event in the multi-object interaction information stream.
In this embodiment, it is worth to be noted that the interactive graphics element set in the interactive graphics track includes interactive graphics elements each of which takes each interactive graphics track as an interactive region, and the interactive graphics elements include graphics interaction trigger information and graphics attribute information of the interactive graphics track, and interactive graphics records in the interactive graphics track. For example, the interactive graphics track may be used to represent a time recording interval related to an interactive window updating process, the graphics interaction trigger information may be used to represent a trigger node when the graphics is captured (for example, a click or browse operation by a user may be used as a trigger node), and the graphics attribute information may be used to represent graphics attribute information indicated after the graphics is captured.
Substep S1112, for each interactive graphics track, determining, according to each content editing graphic of the plurality of content editing graphics in the updated graphics record of the interactive graphics track of each object interaction event, whether each content editing interactive page in the content editing graphic is a reference target content editing interactive page according to the page interaction elements of the content editing interactive page in the content editing graphic, determining, according to the number of the reference target content editing interactive pages in the content editing graphic, each reference interactive indication control corresponding to the content editing graphic, for each reference interactive indication control, dividing the reference interactive indication control into a plurality of sub-interactive indication controls, and determining, according to the editing object and the preset object range of each content editing interactive page in each sub-interactive indication control, whether the reference interactive indication control is a target interactive indication control, wherein each content editing interaction page corresponds to each content editing interaction behavior.
And a substep S1113, obtaining interactive window partition information of each content editing interactive page in the preset interactive window rule matching target interactive indication control, wherein the interactive window partition information comprises interactive window calling information and interactive window component information, and the preset interactive window rule comprises matching modes corresponding to different interactive window services.
Substep S1114, determining an interactive window rendering attribute characteristic of each interactive window rendering attribute map and an interactive window precondition for each interactive window digest map according to the interactive window partition information of each updated graphic record of each different interactive graphic element set in the interactive graphic track, and determining an interactive window label object of each object interactive event in the interactive graphic track according to the interactive window rendering attribute characteristic of each interactive window rendering attribute map and the interactive window precondition for each interactive window digest map in the target interactive indication control, after taking the characteristic of the interactive window range of the interactive window label object and the characteristic of the interactive window range outside the interactive window range of the interactive window label object and associated with the interactive window label object as the independently movable window characteristic of each object interactive event in the interactive graphic track, and after the characteristics of the independent movable windows of each object interaction event in all interactive graphic tracks are gathered, the independent movable window information of the interaction window track in the multi-object interaction information flow is obtained.
In a possible implementation manner, for step S112, in the process of performing interactive behavior tracking extraction based on the independent movable window information of the interactive window trajectory to obtain the target interactive behavior migration feature of the interactive window trajectory, the following exemplary sub-steps may be implemented, which are described in detail below.
And a substep S1121, obtaining interaction behavior request information of a user interaction behavior set added to window demonstration information of each independent movable window feature in the independent movable window information of the interaction window trajectory, and determining a first interaction flow segment list corresponding to the interaction behavior request information, where the interaction behavior request information includes interaction behavior result information of interaction behavior running information determined according to interaction behavior input information and interaction behavior output information of the user interaction behavior set, and the first interaction flow segment list includes a sequence of a plurality of interaction flow segments of the interaction behavior result information.
In sub-step S1122, window presentation information for each of the independently movable window features is determined based on a first interactive behavior vector of the interactive behavior input information and a second interactive behavior vector of the interactive behavior output information.
And a substep S1123, determining a migration analysis parameter for performing migration analysis on the first interactive flow segment list according to the interactive flow segment sequence relationship between the first interactive behavior vector and the second interactive behavior vector.
The sub-step S1124 is to perform migration analysis on the first interactive flow segment list based on the migration analysis parameters to obtain a second interactive flow segment list.
And S1125, performing migration node positioning on the second interactive flow segment list to obtain a plurality of migration node positioning portions, and performing feature extraction on each migration node positioning portion to obtain migration node positioning features.
And a substep S1126 of determining the interactive behavior migration characteristic of each independent movable window characteristic according to the interactive behavior migration characteristics corresponding to the plurality of migration node positioning characteristics corresponding to the second interactive flow segment list.
And a substep S1127 of obtaining a target interactive behavior migration characteristic of the interactive window trajectory based on the interactive behavior migration characteristic of each independent movable window characteristic.
Further, in a possible implementation manner, for step S113, in the process of performing interactive content migration feature extraction on the multi-object interactive information stream based on the artificial intelligence model to obtain the interactive content migration feature information of the interactive window trajectory, the following exemplary sub-steps may be implemented, which are described in detail below.
And a substep S1131, inputting the multi-object interaction information stream into a pre-trained artificial intelligence model, and obtaining a confidence coefficient of the multi-object interaction information stream matched with each interactive content label.
It is worth to be noted that the artificial intelligence model is obtained by training based on training samples and training label information corresponding to the training samples, the training samples are multi-object interactive information stream samples, and the training label information is interactive content index information label information. The specific training process may refer to a conventional training mode provided in the prior art, and the training process does not belong to the technical problem intended to be solved by the embodiment of the present application, and is not described in detail herein.
And a sub-step S1132, determining target interactive content index information corresponding to the multi-object interactive information stream according to the confidence degree that the multi-object interactive information stream is matched with each interactive content tag.
For example, the interactive content tag with the confidence degree greater than the preset confidence degree threshold value may be determined as the target interactive content index information corresponding to the multi-object interactive information stream.
And a substep S1133, extracting the interactive content migration characteristic information matched with each interactive window track from the interactive content index information description information of the target interactive content index information corresponding to the multi-object interactive information stream.
For example, in the extraction process, the feature information with the transition node description, which is matched with each interactive window track, in the interactive content index information description information may be specifically extracted.
In a possible implementation manner, still referring to step S114, in the process of updating the interactivity event recording control of the multi-object interactive information stream based on the interactive linkage event synthesis information of the interactive window track to obtain the interactivity event update information of the interactive window track, the following exemplary sub-steps may be implemented, which are described in detail below.
And a substep S1145 of obtaining the interactive event information of the interactive window track under the interactive request.
And a substep S1146 of obtaining the interactive event items under the interactive event information and the event relationship configuration information corresponding to each interactive event item.
And a substep S1147 of covering and configuring the interactive linkage event synthesis information of the interactive window track under the event relation configuration information corresponding to each interactive event item to obtain interactive event update information of the interactive window track.
Fig. 3 is a schematic functional module diagram of an information flow interaction processing apparatus 300 based on cloud computing according to an embodiment of the present disclosure, and in this embodiment, functional modules of the information flow interaction processing apparatus 300 based on cloud computing may be divided according to a method embodiment executed by the cloud computing verification interaction center 100, that is, the following functional modules corresponding to the information flow interaction processing apparatus 300 based on cloud computing may be used to execute each method embodiment executed by the cloud computing verification interaction center 100. The cloud computing-based information flow interaction processing apparatus 300 may include a first obtaining module 310, a second obtaining module 320, a generating module 330, and a verifying module 340, where functions of the functional modules of the cloud computing-based information flow interaction processing apparatus 300 are described in detail below.
The first obtaining module 310 is configured to obtain interaction event update information of an interaction window track in a multi-object interaction information stream of the information stream node terminal 200. The first obtaining module 310 may be configured to perform the step S110, and for a detailed implementation of the first obtaining module 310, reference may be made to the detailed description of the step S110.
The second obtaining module 320 is configured to obtain, based on the interaction event update information, a to-be-simulated drawing load element that matches the multiple to-be-loaded interaction event fragments and a target simulated drawing control corresponding to the to-be-simulated drawing load element, where the target simulated drawing control is a simulated drawing control that is served by an interaction component to which event loading information of the to-be-simulated drawing load element belongs, and the target simulated drawing control includes at least one control drawing object. The second obtaining module 320 may be configured to perform the step S120, and for a detailed implementation of the second obtaining module 320, reference may be made to the detailed description of the step S120.
The generating module 330 is configured to filter and cooperate the multiple interaction event fragments to be loaded to obtain a target loading interaction event having a drawing association relationship with the at least one control drawing object, and generate loading drawing control information between the target loading interaction event and the target control drawing object according to drawing parameters of the target loading interaction event and the at least one control drawing object in the target drawing attribute category. The generating module 330 may be configured to execute the step S130, and the detailed implementation of the generating module 330 may refer to the detailed description of the step S130.
The verification module 340 is configured to enter, in each target simulation drawing control, loading drawing control information between the target loading interaction event and the target control drawing object under each drawing attribute category, select, according to an entry result, a target simulation drawing resource that matches the to-be-simulated drawing loading element from a preset target simulation drawing resource set, and push an interaction event verification stream of the target simulation drawing resource to the information flow node terminal 200, so that after the information flow node terminal 200 performs verification confirmation on the interaction event verification stream, the interaction event verification stream is used for information mining of a user of the information flow node terminal 200. The verification module 340 may be configured to perform the step S140, and the detailed implementation of the verification module 340 may refer to the detailed description of the step S140.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules may all be implemented in software invoked by a processing element. Or may be implemented entirely in hardware. And part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the first obtaining module 310 may be a separate processing element, or may be integrated into a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and a processing element of the apparatus calls and executes the functions of the first obtaining module 310. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
Fig. 4 is a schematic diagram illustrating a hardware structure of a cloud computing verification interaction center 100 for implementing the above-mentioned cloud computing-based information stream interaction processing method according to an embodiment of the present disclosure, and as shown in fig. 4, the cloud computing verification interaction center 100 may include a processor 110, a machine-readable storage medium 120, a bus 130, and a transceiver 140.
In a specific implementation process, at least one processor 110 executes computer-executable instructions stored in a machine-readable storage medium 120 (for example, a first obtaining module 310, a second obtaining module 320, a generating module 330, and a verifying module 340 included in the cloud-based information flow interaction processing apparatus 300 shown in fig. 3), so that the processor 110 may execute the cloud-based information flow interaction processing method according to the above method embodiment, where the processor 110, the machine-readable storage medium 120, and the transceiver 140 are connected through the bus 130, and the processor 110 may be configured to control a transceiving action of the transceiver 140, so as to perform data transceiving with the aforementioned information flow node terminal 200.
For a specific implementation process of the processor 110, reference may be made to the above-mentioned method embodiments executed by the cloud computing verification interaction center 100, and implementation principles and technical effects thereof are similar, and details of this embodiment are not described herein again.
In the embodiment shown in fig. 4, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of hardware and software modules within the processor.
The machine-readable storage medium 120 may comprise high-speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus 130 may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus 130 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
In addition, an embodiment of the present application further provides a readable storage medium, where a computer executing instruction is stored in the readable storage medium, and when a processor executes the computer executing instruction, the information flow interaction processing method based on cloud computing is implemented.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, particular push elements are used in this description to describe embodiments of this description. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present description may be illustrated and described in terms of several patentable species or contexts, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereof. Accordingly, aspects of this description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present description may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of this specification may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, VisualBasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a passive programming language such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which the elements and sequences are processed, the use of alphanumeric characters, or the use of other designations in this specification is not intended to limit the order of the processes and methods in this specification, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Finally, it should be understood that the examples in this specification are only intended to illustrate the principles of the examples in this specification. Other variations are also possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (10)

1. An information flow interaction processing method based on cloud computing is characterized by being applied to a cloud computing verification interaction center which is in communication connection with a plurality of information flow node terminals, and the method comprises the following steps:
acquiring interactive event updating information of an interactive window track in the multi-object interactive information stream of the information stream node terminal;
acquiring a drawing and loading element to be simulated which is matched with a plurality of interaction event fragments to be loaded and a target simulation drawing control corresponding to the drawing and loading element to be simulated based on the interaction event update information, wherein the target simulation drawing control is a simulation drawing control which is served by an interaction component to which the event loading information of the drawing and loading element to be simulated belongs, and the target simulation drawing control comprises at least one control drawing object;
screening and matching a plurality of interaction event fragments to be loaded to obtain a target loading interaction event having a drawing association relation with at least one control drawing object, and generating loading drawing control information between the target loading interaction event and the target control drawing object according to drawing parameters of the target loading interaction event and the at least one control drawing object under a target drawing attribute category;
and inputting loading drawing control information between the target loading interactive event and a target control drawing object under each drawing attribute category in each target simulation drawing control, selecting a target simulation drawing resource matched with the drawing loading element to be simulated from a preset target simulation drawing resource set according to an input result, and pushing an interactive event verification stream of the target simulation drawing resource to the information flow node terminal, so that the interactive event verification stream is used for information mining of a user of the information flow node terminal after the information flow node terminal verifies and confirms the interactive event verification stream.
2. The information flow interaction processing method based on cloud computing according to claim 1, wherein the generating of the load rendering control information between the target load interaction event and the target control rendering object according to the rendering parameters of the target load interaction event and at least one control rendering object in the target rendering property category includes:
determining a target drawing attribute type corresponding to each control drawing object according to the drawing incidence relation between the target loading interaction event and the control drawing object;
calling drawing parameters of the target loading interaction event and at least one control drawing object in the determined target drawing attribute category based on the determined target drawing attribute category, and determining the control drawing object with the drawing parameters meeting a preset drawing service range as a target control drawing object;
and generating loading drawing control information between the target loading interaction event and the target control drawing object according to the drawing parameters of the target loading interaction event and the target control drawing object under at least one drawing attribute category.
3. The information flow interaction processing method based on cloud computing according to claim 2, wherein;
the method comprises the following steps of calling the drawing parameters of the target loading interaction event and at least one control drawing object in the determined target drawing attribute category based on the determined target drawing attribute category, and determining the control drawing object with the drawing parameters meeting a preset drawing service range as a target control drawing object, wherein the steps comprise:
calling a first drawing parameter of the target loading interaction event and at least one control drawing object under the same drawing attribute category, and determining the control drawing object of which the first drawing parameter meets a preset drawing service range as a first target control drawing object;
calling a second drawing parameter of the target loading interaction event and at least one control drawing object under the hierarchy drawing attribute category, and determining the control drawing object of which the second drawing parameter meets a preset drawing service range as a second target control drawing object;
calling a third drawing parameter of the target loading interaction event and at least one control drawing object in a partition drawing attribute category, and determining the control drawing object of which the third drawing parameter meets a preset drawing service range as a third target control drawing object;
the calling of the first drawing parameter of the target loading interaction event and the drawing object of at least one control under the same drawing attribute category, and determining the control drawing object with the first drawing parameter meeting a preset drawing service range as a first target control drawing object, includes:
selecting a same-drawing attribute sequence from the target simulation drawing control, wherein the same-drawing attribute sequence comprises a plurality of same-drawing attribute lists, and each same-drawing attribute list comprises at least two control drawing objects with the same drawing attribute description vector;
determining a same drawing attribute list which has the same drawing attribute as the loading level drawing attribute of the target loading interaction event to obtain a target same drawing attribute list;
calling a first drawing parameter between the target loading interaction event and each control drawing object in the target same drawing attribute list, and determining a first target control drawing object by using the control drawing object of which the first drawing parameter meets a preset drawing service range;
the generating of the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category includes:
generating first loading drawing control information between the target loading interaction event and a first target control drawing object according to a first drawing parameter between the target loading interaction event and the first target control drawing object;
or, the calling the target loading interaction event and the second drawing parameter of the at least one control drawing object in the hierarchy drawing attribute category, and determining the control drawing object of which the second drawing parameter meets a preset drawing service range as the second target control drawing object, includes:
determining a level drawing attribute relation between the target loading interaction event and at least one control drawing object according to the loading level drawing attribute of the target loading interaction event and the loading level drawing attribute of each control drawing object;
calling a second drawing parameter between the target loading interaction event and the corresponding upper control drawing object based on the determined hierarchy drawing attribute relation, and determining a control drawing object of which the second drawing parameter meets a preset drawing service range as a second target control drawing object;
the generating of the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category includes:
generating second loading drawing control information between the target loading interaction event and a second target control drawing object according to a second drawing parameter between the target loading interaction event and the second target control drawing object;
or, the calling the third drawing parameter of the target loading interaction event and the at least one control drawing object in the partition drawing attribute category, and determining the control drawing object of which the third drawing parameter meets a preset drawing service range as a third target control drawing object, includes:
collecting a pre-established drawing attribute partition of each control drawing object;
calling a mapping rendering value between the target loading interaction event and each control rendering object, and determining the control rendering object with the mapping rendering value larger than a preset value as a key loading interaction event to be selected;
calling a third drawing parameter of the target loading interactive event and the key loading interactive event to be selected, of which the drawing attribute partition covers a preset partition, and determining the key loading interactive event to be selected, of which the third drawing parameter meets a preset drawing service range, as a third target control drawing object;
the generating of the loading rendering control information between the target loading interaction event and the target control rendering object according to the rendering parameters of the target loading interaction event and the target control rendering object in at least one rendering attribute category includes:
and generating third loading drawing control information between the target loading interaction event and a third target control drawing object according to a third drawing parameter between the target loading interaction event and the third target control drawing object.
4. The information flow interaction processing method based on cloud computing according to any one of claims 1 to 3, wherein the entering of the load rendering control information between the target load interaction event and the target control rendering object in each rendering attribute category in each target simulation rendering control includes:
acquiring a preset partition template corresponding to each drawing attribute type;
calculating partition template matching information of loading drawing control information between the obtained partition template and the target loading interaction event and the target control drawing object under the corresponding drawing attribute category to obtain partition loading drawing control information corresponding to each drawing attribute category;
and inputting partition loading drawing control information corresponding to each drawing attribute type into each target simulation drawing control.
5. The information flow interaction processing method based on cloud computing according to any one of claims 1 to 3, wherein the step of performing screening cooperation on the multiple interaction event slices to be loaded to obtain a target loading interaction event having a drawing association relationship with at least one control drawing object includes:
determining target drawing operation track data, the drawing participation confidence coefficient of which between the target drawing operation track data and at least one control drawing object in the interaction event fragments to be loaded is greater than a set confidence coefficient, and a first drawing operation track node and a second drawing operation track node which take the target drawing operation track data as reference drawing operation track data according to the drawing operation track data corresponding to the interaction event fragments to be loaded, wherein the drawing participation element information of the first drawing operation track node is not overlapped with the drawing participation element information of the second drawing operation track node, and a loading interaction relationship exists between the drawing participation element information and the drawing participation element information of the second drawing operation track node;
determining a drawing operation track component meeting a first target requirement in the first drawing operation track node, and determining first interactive scene interface window information corresponding to the first drawing operation track node according to scene area modeling information of a multi-level scene area between a visual scene object of the drawing operation track component meeting the first target requirement and an associated preset scene object; the drawing operation track component meeting the first target requirement is a drawing operation track component of a visual scene object matched with the associated preset scene object;
determining a drawing operation track component meeting a second target requirement in the second drawing operation track node, and determining second interactive scene interface window information corresponding to the second drawing operation track node according to scene area modeling information of a multi-level scene area between a visual scene object of the drawing operation track component meeting the second target requirement and an associated preset scene object; the drawing operation track component meeting the second target requirement is a drawing operation track component of a visual scene object matched with the associated preset scene object;
obtaining window flow information of the drawing operation track component in each first drawing participation element information according to first interactive scene interface window information corresponding to the first drawing operation track node, and obtaining window flow information of the drawing operation track component in each second drawing participation element information according to second interactive scene interface window information in the second drawing operation track node;
according to the window flow information of each first drawing participating element information and each second drawing participating element information, respectively carrying out event tracing on the drawing operation track component in each drawing participating element information to obtain first event tracing information of each first drawing participating element information and second event tracing information of each second drawing participating element information;
obtaining corresponding event tracing information according to the first event tracing information of each first drawing participating element information and the second event tracing information of each second drawing participating element information;
and obtaining a target loading interaction event having a drawing association relation with the at least one control drawing object according to the event tracing information.
6. The information flow interaction processing method based on cloud computing according to any one of claims 1 to 5, wherein the obtaining of the interaction event update information of the interaction window trajectory in the multi-object interaction information flow of the information flow node terminal includes:
acquiring a multi-object interactive information stream of the information stream node terminal, and performing independent movable window extraction processing on the multi-object interactive information stream to obtain independent movable window information of interactive window tracks in the multi-object interactive information stream, wherein the multi-object interactive information stream is an interactive information stream formed by object interactive information recorded by each interactive window track acquired based on a single interactive request;
carrying out interactive behavior tracking extraction based on the independent movable window information of the interactive window track to obtain target interactive behavior migration characteristics of the interactive window track;
extracting interactive content migration characteristics of the multi-object interactive information stream based on an artificial intelligence model to obtain interactive content migration characteristic information of the interactive window track;
and performing interactive linkage event synthesis on the target interactive behavior migration characteristic of the interactive window track in the multi-object interactive information stream and the interactive content migration characteristic information of the interactive window track to obtain interactive linkage event synthesis information of the interactive window track, and performing interactive event update on the interactive event record control of the multi-object interactive information stream based on the interactive linkage event synthesis information of the interactive window track to obtain interactive event update information of the interactive window track.
7. The information flow interaction processing method based on cloud computing according to claim 6, wherein the independent movable window extraction processing is performed on the multi-object interaction information flow to obtain independent movable window information of an interaction window track in the multi-object interaction information flow, and the method comprises:
acquiring an interactive graphic element set in an interactive graphic track recorded by a window service of each object interaction event in the multi-object interactive information stream, wherein the interactive graphic element set in the interactive graphic track comprises interactive graphic elements taking each interactive graphic track as an interaction area, and the interactive graphic elements comprise graphic interaction trigger information and graphic attribute information of the interactive graphic track and interactive graphic records in the interactive graphic track;
for each interactive graphic track, according to each content editing graphic in a plurality of content editing graphics in an updating graphic record of the interactive graphic track of each object interaction event, according to page interaction elements of the content editing interaction pages in the content editing graphics, determining whether each content editing interaction page in the content editing graphics is a reference target content editing interaction page, according to the number of the reference target content editing interaction pages in the content editing graphics, determining each reference interaction indication control corresponding to the content editing graphics, for each reference interaction indication control, dividing the reference interaction indication control into a plurality of sub-interaction indication controls, according to editing objects and preset object ranges of the content editing interaction pages in each sub-interaction indication control, determining whether the reference interaction indication control is a target interaction indication control, wherein each content editing interaction page corresponds to each content editing interaction behavior;
acquiring interaction window partition information of a preset interaction window rule matched with each content editing interaction page in the target interaction indication control, wherein the interaction window partition information comprises interaction window calling information and interaction window component information, and the preset interaction window rule comprises matching modes corresponding to different interaction window services;
determining an interactive window drawing attribute feature of each interactive window drawing attribute map and an interactive window precondition of each interactive window abstract map according to interactive window partition information of each updated graphic record of each different interactive graphic track in an interactive graphic element set in the interactive graphic track, determining an interactive window label object of each object interactive event in the interactive graphic track according to the interactive window drawing attribute feature of each interactive window drawing attribute map and the interactive window precondition of each interactive window abstract map in the target interactive indication control, and taking the features in the interactive window range of the interactive window label object and the features outside the interactive window range of the interactive window label object and in the interactive window range of the interactive window label object as the features of each object interactive event behind the independently movable window feature in the interactive graphic track And fusing the independent movable window characteristics of each object interaction event in all interactive graphic tracks to obtain independent movable window information of an interaction window track in the multi-object interaction information stream.
8. The information flow interaction processing method based on cloud computing according to claim 6, wherein the step of performing interaction behavior tracking extraction based on the independent movable window information of the interaction window trajectory to obtain the target interaction behavior migration characteristic of the interaction window trajectory includes:
acquiring interactive behavior request information of a user interaction behavior set added to window demonstration information of each independent movable window characteristic in independent movable window information of the interactive window track, and determining a first interactive flow segment list corresponding to the interactive behavior request information, wherein the interactive behavior request information comprises interactive behavior result information of interactive behavior running information determined according to interactive behavior input information and interactive behavior output information of the user interaction behavior set, and the first interactive flow segment list comprises a sequence of a plurality of interactive flow segments of the interactive behavior result information;
determining window presentation information for each of the independent movable window features based on a first interactive behavior vector of the interactive behavior input information and a second interactive behavior vector of the interactive behavior output information;
determining migration analysis parameters for performing migration analysis on the first interactive flow segment list according to the interactive flow segment sequence relation of the first interactive behavior vector and the second interactive behavior vector;
performing migration analysis on the first interactive flow segment list based on the migration analysis parameters to obtain a second interactive flow segment list;
carrying out migration node positioning on the second interactive flow segment list to obtain a plurality of migration node positioning portions, and carrying out feature extraction on each migration node positioning portion to obtain migration node positioning features;
determining the interactive behavior migration characteristics of each independent movable window characteristic according to the interactive behavior migration characteristics corresponding to the plurality of migration node positioning characteristics corresponding to the second interactive flow segment list;
and obtaining the target interactive behavior migration characteristic of the interactive window track based on the interactive behavior migration characteristic of each independent movable window characteristic.
9. The information flow interaction processing method based on cloud computing according to claim 6, wherein the performing interaction event update on the interaction event record control of the multi-object interaction information flow based on the interaction linkage event synthesis information of the interaction window track to obtain the interaction event update information of the interaction window track comprises:
acquiring interaction event information of the interaction window track under an interaction event recording control of the multi-object interaction information stream;
acquiring interaction event items under the interaction event information and event relation configuration information corresponding to each interaction event item;
and overlapping and configuring the interactive linkage event synthesis information of the interactive window track under the event relation configuration information corresponding to each interactive event item to obtain the interactive event update information of the interactive window track.
10. A cloud computing verification interaction center, wherein the cloud computing verification interaction center includes a processor, a machine-readable storage medium, and a network interface, the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is configured to be communicatively connected to at least one information flow node terminal, the machine-readable storage medium is configured to store a program, an instruction, or a code, and the processor is configured to execute the program, the instruction, or the code in the machine-readable storage medium to perform the cloud computing-based information flow interaction processing method according to any one of claims 1 to 9.
CN202011417255.XA 2020-12-04 2020-12-04 Information flow interaction processing method based on cloud computing and cloud computing verification interaction center Active CN112433655B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011417255.XA CN112433655B (en) 2020-12-04 2020-12-04 Information flow interaction processing method based on cloud computing and cloud computing verification interaction center

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011417255.XA CN112433655B (en) 2020-12-04 2020-12-04 Information flow interaction processing method based on cloud computing and cloud computing verification interaction center

Publications (2)

Publication Number Publication Date
CN112433655A true CN112433655A (en) 2021-03-02
CN112433655B CN112433655B (en) 2021-09-07

Family

ID=74691972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011417255.XA Active CN112433655B (en) 2020-12-04 2020-12-04 Information flow interaction processing method based on cloud computing and cloud computing verification interaction center

Country Status (1)

Country Link
CN (1) CN112433655B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104700086A (en) * 2015-03-20 2015-06-10 清华大学 Excavating method of topic actions of man-machine interaction for video analysis
CN105574159A (en) * 2015-12-16 2016-05-11 浙江汉鼎宇佑金融服务有限公司 Big data-based user portrayal establishing method and user portrayal management system
CN106354862A (en) * 2016-09-06 2017-01-25 山东大学 Multidimensional individualized recommendation method in heterogeneous network
CN107992598A (en) * 2017-12-13 2018-05-04 北京航空航天大学 A kind of method that colony's social networks excavation is carried out based on video data
CN108268547A (en) * 2016-12-29 2018-07-10 北京国双科技有限公司 User's portrait generation method and device
CN109145909A (en) * 2017-06-27 2019-01-04 江苏华扬信息科技有限公司 A method of based on cloud computing identifying code
CN110688368A (en) * 2019-09-28 2020-01-14 武汉工程大学 Component behavior model mining method and device
CN111079028A (en) * 2019-12-04 2020-04-28 上海财经大学 Collaborative filtering recommendation system and method based on multi-source auxiliary information
CN111626816A (en) * 2020-05-10 2020-09-04 石伟 Image interaction information processing method based on e-commerce live broadcast and cloud computing platform
US10771354B1 (en) * 2019-11-05 2020-09-08 LotusFlare, Inc. Digital platform for multiple network deployments
CN111787081A (en) * 2020-06-21 2020-10-16 张伟 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN111930902A (en) * 2020-06-22 2020-11-13 合肥易知谷信息科技有限公司 Man-machine interaction method based on artificial intelligence
CN111931064A (en) * 2020-08-28 2020-11-13 张坚伟 Information analysis method based on big data and artificial intelligence and cloud service information platform

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104700086A (en) * 2015-03-20 2015-06-10 清华大学 Excavating method of topic actions of man-machine interaction for video analysis
CN105574159A (en) * 2015-12-16 2016-05-11 浙江汉鼎宇佑金融服务有限公司 Big data-based user portrayal establishing method and user portrayal management system
CN106354862A (en) * 2016-09-06 2017-01-25 山东大学 Multidimensional individualized recommendation method in heterogeneous network
CN108268547A (en) * 2016-12-29 2018-07-10 北京国双科技有限公司 User's portrait generation method and device
CN109145909A (en) * 2017-06-27 2019-01-04 江苏华扬信息科技有限公司 A method of based on cloud computing identifying code
CN107992598A (en) * 2017-12-13 2018-05-04 北京航空航天大学 A kind of method that colony's social networks excavation is carried out based on video data
CN110688368A (en) * 2019-09-28 2020-01-14 武汉工程大学 Component behavior model mining method and device
US10771354B1 (en) * 2019-11-05 2020-09-08 LotusFlare, Inc. Digital platform for multiple network deployments
CN111079028A (en) * 2019-12-04 2020-04-28 上海财经大学 Collaborative filtering recommendation system and method based on multi-source auxiliary information
CN111626816A (en) * 2020-05-10 2020-09-04 石伟 Image interaction information processing method based on e-commerce live broadcast and cloud computing platform
CN111787081A (en) * 2020-06-21 2020-10-16 张伟 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN111930902A (en) * 2020-06-22 2020-11-13 合肥易知谷信息科技有限公司 Man-machine interaction method based on artificial intelligence
CN111931064A (en) * 2020-08-28 2020-11-13 张坚伟 Information analysis method based on big data and artificial intelligence and cloud service information platform

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FREIMUT BODENDORF ET AL.: "Mining Customer Opinions on the Internet - A Case Study in the Automotive Industry", 《2010 THIRD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING》 *
李佳慧 等: "基于大数据的电子商务用户画像构建研究", 《电子商务》 *

Also Published As

Publication number Publication date
CN112433655B (en) 2021-09-07

Similar Documents

Publication Publication Date Title
US11605226B2 (en) Video data processing method and apparatus, and readable storage medium
CN112184872B (en) Game rendering optimization method based on big data and cloud computing center
CN113326426A (en) Information pushing method and system based on big data positioning and artificial intelligence
CN101268505B (en) Method and system for classifying a video
CN110347872B (en) Video cover image extraction method and device, storage medium and electronic equipment
CN112464105A (en) Internet platform information pushing method based on big data positioning and cloud computing center
CN112434086B (en) Information flow mining method based on cloud computing and big data and cloud computing interaction center
CN112308627B (en) Advertisement data access method based on block chain and artificial intelligence and big data center
CN112329816A (en) Data classification method and device, electronic equipment and readable storage medium
US20240104006A1 (en) Method for automatically generating interactive test cases
WO2022262719A1 (en) Live streaming processing method and apparatus, storage medium, and electronic device
CN114661994B (en) User interest data processing method and system based on artificial intelligence and cloud platform
CN114004700A (en) Service data processing method and device, electronic equipment and storage medium
CN113051346A (en) Hot spot information processing method based on cloud computing and block chain financial cloud center
CN112164132B (en) Game compatible processing method based on big data and cloud computing center
CN112433655B (en) Information flow interaction processing method based on cloud computing and cloud computing verification interaction center
CN113765909A (en) Big data detection method and system for coping with intelligent education data wind control
CN115346145A (en) Method, device, storage medium and computer program product for identifying repeated video
CN112135175A (en) Advertisement pushing method and system based on big data and smart city and cloud platform
CN115049963A (en) Video classification method and device, processor and electronic equipment
CN112507214B (en) User name-based data processing method, device, equipment and medium
CN114462417A (en) Comment text processing method applied to big data and storage medium
CN114332716A (en) Method and device for clustering scenes in video, electronic equipment and storage medium
CN113297498A (en) Internet-based food attribute mining method and system
CN112286724B (en) Data recovery processing method based on block chain and cloud computing center

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Pan Chenglin

Inventor after: Liu Kehua

Inventor after: Zhou Zixin

Inventor after: Cui Xiufen

Inventor before: Cui Xiufen

GR01 Patent grant
GR01 Patent grant
TA01 Transfer of patent application right

Effective date of registration: 20210825

Address after: 430000 No. 101, building B1, phase 2-1, Wuhan Software New Town, No. 8, Huacheng Avenue, Donghu New Technology Development Zone, Wuhan, Hubei

Applicant after: Wuhan Maiyi Information Technology Co.,Ltd.

Address before: No. 669, Kegao Road, high tech Industrial Development Zone, Wuhua District, Kunming, Yunnan 650106

Applicant before: Cui Xiufen

TA01 Transfer of patent application right
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Information flow interactive processing method based on cloud computing and cloud computing verification interactive center

Effective date of registration: 20210928

Granted publication date: 20210907

Pledgee: Guanggu Branch of Wuhan Rural Commercial Bank Co.,Ltd.

Pledgor: Wuhan Maiyi Information Technology Co.,Ltd.

Registration number: Y2021420000108

PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20230925

Granted publication date: 20210907

Pledgee: Guanggu Branch of Wuhan Rural Commercial Bank Co.,Ltd.

Pledgor: Wuhan Maiyi Information Technology Co.,Ltd.

Registration number: Y2021420000108

PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Information Flow Interaction Processing Method Based on Cloud Computing and Cloud Computing Verification Interaction Center

Effective date of registration: 20230927

Granted publication date: 20210907

Pledgee: Guanggu Branch of Wuhan Rural Commercial Bank Co.,Ltd.

Pledgor: Wuhan Maiyi Information Technology Co.,Ltd.

Registration number: Y2023980059417