CN112597420A - Method and device for realizing unified data management - Google Patents

Method and device for realizing unified data management Download PDF

Info

Publication number
CN112597420A
CN112597420A CN202011567725.0A CN202011567725A CN112597420A CN 112597420 A CN112597420 A CN 112597420A CN 202011567725 A CN202011567725 A CN 202011567725A CN 112597420 A CN112597420 A CN 112597420A
Authority
CN
China
Prior art keywords
data
target
scene
portrait
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011567725.0A
Other languages
Chinese (zh)
Inventor
陈佳慧
白杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
4Paradigm Beijing Technology Co Ltd
Original Assignee
4Paradigm Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 4Paradigm Beijing Technology Co Ltd filed Critical 4Paradigm Beijing Technology Co Ltd
Priority to CN202011567725.0A priority Critical patent/CN112597420A/en
Publication of CN112597420A publication Critical patent/CN112597420A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure relates to a method and an apparatus for implementing unified data management, which includes: for a target scene selected from at least one scene from the outside, creating a target scene corresponding to the target scene; wherein, the scene and the scene are the relation of 1 to N, and N is a positive integer; for each target scene, in the process of creating the target scene, subscribing target data corresponding to the target scene from public data, and distributing the target data to the target scene; wherein the public data is collected raw data which is public for the at least one scene; and managing the target data through the target scene.

Description

Method and device for realizing unified data management
Technical Field
The disclosed embodiments relate to the field of computer technologies, and in particular, to a method and an apparatus for implementing unified data management.
Background
For an operation optimization system depending on data-driven decision, perfect data management is crucial, on one hand, the threshold and cost of product use can be greatly reduced, and on the other hand, powerful support is provided for subsequent service quality (such as recommendation effect). The digital operation and marketing system on the market is reversely observed, and the data management builds the foundation of the product.
At present, the production process of data is often initiated temporarily and randomly, so that the problem of poor data management uniformity exists.
Disclosure of Invention
An object of the embodiments of the present disclosure is to provide a new technical solution for implementing unified data management.
According to a first aspect of the present disclosure, there is provided a method for implementing unified data management, including: for a target scene selected from at least one scene from the outside, creating a target scene corresponding to the target scene; wherein, the scene and the scene are the relation of 1 to N, and N is a positive integer; for each target scene, in the process of creating the target scene, subscribing target data corresponding to the target scene from public data, and distributing the target data to the target scene; wherein the public data is collected raw data which is public for the at least one scene; and managing the target data through the target scene.
According to a second aspect of the present disclosure, there is also provided an apparatus for implementing unified data management, including:
the scene creating module is used for creating a target scene corresponding to a target scene selected from at least one scene from the outside; wherein, the scene and the scene are the relation of 1 to N, and N is a positive integer;
the data distribution module is used for subscribing target data corresponding to the target scenes from public data and distributing the target data to the target scenes in the process of creating the target scenes for each target scene; wherein the public data is collected raw data which is public for the at least one scene;
and the data management module is used for managing the target data through the target scene.
According to a third aspect of the present disclosure, there is also provided a system comprising at least one computing device and at least one storage device, wherein the at least one storage device is configured to store instructions for controlling the at least one computing device to perform the method according to the first aspect of the present disclosure.
According to a fourth aspect of the present disclosure, there is also provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to the first aspect of the present disclosure.
The method and the device have the advantages that at least one scene corresponding to any scene in multiple scenes is constructed, corresponding original data subscribed from common data common to the multiple scenes is distributed to the scenes in the process of creating the scenes, and the distributed data can be managed by taking the scenes as basic units of management data, so that unified management of the data is realized, and data management uniformity is better.
Other features of embodiments of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which is to be read in connection with the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the embodiments of the disclosure.
FIG. 1 is a schematic diagram of an implementation environment to which a method for implementing unified data management and a system component structure capable of implementing the method according to one embodiment can be applied;
FIG. 2 is a flow diagram of a method of implementing unified data management, according to one embodiment;
3-20 show schematic diagrams of the interface effect of the present invention;
FIG. 21 is a flowchart illustration of a method of implementing unified data management, according to another embodiment;
FIG. 22 is a block schematic diagram of an apparatus to implement unified data management, according to one embodiment;
FIG. 23 is a hardware architecture diagram of a system implementing unified data management, according to one embodiment.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
One application scenario of the embodiment of the present disclosure is unified management of data. In the implementation process, the inventor finds that under the condition that the production process of the data is temporarily initiated randomly, the data management uniformity is poor.
In view of the technical problems of the foregoing embodiments, the inventors propose a method for implementing unified data management, which takes a scenario as a basic unit of management data, and for any scenario among a plurality of scenarios, better data management uniformity is achieved by constructing at least one scenario corresponding to the scenario, and allocating corresponding original data subscribed from common data to the scenario to manage the data.
< implementation Environment and hardware configuration >
Fig. 1 is a schematic diagram of a component structure of a system 100 (hereinafter, referred to as the system 100) for implementing unified data management, and the system 100 is capable of applying a method for implementing unified data management according to an embodiment. As shown in fig. 1, the system 100 includes a device 1000 (hereinafter, referred to as the device 1000) and a client 2000, which implement unified data management, and the system 100 can be applied to a unified management scenario of data.
The apparatus 1000 may exist in the form of a server. The server is a service point for providing processing, database and communication facilities. The server may be an integral server, a distributed server across multiple computers, a computer data center, a cloud server, or a server cluster deployed in the cloud. The server may be of various types, such as, but not limited to, a web server, a news server, a mail server, a message server, an advertisement server, a file server, an application server, an interaction server, a database server, or a proxy server. In some embodiments, each server may include hardware, software, or embedded logic components or a combination of two or more such components for performing the appropriate functions supported or implemented by the server. For example, a server, such as a blade server, a cloud server, etc., or may be a server group consisting of a plurality of servers, which may include one or more of the above types of servers, etc.
In one embodiment, the apparatus 1000 may be as shown in fig. 1, including a processor 1100, a memory 1200, an interface apparatus 1300, and a communication apparatus 1400.
Processor 1100 is used to execute computer programs, which may be written in instruction sets of architectures such as x86, Arm, RISC, MIPS, SSE, and the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, various bus interfaces such as a serial bus interface (including a USB interface), a parallel bus interface, and the like. Communication device 1400 is capable of wired or wireless communication, for example.
As applied to the disclosed embodiments, the memory 1200 of the apparatus 1000 is used to store a computer program for controlling the processor 1100 of the apparatus 1000 to operate so as to implement the method of implementing unified data management according to any of the embodiments. A skilled person can design a computer program according to the solution of the embodiments of the present disclosure. How the computer program controls the processor to operate is well known in the art and will not be described in detail here.
Although a number of devices are shown in fig. 1 as device 1000, the present invention may relate to only some of the devices, for example, device 1000 may relate to only memory 1200 and processor 1100.
In this embodiment, the client 2000 is, for example, a desktop computer, a mobile phone, a portable computer, a tablet computer, a palmtop computer, and the like. As shown in fig. 1, the client 2000 may include a processor 2100, a memory 2200, an interface device 2300, a communication device 2400, a display device 2500, an input device 2600, a speaker 2700, a microphone 2800, and the like.
The processor 2100 may be a central processing unit CPU, a microprocessor MCU, or the like, for executing a computer program, which may be written in an instruction set of architectures such as x86, Arm, RISC, MIPS, SSE, or the like. The memory 2200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 2300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 2400 can perform wired or wireless communication, for example, the communication device 2400 may include at least one short-range communication module, for example, any module that performs short-range wireless communication based on a short-range wireless communication protocol such as a Hilink protocol, WiFi (IEEE 802.11 protocol), Mesh, bluetooth, ZigBee, Thread, Z-Wave, NFC, UWB, LiFi, and the like, and the communication device 2400 may also include a long-range communication module, for example, any module that performs WLAN, GPRS, 2G/3G/4G/5G long-range communication. The display device 2500 is, for example, a liquid crystal display, an LED display, a touch display, or the like. The input device 2600 may include, for example, a touch screen, a keyboard, and the like. The client 2000 may output an audio signal through the speaker 2700 and capture the audio signal through the microphone 2800.
As applied to the embodiments of the present disclosure, the memory 2200 of the client 2000 is used to store a computer program for controlling the processor 2100 of the client 2000 to operate, and providing support for implementing a method for implementing unified data management according to any of the embodiments. A skilled person can design a computer program according to the solution of the embodiments of the present disclosure. How the computer program controls the processor to operate is well known in the art and will not be described in detail here.
The network 3000 may be a wireless communication network or a wired communication network, and may be a local area network or a wide area network. In the system 100 shown in fig. 1, the client 2000 and the device 1000 can communicate with each other via a network 3000. The different clients 2000 and the network 3000 on which the device 1000 is based may be the same or different.
It should be understood that although fig. 1 shows only one device 1000 and client 2000, it is not meant to limit the respective numbers, and the system 100 may include multiple devices 1000, multiple clients 2000.
The system 100 shown in FIG. 1 is illustrative only and is not intended to limit the invention, its application, or uses in any way.
Various embodiments and examples according to the present invention are described below with reference to the accompanying drawings.
< method examples >
FIG. 2 is a flow diagram of a method of implementing unified data management, according to one embodiment. The main implementation body of the present embodiment is, for example, the apparatus 1000 in fig. 1.
In a possible implementation, the apparatus 1000 may include a service central station and a technology central station. The service center station can be oriented to specific services, and can be used for packaging technical services provided by the service center station in the packaging technology and providing corresponding services aiming at different scenes. For example, there may be a search service scenario, a push service scenario, a recommendation service scenario, and the like. The technical center station can realize the technical orientation and provide the service for the service center station. For example, the technical middlings specifically include a recommendation middling station, a search middling station, a push middling station, a data middling station, and the like.
In this embodiment, a scenario is a concept of a service center, and as a unit for providing a service, a service plan for describing one service requirement may be understood as being composed of three elements, i.e., service content (content), service type (type), and service layout (layout). A scene corresponds to a service position, such as a commodity recommendation scene of an app end home page of the scene, and scene elements of the scene comprise: personalized recommendation service (service type) of goods (service content) at fixed feed stream (service layout) location.
In detail, the data center station can have functions of data acquisition, data access, data management, data production, data operation and maintenance and the like. For example, the data center can directly obtain the original public data from the client side (such as the client 2000 shown in fig. 1) through the data collection function, and these data can be used to support the services provided by the above-mentioned recommendation, search, push, etc. scenarios. And after data acquisition is finished, data access can be executed, and then data production and management are carried out based on accessed public data. After the data production is completed, the data task scheduling and management can be provided, the indexes in operation can be checked, and the abnormity is monitored and early warned, so that the data operation and maintenance processing is realized.
In view of the fact that the collected original public data cannot be directly used by the underlying algorithm models of the scene services, the public data may be processed and calculated for data generation in the embodiment, so as to be converted into application data. The application data can be used as direct input of the algorithm model and used by the algorithm model at the bottom of each scene service.
Based on the above, in the present embodiment, by constructing each scenario corresponding to a scenario, and allocating corresponding common data to the constructed scenario, data is produced and managed in units of scenarios.
In this embodiment, a scenario is a concept of a station in data, and as a unit of application data isolation management, a data solution for carrying a definite optimization problem may be understood to be composed of elements such as a channel (channel), a page (page), a position (position), a target (goal), and a feedback period (delay). The five aspects are basic elements of an optimization problem and are also boundaries of measurable scene effect evaluation.
For example, consider a scenario example: through personalized delivery of the information flow (position) of the home page (page) of the APP (channel), the purchase rate (target) of a single day (feedback period) is improved. This scenario defines: and (4) channel: APP; page: a home page; position: a stream of information; the target is as follows: a purchase rate; a feedback period: for a single day.
In detail, the original public data may include at least three types of user data, material data, and behavior data, which are respectively as follows:
based on the user data, unified user data in the enterprise operation process can be managed, and the user data content and the meta information (schema) are managed in a unified mode by matching with the data acquisition and access assembly.
Based on the material data, unified material data in the enterprise operation process can be managed, and the material data content and the meta-information are managed in a unified mode by matching with the data acquisition and access assembly. Unlike the user data, the material data may include different types, such as merchandise, information, short videos.
Based on the behavior data, the behavior data generated in the online and offline user interaction process in the enterprise operation process can be managed. And the behavior data content and the meta-information are uniformly managed by matching with an acquisition and access component. The behaviors can be classified into different types according to the actually occurring channel, such as online and offline behaviors, online clicking, browsing, playing and other behaviors, and offline trading, pushing, ordering and other behaviors.
In detail, the application data at least includes four types of data (being subscribed data), image, sample, and index, which are respectively as follows:
for data (which is subscribed data), for a specified type of service scene, pushing, recommending and searching scenes, material and behavior data are selected from original public data according to scene needs. And simultaneously, system data produced by a business system, such as a recommended service log, a material user updating request and the like, are also included.
The portrait is divided into a user portrait and a material portrait. In the operation of scene application, the original data of users and materials are often required to be directly processed or specified and calculated according to behavior history to generate the portrait attributes of the users and the materials, and the processed data provide data support for scene services such as recommendation and search.
The sample is a data example obtained by packaging features, comprises two parts of model features and a sample label and is used for training the model. In the process of modeling and model optimization, sample production is an indispensable data preparation process. And according to the modeling behavior target, carrying out reverse-pushing to obtain a sample label, and then producing a sample by splicing the associated characteristics. And the samples are supplied to a recommendation station, a search station and a push station for model training.
Indexes are calculated and board diagnosis of each dimension is needed to be carried out on user behavior data in the whole channel operation experience and growth process. Therefore, a unified index template and an expansion mode can be provided, and the rapid index generation by scene construction personnel is facilitated.
Based on the above, as shown in fig. 2, the method for implementing unified data management of the present embodiment may include the following steps S201 to S203:
step S201, for a target scene selected from at least one scene from the outside, creating a target scene corresponding to the target scene; the scenes and the scenes are in a relation of 1 to N, and N is a positive integer.
As described above, the scenario may be various, such as a search service scenario, a push service scenario, and a recommendation service scenario. The user can request to create a scene for any scene, and for the creation of the scene, any scene can correspond to a plurality of scenes, namely, the scene and the scene are in one-to-many relationship.
For example, taking a recommendation service as an example, if the scenario only considers a single optimization problem, such as only optimizing the purchase rate of a good, then an association may correspond to a scenario. For a scene with multiple targets (e.g. optimizing click rate and viewing duration of short video recommendation service at the same time), multiple scenes can be associated, for example, using data of two scenes for scene optimization.
In an embodiment of the present disclosure, to illustrate a possible implementation manner of creating a scene, the step S201, for a target scene selected from at least one scene externally, creates a target scene corresponding to the target scene, including steps S2011 to step S2012:
step S2011, in response to the new scenario request, determines a target scenario selected from at least one scenario from the outside, and obtains target scenario configuration information.
In detail, the new scenario request may be issued by the user, for example, by triggering a corresponding function button provided on the interactive interface. When a user creates a new scene, the user can select the scene corresponding to the scene, and further can obtain corresponding target scene configuration information.
Step S2012, a target scenario corresponding to the target scene is created according to the target scenario configuration information.
After the scene configuration information is obtained, a corresponding scene can be created accordingly, and the created scene corresponds to a scene selected by a user, so that the selected scene can be optimized based on the created scene.
Furthermore, after obtaining the scene configuration information, the corresponding common data can be subscribed and distributed to the created scene accordingly, so that the corresponding data can be managed based on the scene. Based on this, in step S202, the subscribing, from public data, to target data corresponding to the target scenario includes: and according to the target scene configuration information, subscribing target data corresponding to the target scene from public data.
In detail, for the creation of the scene, there may be at least two implementation manners, namely personalized creation and creation using the existing scene as a template. Next, two scene creation methods will be described.
For the implementation of the personalized creation scenario described above:
in an embodiment of the present disclosure, to illustrate a possible implementation manner of creating a scene in a personalized manner, therefore, referring to fig. 3, in step S2011, in response to a new scene request, determining a target scene selected from at least one scene externally, and acquiring configuration information of the target scene, includes steps S20111 to step S20113:
step S20111, in response to the new scenario request, providing a personalized creation item of the at least one scenario.
As shown in fig. 3, in response to the new scenario request, in the category option, there are provided personalized creation items of three scenarios: recommending personalization, pushing personalization and searching personalization.
Step S20112, in a case where the personalized creation item of the target scene is triggered externally, providing a scenario creation configuration item corresponding to the target scene; wherein the target scene is any one of the at least one scene.
For the provided personalized creation items of each scene, a user needs to create a scene corresponding to which scene, that is, the personalized creation item of which scene can be selected. Furthermore, an interface of the scene creation configuration item corresponding to the scene can be provided, so that the user can configure the scene as required.
Typically, the scenario creation configuration item may be provided by way of a display interface, such as that shown in fig. 4. As shown in fig. 4, the provided scenario creation configuration items may include an input channel configuration item, an optimization target configuration item, a feedback period configuration item, and a material library configuration item. Furthermore, clicking on the advanced settings button shown in fig. 4 may also correlate to behavioral characteristics of the scenario that require subscription. For example, as shown in fig. 5, the provided scenario creation configuration items may include a behavior type configuration item, a behavior channel configuration item, and an interaction field configuration item.
Based on the above, in an embodiment of the present disclosure, please refer to fig. 4 and fig. 5, where the scenario creation configuration item includes at least one of an input channel configuration item, an optimization target configuration item, a feedback period configuration item, a material library configuration item, a behavior type configuration item, a behavior channel configuration item, and an interaction field configuration item.
Referring to fig. 4, specific contents of the delivery channel configuration item, the optimization target configuration item, the feedback period configuration item, and the material library configuration item may be as follows:
firstly, a delivery channel and a content delivery position are used for positioning a scene optimized by the scene, and the content delivery position comprises a channel client (APP, an applet, a Web and the like) and a delivery column.
And secondly, optimizing targets, namely selecting target delivery information of recommended, searched and pushed scenes, wherein the target delivery information comprises click rate, purchase rate, opening rate, loss rate and the like.
And thirdly, a feedback period and a time window for behavior target feedback.
Selecting a material library, and selecting the material library required to be used for associating the scene.
Referring to fig. 5, the specific contents of the behavior type configuration item, the behavior channel configuration item, and the interaction field configuration item may be as follows:
and (4) selecting all behaviors on a behavior path for realizing the scene optimization goal according to the behavior types. If the optimization goal of the scenario is purchase, then there is a path of actions (expose, click, join shopping cart, create order, purchase) that needs to be added to the behavior type of the scenario.
And secondly, selecting a behavior generating channel needing attention according to the behavior channel, wherein the behavior generating channel can be selected more. Including ios, Android, mp, and others.
And thirdly, selecting all positions generated by the behavior, such as a commodity list page, a commodity detail page, a shopping cart detail page, an order payment page and the like by using the interactive column.
In this embodiment, based on the associated configuration of the behavior data, the corresponding behavior data may be allocated to the scenario, so as to prevent redundant management of the entire behavior data and reduce the application data management cost.
Step S20113, the target context configuration information is acquired by the context creation configuration item provided in the step S20112. As described above, on-demand configuration can be performed based on the scenario creation configuration item, thereby obtaining scenario configuration information.
For the implementation manner of creating a scenario by using an existing scenario as a template:
in an embodiment of the present disclosure, to illustrate a possible implementation manner of creating a scenario with an existing scenario as a template, therefore, referring to fig. 6, the step S2011, in response to a new scenario request, determines a target scenario externally selected from at least one scenario, and acquiring target scenario configuration information includes steps S20114 to step S20117:
step S20114, in response to the new scenario request, provides a scenario template creation item.
Referring to fig. 3 and fig. 6, in response to the new scenario request, in the type option, in addition to the personalized creation items of the above three scenarios, a scenario template creation item may also be provided.
Step S20115, in case the scenario template creation item is externally triggered, providing at least one constructed scenario.
As shown in fig. 6, when a user desires to create a scene with an existing scene as a template, the type of "existing scene as a template" may be selected. Wherein, the existing scene can be any scene already created. In this way, a constructed scenario may be provided.
Step S20116, a first scenario selected from the at least one constructed scenario from the outside is determined, and a scenario corresponding to the first scenario is taken as the target scenario.
Step S20117, the target scenario configuration information is acquired according to the scenario configuration information of the first scenario.
In detail, for the case of creating a scenario by using an existing scenario as a template, additional configuration may not be required, and the behavior data is executed by default according to the response schema and the calculation logic when the user saves the behavior data.
In this embodiment, the precipitation and reuse of the scene template can be supported, so as to precipitate the data assets of the client, and improve the efficiency of implementing related items, such as an AI (Artificial Intelligence) item.
After creating a scene, the user may also view the constructed scene as desired. Based on this, in an embodiment of the present disclosure, please refer to fig. 7, the method further includes step S214:
step S214, responding to the scene display request, providing a scene list, wherein the scene list comprises the scene display item and the scene operation item of the target scene.
As shown in fig. 7, in response to the scenario presentation request, a scenario list may be provided, and based on the provided scenario list, the user may view information of the constructed scenario, and specifically, there may be a scenario presentation item and a scenario operation item.
In an embodiment of the present disclosure, please refer to fig. 7, the scenario presentation item includes: at least one of a number presentation item, a name presentation item, a status presentation item, a type presentation item, an update time presentation item, and a create time presentation item.
As shown in fig. 7, the scene list may have display items such as the number, name, status, type, update time, creation time, and the like of the constructed scene.
In one embodiment of the present disclosure, the scenario operation item includes: and starting and stopping the operation item, releasing the operation item to the scene library, setting the operation item and deleting the operation item.
As shown in fig. 7, the scene list may show operation items such as start, stop, release to the scene library, setting, and deletion of the constructed scene.
For example, if the user clicks a delete operation item of a certain scenario, the application data corresponding to the scenario is deleted, for example, the portrait corresponding to the scenario and the service corresponding to the portrait can be cleaned, and the index corresponding to the scenario and the service corresponding to the index can be cleaned, but the public portrait derived from the scenario may not be destroyed.
Based on the above, in one embodiment of the present disclosure, the scenario operation item includes a start-stop operation item; the method further comprises step S215:
step S215, responding to an external trigger to the start-stop operation item of the target scenario, and in a process of starting the target scenario, executing the step of subscribing, according to the target scenario configuration information, target data corresponding to the target scenario from public data, and allocating the target data to the target scenario.
In this embodiment, after a scenario is created, the scenario may be automatically or manually started by a user, or a scenario that is stopped before the scenario is manually started by the user, and after the scenario is started, corresponding public data may be subscribed according to corresponding scenario configuration information, and then the subscribed public data may be allocated to the scenario.
Based on the above, in one embodiment of the present disclosure, the scenario operation item includes a start-stop operation item; the method further comprises step S216:
step S216, in response to an external trigger to the start-stop operation item of the target scenario, in a case that the target scenario is stopped, stopping executing the step of subscribing, according to the target scenario configuration information, target data corresponding to the target scenario from public data, and allocating the target data to the target scenario.
Corresponding to the above starting scenario, in this embodiment, for the started scenario, the user may manually stop the scenario, and after the scenario is stopped, the user will not subscribe to the corresponding public data according to the corresponding scenario configuration information, and further will not allocate the corresponding data to the scenario.
In one embodiment of the present disclosure, the scenario operation item includes an issue to scenario library operation item; the method further comprises step S216:
step S216, under the condition that the target scene is triggered to be issued to the scene library operation item externally, the target scene is issued to the scene library.
In this embodiment, for any constructed scenario, the operation item issued to the scenario library is triggered by the user, for example, when the user views the scenario list, for a certain scenario displayed on the interface, the scenario can be issued to the scenario library by triggering the operation item issued to the scenario library. By publishing the scene to the scene library, for example, a user may be facilitated to quickly view the scene in the scene library.
As shown in fig. 7, if a scene is published to the scene library, the scene may be correspondingly displayed in the scene list, specifically, the scene library may be displayed after the scene is published.
Based on the above, after any scene is created, the following step S202 may be performed to assign corresponding common data to the created scene.
Step S202, for each target scene created in the step S201, in the process of creating the target scene, subscribing target data corresponding to the target scene from public data, and distributing the target data to the target scene; wherein the common data is raw data collected for common use by the at least one scene.
In detail, in the process of creating a scenario, the creation of the scenario is supported by subscribing to corresponding original public data and allocating the subscribed data to the scenario. The subscribed data is data for the scene, so that the subscribed data can be used as one application data of the scene, and other application data of the scene, such as the above application data of the portrait, the sample, the index and the like, can be constructed based on the application data.
In addition, on the basis of the created scene, by setting the related configuration items, corresponding original public data can be subscribed again according to the related configuration items, and the data subscribed again is distributed to the scene so as to support the creation of a new scene, namely, the new scene can be created by taking the existing scene as a scene template.
In detail, the original public data collected from the client side can be commonly called by a plurality of scene (such as recommendation, search, push and other scenes) services through a scene subscription mode. In the embodiment, corresponding data in the original public data are subscribed through the scene subscription mode, so that the data redundancy is reduced beneficially, and the scene building efficiency is improved.
As can be seen from the above, the scenario configuration information may be obtained through the scenario creation configuration item, and then the scenario is created according to the scenario configuration information, where the scenario creation configuration item may include an optimization target configuration item, and accordingly, an optimization target of a service scenario may be configured, so that one scenario is associated with one or more scenarios.
In this embodiment, corresponding material data, behavior data, and the like may be selected from the original public data according to an optimization objective in the scenario configuration information, and may be allocated to a corresponding scenario, so as to implement data management in units of scenarios.
For example, as shown in table 1 below, it is assumed that the personalized recommendation scenes for the commodities with the optimization targets of click rate and purchase rate respectively correspond to scene 1 and scene 2. First, scenario 1 and scenario 2 may be allocated with all material information in the material library 32 (which may be configured in the above-mentioned material library configuration items). And then distributing the required display and click behavior data to the scene 1 and distributing the required display, click, shopping cart adding, order creating, order payment and other behavior data to the scene 2 according to the optimization target.
TABLE 1
Figure BDA0002861455530000061
Figure BDA0002861455530000071
It can be seen that, part of common data can be commonly used among different scenes of the same scene. In addition, in the case of extending different types of scenes, partial application data can be commonly used among different scenes.
In addition, besides the distribution of the material data and the behavior data, system data produced by the business system, such as recommended service logs, material user update requests and the like, can be distributed to corresponding scenes according to needs.
Based on the above, after the data is allocated to the scenario, the following step S203 is executed to implement the unified data management in units of scenarios.
Step S203, after the step S202, managing the target data by the target scenario.
As mentioned above, the original public data of the subscription cannot be directly used by the underlying algorithm model of each scene service, so in this step, the public data can be processed and calculated for data generation, so as to convert it into application data. The application data can be used as direct input of the algorithm model and used by the algorithm model at the bottom of each scene service.
In this embodiment, the scenario is used as a basic unit for managing application data, and the production process and the production result of the application data are managed in a unified manner. For example, based on the subscribed data, configuration construction and management of user portraits, material portraits, scene indexes and algorithm optimization samples can be provided around business scenes, a data access interface is opened, and the expansion of a computing method is supported.
Based on this, in an embodiment of the present disclosure, the step S203, managing the target data through the target scenario, includes steps S2031 to S2032:
step S2031, according to the target data, constructing application data corresponding to the target scene.
In this step, based on the corresponding subscription data assigned to the scenario, data processing may be performed to construct application data corresponding to the scenario, such as a portrait, a sample, an index, and the like.
Step S2032, providing the application data to the model corresponding to the target scenario for processing.
In this step, the corresponding model may be processed based on the application data of the scenario. For example, the model of the scene corresponding to the scene may be optimized by using application data such as a sample of the scene. Therefore, the application data is generated by subscribing the corresponding public data, and the generated application data further optimizes the model, so that the service can be provided more accurately when the optimized model is used for providing the corresponding business service for the client.
As can be seen from the above, the application data may be a portrait, a sample, a pointer, etc., based on which, in one embodiment of the present disclosure, the application data includes any one or more types of the portrait, the sample, and the pointer; the step S2031, constructing application data corresponding to the target scenario according to the target data, including steps S20311 to S20313:
step S20311, in response to a new request for any type of application data of the target scenario, showing a first configuration item corresponding to the any type of application data.
In this step, the user may issue a new request corresponding to which application data the user needs to create. In this way, in order to facilitate the user to set the configuration information, the corresponding configuration item may be presented.
Step S20312, determining data configuration information externally set by the first configuration item.
In this step, the configuration item is configured as required by the user, and corresponding configuration information can be obtained.
Step S20313, processing the target data according to the data configuration information, and constructing the application data of any type corresponding to the target scene.
In this step, corresponding application data may be constructed based on the configuration information.
The following description will be made of the construction of three types of application data, i.e., images, samples, and indices.
For the representation of the application data, please refer to fig. 8-11, 14-15 in one embodiment of the present disclosure, the application data includes the representation; the first configuration item includes: at least one of a representation name configuration item, at least one representation type configuration item, and a representation description configuration item.
In detail, in the operation of the scene application, it is often necessary to directly process the raw data of the user and the material, or perform specified calculation according to the behavior history to generate the portrait of the user and the material. The user portrait and the material portrait can be directly used as a part of sample data, and data support is provided for scene services such as recommendation, search and the like.
In one embodiment of the present disclosure, the representation includes a user representation and a material representation. Next, the corresponding arrangement items of the user image and the material image will be described.
In this embodiment, when the user requests to generate application data like a user image, a page as shown in any one of fig. 8 to 11 may be displayed. Referring to fig. 8-11, when the application data is a user image, the first configuration item may have a name, a type, a description, and the like of the user image. Wherein the type of user representation can be configured as a base, statistical, model, public representation, as desired.
In this embodiment, when the user requests to generate the application data of the material image, a page as shown in any one of fig. 14 to 15 may be displayed. Referring to fig. 14-15, when the application data is a material image, the first configuration item may have a name, a type, a description, and the like of the material image. Wherein, the type of the material portrayal can be configured into a base, a conversation (or called statistic) and a public portrayal according to needs.
In this regard, in one embodiment of the present disclosure, please refer to fig. 8-10, the application data includes the representation; the first configuration item comprises at least one portrait type configuration item; the step S20312, determining data configuration information externally set by the first configuration item, includes steps S203121 to S203123:
step S203121, determining a target portrait type configuration externally selected from the at least one portrait type configuration.
As shown in fig. 8-11, on the presentation page of the first configuration item corresponding to the user image, a plurality of image type configuration items, such as a basic feature, a statistical feature, a model tag, and a public image, may be presented, and the user may click any type as required to generate the user image of the type.
As shown in fig. 14-15, on the display page of the first configuration item corresponding to the material image, a plurality of image type configuration items, such as a basic feature, a statistical feature, and a public image, may be displayed, and a user may click any type as required to generate the material image of the type.
Step S203122, providing a second configuration item corresponding to the target portrait type configuration item.
In this step, based on the image type selected by the user, the corresponding second configuration item may be displayed. Next, a second configuration item corresponding to each of the user representation and the material representation will be described.
For a second configuration item of the user representation:
as described above, the user profile types may include a base feature, a statistical feature, a model tag, and a public profile, and these four types are described below.
For the base feature this picture type:
the base features refer to static attributes of the user, based on facts. Such as: user ID, IP, age, gender, etc. Thus, the corresponding configuration item may have the following fields and methods:
(1) a field to select an attribute of the user, such as "IP".
(2) The method selects the method acting on the attribute, such as 'region information extraction' and the like; the User defined extraction method is supported, and the UDF (User defined Function) is uniformly managed by the system Function.
As shown in fig. 8, assuming that the user currently clicks a basic type, a second configuration item, such as a field and a method, corresponding to the type is shown.
For statistical features this picture type:
the statistical characteristics are characteristics obtained by statistics based on the user behavior. Thus, the corresponding configuration items may have behavior types, attributes, time windows, calculation methods, service filters as described below:
(1) type of behavior, cumulative behavior of the session, such as "buying behavior".
(2) Attributes, cumulative attributes, may be attribute fields in user, material, or behavior data.
(3) Time window, i.e. calculation period.
(4) The calculation method supports the use of a UDAF self-defining extraction method, the UDAF is uniformly managed by a system function, and three aggregation methods of accumulation, counting and segmented counting are built in.
(5) And (4) service screening, namely judging whether services generated by the paradigm services are calculated or not in the calculation of the exposure behavior data. For example, three types of services, i.e., "paradigm-only service (which can be understood as a specific service)", "non-paradigm service", and "all".
For example, one statistical feature may be: within 3 days (time window), the sum of purchase price (attributes) for the purchase (type of activity) (calculation method).
As shown in fig. 9, assuming that the type currently clicked by the user is statistics, a second configuration item corresponding to the type, such as a behavior type, an attribute, a time window, a calculation method, and service filtering, is displayed.
For model label this image type:
the method is used for externally connecting a custom script/calculation process and generating a user portrait result. The function of providing relevant configuration, and the built-in calculation USER INTEREST (USER _ INTEREST) script. Scripts that involve offline computation may be configured at the system's back-end for their run-time.
Examples are: with the built-in USER _ INTEREST script, it is calculated that the type of goods (material attribute) most likely purchased (behavior type) by the USER is sports.
As shown in fig. 10, assuming that the model currently clicked by the user is of the type, a second configuration item corresponding to the type, such as a behavior type, a material attribute, and a calculation method, is displayed.
For public portraits this type of portraits:
this type is used to select portrait attributes that have been derived to a common portrait. For example, a copy that can be used in the present scenario is created for the portrait, and meta-information such as description and name is attached to the current scenario, so that portrait description information of the current scenario can be modified.
Referring to fig. 11, assuming that the user currently clicks the type of public portrait, the corresponding presentation interface may be as shown in fig. 11.
Based on the foregoing, in one embodiment of the present disclosure, please refer to FIGS. 8-11, the representation includes a user representation;
the at least one portrait type configuration item comprises at least one of a basic feature type configuration item, a statistical feature type configuration item, a model tag type configuration item and a public portrait configuration item;
in the case that the target portrait type configuration item is a basic feature type configuration item, the second configuration item comprises at least one of a field configuration item and an extraction method configuration item;
under the condition that the target portrait type configuration item is a statistical feature type configuration item, the second configuration item comprises at least one of a behavior type configuration item, a calculation attribute configuration item, a time window configuration item, a calculation method configuration item and a service screening configuration item;
and under the condition that the target portrait type configuration item is a model tag type configuration item, the second configuration item comprises at least one of a calculation method configuration item, a behavior type configuration item and a material attribute configuration item.
For a second configuration of the material representation:
as described above, the material images may be of the base character type, the statistical character type or the public image type, and these three types will be described below.
For the base feature this picture type:
the corresponding configuration items may have the following fields and methods:
(1) a field, a static attribute of the selected material, such as material ID, type, title, url (Uniform Resource Locator), etc.
(2) A method of selecting a method acting on the attribute, such as "keyword extraction" or the like; and a user-defined extraction method of the UDF is supported, and the UDF is uniformly managed by a system function.
As shown in fig. 14, assuming that the user currently clicks a basic type, a second configuration item, such as a field and a method, corresponding to the type is shown.
For statistical features this picture type:
the corresponding configuration items may have behavior types, cumulants, time windows, service filters as described below:
(1) behavior type, target behavior type for session accumulation, only singleton-capable.
(2) Cumulative amount, cumulative behavior field. Such as duration, can be used to accumulate the duration of play action; the amount may be used to accumulate the number of purchases.
(3) Time window, statistical time interval, such as may have a full amount, a recent time frame, a time frame bounded by behavior (e.g., "last 100 exposures").
(4) And (3) service screening: in the calculation of the exposure behavior data, it is necessary to determine whether or not there is a service generated by the paradigm service. Including "canonical only service", "non-canonical service", and "all".
In addition, as attributes of users, materials, activities in the public data change (e.g., increase) and the update is published, corresponding selectable items in the representation configuration are correspondingly increased. If the reported material data is added with the region information (geolocation), the selectable material attributes in the portrait will be added with the region fields.
Referring to fig. 15, assuming that the user currently clicks the type of statistics, the second configuration item corresponding to the type is displayed, and the display interface may be as shown in fig. 15.
For public portraits this type of portraits:
this type is used to select portrait attributes that have been derived to a common portrait. For example, a copy that can be used in the present scenario is created for the portrait, and meta-information such as description and name is attached to the current scenario, so that portrait description information of the current scenario can be modified.
Referring to FIGS. 14-15, a user may also click on a type of public representation in an attempt to generate a material representation of that type.
Based on the foregoing, in one embodiment of the present disclosure, please refer to fig. 14-15, wherein the images include material images;
the at least one portrait type configuration item comprises at least one of a basic feature type configuration item, a statistical feature type configuration item and a public portrait configuration item;
in the case that the target portrait type configuration item is a basic feature type configuration item, the second configuration item comprises at least one of a field configuration item and an extraction method configuration item;
under the condition that the target portrait type configuration item is a statistical feature type configuration item, the second configuration item comprises at least one of a behavior type configuration item, an accumulation amount configuration item, a time window configuration item and a service screening configuration item;
in the event that the target portrait type configuration item is a public portrait configuration item, the second configuration item comprises a user portrait attribute selection configuration item.
Step S203123, determining the data configuration information externally set by the second configuration item.
In this step, the user can set the second configuration item as desired, so that the corresponding data configuration information can be obtained, and further, the corresponding portrait can be constructed based on the data configuration information.
After the representation is built, the user can also view the built representation as desired. Based on this, in one embodiment of the present disclosure, please refer to fig. 12 and 16, the application data includes a portrait; the method further comprises step S204:
step S204, responding to the portrait viewing request aiming at the target scene, and providing a portrait attribute list; wherein the representation attribute list comprises: and displaying the portrait attribute of each target portrait and operating the portrait attribute, wherein the target portrait is a constructed portrait corresponding to the target scene.
In this embodiment, a portrait viewing request may be issued when a user needs to view a built portrait. The portrait can be viewed in two ways, namely viewing the attribute list and exploring the content. Two image viewing methods will be described below.
For the image viewing mode of viewing the attribute list:
in response to the portrait view request, a list of portrait attributes may be provided. For example, the presentation of the image attribute list may be as shown in FIG. 12 when a user image is viewed, or as shown in FIG. 16 when a material image is viewed. Referring to FIG. 12 and FIG. 16, a representation property display item and a representation property manipulation item may be presented in the representation property list.
Based on the above, in one embodiment of the present disclosure, please refer to fig. 12 and 16, the representation attribute showing item includes at least one of an attribute name showing item, an attribute type showing item, an update time showing item, and a description showing item;
the portrait attribute operation includes at least one of an update operation, a view set operation, and a save to public portrait operation.
Based on the above, in one embodiment of the present disclosure, please refer to fig. 12 and 16, the portrait attribute operation includes an update operation; the method further comprises steps S205-S206:
step S205, responding to the trigger of the update operation item of the first target portrait from the outside, executing the step of updating the first target portrait during the process of starting the first target portrait; wherein the first target image is any one of the target images.
In the step, the newly-built portrait can be not started by default, and after a user selects the attribute to be started, the user clicks 'refreshing', so that the corresponding periodic/real-time calculation task can be started to update the portrait.
Step S206, in response to the trigger of the update operation item of the first target portrait from the outside, stopping the execution of the step of updating the first target portrait under the condition of stopping the first target portrait.
In this step, for an image on which update is being executed, clicking "stop update" stops the corresponding task, and when a plurality of attributes are associated with the task, it is suggested that the attribute of the image may be involved.
Based on the foregoing, in one embodiment of the present disclosure, referring to FIGS. 12 and 16, the portrait attribute operations include a save to public portrait operation; the method further comprises step S207:
step S207, in response to an external trigger to store a second target image into a public image, storing the second target image into a public image, wherein the second target image is any one of the target images.
In this embodiment, the portrait attributes may be distributed to public data by saving the constructed portrait to a public portrait, which attributes will thereafter be supplied to different scenarios as a public user/material portrait.
Where attributes saved to the system common representation may be prefixed to the common representation attribute set by name, taking into account the naming of the attributes. For example, for a user portrait, assuming that a scene name "home _ recommend" of the portrait is derived and a stored attribute name "user _ receiver _ purhcase", the name is "$ { scene name } - $ { attribute name }" in the in-system public portrait library, and here, the name is "home _ recommend-user _ receiver _ purchasse".
For example, for a material picture, assuming that a scene name "home _ recommend" of a derived picture and a stored attribute name "item _ receiver _ purchasse" are used, they are named "$ { scene name } - $ { attribute name }" in a common picture library built in the system, and they will be named "home _ recommend-item _ receiver _ purchasse" here.
In the embodiment, the deposition and multiplexing of the portrait can be supported, so that the data assets of a client are deposited, and the implementation efficiency of related projects such as AI projects can be improved.
Exploring this image viewing manner for the above contents:
in one embodiment of the present disclosure, please refer to fig. 13, the application data includes a portrait; the method further comprises step S208:
step S208, responding to the portrait searching request, and providing the constructed portrait corresponding to the searching identifier according to the searching identifier included in the portrait searching request.
In this embodiment, the image attribute may be searched for by the user/material ID or the user/material name, and the content of the image may be viewed. The latest updated image content can be provided by default. For example, for a user portrait, a display interface for viewing the portrait by content exploration may be as shown in fig. 13.
In detail, the samples are data instances, are packages of features, the labeled samples can be used to train the model, and the unlabeled samples can be used to test the model. A large number of well-characterized labeled samples are critical to machine learning. Based on this, after application data such as a figure is generated from data assigned to a scene, application data such as a sample can also be generated from the figure. The generated samples may be used to train a base model that optimizes the corresponding scenario.
Thus, in one embodiment of the present disclosure, the application data includes a portrait and a sample;
the step S20313 is to process the target data according to the data configuration information, and construct any type of application data corresponding to the target scenario, including: and generating a sample corresponding to the target scene according to the data configuration information and the portrait corresponding to the target scene.
In this embodiment, a sample may be broad-table data composed of seven parts, i.e., user information, material information, behavior information, user profile, material profile, tag (attribute of optimization target), and score (pre-estimation service).
The user information is original data of users in the public data; the material information is the original data of the materials in the public data; the behavior information is environmental information when exposure/display behaviors occur; the user portrait is a long-time portrait and a short-time portrait configured by the user; the material portrait is the portrait attribute of the material configuration.
And (4) performing attribution deduction on target behaviors for the tags, so as to obtain sample marks. If the label is positive, the sample is a positive sample, and if the label is negative, the sample is a negative sample. For example, when the optimization goal is click through rate, for the recommendation result, whether click is the label.
For scoring, the scoring data and other requested information in the upper system log is associated if the time is a recommended outcome for the system.
As mentioned above, the application data may have a portrait, a sample, and a pointer. For application data of a sample, please refer to fig. 17 in one embodiment of the present disclosure, the application data includes the sample; the first configuration item includes: and at least one of a delivery channel configuration item, a delivery column configuration item, a target behavior configuration item and a feedback cycle configuration item.
In detail, the channel sources for showing the channel are released, such as WeChat applet (MP), Mobile application (Android/IOS), etc.; the release column is used for displaying a content column, such as a home information stream home _ feed; the target behaviors are target behaviors to be predicted, such as purchase (purhcase), purchase any (purchaseseany), purchase same class (purchasesesimilar), click (click); the feedback period is a time window of the feedback delay.
For example, when the user requests to generate application data of a sample, the corresponding presentation interface may be as shown in fig. 17.
In detail, in consideration of different scenes, such as the fact that more similar calculation modes exist between the index data of the recommended scene and the index data of the pushed scene, and in consideration of various subsequent scenes, the fact that the operation diagnosis of the supporting user and the like are needed, a corresponding billboard and index insight are needed, so that a more efficient index development framework can be provided on the data base level, and the same calculation logic is precipitated and reused through a unified development view. Therefore, application data of the index can be created.
For application data, which is an index, please refer to fig. 18 in one embodiment of the present disclosure, the application data includes an index; the first configuration item includes: and the template index configuration item and the user-defined index configuration item correspond to at least one scene.
In this embodiment, a built-in template index and a custom index are supported as supplements, where the template index is an index currently defined for different scenes, and the custom index can be extended according to the scene needs.
For example, the template index may be a push index for a push service scenario, a recommendation index for a recommended service scenario, or a search index for a search service scenario.
For the user-defined index, index expansion can be performed according to scene needs after clicking addition, for example, the expansion index can be configured from the levels of name, statistical method, screening condition, grouping statistics, description and the like so as to meet the scene needs. Based on this, in an embodiment of the present disclosure, please refer to fig. 18, the custom metric configuration items include: at least one of a name configuration item, a statistical method configuration item, a screening condition configuration item, a grouping statistical configuration item and a description configuration item.
For the statistical method, the user can configure the number of users/total times/total amount/total duration of the specified behavior according to the requirement, and can configure a custom index formula.
For segment statistics, the user can specify the segment (group) field as desired.
For the screening conditions, the user can screen statistics in the specified range of the target attribute, and the same way as the screening of the material and the user data, the screening rules may be as follows: rules for numeric, floating point, text, date and time, boolean.
(1) Numeric (short integer, long integer, integer), query input box default "please input integer, e.g. 0", may have the following operators:
equal to (═)
Not equal (≠)
Greater than (>)
Less than (<)
Greater than or equal to (greater than or equal to)
Less than or equal to (less than or equal to)
Nonempty (is null)
Hollow (is not null)
(2) Floating point (floating point number) field, query input box default "please input decimal, such as 0.0", may have the following operators:
equal to (═)
Not equal (≠)
Greater than (>)
Less than (<)
Greater than or equal to (greater than or equal to)
Less than or equal to (less than or equal to)
Nonempty (is null)
Hollow (is not null)
(3) Text (string) field, query input box default "please input text, such as abc", may have the following operators:
equal to (═)
Not equal (≠)
Comprises (constants)
Do not contain (not contacts)
Same as (like)
Different from (not like)
Nonempty (is null)
Hollow (is not null)
(4) The date, time field, "please enter date, e.g., 2020-01-01" or "please enter time, e.g., 2020-01-0100: 00: 00", may have the following operators:
earlier than (before)
Late (after)
Not earlier than (equivalent or after)
Not later than (equivalent or before)
Nonempty (is null)
Hollow (is not null)
(5) Boolean, "please enter boolean numbers, such as true," may have the following operators:
equal to (═)
Not equal (≠)
Nonempty (is null)
Hollow (is not null)
For example, when the user selects to create a custom index, the corresponding configuration interface may be as shown in FIG. 19.
In one embodiment of the present disclosure, please refer to fig. 19, the application data includes an index; the method further comprises step S209-step S211:
step S209, in response to the index display configuration request, provides an index display configuration item.
In one embodiment of the present disclosure, the index display configuration item includes: at least one of a time granularity configuration item, a start time configuration item, and an end time configuration item.
In this embodiment, the user may configure the time granularity, the start time, and the end time as needed, that is, may select the corresponding interval index to be displayed. For example, the time granularity supports four ways of "15 minutes", "30 minutes", "hour", "day", with the start time and the end time as the interval boundaries of the trend graph. Based on the granularity, the index is updated in real time by taking the corresponding granularity as a reference.
And step S210, determining the index display configuration information which is externally set through the index display configuration items.
Step S211, responding to the index display request for the target scenario, displaying the data change trend of the index corresponding to the target scenario through the index display panel according to the index display configuration information.
In the step, configuration information is displayed based on indexes set by a user, and corresponding data change trends are displayed for the user. Preferably, a card-type index display panel may be provided, which contains the variation trend of the index and gives an overall situation icon of the corresponding group. Based on this, this presentation interface may be as shown in fig. 19. Therefore, the user can visually check the index change condition.
In addition to creating a scenario, viewing a scenario list, configuring various types of application data, and the like as described above, in order for a user to view data assigned to a scenario, one embodiment of the present disclosure may also have a data viewing function.
Based on this, in an embodiment of the present disclosure, please refer to fig. 20, the method further includes steps S212 to S213:
step S212, responding to the request for viewing the target data, displaying a data list, wherein the data list is displayed
In detail, a user may request to view data assigned to a certain scenario, based on which a list of data listed with the data may be presented. As shown in fig. 20, the data list may include data types such as behavior data, user data, material data, and the like. In addition, the user can search the fields according to the field names and issue data viewing requests accordingly.
Step S213, in the case of externally triggering the target data of any data type in the data list, displaying the data attribute of the target data of any data type.
In this step, please refer to fig. 20, the user is supported to view the attributes of the data, such as field name, type, null rate, description information, and the like.
Based on this, in one embodiment of the present disclosure, please refer to fig. 20, the at least one data type includes at least one of behavior data, user data, and material data; the behavior data comprises data corresponding to at least one behavior of detail viewing, shopping cart, exposure, comment and request; the data attribute comprises at least one of name, type, null rate and description.
In one embodiment of the present disclosure, unified data management operations may be performed based on the UI interface. The embodiment supports visual creating, publishing and managing scenes through UI interface design, so that real-time, closed-loop and thousands-of-people intelligent operation decision-making applications can be easily configured and built in minutes and seconds.
In this embodiment, the production and management of the application data can be processed in a unified manner, so that problems of production redundancy, uncontrollable production process and the like due to the fact that a production process of the application data is temporarily initiated at will in the existing data management mode and the computing task and state are probed everywhere can be solved, and problems of low reusability and the like due to the fact that the production result of the application data is not precipitated and integrated and the application data scattered everywhere has storage redundancy and poor uniformity can be solved in the existing data management mode.
Therefore, the embodiment of the disclosure provides a new application data management concept of a scene, and combines an open real-time data update mode to realize scene data management in an intelligent operation scene. The embodiment of the disclosure uniformly manages the production of the application data, can efficiently precipitate the data assets, and directly enables the AI model.
According to the method provided by the embodiment of the disclosure, the situation management of the application data can be realized, so that the online intelligent operation of an enterprise can be served, the enterprise is helped to rely on artificial intelligence, and the exclusive application and service system of a client is quickly established.
Fig. 21 is a flowchart illustrating a method according to an embodiment, and a method for implementing unified data management according to the embodiment will now be described by taking the system 100 shown in fig. 1 as an example.
As shown in fig. 21, the method of this embodiment may include steps S301 to S317 as follows:
step S301, responding to the new scene request, providing a scene template creation item and at least one scene personalized creation item, and executing step S302 and step S303.
In detail, the apparatus 1000 shown in fig. 1 may provide a UI interface to facilitate a user's operation as desired. The user may issue the request by performing an operation on the UI interface.
Step S302, under the condition that an individualized establishment item of a target scene is triggered externally, providing a scene establishment configuration item corresponding to the target scene, and acquiring target scene configuration information through the scene establishment configuration item; wherein the target scene is any one of the at least one scene, and step S304 is executed.
Step S303, in a case where the scene template creating item is triggered externally, providing at least one constructed scene, determining a first scene selected from the at least one constructed scene externally, acquiring target scene configuration information according to scene configuration information of the first scene, and executing step S304.
Step S304, a target scene corresponding to the target scene is created according to the target scene configuration information, wherein the scene and the scene are in a relation of 1 to N, N is a positive integer, and step S305 and step S309 are executed.
Step S305, for each target scene, in the process of creating the target scene, according to the target scene configuration information, subscribing target data corresponding to the target scene from public data, and distributing the target data to the target scene; wherein the common data is the collected raw data common to the at least one scene, and step S306 and step S311 are performed.
In detail, data may be collected as common data from the client 2000 shown in fig. 1 by the apparatus 1000 shown in fig. 1.
Step S306, responding to a new request of any type of application data of the target scene, and displaying a first configuration item corresponding to any type of application data, wherein the application data is a portrait, a sample or an index.
Step S307, determining data configuration information set by the first configuration item from the outside; and processing the target data according to the data configuration information, constructing any type of application data corresponding to the target scene, and executing step S308, step S313, step S315 and step S317.
Step S308, providing the sample corresponding to the target scenario to the model corresponding to the target scenario for processing, and ending the current process.
Step S309, responding to the scene display request, providing a scene list, wherein the scene list comprises the scene display item and the scene operation item of the target scene, and the scene operation item comprises an operation item issued to a scene library.
Step S310, in the case that the release of the target scenario to the scenario library operation item is triggered externally, releasing the target scenario to the scenario library, and ending the current process.
Step S311, responding to the request for viewing the target data, displaying a data list, where the data list includes at least one data type.
Step S312, in the case of externally triggering the target data of any data type in the data list, displaying the data attribute of the target data of any data type, and ending the current process.
Step S313, responding to the index display configuration request, providing an index display configuration item; and determining index display configuration information which is externally set through the index display configuration items.
Step S314, responding to the index display request for the target scenario, displaying the data change trend of the index corresponding to the target scenario through the index display panel according to the index display configuration information, and ending the current process.
Step S315, responding to the portrait viewing request aiming at the target scene, providing a portrait attribute list; wherein the representation attribute list comprises: a representation attribute representation item and a representation attribute operation item for each target representation, the target representation being a constructed representation corresponding to the target scene, the representation attribute operation item including a save to public representation operation item.
Step S316, responding to the trigger of the external operation item for saving the second target portrait to the public portrait, and ending the current process, wherein the second target portrait is any portrait in the target portraits.
Step S317, responding to the portrait searching request, and providing the constructed portrait corresponding to the searching identification according to the searching identification included in the portrait searching request.
< apparatus embodiment >
FIG. 22 is a functional block diagram of an apparatus 40 that implements unified data management, according to one embodiment. As shown in fig. 22, the data management apparatus 40 may include a scenario creation module 401, a data distribution module 402, and a data management module 403. The apparatus 40 for implementing unified data management may be the apparatus 1000 in fig. 1.
The scene creating module 401 creates a target scene corresponding to a target scene selected from at least one scene from the outside; the scenes and the scenes are in a relation of 1 to N, and N is a positive integer. For each target scenario, in the process of creating the target scenario, the data distribution module 402 subscribes target data corresponding to the target scenario from public data and distributes the target data to the target scenario; wherein the common data is raw data collected for common use by the at least one scene. The data management module 403 manages the target data through the target scenario.
In an embodiment of the present disclosure, the data management module 403 includes: a module for constructing application data corresponding to the target scene according to the target data; and the module is used for providing the application data to the model corresponding to the target scene for processing.
In one embodiment of the present disclosure, the application data includes any one or more types of a portrait, a sample, and an index;
the constructing of the application data corresponding to the target scenario according to the target data includes: responding to a new request of any type of application data of the target scene, and displaying a first configuration item corresponding to any type of application data; determining data configuration information externally set through the first configuration item; and processing the target data according to the data configuration information to construct any type of application data corresponding to the target scene.
In one embodiment of the present disclosure, the application data includes the representation;
the first configuration item includes: at least one of a representation name configuration item, at least one representation type configuration item, and a representation description configuration item.
In one embodiment of the present disclosure, the application data includes the representation;
the first configuration item comprises at least one portrait type configuration item;
the determining data configuration information externally set by the first configuration item includes: determining a target portrait type configuration externally selected from the at least one portrait type configuration; providing a second configuration item corresponding to the target portrait type configuration item; and determining the data configuration information externally set through a second configuration item.
In one embodiment of the present disclosure, the application data includes a representation;
the apparatus 40 for implementing unified data management further includes: means for providing a list of portrait attributes in response to a portrait view request for the target scene;
wherein the representation attribute list comprises: and displaying the portrait attribute of each target portrait and operating the portrait attribute, wherein the target portrait is a constructed portrait corresponding to the target scene.
In one embodiment of the present disclosure, the representation attribute showing item comprises at least one of an attribute name showing item, an attribute type showing item, an update time showing item and a description showing item;
the portrait attribute operation includes at least one of an update operation, a view set operation, and a save to public portrait operation.
In one embodiment of the present disclosure, the portrait attribute operation includes an update operation;
the apparatus 40 for implementing unified data management further includes: means for performing the step of updating a first target representation during start-up of said first target representation in response to an external trigger to an update operation on said first target representation; wherein the first target portrait is any portrait in the target portraits;
means for ceasing execution of the step of updating the first target representation in the event that the first target representation is stopped in response to a trigger of an external update operation on the first target representation.
In one embodiment of the present disclosure, the portrait attribute operations include a save to public portrait operation;
the apparatus 40 for implementing unified data management further includes: means for saving a second target representation to a common representation in response to an external trigger to a save to common representation operation, wherein the second target representation is any one of the respective target representations.
In one embodiment of the present disclosure, the application data includes a representation;
the apparatus 40 for implementing unified data management further includes: in response to a representation search request, providing a constructed representation corresponding to the search identification in accordance with a search identification included with the representation search request.
In one embodiment of the present disclosure, the representation includes a user representation and a material representation.
In one embodiment of the present disclosure, the representation comprises a user representation;
the at least one portrait type configuration item comprises at least one of a basic feature type configuration item, a statistical feature type configuration item, a model tag type configuration item and a public portrait configuration item;
in the case that the target portrait type configuration item is a basic feature type configuration item, the second configuration item includes at least one of a field configuration item and an extraction device configuration item;
in the case that the target portrait type configuration item is a statistical feature type configuration item, the second configuration item includes at least one of a behavior type configuration item, a calculation attribute configuration item, a time window configuration item, a calculation device configuration item, and a service screening configuration item;
in the case that the target representation type configuration item is a model tag type configuration item, the second configuration item includes at least one of a computing device configuration item, a behavior type configuration item, and a material property configuration item.
In one embodiment of the present disclosure, the representation includes a material representation;
the at least one portrait type configuration item comprises at least one of a basic feature type configuration item, a statistical feature type configuration item and a public portrait configuration item;
in the case that the target portrait type configuration item is a basic feature type configuration item, the second configuration item includes at least one of a field configuration item and an extraction device configuration item;
under the condition that the target portrait type configuration item is a statistical feature type configuration item, the second configuration item comprises at least one of a behavior type configuration item, an accumulation amount configuration item, a time window configuration item and a service screening configuration item;
in the event that the target portrait type configuration item is a public portrait configuration item, the second configuration item comprises a user portrait attribute selection configuration item.
In one embodiment of the present disclosure, the application data includes a representation and a sample;
the processing the target data according to the data configuration information to construct any type of application data corresponding to the target scene includes:
and generating a sample corresponding to the target scene according to the data configuration information and the portrait corresponding to the target scene.
In one embodiment of the present disclosure, the application data comprises a sample;
the first configuration item includes: and at least one of a delivery channel configuration item, a delivery column configuration item, a target behavior configuration item and a feedback cycle configuration item.
In one embodiment of the present disclosure, the application data includes metrics;
the first configuration item includes: and the template index configuration item and the user-defined index configuration item correspond to at least one scene.
In one embodiment of the present disclosure, the user-defined metric configuration item includes: at least one of a name configuration item, a statistical device configuration item, a screening condition configuration item, a grouping statistical configuration item, and a description configuration item.
In one embodiment of the present disclosure, the application data includes metrics;
the apparatus 40 for implementing unified data management further includes: means for providing an indicator display configuration item in response to an indicator display configuration request;
a module for determining index display configuration information externally set by the index display configuration item;
and the module is used for responding to the index display request aiming at the target scene, displaying the data change trend of the index corresponding to the target scene through the index display panel according to the index display configuration information.
In one embodiment of the present disclosure, the index display configuration item includes: at least one of a time granularity configuration item, a start time configuration item, and an end time configuration item.
In an embodiment of the present disclosure, the apparatus 40 for implementing unified data management further includes: means for presenting a data list in response to a request to view the target data, wherein the data list includes at least one data type;
and the module is used for showing the data attribute of the target data of any data type under the condition of externally triggering the target data of any data type in the data list.
In an embodiment of the present disclosure, the scenario creation module 401 includes: a module for responding to the new scene request, determining a target scene selected from at least one scene from the outside, and acquiring target scene configuration information;
a module for creating a target scenario corresponding to the target scenario according to the target scenario configuration information;
the data distribution module 402 is configured to subscribe, according to the target scenario configuration information, target data corresponding to the target scenario from public data.
In an embodiment of the present disclosure, the determining, in response to the new scenario request, a target scenario selected from at least one scenario externally, and acquiring target scenario configuration information includes: responding to the new scene request, and providing a personalized creation item of the at least one scene; under the condition that the personalized creation item of the target scene is triggered externally, providing a scene creation configuration item corresponding to the target scene; wherein the target scene is any one of the at least one scene; and acquiring the target scene configuration information through the scene creation configuration item.
In an embodiment of the present disclosure, the scenario creation configuration item includes at least one of an input channel configuration item, an optimization target configuration item, a feedback period configuration item, a material library configuration item, a behavior type configuration item, a behavior channel configuration item, and an interaction field configuration item.
In an embodiment of the present disclosure, the determining, in response to the new scenario request, a target scenario selected from at least one scenario externally, and acquiring target scenario configuration information includes: responding to the new scene request, and providing a scene template creating item; providing at least one constructed scene in case of externally triggering the scene template creation item; determining a first scene selected from the at least one constructed scene from the outside, and taking a scene corresponding to the first scene as the target scene; and acquiring the target scene configuration information according to the scene configuration information of the first scene.
In an embodiment of the present disclosure, the apparatus 40 for implementing unified data management further includes: and providing a scene list in response to the scene display request, wherein the scene list comprises a scene display item and a scene operation item of the target scene.
In one embodiment of the present disclosure, the scenario presentation item includes: at least one of a number presentation item, a name presentation item, a status presentation item, a type presentation item, an update time presentation item, and a create time presentation item.
In one embodiment of the present disclosure, the scenario operation item includes: and starting and stopping the operation item, releasing the operation item to the scene library, setting the operation item and deleting the operation item.
In one embodiment of the present disclosure, the scenario operation item includes a start-stop operation item;
the data allocating module 402 is configured to respond to an external trigger on the start-stop operation item of the target scenario, and execute the step of subscribing, according to the target scenario configuration information, target data corresponding to the target scenario from public data and allocating the target data to the target scenario in a process of starting the target scenario.
In one embodiment of the present disclosure, the scenario operation item includes a start-stop operation item;
the data allocating module 402 is configured to, in response to an external trigger to the start-stop operation item of the target scenario, stop executing the step of subscribing, according to the target scenario configuration information, target data corresponding to the target scenario from common data and allocating the target data to the target scenario when the target scenario is stopped.
In one embodiment of the present disclosure, the scenario operation item includes an issue to scenario library operation item;
the apparatus 40 for implementing unified data management further includes: and the module is used for issuing the target scene to a scene library under the condition that the issuing of the target scene to a scene library operation item is triggered externally.
< System embodiment >
Fig. 23 is a hardware configuration diagram of a system 50 implementing unified data management according to another embodiment.
As shown in fig. 23, the system 50 for implementing unified data management includes at least one computing device 501 and at least one storage device 502. The at least one storage device 502 is configured to store instructions for controlling the at least one computing device 501 to perform a method as any of the above method embodiments.
The system 50 for implementing unified data management may include at least one apparatus 1000 as shown in fig. 1, or have a hardware structure the same as or similar to the apparatus 1000, and is not limited herein.
Furthermore, an embodiment of the present disclosure also provides a computer-readable storage medium on which a computer program is stored, which, when being executed by a processor, implements the method according to any one of the method embodiments of the present disclosure.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (10)

1. A method of implementing unified data management, comprising:
for a target scene selected from at least one scene from the outside, creating a target scene corresponding to the target scene; wherein, the scene and the scene are the relation of 1 to N, and N is a positive integer;
for each target scene, in the process of creating the target scene, subscribing target data corresponding to the target scene from public data, and distributing the target data to the target scene; wherein the public data is collected raw data which is public for the at least one scene;
and managing the target data through the target scene.
2. The method of claim 1, wherein the managing the target data by the target context comprises:
according to the target data, application data corresponding to the target scene is constructed;
and providing the application data to the model corresponding to the target scene for processing.
3. The method of claim 2, wherein the application data comprises any one or more types of a portrait, a sample, and an index;
the constructing of the application data corresponding to the target scenario according to the target data includes:
responding to a new request of any type of application data of the target scene, and displaying a first configuration item corresponding to any type of application data;
determining data configuration information externally set through the first configuration item;
and processing the target data according to the data configuration information to construct any type of application data corresponding to the target scene.
4. The method of claim 3, wherein the application data comprises the representation;
the first configuration item includes: a representation name configuration item, and/or at least one representation type configuration item, and/or a representation description configuration item.
5. The method of claim 3, wherein the application data comprises the representation;
the first configuration item comprises at least one portrait type configuration item;
the determining data configuration information externally set by the first configuration item includes:
determining a target portrait type configuration externally selected from the at least one portrait type configuration;
providing a second configuration item corresponding to the target portrait type configuration item;
and determining the data configuration information externally set through a second configuration item.
6. The method of claim 3, wherein the application data comprises a representation;
the method further comprises the following steps: providing a picture attribute list in response to a portrait viewing request for the target scene;
wherein the representation attribute list comprises: and displaying the portrait attribute of each target portrait and operating the portrait attribute, wherein the target portrait is a constructed portrait corresponding to the target scene.
7. The method of claim 6, wherein said representation property representation item includes at least one of a property name representation item, a property type representation item, an update time representation item, a description representation item;
the portrait attribute operation includes at least one of an update operation, a view set operation, and a save to public portrait operation.
8. An apparatus for implementing unified data management, comprising:
the scene creating module is used for creating a target scene corresponding to a target scene selected from at least one scene from the outside; wherein, the scene and the scene are the relation of 1 to N, and N is a positive integer;
the data distribution module is used for subscribing target data corresponding to the target scenes from public data and distributing the target data to the target scenes in the process of creating the target scenes for each target scene; wherein the public data is collected raw data which is public for the at least one scene;
and the data management module is used for managing the target data through the target scene.
9. A system comprising at least one computing device and at least one storage device, wherein the at least one storage device is to store instructions for controlling the at least one computing device to perform the method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202011567725.0A 2020-12-25 2020-12-25 Method and device for realizing unified data management Pending CN112597420A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011567725.0A CN112597420A (en) 2020-12-25 2020-12-25 Method and device for realizing unified data management

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011567725.0A CN112597420A (en) 2020-12-25 2020-12-25 Method and device for realizing unified data management

Publications (1)

Publication Number Publication Date
CN112597420A true CN112597420A (en) 2021-04-02

Family

ID=75202195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011567725.0A Pending CN112597420A (en) 2020-12-25 2020-12-25 Method and device for realizing unified data management

Country Status (1)

Country Link
CN (1) CN112597420A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070204211A1 (en) * 2006-02-24 2007-08-30 Paxson Dana W Apparatus and method for creating literary macrames
CN110222029A (en) * 2019-05-09 2019-09-10 国网上海市电力公司 A kind of big data multidimensional analysis computational efficiency method for improving and system
CN110489628A (en) * 2019-08-22 2019-11-22 北大方正集团有限公司 Data processing method, device and electronic equipment
CN110597711A (en) * 2019-08-26 2019-12-20 湖南大学 Automatic driving test case generation method based on scene and task
CN111324906A (en) * 2020-02-17 2020-06-23 中国建设银行股份有限公司 Automatic access method and device based on data interface and electronic equipment
CN111400061A (en) * 2020-03-12 2020-07-10 泰康保险集团股份有限公司 Data processing method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070204211A1 (en) * 2006-02-24 2007-08-30 Paxson Dana W Apparatus and method for creating literary macrames
CN110222029A (en) * 2019-05-09 2019-09-10 国网上海市电力公司 A kind of big data multidimensional analysis computational efficiency method for improving and system
CN110489628A (en) * 2019-08-22 2019-11-22 北大方正集团有限公司 Data processing method, device and electronic equipment
CN110597711A (en) * 2019-08-26 2019-12-20 湖南大学 Automatic driving test case generation method based on scene and task
CN111324906A (en) * 2020-02-17 2020-06-23 中国建设银行股份有限公司 Automatic access method and device based on data interface and electronic equipment
CN111400061A (en) * 2020-03-12 2020-07-10 泰康保险集团股份有限公司 Data processing method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
小麦屠猪馆: "数据运营实战|从真实案例,看如何数据驱动运营", pages 1 - 19, Retrieved from the Internet <URL:http://www.360doc.com/content/17/0913/14/32379425_686774247.shtml> *

Similar Documents

Publication Publication Date Title
US20170011418A1 (en) System and method for account ingestion
US20160364770A1 (en) System for high volume data analytic integration and channel-independent advertisement generation
US20180005274A1 (en) Management system for high volume data analytics and data ingestion
US11657425B2 (en) Target user estimation for dynamic assets
JP2022503842A (en) Techniques for data-driven correlation of metrics
CN110781376A (en) Information recommendation method, device, equipment and storage medium
KR20120116905A (en) Method for presenting user-defined menu of digital content choices, organized as ring of icons surrounding preview pane
US11164211B2 (en) System and method for enabling efficient digital marketing on portable wireless devices for parties with low capabilities
US11019019B2 (en) Dynamic hashtag ordering based on projected interest
CN109313563A (en) A kind of collecting method, apparatus and system
JP7240505B2 (en) Voice packet recommendation method, device, electronic device and program
US11402966B2 (en) Interactive dimensional hierarchy development
US11256722B2 (en) Techniques for modeling aggregation records
CN111930927B (en) Evaluation information display method and device, electronic equipment and readable storage medium
US20170061478A1 (en) Dynamically varying remarketing based on evolving user interests
US11966957B2 (en) Methods and systems for modular personalization center
CN112597420A (en) Method and device for realizing unified data management
CN113778542A (en) Service configuration table generation method and device
CN113704593B (en) Operation data processing method and related device
CN113806596B (en) Operation data management method and related device
US20230316382A1 (en) Methods and systems for automated personalization
KR20230170307A (en) System and method for providing information of goods by calendar
WO2023136739A1 (en) Model customization
KR20240003370A (en) Method, computer device, and computer program to provide marketing message as benefit information
CN113806626A (en) Method and system for sending push message

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination