CN111061767B - Data processing method based on memory calculation and SQL calculation - Google Patents

Data processing method based on memory calculation and SQL calculation Download PDF

Info

Publication number
CN111061767B
CN111061767B CN201911254622.6A CN201911254622A CN111061767B CN 111061767 B CN111061767 B CN 111061767B CN 201911254622 A CN201911254622 A CN 201911254622A CN 111061767 B CN111061767 B CN 111061767B
Authority
CN
China
Prior art keywords
computing
calculation
model
expression
sql
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911254622.6A
Other languages
Chinese (zh)
Other versions
CN111061767A (en
Inventor
程宏亮
穆宇浩
郭联伟
苏魁
王海亮
李旭
刘国杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meritdata Technology Co ltd
Original Assignee
Meritdata Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meritdata Technology Co ltd filed Critical Meritdata Technology Co ltd
Priority to CN201911254622.6A priority Critical patent/CN111061767B/en
Publication of CN111061767A publication Critical patent/CN111061767A/en
Application granted granted Critical
Publication of CN111061767B publication Critical patent/CN111061767B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24564Applying rules; Deductive queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides a data processing method based on memory calculation and SQL calculation, which comprises the following steps: reconstructing an analysis expression semantic system to establish an internal core grammar rule of a database across granularity expressions and an external layer grammar rule across objects returned by the granularity expressions; the reconstruction analysis expression computing framework comprises a computing logic generator and a computing executor, wherein the computing logic generator identifies computing logic according to the input semantic model and model information of the metadata model, and constructs a DAG stream according to the computing logic; and the computing executor receives the DAG stream and the plurality of computing models, generates computing tasks and submits the tasks to release and execute. The flexibility of BI in service calculation is effectively improved, the BI is not limited to a data warehouse, pre-calculation in advance is not needed, or a data center is built by extracting data, and complex service calculation can be realized only by means of SQL and a memory calculation mechanism.

Description

Data processing method based on memory calculation and SQL calculation
Technical Field
The invention relates to the field of BI (business intelligence (Business Intelligence)), in particular to a data processing method based on memory computation and SQL computation.
Background
With the rapid development of agile data BI (business intelligence (Business Intelligence)) and the flexible and varied customer needs, how to provide more agile, flexible computing power is the focus of current BI development, while traditional data warehouse-based models have greatly limited the flexibility of BI computing.
Most BI currently rely on data warehouse, and for a relatively complex unification (e.g., counting the ratio of sales of a certain province to sales of the whole country), a dimension table and a metric table need to be established first, a data mart is created based on the dimension and the metric, and subsequent queries are performed based on the data mart. In BI, such a mode is called "pre-calculation", and the core idea is "space-time-shifting", namely, the data result is calculated in advance and is persistent, so that the calculation efficiency is effectively improved, and the calculation complexity is reduced. However, this technique has the following unavoidable problems:
the development cost is huge, and the demand response is low. Because the data warehouse needs to store the calculation result in advance in a solidified data structure, when the requirement is changed, the modification cost is relatively high, and the flexible and changeable calculation requirement cannot be met well. If all the various demands are covered in advance to improve the pre-computed hit rate, huge space waste is brought. Therefore, the pre-calculation mode is suitable for application scenes solidified by the analysis dimension, but cannot meet the requirements of self-service analysis and impromptu analysis query. ( Examples: the ratio of the provincial sales to the national sales is calculated by pre-calculation to form a result set, but if the demand of the clients changes, the ratio of each class of products and the national ratio of each provincial needs to be known, one dimension is added, the previous result table needs to be reconstructed and pre-calculated again, and the larger space waste is brought )
The detail data is lost, and the analysis conclusion is affected. Since BI is most calculated according to a certain N dimension, aggregate results are summarized, and the user cannot know the composition of detail data only by pre-calculating the stored aggregate results, further deep exploration of business analysis is affected (for example, the occupation ratio of provincial sales is in the current result table A, but the sales provided by those orders and those customers cannot be obtained in the result table)
Meanwhile, the conventional analysis expression is calculated in the same dimension frame (such as addition, subtraction, multiplication and division based on fixed dimension), which is relatively single, but how to calculate the analysis expression in a cross-dimension way, and multi-dimension data extraction, summarization and addressing recalculation are needed to be realized by a more advanced function expression.
Aiming at the problems, the application provides a design method and a technical implementation scheme of a data analysis expression.
Disclosure of Invention
The invention provides a data processing method based on memory computation and SQL computation, which effectively improves the flexibility of BI in terms of business computation, and can realize complex business computation by means of SQL and memory computation mechanism without being limited by a data warehouse, without pre-computation in advance or data extraction to establish a data center.
The technical scheme for realizing the purpose of the invention is as follows:
a data processing method based on memory calculation and SQL calculation comprises the following steps:
reconstructing an analysis expression semantic system to establish an internal core grammar rule of a database across granularity expressions and an external layer grammar rule across objects returned by the granularity expressions;
the reconstruction analysis expression computing framework comprises a computing logic generator and a computing executor, wherein the computing logic generator identifies computing logic according to the input semantic model and model information of the metadata model, and constructs a DAG stream according to the computing logic;
and the computing executor receives the DAG stream and the plurality of computing models, generates computing tasks and submits the tasks to release and execute.
The metadata model refers to metadata information of data to be analyzed. The combination of the semantic model and the metadata model generates a computational model, which is syntactic in nature, but needs to work together with the metadata model.
In a preferred embodiment of the present invention, metadata information of data to be analyzed is automatically extracted according to input contents, and a metadata model is built.
In a preferred embodiment of the present invention, the computing executor converts the computing model of the physical table based on the parent view of the computing model into SQL release for execution, and performs multistage and fractional computation on the grid model and the multiple view models in the memory, and the computing result returns in one or more DataFrame structures.
In a preferred embodiment of the invention, the calculation of the multi-stage sub-step comprises a splitting step: determining whether the keyword fixed exists or not and the number of the existence according to the input of the metadata and the semantics; there is a fixed, split a view, then generate the corresponding sql according to the definition in the view, and deliver to the grid model for calculation.
N view models and N grid models, and DAG streams generated according to parent-child relationships between views.
In a preferred embodiment of the present invention, the DAG flow includes, but is not limited to, a view model, a grid model, and the blood relationship of each model is carded to determine the input and output and associated fields of each model, which are to be deployed in a hierarchically nested mode in SQL.
In a preferred embodiment of the present invention, the cross-granularity expression includes: and pre-calculating a calculation result of a certain granularity level, and then applying the calculation result to a display interface to perform new inquiry.
In a preferred embodiment of the invention, a cross-granularity expression is constructed that includes a plurality of keywords, and a plurality of aggregation functions, the single keyword corresponding to a different function.
In a preferred embodiment of the present invention, the generation, identification, and verification of the analytical expression grammar hierarchy is accomplished through an API based on AntLR4 and its provision.
In a preferred embodiment of the present invention, the inter-granularity expression internal core grammar rules include the following:
● The whole is wrapped by { }, the interior supports { } nesting (i.e. the fields inside the expression can also be expressions);
● The expression must include ": "colon divides the expression into two parts, a left side segment, and a right side segment;
● The left-hand segment must include only one keyword, which supports case;
● The left segment must include 0 or more fields that will affect view granularity;
● The left side segment can also be a function expression, supporting operators;
● The right segment must include an aggregation field and aggregation function that will determine the statistical indicator of the view;
● The right segment may also be a functional expression, supporting operators.
In a preferred embodiment of the present invention, the objects returned by the cross-granularity expression are detail level data columns.
Compared with the prior art, the invention has the beneficial effects that:
the invention effectively improves the flexibility of BI in service calculation, is not limited by a data warehouse, is not required to be pre-calculated in advance, or is used for extracting data to establish a data center, and can realize complex service calculation by means of SQL and a memory calculation mechanism.
Drawings
FIG. 1 is a functional block diagram of the present invention;
FIG. 2 is a flow chart of the computational logic of the present invention.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
A data processing method based on memory calculation and SQL calculation comprises the following steps: reconstructing an analysis expression semantic system to establish an internal core grammar rule of a database across granularity expressions and an external layer grammar rule across objects returned by the granularity expressions;
the reconstruction analysis expression computing framework comprises a computing logic generator and a computing executor, wherein the computing logic generator identifies computing logic according to the input semantic model and model information of the metadata model, and constructs a DAG stream according to the computing logic; and the computing executor receives the DAG stream and the plurality of computing models, generates computing tasks and submits the tasks to release and execute.
Example 1:
referring to fig. 1, a data processing method based on memory computation and SQL computation includes: reconstructing an analysis expression semantic system to establish an internal core grammar rule of a database across granularity expressions and an external layer grammar rule across objects returned by the granularity expressions;
the reconstruction analysis expression computing framework comprises a computing logic generator and a computing executor, wherein the computing logic generator identifies computing logic according to the input semantic model and model information of the metadata model, and constructs a DAG stream according to the computing logic; and the computing executor receives the DAG stream and the plurality of computing models, generates computing tasks and submits the tasks to release and execute.
Wherein the expression grammar hierarchy is analyzed
1.1.1 basic concept understanding
Understanding the analytical expressions first requires understanding the concept of view, and granularity.
View: for data analysis, a view is a set of statistical data formed in a deterministic dimension, metric and computational manner for a certain analysis scene. The essence of the view is one dataset.
Particle size: the smallest unit is represented on the business in a row of data for that view. In a general sense, the minimum granularity of the view is constituted by a specific set of dimensions in the view.
Assuming that a sales order list exists, we count the following two query results according to sales order data:
results a:
market for the production of Sales amount
Northwest of China 500000
Northwest of China 600000
Results B:
market for the production of Province and province Sales amount
Northwest of China Shanxi province 100000
Northeast China Liaoning (Liaoning) 100000
Northwest of China Shanxi province 120000
Northeast China Liaoning (Liaoning) 120000
Results A and B are two views of the sales order detail table.
For view A, the object represented by a row of data is an index state record of a certain market, and the granularity of the index state record is the market.
For view B, the object represented by a row of data is an index state record for a city of a market. The granularity is market+province.
Analytical expressions are largely divided into several types:
● Detail level expression: such as expression cost = [ sales ] - [ profit ], is a computational expression of detail level created on the basis of the original dataset.
● The aggregate computing expression: such as profit margin = sum profit/sum ([ sales ]), which is a computational expression based on the result dataset, relative to the detail level expression. (i.e., the result dataset is generated by summarizing profit and sales in a dimension, and the expression is calculated on a summarized view basis).
● The expression is calculated across granularity: an expression corresponding to pre-computing data of a certain granularity level (for this example, the total amount consumed by each customer in the past is obtained in advance), and then applying the computing result (the sales total amount of the customer) to perform a new query (such as averaging and summing) in the front view. As shown in the following figure
● The front view (parent view), is the data query view. View: for data analysis, a view is a set of statistical data formed in a deterministic dimension, metric and computational manner for a certain analysis scene. The essence of the view is one dataset.
Figure BDA0002309913320000071
A new field CLV is obtained, based on which we can calculate the CLV mean, maximum value of the region in the front view.
1.1.2 keyword definition
Keywords of the cross-granularity analysis expression mainly include Fixed, exclude, include, lookup
●Fixed
The keyword indicates that the granularity of the view represented by the expression is Fixed, and the Fixed dimension is determined by a dimension field defined after fix and is not interfered by the dimension in the main view. Such as the customer full life cycle value (clv) calculation above.
●Exclued
The keyword indicates that, for the view represented by the expression, the granularity of the view is coarser than that of the main view, and when determining the granularity of the expression, the query granularity of the main view needs to be removed. Such as performing a summary aggregate calculation. In the front view, sales are counted according to the area and the product type, and the expression { exclusied [ area ]: sum } is obtained by removing the area dimension and then collecting the value (namely, collecting the value according to the sales of the product).
●Include
● The keyword indicates that, for the view represented by the expression, the granularity is finer than that of the main view, and the granularity of the query of the main view needs to be added to determine the granularity of the expression. If sales are counted according to the area and the product type in the front view, the expression { Include (province): sum }, the sales of a certain area, a certain product and a certain province are obtained.
●Fixed
The keyword indicates that the granularity of the view represented by the expression is Fixed, and the Fixed dimension is determined by a dimension field defined after fix and is not interfered by the dimension in the main view. Such as the customer full life cycle value (clv) calculation above.
●Lookup
The keyword represents that, for the view represented by the expression, granularity is refined and increased on granularity of the front view, the dimension (only date is supported at present) specified by the expression, and row-crossing fetch and row-crossing calculation are realized according to the error row parameters defined in the expression. Such as calculating sales equivalence ratio, ring ratio.
1.1.3 support function definition
Aggregation functions supported across granularity analysis expressions include:
● Sum: summing, supporting numeric type fields
● Avg: averaging, supporting numeric type fields
● Count, support date, value, character type field
● Dcount: unique count is calculated, supporting date, value, character type field
● Max: maximum value, support date, value, character type field
● Min: minimum value, support date, value, character type field
The functions supported by the cross-granularity analysis expression are independent of the aggregate functions of the front view.
● In addition, common detail level functions are supported for the post-keyword fields of the keyword Fixed, exclude, include. Including but not limited to: character functions (e.g., substr, contact), numerical functions (e.g., abs, etc.), date functions (year\mole\day\now), etc. Support +, -, ×,/and the like operators.
1.1.4 internal grammar definition
The inter-granularity expression internal core grammar rules include the following:
● The whole is wrapped by { }, the interior supports { } nesting (i.e., the fields inside the expression can also be expressions).
● The expression must include ": ", a colon divides the expression into two parts, the left hand section, and the right hand section.
● The left hand segment must include only one keyword, which supports case.
● The left segment must include 0 or more fields. (this field will affect view granularity).
● The left segment may also be a functional expression. Supporting operators.
● The right hand segment must include an aggregation field and an aggregation function. (this field will determine the statistical index of the view)
● The right hand segment may also be a functional expression. Supporting operators.
Based on ANTLR4 (ANother Tool for Language Recognition, a language generation recognition tool), a cross-granularity internal grammar file (part) is defined as follows, and grammar generation, recognition, and verification are completed based on ANTLR4 and its provided API.
( And (3) injection: the application is based on the application and implementation of the tool in this scenario )
Figure BDA0002309913320000101
1.1.5 outer layer grammar definition
The object returned across granularity expressions is a detail level data column, and can be applied to the following positions:
● May be nested in a new cross-granularity expression.
● May be nested in a new aggregation function.
● Can be applied in detail functions.
● Can participate in the operations of +, -, ×,/and the like at the detail level.
The grammar definition file is referred as follows:
Figure BDA0002309913320000102
1.1.6 semantic model Generation
And generating a grammar check instance based on AntLR4, carrying out expression semantic analysis in a visitor mode, and generating a semantic model.
For example, the expression avg ({ Fixed [ customer name ]: sum) }, (average customer full life cycle value) is parsed by applying the deltailfun.g4 semantic file, and the result of the parsing is returned.
Returning the parsing result typically includes the following:
the expression content: avg ({ Fixed [ customer name ]: sum })
Expression type: fixed type
Aggregation dimension: [ customer name ]
Measurement: sales amount
The measurement mode is as follows: totalizing
The measure of the outer layer of the expression: average of
2.2 analytical expression computing framework
The analysis expression computing framework is an analysis and execution framework which is mixed with memory computing and SQL computing, and is an enhancement of the existing computing framework at present.
The computing framework mainly comprises the following contents:
2.2.1 metadata model
The metadata model is a basic input model defining a calculation process, and is automatically generated according to the user input content and a semantic model generated by an analysis expression.
And automatically generating a metadata model according to the semantic model generated by the user input content and the analysis expression.
The primary content of the metadata model will cover the basic descriptive information (abstract) of the data: the method mainly comprises the following steps:
● List of fields involved in an expression
● Data source information to which a field belongs
● Physical table information to which a field belongs
● View information to which a field belongs
● Information such as type, precision, length, name, alias, comment and the like to which a field belongs
● Semantic definition (dimension/metric) of fields in computation
● Field computation rules and grammar
● Derived fields and blood relationship
2.2.2 computational model
The calculation model mainly records the calculation logic of a certain data object, and a general calculation model generates a corresponding calculation model according to the semantics and a metadata model, wherein the calculation model generally comprises the following steps:
● Core component: computational logic generator, view model, mesh model, DAG flow model, and executor
● A view model, which is used to describe a certain view of a sub-computing process, is typically composed of a plurality of views, and generally includes the following:
■ Parent view: basic data view based on the view
■ Packet field: put in the groupby segment when packaging sql, granularity ■ representing the view filters the fields: packaging sql, and placing in a filtering section (containing inner layer and outer layer)
■ Ordering field: placing in an oder segment when packaging sql
■ Metric field: placed in select section when packaging sql
■ Association field: the SQL is packaged and placed on join keywords.
■ Generating SQL summary information
● The mesh model will typically include the following:
■ Partition field: core field of memory computing model
■ Addressing field: core field of memory computing model
■ Memory grid: a memory-based data structure temporarily storing intermediate computing results.
● DAG flow model: based on the semantic model of the expression, the program automatically judges and generates the DAG task flow, and the expression is a technical realization of converting the semantics into a calculation process.
2.2.3 computational logic generator
The logic for the generation of the computational logic is shown in fig. 2:
the computational logic generator will identify computational logic based on the entered semantic model and model information for the metadata model.
When keywords such as Fixed, exclued exist in the analysis expression, a plurality of view models with father-son relations are required to be organized by default. 1-N view models are generated.
The father-son relationship in the view model is completely recorded, the blood relationship of each model is combed by a calculation logic generator, the input and output and the associated field of each model are determined, and the patterns of hierarchical nesting are developed in SQL.
Meanwhile, for models using aggregate expressions, table calculation expressions, and the like, a mesh model will be generated by default.
The computational logic generator generates a DAG task flow that describes the overall computational process and computational steps.
2.2.4 computing actuator
The computing executor receives the input DAT stream and the plurality of computing models, generates a computing task and submits the issuing execution.
In the execution process, firstly, a calculation model taking a father view of the calculation model as a basic physical table is converted into SQL (structured query language) for release and execution, and in a memory, the grid model and a more complex view model are subjected to multistage and stepped calculation.
The computing executor has certain optimizing computing capacity, and the view model with the repeatability is combined and compressed properly to reduce the query times and the memory space.
The calculation results are returned in one or more DataFrame structures. A DataFrame is a tabular data structure containing an ordered set of columns, each of which may be of a different value type (numeric, string, boolean, etc.). The DataFrame has both row and column indices
The above list of detailed descriptions is only specific to practical embodiments of the present invention, and they are not intended to limit the scope of the present invention, and all equivalent embodiments or modifications that do not depart from the spirit of the present invention should be included in the scope of the present invention.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to the embodiments described in detail below, and that the embodiments described in the examples may be combined as appropriate to form other embodiments that will be apparent to those skilled in the art.

Claims (8)

1. The data processing method based on memory calculation and SQL calculation is characterized by comprising the following steps:
reconstructing an analysis expression semantic system to establish an internal core grammar rule of a database across granularity expressions and an external layer grammar rule across objects returned by the granularity expressions; wherein the inter-granularity expression internal core grammar rules include: the whole is wrapped by { }, and the inner part supports { } nesting; the expression must include ": "colon divides the expression into two parts, a left side segment and a right side segment; the left side section comprises only one keyword, and the keyword supports case; the left segment includes 0 or more fields; or the left side segment is a function expression, supporting operators; the right segment includes an aggregation field and an aggregation function; or the right side segment is a function expression, supporting operators;
the reconstruction analysis expression computing framework comprises a computing logic generator and a computing executor, wherein the computing logic generator identifies computing logic according to the input semantic model and model information of the metadata model, and constructs a DAG stream according to the computing logic;
the computing executor receives the DAG stream and the plurality of computing models, generates computing tasks and submits the computing tasks to release and execute; wherein the generating a computing task comprises: the computing executor receives the input DAT stream and a plurality of computing models, generates a computing task and submits the task to release and execute; in the execution process, firstly converting a calculation model taking a father view of the calculation model as a basic physical table into SQL (structured query language) for release and execution, and in a memory, carrying out multistage and stepwise calculation on a grid model and a more complex view model; the computing executor has certain optimization computing capability, and the view model with the repeatability is combined and compressed appropriately so as to reduce the query times and the memory space; the calculation result is returned in one or more DataFrame structures; dataFrame is a tabular data structure containing a set of ordered columns, each column including a different value type; the DataFrame has both a row index and a column index.
2. The data processing method based on memory computation and SQL computation according to claim 1, wherein metadata information of data to be analyzed is automatically extracted according to input contents, and a metadata model is built.
3. The method for processing data based on memory computation and SQL computation according to claim 2, wherein the computation of the multi-stage sub-step comprises a splitting step: determining whether the keyword fixed exists or not and the number of the existence according to the input of the metadata and the semantics; there is a fixed, split a view, then generate the corresponding sql according to the definition in the view, and deliver to the grid model for calculation.
4. The method of claim 1, wherein the DAG stream includes, but is not limited to, a view model, a grid model, and a blood relationship of each model is combed to determine an input/output and an associated field of each model, and the DAG stream is expanded in a hierarchically nested mode in SQL.
5. The method for processing data based on memory computation and SQL computation according to any one of claims 1 to 4, wherein the cross-granularity expression includes: and pre-calculating a calculation result of a certain granularity level, and then applying the calculation result to a display interface to perform new inquiry.
6. The method of claim 5, wherein the constructing includes constructing a cross-granularity expression including a plurality of keywords, and a plurality of aggregation functions, each keyword corresponding to a different function.
7. The method of claim 1, wherein the generation, recognition, and verification of the analytical expression grammar system is performed by an AntLR4 and an API provided by the AntLR 4.
8. The method for processing data based on memory computation and SQL computation according to claim 1, wherein the object returned by the cross-granularity expression is a detail-level data column.
CN201911254622.6A 2019-12-10 2019-12-10 Data processing method based on memory calculation and SQL calculation Active CN111061767B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911254622.6A CN111061767B (en) 2019-12-10 2019-12-10 Data processing method based on memory calculation and SQL calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911254622.6A CN111061767B (en) 2019-12-10 2019-12-10 Data processing method based on memory calculation and SQL calculation

Publications (2)

Publication Number Publication Date
CN111061767A CN111061767A (en) 2020-04-24
CN111061767B true CN111061767B (en) 2023-05-05

Family

ID=70300205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911254622.6A Active CN111061767B (en) 2019-12-10 2019-12-10 Data processing method based on memory calculation and SQL calculation

Country Status (1)

Country Link
CN (1) CN111061767B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112269792B (en) * 2020-12-11 2021-07-02 腾讯科技(深圳)有限公司 Data query method, device, equipment and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325869A (en) * 2016-08-24 2017-01-11 浪潮(北京)电子信息产业有限公司 Configuration file operation processing method and device
CN107704265A (en) * 2017-09-30 2018-02-16 电子科技大学 A kind of configurable rule generating method of service-oriented stream

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6341288B1 (en) * 1998-07-29 2002-01-22 Sybase, Inc. Database system with methodology for accessing a database from portable devices
US20140289280A1 (en) * 2013-03-15 2014-09-25 Perforce Software, Inc. System and Method for Bi-directional Conversion of Directed Acyclic Graphs and Inter-File Branching
CN103279358B (en) * 2013-06-08 2016-04-27 北京首钢自动化信息技术有限公司 A kind of explanation type Service Component dynamic fixing method of Industry-oriented application
CN103577590A (en) * 2013-11-12 2014-02-12 北京润乾信息系统技术有限公司 Data query method and system
EP3475887B1 (en) * 2016-08-22 2023-07-19 Oracle International Corporation System and method for dynamic lineage tracking, reconstruction, and lifecycle management
CN109492083A (en) * 2018-11-05 2019-03-19 北京奥法科技有限公司 A method of more wheel human-computer intellectualizations are realized based on list content
CN110018829B (en) * 2019-04-01 2022-11-11 北京东方国信科技股份有限公司 Method and device for improving execution efficiency of PL/SQL language interpreter

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325869A (en) * 2016-08-24 2017-01-11 浪潮(北京)电子信息产业有限公司 Configuration file operation processing method and device
CN107704265A (en) * 2017-09-30 2018-02-16 电子科技大学 A kind of configurable rule generating method of service-oriented stream

Also Published As

Publication number Publication date
CN111061767A (en) 2020-04-24

Similar Documents

Publication Publication Date Title
CN110618983B (en) JSON document structure-based industrial big data multidimensional analysis and visualization method
CN108292315B (en) Storing and retrieving data in a data cube
CN106997386B (en) OLAP pre-calculation model, automatic modeling method and automatic modeling system
US10366100B2 (en) Aggregation framework system architecture and method
Phipps et al. Automating data warehouse conceptual schema design and evaluation.
US20180300381A1 (en) Aggregation framework system architecture and method
CN105718565B (en) The construction method and construction device of data warehouse model
US20110119300A1 (en) Method Of Generating An Analytical Data Set For Input Into An Analytical Model
CN104700190B (en) One kind is for project and the matched method and apparatus of professional
KR20010072019A (en) Method and apparatus for selecting aggregate levels and cross product levels for a data warehouse
CN109710663A (en) A kind of data statistics chart generation method
CN111061767B (en) Data processing method based on memory calculation and SQL calculation
CN108182204A (en) The processing method and processing device of data query based on house prosperity transaction multi-dimensional data
CN114490571A (en) Modeling method, server and storage medium
CA3130648A1 (en) Data processing query method and device based on olap pre-calculation model
CN113342843A (en) Big data online analysis method and system
CN113255309A (en) Index calculation engine implementation method based on multi-dimensional model
CN112667859A (en) Data processing method and device based on memory
CN109933622A (en) A kind of data visualisation system and implementation method
CN110134729A (en) Data calculate analysis system and method
Zhang Parameter Curation and Data Generation for Benchmarking Multi-model Queries.
CN114090627B (en) Data query method and device
CN117056350A (en) Multi-dimensional analysis dynamic acceleration method, system and device based on self-adaption
CN108052522A (en) A kind of method and system that dynamic optimization is carried out to OLAP precomputations model
Yarygina et al. Bi-objective optimization for approximate query evaluation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant