CN109684370A - Daily record data processing method, system, equipment and storage medium - Google Patents
Daily record data processing method, system, equipment and storage medium Download PDFInfo
- Publication number
- CN109684370A CN109684370A CN201811041735.3A CN201811041735A CN109684370A CN 109684370 A CN109684370 A CN 109684370A CN 201811041735 A CN201811041735 A CN 201811041735A CN 109684370 A CN109684370 A CN 109684370A
- Authority
- CN
- China
- Prior art keywords
- data
- daily record
- record data
- transfer components
- message
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 21
- 238000012546 transfer Methods 0.000 claims abstract description 139
- 238000012545 processing Methods 0.000 claims abstract description 39
- 238000000034 method Methods 0.000 claims abstract description 38
- 230000008569 process Effects 0.000 claims abstract description 18
- 238000003306 harvesting Methods 0.000 claims abstract description 17
- 230000005540 biological transmission Effects 0.000 claims description 27
- 238000001914 filtration Methods 0.000 claims description 23
- 230000015654 memory Effects 0.000 claims description 15
- 238000007405 data analysis Methods 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 10
- 238000012544 monitoring process Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 5
- 238000005192 partition Methods 0.000 description 5
- 238000013507 mapping Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 241000208340 Araliaceae Species 0.000 description 2
- 241001269238 Data Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/06—Management of faults, events, alarms or notifications
- H04L41/069—Management of faults, events, alarms or notifications using logs of notifications; Post-processing of notifications
Abstract
The present invention discloses a kind of daily record data processing method, system, equipment and storage medium, it is related to big data analysis and processing, this method comprises: data transfer components determine the data type of data in target data source and target data source according to data harvesting request, it calls preset script to carry out data acquisition according to data type, is sent to presetting message cluster after then carrying out initialization process to collected daily record data;Presetting message cluster generates message queue according to initialization daily record data and message queue is sent to data transfer components;Data transfer components determine consuming order according to message queue, the daily record data stored in presetting message cluster is consumed by consuming order, and post-consumer data are saved by data search component, corresponding script is called to be acquired data Source log data by the data type thus according to stored data in data source, to increase the covering surface of data acquisition, the diversification acquisition to daily record data is realized.
Description
Technical field
The present invention relates to technical field of information processing more particularly to a kind of daily record data processing method, system, equipment and deposit
Storage media.
Background technique
ELK platform is generally known as log analysis platform, wherein E, L, K are three sections of open source products respectively
The lead-in superclass of (Elasticsearch, Logstash and Kibana) title.ElasticSerach is one and is based on opening
The distributed search engine realized on the search engine of source, while additionally providing data storage function;Logstash is main
For the filtering, modification and collection of daily record data, while supporting all kinds of different data acquiring modes;Kibana is mainly used for
By Hyper text transfer (HyperText Transfer Protocol, HTTP), agreement is good shows use by both above-mentioned
Family, so as to user's use.Existing ELK platform carries out data acquisition to data sources such as the database of external system or application logs
When with analysis, often it is acquired, handles and analyzes just for using the relevant data that report an error, covering surface is lower, and can not
Support large-scale data.
Above content is only used to facilitate the understanding of the technical scheme, and is not represented and is recognized that above content is existing skill
Art.
Summary of the invention
The main purpose of the present invention is to provide a kind of daily record data processing method, system, equipment and storage medium, purports
When the existing log analysis platform of solution is acquired analysis to daily record data, covering surface is lower and can not support extensive number
According to the technical issues of.
To achieve the above object, it the present invention provides a kind of daily record data processing method, the described method comprises the following steps:
Data transfer components determine that this data acquires corresponding target data source according to the data harvesting request received
With the data type of stored data in the target data source;
The data transfer components according to the data type call corresponding preset script to the target data source into
The acquisition of row data carries out initialization process to collected daily record data and obtains initialization daily record data, and by the initialization
Daily record data is sent to presetting message cluster;
The presetting message cluster generates message queue according to the initialization daily record data received, receiving
The message of data transfer components transmission is stated when pulling request, the message queue is sent to the data transfer components;
The data transfer components determine consuming order according to the message queue, by the consuming order to described default
The daily record data stored in message trunking is consumed, and post-consumer daily record data is sent to data search component and is protected
It deposits.
Preferably, the data transfer components call corresponding preset script to the number of targets according to the data type
Data acquisition is carried out according to source, initialization process is carried out to collected daily record data and obtains initialization daily record data, and will be described
The step of initialization daily record data is sent to presetting message cluster, comprising:
The data transfer components according to the data type call corresponding preset script to the target data source into
The acquisition of row data;
The data transfer components obtain filtering rule predetermined, and according to the filtering rule to collected day
Will data are filtered;
Filtered daily record data is converted into the initialization daily record data of preset format by the data transfer components, and will
The initialization daily record data is sent to presetting message cluster.
Preferably, the data transfer components call corresponding preset script to the number of targets according to the data type
The step of carrying out data acquisition according to source, comprising:
The data transfer components are called default when determining the data type is database data or application log
Monitoring script to the target data source carry out data acquisition.
Preferably, the data transfer components call corresponding preset script to the number of targets according to the data type
Data acquisition is carried out according to source, initialization process is carried out to collected daily record data and obtains initialization daily record data, and will be described
The step of initialization daily record data is sent to presetting message cluster, comprising:
The data transfer components call preset script to the mesh when determining the data type is access log
It marks data source and carries out data acquisition, obtain user access logs;
The data transfer components carry out information mistake to the user access logs according to preconfigured security strategy
Filter obtains initialization daily record data, and the initialization daily record data is sent to presetting message cluster.
Preferably, the presetting message cluster generates message queue according to the initialization daily record data received,
When receiving the message that the data transfer components are sent and pulling request, the message queue is sent to the data transmission group
The step of part, comprising:
The presetting message cluster receives the initialization daily record data, determines and disappears belonging to the initialization daily record data
Cease classification;
The presetting message cluster carries out classification storage to the initialization daily record data according to the News Category determined,
And according to storage result generate include the consuming order message queue;
The presetting message cluster disappears when the message for receiving the data transfer components transmission pulls request by described
Breath queue is sent to the data transfer components.
Preferably, the data transfer components determine consuming order according to the message queue, by the consuming order pair
The daily record data stored in the presetting message cluster is consumed, and post-consumer daily record data is sent to data search group
The step of part is saved, comprising:
The data transfer components determine consuming order according to the message queue, by the consuming order to described default
The daily record data stored in message trunking is consumed;
The data transfer components determine the data in the data acquisition instruction for getting the transmission of data search component
Searching component sends the sending time parameter of the data acquisition instruction;
The data transfer components call preset plug-in unit to extract the original time ginseng for including in the initialization daily record data
Number carries out parameter to the sending time parameter according to the original time parameter and redefines;
Post-consumer daily record data is sent to the number when the completion parameter redefines by the data transfer components
It is saved according to searching component.
Preferably, the data transfer components call preset plug-in unit extract it is described initialization daily record data in include it is original
Time parameter carries out the step of parameter redefines to the sending time parameter according to the original time parameter, comprising:
The data transfer components call preset plug-in unit to extract the original time ginseng for including in the initialization daily record data
Number is formatted the date field for obtaining object format to the original time parameter;
The data transfer components replace the date field in the sending time parameter according to the date field
It changes, the sending time parameter is redefined with realizing.
In addition, to achieve the above object, the present invention also proposes a kind of daily record data processing system, and the system comprises: number
According to transmission assembly, data search component and presetting message cluster;
Wherein, the data transfer components, for determining the acquisition pair of this data according to the data harvesting request received
The data type of stored data in the target data source and the target data source answered;
The data transfer components are also used to call corresponding preset script to the number of targets according to the data type
Data acquisition is carried out according to source, initialization process is carried out to collected daily record data and obtains initialization daily record data, and will be described
Initialization daily record data is sent to the presetting message cluster;
The presetting message cluster is connecing for generating message queue according to the initialization daily record data received
When receiving the message that the data transfer components are sent and pulling request, the message queue is sent to the data transmission group
Part;
The data transfer components are also used to determine consuming order according to the message queue, by the consuming order pair
The daily record data stored in the presetting message cluster is consumed, and post-consumer daily record data is sent to the data and is searched
Rope component is saved.
In addition, to achieve the above object, the present invention also proposes a kind of daily record data processing equipment, the daily record data processing
Equipment includes: memory, processor and is stored on the memory and at the daily record data that can be run on the processor
The step of reason program, the daily record data processing routine is arranged for carrying out daily record data processing method as described above.
In addition, to achieve the above object, the present invention also proposes a kind of storage medium, log is stored on the storage medium
Data processor, the daily record data processing routine realize daily record data processing side as described above when being executed by processor
The step of method.
Data transfer components of the present invention determine that this data acquires corresponding target according to the data harvesting request received
The data type of stored data in data source and target data source calls corresponding preset script to target according to data type
Data source carries out data acquisition, then carries out initialization process to collected daily record data and obtains initialization daily record data, and
Initialization daily record data is sent to presetting message cluster;Presetting message cluster is generated according to the initialization daily record data received
Message queue is sent to the data and passed by message queue when the message for receiving data transfer components transmission pulls request
Defeated component;Data transfer components determine consuming order according to message queue, by consuming order to storing in presetting message cluster
Daily record data is consumed, and post-consumer daily record data is sent to data search component and is saved, by then passing through number
Target data source is first determined according to transmission assembly, and corresponding foot is called further according to the data type of target data source stored data
This is acquired daily record data, so as to increase the covering surface of data acquisition, realizes and acquires to the diversification of daily record data,
Initialization daily record data is received by presetting message cluster simultaneously and generates message queue, thus to realize extensive log number
According to acquisition provide effective way.
Detailed description of the invention
Fig. 1 is the structural representation of the daily record data processing equipment for the hardware running environment that the embodiment of the present invention is related to
Figure;
Fig. 2 is the flow diagram of daily record data processing method first embodiment of the present invention;
Fig. 3 is the flow diagram of daily record data processing method second embodiment of the present invention;
Fig. 4 is the flow diagram of daily record data processing method 3rd embodiment of the present invention;
Fig. 5 is the flow diagram of daily record data processing method fourth embodiment of the present invention;
Fig. 6 is the structural block diagram of daily record data processing system first embodiment of the present invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that described herein, specific examples are only used to explain the present invention, is not intended to limit the present invention.
Referring to Fig.1, Fig. 1 is the daily record data processing equipment structure for the hardware running environment that the embodiment of the present invention is related to
Schematic diagram.
As shown in Figure 1, the daily record data processing equipment may include: processor 1001, such as central processing unit
(Central Processing Unit, CPU), communication bus 1002, user interface 1003, network interface 1004, memory
1005.Wherein, communication bus 1002 is for realizing the connection communication between these components.User interface 1003 may include display
Shield (Display), input unit such as keyboard (Keyboard), optional user interface 1003 can also include that the wired of standard connects
Mouth, wireless interface.Network interface 1004 optionally may include standard wireline interface and wireless interface (such as Wireless Fidelity
(WIreless-FIdelity, WI-FI) interface).Memory 1005 can be the random access memory (Random of high speed
Access Memory, RAM) memory, be also possible to stable nonvolatile memory (Non-Volatile Memory,
), such as magnetic disk storage NVM.Memory 1005 optionally can also be the storage device independently of aforementioned processor 1001.
It will be understood by those skilled in the art that structure shown in Fig. 1 does not constitute the limit to daily record data processing equipment
It is fixed, it may include perhaps combining certain components or different component layouts than illustrating more or fewer components.
As shown in Figure 1, as may include operating system, data storage mould in a kind of memory 1005 of storage medium
Block, network communication module, Subscriber Interface Module SIM and daily record data processing routine.
In daily record data processing equipment shown in Fig. 1, network interface 1004 is mainly used for being counted with network server
According to communication;User interface 1003 is mainly used for carrying out data interaction with user;Processing in daily record data processing equipment of the present invention
Device 1001, memory 1005 can be set in daily record data processing equipment, and the daily record data processing equipment passes through processor
The daily record data processing routine stored in 1001 calling memories 1005, and execute at daily record data provided in an embodiment of the present invention
Reason method.
The embodiment of the invention provides a kind of daily record data processing methods, are at daily record data of the present invention referring to Fig. 2, Fig. 2
The flow diagram of reason method first embodiment.
In the present embodiment, the daily record data processing method the following steps are included:
Step S10: data transfer components determine that this data acquires corresponding mesh according to the data harvesting request received
Mark the data type of stored data in data source and the target data source;
It should be noted that the present embodiment daily record data processing method is realized based on ELK log analysis platform, it is described
ELK log analysis platform (hereinafter referred to as ELK platform) is by Elasticsearch, tri- kinds of open source works of Lo gstash, Kibana
Have the Log Analysis System that builds, wherein Elasticsearch is an open source distributed search engine, provide collection,
Analysis, storing data three zones;Logstash be then mainly used for the collection of log, analysis, filtering log tool, support
A large amount of data acquiring mode;Kibana can be the Web for the log analysis close friend that Logstash and ElasticSearch are provided
Interface can help to summarize, analyze and search for important log data.In the present embodiment and following each embodiments, the data are passed
Defeated component, that is, Logstash tool, the data search component, that is, Elasticsea rch tool.
It should be understood that the data harvesting request, which can be user, passes through mobile phone, tablet computer or PC etc. eventually
The data acquisition instructions that end equipment is sent, the data acquisition instructions can also be appointed by preset timing in the terminal device
Business is triggered and is generated.The target data source can be data required for storing the data harvesting request device (such as
Database) or original media (such as application program, web page server etc.).
In addition, being in the present embodiment database data by the dtd--data type definition of stored data in database, will answer
Dtd--data type definition with stored data in program is using log, by the data type of stored data in web page server
It is defined as access log (or web access log), and before executing this step, staff can be directed to different data in advance
Type writes corresponding data acquisition script or configuration plug-in, to realize diversified, the higher data of with strong points and covering surface
Acquisition.
Certainly, in order to keep the data transfer components Logstash quickly true according to the data harvesting request received
The data type that this data acquires stored data in targeted data source is made, it can be in advance in data transfer components
The mapping relations between data source identification and data type are established in Logstash, so that the data transfer components
Logstash is after determining that this data acquires targeted target data source, according to the corresponding data of the target data source
Source mark realizes the quick determination to the data type of its stored data.
In the concrete realization, data transfer components Logstash first determines this according to the data harvesting request received
Data acquire targeted target data source, then pass through the number pre-established according to the corresponding data source identification of target data source
The data type of stored data in target data source is quickly determined according to the mapping relations between source mark and data type.
Step S20: the data transfer components call corresponding preset script to the target according to the data type
Data source carries out data acquisition, carries out initialization process to collected daily record data and obtains initialization daily record data, and by institute
It states initialization daily record data and is sent to presetting message cluster;
It should be noted that staff can configure for the data source of different types of data before executing this step
Corresponding script to carry out data acquisition to data source, such as: one section of monitoring script can be write in advance for database data
The daily record datas such as the safety monitored obtain in database or business are carried out to the event information in database;For application program
Data acquisition can be carried out by the monitoring script write in advance using log, the log4j in modification application program can also be passed through
Configuration file exports object to carry out the acquisition of daily record data and control log information;Day is accessed for the web of web page server
Will can be by preset script plug-in unit come the acquisition for the log that accesses.
Further, it is contemplated that in current all script engines, a kind of Lua (scripting language) script it is fastest and easy
In maintenance, therefore when data transfer components Logstash is determining that the data type is web access log in the present embodiment
When, preset Lua script can be called to carry out data acquisition to the target data source, obtain user access logs;Then according to pre-
The security strategy first configured carries out information filtering acquisition initialization daily record data to the user access logs, then will be described first
Beginningization daily record data is sent to presetting message cluster.Wherein, the user is visited according to preconfigured security strategy
Ask that the implementation of log progress information filtering can be by calling website application layer intrusion prevention system (Web
Application Firewall, WAF) module realizes that the security strategy can be staff and compile according to working experience
That writes carries out the abnormality detection agreement of security verification to user access activity.
In the concrete realization, data transfer components Logstash stored data in determining target data source
After data type, corresponding preset script is called to carry out data acquisition to target data source according to the data type, it is then right
Collected daily record data carries out initialization process and obtains initialization daily record data, then will initialization daily record data be sent to it is default
Message trunking.
Step S30: the presetting message cluster generates message queue according to the initialization daily record data received,
When receiving the message that the data transfer components are sent and pulling request, the message queue is sent to the data transmission group
Part;
It should be noted that the presetting message cluster, which can be, has many advantages, such as high-performance, highly reliable and high real-time
Distributed open source message-oriented middleware or server cluster, such as RocketMQ cluster or kafka cluster etc., in practical application
In, the presetting message cluster can store to obtain message queue by the producer (Producer) to the message category got,
Then it is consumed by consumer (Consumer) according to the sequence that message queue carries out message, and consumer consumes to message
When can constantly pull message from cluster by way of establishing long connection with message trunking and then disappear to these message
Take.
In the concrete realization, presetting message cluster receives the initialization log number that data transfer components Logstash is sent
According to, according to pre-set subregion (partitions) to initialization daily record data carry out classification storage, then deposited according to classification
The result of storage generates corresponding message queue, and pulls request in the message for receiving data transfer components Logstash transmission
When, message queue is sent to data transfer components Logstash.
Step S40: the data transfer components determine consuming order according to the message queue, by the consuming order pair
The daily record data stored in the presetting message cluster is consumed, and post-consumer daily record data is sent to data search group
Part is saved.
It should be noted that the consuming order can be daily record data or message, by message sink end, (such as data are transmitted
Component Logstash) corresponding Message Processing sequence when receiving and processing.
In the concrete realization, data transfer components Logstash is determined according to the message queue of presetting message collection pocket transmission
Consuming order, then the daily record data stored in presetting message cluster is consumed by consuming order, then by post-consumer day
Will data are sent to data search component ElasticSearch and are saved.
The present embodiment data transfer components determine that this data acquires corresponding mesh according to the data harvesting request received
The data type for marking stored data in data source and target data source calls corresponding preset script to mesh according to data type
It marks data source and carries out data acquisition, initialization process then is carried out to collected daily record data and obtains initialization daily record data,
And initialization daily record data is sent to presetting message cluster;Presetting message cluster is raw according to the initialization daily record data received
Message queue is sent to the data when the message for receiving data transfer components transmission pulls request at message queue
Transmission assembly;Data transfer components determine consuming order according to message queue, by consuming order to storing in presetting message cluster
Daily record data consumed, and post-consumer daily record data is sent to data search component and is saved, by then passing through
Data transfer components in ELK platform first determine target data source, further according to the data type of target data source stored data
It calls corresponding script to be acquired daily record data, so as to increase the covering surface of data acquisition, realizes to log number
According to diversification acquisition, while receiving by presetting message cluster initialization daily record data and generating message queue, to be
Realize that the acquisition of extensive daily record data provides effective way.
With reference to Fig. 3, Fig. 3 is the flow diagram of daily record data processing method second embodiment of the present invention.
Based on above-mentioned first embodiment, in the present embodiment, the step S20 includes:
Step S201: the data transfer components call corresponding preset script to the target according to the data type
Data source carries out data acquisition;
It should be understood that can be pre-configured with accordingly for the data source staff of different types of data in the present embodiment
Script (the i.e. described preset script) come to data source carry out data acquisition, wherein the preset script can be according to acquisition
The log4j configuration file that demand was modified, is also possible to pre-configured monitoring file.Specifically, if data transfer components
Logstash determines the data type for the data for saving or generating in target data source for database data or using log
When, preset monitoring script can be called to carry out data acquisition to the target data source.
Further, if data transfer components Logstash determines the data for saving or generating in target data source
Data type is that then can also will be adopted by the journal file in preset log4j file acquisition applications program using log
The journal file collected is sent directly to the presetting message cluster.
Step S202: the data transfer components obtain filtering rule predetermined, and according to the filtering rule pair
Collected daily record data is filtered;
It should be noted that the filtering rule, which can be preset resource, intercepts rule, such as to daily record data
In sensitive vocabulary, picture file or resource request path etc. intercepted.
In the concrete realization, data transfer components Logstash obtains filtering rule predetermined, then according to filtering
Rule is filtered collected daily record data, obtains filtered daily record data.
Step S203: filtered daily record data is converted into the initialization log of preset format by the data transfer components
Data, and the initialization daily record data is sent to presetting message cluster.
It should be noted that preset format described in the present embodiment can be JS object numbered musical notation (JavaScript Object
Notation, JSON) format, the presetting message cluster can be the distributed message server cluster of high-throughput, described
Filtered daily record data is converted into the initialization daily record data of preset format i.e. to filtering by data transfer components Logstash
Daily record data afterwards carries out the initialization log number that uniform format obtains the initialization daily record data of JSON format, and finally obtains
According to class name, message content, output execution method, time of origin where the thread name of middle data, log rank, log information
And/or the data format of the parameters such as line number in code is consistent.
In the concrete realization, filtered daily record data is converted into the first of preset format by data transfer components Logstash
Then initialization daily record data is sent to presetting message cluster by beginningization daily record data.
The present embodiment data transfer components call corresponding preset script to count target data source according to data type
According to acquisition;Filtering rule predetermined is obtained, and collected daily record data is filtered according to filtering rule and ensure that
The safety of data acquisition, while the initialization daily record data by the way that filtered daily record data to be converted into preset format, and
Initialization daily record data is sent to presetting message cluster, to ensure that the consistency of initialization Log data format, is improved
The treatment effeciency of presetting message cluster to initialization daily record data.
With reference to Fig. 4, Fig. 4 is the flow diagram of daily record data processing method 3rd embodiment of the present invention.
Based on the various embodiments described above, in the present embodiment, the step S30 includes:
Step S301: the presetting message cluster receives the initialization daily record data, determines the initialization log number
According to affiliated News Category;
It should be noted that the distributed post subscription message system in view of kafka as a kind of high-throughput, mesh
Be on unified line and offline Message Processing, i.e., to be provided by cluster in real time by the loaded in parallel mechanism of Hadoop
Message, presetting message cluster described in the present embodiment are preferably kafka cluster.
It should be understood that message trunking after receiving the message, can be according to specified message subject under normal conditions
(topic) stored after classifying to message, that is to say, that by message designated key by message category so that consumer can
Only focus on the message in the main body of oneself needs.
It in the concrete realization, can be according in initialization daily record data when presetting message cluster receives initialization daily record data
The subject information of carrying determines News Category belonging to the daily record data.
Step S302: the presetting message cluster carries out the initialization daily record data according to the News Category determined
Classification storage, and according to storage result generate include the consuming order message queue;
It is understood that presetting message cluster determine initialization daily record data News Category after, can be according to it
Affiliated News Category first classifies to initialization daily record data, then divides again sorted initialization daily record data
Area's storage, finally generates corresponding message queue further according to the result of partitioned storage, specifically, the message queue can be and disappear
The message that the partition identification of partition holding where when breath saves arranges in a predetermined order pulls queue.
In the present embodiment, the presetting message cluster, can before carrying out message category according to the News Category (theme)
It first creates several themes and sets the corresponding number of partitions of each theme, the number of partitions is more, and handling capacity is also bigger.Similarly,
Presetting message cluster can also balancedly store message in different subregions according to the balance policy of mean allocation.
In the concrete realization, presetting message cluster carries out the initialization daily record data according to the News Category determined
Classification storage, and according to storage result generate include consuming order message queue.
Step S303: the presetting message cluster pulls request in the message for receiving the data transfer components transmission
When, the message queue is sent to the data transfer components.
In the concrete realization, when presetting message cluster is pulled in the message for receiving data transfer components Logstash transmission
When request, pre-generated message queue is sent to data transfer components Logstash, for the data transfer components
The subsequent basis of Logstash receives message carry out sequence consumption of the message queue to storing in presetting message cluster.
The present embodiment presetting message cluster receives initialization daily record data, determines message class belonging to initialization daily record data
Not;Classification storage is carried out to initialization daily record data according to the News Category determined, and includes according to storage result generation
The message queue of consuming order;Presetting message cluster is when the message for receiving data transfer components transmission pulls request, by institute
It states message queue and is sent to the data transfer components, realize the possibility of larger scale data acquisition and processing.
With reference to Fig. 5, Fig. 5 is the flow diagram of daily record data processing method fourth embodiment of the present invention.
Based on the various embodiments described above, in the present embodiment, the step S40 may particularly include following steps:
Step S401: the data transfer components determine consuming order according to the message queue, by the consuming order
The daily record data stored in the presetting message cluster is consumed;
In the concrete realization, data transfer components Logstash determines consuming order according to the message queue received, so
The daily record data stored in the presetting message cluster is successively pulled and consumed according to the consuming order determined afterwards.
Step S402: the data transfer components are in the data acquisition instruction for getting the transmission of data search component, really
The fixed data search component sends the sending time parameter of the data acquisition instruction;
It should be understood that, it is contemplated that after obtaining consumption from data transfer components as ElasticSearch in some cases
Daily record data when, can be using acquisition time when itself collecting the daily record data as data transfer components Logstash from number
The practical generation time of the daily record data obtained according to source is (for example, the practical generation time that data source A generates daily record data B is
2018-05-11 16:30:30, if the acquisition time of ElasticSearch is 2018-05-12 10:30:30,
ElasticSearch will be using 2018-05-12 10:30:30 as the practical generation time of the daily record data B).Therefore it is
It avoids the data search component Elastic Search subsequent being established according to acquisition time when itself collecting daily record data
Index distortion between daily record data and acquisition time, it is necessary to time parameter be carried out to the acquisition time and redefined, come
Guarantee the accuracy and reliability of the index finally established.
It should be noted that by data search component ElasticSearch to data transfer components in the present embodiment
Logstash sends the sending time parameter of data acquisition instruction as acquisition time when itself obtaining the daily record data, that is, works as
Data transfer components Logstash will receive the data in the data acquisition instruction for getting ElasticSearch transmission
After the acquisition time (the i.e. described sending time parameter) of acquisition instruction obtains consumption as data search component ElasticSearch
Daily record data acquisition time.
Step S403: the data transfer components call preset plug-in unit to extract the original for including in the initialization daily record data
Beginning time parameter carries out parameter to the sending time parameter according to the original time parameter and redefines;
It should be noted that the preset plug-in unit can be the preconfigured foot for data progress screening and filtering
This, data transfer components Logstash can call staff to be pre-configured with filter plug-in unit and extract initialization in the present embodiment
Then the original time parameter for including in daily record data carries out parameter weight to the sending time parameter according to original time parameter
Definition, the specific data transfer components Logstash can call preset filter plug-in unit to extract the initialization daily record data
In include original time parameter, to the original time parameter be formatted obtain object format date field;It is described
Data transfer components Logstash is replaced the date field in the sending time parameter according to the date field, with
Realization redefines the sending time parameter.
Wherein, data transfer components Logstash is formatted original time parameter and can be string format
Layout when the original time parameter of (e.g., 2018-05-11 16:30:30.830) or Unix timestamp format is converted into ISO8601
The date field of formula, such as by 30 minutes and 30 seconds at 4 points in afternoon 11 day May 2018 Beijing time, it is converted into ISO8601 time format
Date field then are as follows: 2018-05-11T16:30:30+08:00 or 20180511T163030+08.
In the concrete realization, data transfer components are according to formatted date field in the sending time parameter
Date field is replaced, and is redefined to realize to the sending time parameter, guarantees that data search component is finally established
Index accuracy and reliability.
Step S404: the data transfer components send out post-consumer daily record data when the completion parameter redefines
It send to the data search component and is saved.
In the concrete realization, data transfer components Logstash will disappear when completing to redefine the parameter of time parameter
Daily record data after expense is sent to the data search component ElasticSearch and is saved.
The present embodiment data transfer components determine consuming order according to the message queue, by the consuming order to default
The daily record data stored in message trunking is consumed;In the data acquisition instruction for getting the transmission of data search component, really
Fixed number sends the sending time parameter of data acquisition instruction according to searching component;Preset plug-in unit is called to extract in initialization daily record data
The original time parameter for including carries out parameter to sending time parameter according to original time parameter and redefines;Complete parameter weight
When definition, post-consumer daily record data is sent to data search component and is saved, ensure that data search component is finally built
The accuracy and reliability of vertical index.
In addition, the embodiment of the present invention also proposes a kind of storage medium, daily record data processing is stored on the storage medium
Program, the daily record data processing routine realize the step of daily record data processing method as described above when being executed by processor
Suddenly.
It is the structural block diagram of daily record data processing system first embodiment of the present invention referring to Fig. 6, Fig. 6.
As shown in fig. 6, the daily record data processing system that the embodiment of the present invention proposes includes: data transfer components 601, data
Searching component 602 and presetting message cluster 603;
Wherein, the data transfer components 601, for determining that this data acquires according to the data harvesting request received
The data type of stored data in corresponding target data source and the target data source;
It should be noted that the data harvesting request, which can be user, passes through mobile phone, tablet computer or PC etc.
The data acquisition instructions that terminal device is sent, which can also be by preset timing in the terminal device
Task is triggered and is generated.The target data source can be the device (example of data required for storing the data harvesting request
Such as database) or original media (for example, application program, web page server).
In addition, can be in the present embodiment database data by the dtd--data type definition of stored data in database, it will
The dtd--data type definition of stored data is using log, by the data class of stored data in web page server in application program
Type is defined as web access log, and staff can write corresponding data acquisition script for different data types in advance
Or configuration plug-in, to realize diversified, the higher data acquisition of with strong points and covering surface.
Certainly, in order to enable the data transfer components 601 quickly to be determined according to the data harvesting request received
This data acquires the data type of stored data in targeted data source, can be in advance in the data transfer components 601
It is middle to establish the mapping relations between data source identification and data type, so that the data transfer components 601 are being determined
After this data acquires targeted target data source, realized according to the corresponding data source identification of the target data source to its institute
Save the quick determination of the data type of data.
The data transfer components 601 are also used to call corresponding preset script to the mesh according to the data type
It marks data source and carries out data acquisition, initialization process is carried out to collected daily record data and obtains initialization daily record data, and will
The initialization daily record data is sent to the presetting message cluster 603;
It should be noted that staff can configure corresponding foot for the data source of different types of data in the present embodiment
Originally data acquisition was carried out to data source, come for example, one section of monitoring script can be write in advance for database data to database
In event information carry out the safety monitored obtain in database or the daily record datas such as business;Log is applied for application program
Data acquisition can be carried out by the monitoring script write in advance, the log4j configuration file in modification application program can also be passed through
To carry out the acquisition of daily record data and control log information output object;The web access log of web page server can be passed through
Preset script plug-in unit is come the acquisition for the log that accesses.
Further, it is contemplated that in current all script engines, Lua script it is fastest and easy to maintain, therefore this
In embodiment when data transfer components 601 determine the data type be web access log when, preset Lua foot can be called
This carries out data acquisition to the target data source, obtains user access logs;Then according to preconfigured security strategy
Information filtering is carried out to the user access logs and obtains initialization daily record data, then the initialization daily record data is sent to
Presetting message cluster.Wherein, described that information filtering is carried out to the user access logs according to preconfigured security strategy
Implementation can be by call website application layer intrusion prevention system (Web Application Firewall, WAF)
Module realizes that the security strategy can be staff and carry out safety to user access activity according to what working experience was write
Property verifying abnormality detection agreement.
The presetting message cluster 603, for generating message queue according to the initialization daily record data received,
When receiving the message that the data transfer components 601 are sent and pulling request, the message queue is sent to the data and is passed
Defeated component 601;
It should be noted that the presetting message cluster 603 can be with high-performance, it is highly reliable and it is high in real time etc. it is excellent
The distributed open source message-oriented middleware or server cluster of point, such as RocketMQ cluster or kafka cluster etc., in practical application
In, the presetting message cluster 603 can store the message category got by the producer (Producer) to obtain message team
Then column are consumed by consumer (Consumer) according to the sequence that message queue carries out message, and consumer carries out to message
When consumption can by way of establishing long connection with message trunking, constantly pulled from cluster message then to these message into
Row consumption.
The data transfer components 061 are also used to determine consuming order according to the message queue, by the consuming order
The daily record data stored in the presetting message cluster 603 is consumed, and post-consumer daily record data is sent to data
Searching component 602 is saved.
It should be noted that the consuming order can be daily record data or message, by message sink end, (such as data are transmitted
Component 601) corresponding Message Processing sequence when receiving and processing.
In the concrete realization, data transfer components 601 are determined to disappear according to the message queue that presetting message cluster 603 is sent
Take sequence, then the daily record data stored in presetting message cluster 603 is consumed by consuming order, then by post-consumer day
Will data are sent to data search component 602 and are saved.
The present embodiment data transfer components determine that this data acquires corresponding mesh according to the data harvesting request received
The data type for marking stored data in data source and target data source calls corresponding preset script to mesh according to data type
It marks data source and carries out data acquisition, initialization process then is carried out to collected daily record data and obtains initialization daily record data,
And initialization daily record data is sent to presetting message cluster;Presetting message cluster is raw according to the initialization daily record data received
Message queue is sent to the data when the message for receiving data transfer components transmission pulls request at message queue
Transmission assembly;Data transfer components determine consuming order according to message queue, by consuming order to storing in presetting message cluster
Daily record data consumed, and post-consumer daily record data is sent to data search component and is saved, by then passing through
Data transfer components first determine target data source, call further according to the data type of target data source stored data corresponding
Script is acquired daily record data, and so as to increase the covering surface of data acquisition, the diversification of daily record data is adopted in realization
Collection, while initialization daily record data is received by presetting message cluster and generates message queue, thus to realize extensive day
The acquisition of will data provides effective way.
Based on the above-mentioned daily record data processing unit first embodiment of the present invention, daily record data processing unit of the present invention is proposed
Second embodiment.
In the present embodiment, the data transfer components 601 are also used to be called according to the data type corresponding preset
Script carries out data acquisition to the target data source;Filtering rule predetermined is obtained, and according to the filtering rule pair
Collected daily record data is filtered;Filtered daily record data is converted into the initialization daily record data of preset format, and
The initialization daily record data is sent to presetting message cluster 603.
Further, the data transfer components 601, be also used to determine the data type be database data or
When using log, preset monitoring script is called to carry out data acquisition to the target data source.
Further, the data transfer components 601 are also used to when determining the data type is access log,
It calls preset script to carry out data acquisition to the target data source, obtains user access logs;According to preconfigured safety
Strategy carries out information filtering acquisition initialization daily record data to the user access logs, and the initialization daily record data is sent out
It send to presetting message cluster 603.
Further, the presetting message cluster 603 is also used to receive the initialization daily record data, determines described first
News Category belonging to beginningization daily record data;Classification is carried out to the initialization daily record data according to the News Category determined to deposit
It stores up, and includes the message queue of the consuming order according to storage result generation;Receiving the data transfer components 601
When the message of transmission pulls request, the message queue is sent to the data transfer components 601.
Further, the data transfer components 601 are also used to determine consuming order according to the message queue, by institute
Consuming order is stated to consume the daily record data stored in the presetting message cluster 603;Getting the data search
When the data acquisition instruction that component 602 is sent, determine that the data search component 602 sends the transmission of the data acquisition instruction
Time parameter;Preset plug-in unit is called to extract the original time parameter for including in the initialization daily record data, according to described original
Time parameter carries out parameter to the sending time parameter and redefines;When the completion parameter redefines, by post-consumer day
Will data are sent to the data search component 602 and are saved.
Further, the data transfer components 601 are also used to that preset plug-in unit is called to extract the initialization daily record data
In include original time parameter, to the original time parameter be formatted obtain object format date field;According to
The date field is replaced the date field in the sending time parameter, to realize to the sending time parameter
It redefines.
The other embodiments or specific implementation of daily record data processing system of the present invention can refer to above-mentioned each method and implement
Example, details are not described herein again.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the system that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or system institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or system.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as read-only memory/random access memory, magnetic disk, CD), including some instructions are used so that a terminal device (can
To be mobile phone, computer, server, air conditioner or the network equipment etc.) execute method described in each embodiment of the present invention.
The above is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair
Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of daily record data processing method, which is characterized in that the described method includes:
Data transfer components determine that this data acquires corresponding target data source and institute according to the data harvesting request received
State the data type of stored data in target data source;
The data transfer components call corresponding preset script to count the target data source according to the data type
According to acquisition, initialization process is carried out to collected daily record data and obtains initialization daily record data, and by the initialization log
Data are sent to presetting message cluster;
The presetting message cluster generates message queue according to the initialization daily record data received, is receiving the number
When pulling request according to the message that transmission assembly is sent, the message queue is sent to the data transfer components;
The data transfer components determine consuming order according to the message queue, by the consuming order to the presetting message
The daily record data stored in cluster is consumed, and post-consumer daily record data is sent to data search component and is saved.
2. the method as described in claim 1, which is characterized in that the data transfer components are according to data type calling pair
The preset script answered carries out data acquisition to the target data source, carries out initialization process acquisition to collected daily record data
Initialize daily record data, and the step of initialization daily record data is sent to presetting message cluster, comprising:
The data transfer components call corresponding preset script to count the target data source according to the data type
According to acquisition;
The data transfer components obtain filtering rule predetermined, and according to the filtering rule to collected log number
According to being filtered;
Filtered daily record data is converted into the initialization daily record data of preset format by the data transfer components, and will be described
Initialization daily record data is sent to presetting message cluster.
3. method according to claim 2, which is characterized in that the data transfer components are according to data type calling pair
The step of preset script answered carries out data acquisition to the target data source, comprising:
The data transfer components call preset prison when determining the data type is database data or application log
Script is listened to carry out data acquisition to the target data source.
4. the method as described in claim 1, which is characterized in that the data transfer components are according to data type calling pair
The preset script answered carries out data acquisition to the target data source, carries out initialization process acquisition to collected daily record data
Initialize daily record data, and the step of initialization daily record data is sent to presetting message cluster, comprising:
The data transfer components call preset script to the number of targets when determining the data type is access log
Data acquisition is carried out according to source, obtains user access logs;
The data transfer components obtain user access logs progress information filtering according to preconfigured security strategy
Daily record data must be initialized, the initialization daily record data is sent to presetting message cluster.
5. the method as claimed in claim 3 or 4, which is characterized in that the presetting message cluster is described first according to what is received
Beginningization daily record data generates message queue, will be described when the message for receiving the data transfer components transmission pulls request
Message queue is sent to the step of data transfer components, comprising:
The presetting message cluster receives the initialization daily record data, determines message class belonging to the initialization daily record data
Not;
The presetting message cluster carries out classification storage, and root to the initialization daily record data according to the News Category determined
It include the message queue of the consuming order according to storage result generation;
The presetting message cluster is when the message for receiving the data transfer components transmission pulls request, by the message team
Column are sent to the data transfer components.
6. method as claimed in claim 5, which is characterized in that the data transfer components disappear according to message queue determination
Take sequence, the daily record data stored in the presetting message cluster is consumed by the consuming order, and will be post-consumer
Daily record data is sent to the step of data search component is saved, comprising:
The data transfer components determine consuming order according to the message queue, by the consuming order to the presetting message
The daily record data stored in cluster is consumed;
The data transfer components determine the data search in the data acquisition instruction for getting the transmission of data search component
Component sends the sending time parameter of the data acquisition instruction;
The data transfer components call preset plug-in unit to extract the original time parameter for including in the initialization daily record data, root
Parameter is carried out to the sending time parameter according to the original time parameter to redefine;
Post-consumer daily record data is sent to the data and searched by the data transfer components when the completion parameter redefines
Rope component is saved.
7. method as claimed in claim 6, which is characterized in that it is described first that the data transfer components call preset plug-in unit to extract
The original time parameter for including in beginningization daily record data joins the sending time parameter according to the original time parameter
The step of number redefines, comprising:
The data transfer components call preset plug-in unit to extract the original time parameter for including in the initialization daily record data, right
The original time parameter is formatted the date field for obtaining object format;
The data transfer components are replaced the date field in the sending time parameter according to the date field, with
Realization redefines the sending time parameter.
8. a kind of daily record data processing system, which is characterized in that the system comprises: data transfer components, data search component
With presetting message cluster;
Wherein, the data transfer components, for determining that the acquisition of this data is corresponding according to the data harvesting request received
The data type of stored data in target data source and the target data source;
The data transfer components are also used to call corresponding preset script to the target data source according to the data type
Data acquisition is carried out, initialization process is carried out to collected daily record data and obtains initialization daily record data, and will be described initial
Change daily record data and is sent to the presetting message cluster;
The presetting message cluster is being received for generating message queue according to the initialization daily record data received
When the message that the data transfer components are sent pulls request, the message queue is sent to the data transfer components;
The data transfer components are also used to determine consuming order according to the message queue, by the consuming order to described
The daily record data stored in presetting message cluster is consumed, and post-consumer daily record data is sent to the data search group
Part is saved.
9. a kind of daily record data processing equipment, which is characterized in that the equipment includes: memory, processor and is stored in described
On memory and the daily record data processing routine that can run on the processor, the daily record data processing routine are configured to reality
Now the step of daily record data processing method as described in any one of claims 1 to 7.
10. a kind of storage medium, which is characterized in that be stored with daily record data processing routine, the log on the storage medium
The step of daily record data processing method as described in any one of claim 1 to 7 is realized when data processor is executed by processor
Suddenly.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811041735.3A CN109684370A (en) | 2018-09-07 | 2018-09-07 | Daily record data processing method, system, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811041735.3A CN109684370A (en) | 2018-09-07 | 2018-09-07 | Daily record data processing method, system, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109684370A true CN109684370A (en) | 2019-04-26 |
Family
ID=66184513
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811041735.3A Pending CN109684370A (en) | 2018-09-07 | 2018-09-07 | Daily record data processing method, system, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109684370A (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110489657A (en) * | 2019-07-05 | 2019-11-22 | 五八有限公司 | A kind of information filtering method, device, terminal device and storage medium |
CN110688383A (en) * | 2019-09-26 | 2020-01-14 | 中国银行股份有限公司 | Data acquisition method and system |
CN110716813A (en) * | 2019-09-17 | 2020-01-21 | 贝壳技术有限公司 | Data stream processing method and device, readable storage medium and processor |
CN111049899A (en) * | 2019-12-11 | 2020-04-21 | 贝壳技术有限公司 | kafka message storage system, method, apparatus, and computer-readable storage medium |
CN111158643A (en) * | 2019-11-29 | 2020-05-15 | 石化盈科信息技术有限责任公司 | Data processing system and method |
CN111262915A (en) * | 2020-01-10 | 2020-06-09 | 北京东方金信科技有限公司 | Kafka cluster-crossing data conversion system and method |
CN111309793A (en) * | 2020-01-15 | 2020-06-19 | 北大方正集团有限公司 | Data processing method, device and equipment |
CN111427903A (en) * | 2020-03-27 | 2020-07-17 | 四川虹美智能科技有限公司 | Log information acquisition method and device |
CN111913821A (en) * | 2020-08-17 | 2020-11-10 | 武汉众邦银行股份有限公司 | Method for realizing cross-data-source real-time data stream production consumption |
CN112182160A (en) * | 2020-09-30 | 2021-01-05 | 中国民航信息网络股份有限公司 | Log data processing method and device, storage medium and electronic equipment |
CN112579639A (en) * | 2019-09-29 | 2021-03-30 | 北京国双科技有限公司 | Data processing method and device, electronic equipment and storage medium |
CN112579326A (en) * | 2020-12-29 | 2021-03-30 | 北京五八信息技术有限公司 | Offline data processing method and device, electronic equipment and computer readable medium |
CN112667476A (en) * | 2020-12-30 | 2021-04-16 | 平安普惠企业管理有限公司 | Task-based message filtering method, device, equipment and storage medium |
CN112702415A (en) * | 2020-12-21 | 2021-04-23 | 广州华资软件技术有限公司 | Method for converting Kafka long connection consumption into service |
CN113067883A (en) * | 2021-03-31 | 2021-07-02 | 建信金融科技有限责任公司 | Data transmission method and device, computer equipment and storage medium |
CN114070720A (en) * | 2021-10-20 | 2022-02-18 | 浪潮金融信息技术有限公司 | Data preposition system, method and medium based on asynchronous long connection technology |
CN114363042A (en) * | 2021-12-30 | 2022-04-15 | 爱集微咨询(厦门)有限公司 | Log analysis method, device, equipment and readable storage medium |
CN115150418A (en) * | 2022-08-26 | 2022-10-04 | 北京蔚领时代科技有限公司 | Data storage method of server cluster |
CN115391325A (en) * | 2022-10-31 | 2022-11-25 | 深圳曼顿科技有限公司 | Energy data management method, device, equipment and medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103401934A (en) * | 2013-08-06 | 2013-11-20 | 广州唯品会信息科技有限公司 | Method and system for acquiring log data |
US20160098325A1 (en) * | 2013-06-19 | 2016-04-07 | Hewlett-Packard Development Company, L.P. | Unifying application log messages using runtime instrumentation |
CN106776693A (en) * | 2016-11-10 | 2017-05-31 | 福建中金在线信息科技有限公司 | A kind of website data acquisition method and device |
CN107229556A (en) * | 2017-06-09 | 2017-10-03 | 环球智达科技(北京)有限公司 | Log Analysis System based on elastic components |
CN107273267A (en) * | 2017-06-09 | 2017-10-20 | 环球智达科技(北京)有限公司 | Log analysis method based on elastic components |
CN107480277A (en) * | 2017-08-22 | 2017-12-15 | 北京京东尚科信息技术有限公司 | Method and device for web log file collection |
CN107547589A (en) * | 2016-06-27 | 2018-01-05 | 腾讯科技(深圳)有限公司 | A kind of data acquisition treatment method and device |
CN107622084A (en) * | 2017-08-10 | 2018-01-23 | 深圳前海微众银行股份有限公司 | Blog management method, system and computer-readable recording medium |
CN108365971A (en) * | 2018-01-10 | 2018-08-03 | 深圳市金立通信设备有限公司 | Daily record analytic method, equipment and computer-readable medium |
-
2018
- 2018-09-07 CN CN201811041735.3A patent/CN109684370A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160098325A1 (en) * | 2013-06-19 | 2016-04-07 | Hewlett-Packard Development Company, L.P. | Unifying application log messages using runtime instrumentation |
CN103401934A (en) * | 2013-08-06 | 2013-11-20 | 广州唯品会信息科技有限公司 | Method and system for acquiring log data |
CN107547589A (en) * | 2016-06-27 | 2018-01-05 | 腾讯科技(深圳)有限公司 | A kind of data acquisition treatment method and device |
CN106776693A (en) * | 2016-11-10 | 2017-05-31 | 福建中金在线信息科技有限公司 | A kind of website data acquisition method and device |
CN107229556A (en) * | 2017-06-09 | 2017-10-03 | 环球智达科技(北京)有限公司 | Log Analysis System based on elastic components |
CN107273267A (en) * | 2017-06-09 | 2017-10-20 | 环球智达科技(北京)有限公司 | Log analysis method based on elastic components |
CN107622084A (en) * | 2017-08-10 | 2018-01-23 | 深圳前海微众银行股份有限公司 | Blog management method, system and computer-readable recording medium |
CN107480277A (en) * | 2017-08-22 | 2017-12-15 | 北京京东尚科信息技术有限公司 | Method and device for web log file collection |
CN108365971A (en) * | 2018-01-10 | 2018-08-03 | 深圳市金立通信设备有限公司 | Daily record analytic method, equipment and computer-readable medium |
Non-Patent Citations (1)
Title |
---|
罗东锋;李芳;郝汪洋;吴仲城;: "基于Docker的大规模日志采集与分析系统", 计算机系统应用, vol. 26, no. 10, 15 October 2017 (2017-10-15), pages 82 - 88 * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110489657A (en) * | 2019-07-05 | 2019-11-22 | 五八有限公司 | A kind of information filtering method, device, terminal device and storage medium |
CN110716813A (en) * | 2019-09-17 | 2020-01-21 | 贝壳技术有限公司 | Data stream processing method and device, readable storage medium and processor |
CN110688383A (en) * | 2019-09-26 | 2020-01-14 | 中国银行股份有限公司 | Data acquisition method and system |
CN112579639A (en) * | 2019-09-29 | 2021-03-30 | 北京国双科技有限公司 | Data processing method and device, electronic equipment and storage medium |
CN111158643A (en) * | 2019-11-29 | 2020-05-15 | 石化盈科信息技术有限责任公司 | Data processing system and method |
CN111049899A (en) * | 2019-12-11 | 2020-04-21 | 贝壳技术有限公司 | kafka message storage system, method, apparatus, and computer-readable storage medium |
CN111262915A (en) * | 2020-01-10 | 2020-06-09 | 北京东方金信科技有限公司 | Kafka cluster-crossing data conversion system and method |
CN111262915B (en) * | 2020-01-10 | 2020-09-22 | 北京东方金信科技有限公司 | Kafka cluster-crossing data conversion system and method |
CN111309793A (en) * | 2020-01-15 | 2020-06-19 | 北大方正集团有限公司 | Data processing method, device and equipment |
CN111427903A (en) * | 2020-03-27 | 2020-07-17 | 四川虹美智能科技有限公司 | Log information acquisition method and device |
CN111427903B (en) * | 2020-03-27 | 2023-04-21 | 四川虹美智能科技有限公司 | Log information acquisition method and device |
CN111913821A (en) * | 2020-08-17 | 2020-11-10 | 武汉众邦银行股份有限公司 | Method for realizing cross-data-source real-time data stream production consumption |
CN111913821B (en) * | 2020-08-17 | 2021-07-16 | 武汉众邦银行股份有限公司 | Method for realizing cross-data-source real-time data stream production consumption |
CN112182160A (en) * | 2020-09-30 | 2021-01-05 | 中国民航信息网络股份有限公司 | Log data processing method and device, storage medium and electronic equipment |
CN112182160B (en) * | 2020-09-30 | 2023-12-26 | 中国民航信息网络股份有限公司 | Log data processing method and device, storage medium and electronic equipment |
CN112702415A (en) * | 2020-12-21 | 2021-04-23 | 广州华资软件技术有限公司 | Method for converting Kafka long connection consumption into service |
CN112579326A (en) * | 2020-12-29 | 2021-03-30 | 北京五八信息技术有限公司 | Offline data processing method and device, electronic equipment and computer readable medium |
CN112667476B (en) * | 2020-12-30 | 2023-02-14 | 平安普惠企业管理有限公司 | Task-based message filtering method, device, equipment and storage medium |
CN112667476A (en) * | 2020-12-30 | 2021-04-16 | 平安普惠企业管理有限公司 | Task-based message filtering method, device, equipment and storage medium |
CN113067883A (en) * | 2021-03-31 | 2021-07-02 | 建信金融科技有限责任公司 | Data transmission method and device, computer equipment and storage medium |
CN113067883B (en) * | 2021-03-31 | 2023-07-28 | 建信金融科技有限责任公司 | Data transmission method, device, computer equipment and storage medium |
CN114070720A (en) * | 2021-10-20 | 2022-02-18 | 浪潮金融信息技术有限公司 | Data preposition system, method and medium based on asynchronous long connection technology |
CN114070720B (en) * | 2021-10-20 | 2024-03-29 | 浪潮金融信息技术有限公司 | Data prepositive system, method and medium based on asynchronous long connection technology |
CN114363042A (en) * | 2021-12-30 | 2022-04-15 | 爱集微咨询(厦门)有限公司 | Log analysis method, device, equipment and readable storage medium |
CN115150418A (en) * | 2022-08-26 | 2022-10-04 | 北京蔚领时代科技有限公司 | Data storage method of server cluster |
CN115150418B (en) * | 2022-08-26 | 2024-01-26 | 北京蔚领时代科技有限公司 | Data storage method of server cluster |
CN115391325A (en) * | 2022-10-31 | 2022-11-25 | 深圳曼顿科技有限公司 | Energy data management method, device, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109684370A (en) | Daily record data processing method, system, equipment and storage medium | |
Lai et al. | Fedscale: Benchmarking model and system performance of federated learning at scale | |
US10681060B2 (en) | Computer-implemented method for determining computer system security threats, security operations center system and computer program product | |
CN111367187B (en) | Method for improving the processing of sensor flow data in a distributed network | |
Strohbach et al. | Towards a big data analytics framework for IoT and smart city applications | |
US10797987B1 (en) | Systems and methods for switch stack emulation, monitoring, and control | |
US10567409B2 (en) | Automatic and scalable log pattern learning in security log analysis | |
US20170012838A1 (en) | Automatically generating service documentation based on actual usage | |
US11201835B1 (en) | Systems and methods for multi-tier resource and subsystem orchestration and adaptation | |
US20210385251A1 (en) | System and methods for integrating datasets and automating transformation workflows using a distributed computational graph | |
US10986012B1 (en) | System for generating alerts based on alert condition and optimistic concurrency control procedure | |
US10666666B1 (en) | Security intelligence automation platform using flows | |
JP2017504121A (en) | Measuring device of user behavior and participation using user interface in terminal device | |
CN112632135A (en) | Big data platform | |
US11546380B2 (en) | System and method for creation and implementation of data processing workflows using a distributed computational graph | |
US11579860B2 (en) | Model driven state machine transitions to configure an installation of a software program | |
CN107003910B (en) | Method and device for classifying virtual activities of mobile users | |
CN111046022A (en) | Database auditing method based on big data technology | |
CN106330990A (en) | B/S structure performance monitoring analysis system and method | |
CN115033876A (en) | Log processing method, log processing device, computer device and storage medium | |
Ribeiro et al. | A data integration architecture for smart cities | |
CN113590604A (en) | Service data processing method and device and server | |
Assenmacher et al. | Openbots | |
CN109684159A (en) | Method for monitoring state, device, equipment and the storage medium of distributed information system | |
CN109684158A (en) | Method for monitoring state, device, equipment and the storage medium of distributed coordination system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |