CN113204376A - File analysis method and device, computer equipment and storage medium - Google Patents

File analysis method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113204376A
CN113204376A CN202110460564.3A CN202110460564A CN113204376A CN 113204376 A CN113204376 A CN 113204376A CN 202110460564 A CN202110460564 A CN 202110460564A CN 113204376 A CN113204376 A CN 113204376A
Authority
CN
China
Prior art keywords
analysis
target
model
file
configuration parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110460564.3A
Other languages
Chinese (zh)
Inventor
赵忠孝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chinabank Payments Beijing Technology Co Ltd
Original Assignee
Chinabank Payments Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chinabank Payments Beijing Technology Co Ltd filed Critical Chinabank Payments Beijing Technology Co Ltd
Priority to CN202110460564.3A priority Critical patent/CN113204376A/en
Publication of CN113204376A publication Critical patent/CN113204376A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4411Configuring for operating with peripheral devices; Loading of device drivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Stored Programmes (AREA)

Abstract

The present disclosure provides a file parsing method, apparatus, computer device, and storage medium, where the method includes: determining a program identifier of a target application program, wherein the target application program is used for analyzing a file; acquiring running environment information corresponding to the program identifier; generating a target thread model matched with the running environment information; and analyzing the file according to the target thread model. The analysis processing logic of the application program aiming at the file can be matched with the running environment of the application program, so that the analysis performance of the application program on the file is effectively improved, the application program for analyzing the file has portability, and the operation and maintenance cost of multi-running environment deployment is reduced.

Description

File analysis method and device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a file parsing method and apparatus, a computer device, and a storage medium.
Background
At present, the transaction amount of online transactions is increasing, for example, due to various promotion activities performed by various large electronic merchants, a large number of orders are generated on a single day, and the orders need to perform account checking operations of an initiator and a receiver on the next day, so that a file for account checking needs to be acquired and parsed to support the account checking operations. Generally, in order to ensure information security, an initiator encrypts a file, and correspondingly, an acquirer decrypts the file and then analyzes the file. And in the non-promotion period, the transaction amount is relatively small, and the generated file amount to be analyzed is also small, so that a normal calculation server can be correspondingly adopted.
In the related art, the analysis processing logic for the reconciliation file is easily restricted by the operating environment carried by the application program, so that the analysis processing performance of the application program on the reconciliation file is influenced, and the file analysis efficiency is not high.
Disclosure of Invention
The present disclosure provides a file parsing method, apparatus, computer device and storage medium, which are intended to solve at least one of the technical problems in the related art to a certain extent.
According to a first aspect, there is provided a file parsing method, including: determining a program identifier of a target application program, wherein the target application program is used for analyzing a file; acquiring running environment information corresponding to the program identifier; generating a target thread model matched with the running environment information; and analyzing the file according to the target thread model.
According to a second aspect, there is provided a file parsing apparatus, including: the determining module is used for determining a program identifier of a target application program, wherein the target application program is used for analyzing the file; the first acquisition module is used for acquiring the running environment information corresponding to the program identifier; the generating module is used for generating a target thread model matched with the running environment information; and the analysis module is used for analyzing the file according to the target thread model.
According to a third aspect, there is provided a computer device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform a file parsing method provided by embodiments of the present disclosure.
According to a fourth aspect, a non-transitory computer-readable storage medium is presented, having stored thereon computer instructions for causing a computer to perform a file parsing method provided by embodiments of the present disclosure.
According to the method and the device for analyzing the file, the program identification of the target application program is determined, wherein the target application program is used for analyzing the file, the running environment information corresponding to the program identification is obtained, the target thread model matched with the running environment information is generated, the file is analyzed according to the target thread model, the analyzing and processing logic of the application program aiming at the file is matched with the running environment of the application program, the analyzing performance of the application program on the file is effectively improved, the application program for analyzing the file has transportability, and the operation and maintenance cost of multi-running environment deployment is reduced.
Additional aspects and advantages of the disclosure will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the disclosure.
Drawings
The foregoing and/or additional aspects and advantages of the present disclosure will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart diagram of a file parsing method according to a first embodiment of the present disclosure;
FIG. 2 is a schematic flow chart diagram of a file parsing method according to a second embodiment of the disclosure;
FIG. 3 is a schematic diagram of a file parsing apparatus according to a third embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a file parsing apparatus according to a fourth embodiment of the present disclosure; and
FIG. 5 is a block diagram of a computer device used to implement the file parsing method of an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of illustrating the present disclosure and should not be construed as limiting the same. On the contrary, the embodiments of the disclosure include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
In view of the technical problems in the related art that the analysis processing logic of a reconciliation file is easily restricted by the operating environment loaded by an application program, so that the performance of the analysis processing of the application program on the reconciliation file is affected, and the file analysis efficiency is not high, the technical solution of the embodiment provides a file analysis method, a device, a computer device and a storage medium, and the method is described below with reference to specific embodiments.
It should be noted that an execution main body of the file parsing method of this embodiment may be a file parsing apparatus, the apparatus may be implemented by software and/or hardware, the apparatus may be configured in a computer device, and the computer device may include, but is not limited to, a terminal, a server, and the like.
Fig. 1 is a schematic flow chart of a file parsing method according to a first embodiment of the present disclosure.
As shown in fig. 1, the file parsing method includes:
s101: a program identification of a target application is determined, wherein the target application is used to parse the file.
In some application scenarios, for example, files are transmitted through the internet, the files may be encrypted during transmission, and accordingly, the files may be decrypted, analyzed, and the like after being received, for example, the contents of the files are read.
Specifically, for example, the account checking analysis is performed on the order file.
And an application for parsing a file (e.g., the order file) may be referred to as a target application for performing a corresponding parsing operation on the file.
In some embodiments, the target application is, for example and without limitation, a decompression application, a decryption application, or any other application having a function of parsing a file, and the target application is not specifically limited herein.
The program identifier may be uniquely used to identify the target application program, that is, one program identifier corresponds to one target application program, and the program identifiers may be, for example, identifiers such as icons, numbers, names, and the like, and different application programs may be distinguished by the program identifiers.
In the embodiment of the disclosure, in the process of parsing the file, the program identifier of the target application program for parsing the file may be determined, and then the corresponding operating environment information may be determined based on the program identifier.
S102: and acquiring the running environment information corresponding to the program identifier.
After the program identifier is determined, further, the operating environment information corresponding to the program identifier may be acquired.
It can be understood that the target application program may have different operation environment information in different environments, and the target application program corresponds to the program identifier, so that the embodiment of the present disclosure may obtain the corresponding operation environment information according to the program identifier.
The runtime environment information may be configuration information required for the target application to run in different environments, such as runtime environment information on different hardware devices or runtime environment information on different platforms, and is not limited herein.
The running environment information is not limited to, for example, hardware configuration information of a platform on which the target application runs, or software configuration information.
Optionally, in some embodiments, when obtaining the running environment information corresponding to the program identifier, first, the server identifier of the server on which the target application is running may be determined according to the program identifier.
That is to say, the running environment of the target application provided in the embodiment of the present disclosure may be an environment of a server on which the target application is installed, and when obtaining the running environment information corresponding to the program identifier, the embodiment may determine, according to the program identifier, a server identifier of the server on which the target application is running, that is: the server identifier of the target application program can be determined from one or more servers, and then the environment information of the corresponding server is read based on the server identifier and is used as the running environment information.
The server may correspond to a server identifier, such as a number, a name, and the like, where the server identifier may be used to distinguish different servers, and the corresponding server identifier may be determined by the program identifier of the target application program, and for example, the corresponding server identifier may be determined according to the program identifier of the target application program by combining with a preconfigured identifier correspondence table.
In some embodiments, the software resource configuration parameter corresponding to the server identifier may also be obtained and used as the operating environment information, which is not limited to this.
In some embodiments, the hardware resource configuration parameter may be the full amount of hardware configuration parameters of the server, including, for example and without limitation: the hardware resource configuration parameters may also include any other possible parameters, and are not specifically limited herein.
In other embodiments, the hardware resource configuration parameter may be obtained, for example, by a class library method of the computer programming language Java itself, such as runtime.
By determining the server identifier of the server operated by the target application program according to the program identifier, acquiring the hardware resource configuration parameter corresponding to the server identifier, and using the hardware resource configuration parameter as the operating environment information, the operating environment information corresponding to the program identifier can be quickly acquired, the target application program for analyzing the file is prevented from being restricted by the hardware resource configuration parameter, the deployment scene of the application program can be effectively assisted to be expanded, and the high efficiency of file analysis is ensured.
It should be understood that the above example is only an exemplary illustration for obtaining the operating environment information, and in practical applications, the operating environment information corresponding to the program identifier may also be obtained in any other possible manner, which is not limited specifically herein.
S103: and generating a target thread model matched with the running environment information.
The thread model refers to a multithreading pool-related model in a development language supporting multithreading, and the thread model may be applied to parsing a file, for example, the thread model may be called based on a target application program to implement parallel processing on the file, and the like, which is not limited thereto.
The thread model matched with the running environment information may be referred to as a target thread model, that is, after the running environment information is obtained, the embodiment of the present disclosure may generate the target thread model adapted to the running environment information with reference to the running environment information.
S104: and analyzing the file according to the target thread model.
After the target thread model is determined, the target thread model may be called based on the target application program to support parsing of the file.
For example, the encrypted bill file may be read first, and then the encrypted bill file may be decrypted and analyzed step by step according to the target thread model to obtain the bill content.
According to the method and the device for analyzing the file, the program identifier of the target application program is determined, wherein the target application program is used for analyzing the file, the running environment information corresponding to the program identifier is obtained, the target thread model matched with the running environment information is generated, the file is analyzed according to the target thread model, the analyzing and processing logic of the application program aiming at the file is matched with the running environment of the application program, the analyzing performance of the application program on the file is effectively improved, the application program for analyzing the file has transportability, and the operation and maintenance cost of multi-running environment deployment is reduced.
Fig. 2 is a flowchart illustrating a file parsing method according to a second embodiment of the disclosure.
As shown in fig. 2, the file parsing method includes:
s201: a program identification of a target application is determined, wherein the target application is used to parse the file.
S202: and determining the server identification of the server operated by the target application program according to the program identification.
S203: and acquiring a hardware resource configuration parameter corresponding to the server identifier, and taking the hardware resource configuration parameter as the running environment information.
For the description of S201 to S203, reference may be made to the above embodiments, which are not described herein again.
S204: obtaining an initial thread model, wherein the initial thread model comprises: a plurality of thread parameters.
After determining the running environment information, the embodiment of the present disclosure may obtain an initial thread model, where the initial thread model includes: a plurality of thread parameters.
The thread model with the initial thread parameters may be referred to as an initial thread model, and for example, the thread model with the empty parameter values of the thread parameters may be referred to as an initial thread model.
Wherein, a plurality of thread parameters can be as shown in table 1:
TABLE 1
Serial number Name (R) Type (B) Means of
1 corePoolSize int Core thread pool size
2 maximumPoolSize int Maximum thread pool size
3 keepAliveTime long Thread maximum idle time
4 unit TimeUnit Time unit
5 workQueue BlockingQueue<Runnable> Thread wait queue
6 threadFactory ThreadFactory Thread creation factory
7 handler RejectedExecutionHandler Rejection policy
As shown in Table 1, the plurality of thread parameters include, for example and without limitation, a core thread pool size corePoolSize, a maximum thread pool size maximumPoolSize, a thread maximum idle time keepalivieTime, a time unit, a thread wait queue workQueue, a thread creation factory, a rejection policy handler, and the like.
In the initial thread model of the embodiment of the present application, the thread parameters may be null.
S205: and determining a plurality of target thread parameter values respectively corresponding to the plurality of thread parameters according to the hardware resource configuration parameters.
The parameter values corresponding to the thread parameters determined according to the hardware resource configuration parameters can be called target thread parameter values, different target thread parameter values can be determined according to different hardware resource configuration parameters, and then a target application program can be supported to load a target thread model adapted to the hardware resource configuration parameters of the self-running server.
In combination with the plurality of thread parameters in the initial thread model, the embodiments of the present disclosure support determining a plurality of target thread parameter values corresponding to the plurality of thread parameters, respectively, according to the hardware resource configuration parameter, that is, determining an actual parameter value of the thread parameter in the initial thread model.
In some embodiments, when determining the plurality of target thread parameter values corresponding to the plurality of thread parameters respectively according to the hardware resource configuration parameters, the plurality of target thread parameter values may be determined by adopting a rule matching or model calculation method.
For example, the embodiment of the present disclosure may provide a mapping relationship table of hardware resource configuration parameters and thread parameter values, and may determine a target thread parameter value from the mapping relationship table according to the hardware resource configuration parameters in a rule matching manner when determining a plurality of target thread parameter values corresponding to a plurality of thread parameters, respectively.
Alternatively, a model calculation mode, for example, a pre-trained artificial intelligence-based model, may be adopted to input hardware resource configuration parameters and output corresponding target thread parameter values. Furthermore, the values of the target thread parameters may be determined according to any other possible manner, which is not specifically limited herein.
S206: and respectively assigning the target thread parameter values to the corresponding thread parameters to obtain a target thread model.
After the target thread parameter values are determined, the target thread parameter values can be respectively assigned to the corresponding thread parameters, and therefore the assigned thread model is used as the target thread model.
In some embodiments, for example, a plurality of target thread parameter values are used to assign values to the initial parameter values in the initial thread model, so as to obtain the target thread model. The target thread parameter value is determined according to the hardware resource configuration parameter, so that the obtained target thread model can be matched with the hardware resource configuration parameter, the file analysis efficiency can be improved, and the file analysis effect is improved.
It is understood that the above example is only an exemplary illustration for obtaining the target thread model, and in practical applications, the target thread model may also be obtained in any other possible manner, which is not specifically limited herein.
S207: and acquiring analysis configuration parameters corresponding to the file.
Configuration parameters related to the file parsing processing logic may be referred to as parsing configuration parameters, and the parsing configuration parameters may be configured with reference to service scenario requirements of file parsing in advance.
For example: the analytical configuration parameters are shown in table 2:
TABLE 2
Serial number Name (R) Type (B) Means of
1 batchSize int Number of pieces submitted in batch
2 Pages int Number of pages of document segmentation
3 PageSize long Total number of pages of document after splitting document
In conjunction with table 2, the parsing configuration parameters include, for example and without limitation, the number of batch submissions batchSize, the number of Pages of file splitting, the total number of Pages PageSize of file after file splitting, and may further include parameters such as file size and file line number, which are not specifically limited herein.
In some embodiments, if the target application has configured parsing configuration parameters, that is, if the server carried by the target application locally can configure corresponding parsing configuration parameters in advance, it indicates that the target application has configured parsing configuration parameters, the parsing configuration parameters may be directly obtained, and the configured parsing configuration parameters are used as corresponding parsing configuration parameters.
In other embodiments, if the target application does not have configured parsing configuration parameters, the parsing configuration parameters are generated according to the hardware resource configuration parameters in combination with the parsing service scenario requirements, that is: the analysis configuration parameters are generated by combining the hardware resource configuration parameters and the actual application scene, so that the analysis configuration parameters can be flexibly acquired, the analysis configuration parameters are matched with the hardware resource configuration parameters, and the requirement of analyzing the service scene can be met.
S208: and generating a target analysis model matched with the analysis configuration parameters.
The analysis model comprises parameters and structures set during program execution of the analysis file, and can assist in integrating business scene requirements into analysis processing logic.
In this embodiment, an analysis model adapted to a service scenario requirement of file analysis may be referred to as a target analysis model. When the analysis configuration parameters are obtained, the encryption and decryption algorithms corresponding to the files can be obtained, and then a target analysis model is generated according to the analysis configuration parameters and the encryption and decryption algorithms and is used for assisting in file analysis operation.
Optionally, in some embodiments, when generating the target analytic model matching the analytic configuration parameters, an initial analytic model may be obtained, where the initial analytic model includes: a plurality of resolution parameters.
The parsing model can be used for assisting file parsing, such as splitting a file, submitting the file in batches, calculating the number of pages of the split file, and the like.
The analysis model with the initial analysis parameters may be referred to as an initial analysis model, for example, an analysis model with a plurality of analysis parameters with empty parameter values may be referred to as an initial analysis model, that is, a plurality of analysis parameters included in the initial analysis model may be empty values.
Further, a plurality of target analysis parameter values corresponding to the plurality of analysis parameters may be determined according to the analysis configuration parameters.
The analysis parameter value determined according to the analysis configuration parameter may be referred to as a target analysis parameter value, that is, the present disclosure may correspondingly generate a target analysis model adapted to a service scene requirement of file analysis according to the analysis configuration parameter corresponding to the service scene requirement, so that the file is analyzed based on the target analysis model in combination with the target thread model.
In some embodiments, multiple target resolution parameter values may be determined, such as by rule matching or model calculation.
For example, the embodiment of the present disclosure may provide a mapping relationship table between the analysis configuration parameters and the analysis parameter values, and may determine, in a rule matching manner, a plurality of target analysis parameter values corresponding to the plurality of analysis parameters according to the analysis configuration parameters by determining a plurality of target analysis parameter values corresponding to the plurality of analysis parameters, when determining the plurality of target analysis parameter values corresponding to the plurality of analysis parameters.
Alternatively, a model calculation mode, for example, a pre-trained artificial intelligence-based model, may be adopted to input the analysis configuration parameters and output the corresponding target analysis parameter values. In addition, the target resolution parameter values may be determined according to any other possible manner, which is not specifically limited herein.
Further, the target analysis parameter values are respectively assigned to the corresponding analysis parameters to obtain a target analysis model. For example, a plurality of target analysis parameter values are used to assign a plurality of corresponding analysis parameters in the initial analysis model, so as to obtain the target analysis model. By determining the target analysis parameter value according to the analysis configuration parameter, the model parameter of the target analysis model can be adapted to the file analysis, and the subsequent auxiliary file analysis operation is facilitated.
S209: and analyzing the file according to the target thread model and the target analysis model.
Further, the file is analyzed according to the target thread model and the target analysis model, for example, the encrypted bill file is decrypted and analyzed step by step according to the target thread model and the target analysis model, so as to obtain the bill content.
In addition, the embodiment can also record the execution result and the model parameters under the current model, and calculate the execution efficiency of various models, so as to analyze the overall analysis performance.
Optionally, in some embodiments, if the target application has configured parsing configuration parameters, the target thread model and the target parsing model are updated according to the configured parsing configuration parameters, that is, the embodiments of the present disclosure further support updating the target thread model and the target parsing model, so as to reduce consumption of software and hardware resources generated by the models, facilitate multiplexing of the target thread model and the target parsing model, and improve file parsing efficiency.
In practical applications, the target thread model and the target analysis model of the embodiment of the present disclosure also support manual configuration, for example, a configuration center may be provided, and the configuration center is used for storing parameters of manual configuration. The method comprises the steps that a manual configuration can be pulled once from a configuration center, and if the manual configuration is not configured, a target thread model and a target analysis model are generated according to relevant parameters obtained by starting loading; if the configuration center stores parameters for manually configuring and creating the thread model, the parameters of the configuration center can be adopted to update the target thread model and the target analysis model. In addition, the target thread model and the target analysis model can be manually adjusted according to actual requirements.
According to the method and the device for analyzing the file, the program identifier of the target application program is determined, wherein the target application program is used for analyzing the file, the running environment information corresponding to the program identifier is obtained, the target thread model matched with the running environment information is generated, the file is analyzed according to the target thread model, the analyzing and processing logic of the application program aiming at the file is matched with the running environment of the application program, the analyzing performance of the application program on the file is effectively improved, the application program for analyzing the file has transportability, and the operation and maintenance cost of multi-running environment deployment is reduced. In addition, the method and the device can determine different target thread parameter values according to different hardware resource configuration parameters, and further can support a target application program to load a target thread model adapted to the hardware resource configuration parameters of the self-running server. In addition, the software and hardware resource consumption generated by the model can be reduced, the target thread model and the target analysis model can be conveniently multiplexed, and the file analysis efficiency is improved.
Fig. 3 is a schematic diagram of a file parsing apparatus according to a third embodiment of the present disclosure.
As shown in fig. 3, the file parsing apparatus 30 includes:
a determining module 301, configured to determine a program identifier of a target application program, where the target application program is used to parse a file; a first obtaining module 302, configured to obtain running environment information corresponding to a program identifier; a generating module 303, configured to generate a target thread model matched with the running environment information; and an analysis module 304, configured to analyze the file according to the target thread model.
Optionally, in some embodiments, as shown in fig. 4, fig. 4 is a schematic diagram of a file parsing apparatus according to a fourth embodiment of the present disclosure, where the first obtaining module 302 includes: a first determining submodule 3021, configured to determine, according to the program identifier, a server identifier of a server on which the target application program operates; the first obtaining sub-module 3022 is configured to obtain a hardware resource configuration parameter corresponding to the server identifier, and use the hardware resource configuration parameter as the operating environment information.
Optionally, in some embodiments, as shown in fig. 4, the generating module 303 includes: a second obtaining submodule 3031, configured to obtain an initial threading model, where the initial threading model includes: a plurality of thread parameters; a second determining submodule 3032, configured to determine, according to the hardware resource configuration parameter, a plurality of target thread parameter values corresponding to the plurality of thread parameters, respectively; the assignment submodule 3033 is configured to assign the multiple target thread parameter values to the corresponding multiple thread parameters, respectively, so as to obtain a target thread model.
Optionally, in some embodiments, as shown in fig. 4, the apparatus 30 further comprises: a second obtaining module 305, configured to obtain an initial threading model, where the initial threading model includes: a plurality of thread parameters; the parsing module 304, comprising: a generating submodule 3041 for generating a target analysis model matched with the analysis configuration parameters; the parsing submodule 3042 is configured to parse the file according to the target thread model and the target parsing model.
Optionally, in some embodiments, the generating sub-module 3041 is specifically configured to: obtaining an initial analytical model, wherein the initial analytical model comprises: a plurality of resolution parameters; determining a plurality of target analysis parameter values respectively corresponding to the plurality of analysis parameters according to the analysis configuration parameters; and respectively assigning the target analysis parameter values to the corresponding analysis parameters to obtain a target analysis model.
Optionally, in some embodiments, the second obtaining module 305 is specifically configured to: if the target application program has the configured analysis configuration parameters, taking the configured analysis configuration parameters as the corresponding analysis configuration parameters; and if the target application program does not have the configured analysis configuration parameters, generating the analysis configuration parameters according to the hardware resource configuration parameters and the analysis service scene requirements.
Optionally, in some embodiments, the second obtaining module 305 is specifically configured to: and if the target application program has the configured analysis configuration parameters, updating the target thread model and the target analysis model according to the configured analysis configuration parameters.
It should be noted that the explanation of the file parsing method is also applicable to the file parsing apparatus of the present embodiment, and is not repeated herein.
According to the method and the device for analyzing the file, the program identifier of the target application program is determined, wherein the target application program is used for analyzing the file, the running environment information corresponding to the program identifier is obtained, the target thread model matched with the running environment information is generated, the file is analyzed according to the target thread model, the analyzing and processing logic of the application program aiming at the file is matched with the running environment of the application program, the analyzing performance of the application program on the file is effectively improved, the application program for analyzing the file has transportability, and the operation and maintenance cost of multi-running environment deployment is reduced.
In order to implement the foregoing embodiments, the present disclosure also provides a computer device, including: the file parsing method includes that the file parsing method is achieved when the processor executes the program.
In order to achieve the above embodiments, the present disclosure also proposes a non-transitory computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the file parsing method as proposed by the foregoing embodiments of the present disclosure.
In order to implement the foregoing embodiments, the present disclosure further provides a computer program product, which when executed by an instruction processor in the computer program product, executes the file parsing method as proposed in the foregoing embodiments of the present disclosure.
FIG. 5 illustrates a block diagram of an exemplary computer device suitable for use in implementing embodiments of the present disclosure. The computer device 12 shown in fig. 5 is only one example and should not bring any limitations to the functionality or scope of use of the embodiments of the present disclosure.
As shown in FIG. 5, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 28 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 30 and/or cache Memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive").
Although not shown in FIG. 5, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only Memory (CD-ROM), a Digital versatile disk Read Only Memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the disclosure.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally perform the functions and/or methodologies of the embodiments described in this disclosure.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Moreover, computer device 12 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the Internet) via Network adapter 20. As shown, network adapter 20 communicates with the other modules of computer device 12 via bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing the file parsing method mentioned in the foregoing embodiments.
Other embodiments of the present disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
It should be noted that, in the description of the present disclosure, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Further, in the description of the present disclosure, "a plurality" means two or more unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present disclosure includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present disclosure.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present disclosure have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present disclosure, and that changes, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present disclosure.

Claims (16)

1. A file parsing method, the method comprising:
determining a program identifier of a target application program, wherein the target application program is used for analyzing a file;
acquiring running environment information corresponding to the program identifier;
generating a target thread model matched with the running environment information; and
and analyzing the file according to the target thread model.
2. The method of claim 1, wherein the obtaining operating environment information corresponding to the program identification comprises:
determining a server identifier of a server operated by the target application program according to the program identifier;
and acquiring a hardware resource configuration parameter corresponding to the server identifier, and taking the hardware resource configuration parameter as the running environment information.
3. The method of claim 2, wherein generating a target thread model that matches the runtime environment information comprises:
obtaining an initial threading model, the initial threading model comprising: a plurality of thread parameters;
determining a plurality of target thread parameter values respectively corresponding to the plurality of thread parameters according to the hardware resource configuration parameters;
and respectively assigning the target thread parameter values to the corresponding thread parameters to obtain the target thread model.
4. The method of claim 2, prior to said parsing said file according to said target thread model, further comprising:
acquiring analysis configuration parameters corresponding to the file;
then, the parsing the file according to the target thread model includes:
generating a target analysis model matched with the analysis configuration parameters;
and analyzing the file according to the target thread model and the target analysis model.
5. The method of claim 4, wherein the generating a target analytic model that matches the analytic configuration parameters comprises:
obtaining an initial analytical model, wherein the initial analytical model comprises: a plurality of resolution parameters;
determining a plurality of target analysis parameter values respectively corresponding to the analysis parameters according to the analysis configuration parameters;
and respectively assigning the target analysis parameter values to the corresponding analysis parameters to obtain the target analysis model.
6. The method of claim 4, wherein the obtaining parsing configuration parameters corresponding to the file comprises:
if the target application program has configured analysis configuration parameters, taking the configured analysis configuration parameters as the corresponding analysis configuration parameters;
and if the target application program does not have the configured analysis configuration parameters, generating the analysis configuration parameters according to the hardware resource configuration parameters and the analysis service scene requirements.
7. The method of claim 6, further comprising:
and if the target application program has configured analysis configuration parameters, updating the target thread model and the target analysis model according to the configured analysis configuration parameters.
8. A file parsing apparatus, comprising:
the system comprises a determining module, a determining module and a processing module, wherein the determining module is used for determining a program identifier of a target application program, and the target application program is used for analyzing a file;
the first acquisition module is used for acquiring the running environment information corresponding to the program identifier;
the generating module is used for generating a target thread model matched with the running environment information; and
and the analysis module is used for analyzing the file according to the target thread model.
9. The apparatus of claim 8, wherein the first obtaining module comprises:
the first determining submodule is used for determining the server identification of the server operated by the target application program according to the program identification;
and the first obtaining submodule is used for obtaining the hardware resource configuration parameters corresponding to the server identification and taking the hardware resource configuration parameters as the running environment information.
10. The apparatus of claim 9, wherein the generating module comprises:
a second obtaining sub-module, configured to obtain an initial thread model, where the initial thread model includes: a plurality of thread parameters;
a second determining submodule, configured to determine, according to the hardware resource configuration parameter, a plurality of target thread parameter values corresponding to the plurality of thread parameters, respectively;
and the assignment submodule is used for assigning the target thread parameter values to the corresponding thread parameters respectively so as to obtain the target thread model.
11. The apparatus of claim 9, wherein the apparatus further comprises:
the second acquisition module is used for acquiring the analysis configuration parameters corresponding to the file;
the analysis module comprises:
the generation submodule is used for generating a target analysis model matched with the analysis configuration parameters;
and the analysis submodule is used for analyzing the file according to the target thread model and the target analysis model.
12. The apparatus of claim 11, wherein the generation submodule is specifically configured to:
obtaining an initial analytical model, wherein the initial analytical model comprises: a plurality of resolution parameters;
determining a plurality of target analysis parameter values respectively corresponding to the analysis parameters according to the analysis configuration parameters;
and respectively assigning the target analysis parameter values to the corresponding analysis parameters to obtain the target analysis model.
13. The apparatus of claim 11, wherein the second obtaining module is specifically configured to:
if the target application program has configured analysis configuration parameters, taking the configured analysis configuration parameters as the corresponding analysis configuration parameters;
and if the target application program does not have the configured analysis configuration parameters, generating the analysis configuration parameters according to the hardware resource configuration parameters and the analysis service scene requirements.
14. The apparatus of claim 13, wherein the second obtaining module is specifically configured to:
and if the target application program has configured analysis configuration parameters, updating the target thread model and the target analysis model according to the configured analysis configuration parameters.
15. A computer device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
CN202110460564.3A 2021-04-27 2021-04-27 File analysis method and device, computer equipment and storage medium Pending CN113204376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110460564.3A CN113204376A (en) 2021-04-27 2021-04-27 File analysis method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110460564.3A CN113204376A (en) 2021-04-27 2021-04-27 File analysis method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113204376A true CN113204376A (en) 2021-08-03

Family

ID=77029095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110460564.3A Pending CN113204376A (en) 2021-04-27 2021-04-27 File analysis method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113204376A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157566A (en) * 2021-11-08 2022-03-08 中信科移动通信技术股份有限公司 Base station performance file analysis method and system
CN114860738A (en) * 2022-07-05 2022-08-05 中航信移动科技有限公司 Data processing system for determining order number environment category

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011143852A1 (en) * 2010-05-21 2011-11-24 中兴通讯股份有限公司 Managing method, device and terminal for application program
CN103678396A (en) * 2012-09-20 2014-03-26 阿里巴巴集团控股有限公司 Data backup method and device based on data models
CN107168726A (en) * 2017-03-30 2017-09-15 武汉斗鱼网络科技有限公司 A kind of method and apparatus of dynamic configuration application program
CN107659632A (en) * 2017-09-19 2018-02-02 咪咕数字传媒有限公司 A kind of file encryption-decryption method, device and computer-readable recording medium
CN110688828A (en) * 2019-09-20 2020-01-14 京东数字科技控股有限公司 File processing method and device, file processing system and computer equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011143852A1 (en) * 2010-05-21 2011-11-24 中兴通讯股份有限公司 Managing method, device and terminal for application program
CN103678396A (en) * 2012-09-20 2014-03-26 阿里巴巴集团控股有限公司 Data backup method and device based on data models
CN107168726A (en) * 2017-03-30 2017-09-15 武汉斗鱼网络科技有限公司 A kind of method and apparatus of dynamic configuration application program
CN107659632A (en) * 2017-09-19 2018-02-02 咪咕数字传媒有限公司 A kind of file encryption-decryption method, device and computer-readable recording medium
CN110688828A (en) * 2019-09-20 2020-01-14 京东数字科技控股有限公司 File processing method and device, file processing system and computer equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157566A (en) * 2021-11-08 2022-03-08 中信科移动通信技术股份有限公司 Base station performance file analysis method and system
CN114157566B (en) * 2021-11-08 2023-09-08 中信科移动通信技术股份有限公司 Base station performance file analysis method and system
CN114860738A (en) * 2022-07-05 2022-08-05 中航信移动科技有限公司 Data processing system for determining order number environment category

Similar Documents

Publication Publication Date Title
US8533720B2 (en) Offloading work from one type to another type of processor based on the count of each type of service call instructions in the work unit
US10175977B2 (en) User profile based code review
CN111158741B (en) Method and device for monitoring dependency relationship change of service module on third party class library
US9898258B2 (en) Versioning of build environment information
CN113204376A (en) File analysis method and device, computer equipment and storage medium
CN110659210A (en) Information acquisition method and device, electronic equipment and storage medium
CN110688111A (en) Configuration method, device, server and storage medium of business process
US20120060150A1 (en) High performance execution in workflow bpm engine
TW202034155A (en) Negative zero control in instruction execution
US10620952B2 (en) Conversion of boolean conditions
US20210406150A1 (en) Application instrumentation and event tracking
US10684939B2 (en) Using workload profiling and analytics to understand and score complexity of test environments and workloads
WO2021044502A1 (en) Test design device, test design method, test design program, and test design system
CN110070383B (en) Abnormal user identification method and device based on big data analysis
US11182272B2 (en) Application state monitoring
US10353795B2 (en) Standardizing run-time and historical customer and test environments and workloads comparisons using specific sets of key platform data points
JP6697486B2 (en) Garbage collection without special instructions
US20140282461A1 (en) Concurrent patching of shared libraries
US11392371B2 (en) Identification of a partial code to be refactored within a source code
CN109634636B (en) Application processing method, device, equipment and medium
US20180095835A1 (en) Resilient analytics utilizing dark data
CN112925523A (en) Object comparison method, device, equipment and computer readable medium
US20220229766A1 (en) Development of applications using telemetry data and performance testing
CN116661758B (en) Method, device, electronic equipment and medium for optimizing log framework configuration
US9250870B2 (en) Automated creation of shim programs and interfaces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination