WO2015078248A1 - Dynamic code instrumentation - Google Patents

Dynamic code instrumentation Download PDF

Info

Publication number
WO2015078248A1
WO2015078248A1 PCT/CN2014/089379 CN2014089379W WO2015078248A1 WO 2015078248 A1 WO2015078248 A1 WO 2015078248A1 CN 2014089379 W CN2014089379 W CN 2014089379W WO 2015078248 A1 WO2015078248 A1 WO 2015078248A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
response
request
configuration file
mut
Prior art date
Application number
PCT/CN2014/089379
Other languages
English (en)
French (fr)
Inventor
Xiaowei Chen
Yilin Wang
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2015078248A1 publication Critical patent/WO2015078248A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/865Monitoring of software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs

Definitions

  • the disclosed implementations relate generally to the field of computer technology, and in particular, to methods and systems for monitoring performance of a software module under test (MUT) .
  • MUT software module under test
  • Protocol Buffers (sometimes called Protobuf) are used to encode structured data in an efficient yet extensible format. Google has used the Protocol Buffers for almost all of its internal remote procedure call (RPC) protocols and file formats. More importantly, Google has opened the source codes of Protocol Buffers to developers for use in serialization of structured data.
  • the Protocol Buffers are associated with a specific interface language that could be used to create certain message structures. Each message structure includes interface codes. The respective interface codes are generated using a specific compiler, i. e. protoc, associated with the Protocol Buffers, and thereby creates communication interfaces or ports to access the corresponding message structure.
  • FIG. 1A illustrates a conventional module measurement system 100 including a module under test (MUT) 102 that is driven by a test drive module 104.
  • MUT 102 needs to be tested.
  • Module measurement system 100 further includes a code instrumentation module 106 coupled to MUT 102 to monitor the performance of module 106.
  • Module measurement system 100 has become a typical test model, and been adopted by many servers that rely on Google’s Protocol Buffers to encode structured data.
  • test drive module 104 sends a parsing request to MUT 102.
  • Code instrumentation module 106 simulates a load to MUT 102, monitors the performance of MUT 102, and returns a test response including test results to test drive module 104.
  • code instrumentation modules 106 each being applied to simulate a specific load condition for a specific MUT in response to a specific request.
  • each code instrumentation module 106 could end up having a distinct request analysis and response method in order to meet the respective requirements of the corresponding MUT 102 or parsing request.
  • the process of creating code instrumentation modules 106 could be complicated.
  • code instrumentation module 106 functions in a static manner when it simulates the load to MUT 102 and generates the response. Such a static operation limits code instrumentation module 106 from expanding its function upon a change of MUT 102 or a parsing request. There exists a need to increase the expansibility of code instrumentation module 106.
  • the application is implemented in a computer system that has one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. Instructions for performing these functions may be included in a computer program product configured for execution by one or more processors.
  • One aspect of the application is a computer-implemented method for monitoring performance of a module under test (MUT) .
  • the method is performed by a computer system that includes at least the MUT.
  • the method includes obtaining a test configuration file, and test configuration file 40 includes a plurality of test scenarios each of which further includes a test condition and a set of test rules for generating a test response.
  • the method further includes receiving a first test request including a first test condition from the MUT, wherein the MUT is configured to generate the first test request in response to a test instruction from a test driving module.
  • the method further includes identifying, in test configuration file 40, a test scenario whose test condition corresponds to the first test condition, and generating a first test response corresponding to the first test request using the a first set of test rules in the identified test scenario.
  • the computer system includes at least a MUT.
  • the computer system includes one or more processors, and memory having instructions stored thereon, which when executed by the one or more processors cause the processors to perform operations to implement the above methods for monitoring performance of the MUT.
  • Another aspect of the application is a non-transitory computer-readable medium, having instructions stored thereon, which when executed by one or more processors cause the processors to perform operations to implement the above methods for monitoring performance of a MUT.
  • FIG. 1A is a block diagram of a conventional module measurement system including a MUT that is driven by a test drive module and needs to be tested.
  • FIG. 1B is a block diagram of a module measurement system including a MUT that is driven by a test drive module and needs to be tested in accordance with some embodiments of the application.
  • FIGS. 2-4 are flow charts of three exemplary methods for monitoring performance of a MUT in configurable instrumentation environments in accordance with some embodiments of the application.
  • FIGS. 5-7 are block diagrams of code instrumentation modules that monitor performance of a MUT in configurable instrumentation environments in accordance with some embodiments of the application.
  • FIG. 8 is a block diagram of an exemplary computer system that monitors performance of a MUT in accordance with some embodiments.
  • FIG. 1B is a block diagram of a module measurement system 120 including a MUT 10 that needs to be tested in accordance with some embodiments of the application.
  • module measurement system 120 includes a test drive module 20 that drives MUT 10, a code instrumentation module 30 and a test configuration file 40.
  • code instrumentation module 30 creates a testing environment to measure performance of MUT 10. Specifically, code instrumentation module 30 identifies a test condition according a test instruction received from test drive module 20, resorts to test configuration file 40, and automatically and dynamically configure the testing environment (e.g., create a load for MUT 10) according to the test condition as instructed by test configuration file 40. By this means, code instrumentation module 30 does not need to predetermine and store program codes for testing MUT 10 in response to every possible test instruction.
  • Test drive module 20 is coupled to MUT 10, and generates a test instruction to test MUT 10 under a certain condition.
  • the condition specifies what kind of load MUT 10 has to support.
  • MUT 10 In response to the test instruction, MUT 10 generates a first test request that includes a first test condition.
  • MUT 10 is further coupled to code instrumentation module 30, and sends the first test request thereto.
  • Code instrumentation module 30 obtains test configuration file 40 optionally from a local memory or a remote computer, and test configuration file 40 includes a plurality of test scenarios each of which further includes a test condition and a set of test rules for generating a respective test response.
  • code instrumentation module 30 In response to the first test request made by MUT 10, code instrumentation module 30 identifies, in test configuration file 40, a test scenario whose test condition corresponds to the first test condition. Then, code instrumentation module 30 sets forth a test environment according to the test scenario for MUT 10, e.g., by enforcing a first set of rules, and generates a first test response in response to the first test request using the first set of test rules in the identified test scenario. MUT 10 further generates a test reply based on the first test response, and returns the test reply to test drive module 20 in response to the test instruction.
  • MUT 10 receives a second test instruction from test drive module 20, and generates a second test request.
  • Codes instrumentation module 30 updates the first test response in accordance with the second test request, and returns the updated first test response to MUT 10.
  • MUT 10 is configured to update the test reply in response to the test instruction received from test drive module 20 in accordance with the updated first test response.
  • the first test response includes a response message.
  • Code instrumentation module 30 receives at least part of the response message (e.g., content or value for a field of the response message) via a remote protocol call (RPC) .
  • the first test response is generated dynamically and automatically to update the part of the response message further in accordance with the first set of test rules.
  • the first test response further includes a response message determined according to a response rule that is not defined in test configuration file 40.
  • this response rule is retained within code instrumentation module 30 or received from a remote machine (optionally via a RPC) .
  • the first set of test rules in test configuration file 40 optionally has a priority over the response rule.
  • test configuration file 40 is provided via a RPC.
  • module measurement system 120 is implemented in a distributed computer system in which at least test configuration file 40 is stored at a remote server.
  • Test configuration file 40 is transferred to code instrumentation module 30 which is located at a local client machine, when the local machine makes a remote procedure call to fetch test configuration file 40.
  • the test configuration file is stored in a local memory.
  • the first test response includes a test response message, and at least part of the response message (e.g., a field of the response message) is provided by a remote machine via the RPC independently of the test configuration file.
  • Google Protocol Buffer is applied to structure and transfer at least one of test configuration file 40, the test instruction, the first test request, the first test response, the test reply and the like.
  • test configuration file 40 is stored in a remote server according to a certain data structure.
  • Protobuf includes a request/response protocol that allows a local client machine that stores code instrumentation module 30 to communicate with the remote server. The client machine follows a hand marshaling and un-marshaling procedure. The client machine further serializes data received from the remote server, and deserializes data for data broadcasting. In this example, Protobuf allows code instrumentation module 30 to send a request for test configuration file 40.
  • test configuration file 40 Part or all of test configuration file 40 are transferred from the remote server to the client machine in a serial data format, and deserialized for use in code instrumentation module 30.
  • each of the test instruction, the first test request, the first test response and the test reply could be managed by Protobuf.
  • the respective test condition includes receiving a test request that includes a respective request message, and the respective generated test response includes a respective response message.
  • code instrumentation module 30 extracts the test condition corresponding to a test request by parsing the respective request message.
  • the respective condition of each test scenario in test configuration file 40 includes that the respective test request includes at least one keyword associated with a type of the respective test scenario.
  • the method further includes creating test configuration file 40. Specifically, for each test scenario, the respective at least one keyword is obtained based on an introspection feature of Google Protocol Buffer, and the respective test scenario is associated with the respective at least one keyword.
  • the method further includes deserializing the first test request and identifying at least one first keyword in the first test request. The first test request is associated with the test scenario in test configuration file 40 by comparing the at least one first keyword with the at least one keywords associated with the plurality of test scenarios in test configuration file 40.
  • FIG. 1B is intended more as functional description of the various features which may be present in a set of devices or computer systems than as a structural schematic of the implementations described herein.
  • modules or components shown separately could be combined and some modules could be separated.
  • some modules shown separately in FIG. 1B could be implemented on single machines and single modules could be implemented by one or more machines.
  • Some modules or components may be omitted, and modules or components that are not shown may be added.
  • the actual number of modules used to monitor performance of MUT 10 and how features are allocated among them will vary from one implementation to another.
  • FIGS. 2-4 are flow charts of exemplary methods 200, 300 and 400 for monitoring performance of a MUT 10 in configurable instrumentation environments in accordance with some embodiments of the application, respectively.
  • Each of methods 200, 300 and 400 is, optionally, governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a computer system.
  • Each of the operations shown in FIGS. 2-4 may correspond to instructions stored in a computer memory or non-transitory computer readable storage medium.
  • the computer readable storage medium may include a magnetic or optical disk storage device, solid state storage devices such as Flash memory, or other non-volatile memory device or devices.
  • the instructions stored on the computer readable storage medium may include one or more of: source code, assembly language code, object code, or other instruction format that is interpreted by one or more processors. Some operations in each of methods 200, 300 and 400 may be combined and/or the order of some operations may be changed.
  • MUT performance monitoring method 200 includes (S01) acquiring a mapping relationship between a request message of a test request and a response message of a corresponding test response from a response protocol file.
  • the request message is part of a test request, and is optionally identified by a request message identifier (e.g., a request message name) .
  • the response message is part of a test response, and is optionally identified by a response message identifier (e.g., a response message name) .
  • method 200 is implemented by a computer system to acquire the mapping relationship from the response protocol file, e.g., a . proto file used in Protobuf.
  • the response protocol file is part of test configuration file 40.
  • a service interface is defined in the response protocol file (e.g., a . proto file) , and a compiling option is added based on the defined service interface for the purposes of providing the mapping relationship between the test requests and their corresponding test response.
  • the response protocol file (e.g., a . proto file) is parsed directly to obtain the mapping relationship between test requests and test responses.
  • each test request includes a respective request message identified by a request message name
  • each test response includes a respective response message identified by a response message name. Therefore, the . proto file is parsed to obtain the mapping relationship indicated by the request message names and the corresponding response message names.
  • the computer system receives (S02) a test request that includes a test request message, and obtains the request message name corresponding to the test request message.
  • the computer system further determines the response message name corresponding to the request message name according to the mapping relationship acquired from test configuration file 40.
  • a test response is dynamically generated (S03) to include a test response message according to the response message name determined based on the response protocol file.
  • the request message name corresponding to the test request is obtained and used to create the test response based on serialization routines (e.g., using protobuf’s introspection feature, which allows the computer system to dynamically generate a test response message without knowing explicitly the format of the test response message) .
  • serialization routines e.g., using protobuf’s introspection feature, which allows the computer system to dynamically generate a test response message without knowing explicitly the format of the test response message.
  • the response message name corresponding to the request message name is identified in and obtained from the response protocol file.
  • a test response message is dynamically generated according to the response message name associated with the test response message name as follows:
  • msg RespClass () .
  • test response message includes at least one response field whose content or value has not been determined.
  • the computer system dynamically set (S04) the content or value for the at least one response field of the test response message according to a set of test rules in test configuration file 40.
  • test configuration file 40 dynamically sets up an instrumentation environment for testing the MUT under different test scenarios or conditions.
  • test configuration file 40 includes the set of test rules which defines a test response associated with each test request.
  • test configuration file 40 is searched for identify the corresponding test response that oftentimes includes a test response message.
  • the test response message includes at least one response field that has undetermined content or value.
  • Test configuration file 40 is referenced to determine the undetermined content or value for least one response field of the test response message according to the test rules that are associated with the corresponding test scenario as determined by the test request.
  • Each test rule in test configuration file 40 includes a Trigger condition which, when satisfied, would trigger a corresponding Action, e.g., setting content or value to the at least one undetermined response field in a test response message.
  • a Trigger condition which, when satisfied, would trigger a corresponding Action, e.g., setting content or value to the at least one undetermined response field in a test response message.
  • the Trigger condition is defined according to the content of the request fields in the corresponding deserialized test request message, and the corresponding Action is implemented to determine one or more response fields in test response message.
  • the Trigger condition and the corresponding Action are represented in a code script that could be directly implemented in a dynamic programming language.
  • the computer system deserializes the test request and checks each test rule of the set of test rules in test configuration file 40.
  • a Trigger condition of a specific test rule is satisfied
  • a corresponding Action is implemented; otherwise, the computer system moves to check the next test rule in the test rule set. This rule check process is repeated until all rules in the test rule set are checked.
  • the at least one undetermined test field of the test response message is set to a default response field. When Trigger conditions of the other test rules are not satisfied any more, the test response includes the default response field.
  • the mapping relationship is acquired between a request message and a response message from a response protocol file which is optionally part of a test configuration file.
  • the computer system After receiving a test request, the computer system obtains the request message name corresponding to the test request, identifies the response message name corresponding to the request message name according to the mapping relationship, dynamically generates a response message corresponding to the response message name, and dynamically sets content or value for one or more response fields of the corresponding test response message according to the set of rules in test configuration file 40.
  • the relationship between each test request and a corresponding test response has to be programmed separately in respective program codes.
  • the computer system identifies one or more test rules in a test configuration file 40 according to the test request, and creates the corresponding test response according to the identified test rules. As such, no custom program codes need to be created separately for each test request, and the amount of work is reduced for monitoring performance of the MUT under different scenarios.
  • the computer system generates a test response according to a test configuration file 40, and moreover, includes a communication interface to receive information from a remote machine (e.g., a response rule) .
  • a code instrumentation module of the computer system is dynamically controlled by the remote machine via a Remote Procedure Call (RPC) , and generates the test response in response to a test request at least partially according to information provided via the RPC.
  • RPC Remote Procedure Call
  • the test response could be dynamically edited to include or comply with the information received via the RPC in response to the corresponding test request. Therefore, performance monitoring method 300 allows both the information provided via the RPC and test configuration file 40 to provide for configurable test responses associated with different test requests.
  • a response field of the test response has to be modified according to different test scenarios.
  • the response filed of the test response is set to null in a normal scenario, and not set to null in another abnormal scenario. Therefore, the response field of the test response is dynamically modified according to different test scenarios, and both the information provided via the RPC and test configuration file 40 could be applied to modify the response field in the test response dynamically.
  • the MUT performance monitoring method 300 also includes (S01) acquiring a mapping relationship between a request message of a test request and a response message of a corresponding test response from a response protocol file.
  • method 300 is implemented by a computer system to acquire the mapping relationship from a response protocol file, e.g., a . proto file used in Protobuf.
  • a service interface is defined in the response protocol file (e.g., the . proto file) , and a compiling option is added based on the defined service interface for the purposes of providing the mapping relationship between the test requests and their corresponding test responses.
  • Python based program codes are generated in a protocol buffer complier to enable a reflection function and therefore obtain a request message name of a test request.
  • the response protocol file (e. g. , a . proto file) is parsed directly to obtain the mapping relationship between test requests and test responses.
  • each test request includes a respective request message identified by a request message name
  • each test response includes a respective response message identified by a response message name. Therefore, the . proto file is parsed to obtain the mapping relationship indicated by the request message names and the corresponding response message names.
  • the computer system identifies (S12) a communication interface for receiving content or value for at least one response field in the test response via a RPC.
  • the computer system After receiving a test request, the computer system obtains (S13) the request message name corresponding to the test request, identifies a corresponding response message or message name, and modifies the response fields of the corresponding response message name according to test rules associated with the test request.
  • the RPC is an inter-process communication that allows a computer program to cause a subroutine or procedure to execute in another computer without the programmer explicitly coding the details for this remote interaction.
  • the RPC is established on a C/S (Client/Server) framework in which a client machine recall the computer program located on the server via a communication interface, just as a program is recalled locally on the client machine.
  • the RPC is implemented as a remote Python call (RPYC) or the like.
  • the value of a response field is received at the communication interface, before a test request is received.
  • the value of the response field is used in a corresponding test response that would be created in response to the subsequent test request.
  • the computer device receives a request message name corresponding to the test request via the RPC. Further, in accordance with the test rules corresponding to the test request, the computer system modifies one or more response fields of the corresponding test response created in response to the request message name. Stated another way, the specific requirement of the test rules corresponding to the test request, the computer system dynamically returns a response field value corresponding to the response message name associated with the request message name of the test request.
  • code instrumentation module 30 of the computer system directly uses the RPC to provide the content or values for the response fields in the test response. Code instrumentation module 30 does not need to be restarted to receive the content or values via the RPC. Further, in some situations, the response fields are preferably defined using the RPC provided content or values before using those provided by test configuration file 40.
  • the computer system a mapping relationship between a request message of a test request and a response message of a corresponding test response from a test configuration file. Further, the computer system identifies a communication interface for receiving content or value for at least one response field in the test response via a RPC. After receiving a test request, the computer system obtains the request message name corresponding to the test request, identifies a corresponding response message or message name, and modifies the response fields of the corresponding response message name according to test rules associated with the test request. The test response is thereby dynamically modified according to the MUT and the test conditions associated with the test request.
  • each test request is associated with different test responses under different test conditions (e.g., when the respective test request is coupled to different loads) .
  • Code instrumentation module 30 often needs to set special response fields when the MUT is tested under some abnormal conditions. Therefore, the value of the undetermined response field value is dynamically modified according to the MUT and the corresponding test conditions.
  • the test response is automatically controlled in accordance with a test request of a remote machine, and the RPC is optionally used to provide at least part of the content or value for some response fields in the test response created in response to the test request.
  • the performance monitoring method 400 further includes (S10) creating a test configuration file 40 including a set of test rules.
  • test configuration file 40 is created before the computer device acquires a mapping relationship between a request message name of a test request and a response message name of a corresponding test response from a test configuration file.
  • test configuration file 40 is introduced, such that code instrumentation module 30 could return different test responses in accordance with a determination whether a test request satisfies one or more test rules in test configuration file 40.
  • Test configuration file 40 provides a convenient solution for code instrumentation, because it includes a plurality of test scenarios within a single file and allows code instrumentation module 30 to generate the test responses dynamically in different test scenarios without programmers creating separate codes or intervening the test processes manually.
  • a test request includes a test request message.
  • Field names in different types of test request messages are obtained using the protobuf’s introspection feature.
  • the content of its fields are extracted and associated with the corresponding field names as determined by the protobuf.
  • a mapping relationship is established for the field names and the field content of the received test request message.
  • field content of one or more fields in the test request message is extracted and copied into fields in a response message created in response to the test request, and functions in a dynamic programming language, e. g. , set_attr () , are used to extract and copy the desirable field content in the test request message.
  • test configuration file 40 provides a plurality of test scenarios associated with different test conditions for code instrumentation module 30 in the computer system, such that performance of the MUT could be monitored with improved flexibility under these test scenarios.
  • FIGS. 5-7 are block diagrams of code instrumentation modules 300 that monitor performance of a MUT 10 in configurable instrumentation environments in accordance with some embodiments of the application.
  • code instrumentation module 30 includes parameter acquisition module 01, response generation module 02 and first setting module 03.
  • Parameter acquisition module 01 is configured to obtain a mapping relationship between a request message in a test request and a response message in a corresponding test response from a response protocol file.
  • An exemplary response protocol file includes a . proto file created by the protocol buffer language. The . proto file defines the properties of the request message and the response message (including their mapping relationship) .
  • a service interface is defined in the response protocol file (e. g. , the . proto file) .
  • a compiling option is added based on the defined service interface, such that parameter acquisition module 01 obtains information related to the mapping relationship between the test requests and their corresponding test responses.
  • Python based program codes are generated in a protocol buffer complier to enable a reflection function and therefore obtain a request message name of a test request.
  • the response protocol file (e.g., a . proto file) is parsed directly by parameter acquisition module 01 to obtain the mapping relationship between test requests and test responses.
  • each test request includes a respective request message identified by a request message name
  • each test response includes a respective response message identified by a response message name. Therefore, the . proto file is parsed by parameter acquisition module 01 to obtain the mapping relationship indicated by the request message names and the corresponding response message names.
  • Response generation module 02 is configured to receive a test request that includes a test request message, and obtains the request message name corresponding to the test request message.
  • the computer system further determines the response message name corresponding to the request message name according to the mapping relationship acquired from test configuration file 40. Then, a test response is dynamically generated to include a test response message according to the response message name determined based on the response protocol file.
  • response generation module 02 after response generation module 02 receives the test request, it also obtains the request message name corresponding to the test request.
  • the request message name is further used to create the test response based on serialization routines (e. g. , using protobuf’s introspection feature, which allows the computer system to dynamically generate a test response message without knowing explicitly the format of the test response message) .
  • serialization routines e. g. , using protobuf’s introspection feature, which allows the computer system to dynamically generate a test response message without knowing explicitly the format of the test response message.
  • response generation module 02 identifies and obtains the response message name corresponding to the request message name from the response protocol file.
  • a test response message is dynamically generated according to the response message name associated with the test response message name as follows:
  • msg RespClass () .
  • test response message includes at least one response field whose content or value has not been determined.
  • the computer system dynamically set the content or value for the at least one response field of the test response message according to a set of test rules in test configuration file 40.
  • test configuration file 40 dynamically sets up an instrumentation environment for testing the MUT under different test scenarios or conditions.
  • test configuration file 40 includes the set of test rules which defines a test response associated with each test request.
  • test configuration file 40 is searched for identify the corresponding test response that oftentimes includes a test response message.
  • the test response message includes at least one response field that has undetermined content or value.
  • Test configuration file 40 is again referenced to determine the undetermined content or value for least one response field of the test response message according to the test rules that are associated with the corresponding test scenario as determined by the test request.
  • Each test rule in test configuration file 40 includes a Trigger condition which, when satisfied, would trigger a corresponding Action, e.g., setting content or value to the at least one undetermined response field in a test response message.
  • a Trigger condition which, when satisfied, would trigger a corresponding Action, e.g., setting content or value to the at least one undetermined response field in a test response message.
  • the Trigger condition is defined according to the content of the request fields in the corresponding deserialized test request message, and the corresponding Action is implemented to determine one or more response fields in test response message.
  • the Trigger condition and the corresponding Action are represented in a code script that could be directly implemented in a dynamic programming language.
  • the computer system deserializes the test request and checks each test rule of the set of test rules in test configuration file 40.
  • a Trigger condition of a specific test rule is satisfied
  • a corresponding Action is implemented; otherwise, the computer system moves to check the next test rule in the test rule set. This rule check process is repeated until all rules in the test rule set are checked.
  • the at least one undetermined test field of the test response message is set to a default response field. When Trigger conditions of the other test rules are not satisfied any more, the test response includes the default response field.
  • the mapping relationship is acquired between a request message and a response message from a response protocol file which is optionally part of a test configuration file.
  • the computer system After receiving a test request, the computer system obtains the request message name corresponding to the test request, identifies the response message name corresponding to the request message name according to the mapping relationship, dynamically generates a response message corresponding to the response message name, and dynamically sets content or value for one or more response fields of the corresponding test response message according to the set of rules in test configuration file 40.
  • the relationship between each test request and a corresponding test response has to be programmed separately in respective program codes.
  • the computer system identifies one or more test rules in a test configuration file 40 according to the test request, and creates the corresponding test response according to the identified test rules. As such, no custom program codes need to be created separately for each test request, and the amount of work is reduced for monitoring performance of the MUT under different scenarios.
  • code instrumentation module 30 generates a test response according to a test configuration file, and moreover, includes a communication interface to receive information from a remote machine.
  • a code instrumentation module of code instrumentation module 30 is dynamically controlled by the remote machine via a Remote Procedure Call (RPC) , and generates the test response in response to a test request at least partially according to information provided via the RPC.
  • RPC Remote Procedure Call
  • the test response could be dynamically edited to include or comply with the information received via the RPC in response to the corresponding test request. Therefore, code instrumentation module 30 allows both the information provided via the RPC and test configuration file 40 to provide for configurable test responses associated with different test requests.
  • a response field of the test response has to be modified according to different test scenarios.
  • the response filed of the test response is set to null in a normal scenario, and not set to null in another abnormal scenario. Therefore, the response field of the test response is dynamically modified according to different test scenarios.
  • Both the information provided via the RPC and test configuration file 40 could be applied to modify the response field in the test response dynamically.
  • Code instrumentation module 30 includes parameter acquisition module 01, communication interface module 04 and second setting module 05.
  • Parameter acquisition module 01 is configured to obtain a mapping relationship between a request message name of a test request and a response message name of a corresponding test response from a response protocol file.
  • parameter acquisition module 01 acquires the mapping relationship from a response protocol file, e.g., a . proto file used in Protobuf.
  • a service interface is defined in the response protocol file (e. g. , the . proto file) , and a compiling option is added based on the defined service interface for the purposes of providing the mapping relationship between the test requests and their corresponding test responses.
  • Python based program codes are generated in a protocol buffer complier to enable a reflection function and therefore obtain a request message name of a test request.
  • parameter acquisition module 01 directly parses the response protocol file (e.g., a . proto file) to obtain the mapping relationship between test requests and test responses.
  • response protocol file e.g., a . proto file
  • each test request includes a respective request message identified by a request message name
  • each test response includes a respective response message identified by a response message name. Therefore, the . proto file is parsed to obtain the mapping relationship indicated by the request message names and the corresponding response message names.
  • communication interface module 04 is configured to identify a communication interface for receiving content or value for at least one response field in the test response via a RPC.
  • second setting module 05 is configured to obtain (S13) the request message name corresponding to the test request, identify a corresponding response message or message name, and modify the response fields of the corresponding response message name according to test rules associated with the test request.
  • the RPC is an inter-process communication that allows a computer program to cause a subroutine or procedure to execute in another computer without the programmer explicitly coding the details for this remote interaction.
  • the RPC is established on a C/S (Client/Server) framework in which a client machine recall the computer program located on the server via a communication interface, just as a program is recalled locally on the client machine.
  • the RPC is implemented as a remote Python call (RPYC) or the like.
  • the value of a response field is received at the communication interface, before a test request is received.
  • the value of the response field is used in a corresponding test response that would be created in response to the subsequent test request.
  • the computer device receives a request message name corresponding to the test request via the RPC. Further, in accordance with the test rules corresponding to the test request, the computer system modifies one or more response fields of the corresponding test response created in response to the request message name. Stated another way, the specific requirement of the test rules corresponding to the test request, the computer system dynamically returns a response field value corresponding to the response message name associated with the request message name of the test request.
  • code instrumentation module 30 of the computer system directly uses the RPC to provide the content or values for the response fields in the test response. Code instrumentation module 30 does not need to be restarted to receive the content or values via the RPC. Further, in some situations, the response fields are preferably defined using the RPC provided content or values before using those provided by test configuration file 40.
  • the computer system a mapping relationship between a request message name of a test request and a response message name of a corresponding test response from a test configuration file. Further, the computer system identifies a communication interface for receiving content or value for at least one response field in the test response via a RPC. After receiving a test request, the computer system obtains the request message name corresponding to the test request, identifies a corresponding response message or message name, and modifies the response fields of the corresponding response message name according to test rules associated with the test request. The test response is thereby dynamically modified according to the MUT and the test conditions associated with the test request.
  • each test request is associated with different test responses under different test conditions (e.g., when the respective test request is coupled to different loads) .
  • Code instrumentation module 30 often needs to set special response fields when the MUT is tested under some abnormal conditions. Therefore, the value of the response field value is dynamically modified according to the MUT and the corresponding test conditions.
  • the test response is automatically controlled in accordance with a test request of a remote machine, and the RPC is optionally used to provide at least part of the content or value for some response fields in the test response created in response to the test request.
  • code instrumentation module 30 further includes a rule setting module 06 in addition to parameter acquisition module 01, response generation module 02 and first setting module 03. Rule setting module 06 that defines test configuration file 40.
  • test configuration file 40 is introduced, such that code instrumentation module 30 could return different test responses in accordance with a determination whether a test request satisfies one or more test rules in test configuration file 40.
  • Test configuration file 40 that includes test rules created by rule setting module 06 provides a convenient solution for code instrumentation, because it includes a plurality of test scenarios within a single file and allows code instrumentation module 30 to generate the test responses dynamically in different test scenarios without programmers creating separate codes or intervening the test processes manually.
  • a test request includes a test request message.
  • rule setting module 06 field names in different types of test request messages are obtained using the protobuf’s introspection feature. After a test request message is received and desearialized by rule setting module 06, the content of its fields are extracted and associated with the corresponding field names as determined by the protobuf. Thereby, a mapping relationship is established for the field names and the field content of the received test request message. In accordance with the mapping relationship, rule setting module 6 determines whether a Trigger condition is satisfied. Furthermore, in some embodiments, rule setting module 06 extracts field content of one or more fields in the test request message, and copies the extracted field content to fields in a response message created in response to the test request. Specifically, functions in a dynamic programming language, e.g., set_attr () , are used to extract and copy the desirable field content in the test request message.
  • Test configuration file 40 provides a plurality of test scenarios associated with different test conditions for code instrumentation module 30 in the computer system, such that performance of the MUT could be monitored with improved flexibility under these test scenarios.
  • FIG. 8 is a block diagram of an exemplary computer system 800 that monitors performance of a MUT 10 in accordance with some embodiments.
  • computer system 120 at least includes one or more processors 810 (e. g. , central processing units) and a memory 820 for storing data, programs and instructions for execution by one or more processors 810.
  • processors 810 e. g. , central processing units
  • memory 820 for storing data, programs and instructions for execution by one or more processors 810.
  • computer system 120 further includes one or more communication interfaces 830, an input/output (I/O) interface 840, and one or more communication buses 850 that interconnect these components.
  • I/O input/output
  • I/O interface 840 includes an input unit 842 and a display unit 844.
  • input unit 842 include a keyboard, a mouse, a touch pad, a game controller, a function key, a trackball, a joystick, a microphone, a camera and the like.
  • display unit 844 displays information that is inputted by the user or provided to the user for review. Examples of display unit 844 include, but are not limited to, a liquid crystal display (LCD) and an organic light-emitting diode (OLED) display.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • input unit 842 and display unit 844 are integrated on a touch-sensitive display that displays a graphical user interface (GUI) .
  • GUI graphical user interface
  • communication buses 530 include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
  • communication interfaces 830 further include a receiver 832 and a transmitter 834.
  • memory 820 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 820 includes one or more storage devices remotely located from the one or more processors 810.
  • memory 820, or alternatively the non-volatile memory device (s) within memory 820 includes a non-transitory computer readable storage medium.
  • memory 820 or alternatively the non-transitory computer readable storage medium of memory 820 stores the following programs, modules and data structures, instructions, or a subset thereof:
  • ⁇ Operating System 801 that includes procedures for handling various basic system services and for performing hardware dependent tasks;
  • I/O interface module 802 that includes procedures for handling various basic input and output functions through one or more input and output devices, wherein I/O interface module 802 further includes an interface display module that controls displaying of a graphical user interface;
  • ⁇ Communication module 803 that is used for connecting computer system 120 to other computational devices (e.g., servers and client devices) , via one or more network communication interfaces 850 (wired or wireless) and one or more communication networks, such as the Internet, other wide area networks, local area networks, metropolitan area networks, and so on;
  • ⁇ a module under test i. e., MUT 10, that needs to be tested;
  • ⁇ Test drive module 20 that generates a test instruction to control the test on MUT 10;
  • ⁇ Code instrumentation module 30 that receives the test instruction and tests MUT 10 according to test rules in a test configuration file 40. More details on code instrumentation module 20 are explained above with reference to FIGS. 1B and 2-6.
  • Computer system 800 optionally stores test configuration file 40 in a local memory.
  • test configuration file 40 is stored remotely on another computer machine.
  • the methods of the embodiments may be realized by means of software and necessary general hardware (platform) , or only by hardware. In most cases, the former is preferred.
  • the technical solution of the present application or the part which makes a contribution to the prior art may be substantially embodied in the form of software product.
  • the software product may be stored in a storage medium (such as ROM/RAM, magnetic dish, optical disk) of a computer, which comprises some instructions for enabling a terminal equipment (such as mobile phone, computer, server, network device, or the implementation apparatus of response-customizable code instrumentation module illustrated in FIG. 5 to FIG. 7 to carry out the method of each embodiment of the present application.
  • first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another.
  • first ranking criteria could be termed second ranking criteria, and, similarly, second ranking criteria could be termed first ranking criteria, without departing from the scope of the present invention.
  • First ranking criteria and second ranking criteria are both ranking criteria, but they are not the same ranking criteria.
  • the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting, ” that a stated condition precedent is true, depending on the context.
  • the phrase “if it is determined [that a stated condition precedent is true] ” or “if [astated condition precedent is true] ” or “when [astated condition precedent is true] ” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
  • stages that are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be obvious to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.
PCT/CN2014/089379 2013-11-26 2014-10-24 Dynamic code instrumentation WO2015078248A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310615914.4 2013-11-26
CN201310615914.4A CN104683386B (zh) 2013-11-26 2013-11-26 可定制响应的桩服务实现方法及装置

Publications (1)

Publication Number Publication Date
WO2015078248A1 true WO2015078248A1 (en) 2015-06-04

Family

ID=53198326

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/089379 WO2015078248A1 (en) 2013-11-26 2014-10-24 Dynamic code instrumentation

Country Status (2)

Country Link
CN (1) CN104683386B (zh)
WO (1) WO2015078248A1 (zh)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109151037A (zh) * 2018-09-04 2019-01-04 政采云有限公司 一种通信方法及装置
CN110147320A (zh) * 2019-04-19 2019-08-20 平安普惠企业管理有限公司 接口测试方法、装置及电子设备
US10915426B2 (en) 2019-06-06 2021-02-09 International Business Machines Corporation Intercepting and recording calls to a module in real-time
US10929126B2 (en) 2019-06-06 2021-02-23 International Business Machines Corporation Intercepting and replaying interactions with transactional and database environments
US11016762B2 (en) 2019-06-06 2021-05-25 International Business Machines Corporation Determining caller of a module in real-time
US11036619B2 (en) 2019-06-06 2021-06-15 International Business Machines Corporation Bypassing execution of a module in real-time
US11074069B2 (en) 2019-06-06 2021-07-27 International Business Machines Corporation Replaying interactions with transactional and database environments with re-arrangement
CN113434147A (zh) * 2021-06-25 2021-09-24 北京达佳互联信息技术有限公司 基于ProtoBuf协议的消息解析方法及装置
CN113704079A (zh) * 2020-05-22 2021-11-26 北京沃东天骏信息技术有限公司 基于Protobuf的接口测试方法和装置
CN113704087A (zh) * 2021-07-09 2021-11-26 奇安信科技集团股份有限公司 一种跨域传输设备的文件业务测试方法、装置及电子设备
CN113434147B (zh) * 2021-06-25 2024-05-14 北京达佳互联信息技术有限公司 基于ProtoBuf协议的消息解析方法及装置

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107122292A (zh) * 2016-10-21 2017-09-01 北京小度信息科技有限公司 一种模拟服务的方法及系统
CN106603283B (zh) * 2016-12-13 2019-09-13 广州品唯软件有限公司 一种模拟服务的方法、装置及集中管理平台
CN107168993A (zh) * 2017-03-29 2017-09-15 广州优视网络科技有限公司 处理响应数据的方法、设备、客户端设备和电子设备
CN112445700B (zh) * 2019-09-05 2023-10-13 腾讯科技(深圳)有限公司 测试方法和装置
CN111078571B (zh) * 2019-12-20 2024-02-02 广州品唯软件有限公司 模拟响应的测试方法、终端设备及计算机可读存储介质
CN112328222A (zh) * 2020-11-26 2021-02-05 天津市鑫联兴科技有限公司 一种动态流程服务接口方法及动态流程服务接口引擎
CN112799713B (zh) * 2020-12-31 2022-06-28 江苏苏宁银行股份有限公司 一种通用可配置测试桩平台
CN113468388B (zh) * 2021-06-30 2024-05-03 深圳集智数字科技有限公司 控制方法、装置、服务器及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101515301A (zh) * 2008-02-23 2009-08-26 炬力集成电路设计有限公司 一种片上系统芯片验证的方法和装置
CN102214140A (zh) * 2011-06-21 2011-10-12 中兴通讯股份有限公司 软件自动测试的方法及系统
CN103135011A (zh) * 2011-11-28 2013-06-05 爱德万测试株式会社 测试模块生成装置、测试步骤生成装置、生成方法及测试装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102143523B (zh) * 2010-11-18 2014-06-25 华为技术有限公司 基于业务递送平台的应用测试方法和业务递送平台
CN103136095A (zh) * 2011-11-28 2013-06-05 阿里巴巴集团控股有限公司 一种测试应用程序接口的方法、装置及系统
CN102622237B (zh) * 2012-03-14 2015-09-09 北京思特奇信息技术股份有限公司 一种业务功能流程化的配置方法及系统

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101515301A (zh) * 2008-02-23 2009-08-26 炬力集成电路设计有限公司 一种片上系统芯片验证的方法和装置
CN102214140A (zh) * 2011-06-21 2011-10-12 中兴通讯股份有限公司 软件自动测试的方法及系统
CN103135011A (zh) * 2011-11-28 2013-06-05 爱德万测试株式会社 测试模块生成装置、测试步骤生成装置、生成方法及测试装置

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109151037A (zh) * 2018-09-04 2019-01-04 政采云有限公司 一种通信方法及装置
CN109151037B (zh) * 2018-09-04 2022-03-04 政采云有限公司 一种通信方法及装置
CN110147320A (zh) * 2019-04-19 2019-08-20 平安普惠企业管理有限公司 接口测试方法、装置及电子设备
US11074069B2 (en) 2019-06-06 2021-07-27 International Business Machines Corporation Replaying interactions with transactional and database environments with re-arrangement
US11016762B2 (en) 2019-06-06 2021-05-25 International Business Machines Corporation Determining caller of a module in real-time
US11036619B2 (en) 2019-06-06 2021-06-15 International Business Machines Corporation Bypassing execution of a module in real-time
US10929126B2 (en) 2019-06-06 2021-02-23 International Business Machines Corporation Intercepting and replaying interactions with transactional and database environments
US10915426B2 (en) 2019-06-06 2021-02-09 International Business Machines Corporation Intercepting and recording calls to a module in real-time
CN113704079A (zh) * 2020-05-22 2021-11-26 北京沃东天骏信息技术有限公司 基于Protobuf的接口测试方法和装置
CN113434147A (zh) * 2021-06-25 2021-09-24 北京达佳互联信息技术有限公司 基于ProtoBuf协议的消息解析方法及装置
CN113434147B (zh) * 2021-06-25 2024-05-14 北京达佳互联信息技术有限公司 基于ProtoBuf协议的消息解析方法及装置
CN113704087A (zh) * 2021-07-09 2021-11-26 奇安信科技集团股份有限公司 一种跨域传输设备的文件业务测试方法、装置及电子设备
CN113704087B (zh) * 2021-07-09 2024-01-19 奇安信科技集团股份有限公司 一种跨域传输设备的文件业务测试方法、装置及电子设备

Also Published As

Publication number Publication date
CN104683386A (zh) 2015-06-03
CN104683386B (zh) 2019-01-04

Similar Documents

Publication Publication Date Title
WO2015078248A1 (en) Dynamic code instrumentation
EP3030969B1 (en) Automated application test system
US9910759B2 (en) Logging framework and methods
JP2018139106A (ja) クラウド接続された自動テスティング
US9652220B2 (en) Zero down-time deployment of new application versions
US9146779B2 (en) System and method for migrating an application
CN107608901B (zh) 基于Jmeter的测试方法及装置、存储介质、电子设备
US9304894B2 (en) Code-free testing framework
US10606659B2 (en) Acquiring location information for logical partition within virtual machine
US11893367B2 (en) Source code conversion from application program interface to policy document
CN111782519A (zh) 测试方法、装置和电子设备
US20240146650A1 (en) Creating endpoints
CN108197024B (zh) 嵌入式浏览器调试方法、调试终端及计算机可读存储介质
CN107122203B (zh) 一种配置文件的设置方法及装置
CN113032244A (zh) 接口测试方法、装置、计算机系统和计算机可读存储介质
CN103019900B (zh) 终端性能的检测结果显示方法和装置
KR20180061589A (ko) 소프트웨어 빌드 시스템 및 이를 이용한 소프트웨어 빌드 방법
US20170004064A1 (en) Actions test automation
US9442818B1 (en) System and method for dynamic data collection
CN109408376B (zh) 一种配置数据的生成方法、装置、设备及存储介质
CN113296758B (zh) 一种前端组件库构建方法、装置及存储介质
CN111782520A (zh) 测试方法、装置和电子设备
US11755665B2 (en) Identification of a computer processing unit
CN113726902B (zh) 微服务的调用方法、系统、服务器、设备和存储介质
CN113542436B (zh) 基于请求触发的Web后端全量API自发现方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14866488

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.10.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14866488

Country of ref document: EP

Kind code of ref document: A1