CN111639032B - Method and apparatus for testing applications - Google Patents

Method and apparatus for testing applications Download PDF

Info

Publication number
CN111639032B
CN111639032B CN202010488006.3A CN202010488006A CN111639032B CN 111639032 B CN111639032 B CN 111639032B CN 202010488006 A CN202010488006 A CN 202010488006A CN 111639032 B CN111639032 B CN 111639032B
Authority
CN
China
Prior art keywords
initial
test
application
label
modified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010488006.3A
Other languages
Chinese (zh)
Other versions
CN111639032A (en
Inventor
李欢
逄增耀
杜英豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202010488006.3A priority Critical patent/CN111639032B/en
Publication of CN111639032A publication Critical patent/CN111639032A/en
Application granted granted Critical
Publication of CN111639032B publication Critical patent/CN111639032B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The application discloses a method and a device for testing application, and relates to the technical field of testing. The specific implementation scheme is as follows: acquiring a test request of an application, wherein the test request comprises configuration information for configuring test data for the application, and the configuration information comprises initial test flow ratio; according to the initial test flow ratio, randomly generating labels of the initial test flow in a label interval of the total flow to form an initial label set, wherein the label interval is an interval generated by dividing the total flow according to a preset granularity, and the initial label set is used for determining users participating in application tests; based on the initial set of tags and the configuration information, a configuration file is generated to test the application. According to the implementation mode, the non-repeated labels are randomly generated in the label interval of the total flow and serve as labels of the initial test flow, so that randomness of users participating in different tests is guaranteed, uniformity of user flow dispersion is guaranteed, and accuracy of test results is improved.

Description

Method and apparatus for testing applications
Technical Field
Embodiments of the present disclosure relate to the field of computer technology, and in particular, to the field of testing technology.
Background
The internet company application's function adjustment, background recommendation algorithm adjustment, etc. require a large number of tests to be performed in order to determine from the test results whether the adjustment can be received by the user.
In the related art, when the method is applied to testing, users participating in the testing are required to be reasonably selected so as to obtain more accurate testing effects. Thus, a user who chooses how to participate in a test may directly influence the test results.
Disclosure of Invention
A method, apparatus, device, and storage medium for testing an application are provided.
According to a first aspect, there is provided a method for testing an application, the method comprising: acquiring a test request of an application, wherein the test request comprises configuration information for configuring test data for the application, the configuration information comprises initial test flow ratio, and the initial flow ratio represents the ratio of the initial test flow to the total flow; according to the initial test flow ratio, randomly generating labels of the initial test flow in a label interval of the total flow to form an initial label set, wherein the label interval is an interval generated by dividing the total flow according to a preset granularity, and the initial label set is used for determining users participating in application tests; based on the initial set of tags and the configuration information, a configuration file is generated to test the application.
According to a second aspect, there is provided an apparatus for testing an application, the apparatus comprising: the system comprises an acquisition unit configured to acquire a test request of an application, wherein the test request comprises configuration information for configuring test data for the application, the configuration information comprises an initial test flow ratio, and the initial flow ratio represents the ratio of the initial test flow to the total flow; the label generating unit is configured to randomly generate labels of the initial test flow in a label interval of the total flow according to the initial test flow proportion to form an initial label set, wherein the label interval is an interval generated by cutting the total flow according to a preset granularity, and the initial label set is used for determining users participating in application tests; and a configuration file generation unit configured to generate a configuration file based on the initial tag set and the configuration information to test the application.
In a third aspect, an electronic device is provided, the electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method described above.
In a fourth aspect, a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the above method is provided.
According to the technology, the problem of poor randomness of users participating in different tests is solved, and the accuracy of test results is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for better understanding of the present solution and do not constitute a limitation of the present application. Wherein:
FIG. 1 is a schematic diagram of a first embodiment of a method for testing an application according to the present application;
FIG. 2 is a schematic diagram of one implementation of a method for testing an application in accordance with the first embodiment;
FIG. 3 is a schematic diagram of a second embodiment of a method for testing an application according to the present application;
FIG. 4 is a schematic diagram of an embodiment of an apparatus for testing applications according to the present application;
fig. 5 is a block diagram of an electronic device for implementing a method for testing an application of an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Referring to fig. 1, a schematic diagram of a first embodiment of a method for testing an application according to the present application is shown. The method for testing an application may comprise the steps of:
step 101, a test request of an application is obtained.
In this embodiment, the execution body of the method for testing an application may be a device for testing an application, which may be an electronic entity (e.g., a server), or may be an application program integrated with software. In use, a test request for an application may be entered into the means for testing an application. The device for testing the application can test the application by adopting the method for testing the application of the embodiment.
In this embodiment, the executing body (e.g., a server) may obtain the test request of the application from another electronic device through a wired connection manner or a wireless connection manner. As an example, a configuration platform of a test task may be installed in an electronic device where a tester is located, so that the tester may generate configuration information for test data related to the test task configuration through the configuration platform, and the execution subject may obtain a test request including the configuration information from the electronic device where the tester is located. It will be understood, of course, that the test request of the application may also be stored locally on the executing body, and in this case, the executing body may directly obtain the test request from the local. The application to be tested includes, but is not limited to, an application used in a mobile terminal such as a mobile phone, a tablet computer, etc., and the application may be software used on the terminal device, or may be some functions in the software used on the terminal device, for example, the application to be tested may be a bar, a bar catalog, etc. It should be noted that, the test request may include configuration information for configuring test data for the application to be tested, and the configuration information may include an initial test traffic proportion. Here, the initial test flow ratio may represent a ratio of the initial test flow to the total flow. As an example, the total flow may be the sum of the flows of all the tests to be 100%, and the initial test flow may be the flow allocated to the test from the total flow before the start of the test, for example, 20%.
In general, before performing an application test, a tester may edit test data through the electronic device in which the tester is located to generate a test request, where configuration information in the generated test request may include not only the initial test reserve ratio, but also a test validity period, a test flow type (e.g., bar dimension flow, search dimension flow, etc.) edited by the tester in the electronic device. Here, the configuration information of the test request may be determined according to actual requirements, which is not limited only.
In some optional implementations of this embodiment, the configuration information may further include traffic layer information. Aiming at the problem of limited access traffic, the traffic can be logically layered to form different traffic layers, so that multiplexing and sharing of the traffic can be realized. The traffic layer information may include traffic layering information where the test request is located. In this case, the total flow may be the total flow of the flow hierarchy where the test is located.
Step 102, according to the initial test flow ratio, randomly generating the labels of the initial test flow in the label interval of the total flow to form an initial label set.
In this embodiment, based on the ratio of the initial test flow obtained in step 101, the executing body may randomly generate the label of the initial test flow in the label interval of the total flow by using various means. Here, the total flow may be divided according to a preset granularity to generate a label interval of the total flow, for example, the total flow is 100%, and the label interval [0, 99] may be obtained by dividing the total flow according to a granularity of 1%. It is understood that the label of the total flow may be the basis for dividing the total flow, for example, the label interval of the total flow is [0, 99], and the label of the total flow may include 0, 1 … …, 99, and the total flow may be divided into 100 parts. The executing body may determine the number of the labels of the initial test flow according to the ratio of the initial test flow to the total flow, and then may randomly generate the labels of the initial test flow in the label interval of the total flow by using, for example, a monte carlo method, and the generated labels are not repeated. For example, the label interval range of the total flow may be [0, 99], and the initial test flow ratio is 20%, then 20 non-repeated numbers meeting the label requirement may be randomly generated in the [0, 99] interval range, and the generated numbers are the labels of the initial test flow. The set formed by the generated labels of the initial test flow is the initial label set.
It should be noted that the initial tag set may be used to determine the user who is participating in the application test. Specifically, for a user accessing the application, if the user identification (for example, the user ID, the device identification of the device used by the user, the IP address, etc.) of the user may fall in the initial tag set after being processed according to a preset rule, the user may be determined to be a user participating in the application test. It can be understood that when generating the configuration file, each test needs to randomly generate the label of the initial test flow in the label interval of the total flow according to the proportion of the initial test flow, so that the labels of the initial test flow of each test are random for different tests, which can ensure that users participating in each test are also random, thereby avoiding the problem of uneven dispersion of the user flow in different tests.
Step 103, generating a configuration file based on the initial tag set and the configuration information to test the application.
In this embodiment, based on the initial tag set formed in step 102, the execution body may generate the configuration file by combining the initial tag set formed with other configuration information in the test request. The application may be tested using the configuration items in the generated configuration file.
In some optional implementations of this embodiment, after generating the configuration file, the executing body may synchronize the configuration file to the redis storage system, and the redis storage system may issue the configuration item in the configuration file, so that the application may test according to the configuration item. According to the implementation mode, the configuration items of the configuration files are issued through the redis storage system, so that the testing efficiency can be improved.
With continued reference to fig. 2, fig. 2 is a schematic diagram of an application scenario of the method for testing an application according to the present embodiment. In the application scenario of fig. 2, a tester may send a test request of an application a (e.g., a catalog of a bar) and an executing body (e.g., a server) may obtain the test request sent by the tester; then, the executing body can randomly generate the label of the initial test flow in the label interval of the total flow according to the initial test flow proportion contained in the test request, and form an initial label set; finally, the execution body may generate a configuration file based on the composed initial tag set and the configuration information. The application A background server can test the application A according to the configuration items of the configuration file and obtain a test result.
According to the method for testing the application, which is provided by the embodiment of the application, the test request of the application can be obtained, then the labels of the initial test flow can be randomly generated in the label interval of the total flow according to the initial test flow proportion contained in the test request, an initial label set is formed, and finally the configuration file can be generated based on the initial label set and the configuration information, so that the application can be tested. According to the method provided by the embodiment, the non-repeated labels are randomly generated in the label interval of the total flow and serve as the labels of the initial test flow, so that the randomness of users participating in the application test is guaranteed, therefore, the labels of the initial test flow of each test can be randomly generated according to different tests, the situation that the users participating in different tests are the same batch of users can be avoided, the uniformity of user flow dispersion is guaranteed, and the accuracy of the test is improved.
With continued reference to fig. 3, fig. 3 is a schematic diagram of a second embodiment of a method for testing an application according to the present application. The method for testing an application may comprise the steps of:
step 301, a test request of an application is obtained.
Step 302, according to the initial test flow ratio, randomly generating the labels of the initial test flow in the label interval of the total flow to form an initial label set.
Step 303, generating a configuration file based on the initial tag set and the configuration information.
In this embodiment, the disclosure of steps 301 to 303 is the same as or similar to that of steps 101 to 103 in the above embodiment, and will not be repeated here.
Step 304, in response to receiving the modification instruction of the initial test flow, modifies the initial test flow.
In this embodiment, the above application may have a need for initial test flow adjustment during the test. If the initial test flow needs to be adjusted, the tester can modify the initial test flow in a configuration platform or the like, so that a modification instruction of the initial test flow can be generated. If the execution body receives a modification instruction of the initial test flow, the initial test flow can be modified. It will be appreciated that the modification instructions for the initial test flow may be used to increase the test flow based on the initial test flow, and may also be used to decrease the test flow based on the initial test flow.
And 305, determining the ratio of the modified initial test flow to the total flow as the modified test flow ratio.
In this embodiment, a modified initial test flow may be obtained based on step 304, and then, the execution body may calculate a ratio of the modified initial test flow to the total flow, and determine the calculated ratio as the ratio of the modified test flow.
Step 306, based on the modified test traffic proportion, the initial tag set is modified to obtain a modified tag set.
In this embodiment, based on the modified test traffic proportion obtained in step 305, the execution body may modify the initial tag set in various manners, and determine the modified initial tag set as the modified tag set. As an example, the execution body may randomly generate the label of the modified test flow in the label interval of the total flow according to the ratio of the modified test flow, and replace the label in the initial label set with the obtained label to obtain the modified label set.
In some optional implementations of this embodiment, after determining the modified test traffic proportion, the executing entity may determine a magnitude relation between the modified test traffic proportion and the initial test traffic proportion. If the modified test traffic proportion is less than the initial test traffic proportion, then a portion of the tags may be deleted from the initial tag set. Specifically, the executing body may first calculate a difference value between the modified test flow ratio and the initial test flow ratio, then determine, according to the difference value, a number of tags deleted randomly from the initial tag set, determine the number as a first number, and finally determine, randomly, in the initial tag set, that the first number of tags are tags to be deleted, and delete each tag to be deleted from the initial tag set, so as to obtain a modified tag set. As an example, the initial test traffic proportion may be 20%, the modified test traffic proportion obtained by modifying the initial test traffic may be 10%, if the initial tag set includes 20 tags, the first number may be determined to be 10, 10 tags are randomly determined as tags to be deleted in the initial tag set, and each tag to be deleted is deleted from the initial tag set, so as to obtain a modified tag set including 10 tags. According to the scheme disclosed by the implementation mode, when the test flow is reduced, the labels in the set can be randomly reduced on the basis of the initial label set, so that the randomness and consistency of the labels in the modified label set are ensured.
In some optional implementations of this embodiment, if it is determined that the modified test traffic proportion is greater than the initial test traffic proportion, a tag may be added to the initial tag set. Specifically, the executing body may first calculate a difference value between the modified test flow ratio and the initial test flow ratio, then determine the number of the tags to be added according to the difference value, determine the number as the second number, then randomly determine the second number of the tags to be added in the tag interval of the total flow, and finally add each tag to be added in the initial tag set to obtain the modified tag set. It will be appreciated that the second number of labels to be added may be obtained by: the label is randomly generated from the label interval of the total flow, if the label generated at this time does not exist in the initial label set, the label can be added to the initial label set until a second number of labels are added in the initial label set, so that a modified label set can be obtained. Here, the second number of tags to be added in the initial tag set may also be obtained by other means, which is not limited only herein. According to the scheme disclosed by the implementation mode, when the test flow is increased, the label to be increased can be randomly determined in the label interval of the total flow, and the labels in the set are increased on the basis of the initial label set, so that the randomness and consistency of the labels in the modified label set are ensured.
In step 307, a modified configuration file is generated to test the application.
In this embodiment, the execution body may generate the modified configuration file based on the modified tag set obtained by modifying the initial tag set in step 306. The application may be tested using the modified configuration file.
In some optional implementations of this embodiment, the execution body may issue a configuration item of the configuration file to a background server of the application, where the issued configuration item may include a configuration item corresponding to the target tag set. When a user accesses the application, the background server can acquire the user identification of the user accessing the application, then process the user identification according to a preset rule, and when the processed user identification is determined to be in the target label set, the user identification can be determined to be matched with the target label set, and in this case, the user can be determined to be the user participating in the application test. The target tag set may be any one of the initial tag set or the modified tag set. The scheme disclosed by the implementation mode can reasonably determine the users participating in the application test according to the determined labels, and the mode can further ensure the randomness of the users participating in different tests and further ensure the consistency of the users participating in the same test.
Typically, for different test requests, different test identifications may be set and then entered into a specified log file. The user participating in the test can carry the test identifier of the test, so that the user participating in the test can be tracked, the clicking, browsing and other behaviors of the user participating in the test can be obtained, the behaviors of the user participating in the test are tracked, and the statistics of the test results in various dimensions are obtained. The statistics obtained may characterize the extent to which the tested application is received by the user.
In some optional implementations of this embodiment, the configuration information of the test request may further include whitelist information, where a whitelist may include a user identifier that is necessarily selected to participate in the test, where the whitelist information may be used for debugging an application to be tested by an internal tester or the like.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 1, the method for testing application in this embodiment can randomly add or delete the labels in the initial label set when the initial test flow ratio is changed, and the method can ensure that the randomness of the labels in the modified label set is obtained, thereby further improving the accuracy of the test.
With further reference to fig. 4, as an implementation of the method shown in the above figures, the present disclosure provides an embodiment of an apparatus for testing applications, which corresponds to the method embodiment shown in fig. 1, and which is particularly applicable in various electronic devices.
As shown in fig. 4, the apparatus 400 for testing an application of the present embodiment includes: an acquisition unit 401, a tag generation unit 402, and a profile generation unit 403. Wherein the obtaining unit 401 is configured to obtain a test request of the application, where the test request includes configuration information for configuring test data for the application, the configuration information includes an initial test traffic proportion, and the initial traffic proportion represents a ratio of the initial test traffic to a total traffic; the label generating unit 402 is configured to randomly generate labels of the initial test flow in a label interval of the total flow according to the initial test flow ratio to form an initial label set, wherein the label interval is an interval generated by dividing the total flow according to a preset granularity, and the initial label set is used for determining users participating in application tests; the profile generation unit 403 is configured to generate a profile to test the application based on the initial set of tags and the configuration information.
In some optional implementations of this embodiment, the apparatus 400 further includes: an initial test flow rate modification unit configured to modify the initial test flow rate in response to receiving a modification instruction of the initial test flow rate; a determining unit configured to determine a ratio of the modified initial test flow to the total flow as a modified test flow ratio; the initial tag set modification unit is configured to modify the initial tag set based on the modification test traffic proportion to obtain a modified tag set.
In some optional implementations of the present embodiment, the initial tag set modification unit is further configured to: in response to determining that the modified test flow rate ratio is less than the initial test flow rate ratio, determining a difference between the modified test flow rate ratio and the initial test flow rate ratio; determining the number of tags to be deleted in the initial tag set as a first number based on the determined difference; randomly determining a first number of tags as tags to be deleted in an initial tag set; and deleting each label to be deleted from the initial label set to obtain a modified label set.
In some optional implementations of the present embodiment, the initial tag set modification unit is further configured to: in response to determining that the modified test flow rate ratio is greater than the initial test flow rate ratio, determining a difference between the modified test flow rate ratio and the initial test flow rate ratio; determining the number of tags to be added in the initial tag set as a second number based on the determined difference, wherein the tags to be added are different from the tags in the tag set; randomly determining a second number of tags to be added in a tag interval of the total flow; and adding each label to be added in the initial label set to obtain a modified label set.
In some optional implementations of the present embodiment, the configuration file generation unit 403 is further configured to: generating a configuration file of the test task based on the initial tag set and the configuration information; synchronizing the configuration file to the redis storage system so that the redis storage system issues configuration items in the configuration file to test the application.
In some optional implementations of this embodiment, the apparatus 400 further includes: the sending unit is configured to send the configuration items in the configuration file to the background server of the application, so that the background server determines that the user participates in the test of the application under the condition that the user identification of the user accessing the application is matched with the labels in the target label set based on the configuration items, wherein the target label set is an initial label set or a modified label set.
The elements recited in apparatus 400 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations and features described above with respect to the method are equally applicable to the apparatus 400 and the units contained therein, and are not described in detail herein.
According to embodiments of the present application, there is also provided an electronic device, a readable storage medium and a computer program product.
As shown in fig. 5, is a block diagram of an electronic device of an apparatus for testing an application according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 5, the electronic device includes: one or more processors 501, memory 502, and interfaces for connecting components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 501 is illustrated in fig. 5.
Memory 502 is a non-transitory computer readable storage medium provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the methods for testing applications provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the methods for testing applications provided herein.
The memory 502 is used as a non-transitory computer readable storage medium, and may be used to store a non-transitory software program, a non-transitory computer executable program, and modules, such as program instructions/modules (e.g., the acquisition unit 401, the tag generation unit 402, and the configuration file generation unit 403 shown in fig. 4) corresponding to a method for testing an application in an embodiment of the present application. The processor 501 executes various functional applications of the server and data processing, i.e., implements the method for testing applications in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 502.
Memory 502 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created according to the use of the electronic device for the test application, etc. In addition, memory 502 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 502 may optionally include memory located remotely from processor 501, which may be connected to the electronic device for the test application via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device for the method of testing an application may further comprise: an input device 503 and an output device 504. The processor 501, memory 502, input devices 503 and output devices 504 may be connected by a bus or otherwise, for example in fig. 5.
The input device 503 may receive entered numeric or character information and generate key signal inputs related to user settings and function control of the electronic device for the test application, such as a touch screen, a keypad, a mouse, a trackpad, a touchpad, a pointer stick, one or more mouse buttons, a trackball, a joystick, and the like. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the test request of the application can be obtained, then the label of the initial test flow can be randomly generated in the label interval of the total flow according to the initial test flow proportion contained in the test request to form an initial label set, and finally the configuration file can be generated based on the initial label set and the configuration information, so that the application can be tested. According to the method provided by the embodiment, the non-repeated labels are randomly generated in the label interval of the total flow as the labels of the initial test flow, so that the randomness of users participating in the application test is ensured, therefore, the labels of the initial test flow of each test are randomly generated aiming at different tests, the situation that the users participating in different tests are the same batch of users can be avoided, the uniformity of user flow dispersion is ensured, and the accuracy of test results is improved.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (8)

1. A method for testing an application, comprising:
acquiring a test request of the application, wherein the test request comprises configuration information for configuring test data for the application, the configuration information comprises initial test flow ratio, and the initial test flow ratio represents the ratio of initial test flow to total flow;
randomly generating labels of the initial test flow in a label interval of the total flow according to the initial test flow ratio to form an initial label set, wherein the label interval is an interval generated by cutting the total flow according to a preset granularity, and the initial label set is used for determining users participating in the application test;
generating a configuration file based on the initial tag set and the configuration information to test the application;
modifying the initial test flow in response to receiving a modification instruction of the initial test flow;
determining the ratio of the modified initial test flow to the total flow as a modified test flow ratio;
modifying the initial tag set based on the modified test traffic formulation to obtain a modified tag set, comprising: responsive to determining that the modified test flow rate ratio is less than the initial test flow rate ratio, determining a difference between the modified test flow rate ratio and the initial test flow rate ratio; determining the number of tags to be deleted in the initial tag set as a first number based on the determined difference; randomly determining a first number of tags as the tags to be deleted in the initial tag set; deleting each label to be deleted from the initial label set to obtain a modified label set;
after generating a configuration file based on the initial set of tags and the configuration information, the method further comprises:
and sending a configuration item in the configuration file to a background server of the application, so that the background server determines that a user accesses the application participates in the test of the application under the condition that the user identification of the user is matched with a label in a target label set based on the configuration item, wherein the target label set is the initial label set or the modified label set.
2. The method of claim 1, wherein the modifying the initial set of tags based on the modified test traffic formulation results in a modified set of tags comprising:
responsive to determining that the modified test flow rate ratio is greater than the initial test flow rate ratio, determining a difference between the modified test flow rate ratio and the initial test flow rate ratio;
determining the number of tags to be added in the initial tag set as a second number based on the determined difference, wherein the tags to be added are different from the tags in the tag set;
randomly determining a second number of tags to be added in a tag interval of the total flow;
and adding each label to be added in the initial label set to obtain a modified label set.
3. The method of claim 1, wherein the generating a configuration file of the test task based on the initial set of tags and the configuration information to test the application comprises:
generating a configuration file of the test task based on the initial tag set and the configuration information;
synchronizing the configuration file to a redis storage system so that the redis storage system issues configuration items in the configuration file to test the application.
4. An apparatus for testing an application, comprising:
an obtaining unit configured to obtain a test request of the application, wherein the test request includes configuration information for configuring test data for the application, the configuration information includes an initial test traffic proportion, and the initial test traffic proportion represents a ratio of initial test traffic to total traffic;
the label generating unit is configured to randomly generate labels of the initial test flow in a label interval of the total flow according to the initial test flow proportion to form an initial label set, wherein the label interval is an interval generated by cutting the total flow according to a preset granularity, and the initial label set is used for determining users participating in the application test;
a configuration file generation unit configured to generate a configuration file based on the initial tag set and the configuration information to test the application;
an initial test flow rate modification unit configured to modify the initial test flow rate in response to receiving a modification instruction of the initial test flow rate;
a determining unit configured to determine a ratio of the modified initial test flow to the total flow as a modified test flow ratio;
an initial tag set modifying unit configured to modify the initial tag set based on the modified test traffic formulation, resulting in a modified tag set, further configured to: responsive to determining that the modified test flow rate ratio is less than the initial test flow rate ratio, determining a difference between the modified test flow rate ratio and the initial test flow rate ratio; determining the number of tags to be deleted in the initial tag set as a first number based on the determined difference; randomly determining a first number of tags as the tags to be deleted in the initial tag set; deleting each label to be deleted from the initial label set to obtain a modified label set;
wherein the apparatus further comprises:
and the sending unit is configured to send the configuration item in the configuration file to a background server of the application, so that the background server determines that the user participates in the test of the application under the condition that the user identification of the user accessing the application is matched with the label in a target label set based on the configuration item, wherein the target label set is the initial label set or the modified label set.
5. The apparatus of claim 4, wherein the initial tag set modification unit is further configured to:
responsive to determining that the modified test flow rate ratio is greater than the initial test flow rate ratio, determining a difference between the modified test flow rate ratio and the initial test flow rate ratio;
determining the number of tags to be added in the initial tag set as a second number based on the determined difference, wherein the tags to be added are different from the tags in the tag set;
randomly determining a second number of tags to be added in a tag interval of the total flow;
and adding each label to be added in the initial label set to obtain a modified label set.
6. The apparatus of claim 4, wherein the profile generation unit is further configured to:
generating a configuration file of the test task based on the initial tag set and the configuration information;
synchronizing the configuration file to a redis storage system so that the redis storage system issues configuration items in the configuration file to test the application.
7. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-3.
8. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-3.
CN202010488006.3A 2020-06-02 2020-06-02 Method and apparatus for testing applications Active CN111639032B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010488006.3A CN111639032B (en) 2020-06-02 2020-06-02 Method and apparatus for testing applications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010488006.3A CN111639032B (en) 2020-06-02 2020-06-02 Method and apparatus for testing applications

Publications (2)

Publication Number Publication Date
CN111639032A CN111639032A (en) 2020-09-08
CN111639032B true CN111639032B (en) 2023-08-01

Family

ID=72331428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010488006.3A Active CN111639032B (en) 2020-06-02 2020-06-02 Method and apparatus for testing applications

Country Status (1)

Country Link
CN (1) CN111639032B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109903109A (en) * 2017-12-08 2019-06-18 北京京东尚科信息技术有限公司 Test method and device
CN111124850A (en) * 2019-11-12 2020-05-08 上海移远通信科技有限公司 MQTT server performance testing method, system, computer equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104102576A (en) * 2013-04-12 2014-10-15 阿里巴巴集团控股有限公司 Multi-version test method and device
CN104281611B (en) * 2013-07-08 2018-04-03 阿里巴巴集团控股有限公司 Customer flow distribution method and device in Website testing system
TW201405503A (en) * 2013-09-14 2014-02-01 Sense Digital Co Ltd Random embossment tactile feeling counterfeiting method
CN109308255B (en) * 2017-07-28 2021-11-30 北京京东尚科信息技术有限公司 Method and device for A/B test experiment
CN108491267B (en) * 2018-03-13 2022-02-08 百度在线网络技术(北京)有限公司 Method and apparatus for generating information
CN109446442B (en) * 2018-10-15 2020-04-24 北京字节跳动网络技术有限公司 Method and apparatus for processing information
CN110209579B (en) * 2019-05-28 2023-06-06 Oppo广东移动通信有限公司 Test method and electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109903109A (en) * 2017-12-08 2019-06-18 北京京东尚科信息技术有限公司 Test method and device
CN111124850A (en) * 2019-11-12 2020-05-08 上海移远通信科技有限公司 MQTT server performance testing method, system, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111639032A (en) 2020-09-08

Similar Documents

Publication Publication Date Title
CN108153670B (en) Interface testing method and device and electronic equipment
CN111258609B (en) Upgrading method and device of Kubernetes cluster, electronic equipment and medium
CN110765024B (en) Simulation test method, simulation test device, electronic equipment and computer readable storage medium
CN111639027B (en) Test method and device and electronic equipment
US10671573B2 (en) Generating data tables
CN111858506B (en) Test data processing method and device, electronic equipment and storage medium
CN111752843A (en) Method, device, electronic equipment and readable storage medium for determining influence surface
CN112491617B (en) Link tracking method, device, electronic equipment and medium
CN111752960B (en) Data processing method and device
CN112559522A (en) Data storage method and device, query method, electronic device and readable medium
CN111913998A (en) Data processing method, device, equipment and storage medium
CN112015468A (en) Interface document processing method and device, electronic equipment and storage medium
CN111352706A (en) Data access method, device, equipment and storage medium
JP7016436B2 (en) Data processing methods, devices, equipment and storage media based on smart contracts
CN111767149B (en) Scheduling method, device, equipment and storage equipment
CN112015439B (en) Embedding method, device, equipment and storage medium of user APP interest
CN111639032B (en) Method and apparatus for testing applications
CN111782357A (en) Label control method and device, electronic equipment and readable storage medium
CN111177479A (en) Method and device for acquiring feature vectors of nodes in relational network graph
CN114661274A (en) Method and device for generating intelligent contract
CN113778973A (en) Data storage method and device
JP2021118004A (en) Graph calculation processing method, device, electronic equipment and storage medium
CN111782497B (en) Test method, test apparatus, electronic device, and readable storage medium
CN110837363A (en) Code file generation method, device, equipment and medium
CN111459887B (en) Resource screening method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant