CN112416751A - Processing method and device for interface automation test and storage medium - Google Patents

Processing method and device for interface automation test and storage medium Download PDF

Info

Publication number
CN112416751A
CN112416751A CN202011104883.2A CN202011104883A CN112416751A CN 112416751 A CN112416751 A CN 112416751A CN 202011104883 A CN202011104883 A CN 202011104883A CN 112416751 A CN112416751 A CN 112416751A
Authority
CN
China
Prior art keywords
interface
information
target
test
description data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011104883.2A
Other languages
Chinese (zh)
Inventor
仝瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202011104883.2A priority Critical patent/CN112416751A/en
Publication of CN112416751A publication Critical patent/CN112416751A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The disclosure relates to a processing method and device for interface automation test and a storage medium. The method comprises the following steps: analyzing initial script information containing a plurality of interface test cases, and extracting a plurality of target information of each interface test case; respectively writing the target information into corresponding field positions of a database to generate case description data of the interface test cases, and displaying the case description data on a target equipment interface; and responding to a test command sent by a user to the target use case description data in the target equipment interface, and executing the target script information corresponding to the target use case description data in the initial script information. According to the method and the device, the plurality of interface test cases in the test case script are visualized, so that a user can select a corresponding test case from the visualized interface test cases for testing based on self test requirements, and the user can conveniently and flexibly utilize and operate information of the test cases.

Description

Processing method and device for interface automation test and storage medium
Technical Field
The present disclosure relates to the field of computers, and in particular, to a processing method and apparatus for an interface automation test, and a storage medium.
Background
In order to ensure the correctness and stability of the service, it is essential for the testing of the back-end service. However, at present, the iteration cycle of software is very fast, service updating is very frequent, and the requirement of quality assurance is difficult to meet through pure manual verification, so that the interface automation test can help testers to quickly perform functional correctness verification and online service inspection. For example, a test case is written in advance based on an interface automation test technology, and then the correctness verification of the function and the on-line service inspection are realized according to the test case.
In the related art, the test cases used in the interface automation test are usually generated by using a script writing method. However, because the test cases written based on the script have a certain speciality, the content of the script test cases is usually complex, and is inconvenient to manage and maintain.
Disclosure of Invention
The present disclosure provides a processing method, device and system for interface automation test, so as to at least solve the problem that the script test case content in the related art is inconvenient to manage and maintain. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a processing method for interface automation test, including:
analyzing initial script information containing a plurality of interface test cases, and extracting a plurality of target information of each interface test case;
respectively writing the target information into corresponding field positions of a database, generating case description data of the interface test cases, and displaying the case description data on a target equipment interface;
and responding to a test command sent by a user to the target use case description data in the target equipment interface, and executing the target script information corresponding to the target use case description data in the initial script information.
In some embodiments of the present disclosure, before the parsing the initial script information including a plurality of interface test cases and extracting a plurality of target information of each of the interface test cases, the method further includes:
responding to a submission event of the initial script information;
starting a monitoring program set in a case running program in a test framework;
after the writing the target information into the corresponding field positions of the database respectively and generating the case description data of the interface test cases, the method further includes:
receiving a write completion message fed back by the monitoring program;
and exiting the use case running program in the test framework.
In some embodiments of the present disclosure, the parsing the initial script information including a plurality of interface test cases, and extracting a plurality of target information of each of the interface test cases includes:
matching the initial script information with a preset name keyword;
and if the matching is successful, extracting the class name and the method name of each interface test case from the initial script information according to the name key words.
In some embodiments of the present disclosure, the parsing the initial script information including a plurality of interface test cases, and extracting a plurality of target information of each of the interface test cases, further includes:
matching the initial script information with preset extended keywords;
and if the matching is successful, extracting the extended information of each interface test case from the initial script information according to the extended keywords.
In some embodiments of the present disclosure, after displaying the use case description data on the target device interface, the method further includes:
responding to the modification operation of the user on the use case description data in the target equipment interface, and synchronously modifying corresponding information in the database according to the modification operation;
and when the case description data stored in the database is detected to be inconsistent with the initial script information, sending modification prompt information.
According to a second aspect of the embodiments of the present disclosure, there is provided a processing apparatus for interface automation test, including:
the analysis module is configured to analyze initial script information containing a plurality of interface test cases and extract a plurality of target information of each interface test case;
the writing module is configured to write the target information into corresponding field positions of a database respectively, and generate case description data of the interface test cases;
the display module is configured to display the use case description data on a target device interface;
and the execution module is configured to respond to a test command sent by a user to the target use case description data in the target equipment interface, and execute the target script information corresponding to the target use case description data in the initial script information.
In some embodiments of the present disclosure, the apparatus further comprises:
the starting module is configured to respond to a submission event of the initial script information and start a monitoring program set in a case running program in a testing frame before the analysis module analyzes the initial script information containing a plurality of interface test cases and extracts a plurality of target information of each interface test case;
the processor module is configured to write the target information into corresponding field positions of a database respectively by the write-in module, receive write-in completion information fed back by the monitoring program after generating case description data of the interface test cases, and quit a case running program in the test framework.
In some embodiments of the present disclosure, the parsing module is specifically configured to:
matching the initial script information with a preset name keyword;
and if the matching is successful, extracting the class name and the method name of each interface test case from the initial script information according to the name key words.
In some embodiments of the disclosure, the parsing module is further specifically configured to:
matching the initial script information with preset extended keywords;
and if the matching is successful, extracting the extended information of each interface test case from the initial script information according to the extended keywords.
In some embodiments of the present disclosure, the apparatus further comprises:
the modification module is configured to respond to modification operation of the user on the use case description data in the target equipment interface, and synchronously modify corresponding information in the database according to the modification operation;
a sending module configured to send modification prompt information when detecting that the use case description data stored in the database is inconsistent with the initial script information.
According to a third aspect of the embodiments of the present disclosure, there is provided another processing apparatus for automated interface testing, including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the processing method for interface automation test according to the first aspect.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a storage medium, wherein when instructions in the storage medium are executed by a processor of a processing apparatus for interface automation test, the processing apparatus for interface automation test is enabled to execute the processing method for interface automation test of the first aspect.
According to a fifth aspect of the embodiments of the present disclosure, there is provided a computer program product, wherein when the instructions in the computer program product are executed by a processor, the processing method of the interface automation test of the first aspect is executed.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the method comprises the steps of analyzing initial script information containing a plurality of interface test cases, extracting a plurality of target information of each interface test case, writing the plurality of target information into corresponding field positions of a database respectively, generating case description data of the plurality of interface test cases, and displaying the case description data on a target equipment interface, so that the visualization of the interface test cases is realized, a user can see and operate information of the test cases on the interface conveniently, for example, case logic and operation case states can be displayed by utilizing a visual interface, and the management and maintenance of the test cases are facilitated. In addition, the target script information corresponding to the target case description data in the initial script information can be executed by responding to the test command sent by the user to the target case description data in the target equipment interface, so that the method and the device can be seen.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow chart illustrating a method of processing interface automation tests in accordance with an exemplary embodiment.
Fig. 2a and 2b are exemplary diagrams illustrating an initial script information visualization in accordance with an exemplary embodiment.
FIG. 3 is a flow chart illustrating a method of processing an interface automation test in accordance with another exemplary embodiment.
FIG. 4 is an exemplary diagram illustrating a method of processing an interface automation test in accordance with one illustrative embodiment.
FIG. 5 is a block diagram of a processing device illustrating automated testing of an interface, according to an example embodiment.
FIG. 6 is a block diagram of a processing device for automated testing of an interface, according to another exemplary embodiment.
FIG. 7 is a block diagram of a processing device for automated testing of an interface, according to yet another exemplary embodiment.
FIG. 8 is a block diagram illustrating a processing device for automated testing of an interface, according to an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
FIG. 1 is a flow chart illustrating a method of processing interface automation tests in accordance with an exemplary embodiment. It should be noted that the processing method for interface automation test according to the embodiment of the present disclosure is used in the processing apparatus for interface automation test, for example, the processing apparatus for interface automation test may be test tool software configured on an electronic device. As shown in fig. 1, the processing method of the interface automation test includes the following steps.
In step S11, the initial script information including a plurality of interface test cases is analyzed, and a plurality of target information for each interface test case is extracted.
It should be noted that the initial script information may be understood as a code script that is written by a tester in advance through a script case writing tool, and the code script may include a plurality of interface test cases. The interface is understood to be an interface used for realizing a service function, for example, a web page login function on a certain website, and the login function is realized through the interface.
In this step, when the initial script information including the interface test case is obtained, the initial script information may be analyzed, and a plurality of target information of each interface test case may be extracted from the initial script information. In the embodiment of the present disclosure, the target information may include, but is not limited to, use case description, dependency, priority, grouping, use case state, execution sequence, class name, and method name in an interface test case, and these information are actually used in performing a test. In the prior art, the target information can be extracted from the script information of the test case only when the test case really runs. In the present disclosure, when the initial script information including a plurality of interface test cases is obtained, before the test case script is not executed, all target information of each interface test case is extracted from the initial script information based on a static analysis manner, so that the test case script is visualized by using the extracted target information in the following.
It should be noted that, the target information of the interface test case may be extracted from the initial script information by using a keyword matching method. In some embodiments of the present disclosure, the initial script information may be matched with a preset name keyword, and if the matching is successful, the class name and the method name of each interface test case are extracted from the initial script information according to the name keyword. Wherein, the name key may include, but is not limited to, a class name key and a method name key, since the class name and the method name may be located to a unique test case.
It should be further noted that, in order to facilitate the tester to view the test situation, information for front-end presentation may also exist in the test case, for example, the priority, the case state, the running environment, and the like are all customized annotations and may be extensible. Therefore, when the initial script information is analyzed, the initial script information can be matched with the extension key, and all useful information in the interface test case can be extracted from the initial script information. Specifically, in other embodiments of the present disclosure, the initial script information may also be matched with a preset extended keyword, and if the matching is successful, the extended information of each interface test case is extracted from the initial script information according to the extended keyword.
In step S12, the target information is written into the corresponding field positions of the database, so as to generate case description data of the interface test cases, and the case description data is displayed on the target device interface.
In the embodiment of the disclosure, the database may store the target information of the interface test case through the data table. The data table contains attribute names and attribute values thereof, wherein the attributes may include a class name attribute and a method name attribute, or may further include extended attributes, wherein the extended attributes may include but are not limited to one or more of a use case description attribute, a dependency attribute, a priority attribute, a grouping attribute, a use case status attribute, an execution sequence attribute, a running environment attribute, and the like. As an example, the above field position may be understood as a field in a data table for representing an attribute name.
In this step, after the target information of each interface test case is extracted from the initial script information, the target information of each interface test case may be input into a corresponding field position of the database, so as to generate case description data of each interface test case, and then the case description data may be displayed on the target device interface. That is, the target information of all interface test cases analyzed from the initial script information is all integrated together in a manner of writing into a database to obtain case description data, and the case description data is displayed on a target device interface, so that the visualization of the interface test cases is realized.
For example, as shown in fig. 2a, the initial script information includes interface test cases. The initial script information is a test case script aiming at a login registration module, the initial script information comprises an interface test case aiming at a registration function and an interface test case aiming at a login function, each test case comprises corresponding priority information, case state information, operation environment information, case description information and a method name, and the two cases of the interface test case aiming at the registration function and the interface test case aiming at the login function share one class name information, so that the interface test case aiming at the registration function and the interface test case aiming at the login function can be positioned through the type and the method name. In the present disclosure, all the target information of the interface test case with the registration function and all the target information of the interface test case with the login function may be analyzed from the initial script information, and the target information is written into the corresponding field position of the database, so that the case description data shown in fig. 2b may be formed. The first row of data shown in fig. 2b can be represented as each attribute name in the data table, and the content under the column thereof is the corresponding attribute value; the second row of data as shown in fig. 2b may be represented as all target information in the interface test case of the registered function; the third row of data shown in fig. 2b may be represented as all target information in the interface test case of the login function, that is, the generated case description data includes two pieces of case description data, that is, case description data corresponding to the interface test case of the login function and case description data corresponding to the interface test case of the registered function. Therefore, the number of the interface test cases contained in the initial script information is consistent with the number of the use case description data contained in the use case description data.
In step S13, in response to the test command sent by the user to the target use case description data in the target device interface, the target script information corresponding to the target use case description data in the initial script information is executed.
Optionally, in some embodiments of the present disclosure, when the use case description data of the multiple interface test cases is displayed on the target device interface, a control button corresponding to each use case description data in the use case description data may be generated based on each interface test case in the initial script information, and each control button is displayed at a position near the use case description data corresponding to the control button on the target device interface, so that the user may implement the running of the test case script by clicking the control button. For example, when a trigger operation of a user for a control button corresponding to target use case description data in a target device interface is received, a test command sent by the user to the target use case description data may be responded, and at this time, target script information corresponding to the target use case description data in initial script information may be executed. Therefore, the plurality of interface test cases in the test case script are visualized, so that a user can select a corresponding test case from the visualized interface test cases for testing based on self test requirements, and the user can conveniently and flexibly utilize and operate the information of the test case.
According to the processing method for the interface automatic test of the embodiment of the disclosure, the initial script information containing a plurality of interface test cases can be analyzed, a plurality of target information of each interface test case can be extracted, the plurality of target information can be written into corresponding field positions of the database respectively, the case description data of the plurality of interface test cases can be generated, and the case description data can be displayed on the interface of the target equipment, so that the visualization of the interface test cases can be realized, a user can conveniently see and operate the information of the test cases on the interface, for example, the case logic and the operation case state can be displayed by utilizing the visualized interface, and the management and maintenance of the test cases can be facilitated. In addition, the target script information corresponding to the target case description data in the initial script information can be executed by responding to the test command sent by the user to the target case description data in the target equipment interface, so that the method and the device can be seen.
FIG. 3 is a flow chart illustrating a method of processing an interface automation test in accordance with another exemplary embodiment. As shown in fig. 3, the processing method of the interface automation test includes the following steps.
In step S31, in response to the submission event of the initial script information, a listener set in the use case execution program in the test framework is started.
Alternatively, after the tester writes the initial script information by using the script writing tool software, the tester may submit the initial script information to the processing apparatus for the interface automation test according to the embodiment of the present disclosure. The processing device for the interface automation test of the embodiment of the disclosure may start a listener set in a use case running program in a test framework in the processing device in response to a commit event for the initial script information. The monitoring program can be used for realizing the persistence work of the test case, namely writing the information of the case into a database.
In step S32, the initial script information including a plurality of interface test cases is analyzed, and a plurality of target information for each interface test case is extracted.
In the embodiment of the disclosure, when a submission event for the initial script information is monitored, the initial script information can be acquired, the initial script information containing a plurality of interface test cases is analyzed, and a plurality of target information of each interface test case is extracted.
In step S33, the target information is written into the corresponding field positions of the database, and case description data of the interface test cases is generated.
In step S34, a write completion message fed back by the listener is received, and the use case running program in the test framework is exited.
That is, after writing the target information into the corresponding field positions of the database respectively and generating the case description data of the interface test cases, it can be determined whether there is a write completion message fed back by the listener, if so, it indicates that the write operation of the database is currently completed, and at this time, the case running program in the test framework can be exited, so that the case information can be obtained and the case cannot be really executed.
In step S35, the use case description data is displayed on the target device interface.
In step S36, in response to the test command sent by the user to the target use case description data in the target device interface, the target script information corresponding to the target use case description data in the initial script information is executed.
For example, as shown in fig. 4, a tester writes a test case locally and then submits the code to a code management tool (e.g., gitlab). The commit event causes the code management tool to automatically trigger the listener to perform the packing compilation, which is the most core step. Because the TestContext object is provided by the test framework (e.g., TestNg, etc.), it contains the case description, dependencies, priorities, groupings, etc. of each test case. However, in the prior art, the information can be obtained only when the use case is really run, and is not statically resolved, so a compromise way needs to be found, and the information can be obtained without really running the use case. The method and the system have the advantages that a listener (namely the listener) is customized in a case running program (such as testng. xml and the like) provided by a test framework, so that the persistence work of the test case is realized, namely, the information of the case is written into a database (such as target information of class name, method name and the like). The submitting event triggers the monitor to compile and then execute the case running program, in the prior art, the original function of the case running program is used for running the case, but in the disclosure, the monitor is executed before the case is run by the case running program. All target information in the interface test case is accessed into a database to generate corresponding case description data, the case description data is displayed on a target equipment interface to complete case persistence, then, a message can be sent to a tester, the tester can see the test case on a visual test platform interface, and the displayed information includes information such as use case description, on-line and off-line states, priority and the like.
In some embodiments of the present disclosure, after the use case description data is displayed on the target device interface, in response to a modification operation of a user on the use case description data in the target device interface, corresponding information in the database may be modified synchronously according to the modification operation, and when it is detected that the use case description data stored in the database is inconsistent with the initial script information, modification prompt information may be sent. That is, when the use case description data is displayed on the target device interface, the user can modify the use case description data in the target device interface according to the test requirement of the user, and synchronize the modified data into the database. In the method, whether the case description data stored in the database is consistent with the initial script information or not can be monitored periodically or periodically, if not, the visual test case content is modified, and the initial script information is not modified, at the moment, modification prompt information can be sent to a device for writing the initial script information, so that the device can correspondingly modify the initial script information based on the modification prompt information, and a scene that a user can flexibly write the test case is realized.
Therefore, the embodiment of the disclosure separates the work of script writing and script persistence, and a user can see the updated content on the platform only by writing the test case and submitting the code without manually doing a persistence operation. The user can flexibly compile scenes of the test cases and can conveniently see and operate information of the test cases on the interface.
FIG. 5 is a block diagram of a processing device illustrating automated testing of an interface, according to an example embodiment. Referring to fig. 5, the apparatus includes a parsing module 510, a writing module 520, a display module 530, and an execution module 540.
Specifically, the parsing module 510 is configured to parse initial script information including a plurality of interface test cases, and extract a plurality of target information of each interface test case. In some embodiments of the present disclosure, parsing module 510 is specifically configured to:
matching the initial script information with a preset name keyword; and if the matching is successful, extracting the class name and the method name of each interface test case from the initial script information according to the name key words.
In other embodiments of the present disclosure, the parsing module 510 is further specifically configured to: matching the initial script information with preset extended keywords; and if the matching is successful, extracting the extended information of each interface test case from the initial script information according to the extended keywords.
The writing module 520 is configured to write the plurality of target information into corresponding field positions of the database, respectively, and generate case description data of the plurality of interface test cases.
The display module 530 is configured to display the use case description data on the target device interface.
The execution module 540 is configured to execute the target script information corresponding to the target use case description data in the initial script information in response to a test command sent by a user to the target use case description data in the target device interface.
In some embodiments of the present disclosure, as shown in fig. 6, the processing device 500 for interface automation test may further include: a start module 550 and a processor module 560. The starting module 550 is configured to start a monitor set in a case running program in the test framework in response to a submission event of the initial script information before the parsing module 510 parses initial script information including a plurality of interface test cases and extracts a plurality of target information of each interface test case; the processor module 560 is configured to write the plurality of target information into corresponding field positions of the database respectively by the write module 520, receive a write completion message fed back by the listener after generating case description data of the plurality of interface test cases, and exit the case running program in the test framework.
In some embodiments of the present disclosure, as shown in fig. 7, the processing device 500 for interface automation test may further include: a modification module 570 and a sending module 580. The modification module 570 is configured to respond to modification operation of a user on the use case description data in the target device interface, and synchronously modify corresponding information in the database according to the modification operation; the sending module 580 is configured to send modification prompt information when detecting that the use case description data stored in the database is inconsistent with the initial script information.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
According to the processing device for the interface automatic test, the initial script information containing a plurality of interface test cases can be analyzed, a plurality of target information of each interface test case can be extracted, the target information can be written into corresponding field positions of the database respectively, case description data of the interface test cases can be generated, and the case description data can be displayed on a target equipment interface, so that the interface test cases can be visualized, a user can see and operate information of the test cases conveniently on the interface, for example, case logic and operation case states can be displayed by utilizing the visualized interface, and management and maintenance of the test cases are facilitated. In addition, the target script information corresponding to the target case description data in the initial script information can be executed by responding to the test command sent by the user to the target case description data in the target equipment interface, so that the method and the device can be seen.
FIG. 8 is a block diagram illustrating a processing device for automated testing of an interface, according to an example embodiment. It should be noted that the processing apparatus for automatically testing the interface may be an electronic device, and the electronic device may be installed with a client corresponding to the multimedia resource social platform. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 8, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a touch-sensitive display screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described processing method for the interface automation test.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the electronic device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, wherein instructions of the storage medium, when executed by a processor of an electronic device 800, enable the electronic device 800 to perform the interface automation test processing method according to the above embodiment.
A computer program product enabling an electronic device 800 to perform a processing method of interface automation testing when instructions in the computer program product are executed by a processor of the electronic device 800.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A processing method for interface automation test is characterized by comprising the following steps:
analyzing initial script information containing a plurality of interface test cases, and extracting a plurality of target information of each interface test case;
respectively writing the target information into corresponding field positions of a database, generating case description data of the interface test cases, and displaying the case description data on a target equipment interface;
and responding to a test command sent by a user to the target use case description data in the target equipment interface, and executing the target script information corresponding to the target use case description data in the initial script information.
2. The method of claim 1, wherein before parsing the initial script information including a plurality of interface test cases and extracting a plurality of target information of each of the interface test cases, the method further comprises:
responding to a submission event of the initial script information;
starting a monitoring program set in a case running program in a test framework;
after the writing the target information into the corresponding field positions of the database respectively and generating the case description data of the interface test cases, the method further includes:
receiving a write completion message fed back by the monitoring program;
and exiting the use case running program in the test framework.
3. The method of claim 1, wherein parsing the initial script information containing a plurality of interface test cases and extracting a plurality of target information for each of the interface test cases comprises:
matching the initial script information with a preset name keyword;
and if the matching is successful, extracting the class name and the method name of each interface test case from the initial script information according to the name key words.
4. The method of claim 3, wherein parsing initial script information comprising a plurality of interface test cases and extracting a plurality of target information for each of the interface test cases, further comprises:
matching the initial script information with preset extended keywords;
and if the matching is successful, extracting the extended information of each interface test case from the initial script information according to the extended keywords.
5. The method of any of claims 1-4, after displaying the use case description data on a target device interface, further comprising:
responding to the modification operation of the user on the use case description data in the target equipment interface, and synchronously modifying corresponding information in the database according to the modification operation;
and when the case description data stored in the database is detected to be inconsistent with the initial script information, sending modification prompt information.
6. A processing apparatus for automated testing of an interface, comprising:
the analysis module is configured to analyze initial script information containing a plurality of interface test cases and extract a plurality of target information of each interface test case;
the writing module is configured to write the target information into corresponding field positions of a database respectively, and generate case description data of the interface test cases;
the display module is configured to display the use case description data on a target device interface;
and the execution module is configured to respond to a test command sent by a user to the target use case description data in the target equipment interface, and execute the target script information corresponding to the target use case description data in the initial script information.
7. The apparatus of claim 6, further comprising:
the starting module is configured to respond to a submission event of the initial script information and start a monitoring program set in a case running program in a testing frame before the analysis module analyzes the initial script information containing a plurality of interface test cases and extracts a plurality of target information of each interface test case;
the processor module is configured to write the target information into corresponding field positions of a database respectively by the write-in module, receive write-in completion information fed back by the monitoring program after generating case description data of the interface test cases, and quit a case running program in the test framework.
8. The apparatus of claim 6, wherein the parsing module is specifically configured to:
matching the initial script information with a preset name keyword;
and if the matching is successful, extracting the class name and the method name of each interface test case from the initial script information according to the name key words.
9. A processing apparatus for automated testing of an interface, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the processing method of interface automation testing of any one of claims 1 to 5.
10. A storage medium in which instructions, when executed by a processor of a processing device for interface automation test, enable the processing device for interface automation test to perform the processing method for interface automation test as recited in any one of claims 1 to 5.
CN202011104883.2A 2020-10-15 2020-10-15 Processing method and device for interface automation test and storage medium Pending CN112416751A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011104883.2A CN112416751A (en) 2020-10-15 2020-10-15 Processing method and device for interface automation test and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011104883.2A CN112416751A (en) 2020-10-15 2020-10-15 Processing method and device for interface automation test and storage medium

Publications (1)

Publication Number Publication Date
CN112416751A true CN112416751A (en) 2021-02-26

Family

ID=74854746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011104883.2A Pending CN112416751A (en) 2020-10-15 2020-10-15 Processing method and device for interface automation test and storage medium

Country Status (1)

Country Link
CN (1) CN112416751A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113806229A (en) * 2021-09-27 2021-12-17 工银科技有限公司 Interface change test script multiplexing method, device, equipment, medium and product
CN114924991A (en) * 2022-07-19 2022-08-19 深圳市亿联无限科技有限公司 Method and system for reproducing probabilistic problem under specific operation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109753276A (en) * 2018-12-29 2019-05-14 北京天际启游科技有限公司 A kind of control method and relevant apparatus based on illusory engine
CN110413524A (en) * 2019-07-26 2019-11-05 中国工商银行股份有限公司 For generating method and apparatus, the automated testing method of test script
CN111078580A (en) * 2019-12-31 2020-04-28 贵阳货车帮科技有限公司 Test case management method and device, storage medium and electronic equipment
CN111176996A (en) * 2019-12-25 2020-05-19 平安普惠企业管理有限公司 Test case generation method and device, computer equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109753276A (en) * 2018-12-29 2019-05-14 北京天际启游科技有限公司 A kind of control method and relevant apparatus based on illusory engine
CN110413524A (en) * 2019-07-26 2019-11-05 中国工商银行股份有限公司 For generating method and apparatus, the automated testing method of test script
CN111176996A (en) * 2019-12-25 2020-05-19 平安普惠企业管理有限公司 Test case generation method and device, computer equipment and storage medium
CN111078580A (en) * 2019-12-31 2020-04-28 贵阳货车帮科技有限公司 Test case management method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113806229A (en) * 2021-09-27 2021-12-17 工银科技有限公司 Interface change test script multiplexing method, device, equipment, medium and product
CN114924991A (en) * 2022-07-19 2022-08-19 深圳市亿联无限科技有限公司 Method and system for reproducing probabilistic problem under specific operation

Similar Documents

Publication Publication Date Title
CN111273899B (en) Code processing method, device, electronic equipment and storage medium
CN111274131A (en) Interface testing method and device, electronic equipment and storage medium
CN111221733A (en) Information processing method and device, mobile terminal and storage medium
CN111371837A (en) Function presenting method, function presenting device, and storage medium
CN112416751A (en) Processing method and device for interface automation test and storage medium
CN115185717A (en) Interface calling method and device, electronic equipment and storage medium
CN111061452A (en) Voice control method and device of user interface
CN114741292A (en) Test script management method and device, electronic equipment and storage medium
CN116069612A (en) Abnormality positioning method and device and electronic equipment
CN113010157A (en) Code generation method and device
CN111209195A (en) Method and device for generating test case
CN110928854A (en) Data import method and device and electronic equipment
CN111240927B (en) Method, device and storage medium for detecting time consumption of method in program
CN115543831A (en) Test script generation method, device, equipment and storage medium
CN112667852B (en) Video-based searching method and device, electronic equipment and storage medium
CN111596980B (en) Information processing method and device
CN114218235A (en) Page text updating method and device, electronic equipment and storage medium
CN112446366A (en) Image translation evaluating method and device for image translation evaluating
CN111736890A (en) Data updating method and device, electronic equipment and storage medium
CN115357519B (en) Test method, device, equipment and medium
CN114554283B (en) Target object display method and device, electronic equipment and storage medium
CN109947640B (en) Regression test-based core function coverage statistical method and device
CN114896165A (en) Testing method and device of conversation robot system, electronic equipment and storage medium
CN115495062A (en) Data processing method and device, electronic equipment and storage medium
CN111382061B (en) Test method, test device, test medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination