CN113505082A - Application program testing method and device - Google Patents

Application program testing method and device Download PDF

Info

Publication number
CN113505082A
CN113505082A CN202111055191.8A CN202111055191A CN113505082A CN 113505082 A CN113505082 A CN 113505082A CN 202111055191 A CN202111055191 A CN 202111055191A CN 113505082 A CN113505082 A CN 113505082A
Authority
CN
China
Prior art keywords
test
code
case
data
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111055191.8A
Other languages
Chinese (zh)
Other versions
CN113505082B (en
Inventor
刘楚蓉
曾辉
王健
谢宗兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111055191.8A priority Critical patent/CN113505082B/en
Publication of CN113505082A publication Critical patent/CN113505082A/en
Application granted granted Critical
Publication of CN113505082B publication Critical patent/CN113505082B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The application discloses an application program testing method and device, relates to the technical field of data testing, and can be applied to various scenes such as cloud technology, artificial intelligence, intelligent traffic, Internet of vehicles and the like. The method comprises the following steps: acquiring attribute information of a target application program corresponding to a test case; determining a test scene corresponding to the test case based on the attribute information; sending a test scene and an automatic test code corresponding to the test case to the test terminal so that the test terminal executes the automatic test code in the test scene, and playing back target data in the code execution process to obtain test data of a target application program; sending the test data and the target data to the server so that the server obtains a test result of the target application program based on the test data and the target data; the automatic test code is generated based on the test case; the method and the device improve the code compiling efficiency and reduce the testing cost of the application program.

Description

Application program testing method and device
Technical Field
The present application relates to the field of data testing technologies, and in particular, to a method and an apparatus for testing an application program.
Background
The existing automatic test technology needs manual compiling of test codes of test cases; the compiling cost of the code is high, a compiler is required to have a certain code level, and the cost is very high in the aspect of constructing test scenes such as test data and mock data.
Therefore, it is necessary to provide an application program testing method and apparatus, which implement rapid generation of automatic test codes corresponding to test cases, thereby avoiding manual writing of test codes; the code writing efficiency is improved, and the testing cost of the application program is reduced.
Disclosure of Invention
The application provides an application program testing method and device, which can improve code compiling efficiency and reduce testing cost of an application program.
In one aspect, the present application provides an application program testing method, including:
acquiring attribute information of a target application program corresponding to a test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors;
determining a test scene corresponding to the test case based on the attribute information;
sending the test scene and the automated test code corresponding to the test case to a test terminal so that the test terminal executes the automated test code in the test scene, and playing back the target data in the code execution process to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case.
Another aspect provides an application testing apparatus, comprising:
the attribute information acquisition module is used for acquiring the attribute information of the target application program corresponding to the test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors;
the test scene determining module is used for determining a test scene corresponding to the test case based on the attribute information;
the code sending module is used for sending the test scene and the automatic test codes corresponding to the test cases to a test terminal so that the test terminal executes the automatic test codes in the test scene, and in the code execution process, the target data is played back to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case.
Another aspect provides an application testing device, which includes a processor and a memory, where at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded and executed by the processor to implement the application testing method as described above.
Another aspect provides a computer storage medium storing at least one instruction or at least one program, the at least one instruction or the at least one program being loaded and executed by a processor to implement the application testing method as described above.
Another aspect provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes to implement the application testing method as described above.
The application program testing method and device have the following technical effects:
the method includes the steps that attribute information of a target application program corresponding to a test case is obtained; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors; determining a test scene corresponding to the test case based on the attribute information; therefore, scene information corresponding to the test case can be determined in time, and the requirements of various automatic test scenes are met; sending the test scene and the automated test code corresponding to the test case to a test terminal so that the test terminal executes the automated test code in the test scene, and playing back the target data in the code execution process to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case; therefore, automatic testing can be performed through automatic testing codes generated based on the recorded test cases, testing cost is reduced, and testing efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions and advantages of the embodiments of the present application or the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of an application test system according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an application testing method according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for determining a file name of a test case according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for generating automated test code according to an embodiment of the present disclosure;
FIG. 5 is a flowchart illustrating a method for generating automated test code according to test cases according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a method for displaying the target page element by the target terminal according to an embodiment of the present application;
fig. 7 is a display interface diagram of a use case recording control provided in the embodiment of the present application;
fig. 8 is a display interface diagram of a filename editing popup window corresponding to a recording case generation code according to the embodiment of the present application;
FIG. 9 is a display interface diagram of a filename editing control provided by an embodiment of the present application;
fig. 10 is a display interface diagram of a use case recording prompt control provided in the embodiment of the present application;
FIG. 11 is a display interface diagram of a page element extraction control provided by an embodiment of the present application;
FIG. 12 is a display interface diagram of a floating frame according to an embodiment of the present disclosure;
FIG. 13 is a display interface diagram illustrating a pop-up window in accordance with an embodiment of the present disclosure;
FIG. 14 is a flowchart illustrating another test code generation method according to an embodiment of the present application;
FIG. 15 is a timing flow diagram for test code generation provided by embodiments of the present application;
FIG. 16 is a diagram comparing the effects of automated test code and the use case writing and execution using conventional test techniques according to an embodiment of the present application;
FIG. 17 is a schematic structural diagram of an application testing apparatus according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1, fig. 1 is a schematic diagram of an application test system according to an embodiment of the present disclosure, and as shown in fig. 1, the application test system may include at least a first client 01, a second client 02, and a server 03.
Specifically, in this embodiment of the application, the first client 01 may include a smart phone, a desktop computer, a tablet computer, a notebook computer, a digital assistant, a smart wearable device, a smart speaker, a vehicle-mounted terminal, a smart television, and other types of physical devices, and may also include software running in the physical devices, for example, web pages provided by some service providers to users, and applications provided by the service providers to users. Specifically, the first client 01 may be configured to send a test scenario and an automation test code corresponding to a test case to the second client 02.
Specifically, in this embodiment of the application, the second client 02 may include a type of physical device such as a smart phone, a desktop computer, a tablet computer, a notebook computer, a digital assistant, a smart wearable device, a smart speaker, a vehicle-mounted terminal, and a smart television, and may also include software running in the physical device, for example, a web page provided by some service providers to a user, and an application provided by the service providers to the user. Specifically, the second client 02 may be configured to execute the automated test code in the test scenario, and in a code execution process, playback the target data to obtain the test data of the target application program.
Specifically, in this embodiment of the application, the server 03 may include an independently operating server, or a distributed server, or a server cluster including a plurality of servers, and may also be a cloud server that provides basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), and a big data and artificial intelligence platform. The server 03 may comprise a network communication unit, a processor, a memory, etc. Specifically, the server 03 may be configured to obtain a test result of the target application program according to the test data and the target data.
An application testing method of the present application is described below, and fig. 2 is a flowchart of an application testing method provided in an embodiment of the present application, which provides the method operation steps described in the embodiment or flowchart, but may include more or less operation steps based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. In practice, the system or server product may be implemented in a sequential or parallel manner (e.g., parallel processor or multi-threaded environment) according to the embodiments or methods shown in the figures. Specifically, as shown in fig. 2, the method may be applied to a local terminal, and specifically may include:
s201: acquiring attribute information of a target application program corresponding to a test case; the test case is obtained by recording target data; the target data includes a plurality of object operation behavior data generated in accordance with a time sequence of object operation behaviors.
In the embodiment of the application, the attribute information of the target application program may include, but is not limited to, a use case ID, a version number, a mock ID in a test case recording process, and the like; the mock ID is used to obtain mock data of a link stored in the background server, such as a network and an IO (Input/Output). The mock test is a test method which is created by using a virtual object for some objects which are not easy to construct or obtain in the test process. In the testing process, it is required to ensure that the test page data (page praise number, comment number, etc.) of each test terminal are completely consistent, that is, all network requests need to be intercepted. And the mock data is automatically backfilled, so that the data consistency of the test terminal during test case playback is ensured.
S203: and determining a test scene corresponding to the test case based on the attribute information.
In the embodiment of the application, the test scene of the use case can be simulated according to the mock data, so that the test scene can be quickly replayed, and the automatic test can be ensured to be carried out in the set test scene.
S205: sending the test scene and the automated test code corresponding to the test case to a test terminal so that the test terminal executes the automated test code in the test scene, and playing back the target data in the code execution process to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case.
Specifically, in the embodiment of the present application, when the test data is completely consistent with the target data, it is determined that the target application program runs stably; and when the test data is completely consistent with the target data, determining that the target application program is unstable in operation. The number of the test terminals can be multiple, and when the target application programs of the multiple test terminals are stable, the stability of the target application programs is good, and the test is passed; when the target application program of any test terminal runs unstably, the target application program is not good in stability, the test is not passed, and the target application program needs to be further optimized.
In an embodiment of the present application, the method further includes:
acquiring target data;
the acquiring target data includes:
s101: responding to a first operation instruction triggered based on a case recording prompt control, and sending a test case recording instruction to a target terminal; the test case recording instruction carries identification information of the target application program.
In the embodiment of the present application, the local terminal and the target terminal are communicatively connected, for example, through a USB data line.
In a specific embodiment, the use case recording prompt control may include, but is not limited to, a text prompt identifier, a picture prompt identifier, a symbol prompt identifier, and the like; as shown in fig. 10, the use case recording prompt control may be a "start recording" key identifier in fig. 10, and when the user triggers the identifier, a test case recording instruction may be sent to the target terminal, where the test case recording instruction is specific to a specific application program and carries identification information of the target application program, and the identification information may include, but is not limited to, an icon, a name, and the like of the target application program.
In this embodiment of the present application, the target data further includes a plurality of object operation associated data, each object operation associated data corresponds to one object operation behavior data, and the method further includes:
generating a plurality of step use case information according to the time sequence corresponding to each object operation behavior data by using the object operation behavior data;
in the embodiment of the present application, the object operation behavior data may include, but is not limited to, data corresponding to various behaviors of a user in sliding, clicking, dragging, and the like at a target terminal; in the recording executing process, recording all operation behaviors of the target terminal, such as clicking, sliding, double clicking and the like, so that all operation flows can be recorded, and meanwhile, information of the control can be captured; representing the execution sequence of each operation behavior by the time sequence corresponding to the object operation behavior data; multiple pieces of step use case information can be constructed according to the execution sequence of the operation behaviors, and each operation behavior corresponds to one piece of step use case information.
Generating a plurality of step use case information according to the time sequence corresponding to each object operation behavior data by the plurality of object operation behavior data, including:
generating a plurality of step use case information from the plurality of object operation behavior data and the corresponding plurality of object operation associated data based on the corresponding time sequence of each object operation behavior data; each step use case information is generated based on one object operation behavior data and the corresponding object operation associated data.
In the embodiment of the present application, the object operation associated data may include, but is not limited to, data such as text, picture, coordinate position, path (xpath), whether to pop window, operation delay, and the like in the use case step list; the path corresponds to control identification information in the display page.
In this embodiment of the present application, as shown in fig. 3, before sending a test case recording instruction to a target terminal in response to a first operation instruction triggered based on a use case recording prompt control, the method further includes:
s301: and displaying a use case recording control on a target page based on a preset plug-in.
In the embodiment of the application, the method can be applied to an Android development tool (Android Studio); the Android Studio is an Android integrated development tool proposed by google, and the tool provides an integrated Android development tool for development and debugging. The preset plug-in may be installed in the Android Studio, and specifically, the preset plug-in may be an Integrated Development Environment (IDE) plug-in, where the IDE plug-in is used to provide an application program of the program Development Environment, and generally includes tools such as a code editor, a compiler, a debugger, and a graphical user interface.
In a specific embodiment, the use case recording control may include, but is not limited to, a text, a picture, a symbol, and other identifiers; for example, as shown in fig. 7, the use case recording control may generate a code for the "record use case" identification in fig. 7.
S303: responding to a second operation instruction triggered based on the case recording control, and displaying a file name editing popup window of the test case; the file name editing popup window comprises a file name editing control and a file name submitting control.
In the embodiment of the present application, the filename editing control is used to input a filename, the filename submission control is used to confirm the filename, the filename editing control may include, but is not limited to, a filename input box, a selectable filename list, and the like, and the filename submission control may include, but is not limited to, a text, a picture, a symbol, and the like.
In a specific embodiment, as shown in fig. 8, fig. 8 is a filename editing popup corresponding to the recording case generation code, where a blank input box is a filename editing control, and a "confirm" button is a filename submission control.
S305: and responding to a third operation instruction triggered based on the file name editing control, and displaying the file name of the test case.
In a specific embodiment, as shown in fig. 9, the user may trigger the filename editing control to input a filename, where the filename is a class name; such as: the class name corresponding to the video Author Test case is Author Test.
S307: and determining the file name of the test case in response to a fourth operation instruction triggered based on the file name submission control.
In a particular embodiment, as shown in FIG. 9, after the user enters the filename, the "OK" may be clicked on to submit the filename.
S309: displaying a case recording prompt page; the use case recording prompt page comprises the use case recording prompt control.
In the embodiment of the application, the use case recording prompt control may include, but is not limited to, a text prompt identifier, a picture prompt identifier, a symbol prompt identifier, and the like; after the user submits the file name, the current display page jumps to a case recording prompt page; such as the page shown in fig. 10, includes a use case recording prompt control of "start recording".
In an embodiment of the present application, as shown in fig. 4, the method further includes:
s401: converting the target data into a plurality of step use case information based on the time sequence corresponding to each object operation behavior data;
s403: constructing a test case based on the case information of the plurality of steps;
in the embodiment of the application, the multiple step case information is spliced according to the time sequence corresponding to each step case information to obtain the test case. In a particular embodiment, the test case may be used to test a video author. The test cases are stored in a JSON format. JSON (JavaScript Object Notation) is a lightweight data exchange format. The Test Case (Test Case) refers to the description of a Test task performed on a specific software product, and embodies Test schemes, methods, techniques and strategies. The contents of the test object, the test environment, the input data, the test steps, the expected results, the test scripts and the like are included, and finally, a document is formed. Simply considered, a test case is a set of test inputs, execution conditions, and expected results tailored for a particular purpose to verify whether a particular software requirement is met.
S405: and generating automatic test codes based on the test cases.
In this embodiment of the present application, as shown in fig. 5, the generating an automated test code based on the test case includes:
s4051: carrying out serialization processing on the case information of each step to obtain a step class object;
in the embodiment of the application, the step use case information in the JSON format can be serialized and converted into the step class object.
In this embodiment of the present application, the performing serialization processing on the use case information of each step to obtain a step class object includes:
s40511: serializing the object operation behavior data in the case information of each step to obtain a first class of objects;
s40513: and serializing the object operation associated data in the case information of each step to obtain a second class of objects.
In the embodiment of the application, object operation behavior data in a JSON format can be converted into a first class of objects in a serialized mode; serializing object operation associated data in a JSON format to convert the object operation associated data into a second class of objects; the first class of objects and the second class of objects are objects of target syntax; the essence of the conversion of the class object is to convert the data in JSON format into an instance object of the target syntax.
S4053: converting each step class object into a code of a target grammar;
in the embodiment of the present application, before code conversion begins, a target syntax type that needs to be converted is determined, for example, the target syntax may include, but is not limited to, a syntax corresponding to Espresso, and other syntax customized by a user. Espresso is an automated testing framework for the android application display interface offered by Google officials.
In this embodiment of the present application, the code for converting each step class object into the target syntax includes:
s40531: determining a first code corresponding to each first-class object based on the first mapping information; the first code is a code of the target grammar, and the first mapping information represents a mapping relation between a class object and the code;
in the embodiment of the application, the test code corresponding to the first class object corresponding to each kind of object operation behavior data can be predefined through the key class; key classes may include, but are not limited to, a convert assist class (Converter Helper), a Code Creator Registry class (Code Creator Registry), and the like; the mapping relation between the class object and the code is constructed in advance; therefore, the object operation behavior data can be quickly converted into corresponding test codes.
In this embodiment of the application, after determining the first code corresponding to each first class object based on the first mapping information, the method further includes:
determining annotation information of the first code based on the object operation behavior data;
and updating the first code based on the annotation information to obtain a first updated code.
In the embodiment of the present application, after the first code conversion is completed, annotation information may be added to the first code, where the annotation information may include, but is not limited to, text information, picture information, and the like; for example, if the first code is a code corresponding to a sliding operation, the annotation information may be a sliding operation; thereby facilitating the user to quickly know the meaning of the code; in addition, the code can be added with processing such as an idle line, so that the first codes corresponding to different operation behaviors are distinguished, and the recognition degree of the user on the codes of the different operation behaviors is improved.
S40533: and converting each second class object into second code of the target grammar.
In the embodiment of the present application, the converting the second class object into the second code is to convert the instance object corresponding to the object operation association data into the second code, for example, to convert text, picture, coordinate position, etc. into test code.
S4055: and splicing the codes corresponding to the case information of each step to generate the automatic test codes.
In this embodiment of the application, the splicing the codes corresponding to the case information in each step to generate the automated test code includes:
splicing the first updating code corresponding to each step case information with the second code to generate a target code corresponding to each step case information;
and splicing the target codes corresponding to the case information of each step to generate the automatic test codes.
In the embodiment of the application, the first update code and the second code corresponding to each step use case information can be combined to obtain a target code corresponding to each step use case information; and then splicing a plurality of groups of target codes corresponding to the case information of the plurality of steps according to the time sequence corresponding to the target codes, thereby generating the automatic test codes.
In this embodiment, before the splicing the codes corresponding to the case information in each step to generate the automated test code, the method further includes:
determining a class template corresponding to the test case based on the file name of the test case;
and acquiring the description information of the test case.
In the embodiment of the application, the class template corresponding to the test case is used for generating an automatic test code; the description information of the test case can be used for representing information such as the use, application scene and the like of the test case.
In this embodiment of the application, the splicing the codes corresponding to the case information in each step to generate the automated test code includes:
splicing the codes corresponding to the case information of each step to obtain spliced codes;
acquiring a code corresponding to the pause time in the splicing code;
in this embodiment of the present application, the pause time may be a time difference between an object operation behavior and a response behavior of the terminal; for example, when a user triggers a search command, the time for displaying the search box by the terminal usually lags behind the time for triggering the search command by the user, and the time difference between the two is the pause time.
And modifying the codes corresponding to the pause time to obtain the automatic test codes corresponding to the test cases.
In the embodiment of the application, the code corresponding to the pause time is modified, and the pause time can be modified to 0 or shortened.
In the embodiment of the application, the pause time can be modified by acquiring the code corresponding to the pause time, so that the running speed of the case to be tested in the automatic test process can be increased.
In this embodiment of the application, the splicing the codes corresponding to the case information in each step to generate the automated test code includes:
splicing the codes corresponding to the case information of each step to obtain spliced codes;
and generating the automatic test code in the class template based on the description information of the test case and the spliced code.
In the embodiment of the application, the spliced code is an operation code, class structures such as a package name and input (Import) information can be added to the operation code, and then the operation code with the added class structure, the class name, a function name related to a class object in the code, and the like are replaced into a class template, so that an operable automatic test code is generated.
S103: and receiving the target data sent by the target terminal, wherein the target data is generated by the target terminal based on the test case recording instruction in the target application program corresponding to the identification information of the target application program.
In the embodiment of the present application, the target terminal may include, but is not limited to, a mobile phone, a tablet computer, and the like; the object may include, but is not limited to, a user, a robot, and the like. After receiving the test case recording instruction, if the test case is a video author test case, as shown in fig. 6, the method may include:
s601: the target terminal displays a preset page in the target application program corresponding to the identification information of the target application program based on the test case recording instruction; the preset page comprises a page element extraction control;
in the embodiment of the application, the page element extraction control is used for extracting elements such as texts, pictures and the like in the current page; the page element extraction controls may include, but are not limited to, text, pictures, and the like; as shown in fig. 11, the page element extraction control may be a "check" button in the page; when the recording is finished, the user can click a "finish" button in the page to finish the recording process.
S603: the target terminal responds to an operation instruction triggered by the page element extraction control and displays a suspension frame in the preset page; the floating frame comprises a plurality of page elements of the preset page;
in the embodiment of the application, the floating frame can be displayed at any position in the current page; the size of the suspension frame can be adjusted according to actual and real contents; as shown in FIG. 12, the flyout box in the page is displayed alongside the page, which includes all the elements presented in the current page.
S605: the target terminal responds to an operation instruction triggered to the target page element and determines the target page element from the multiple page elements;
in an embodiment of the present application, the user may select the trigger target page element and click a submit (e.g., "determine") button to determine the target page element. In one particular embodiment, as shown in FIG. 12, a video author "Saxifraga stylist" may be selected among the elements of the page presented in the flyover box.
S607: and the target terminal displays the target page element in a target area of the preset page.
In the embodiment of the present application, as shown in fig. 12, when the user selects the video author "show pavilion modeler" in the floating frame and clicks "ok", the floating frame is hidden and the video author "show pavilion modeler" is displayed in the target area of the page. The target area can be any display area in the preset page.
S609: the target terminal responds to a trigger instruction of a use case description popup window, and determines the description information of the test use case; and the use case description popup is a popup displayed in the preset page.
In the embodiment of the present application, as shown in fig. 13, the use case description popup is a use case name popup, and a user may input description information of a test case in the popup, for example, description information such as a use of the use case may be input. After the target terminal determines the description information of the test case, the target terminal may send the description information to the local terminal.
In the embodiment of the present application, the target data may include all the operation data and operation data corresponding to steps S401 to 408 in the target terminal.
Specifically, in the embodiment of the present application, a use case entry is created through an IDE plug-in, and a mock data acquisition and recording operation flow generation code is realized through a playback Software Development Kit (playback sdk). mock data is stored through a background server (background for short), and the interaction flow of the three modules is shown in figure 14, wherein an IDE plug-in is used for creating an IDE file and writing codes; firstly, sending a use case recording request to playback sdk; recording the operation process of the target terminal based on the request and generating a code by the replay sdk; and returns the automation code to the IDE plug-in; simultaneously uploading mock data to a background server; the background server is used for storing mock data, and is convenient for automatic testing in a target application program subsequently applied to the testing terminal.
Specifically, in the embodiment of the application, a Software Development Kit (SDK) may be integrated, and an SDK mock data recording operation process may be performed; sdk capability is divided into three parts, the first part is mock capability, which is mainly the data of the mock mobile phone at that time, so that the recorded and played data can be basically kept consistent, such as network data, local database, local file, ABTest data, etc.; the second part is recording and checking capacity, and mainly records specific operations of testing classmates, such as clicking, sliding, double clicking, long pressing and the like, and character pictures can be selected as check points in the operation process; the third part is the capability of code generation, corresponding automation code generated according to the recorded operation process.
The timing diagram for the third part of code generation is shown in fig. 15, where the record manager, automatic use case conversion distributor, transcoder factory, transcoder, conversion auxiliary class, transcoding registry class, code generation class are all class names; the conversion flow of the test code is as follows:
1. a recording manager starts case recording and stores recording data in the recording process;
2. the recording manager sends a request for converting the android test code to an automatic case conversion distributor, wherein the request carries recording data;
3. the automatic case conversion distributor sends a request for creating a code converter according to the type of the recorded data to a code converter factory;
4. the code converter factory determines a corresponding code converter according to the data type;
5. the code converter converts the step code and sends the step code to the conversion auxiliary class;
6. the conversion auxiliary class sends a request for obtaining the code converter to the code conversion registration class;
7. the code conversion registration class returns a code converter to the conversion auxiliary class;
8. the conversion auxiliary class determines a code generation class through a code converter and sends a test code generation request to the code generation class;
9. and the code generation class splices the step codes to generate the automatic test codes.
Specifically, in the embodiment of the present application, a comparison graph of the effects of case writing and case execution by using the automated test code of the present application and using the conventional test technology is shown in fig. 16; therefore, compared with the traditional test technology, the case test time of the automatic test code is obviously shortened in the embodiment of the application; in addition, the test code is not required to be manually written, and the time for writing the case through the automatic test code is also obviously shortened.
In some embodiments, the performance during use case testing is compared as follows:
1. manual testing:
-black box test: 5-6 min/case, without code base;
-maintenance costs: 3-5 min/case, manual case maintenance;
-a verification cost: test time and number of verifications;
-a verification process: visual confirmation and mechanical labor;
2. traditional automated testing:
-write debug: 30 min/case, code capability is required;
-maintenance time: 20 min/case, code capability is required;
-maintenance costs: maintaining time and number of service changes;
-a verification cost: the labor time cost is zero, and the automatic operation is realized;
-compatibility: need to adapt to different machine types
The traditional automatic test comprises random test by using Monkey, no fixed path exists, and a random path is automatically operated; or the automatic case compiling is carried out by utilizing the related test framework; since the Monkey random test belongs to random operation, although the intervention labor cost is low, the daily automation scene requirement cannot be met; the mainstream test framework can meet the requirements of self-defining paths and assertions, but the compiling cost is high, and the cost is very high in the process of constructing test scenes such as test data and mock data.
3. The terminal recording playback test of the application:
-recording cost: 6-8 min/case, without code base;
-maintenance time: 3-8 min/case, without code base;
-maintenance costs: maintaining time and number of service changes;
-a verification cost: the labor time cost is zero, and the automatic operation is realized;
-compatibility: cross-device playback capability after recording is supported.
According to the technical scheme provided by the embodiment of the application, the embodiment of the application acquires the attribute information of the target application program corresponding to the test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors; determining a test scene corresponding to the test case based on the attribute information; therefore, scene information corresponding to the test case can be determined in time, and the requirements of various automatic test scenes are met; sending the test scene and the automated test code corresponding to the test case to a test terminal so that the test terminal executes the automated test code in the test scene, and playing back the target data in the code execution process to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case; therefore, automatic testing can be performed through automatic testing codes generated based on the recorded test cases, testing cost is reduced, and testing efficiency is improved.
An embodiment of the present application further provides an application program testing apparatus, as shown in fig. 17, the apparatus includes:
an attribute information obtaining module 1710, configured to obtain attribute information of a target application corresponding to a test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors;
a test scenario determining module 1720, configured to determine, based on the attribute information, a test scenario corresponding to the test case;
a code sending module 1730, configured to send the test scenario and the automated test code corresponding to the test case to a test terminal, so that the test terminal executes the automated test code in the test scenario, and in a code execution process, plays back the target data to obtain test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case.
In some embodiments, the apparatus may further comprise:
the target data acquisition module is used for acquiring target data;
the target data acquisition module may include:
the recording instruction sending unit is used for responding to a first operation instruction triggered based on the use case recording prompt control and sending a test use case recording instruction to the target terminal; the test case recording instruction carries identification information of the target application program;
and the target data receiving unit is used for receiving the target data sent by the target terminal, and the target data is data generated by the target terminal based on the test case recording instruction in the target application program corresponding to the identification information of the target application program.
In some embodiments, the apparatus may further comprise:
the step case information conversion module is used for converting the target data into a plurality of step case information based on the time sequence corresponding to each object operation behavior data;
the test case construction module is used for constructing a test case based on the step case information;
and the automatic test code generation module is used for generating automatic test codes based on the test cases.
In some embodiments, the target data further includes a plurality of object operation association data, each object operation association data corresponding to one object operation behavior data, and the apparatus may further include:
the step case information generating module is used for generating a plurality of step case information according to the time sequence corresponding to each object operation behavior data by the plurality of object operation behavior data;
in some embodiments, the step use case information generating module may include:
a step use case information generating unit, configured to generate a plurality of step use case information from the plurality of object operation behavior data and the corresponding plurality of object operation associated data based on a time sequence corresponding to each object operation behavior data; each step use case information is generated based on one object operation behavior data and the corresponding object operation associated data.
In some embodiments, the automated test code generation module may include:
a step class object determining unit, configured to perform serialization processing on the case information of each step to obtain a step class object;
the code conversion unit is used for converting each step class object into a code of a target grammar;
and the code generation unit is used for splicing the codes corresponding to the case information of each step to generate the automatic test codes.
In some embodiments, the step class object determining unit may include:
the first class object determining subunit is configured to perform serialization processing on the object operation behavior data in each step use case information to obtain a first class object;
a second class object determining subunit, configured to serialize the object operation associated data in each step use case information to obtain a second class object;
in some embodiments, the code translation unit may include:
the first code determining subunit is used for determining a first code corresponding to each first-class object based on the first mapping information; the first code is a code of the target grammar, and the first mapping information represents a mapping relation between a class object and the code;
the second code determines a subunit for translating each second class object into the second code of the target grammar.
In some embodiments, the apparatus may further comprise:
the annotation information determination module is used for determining annotation information of the first code based on the object operation behavior data;
the first updating code determining module is used for updating the first code based on the annotation information to obtain a first updating code;
in some embodiments, the automated test code generation module may include:
the object code generating unit is used for splicing the first updating code and the second code corresponding to each step case information to generate an object code corresponding to each step case information;
and the first code generation unit is used for splicing the target codes corresponding to the case information of each step to generate the automatic test codes.
In some embodiments, the apparatus may further comprise:
the case recording control display module is used for displaying a case recording control on a target page based on a preset plug-in;
the file name editing popup window display module is used for responding to a second operation instruction triggered based on the case recording control and displaying the file name editing popup window of the test case; the file name editing popup window comprises a file name editing control and a file name submitting control;
the file name display module is used for responding to a third operation instruction triggered based on the file name editing control and displaying the file name of the test case;
the file name determining module is used for responding to a fourth operation instruction triggered by the file name submitting control and determining the file name of the test case;
the case recording prompting page display module is used for displaying a case recording prompting page; the use case recording prompt page comprises the use case recording prompt control.
In some embodiments, the apparatus may further comprise:
the class template determining module is used for determining a class template corresponding to the test case based on the file name of the test case;
and the description information acquisition module is used for acquiring the description information of the test case.
In some embodiments, the automated test code generation module may include:
the code splicing unit is used for splicing the codes corresponding to the case information of each step to obtain spliced codes;
and the second code generation unit is used for generating the automatic test code in the class template based on the description information of the test case and the spliced code.
The device and method embodiments in the device embodiment described are based on the same inventive concept.
The embodiment of the application program testing device comprises a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded and executed by the processor to realize the application program testing method provided by the method embodiment.
Embodiments of the present application further provide a computer storage medium, where the storage medium may be disposed in a terminal to store at least one instruction or at least one program for implementing an application program testing method in the method embodiments, and the at least one instruction or the at least one program is loaded and executed by the processor to implement the application program testing method provided in the method embodiments.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device executes the application program testing method provided by the method embodiment.
Alternatively, in an embodiment of the present application, the storage medium may be located in at least one network server of a plurality of network servers of a computer network. Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The memory according to the embodiments of the present application may be used to store software programs and modules, and the processor may execute various functional applications and data processing by operating the software programs and modules stored in the memory. The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system, application programs needed by functions and the like; the storage data area may store data created according to use of the apparatus, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory may also include a memory controller to provide the processor access to the memory.
The application program testing method provided by the embodiment of the application program can be executed in a mobile terminal, a computer terminal, a server or a similar operation device. Taking an example of the application running on a server, fig. 18 is a hardware structure block diagram of the server of the application testing method provided in the embodiment of the present application. As shown in fig. 18, the server 1800 may have a large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1810 (the Central Processing Units 1810 may include but are not limited to a Processing device such as a microprocessor MCU or a programmable logic device FPGA), a memory 1830 for storing data, one or more storage media 1820 (such as one or more mass storage devices) for storing applications 1823 or data 1822. Memory 1830 and storage medium 1820 can be, among other things, transitory or persistent storage. The program stored in the storage medium 1820 may include one or more modules, each of which may include a series of instruction operations on a server. Still further, central processor 1810 may be configured to communicate with storage medium 1820 to execute a sequence of instruction operations in storage medium 1820 on server 1800. The server 1800 may also include one or more power supplies 1860, one or more wired or wireless network interfaces 1850, one or more input-output interfaces 1840, and/or one or more operating systems 1821, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The input/output interface 1840 may be used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the server 1800. In one example, the input/output Interface 1840 includes a Network adapter (NIC) that may be coupled to other Network devices via a base station to communicate with the internet. In one example, the input/output interface 1840 may be a Radio Frequency (RF) module, which is used to communicate with the internet by wireless.
It will be understood by those skilled in the art that the structure shown in fig. 18 is merely an illustration and is not intended to limit the structure of the electronic device. For example, the server 1800 may also include more or fewer components than shown in FIG. 18, or have a different configuration than shown in FIG. 18.
As can be seen from the embodiments of the application program testing method, apparatus, device, or storage medium provided by the present application, the present application obtains attribute information of a target application program corresponding to a test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors; determining a test scene corresponding to the test case based on the attribute information; therefore, scene information corresponding to the test case can be determined in time, and the requirements of various automatic test scenes are met; sending the test scene and the automated test code corresponding to the test case to a test terminal so that the test terminal executes the automated test code in the test scene, and playing back the target data in the code execution process to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case; therefore, automatic testing can be performed through automatic testing codes generated based on the recorded test cases, testing cost is reduced, and testing efficiency is improved.
It should be noted that: the sequence of the embodiments of the present application is only for description, and does not represent the advantages and disadvantages of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present application are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus, device, and storage medium embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer storage medium, and the above storage medium may be a read-only memory, a magnetic disk, an optical disk, or the like.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. An application testing method, the method comprising:
acquiring attribute information of a target application program corresponding to a test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors;
determining a test scene corresponding to the test case based on the attribute information;
sending the test scene and the automated test code corresponding to the test case to a test terminal so that the test terminal executes the automated test code in the test scene, and playing back the target data in the code execution process to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case.
2. The method of claim 1, further comprising:
acquiring the target data;
the acquiring the target data comprises:
responding to a first operation instruction triggered based on a case recording prompt control, and sending a test case recording instruction to a target terminal; the test case recording instruction carries identification information of the target application program;
and receiving the target data sent by the target terminal, wherein the target data is generated by the target terminal based on the test case recording instruction in the target application program corresponding to the identification information of the target application program.
3. The method of claim 1, further comprising:
converting the target data into a plurality of step use case information based on the time sequence corresponding to each object operation behavior data;
constructing the test case based on the step case information;
and generating the automatic test code based on the test case.
4. The method of any of claims 1-3, wherein the target data further comprises a plurality of object operation association data, each object operation association data corresponding to an object operation behavior data, the method further comprising:
generating a plurality of step use case information according to the time sequence corresponding to each object operation behavior data by using the object operation behavior data;
generating a plurality of step use case information according to the time sequence corresponding to each object operation behavior data by the plurality of object operation behavior data, including:
generating a plurality of step use case information from the plurality of object operation behavior data and the corresponding plurality of object operation associated data based on the corresponding time sequence of each object operation behavior data; each step use case information is generated based on one object operation behavior data and the corresponding object operation associated data.
5. The method of claim 3, wherein the generating the automated test code based on the test case comprises:
carrying out serialization processing on the case information of each step to obtain a step class object;
converting each step class object into a code of a target grammar;
and splicing the codes corresponding to the case information of each step to generate the automatic test codes.
6. The method according to claim 5, wherein the step of performing the serialization processing on each step use case information to obtain a step class object comprises:
serializing the object operation behavior data in the case information of each step to obtain a first class of objects;
serializing the object operation associated data in the case information of each step to obtain a second class of objects;
correspondingly, the code for converting each step class object into the target grammar comprises the following steps:
determining a first code corresponding to each first-class object based on the first mapping information; the first code is a code of the target grammar, and the first mapping information represents a mapping relation between a class object and the code;
and converting each second class object into second code of the target grammar.
7. The method of claim 6, wherein after determining the first code corresponding to each first-class object based on the first mapping information, the method further comprises: determining annotation information of the first code based on the object operation behavior data;
updating the first code based on the annotation information to obtain a first updated code;
correspondingly, the splicing the codes corresponding to the case information of each step to generate the automatic test code includes:
splicing the first updating code corresponding to each step case information with the second code to generate a target code corresponding to each step case information;
and splicing the target codes corresponding to the case information of each step to generate the automatic test codes.
8. The method according to claim 2, wherein before the step of sending the test case recording instruction to the target terminal in response to the first operation instruction triggered based on the use case recording prompt control, the method further comprises: displaying a case recording control on a target page based on a preset plug-in;
responding to a second operation instruction triggered based on the case recording control, and displaying a file name editing popup window of the test case; the file name editing popup window comprises a file name editing control and a file name submitting control;
responding to a third operation instruction triggered based on the file name editing control, and displaying the file name of the test case;
responding to a fourth operation instruction triggered by the file name submitting control, and determining the file name of the test case;
displaying a case recording prompt page; the use case recording prompt page comprises the use case recording prompt control.
9. The method according to claim 5, wherein before the splicing the codes corresponding to each step case information to generate the automated test code, the method further comprises: determining a class template corresponding to the test case based on the file name of the test case;
acquiring description information of the test case;
correspondingly, the splicing the codes corresponding to the case information of each step to generate the automatic test code includes:
splicing the codes corresponding to the case information of each step to obtain spliced codes;
and generating the automatic test code in the class template based on the description information of the test case and the spliced code.
10. An application testing apparatus, the apparatus comprising:
the attribute information acquisition module is used for acquiring the attribute information of the target application program corresponding to the test case; the test case is obtained by recording target data; the target data comprises a plurality of object operation behavior data generated according to the time sequence of the object operation behaviors;
the test scene determining module is used for determining a test scene corresponding to the test case based on the attribute information;
the code sending module is used for sending the test scene and the automatic test codes corresponding to the test cases to a test terminal so that the test terminal executes the automatic test codes in the test scene, and in the code execution process, the target data is played back to obtain the test data of the target application program; sending the test data and the target data to a server so that the server obtains a test result of the target application program based on the test data and the target data; wherein the automated test code is generated based on the test case.
CN202111055191.8A 2021-09-09 2021-09-09 Application program testing method and device Active CN113505082B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111055191.8A CN113505082B (en) 2021-09-09 2021-09-09 Application program testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111055191.8A CN113505082B (en) 2021-09-09 2021-09-09 Application program testing method and device

Publications (2)

Publication Number Publication Date
CN113505082A true CN113505082A (en) 2021-10-15
CN113505082B CN113505082B (en) 2021-12-14

Family

ID=78016855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111055191.8A Active CN113505082B (en) 2021-09-09 2021-09-09 Application program testing method and device

Country Status (1)

Country Link
CN (1) CN113505082B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114064475A (en) * 2021-11-11 2022-02-18 中国联合网络通信集团有限公司 Cloud native application testing method, device, equipment and storage medium
CN115941834A (en) * 2022-11-25 2023-04-07 深圳心启科技有限公司 Automatic operation method, device and equipment for smart phone and storage medium
WO2023098077A1 (en) * 2021-12-01 2023-06-08 中国电信股份有限公司 Network intent processing method and related device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040025083A1 (en) * 2002-07-31 2004-02-05 Murthi Nanja Generating test code for software
CN101867501A (en) * 2010-05-25 2010-10-20 北京宜富泰网络测试实验室有限公司 Method and system for automatically testing consistence of SNMP (Simple Network Management Protocol) interface information model
CN102176200A (en) * 2009-09-25 2011-09-07 南京航空航天大学 Software test case automatic generating method
CN103984628A (en) * 2014-05-15 2014-08-13 中国南方航空股份有限公司 Automatic function test method and system applied to BS (browser/server) framework
CN106055481A (en) * 2016-06-02 2016-10-26 腾讯科技(深圳)有限公司 Computer program test method and device
CN108062276A (en) * 2017-12-19 2018-05-22 北京小度信息科技有限公司 The generation method and device of test case and the generation method and device of test report
CN110716849A (en) * 2018-07-11 2020-01-21 亿度慧达教育科技(北京)有限公司 Method and device for recording test cases of application programs
CN111078548A (en) * 2019-12-06 2020-04-28 上海励驰半导体有限公司 Test case analysis method and device, storage medium and verification platform
CN112148579A (en) * 2019-06-26 2020-12-29 腾讯科技(深圳)有限公司 User interface testing method and device
CN112463605A (en) * 2020-11-26 2021-03-09 杭州网易云音乐科技有限公司 Automatic testing method and device, storage medium and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040025083A1 (en) * 2002-07-31 2004-02-05 Murthi Nanja Generating test code for software
CN102176200A (en) * 2009-09-25 2011-09-07 南京航空航天大学 Software test case automatic generating method
CN101867501A (en) * 2010-05-25 2010-10-20 北京宜富泰网络测试实验室有限公司 Method and system for automatically testing consistence of SNMP (Simple Network Management Protocol) interface information model
CN103984628A (en) * 2014-05-15 2014-08-13 中国南方航空股份有限公司 Automatic function test method and system applied to BS (browser/server) framework
CN106055481A (en) * 2016-06-02 2016-10-26 腾讯科技(深圳)有限公司 Computer program test method and device
CN108062276A (en) * 2017-12-19 2018-05-22 北京小度信息科技有限公司 The generation method and device of test case and the generation method and device of test report
CN110716849A (en) * 2018-07-11 2020-01-21 亿度慧达教育科技(北京)有限公司 Method and device for recording test cases of application programs
CN112148579A (en) * 2019-06-26 2020-12-29 腾讯科技(深圳)有限公司 User interface testing method and device
CN111078548A (en) * 2019-12-06 2020-04-28 上海励驰半导体有限公司 Test case analysis method and device, storage medium and verification platform
CN112463605A (en) * 2020-11-26 2021-03-09 杭州网易云音乐科技有限公司 Automatic testing method and device, storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张鑫宇: "DRF-Serializer序列化", 《CSDN:HTTPS://BLOG.CSDN.NET/QQ_35876972/ARTICLE/DETAILS/104826676》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114064475A (en) * 2021-11-11 2022-02-18 中国联合网络通信集团有限公司 Cloud native application testing method, device, equipment and storage medium
WO2023098077A1 (en) * 2021-12-01 2023-06-08 中国电信股份有限公司 Network intent processing method and related device
CN115941834A (en) * 2022-11-25 2023-04-07 深圳心启科技有限公司 Automatic operation method, device and equipment for smart phone and storage medium
CN115941834B (en) * 2022-11-25 2024-04-02 深圳朴数智能科技有限公司 Automatic operation method, device, equipment and storage medium of smart phone

Also Published As

Publication number Publication date
CN113505082B (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN113505082B (en) Application program testing method and device
US9400784B2 (en) Integrated application localization
US11126938B2 (en) Targeted data element detection for crowd sourced projects with machine learning
US11237948B2 (en) Rendering engine component abstraction system
CN111708528A (en) Method, device and equipment for generating small program and storage medium
CN110955409B (en) Method and device for creating resources on cloud platform
JP7280388B2 (en) Apparatus and method, equipment and medium for implementing a customized artificial intelligence production line
CN115658529A (en) Automatic testing method for user page and related equipment
CN113778897A (en) Automatic test method, device, equipment and storage medium of interface
CN111078529B (en) Client writing module testing method and device and electronic equipment
CN111158648B (en) Interactive help system development method based on live-action semantic understanding and platform thereof
US8000952B2 (en) Method and system for generating multiple path application simulations
CN112446948A (en) Virtual reality courseware processing method and device, electronic equipment and storage medium
CN113296759B (en) User interface processing method, user interface processing system, device and storage medium
JP7029557B1 (en) Judgment device, judgment method and judgment program
CN112346736B (en) Data processing method and system
CN115113850A (en) Cross-platform application construction and operation method, server, terminal and system
CN112860587A (en) UI automatic test method and device
KR101987183B1 (en) Apparatus, method and server for app development supporting front-end development using app development utility
CN104503992A (en) Question bank construction method
CN111352637B (en) Method, device and equipment for deploying machine learning system
CN113836037B (en) Interface interaction testing method, device, equipment and storage medium
JP7319516B2 (en) Program, information processing device, and control method thereof
JP5683209B2 (en) Client computer with automatic document generation function
CN113742240A (en) User interface testing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant