CN117370141A - Test report generation method and device, electronic equipment and storage medium - Google Patents

Test report generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117370141A
CN117370141A CN202210761460.0A CN202210761460A CN117370141A CN 117370141 A CN117370141 A CN 117370141A CN 202210761460 A CN202210761460 A CN 202210761460A CN 117370141 A CN117370141 A CN 117370141A
Authority
CN
China
Prior art keywords
test
test result
result
preset
error code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210761460.0A
Other languages
Chinese (zh)
Inventor
何若溥
李文宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
If Technology Co Ltd
Original Assignee
If Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by If Technology Co Ltd filed Critical If Technology Co Ltd
Priority to CN202210761460.0A priority Critical patent/CN117370141A/en
Publication of CN117370141A publication Critical patent/CN117370141A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides a test report generation method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring requirement information of a test item; performing an iterative operation, the iterative operation comprising: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion statement, and if not, replacing one sub-demand in the demand information until the test result is correct; and generating the test report according to the test scheme and the test result. According to the method and the device, the test result is circularly detected by combining the preset assertion until the test result is at the correct position, and then the purpose of automatic test report generation is achieved by combining the assertion.

Description

Test report generation method and device, electronic equipment and storage medium
Technical Field
The invention relates to the field of automated testing, in particular to a test report generation method, a device, electronic equipment and a storage medium.
Background
Traditional software reports are provided in a document form or in a defect management tool such as Buddhist channels, the customization capability is poor, the displayed content is not necessarily required by us, and in order to respond to the whole strategy of the digital transformation of a company, the test report is optimized in a mode of combining a digital billboard with chart parameterization, so that compared with the traditional test report mode, the readability is higher and the management is better.
In addition, even though the chart data and the non-chart data are applied to combine to generate the test report in the industry, most of the corresponding software reports only show test results of a single test type (such as functional test, interface test, ui automation test, non-functional test, performance test, special test and the like), and a plurality of types of test reports need to be manually summarized and uniformly written before the project is on line, if the project is a large project, the test type coverage is very complex, the report content is also very complex, a large amount of repeated work content is inevitably present, the efficiency is low, and the generation of the complex automation test report cannot be realized in the prior art.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a test report generation method, a device, electronic equipment and a storage medium, which aim to solve the problems that the current automatic generation of test reports is low in efficiency, the generation of complex automatic test reports cannot be realized in the prior art, and the like.
In order to solve the technical problems, the invention provides the following technical scheme:
an embodiment of one aspect of the present invention provides a test report generating method, including:
acquiring requirement information of a test item;
performing an iterative operation, the iterative operation comprising: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion statement, and if not, replacing one sub-demand in the demand information until the test result is correct;
and generating the test report according to the test scheme and the test result so as to assist a tester to test.
In a preferred embodiment, the preset assertion statement includes a feature character that normalizes a test result, and the determining whether the test result is correct based on at least one preset assertion includes:
and determining whether the characteristic characters in the test result are consistent with the characteristic characters in the preset assertion statement, and if not, determining that the test result is incorrect.
In a preferred embodiment, the preset assertion statement includes a plurality of preset assertion statements, each preset assertion statement includes a feature character that normalizes a test result, and the determining whether the test result is correct based on at least one preset assertion includes:
and determining whether more than half of all the characteristic characters in the test result are consistent with the characteristic characters in the preset assertion statement, and if not, determining that the test result is incorrect.
In a preferred embodiment, the requirement information includes: test targets, test contents, test constraints, test phases, test strategies, defect level definitions, test admission criteria, test passing criteria, test suspension principles and resume test conditions; the test scheme comprises a test result display framework, a global test environment and a test flow document;
the generating a test scheme according to the requirement information comprises the following steps:
generating a test flow document according to the test target, the test content, the test constraint, the test stage, the test strategy, the defect level definition, the test admission standard, the test passing standard, the test suspension principle and the resume test condition;
generating the type of the test result according to the test data and the test flow document;
and determining a test result display architecture and configuring a global test environment based on the type of the test result.
In a preferred embodiment, the test result comprises an error code, the method further comprising:
and aiming at the whole test process, determining whether the test result is abnormal or not according to the occurrence time points of all error codes in the test result and combining the preset error code quantity trend.
In a preferred embodiment, the predicate statement is configured to: if T1 is greater than T2, X1 is less than X2, wherein T1 and T2 are both time points, X1 is the number of error codes in the error code set corresponding to T1, and X2 is the number of error codes in the error code set corresponding to T2;
determining whether the test result is abnormal according to the occurrence time points of all error codes in the test result and by combining a preset error code quantity trend, wherein the determining comprises the following steps:
classifying all error codes according to time points to obtain an error code set corresponding to each time point;
and determining the number of error codes in each error code set according to the sequence of the time points, judging whether each error code set meets the assertion statement, and if not, determining that the test result is abnormal.
In a preferred embodiment, the predicate statement is configured to: if the right end point of the time axis interval of T1 is greater than the right end point of the time axis interval of T2, or if the left end point of the time axis interval of T1 is greater than the left end point of the time axis interval of T2, X1 is less than X2, wherein T1 and T2 are both time intervals, the interval length of each time interval is the same, X1 is the number of error codes in the error code set corresponding to T1, and X2 is the number of error codes in the error code set corresponding to T2;
determining whether the test result is abnormal according to the occurrence time points of all error codes in the test result and by combining a preset error code quantity trend, wherein the determining comprises the following steps:
classifying all error codes according to time points to obtain an error code set corresponding to each time point;
and determining the number of error codes in each error code set according to the sequence of the time points, judging whether each error code set meets the assertion statement, and if not, determining that the test result is abnormal.
An embodiment of a second aspect of the present application provides a test report generating device, including:
the acquisition module is used for acquiring the requirement information of the test item;
the iteration operation module is used for executing iteration operation, and the iteration operation comprises the following steps: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion, and if not, replacing one sub-demand in the demand information until the test result is correct;
and the report generation module is used for generating the test report according to the test scheme and the test result so as to assist a tester to test.
In yet another aspect of the present invention, an electronic device is provided that includes a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the test report generating method when executing the program.
In yet another aspect of the present invention, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, implements a test report generating method.
According to the technical scheme, the test report generation method, the device, the electronic equipment and the storage medium provided by the invention have the advantages that the test result is circularly detected by combining the preset assertion until the test result is at the correct position, and then the purpose of automatic test report generation is achieved by combining the assertion.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a test report generating method according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating a specific step of a test report generating method according to an embodiment of the present invention.
FIG. 3 is a second embodiment of a flow chart of a specific step of a test report generating method according to the present invention.
FIG. 4 is a third step flow chart illustrating a specific step of a test report generating method according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of a test report generating apparatus.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
At present, chart data and non-chart data are more applied to combine and generate test reports in the industry, but most of corresponding software reports only show test results of single test types (such as functional tests, interface tests, ui automatic tests, nonfunctional tests, performance tests, special tests and the like), and a plurality of types of test reports are needed to be summarized and uniformly written manually before projects are put on line, if the projects are large-scale projects, the test types are very complex in coverage, the report content is very complex, a large number of repeated work contents are inevitably present, and the efficiency is low.
The key concept of the intelligent cabin road dysphoria emotion relieving technical scheme based on vision is that the driver and the in-cabin personnel can feel through the combination of vision and touch, and the intelligent cabin road dysphoria emotion relieving technical scheme based on vision is provided to adapt to the requirements of different people and scenes.
As shown in fig. 1, an embodiment of the present invention provides a test report generating method, including:
s1: acquiring requirement information of a test item;
s2: performing an iterative operation, the iterative operation comprising: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion statement, and if not, replacing one sub-demand in the demand information until the test result is correct;
and S3, generating the test report according to the test scheme and the test result so as to assist a tester to test.
According to the test report generation method, the test result is circularly detected by combining the preset assertion until the test result is at the correct position, and then the purpose of automatic test report generation is achieved by combining the assertion.
It will be appreciated that the Testing of the present application is directed primarily to Software Testing, which is a process used to facilitate the verification of the correctness, integrity, security and quality of Software, and is an audit or comparison process between actual and expected output. Classical definitions of software testing are: and (3) operating the program under the specified conditions to find out program errors, measuring the quality of the software and evaluating whether the software can meet the design requirements.
The software test report is one of important components, aims to record the test process and result, analyze the found problems and defects, provide basis for correcting the quality problems of the software, provide basis for software acceptance, and can definitely present risks and preventive measures and provide a certain significance for quality duplication.
In this embodiment of the present application, the requirement information is a requirement for testing, for example, a test target, a test content, a test constraint, a test stage, a test policy, a defect level definition, a test admission standard, a test passing standard, a test suspension principle, a resume test condition, and the like, and a test document test reference document, a test submission document, a test case, test data, and the like are written.
Further, an assertion (assertion) is a first-order logic within a program (e.g., a logical predicate that results in true or false) that is intended to represent that the corresponding assertion should be true in order to verify the result expected by the software developer when the program is executed to the location of the assertion. If the assertion is not true, the program will abort execution and give an error message.
In the embodiment of the application, the automatically generated test result is judged by combining the preset assertion statement, and if the result is found to be incorrect, the test result is regenerated by replacing part of input, so that under the condition of continuous iteration, correct demand information is easily found, and a test report is automatically generated.
In this embodiment of the present application, the requirement information includes a plurality of sub-requirements, for example, a test target, a test content, a test constraint, a test stage, a test policy, a defect level definition, a test admission standard, a test passing standard, a test suspension principle, a resume test condition, and the like may be defined as sub-requirements.
The embodiments of the present application are described in detail below.
In a preferred embodiment, the preset assertion statement includes a feature character that normalizes a test result, and the determining whether the test result is correct based on at least one preset assertion includes:
and determining whether the characteristic characters in the test result are consistent with the characteristic characters in the preset assertion statement, and if not, determining that the test result is incorrect.
In a preferred embodiment, the preset assertion statement includes a plurality of preset assertion statements, each preset assertion statement includes a feature character that normalizes a test result, and the determining whether the test result is correct based on at least one preset assertion includes:
and determining whether more than half of all the characteristic characters in the test result are consistent with the characteristic characters in the preset assertion statement, and if not, determining that the test result is incorrect.
In the above two preferred embodiments, the anchoring and the measurement are performed according to the feature characters in the test result, and in practical application, for example, certain specific requirements necessarily generate certain feature characters, for example, for the software a, the speed of the software a must be measured in the test, at this time, the feature characters such as speed exist in the test result, if the speed character cannot be found in the test result, the test is incorrect, the sub-requirement needs to be replaced to retest, the speed character needs to be found, and obviously, for a certain complex test, more feature characters can more accurately determine whether the test result is correct.
Further, in a preferred embodiment, the requirement information includes: test targets, test contents, test constraints, test phases, test strategies, defect level definitions, test admission criteria, test passing criteria, test suspension principles and resume test conditions; the test scheme comprises a test result display framework, a global test environment and a test flow document;
in this embodiment, as shown in fig. 2, the generating a test solution according to the requirement information in step S2 includes:
s21: generating a test flow document according to the test target, the test content, the test constraint, the test stage, the test strategy, the defect level definition, the test admission standard, the test passing standard, the test suspension principle and the resume test condition;
s22: generating the type of the test result according to the test data and the test flow document;
s23: and determining a test result display architecture and configuring a global test environment based on the type of the test result.
Specifically, specific examples are given below:
1. project demand ensemble evaluation
After a product is subjected to product manager carding, evaluating the product value, splitting the product requirement, applying for required resources, and after successful standing, organizing a research department to start a requirement review meeting, and testing the product prototype and the PRD requirement document to respectively stand at the angles of the product, development and user to evaluate the overall requirement, wherein the method may comprise the following steps: core function coverage, performance requirements, compatibility requirements, usability requirements, accurate user burial points and other different types of product requirements. Meanwhile, after the review meeting is finished, development engineers make technical designs, including software architecture, interface design, database table design and the like, the development engineers are divided into B-end C-end products such as Web, app, PC and the like according to product forms, and the development engineers are divided into 2B (facing enterprises), 2C (facing consumers), 2G (facing governments) and 2S (facing themselves) according to product attributes, so that test groups need to design test schemes, including different test types to cover the test requirements, and therefore, the test reports also have different contents which are displayed with different emphasis.
2. Test plan overall design
According to the requirements and the design of the research and development technology, the test requirements (test targets, test contents, test constraints, test stages, test strategies, defect level definitions, test admission standards, test passing standards, test suspension principles, test resumption conditions and the like) are defined, the test documents (test reference documents, test submitting documents, test cases, test data and the like) are written, and the test resources (testers, test equipment, test tools, test plans and the like) are defined so as to guide a test engineer to conduct standard and standard tests.
3. Report structure template design
The content to be displayed and the display form (sector graph, ring graph, column graph, line graph, radar graph, heat point graph, water wave graph, etc.) of the content are explicitly reported, and the content is structured (for example, a report chart with one function cannot be placed, a performance chart is followed by one function icon, namely, different types of the report charts are separated, so that the report content is clearly visible).
4. Configuring a global test environment
Software development generally has four environments: development environment, test environment, pre-production environment, production environment
Test environment refers to a description of the software and hardware environment on which the test is run, as well as any other software that interacts with the software under test, including drivers and stub. The test environment refers to the collective term for computer hardware, software, network devices, and historical data necessary to complete the software testing work. The test data and the test result are closely related to the test activity, the global environment is a stable and controllable test environment, the test personnel can spend less time to complete the execution of the test case, additional time is not required to be spent for the maintenance of the test case and the test process, and each submitted defect can be ensured to be accurately reproduced at any time.
5. Configuration test end button
This button may be a code script or a front-end button associated with the associated interface. The function is to issue an instruction to acquire the test result. The command is to automatically acquire buried point data of a test result in the test by codes, format the result and record the result in a result file.
6. Obtaining result data
Java or Python can be used without making strong requirements, a technical stack of a tester can be seen, a config. Ini or conf file can be created according to different types of tests in a specific implementation mode, and the method is that the ini file is read, the configperser library of Python can be used for reading, or the file is read through yaml.
Example codes are as follows:
from configparser import ConfigParser
import os
conn=ConfigParser()
path for # acquisition of ini file
file_path=os.path.join(os.path.abspath('.'),'config.ini')
conn.read(file_path)
url=conn.get('api','url')
method=conn.get('api','method')
port=conn.get('mysql','port')
print(url,method,port)
After operation, the following values are obtained, and the reading is successful
www.txxxxx.com get 3306
7. Intelligent analysis test results
And adding assertion to the test result with the required standardized result in an assent form, adding if flow control, and adding the operation after the judgment, and notifying whether the operation is reacquired or ended.
8. Automatic generation of test reports
The method is combined with an automatic test and an intelligent quality platform, and the task of automatically generating a test report by triggering construction is realized when the final development and submission codes are combined, so that the automatic generation of the test report is realized.
Further, the layout of the report, i.e., the test result display architecture, may be a bar graph, etc.
Exemplary, drawing of a bar graph: bar is suitable for statistics of bug quantity, use case data quantity and demand quantity of modules.
: the report data is a bug distribution module, 6 modules are provided, and the corresponding bug numbers are 16, 20, 25, 11, 19 and 13 respectively.
The histogram code is as follows:
exemplary, examples: the test is performed for three times, the first round of test executes 200 cases, and 100 cases are passed; the second round of test performs case200, passing 150; the third round of testing performed case200, passing 200.
The code is as follows:
in a preferred embodiment, the test result comprises an error code, the method further comprising:
and aiming at the whole test process, determining whether the test result is abnormal or not according to the occurrence time points of all error codes in the test result and combining the preset error code quantity trend.
For a detailed description of this, please refer to fig. 3 and 4, in a preferred embodiment, the assertion statement is configured to: if T1 is greater than T2, X1 is less than X2, wherein T1 and T2 are both time points, X1 is the number of error codes in the error code set corresponding to T1, and X2 is the number of error codes in the error code set corresponding to T2;
determining whether the test result is abnormal according to the occurrence time points of all error codes in the test result and by combining a preset error code quantity trend, wherein the determining comprises the following steps:
s221: classifying all error codes according to time points to obtain an error code set corresponding to each time point;
s222: and determining the number of error codes in each error code set according to the sequence of the time points, judging whether each error code set meets the assertion statement, and if not, determining that the test result is abnormal.
In a preferred embodiment, the predicate statement is configured to: if the right end point of the time axis interval of T1 is greater than the right end point of the time axis interval of T2, or if the left end point of the time axis interval of T1 is greater than the left end point of the time axis interval of T2, X1 is less than X2, wherein T1 and T2 are both time intervals, the interval length of each time interval is the same, X1 is the number of error codes in the error code set corresponding to T1, and X2 is the number of error codes in the error code set corresponding to T2;
determining whether the test result is abnormal according to the occurrence time points of all error codes in the test result and by combining a preset error code quantity trend, wherein the determining comprises the following steps:
s231: classifying all error codes according to time points to obtain an error code set corresponding to each time point;
s232: and determining the number of error codes in each error code set according to the sequence of the time points, judging whether each error code set meets the assertion statement, and if not, determining that the test result is abnormal.
The two embodiments described above are further preferred embodiments of the present application in connection with preset assertion statements, i.e. with the number of error codes of the test report, i.e. the law of the Bug number, i.e. the Bug should be of decreasing trend as the test proceeds. Thereby combining the bug number to judge the abnormality of the test result.
Illustratively, the line graph may be used to represent the tendency of bug variations, performance, etc.
The example codes are as follows:
/>
furthermore, the data obtained by assertion can be corrected based on actual conditions, for example, the early investment of resources is found to be less, the later investment is increased, more bugs are found, or the personnel responsible for early test may have insufficient testing capability and experience, so that a plurality of bugs are not tested, or a certain module in the test is changed to cause a large-scale bug, so that a roller back rolling operation is needed. At this time, the abnormality detected by the assertion can be excluded in combination with the above.
The application provides a test report generating device at a software level, as shown in fig. 5, including:
the acquisition module 1 acquires the requirement information of the test item;
an iterative operation module 2, which performs iterative operations including: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion, and if not, replacing one sub-demand in the demand information until the test result is correct;
and the report generation module 3 is used for generating the test report according to the test scheme and the test result so as to assist a tester to test.
According to the test report generating device, through the configuration acquisition module, the negative emotion determining module and the test report generating module, the test result is circularly detected by combining the preset assertion until the test result is at the correct position, and then the purpose of automatic test report generation is achieved by combining the assertion.
In terms of hardware level, in order to provide an embodiment of an electronic device for implementing all or part of the content in the test report generating method, the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete communication with each other through the bus; the communication interface is used for realizing information transmission among the server, the device, the distributed message middleware cluster device, various databases, user terminals and other related equipment; the electronic device may be a desktop computer, a tablet computer, a mobile terminal, etc., and the embodiment is not limited thereto. In this embodiment, the electronic device may refer to an embodiment of the test report generating method in the embodiment and an embodiment of the test report generating apparatus, and the contents thereof are incorporated herein, and the repetition is omitted.
Fig. 6 is a schematic block diagram of a system configuration of an electronic device 9600 according to an embodiment of the present invention. As shown in fig. 6, the electronic device 9600 may include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this fig. 6 is exemplary; other types of structures may also be used in addition to or in place of the structures to implement telecommunications functions or other functions.
In one embodiment, the test report generating functionality may be integrated into the central processor 9100.
In another embodiment, the test report generating device may be configured separately from the central processor 9100, for example, the test report generating device may be configured as a chip connected to the central processor 9100, and the test report generating function is implemented by control of the central processor.
As shown in fig. 6, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 need not include all of the components shown in fig. 6; in addition, the electronic device 9600 may further include components not shown in fig. 6, and reference may be made to the related art.
As shown in fig. 6, the central processor 9100, sometimes referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device, which central processor 9100 receives inputs and controls the operation of the various components of the electronic device 9600.
The memory 9140 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information about failure may be stored, and a program for executing the information may be stored. And the central processor 9100 can execute the program stored in the memory 9140 to realize information storage or processing, and the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. The power supply 9170 is used to provide power to the electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, but not limited to, an LCD display.
The memory 9140 may be a solid state memory such as Read Only Memory (ROM), random Access Memory (RAM), SIM card, etc. But also a memory which holds information even when powered down, can be selectively erased and provided with further data, an example of which is sometimes referred to as EPROM or the like. The memory 9140 may also be some other type of device. The memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 storing application programs and function programs or a flow for executing operations of the electronic device 9600 by the central processor 9100.
The memory 9140 may also include a data store 9143, the data store 9143 for storing data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers of the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, address book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. A communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, as in the case of conventional mobile communication terminals.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, etc., may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and to receive audio input from the microphone 9132 to implement usual telecommunications functions. The audio processor 9130 can include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100 so that sound can be recorded locally through the microphone 9132 and sound stored locally can be played through the speaker 9131.
The embodiment of the present invention also provides a computer-readable storage medium capable of implementing all the steps in the test report generating method of the server for the execution subject in the above embodiment, the computer-readable storage medium storing thereon a computer program which, when executed by a processor, implements all the steps in the test report generating method in the above embodiment.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principles and embodiments of the present invention have been described in detail with reference to specific examples, which are provided to facilitate understanding of the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (10)

1. A test report generation method, comprising:
acquiring requirement information of a test item;
performing an iterative operation, the iterative operation comprising: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion statement, and if not, replacing one sub-demand in the demand information until the test result is correct;
and generating the test report according to the test scheme and the test result so as to assist a tester to test.
2. The test report generating method of claim 1, wherein the preset predicate statement includes a feature character that normalizes a test result, and wherein determining whether the test result is correct based on at least one preset predicate includes:
and determining whether the characteristic characters in the test result are consistent with the characteristic characters in the preset assertion statement, and if not, determining that the test result is incorrect.
3. The test report generating method of claim 1, wherein the preset predicate statements comprise a plurality of, each preset predicate statement comprising a feature character that normalizes a test result, the determining whether the test result is correct based on at least one preset predicate comprising:
and determining whether more than half of all the characteristic characters in the test result are consistent with the characteristic characters in the preset assertion statement, and if not, determining that the test result is incorrect.
4. The test report generating method according to claim 2, wherein the demand information includes: test targets, test contents, test constraints, test phases, test strategies, defect level definitions, test admission criteria, test passing criteria, test suspension principles and resume test conditions; the test scheme comprises a test result display framework, a global test environment and a test flow document;
the generating a test scheme according to the requirement information comprises the following steps:
generating a test flow document according to the test target, the test content, the test constraint, the test stage, the test strategy, the defect level definition, the test admission standard, the test passing standard, the test suspension principle and the resume test condition;
generating the type of the test result according to the test data and the test flow document;
and determining a test result display architecture and configuring a global test environment based on the type of the test result.
5. The test report generating method of claim 1, wherein the test result comprises an error code, the method further comprising:
and aiming at the whole test process, determining whether the test result is abnormal or not according to the occurrence time points of all error codes in the test result and combining the preset error code quantity trend.
6. The test report generating method of claim 5, wherein the predicate statement is configured to: if T1 is greater than T2, X1 is less than X2, wherein T1 and T2 are both time points, X1 is the number of error codes in the error code set corresponding to T1, and X2 is the number of error codes in the error code set corresponding to T2;
determining whether the test result is abnormal according to the occurrence time points of all error codes in the test result and by combining a preset error code quantity trend, wherein the determining comprises the following steps:
classifying all error codes according to time points to obtain an error code set corresponding to each time point;
and determining the number of error codes in each error code set according to the sequence of the time points, judging whether each error code set meets the assertion statement, and if not, determining that the test result is abnormal.
7. The test report generating method of claim 1, wherein the assertion statement is configured to: if the right end point of the time axis interval of T1 is greater than the right end point of the time axis interval of T2, or if the left end point of the time axis interval of T1 is greater than the left end point of the time axis interval of T2, X1 is less than X2, wherein T1 and T2 are both time intervals, the interval length of each time interval is the same, X1 is the number of error codes in the error code set corresponding to T1, and X2 is the number of error codes in the error code set corresponding to T2;
determining whether the test result is abnormal according to the occurrence time points of all error codes in the test result and by combining a preset error code quantity trend, wherein the determining comprises the following steps:
classifying all error codes according to time points to obtain an error code set corresponding to each time point;
and determining the number of error codes in each error code set according to the sequence of the time points, judging whether each error code set meets the assertion statement, and if not, determining that the test result is abnormal.
8. A test report generating apparatus, comprising:
the acquisition module is used for acquiring the requirement information of the test item;
the iteration operation module is used for executing iteration operation, and the iteration operation comprises the following steps: generating a test scheme according to the demand information, testing the test item according to the test scheme to obtain a test result, determining whether the test result is correct or not based on at least one preset assertion, and if not, replacing one sub-demand in the demand information until the test result is correct;
and the report generation module is used for generating the test report according to the test scheme and the test result so as to assist a tester to test.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the test report generating method of any of claims 1 to 7 when the program is executed by the processor.
10. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the test report generating method of any of claims 1 to 7.
CN202210761460.0A 2022-06-30 2022-06-30 Test report generation method and device, electronic equipment and storage medium Pending CN117370141A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210761460.0A CN117370141A (en) 2022-06-30 2022-06-30 Test report generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210761460.0A CN117370141A (en) 2022-06-30 2022-06-30 Test report generation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117370141A true CN117370141A (en) 2024-01-09

Family

ID=89391505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210761460.0A Pending CN117370141A (en) 2022-06-30 2022-06-30 Test report generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117370141A (en)

Similar Documents

Publication Publication Date Title
CN109871326B (en) Script recording method and device
CN108763076A (en) A kind of Software Automatic Testing Method, device, equipment and medium
CN112783793B (en) Automatic interface test system and method
CN112905459B (en) Service interface testing method and device, electronic equipment and storage medium
CN107451112B (en) Form tool data checking method, device, terminal equipment and storage medium
CN110990274B (en) Data processing method, device and system for generating test cases
CN105786695A (en) Data test method and system
CN112905262B (en) Configuration method and device of aerospace measurement and control system
CN111814354B (en) Simulation test method, system, medium and electronic device for instrument performance
CN115422065A (en) Fault positioning method and device based on code coverage rate
CN115794641A (en) Method, device and equipment for making number based on business process and storage medium
CN112905460B (en) Device and method for simulating three-party receipt by automatic interface test
CN113095782A (en) Automatic approval decision-making method and device
KR20120111618A (en) Apparatus and method for testing plc command
CN116244202A (en) Automatic performance test method and device
CN117370141A (en) Test report generation method and device, electronic equipment and storage medium
CN115952079A (en) Method and system for recording mobile application automation behaviors and analyzing and positioning defects
CN111949510A (en) Test processing method and device, electronic equipment and readable storage medium
CN105653445A (en) Implementation method capable of meeting DO-178C test result
CN113238940B (en) Interface test result comparison method, device, equipment and storage medium
CN113986753A (en) Interface test method, device, equipment and storage medium
CN113886222A (en) Test case design method, device and equipment and readable storage medium
CN113515447A (en) System automation test method and device
CN116736025B (en) Closed loop automatic test device and method for analog quantity acquisition equipment
CN117056211A (en) Low-code automatic test method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination