WO2017181591A1 - 测试方法及系统 - Google Patents

测试方法及系统 Download PDF

Info

Publication number
WO2017181591A1
WO2017181591A1 PCT/CN2016/100153 CN2016100153W WO2017181591A1 WO 2017181591 A1 WO2017181591 A1 WO 2017181591A1 CN 2016100153 W CN2016100153 W CN 2016100153W WO 2017181591 A1 WO2017181591 A1 WO 2017181591A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
result
request
feedback
test request
Prior art date
Application number
PCT/CN2016/100153
Other languages
English (en)
French (fr)
Inventor
李洪福
Original Assignee
乐视控股(北京)有限公司
乐视云计算有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视云计算有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017181591A1 publication Critical patent/WO2017181591A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Definitions

  • Embodiments of the present invention relate to the field of testing technologies, and in particular, to a testing method and system.
  • test side usually a service terminal
  • server side usually a service terminal
  • test method of invading the test end not only needs to deploy the test software on the test side, but also destroys the running data of the test end; the test on the server side also returns the request feedback to the test end, which also invades the test end; in addition, the modern server
  • the number of objects is gradually increasing, and the service business is gradually increasing. Testers need to manually judge the operation of the software according to a wide variety of work logs, which has low efficiency and large human factors.
  • the embodiments of the present invention provide a test method and system, which are used to solve the problems of inefficiency, poor accuracy, and need to invade the test end to test the test end and damage the test end caused by the tester manually analyzing the log file in the prior art.
  • an embodiment of the present invention provides a testing method, where the method includes:
  • a test report is generated by comparing the expected running result with the test result.
  • an embodiment of the present invention provides a test system, where the system includes:
  • a feedback intercepting unit for intercepting test request feedback to the test terminal
  • test request determining unit configured to parse the test request feedback, and determine a test request
  • An expected result determining unit configured to determine an expected running result by using a rule engine library according to the determined test request, wherein the rule engine library includes a test request and a corresponding expected row result;
  • test result generating unit configured to execute a test task according to the test request, and generate a test result
  • a test report generating unit is configured to compare the expected running result and the test result to generate a test report.
  • an embodiment of the present invention further provides a non-volatile computer storage medium storing computer-executable instructions for performing any of the above test methods of the present invention.
  • an embodiment of the present invention further provides an electronic device, including: at least one processor; and a memory; wherein the memory stores a program executable by the at least one processor, where the instruction is The at least one processor is executed to enable the at least one processor to perform any of the above test methods of the present invention.
  • an embodiment of the present invention further provides a computer program product, the computer program product comprising a computing program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, when the program instruction When executed by a computer, the computer is caused to perform any of the above test methods.
  • the invention can overcome the test by performing interception test on the server side and generating a test report.
  • the tester manually analyzes the log file to cause inefficiency, poor accuracy, and needs to invade the test end to test and destroy the test end.
  • the operation is simple and convenient, and the test is automated.
  • FIG. 1 is a schematic diagram of a system architecture used in an embodiment of the present invention.
  • FIG. 2 is a flow chart showing a test method according to a first embodiment of the present invention
  • FIG. 3 is a schematic flow chart of a testing method according to a second embodiment of the present invention.
  • FIG. 4 is a flow chart showing a test method according to a third embodiment of the present invention.
  • Figure 5 is a block diagram showing the structure of a test system according to a first embodiment of the present invention.
  • FIG. 6 is a schematic structural view of a test system according to a second embodiment of the present invention.
  • Figure 7 is a block diagram showing the structure of a test system according to a third embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
  • FIG. 1 an exemplary system architecture 100 in which embodiments of the present invention may be applied is illustrated.
  • system architecture 100 can include terminal devices 101, 102, network 103, and servers 104, 105.
  • the user 110 can interact with the servers 104, 105 over the network 103 using the terminal devices 101, 102 to receive or transmit messages and the like.
  • Various communication client applications such as instant messaging tools, email clients, social platform software, audio and video software, etc., can be installed on the terminal devices 101, 102.
  • the terminal devices 101, 102 may be various electronic devices, including but not limited to personal computers, smart phones, tablet computers, personal digital assistants, wearable devices, etc., and the terminal devices 101, 102 may also be smart home appliances, routers, and the like.
  • the network 103 is used to provide a medium for communication links between the terminal devices 101, 102 and the servers 104, 105.
  • Network 103 may include various types of connections, such as wired, wireless communication links, or fiber optic cables.
  • the servers 104, 105 may be servers that provide various services.
  • the server can store, analyze, and the like the received data.
  • the server in order to prevent intrusion into the terminal device, the server does not feed back the processing result to the terminal device.
  • terminal devices, networks, and servers in Figure 1 is merely illustrative. Depending on the implementation needs, there can be any number of terminal devices, networks, and servers.
  • FIG. 200 a flow chart 200 of a first embodiment of the testing method of the present invention is shown.
  • test method can include:
  • Step 201 Intercepting test request feedback to the test terminal.
  • the test terminal may be the terminal devices 101, 102 shown in FIG. 1. Specifically, it may be various electronic devices, including but not limited to personal computers, smart phones, tablet computers, personal digital assistants, wearable devices, etc., and the test terminals may also be smart home appliances, routers, and the like.
  • test request may be, for example, a request to link a certain web page, a request to acquire file data, and the present invention is not limited in this regard.
  • Step 202 Parse the test request feedback and determine a test request.
  • the test request may be, for example, returning to an interface within 2 seconds.
  • Step 203 Determine an expected running result by using a rule engine library according to the determined test request.
  • the rules engine library can include test requests and corresponding expected run results.
  • the rule engine library may be a library storing related functions, a corresponding table in the database, or other forms of correspondence, and the present invention is not limited in this respect.
  • the profile engine library can store the configuration files and expected run results required to test the application.
  • a configuration file can contain test requests, and the expected run results can include the expected completion time and completed content.
  • the specific value may be represented by a value of md5 (Message Digest Algorithm, fifth edition), and the present invention is not limited in this regard.
  • Step 204 Compare the test result generated by executing the test task corresponding to the test request with the expected running result, and generate a test report.
  • the test report includes at least a comparison of the test result and the expected operation result.
  • the test result 1:3s returns to the first interface.
  • Test result 2 Return to the second interface within 2s.
  • the server can receive the comparison and store it for later collation of the data.
  • a threshold is set for the number of errors. When the number of errors is greater than the predetermined threshold, it indicates that the new function of the release is more BUG, and the test server should not be officially launched.
  • the test quality of the test server can be well controlled, and when the problem occurs, the component can be cancelled in time to ensure the reliability of the test.
  • the invention can overcome the problems of inefficiency, poor accuracy, and need to invade the test end to test the test end and damage the test end caused by the tester manually analyzing the log file, and the operation is simple and convenient, and the realization is realized. Automated testing.
  • intercepting test request feedback to the test terminal may include adding a drainage address to the rule engine library in the test request feedback.
  • the present embodiment introduces the data stream into the rule engine library by adding a drainage address to the test engine library in the test request feedback (for example, specifically by using an add function). This operation can prevent the test feedback from returning to the test terminal and invading the normal operation of the client terminal.
  • the working status information of the server is the test status, it is determined to intercept the test request feedback to the test terminal.
  • the server side of this embodiment can be configured with two states: a test state and an operational state. For example, when the debug parameter is 1, the corresponding server is in the test state. When the debug parameter is 0, the corresponding server is in the operational state.
  • the test system can set up two servers, one is a running server, which is responsible for communication with the test terminal, and obtains test request feedback of the test terminal. The other is a test server, which is responsible for testing.
  • the test system can also set up only one server, which takes into account the test communication and test work with the test terminal.
  • the specific setting manner can be freely set according to the performance and actual needs of the server, and the present invention is not limited in this respect.
  • the test version is suitable for release.
  • the debug parameter needs to be switched from 1 to 0, and the server will be converted from the test state to the operational state. Therefore, after the test is completed, the test server is officially put into operation and becomes an operation server. If the actual operation is not satisfactory, you can also switch the debug parameter from 0 to 1, which will convert the server from the operational state to the test state. This allows for free switching, reducing hardware investment and saving capital.
  • FIG. 3 there is shown a flow chart 300 of a second embodiment of the testing method of the present invention.
  • test method can include:
  • Step 301 Generate a one-way connection from the test terminal to the server end.
  • Step 302 Intercepting test request feedback to the test terminal.
  • Step 303 Parse the test request feedback and determine the test request.
  • Step 304 Determine, according to the determined test request, a rule engine library to determine an expected running result,
  • the rule engine library includes each test request and corresponding expected running results.
  • Step 305 Compare the test result and the expected running result generated by executing the test task corresponding to the test request, and generate a test report.
  • the embodiment shown in FIG. 3 includes: generating a test terminal to the server before intercepting the test request feedback to the test terminal (ie, step 201 in FIG. 2). One-way connection at the end. Such an operation can further prevent the test feedback from returning to the test terminal and invading the normal operation of the client terminal.
  • FIG. 400 a flow chart 400 of a second embodiment of the test method of the present invention is shown.
  • test method can include:
  • Step 401 intercept test request feedback to the test terminal.
  • Step 402 Parse the test request feedback and determine the test request.
  • Step 403 Determine an expected running result by using a rule engine library according to the determined test request, where the rule engine library includes each test request and corresponding expected running results.
  • Step 404 Compare the test result and the expected running result generated by executing the test task corresponding to the test request, and generate a test report.
  • Step 405 When it is determined that the test result in the test report is consistent with the expected running result, the working state information of the server is switched from the test state to the operating state.
  • the embodiment shown in FIG. 4 includes, after generating the test report (ie, step 204 in FIG. 2): when determining the test result and expected in the test report.
  • the working status information of the server is switched from the test state to the operational state.
  • the server side of this embodiment can freely convert the server from the test state and the operational state by setting the debug parameter to 1 or 0. For example, after the test is over, if the test status is satisfactory, for example, the bug is relatively small, the test version is suitable for release. At this time, only the debug parameter needs to be switched from 1 to 0, and the server will be converted from the test state to the operational state. Thus, after the test is completed, if the test results in the corresponding test report corresponding to the test task are all expected to run If the results are consistent, the test server can be officially launched and become an operational server. If the actual operation is not satisfactory, you can also switch the debug parameter from 0 to 1, which will convert the server from the operational state to the test state. This allows for free switching, reducing hardware investment and saving capital.
  • FIG. 5 a first embodiment block diagram 500 of the test system of the present invention is shown.
  • the test system 500 may include a feedback interception unit, a test request determination unit, an expected result determination unit, a test result generation unit, and a test report generation unit. among them:
  • the feedback interception unit is configured to intercept test request feedback to the test terminal.
  • the test request determining unit is configured to parse the test request feedback intercepted by the feedback intercepting unit to determine the test request.
  • the expected result determining unit is configured to determine an expected running result by using the rule engine library according to the test request determined by the test request determining unit.
  • the rule engine library includes test requests and corresponding expected row results.
  • the test result generating unit is configured to execute the test task according to the test request determined by the test request determining unit, and generate the test result.
  • the test report generating unit is configured to compare the expected running result determined by the expected result determining unit with the test result determined by the test request determining unit to generate a test report.
  • the feedback intercepting unit is configured to: add a drainage address to the rule engine library in the test request feedback.
  • the feedback intercepting unit is configured to: when the working status information of the server end is the test status, determine to intercept the test request feedback to the test terminal.
  • the invention can overcome the problems of inefficiency, poor accuracy, and need to invade the test end to test the test end and damage the test end caused by the tester manually analyzing the log file, and the operation is simple and convenient, and the realization is realized. Automated testing.
  • FIG. 600 a block diagram 600 of a second embodiment of the test system of the present invention is shown.
  • the test system 600 can include: a connection generation unit, a feedback interception unit, The test request determination unit, the expected result determination unit, the test result generation unit, and the test report generation unit. among them:
  • the connection generation unit is configured to generate a one-way connection from the test terminal to the server.
  • the feedback interception unit is configured to intercept test request feedback to the test terminal.
  • the test request determination unit is configured to parse the test request feedback and determine the test request.
  • the expected result determining unit is configured to determine an expected running result using the rule engine library according to the determined test request.
  • the rule engine library includes test requests and corresponding expected row results.
  • the test result generating unit is configured to execute the test task according to the test request and generate the test result.
  • the test report generation unit is used to compare the expected running result and the test result to generate a test report.
  • the embodiment shown in FIG. 6 is different from the embodiment shown in FIG. 5 in that the embodiment shown in FIG. 6 further includes: a unidirectional connection connection generating unit for generating a test terminal to a server end. Such an operation can further prevent the test feedback from returning to the test terminal and invading the normal operation of the client terminal.
  • FIG. 7 a block diagram 700 of a third embodiment of the test system of the present invention is shown.
  • the test system 700 may include a feedback interception unit, a test request determination unit, an expected result determination unit, a test result generation unit, a test report generation unit, and a state switching unit. among them:
  • the feedback interception unit is configured to intercept test request feedback to the test terminal.
  • the test request determination unit is configured to parse the test request feedback and determine the test request.
  • the expected result determining unit is configured to determine an expected running result using the rule engine library according to the determined test request.
  • the rule engine library includes test requests and corresponding expected row results.
  • the test result generating unit is configured to execute the test task according to the test request and generate the test result.
  • the test report generation unit is used to compare the expected running result and the test result to generate a test report.
  • the state switching unit is used to determine that the test result in the test report is consistent with the expected running result When the server-side working status information is switched from the test state to the operational state.
  • the server side of this embodiment can freely convert the server from the test state and the operational state by setting the debug parameter to 1 or 0.
  • the test status is satisfactory, for example, the bug is relatively small
  • the test version is suitable for release.
  • the debug parameter needs to be switched from 1 to 0, and the server will be converted from the test state to the operational state. Therefore, after the test is completed, the test server is officially put into operation and becomes an operation server. If the actual operation is not satisfactory, you can also switch the debug parameter from 0 to 1, which will convert the server from the operational state to the test state. This allows for free switching, reducing hardware investment and saving capital.
  • a related function module can be implemented by a hardware processor.
  • the embodiment of the present invention provides a non-volatile computer storage medium, where the computer storage medium stores computer executable instructions, and the computer executable instructions can execute the test method in any of the foregoing method embodiments;
  • the non-volatile computer storage medium of the present invention stores computer-executable instructions that are set to:
  • a test report is generated by comparing the expected running result with the test result.
  • non-transitory computer readable storage medium can be used to store non-volatile software programs, A non-volatile computer executable program and a module, such as a program instruction/module corresponding to the test method in the embodiment of the present invention (for example, the feedback intercepting unit, the test request determining unit, the expected result determining unit, and the test shown in FIG. 5) Result generation unit and test report generation unit).
  • the one or more modules are stored in the non-transitory computer readable storage medium, and when executed by a processor, perform the test method of any of the above method embodiments.
  • the non-transitory computer readable storage medium may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function; the storage data area may be stored according to the use of the testing device. Data, etc.
  • the non-transitory computer readable storage medium may include a high speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other nonvolatile solid state storage device.
  • the non-transitory computer readable storage medium optionally includes a memory remotely disposed relative to the processor, the remote memory being connectable to the test device over a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the embodiment of the present invention further provides a computer program product, the computer program product comprising a calculation program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, when the program instruction is executed by the computer, causing the computer to execute the above Any test method.
  • FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. As shown in FIG. 8, the device includes:
  • One or more processors 810 and memory 820, one processor 810 is taken as an example in FIG.
  • the apparatus of the test method may further include: an input device 830 and an output device 840.
  • the processor 810, the memory 820, the input device 830, and the output device 840 may be connected by a bus or other means, as exemplified by a bus connection in FIG.
  • the memory 820 is the above-described nonvolatile computer readable storage medium.
  • the processor 810 executes various functional applications of the server and data processing by running non-volatile software programs, instructions, and modules stored in the memory 820, that is, implementing the above-described method embodiment test method.
  • Input device 830 can receive input numeric or character information, and generate and test devices User settings and key signal inputs related to function control.
  • the output device 840 can include a display device such as a display screen.
  • the above product can perform the method provided by the embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method.
  • the above product can perform the method provided by the embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method.
  • the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, The instructions are executed by the at least one processor to enable the at least one processor to:
  • a test report is generated by comparing the expected running result with the test result.
  • the electronic device of the embodiment of the invention exists in various forms, including but not limited to:
  • Mobile communication devices These devices are characterized by mobile communication functions and are mainly aimed at providing voice and data communication.
  • Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.
  • Ultra-mobile personal computer equipment This type of equipment belongs to the category of personal computers, has computing and processing functions, and generally has mobile Internet access.
  • Such terminals include: PDAs, MIDs, and UMPC devices, such as the iPad.
  • Portable entertainment devices These devices can display and play multimedia content. Such devices include: audio, video players (such as iPod), handheld game consoles, e-books, and smart toys and portable car navigation devices.
  • the composition of the server includes a processor, a hard disk, and a memory.
  • the system bus and the like, the server is similar to the general computer architecture, but because of the need to provide highly reliable services, it is highly demanded in terms of processing power, stability, reliability, security, scalability, manageability, and the like.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. Those of ordinary skill in the art can understand and implement without deliberate labor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

一种测试方法及一种测试系统,该方法包括:拦截向测试终端的测试请求反馈(201);解析测试请求反馈,确定测试请求(202);根据所确定的测试请求,利用规则引擎库确定预期运行结果(203);比较执行测试请求对应的测试任务而生成的测试结果和所述预期运行结果,生成测试报告(204)。该方法可以通过在服务器端进行拦截测试,并生成测试报告,克服了现有技术中测试人员人工分析日志文件导致的低效、准确性差和需要侵入测试端进行测试而破坏测试端等问题,操作简单方便,实现了测试的自动化。

Description

测试方法及系统 技术领域
本发明实施例涉及测试技术领域,尤其涉及一种测试方法及系统。
背景技术
随着网络技术的快速发展,智能服务应用已经渗透到人们的娱乐、学习、工作中。随着现有服务应用的复杂度不断的提高和规模逐渐增大,新版本的开发需要在各个阶段进行多次测试。软件测试是使用人工操作或者软件自动运行的方式来检验软件是否满足规定的需求,或者来检验预期结果与实际结果之间的差别的过程。
现有技术中,通常是在测试端(通常是服务终端)或者是在服务器端进行软件测试。然而,侵入测试端的测试方式不仅需要在测试端部署测试软件,还会破坏测试端的运行数据;在服务器端测试也会将请求反馈返回测试端,此种方式也会侵入测试端;另外,现代服务器对象逐渐增多,服务业务逐渐增大,测试人员需要根据种类繁多且数量巨大的工作日志来人为判断软件的运行情况存在效率低、人为因素影响大等问题。
发明内容
本发明实施例提供一种测试方法及系统,用以解决现有技术中测试人员人工分析日志文件导致的低效、准确性差和需要侵入测试端进行测试而破坏测试端等问题。
第一方面,本发明实施例提供了一种测试方法,该方法包括:
拦截向测试终端的测试请求反馈;
解析所述测试请求反馈,确定测试请求;
根据所确定的测试请求,利用规则引擎库确定预期运行结果,其中,所述规则引擎库包括测试请求和对应的预期行结果;
按照所述测试请求,执行测试任务,生成测试结果;
比较所述预期运行结果和所述测试结果,生成测试报告。
第二方面,本发明实施例提供了一种测试系统,该系统包括:
反馈拦截单元,用于拦截向测试终端的测试请求反馈;
测试请求确定单元,用于解析所述测试请求反馈,确定测试请求;
预期结果确定单元,用于根据所确定的测试请求,利用规则引擎库确定预期运行结果,其中,所述规则引擎库包括测试请求和对应的预期行结果;
测试结果生成单元,用于按照所述测试请求,执行测试任务,生成测试结果;
测试报告生成单元,用于比较所述预期运行结果和所述测试结果,生成测试报告。
第三方面,本发明实施例还提供了一种非易失性计算机存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行本发明上述任一项测试方法。
第四方面,本发明实施例还提供了一种电子设备,包括:至少一个处理器;以及存储器;其中,所述存储器存储有可被所述至少一个处理器执行的程序,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行本发明上述任一项测试方法。
第五方面,本发明实施例还提供一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的计算程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行上述任一项测试方法。
本发明可以通过在服务器端进行拦截测试,并生成测试报告,克服了测 试人员人工分析日志文件导致的低效、准确性差和需要侵入测试端进行测试而破坏测试端等问题,操作简单方便,实现了测试的自动化。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对实施例描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为用于本发明一个实施例的系统架构示意图;
图2为根据本发明第一个实施例的测试方法流程示意图;
图3为根据本发明第二个实施例的测试方法流程示意图;
图4为根据本发明第三个实施例的测试方法流程示意图;
图5为根据本发明第一个实施例的测试系统结构示意图;
图6为根据本发明第二个实施例的测试系统结构示意图;
图7为根据本发明第三个实施例的测试系统结构示意图;
图8为根据本发明实施例提供的电子设备的结构示意图。
具体实施例
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
首先参考图1,其示出了可以应用本发明实施例的示例性系统架构100。
如图1所示,系统架构100可以包括终端设备101、102、网络103和服务器104、105。
用户110可以使用终端设备101、102通过网络103与服务器104、105交互,以接收或发送消息等。终端设备101、102上可以安装有各种通讯客户端应用,例如即时通信工具、邮箱客户端、社交平台软件、音频视频软件等。
终端设备101、102可以是各种电子设备,包括但不限于个人电脑、智能手机、平板电脑、个人数字助理、可穿戴设备等,终端设备101、102还可以是智能家电、路由器等。
网络103用以在终端设备101、102和服务器104、105之间提供通信链路的介质。网络103可以包括各种连接类型,例如有线、无线通信链路或者光纤电缆等。
服务器104、105可以是提供各种服务的服务器。服务器可以对接收到的数据进行存储、分析等处理。本实施例中,为了防止侵入终端设备,服务器不将处理结果反馈给终端设备。
应该理解,图1中的终端设备、网络和服务器的数目仅仅是示意性的。根据实现需要,可以具有任意数目的终端设备、网络和服务器。
接着参考图2,其示出了本发明测试方法的第一实施例流程图200。
如图2所示,测试方法可以包括:
步骤201:拦截向测试终端的测试请求反馈。
在本实施例中,测试终端可以是图1所示的终端设备101、102。具体可以是各种电子设备,包括但不限于个人电脑、智能手机、平板电脑、个人数字助理、可穿戴设备等,测试终端还可以是智能家电、路由器等。
在本实施例中,测试请求例如可以是链接某个网页的请求、获取文件数据的请求,本发明在此方面不做限制。
步骤202:解析所述测试请求反馈,确定测试请求。
在本实施例中,测试请求例如可以是:在2s内返回某一个界面。
步骤203:根据所确定的测试请求,利用规则引擎库确定预期运行结果。
在本实施例中,规则引擎库可以包括测试请求和对应的预期运行结果。规则引擎库可以是存储着关联的函数的库,也可以是数据库中的对应表,还可以是其他形式的对应关系,本发明在此方面没有限制。规则引擎库中可以存储测试应用时所需的配置文件和预期的运行结果。例如,配置文件可以包含测试请求,预期的运行结果可以包含预期完成的时间和完成的内容。具体数值可以用md5(Message Digest Algorithm,消息摘要算法第五版)值来表示,本发明在此方面不做限制。
步骤204:比较执行所述测试请求对应的测试任务而生成的测试结果和所述预期运行结果,生成测试报告。
在本实施例中,该测试报告中至少包括测试结果和预期的运行结果的对比。例如:
预期结果:2s内返回第一个界面。
测试结果1:3s才返回第一界面。
测试结果2:2s内返回第二个界面。
对比结果:预期结果与测试结果不一致。
在本实施例中,服务器可以接收该对比并将其存储起来以备之后对数据进行整理。具体可以根据实际情况,例如对错误的数量设定一个阈值,当错误的数量大于预定阈值时,则说明此次发布的新功能BUG较多,不宜发布,测试服务器不宜正式上线。
在本实施例中,通过统计错误的测试报告,可以很好地把控测试服务器的测试质量,当出现问题时可以及时取消该组件,确保测试的可靠性。
本发明通过在服务器端进行拦截测试,并生成测试报告,可以克服测试人员人工分析日志文件导致的低效、准确性差和需要侵入测试端进行测试而破坏测试端等问题,操作简单方便,实现了测试的自动化。
在一些可选的实施例中,拦截向测试终端的测试请求反馈(即图1中步骤201)可以包括:在测试请求反馈中添加指向规则引擎库的引流地址。
由此,本实施例通过在测试请求反馈中添加指向所述规则引擎库的引流地址(例如,具体可以利用add函数来实现),将数据流引入规则引擎库。如此操作可以防止测试反馈返回测试终端而侵扰客户终端的正常工作。
在一些可选的实施例中,当服务器端的工作状态信息为测试状态时,确定拦截向测试终端的测试请求反馈。
由此,本实施例的服务器端可以设置有两个状态:测试状态和运营状态。例如:debug参数为1时,对应服务器端为测试状态。debug参数为0时,对应服务器端为运营状态。如此,在设置测试系统时,可以进行灵活设置。例如,测试系统可以设置两台服务器,一台为运行服务器,其负责与测试终端的通信,获取测试终端的测试请求反馈。另一台为测试服务器,其负责测工作。另外,测试系统也可以只设置一台服务器,该服务器兼顾与测试终端的测试通信以及测试工作。具体设置方式,可以按服务器的性能及实际需求自由设置,本发明在此方面没有限制。
在测试结束后,如果测试状态比较满意,例如bug比较少,测试版本适宜发布,此时,仅需要将debug参数由1切换为0,就会将服务器由测试状态转换为运营状态。由此,在测试完成后,将测试服务器正式上线运营,变为运营服务器。如果,实际运营的效果不理想时,还可以将debug参数由0切换为1,就会将服务器由运营状态转换为测试状态。这样可以自由切换,减少硬件投入,节省资金投入。
接着参考图3,其示出了本发明测试方法的第二实施例流程图300。
如图3所示,测试方法可以包括:
步骤301:生成从测试终端至服务器端的单向连接。
步骤302:拦截向测试终端的测试请求反馈。
步骤303:解析测试请求反馈,确定测试请求。
步骤304:根据所确定的测试请求,利用规则引擎库确定预期运行结果, 其中,规则引擎库包括各测试请求和对应的各预期运行结果。
步骤305:比较执行测试请求对应的测试任务而生成的测试结果和预期运行结果,生成测试报告。
图3所示实施例与图2所示实施例的区别在于,图3所示实施例在拦截向测试终端的测试请求反馈(即图2中的步骤201)之前包括:生成从测试终端至服务器端的单向连接。如此操作可以进一步防止测试反馈返回测试终端而侵扰客户终端的正常工作。
接着参考图4,其示出了本发明测试方法的第二实施例流程图400。
如图4所示,测试方法可以包括:
步骤401:拦截向测试终端的测试请求反馈。
步骤402:解析测试请求反馈,确定测试请求。
步骤403:根据所确定的测试请求,利用规则引擎库确定预期运行结果,其中,规则引擎库包括各测试请求和对应的各预期运行结果。
步骤404:比较执行测试请求对应的测试任务而生成的测试结果和预期运行结果,生成测试报告。
步骤405:当确定测试报告中的测试结果和预期的运行结果一致时,将服务器端的工作状态信息由测试状态切换为运营状态。
图4所示实施例与图2所示实施例的区别在于,图4所示实施例在生成测试报告(即图2中的步骤204)之后包括:当确定测试报告中的测试结果和预期的运行结果一致时,将服务器端的工作状态信息由测试状态切换为运营状态。
由此,本实施例的服务器端可以通过将debug参数的设置为1或0来将服务器由测试状态与运营状态进行自由转换。例如在测试结束后,如果测试状态比较满意,例如bug比较少,测试版本适宜发布,此时,仅需要将debug参数由1切换为0,就会将服务器由测试状态转换为运营状态。由此,在测试完成后,如果相应的测试任务对应的测试报告中的测试结果均与预期运行 结果一致,则可以将测试服务器正式上线运营,变为运营服务器。如果,实际运营的效果不理想时,还可以将debug参数由0切换为1,就会将服务器由运营状态转换为测试状态。这样可以自由切换,减少硬件投入,节省资金投入。
然后参考图5,其示出了本发明测试系统的第一实施例结构图500。
如图5所示,测试系统500可以包括:反馈拦截单元、测试请求确定单元、预期结果确定单元、测试结果生成单元和测试报告生成单元。其中:
反馈拦截单元用于拦截向测试终端的测试请求反馈。
测试请求确定单元用于解析反馈拦截单元拦截的测试请求反馈,确定测试请求。
预期结果确定单元用于根据测试请求确定单元所确定的测试请求,利用规则引擎库确定预期运行结果。
其中,规则引擎库包括测试请求和对应的预期行结果。
测试结果生成单元用于按照测试请求确定单元所确定的测试请求,执行测试任务,生成测试结果。
测试报告生成单元用于比较预期结果确定单元所确定的预期运行结果和测试请求确定单元所确定的测试结果,生成测试报告。
在一些可选的实施例中,反馈拦截单元用于:在所述测试请求反馈中添加指向所述规则引擎库的引流地址。
在一些可选的实施例中,反馈拦截单元用于:当所述服务器端的工作状态信息为测试状态时,确定拦截向测试终端的测试请求反馈。
本发明通过在服务器端进行拦截测试,并生成测试报告,可以克服测试人员人工分析日志文件导致的低效、准确性差和需要侵入测试端进行测试而破坏测试端等问题,操作简单方便,实现了测试的自动化。
然后参考图6,其示出了本发明测试系统的第二实施例结构图600。
如图6所示,测试系统600可以包括:连接生成单元、反馈拦截单元、 测试请求确定单元、预期结果确定单元、测试结果生成单元和测试报告生成单元。其中:
连接生成单元用于生成从测试终端至服务器端的单向连接。
反馈拦截单元用于拦截向测试终端的测试请求反馈。
测试请求确定单元用于解析测试请求反馈,确定测试请求。
预期结果确定单元用于根据所确定的测试请求,利用规则引擎库确定预期运行结果。
其中,规则引擎库包括测试请求和对应的预期行结果。
测试结果生成单元用于按照测试请求,执行测试任务,生成测试结果。
测试报告生成单元用于比较预期运行结果和测试结果,生成测试报告。
图6所示实施例与图5所示实施例的区别在于,图6所示实施例在图5实施例的基础上还包括:用于生成从测试终端至服务器端的单向连接连接生成单元。如此操作可以进一步防止测试反馈返回测试终端而侵扰客户终端的正常工作。
然后参考图7,其示出了本发明测试系统的第三实施例结构图700。
如图7所示,测试系统700可以包括:反馈拦截单元、测试请求确定单元、预期结果确定单元、测试结果生成单元、测试报告生成单元和状态切换单元。其中:
反馈拦截单元用于拦截向测试终端的测试请求反馈。
测试请求确定单元用于解析测试请求反馈,确定测试请求。
预期结果确定单元用于根据所确定的测试请求,利用规则引擎库确定预期运行结果。
其中,规则引擎库包括测试请求和对应的预期行结果。
测试结果生成单元用于按照测试请求,执行测试任务,生成测试结果。
测试报告生成单元用于比较预期运行结果和测试结果,生成测试报告。
状态切换单元用于当确定测试报告中的测试结果和预期的运行结果一致 时,将服务器端的工作状态信息由测试状态切换为运营状态。
图7所示实施例与图5所示实施例的区别在于,图7所示实施例在图5实施例的基础上还包括状态切换单元。由此,本实施例的服务器端可以通过将debug参数的设置为1或0来将服务器由测试状态与运营状态进行自由转换。例如在测试结束后,如果测试状态比较满意,例如bug比较少,测试版本适宜发布,此时,仅需要将debug参数由1切换为0,就会将服务器由测试状态转换为运营状态。由此,在测试完成后,将测试服务器正式上线运营,变为运营服务器。如果,实际运营的效果不理想时,还可以将debug参数由0切换为1,就会将服务器由运营状态转换为测试状态。这样可以自由切换,减少硬件投入,节省资金投入。
由于上述实施例的测试系统与测试方法的功能相对应,因此,上述测试方法可以实现的功能及技术效果也同样适用于上述测试系统,在此不再赘述二者相同或相关的内容。本发明实施例中可以通过硬件处理器(hardware processor)来实现相关功能模块。
本发明实施例提供了一种非易失性计算机存储介质,所述计算机存储介质存储有计算机可执行指令,该计算机可执行指令可执行上述任意方法实施例中的测试方法;
作为一种实施方式,本发明的非易失性计算机存储介质存储有计算机可执行指令,所述计算机可执行指令设置为:
拦截向测试终端的测试请求反馈;
解析所述测试请求反馈,确定测试请求;
根据所确定的测试请求,利用规则引擎库确定预期运行结果,其中,所述规则引擎库包括测试请求和对应的预期行结果;
按照所述测试请求,执行测试任务,生成测试结果;
比较所述预期运行结果和所述测试结果,生成测试报告。
作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、 非易失性计算机可执行程序以及模块,如本发明实施例中的测试方法对应的程序指令/模块(例如,附图5所示的反馈拦截单元、测试请求确定单元、预期结果确定单元、测试结果生成单元和测试报告生成单元)。所述一个或者多个模块存储在所述非易失性计算机可读存储介质中,当被处理器执行时,执行上述任意方法实施例中的测试方法。
非易失性计算机可读存储介质可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据测试装置的使用所创建的数据等。此外,非易失性计算机可读存储介质可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,非易失性计算机可读存储介质可选包括相对于处理器远程设置的存储器,这些远程存储器可以通过网络连接至测试装置。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
本发明实施例还提供一种计算机程序产品,计算机程序产品包括存储在非易失性计算机可读存储介质上的计算程序,计算机程序包括程序指令,当程序指令被计算机执行时,使计算机执行上述任一项测试方法。
图8是本发明一实施例提供的电子设备的结构示意图,如图8所示,该设备包括:
一个或多个处理器810以及存储器820,图8中以一个处理器810为例。
测试方法的设备还可以包括:输入装置830和输出装置840。
处理器810、存储器820、输入装置830和输出装置840可以通过总线或者其他方式连接,图8中以通过总线连接为例。
存储器820为上述的非易失性计算机可读存储介质。处理器810通过运行存储在存储器820中的非易失性软件程序、指令以及模块,从而执行服务器的各种功能应用以及数据处理,即实现上述方法实施例测试方法。
输入装置830可接收输入的数字或字符信息,以及产生与测试装置的用 户设置以及功能控制有关的键信号输入。输出装置840可包括显示屏等显示设备。
上述产品可执行本发明实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本发明实施例所提供的方法。
作为一种实施方式,上述电子设备包括:至少一个处理器;以及,与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够:
拦截向测试终端的测试请求反馈;
解析所述测试请求反馈,确定测试请求;
根据所确定的测试请求,利用规则引擎库确定预期运行结果,其中,所述规则引擎库包括测试请求和对应的预期行结果;
按照所述测试请求,执行测试任务,生成测试结果;
比较所述预期运行结果和所述测试结果,生成测试报告。
本发明实施例的电子设备以多种形式存在,包括但不限于:
(1)移动通信设备:这类设备的特点是具备移动通信功能,并且以提供话音、数据通信为主要目标。这类终端包括:智能手机(例如iPhone)、多媒体手机、功能性手机,以及低端手机等。
(2)超移动个人计算机设备:这类设备属于个人计算机的范畴,有计算和处理功能,一般也具备移动上网特性。这类终端包括:PDA、MID和UMPC设备等,例如iPad。
(3)便携式娱乐设备:这类设备可以显示和播放多媒体内容。该类设备包括:音频、视频播放器(例如iPod),掌上游戏机,电子书,以及智能玩具和便携式车载导航设备。
(4)服务器:提供计算服务的设备,服务器的构成包括处理器、硬盘、内存、 系统总线等,服务器和通用的计算机架构类似,但是由于需要提供高可靠的服务,因此在处理能力、稳定性、可靠性、安全性、可扩展性、可管理性等方面要求较高。
(5)其他具有数据交互功能的电子装置。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
通过以上的实施例的描述,本领域的技术人员可以清楚地了解到各实施例可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (13)

  1. 一种测试方法,应用于电子设备,包括:
    拦截向测试终端的测试请求反馈;
    解析所述测试请求反馈,确定测试请求;
    根据所确定的测试请求,利用规则引擎库确定预期运行结果,其中,所述规则引擎库包括各测试请求和对应的各预期运行结果;
    比较执行所述测试请求对应的测试任务而生成的测试结果和所述预期运行结果,生成测试报告。
  2. 根据权利要求1所述的方法,其中,所述拦截向测试终端的测试请求反馈包括:在所述测试请求反馈中添加指向所述规则引擎库的引流地址。
  3. 根据权利要求1或2所述的方法,其中,在所述拦截向测试终端的测试请求反馈之前包括:生成从所述测试终端至服务器端的单向连接。
  4. 根据权利要求1或2所述的方法,其中,当服务器端的工作状态信息为测试状态时,确定拦截向测试终端的测试请求反馈。
  5. 根据权利要求4所述的方法,其中,在所述生成测试报告之后,包括:
    当确定所述测试报告中的测试结果和预期的运行结果一致时,将所述服务器端的工作状态信息由测试状态切换为运营状态。
  6. 一种测试系统,包括:
    反馈拦截单元,用于拦截向测试终端的测试请求反馈;
    测试请求确定单元,用于解析所述测试请求反馈,确定测试请求;
    预期结果确定单元,根据所确定的测试请求,利用规则引擎库确定预期 运行结果,其中,所述规则引擎库包括各测试请求和对应的各预期运行结果;
    测试结果生成单元,用于按照所述测试请求,执行测试任务,生成测试结果;
    测试报告生成单元,用于比较执行所述测试请求对应的测试任务而生成的测试结果和所述预期运行结果,生成测试报告。
  7. 根据权利要求6所述的系统,其中,所述反馈拦截单元用于:在所述测试请求反馈中添加指向所述规则引擎库的引流地址。
  8. 根据权利要求6或7所述的系统,其中,所述系统还包括:
    连接生成单元,用于生成从所述测试终端至服务器端的单向连接。
  9. 根据权利要求6或7所述的系统,其中,所述反馈拦截单元用于:
    当所述服务器端的工作状态信息为测试状态时,确定拦截向测试终端的测试请求反馈。
  10. 根据权利要求9所述的系统,其中,所述系统还包括:
    状态切换单元,用于当确定所述测试报告中的测试结果和预期的运行结果一致时,将所述服务器端的工作状态信息由测试状态切换为运营状态。
  11. 一种电子设备,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-5任一所述的方法。
  12. 一种非暂态计算机可读存储介质,其特征在于,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行权利要求1-5任一所述方法。
  13. 一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的计算程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行权利要求1-5任一所述的方法。
PCT/CN2016/100153 2016-04-20 2016-09-26 测试方法及系统 WO2017181591A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610248234.7 2016-04-20
CN201610248234.7A CN105955878A (zh) 2016-04-20 2016-04-20 服务器端的测试方法及系统

Publications (1)

Publication Number Publication Date
WO2017181591A1 true WO2017181591A1 (zh) 2017-10-26

Family

ID=56917912

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/100153 WO2017181591A1 (zh) 2016-04-20 2016-09-26 测试方法及系统

Country Status (2)

Country Link
CN (1) CN105955878A (zh)
WO (1) WO2017181591A1 (zh)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109981193A (zh) * 2017-12-28 2019-07-05 北京松果电子有限公司 基于lte的图传模块测试方法、装置、存储介质及设备
CN110502444A (zh) * 2019-08-28 2019-11-26 北京达佳互联信息技术有限公司 一种图像处理算法的测试方法及测试装置
CN110569287A (zh) * 2019-09-11 2019-12-13 上海移远通信技术股份有限公司 产品抽测的控制方法、系统、电子设备和存储介质
CN110674047A (zh) * 2019-09-26 2020-01-10 北京字节跳动网络技术有限公司 软件测试方法、装置及电子设备
CN110737209A (zh) * 2019-11-04 2020-01-31 成都锐能科技有限公司 控制显示组件仿真方法、装置、电子设备及存储介质
CN110888816A (zh) * 2019-12-11 2020-03-17 广州品唯软件有限公司 程序测试方法、程序测试装置及存储介质
CN111104332A (zh) * 2019-12-20 2020-05-05 广州品唯软件有限公司 覆盖率测试方法,测试装置,服务设备及可读存储介质
CN111163134A (zh) * 2019-12-11 2020-05-15 浙江极智通信科技股份有限公司 设备测试方法及系统
CN111432045A (zh) * 2020-03-19 2020-07-17 杭州迪普科技股份有限公司 一种域名系统服务器调度算法的测试方法、装置及设备
CN111858296A (zh) * 2019-12-31 2020-10-30 北京嘀嘀无限科技发展有限公司 接口测试方法、装置、设备和存储介质
CN111858295A (zh) * 2019-12-31 2020-10-30 北京骑胜科技有限公司 一种固件测试方法、装置、电子设备和存储介质
CN113138914A (zh) * 2020-01-19 2021-07-20 腾讯科技(深圳)有限公司 资源交互系统测试方法、装置、存储介质和计算机设备
CN114071119A (zh) * 2020-07-31 2022-02-18 北京达佳互联信息技术有限公司 资源的测试方法、装置、电子设备及存储介质
CN116737483A (zh) * 2023-08-11 2023-09-12 成都飞机工业(集团)有限责任公司 一种装配测试交互方法、装置、设备及存储介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955878A (zh) * 2016-04-20 2016-09-21 乐视控股(北京)有限公司 服务器端的测试方法及系统
CN109672790B (zh) * 2018-09-20 2021-10-01 平安科技(深圳)有限公司 话务请求引流方法、装置、设备及可读存储介质
CN116340187B (zh) * 2023-05-25 2023-08-15 建信金融科技有限责任公司 规则引擎迁移测试方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103023711A (zh) * 2011-09-22 2013-04-03 腾讯科技(深圳)有限公司 服务可靠性验证方法及系统
CN103136099A (zh) * 2011-12-02 2013-06-05 腾讯科技(深圳)有限公司 测试软件的方法、模拟终端、后台服务器和系统
US20130246853A1 (en) * 2012-03-13 2013-09-19 Truemetrics Llc System and methods for automated testing of functionally complex systems
CN105955878A (zh) * 2016-04-20 2016-09-21 乐视控股(北京)有限公司 服务器端的测试方法及系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6769114B2 (en) * 2000-05-19 2004-07-27 Wu-Hon Francis Leung Methods and apparatus for preventing software modifications from invalidating previously passed integration tests
CN103473174A (zh) * 2013-09-10 2013-12-25 四川长虹电器股份有限公司 智能电视应用软件的云测试方法
CN105354140B (zh) * 2015-11-02 2018-09-25 上海聚力传媒技术有限公司 一种自动化测试的方法及系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103023711A (zh) * 2011-09-22 2013-04-03 腾讯科技(深圳)有限公司 服务可靠性验证方法及系统
CN103136099A (zh) * 2011-12-02 2013-06-05 腾讯科技(深圳)有限公司 测试软件的方法、模拟终端、后台服务器和系统
US20130246853A1 (en) * 2012-03-13 2013-09-19 Truemetrics Llc System and methods for automated testing of functionally complex systems
CN105955878A (zh) * 2016-04-20 2016-09-21 乐视控股(北京)有限公司 服务器端的测试方法及系统

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109981193B (zh) * 2017-12-28 2021-06-29 北京小米松果电子有限公司 基于lte的图传模块测试方法、装置、存储介质及设备
CN109981193A (zh) * 2017-12-28 2019-07-05 北京松果电子有限公司 基于lte的图传模块测试方法、装置、存储介质及设备
CN110502444A (zh) * 2019-08-28 2019-11-26 北京达佳互联信息技术有限公司 一种图像处理算法的测试方法及测试装置
CN110502444B (zh) * 2019-08-28 2023-08-18 北京达佳互联信息技术有限公司 一种图像处理算法的测试方法及测试装置
CN110569287A (zh) * 2019-09-11 2019-12-13 上海移远通信技术股份有限公司 产品抽测的控制方法、系统、电子设备和存储介质
CN110674047A (zh) * 2019-09-26 2020-01-10 北京字节跳动网络技术有限公司 软件测试方法、装置及电子设备
CN110737209A (zh) * 2019-11-04 2020-01-31 成都锐能科技有限公司 控制显示组件仿真方法、装置、电子设备及存储介质
CN110888816A (zh) * 2019-12-11 2020-03-17 广州品唯软件有限公司 程序测试方法、程序测试装置及存储介质
CN111163134A (zh) * 2019-12-11 2020-05-15 浙江极智通信科技股份有限公司 设备测试方法及系统
CN110888816B (zh) * 2019-12-11 2023-08-22 广州品唯软件有限公司 程序测试方法、程序测试装置及存储介质
CN111104332B (zh) * 2019-12-20 2024-01-30 广州品唯软件有限公司 覆盖率测试方法,测试装置,服务设备及可读存储介质
CN111104332A (zh) * 2019-12-20 2020-05-05 广州品唯软件有限公司 覆盖率测试方法,测试装置,服务设备及可读存储介质
CN111858296A (zh) * 2019-12-31 2020-10-30 北京嘀嘀无限科技发展有限公司 接口测试方法、装置、设备和存储介质
CN111858295A (zh) * 2019-12-31 2020-10-30 北京骑胜科技有限公司 一种固件测试方法、装置、电子设备和存储介质
CN111858295B (zh) * 2019-12-31 2024-05-14 北京骑胜科技有限公司 一种固件测试方法、装置、电子设备和存储介质
CN113138914A (zh) * 2020-01-19 2021-07-20 腾讯科技(深圳)有限公司 资源交互系统测试方法、装置、存储介质和计算机设备
CN113138914B (zh) * 2020-01-19 2024-04-26 腾讯科技(深圳)有限公司 资源交互系统测试方法、装置、存储介质和计算机设备
CN111432045B (zh) * 2020-03-19 2022-05-31 杭州迪普科技股份有限公司 一种域名系统服务器调度算法的测试方法、装置及设备
CN111432045A (zh) * 2020-03-19 2020-07-17 杭州迪普科技股份有限公司 一种域名系统服务器调度算法的测试方法、装置及设备
CN114071119B (zh) * 2020-07-31 2024-03-19 北京达佳互联信息技术有限公司 资源的测试方法、装置、电子设备及存储介质
CN114071119A (zh) * 2020-07-31 2022-02-18 北京达佳互联信息技术有限公司 资源的测试方法、装置、电子设备及存储介质
CN116737483B (zh) * 2023-08-11 2023-12-08 成都飞机工业(集团)有限责任公司 一种装配测试交互方法、装置、设备及存储介质
CN116737483A (zh) * 2023-08-11 2023-09-12 成都飞机工业(集团)有限责任公司 一种装配测试交互方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN105955878A (zh) 2016-09-21

Similar Documents

Publication Publication Date Title
WO2017181591A1 (zh) 测试方法及系统
US10394697B2 (en) Focus area integration test heuristics
US9720753B2 (en) CloudSeer: using logs to detect errors in the cloud infrastructure
WO2020233369A1 (zh) 基于模拟端口改进软件集成系统的方法及相关设备
WO2017219589A1 (zh) 程序崩溃消息的处理方法及系统
US10592237B2 (en) Efficient detection of architecture related bugs during the porting process
WO2018000607A1 (zh) 一种识别测试用例失败原因的方法及电子设备
CN112187585B (zh) 网络协议测试方法及装置
CN108228628B (zh) 一种结构化查询语言数据库中的宽表生成方法及其装置
CN110750592B (zh) 数据同步的方法、装置和终端设备
KR102488582B1 (ko) 애플리케이션 실행 상태 검증 방법 및 장치
CN110928770B (zh) 软件测试的方法、装置、系统、存储介质和电子设备
WO2019205555A1 (zh) 消息推送方法及装置
US9811447B2 (en) Generating a fingerprint representing a response of an application to a simulation of a fault of an external service
CN110830500B (zh) 网络攻击追踪方法、装置、电子设备及可读存储介质
CN114546830A (zh) 回归测试方法、装置、电子设备及存储介质
US9626268B2 (en) Controlling a byte code transformer on detection of completion of an asynchronous command
US10558556B2 (en) Introspective fault and workload injection for service assurance
US20150378710A1 (en) Patching Auto-Stop
CN110795330A (zh) 一种Monkey压力测试的方法和装置
JP6592450B2 (ja) データ同期及びフェイルオーバ管理のためのシステム及び方法
Hine et al. Reac2o: a runtime for enterprise system models
CN109522187B (zh) 状态信息快速提取方法及装置
CN107276852B (zh) 一种数据安全检测方法及终端
CN111694686A (zh) 一种异常服务的处理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16899177

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16899177

Country of ref document: EP

Kind code of ref document: A1