CN114791875A - Usability testing method and device, electronic equipment and storage medium - Google Patents

Usability testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114791875A
CN114791875A CN202110099106.1A CN202110099106A CN114791875A CN 114791875 A CN114791875 A CN 114791875A CN 202110099106 A CN202110099106 A CN 202110099106A CN 114791875 A CN114791875 A CN 114791875A
Authority
CN
China
Prior art keywords
test
data
control
state
link address
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110099106.1A
Other languages
Chinese (zh)
Inventor
支尚
杨涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202110099106.1A priority Critical patent/CN114791875A/en
Publication of CN114791875A publication Critical patent/CN114791875A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The embodiment of the disclosure relates to an ease of use test method and device, electronic equipment and a storage medium. In at least one embodiment of the disclosure, by acquiring a data acquisition script for an ease-of-use test and embedding the data acquisition script into a page of a link address of a test object, automatic acquisition of test data by using the data acquisition script is realized, and the link address of the ease-of-use test is generated based on the link address of the test object and a test task, so that a test user can independently complete online test of the test object by accessing the link address without manual intervention of a third party, and compared with the existing offline test, manual arrangement of the test task and manual transcription and combing of the test data are not required to be performed on site, thereby improving test efficiency and reducing labor cost; compared with the existing on-line test, the on-line test method does not need to be remotely communicated with a test user to carry out manual arrangement of an on-line test task and manual transcription and combing of test data, and improves the test efficiency.

Description

Usability testing method and device, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of usability testing, in particular to a usability testing method and device, electronic equipment and a non-transitory computer readable storage medium.
Background
In the past, the actual use condition of a product by a user is known, and the user needs to carry out offline usability test on the product. The process of the offline ease of use test is typically: the user carries out typical operation on the product, and an observer and a product developer observe, listen and record beside so as to obtain data such as operation record, feedback, problems and the like of the user, so that the product is improved, and the usability of the product is improved.
However, the offline ease of use test has the following problems: the user needs to operate the product on the site, an observer and a product developer also need to transcribe and comb on the site, and in addition, the arrangement of an usability test task is usually carried out on the site, so that the observer and the product developer need to go to the site of the user on business, and also need to manually observe and record, the whole test process has low efficiency, long time consumption and high labor cost, and the test user cannot independently complete the test. And cannot be developed offline during new coronary epidemics.
Usability testing can also be performed on-line, for example, by remote video and sharing a desktop. However, the observer is still required to perform the arrangement of the usability testing task, the remote communication with the user, and the transcription and the combing of the data such as the operation record, the feedback, the problem and the like of the user, so that the testing efficiency is still low, and the testing user cannot independently complete the testing.
The above description of the discovery process of the problems is only for the purpose of assisting understanding of the technical solutions of the present disclosure, and does not represent an admission that the above is prior art.
Disclosure of Invention
To solve at least one problem of the prior art, at least one embodiment of the present disclosure provides an ease-of-use testing method, apparatus, electronic device, and non-transitory computer-readable storage medium.
In a first aspect, an embodiment of the present disclosure provides an ease of use testing method, including:
acquiring a first link address and a test task of a test object;
acquiring a data acquisition script for the usability test;
embedding the data acquisition script into a page of the first link address;
generating a second link address based on the first link address and the test task;
and responding to the access request of the second link address, jumping to the page of the first link address to enable the data acquisition script to guide usability testing based on the testing task, and acquiring testing data based on the data acquisition script.
In a second aspect, an embodiment of the present disclosure further provides an ease of use testing apparatus, including:
the first acquisition unit is used for acquiring a first link address and a test task of a test object;
the second acquisition unit is used for acquiring a data acquisition script of the usability test;
the embedding unit is used for embedding the data acquisition script into the page of the first link address;
the generating unit is used for generating a second link address based on the first link address and the test task;
and the response unit is used for responding to the access request of the second link address, jumping to the page of the first link address to enable the data acquisition script to guide usability testing based on the testing task, and acquiring testing data based on the data acquisition script.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor and a memory; the processor is configured to perform the steps of the usability testing method according to any of the embodiments of the first aspect by calling a program or instructions stored in the memory.
In a fourth aspect, embodiments of the present disclosure further provide a non-transitory computer-readable storage medium for storing a program or instructions, where the program or instructions cause a computer to perform the steps of the usability testing method according to any one of the embodiments of the first aspect.
Therefore, in at least one embodiment of the present disclosure, by acquiring the data acquisition script of the usability test and embedding the data acquisition script into the page of the link address of the test object, automatic acquisition of the test data by using the data acquisition script is realized, and the link address of the usability test is generated based on the link address of the test object and the test task, so that the test user can independently complete the online test of the test object by accessing the link address, and compared with the current online test, manual arrangement of the test task and manual transcription and combing of the test data do not need to be performed on site, so that the test efficiency is improved, and the labor cost is reduced; compared with the existing on-line test, the on-line test method does not need to be remotely communicated with a test user to carry out manual arrangement of an on-line test task and manual transcription and combing of test data, and improves the test efficiency.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following descriptions are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a diagram of an exemplary application scenario provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart of a method for ease of use testing provided by embodiments of the present disclosure;
FIG. 3 is an interaction diagram of a method for usability testing provided by an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a first interface for adding a test object according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a second interface for embedding a test object according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a third interface for adding a test task according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a fourth interface for creating a test item according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of an interface for successfully creating a test project according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of an interface for starting a test provided by an embodiment of the present disclosure;
FIG. 10 is a schematic illustration of a floating page provided by an embodiment of the present disclosure;
FIG. 11 is a first state diagram of the floating page shown in FIG. 10;
FIG. 12 is a second state diagram of the floating page shown in FIG. 10;
FIG. 13 is a third state diagram of the floating page shown in FIG. 10;
FIG. 14 is a fourth state diagram of the floating page shown in FIG. 10;
FIG. 15 is a schematic diagram of a video display area and a video timeline area in an analysis interface provided by an embodiment of the present disclosure;
FIG. 16 is a first state diagram of a multidimensional data detail tree region in an analysis interface provided by an embodiment of the present disclosure;
FIG. 17 is a second state diagram of a multidimensional data detail tree region in an analysis interface provided by an embodiment of the present disclosure;
FIG. 18 is a page change diagram of the conversion of feedback data to problem marking provided by an embodiment of the present disclosure;
FIG. 19 is a schematic diagram of a problem recording and scoring area in an analysis interface according to an embodiment of the present disclosure;
FIG. 20 is an interface diagram of an ease of use test conclusion provided by an embodiment of the present disclosure;
FIG. 21 is a block diagram of an ease-of-use testing apparatus provided by an embodiment of the present disclosure;
fig. 22 is an exemplary block diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure can be more clearly understood, the present disclosure will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. The specific embodiments described herein are merely illustrative of the disclosure and do not delimit the disclosure. All other embodiments derived by one of ordinary skill in the art from the described embodiments of the disclosure are intended to be within the scope of the disclosure.
It is noted that, in this document, relational terms such as "first" and "second," and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
It should be noted that, in this document, the layout, style, and color of the page, and the layout, style, and color of the interface may all be flexibly changed according to actual needs.
Aiming at the problems that the existing offline test needs manual arrangement of a test task and manual transcription and combing of test data on site, the test efficiency is low, the labor cost is high, and a test user cannot independently complete the test, and aiming at the problems that the existing online test needs to be remotely communicated with the test user to manually arrange the online test task and manually transcribe and comb the test data, the test efficiency is low, and the test user cannot independently complete the test, in at least one embodiment of the disclosure, the automatic collection of the test data by using a data collection script is realized by obtaining the data collection script of the usability test and embedding the data collection script into a page of a link address of a test object; and the link address of the usability test is generated based on the link address of the test object and the test task, so that the test user can independently complete the online test of the test object by accessing the link address without manual intervention of a third party.
Compared with the existing offline test, the method and the device do not need manual arrangement of test tasks and manual transcription and combing of test data on site, improve the test efficiency and reduce the labor cost; compared with the existing on-line test, the on-line test method does not need to be remotely communicated with a test user to carry out manual arrangement of an on-line test task and manual transcription and combing of test data, and improves the test efficiency.
Fig. 1 is a diagram of an exemplary application scenario provided by an embodiment of the present disclosure. In fig. 1, the test object 11 may be understood as a product to be tested, for example, a product that can be displayed by a page of a link address, such as a software system, an application program, a page code, and the like.
The usability testing device 12 can automatically collect the test data by using the data collection script by acquiring the data collection script for the usability test and embedding the data collection script into the page of the link address of the test object 11, thereby solving the problem of low testing efficiency caused by manual transcription of the test data.
In some embodiments, the data acquisition script is embedded into the page of the link address of the test object 11 in an automatic manner, that is, the usability testing device 12 may directly access the link address of the test object 11 and embed the data acquisition script into the code corresponding to the page of the link address of the test object 11.
In some embodiments, the data collection script may be embedded into the page of the link address of the test object 11 manually, that is, the usability test device 12 may inform a third party, such as a project associator, of the data collection script, and the project associator may embed the data collection script into the page of the link address of the test object 11. Where a project associate may be understood as an associate of an ease of use test project, such as the observer and/or developer mentioned in the background.
In fig. 1, the usability testing apparatus 12 may further generate a link address for the usability test based on the link address of the test object 11 and the test task, so that the test user may access the link address to independently complete the online test of the test object 11 without manual intervention of a third party. The problem of low testing efficiency caused by the fact that field communication or online communication with a testing user is needed to arrange a testing task is solved.
In fig. 1, a plurality of test user devices 13 are shown, each test user device 13 having a network connection function and being capable of performing an ease-of-use online test. Therefore, there may be a plurality of users participating in the usability test, and each user may perform an independent test on the test object 11 through the respective test user's device 13 using the network.
Taking the device 13 of one test user as an example, the device 13 of the test user can obtain the link address of the usability test from the usability test device 12 through the network, and further access the link address of the usability test through the network to realize the online test on the test object 11.
In some embodiments, multiple users may collaboratively complete a collaborative test on the test object 11 using a network through respective test user's devices 13. For example, the devices 13 of the plurality of test users can acquire the link address of the usability test from the usability test apparatus 12 via the network, access the link address of the usability test via the network, and edit the link address on the same page corresponding to the link address. In the editing process, for each test user's device 13, the editing contents are displayed in a page in real time, and the editing contents are transmitted to the ease-of-use testing apparatus 12 through a network. The usability testing apparatus 12 may display the edited contents of different users in a page according to a certain page editing policy, so that the device 13 of each testing user may display the edited contents of all users based on the display layout. The page editing strategy can follow the mature strategy in the technical field of multi-person collaborative editing of pages, and is not described any further. In this way, the device 13 of each test user can see not only the contents edited by itself on the page in real time but also the contents edited by other users on the same page.
Fig. 2 is a flowchart of a usability testing method provided by an embodiment of the present disclosure. In some embodiments, the usability test method shown in fig. 2 may be applied to the application scenario shown in fig. 1, and accordingly, the usability test apparatus 12 in fig. 1 is an executing subject of the usability test method shown in fig. 2. For convenience of description, the following embodiments describe the flow of the usability test method with the usability test apparatus as the execution subject.
As shown in fig. 2, in step 201, the usability test apparatus obtains a first link address and a test task of a test object. The test task may include, but is not limited to, a name of the test task and an operational description of the test task. In some embodiments, in addition to obtaining the first link address of the test object, the name of the test object may also be obtained. In some embodiments, the link address may be a URL (Uniform Resource Locator), a unique Identifier (ID), or any other form of character string.
In some embodiments, the ease-of-use testing apparatus may present an interactive interface including an input box for the first link address of the test object, an input box for the test task, and a confirmation (or submission) control. And the project associator inputs the first link address and the test task of the test object in the interactive interface at the same time, clicks the confirmation (or submission) control, and responds to the clicking operation of the confirmation (or submission) control by the usability testing device to acquire the first link address and the test task of the test object input by the test user in the input frame of the interactive interface.
In some embodiments, the ease of use testing apparatus may present two interactive interfaces, one for adding test objects and the other for adding test tasks. For example, an input box and a confirmation (or submission) control in the interactive interface for adding the test object, wherein the input box comprises a first link address of the test object; the interactive interface for adding the test task comprises an input box and a confirmation (or submission) control of the test task. The usability testing device firstly displays an interactive interface for adding a test object, and then displays the interactive interface for adding the test task after acquiring the first link address of the test object, so as to acquire the test task.
In step 202, the ease of use testing apparatus obtains a data collection script for the ease of use test. In some embodiments, the data collection script may be a front-end script of the collection probe, wherein the collection probe may be a Web script program, which is substantially a script file implementing a collection function through a Web page programming language (ASP, PHP, ASP. The acquisition function is, for example, a recording screen, and a floating page for acquiring text, images, and other data input by a user is created.
In some embodiments, the usability testing apparatus may automatically generate the data collection script for the test object after acquiring the first link address of the test object. In some embodiments, the ease-of-use testing apparatus may invoke a pre-set data collection script.
In step 203, the usability test apparatus embeds the data collection script in the page of the first link address. In some embodiments, the ease of use testing apparatus adds the data collection script to the body/> tag of the page of the first link address, for example to the first line of the body/> tag contents.
In some embodiments, the ease of use testing device may directly access the link address of the test object, embedding the data collection script in the page of the link address of the test object.
In some embodiments, the ease-of-use testing device may inform the project associator of the data collection script, which is embedded by the project associator in a page of the link address of the test object.
For example, after the usability testing apparatus generates the data collection script, a prompt interface is displayed, and prompt information is displayed in the prompt interface and used for prompting the project correlation party to copy the data collection script and add the copied data collection script to the first line of the body/> tag content of the page of the first link address. In this way, the project associator knows that the data collection script should be copied and opens the page of the first link address, adding the data collection script to the first line of the body/> tag content of the page of the first link address.
In step 204, the usability test apparatus generates a second link address based on the first link address and the test task. Specifically, the ease-of-use testing apparatus may package a name of the test object, the first link address, a name of the test task, and an operation description of the test task as the second link address.
In some embodiments, the usability testing apparatus, after obtaining the testing task, displays an interactive interface for creating the testing item, and the item associator may create the testing item for the testing object using the interactive interface, so that the item associator manages each usability test in the manner of the testing item.
In some embodiments, the name of the test object, the first link address and the test task are displayed in the interactive interface for creating the test project so that the project associator can verify whether the test object is correct; and an input box of the project name and a creating (or submitting) control are provided in the interactive interface, so that a project associator inputs the project name for the usability test in the input box. Therefore, the project correlation party confirms that all information displayed in the interactive interface is correct, and after the project name is input, the project correlation party can click the creation (or submission) control to complete the creation of the test project. Accordingly, the usability testing apparatus encapsulates the project name, the name of the test object, the first link address, and the test task as the second link address in response to the click operation of creating (or submitting) the control.
After the usability testing device generates the second link address, the usability testing device can prompt the project correlation party to copy the second link address and send the second link address to the testing user, so that the testing user can access the second link address through the equipment 13 of the testing user shown in fig. 1 to independently complete the online test of the test object without manual intervention of a third party.
In step 205, the usability test apparatus responds to the access request of the second link address, jumps to the page of the first link address, so that the data collection script guides the usability test based on the test task, and collects the test data based on the data collection script. Therefore, the data acquisition script guides the test user to independently complete the usability test based on the test task without manual intervention.
In some embodiments, the usability testing device responds to the access request of the second link address, presents the start testing interface, the testing user inputs a name in the start testing interface, and clicks the start testing control in the start testing interface after confirming that the usability testing service agreement is read. Correspondingly, the usability testing device responds to the click operation of the starting testing control, jumps to the page of the first link address, so that the data acquisition script guides the usability testing based on the testing task, and acquires the testing data based on the data acquisition script.
In some embodiments, the data collection script directs the ease of use test based on the test task, for example: the data acquisition script creates a floating layer page in the page of the first link address; the floating layer page is used for guiding the execution of the test task. For example, the name of the test task and the operation description of the test task can be displayed in the created floating layer page, so that a test user can know the details of the test task and further independently operate according to the operation description of the test task in the page of the first link address, and meanwhile, the data acquisition script can automatically acquire test data generated in the operation process of the test user.
To illustrate the relationship of interaction between different subjects more clearly, the interaction flow of the ease-of-use testing method is described below in conjunction with fig. 3. Fig. 3 is an interaction diagram of an ease of use testing method provided by an embodiment of the present disclosure, and it should be noted that fig. 3 is only some embodiments, but not all embodiments, of the present disclosure.
As shown in fig. 3, four types of subjects are involved in the interaction flow of the whole usability testing method: the device of the test user, the device of the project correlation side, the usability test device and the test object.
In some embodiments, the ease-of-use testing device may be a software device, a hardware device, or a combination of software and hardware. For example, the usability testing apparatus is a software apparatus running on an operating system, and the operating system may be an operating system (IOS system, android system, etc.) of a mobile device, so that the usability testing apparatus may be installed on the mobile device, such as a mobile device like a smart phone, a tablet computer, etc.; the operating system may be an operating system of a fixed device (Windows system, Linux system, etc.), and thus, the usability testing apparatus may be installed on the fixed device, such as a desktop computer, a cloud server, etc.
In some embodiments, if the ease of use testing device is a software device, the ease of use testing device may be installed in a facility of the project associate; if the usability testing device is a hardware device or a device combining software and hardware, the usability testing device may be provided inside the equipment of the project-related party or may be implemented as a device independent from the equipment of the project-related party.
In fig. 3, the test user uses the equipment of the test user, and the project associator uses the equipment of the project associator. The interaction between the equipment of the test user, the equipment of the project associator, the ease of use testing device and the test object is as follows:
in fig. 3, the device of the project associator adds the test object in the interactive interface for adding the test object, which is displayed by the usability testing apparatus, including the name of the test object and the first link address of the test object. Thus, the usability test apparatus receives the name of the test object and the first link address of the test object through the interactive interface.
In fig. 3, the usability test apparatus receives the name of the test object and the first link address of the test object, and then generates a data collection script for the test object. The ease of use testing apparatus may inform the data collection script to the equipment of the project associate.
For example, after the usability testing apparatus generates the data collection script, a prompt interface is displayed, and prompt information is displayed in the prompt interface and used for prompting the project correlation party to copy the data collection script and add the copied data collection script to the first line of the main body < body/> tag content of the page of the first link address. In this way, the project associator knows that the data collection script should be copied, opens the page of the first link address, adds the data collection script to the first line of the body/> tag content of the page of the first link address, and completes the embedded data collection script shown in fig. 3.
In fig. 3, the embedded test may be performed on the test object embedded with the data collection script manually, and after the embedded test passes, the project associator may be notified to add the test task and create the test project associated with the test task.
In some embodiments, after the embedded test passes, the ease-of-use testing device may expose an interactive interface for adding test tasks, so that the project associator adds test tasks, including names of the test tasks and operational descriptions of the test tasks. The usability testing device receives the name and the operation description of the testing task, then displays an interactive interface for creating a testing project, so that a project correlation party creates the testing project correlated with the testing task, the project correlation party clicks a creating control in the interactive interface after the testing project is created, and the usability testing device responds to the clicking operation of the creating control to generate a testing link, namely a second link address.
In fig. 3, the usability test apparatus may inform the device of the project related party of the test link after generating the test link. For example, after the usability testing apparatus generates the test link, the interface on which the test item is successfully created is displayed, the generated test link is displayed in the interface, and the item associated party is prompted to copy the test link and send the test link to the test user, so that the device of the test user can access the test link to perform the usability test.
In FIG. 3, the device of the project associate sends the test link to the device of the test user; testing equipment access test links of a user; the usability testing device responds to the access request of the test link and displays a test starting interface; the test user inputs a name in the test starting interface, and clicks the test starting control in the test starting interface after confirming that the usability test service protocol is read. And the usability testing device responds to the click operation of the starting testing control, carries out privacy authorization on equipment of a testing user, and then jumps to a page of the first link address, so that the data acquisition script guides the usability test based on the testing task, and acquires the testing data based on the data acquisition script.
In fig. 3, the data acquisition script may start screen recording after the test starts, acquire test data, and send the test data to the usability testing apparatus. And after all the test tasks are completed, closing screen recording by the data acquisition script, and ending the usability test.
In order to more intuitively understand the flow of the usability test method according to the embodiment of the present disclosure, a process of generating test items in the usability test method is described below with reference to fig. 4 to 9.
Test item generation
The usability testing device is taken as an example of a software device, and the usability testing device is installed in equipment of a project correlation party. The device of the item associate activates the usability testing device, e.g. the item associate clicks or double-clicks an icon of the usability testing device to access the usability testing device.
The usability testing device responds to a starting operation (clicking or double-clicking operation) and shows a first interface for adding a testing object. The first interface comprises an input box of test object configuration information and a first control. Test object configuration information may include, but is not limited to: the name of the test object and the first link address of the test object.
Fig. 4 is a schematic view of a first interface for adding a test object according to an embodiment of the present disclosure. In fig. 4, the product is the test object, the add product is the add test object, the product name is the test object name, the product link address is the first link address of the test object, and the "next" control is the first control. The first interface shown in fig. 4 includes an input box for a product name, an input box for a product link address, and a "next" control.
And responding to the clicking operation of the 'next' control (in figure 4) by the usability testing device, acquiring the product name and the product link address in the input box, and displaying a second interface for embedding the testing object. And the second interface displays the embedded prompt information and the second control. The prompt information is used for prompting the project associated party to copy the data acquisition script and add the copied data acquisition script to the first line of the main body < body/> tag content of the page of the product link address.
Fig. 5 is a schematic diagram of a second interface for embedding a test object according to an embodiment of the present disclosure. The embedded prompt information displayed in the second interface shown in fig. 5 is: please paste the following code on the first line of the product < body/> content, embed the product in the ETest tool. The ETest tool is an easy-to-use testing device. Also shown in FIG. 5 are the code of the data acquisition script and a "copy code" control. And clicking the 'copy code' control by the project associator to quickly copy the code. In addition, fig. 5 also includes a "next" control, which is a second control.
The usability testing device responds to clicking operation of the 'next' control (in fig. 5) and displays a third interface for adding the testing task. And the third interface comprises an input box of the test task configuration information and a third control. Test task configuration information includes, but is not limited to: name of the test task and operational description of the test task.
Fig. 6 is a schematic diagram of a third interface for adding a test task according to an embodiment of the present disclosure. In fig. 6, the task name is the name of the test task, the task description is the operation description of the test task, and the "determination" control is the third control. The third interface shown in fig. 6 includes an input box of task name, an input box of task description, a "determine" control, and an "add test task" control. And clicking the 'add test task' control by the project associator, and adding a plurality of test tasks.
The ease of use testing apparatus presents a fourth interface for creating the test item in response to a click operation of the "OK" control (FIG. 6). And displaying the test object configuration information, the test task configuration information, the input box of the test project name and a fourth control in a fourth interface.
Fig. 7 is a schematic diagram of a fourth interface for creating a test item according to an embodiment of the present disclosure. In fig. 7, the product is the name of the test object, the item name is the test item name, the test start page is the first link address of the test object, and the "submit" control is the fourth control. The fourth interface shown in fig. 7 includes an input box of the product to which the user belongs, an input box of the project name, an input box of the test start page, a label of the test task, a "add test task" control, and a "submit" control. The number of the labels of the test tasks can be multiple, and one label corresponds to one test task. And the input box of the belonged product and the input box of the test starting page display the belonged product and the test starting page acquired by the usability testing device by default. The project correlation party clicks the input box of the product or the input box of the test starting page to which the project correlation party belongs, and the product or the test starting page to which the project correlation party belongs can be modified. And the project associator clicks the input box of the project name to name the project. And clicking x in the label of the test task by the project correlation party to delete the label, namely deleting the test task. And clicking the 'add test task' control by the project associator, and adding a plurality of test tasks.
The usability testing device responds to the clicking operation of the 'submitting' control (in fig. 7) to acquire the project name, and packages the product, the test starting page, the test task and the project name as the second link address. And the usability testing device displays the interface for successfully creating the test item after packaging the second link address, displays the packaged second link address in the interface, prompts the item association party to copy the second link address and sends the second link address to the test user, so that the equipment of the test user can access the second link address to perform usability testing.
Fig. 8 is a schematic diagram of an interface for successfully creating a test item according to an embodiment of the present disclosure. The interface shown in FIG. 8 includes a copy control. And clicking the copy control by the project correlation party to quickly copy the second link address. In addition, in fig. 8, the following information is displayed: "please send the following link address to the target user, the widget directs it to complete the usability test" to prompt the project associate to copy the second link address and send it to the test user. The target user is the test user, and the small tool assistant can understand the auxiliary test function provided by the data acquisition script.
And the project correlation party copies the second link address and sends the second link address to the test user. And after the test user acquires the second link address, the test user enters a test starting interface by clicking the second link address to test the test object.
For example, when the test user clicks the second link address, the usability test apparatus responds to the access request of the second link address, and displays the start test interface as shown in fig. 9, where fig. 9 includes the name input box and the "start test" control. The test user inputs the name in the name input box, and clicks the 'start test' control to test the test object after confirming that the 'usability test service agreement' is read.
In order to more intuitively understand the flow of the usability test method according to the embodiment of the present disclosure, a process of collecting test data in the usability test method is described below with reference to fig. 10 to 14.
Test data collection
The usability testing device responds to the clicking operation of the 'start testing' control in the figure 9, jumps to the page of the first link address, leads the usability testing of the data acquisition script based on the testing task and acquires the testing data based on the data acquisition script.
In some embodiments, after jumping to the page of the first link address, the data acquisition script creates a floating page in the page of the first link address; the floating layer page is used for guiding the execution of the test task. In some embodiments, after the data collection script creates the floating-layer page, a screen recording is performed to generate a screen recording file, wherein the screen recording is a html-based web screen recording.
In some embodiments, a chat interface is displayed in the floating page, the chat interface including a conversation display area, a conversation input box, and a send control. The data acquisition script sends the test tasks (including the names of the test tasks and the operation descriptions of the test tasks) to the dialogue display area through the virtual assistant, replaces manual welcome test users, arranges the test tasks, and guides the test users to carry out usability tests. Prompt messages such as 'meeting slot point, telling slot bar' are displayed in the dialog input box by default, so that a test user can know that the problem or slot point is input through the dialog input box and sent to the dialog display area by clicking a sending control after meeting the problem or slot point.
Fig. 10 is a schematic view of a floating page according to an embodiment of the disclosure. In fig. 10, a chat interface is shown, which includes a dialog display area 101, a dialog input box 102, and a send control 103. The data collection script sends the test task to the dialog display region 101 via the virtual assistant 104. The data collection script responds to the click operation of the sending control 103 to send the content in the dialog input box 102 to the dialog display area 101, such as the first state diagram of the floating layer page shown in fig. 11. The content in the dialog input box 102 is information of feeling, evaluation, groove spitting and the like in the test process input by the test user.
In some embodiments, a test toolbar may be further included in the floating page, the test toolbar including a first control. The states of the first control include a task switching state and a task completing state.
And the data acquisition script is in a task switching state based on the state of the first control, responds to the clicking operation of the first control, and sends another test task to the conversation display area through the virtual assistant.
And the data acquisition script is in a task completion state based on the state of the first control, responds to the clicking operation of the first control and displays the usability evaluation interface. The test user can grade on the usability evaluation interface, and put forward suggestions and suggestions according to visual feelings.
For example, in fig. 10, a test toolbar 105 is also included in the floating page, and a first control 106 is included in the test toolbar. The state of the first control 106 includes a task switch state (e.g., "next task" as shown in fig. 10). The data collection script sends another test task to the dialog display area 101 via the virtual assistant 104 in response to the click operation of the first control 106 based on the status of the first control 106 being "next task".
In some embodiments, the task state information is displayed in the test toolbar. The task state information includes: the total number of test tasks and the sequence number of the task currently being tested. For example, in FIG. 10, the task state information 107 is displayed in the test toolbar 105. The task state information 107 is, for example, "task 1/4" shown in fig. 10, and indicates that the total number of test tasks is 4 and the sequence number of the task currently being tested is 1. When the state of the first widget 106 is "next task", and the testing user clicks the first widget 106, the task state information 107 becomes "task 2/4", which indicates that the sequence number of the task currently being tested is 2.
In some embodiments, if all the test tasks complete the test, that is, the serial number of the task currently being tested in the task state information is the serial number of the last task, and the test user completes the test of the last task, the data acquisition script modifies the state of the first control to the task completion state, and the "next task" in fig. 10 is displayed as "complete and score", and the usability evaluation interface is displayed in response to the click operation of the first control.
In some embodiments, a voice control is also included in the test toolbar. And the data acquisition script responds to the clicking operation of the voice control, acquires and identifies voice data, converts the identification result into characters and displays the characters in the dialog input box. After the test user confirms that the characters recognized by the voice are correct, the user can click the sending control to display the characters recognized by the voice in the dialogue display area; after the user confirms that the characters recognized by the voice are wrong, the characters can be directly modified in the dialogue input box, and the user clicks the sending button after modification so as to display the modified characters in the dialogue display area. And responding the click operation of the sending control by the data acquisition script, and sending the characters in the dialog input box to the dialog display area.
For example, in FIG. 10, a voice control 108 is also included in the test toolbar 105. After the test user clicks the voice control 108, the test user's device may collect the test user's voice data. The data acquisition script responds to the click operation of the voice control 108, acquires voice data, identifies the voice data, converts the identification result into characters and displays the characters in the dialog input box 102; when the test user clicks the sending control 103, the data collection script sends the text in the dialog input box 102 to the dialog display area 101 in response to the click operation of the sending control 103, as shown in the first state diagram of the floating page in fig. 11.
In some embodiments, a second control is also included in the floating page. The states of the second control include a stowed state and a deployed state.
And the data acquisition script is in a folding state based on the state of the second control, responds to the click operation of the second control, folds the chat interface, and modifies the state of the second control into an expansion state.
And the data acquisition script responds to the click operation of the second control to expand the chat interface and modify the state of the second control into a retracted state based on the state of the second control as an expanded state.
For example, in fig. 10, a second control 109 is also included in the floating page, and the state of the second control 109 is a stowed state (upward arrow). After the test user finishes viewing the test task, the test user may click the second control 109 to pack the chat interface, the data acquisition script responds to the click operation of the second control 109 to pack the chat interface, and modifies the state of the second control into an expanded state (a downward arrow), such as the second state diagram of the floating page shown in fig. 12, that is, the state diagram of the floating page after packing the chat interface.
In FIG. 12, only the test toolbar is retained and the state of the second control is modified to the expanded state (downward arrow). Clicking on the second control by the user can expand the chat interface to be redisplayed as a floating page as shown in fig. 10. The data collection script responds to the click operation of the second control, expands the chat interface, and modifies the state of the second control into a retracted state (an upward arrow), such as the second control 109 shown in fig. 10, where the state of the second control 109 is the retracted state (an upward arrow).
In fig. 12, the "complete and score" control is the first control 106 in fig. 10, and the state of the first control 106 is the task complete state. And responding to the click operation of the completion and grading control by the data acquisition script, and displaying an usability evaluation interface. The test user can grade on the usability evaluation interface, and put forward suggestions and suggestions according to visual feelings.
In fig. 12, "task 4/4" indicates that the total number of test tasks is 4, and the test user has completed testing of all test tasks.
In some embodiments, after the chat interface is retracted, the data acquisition script responds to the click operation of the voice control to acquire voice data and perform recognition, and the recognition result is converted into characters. The data collection script displays a dialog input box and a send control below the test toolbar and displays the recognized text in the dialog input box. And responding to the click operation of the sending control by the data acquisition script, displaying a preview area, and displaying characters in the dialog input box in the preview area. In this embodiment, the preview area displays only the speech-recognized text, and does not display the history dialog.
For example, fig. 13 is a third state diagram of the floating page shown in fig. 10, namely the state diagram after the chat interface is closed and the test user clicks the voice control. In fig. 13, the data acquisition script responds to the click operation of the voice control, acquires voice data, performs recognition, and converts the recognition result into text. In FIG. 13, the data collection script displays a dialog input box and send control under the test toolbar and displays the recognized text in the dialog input box. In fig. 13, the data collection script modifies the state of the second control into a collapsed state (an upward arrow), so that the test user can click the second control to collapse the dialog input box and send the control, and return to the second state diagram of the floating page as shown in fig. 12. Correspondingly, the data acquisition script responds to the clicking operation of the second control, the dialog input box and the sending control are folded, and the state of the second control is modified into an expanded state (a downward arrow).
In fig. 13, if the test user clicks the sending control, the data collection script displays a preview area in response to the clicking operation of the sending control, and displays the text in the dialog input box in the preview area, as shown in fig. 14. In fig. 14, the preview area displays only the speech-recognized text, and does not display the history dialogue.
After the preview area is displayed in fig. 14, the state of the second control remains the same as the retracted state (upward arrow), so that the test user can click the second control to retract the dialog input box, the send control, and the preview area, only the test toolbar is reserved, and the second state diagram of the floating page shown in fig. 12 is returned. Correspondingly, the data acquisition script responds to the click operation of the second control, the dialog input box, the sending control and the preview area are collected, and the state of the second control is modified into an expansion state (a downward arrow).
In some embodiments, the data collection script expands the chat interface in response to a click operation of the second control based on the state of the second control being an expanded state, and the text displayed in the history dialog and the preview area is displayed in the dialog display area.
In some embodiments, the test data includes screen-recording files, front-end behavior data, feedback data obtained based on the floating page, front-end exception error reporting, and/or usability assessment data. When the test user clicks the "complete and score" control as shown in fig. 12, the data acquisition script uploads the test data, for example, to the cloud server; or the data acquisition script directly sends the test data to the usability testing device.
The screen recording file is a video file, and the data acquisition script can set video parameters such as a frame rate, a resolution and the like corresponding to the video file; the data collection script may also obtain front-end system parameters before recording the screen instead of setting the video parameters, and determine the video parameters based on the front-end system parameters, where the front-end system parameters include at least one of screen resolution, remaining storage space, and processor utilization.
For example, the resolution of the recording screen is determined based on the screen resolution in the front-end system parameters, and the like. For another example, it may be determined whether there is enough storage space to store the recording screen file based on the remaining storage space, and if not, the resolution of the recording screen should be reduced. For another example, whether the processor can support the recording of the screen recording file can be determined based on the utilization rate of the processor, and if not, the frame rate of the recording screen should be reduced.
The front-end behavior data includes: click, enter, move, jump, execute time, etc. The click can be a mouse click, the input can be keyboard output, the jump can be page jump, and the execution time can be the execution time of the front-end behavior.
The behavior trace feedback data is, for example, data input by the user in the floating page, such as groove shooting, feeling, evaluation, picture, voice, and the like.
The usability evaluation data is data input by the test user in the usability evaluation interface, such as scoring, opinion and suggestion.
For a more intuitive understanding of the flow of the usability test method according to the embodiment of the present disclosure, the following describes a process of analyzing test data in the usability test method with reference to fig. 15 to 20.
Analysis of test data
After the usability testing device obtains the test data of the test user, the test data can be analyzed, manual analysis is replaced, and the analysis efficiency is improved. The test data comprises a screen recording file, front-end behavior data, feedback data acquired based on a floating layer page, front-end abnormal error reporting and/or usability evaluation data.
In some embodiments, the test data is collected by the data collection script and uploaded to the cloud server, and the usability test device may obtain the test data from the cloud server. In some embodiments, test data is collected by a data collection script and sent directly to the ease of use testing device.
In some embodiments, after the usability testing device obtains the test data of the test user, the analysis interface is displayed based on the test data; wherein the analysis interface includes a video display area.
The usability testing device generates a front-end behavior trace based on the front-end behavior data. For example, the ease-of-use testing apparatus may generate the front-end behavior trajectory based on a click, an input, a movement, a jump, an execution time, and the like in the front-end behavior data.
The usability testing device matches the front-end behavior track with the video picture of the screen recording file, so that the time of the video picture is aligned with the time of each behavior in the front-end behavior track, namely the video picture when a certain behavior occurs is matched with the behavior.
The usability testing device plays the video picture of the screen recording file and the matched front-end behavior track in the video display area, namely, the front-end behavior track is displayed in the video picture while the video picture is played.
For example, fig. 15 shows a video display area in the analysis interface, and a front behavior trace is synchronously displayed in a video picture played in the video display area.
In some embodiments, the analysis interface includes a video timeline area. For example, a video timeline area in the analysis interface is shown in fig. 15. The video playing method comprises the steps that a time axis with a preset length is displayed in a video time axis area, the time axis is provided with a playing positioning mark, and the playing positioning mark is used for marking the current playing progress of a video.
The usability testing device determines the corresponding relation between the length of the time axis and the video duration based on the video duration of the screen recording file, so that the current video playing progress marked by the playing positioning marks on the time axis can correspond to the video picture of the screen recording file displayed in the video display area, and the problem that the time and the video picture do not correspond is solved.
The usability testing device marks the time axis based on the corresponding relation between the length of the time axis and the duration of the video and the front-end behavior data, the feedback data and/or the front-end abnormal error report. In some embodiments, the front-end behavior data, the feedback data, and the front-end exception error are respectively associated with different icons and/or colors.
In some embodiments, after the usability testing device marks the time axis, the marking mark and the corresponding icon are displayed above the time axis, so that the item association party can determine which type of testing data each marking mark corresponds to, and the item association party can click the marking mark to quickly position the testing data.
For example, in fig. 15, the positions of the marking marks of different test data are shown above the time axis, the marking marks are circles, the flags indicate that one test task completes the test, and the icon a indicates that the question is marked.
In some embodiments, the usability testing device responds to the clicking operation of the marking mark, and moves the playing positioning mark to the moment corresponding to the marking mark along the time axis; and then playing the video picture at the moment and the front-end behavior track matched with the video picture in the video display area, and continuing to play the video picture.
In some embodiments, the analysis interface includes a multidimensional data detail tree region. The usability testing device constructs a time sequence detail tree based on the front-end behavior data, the feedback data and/or the front-end abnormal error report; and displaying the time sequence detail tree in the multidimensional data detail tree area.
For example, fig. 16 is a first state diagram of a multidimensional data detail tree region in an analysis interface provided by the embodiment of the present disclosure. In fig. 16, a time sequence detail tree 161 is displayed in the multidimensional data detail tree region, and front-end behavior data, feedback data, front-end abnormal error report, and question marking are sorted in time sequence in the time sequence detail tree 161. Wherein the front-end behavior data comprises: page jump, mouse click, keyboard entry, execution time of front-end behavior, etc.
In some embodiments, the timing detail tree includes a plurality of nodes, each node including details of the data and an icon, and the hierarchical relationship between the nodes is determined based on the timing of the data. For example, in FIG. 16, a node in the timing details tree has been flagged for a question, and the node includes a question-flagged icon 161-1 and details 161-2.
In some embodiments, a third control is included in the multidimensional data detail tree region. The states of the third control include a first state describing a trajectory of behavior of the user and a second state describing a multidimensional assistance analysis. For example, in fig. 16, the third control 162 may be a drop-down list box, where the options include "user behavior trace" and "multidimensional aided analysis".
The usability test device responds to the trigger operation of the first state and displays the time sequence detail tree in the multidimensional data detail tree area. In fig. 16, a triggering operation of the first state such as the item associate clicking on the third control 162 and selecting "user behavior trace". The state of the third control 162 is the first state, i.e., the user behavior trace. When the item associator clicks the third control element 162 and selects "user behavior trace," the area of the multidimensional data detail tree shown in FIG. 16 is displayed.
And the usability testing device responds to the triggering operation of the second state, displays the feedback data and the front-end abnormal error report in the time sequence detail tree in the multidimensional data detail tree area, and hides the front-end behavior data in the time sequence detail tree. In fig. 16, a triggering operation of the second state, such as the item associator clicking the third control 162 and selecting "multidimensional auxiliary analysis", displays a second state diagram of the multidimensional data detail tree region as shown in fig. 17. In fig. 17, the front-end behavior data in the timing detail tree is hidden, for example, the front-end behavior data such as page jump, mouse click, keyboard input, execution time of the front-end behavior, etc. are all hidden, and only the feedback data (i.e., user feedback in fig. 17) and the front-end abnormal error report (i.e., front-end error report in fig. 17) are retained. In fig. 17, a "user feedback changes to a problem recording" control and a deletion control are added for user feedback, a deletion control (and a "modification" control) is added for a front-end error report, and a "modification" control and a deletion control are added for problem marking.
In some embodiments, a fourth control is included in the multidimensional data detail tree region. And the usability testing device responds to the clicking operation of the fourth control and marks the time axis based on the time sequence detail tree. For example, the fourth control 163 in FIG. 16 is "Mark and record problem". If all nodes of the timing detail tree 161 in fig. 16 are not selected, the usability testing apparatus responds to the click operation of the "mark and record problem" control, marks the time axis in fig. 15, displays the positions of the mark marks of different data above the time axis, and marks the mark marks as dots, in fig. 15, the flag indicates that a test task is completed, and the icon a indicates that the problem mark is performed.
In some embodiments, the usability testing device responds to the clicking operation of the node corresponding to the feedback data in the time sequence detail tree, and modifies the state of the node into a selected state; and responding to the click operation of the fourth control based on the selected state, and calling out the popup layer page. Wherein, the bullet layer page is used for converting the feedback data into a problem and marking. In fig. 17, the node corresponding to the feedback data (i.e. the user feedback in fig. 17) is selected, for example, by clicking an icon 171-1 of the user feedback or clicking details 171-2 of the user feedback for the item associator. Correspondingly, the usability testing device responds to the clicking operation of the control of marking and recording problems, and a popup layer page is called out and used for converting the feedback data into the problems to mark.
In some embodiments, the nodes corresponding to the feedback data in the timing detail tree include: and (5) marking a control part by a problem. And the usability testing device responds to the clicking operation of the problem marking control, calls out the popup layer page and is used for converting the feedback data into the problem marking. In fig. 17, after the mouse points at the problem marking control, the user is prompted to "convert the user feedback into a problem record", so that the project correlation party can know the function of the problem marking control conveniently.
In some embodiments, the pop-up page is, for example, 181 shown in fig. 18, and fig. 18 is a page variation diagram of conversion of feedback data into question marks provided by an embodiment of the disclosure. In fig. 18, the project related party clicks the question marking control, that is, the control pointed to by the "user feedback is converted into the question record", and the usability testing apparatus calls the popup layer page 181 in response to the clicking operation of the question marking control. The bullet layer page 181 includes a selection box of the question type, an input box of the question description, and a determination control. The selection box of the question type may be a drop-down list box or a radio box. And the project correlation party clicks the determination control after selecting the problem type and inputting the problem description. The usability testing device responds to the click operation of the determined control to obtain the problem type and the problem description; and further, the question type and the question description are displayed in a correlation mode with the user feedback. In the association display, for example, in fig. 18, the icon of the user feedback is replaced by the icon 182-1 of the question mark, and the question description 182-2 is displayed above the details of the user feedback, that is, the question description 182-2 is associated with the details of the user feedback in a waterfall manner.
In some embodiments, the analysis interface includes a question log and scoring area. The usability testing device displays the contents of the marking of the problems and the usability evaluation data in the problem recording and scoring area. For example, fig. 19 is a schematic diagram of a problem recording and scoring area in an analysis interface provided by an embodiment of the present disclosure. In FIG. 19, the issue records and scoring area includes an issue records control and an ease of use scoring control. And the usability testing device responds to the clicking operation of the problem recording control and displays the problem marking content in the problem recording and scoring area. The usability testing device responds to clicking operation of the usability score and displays usability evaluation data in the problem recording and scoring area.
In some embodiments, after completing the analysis of the test data or in response to the trigger operation of the usability test conclusion, the usability test device displays the usability measurement conclusion interface shown in fig. 20, so that the same management and tracking of the problem records are realized, and the problem is quickly tracked and repaired by the project related party. The ease of use metric conclusion interface may present a variety of information including, for example, an overall ease of use rating, an ease of use metric score, an ease of operation score, an ease of learning score, a number of test users, a number of ease of use issues, an ease of use test profile, and the like. Such information may be obtained in connection with data (e.g., ratings, opinions, recommendations, etc.) entered by different test users at the ease-of-use assessment interface.
Fig. 21 is a block diagram of an ease-of-use testing apparatus provided in an embodiment of the present disclosure, and as shown in fig. 21, the ease-of-use testing apparatus includes: a first acquisition unit 211, a second acquisition unit 212, an embedding unit 213, a generation unit 214, and a response unit 215.
The first obtaining unit 211 is configured to obtain a first link address and a test task of a test object.
And a second obtaining unit 212, configured to obtain a data acquisition script for the usability test.
The embedding unit 213 is configured to embed the data collection script in the page of the first link address.
A generating unit 214 is configured to generate a second link address based on the first link address and the test task.
And the response unit 215 is configured to jump to the page of the first link address in response to the access request of the second link address, so that the data collection script guides the usability test based on the test task, and collects the test data based on the data collection script.
In some embodiments, embedding the data collection script in the page of the first link address by the embedding unit 213 comprises: and adding the data acquisition script to a body tag of the page of the first link address.
In some embodiments, the first obtaining unit 211 also obtains the name of the test object. The test tasks include: name of the test task and operational description of the test task. The generating unit 214 generates the second link address based on the first link address and the test task, including: and packaging the name of the test object, the first link address, the name of the test task and the operation description of the test task into a second link address.
In some embodiments, the data collection script directing the ease of use test based on the test task comprises: the data acquisition script creates a floating layer page in the page of the first link address; the floating layer page is used for guiding the execution of the test task.
In some embodiments, a chat interface is displayed in the floating layer page, and the chat interface comprises a dialog display area, a dialog input box and a sending control. The data acquisition script sends the test task to the dialogue display area through the virtual assistant; and the data acquisition script responds to the click operation of the sending control and sends the content in the dialog input box to the dialog display area.
In some embodiments, the floating layer page further comprises a test toolbar, and the test toolbar comprises a first control; the states of the first control include a task switch state and a task complete state. And the data acquisition script is in a task switching state based on the state of the first control, responds to the clicking operation of the first control and sends another test task to the dialogue display area through the virtual assistant. And the data acquisition script responds to the clicking operation of the first control and displays the usability evaluation interface based on the state of the first control as the task completion state.
In some embodiments, task state information is displayed in the test toolbar, the task state information including: the total number of test tasks and the sequence number of the task currently being tested.
In some embodiments, a voice control is also included in the test toolbar. And the data acquisition script responds to the clicking operation of the voice control, acquires voice data, identifies the voice data, converts the identification result into characters and displays the characters in the dialog input box. And responding the click operation of the sending control by the data acquisition script, and sending the characters in the dialog input box to the dialog display area.
In some embodiments, a second control is also included in the floating page. The states of the second control include a stowed state and a deployed state. And the data acquisition script is in a folding state based on the state of the second control, responds to the click operation of the second control, folds the chat interface, and modifies the state of the second control into an expansion state. And the data acquisition script responds to the click operation of the second control to expand the chat interface based on the expanded state of the second control, and modifies the state of the second control into a retracted state.
In some embodiments, after the data collection script packs the chat interface, the data collection script responds to the click operation of the voice control to acquire voice data and identify the voice data, and the identification result is converted into characters. The data collection script displays the dialog input box and the send control and displays the recognized text in the dialog input box. And responding to the click operation of the sending control by the data acquisition script, displaying a preview area, and displaying characters in the dialog input box in the preview area.
In some embodiments, after the data collection script displays the dialog input box and/or displays the preview area, the data collection script modifies the state of the second control into a stowed state; and the data acquisition script is in a retraction state based on the state of the second control, responds to the click operation of the second control, retracts the dialog input box, the sending control and/or the preview area, and modifies the state of the second control into a deployment state.
In some embodiments, the data collection script expands the chat interface in response to a click operation of the second control based on the state of the second control being an expanded state, and the text displayed in the history dialog and the preview area is displayed in the dialog display area.
In some embodiments, collecting test data based on the data collection script comprises: and after a floating layer page is created in the page of the first link address, screen recording is carried out based on the data acquisition script, and a screen recording file is generated.
In some embodiments, the test data includes the screen-recording file, front-end behavior data, feedback data obtained based on a floating page, front-end exception error reporting, and/or usability evaluation data.
In some embodiments, the ease of use testing device further comprises an analysis unit, not shown in fig. 21, for analyzing the test data.
In some embodiments, the analyzing unit analyzing the test data comprises: displaying an analysis interface based on the test data; wherein the analysis interface comprises a video display area; generating a front-end behavior trace based on the front-end behavior data; matching the front-end behavior track with a video picture of a screen recording file; and playing the video picture of the screen recording file and the matched front-end behavior track in the video display area.
In some embodiments, the analysis interface includes a video timeline area; the video playing method comprises the steps that a time axis with a preset length is displayed in a video time axis area, the time axis is provided with a playing positioning mark, and the playing positioning mark is used for marking the current playing progress of a video.
The analysis unit determines the corresponding relation between the length of a time axis and the video duration based on the video duration of the screen recording file; and marking the time axis based on the corresponding relation, the front-end behavior data, the feedback data and/or the front-end abnormal error report.
In some embodiments, the front-end behavior data, the feedback data, and the front-end exception error are respectively corresponding to different icons. After marking the time axis, the analysis unit displays a marking mark and a corresponding icon above the time axis.
In some embodiments, the analyzing unit responds to the clicking operation of the marking mark, and moves the playing positioning mark to the moment corresponding to the marking mark along the time axis; and playing the video picture at the playing moment in the video display area and the front-end behavior track matched with the video picture.
In some embodiments, the analysis interface includes a multidimensional data detail tree region. The analysis unit constructs a time sequence detail tree based on the front-end behavior data, the feedback data and/or the front-end abnormal error report; and displaying the time sequence detail tree in the multidimensional data detail tree area.
In some embodiments, the timing detail tree includes a plurality of nodes, each node including a detail of the data and an icon, the hierarchical relationship between the nodes being determined based on the timing of the data.
In some embodiments, a third control is included in the multidimensional data detail tree region. The states of the third control include a first state describing a trajectory of behavior of the user and a second state describing a multidimensional assistance analysis.
The analysis unit responds to the trigger operation of the first state and displays the time sequence detail tree in the multidimensional data detail tree area.
And the analysis unit responds to the triggering operation of the second state, displays the feedback data and the front-end abnormal error report in the time sequence detail tree in the multidimensional data detail tree area, and hides the front-end behavior data in the time sequence detail tree.
In some embodiments, a fourth control is included in the multidimensional data detail tree region. And the analysis unit responds to the click operation of the fourth control and marks the time axis based on the time sequence detail tree.
In some embodiments, the analysis unit modifies the state of the node to a selected state in response to a click operation of the node corresponding to the feedback data in the time series detail tree. And the analysis unit responds to the click operation of the fourth control based on the selected state and calls out a popup layer page, and the popup layer page is used for converting the feedback data into a problem and marking.
In some embodiments, the nodes corresponding to the feedback data in the timing detail tree include: and (5) marking a control part by a problem. The analysis unit responds to the clicking operation of the problem marking control and calls out a pop-up layer page, and the pop-up layer page is used for converting the feedback data into a problem marking.
In some embodiments, the popup layer page includes a selection box for the question type, an input box for the question description, and a determination control. The analysis unit responds to the click operation of the determined control to obtain the problem type and the problem description; and associating and displaying the question type and the question description with the feedback data.
In some embodiments, the analysis interface includes a question log and scoring area. And displaying the contents of the marking of the problems and the usability evaluation data in a problem recording and scoring area.
In some embodiments, the issue records and scoring area includes an issue records control and a scoring control. And the analysis interface responds to the click operation of the question recording control and displays the question marking content in the question recording and scoring area. And the analysis interface responds to the clicking operation of the scoring control and displays the usability evaluation data in the problem record and scoring area.
In some embodiments, the division of each unit in the usability testing apparatus is only one logic function division, and there may be another division manner when the practical implementation is performed, for example, at least two units of the first obtaining unit 211, the second obtaining unit 212, the embedding unit 213, the generating unit 214, and the responding unit 215 may be implemented as one unit; the first acquisition unit 211, the second acquisition unit 212, the embedding unit 213, the generation unit 214, or the response unit 215 may also be divided into a plurality of sub-units. It will be understood that the various units or sub-units may be implemented in electronic hardware, or a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application.
Fig. 22 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 22, the electronic apparatus includes: at least one processor 221, at least one memory 222, and at least one communication interface 223. The various components in the electronic device are coupled together by a bus system 224. A communication interface 223 for information transmission with an external device. Understandably, bus system 224 is operative to enable connective communication between these components. The bus system 224 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, the various buses are identified in fig. 22 as bus system 224.
It will be appreciated that the memory 222 in the present embodiment can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory.
In some embodiments, memory 222 stores elements, executable units or data structures, or a subset thereof, or an expanded set thereof as follows: an operating system and an application program.
The operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic tasks and processing hardware-based tasks. The application programs, including various application programs such as a Media Player (Media Player), a Browser (Browser), etc., are used to implement various application tasks. The program for implementing the usability test method provided by the embodiment of the present disclosure may be included in an application program.
In the embodiment of the present disclosure, the processor 221 is configured to execute the steps of the usability test method provided by the embodiment of the present disclosure by calling a program or an instruction stored in the memory 222, specifically, a program or an instruction stored in an application program.
The usability test method provided by the embodiment of the present disclosure may be applied to the processor 221, or implemented by the processor 221. The processor 221 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 221. The Processor 221 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the usability testing method provided by the embodiment of the present disclosure may be directly implemented by the hardware decoding processor, or implemented by a combination of hardware and software units in the hardware decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 222, and the processor 221 reads the information in the memory 222, and completes the steps of the method in combination with the hardware thereof.
It should be noted that for simplicity of description, the above-mentioned method embodiments are described as a series of acts, but those skilled in the art can understand that the disclosed embodiments are not limited by the described order of acts, as some steps can be performed in other orders or simultaneously according to the disclosed embodiments. In addition, those skilled in the art can appreciate that the embodiments described in the specification all belong to alternative embodiments.
Embodiments of the present disclosure also provide a non-transitory computer-readable storage medium, where the non-transitory computer-readable storage medium stores a program or an instruction, and the program or the instruction causes a computer to execute steps of each embodiment of the usability testing method, which are not described herein again to avoid repeated descriptions.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
It will be understood by those skilled in the art that although some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the disclosure and form different embodiments.
Those skilled in the art will appreciate that the descriptions of the various embodiments have different emphasis, and reference may be made to the related descriptions of other embodiments for those parts of one embodiment that are not described in detail.
Although the embodiments of the present disclosure have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the present disclosure, and such modifications and variations are within the scope defined by the appended claims.

Claims (31)

1. An ease of use testing method comprising:
acquiring a first link address and a test task of a test object;
acquiring a data acquisition script for the usability test;
embedding the data acquisition script into a page of the first link address;
generating a second link address based on the first link address and the test task;
and responding to the access request of the second link address, jumping to the page of the first link address to enable the data acquisition script to guide usability testing based on the testing task, and acquiring testing data based on the data acquisition script.
2. The method of claim 1, wherein said embedding the data collection script in the page of the first link address comprises:
and adding the data acquisition script to a main body label of the page of the first link address.
3. The method of claim 1, wherein the method further comprises: acquiring the name of the test object;
the test task comprises the following steps: the name of the test task and the operation description of the test task;
the generating a second link address based on the first link address and the test task comprises: and packaging the name of the test object, the first link address, the name of the test task and the operation description of the test task into a second link address.
4. The method of claim 1, wherein the data collection script directing usability testing based on the test task comprises:
the data acquisition script creates a floating layer page in the page of the first link address; wherein the floating page is used for guiding the execution of the test task.
5. The method of claim 4, wherein a chat interface is displayed in the floating layer page, and the chat interface comprises a dialog display area, a dialog input box and a sending control;
the data acquisition script sends the test task to the dialogue display area through a virtual assistant;
and the data acquisition script responds to the click operation of the sending control and sends the content in the dialog input box to the dialog display area.
6. The method of claim 5, wherein the floating page further comprises a test toolbar comprising a first control;
the states of the first control comprise a task switching state and a task completing state;
the data acquisition script is in a task switching state based on the state of the first control, responds to the clicking operation of the first control, and sends another test task to the dialogue display area through the virtual assistant;
and the data acquisition script responds to the clicking operation of the first control and displays an usability evaluation interface based on the task completion state of the first control.
7. The method of claim 6, wherein task state information is displayed in the test toolbar, the task state information comprising: the total number of test tasks and the sequence number of the task currently being tested.
8. The method of claim 6, wherein the test toolbar further includes a voice control;
the data acquisition script responds to the clicking operation of the voice control, acquires voice data, identifies the voice data, converts an identification result into characters and displays the characters in the dialog input box;
and the data acquisition script responds to the click operation of the sending control and sends the characters in the dialog input box to the dialog display area.
9. The method of claim 8, wherein the floating page further comprises a second control therein; the state of the second control comprises a stowed state and a deployed state;
the data acquisition script responds to the click operation of the second control to pack the chat interface based on the state of the second control being a packing state, and modifies the state of the second control into an expansion state;
and the data acquisition script responds to the click operation of the second control based on the fact that the state of the second control is an expansion state, expands the chat interface and modifies the state of the second control into a retraction state.
10. The method of claim 9, wherein upon collapsing the chat interface, the method further comprises:
the data acquisition script responds to the clicking operation of the voice control, acquires voice data, identifies the voice data and converts an identification result into characters;
displaying the dialog input box and the sending control, and displaying the characters in the dialog input box;
and the data acquisition script responds to the click operation of the sending control, displays a preview area and displays the characters in the dialog input box in the preview area.
11. The method of claim 10, wherein after displaying the dialog input box and/or the preview region, the method further comprises:
the data acquisition script modifies the state of the second control into a retracted state;
and the data acquisition script responds to the click operation of the second control based on the fact that the state of the second control is a folding state, folds the dialog input box, the sending control and/or the preview area, and modifies the state of the second control into a spreading state.
12. The method of claim 11, wherein the method further comprises:
and the data acquisition script responds to the click operation of the second control to expand the chat interface based on the state of the second control as an expansion state, and the characters displayed in the historical dialog and the preview area are displayed in the dialog display area.
13. The method of claim 4, wherein said collecting test data based on said data collection script comprises:
and after a floating layer page is created in the page of the first link address, screen recording is carried out based on the data acquisition script, and a screen recording file is generated.
14. The method of claim 13, wherein the test data comprises the screen-recording file, front-end behavior data, feedback data obtained based on the floating page, front-end exception error reporting, and/or usability assessment data.
15. The method of claim 14, wherein the method further comprises: and analyzing the test data.
16. The method of claim 15, wherein the analyzing the test data comprises:
displaying an analysis interface based on the test data; wherein the analysis interface comprises a video display area;
generating a front-end behavior trace based on the front-end behavior data;
matching the front-end behavior track with the video picture of the screen recording file;
and playing the video picture of the screen recording file and the matched front-end behavior track in the video display area.
17. The method of claim 16, wherein the analysis interface comprises a video timeline area; the method comprises the steps that a time axis with a preset length is displayed in a video time axis area, the time axis is provided with a playing positioning mark, and the playing positioning mark is used for marking the current playing progress of a video;
determining the corresponding relation between the length of the time axis and the video duration based on the video duration of the screen recording file;
marking the time axis based on the corresponding relation and the front-end behavior data, the feedback data and/or the front-end abnormal error report.
18. The method of claim 17, wherein the front-end behavior data, the feedback data, and the front-end exception error are respectively associated with different icons;
and after marking the time shaft, displaying a marking mark and a corresponding icon above the time shaft.
19. The method of claim 18, wherein the method further comprises:
responding to the clicking operation of the marking mark, and moving the playing positioning mark to the moment corresponding to the marking mark along the time axis;
and playing the video picture at the moment and the front-end behavior track matched with the video picture in the video display area.
20. The method of claim 17, wherein the analysis interface comprises a multidimensional data detail tree region;
constructing a time sequence detail tree based on the front-end behavior data, the feedback data and/or the front-end abnormal error report;
and displaying the time sequence detail tree in the multidimensional data detail tree area.
21. The method of claim 20, wherein the timing details tree comprises a plurality of nodes, each node comprising details of the data and an icon, the hierarchical relationship between nodes being determined based on the timing of the data.
22. The method of claim 20, wherein a third control is included in the multidimensional data detail tree region; the states of the third control comprise a first state describing a behavior trace of the user and a second state describing multidimensional auxiliary analysis;
responding to the triggering operation of the first state, and displaying the time sequence detail tree in the multidimensional data detail tree area;
responding to the triggering operation of the second state, displaying the feedback data and the front end abnormal error report in the time sequence detail tree in the multidimensional data detail tree area, and hiding the front end behavior data in the time sequence detail tree.
23. The method of claim 20, wherein a fourth control is included in the multidimensional data detail tree region;
and responding to the clicking operation of the fourth control, and marking the time axis based on the time sequence detail tree.
24. The method of claim 23, wherein the method further comprises:
responding to the clicking operation of the node corresponding to the feedback data in the time sequence detail tree, and modifying the state of the node into a selected state;
and responding to the click operation of the fourth control based on the selected state, and calling a popup layer page, wherein the popup layer page is used for converting the feedback data into a problem and marking.
25. The method of claim 20, wherein the nodes corresponding to the feedback data in the timing detail tree comprise: problem marking controls;
and responding the clicking operation of the problem marking control, and calling out a popup layer page, wherein the popup layer page is used for converting the feedback data into a problem marking.
26. The method of claim 24 or 25, wherein the popup layer page includes a selection box of a question type, an input box of a question description, and a determination control;
responding to the click operation of the determined control, and acquiring a problem type and a problem description;
and displaying the question type and the question description in association with the feedback data.
27. The method of claim 24 or 25, wherein the analysis interface comprises a question recording and scoring area;
and displaying the marking content of the problems and the usability evaluation data in the problem recording and scoring area.
28. The method of claim 27, wherein the issue recording and scoring area includes an issue recording control and a scoring control;
responding to the click operation of the question recording control, and displaying the content marked by the question in the question recording and scoring area;
and responding to the clicking operation of the scoring control, and displaying the usability evaluation data in the question record and scoring area.
29. An ease-of-use testing device comprising:
the first acquisition unit is used for acquiring a first link address and a test task of a test object;
the second acquisition unit is used for acquiring a data acquisition script of the usability test;
the embedding unit is used for embedding the data acquisition script into the page of the first link address;
a generating unit, configured to generate a second link address based on the first link address and the test task;
and the response unit is used for responding to the access request of the second link address, jumping to the page of the first link address to enable the data acquisition script to guide usability testing based on the testing task, and acquiring testing data based on the data acquisition script.
30. An electronic device, comprising: a processor and a memory;
the processor is configured to perform the steps of the method of any one of claims 1 to 28 by calling a program or instructions stored in the memory.
31. A non-transitory computer readable storage medium storing a program or instructions for causing a computer to perform the steps of the method of any one of claims 1 to 28.
CN202110099106.1A 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium Pending CN114791875A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110099106.1A CN114791875A (en) 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110099106.1A CN114791875A (en) 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114791875A true CN114791875A (en) 2022-07-26

Family

ID=82459458

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110099106.1A Pending CN114791875A (en) 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114791875A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868096A (en) * 2015-01-22 2016-08-17 阿里巴巴集团控股有限公司 Methods and apparatuses used for displaying web page test result in browser and device
CN107133180A (en) * 2017-06-07 2017-09-05 腾讯科技(深圳)有限公司 Method of testing, test device and the storage medium of dynamic page
CN107908552A (en) * 2017-10-30 2018-04-13 阿里巴巴集团控股有限公司 A kind of test method based on link, device and equipment
CN108647141A (en) * 2018-04-26 2018-10-12 腾讯科技(深圳)有限公司 Automatic test approach, device, computer-readable medium and electronic equipment
US20190303274A1 (en) * 2018-03-30 2019-10-03 Atlassian Pty Ltd Systems and methods for monitoring performance of applications
CN110825618A (en) * 2019-10-10 2020-02-21 重庆金融资产交易所有限责任公司 Method and related device for generating test case

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868096A (en) * 2015-01-22 2016-08-17 阿里巴巴集团控股有限公司 Methods and apparatuses used for displaying web page test result in browser and device
CN107133180A (en) * 2017-06-07 2017-09-05 腾讯科技(深圳)有限公司 Method of testing, test device and the storage medium of dynamic page
CN107908552A (en) * 2017-10-30 2018-04-13 阿里巴巴集团控股有限公司 A kind of test method based on link, device and equipment
US20190303274A1 (en) * 2018-03-30 2019-10-03 Atlassian Pty Ltd Systems and methods for monitoring performance of applications
CN108647141A (en) * 2018-04-26 2018-10-12 腾讯科技(深圳)有限公司 Automatic test approach, device, computer-readable medium and electronic equipment
CN110825618A (en) * 2019-10-10 2020-02-21 重庆金融资产交易所有限责任公司 Method and related device for generating test case

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ALIBABADESIGN: "如何量化产品体验?来看阿里出品的度量模型", Retrieved from the Internet <URL:https://www.uisdc.com/ali-metric-model> *
ALIBABADESIGN: "设计是玄学?如何科学度量100+款「中后台技术类产品」体验", Retrieved from the Internet <URL:https://www.shejitk.com/archives/5274> *
BARRET李靖: "走进 phantomjs 嵌入式测试", Retrieved from the Internet <URL:https://www.barretlee.com/blog/2015/09/25/move-on-phantomjs/> *
KATE MORAN 等: "Remote Moderated Usability Tests: How to Do Them", Retrieved from the Internet <URL:https://www.nngroup.com/articles/moderated-remote-usability-test/> *

Similar Documents

Publication Publication Date Title
US6308146B1 (en) System and method for simulating user input to control the operation of an application
CN106844217B (en) Method and device for embedding point of applied control and readable storage medium
CN103098051B (en) Search engine optmization assistant
CN110928772B (en) Test method and device
CN108345456A (en) Page code generation method, device, computer equipment and storage medium
CN105264474B (en) The NI Vision Builder for Automated Inspection program editing environment with paste feature is replicated including operation context-aware
CN108491205A (en) A kind of front end web development methods and system based on component tree
US9459780B1 (en) Documenting interactive graphical designs
CN109739855B (en) Method and system for realizing data sheet splicing and automatically training machine learning model
CN105740153A (en) Cloud testing method and device
JPS62212837A (en) Interactive software tester
CN107844331A (en) Generate the method, apparatus and equipment of boot configuration file
CN111881036A (en) Test case management method and device and electronic equipment
US20220139075A1 (en) Deep learning guide device and method
Silva et al. A comparative study of milestones for featuring GUI prototyping tools
CN115843374A (en) System and method for capturing, indexing and extracting digital workflows from videos using artificial intelligence
JP2001005690A (en) Program test system
CN114897296A (en) RPA flow labeling method, execution process playback method and storage medium
JP2013182410A (en) Business analysis design support device, business analysis design support method, and business analysis design support program
CN104077669B (en) The universal method that a kind of autonomous customization of computer operation stream is performed with driving
CN114416516A (en) Test case and test script generation method, system and medium based on screenshot
CN113448845A (en) UI automation test method and system
CN112667517A (en) Method, device, equipment and storage medium for acquiring automatic test script
CN114791875A (en) Usability testing method and device, electronic equipment and storage medium
Lopes et al. Solution discovery over feature toggling with built-in abstraction in outsystems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination