CN114791875B - Usability testing method and device, electronic equipment and storage medium - Google Patents

Usability testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114791875B
CN114791875B CN202110099106.1A CN202110099106A CN114791875B CN 114791875 B CN114791875 B CN 114791875B CN 202110099106 A CN202110099106 A CN 202110099106A CN 114791875 B CN114791875 B CN 114791875B
Authority
CN
China
Prior art keywords
test
data
control
state
link address
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110099106.1A
Other languages
Chinese (zh)
Other versions
CN114791875A (en
Inventor
支尚
杨涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202110099106.1A priority Critical patent/CN114791875B/en
Publication of CN114791875A publication Critical patent/CN114791875A/en
Application granted granted Critical
Publication of CN114791875B publication Critical patent/CN114791875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure relates to a usability testing method, a device, electronic equipment and a storage medium. In at least one embodiment of the present disclosure, by acquiring a data acquisition script for an easy-to-use test and embedding the data acquisition script into a page of a link address of a test object, automatic acquisition of test data by the data acquisition script is realized, and the link address for the easy-to-use test is generated based on the link address of the test object and a test task, so that a test user accesses the link address to independently complete an on-line test of the test object, without a third party manual intervention, compared with the current off-line test, without manual arrangement of the test task and manual transcription and carding of the test data in the field, the test efficiency is improved, and the labor cost is reduced; compared with the current online test, the method does not need to carry out manual arrangement of online test tasks and manual transcription and carding of test data by remote communication with a test user, and improves the test efficiency.

Description

Usability testing method and device, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of usability testing, in particular to a usability testing method, a device, electronic equipment and a non-transitory computer readable storage medium.
Background
In the past, the actual use condition of the product by a user is known, and the user is required to conduct the down-line usability test on the product. The process of off-line usability testing is typically: the user performs typical operation on the product, and an observer and a product developer observe, listen and record aside so as to obtain data such as operation records, feedback, problems and the like of the user, so that the product is improved, and the usability of the product is improved.
However, the offline usability test has the following problems: the operation of the user to the product needs to be carried out on site, the observer and the product developer need to carry out transcription and carding on site, and in addition, the arrangement of the usability testing task is usually carried out on site, so that the user needs to go to the site of the user when going on business of the observer and the product developer, and also needs to observe and record manually, the efficiency of the whole testing process is low, the time consumption is long, the labor cost is high, and the testing user cannot independently finish the testing. And develop under no normal during new coronary epidemic situation.
The usability test can also be performed online, for example, by remote video and desktop sharing. However, the arrangement of the task of the usability test by the observer, the remote communication with the user, the transcription and the carding of the data such as the operation record, the feedback and the problem of the user are still required, the test efficiency is still low, and the test user cannot independently complete the test.
The above description of the discovery process of the problem is merely for aiding in understanding the technical solution of the present disclosure, and does not represent an admission that the above is prior art.
Disclosure of Invention
To address at least one problem with the prior art, at least one embodiment of the present disclosure provides an ease of use testing method, apparatus, electronic device, and non-transitory computer readable storage medium.
In a first aspect, an embodiment of the present disclosure provides an usability testing method, including:
acquiring a first link address and a test task of a test object;
acquiring a data acquisition script of the usability test;
embedding the data acquisition script into a page of the first link address;
Generating a second link address based on the first link address and the test task;
And responding to the access request of the second link address, jumping to the page of the first link address so that the data acquisition script guides the usability test based on the test task and acquires test data based on the data acquisition script.
In a second aspect, an embodiment of the present disclosure further provides an usability testing apparatus, including:
the first acquisition unit is used for acquiring a first link address of the test object and a test task;
The second acquisition unit is used for acquiring a data acquisition script for the usability test;
The embedding unit is used for embedding the data acquisition script into the page of the first link address;
the generating unit is used for generating a second link address based on the first link address and the test task;
and the response unit is used for responding to the access request of the second link address, jumping to the page of the first link address so that the data acquisition script guides the usability test based on the test task and acquires test data based on the data acquisition script.
In a third aspect, an embodiment of the present disclosure further proposes an electronic device, including: a processor and a memory; the processor is configured to perform the steps of the usability testing method according to any of the embodiments of the first aspect by invoking a program or instructions stored in the memory.
In a fourth aspect, embodiments of the present disclosure also propose a non-transitory computer-readable storage medium storing a program or instructions for causing a computer to perform the steps of the usability testing method according to any of the embodiments of the first aspect.
In at least one embodiment of the present disclosure, the automatic collection of test data by using the data collection script is realized by obtaining the data collection script of the usability test and embedding the data collection script into the page of the link address of the test object, and the link address of the usability test is generated based on the link address of the test object and the test task, so that the test user can independently complete the online test of the test object by accessing the link address. Compared with the current online test, the method does not need to carry out manual arrangement of online test tasks and manual transcription and carding of test data by remote communication with a test user, and improves the test efficiency.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings to those of ordinary skill in the art.
FIG. 1 is an exemplary application scenario diagram provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart of an ease of use test method provided by an embodiment of the present disclosure;
FIG. 3 is an interaction diagram of an ease of use test method provided by an embodiment of the present disclosure;
FIG. 4 is a first interface schematic for adding test objects provided by an embodiment of the present disclosure;
FIG. 5 is a second interface schematic for embedding test objects provided by embodiments of the present disclosure;
FIG. 6 is a third interface schematic for adding test tasks provided by an embodiment of the present disclosure;
FIG. 7 is a fourth interface schematic for creating test items provided by an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of an interface for successfully creating a test item provided by an embodiment of the present disclosure;
FIG. 9 is a schematic illustration of an interface for initiating a test provided by an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of a floating layer page provided by an embodiment of the present disclosure;
FIG. 11 is a first state diagram of the floating layer page shown in FIG. 10;
FIG. 12 is a second state diagram of the floating layer page shown in FIG. 10;
FIG. 13 is a third state diagram of the floating layer page shown in FIG. 10;
FIG. 14 is a fourth state diagram of the floating layer page shown in FIG. 10;
FIG. 15 is a schematic diagram of a video display area and a video timeline area in an analysis interface provided by embodiments of the present disclosure;
FIG. 16 is a first state diagram of a multidimensional data detail tree region in an analysis interface provided in an embodiment of the present disclosure;
FIG. 17 is a second state diagram of a multidimensional data detail tree region in an analysis interface provided in an embodiment of the present disclosure;
FIG. 18 is a diagram of a page variation of a feedback data conversion to question marking provided by an embodiment of the present disclosure;
FIG. 19 is a schematic diagram of a problem recording and scoring area in an analysis interface provided by an embodiment of the present disclosure;
FIG. 20 is an interface diagram of an ease-of-use test conclusion provided by an embodiment of the present disclosure;
FIG. 21 is a block diagram of an ease-of-use testing apparatus provided by an embodiment of the present disclosure;
fig. 22 is an exemplary block diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order that the above-recited objects, features and advantages of the present disclosure may be more clearly understood, a more particular description of the disclosure will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It is to be understood that the described embodiments are some, but not all, of the embodiments of the present disclosure. The specific embodiments described herein are to be considered in an illustrative rather than a restrictive sense. All other embodiments derived by a person of ordinary skill in the art based on the described embodiments of the present disclosure fall within the scope of the present disclosure.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
In this document, the layout, style and color of the page, and the layout, style and color of the interface may all be flexibly changed according to actual needs.
Aiming at the problems that the current online test requires manual arrangement of test tasks and manual transcription and carding of test data on site, the test efficiency is low, the labor cost is high, and a test user cannot independently complete the test, and the current online test requires remote communication with the test user to manually arrange the online test tasks and manually transcribe and comb the test data, the test efficiency is low, and the test user cannot independently complete the test, in at least one embodiment of the present disclosure, the automatic collection of the test data by utilizing the data collection script is realized by acquiring the data collection script of the usability test and embedding the data collection script into a page of a link address of a test object; and the link address of the usability test is generated based on the link address of the test object and the test task, so that the test user can independently complete the online test of the test object by accessing the link address without the manual intervention of a third party.
In at least one embodiment of the present disclosure, compared with the current offline test, there is no need to perform manual arrangement of test tasks and manual transcription and carding of test data on site, so that the test efficiency is improved, and the labor cost is reduced; compared with the current online test, the method does not need to carry out manual arrangement of online test tasks and manual transcription and carding of test data by remote communication with a test user, and improves the test efficiency.
Fig. 1 is an exemplary application scenario diagram provided by an embodiment of the present disclosure. In fig. 1, a test object 11 may be understood as a product to be tested, such as a software system, an application, a page code, etc. that may be presented by linking pages of an address.
The usability testing apparatus 12 can automatically collect the test data by using the data collection script by acquiring the data collection script of the usability test and embedding the data collection script into the page of the link address of the test object 11, thereby solving the problem of lower test efficiency caused by the requirement of manual transcription of the test data.
In some embodiments, the manner of embedding the data acquisition script into the page of the link address of the test object 11 may be automatic, i.e. the usability testing apparatus 12 may directly access the link address of the test object 11, and embed the data acquisition script into the code corresponding to the page of the link address of the test object 11.
In some embodiments, the way in which the data acquisition script is embedded in the page of the link address of the test object 11 may be by manual embedding, i.e. the usability testing apparatus 12 may inform a third party, e.g. an item association party, of the data acquisition script embedded in the page of the link address of the test object 11 by the item association party. A project-related party may be understood, among other things, as a person associated with a usability test project, such as an observer and/or developer mentioned in the background.
In fig. 1, the usability testing apparatus 12 may further generate a link address of the usability test based on the link address of the test object 11 and the test task, so that the test user can independently complete the online test of the test object 11 without manual intervention of a third party by accessing the link address. The problem of low test efficiency caused by the fact that a test user needs to conduct field communication or online communication to arrange a test task is solved.
In fig. 1, a plurality of test user devices 13 are shown, each test user device 13 having a network connection function, and being capable of performing an online test of usability. Thus, there may be a plurality of users participating in the usability test, and each user may complete an independent test of the test object 11 using the network through the respective test user's device 13.
Taking a device 13 of a test user as an example, the device 13 of the test user can obtain a link address of the usability test from the usability test device 12 through a network, and then access the link address of the usability test through the network to realize online test on the test object 11.
In some embodiments, multiple users may utilize network collaboration to complete collaborative testing of test object 11 through respective test user's devices 13. For example, the devices 13 of the plurality of test users may acquire the link address of the usability test from the usability test apparatus 12 via the network, and then access the link address of the usability test via the network, and edit the same page corresponding to the link address. During editing, for each test user's device 13, the editing content is displayed in real time in a page, and the editing content is transmitted to the usability testing apparatus 12 via a network. The usability testing apparatus 12 may display the edited content of different users in a page according to a certain page editing policy so that the device 13 of each test user may display the edited content of all users based on the displayed layout. The page editing policy may follow a mature policy in the technical field of collaborative editing of pages by multiple people, which is not described in detail. Thus, for each device 13 of the test user, not only the content edited by the user himself for the page, but also the content edited by other users for the same page can be seen in real time.
Fig. 2 is a flowchart of an easy-to-use testing method provided by an embodiment of the present disclosure. In some embodiments, the usability testing method shown in fig. 2 may be suitable for the application scenario shown in fig. 1, and accordingly, the usability testing apparatus 12 in fig. 1 is an execution subject of the usability testing method shown in fig. 2. For convenience of description, the following embodiments use the usability testing apparatus as the execution subject to describe the flow of the usability testing method.
As shown in fig. 2, in step 201, the usability testing apparatus acquires a first link address and a test task of a test object. Wherein the test tasks may include, but are not limited to, the name of the test task and the operational description of the test task. In some embodiments, in addition to obtaining the first link address of the test object, the name of the test object may also be obtained. In some embodiments, the link address may be a URL (Uniform Resource Locator ), or may be any form of character string such as a unique Identification (ID).
In some embodiments, the usability testing apparatus may present an interactive interface including an input box for a first link address of the test object, an input box for a test task, and a confirmation (or submission) control. The method comprises the steps that an item association party inputs a first link address and a test task of a test object in an interactive interface at the same time, clicks a confirmation (or submission) control, and an usability testing device responds to clicking operation of the confirmation (or submission) control to acquire the first link address and the test task of the test object, which are input by a test user in an input box of the interactive interface.
In some embodiments, the usability testing apparatus may expose two interactive interfaces, one for adding test objects and the other for adding test tasks. For example, an input box and a confirmation (or submission) control including a first link address of a test object in an interactive interface for adding the test object; the interactive interface for adding the test tasks comprises an input box of the test tasks and a confirmation (or submitting) control. The usability testing device firstly displays an interactive interface for adding the test object, and after the first link address of the test object is obtained, displays an interactive interface for adding the test task, so as to obtain the test task.
In step 202, the usability testing apparatus obtains a data collection script for usability testing. In some embodiments, the data collection script may be a front-end script of a collection probe, where the collection probe may be a Web script program, essentially a script file that implements collection functions through a Web page programming language (ASP, PHP, ASP. Net, etc.), and may detect and collect relevant data in a page, such as user operation data, time data, dynamic data of a Web page display, video data, page jump data, etc. The acquisition function, such as a recording screen, creates a floating layer page for acquiring text, images, etc. data entered by the user.
In some embodiments, the usability testing apparatus may automatically generate the data collection script for the test object after acquiring the first link address of the test object. In some embodiments, the usability testing apparatus may invoke a pre-set data collection script.
In step 203, the usability testing apparatus embeds a data acquisition script in a page of the first link address. In some embodiments, the usability testing apparatus adds the data collection script to a body < body/> tag of the page of the first link address, e.g., to a first line of body < body/> tag content.
In some embodiments, the usability testing apparatus may directly access the link address of the test object, embed the data collection script in a page of the link address of the test object.
In some embodiments, the usability testing apparatus may inform the project associate of the data collection script embedded in the page of the link address of the test object by the project associate.
For example, after the usability testing apparatus generates the data acquisition script, a prompt interface is displayed, where prompt information is displayed, and the prompt information is used to prompt the project-associated party to copy the data acquisition script and then add the data acquisition script to the first line of the main body < body/> tag content of the page of the first link address. In this way, the item association party knows that the data acquisition script should be replicated and opens the page of the first link address, adding the data acquisition script to the first line of the body/> tag content of the page of the first link address.
In step 204, the usability testing apparatus generates a second link address based on the first link address and the test task. Specifically, the usability testing apparatus may package the name of the test object, the first link address, the name of the test task, and the operation description of the test task as the second link address.
In some embodiments, after the usability testing device acquires the testing task, an interactive interface for creating the testing item is displayed, and the item association party can create the testing item aiming at the testing object by using the interactive interface, so that the item association party can manage each usability test in the mode of testing the item.
In some embodiments, the name of the test object, the first link address, and the test task are displayed in an interactive interface for creating the test item so that the item-associated party verifies whether it is correct; and an input box for item names and a create (or submit) control are provided in the interactive interface for the item-associated party to enter the item names for the current usability test in the input box. Thus, the project association party confirms that each piece of information displayed in the interactive interface is correct, and after the project name is input, the project association party can click a creation (or submission) control to complete the creation of the test project. Accordingly, the usability testing apparatus encapsulates the item name, the name of the test object, the first link address, and the test task as the second link address in response to the click operation of the create (or submit) control.
After the second link address is generated, the usability testing device can prompt the project association party to copy the second link address and send the second link address to the testing user, so that the testing user can access the second link address through the equipment 13 of the testing user shown in fig. 1, and the online test of the testing object is independently completed without the manual intervention of a third party.
In step 205, the usability testing apparatus responds to the access request of the second link address and jumps to the page of the first link address, so that the data acquisition script guides the usability test based on the test task and acquires the test data based on the data acquisition script. Therefore, the data acquisition script guides the test user to independently complete the usability test based on the test task, and manual intervention is not needed.
In some embodiments, the usability testing apparatus responds to the access request of the second link address, displays a start testing interface, and after the test user inputs a name in the start testing interface and confirms that the usability testing service protocol has been read, clicks a start testing control in the start testing interface. Accordingly, the usability testing device responds to clicking operation of the starting testing control, jumps to the page of the first link address, enables the data acquisition script to guide usability testing based on the testing task, and acquires testing data based on the data acquisition script.
In some embodiments, the data collection script directs the usability test based on the test task, for example, in the following manner: the data acquisition script creates a floating layer page in the page of the first link address; the floating layer page is used for guiding the execution of the test task. For example, the name of the test task and the operation description of the test task can be displayed in the creation floating layer page, so that the test user can know the details of the test task, and then independently operate according to the operation description of the test task in the page of the first link address, and meanwhile, the data acquisition script can automatically acquire test data generated in the operation process of the test user.
To more clearly illustrate the relationship of interactions between different principals, the interaction flow of the usability testing method is described below in conjunction with FIG. 3. Fig. 3 is an interaction diagram of an usability testing method provided by an embodiment of the present disclosure, and it should be noted that fig. 3 is only some embodiments, but not all embodiments, of the present disclosure.
As shown in fig. 3, four types of main bodies are involved in the interaction flow of the whole usability testing method: the device for testing the user, the device for the project associating party, the usability testing device and the testing object.
In some embodiments, the usability testing device may be a software device, a hardware device, or a combination of software and hardware. For example, the usability testing apparatus is a software apparatus running on an operating system, which may be an operating system (IOS system, android system, etc.) of a mobile device, so that the usability testing apparatus may be installed on a mobile device, such as a mobile device like a smartphone, a tablet computer, etc.; the operating system may be an operating system of a fixed device (Windows system, linux system, etc.), so that the usability testing apparatus may be installed on the fixed device, for example, a desktop computer, a cloud server, etc.
In some embodiments, if the usability testing apparatus is a software apparatus, the usability testing apparatus may be installed in the device of the project associate; if the usability testing device is a hardware device or a device combining hardware and software, the usability testing device may be disposed inside the equipment of the project-related party, or may be implemented as a device independent of the equipment of the project-related party.
In fig. 3, the test user uses the device of the test user, and the item association party uses the device of the item association party. The interaction between the device of the test user, the device of the project associating party, the usability testing apparatus and the test object is as follows:
In fig. 3, the device of the project associating party adds a test object in an interactive interface for adding a test object, which is displayed by the usability testing apparatus, and includes a name of the test object and a first link address of the test object. In this way, the usability testing apparatus receives the name of the test object and the first link address of the test object through the interactive interface.
In fig. 3, after receiving the name of the test object and the first link address of the test object, the usability testing apparatus generates a data acquisition script for the test object. The usability testing apparatus may inform the data collection script to the device of the project associate.
For example, after the usability testing apparatus generates the data acquisition script, a prompt interface is displayed, where prompt information is displayed, and the prompt information is used to prompt the project-associated party to copy the data acquisition script and then add the data acquisition script to the first line of the main body < body/> tag content of the page of the first link address. Thus, the project-associated party knows that the data acquisition script should be replicated, opens the page of the first link address, adds the data acquisition script to the first line of the body/> tag content of the page of the first link address, and completes the embedded data acquisition script shown in fig. 3.
In fig. 3, the test object embedded with the data acquisition script may be subjected to an embedding test manually, and after the embedding test passes, the project-related party may be notified to add a test task and create a test project associated with the test task.
In some embodiments, after the embedded test passes, the usability testing apparatus may present an interactive interface for adding the test task so that the project-associated party adds the test task, including the name of the test task and the operational description of the test task. After receiving the name and operation description of the test task, the usability testing device displays an interactive interface for creating the test item so that the item association party creates the test item associated with the test task, clicks a creation control in the interactive interface after creating the test item, and responds to clicking operation of the creation control to generate a test link, namely a second link address.
In fig. 3, the usability testing apparatus may inform the device of the project associate of the test link after generating the test link. For example, after the test link is generated, the usability test device displays an interface for successfully creating the test item, displays the generated test link in the interface, prompts the item association party to copy the test link and send the test link to the test user, and then the equipment of the test user can access the test link to perform the usability test.
In FIG. 3, the device of the item-associated party sends a test link to the device of the test user; the device of the test user accesses the test link; the usability testing device responds to the access request of the testing link and displays a testing starting interface; and the test user inputs a name in the start test interface, and clicks a start test control in the start test interface after confirming that the usability test service protocol is read. The usability testing device responds to clicking operation of the starting testing control, privacy authorization is conducted on equipment of a testing user, and then the equipment jumps to a page of a first link address, so that the data acquisition script guides usability testing based on the testing task, and test data are acquired based on the data acquisition script.
In fig. 3, the data acquisition script may initiate screen recording after the test is started, acquire test data, and send the test data to the usability testing apparatus. And after all the test tasks are completed, the data acquisition script closes the screen recording, and the usability test is finished.
In order to more intuitively understand the flow of the usability testing method according to the embodiment of the present disclosure, the process of generating the test item in the usability testing method is described below with reference to fig. 4 to 9.
Test item generation
Taking the usability testing device as a software device for example, the usability testing device is installed in equipment of the project associating party. The device of the item association initiates the usability testing apparatus, e.g., the item association clicks or double clicks on an icon of the usability testing apparatus to access the usability testing apparatus.
The usability testing apparatus displays a first interface for adding the test object in response to the start operation (click or double click operation). The first interface comprises an input box of test object configuration information and a first control. Test object configuration information may include, but is not limited to: the name of the test object and the first link address of the test object.
Fig. 4 is a schematic diagram of a first interface for adding a test object according to an embodiment of the disclosure. In fig. 4, a product is a test object, an added product is an added test object, a product name is a test object name, a product link address is a first link address of the test object, and a "next" control is a first control. The first interface shown in fig. 4 includes an input box for the product name, an input box for the product link address, and a "next" control.
The usability testing apparatus responds (in fig. 4) to the click operation of the "next" control, acquires the product name and the product link address in the input box, and displays a second interface for embedding the test object. And displaying the embedded prompt information and the second control in the second interface. The prompt information is used for prompting the item association party to copy the data acquisition script and then add the data acquisition script to the first line of the body/> tag content of the page of the product link address.
Fig. 5 is a schematic diagram of a second interface for embedding a test object according to an embodiment of the present disclosure. The embedded prompt information displayed in the second interface shown in fig. 5 is: please paste the following code on the first line of product < body/> content, embed the product in ETest tool). The ETest tool is the usability testing device. Also shown in FIG. 5 are the code of the data acquisition script and the "copy code" control. The project associating party clicks the copy code control, and can quickly copy codes. In addition, fig. 5 also includes a "next" control, which is a second control.
The usability testing apparatus presents a third interface for adding test tasks in response to clicking of the "next" control (in fig. 5). The third interface comprises an input box of the test task configuration information and a third control. Test task configuration information includes, but is not limited to: the name of the test task and the operational description of the test task.
Fig. 6 is a schematic diagram of a third interface for adding a test task according to an embodiment of the disclosure. In fig. 6, the task name is the name of the test task, the task description is the operation description of the test task, and the "determine" control is the third control. The third interface shown in fig. 6 includes an input box for a task name, an input box for a task description, a "determine" control, and an "add test task" control. The project-associated party clicks the "Add test task" control, which may add multiple test tasks.
The usability testing apparatus presents a fourth interface for creating test items in response to a click operation of the "ok" control (in fig. 6). And displaying the test object configuration information, the test task configuration information, the input box of the test item name and the fourth control in the fourth interface.
Fig. 7 is a fourth interface schematic for creating test items provided by an embodiment of the present disclosure. In fig. 7, the product is the name of the test object, the project name is the test project name, the test start page is the first link address of the test object, and the "submit" control is the fourth control. The fourth interface shown in fig. 7 includes an input box of the product, an input box of the project name, an input box of the test start page, a label of the test task, an "add test task" control, and a "submit" control. The number of the labels of the test tasks can be multiple, and one label corresponds to one test task. The input box of the product and the input box of the test starting page display the product and the test starting page acquired by the usability testing device by default. The item association party clicks the input box of the product or the input box of the test start page, and can modify the product or the test start page. The project associating party clicks the input box of the project name to name the project. The project associate clicks x in the label of the test task and the label may be deleted, i.e., the test task is deleted. The project-associated party clicks the "Add test task" control, which may add multiple test tasks.
The usability testing apparatus obtains the item name in response to the click operation of the "submit" control (in fig. 7), and encapsulates the belonging product, the test start page, the test task, and the item name as the second link address. After the second link address is packaged, the usability testing device displays an interface for successfully creating the test item, displays the packaged second link address in the interface, prompts the item association party to copy the second link address and send the second link address to the testing user, and then the equipment of the testing user can access the second link address to carry out usability testing.
FIG. 8 is a schematic diagram of an interface for successfully creating a test item according to an embodiment of the present disclosure. The interface shown in fig. 8 includes a copy control. The item association party clicks the copy control, which can quickly copy the second link address. In addition, in fig. 8, a prompt message is also displayed: "please send the following link address to the target user, the tool assistant directs it to complete the usability test" to prompt the project associate to copy the second link address and send it to the test user. The target user is the test user, and the tool assistant can understand the auxiliary test function provided by the data acquisition script.
The project associating party copies the second link address and sends it to the test user. After the test user obtains the second link address, the test user enters the test starting interface to test the test object by clicking the second link address.
For example, the test user clicks on the second link address, and the usability testing apparatus responds to the access request of the second link address, and displays a start test interface as shown in fig. 9, where fig. 9 includes a name input box and a "start test" control. After the test user inputs the name in the name input box and confirms that the usability test service protocol is read, the test start control is clicked to test the test object.
In order to more intuitively understand the flow of the usability testing method according to the embodiments of the present disclosure, the process of testing data collection in the usability testing method is described below with reference to fig. 10 to 14.
Test data acquisition
The usability testing device responds to clicking operation of the "start test" control in fig. 9, jumps to the page of the first link address, so that the data acquisition script guides usability testing based on the testing task, and acquires testing data based on the data acquisition script.
In some embodiments, after jumping to the page of the first link address, the data acquisition script creates a floating layer page in the page of the first link address; the floating layer page is used for guiding the execution of the test task. In some embodiments, after the floating layer page is created, the data acquisition script performs screen recording to generate a screen recording file, wherein the screen recording is an html-based web screen recording.
In some embodiments, a chat interface is displayed in the floating layer page, the chat interface including a conversation display area, a conversation input box, and a send control. The data acquisition script sends the test task (comprising the name of the test task and the operation description of the test task) to the dialogue display area through the virtual assistant, replaces manual welcome test users and arranges the test task, and guides the test users to conduct usability tests. The dialog input box defaults to display a prompt message such as "meet slot, here spit slot bar", so that after a test user encounters a problem or slot, he knows to input through the dialog input box and clicks the send control to send the problem or slot to the dialog display area.
Fig. 10 is a schematic diagram of a floating layer page according to an embodiment of the disclosure. In fig. 10, a chat interface is shown, which includes a dialog display area 101, a dialog input box 102, and a send control 103. The data acquisition script sends the test tasks to the dialog display area 101 via the virtual assistant 104. The data acquisition script transmits the contents in the dialog input box 102 to the dialog display region 101 in response to the click operation of the transmission control 103, as in the first state diagram of the floating layer page shown in fig. 11. The content in the dialog input box 102 is information such as feeling, evaluation, and spitting in the testing process input by the testing user.
In some embodiments, a test toolbar may also be included in the floating layer page, with the test toolbar including the first control. The states of the first control include a task switch state and a task complete state.
The data acquisition script is based on the state of the first control as a task switching state, responds to clicking operation of the first control, and sends another test task to the dialogue display area through the virtual assistant.
The data acquisition script responds to clicking operation of the first control based on the state of the first control as a task completion state, and displays an usability evaluation interface. The test user can score on the usability evaluation interface, make comments and suggestions according to visual feelings, and the like.
For example, in FIG. 10, a test toolbar 105 is also included in the floating layer page, including a first control 106. The state of the first control 106 includes a task switch state (e.g., the "next task" shown in fig. 10). The data acquisition script responds to the clicking operation of the first control 106 based on the state of the first control 106 being the "next task" and sends another test task to the dialog display area 101 via the virtual assistant 104.
In some embodiments, task state information is displayed in the test toolbar. The task state information includes: the total number of test tasks and the serial number of the task currently being tested. For example, in FIG. 10, task state information 107 is displayed in the test toolbar 105. The task status information 107 is, for example, "task 1/4" shown in fig. 10, indicating that the total number of test tasks is 4 and the serial number of the task currently being tested is 1. When the state of the first control 106 is "next task", and the test user clicks the first control 106, the task state information 107 becomes "task 2/4", indicating that the serial number of the task currently being tested is 2.
In some embodiments, if all the test tasks complete the test, that is, the serial number of the task currently being tested in the task state information is the serial number of the last task, and the test user completes the test of the last task, the data acquisition script modifies the state of the first control to a task completion state, and in fig. 10, the "next task" will be displayed as "complete and scored", and the usability evaluation interface is displayed in response to the clicking operation of the first control.
In some embodiments, a voice control is also included in the test toolbar. The data acquisition script responds to clicking operation of the voice control, acquires voice data, identifies the voice data, converts an identification result into characters and displays the characters in a dialogue input box. After the test user confirms that the voice-recognized words are correct, clicking a sending control to display the voice-recognized words in a dialogue display area; after confirming that the text of the voice recognition is wrong, the user can directly modify in the dialogue input box, and click the sending button after modification so as to display the modified text in the dialogue display area. And the data acquisition script responds to the click operation of the sending control and sends the characters in the dialogue input box to the dialogue display area.
For example, in FIG. 10, a voice control 108 is also included in the test toolbar 105. After the test user clicks the voice control 108, the test user's device may collect the test user's voice data. The data acquisition script responds to clicking operation of the voice control 108, acquires voice data, identifies the voice data, converts an identification result into characters, and displays the characters in the dialogue input box 102; when the test user clicks the send control 103, the data acquisition script responds to the clicking operation of the send control 103 to send the text in the dialog input box 102 to the dialog display area 101, as shown in the first state diagram of the floating layer page in fig. 11.
In some embodiments, a second control is also included in the floating layer page. The states of the second control include a stowed state and a deployed state.
The data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is a packed state, packs up the chat interface, and modifies the state of the second control into an unfolding state.
The data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is an unfolding state, unfolds the chat interface, and changes the state of the second control into a folding state.
For example, in fig. 10, a second control 109 is also included in the floating layer page, and the state of the second control 109 is the collapsed state (upward arrow). After the test user finishes the test task, the test user can click on the second control 109 to pack up the chat interface, the data acquisition script packs up the chat interface in response to the click operation of the second control 109, and changes the state of the second control into an expanded state (downward arrow), such as a second state diagram of the floating layer page shown in fig. 12, that is, a state diagram after the floating layer page packs up the chat interface.
In fig. 12, only the test strip is retained, and the state of the second control is modified to the deployed state (downward arrow). Clicking the second control by the user expands the chat interface and redisplays it as the floating layer page shown in fig. 10. The data acquisition script expands the chat interface in response to the clicking operation of the second control, and modifies the state of the second control to a collapsed state (upward arrow), such as the second control 109 shown in fig. 10, and the state of the second control 109 is the collapsed state (upward arrow).
In fig. 12, the "complete and score" control is the first control 106 in fig. 10, and the state of the first control 106 is the task complete state. And the data acquisition script responds to clicking operation of the control for completing and grading, and displays an usability evaluation interface. The test user can score on the usability evaluation interface, make comments and suggestions according to visual feelings, and the like.
In fig. 12, "task 4/4" means that the total number of test tasks is 4, and the test user has completed the test of all the test tasks.
In some embodiments, after the chat interface is packed up, the data acquisition script responds to clicking operation of the voice control, acquires voice data and identifies, and converts the identification result into characters. The data acquisition script displays a dialog input box and a send control under the test toolbar and displays the recognized text in the dialog input box. The data acquisition script responds to the click operation of the sending control, displays a preview area, and displays characters in the dialogue input box in the preview area. In this embodiment, the preview area displays only the text of the speech recognition, and does not display the history dialogue.
For example, fig. 13 is a third state diagram of the floating layer page shown in fig. 10, that is, a state diagram after the chat interface is collapsed and the test user clicks the voice control. In fig. 13, the data acquisition script acquires voice data and recognizes in response to a click operation of the voice control, and converts the recognition result into text. In fig. 13, the data acquisition script displays a dialog input box and a send control underneath the test toolbar and displays the recognized text in the dialog input box. In fig. 13, the data acquisition script modifies the state of the second control to a collapsed state (upward arrow) to facilitate the test user clicking on the second control to collapse the dialog input and send the control, returning to the second state diagram of the floating layer page as shown in fig. 12. Accordingly, the data acquisition script responds to clicking operation of the second control, packs up the dialogue input box and the sending control, and modifies the state of the second control into an expanded state (downward arrow).
In fig. 13, if the test user clicks the send control, the data acquisition script displays a preview area in response to the click operation of the send control, and displays text in the dialog input box in the preview area, as shown in fig. 14. In fig. 14, the preview area displays only the text of the speech recognition, and does not display the history dialogue.
After the preview area is displayed in fig. 14, the state of the second control remains unchanged (upward arrow), so that the test user can click on the second control to pack up the dialog input box, send the control and the preview area, only the test toolbar is reserved, and the second state diagram of the floating layer page shown in fig. 12 is returned. Accordingly, the data acquisition script responds to clicking operation of the second control, packs up the dialog input box, the transmission control and the preview area, and modifies the state of the second control into an expanded state (downward arrow).
In some embodiments, the data collection script expands the chat interface in response to a click operation of the second control based on the state of the second control being an expanded state, and displays the text displayed in the history dialog and preview region in the dialog display region.
In some embodiments, the test data includes a screen file, front-end behavior data, feedback data obtained based on floating layer pages, front-end exception reporting, and/or usability assessment data. When the test user clicks the "complete and score" control as shown in fig. 12, the data acquisition script uploads the test data, e.g., to the cloud server; or the data acquisition script directly transmits the test data to the usability testing device.
The screen recording file is a video file, and the data acquisition script can set video parameters such as frame rate, resolution and the like corresponding to the video file; the data acquisition script may also not set video parameters, but acquire front-end system parameters before recording the screen, and determine the video parameters based on the front-end system parameters, where the front-end system parameters include at least one of screen resolution, remaining storage space, and processor utilization.
For example, the resolution of the recording screen is determined based on the screen resolution in the front-end system parameters, and the like. For another example, it may be determined whether there is enough memory to store the recording file based on the remaining memory, and if not, the resolution of the recording screen should be reduced. For another example, based on the processor utilization, it may be determined whether the processor can support recording of the recording file, and if not, the frame rate of the recording screen should be reduced.
The front-end behavior data includes: click, input, move, jump, execution time, etc. The clicking can be mouse clicking, the input can be keyboard output, the jumping can be page jumping, and the execution time can be the execution time of the front-end behavior.
The behavior trace feedback data is data input by a user in a floating layer page, such as spitting, feeling, evaluation, pictures, voice and the like.
The usability evaluation data is data input by the test user in the usability evaluation interface, such as scoring, opinion making, suggestion and the like.
In order to more intuitively understand the flow of the usability testing method according to the embodiments of the present disclosure, the process of analyzing test data in the usability testing method is described below with reference to fig. 15 to 20.
Test data analysis
After the usability testing device obtains the testing data of the testing user, the testing data can be analyzed, manual analysis is replaced, and the analysis efficiency is improved. The test data comprise a screen recording file, front-end behavior data, feedback data acquired based on a floating layer page, front-end abnormal error reporting and/or usability evaluation data.
In some embodiments, the test data is collected by the data collection script and uploaded to the cloud server, and the usability testing apparatus may obtain the test data from the cloud server. In some embodiments, the test data is collected by a data collection script and sent directly to the usability testing apparatus.
In some embodiments, after the usability testing device obtains the test data of the test user, the analysis interface is displayed based on the test data; wherein the analysis interface includes a video display area.
The usability testing apparatus generates a front-end behavior trace based on the front-end behavior data. For example, the usability testing apparatus may generate the front-end behavior trace based on clicks, inputs, movements, jumps, execution times, etc. in the front-end behavior data.
The usability testing device matches the front-end behavior track with the video picture of the screen recording file, so that the time alignment of the video picture with each behavior in the front-end behavior track is realized, namely, the video picture when a certain behavior occurs is matched with the behavior.
The usability testing device plays the video picture of the screen recording file and the matched front-end behavior track in the video display area, namely, the front-end behavior track is displayed in the video picture while the video picture is played.
For example, fig. 15 shows a video display area in the analysis interface, and the front-end behavior trace is synchronously displayed in the video picture played in the video display area.
In some embodiments, the analysis interface includes a video timeline area. For example, a video timeline area in an analysis interface is shown in FIG. 15. The video time axis region displays a time axis with preset length, and the time axis is provided with a play positioning mark which is used for marking the current play progress of the video.
The usability testing device determines the corresponding relation between the length of the time axis and the video time length based on the video time length of the screen recording file, so that the current playing progress of the video marked by the playing positioning mark on the time axis can correspond to the video picture of the screen recording file displayed in the video display area, and the problem that the time and the video picture do not correspond to each other does not exist.
The usability testing device marks the time axis based on the corresponding relation between the length of the time axis and the video duration and the front-end behavior data, the feedback data and/or the front-end abnormal error reporting. In some embodiments, the front-end behavior data, the feedback data, and the front-end exception errors correspond to different icons and/or colors, respectively.
In some embodiments, after the usability testing device marks the time axis, a marking mark and a corresponding icon are displayed above the time axis, so that the project association party can determine what kind of test data each marking mark corresponds to, and also can conveniently mark the marking mark by the project association party point, and quickly locate the test data.
For example, in fig. 15, the positions of the marking marks of different test data are shown above the time axis, the marking marks are dots, the flag indicates that one test task is finished testing, and the icon a indicates that the problem is marked.
In some embodiments, the usability testing device responds to the clicking operation of the marking mark, and moves the playing positioning mark to the moment corresponding to the marking mark along the time axis; and further playing the video picture at the moment and the front-end behavior track matched with the video picture in the video display area, and continuing playing the video picture.
In some embodiments, the analysis interface includes a multi-dimensional data detail tree area. The usability testing device constructs a time sequence detail tree based on front-end behavior data, feedback data and/or front-end abnormal error reporting; and further displaying the timing detail tree in a multi-dimensional data detail tree area.
For example, fig. 16 is a first state diagram of a multidimensional data detail tree region in an analysis interface provided in an embodiment of the present disclosure. In fig. 16, a timing detail tree 161 is shown in a multidimensional data detail tree area, and the timing detail tree 161 sorts the front-end behavior data, the feedback data, the front-end abnormality error reporting, and the problem marking in time order. Wherein, the front-end behavior data includes: page jumps, mouse clicks, keyboard inputs, execution time of front-end behavior, etc.
In some embodiments, the timing detail tree includes a plurality of nodes, each node including details of the data and an icon, and the hierarchical relationship between the nodes is determined based on the timing of the data. For example, in FIG. 16, one node in the timing detail tree is a question mark, and that node includes a question mark icon 161-1 and details 161-2.
In some embodiments, a third control is included in the multi-dimensional data detail tree region. The states of the third control include a first state describing a user behavior trace and a second state describing a multidimensional auxiliary analysis. For example, in FIG. 16, the third control 162 may be a drop-down list box in which options include "user behavior trace" and "multidimensional auxiliary analysis".
The usability testing apparatus displays the time series detail tree in the multidimensional data detail tree area in response to a triggering operation of the first state. In fig. 16, a triggering operation of the first state, such as the item-associating party clicking on the third control 162 and selecting "user behavior trace". The state of the third control 162 is the first state, i.e., the user behavior trace. After the item-associating party clicks on the third control 162 and selects "user behavior trace", the multi-dimensional data detail tree area shown in FIG. 16 is displayed.
And responding to the triggering operation of the second state by the usability testing device, displaying feedback data and front-end abnormal error in the time sequence detail tree in a multidimensional data detail tree area, and hiding front-end behavior data in the time sequence detail tree. In fig. 16, a triggering operation of the second state, such as the item-associating party clicking on the third control 162 and selecting "multidimensional auxiliary analysis", displays a second state diagram of the multidimensional data detail tree region as shown in fig. 17. In fig. 17, front-end behavior data in the timing detail tree is hidden, for example, page jumps, mouse clicks, keyboard inputs, execution time of front-end behaviors, etc. are all hidden, and only feedback data (i.e., user feedback in fig. 17) and front-end exception errors (i.e., front-end errors in fig. 17) are reserved. In fig. 17, the "user feedback is converted into a problem record" control and a delete control are added for user feedback, a delete control (and a "modify" control) is added for front-end error reporting, and a "modify" control and a delete control are added for problem marking.
In some embodiments, a fourth control is included in the multi-dimensional data detail tree region. And responding to clicking operation of the fourth control by the usability testing device, and marking the time axis based on the time sequence detail tree. For example, the fourth control 163 in fig. 16 is "mark and record problem". If all nodes of the time sequence detail tree 161 in fig. 16 are not selected, the usability testing apparatus responds to the clicking operation of the "mark and record problem" control, marks the time axis in fig. 15, and displays the positions of the mark marks of different data above the time axis, wherein the mark marks are dots, in fig. 15, the flag indicates that one testing task is finished, and the icon a indicates that the problem marks.
In some embodiments, the usability testing device responds to clicking operation of the node corresponding to the feedback data in the time sequence detail tree, and modifies the state of the node into a selected state; and further, based on the selected state, responding to clicking operation of the fourth control, and calling out the elastic layer page. The elastic layer page is used for converting feedback data into problem marking. In FIG. 17, the corresponding node of feedback data (i.e., user feedback in FIG. 17) is selected, such as clicking on icon 171-1 of the user feedback or clicking on details 171-2 of the user feedback for the item-associated party. Correspondingly, the usability testing device responds to clicking operation of the marking and recording problem control, calls out an elastic layer page and is used for converting feedback data into problem marking.
In some embodiments, the feedback data corresponding nodes in the timing detail tree include: question marking control. The usability testing device responds to clicking operation of the problem marking control, calls out an elastic layer page and is used for converting feedback data into problem marking. In fig. 17, after the mouse points to the question marking control, the user is prompted to feed back and turn to the question record, so that the project associating party can know the function of the question marking control.
In some embodiments, the bullet layer page is shown, for example, at 181 in fig. 18, and fig. 18 is a page variation diagram of a feedback data conversion to question marking provided by an embodiment of the present disclosure. In fig. 18, the project associating party clicks on the question marking control, that is, the control pointed by "user feedback is converted into question record", and the usability testing apparatus calls out the bullet layer page 181 in response to the clicking operation of the question marking control. The bulletin page 181 includes a selection box for the type of question, an input box for the description of the question, and a determination control. The selection box of the question type can be a drop-down list box or a single selection box. After selecting the question type and entering the question description, the project-associated party clicks the determine control. The usability testing device responds to click operation of the determining control to acquire the problem type and the problem description; and further, the problem types and the problem descriptions are displayed in association with the user feedback. In the associated display, for example, in fig. 18, the icon fed back by the user is replaced by the icon 182-1 for marking the question, and the question description 182-2 is displayed above the details fed back by the user, that is, the question description 182-2 and the details fed back by the user are associated in a "waterfall" manner.
In some embodiments, the analysis interface includes a question recording and scoring area. The usability testing device displays the content of the problem marking and usability evaluation data in the problem recording and scoring area. For example, fig. 19 is a schematic diagram of a problem recording and scoring area in an analysis interface according to an embodiment of the present disclosure. In fig. 19, the issue record and scoring area includes an issue record control and an usability scoring control. And the usability testing device responds to clicking operation of the problem recording control and displays the content of the problem marking in the problem recording and scoring area. The usability testing device responds to clicking operation of usability grading, and usability evaluation data are displayed in a problem recording and grading area.
In some embodiments, the usability testing apparatus displays the usability measurement conclusion interface shown in fig. 20 after completing the analysis of the test data or responding to the triggering operation of the usability test conclusion, so as to realize the same management and tracking of the problem record, and facilitate the project relator to quickly track the problem and repair the problem. The usability metric conclusion interface may present a variety of information including, for example, a usability overall rating, a usability metric score, an operability score, an easy learning score, a number of test users, a number of usability questions, a usability test profile, and the like. Such information may be obtained in connection with data (e.g., scoring, opinion, advice, etc.) entered by different test users at the usability assessment interface.
Fig. 21 is a block diagram of an usability testing apparatus according to an embodiment of the present disclosure, and as shown in fig. 21, the usability testing apparatus includes: a first acquisition unit 211, a second acquisition unit 212, an embedding unit 213, a generation unit 214, and a response unit 215.
The first obtaining unit 211 is configured to obtain a first link address of the test object and a test task.
The second obtaining unit 212 is configured to obtain a data acquisition script for the usability test.
An embedding unit 213, configured to embed the data acquisition script in the page of the first link address.
The generating unit 214 is configured to generate a second link address based on the first link address and the test task.
And the response unit 215 is used for responding to the access request of the second link address, jumping to the page of the first link address so as to lead the usability test by the data acquisition script based on the test task and acquiring the test data based on the data acquisition script.
In some embodiments, embedding unit 213 embeds the data collection script in the page of the first link address includes: the data acquisition script is added to the body tag of the page of the first link address.
In some embodiments, the first obtaining unit 211 also obtains the name of the test object. The test tasks include: the name of the test task and the operational description of the test task. The generating unit 214 generates the second link address based on the first link address and the test task, including: the name of the test object, the first link address, the name of the test task and the operation description of the test task are packaged as the second link address.
In some embodiments, the data collection script directs the usability test based on the test task comprising: the data acquisition script creates a floating layer page in the page of the first link address; the floating layer page is used for guiding the execution of the test task.
In some embodiments, a chat interface is displayed in the floating layer page, the chat interface including a conversation display area, a conversation input box, and a send control. The data acquisition script sends the test task to the dialogue display area through the virtual assistant; the data acquisition script responds to the click operation of the sending control and sends the content in the dialogue input box to the dialogue display area.
In some embodiments, the floating layer page further includes a test toolbar, and the test toolbar includes a first control; the states of the first control include a task switch state and a task complete state. The data acquisition script is based on the state of the first control as a task switching state, responds to clicking operation of the first control, and sends another test task to the dialogue display area through the virtual assistant. The data acquisition script responds to clicking operation of the first control based on the state of the first control as a task completion state, and displays an usability evaluation interface.
In some embodiments, task state information is displayed in the test toolbar, the task state information including: the total number of test tasks and the serial number of the task currently being tested.
In some embodiments, a voice control is also included in the test toolbar. The data acquisition script responds to clicking operation of the voice control, acquires voice data, identifies the voice data, converts an identification result into characters and displays the characters in a dialogue input box. And the data acquisition script responds to the click operation of the sending control and sends the characters in the dialogue input box to the dialogue display area.
In some embodiments, a second control is also included in the floating layer page. The states of the second control include a stowed state and a deployed state. The data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is a packed state, packs up the chat interface, and modifies the state of the second control into an unfolding state. The data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is an unfolding state, unfolds the chat interface, and changes the state of the second control into a folding state.
In some embodiments, after the chat interface is packed up by the data acquisition script, the data acquisition script responds to clicking operation of the voice control, acquires voice data and identifies, and converts the identification result into characters. The data acquisition script displays a dialog input box and a send control and displays the recognized text in the dialog input box. The data acquisition script responds to the click operation of the sending control, displays a preview area, and displays characters in the dialogue input box in the preview area.
In some embodiments, after the data acquisition script displays the dialog input box and/or displays the preview area, the data acquisition script modifies the state of the second control to a collapsed state; the data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is a packed state, packs up the dialogue input box, the sending control and/or the preview area, and modifies the state of the second control into an unfolding state.
In some embodiments, the data collection script expands the chat interface in response to a click operation of the second control based on the state of the second control being an expanded state, and displays the text displayed in the history dialog and preview region in the dialog display region.
In some embodiments, collecting test data based on the data collection script includes: and after the floating layer page is created in the page of the first link address, recording the screen based on the data acquisition script, and generating a screen recording file.
In some embodiments, the test data includes the screen recording file, front-end behavior data, feedback data obtained based on floating layer pages, front-end exception reporting errors, and/or usability assessment data.
In some embodiments, the usability testing apparatus further comprises an analysis unit, not shown in fig. 21, for analyzing the test data.
In some embodiments, the analyzing the test data by the analyzing unit comprises: displaying an analysis interface based on the test data; wherein the analysis interface comprises a video display area; generating a front-end behavior trace based on the front-end behavior data; matching the front-end behavior track with a video picture of a screen recording file; and playing the video picture of the screen recording file and the matched front-end behavior track in the video display area.
In some embodiments, the analysis interface includes a video timeline area; the video time axis region displays a time axis with preset length, and the time axis is provided with a play positioning mark which is used for marking the current play progress of the video.
The analysis unit determines the corresponding relation between the length of the time axis and the video duration based on the video duration of the screen recording file; and marking the time axis based on the corresponding relation, the front-end behavior data, the feedback data and/or the front-end abnormal error reporting.
In some embodiments, the front-end behavior data, the feedback data, and the front-end exception errors correspond to different icons, respectively. After marking the time axis, the analysis unit displays marking marks and corresponding icons above the time axis.
In some embodiments, the analysis unit responds to the clicking operation of the marking mark, and moves the playing positioning mark to the moment corresponding to the marking mark along the time axis; and playing the video picture at the moment of playing the video display area and the front-end behavior track matched with the video picture.
In some embodiments, the analysis interface includes a multi-dimensional data detail tree area. The analysis unit builds a time sequence detail tree based on front-end behavior data, feedback data and/or front-end abnormal error reporting; the timing detail tree is displayed in a multidimensional data detail tree area.
In some embodiments, the timing detail tree includes a plurality of nodes, each node including details of the data and an icon, and the hierarchical relationship between the nodes is determined based on the timing of the data.
In some embodiments, a third control is included in the multi-dimensional data detail tree region. The states of the third control include a first state describing a user behavior trace and a second state describing a multidimensional auxiliary analysis.
The analysis unit displays the time series detail tree in the multidimensional data detail tree area in response to a trigger operation of the first state.
And the analysis unit responds to the triggering operation of the second state, displays the feedback data and the front-end abnormal error in the time sequence detail tree in a multidimensional data detail tree area, and hides the front-end behavior data in the time sequence detail tree.
In some embodiments, a fourth control is included in the multi-dimensional data detail tree region. And the analysis unit responds to clicking operation of the fourth control, and marks the time axis based on the time sequence detail tree.
In some embodiments, the analysis unit responds to the click operation of the node corresponding to the feedback data in the time sequence detail tree, and modifies the state of the node into the selected state. And the analysis unit responds to clicking operation of the fourth control on the basis of the selected state, calls out an elastic layer page, and the elastic layer page is used for converting feedback data into problem marking.
In some embodiments, the feedback data corresponding nodes in the timing detail tree include: question marking control. And the analysis unit responds to clicking operation of the problem marking control, calls out an elastic layer page, and the elastic layer page is used for converting feedback data into problem marking.
In some embodiments, a selection box for the type of question, an input box for the description of the question, and a determination control are included in the bullet page. The analysis unit responds to click operation of the determination control to acquire the problem type and the problem description; and displaying the problem type and the problem description in association with the feedback data.
In some embodiments, the analysis interface includes a question recording and scoring area. And displaying the content of the problem marking and usability evaluation data in a problem recording and scoring area.
In some embodiments, the issue record and scoring area includes an issue record control and a scoring control. And responding to clicking operation of the problem recording control by the analysis interface, and displaying the content of the problem marking in the problem recording and scoring area. And responding to clicking operation of the scoring control by the analysis interface, and displaying the usability evaluation data in the problem recording and scoring area.
In some embodiments, the division of each unit in the usability testing apparatus is only one logic function division, and there may be another division manner when actually implemented, for example, at least two units of the first obtaining unit 211, the second obtaining unit 212, the embedding unit 213, the generating unit 214, and the responding unit 215 may be implemented as one unit; the first acquisition unit 211, the second acquisition unit 212, the embedding unit 213, the generating unit 214, or the responding unit 215 may also be divided into a plurality of sub-units. It is understood that each unit or sub-unit can be implemented in electronic hardware, or in combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Those skilled in the art can implement the described functionality using different methods for each particular application.
Fig. 22 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 22, the electronic device includes: at least one processor 221, at least one memory 222, and at least one communication interface 223. The various components in the electronic device are coupled together by a bus system 224. A communication interface 223 for information transfer with an external device. It is appreciated that the bus system 224 is used to effect connected communications between these components. The bus system 224 includes, in addition to a data bus, a power bus, a control bus, and a status signal bus. But for clarity of illustration the various buses are labeled as bus system 224 in fig. 22.
It will be appreciated that the memory 222 in this embodiment may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory.
In some implementations, the memory 222 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system and application programs.
The operating system includes various system programs, such as a framework layer, a core library layer, a driving layer, and the like, and is used for realizing various basic tasks and processing hardware-based tasks. Applications, including various applications such as a media player (MEDIA PLAYER), browser (Browser), etc., are used to implement various application tasks. The program for implementing the usability testing method provided by the embodiment of the present disclosure may be included in the application program.
In the embodiment of the present disclosure, the processor 221 is configured to execute the steps of each embodiment of the usability testing method provided in the embodiment of the present disclosure by calling a program or an instruction stored in the memory 222, specifically, a program or an instruction stored in an application program.
The usability testing method provided by the embodiments of the present disclosure may be applied to the processor 221 or implemented by the processor 221. The processor 221 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry of hardware in the processor 221 or instructions in the form of software. The Processor 221 may be a general purpose Processor, digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the usability testing method provided by the embodiment of the present disclosure may be directly embodied in the execution of the hardware decoding processor, or may be executed by a combination of hardware and software units in the decoding processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 222 and the processor 221 reads the information in the memory 222 and performs the steps of the method in combination with its hardware.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but those skilled in the art can appreciate that the disclosed embodiments are not limited by the order of actions described, as some steps may occur in other orders or concurrently in accordance with the disclosed embodiments. In addition, those skilled in the art will appreciate that the embodiments described in the specification are all alternatives.
The embodiments of the present disclosure further provide a non-transitory computer readable storage medium storing a program or instructions that cause a computer to perform steps such as the embodiments of the usability testing method, and for avoiding repetition of the description, the description will not be repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the disclosure and form different embodiments.
Those skilled in the art will appreciate that the descriptions of the various embodiments are each focused on, and that portions of one embodiment that are not described in detail may be referred to as related descriptions of other embodiments.
Although embodiments of the present disclosure have been described with reference to the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the disclosure, and such modifications and variations fall within the scope defined by the appended claims.

Claims (31)

1. An ease of use testing method comprising:
acquiring a first link address and a test task of a test object;
acquiring a data acquisition script of the usability test;
embedding the data acquisition script into a page of the first link address;
Generating a second link address based on the first link address and the test task;
And responding to the access request of the second link address, jumping to the page of the first link address so that the data acquisition script guides the usability test based on the test task and acquires test data based on the data acquisition script.
2. The method of claim 1, wherein the embedding the data collection script in the page of the first link address comprises:
And adding the data acquisition script into a main body tag of the page of the first link address.
3. The method of claim 1, wherein the method further comprises: acquiring the name of the test object;
the test tasks include: the name of the test task and the operation description of the test task;
The generating a second link address based on the first link address and the test task includes: and packaging the name of the test object, the first link address, the name of the test task and the operation description of the test task into a second link address.
4. The method of claim 1, wherein the data collection script directs usability testing based on the test task comprises:
the data acquisition script creates a floating layer page in the page of the first link address; the floating layer page is used for guiding the execution of the test task.
5. The method of claim 4, wherein a chat interface is displayed in the floating layer page, wherein the chat interface comprises a dialogue display area, a dialogue input box and a sending control;
the data acquisition script sends the test task to the dialogue display area through a virtual assistant;
and the data acquisition script responds to the click operation of the sending control and sends the content in the dialogue input box to the dialogue display area.
6. The method of claim 5, wherein the floating layer page further comprises a test toolbar comprising a first control therein;
the states of the first control comprise a task switching state and a task completion state;
The data acquisition script responds to clicking operation of the first control based on the state of the first control as the task switching state, and sends another test task to the dialogue display area through the virtual assistant;
And the data acquisition script responds to clicking operation of the first control on the basis of the state of the first control as the task completion state, and displays an usability evaluation interface.
7. The method of claim 6, wherein task state information is displayed in the test toolbar, the task state information comprising: the total number of test tasks and the serial number of the task currently being tested.
8. The method of claim 6, wherein the test tool bar further comprises a voice control therein;
The data acquisition script responds to clicking operation of the voice control, acquires voice data, identifies the voice data, converts an identification result into characters and displays the characters in the dialogue input box;
And the data acquisition script responds to the click operation of the sending control and sends the characters in the dialogue input box to the dialogue display area.
9. The method of claim 8, wherein the floating layer page further comprises a second control therein; the states of the second control comprise a retracted state and an extended state;
the data acquisition script is based on the state of the second control as a stowage state, responds to clicking operation of the second control, stows the chat interface, and modifies the state of the second control into an unfolding state;
and the data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is an unfolding state, unfolds the chat interface, and changes the state of the second control into a folding state.
10. The method of claim 9, wherein after said collapsing the chat interface, the method further comprises:
The data acquisition script responds to clicking operation of the voice control, acquires voice data and identifies the voice data, and converts an identification result into characters;
Displaying the dialogue input box and the sending control, and displaying the characters in the dialogue input box;
and the data acquisition script responds to the click operation of the sending control, displays a preview area, and displays the characters in the dialogue input box in the preview area.
11. The method of claim 10, wherein after displaying the dialog input box and/or the display preview area, the method further comprises:
The data acquisition script modifies the state of the second control to a stowage state;
And the data acquisition script is used for responding to clicking operation of the second control on the basis that the state of the second control is a packed state, packing up the dialogue input box, the sending control and/or the preview area, and modifying the state of the second control into an unfolding state.
12. The method of claim 11, wherein the method further comprises:
And the data acquisition script responds to clicking operation of the second control on the basis that the state of the second control is an unfolding state, the chat interface is unfolded, and the history dialogue and the characters displayed in the preview area are displayed in the dialogue display area.
13. The method of claim 4, wherein the gathering test data based on the data gathering script comprises:
and after a floating layer page is created in the page of the first link address, recording a screen based on the data acquisition script, and generating a screen recording file.
14. The method of claim 13, wherein the test data comprises the screen file, front-end behavior data, feedback data obtained based on the floating layer page, front-end exception reporting, and/or usability assessment data.
15. The method of claim 14, wherein the method further comprises: and analyzing the test data.
16. The method of claim 15, wherein the analyzing the test data comprises:
displaying an analysis interface based on the test data; wherein the analysis interface comprises a video display area;
Generating a front-end behavior track based on the front-end behavior data;
matching the front-end behavior track with the video picture of the screen recording file;
and playing the video picture of the screen recording file and the matched front-end behavior track in the video display area.
17. The method of claim 16, wherein the analysis interface comprises a video timeline area; displaying a time axis with preset length in the video time axis area, wherein the time axis is provided with a play positioning mark which is used for marking the current play progress of the video;
Based on the video duration of the screen recording file, determining the corresponding relation between the length of the time axis and the video duration;
And marking the time axis based on the corresponding relation, the front-end behavior data, the feedback data and/or the front-end abnormal error reporting.
18. The method of claim 17, wherein the front-end behavior data, the feedback data, and the front-end exception errors correspond to different icons, respectively;
and after marking the time axis, displaying marking marks and corresponding icons above the time axis.
19. The method of claim 18, wherein the method further comprises:
responding to the clicking operation of the marking mark, and moving the playing positioning mark to the moment corresponding to the marking mark along the time axis;
And playing the video picture at the moment and the front-end behavior track matched with the video picture in the video display area.
20. The method of claim 17, wherein the analysis interface comprises a multi-dimensional data detail tree area;
constructing a time sequence detail tree based on the front-end behavior data, the feedback data and/or the front-end exception reporting errors;
the timing detail tree is displayed in the multi-dimensional data detail tree area.
21. The method of claim 20, wherein the timing detail tree comprises a plurality of nodes, each node comprising details of the data and an icon, the hierarchical relationship between the nodes being determined based on the timing of the data.
22. The method of claim 20, wherein a third control is included in the multi-dimensional data detail tree region; the states of the third control comprise a first state describing a user behavior track and a second state describing multidimensional auxiliary analysis;
responding to the triggering operation of the first state, and displaying the time sequence detail tree in the multidimensional data detail tree area;
And responding to the triggering operation of the second state, displaying feedback data and front-end abnormal errors in the time sequence detail tree in the multi-dimensional data detail tree area, and hiding front-end behavior data in the time sequence detail tree.
23. The method of claim 20, wherein a fourth control is included in the multi-dimensional data detail tree region;
and responding to clicking operation of the fourth control, and marking the time axis based on the time sequence detail tree.
24. The method of claim 23, wherein the method further comprises:
Responding to clicking operation of the node corresponding to the feedback data in the time sequence detail tree, and modifying the state of the node into a selected state;
And based on the selected state, responding to clicking operation of the fourth control, calling out an elastic layer page, wherein the elastic layer page is used for converting the feedback data into problem marking.
25. The method of claim 20, wherein the feedback data corresponding nodes in the timing detail tree comprise: a question marking control;
And responding to clicking operation of the question marking control, calling out an elastic layer page, wherein the elastic layer page is used for converting the feedback data into question marking.
26. The method of claim 24 or 25, wherein the bullet page includes a selection box for a question type, an input box for a question description, and a determination control;
Responding to the clicking operation of the determining control to acquire the problem type and the problem description;
and displaying the question type and the question description in association with the feedback data.
27. The method of claim 24 or 25, wherein the analysis interface includes a question recording and scoring area;
And displaying the content of the problem marking and the usability evaluation data in the problem recording and scoring area.
28. The method of claim 27, wherein the issue record and scoring area includes an issue record control and a scoring control;
responding to clicking operation of the problem recording control, and displaying the content of the problem marking in the problem recording and scoring area;
And responding to clicking operation of the scoring control, and displaying the usability evaluation data in the problem recording and scoring area.
29. An ease of use testing device, comprising:
the first acquisition unit is used for acquiring a first link address of the test object and a test task;
The second acquisition unit is used for acquiring a data acquisition script for the usability test;
The embedding unit is used for embedding the data acquisition script into the page of the first link address;
the generating unit is used for generating a second link address based on the first link address and the test task;
and the response unit is used for responding to the access request of the second link address, jumping to the page of the first link address so that the data acquisition script guides the usability test based on the test task and acquires test data based on the data acquisition script.
30. An electronic device, comprising: a processor and a memory;
the processor is adapted to perform the steps of the method according to any one of claims 1 to 28 by invoking a program or instruction stored in the memory.
31. A non-transitory computer readable storage medium storing a program or instructions that cause a computer to perform the steps of the method of any one of claims 1 to 28.
CN202110099106.1A 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium Active CN114791875B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110099106.1A CN114791875B (en) 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110099106.1A CN114791875B (en) 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114791875A CN114791875A (en) 2022-07-26
CN114791875B true CN114791875B (en) 2024-07-02

Family

ID=82459458

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110099106.1A Active CN114791875B (en) 2021-01-25 2021-01-25 Usability testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114791875B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868096A (en) * 2015-01-22 2016-08-17 阿里巴巴集团控股有限公司 Methods and apparatuses used for displaying web page test result in browser and device
CN107133180A (en) * 2017-06-07 2017-09-05 腾讯科技(深圳)有限公司 Method of testing, test device and the storage medium of dynamic page

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107908552A (en) * 2017-10-30 2018-04-13 阿里巴巴集团控股有限公司 A kind of test method based on link, device and equipment
US10657032B2 (en) * 2018-03-30 2020-05-19 Atlassian Pty Ltd Systems and methods for monitoring performance of applications
CN108647141B (en) * 2018-04-26 2022-09-09 腾讯科技(深圳)有限公司 Automatic test method, device, computer readable medium and electronic equipment
CN110825618B (en) * 2019-10-10 2024-01-26 天航长鹰(江苏)科技有限公司 Method and related device for generating test case

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868096A (en) * 2015-01-22 2016-08-17 阿里巴巴集团控股有限公司 Methods and apparatuses used for displaying web page test result in browser and device
CN107133180A (en) * 2017-06-07 2017-09-05 腾讯科技(深圳)有限公司 Method of testing, test device and the storage medium of dynamic page

Also Published As

Publication number Publication date
CN114791875A (en) 2022-07-26

Similar Documents

Publication Publication Date Title
KR101120756B1 (en) Automatic text generation
US6308146B1 (en) System and method for simulating user input to control the operation of an application
CN103098051B (en) Search engine optmization assistant
KR101087312B1 (en) Importation of automatically generated content
CN108170611A (en) Automated testing method and device, storage medium, electronic equipment
CN109739855B (en) Method and system for realizing data sheet splicing and automatically training machine learning model
US20120081371A1 (en) Dialog design tool and method
JPS62212837A (en) Interactive software tester
JPS62212867A (en) Multi-mode simulation system
US20100114791A1 (en) Automated Interview Systems and Methods
CN110928763A (en) Test method, test device, storage medium and computer equipment
KR20090058409A (en) Method and system for providing and using editable personal dictionary
Silva et al. A comparative study of milestones for featuring GUI prototyping tools
CN115658529A (en) Automatic testing method for user page and related equipment
CN116450202A (en) Page configuration method, page configuration device, computer equipment and computer readable storage medium
US7574625B2 (en) Active content wizard testing
CN114791875B (en) Usability testing method and device, electronic equipment and storage medium
CN112988580A (en) Test process reproduction method, device, equipment and storage medium
CN111143205B (en) Android platform-oriented test case automatic generation method and generation system
JP2021174096A (en) Test support device, test support program and test support method
TW201015309A (en) Method for testing controls of application software automatically
US8346560B2 (en) Dialog design apparatus and method
JP2019106107A (en) Program, information processing apparatus, and screen test system
CN114764560A (en) Flow form generation method, equipment, storage medium and device
Dhanoa et al. D-Tour: Semi-Automatic Generation of Interactive Guided Tours for Visualization Dashboard Onboarding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant