WO2015039566A1 - Method and system for facilitating automated web page testing - Google Patents

Method and system for facilitating automated web page testing Download PDF

Info

Publication number
WO2015039566A1
WO2015039566A1 PCT/CN2014/085934 CN2014085934W WO2015039566A1 WO 2015039566 A1 WO2015039566 A1 WO 2015039566A1 CN 2014085934 W CN2014085934 W CN 2014085934W WO 2015039566 A1 WO2015039566 A1 WO 2015039566A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
web page
detecting
user interface
plug
Prior art date
Application number
PCT/CN2014/085934
Other languages
French (fr)
Inventor
Yue Lin
Li Xu
Xiang Li
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2015039566A1 publication Critical patent/WO2015039566A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Definitions

  • the present disclosure relates to the field of Web testing technologies and, in particular, to a method and system for facilitating automated web page testing and debugging.
  • Conventional web testing is as follows: For a web page to be tested, code of the web page is tagged, a test logic is defined, and an automated test is performed on the tagged web page code according to the test logic.
  • test logic is compiled with codes, and the test logic is only readable by developers and is difficult for others to understand.
  • a method of facilitating automated web page testing and debugging is performed at a device (e. g. , client device 102, Figures 1 and 3) with one or more processors and memory.
  • the method includes detecting a user input activating a plug-in associated with web page testing and debugging while displaying the web page in a web browser executed on the device.
  • the method includes: identifying a plurality of web page components of the web page; and extracting respective location information and respective configuration information for the plurality of web page components of the web page.
  • the method includes displaying a first user interface corresponding to the plug-in, the first user interface including respective graphical representations for the plurality of web page components of the web page.
  • the method includes detecting one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of components of the web page into a respective test flow and, after detecting a user input to submit the respective test flow, saving the respective test flow to a test flow database including zero or more previously submitted test flows, where the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time.
  • a computing device (e. g. , client device 102 ( Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) ) includes one or more processors and memory storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing, or controlling performance of, the operations of any of the methods described herein.
  • a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a computing device (e. g. , client device 102 ( Figures 1 and 3) or a component thereof (e. g.
  • a computing device e. g. , client device 102 ( Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) ) includes means for performing, or controlling performance of, the operations of any of the methods described herein.
  • FIG. 1 is a block diagram of a server-client environment in accordance with some embodiments.
  • FIG. 2 is a block diagram of a server system in accordance with some embodiments.
  • Figure 3 is a block diagram of a client device in accordance with some embodiments.
  • Figures 4A-4G illustrate exemplary user interfaces for facilitating automated web page testing and debugging in accordance with some embodiments.
  • Figure 5 is a flowchart diagram of an automated web page testing method in accordance with some embodiments.
  • Figure 6 is a flow diagram of an automated web page testing process in accordance with some embodiments.
  • Figures 7A-7C illustrate a flowchart diagram of a method of facilitating automated web page testing and debugging in accordance with some embodiments.
  • server-client environment 100 includes client devices 102-1, 102-2and server system 108.
  • a web browser 104 is executed on a client device 102, and web browser 104 includes a plug-in 106 that communicates with server system 108 through one or more networks 110.
  • Plug-in 106 provides client-side functionalities for the automated web page testing and debugging application and communications with server system 108.
  • Server system 108 provides server-side functionalities for the automated web page testing and debugging application for any number of plug-ins 106 each being executed from a web browser 104 on a respective client device 102.
  • server system 108 includes one or more processors 112, test flow/case library 114, test results database 116, an I/O interface to one or more clients 118, and an optional I/O interface to one or more external services 120.
  • I/O interface to one or more clients 118 facilitates the client-facing input and output processing for server system 108.
  • processor (s) 112 obtain one or more test cases submitted by a client device 102 and, in response, executes the one or more test cases or causes one or more test machines 122 to execute the one or more test cases.
  • Test flow/case library 114 stores test flows and test cases saved and/or submitted by client devices 102
  • test results database 116 stores results for completed test cases and also expected results for test flows and test cases.
  • I/O interface to one or more external services 120 optionally facilitates communications with one or more test machines 122.
  • server system 108 queries one or more test machines 122 to determine their current workloads and causes the one or more test cases to be executed by select test machine (s) of one or more test machines 122 according to their corresponding workloads.
  • client device 102 examples include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA) , atablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices or other data processing devices.
  • PDA personal digital assistant
  • EGPS enhanced general packet radio service
  • Examples of one or more networks 110 include local area networks (LAN) and wide area networks (WAN) such as the Internet.
  • One or more networks 110 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB) , FIREWIRE, Long Term Evolution (LTE) , Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP) , Wi-MAX, or any other suitable communication protocol.
  • USB Universal Serial Bus
  • FIREWIRE Long Term Evolution
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • CDMA code division multiple access
  • TDMA time division multiple access
  • Bluetooth Wi-Fi
  • Wi-Fi voice over Internet Protocol
  • Wi-MAX Wi-MAX
  • Server system 108 is implemented on one or more standalone data processing apparatuses or a distributed network of computers.
  • server system 108 also employs various virtual devices and/or services of third party service providers (e. g. , third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of server system 108.
  • third party service providers e. g. , third-party cloud service providers
  • the automated web page testing and debugging application includes both a client-side portion (e. g. , plug-in 106) and a server-side portion (e. g. , server system 108) .
  • data processing is implemented as a standalone application installed on client device 102, and the databases are created and stored locally at the client device.
  • the division of functionalities between the client and server portions of client environment data processing can vary in different embodiments.
  • plug-in 106 is a thin-client that provides only user-facing input and output processing functions, and delegates all other data processing functionalities to a backend server (e. g. , server system 108) .
  • the automated web page testing and debugging is performed entirely by plug-in 106 and client device 102 stores saved test flows, test cases, and test results. In another example, in some embodiments, the automated web page testing and debugging is performed entirely by server system 108 and server system 108 stores saved test flows, test cases, and test results. In another example the automated web page testing and debugging is performed by plug-in 106 and server system 108 stores saved test flows, test cases, and test results.
  • FIG. 2 is a block diagram illustrating server system 108 in accordance with some embodiments.
  • Server system 108 typically, includes one or more processing units (CPUs) 112, one or more network interfaces 204 (e. g. , including I/O interface to one or more clients 118 and I/O interface to one or more external services 120) , memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset) .
  • Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 206, optionally, includes one or more storage devices remotely located from one or more processing units 112. Memory 206, or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable storage medium. In some implementations, memory 206, or the non-transitory computer readable storage medium of memory 206, stores the following programs, modules, and data structures, or a subset or superset thereof:
  • ⁇ operating system 210 including procedures for handling various basic system services and for performing hardware dependent tasks
  • ⁇ network communication module 212 for connecting server system 108 to other computing devices (e. g. , client devices 102 and test machine (s) 122) connected to one or more networks 110 via one or more network interfaces 204 (wired or wireless) ;
  • ⁇ test platform 214 which provides server-side data processing and functionalities for the automated web page testing and debugging application, including but not limited to:
  • orequest handling module 216 for receiving a request from a client device 102 to execute one or more test cases
  • o(optional) workload determination module 218 for determining the current workload of one or more test machines 122;
  • otest execution module 220 for executing the one or more test cases or causing select test machine (s) from among the one or more test machines 122 to execute the one or more test cases;
  • o(optional) screenshot module 222 for capturing screenshots while executing the one or more test cases
  • otest result analyzing module 224 for performing analysis on the results of the one or more test cases
  • osending module 226 for sending the test results and/or the analysis of the test results to the client device 102;
  • ⁇ server data 240 storing data for the automated web page testing and debugging application, including but not limited to:
  • otest flow/case library 114 stores test flows and test cases saved and/or submitted by client devices 102;
  • otest results database 116 stores results for completed test cases and expected results for test flows and test cases.
  • Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
  • the above identified modules or programs i. e. , sets of instructions
  • memory 206 optionally, stores a subset of the modules and data structures identified above.
  • memory 206 optionally, stores additional modules and data structures not described above.
  • FIG. 3 is a block diagram illustrating a representative client device 102 in accordance with some embodiments.
  • Client device 102 typically, includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset) .
  • Client device 102 also includes a user interface 310.
  • User interface 310 includes one or more output devices 312 that enable presentation of media content, including one or more speakers and/or one or more visual displays.
  • User interface 310 also includes one or more input devices 314, including user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a gesture capturing camera, or other input buttons or controls. Furthermore, some client devices 102 use a microphone and voice recognition or a camera and gesture recognition to supplement or replace the keyboard.
  • Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices.
  • Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302.
  • Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium.
  • memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
  • ⁇ operating system 316 including procedures for handling various basic system services and for performing hardware dependent tasks
  • ⁇ network communication module 318 for connecting client device 102 to other computing devices (e. g. , server system 108) connected to one or more networks 110 via one or more network interfaces 304 (wired or wireless) ;
  • ⁇ presentation module 320 for enabling presentation of information (e. g. , auser interface for application (s) 326, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc. ) at client device 102 via one or more output devices 312 (e. g. , displays, speakers, etc. ) associated with user interface 310;
  • information e. g. , auser interface for application (s) 326, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc.
  • output devices 312 e. g. , displays, speakers, etc.
  • ⁇ input processing module 322 for detecting one or more user inputs or interactions from one of the one or more input devices 314 and interpreting the detected input or interaction;
  • ⁇ web browser module 104 for navigating, requesting (e. g. , via HTTP) , and displaying websites and web pages thereof;
  • client device 102 e. g. , games, application marketplaces, payment platforms, and/or other web or non-web based applications
  • client device 102 e. g. , games, application marketplaces, payment platforms, and/or other web or non-web based applications
  • ⁇ plug-in 106 which provides client-side data processing and functionalities for the automated web page testing and debugging application, including but not limited to:
  • otrigger detection module 330 for triggering plug-in 106 based on an input detected in web browser module 104;
  • ocomponent identifying module 332 for identifying web page components for a respective web page displayed by web browser module 104;
  • oinformation extracting module 334 for extracting location information and configuration information for the identified web page components
  • ouser interface (UI) displaying module 336 for displaying a first UI for arranging a test flow, a second UI for arranging a test case, a third UI for causing execution of one or more test cases, and optionally a UI displaying test results;
  • ogenerating module 338 for generating graphical representations for the identified web page components that corresponding to test scripts for the identified web page components
  • osaving module 340 for saving test flows to test flow (s) library 362 and test cases to test case (s) library 364;
  • oexpected results module 342 for obtaining expected results for a test flow or test case and optionally saving the expected results in expected results library 366;
  • osubmitting module 344 for sending one or more test cases to server system 108 for execution
  • otest results receiving module 346 for receiving test results from server system 108 and optionally saving the test results in test results library 368;
  • oanalyzing module 348 for analyzing the test results received from server system 108 (i. e. , against the expected results) ;
  • ⁇ client data 360 optionally storing data associated with the automated web page testing and debugging application, including but not limited to:
  • otest flow (s) library 362 storing test flows submitted by the user of client device 102;
  • otest case (s) library 364 storing test cases submitted by the user of client device 102;
  • oexpected results library 366 storing expected results for test flows and/or test cases
  • otest results library 368 storing test results for executed test cases.
  • Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
  • the above identified modules or programs i. e. , sets of instructions
  • memory 306 optionally, stores a subset of the modules and data structures identified above.
  • memory 306, optionally, stores additional modules and data structures not described above.
  • plug-in 106 are performed by server system 108, and the corresponding sub-modules of these functions may be located within server system 108 rather than plug-in 106.
  • the functions of generating module 338 and saving module 340 are performed by server system 108.
  • at least some of the functions of server system 108 are performed by plug-in 106, and the corresponding sub-modules of these functions may be located within plug-in 106 rather than server system 108.
  • the functions of test execution module 220, screenshot module 222, and test result analyzing module 224 are performed by plug-in 106.
  • Server system 108 and client device 102 shown in Figures 2-3, respectively, are merely illustrative, and different configurations of the modules for implementing the functions described herein are possible in various embodiments.
  • the display is a touch screen (sometimes also herein called a “touch screen display” ) enabled to receive one or more contacts and display information (e. g. , media content, websites and web pages thereof, and/or user interfaces for application (s) 326) .
  • Figures 4A-4G illustrate exemplary user interfaces for facilitating automated web page testing and debugging in accordance with some embodiments.
  • the device detects inputs on a touch-sensitive surface that is separate from the display.
  • the touch sensitive surface has a primary axis that corresponds to a primary axis on the display.
  • the device detects contacts with the touch-sensitive surface at locations that correspond to respective locations on the display. In this way, user inputs detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display of the device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein.
  • contacts e. g. , finger inputs such as finger contacts, finger tap gestures, finger swipe gestures, etc.
  • one or more of the contacts are replaced with input from another input device (e. g. , amouse-based, stylus-based, or physical button-based input) .
  • a swipe gesture is, optionally, replaced with a mouse click (e. g. , instead of a contact) followed by movement of the cursor along the path of the swipe (e. g. , instead of movement of the contact) .
  • a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e. g. , instead of detection of the contact followed by ceasing to detect the contact) or depression of a physical button.
  • a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e. g. , instead of detection of the contact followed by ceasing to detect the contact) or depression of a physical button.
  • Figures 4A-4G show user interface 404 displayed on client device 102 (e. g. , a mobile phone) ; however, one skilled in the art will appreciate that the user interfaces shown in Figures 4A-4G may be implemented on other similar computing devices.
  • client device 102 e. g. , a mobile phone
  • the user interfaces in Figures 4A-4G are used to illustrate the processes described herein, including the process described with respect to Figures 5-6and 7A-7C.
  • Figure 4A illustrates client device 102 executing a web browser (e. g. , web browser module 104, Figures 1 and 3) .
  • the web browser is displaying a landing page (or any accessible page) for a website (e. g. , anews aggregation outlet) .
  • the web browser includes a web address bar 406 showing a URL for the landing page of the website as the current web address, refresh affordance 408 for reloading the current web page, back navigation affordance 410-A for displaying the last web page, and forward navigation affordance 410-B for displaying the next web page.
  • the web browser also includes plug-in affordance 402, which, when activated (e. g. , with a tap gesture) , causes execution of a plug-in (e. g. , plug-in 106, Figure 1 and 3) associated with an automated web page testing and debugging application.
  • the landing page for the website includes a plurality of webpage components, such as a logo picture 412 associated with the website, a search field 414 for searching the website, and advertisements 434-A and 434-B.
  • the landing page for the website also includes other webpage components, such as a first content section corresponding to “Today’ s Top News Story” 416 with a snippet or preview 418 of the news top story, a bookmark affordance 420 for bookmarking the top news story, and a comment entry field 422 for entering a comment related to the top news story.
  • the landing page for the website further includes other webpage components, such as a second content section corresponding to “Today’ s Top Sports Story” 424 with a snippet or preview 426 of the top sports story, a bookmark affordance 428 for bookmarking the top sports story, a user comments section 430 with user comments related to the top new story, and a comment entry field 432 for entering a comment related to the top sports story.
  • the webpage components shown in Figure 4A are merely exemplary, many other webpage components may be included in a given webpage for testing.
  • Figure 4A further illustrates client device 102 detecting contact 436 at a location corresponding to plug-in affordance 402.
  • plug-in 106 in response to detecting selection of plug-in affordance 402, plug-in 106 identifies web page components of the web page displayed in Figure 4A and, also, extracts location information and configuration information for each of the identified web page components.
  • Figure 4B illustrates client device 102 displaying a first user interface 438 of plug-in 106 that prompts the user of client device 102 to arrange a test flow in response to selection of plug-in affordance 402 in Figure 4A.
  • first user interface 438 includes a first region 439 with a plurality of graphical representations corresponding to the web page components on the web page displayed in Figure 4A.
  • each one of the graphical representations is associated with a test script corresponding to a web page component.
  • search representation 440 represents an editable field corresponding to search field 414 in Figure 4A
  • icon representation 442 represents a link corresponding to logo picture 412 in Figure 4A
  • advertisement representation 444 represents a link corresponding to advertisement 434-A in Figure 4A
  • advertisement representation 446 represents a link corresponding to advertisement 434-B in Figure 4A
  • story representation 448 represents a link corresponding to the top news story associated with snippet 418 in Figure 4A
  • bookmark representation 450 represents a functional button corresponding to bookmark affordance 420 in Figure 4A
  • commentary representation 452 represents an editable field corresponding to comment entry field 422 in Figure 4A
  • story representation 454 represents a link corresponding to the top sports story associated with snippet 426 in Figure 4A
  • bookmark representation 456 represents a functional button corresponding to bookmark affordance 428 in Figure 4A
  • comments representation 458 represents a link corresponding to user comments section 430 in Figure 4A
  • commentary representation 460 represents an editable field corresponding to comment entry field 432 in Figure 4
  • first user interface 438 further includes “Record Expected Results” affordance 461, which, when activated (e. g. , with a tap gesture) , causes the plug-in to display a user interface that prompts the user of client device 102 to perform actions indicating the expected results for the test flow.
  • first user interface 438 further includes “Submit Test Flow” affordance 462, which, when activated (e. g.
  • first user interface 438 further includes “Other Options” affordance 463, which, when activated (e. g. , with a tap gesture) , causes the plug-in to display a user interface that enables the user of client device 102 to view previously saved test flows and test cases (e. g. , asecond user interface 479 for arranging a test case, Figure 4F) , and to submit test case (s) for execution cases (e. g. , athird user interface 493 for executing test case (s) , Figure 4G) .
  • Figure 4C illustrates client device 102 displaying graphical representations 450, 452, 458, and 460 arranged in a test flow sequence in second region 465 of first user interface 438.
  • Figure 4C also illustrates client device 102 detecting contact 464 at a location corresponding to comment field representation 452.
  • contact 464 is associated with a long-press gesture (e. g. , apress gesture for greater than X seconds) on comment field representation 452.
  • Figure 4D illustrates client device 102 displaying options panel 466 in response to selection of comment field representation 452 in Figure 4C.
  • options panel 466 allows the user of client device 102 to edit options associated with comment field representation 452 for the test flow.
  • options panel 466 includes “Enter Test Text” affordance 468, which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a virtual keyboard for entering test text for executing the test script corresponding to comment field representation 452.
  • options panel 466 also includes “Remove from Test Flow” affordance 470, which, when activated (e. g.
  • options panel 466 further includes “Other Options” affordance 472 which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a set of options for adjusting and/or manipulating the test script corresponding to comment field representation 452 and the like.
  • Figure 4D also illustrates client device 102 detecting contact 474 at a location corresponding to “Enter Test Text” affordance 468.
  • the plug-in 106 determines at least some of the options available for each webpage component based on the configuration information associated with the webpage component. In one example, if the web page component is a text input field, one of the options available to the graphical representation of the text input field is for collecting a text input test pattern from the user. In another example, if the web page component is a drop down menu, the options available to the graphical representation of the drop-down menu is for collecting a selection input for the drop down menu. In some embodiments, one of the options provided for a graphical representation is for the user to identify a storage location where the required test input for the corresponding web page component may be found.
  • Figure 4E illustrates client device 102 displaying first user interface 438 of the plug-in.
  • Figure 4E illustrates client device 102 detecting contact 462 at a location corresponding to “Submit Test Flow” affordance 462.
  • plug-in 106 submits the test flow by locally saving the test flow arranged in second region 465 to test flow (s) library 362 ( Figure 3) and/or submitting the test flow to server system 108 where the test flow arranged in second region 465 is saved remotely in test flow/case library 114 ( Figures 1-2) .
  • Figure 4F illustrates client device 102 displaying a second user interface 479 of plug-in 106 that prompts the user of client device 102 to arrange a test case.
  • second user interface 479 is displayed in response to selection of “Submit Test Flow” affordance 462 in Figure 4E.
  • second user interface 479 is displayed at a time subsequent to Figure 4E in response to selection of plug-in affordance 402 in the web browser (e. g. , in Figure 4A) and, then, selection of “Other Options” affordance 463 in first user interface 438 of plug-in 106 (e. g. , in Figure 4B) .
  • second user interface 479 includes flow library region 477 with graphical representations 478-A, 478-B, and 478-C of previously submitted test flows (e. g. , stored in test flow (s) library 362, Figure 3 and/or test flow/case library 114, Figures 1-2) and test case region 481 for arranging a test case with the graphical representations 478-A, 478-B, and 478-C.
  • test flow (s) library 362, Figure 3 and/or test flow/case library 114, Figures 1-2) and test case region 481 for arranging a test case with the graphical representations 478-A, 478-B, and 478-C.
  • the user of client device 102 arranges a test case in test case region 481 by dragging graphical representations 478-A, 478-B, and 478-C from flow library region 477 into test case region 481 where the user may further reorder the sequence of graphical representations and/or remove graphical representations from the sequence by selecting the minus affordance overlaid on the graphical representations.
  • graphical representations 478-A and 478-B are arranged in a test case sequence.
  • second user interface 479 includes a home button 480 for returning to a home interface for plug-in 106 (e. g. , first user interface 438 in Figure 4B) , expand affordance 482 for adjusting the size of second user interface 479 and/or displaying second user interface 479 in full screen mode, options affordance 484 for adjusting configuration options for plug-in 106, new window affordance 486 for displaying second user interface 479 in a new window, and exit affordance 488 for exiting plug-in 106.
  • second user interface 479 also includes case name entry field 484 for a case name for the test case sequence in test case region 481.
  • second user interface 479 also includes reset affordance 490, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to reset the test case sequence in test case region 481 and submit affordance 492, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to submit the test case in test case region 481 by locally saving the test case arranged in test case region 481 to test case (s) library 364 ( Figure 3) and/or submitting the test case to server system 108 where the test case is saved remotely in test flow/case library 114 ( Figures 1-2) .
  • Figure 4F also illustrates client device 102 detecting contact 491 at a location corresponding to submit affordance 492.
  • Figure 4G illustrates client device 102 displaying a third user interface 493 of plug-in 106 that prompts the user of client device 102 to execute one or more test cases in response to or at a time subsequent to selection of submit affordance 492 in Figure 4F.
  • the third user interface 493 includes test case 3 submitted in response to selection of submit affordance 492 in Figure 4F and also previously submitted test cases 1 and 2.
  • second user interface 479 includes a home button 480 for returning to a home interface for plug-in 106 (e. g.
  • first user interface 438 in Figure 4B expand affordance 482 for adjusting the size of third user interface 493 and/or displaying third user interface 493 in full screen mode, options affordance 484 for adjusting configuration options for plug-in 106, new window affordance 486 for displaying third user interface 493 in a new window, and exit affordance 488 for exiting plug-in 106.
  • the third user interface 493 also includes toggle boxes for selecting test cases for execution (e. g. , acolumn of check boxes preceding the number and test case name columns) .
  • the third user interface 493 further includes a plurality of options for the execution of each test case including an affordance for adjusting a number of execution loops for a respective test case, an toggle affordance identifying whether to take screenshots during execution of the respective test case, toggle affordance for selecting web browsers (e. g. , Google Chrome TM , Mozilla Firefox TM , and Microsoft Internet Explorer TM ) on which to execute the respective test case, and a set of options for editing the respective test case including a view test case affordance (e. g.
  • the third user interface 493 further includes an execute affordance 498, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to cause execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108.
  • an execute affordance 498 which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to cause execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108.
  • Figure 5 illustrates a flowchart diagram of a method 500 of an automated web testing in accordance with some embodiments.
  • operations 502-510 of method 500 are performed by a device with one or more processors and memory and operations 512-514 of method 500 are performed by a server with one or more processors and memory.
  • operations 502-510 of method 500 are performed by client device 102 ( Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) and operations 512-514 of method 500 are performed by server system 108 ( Figures 1-2) or a component thereof (e. g. , test platform 214, Figure 2) .
  • method 600 is governed by instructions that are stored in a non-transitory computer readable storage medium of the device and/or server and the instructions are executed by one or more processors of the device and/or server.
  • a web browser 104 ( Figures 1 and 3) is executed on a client device 102, and web browser 104 includes a plug-in 106 that communicates with server system 108 through one or more networks 110 ( Figure 1) .
  • Plug-in 106 provides client-side functionalities for the automated web page testing and debugging application and communications with server system 108.
  • Server system 108 provides server-side functionalities for the automated web page testing and debugging application.
  • client device 102 or a component thereof For a respective web page, client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) : identifies (502) web page components and extracts information associated with the web page components, where the extracted information comprises position and configuration information in for the web page components; and displays graphical representations of the identified web page components.
  • client device 102 or a component thereof e. g. , plug-in 106, Figures 1 and 3
  • displaying the graphical representations comprises: organizing the web page components according to the extracted information in a manner of JavaScript object notation (JSON) , where the position is represented in an XML path language (XPath) ; and displaying organized graphical representations.
  • JSON JavaScript object notation
  • XPath XML path language
  • Client device 102 or a component thereof acquires (504) a respective test flow, where the respective test flow is a sequence of one or more of the graphical representations.
  • acquiring the respective test flow comprises the following steps: detecting selection information inputted by the user of client device of one or more of the graphical representations, where the selection information comprises arranging one or more of the graphical representations into a flow sequence; and organizing the corresponding node into a test flow according to the selection information.
  • Figure 4E graphical representations 450, 452, 458, and 460 are arranged into a test flow sequence in second region 465.
  • Client device 102 or a component thereof acquires (506) a respective test case, where the respective test case is a sequence of one or more test flows at least including the respective test flow.
  • the respective test case is a sequence of one or more test flows at least including the respective test flow.
  • Figure 4F graphical representations 478-B and 478-C are arranged into a test case sequence in test case region 481.
  • Client device 102 or a component thereof acquires (508) a test task, where the test task is a sequence of one or more test cases at least including the respective test case.
  • the test task is a sequence of one or more test cases at least including the respective test case.
  • each of the one or more test cases are transformed into a test logic compiled with codes and the one or more test logics are compiled with codes into a test task.
  • the test task includes test cases 1, 2, and 3 displayed in third user interface 493.
  • Client device 102 or a component thereof submits (510) the test task to server system 108 ( Figures 1-2) or a component thereof (e. g. , test platform 214, Figure 2) .
  • client device 102 or a component thereof e. g. , plug-in 106, Figures 1 and 3 pushes the test task to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) in a manner of JSON.
  • server system 108 or a component thereof executes (512) the test task or sends the test task to one or more test machines (e. g. , test machine (s) 122, Figure 1) for execution.
  • server system 108 or a component thereof e. g. , test platform 214, Figure 2
  • server system 108 or a component thereof performs the test task and captures screenshot on key operations of the test task.
  • the one or more test machines perform the test task, capture screenshots on key operations of the test task, and report test results and screenshot information to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) .
  • the server adopts a WebDriver to drive a browser in the one or more test machine to perform the test task.
  • system 108 or a component thereof provides (514) the test results and the screenshot information for the test task to the device.
  • the server prior to providing the test results and the screenshot information for the test task to the device, performs intelligent analysis on the test results and provides the intelligent analysis and the screenshot information for the test task to the device.
  • FIG. 6 illustrates a flow diagram of a process 600 for an automated web test in accordance with some embodiments.
  • process 600 is performed in a data processing environment (e. g. , server-client environment 100, Figure 1) that includes a device with one or more processors and memory that is associated with a user (e. g. , client device 102, Figure 1 and 3) , aserver with one or more processors and memory (e. g. , server system 108, Figures 1-2) and optionally one or more test machines each with one or more processors and memory (e. g. , test machine (s) 122, Figure 1) .
  • a web browser 104 ( Figures 1 and 3) is executed on a client device 102, and web browser 104 includes a plug-in 106 that communicates with server system 108 through one or more networks 110 ( Figure 1) .
  • Plug-in 106 provides client-side functionalities for the automated web page testing and debugging application and communications with server system 108.
  • Server system 108 provides server-side functionalities for the automated web page testing and debugging application.
  • a user of client device 102 accesses (602) a web page via web browser 104 and selects an affordance associated with automated test plug-in 106.
  • plug-in 106 detects selection of the affordance or receives an indication of selection of the affordance.
  • client device 102 while displaying a web page for a website corresponding to web address bar 406 in web browser 104, client device 102 detects contact 436 at a location corresponding to plug-in affordance 402.
  • the web browser sends a trigger to initiate plug-in 106 or plug-in 106 detects selection of the affordance.
  • plug-in 106 identifies (604) web page components in the displayed web page, extracts location and configuration information associated with the web page components, displays graphical representations of the web page components, and organizes a test flow with the graphical representations of the web page components according to user inputs.
  • client device 102 displays a first user interface 438 of plug-in 106 that prompts the user of client device 102 to arrange a test flow in response to selection of plug-in affordance 402 in Figure 4A.
  • first user interface 438 includes graphical representations 440-460 corresponding to the web page components of the web page displayed in Figure 4A.
  • Figure 4E shows graphical representations 450, 452, 458, and 460 arranged in a test flow sequence in second region 465 of first user interface 438.
  • this test flow sequence is submitted in response to selection of “Submit Test Flow” affordance 462 in Figure 4E.
  • plug-in 106 organizes (606) a test case at least including the test flow and organizes a test task at least including the test case.
  • client device 102 displays a second user interface 479 of plug-in 106 that prompts the user of client device 102 to arrange a test case.
  • client device 102 displays a third user interface 493 of plug-in 106 that prompts the user of client device 102 to execute one or more test cases.
  • plug-in 106 sends a test task comprising the one or more selected test cases to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) .
  • plug-in 106 in response to selection of execute affordance 498, causes execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108 (e. g. , by sending a test task including the one or more selected test cases to server system 108) .
  • server system 108 or a component thereof determines the current workload of one or more test machines 122 and server system 108 or a component thereof (e. g. , test execution module 220, Figure 2) sends the test cases in the test task to select test machine (s) from among the one or more test machines 122 to execute the one or more test cases.
  • the test machine (s) 122 perform (608) the test task and captures screenshots for key operations of the test cases comprising the test task. After performing the one or more test cases comprising the test task, the test machine (s) 122 send the test results and screenshots to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) .
  • server system 108 or a component thereof performs (610) intelligent analysis on the results of the one or more test cases.
  • server system 108 or a component thereof sends the intelligent analysis and the screenshots to the client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) .
  • Figures 7A-7C illustrates a flowchart diagram of a method 700 of facilitating automated web page testing and debugging in accordance with some embodiments.
  • method 700 is performed by a device with one or more processors.
  • method 700 is performed by client device 102 ( Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) .
  • method 700 is governed by instructions that are stored in a non-transitory computer readable storage medium of the device and the instructions are executed by one or more processors of the device.
  • the device While displaying a web page in a web browser executed on the device, the device detects (702) a user input activating a plug-in associated with web page testing and debugging.
  • client device 102 executes a web browser (e. g. , web browser module 104, Figures 1 and 3) displaying a landing page for a website (e. g. , anews aggregation outlet) .
  • the web browser also includes plug-in affordance 402, which, when activated (e. g. , with a tap gesture) , causes execution of a plug-in (e. g. , plug-in 106, Figure 1 and 3) .
  • client device 102 detects contact 436 at a location corresponding to plug-in affordance 402.
  • client device 102 or a component thereof e. g. , trigger detection module 330, Figure 3
  • plug-in 106 is a client-side portion of an automated web page testing and debugging application.
  • the device (704) identifies a plurality of web page components of the web page; and extracts respective location information and respective configuration information for the plurality of components of the web page.
  • client device 102 or a component thereof e. g. , component identifying module 332, Figure 3
  • client device 102 or a component thereof e. g. , information extracting module 334, Figure 3
  • location information and configuration information e. g. , component type, what kinds of input are expected for the webpage component, appearance, size, etc.
  • plug-in 106 identifies web page components for the web page displayed in Figure 4A and also extracts location information and configuration information for each of the identified web page components.
  • the location information indicates an HTML tag for component within the web page or the coordinates for component within the web page.
  • the configuration information includes information associated with the web page component such as a “component type” (e. g. , whether it is a text input field, a button, an icon, a link to a web address, a script, etc. ) , a “description text” (e. g. , the text or description associated with the component) , or a expected “action type” (e.
  • the device displays (706) a first user interface corresponding to the plug-in, the first user interface including respective graphical representations (i. e. , nodes) for the plurality of components of the web page.
  • client device 102 or a component thereof e. g. , generating module 338, Figure 3
  • client device 102 or a component thereof e. g. , user interface (UI) displaying module 336, Figure 3
  • UI user interface
  • client device 102 displays a first user interface 438 of plug-in 106 that prompts the user of client device 102 to arrange a test flow in response to selection of plug-in affordance 402 in Figure 4A.
  • first user interface 438 includes a first region 439 with a plurality of graphical representations 440-460 corresponding to the web page components of the web page displayed in Figure 4A.
  • the user of client device 102 arranges a test flow in second region 465 of first user interface 438 by dragging graphical representations from first region 439 into second region 465 where the user may further reorder the sequence of graphical representations and/or remove graphical representations from the sequence.
  • the graphical representations are (708) associated with respective test script operations corresponding to the two or more components of the web page.
  • the graphical representations correspond to test script operations corresponding to web page components.
  • the test script is based on the extracted location and configuration information.
  • the user can interact with the graphical representation to further configure the test script operations corresponding to the webpage components. For example, if a webpage component is a text input field, and the graphical representation for the webpage component corresponds to test script operations for filling out the text input field with certain text input, and the graphical representation can provide options for user the select which types of test text input to use for the test script operations.
  • client device 102 displays options panel 466 in response to selection of comment field representation 452 in Figure 4C.
  • options panel 466 allows the user of client device 102 to edit options associated with comment field representation 452 for the test flow.
  • options panel 466 includes: (A) “Enter Test Text” affordance 468, which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a virtual keyboard for entering test text for executing the test script corresponding to comment field representation 452; (B) “Remove from Test Flow” affordance 470, which, when activated (e. g.
  • the device detects (710) one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of components of the web page into a respective test flow. For example, the user is able to arrange one or more of the plurality of the graphical representations into a custom test flow sequence. Some graphical representations may not be used to create the test flow.
  • the plug-in breaks the web page into nodes and the user selects which nodes to manipulate and include in the test flow by dragging them into the first user interface.
  • the user further interacts with each of the graphical representations to configure the node before it is added to the test flow.
  • the graphical representation can provide a drop-down menu for the user to select the configuration options available for the node corresponding to the graphical representation.
  • Figure 4C shows client device 102 displaying graphical representations 450, 452, 458, and 460 arranged in a test flow sequence in second region 465 of first user interface 438.
  • the device After detecting a user input to submit the respective test flow, the device saves (712) the respective test flow to a test flow database including zero or more previously submitted test flows, where the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time.
  • client device 102 or a component thereof e. g. , saving module 340, Figure 3
  • the device (714) detects one or more manual test inputs provided by the user to the web page; detects a change in the web page displayed in the web browser in response to the one or more manual test input; identifies a second plurality of web page components based on the change in the webpage; and displays respective graphical representations for the second plurality of web page components in the first user interface.
  • operation 714 generates graphical representations for expected results of certain test inputs.
  • These graphical representations can be compiled and associated with a test case or test flow.
  • the change can be the loading of a new page, or an update made to the current webpage.
  • the loading of a new page can be indicated by the change of URL of the page (e. g. , alogin-success page) .
  • an update in the page can be a modification to a portion of the page (e. g. , the number of items shown on the shopping cart, etc. ) .
  • the device detects (716) one or more additional user inputs to select at least one of the respective representations for the second plurality of web page components in the first user interface as expected result for the respective test flow and, in response to the one or more additional user inputs to submit the expected result for the respective test flow, saves the expected result in association with the respective test flow in the test flow database.
  • the one or more additional inputs are detected when plug-in 106 is active.
  • there is an affordance in the user interface of plug-in 106 e. g. , not shown in user interfaces in Figures 4B and 4F-4G) for showing a user interface for recording result representations, similar to the interface for recording test flows.
  • the same user interface e. g.
  • the user interface in Figures 4B and 4F-4G can be used for building the test flow and adding the result representations for the rest flow.
  • the manual test inputs can be a series of inputs, such as providing user name and password in a login page, adding one or more items to a shopping cart in a shopping webpage.
  • the expected results are saved locally in expected results library 366 ( Figure 3) and/or remotely in test results database 116 ( Figures 1-2) .
  • the expected result is (718) used to analyze test results generated by the respective test case when the respective test case is run by a machine.
  • server system 108 compares test results to the expected results to verify whether the test result was normal.
  • client device 102 or a component thereof e. g. , test results receiving module 346, Figure 3 compares the received test results to the expected results to verify whether the test result was normal.
  • the device (720) displays a second user interface corresponding to the plug-in that includes the respective test flow and one or more previously submitted test flows; while displaying the second user interface, detects one or more user inputs to arrange at least one of the respective test flow and one or more previously submitted test flows into a respective test case; and, after detecting a user input to submit the respective test case, saves the respective test case to a test case database including zero or more previously submitted test cases, where the test case database is configured to provide the respective test case for execution by a test machine.
  • the second user interface is provided immediately after the submission of the respective test flow, or at a later time when the library is opened.
  • client device 102 displays a second user interface 479 of plug-in 106 that prompts the user of client device 102 to arrange a test case.
  • second user interface 479 is displayed in response to selection of “Submit Test Flow” affordance 462 in Figure 4E.
  • second user interface 479 is displayed at a time subsequent to Figure 4E in response to selection of plug-in affordance 402 in the web browser (e. g. , in Figure 4A) and, then, selection of “Other Options” affordance 463 in first user interface 438 of plug-in 106 (e. g. , in Figure 4B) .
  • second user interface 479 includes flow library region 477 with graphical representations 478-A, 478-B, and 478-C of previously submitted test flows (e. g. , stored in test flow (s) library 362 ( Figure 3) and/or test flow/case library 114 ( Figures 1-2) )and test case region 481 for arranging a test case with graphical representations 478-A, 478-B, and 478-C.
  • test flow (s) library 362 ( Figure 3) and/or test flow/case library 114 ( Figures 1-2) the user of client device 102 arranges a test case in test case region 481 by dragging graphical representations 478-, and 478-C from flow library region 477 into test case region 481.
  • the first and second user interfaces displayed within the web browser are (722) overlaid on the web page.
  • the first and second user interfaces are pop-up or floating windows that can be resized and moved.
  • the first and second user interfaces are a displayed in a window distinct from the web browser.
  • first user interface 438 is displayed within web browser 104 and overlaid on the web page displayed in Figure 4A.
  • second user interface 479 is displayed within web browser 104 and overlaid on the web page displayed in Figure 4A.
  • the device displays (724) a third user interface corresponding to the plug-in that includes the respective test case and the zero or more previously submitted test cases, where the third user interface includes a plurality of options for executing each of the respective test case and the one or more previously submitted test cases.
  • client device 102 displays a third user interface 493 of plug-in 106 that prompts the user of client device 102 to execute one or more test cases in response to or at a time subsequent to selection of submit affordance 492 in Figure 4F.
  • the user is able to prioritize the order in which test cases are run, the number of iterations per test case, the browser on which to run the test case, removing test cases from the execution list, and/or whether to capture screenshots during execution of the test cases.
  • the third user interface 493 includes options for the execution of test cases 1, 2, and 3 including adjusting the execution order of the test cases 1, 2, and 3, toggling execution of test cases 1, 2, and 3, adjusting a number of loops of execution for test cases 1, 2, and 3, toggling screenshots during execution of test cases 1, 2, and 3, and selecting web browsers in which to execute test cases 1, 2, and 3.
  • the device after detecting a user input to execute at least one of the respective test case and zero or more previously submitted test cases, the device sends (726) the at least one of the respective test case and the one or more previously submitted test cases to a testing platform.
  • device 102 or a component thereof e. g. , submitting module 344, Figure 3
  • the third user interface 493 in Figure 4G further includes an execute affordance 498, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to cause execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108.
  • plug-in 106 in response to selection of execute affordance 498 in Figure 4G, plug-in 106 sends the selected test cases to server system 108 for execution.
  • server system 108 or a component thereof e. g. , test execution module 220, Figure 2 executes the selected test cases or causes one or more test machines 122 to execute the selected test cases.
  • server system 108 sends the test cases to various test machines 122 to execute the test cases based on the work load of the test machines 122 as determined by workload determination module 218 ( Figure 2) .
  • the device (728) after sending the at least one of the respective test case and the one or more previously submitted test cases to the testing platform, the device (728) : obtains results from the testing platform for the at least one of the respective test case and the one or more previously submitted test cases; and displays a fourth user interface with the results obtained from the testing platform.
  • client device 102 or a component thereof e. g. , test results receiving module, Figure 3
  • the test results are emailed or sent to the user via another communication method (e. g. , SMS, MMS, or the like) .
  • the results obtained from the testing platform further include (730) intelligent analysis on test results for the at least one of the respective test case and the one or more previously submitted test cases and one or more screenshots corresponding to execution of the at least one of the respective test case and the one or more previously submitted test cases.
  • server system 108 or a component thereof e. g. , test result analyzing module 224, Figure 2 performs intelligent analysis on the results of the one or more test cases and the screenshots and in some circumstances expected results for the one or more test cases submitted by the user of client device 102.

Abstract

A method and system for facilitating web page testing are disclosed. While displaying a web page, a device with processor (s) and memory detects a user input activating a plug-in for web page testing. In response to detecting the user input, the device: identifies web page components of the web page; and extracts location and configuration information for the web page components. The device displays an interface corresponding to the plug-in including graphical representations for the web page components. The device detects user inputs to select and arrange the graphical representations for two or more of the web page components into a test flow and, after detecting a user input to submit the test flow, the device saves the test flow to a test flow database. The test flow database provides the respective test flow for use in constructing one or more test cases at a later time.

Description

METHOD AND SYSTEM FOR FACILITATING AUTOMATED WEB PAGE TESTING
PRIORITY CLAIM AND RELATED APPLICATION
This application claims priority to Chinese Patent Application No. 201310432090.7, entitled “Method, System, and Apparatus for Automated Web Testing, ” filed on September 22, 2013, which is incorporated by reference in its entirety.
FIELD OF THE TECHNOLOGY
The present disclosure relates to the field of Web testing technologies and, in particular, to a method and system for facilitating automated web page testing and debugging.
BACKGROUND
Conventional web testing is as follows: For a web page to be tested, code of the web page is tagged, a test logic is defined, and an automated test is performed on the tagged web page code according to the test logic.
This manner at least has the following defects:
1) Difficult to understand: The test logic is compiled with codes, and the test logic is only readable by developers and is difficult for others to understand.
2) Poor independence: A markup code needs to be inserted in the code of the original Web project page, which may possibly affect the functions or performances of the original Web project page.
3) High use cost: This testing manner has a high technology threshold that can only be implemented by test personnel having a high-level coding ability at an earlier stage in the developing process and also requires developer cooperation.
4) High maintenance cost: During agile development, the complexity of synchronously modifying the test logic code is high due to frequently modified business requirements.
5) Low applicability: Each web page requires a test logic code that is independently compiled, so that the code is cannot typically be reused.
SUMMARY
In some embodiments, a method of facilitating automated web page testing and debugging is performed at a device (e. g. , client device 102, Figures 1 and 3) with one or more processors and memory. The method includes detecting a user input activating a plug-in associated with web page testing and debugging while displaying the web page in a web browser executed on the device. In response to detecting the user input activating the plug-in, the method includes: identifying a plurality of web page components of the web page; and extracting respective location information and respective configuration information for the plurality of web page components of the web page. The method includes displaying a first user interface corresponding to the plug-in, the first user interface including respective graphical representations for the plurality of web page components of the web page. The method includes detecting one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of components of the web page into a respective test flow and, after detecting a user input to submit the respective test flow, saving the respective test flow to a test flow database including zero or more previously submitted test flows, where the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time.
In some embodiments, a computing device (e. g. , client device 102 (Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) ) includes one or more processors and memory storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing, or controlling performance of, the operations of any of the methods described herein. In some embodiments, a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a computing device (e. g. , client device 102 (Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) ) with one or more processors, cause the computing device to perform, or control performance of, the operations of any of the methods described herein. In some embodiments, a computing device (e. g. , client device 102 (Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) ) includes means for performing, or controlling performance of, the operations of any of the methods described herein.
Various advantages of the present application are apparent in light of the descriptions below.
BRIEF DESCRIPTION OF DRAWINGS
The aforementioned features and advantages of the disclosed technology as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of preferred embodiments when taken in conjunction with the drawings.
To describe the technical solutions in the embodiments of the present disclosed technology or in the prior art more clearly, the following briefly introduces the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description show merely some embodiments of the present disclosed technology, and persons of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
Figure 1 is a block diagram of a server-client environment in accordance with some embodiments.
Figure 2 is a block diagram of a server system in accordance with some embodiments.
Figure 3 is a block diagram of a client device in accordance with some embodiments.
Figures 4A-4G illustrate exemplary user interfaces for facilitating automated web page testing and debugging in accordance with some embodiments.
Figure 5 is a flowchart diagram of an automated web page testing method in accordance with some embodiments.
Figure 6 is a flow diagram of an automated web page testing process in accordance with some embodiments.
Figures 7A-7C illustrate a flowchart diagram of a method of facilitating automated web page testing and debugging in accordance with some embodiments.
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
DESCRIPTION OF EMBODIMENTS
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one skilled in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The following clearly and completely describes the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application. Apparently, the described embodiments are merely a part rather than all of the embodiments of the present application. All other embodiments obtained by persons of ordinary skill in the art based on the embodiments of the present application without creative efforts shall fall within the protection scope of the present application.
In accordance with some embodiments, server-client environment 100 includes client devices 102-1, 102-2and server system 108. In some embodiments, a web browser 104 is executed on a client device 102, and web browser 104 includes a plug-in 106 that communicates with server system 108 through one or more networks 110. As shown in Figure 1, data processing for an automated web page testing and debugging application is implemented in a server-client environment 100 in accordance with some embodiments. Plug-in 106 provides client-side functionalities for the automated web page testing and debugging application and communications with server system 108. Server system 108 provides server-side functionalities for the automated web page testing and debugging application for any number of plug-ins 106 each being executed from a web browser 104 on a respective client device 102.
In some embodiments, server system 108 includes one or more processors 112, test flow/case library 114, test results database 116, an I/O interface to one or more clients 118, and an optional I/O interface to one or more external services 120. I/O interface to one or more clients 118 facilitates the client-facing input and output processing for server system 108. In some embodiments, processor (s) 112 obtain one or more test cases submitted by a  client device 102 and, in response, executes the one or more test cases or causes one or more test machines 122 to execute the one or more test cases. Test flow/case library 114 stores test flows and test cases saved and/or submitted by client devices 102, and test results database 116 stores results for completed test cases and also expected results for test flows and test cases. I/O interface to one or more external services 120 optionally facilitates communications with one or more test machines 122. For example, in some embodiments, after receiving one or more test cases from a client device 102, server system 108 queries one or more test machines 122 to determine their current workloads and causes the one or more test cases to be executed by select test machine (s) of one or more test machines 122 according to their corresponding workloads.
Examples of client device 102 include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA) , atablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices or other data processing devices.
Examples of one or more networks 110 include local area networks (LAN) and wide area networks (WAN) such as the Internet. One or more networks 110 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB) , FIREWIRE, Long Term Evolution (LTE) , Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP) , Wi-MAX, or any other suitable communication protocol.
Server system 108 is implemented on one or more standalone data processing apparatuses or a distributed network of computers. In some embodiments, server system 108 also employs various virtual devices and/or services of third party service providers (e. g. , third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of server system 108.
In Figure 1, the automated web page testing and debugging application includes both a client-side portion (e. g. , plug-in 106) and a server-side portion (e. g. , server system  108) . In some embodiments, data processing is implemented as a standalone application installed on client device 102, and the databases are created and stored locally at the client device. In addition, the division of functionalities between the client and server portions of client environment data processing can vary in different embodiments. For example, in some embodiments, plug-in 106 is a thin-client that provides only user-facing input and output processing functions, and delegates all other data processing functionalities to a backend server (e. g. , server system 108) . For example, in some embodiments, the automated web page testing and debugging is performed entirely by plug-in 106 and client device 102 stores saved test flows, test cases, and test results. In another example, in some embodiments, the automated web page testing and debugging is performed entirely by server system 108 and server system 108 stores saved test flows, test cases, and test results. In another example the automated web page testing and debugging is performed by plug-in 106 and server system 108 stores saved test flows, test cases, and test results.
Figure 2 is a block diagram illustrating server system 108 in accordance with some embodiments. Server system 108, typically, includes one or more processing units (CPUs) 112, one or more network interfaces 204 (e. g. , including I/O interface to one or more clients 118 and I/O interface to one or more external services 120) , memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset) . Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 206, optionally, includes one or more storage devices remotely located from one or more processing units 112. Memory 206, or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable storage medium. In some implementations, memory 206, or the non-transitory computer readable storage medium of memory 206, stores the following programs, modules, and data structures, or a subset or superset thereof:
·operating system 210 including procedures for handling various basic system services and for performing hardware dependent tasks;
·network communication module 212 for connecting server system 108 to other computing devices (e. g. , client devices 102 and test machine (s) 122) connected to one or more networks 110 via one or more network interfaces 204 (wired or wireless) ;
·test platform 214, which provides server-side data processing and functionalities for the automated web page testing and debugging application, including but not limited to:
orequest handling module 216 for receiving a request from a client device 102 to execute one or more test cases;
o(optional) workload determination module 218 for determining the current workload of one or more test machines 122;
otest execution module 220 for executing the one or more test cases or causing select test machine (s) from among the one or more test machines 122 to execute the one or more test cases;
o(optional) screenshot module 222 for capturing screenshots while executing the one or more test cases;
otest result analyzing module 224 for performing analysis on the results of the one or more test cases; and
osending module 226 for sending the test results and/or the analysis of the test results to the client device 102; and
·server data 240 storing data for the automated web page testing and debugging application, including but not limited to:
otest flow/case library 114 stores test flows and test cases saved and/or submitted by client devices 102; and
otest results database 116 stores results for completed test cases and expected results for test flows and test cases.
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i. e. , sets of instructions) need not be implemented as separate software programs, procedures, or  modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 206, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 206, optionally, stores additional modules and data structures not described above.
Figure 3 is a block diagram illustrating a representative client device 102 in accordance with some embodiments. Client device 102, typically, includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset) . Client device 102 also includes a user interface 310. User interface 310 includes one or more output devices 312 that enable presentation of media content, including one or more speakers and/or one or more visual displays. User interface 310 also includes one or more input devices 314, including user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a gesture capturing camera, or other input buttons or controls. Furthermore, some client devices 102 use a microphone and voice recognition or a camera and gesture recognition to supplement or replace the keyboard. Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302. Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium. In some implementations, memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
·operating system 316 including procedures for handling various basic system services and for performing hardware dependent tasks;
·network communication module 318 for connecting client device 102 to other computing devices (e. g. , server system 108) connected to one or more networks 110 via one or more network interfaces 304 (wired or wireless) ;
·presentation module 320 for enabling presentation of information (e. g. , auser interface for application (s) 326, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc. ) at client device 102 via one or more output devices 312 (e. g. , displays, speakers, etc. ) associated with user interface 310;
·input processing module 322 for detecting one or more user inputs or interactions from one of the one or more input devices 314 and interpreting the detected input or interaction;
·web browser module 104 for navigating, requesting (e. g. , via HTTP) , and displaying websites and web pages thereof;
·one or more applications 326 for execution by client device 102 (e. g. , games, application marketplaces, payment platforms, and/or other web or non-web based applications) ;
·plug-in 106, which provides client-side data processing and functionalities for the automated web page testing and debugging application, including but not limited to:
otrigger detection module 330 for triggering plug-in 106 based on an input detected in web browser module 104;
ocomponent identifying module 332 for identifying web page components for a respective web page displayed by web browser module 104;
oinformation extracting module 334 for extracting location information and configuration information for the identified web page components;
ouser interface (UI) displaying module 336 for displaying a first UI for arranging a test flow, a second UI for arranging a test case, a third UI for causing execution of one or more test cases, and optionally a UI displaying test results;
ogenerating module 338 for generating graphical representations for the identified web page components that corresponding to test scripts for the identified web page components;
osaving module 340 for saving test flows to test flow (s) library 362 and test cases to test case (s) library 364;
oexpected results module 342 for obtaining expected results for a test flow or test case and optionally saving the expected results in expected results library 366;
osubmitting module 344 for sending one or more test cases to server system 108 for execution;
otest results receiving module 346 for receiving test results from server system 108 and optionally saving the test results in test results library 368; and
oanalyzing module 348 for analyzing the test results received from server system 108 (i. e. , against the expected results) ; and
·client data 360 optionally storing data associated with the automated web page testing and debugging application, including but not limited to:
otest flow (s) library 362 storing test flows submitted by the user of client device 102;
otest case (s) library 364 storing test cases submitted by the user of client device 102;
oexpected results library 366 storing expected results for test flows and/or test cases; and
otest results library 368 storing test results for executed test cases.
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i. e. , sets of instructions) need not be implemented as separate software programs, procedures, modules or data structures, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 306, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 306, optionally, stores additional modules and data structures not described above.
In some embodiments, at least some of the functions of plug-in 106 are performed by server system 108, and the corresponding sub-modules of these functions may be located within server system 108 rather than plug-in 106. For example, the functions of generating  module 338 and saving module 340 are performed by server system 108. In some embodiments, at least some of the functions of server system 108 are performed by plug-in 106, and the corresponding sub-modules of these functions may be located within plug-in 106 rather than server system 108. For example, the functions of test execution module 220, screenshot module 222, and test result analyzing module 224 are performed by plug-in 106. Server system 108 and client device 102 shown in Figures 2-3, respectively, are merely illustrative, and different configurations of the modules for implementing the functions described herein are possible in various embodiments.
Attention is now directed towards embodiments of user interfaces and associated processes that may be implemented on a client device 102 with zero or more speakers, zero or more microphones, and a display. For example, the display is a touch screen (sometimes also herein called a “touch screen display” ) enabled to receive one or more contacts and display information (e. g. , media content, websites and web pages thereof, and/or user interfaces for application (s) 326) . Figures 4A-4G illustrate exemplary user interfaces for facilitating automated web page testing and debugging in accordance with some embodiments.
Although some of the examples that follow will be given with reference to inputs on a touch screen (where the touch sensitive surface and the display are combined) , in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display. In some embodiments, the touch sensitive surface has a primary axis that corresponds to a primary axis on the display. In accordance with these embodiments, the device detects contacts with the touch-sensitive surface at locations that correspond to respective locations on the display. In this way, user inputs detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display of the device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein.
Additionally, while the following examples are given primarily with reference to contacts (e. g. , finger inputs such as finger contacts, finger tap gestures, finger swipe gestures, etc. ) , it should be understood that, in some embodiments, one or more of the contacts are replaced with input from another input device (e. g. , amouse-based, stylus-based, or physical  button-based input) . For example, a swipe gesture is, optionally, replaced with a mouse click (e. g. , instead of a contact) followed by movement of the cursor along the path of the swipe (e. g. , instead of movement of the contact) . As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e. g. , instead of detection of the contact followed by ceasing to detect the contact) or depression of a physical button. Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
Figures 4A-4G show user interface 404 displayed on client device 102 (e. g. , a mobile phone) ; however, one skilled in the art will appreciate that the user interfaces shown in Figures 4A-4G may be implemented on other similar computing devices. The user interfaces in Figures 4A-4G are used to illustrate the processes described herein, including the process described with respect to Figures 5-6and 7A-7C.
Figure 4A illustrates client device 102 executing a web browser (e. g. , web browser module 104, Figures 1 and 3) . In Figure 4A, the web browser is displaying a landing page (or any accessible page) for a website (e. g. , anews aggregation outlet) . In Figure 4A, the web browser includes a web address bar 406 showing a URL for the landing page of the website as the current web address, refresh affordance 408 for reloading the current web page, back navigation affordance 410-A for displaying the last web page, and forward navigation affordance 410-B for displaying the next web page. In Figure 4A, the web browser also includes plug-in affordance 402, which, when activated (e. g. , with a tap gesture) , causes execution of a plug-in (e. g. , plug-in 106, Figure 1 and 3) associated with an automated web page testing and debugging application.
In Figure 4A, the landing page for the website includes a plurality of webpage components, such as a logo picture 412 associated with the website, a search field 414 for searching the website, and advertisements 434-A and 434-B. In Figure 4A, the landing page for the website also includes other webpage components, such as a first content section corresponding to “Today’ s Top News Story” 416 with a snippet or preview 418 of the news top story, a bookmark affordance 420 for bookmarking the top news story, and a comment entry field 422 for entering a comment related to the top news story. In Figure 4A, the landing page for the website further includes other webpage components, such as a second  content section corresponding to “Today’ s Top Sports Story” 424 with a snippet or preview 426 of the top sports story, a bookmark affordance 428 for bookmarking the top sports story, a user comments section 430 with user comments related to the top new story, and a comment entry field 432 for entering a comment related to the top sports story. The webpage components shown in Figure 4A are merely exemplary, many other webpage components may be included in a given webpage for testing.
Figure 4A further illustrates client device 102 detecting contact 436 at a location corresponding to plug-in affordance 402. In some embodiments, in response to detecting selection of plug-in affordance 402, plug-in 106 identifies web page components of the web page displayed in Figure 4A and, also, extracts location information and configuration information for each of the identified web page components.
Figure 4B illustrates client device 102 displaying a first user interface 438 of plug-in 106 that prompts the user of client device 102 to arrange a test flow in response to selection of plug-in affordance 402 in Figure 4A. In Figure 4B, first user interface 438 includes a first region 439 with a plurality of graphical representations corresponding to the web page components on the web page displayed in Figure 4A. In some embodiments, each one of the graphical representations is associated with a test script corresponding to a web page component.
For example, search representation 440 represents an editable field corresponding to search field 414 in Figure 4A, icon representation 442 represents a link corresponding to logo picture 412 in Figure 4A, advertisement representation 444 represents a link corresponding to advertisement 434-A in Figure 4A, advertisement representation 446 represents a link corresponding to advertisement 434-B in Figure 4A, story representation 448 represents a link corresponding to the top news story associated with snippet 418 in Figure 4A, bookmark representation 450 represents a functional button corresponding to bookmark affordance 420 in Figure 4A, commentary representation 452 represents an editable field corresponding to comment entry field 422 in Figure 4A, story representation 454 represents a link corresponding to the top sports story associated with snippet 426 in Figure 4A, bookmark representation 456 represents a functional button corresponding to bookmark affordance 428 in Figure 4A, comments representation 458 represents a link corresponding to user comments section 430 in Figure 4A, and commentary representation  460 represents an editable field corresponding to comment entry field 432 in Figure 4A. In Figure 4B, first user interface 438 also includes second region 465 for arranging a test flow.
For example, the user of client device 102 arranges a test flow in second region 465 of first user interface 438 by dragging graphical representations from first region 439 into second region 465 where the user may further reorder the sequence of graphical representations and/or remove graphical representations from the sequence. In Figure 4B, first user interface 438 further includes “Record Expected Results” affordance 461, which, when activated (e. g. , with a tap gesture) , causes the plug-in to display a user interface that prompts the user of client device 102 to perform actions indicating the expected results for the test flow. In Figure 4B, first user interface 438 further includes “Submit Test Flow” affordance 462, which, when activated (e. g. , with a tap gesture) , causes the plug-in to submit the test flow by locally saving the test flow arranged in second region 465 in test flow (s) library 362 and/or remotely saving the test flow arranged in second region 465 in test flow/case library 114 (Figures 1-2) . In Figure 4B, first user interface 438 further includes “Other Options” affordance 463, which, when activated (e. g. , with a tap gesture) , causes the plug-in to display a user interface that enables the user of client device 102 to view previously saved test flows and test cases (e. g. , asecond user interface 479 for arranging a test case, Figure 4F) , and to submit test case (s) for execution cases (e. g. , athird user interface 493 for executing test case (s) , Figure 4G) .
Figure 4C illustrates client device 102 displaying  graphical representations  450, 452, 458, and 460 arranged in a test flow sequence in second region 465 of first user interface 438. Figure 4C also illustrates client device 102 detecting contact 464 at a location corresponding to comment field representation 452. For example, contact 464 is associated with a long-press gesture (e. g. , apress gesture for greater than X seconds) on comment field representation 452.
Figure 4D illustrates client device 102 displaying options panel 466 in response to selection of comment field representation 452 in Figure 4C. In Figure 4D, options panel 466 allows the user of client device 102 to edit options associated with comment field representation 452 for the test flow. In Figure 4D, options panel 466 includes “Enter Test Text” affordance 468, which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a virtual keyboard for entering test text for executing the test script corresponding to comment field representation 452. In Figure 4D, options panel 466 also  includes “Remove from Test Flow” affordance 470, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to remove comment field representation 452 from the test flow (e. g. , shown in second region 465 in Figure 4C) . In Figure 4D, options panel 466 further includes “Other Options” affordance 472 which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a set of options for adjusting and/or manipulating the test script corresponding to comment field representation 452 and the like. Figure 4D also illustrates client device 102 detecting contact 474 at a location corresponding to “Enter Test Text” affordance 468. For example, after selecting “Enter Test Text” affordance 468, ” the user of client device 102 enters test text for the comment entry field. In some embodiments, the plug-in 106 determines at least some of the options available for each webpage component based on the configuration information associated with the webpage component. In one example, if the web page component is a text input field, one of the options available to the graphical representation of the text input field is for collecting a text input test pattern from the user. In another example, if the web page component is a drop down menu, the options available to the graphical representation of the drop-down menu is for collecting a selection input for the drop down menu. In some embodiments, one of the options provided for a graphical representation is for the user to identify a storage location where the required test input for the corresponding web page component may be found.
Continuing with the example above, after entering the test text, Figure 4E illustrates client device 102 displaying first user interface 438 of the plug-in. Figure 4E illustrates client device 102 detecting contact 462 at a location corresponding to “Submit Test Flow” affordance 462. For example, in response to selection of “Submit Test Flow” affordance 462, plug-in 106 submits the test flow by locally saving the test flow arranged in second region 465 to test flow (s) library 362 (Figure 3) and/or submitting the test flow to server system 108 where the test flow arranged in second region 465 is saved remotely in test flow/case library 114 (Figures 1-2) .
Figure 4F illustrates client device 102 displaying a second user interface 479 of plug-in 106 that prompts the user of client device 102 to arrange a test case. For example, second user interface 479 is displayed in response to selection of “Submit Test Flow” affordance 462 in Figure 4E. In another example, second user interface 479 is displayed at a time subsequent to Figure 4E in response to selection of plug-in affordance 402 in the web  browser (e. g. , in Figure 4A) and, then, selection of “Other Options” affordance 463 in first user interface 438 of plug-in 106 (e. g. , in Figure 4B) .
In Figure 4F, second user interface 479 includes flow library region 477 with graphical representations 478-A, 478-B, and 478-C of previously submitted test flows (e. g. , stored in test flow (s) library 362, Figure 3 and/or test flow/case library 114, Figures 1-2) and test case region 481 for arranging a test case with the graphical representations 478-A, 478-B, and 478-C. For example, the user of client device 102 arranges a test case in test case region 481 by dragging graphical representations 478-A, 478-B, and 478-C from flow library region 477 into test case region 481 where the user may further reorder the sequence of graphical representations and/or remove graphical representations from the sequence by selecting the minus affordance overlaid on the graphical representations. In Figure 4F, graphical representations 478-A and 478-B are arranged in a test case sequence.
In Figure 4F, second user interface 479 includes a home button 480 for returning to a home interface for plug-in 106 (e. g. , first user interface 438 in Figure 4B) , expand affordance 482 for adjusting the size of second user interface 479 and/or displaying second user interface 479 in full screen mode, options affordance 484 for adjusting configuration options for plug-in 106, new window affordance 486 for displaying second user interface 479 in a new window, and exit affordance 488 for exiting plug-in 106. In Figure 4F, second user interface 479 also includes case name entry field 484 for a case name for the test case sequence in test case region 481. In Figure 4F, second user interface 479 also includes reset affordance 490, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to reset the test case sequence in test case region 481 and submit affordance 492, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to submit the test case in test case region 481 by locally saving the test case arranged in test case region 481 to test case (s) library 364 (Figure 3) and/or submitting the test case to server system 108 where the test case is saved remotely in test flow/case library 114 (Figures 1-2) . Figure 4F also illustrates client device 102 detecting contact 491 at a location corresponding to submit affordance 492.
Figure 4G illustrates client device 102 displaying a third user interface 493 of plug-in 106 that prompts the user of client device 102 to execute one or more test cases in response to or at a time subsequent to selection of submit affordance 492 in Figure 4F. In Figure 4G,the third user interface 493 includes test case 3 submitted in response to selection of submit  affordance 492 in Figure 4F and also previously submitted  test cases  1 and 2. In Figure 4G, second user interface 479 includes a home button 480 for returning to a home interface for plug-in 106 (e. g. , first user interface 438 in Figure 4B) , expand affordance 482 for adjusting the size of third user interface 493 and/or displaying third user interface 493 in full screen mode, options affordance 484 for adjusting configuration options for plug-in 106, new window affordance 486 for displaying third user interface 493 in a new window, and exit affordance 488 for exiting plug-in 106.
In Figure 4G, the third user interface 493 also includes toggle boxes for selecting test cases for execution (e. g. , acolumn of check boxes preceding the number and test case name columns) . In Figure 4G, the third user interface 493 further includes a plurality of options for the execution of each test case including an affordance for adjusting a number of execution loops for a respective test case, an toggle affordance identifying whether to take screenshots during execution of the respective test case, toggle affordance for selecting web browsers (e. g. , Google ChromeTM , Mozilla FirefoxTM , and Microsoft Internet ExplorerTM ) on which to execute the respective test case, and a set of options for editing the respective test case including a view test case affordance (e. g. , corresponding to a magnifying glass icon) , an edit test case icon (e. g. , corresponding to pad and pencil icon) , and a delete test case icon (e. g. , corresponding to trashcan icon) . In Figure 4G, the third user interface 493 further includes an execute affordance 498, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to cause execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108.
Figure 5 illustrates a flowchart diagram of a method 500 of an automated web testing in accordance with some embodiments. In some embodiments, operations 502-510 of method 500 are performed by a device with one or more processors and memory and operations 512-514 of method 500 are performed by a server with one or more processors and memory. For example, in some embodiments, operations 502-510 of method 500 are performed by client device 102 (Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) and operations 512-514 of method 500 are performed by server system 108 (Figures 1-2) or a component thereof (e. g. , test platform 214, Figure 2) . In some embodiments, method 600 is governed by instructions that are stored in a non-transitory computer readable storage medium of the device and/or server and the instructions are executed by one or more processors of the device and/or server.
In some embodiments, a web browser 104 (Figures 1 and 3) is executed on a client device 102, and web browser 104 includes a plug-in 106 that communicates with server system 108 through one or more networks 110 (Figure 1) . Plug-in 106 provides client-side functionalities for the automated web page testing and debugging application and communications with server system 108. Server system 108 provides server-side functionalities for the automated web page testing and debugging application.
For a respective web page, client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) : identifies (502) web page components and extracts information associated with the web page components, where the extracted information comprises position and configuration information in for the web page components; and displays graphical representations of the identified web page components. In some embodiments, client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) performs operation 502 in response to selection of an affordance corresponding to an automated test plug-in (e. g. , selection of plug-in affordance 402 in Figure 4A) . In some embodiments, displaying the graphical representations comprises: organizing the web page components according to the extracted information in a manner of JavaScript object notation (JSON) , where the position is represented in an XML path language (XPath) ; and displaying organized graphical representations.
Client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) acquires (504) a respective test flow, where the respective test flow is a sequence of one or more of the graphical representations. In some embodiments, acquiring the respective test flow comprises the following steps: detecting selection information inputted by the user of client device of one or more of the graphical representations, where the selection information comprises arranging one or more of the graphical representations into a flow sequence; and organizing the corresponding node into a test flow according to the selection information. For example, in Figure 4E  graphical representations  450, 452, 458, and 460 are arranged into a test flow sequence in second region 465.
Client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) acquires (506) a respective test case, where the respective test case is a sequence of one or more test flows at least including the respective test flow. For example, in Figure 4F  graphical representations 478-B and 478-C are arranged into a test case sequence in test case region 481.
Client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) acquires (508) a test task, where the test task is a sequence of one or more test cases at least including the respective test case. In some embodiments, each of the one or more test cases are transformed into a test logic compiled with codes and the one or more test logics are compiled with codes into a test task. For example, in Figure 4G, the test task includes  test cases  1, 2, and 3 displayed in third user interface 493.
Client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) submits (510) the test task to server system 108 (Figures 1-2) or a component thereof (e. g. , test platform 214, Figure 2) . For example, client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) pushes the test task to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) in a manner of JSON.
In some embodiments, after receiving the test task, server system 108 or a component thereof (e. g. , test platform 214, Figure 2) executes (512) the test task or sends the test task to one or more test machines (e. g. , test machine (s) 122, Figure 1) for execution. In some embodiments, server system 108 or a component thereof (e. g. , test platform 214, Figure 2) performs the test task and captures screenshot on key operations of the test task. In some embodiments, the one or more test machines perform the test task, capture screenshots on key operations of the test task, and report test results and screenshot information to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) . In some embodiments, the server adopts a WebDriver to drive a browser in the one or more test machine to perform the test task.
In some embodiments, system 108 or a component thereof (e. g. , test platform 214, Figure 2) provides (514) the test results and the screenshot information for the test task to the device. In some embodiments, prior to providing the test results and the screenshot information for the test task to the device, the server performs intelligent analysis on the test results and provides the intelligent analysis and the screenshot information for the test task to the device.
It should be understood that the particular order in which the operations in Figure 5 have been described is merely exemplary and is not intended to indicate that the described  order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e. g. , process 600 and method 700) are also applicable in an analogous manner to method 500 described above with respect to Figure 5.
Figure 6 illustrates a flow diagram of a process 600 for an automated web test in accordance with some embodiments. In some embodiments, process 600 is performed in a data processing environment (e. g. , server-client environment 100, Figure 1) that includes a device with one or more processors and memory that is associated with a user (e. g. , client device 102, Figure 1 and 3) , aserver with one or more processors and memory (e. g. , server system 108, Figures 1-2) and optionally one or more test machines each with one or more processors and memory (e. g. , test machine (s) 122, Figure 1) . In some embodiments, a web browser 104 (Figures 1 and 3) is executed on a client device 102, and web browser 104 includes a plug-in 106 that communicates with server system 108 through one or more networks 110 (Figure 1) . Plug-in 106 provides client-side functionalities for the automated web page testing and debugging application and communications with server system 108. Server system 108 provides server-side functionalities for the automated web page testing and debugging application.
A user of client device 102 accesses (602) a web page via web browser 104 and selects an affordance associated with automated test plug-in 106. For example, plug-in 106 detects selection of the affordance or receives an indication of selection of the affordance. In Figure 4A, for example, while displaying a web page for a website corresponding to web address bar 406 in web browser 104, client device 102 detects contact 436 at a location corresponding to plug-in affordance 402. In some embodiments, after detecting selection of an affordance associated with automated test plug-in 106, the web browser sends a trigger to initiate plug-in 106 or plug-in 106 detects selection of the affordance.
In response to the trigger, plug-in 106 identifies (604) web page components in the displayed web page, extracts location and configuration information associated with the web page components, displays graphical representations of the web page components, and organizes a test flow with the graphical representations of the web page components according to user inputs. For example, in Figure 4B, client device 102 displays a first user  interface 438 of plug-in 106 that prompts the user of client device 102 to arrange a test flow in response to selection of plug-in affordance 402 in Figure 4A. Continuing with this example, first user interface 438 includes graphical representations 440-460 corresponding to the web page components of the web page displayed in Figure 4A. Figure 4E, for example, shows  graphical representations  450, 452, 458, and 460 arranged in a test flow sequence in second region 465 of first user interface 438. For example, this test flow sequence is submitted in response to selection of “Submit Test Flow” affordance 462 in Figure 4E.
After organizing the test flow, plug-in 106 organizes (606) a test case at least including the test flow and organizes a test task at least including the test case. In Figure 4F, for example, client device 102 displays a second user interface 479 of plug-in 106 that prompts the user of client device 102 to arrange a test case. In Figure 4G, for example, client device 102 displays a third user interface 493 of plug-in 106 that prompts the user of client device 102 to execute one or more test cases. In some embodiments, after selecting one or more test cases for execution, plug-in 106 sends a test task comprising the one or more selected test cases to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) . For example, in Figure 4G, in response to selection of execute affordance 498, plug-in 106 causes execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108 (e. g. , by sending a test task including the one or more selected test cases to server system 108) .
In some embodiments, after receiving the test task, server system 108 or a component thereof (e. g. , workload determination module 218, Figure 2) determines the current workload of one or more test machines 122 and server system 108 or a component thereof (e. g. , test execution module 220, Figure 2) sends the test cases in the test task to select test machine (s) from among the one or more test machines 122 to execute the one or more test cases.
The test machine (s) 122 perform (608) the test task and captures screenshots for key operations of the test cases comprising the test task. After performing the one or more test cases comprising the test task, the test machine (s) 122 send the test results and screenshots to server system 108 or a component thereof (e. g. , test platform 214, Figure 2) .
In some embodiments, after receiving the test results and the screenshots, server system 108 or a component thereof (e. g. , test result analyzing module 224, Figure 2)  performs (610) intelligent analysis on the results of the one or more test cases. After performing the intelligent analysis, server system 108 or a component thereof sends the intelligent analysis and the screenshots to the client device 102 or a component thereof (e. g. , plug-in 106, Figures 1 and 3) .
It should be understood that the particular order in which the operations in Figure 6 have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e. g. , methods 500 and 700) are also applicable in an analogous manner to process 600 described above with respect to Figure 6.
Figures 7A-7C illustrates a flowchart diagram of a method 700 of facilitating automated web page testing and debugging in accordance with some embodiments. In some embodiments, method 700 is performed by a device with one or more processors. For example, in some embodiments, method 700 is performed by client device 102 (Figures 1 and 3) or a component thereof (e. g. , plug-in 106, Figures 1 and 3) . In some embodiments, method 700 is governed by instructions that are stored in a non-transitory computer readable storage medium of the device and the instructions are executed by one or more processors of the device.
While displaying a web page in a web browser executed on the device, the device detects (702) a user input activating a plug-in associated with web page testing and debugging. In Figure 4A, for example, client device 102 executes a web browser (e. g. , web browser module 104, Figures 1 and 3) displaying a landing page for a website (e. g. , anews aggregation outlet) . In Figure 4A, the web browser also includes plug-in affordance 402, which, when activated (e. g. , with a tap gesture) , causes execution of a plug-in (e. g. , plug-in 106, Figure 1 and 3) . In Figure 4A, for example, client device 102 detects contact 436 at a location corresponding to plug-in affordance 402. For example, client device 102 or a component thereof (e. g. , trigger detection module 330, Figure 3) triggers plug-in 106 in response to selection of plug-in affordance 402 in Figure 4A. In some embodiments, plug-in 106 is a client-side portion of an automated web page testing and debugging application.
In response to detecting the user input activating the plug-in, the device (704) : identifies a plurality of web page components of the web page; and extracts respective location information and respective configuration information for the plurality of components of the web page. In some embodiments, in response to detecting the user input, client device 102 or a component thereof (e. g. , component identifying module 332, Figure 3) identifies web page components for the web page displayed by web browser module 104 and, also, client device 102 or a component thereof (e. g. , information extracting module 334, Figure 3) extracts location information and configuration information (e. g. , component type, what kinds of input are expected for the webpage component, appearance, size, etc. ) for the identified web page components.
For example, in response to detecting selection of plug-in affordance 402 in Figure 4B, plug-in 106 identifies web page components for the web page displayed in Figure 4A and also extracts location information and configuration information for each of the identified web page components. For example, the location information indicates an HTML tag for component within the web page or the coordinates for component within the web page. For example, the configuration information includes information associated with the web page component such as a “component type” (e. g. , whether it is a text input field, a button, an icon, a link to a web address, a script, etc. ) , a “description text” (e. g. , the text or description associated with the component) , or a expected “action type” (e. g. , whether it can be clicked, dragged, copied, deleted, enlarged, zoomed, rotated, or any restrictions imposed on the action (e. g. , only numerical inputs, only Roman character input, only zoomed to a certain level, only dragged within a given window, etc. ) ) .
The device displays (706) a first user interface corresponding to the plug-in, the first user interface including respective graphical representations (i. e. , nodes) for the plurality of components of the web page. In some embodiments, client device 102 or a component thereof (e. g. , generating module 338, Figure 3) generates graphical representations for the identified web page components that correspond to test scripts for the identified web page component, and client device 102 or a component thereof (e. g. , user interface (UI) displaying module 336, Figure 3) displays a first user interface for arranging a test flow. In Figure 4B, for example, client device 102 displays a first user interface 438 of plug-in 106 that prompts the user of client device 102 to arrange a test flow in response to selection of plug-in affordance 402 in Figure 4A. In Figure 4B, first user interface 438 includes a first region 439  with a plurality of graphical representations 440-460 corresponding to the web page components of the web page displayed in Figure 4A. For example, with reference to Figure 4B, the user of client device 102 arranges a test flow in second region 465 of first user interface 438 by dragging graphical representations from first region 439 into second region 465 where the user may further reorder the sequence of graphical representations and/or remove graphical representations from the sequence.
In some embodiments, the graphical representations are (708) associated with respective test script operations corresponding to the two or more components of the web page. In some embodiments, the graphical representations correspond to test script operations corresponding to web page components. In some embodiments, the test script is based on the extracted location and configuration information. In some embodiments, the user can interact with the graphical representation to further configure the test script operations corresponding to the webpage components. For example, if a webpage component is a text input field, and the graphical representation for the webpage component corresponds to test script operations for filling out the text input field with certain text input, and the graphical representation can provide options for user the select which types of test text input to use for the test script operations.
In Figure 4D, for example, client device 102 displays options panel 466 in response to selection of comment field representation 452 in Figure 4C. In Figure 4D, options panel 466 allows the user of client device 102 to edit options associated with comment field representation 452 for the test flow. In Figure 4D, options panel 466 includes: (A) “Enter Test Text” affordance 468, which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a virtual keyboard for entering test text for executing the test script corresponding to comment field representation 452; (B) “Remove from Test Flow” affordance 470, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to remove comment field representation 452 from the test flow (e. g. , shown in second region 465 in Figure 4C) ; and (C) “Other Options” affordance 472 which, when activated (e. g. , with a tap gesture) , causes client device 102 to display a set of options for adjusting and/or manipulating the test script corresponding to comment field representation 452 and the like.
The device detects (710) one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of components of the  web page into a respective test flow. For example, the user is able to arrange one or more of the plurality of the graphical representations into a custom test flow sequence. Some graphical representations may not be used to create the test flow. In some embodiments, the plug-in breaks the web page into nodes and the user selects which nodes to manipulate and include in the test flow by dragging them into the first user interface. In some embodiments, the user further interacts with each of the graphical representations to configure the node before it is added to the test flow. For example, the graphical representation can provide a drop-down menu for the user to select the configuration options available for the node corresponding to the graphical representation. For example, after dragging some of the graphical representations shown in Figure 4B from first region 439 into second region 465, Figure 4C shows client device 102 displaying  graphical representations  450, 452, 458, and 460 arranged in a test flow sequence in second region 465 of first user interface 438.
After detecting a user input to submit the respective test flow, the device saves (712) the respective test flow to a test flow database including zero or more previously submitted test flows, where the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time. In Figure 4E, for example, in response to selection of “Submit Test Flow” affordance 462, client device 102 or a component thereof (e. g. , saving module 340, Figure 3) locally saving the test flow arranged in second region 465 to test flow (s) library 362 (Figure 3) and/or submits the test flow arranged in second region 465 to server system 108 wherein the test flow is remotely saved in test flow/case library 114 (Figures 1-2) .
In some embodiments, the device (714) : detects one or more manual test inputs provided by the user to the web page; detects a change in the web page displayed in the web browser in response to the one or more manual test input; identifies a second plurality of web page components based on the change in the webpage; and displays respective graphical representations for the second plurality of web page components in the first user interface. For example, operation 714 generates graphical representations for expected results of certain test inputs. These graphical representations can be compiled and associated with a test case or test flow. For example, the change can be the loading of a new page, or an update made to the current webpage. For example, the loading of a new page can be indicated by the change of URL of the page (e. g. , alogin-success page) . For example, an update in the page can be a  modification to a portion of the page (e. g. , the number of items shown on the shopping cart, etc. ) .
In some embodiments, the device detects (716) one or more additional user inputs to select at least one of the respective representations for the second plurality of web page components in the first user interface as expected result for the respective test flow and, in response to the one or more additional user inputs to submit the expected result for the respective test flow, saves the expected result in association with the respective test flow in the test flow database. The one or more additional inputs are detected when plug-in 106 is active. In some embodiments, there is an affordance in the user interface of plug-in 106 (e. g. , not shown in user interfaces in Figures 4B and 4F-4G) for showing a user interface for recording result representations, similar to the interface for recording test flows. In some embodiments, the same user interface (e. g. , the user interface in Figures 4B and 4F-4G) can be used for building the test flow and adding the result representations for the rest flow. For example, the manual test inputs can be a series of inputs, such as providing user name and password in a login page, adding one or more items to a shopping cart in a shopping webpage. In some embodiments, the expected results are saved locally in expected results library 366 (Figure 3) and/or remotely in test results database 116 (Figures 1-2) .
In some embodiments, the expected result is (718) used to analyze test results generated by the respective test case when the respective test case is run by a machine. In some embodiments, after running a test case or test flow, server system 108 compares test results to the expected results to verify whether the test result was normal. In some embodiments, after receiving test results from server system 108, client device 102 or a component thereof (e. g. , test results receiving module 346, Figure 3) compares the received test results to the expected results to verify whether the test result was normal.
In some embodiments, the device (720) : displays a second user interface corresponding to the plug-in that includes the respective test flow and one or more previously submitted test flows; while displaying the second user interface, detects one or more user inputs to arrange at least one of the respective test flow and one or more previously submitted test flows into a respective test case; and, after detecting a user input to submit the respective test case, saves the respective test case to a test case database including zero or more previously submitted test cases, where the test case database is configured to provide the  respective test case for execution by a test machine. In some embodiments, the second user interface is provided immediately after the submission of the respective test flow, or at a later time when the library is opened. In Figure 4F, for example, client device 102 displays a second user interface 479 of plug-in 106 that prompts the user of client device 102 to arrange a test case. For example, second user interface 479 is displayed in response to selection of “Submit Test Flow” affordance 462 in Figure 4E. In another example, second user interface 479 is displayed at a time subsequent to Figure 4E in response to selection of plug-in affordance 402 in the web browser (e. g. , in Figure 4A) and, then, selection of “Other Options” affordance 463 in first user interface 438 of plug-in 106 (e. g. , in Figure 4B) .
In Figure 4F, second user interface 479 includes flow library region 477 with graphical representations 478-A, 478-B, and 478-C of previously submitted test flows (e. g. , stored in test flow (s) library 362 (Figure 3) and/or test flow/case library 114 (Figures 1-2) )and test case region 481 for arranging a test case with graphical representations 478-A, 478-B, and 478-C. For example, in Figure 4F, the user of client device 102 arranges a test case in test case region 481 by dragging graphical representations 478-, and 478-C from flow library region 477 into test case region 481.
In some embodiments, the first and second user interfaces displayed within the web browser are (722) overlaid on the web page. For example, the first and second user interfaces are pop-up or floating windows that can be resized and moved. In some embodiments, the first and second user interfaces are a displayed in a window distinct from the web browser. For example, in Figure 4B, first user interface 438 is displayed within web browser 104 and overlaid on the web page displayed in Figure 4A. For example, in Figure 4F, second user interface 479 is displayed within web browser 104 and overlaid on the web page displayed in Figure 4A.
In some embodiments, the device displays (724) a third user interface corresponding to the plug-in that includes the respective test case and the zero or more previously submitted test cases, where the third user interface includes a plurality of options for executing each of the respective test case and the one or more previously submitted test cases. In Figure 4G, for example, client device 102 displays a third user interface 493 of plug-in 106 that prompts the user of client device 102 to execute one or more test cases in response to or at a time subsequent to selection of submit affordance 492 in Figure 4F. In  some embodiments, the user is able to prioritize the order in which test cases are run, the number of iterations per test case, the browser on which to run the test case, removing test cases from the execution list, and/or whether to capture screenshots during execution of the test cases. For example, in Figure 4G, the third user interface 493 includes options for the execution of  test cases  1, 2, and 3 including adjusting the execution order of the  test cases  1, 2, and 3, toggling execution of  test cases  1, 2, and 3, adjusting a number of loops of execution for  test cases  1, 2, and 3, toggling screenshots during execution of  test cases  1, 2, and 3, and selecting web browsers in which to execute  test cases  1, 2, and 3.
In some embodiments, after detecting a user input to execute at least one of the respective test case and zero or more previously submitted test cases, the device sends (726) the at least one of the respective test case and the one or more previously submitted test cases to a testing platform. In some embodiments, device 102 or a component thereof (e. g. , submitting module 344, Figure 3) sends the one or more test cases to server system 108 for execution. For example, the third user interface 493 in Figure 4G further includes an execute affordance 498, which, when activated (e. g. , with a tap gesture) , causes plug-in 106 to cause execution of the test cases that have been selected for execution according to the plurality of options for execution by server system 108. For example, in some embodiments, in response to selection of execute affordance 498 in Figure 4G, plug-in 106 sends the selected test cases to server system 108 for execution. In some embodiments, server system 108 or a component thereof (e. g. , test execution module 220, Figure 2) executes the selected test cases or causes one or more test machines 122 to execute the selected test cases. In some embodiments, server system 108 sends the test cases to various test machines 122 to execute the test cases based on the work load of the test machines 122 as determined by workload determination module 218 (Figure 2) .
In some embodiments, after sending the at least one of the respective test case and the one or more previously submitted test cases to the testing platform, the device (728) : obtains results from the testing platform for the at least one of the respective test case and the one or more previously submitted test cases; and displays a fourth user interface with the results obtained from the testing platform. In some embodiments, client device 102 or a component thereof (e. g. , test results receiving module, Figure 3) saves the test results in test results library 368 (Figure 3) for future viewing. In some embodiments, the test results are emailed or sent to the user via another communication method (e. g. , SMS, MMS, or the like) .
In some embodiments, the results obtained from the testing platform further include (730) intelligent analysis on test results for the at least one of the respective test case and the one or more previously submitted test cases and one or more screenshots corresponding to execution of the at least one of the respective test case and the one or more previously submitted test cases. In some embodiments, after causing execution of the one or more test cases and obtaining test results and/or screenshots for the one or more test cases, server system 108 or a component thereof (e. g. , test result analyzing module 224, Figure 2) performs intelligent analysis on the results of the one or more test cases and the screenshots and in some circumstances expected results for the one or more test cases submitted by the user of client device 102.
It should be understood that the particular order in which the operations in Figures 7A-7C have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e. g. , method 500 and process 600) are also applicable in an analogous manner to method 700 described above with respect to Figures 7A-7C.
While particular embodiments are described above, it will be understood it is not intended to limit the application to these particular embodiments. On the contrary, the application includes alternatives, modifications and equivalents that are within the spirit and scope of the appended claims. Numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one of ordinary skill in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.

Claims (20)

  1. A method of facilitating automated web page testing and debugging, the method comprising:
    at a device with one or more processors and memory:
    while displaying a web page in a web browser executed on the device, detecting a user input activating a plug-in associated with web page testing and debugging:
    in response to detecting the user input activating the plug-in:
    identifying a plurality of web page components of the web page; and
    extracting respective location information and respective configuration information for the plurality of web page components of the web page;
    displaying a first user interface corresponding to the plug-in, the first user interface including respective graphical representations for the plurality of web page components of the web page;
    detecting one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of web page components of the web page into a respective test flow; and
    after detecting a user input to submit the respective test flow, saving the respective test flow to a test flow database including zero or more previously submitted test flows, wherein the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time.
  2. The method of claim 1, wherein the graphical representations are associated with respective test script operations corresponding to the two or more components of the web page.
  3. The method of any of claims 1-2, further comprising:
    displaying a second user interface corresponding to the plug-in that includes the respective test flow and one or more previously submitted test flows;
    while displaying the second user interface, detecting one or more user inputs to arrange at least one of the respective test flow and one or more previously submitted test flows into a respective test case; and 
    after detecting a user input to submit the respective test case, saving the respective test case to a test case database including zero or more previously submitted test cases, wherein the test case database is configured to provide the respective test case for execution by a test machine.
  4. The method of claim 3, further comprising:
    displaying a third user interface corresponding to the plug-in that includes the respective test case and the zero or more previously submitted test cases, wherein the third user interface includes a plurality of options for executing each of the respective test case and the one or more previously submitted test cases.
  5. The method of claim 4, further comprising:
    after detecting a user input to execute at least one of the respective test case and zero or more previously submitted test cases, sending the at least one of the respective test case and the one or more previously submitted test cases to a testing platform.
  6. The method of any of claims 1-5, further comprising:
    detecting one or more manual test inputs provided by the user to the web page;
    detecting a change in the web page displayed in the web browser in response to the one or more manual test input;
    identifying a second plurality of web page components based on the change in the webpage; and
    displaying respective graphical representations for the second plurality of web page components in the first user interface.
  7. The method of claim 6, further comprising:
    detecting one or more additional user inputs to select at least one of the respective representations for the second plurality of web page components in the first user interface as expected result for the respective test flow; and
    in response to the one or more additional user inputs to submit the expected result for the respective test flow, saving the expected result in association with the respective test flow in the test flow database.
  8. A device, comprising:
    one or more processors; and
    memory storing one or more programs to be executed by the one or more processors, the one or more programs comprising instructions for:
    while displaying a web page in a web browser executed on the device, detecting a user input activating a plug-in associated with web page testing and debugging:
    in response to detecting the user input activating the plug-in:
    identifying a plurality of web page components of the web page; and
    extracting respective location information and respective configuration information for the plurality of web page components of the web page;
    displaying a first user interface corresponding to the plug-in, the first user interface including respective graphical representations for the plurality of web page components of the web page;
    detecting one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of web page components of the web page into a respective test flow; and
    after detecting a user input to submit the respective test flow, saving the respective test flow to a test flow database including zero or more previously submitted test flows, wherein the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time.
  9. The device of claim 8, wherein the graphical representations are associated with respective test script operations corresponding to the two or more components of the web page.
  10. The device of any of claims 8-9, wherein the one or more programs further comprise instructions for:
    displaying a second user interface corresponding to the plug-in that includes the respective test flow and one or more previously submitted test flows;
    while displaying the second user interface, detecting one or more user inputs to arrange at least one of the respective test flow and one or more previously submitted test flows into a respective test case; and 
    after detecting a user input to submit the respective test case, saving the respective test case to a test case database including zero or more previously submitted test cases, wherein the test case database is configured to provide the respective test case for execution by a test machine.
  11. The device of claim 10, wherein the one or more programs further comprise instructions for:
    displaying a third user interface corresponding to the plug-in that includes the respective test case and the zero or more previously submitted test cases, wherein the third user interface includes a plurality of options for executing each of the respective test case and the one or more previously submitted test cases.
  12. The device of claim 11, wherein the one or more programs further comprise instructions for:
    after detecting a user input to execute at least one of the respective test case and zero or more previously submitted test cases, sending the at least one of the respective test case and the one or more previously submitted test cases to a testing platform.
  13. The device of any of claims 8-12, wherein the one or more programs further comprise instructions for:
    detecting one or more manual test inputs provided by the user to the web page;
    detecting a change in the web page displayed in the web browser in response to the one or more manual test input;
    identifying a second plurality of web page components based on the change in the webpage; and
    displaying respective graphical representations for the second plurality of web page components in the first user interface.
  14. The device of claim 13, wherein the one or more programs further comprise instructions for:
    detecting one or more additional user inputs to select at least one of the respective representations for the second plurality of web page components in the first user interface as expected result for the respective test flow; and 
    in response to the one or more additional user inputs to submit the expected result for the respective test flow, saving the expected result in association with the respective test flow in the test flow database.
  15. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a device with one or more processors, cause the device to perform operations comprising:
    while displaying a web page in a web browser executed on the device, detecting a user input activating a plug-in associated with web page testing and debugging:
    in response to detecting the user input activating the plug-in:
    identifying a plurality of web page components of the web page; and
    extracting respective location information and respective configuration information for the plurality of web page components of the web page;
    displaying a first user interface corresponding to the plug-in, the first user interface including respective graphical representations for the plurality of web page components of the web page;
    detecting one or more user inputs to select and arrange the respective graphical representations for two or more of the plurality of web page components of the web page into a respective test flow; and
    after detecting a user input to submit the respective test flow, saving the respective test flow to a test flow database including zero or more previously submitted test flows, wherein the test flow database is configured to provide the respective test flow for use in constructing one or more test cases at a later time.
  16. The non-transitory computer readable storage medium of claim 15, wherein the instructions cause the device to perform operations further comprising:
    displaying a second user interface corresponding to the plug-in that includes the respective test flow and one or more previously submitted test flows;
    while displaying the second user interface, detecting one or more user inputs to arrange at least one of the respective test flow and one or more previously submitted test flows into a respective test case; and
    after detecting a user input to submit the respective test case, saving the respective test case to a test case database including zero or more previously submitted test cases, wherein  the test case database is configured to provide the respective test case for execution by a test machine.
  17. The non-transitory computer readable storage medium of claim 16, wherein the instructions cause the device to perform operations further comprising:
    displaying a third user interface corresponding to the plug-in that includes the respective test case and the zero or more previously submitted test cases, wherein the third user interface includes a plurality of options for executing each of the respective test case and the one or more previously submitted test cases.
  18. The non-transitory computer readable storage medium of claim 17, wherein the instructions cause the device to perform operations further comprising:
    after detecting a user input to execute at least one of the respective test case and zero or more previously submitted test cases, sending the at least one of the respective test case and the one or more previously submitted test cases to a testing platform.
  19. The non-transitory computer readable storage medium of any of claims 15-18, wherein the instructions cause the device to perform operations further comprising:
    detecting one or more manual test inputs provided by the user to the web page;
    detecting a change in the web page displayed in the web browser in response to the one or more manual test input;
    identifying a second plurality of web page components based on the change in the webpage; and
    displaying respective graphical representations for the second plurality of web page components in the first user interface.
  20. The non-transitory computer readable storage medium of claim 19, wherein the instructions cause the device to perform operations further comprising:
    detecting one or more additional user inputs to select at least one of the respective representations for the second plurality of web page components in the first user interface as expected result for the respective test flow; and
    in response to the one or more additional user inputs to submit the expected result for the respective test flow, saving the expected result in association with the respective test flow in the test flow database.
PCT/CN2014/085934 2013-09-22 2014-09-04 Method and system for facilitating automated web page testing WO2015039566A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310432090.7 2013-09-22
CN201310432090.7A CN104461855B (en) 2013-09-22 2013-09-22 A kind of Web automated testing method, system and device

Publications (1)

Publication Number Publication Date
WO2015039566A1 true WO2015039566A1 (en) 2015-03-26

Family

ID=52688225

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/085934 WO2015039566A1 (en) 2013-09-22 2014-09-04 Method and system for facilitating automated web page testing

Country Status (2)

Country Link
CN (1) CN104461855B (en)
WO (1) WO2015039566A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107783902A (en) * 2017-09-26 2018-03-09 甘肃万维信息技术有限责任公司 A kind of Selenium automated testing methods and system from coding
CN108399129A (en) * 2018-02-28 2018-08-14 车智互联(北京)科技有限公司 H5 page method for testing performance
CN108595339A (en) * 2018-05-09 2018-09-28 成都致云科技有限公司 Automated testing method, apparatus and system
CN108595321A (en) * 2018-04-04 2018-09-28 北京潘达互娱科技有限公司 A kind of application testing method and device
CN108628741A (en) * 2018-04-10 2018-10-09 平安科技(深圳)有限公司 Webpage test method, device, electronic equipment and medium
CN109918288A (en) * 2019-01-16 2019-06-21 北京互金新融科技有限公司 Use-case test method and device
CN115145464A (en) * 2022-07-28 2022-10-04 重庆长安汽车股份有限公司 Page testing method and device, electronic equipment and storage medium

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750471B (en) * 2013-12-30 2020-05-05 格尔软件股份有限公司 WEB page performance detection, acquisition and analysis plug-in and method based on browser
CN104991777B (en) * 2015-07-14 2018-04-13 普元信息技术股份有限公司 Realize that web application automatic test view melts the system and method for hair
CN106547679B (en) * 2015-09-17 2021-03-23 腾讯科技(深圳)有限公司 Script management method and script management platform
CN105183657A (en) * 2015-09-30 2015-12-23 上海斐讯数据通信技术有限公司 System and method for testing WEB interface
CN106897204A (en) * 2015-12-17 2017-06-27 中国电信股份有限公司 The automatic monitoring method and system of operation flow
CN106970870B (en) * 2016-01-14 2023-02-24 腾讯科技(北京)有限公司 Webpage test platform, webpage test method and webpage test system
CN106201618B (en) * 2016-07-14 2019-03-05 中电长城网际系统应用有限公司 A kind of APP task executing method and system based on inserting mechanism
CN106874204A (en) * 2017-02-15 2017-06-20 广州神马移动信息科技有限公司 Automatic test method for customizing and custom-built system
CN107688529B (en) * 2017-02-20 2020-07-21 平安科技(深圳)有限公司 Component debugging method and device
CN106878328A (en) * 2017-03-22 2017-06-20 福建中金在线信息科技有限公司 A kind of website testing method and device
CN107145448A (en) * 2017-05-09 2017-09-08 携程旅游信息技术(上海)有限公司 Test middleware, test system and method based on selenium
CN108255702A (en) * 2017-09-21 2018-07-06 平安科技(深圳)有限公司 A kind of test case creation method, apparatus, equipment and storage medium
CN107665171B (en) * 2017-10-11 2020-08-04 中国民生银行股份有限公司 Automatic regression testing method and device
CN109960624A (en) * 2017-12-26 2019-07-02 航天信息股份有限公司 A kind of test method and system of JsDriver
CN110347577B (en) * 2018-04-04 2024-04-09 阿里巴巴集团控股有限公司 Page testing method, device and equipment thereof
CN108845929A (en) * 2018-05-07 2018-11-20 北京三快在线科技有限公司 Page performance test method and apparatus
CN109857668A (en) * 2019-02-03 2019-06-07 苏州市龙测智能科技有限公司 UI automated function test method, test device, test equipment and storage medium
CN110297759B (en) * 2019-05-22 2022-04-12 深圳壹账通智能科技有限公司 Method, device, equipment and storage medium for manufacturing test page script
CN113360365B (en) * 2020-03-03 2024-04-05 北京同邦卓益科技有限公司 Flow test method and flow test system
CN111752828A (en) * 2020-06-04 2020-10-09 武汉迎风聚智科技有限公司 Performance test method and device for Web application

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090006897A1 (en) * 2007-06-27 2009-01-01 Microsoft Corporation Automated service testing
CN101339532A (en) * 2007-07-06 2009-01-07 中国银联股份有限公司 Web application system automatized test method and apparatus
US20130042222A1 (en) * 2011-08-08 2013-02-14 Computer Associates Think, Inc. Automating functionality test cases
CN103309806A (en) * 2013-05-03 2013-09-18 上海证券交易所 Device and method for fast developing and testing

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101241466B (en) * 2007-02-08 2010-09-29 深圳迈瑞生物医疗电子股份有限公司 Embedded software test method and system
CN103268226B (en) * 2013-05-17 2016-07-06 瑞斯康达科技发展股份有限公司 A kind of test script file generates method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090006897A1 (en) * 2007-06-27 2009-01-01 Microsoft Corporation Automated service testing
CN101339532A (en) * 2007-07-06 2009-01-07 中国银联股份有限公司 Web application system automatized test method and apparatus
US20130042222A1 (en) * 2011-08-08 2013-02-14 Computer Associates Think, Inc. Automating functionality test cases
CN103309806A (en) * 2013-05-03 2013-09-18 上海证券交易所 Device and method for fast developing and testing

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107783902A (en) * 2017-09-26 2018-03-09 甘肃万维信息技术有限责任公司 A kind of Selenium automated testing methods and system from coding
CN108399129A (en) * 2018-02-28 2018-08-14 车智互联(北京)科技有限公司 H5 page method for testing performance
CN108595321A (en) * 2018-04-04 2018-09-28 北京潘达互娱科技有限公司 A kind of application testing method and device
CN108628741A (en) * 2018-04-10 2018-10-09 平安科技(深圳)有限公司 Webpage test method, device, electronic equipment and medium
CN108628741B (en) * 2018-04-10 2021-10-01 平安科技(深圳)有限公司 Webpage testing method and device, electronic equipment and medium
CN108595339A (en) * 2018-05-09 2018-09-28 成都致云科技有限公司 Automated testing method, apparatus and system
CN109918288A (en) * 2019-01-16 2019-06-21 北京互金新融科技有限公司 Use-case test method and device
CN115145464A (en) * 2022-07-28 2022-10-04 重庆长安汽车股份有限公司 Page testing method and device, electronic equipment and storage medium
CN115145464B (en) * 2022-07-28 2023-07-18 重庆长安汽车股份有限公司 Page testing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN104461855A (en) 2015-03-25
CN104461855B (en) 2019-03-26

Similar Documents

Publication Publication Date Title
WO2015039566A1 (en) Method and system for facilitating automated web page testing
US9152529B2 (en) Systems and methods for dynamically altering a user interface based on user interface actions
US9342237B2 (en) Automated testing of gesture-based applications
US10324828B2 (en) Generating annotated screenshots based on automated tests
CA3018196C (en) Visual regresssion testing tool
US9756140B2 (en) Tracking user behavior relative to a network page
US10140314B2 (en) Previews for contextual searches
US9317257B2 (en) Folded views in development environment
US9003423B1 (en) Dynamic browser compatibility checker
US20190303269A1 (en) Methods and systems for testing visual aspects of a web page
US10353721B2 (en) Systems and methods for guided live help
US8589874B2 (en) Visual interface to represent scripted behaviors
US20140351796A1 (en) Accessibility compliance testing using code injection
WO2015043352A1 (en) Method and apparatus for selecting test nodes on webpages
US20180060222A1 (en) Building signatures of application flows
WO2013085528A1 (en) Methods and apparatus for dynamically adapting a virtual keyboard
CN104699602A (en) Method for detecting influence and computer
US20170052982A1 (en) Image Searches Using Image Frame Context
US10712913B2 (en) Event-based architecture for expand-collapse operations
US9811505B2 (en) Techniques to provide processing enhancements for a text editor in a computing environment
WO2019109553A1 (en) Functional and performance test script creation method, device, apparatus and storage medium
Alshayban et al. AccessiText: automated detection of text accessibility issues in Android apps
US20240020350A1 (en) Method and system for navigation control
WO2015039585A1 (en) Method and device for testing software reliability
US9489417B2 (en) Auto-search textbox in a content submission system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14846386

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14846386

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 01/06/2016)