WO2012027886A1 - Platform specific application test - Google Patents

Platform specific application test Download PDF

Info

Publication number
WO2012027886A1
WO2012027886A1 PCT/CN2010/076489 CN2010076489W WO2012027886A1 WO 2012027886 A1 WO2012027886 A1 WO 2012027886A1 CN 2010076489 W CN2010076489 W CN 2010076489W WO 2012027886 A1 WO2012027886 A1 WO 2012027886A1
Authority
WO
WIPO (PCT)
Prior art keywords
target device
development
software application
output
operating environment
Prior art date
Application number
PCT/CN2010/076489
Other languages
French (fr)
Inventor
Feng He
Original Assignee
Google Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Inc. filed Critical Google Inc.
Priority to PCT/CN2010/076489 priority Critical patent/WO2012027886A1/en
Publication of WO2012027886A1 publication Critical patent/WO2012027886A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/06Testing, supervising or monitoring using simulated traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Definitions

  • the disclosure relates generally to techniques for providing testing and test verification of applications on computing device platforms.
  • Mobile devices including cell phones, smart phones, smart books, net books, tablet computers, and the like, have become increasingly prevalent in recent years.
  • mobile devices may operate using a different operating environment (e.g., operating system) that is specifically catered to use less computing power and/or less electrical power (e.g., to reduce reliance on a device battery).
  • operating system e.g., operating system
  • Many mobile devices also employ unique mechanisms for user input, such as utilizing multi-touch gestures on a device display to detect user input as opposed to a more classical keyboard and mouse/touchpad combination.
  • This disclosure is directed to techniques for the verification of application software on a target device on which the application software is to be executed.
  • the techniques of this disclosure may be advantageous, because application software may be tested and automatically verified by a development device without requiring any additional hardware to act as an interpreter between a development device and an actual output of the target device. Furthermore, this disclosure is directed to advantageous techniques for the comparison of actual device output to desired output generated by a development device.
  • a method is described herein. The method includes executing, on a development device and according to a test program, a software application to generate a desired output for the software application.
  • the software application is configured to execute on a target device with a different operating environment than the development device.
  • the method further includes receiving, from the target device and by the development device, an actual output of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device.
  • the one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device.
  • the method further includes comparing, by the development device, the actual output to the desired output to verify execution of the software application on the target device.
  • the method further includes providing a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
  • a development computing device includes a memory/storage module configured to store a software application configured to operate on a target device that with a different operating environment than the development computing device.
  • the development computing device further includes a desired output generator configured to execute the software application on the development computing device according tb a test program to generate a desired output for the software application.
  • the development computing device uses one or more characteristics of the operating environment of the development computing device that are different than one or more characteristics of the operating environment of the target device to generate the desired output for the software application.
  • the development computing device further includes means for receiving, from the target device, an actual output of the target device in generated based on execution of the software application according to the test program and using one or more characteristics installed on the target device.
  • the one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device.
  • the development computing device further includes a test verification module configured to compare the actual output to the desired output to verify execution of the software application on the target device.
  • the development device is configured to provide a user with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
  • an article of manufacture comprising a computer-readable storage medium storing instructions.
  • the computer-readable storage medium stores instructions that cause a computing device to execute, on a development device and according to a test program, a software application to generate a desired output for the software application.
  • the software application is configured to execute on a target device with a different operating environment than the development device.
  • the computer- readable storage medium also stores instructions that cause the computing device to receive, from the target device and by the development device, an actual output of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device.
  • the one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device.
  • the computer-readable storage medium also stores instructions that cause the computing device to compare, by the development device, the actual output to the desired output to verify execution of the software application on the target device.
  • the computer-readable storage medium also stores instructions that cause the computing device to provide a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
  • FIG. 1 is a conceptual diagram illustrating one example of a development device and a target device consistent with the techniques of this disclosure.
  • FIG. 2 is a block diagram illustrating one example of a development device and a target device consistent with the techniques of this disclosure.
  • FIG. 3 is a flowchart diagram illustrating one example of a method of testing a software application on a target device consistent with the techniques of this disclosure.
  • FIGS. 4-6 are conceptual diagrams that illustrate various examples of techniques for verifying actual output of a target device consistent with this disclosure.
  • FIG. 1 is a conceptual diagram illustrating one example of application software testing consistent with the techniques of this disclosure.
  • FIG. 1 shows a development device 1 10 and at least one target device 120.
  • the development device 1 10 may comprise a classical and/or desktop computing device as shown in FIG. 1.
  • Development device 1 10 as show in FIG. 1 is provided merely for exemplary purposes, other types of computing devices are also contemplated as a development device.
  • development device 1 10 may be any type of device, including a mobile device such as a cellular phone, smartphone, netbook, laptop computer, tablet computer, or any similar device.
  • Development device 110 may be instead be any other type of computing device, such as a mainf ame computing device.
  • development device 1 10 may differ from target device 120 in that development device 1 10 uses a different operating environment than target device 120.
  • an operating environment 1 15 of development device 1 10 may differ from an operating environment 125 of target device 120 in that development device 110 uses a different operating system than target device 120 (e.g., operating systems 112, 122).
  • Non- limiting examples of operating systems 1 12, 122 that may be used by either of a development device 1 10 or a target device 120 include Microsoft's Windows® (XP®, Vista®, Windows 7®), Apple's MacOS® (Tiger®, Leopard®, Snow Leopard®), and Linux.
  • Other non- limiting examples of operating systems 1 12, 122 include Google's Android®, Microsoft's Windows Mobile®, Nokia's Sybian®, Palm's WebOS®, and Apples iPhone OS®.
  • development device 120 includes different hardware (e.g., processor, signal processing, memory, storage, communications capabilities, peripheral devices, and other like hardware differences) that target device 120.
  • a target device 120 as described herein may differ from development device 110 in that it may have more or less computing power, short term memory (e.g., volatile memory such as random access (RAM) memory), or long-term memory (e.g., magnetic hard disc, Flash memory, or other non- olatile storage components) than development device 110.
  • short term memory e.g., volatile memory such as random access (RAM) memory
  • long-term memory e.g., magnetic hard disc, Flash memory, or other non- olatile storage components
  • respective operating environments 1 15, 125 of target device 120 and development device 110 may also or instead differ in that they are configured to execute different non-operating system software.
  • target device 120 may include a computer readable storage medium (e.g., short or long-term data storage), that stores instructions to cause the target device 120 (e.g. a processor of target device 120) to perform operations. The same instructions may not be stored on a computer readable storage medium of development device.
  • Development device 1 10 and target device 120 may further or instead store different data.
  • software executable by respective processors of the development 1 10 and target 120 devices may utilize different data to cause functions to be performed by the devices.
  • Respective operating environments 115, 125 Of target device 120 and development device 1 10 may differ instead or in addition to any of the differences described above.
  • the operating environments 1 15, 125 may differ in any combination of different peripheral or internal hardware, software, or firmware.
  • FIG. 1 also illustrates a few examples of target devices 120.
  • the examples shown in FIG. 1 include a net book or smart book computing device, a cellular or smart phone, and a tablet computer.
  • the target devices 120 shown in FIG. 1 are provided merely for exemplary purposes and not intended to limit target devices 120 to the examples shown.
  • target device 120 may be a desktop computer, laptop computer, or any other device described above.
  • an application developer may install and execute emulator software 1 14 on a development device 1 10 that executes within, or otherwise operatively
  • an application developer may write application code executable by the emulator to simulate how the application will operate on the target device 120 for which the application is written.
  • an application developer may, through the emulator 1 14, simulate user input to the target device 120.
  • the software developer may simulate user input specific to a particular target device 120 or target device operating environment via a development device 1 10 that does not include the same or similar input mechanisms.
  • keystrokes, text entry, or mouse input via a development device 1 10 that is a desktop computer may simulate touch screen commands, detection of accelerometer input, or other input commands specific to input mechanisms of a target device 120 (e.g., a mobile device).
  • An application developer may, to a certain degree, verify operation of application software by entering individual commands manually via an emulator (e.g., enter in each command and manually verify a desired response through viewing a graphical representation of an emulated target device display).
  • an application developer may exhaustively test an application prior to any customer release.
  • a software developer may create a test program to simulate a series of user inputs in sequence to emulated execution of a software application. The results of the test program may be automatically verified to determine whether the application responded as expected to user input defined by the test program.
  • Verifying operation of software application code via an emulator and test software executing on a development device 1 10 may be advantageous, because an application developer may run a large number of application software tests relatively quickly because, in some examples development device 1 10 may have superior computing power in comparison to target device 120.
  • emulator 1 14 may provide some feedback to an application developer regarding operation of an application in a target device 120 operating environment, emulator 114 may not be able to emulate exactly an environment of a particular target device 120.
  • a particular application may not execute the same (e.g., generate the same device output) via emulator 1 14 due to the above-mentioned differences between operating environments 115, 125 of development device 1 10 and target device 120.
  • an expected or desired output of an application may be generated for comparison to an actual output of the target device 120.
  • an application developer may create a test program as described above, run that test program on
  • An application to be tested may be loaded onto target device 120, and executed according to the same test program as used to generate the desired output.
  • One or more screen shots that represent an actual output of target device 120 in response to the application executed according to the test program may then be captured.
  • target device 120 may use different hardware, software or data than development device 1 10 to perform different operations to cause output, e.g., the display of images, text, or video.
  • target device 120 may use different primitive libraries or graphics processing hardware or software to display images or video.
  • target device 120 may use different fonts or font libraries to generate text for display.
  • testing of an application may include installing one or more characteristics of an operating environment of development device 1 10 that are different than one or more corresponding characteristics of an operating environment of target device 120 on target device 120.
  • the one or more characteristics may be used to perform one or more output operations, such as the display of images, text, or video.
  • a library such as one or more font libraries of development device 1 10, may be installed on target device 120.
  • one or more libraries or software such as graphics primitives, image or video processing software, or other like data or software may be installed on target device 120.
  • one or more software instructions to cause a target device 120 utilize one or more particular hardware components (e.g., a particular processor, digital signal processor, memory component or portion of a memory component (e.g., register, long-term or short term storage component or address) for the processing of image, text, or video.
  • a particular hardware component e.g., a particular processor, digital signal processor, memory component or portion of a memory component (e.g., register, long-term or short term storage component or address) for the processing of image, text, or video.
  • the one or more operating environment characteristics installed on target device 120 may not be solely for the presentation of images, video and/or text via a display.
  • the one or more operating environment characteristics may include data or program instructions related to the processing of user input (e.g., whether to accept a particular type of user input, what action to perform in response to the user input).
  • a characteristic of an operating environment may include data and or program instructions related to other forms of device output other than via a device display output.
  • a characteristic installed on target device may relate to the generation of audio output (e.g., audio processing, audio output component used), tactile feedback (e.g., frequency or strength of vibrations), or any other type of device output.
  • Such other forms of device output may be compared as described above.
  • one or more indications of audio or tactile output e.g., a digital or analog representation of an audio or vibration waveform
  • verity execution of an application on target device 120 may be compared to verity execution of an application on target device 120.
  • an application may be loaded on target device 120, and executed according to a test program, An actual output of the application in response to the test program may be captured from target device 120.
  • the actual output of the application may include one or more screen shots that represent display output of target device 120 in response to the test program.
  • a desired output as described above may be compared to an actual output of the target device 120, as also described above.
  • screen shots of the actual output of the target device 120 may be compared to screen shots of the desired output.
  • the screen shots may be compared pixel-by-pixel to determine whether there are differences between the desired and actual outputs.
  • the screen shots may be processed and/or analyzed for comparison. For example, screen shots may be processed and/or analyzed to determine specific aspects for comparison as described in further detail below. If the actual output is different than the desired output, this may indicate that actual output is different than expected, and that application software may require modification to operate as desired.
  • target device 120 Because, as discussed above, one or more characteristics of an operating environment of development device 1 10 are installed on target device 120, desired output and actual output may be substantially identical to provide for an accurate comparison. As such, a comparison as described herein may reduce or eliminate false negatives in output comparison, because target device 120 output generated based on the same characteristics may be substantially identical to corresponding output generated by development device 1 10 also based on the same characteristics.
  • Screen shots of desired and/or actual output may be processed and/or analyzed to determine specific aspects for comparison.
  • screen shots may be converted to from color to black and white.
  • screen shots may be analyzed to determine desirable image characteristics for comparison.
  • screen shots may be analyzed to determine defining characteristics for comparison.
  • screen shots may be analyzed to determine distinct portions of the respective screen shots that include defining characteristics that may be directly compared to determine whether a match exists.
  • a development device 1 10 may be configured to identify text in actual and/or desired output, and compare only the text portions of desired and actual output to determine whether an application operates as expected on target device 120.
  • development device 1 10 may be configured to analyze screen shots of desired and actual output to determine portions of the screen shot that include text.
  • the development device 1 10 may compare text portions of desired and actual output pixel-by-pixel as described above. This technique may be advantageous, because less information may need to be compared than pixel-by-pixel of entire screen shots as described above.
  • OCR optical character recognition
  • OCR may be applied to an actual output of target device 120 to capture a series of text symbols presented in screen shots of target device 120 output for comparison to desired output.
  • OCR may also be applied to the desired output to capture a series of text symbols presented in desired output for comparison to the text symbols of the actual output.
  • Test comparison according to determine accuracy of an application test may be desirable, because comparison of text symbols may require less processing power than pixel-by-pixel comparison of screen shots or portions of screen shots.
  • emulator may instead generate a series of expected text symbols that represent sequential desired output. OCR may be applied to actual target device 120 output as described above, and the series of expected text symbols may be compared to a series of corresponding text symbols of an actual output of target device 120 output. Test comparison according to this example may be advantageous, because the step of generating screen shots to represent a desired output may be skipped, thus further reducing processing power needed to perform test comparisons.
  • FIG. 2 is a block diagram illustrating one example of components of a development device 210 and a target device 220 configured to enable automated testing of application software on the target device 220 consistent with the techniques of this disclosure.
  • development device utilizes a different operating environment (operating environment 215) than target device 220 (operating enviornment 225).
  • operating environment 215 differs from operating environment 225 with respect to operating systems 212, 222.
  • development device 210 may utilize one or more characteristics of operating system 212 to generate display output.
  • Target device 220 operating environment 225 may utilize one or more corresponding but different software or data than operating environment 215 of development device 210 to generate output (e.g. display output).
  • development device 210 may include one or more processors 270, memory/storage modules 272, communications modules 274, and peripheral devices 276.
  • the one or more processors 270 include one or more electrical circuits configured to execute program instructions to carry out operations of development device 210.
  • processor may be configured to execute emulator software 114 on development device 210.
  • Processor 270 may further be configured to execute program instructions to carry out various functionality of development device 110 described herein.
  • processor 270 may be configured to execute program instructions to carry out functionality associated with one or more of operating system 212, library installer 230, desired output generator 260, test program module 240, and/or test verification module 250 as described below with respect to FIG. 2.
  • FIG. 2 As also shown in FIG.
  • development device 210 may include one or more memory/storage modules 274.
  • Memory/storage module 274 may include any form of short term (e.g., random access memory (RAM) or other volatile memory component), or long term (e.g., magnetic hard disc, Flash, or any other non- volatile memory component).
  • RAM random access memory
  • Memory/storage module 274 may be used by processor 274 or other components of device 210 to temporarily or chronically store information.
  • memory/storage module 274 may be configured to store program instructions such as software that may execute on processor 272 to cause emulator 114 to function, or to store program instructions to define application software ⁇ 214.
  • development device 210 may include one or more communications modules 276.
  • the one or more communications modules 276 may be operative to facilitate communication with development device 210 via a network, e.g., a wireless (e.g., Wi-fi®, cellular, Bluetooth®) or wired (e.g., Ethernet) connection.
  • a network e.g., a wireless (e.g., Wi-fi®, cellular, Bluetooth®) or wired (e.g., Ethernet) connection.
  • development device 210 may be coupled to one or more peripheral device 278.
  • the one or more peripheral devices 278 may include various input/output mechanisms of device 210, such as a keyboard, mouse, monitor, printer, or the like. Other types of peripheral devices 278 are also contemplated.
  • target device 220 may also include the same or similar (or different) components as development device 210.
  • target device 220 may include a processor, a memory storage module, a communications module, and/or one or more peripheral devices.
  • Development device 210 includes a characteristic installer module 230.
  • Characteristic installer module 230 may communicate with target device 220 in order to install, on target device 220, one or more characteristics of an operating environment of development device 210 (e.g., install one or more libraries 232 or software (not shown) of an environment of development device 210 on target device 220). For example, characteristic installer module 230 may communicate commands to cause operating system 222 of target device 220 to "point" to a library 232 or software of development device 210 (e.g., identify a memory location of development device in which a library 232 or software is stored). In another example, characteristic installer 230 may communicate at least a portion of a library 232 or software to target device 220 to be stored in temporary or long-term storage of target device 220. In one example, library installer 230 may overwrite a corresponding library of target device 220 stored in temporary or long term storage of target device 220 with library 232 or software.
  • an operating environment of development device 210 e.g., install one or more libraries 232 or software (not shown) of an environment of development device
  • Characteristic installer module 230 may further be operative to, upon completion of generating actual target device 220 output as described herein, reset target device 220 to utilize a native library or software of target device 220.
  • characteristic installer module 230 may configure a "pointer" of target device 220 as described above to point to a native library or software of target device 220 (e.g., a native library stored in a memory of target device 220), or may overwrite a library of development device with a native library or software of target device 220.
  • development device 210 may not include a characteristic installer module.
  • one or more characteristics of operating environment 215 may be installed on target device 220 by other mechanisms.
  • the one or more characteristics may be installed by target device 220 accessing a library or software of development device 210 via network communications.
  • the one or more characteristics of operating environment 215 may be stored one or more computing devices (e.g., servers) coupled to target device 220 via a network.
  • the one or more characteristics may be stored on a computer readable storage medium (e.g., an external storage drive such as a magnetic or Flash memory device). The computer readable storage medium may then be coupled to transfer the one or more characteristics to target device 220 for temporary or permanent storage and/or use.
  • Development device 210 further includes a test program module 240.
  • Test program module 240 may generate a test program that includes one or more inputs to application 214.
  • test program module 240 may generate a test program that includes a series of inputs in a manner expected for target device 220, such as a series of touch-screen interactions, accelerometer detection of movement or orientation changes, or other input depending on target device 220.
  • test program module 240 may generate a test program that includes a series of inputs in a manner expected for target device 220, such as a series of touch-screen interactions, accelerometer detection of movement or orientation changes, or other input depending on target device 220.
  • test program module 240 may generate a test program that includes a series of inputs in a manner expected for target device 220, such as a series of touch-screen interactions, accelerometer detection of movement or orientation changes, or other input depending on target device 220.
  • test program module 240 may generate a test program that includes a series of inputs in a manner expected for target device
  • test program module 240 may enable an application developer to manually generate a series of inputs for purposes of testing application 214.
  • test program module 240 may generate a test program according to the SenseTalk® language provided by Google.
  • test program module 240 may generate a test program according to the Java® language.
  • Test program module 240 may also or instead generate test programs according to any language consistent with the techniques of this disclosure.
  • Development device 210 also includes a desired output generator 260.
  • Desired output generator 260 may be a portion of emulator software such as emulator 114 depicted in FIG. 1.
  • desired output generator 260 may execute application software 214 based on a test program from test program module 240 in order to generate a desired output
  • desired output generator 260 generates a desired output 262 in the form of series of screen shots that represent a display output of target device 220 in response to the test program. In another example, desired output generator 260 generates a desired output 262 in the form of a series of text symbols that might be present in a display output of target device 220 in response to the test program.
  • application 214 may be loaded onto target device 220, along with test program 242.
  • Target device 220 may execute application 214 in accordance with a series of target device 220 inputs defined by test program 242.
  • Test verification module 250 may be operable to capture actual output 264 of target device 220 in response to executing application 214 in accordance with test program 242. In one example, test verification module 250 may capture actual output 264 as a series of screen shots representing display output of target device 220 in response to the test program.
  • Test verification module 250 may further compare desired output 262 to actual output 264 to provide an application developer with an indication of whether application 214 executed as expected or desired on target device 220 in response to test program 242.
  • FIGS. 4-6 are block diagrams that illustrate generally examples of comparisons that may be performed to verify operation of a software application 214 on a target device 220.
  • test verification module 250 may perform a pixel-by-pixel comparison of corresponding screen shots of desired output 262 and actual output 264.
  • the pixel-by-pixel comparison of the corresponding screen shots may include both one or more text portions 420 and/or one or more image portions 422. If the comparison indicates that pixels of corresponding screen shots do not match, an application developer may be provided with an indication of the mismatch.
  • Test verification module 250 may analyze and/or process one or more screen shots of actual and desired output for comparison. For example, test verification module 250 may process one or more screen shots to reduce a complexity of the screen shots for comparison. For example, test verification module 250 may convert screen shots to black and white. In another example, test verification module 250 may remove undesirable noise from screen shots. In another example, test verification module 250 may analyze one or more screen shots to determine distinguishing characteristics of the screen shot. According to these examples, the distinguishing characteristics themselves (e.g., a portion of a screen shot that includes a distinguishing characteristic) may be compared.
  • FIG. 5 is a block diagram that illustrates an alternative example of a comparison that may be performed consistent with the techniques of this disclosure. According to the example of FIG.
  • test verification module 250 may be configured to analyze screen shots of desired output 262 and actual output 264 to determine text regions 420 of the screen shots. According to this example, test verification module 250 may compare the text regions 420 of the screen shots to determine whether application 214 executed as expected or desired in response to test program 242.
  • FIG. 6 is a block diagram that illustrate another alternative example of a comparison that may be performed consistent with the techniques of this disclosure.
  • test verification module 250 may perform optical character recognition on some or all of one or more screen shots of either or both of desired output 262 and actual output 264 to identify text symbols (e.g., text of text portions 420) in the screen shots.
  • An OCR operation may provide, as an output, a series of text characters in the order in which the text symbols appeared in the screen shots. Text characters representing actual output 264 and desired output 262 may be compared to determine whether application 214 executed as expected or desired in response to test program 242.
  • desired output generator 260 may not generate screen shots that represent desired output. Instead, desired output generator 260 may generate a series of text characters that correspond to desired output 262. According to this example, test verification module 250 may perform OCR on actual output 264 to generate text characters that represent actual output 264. Text characters that correspond to actual output 264 and desired output 262 may then be directly compared (e.g., character by character) to determine whether application 214 executed as expected or desired in response to test program 242.
  • FIG. 3 is a flow chart illustrating one example of testing an application program on a target device consistent with the techniques of this disclosure.
  • the method illustrated in FIG. 3 is described below as being performed according to the particular examples of development device 110, 210 and target device 210, 220 as depicted in FIGS. 1 and 2, however one of ordinary skill in the art will recognize that the method of FIG. 3 may be implemented according to any combination of devices. As shown in FIG.
  • the method includes executing, on a development device (e.g., development device 110, 210) and according to a test program, a software application (e.g., software application 214) to generate a desired output (e.g., desired output 262) for the software application,
  • a software application e.g., software application 214 to generate a desired output (e.g., desired output 262) for the software application
  • the software application is configured to execute on a target device (e.g., target device 120, 220) with a different operating environment (e.g., operating environments 115/125, 125/225) than the
  • the method further includes receiving, from the target device and by the development device, an actual output (e.g., actual output 264) of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device (302).
  • the one or more characteristics installed on the target device may comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device.
  • the method further includes comparing, by the development device, the actual output to the desired output to verify execution of the software application on the target device (303).
  • the method further includes providing a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output (304).
  • comparing the actual output to the desired output to verify operating of the software application includes performing pixel-by-pixel comparison of at least one visual indication of the actual output and at least one visual indication of the desired output.
  • comparing the actual output to the desired output includes identifying one or more portions of actual output and desired output that include defining characteristics and comparing the portions that include the defining characteristics.
  • comparing includes identifying at least one text portion of the actual output, and performing pixel-by-pixel comparison of an image of the at least one text portion of the actual output to an associated image of at least one text portion of theidesired output.
  • comparing includes performing pixel-by-pixel comparison of non-text portions of the actual and desired outputs.
  • comparing includes performing optical character recognition (OCR) on the actual output.
  • OCR optical character recognition
  • the method may further comprise comparing a result of the OCR to one or more characters of the desired output.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Debugging And Monitoring (AREA)

Abstract

This disclosure is directed to techniques for the automated testing and verification of a software application on a target device. The software application may be executed according to a test program on a development device that uses a different operating environment than the target device to generate a desired output for the software application. The development device may receive, from the target device, an actual output of the target device generated based on execution of the software application according to the test program and using one or more characteristics installed on the target device. The one or more characteristics installed on the target device may be one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device. The actual output may be automatically compared to the desired output.

Description

PLATFORM SPECIFIC APPLICATION TEST
TECHNICAL FIELD
[0001] The disclosure relates generally to techniques for providing testing and test verification of applications on computing device platforms.
BACKGROUND
[0002] Mobile devices, including cell phones, smart phones, smart books, net books, tablet computers, and the like, have become increasingly prevalent in recent years. To improve the portability and usability of mobile devices, many are configured to operate differently than more classical computing devices such as desktop or laptop computers. For example, mobile devices may operate using a different operating environment (e.g., operating system) that is specifically catered to use less computing power and/or less electrical power (e.g., to reduce reliance on a device battery). Many mobile devices also employ unique mechanisms for user input, such as utilizing multi-touch gestures on a device display to detect user input as opposed to a more classical keyboard and mouse/touchpad combination.
[0003] Developing applications for mobile devices (e.g., cell phones, smartphones, tablet computers) with limited processing power and unique operating environments presents unique challenges in comparison to developing applications for more classical devices such as desktop computers, laptops, and the like.
SUMMARY
[0002] This disclosure is directed to techniques for the verification of application software on a target device on which the application software is to be executed. The techniques of this disclosure may be advantageous, because application software may be tested and automatically verified by a development device without requiring any additional hardware to act as an interpreter between a development device and an actual output of the target device. Furthermore, this disclosure is directed to advantageous techniques for the comparison of actual device output to desired output generated by a development device. [0003] In one example, a method is described herein. The method includes executing, on a development device and according to a test program, a software application to generate a desired output for the software application. The software application is configured to execute on a target device with a different operating environment than the development device. The method further includes receiving, from the target device and by the development device, an actual output of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device. The one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device. The method further includes comparing, by the development device, the actual output to the desired output to verify execution of the software application on the target device. The method further includes providing a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
[0004] In another example, a development computing device is described herein. The development computing device includes a memory/storage module configured to store a software application configured to operate on a target device that with a different operating environment than the development computing device. The development computing device further includes a desired output generator configured to execute the software application on the development computing device according tb a test program to generate a desired output for the software application. The development computing device uses one or more characteristics of the operating environment of the development computing device that are different than one or more characteristics of the operating environment of the target device to generate the desired output for the software application. The development computing device further includes means for receiving, from the target device, an actual output of the target device in generated based on execution of the software application according to the test program and using one or more characteristics installed on the target device. The one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device. The development computing device further includes a test verification module configured to compare the actual output to the desired output to verify execution of the software application on the target device. The development device is configured to provide a user with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
[0005] In another example, an article of manufacture comprising a computer-readable storage medium storing instructions is described herein. The computer-readable storage medium stores instructions that cause a computing device to execute, on a development device and according to a test program, a software application to generate a desired output for the software application. The software application is configured to execute on a target device with a different operating environment than the development device. The computer- readable storage medium also stores instructions that cause the computing device to receive, from the target device and by the development device, an actual output of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device. The one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device. The computer-readable storage medium also stores instructions that cause the computing device to compare, by the development device, the actual output to the desired output to verify execution of the software application on the target device. The computer-readable storage medium also stores instructions that cause the computing device to provide a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
[0006] The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims. BRIEF DESCRIPTION OF DRAWINGS
[0007] FIG. 1 is a conceptual diagram illustrating one example of a development device and a target device consistent with the techniques of this disclosure.
[0008] FIG. 2 is a block diagram illustrating one example of a development device and a target device consistent with the techniques of this disclosure.
[0009] FIG. 3 is a flowchart diagram illustrating one example of a method of testing a software application on a target device consistent with the techniques of this disclosure.
[0010] FIGS. 4-6 are conceptual diagrams that illustrate various examples of techniques for verifying actual output of a target device consistent with this disclosure.
DETAILED DESCRIPTION
[0011] FIG. 1 is a conceptual diagram illustrating one example of application software testing consistent with the techniques of this disclosure. FIG. 1 shows a development device 1 10 and at least one target device 120. The development device 1 10 may comprise a classical and/or desktop computing device as shown in FIG. 1. Development device 1 10 as show in FIG. 1 is provided merely for exemplary purposes, other types of computing devices are also contemplated as a development device. For example, development device 1 10 may be any type of device, including a mobile device such as a cellular phone, smartphone, netbook, laptop computer, tablet computer, or any similar device. Development device 110 may be instead be any other type of computing device, such as a mainf ame computing device.
[0012] In one example, development device 1 10 may differ from target device 120 in that development device 1 10 uses a different operating environment than target device 120. In one example, an operating environment 1 15 of development device 1 10 may differ from an operating environment 125 of target device 120 in that development device 110 uses a different operating system than target device 120 (e.g., operating systems 112, 122). Non- limiting examples of operating systems 1 12, 122 that may be used by either of a development device 1 10 or a target device 120 include Microsoft's Windows® (XP®, Vista®, Windows 7®), Apple's MacOS® (Tiger®, Leopard®, Snow Leopard®), and Linux. Other non- limiting examples of operating systems 1 12, 122 include Google's Android®, Microsoft's Windows Mobile®, Nokia's Sybian®, Palm's WebOS®, and Apples iPhone OS®.
[0013] In some examples, the respective operating environments may differ instead or in addition in that development device 120 includes different hardware (e.g., processor, signal processing, memory, storage, communications capabilities, peripheral devices, and other like hardware differences) that target device 120. For example, a target device 120 as described herein may differ from development device 110 in that it may have more or less computing power, short term memory (e.g., volatile memory such as random access (RAM) memory), or long-term memory (e.g., magnetic hard disc, Flash memory, or other non- olatile storage components) than development device 110.
[0014] In still other examples, respective operating environments 1 15, 125 of target device 120 and development device 110 may also or instead differ in that they are configured to execute different non-operating system software. For example, target device 120 may include a computer readable storage medium (e.g., short or long-term data storage), that stores instructions to cause the target device 120 (e.g. a processor of target device 120) to perform operations. The same instructions may not be stored on a computer readable storage medium of development device.
[0015] Development device 1 10 and target device 120 may further or instead store different data. For example, software executable by respective processors of the development 1 10 and target 120 devices may utilize different data to cause functions to be performed by the devices. Respective operating environments 115, 125 Of target device 120 and development device 1 10 may differ instead or in addition to any of the differences described above. The operating environments 1 15, 125 may differ in any combination of different peripheral or internal hardware, software, or firmware.
[0016] Any other difference between operating environments 1 15, 125 of development device 1 10 and target device 120 is also contemplated. For example, operating environments 1 15, 125 may further differ in that they employs different mechanisms for detecting user input, for example a touch screen or other similar mechanism for detecting user input instead of a classical keyboard and mouse/trackpad combination as typical for desktop and laptop computers. In some examples, a difference between respective operating environments 115, 125 may directly or indirectly affect device output. [0017] FIG. 1 also illustrates a few examples of target devices 120. The examples shown in FIG. 1 include a net book or smart book computing device, a cellular or smart phone, and a tablet computer. The target devices 120 shown in FIG. 1 are provided merely for exemplary purposes and not intended to limit target devices 120 to the examples shown. For example, target device 120 may be a desktop computer, laptop computer, or any other device described above.
[0018] In some circumstances, it may be desirable for an application developer to develop and test application code intended to run on a target device 120 using development device 1 10. For this purpose, an application developer may install and execute emulator software 1 14 on a development device 1 10 that executes within, or otherwise operatively
communicates with, OS 112 and replicates an operating environment (e.g., OS 122) of a particular target device 120. The application developer may write application code executable by the emulator to simulate how the application will operate on the target device 120 for which the application is written. To test an application, an application developer may, through the emulator 1 14, simulate user input to the target device 120. For example, the software developer may simulate user input specific to a particular target device 120 or target device operating environment via a development device 1 10 that does not include the same or similar input mechanisms. In one specific example, keystrokes, text entry, or mouse input via a development device 1 10 that is a desktop computer may simulate touch screen commands, detection of accelerometer input, or other input commands specific to input mechanisms of a target device 120 (e.g., a mobile device).
[0019] An application developer may, to a certain degree, verify operation of application software by entering individual commands manually via an emulator (e.g., enter in each command and manually verify a desired response through viewing a graphical representation of an emulated target device display). However, it may be desirable for an application developer to exhaustively test an application prior to any customer release. As such, a software developer may create a test program to simulate a series of user inputs in sequence to emulated execution of a software application. The results of the test program may be automatically verified to determine whether the application responded as expected to user input defined by the test program. [0020] Verifying operation of software application code via an emulator and test software executing on a development device 1 10 may be advantageous, because an application developer may run a large number of application software tests relatively quickly because, in some examples development device 1 10 may have superior computing power in comparison to target device 120. Although emulator 1 14 may provide some feedback to an application developer regarding operation of an application in a target device 120 operating environment, emulator 114 may not be able to emulate exactly an environment of a particular target device 120. For example, a particular application may not execute the same (e.g., generate the same device output) via emulator 1 14 due to the above-mentioned differences between operating environments 115, 125 of development device 1 10 and target device 120.
[0021] Therefore, it may be advantageous to test an application written for a particular target device 120 or target device operating system 122 directly on the target device 120. To test an application directly on target device 120, an application developer may load the application on the target device 120, and manually operate target device 120 to provide user input and manually verify the results (e.g., provide touch-screen gestures as input, and verify a display output of the target device 120). However, it may be desirable to test an application on target device 120 more thoroughly than manual testing and verification can reasonably provide. As such, it may be desirable to automate testing and verification of testing of an application executing on a desired target device 120. < 1
[0022] To automate testing and verification of an application according to the techniques of this disclosure, an expected or desired output of an application may be generated for comparison to an actual output of the target device 120. For example, an application developer may create a test program as described above, run that test program on
development device 1 10 emulator 1 14, and capture one or more screen shots that represent an emulated display output of target device 120. An application to be tested may be loaded onto target device 120, and executed according to the same test program as used to generate the desired output. One or more screen shots that represent an actual output of target device 120 in response to the application executed according to the test program may then be captured.
[0023] One problem in the testing of application software as described above for a target device is that, because operating environments 115, 125 of the development and target devices are different, certain aspects of output generated by emulator 1 14 and target device 120 may differ even though they represent the same response to input commands of a test program. For example target device 120 may use different hardware, software or data than development device 1 10 to perform different operations to cause output, e.g., the display of images, text, or video. For example, target device 120 may use different primitive libraries or graphics processing hardware or software to display images or video. In another example, target device 120 may use different fonts or font libraries to generate text for display.
[0024] In one example, as shown in FIG. 1, testing of an application may include installing one or more characteristics of an operating environment of development device 1 10 that are different than one or more corresponding characteristics of an operating environment of target device 120 on target device 120. The one or more characteristics may be used to perform one or more output operations, such as the display of images, text, or video. In one specific example, a library, such as one or more font libraries of development device 1 10, may be installed on target device 120. In other examples, one or more libraries or software such as graphics primitives, image or video processing software, or other like data or software may be installed on target device 120. For example, one or more software instructions to cause a target device 120 utilize one or more particular hardware components (e.g., a particular processor, digital signal processor, memory component or portion of a memory component (e.g., register, long-term or short term storage component or address) for the processing of image, text, or video.
[0025] In still other examples, the one or more operating environment characteristics installed on target device 120 may not be solely for the presentation of images, video and/or text via a display. For example, the one or more operating environment characteristics may include data or program instructions related to the processing of user input (e.g., whether to accept a particular type of user input, what action to perform in response to the user input). In another example, a characteristic of an operating environment may include data and or program instructions related to other forms of device output other than via a device display output. For example, a characteristic installed on target device may relate to the generation of audio output (e.g., audio processing, audio output component used), tactile feedback (e.g., frequency or strength of vibrations), or any other type of device output. Such other forms of device output may be compared as described above. For example, one or more indications of audio or tactile output (e.g., a digital or analog representation of an audio or vibration waveform) may be compared to verity execution of an application on target device 120.
[0026] As also shown in FIG. 1, an application may be loaded on target device 120, and executed according to a test program, An actual output of the application in response to the test program may be captured from target device 120. In one example, the actual output of the application may include one or more screen shots that represent display output of target device 120 in response to the test program.
[0027] As also shown in FIG. 1, according to the techniques of this disclosure a desired output as described above may be compared to an actual output of the target device 120, as also described above. In one example, to verify results, screen shots of the actual output of the target device 120 may be compared to screen shots of the desired output. In one example, the screen shots may be compared pixel-by-pixel to determine whether there are differences between the desired and actual outputs. In other examples, the screen shots may be processed and/or analyzed for comparison. For example, screen shots may be processed and/or analyzed to determine specific aspects for comparison as described in further detail below. If the actual output is different than the desired output, this may indicate that actual output is different than expected, and that application software may require modification to operate as desired.
i
[0028] Because, as discussed above, one or more characteristics of an operating environment of development device 1 10 are installed on target device 120, desired output and actual output may be substantially identical to provide for an accurate comparison. As such, a comparison as described herein may reduce or eliminate false negatives in output comparison, because target device 120 output generated based on the same characteristics may be substantially identical to corresponding output generated by development device 1 10 also based on the same characteristics.
[0029] Screen shots of desired and/or actual output may be processed and/or analyzed to determine specific aspects for comparison. In one example, screen shots may be converted to from color to black and white. In another example, screen shots may be analyzed to determine desirable image characteristics for comparison. For example, screen shots may be analyzed to determine defining characteristics for comparison. In one specific example, screen shots may be analyzed to determine distinct portions of the respective screen shots that include defining characteristics that may be directly compared to determine whether a match exists.
[0030] The description herein of techniques for screen shot processing, analysis, and/or comparison are merely provided for exemplary purposes. Any technique for comparing a full or portion of a screen shot to verify target device 120 output are contemplated and consistent with the techniques of this disclosure.
[0031] In one example according to the techniques of this disclosure, a development device 1 10 may be configured to identify text in actual and/or desired output, and compare only the text portions of desired and actual output to determine whether an application operates as expected on target device 120. For example, development device 1 10 may be configured to analyze screen shots of desired and actual output to determine portions of the screen shot that include text. The development device 1 10 may compare text portions of desired and actual output pixel-by-pixel as described above. This technique may be advantageous, because less information may need to be compared than pixel-by-pixel of entire screen shots as described above.
[0032] In another example according to the techniques of this disclosure, optical character recognition (OCR) may be employed for purposes of comparing actual and desired outputs. For example, OCR may be applied to an actual output of target device 120 to capture a series of text symbols presented in screen shots of target device 120 output for comparison to desired output. In one example, OCR may also be applied to the desired output to capture a series of text symbols presented in desired output for comparison to the text symbols of the actual output. Test comparison according to determine accuracy of an application test may be desirable, because comparison of text symbols may require less processing power than pixel-by-pixel comparison of screen shots or portions of screen shots.
[0033] In another example, instead of emulator 1 14 generating a series of output screen shots representing desired output as described above, emulator may instead generate a series of expected text symbols that represent sequential desired output. OCR may be applied to actual target device 120 output as described above, and the series of expected text symbols may be compared to a series of corresponding text symbols of an actual output of target device 120 output. Test comparison according to this example may be advantageous, because the step of generating screen shots to represent a desired output may be skipped, thus further reducing processing power needed to perform test comparisons.
[0034] FIG. 2 is a block diagram illustrating one example of components of a development device 210 and a target device 220 configured to enable automated testing of application software on the target device 220 consistent with the techniques of this disclosure. As shown in FIG. 2, development device utilizes a different operating environment (operating environment 215) than target device 220 (operating enviornment 225). In one example, operating environment 215 differs from operating environment 225 with respect to operating systems 212, 222. In one example, development device 210 may utilize one or more characteristics of operating system 212 to generate display output. Target device 220 operating environment 225 may utilize one or more corresponding but different software or data than operating environment 215 of development device 210 to generate output (e.g. display output).
[0035] As shown in FIG. 2, development device 210 may include one or more processors 270, memory/storage modules 272, communications modules 274, and peripheral devices 276. The one or more processors 270 include one or more electrical circuits configured to execute program instructions to carry out operations of development device 210. For example, processor may be configured to execute emulator software 114 on development device 210. Processor 270 may further be configured to execute program instructions to carry out various functionality of development device 110 described herein. For example, processor 270 may be configured to execute program instructions to carry out functionality associated with one or more of operating system 212, library installer 230, desired output generator 260, test program module 240, and/or test verification module 250 as described below with respect to FIG. 2. As also shown in FIG. 2, development device 210 may include one or more memory/storage modules 274. Memory/storage module 274 may include any form of short term (e.g., random access memory (RAM) or other volatile memory component), or long term (e.g., magnetic hard disc, Flash, or any other non- volatile memory component). Memory/storage module 274 may be used by processor 274 or other components of device 210 to temporarily or chronically store information. For example, memory/storage module 274 may be configured to store program instructions such as software that may execute on processor 272 to cause emulator 114 to function, or to store program instructions to define application software ·214. As also shown in FIG. 2, development device 210 may include one or more communications modules 276. The one or more communications modules 276 may be operative to facilitate communication with development device 210 via a network, e.g., a wireless (e.g., Wi-fi®, cellular, Bluetooth®) or wired (e.g., Ethernet) connection.
[0036] As also shown in FIG. 2, development device 210 may be coupled to one or more peripheral device 278. The one or more peripheral devices 278 may include various input/output mechanisms of device 210, such as a keyboard, mouse, monitor, printer, or the like. Other types of peripheral devices 278 are also contemplated.
[0037] Although not depicted in FIG. 2, target device 220 may also include the same or similar (or different) components as development device 210. For example, target device 220 may include a processor, a memory storage module, a communications module, and/or one or more peripheral devices.
[0038] Development device 210 includes a characteristic installer module 230.
Characteristic installer module 230 may communicate with target device 220 in order to install, on target device 220, one or more characteristics of an operating environment of development device 210 (e.g., install one or more libraries 232 or software (not shown) of an environment of development device 210 on target device 220). For example, characteristic installer module 230 may communicate commands to cause operating system 222 of target device 220 to "point" to a library 232 or software of development device 210 (e.g., identify a memory location of development device in which a library 232 or software is stored). In another example, characteristic installer 230 may communicate at least a portion of a library 232 or software to target device 220 to be stored in temporary or long-term storage of target device 220. In one example, library installer 230 may overwrite a corresponding library of target device 220 stored in temporary or long term storage of target device 220 with library 232 or software.
[0039] Characteristic installer module 230 may further be operative to, upon completion of generating actual target device 220 output as described herein, reset target device 220 to utilize a native library or software of target device 220. For example, characteristic installer module 230 may configure a "pointer" of target device 220 as described above to point to a native library or software of target device 220 (e.g., a native library stored in a memory of target device 220), or may overwrite a library of development device with a native library or software of target device 220.
[0040] In one example, development device 210 may not include a characteristic installer module. According to these examples, one or more characteristics of operating environment 215 may be installed on target device 220 by other mechanisms. For example, the one or more characteristics may be installed by target device 220 accessing a library or software of development device 210 via network communications. For example, the one or more characteristics of operating environment 215 may be stored one or more computing devices (e.g., servers) coupled to target device 220 via a network. In other examples, the one or more characteristics may be stored on a computer readable storage medium (e.g., an external storage drive such as a magnetic or Flash memory device). The computer readable storage medium may then be coupled to transfer the one or more characteristics to target device 220 for temporary or permanent storage and/or use.
[0041] Development device 210 further includes a test program module 240. Test program module 240 may generate a test program that includes one or more inputs to application 214. For example, test program module 240 may generate a test program that includes a series of inputs in a manner expected for target device 220, such as a series of touch-screen interactions, accelerometer detection of movement or orientation changes, or other input depending on target device 220. In one example, test program module 240 may
automatically generate a test program. In other examples, test program module 240 may enable an application developer to manually generate a series of inputs for purposes of testing application 214. In one specific example, test program module 240 may generate a test program according to the SenseTalk® language provided by Google. In other examples, test program module 240 may generate a test program according to the Java® language. Test program module 240 may also or instead generate test programs according to any language consistent with the techniques of this disclosure.
[0042] Development device 210 also includes a desired output generator 260. Desired output generator 260 may be a portion of emulator software such as emulator 114 depicted in FIG. 1. In one example, desired output generator 260 may execute application software 214 based on a test program from test program module 240 in order to generate a desired output
262 of application 214 for later comparison. In one example, desired output generator 260 generates a desired output 262 in the form of series of screen shots that represent a display output of target device 220 in response to the test program. In another example, desired output generator 260 generates a desired output 262 in the form of a series of text symbols that might be present in a display output of target device 220 in response to the test program.
[0043] As also shown in FIG. 1 , application 214 may be loaded onto target device 220, along with test program 242. Target device 220 may execute application 214 in accordance with a series of target device 220 inputs defined by test program 242. Test verification module 250 may be operable to capture actual output 264 of target device 220 in response to executing application 214 in accordance with test program 242. In one example, test verification module 250 may capture actual output 264 as a series of screen shots representing display output of target device 220 in response to the test program.
[0044] Test verification module 250 may further compare desired output 262 to actual output 264 to provide an application developer with an indication of whether application 214 executed as expected or desired on target device 220 in response to test program 242. FIGS. 4-6 are block diagrams that illustrate generally examples of comparisons that may be performed to verify operation of a software application 214 on a target device 220. For example, as shown in FIG. 4, test verification module 250 may perform a pixel-by-pixel comparison of corresponding screen shots of desired output 262 and actual output 264. The pixel-by-pixel comparison of the corresponding screen shots may include both one or more text portions 420 and/or one or more image portions 422. If the comparison indicates that pixels of corresponding screen shots do not match, an application developer may be provided with an indication of the mismatch.
[0045] Test verification module 250 may analyze and/or process one or more screen shots of actual and desired output for comparison. For example, test verification module 250 may process one or more screen shots to reduce a complexity of the screen shots for comparison. For example, test verification module 250 may convert screen shots to black and white. In another example, test verification module 250 may remove undesirable noise from screen shots. In another example, test verification module 250 may analyze one or more screen shots to determine distinguishing characteristics of the screen shot. According to these examples, the distinguishing characteristics themselves (e.g., a portion of a screen shot that includes a distinguishing characteristic) may be compared. [0046] FIG. 5 is a block diagram that illustrates an alternative example of a comparison that may be performed consistent with the techniques of this disclosure. According to the example of FIG. 5, test verification module 250 may be configured to analyze screen shots of desired output 262 and actual output 264 to determine text regions 420 of the screen shots. According to this example, test verification module 250 may compare the text regions 420 of the screen shots to determine whether application 214 executed as expected or desired in response to test program 242.
[0047] FIG. 6 is a block diagram that illustrate another alternative example of a comparison that may be performed consistent with the techniques of this disclosure. According to the example of FIG. 6, test verification module 250 may perform optical character recognition on some or all of one or more screen shots of either or both of desired output 262 and actual output 264 to identify text symbols (e.g., text of text portions 420) in the screen shots. An OCR operation may provide, as an output, a series of text characters in the order in which the text symbols appeared in the screen shots. Text characters representing actual output 264 and desired output 262 may be compared to determine whether application 214 executed as expected or desired in response to test program 242.
[0048] In a related example as described above, desired output generator 260 may not generate screen shots that represent desired output. Instead, desired output generator 260 may generate a series of text characters that correspond to desired output 262. According to this example, test verification module 250 may perform OCR on actual output 264 to generate text characters that represent actual output 264. Text characters that correspond to actual output 264 and desired output 262 may then be directly compared (e.g., character by character) to determine whether application 214 executed as expected or desired in response to test program 242.
[0049] FIG. 3 is a flow chart illustrating one example of testing an application program on a target device consistent with the techniques of this disclosure. The method illustrated in FIG. 3 is described below as being performed according to the particular examples of development device 110, 210 and target device 210, 220 as depicted in FIGS. 1 and 2, however one of ordinary skill in the art will recognize that the method of FIG. 3 may be implemented according to any combination of devices. As shown in FIG. 1 , the method includes executing, on a development device (e.g., development device 110, 210) and according to a test program, a software application (e.g., software application 214) to generate a desired output (e.g., desired output 262) for the software application, The software application is configured to execute on a target device (e.g., target device 120, 220) with a different operating environment (e.g., operating environments 115/125, 125/225) than the
development device (301).
[0050] The method further includes receiving, from the target device and by the development device, an actual output (e.g., actual output 264) of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device (302). The one or more characteristics installed on the target device may comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device. The method further includes comparing, by the development device, the actual output to the desired output to verify execution of the software application on the target device (303). The method further includes providing a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output (304).
[0051] In one example, comparing the actual output to the desired output to verify operating of the software application includes performing pixel-by-pixel comparison of at least one visual indication of the actual output and at least one visual indication of the desired output. In another example, comparing the actual output to the desired output includes identifying one or more portions of actual output and desired output that include defining characteristics and comparing the portions that include the defining characteristics. In another example, comparing includes identifying at least one text portion of the actual output, and performing pixel-by-pixel comparison of an image of the at least one text portion of the actual output to an associated image of at least one text portion of theidesired output. In another example, comparing includes performing pixel-by-pixel comparison of non-text portions of the actual and desired outputs. In another example, comparing includes performing optical character recognition (OCR) on the actual output. In a related example, the method may further comprise comparing a result of the OCR to one or more characters of the desired output. [0052] Various examples of this disclosure have been described. These and other examples are within the scope of the following claims.

Claims

CLAIMS:
1. A method, comprising:
executing, on a development device and according to a test program, a software application to generate a desired output for the software application, wherein the software application is configured to execute on a target device with a different operating environment than the development device;
receiving, from the target device and by the development device, an actual output of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device, wherein the one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device; comparing, by the development device, the actual output to the desired output to verify execution of the software application on the target device; and
providing a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
2. The method of claim 1, wherein executing,' cm the development device and according the test program, the software application to generate the desired output for the software application comprises executing the software application using an emulator configured to emulate the operating environment of the target device.
3. The method of claim 1, wherein the one or more characteristics installed on the target device comprise fonts of the operating environment of the development device that are different than corresponding fonts of the operating environment of the target device.
4. The method of claim 1 , wherein the one or more characteristics installed on the target device comprise a font library of the development device.
5. The method of claim 1, wherein comparing the actual output to the desired output to verify operation of the application on the target device comprises:
performing pixel-by-pixel comparison of at least one visual indication of the actual output and at least one visual indication of the desired output.
6. The method of claim 1 , wherein comparing the actual output to the desired output to verify operation of the application on the target device comprises:
identifying at least one text portion of the actual output; and
performing pixel-by-pixel comparison of an image of the at least one text portion of the actual output to an associated image of at least one text portion of the desired output,
7. The method of claim 1, wherein comparing the actual output to the desired output to verify operation of the application on the target device comprises:
performing optical character recognition (OCR) on the actual output .
8. The method of claim 7, further comprising:
comparing a result of the OCR to one or more characters of the desired output.
9. A development computing device, comprising:
a memory/storage module configured to store a software application configured to operate on a target device that with a different operating environment than the development computing device;
a desired output generator configured to execute the software application on the development computing device according to a test program to generate a desired output for the software application, wherein the development computing device uses one or more characteristics of the operating environment of the development computing device that are different than one or more characteristics of the operating environment of the target device to generate the desired output for the software application;
means for receiving, from the target device, an actual output of the target device in generated based on execution of the software application according to the test program and using one or more characteristics installed on the target device, wherein the one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device;
a test verification module configured to compare the actual output to the desired output to verify execution of the software application on the target device; and
wherein the development device is configured to provide a user with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
10. An article of manufacture comprising a computer-readable storage medium storing instructions that cause a computing device to:
execute, on a development device and according to a test program, a software application to generate a desired output for the software application, wherein the software application is configured to execute on a target device with a different operating environment than the development device;
receive, from the target device and by the development device, an actual output of the target device generated based on execution of the software application on the target device according to the test program and using one or more characteristics installed on the target device, wherein the one or more characteristics installed on the target device comprise one or more characteristics of the operating environment of the development device that are different than one or more characteristics of the operating environment of the target device; compare, by the development device, the actual output to the desired output to verify execution of the software application on the target device; and
provide a user of the development device with at least one indication associated with verification of the execution of the software application on the target device based on the comparison of the actual output to the desired output.
1 1. The article of manufacture of claim of claim 10, wherein the one or more
characteristics installed on the target device comprise fonts of the operating environment of the development device that are different than corresponding fonts of the operating environment of the target device,
12. The article of manufacture of claim of claim 10, wherein the one or more
characteristics installed on the target device comprise a font library of the development device that are different than corresponding fonts of the operating environment of the target device.
13. The article of manufacture of claim of claim 10, wherein the computer-readable storage medium further stores instructions that cause the computing device to:
perform pixel-by-pixel comparison of at least one visual indication of the actual output and at least one visual indication of the desired output.
14. The article of manufacture of claim of claim 10, wherein the computer-readable storage medium further stores instructions that cause the computing device to:
identify at least one text portion of the actual output; and
perform pixel-by-pixel comparison of an image of the at least one text portion of the actual output to an associated image of at least one text portion of the desired output.
15. The article of manufacture of claim of claim 10, wherein the computer-readable storage medium further stores instructions that cause the computing device to:
perform optical character recognition (OCR) on the actual output .
16. The article of manufacture of claim of claim 15, wherein the computer-readable storage medium further stores instructions that cause the computing device to:
compare a result of the OCR to one or more characters of the desired output.
PCT/CN2010/076489 2010-08-31 2010-08-31 Platform specific application test WO2012027886A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2010/076489 WO2012027886A1 (en) 2010-08-31 2010-08-31 Platform specific application test

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2010/076489 WO2012027886A1 (en) 2010-08-31 2010-08-31 Platform specific application test

Publications (1)

Publication Number Publication Date
WO2012027886A1 true WO2012027886A1 (en) 2012-03-08

Family

ID=45772075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2010/076489 WO2012027886A1 (en) 2010-08-31 2010-08-31 Platform specific application test

Country Status (1)

Country Link
WO (1) WO2012027886A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140325482A1 (en) * 2013-04-25 2014-10-30 TestPlant Europe Limited Method for creating a label
WO2014179731A1 (en) * 2013-05-02 2014-11-06 Amazon Technologies, Inc. Program testing service
WO2018169573A1 (en) * 2017-03-17 2018-09-20 Google Llc Determining application test results using screenshot metadata
EP3433983A4 (en) * 2016-03-22 2019-10-30 T-Mobile USA, Inc. Mobile device validation
US10725890B1 (en) 2017-07-12 2020-07-28 Amazon Technologies, Inc. Program testing service

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1862508A (en) * 2005-05-13 2006-11-15 中兴通讯股份有限公司 Automatic testing system of person digital aid cell phone function and method thereof
CN101175285A (en) * 2006-11-01 2008-05-07 联想移动通信科技有限公司 Automatic testing method and system for mobile phone software
WO2008074526A2 (en) * 2006-12-18 2008-06-26 International Business Machines Corporation Method, system and computer program for testing software applications based on multiple data sources
CN101521899A (en) * 2009-03-31 2009-09-02 大连海事大学 System and method for on-computer test of mobile applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1862508A (en) * 2005-05-13 2006-11-15 中兴通讯股份有限公司 Automatic testing system of person digital aid cell phone function and method thereof
CN101175285A (en) * 2006-11-01 2008-05-07 联想移动通信科技有限公司 Automatic testing method and system for mobile phone software
WO2008074526A2 (en) * 2006-12-18 2008-06-26 International Business Machines Corporation Method, system and computer program for testing software applications based on multiple data sources
CN101521899A (en) * 2009-03-31 2009-09-02 大连海事大学 System and method for on-computer test of mobile applications

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140325482A1 (en) * 2013-04-25 2014-10-30 TestPlant Europe Limited Method for creating a label
GB2515386A (en) * 2013-04-25 2014-12-24 Testplant Europ Ltd Method for remotely testing the operation of a computer system
GB2515386B (en) * 2013-04-25 2015-08-26 Testplant Europ Ltd Method for remotely testing the operation of a computer system
US9317403B2 (en) * 2013-04-25 2016-04-19 Testplant Limited Method for creating a label
US9405656B2 (en) 2013-04-25 2016-08-02 TestPlanet Europe Limited Method for remotely testing the operation of a computer system
WO2014179731A1 (en) * 2013-05-02 2014-11-06 Amazon Technologies, Inc. Program testing service
EP3433983A4 (en) * 2016-03-22 2019-10-30 T-Mobile USA, Inc. Mobile device validation
WO2018169573A1 (en) * 2017-03-17 2018-09-20 Google Llc Determining application test results using screenshot metadata
CN110337641A (en) * 2017-03-17 2019-10-15 谷歌有限责任公司 It is determined using screenshot capture metadata and applies test result
US10725890B1 (en) 2017-07-12 2020-07-28 Amazon Technologies, Inc. Program testing service

Similar Documents

Publication Publication Date Title
US9342237B2 (en) Automated testing of gesture-based applications
US9841826B2 (en) Automatic test system and test method for computer, record medium, and program product
US8819630B2 (en) Automatic test tool for webpage design with micro-browsers on mobile platforms
US8627296B1 (en) Unified unit and integration test with automatic mock creation
US9720799B1 (en) Validating applications using object level hierarchy analysis
US20130117855A1 (en) Apparatus for automatically inspecting security of applications and method thereof
WO2013030674A2 (en) System and methods for generating and managing a virtual device
US9411711B2 (en) Adopting an existing automation script to a new framework
CN101192153B (en) Method and apparatus for obtaining user interface information from executable program code
US10705858B2 (en) Automatic import of third party analytics
WO2012027886A1 (en) Platform specific application test
US8984487B2 (en) Resource tracker
KR20190113050A (en) Method and system for automatic configuration test case generation of mobile application
CN112148594A (en) Script testing method and device, electronic equipment and storage medium
WO2017049649A1 (en) Technologies for automated application exploratory testing
US11054915B2 (en) Locally implemented terminal latency mitigation
CN111414309A (en) Automatic test method of application program, computer equipment and storage medium
CN103631702A (en) Automatic random key test method and device
KR101753314B1 (en) Method for testing application of using learning image matching and apparatus for executing the method
CN115687146A (en) BIOS (basic input output System) test method and device, computer equipment and storage medium
CN114625663A (en) Test method, test device, computer equipment and storage medium
CN114064010A (en) Front-end code generation method, device, system and storage medium
CN113821438A (en) Application response performance test method and system and computing equipment
Chu et al. Automated GUI testing for android news applications
CN113139190A (en) Program file detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10856573

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10856573

Country of ref document: EP

Kind code of ref document: A1