WO2013122586A2 - Automated perceptual quality assessment of touch screen devices - Google Patents

Automated perceptual quality assessment of touch screen devices Download PDF

Info

Publication number
WO2013122586A2
WO2013122586A2 PCT/US2012/025282 US2012025282W WO2013122586A2 WO 2013122586 A2 WO2013122586 A2 WO 2013122586A2 US 2012025282 W US2012025282 W US 2012025282W WO 2013122586 A2 WO2013122586 A2 WO 2013122586A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch screen
frame
gesture
rate determination
update
Prior art date
Application number
PCT/US2012/025282
Other languages
French (fr)
Other versions
WO2013122586A3 (en
Inventor
Eugene Kuznetsov
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to DE112012005875.5T priority Critical patent/DE112012005875T5/en
Priority to CN201280069833.2A priority patent/CN104115097B/en
Priority to US13/997,044 priority patent/US9298312B2/en
Priority to PCT/US2012/025282 priority patent/WO2013122586A2/en
Publication of WO2013122586A2 publication Critical patent/WO2013122586A2/en
Publication of WO2013122586A3 publication Critical patent/WO2013122586A3/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits

Definitions

  • Conventional approaches to evaluating the effectiveness of touch screen devices may involve manually observing video captured of the device during manipulation in order to quantify parameters related to perceptual quality. Such an approach may provide sub-optimal and/or inaccurate assessment results.
  • FIG. 2B is a flowchart of an example of a method of evaluating frame rates for a touch screen according to an embodiment
  • FIG. 3 is a flowchart of an example of a method of evaluating a touch screen according to an embodiment
  • FIG. 4 is a block diagram of an example of a computing system according to an embodiment.
  • Other embodiments can include a computer readable storage medium having a set of instructions which, if executed by a processor, cause a computer to receive a video of an interaction with a touch screen.
  • the instructions may also cause a computer to use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture.
  • embodiments may include a computing system having a non-volatile memory to store a video of an interaction with a touch screen, and a chipset with logic to receive the video.
  • the logic can also use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture.
  • embodiments can include a method in which a video of an interaction with a touch screen is received.
  • the method may also provide for using a computer vision process to identify a gesture in the video, and detecting a first display update on the touch screen.
  • the method can involve determining a response time of the touch screen with respect to the gesture, wherein determining the response time includes counting one or more frames between the gesture and the first display update.
  • the method may provide for tracking a display area update percentage subsequent to the first display update, wherein the display area update percentage is tracked on a frame-by-frame basis.
  • the method can conduct a frame rate determination for the touch screen based on the display area update percentage, and output the response time and a result of the frame rate determination.
  • FIG. 1 shows a measurement system 10 that may be used to evaluate a device 12 having a touch screen 14, wherein the device 12 could be a consumer electronics device such as a wireless smart phone, mobile Internet device (MID), tablet, notebook computer, desktop computer, television, etc.
  • the device 12 might be used to browse web pages, play video games, view programming content, and conduct other user related activities.
  • the touch screen 14 might be used to input data and control the functionality of the device 12, as well as to output information to the user of the device 12.
  • the device 12 may also have other input devices (not shown) such as a mouse, touchpad, keypad, keyboard, and/or microphone.
  • a high speed camera 16 captures a video of the front of the touch screen 14 during various interactions between a robotic arm 15 and the touch screen 14.
  • the camera 16 could be a digital camera capable of recording the touch screen 14 at high enough frame rates and resolutions to extract objective data from the resulting video.
  • a gesture such as a zoom, pan, rotate or tap operation, could be performed on the device 12 via the touch screen 14, wherein the video captured by the camera 16 may document the visual output of the touch screen 14.
  • the robotic arm 15 can mimic user gestures/interactions.
  • the gestures may be conducted manually.
  • the method 22 may be implemented in executable software as a set of logic instructions stored in a machine- or computer-readable medium of a memory such as random access memory (RAM), read only memory (ROM), programmable ROM (PROM), firmware, flash memory, etc., in configurable logic such as, for example, programmable logic arrays (PLAs), field programmable gate arrays (FPGAs), complex programmable logic devices (CPLDs), in fixed- functionality hardware logic using circuit technology such as, for example, application specific integrated circuit (ASIC), complementary metal oxide semiconductor (CMOS) or transistor- transistor logic (TTL) technology, or any combination thereof.
  • PLAs programmable logic arrays
  • FPGAs field programmable gate arrays
  • CPLDs complex programmable logic devices
  • ASIC application specific integrated circuit
  • CMOS complementary metal oxide semiconductor
  • TTL transistor- transistor logic
  • computer program code to carry out operations shown in the method 22 may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • object oriented programming language such as Java, Smalltalk, C++ or the like
  • conventional procedural programming languages such as the "C" programming language or similar programming languages.
  • the functionality of the method 22 may also be implemented via proprietary and/or commercially available image/video processing tools such as OpenCV or Matlab.
  • FIG. 2B shows a method 30 of evaluating frame rates for a touch screen.
  • the method 30 may be implemented in executable software as a set of logic instructions stored in a machine- or computer-readable medium of a memory such as RAM, ROM, PROM, firmware, flash memory, etc., in configurable logic such as, for example, PLAs, FPGAs, CPLDs, in fixed-functionality hardware logic using circuit technology such as, for example, ASIC, CMOS or TTL technology, or any combination thereof.
  • Illustrated block 32 provides for tracking a display area update percentage subsequent to a response of the touch screen to a gesture. As will be discussed in greater detail, the display area update percentage can provide a unique solution to the challenge presented by touch screens that do not update entirely at once.
  • block 34 may conduct a frame rate determination for the touch screen based on the display area update percentage.
  • FIG. 3 shows a more detailed example of a method 36 of evaluating a touch screen.
  • the method 36 may be used to implement one or more aspects of the method 22 (FIG. 2A) and/or the method 30 (FIG. 2B), already discussed.
  • Block 38 may load a video of an interaction with the touch screen, wherein the first frame of the video can be examined at block 40.
  • Blocks 42, 44 and 46 provide for searching the video on a frame-by-frame basis for a finger (or robotic arm, etc.).
  • the search may involve using a computer vision process configured to analyze pixel data and determine whether one or more objects in the pixel data correspond to a finger. If so, illustrated block 48 determines whether the finger has moved since the last frame. Accordingly, the determination at block 48 can involve a comparative analysis between two or more frames in the video content. If movement of the finger is detected, a response time/delay counter may be started at block 50.
  • Block 52 may advance one frame, wherein the next frame can be examined at block 54. If it is determined at block 56 that the display has not been updated, it may be inferred that the touch screen has not yet responded to the gesture and the frame advance and examination at blocks 52 and 54 can be repeated. If a display update is detected (e.g., by virtue of a significant difference between the pixel data of successive frames), illustrated block 58 stops the response delay counter and reports the value of the counter as the response time for the touch screen.
  • Block 60 may provide for calculating a display area update percentage and resetting a "no motion" counter.
  • the display area update percentage can enable a particularly useful approach to frame rate calculation.
  • the no motion counter it may be noted that display area updates could be conducted relatively slowly depending upon the hardware and/or software configuration of the touch screen device. Thus, simply advancing one frame as in block 62, examining the frame as in block 64, and determining whether another display update has taken place as in block 66, might yield "false negatives" with regard to the video content containing motion.
  • the no motion counter may therefore be used in conjunction with a threshold to trigger the frame rate determination.
  • the no motion counter can be incremented at block 68 and compared to the threshold at block 70.
  • the threshold which may be set based on experimental or other data, can take into consideration device latencies that may cause the touch screen to appear as though it is done responding to the gesture. Thus, if the threshold is not exceeded, another frame advance may be conducted at block 62 and the frame rate determination is effectively deferred. Otherwise, illustrated block 72 provides for outputting the number of observed frames and the average frame rate.
  • block 60 may determine that twenty-five frames have been observed (e.g., twenty-five times one hundred). Accordingly, block 72 may use the number of frames that have been observed and knowledge of the amount of time that has expired to determine the average frame rate for the touch screen.
  • FIG. 4 shows a computing system 74 having a processor 76, system memory 78, a platform controller hub (PCH) 80, mass storage (e.g., hard disk drive/HDD, optical disk, flash memory) 82, a network controller 84, and various other controllers (not shown).
  • the computing system 74 could be readily substituted for the computing system 18 (FIG. 1), already discussed.
  • the platform 74 could be part of a mobile platform such as a laptop, personal digital assistant (PDA), mobile Internet device (MID), wireless smart phone, etc., or any combination thereof.
  • the platform 74 may also be part of a fixed platform such as a personal computer (PC), server, workstation, etc.
  • the processor 76 may include one or more processor cores 86 capable of executing a set of stored logic instructions, and an integrated memory controller (IMC) 88 configured to communicate with the system memory 78.
  • the system memory 78 could include dynamic random access memory (DRAM) configured as a memory module such as a dual inline memory module (DIMM), a small outline DIMM (SODIMM), etc.
  • DRAM dynamic random access memory
  • DIMM dual inline memory module
  • SODIMM small outline DIMM
  • the illustrated PCH 80 functions as a host device and may communicate with the network controller 84, which could provide off- platform wireless communication functionality for a wide variety of purposes such as cellular telephone (e.g., Wideband Code Division Multiple Access/W-CDMA (Universal Mobile Telecommunications System/UMTS), CDMA2000 (IS-856/IS-2000), etc.), Wi-Fi (Wireless Fidelity, e.g., Institute of Electrical and Electronics Engineers/IEEE 802.11-2007, Wireless Local Area Network/LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications), LR-WPAN (Low-Rate Wireless Personal Area Network, e.g., IEEE 802.15.4- 2006), Bluetooth (e.g., IEEE 802.15.1-2005, Wireless Personal Area Networks), WiMax (e.g., IEEE 802.16-2004, LAN/MAN Broadband Wireless LANS), GPS (Global Positioning System), spread spectrum (e.g., 900 MHz),
  • W-CDMA Universal Mobile Telecommunications System/UMTS
  • the network controller 84 may also provide off-platform wired communication (e.g., RS-232 (Electronic Industries Alliance/EIA), Ethernet (e.g., IEEE 802.3-2005), power line communication (e.g., X10, IEEE P1675), USB (e.g., Universal Serial Bus, e.g., USB Specification 3.0, Rev. 1.0, November 12, 2008, USB Implementers Forum), DSL (digital subscriber line), cable modem, Tl connection, etc., functionality.
  • RS-232 Electronic Industries Alliance/EIA
  • Ethernet e.g., IEEE 802.3-2005
  • power line communication e.g., X10, IEEE P1675
  • USB e.g., Universal Serial Bus, e.g., USB Specification 3.0, Rev. 1.0, November 12, 2008, USB Implementers Forum
  • DSL digital subscriber line
  • cable modem e.g., Tl connection, etc.
  • the PCH 80 may also be coupled to a robotic arm 15 (FIG. 1) in order to facilitate the approximation of various gestures.
  • the processor 76 might instruct the robotic arm 15 (FIG. 1) to conduct various touch operations including, but not limited to, zoom, pan, rotate and/or tap operations on the display or other input component of the device.
  • the processor 76 may execute logic that receives a video 90 of an interaction with a touch screen from, for example, non-volatile memory such as the mass storage 82, the network controller 84 or other appropriate video source.
  • the logic may also use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture.
  • the logic can track a display area update percentage subsequent to a response of the touch screen to the gesture, and conduct a frame rate determination for the touch screen based on the display area update percentage.
  • the logic may also be configured to output the response time and frame rate 92 via, for example, a display (not shown) of the system 74.
  • Techniques described herein may therefore provide for automated, non-intrusive extraction of parameters of perceptual models for a wide variety of user scenarios and touch screen devices. Accordingly, accuracy challenges associated with manual extraction of parameters may be obviated through the use of a solution that is readily scalable and easily integrated into an overall automation framework of experience evaluation.
  • Embodiments of the present invention are applicable for use with all types of semiconductor integrated circuit (“IC") chips.
  • IC semiconductor integrated circuit
  • Examples of these IC chips include but are not limited to processors, controllers, chipset components, programmable logic arrays (PLAs), memory chips, network chips, and the like.
  • PPAs programmable logic arrays
  • signal conductor lines are represented with lines. Some may be different, to indicate more constituent signal paths, have a number label, to indicate a number of constituent signal paths, and/or have arrows at one or more ends, to indicate primary information flow direction. This, however, should not be construed in a limiting manner. Rather, such added detail may be used in connection with one or more exemplary embodiments to facilitate easier understanding of a circuit.
  • Any represented signal lines may actually comprise one or more signals that may travel in multiple directions and may be implemented with any suitable type of signal scheme, e.g., digital or analog lines implemented with differential pairs, optical fiber lines, and/or single-ended lines.
  • Example sizes/models/values/ranges may have been given, although embodiments of the present invention are not limited to the same. As manufacturing techniques (e.g., photolithography) mature over time, it is expected that devices of smaller size could be manufactured.
  • well known power/ground connections to IC chips and other components may or may not be shown within the figures, for simplicity of illustration and discussion, and so as not to obscure certain aspects of the embodiments of the invention.
  • arrangements may be shown in block diagram form in order to avoid obscuring embodiments of the invention, and also in view of the fact that specifics with respect to implementation of such block diagram arrangements are highly dependent upon the platform within which the embodiment is to be implemented, i.e., such specifics should be well within purview of one skilled in the art.
  • Coupled may be used herein to refer to any type of relationship, direct or indirect, between the components in question, and may apply to electrical, mechanical, fluid, optical, electromagnetic, electromechanical or other connections.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)
  • Position Input By Displaying (AREA)

Abstract

Methods and systems may provide for receiving a video of an interaction with a touch screen, using a computer vision process to identify a gesture in the video, and determining a response time of the touch screen with respect to the gesture. Moreover, a display area update percentage can be tracked subsequent to a response of the touch screen to the gesture, wherein a frame rate determination may be conducted for the touch screen based on the display area update percentage.

Description

AUTOMATED PERCEPTUAL QUALITY ASSESSMENT OF TOUCH SCREEN
DEVICES
CROSS-REFERENCE TO RELATED APPLICATIONS
The present application is related to U. S. Patent Application No. 13/174,052 filed June 30,
201 1.
BACKGROUND
Technical Field
Embodiments generally relate to the evaluation of consumer electronics devices. More particularly, embodiments relate to automated perceptual quality assessments of touch screen devices.
Discussion
Conventional approaches to evaluating the effectiveness of touch screen devices may involve manually observing video captured of the device during manipulation in order to quantify parameters related to perceptual quality. Such an approach may provide sub-optimal and/or inaccurate assessment results.
BRIEF DESCRIPTION OF THE DRAWINGS
The various advantages of the embodiments of the present invention will become apparent to one skilled in the art by reading the following specification and appended claims, and by referencing the following drawings, in which:
FIG. 1 is a block diagram of an example of a measurement system according to an embodiment;
FIG. 2A is a flowchart of an example of a method of evaluating response times for a touch screen according to an embodiment;
FIG. 2B is a flowchart of an example of a method of evaluating frame rates for a touch screen according to an embodiment;
FIG. 3 is a flowchart of an example of a method of evaluating a touch screen according to an embodiment; and
FIG. 4 is a block diagram of an example of a computing system according to an embodiment.
DETAILED DESCRIPTION Embodiments may include a method in which a video of a computer interaction with a touch screen is received. The method may also provide for using a computer vision process to identify a gesture in the video, and determining a response time of the touch screen with respect to the gesture.
Other embodiments can include a computer readable storage medium having a set of instructions which, if executed by a processor, cause a computer to receive a video of an interaction with a touch screen. The instructions may also cause a computer to use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture.
In addition, embodiments may include a computing system having a non-volatile memory to store a video of an interaction with a touch screen, and a chipset with logic to receive the video. The logic can also use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture.
Additionally, embodiments can include a method in which a video of an interaction with a touch screen is received. The method may also provide for using a computer vision process to identify a gesture in the video, and detecting a first display update on the touch screen. In addition, the method can involve determining a response time of the touch screen with respect to the gesture, wherein determining the response time includes counting one or more frames between the gesture and the first display update. Moreover, the method may provide for tracking a display area update percentage subsequent to the first display update, wherein the display area update percentage is tracked on a frame-by-frame basis. Additionally, the method can conduct a frame rate determination for the touch screen based on the display area update percentage, and output the response time and a result of the frame rate determination.
FIG. 1 shows a measurement system 10 that may be used to evaluate a device 12 having a touch screen 14, wherein the device 12 could be a consumer electronics device such as a wireless smart phone, mobile Internet device (MID), tablet, notebook computer, desktop computer, television, etc. Thus, the device 12 might be used to browse web pages, play video games, view programming content, and conduct other user related activities. The touch screen 14 might be used to input data and control the functionality of the device 12, as well as to output information to the user of the device 12. The device 12 may also have other input devices (not shown) such as a mouse, touchpad, keypad, keyboard, and/or microphone.
In the illustrated example, a high speed camera 16 captures a video of the front of the touch screen 14 during various interactions between a robotic arm 15 and the touch screen 14. The camera 16 could be a digital camera capable of recording the touch screen 14 at high enough frame rates and resolutions to extract objective data from the resulting video. For example, a gesture, such as a zoom, pan, rotate or tap operation, could be performed on the device 12 via the touch screen 14, wherein the video captured by the camera 16 may document the visual output of the touch screen 14. Thus, the robotic arm 15 can mimic user gestures/interactions. Alternatively, the gestures may be conducted manually. The illustrated system 10 also includes a computing system 18 having logic 20 to receive the video, use a computer vision process to identify gestures in the video, and determine response times of the touch screen 14 with respect to the gestures, as will be discussed in greater detail. The logic 20 may also track display area update percentages subsequent to responses of the touch screen to gestures, conduct frame rate determinations for the touch screen based on the display area update percentages, and output the response times and results of the frame rate determinations, as will also be discussed in greater detail.
Turning now to FIG. 2A, a method 22 of evaluating response times for a touch screen is shown. The method 22 may be implemented in executable software as a set of logic instructions stored in a machine- or computer-readable medium of a memory such as random access memory (RAM), read only memory (ROM), programmable ROM (PROM), firmware, flash memory, etc., in configurable logic such as, for example, programmable logic arrays (PLAs), field programmable gate arrays (FPGAs), complex programmable logic devices (CPLDs), in fixed- functionality hardware logic using circuit technology such as, for example, application specific integrated circuit (ASIC), complementary metal oxide semiconductor (CMOS) or transistor- transistor logic (TTL) technology, or any combination thereof. For example, computer program code to carry out operations shown in the method 22 may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The functionality of the method 22 may also be implemented via proprietary and/or commercially available image/video processing tools such as OpenCV or Matlab.
Illustrated processing block 24 provides for receiving a video of an interaction with a touch screen, wherein a computer vision process may be used at block 26 to identify a gesture in the video. As already noted, the gesture could include a zoom, pan, rotate or tap operation conducted with one or more fingers of a user (or a robotic arm), wherein the computer vision process may be designed to recognize the finger/arm in various different orientations and at different locations on the touch screen. Such computer vision processes are well documented and are not described in detail herein so as not to obscure other aspects of the embodiments. Block 28 may determine the response time of the touch screen with respect to the gesture. As will be discussed in greater detail, in one example, determining the response time may include counting one or more frames between the gesture and a display update on the touch screen.
FIG. 2B shows a method 30 of evaluating frame rates for a touch screen. The method 30 may be implemented in executable software as a set of logic instructions stored in a machine- or computer-readable medium of a memory such as RAM, ROM, PROM, firmware, flash memory, etc., in configurable logic such as, for example, PLAs, FPGAs, CPLDs, in fixed-functionality hardware logic using circuit technology such as, for example, ASIC, CMOS or TTL technology, or any combination thereof. Illustrated block 32 provides for tracking a display area update percentage subsequent to a response of the touch screen to a gesture. As will be discussed in greater detail, the display area update percentage can provide a unique solution to the challenge presented by touch screens that do not update entirely at once. Rather, from frame-to-frame, typically only a portion of the display will be updated while the touch screen is responding to the gesture (e.g., panning the display in response to a finger swipe). Thus, tracking the display area update percentage may facilitate a determination of the number of frames encountered, and in turn the average frame rate. Accordingly, block 34 may conduct a frame rate determination for the touch screen based on the display area update percentage.
FIG. 3 shows a more detailed example of a method 36 of evaluating a touch screen. In particular, the method 36 may be used to implement one or more aspects of the method 22 (FIG. 2A) and/or the method 30 (FIG. 2B), already discussed. Block 38 may load a video of an interaction with the touch screen, wherein the first frame of the video can be examined at block 40. Blocks 42, 44 and 46 provide for searching the video on a frame-by-frame basis for a finger (or robotic arm, etc.). As already noted, the search may involve using a computer vision process configured to analyze pixel data and determine whether one or more objects in the pixel data correspond to a finger. If so, illustrated block 48 determines whether the finger has moved since the last frame. Accordingly, the determination at block 48 can involve a comparative analysis between two or more frames in the video content. If movement of the finger is detected, a response time/delay counter may be started at block 50.
Block 52 may advance one frame, wherein the next frame can be examined at block 54. If it is determined at block 56 that the display has not been updated, it may be inferred that the touch screen has not yet responded to the gesture and the frame advance and examination at blocks 52 and 54 can be repeated. If a display update is detected (e.g., by virtue of a significant difference between the pixel data of successive frames), illustrated block 58 stops the response delay counter and reports the value of the counter as the response time for the touch screen.
Block 60 may provide for calculating a display area update percentage and resetting a "no motion" counter. As already noted, the display area update percentage can enable a particularly useful approach to frame rate calculation. With regard to the no motion counter, it may be noted that display area updates could be conducted relatively slowly depending upon the hardware and/or software configuration of the touch screen device. Thus, simply advancing one frame as in block 62, examining the frame as in block 64, and determining whether another display update has taken place as in block 66, might yield "false negatives" with regard to the video content containing motion. The no motion counter may therefore be used in conjunction with a threshold to trigger the frame rate determination. In particular, if it is determined at block 66 that a display update has not occurred, then the no motion counter can be incremented at block 68 and compared to the threshold at block 70. The threshold, which may be set based on experimental or other data, can take into consideration device latencies that may cause the touch screen to appear as though it is done responding to the gesture. Thus, if the threshold is not exceeded, another frame advance may be conducted at block 62 and the frame rate determination is effectively deferred. Otherwise, illustrated block 72 provides for outputting the number of observed frames and the average frame rate.
Of particular note is that if it is determined at block 66 that the display has been updated, another display area update percentage calculation can be conducted at block 60. For example, if 25% of the display area is updated each time a display update is detected, and one hundred display updates are detected, block 60 may determine that twenty-five frames have been observed (e.g., twenty-five times one hundred). Accordingly, block 72 may use the number of frames that have been observed and knowledge of the amount of time that has expired to determine the average frame rate for the touch screen.
FIG. 4 shows a computing system 74 having a processor 76, system memory 78, a platform controller hub (PCH) 80, mass storage (e.g., hard disk drive/HDD, optical disk, flash memory) 82, a network controller 84, and various other controllers (not shown). The computing system 74 could be readily substituted for the computing system 18 (FIG. 1), already discussed. The platform 74 could be part of a mobile platform such as a laptop, personal digital assistant (PDA), mobile Internet device (MID), wireless smart phone, etc., or any combination thereof. In addition, the platform 74 may also be part of a fixed platform such as a personal computer (PC), server, workstation, etc. Thus, the processor 76 may include one or more processor cores 86 capable of executing a set of stored logic instructions, and an integrated memory controller (IMC) 88 configured to communicate with the system memory 78. The system memory 78 could include dynamic random access memory (DRAM) configured as a memory module such as a dual inline memory module (DIMM), a small outline DIMM (SODIMM), etc.
The illustrated PCH 80, sometimes referred to as a Southbridge of a chipset, functions as a host device and may communicate with the network controller 84, which could provide off- platform wireless communication functionality for a wide variety of purposes such as cellular telephone (e.g., Wideband Code Division Multiple Access/W-CDMA (Universal Mobile Telecommunications System/UMTS), CDMA2000 (IS-856/IS-2000), etc.), Wi-Fi (Wireless Fidelity, e.g., Institute of Electrical and Electronics Engineers/IEEE 802.11-2007, Wireless Local Area Network/LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications), LR-WPAN (Low-Rate Wireless Personal Area Network, e.g., IEEE 802.15.4- 2006), Bluetooth (e.g., IEEE 802.15.1-2005, Wireless Personal Area Networks), WiMax (e.g., IEEE 802.16-2004, LAN/MAN Broadband Wireless LANS), GPS (Global Positioning System), spread spectrum (e.g., 900 MHz), and other RF (radio frequency) telephony purposes. The network controller 84 may also provide off-platform wired communication (e.g., RS-232 (Electronic Industries Alliance/EIA), Ethernet (e.g., IEEE 802.3-2005), power line communication (e.g., X10, IEEE P1675), USB (e.g., Universal Serial Bus, e.g., USB Specification 3.0, Rev. 1.0, November 12, 2008, USB Implementers Forum), DSL (digital subscriber line), cable modem, Tl connection, etc., functionality.
The PCH 80 may also be coupled to a robotic arm 15 (FIG. 1) in order to facilitate the approximation of various gestures. In such a case, the processor 76 might instruct the robotic arm 15 (FIG. 1) to conduct various touch operations including, but not limited to, zoom, pan, rotate and/or tap operations on the display or other input component of the device.
In the illustrated example, the processor 76 may execute logic that receives a video 90 of an interaction with a touch screen from, for example, non-volatile memory such as the mass storage 82, the network controller 84 or other appropriate video source. The logic may also use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture. In one example, the logic can track a display area update percentage subsequent to a response of the touch screen to the gesture, and conduct a frame rate determination for the touch screen based on the display area update percentage. The logic may also be configured to output the response time and frame rate 92 via, for example, a display (not shown) of the system 74.
Techniques described herein may therefore provide for automated, non-intrusive extraction of parameters of perceptual models for a wide variety of user scenarios and touch screen devices. Accordingly, accuracy challenges associated with manual extraction of parameters may be obviated through the use of a solution that is readily scalable and easily integrated into an overall automation framework of experience evaluation.
Embodiments of the present invention are applicable for use with all types of semiconductor integrated circuit ("IC") chips. Examples of these IC chips include but are not limited to processors, controllers, chipset components, programmable logic arrays (PLAs), memory chips, network chips, and the like. In addition, in some of the drawings, signal conductor lines are represented with lines. Some may be different, to indicate more constituent signal paths, have a number label, to indicate a number of constituent signal paths, and/or have arrows at one or more ends, to indicate primary information flow direction. This, however, should not be construed in a limiting manner. Rather, such added detail may be used in connection with one or more exemplary embodiments to facilitate easier understanding of a circuit. Any represented signal lines, whether or not having additional information, may actually comprise one or more signals that may travel in multiple directions and may be implemented with any suitable type of signal scheme, e.g., digital or analog lines implemented with differential pairs, optical fiber lines, and/or single-ended lines.
Example sizes/models/values/ranges may have been given, although embodiments of the present invention are not limited to the same. As manufacturing techniques (e.g., photolithography) mature over time, it is expected that devices of smaller size could be manufactured. In addition, well known power/ground connections to IC chips and other components may or may not be shown within the figures, for simplicity of illustration and discussion, and so as not to obscure certain aspects of the embodiments of the invention. Further, arrangements may be shown in block diagram form in order to avoid obscuring embodiments of the invention, and also in view of the fact that specifics with respect to implementation of such block diagram arrangements are highly dependent upon the platform within which the embodiment is to be implemented, i.e., such specifics should be well within purview of one skilled in the art. Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the invention, it should be apparent to one skilled in the art that embodiments of the invention can be practiced without, or with variation of, these specific details. The description is thus to be regarded as illustrative instead of limiting.
The term "coupled" may be used herein to refer to any type of relationship, direct or indirect, between the components in question, and may apply to electrical, mechanical, fluid, optical, electromagnetic, electromechanical or other connections. In addition, the terms "first", "second", etc. might be used herein only to facilitate discussion, and carry no particular temporal or chronological significance unless otherwise indicated.
Those skilled in the art will appreciate from the foregoing description that the broad techniques of the embodiments of the present invention can be implemented in a variety of forms. Therefore, while the embodiments of this invention have been described in connection with particular examples thereof, the true scope of the embodiments of the invention should not be so limited since other modifications will become apparent to the skilled practitioner upon a study of the drawings, specification, and following claims.

Claims

We claim: 1. A method comprising:
receiving a video of an interaction with a touch screen;
using a computer vision process to identify a gesture in the video;
detecting a first display update on the touch screen;
determining a response time of the touch screen with respect to the gesture, wherein determining the response time includes counting one or more frames between the gesture and the first display update;
tracking a display area update percentage subsequent to the first display update, wherein the display area update percentage is tracked on a frame-by-frame basis;
conducting a frame rate determination for the touch screen based on the display area update percentage; and
outputting the response time and a result of the frame rate determination.
2. The method of claim 1, further including deferring the frame rate determination if a second display update has occurred on the touch screen.
3. The method of claim 2, further including using a no motion counter and a threshold to trigger the frame rate determination.
4. The method of claim 2, further including detecting the second display update, wherein the first display update and the second display update are detected on a frame-by-frame basis.
5. A computing system comprising:
a non-volatile memory to store a video of an interaction with a touch screen; and a chipset including logic to,
receive the video,
use a computer vision process to identify a gesture in the video, and determine a response time of the touch screen with respect to the gesture.
6. The system of claim 5, wherein the logic is to count one or more frames between the gesture and a first display update on the touch screen.
7. The system of claim 6, wherein the logic is to detect the first display update on a frame-by-frame basis.
8. The system of claim 5, wherein the logic is to,
track a display area update percentage subsequent to a response of the touch screen to the gesture,
conduct a frame rate determination for the touch screen based on the display area update percentage, and
output the response time and a result of the frame rate determination.
9. The system of claim 8, wherein the display area update percentage is to be tracked on a frame-by-frame basis.
10. The system of claim 8, wherein the logic is to defer the frame rate determination if a second display update has occurred on the touch screen.
11. The system of claim 10, wherein the logic is to use a no motion counter and a threshold to trigger the frame rate determination.
12. The system of claim 10, wherein the logic is to detect the second display update on a frame-by-frame basis.
13. A computer readable storage medium comprising a set of instructions which, if executed by a processor, cause a computer to:
receive a video of an interaction with a touch screen;
use a computer vision process to identify a gesture in the video; and
determine a response time of the touch screen with respect to the gesture.
14. The medium of claim 13, wherein the instructions, if executed, cause a computer to count one or more frames between the gesture and a first display update on the touch screen.
15. The medium of claim 14, wherein the instructions, if executed, cause a computer to detect the first display update on a frame-by-frame basis.
16. The medium of claim 13, wherein the instructions, if executed, cause a computer to:
track a display area update percentage subsequent to a response of the touch screen to the gesture;
conduct a frame rate determination for the touch screen based on the display area update percentage; and
output the response time and a result of the frame rate determination.
17. The medium of claim 16, wherein the display area update percentage is to be tracked on a frame-by-frame basis.
18. The medium of claim 16, wherein the instructions, if executed, cause a computer to defer the frame rate determination if a second display update has occurred on the touch screen.
19. The medium of claim 18, wherein the instructions, if executed, cause a computer to use a no motion counter and a threshold to trigger the frame rate determination.
20. The medium of claim 18, wherein the instructions, if executed, cause a computer to detect the second display update on a frame-by-frame basis.
21. A method comprising:
receiving a video of an interaction with a touch screen;
using a computer vision process to identify a gesture in the video; and
determining a response time of the touch screen with respect to the gesture.
22. The method of claim 21, wherein determining the response time includes counting one or more frames between the gesture and a first display update on the touch screen.
23. The method of claim 22, further including detecting the first display update on a frame-by-frame basis.
24. The method of claim 21, further including:
tracking a display area update percentage subsequent to a response of the touch screen to the gesture; conducting a frame rate determination for the touch screen based on the display area update percentage; and
outputting the response time and a result of the frame rate determination.
25. The method of claim 24, wherein the display area update percentage is tracked on a frame-by-frame basis.
26. The method of claim 24, further including deferring the frame rate determination if a second display update has occurred on the touch screen.
27. The method of claim 26, further including using a no motion counter and a threshold to trigger the frame rate determination.
28. The method of claim 26, further including detecting the second display update on a frame-by-frame basis.
PCT/US2012/025282 2011-06-30 2012-02-15 Automated perceptual quality assessment of touch screen devices WO2013122586A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
DE112012005875.5T DE112012005875T5 (en) 2012-02-15 2012-02-15 Automatic, continuous quality assessment of touchscreen devices
CN201280069833.2A CN104115097B (en) 2012-02-15 2012-02-15 The automation perceived quality assessment of touch panel device
US13/997,044 US9298312B2 (en) 2011-06-30 2012-02-15 Automated perceptual quality assessment of touchscreen devices
PCT/US2012/025282 WO2013122586A2 (en) 2012-02-15 2012-02-15 Automated perceptual quality assessment of touch screen devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2012/025282 WO2013122586A2 (en) 2012-02-15 2012-02-15 Automated perceptual quality assessment of touch screen devices

Publications (2)

Publication Number Publication Date
WO2013122586A2 true WO2013122586A2 (en) 2013-08-22
WO2013122586A3 WO2013122586A3 (en) 2013-10-17

Family

ID=48984872

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/025282 WO2013122586A2 (en) 2011-06-30 2012-02-15 Automated perceptual quality assessment of touch screen devices

Country Status (3)

Country Link
CN (1) CN104115097B (en)
DE (1) DE112012005875T5 (en)
WO (1) WO2013122586A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8823794B2 (en) 2011-06-30 2014-09-02 Intel Corporation Measuring device user experience through display outputs
WO2015200025A1 (en) * 2014-06-25 2015-12-30 T-Mobile Usa, Inc. Touch screen testing platform having components for providing conductivity to a tip
US9298312B2 (en) 2011-06-30 2016-03-29 Intel Corporation Automated perceptual quality assessment of touchscreen devices
US9652077B2 (en) 2010-12-09 2017-05-16 T-Mobile Usa, Inc. Touch screen testing platform having components for providing conductivity to a tip
CN106991034A (en) * 2017-04-01 2017-07-28 奇酷互联网络科技(深圳)有限公司 A kind of method and apparatus and mobile terminal for monitoring interim card
EP3039517A4 (en) * 2013-10-07 2017-08-02 Tactual Labs Co. Latency measuring and testing system and method
US10120474B2 (en) 2010-12-09 2018-11-06 T-Mobile Usa, Inc. Touch screen testing platform for engaging a dynamically positioned target feature

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107341818A (en) * 2016-04-29 2017-11-10 北京博酷科技有限公司 Image analysis algorithm for the test of touch-screen response performance

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060121330A (en) * 2005-05-24 2006-11-29 주식회사 신엽테크 A keyped and a color liquid crystal screen image inspection system of a post pc robot and goods's of wide range which uses the vision
US20070176871A1 (en) * 2006-01-27 2007-08-02 Mstar Semiconductor, Inc. Measurement device for measuring gray-to-gray response time
KR20110077536A (en) * 2009-12-30 2011-07-07 이종욱 Apparatus for measuring response time of electrostatic capacity type touch screen panel
US20110292288A1 (en) * 2010-05-25 2011-12-01 Deever Aaron T Method for determining key video frames
WO2011161316A1 (en) * 2010-06-22 2011-12-29 Pitkaenen Janne Apparatus and method for testing usability

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7331523B2 (en) * 2001-07-13 2008-02-19 Hand Held Products, Inc. Adaptive optical image reader
CN101359367B (en) * 2008-09-11 2010-09-29 西安理工大学 Static gesture characteristic describing method based on tetragon skeleton structure
CN101763515B (en) * 2009-09-23 2012-03-21 中国科学院自动化研究所 Real-time gesture interaction method based on computer vision
CN101719015B (en) * 2009-11-03 2011-08-31 上海大学 Method for positioning finger tips of directed gestures
JP2011170856A (en) * 2010-02-22 2011-09-01 Ailive Inc System and method for motion recognition using a plurality of sensing streams

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060121330A (en) * 2005-05-24 2006-11-29 주식회사 신엽테크 A keyped and a color liquid crystal screen image inspection system of a post pc robot and goods's of wide range which uses the vision
US20070176871A1 (en) * 2006-01-27 2007-08-02 Mstar Semiconductor, Inc. Measurement device for measuring gray-to-gray response time
KR20110077536A (en) * 2009-12-30 2011-07-07 이종욱 Apparatus for measuring response time of electrostatic capacity type touch screen panel
US20110292288A1 (en) * 2010-05-25 2011-12-01 Deever Aaron T Method for determining key video frames
WO2011161316A1 (en) * 2010-06-22 2011-12-29 Pitkaenen Janne Apparatus and method for testing usability

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9652077B2 (en) 2010-12-09 2017-05-16 T-Mobile Usa, Inc. Touch screen testing platform having components for providing conductivity to a tip
US10120474B2 (en) 2010-12-09 2018-11-06 T-Mobile Usa, Inc. Touch screen testing platform for engaging a dynamically positioned target feature
US10953550B2 (en) 2010-12-09 2021-03-23 T-Mobile Usa, Inc. Touch screen testing platform for engaging a dynamically positioned target feature
US11724402B2 (en) 2010-12-09 2023-08-15 T-Mobile Usa, Inc. Touch screen testing platform for engaging a dynamically positioned target feature
US8823794B2 (en) 2011-06-30 2014-09-02 Intel Corporation Measuring device user experience through display outputs
US9298312B2 (en) 2011-06-30 2016-03-29 Intel Corporation Automated perceptual quality assessment of touchscreen devices
EP3039517A4 (en) * 2013-10-07 2017-08-02 Tactual Labs Co. Latency measuring and testing system and method
WO2015200025A1 (en) * 2014-06-25 2015-12-30 T-Mobile Usa, Inc. Touch screen testing platform having components for providing conductivity to a tip
CN106991034A (en) * 2017-04-01 2017-07-28 奇酷互联网络科技(深圳)有限公司 A kind of method and apparatus and mobile terminal for monitoring interim card

Also Published As

Publication number Publication date
WO2013122586A3 (en) 2013-10-17
DE112012005875T5 (en) 2014-10-30
CN104115097B (en) 2017-06-09
CN104115097A (en) 2014-10-22

Similar Documents

Publication Publication Date Title
US8823794B2 (en) Measuring device user experience through display outputs
WO2013122586A2 (en) Automated perceptual quality assessment of touch screen devices
US9182826B2 (en) Gesture-augmented speech recognition
TWI543069B (en) Electronic apparatus and drawing method and computer products thereof
US9298312B2 (en) Automated perceptual quality assessment of touchscreen devices
US11017296B2 (en) Classifying time series image data
US10048762B2 (en) Remote control of a desktop application via a mobile device
CN113436100B (en) Method, apparatus, device, medium, and article for repairing video
CN111947903A (en) Vibration abnormity positioning method and device
CN109962983B (en) Click rate statistical method and device
CN111507924A (en) Video frame processing method and device
US20150074597A1 (en) Separate smoothing filter for pinch-zooming touchscreen gesture response
CN112101109B (en) Training method and device for face key point detection model, electronic equipment and medium
CN104933688B (en) Data processing method and electronic equipment
CN108604142B (en) Touch screen device operation method and touch screen device
CN110956131B (en) Single-target tracking method, device and system
US20130335360A1 (en) Touch screen interaction methods and apparatuses
CN114740975A (en) Target content acquisition method and related equipment
CN108696722B (en) Target monitoring method, system and device and storage medium
JP7280142B2 (en) Information processing device, program, and imaging system
US10606419B2 (en) Touch screen control
WO2023024986A1 (en) Method, apparatus, device, and medium for determining video smoothness
WO2023005725A1 (en) Pose estimation method and apparatus, and device and medium
CN110210306B (en) Face tracking method and camera
US20230334903A1 (en) Techniques and Apparatuses that Implement Camera Manager Systems Capable of Generating Frame Suggestions from a Set of Frames

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12868450

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 1120120058755

Country of ref document: DE

Ref document number: 112012005875

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 13997044

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 12868450

Country of ref document: EP

Kind code of ref document: A2