CN112000571A - Test method, test device and test device - Google Patents

Test method, test device and test device Download PDF

Info

Publication number
CN112000571A
CN112000571A CN202010747557.7A CN202010747557A CN112000571A CN 112000571 A CN112000571 A CN 112000571A CN 202010747557 A CN202010747557 A CN 202010747557A CN 112000571 A CN112000571 A CN 112000571A
Authority
CN
China
Prior art keywords
video frame
pointer information
user interface
pixel data
information area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010747557.7A
Other languages
Chinese (zh)
Inventor
宫在军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN202010747557.7A priority Critical patent/CN112000571A/en
Publication of CN112000571A publication Critical patent/CN112000571A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The embodiment of the application discloses a test method, a test device and a test device. An embodiment of the method comprises: starting a pointer function of the android system to present a pointer information area in a user interface of the android system; executing preset operation on a designated position in a user interface; and determining the response time of the user interface for the preset operation based on the content change condition of the pointer information area and the content change condition of the target response area corresponding to the specified position. This embodiment improves the efficiency of testing to the human cost has been reduced.

Description

Test method, test device and test device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a testing method and device and a testing device.
Background
When a User executes an operation through an electronic device (e.g., a mobile phone, a tablet computer, etc.), a User Interface (UI) of the electronic device needs to respond to the operation first, and if the response time is too long, the User may wait for a long time, which may easily cause User loss. Accordingly, there is a need to test the response time of a user interface of an electronic device to facilitate product improvements.
Since the response time is a time difference between a time when a test operation (e.g., a click operation) is performed on the user interface and a time when the user interface presents a response result, it is necessary to accurately identify a test operation performed on the user interface. In the prior art, it is generally determined whether a user interface generates an operation behavior by obtaining root (root) permissions of an operating system (e.g., an android operating system) and using a bottom-layer function. However, as the sealing performance of the operating system is enhanced, the time cost for the tester to acquire the root authority is higher, and thus the detection of the operation behavior of the user interface is more difficult, which results in the reduction of the testing efficiency and the increase of the labor cost.
Disclosure of Invention
The embodiment of the application provides a testing method, a testing device and a testing device, so that the testing efficiency is improved, and the labor cost is reduced.
In a first aspect, an embodiment of the present application provides a testing method, where the method includes: starting a pointer function of the android system to present a pointer information area in a user interface of the android system; executing preset operation on a designated position in the user interface; and determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position.
In a second aspect, an embodiment of the present application provides a testing apparatus, including: the system comprises a starting unit and a display unit, wherein the starting unit is configured to start a pointer function of an android system so as to present a pointer information area in a user interface of the android system; an operation unit configured to perform a preset operation on a designated position in the user interface; a determining unit configured to determine a response time of the user interface for the preset operation based on a change condition of the pointer information area and a change condition of a target response area corresponding to the designated position.
In a third aspect, an apparatus for testing is provided that includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured for execution by the one or more processors that include instructions for: starting a pointer function of the android system to present a pointer information area in a user interface of the android system; executing preset operation on a designated position in the user interface; and determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position.
In a fourth aspect, embodiments of the present application provide a computer-readable medium on which a computer program is stored, which when executed by a processor, implements the method as described in the first aspect above.
According to the test method and device and the device for testing, the pointer information area is presented in the user interface of the android system by starting the pointer function of the android system. Then, a preset operation is performed on the designated position in the user interface. Since the pointer information area changes when the user interface is subjected to the preset operation, and the target response area changes after the user interface responds to the preset operation, the response time can be determined based on the change condition in the pointer information area and the change condition of the target response area. Therefore, in the test process, the response time of the user interface to the preset operation can be accurately detected without the need of obtaining the root authority of the operating system by a tester, so that the test efficiency is improved, and the labor cost is reduced.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a flow diagram of one embodiment of a testing method according to the present application;
FIG. 2 is a schematic illustration of a variation of a pointer information region according to the present application;
FIG. 3 is a flow chart of yet another embodiment of a testing method according to the present application;
FIG. 4 is a schematic block diagram of one embodiment of a test apparatus according to the present application;
FIG. 5 is a schematic diagram of an apparatus for testing according to the present application;
FIG. 6 is a schematic diagram of a server in accordance with some embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to FIG. 1, a flow 100 of one embodiment of a testing method according to the present application is shown. The test method can be operated in various electronic devices including but not limited to: a server, a smartphone, a tablet, a laptop, a desktop computer, etc. The electronic equipment can be used as test equipment, is in communication connection with the android system, and tests the android system.
The input method application mentioned in the embodiment of the application can support various input methods. The input method may be an encoding method used for inputting various symbols to electronic devices such as computers and mobile phones, and a user may conveniently input a desired character or character string to the electronic devices using the input method application. It should be noted that, in the embodiment of the present application, in addition to the common chinese input method (such as pinyin input method, wubi input method, zhuyin input method, phonetic input method, handwriting input method, etc.), the input method may also support other languages (such as english input method, japanese hiragana input method, korean input method, etc.), and the input method and the language category of the input method are not limited at all.
The test method in this embodiment may include the following steps:
step 101, starting a pointer function of the android system to present a pointer information area in a user interface of the android system.
In this embodiment, an execution subject (e.g., a test device) of the test method may test the response time of the user interface of the android system. In practice, the test can be performed on the entity device provided with the android system, and also can be performed on the virtual machine running with the android system.
In this embodiment, the execution main body may start a pointer function of the android system, so as to present a pointer information area in a user interface of the android system. The setting page of the android system usually contains developer options, and the pointer function can be started by starting a pointer position switch in the developer options. After the pointer function is turned on, the pointer information area may be displayed in the user interface. The pointer information display area may be located in a preset area of the user interface, such as at the top of the user interface, and is not limited herein.
The pointer information area may contain real-time touch data. The touch data may include, but is not limited to, contact proportion information (which may be denoted as P), current contact movement distance, current contact movement speed, current contact pressure (which may be denoted as Prs), and current contact area (which may be denoted as Size). Here, the current movement distance of the contact may be specifically represented as a current movement distance component dX of the contact in the X-axis direction and a current movement distance component dY of the contact in the Y-axis direction. The current moving speed of the contact point can be specifically represented by a current moving speed component Xv of the contact point in the X-axis direction and a current moving speed component Yv of the contact point in the Y-axis direction. When there is no touch operation in the screen, each numerical value may be 0.
And 102, executing preset operation on a specified position in the user interface.
In this embodiment, the execution main body can transmit the control instruction to the android system to implement the preset operation on the user interface of the android system. The control instruction may include coordinates of a designated position, so that a preset operation may be performed on the designated position indicated by the coordinates. The preset operation herein may include, but is not limited to, a click operation, a long press operation, a touch operation, and the like.
As an example, the application to be tested is an input method application, and if the key response time of the input method application needs to be detected, the specified position may be a position in a certain key to be tested of a keyboard interface in the input method application. If the preset operation is a click operation, the execution main body can acquire the coordinate of the key to be tested in the user interface of the application to be tested in advance, and then sends a click operation instruction for the position of the coordinate to the application to be tested, so that the click operation is executed on the specified position in the user interface.
And 103, determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position.
In this embodiment, when the predetermined operation is performed at the designated position in the user interface, the pointer information area is changed. Wherein the change of the pointer information region may include a change of touch data in the pointer information region and a change of a presentation style of the pointer information region.
Taking fig. 2 as an example, fig. 2 shows a schematic diagram of a change process of the pointer information area. When a preset operation (such as a click operation) is not performed, the touch data in the pointer information region are "P: 0/0", "dX: 0.0", "dY: 0.0", "Xv: 0.0", "Yv: 0.0", "Prs: 0", and "Size: 0" in this order, and the background color of the pointer information region is a transparent color. When a click operation is performed on a certain position, the touch data of the pointer information area sequentially comprise: "P: 1/1", "dX: 8.0", "dY: 9.0", "XV: 0.3", "Yv: 0.2", "Prs: 1.0" and "Size: 0.1". Meanwhile, the background color of the area where the current contact pressure Prs in the pointer information area is located is changed from a transparent color to a non-transparent color.
In this embodiment, the change of the target response area may include a change of characters in the target response area, a change of a presentation style of the target response area, and the like. As an example, the application to be tested is an input method application, and if the click response time of the key "h" in the input method keyboard needs to be detected, after the key "h" to be tested in the input method keyboard is clicked, a character "h" may appear in a target response area (e.g., a screen-up area of the input method application). At this time, the character in the target response area is changed.
As another example, if the click response time of the numeric keyboard switch key of the input method application needs to be detected, after the expression function key is clicked, the target response area (e.g., the keyboard area of the input method application) may be replaced by the numeric keyboard. The characters in the target response area and the presentation style are changed.
In this embodiment, since the pointer information area changes when the preset operation is performed at the designated position in the user interface, and the target response area in the user interface changes when the user interface responds to the preset operation, the execution main body may determine the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the designated position. Here, the response time is specifically an interval time from the change of the pointer information area to the change of the target response area.
In some optional implementation manners of this embodiment, when performing the preset operation, the execution main body may first start screen recording, and perform the preset operation on the designated position in the user interface in a screen recording process. Therefore, the continuous video frames acquired in the test process can be obtained. The continuous video frames record pictures of the process from the execution of the preset operation to the end of the response of the user interface, so that the response time of the user interface to the preset operation can be determined through the analysis of the continuous video frames.
In some optional implementations of the present embodiment, the execution subject may determine the response time of the user interface for the preset operation according to the following sub-steps S11 to S12:
in sub-step S11, a first video frame with a changed pointer information region and a second video frame with a changed target response region are searched.
The pointer information area is typically unchanged without any operations being performed on the user interface. When a preset operation is performed on the user interface, the pointer information area is changed since the display screen is touched. Therefore, whether the pointer information area in the collected video frame changes or not can be sequentially detected, and the video frame where the changed pointer information area is located is used as the first video frame. In practice, if there are a plurality of video frames in which the pointer information area changes, the video frame in which the first pointer information area changes may be used as the first video frame.
For example, the background color of the pointer information area is transparent, and the touch data therein are "P: 0/0", "dX: 0.0", "dY: 0.0", "XV: 0.0", "Yv: 0.0", "Prs: 0", and "Size: 0" in this order. When a preset operation is performed on the user interface, the touch data of the pointer information area is changed, such as to "P: 1/1", "dX: -28.0", "dY: 169.0", "XV: 0.117", "Yv: 0.258", "Prs: 1.0", and "Size: 0.05", since the display screen is touched. Meanwhile, the background color of the area where the Prs:1.0 is located becomes an opaque color. The execution subject can compare the pointer information areas in the video frames by using an image recognition technology to detect the first video frame with the changed pointer information areas.
The sub-step S12 determines an interval time between the capturing of the first video frame and the capturing of the second video frame, and uses the interval time as a response time of the user interface for a preset operation.
Here, the interval time for capturing the first video frame and the video frame may be determined by the capturing order of the first video frame and the second video frame and the capturing time of each frame. As an example, if the recorded video includes 60 video frames and the recording duration is 1000ms (milliseconds), the acquisition time of a single frame is 1000/60 ms. If the pointer information area is judged to be changed in the 10 th frame and the target response area is judged to be changed in the 15 th frame through the image recognition technology, the response time of the time is as follows: (15-10). times. 1000/60 ms.
In some optional implementation manners of this embodiment, an input method application to be tested may run in the android system, and the specified position is a key to be tested in a keyboard interface of the input method application. Thus, the key response time of the input method application can be detected.
According to the method provided by the embodiment of the application, the pointer information area is presented in the user interface of the android system by starting the pointer function of the android system. Then, a preset operation is performed on the designated position in the user interface. Since the pointer information area changes when the user interface is subjected to the preset operation, and the target response area changes after the user interface responds to the preset operation, the response time can be determined based on the change condition in the pointer information area and the change condition of the target response area. Therefore, in the test process, the response time of the user interface to the preset operation can be accurately detected without the need of obtaining the root authority of the operating system by a tester, so that the test efficiency is improved, and the labor cost is reduced.
With further reference to fig. 3, a flow 300 of yet another embodiment of a testing method is shown. The process 300 of the test method includes the following steps:
step 301, starting a pointer function of the android system to present a pointer information area in a user interface of the android system.
Step 301 in this embodiment can refer to step 101 in the corresponding embodiment of fig. 1, and is not described herein again.
Step 302, starting screen recording, and executing preset operation on a designated position in a user interface in the screen recording process.
In this embodiment, when the preset operation is performed, the execution main body may first start screen recording, and perform the preset operation on the designated position in the user interface in the screen recording process. Therefore, the continuous video frames acquired in the test process can be obtained. The continuous video frames record pictures of the process from the execution of the preset operation to the end of the response of the user interface, so that the response time of the user interface to the preset operation can be determined through the analysis of the continuous video frames.
The specific manner of performing the preset operation on the designated position in the user interface in this embodiment may refer to step 102 in the embodiment corresponding to fig. 1, and is not described herein again.
Step 303, find the first video frame with the changed pointer information area and the second video frame with the changed target response area.
In the present embodiment, the pointer information area is not generally changed when no operation is performed on the user interface. When a preset operation is performed on the user interface, the pointer information area is changed since the display screen is touched. Therefore, whether the pointer information area in the collected video frame changes or not can be sequentially detected, and the video frame where the changed pointer information area is located is used as the first video frame. In practice, if there are a plurality of video frames in which the pointer information area changes, the video frame in which the first pointer information area changes may be used as the first video frame.
For example, the background color of the pointer information area is transparent, and the touch data therein are "P: 0/0", "dX: 0.0", "dY: 0.0", "XV: 0.0", "Yv: 0.0", "Prs: 0", and "Size: 0" in this order. When a preset operation is performed on the user interface, the touch data of the pointer information area is changed, such as to "P: 1/1", "dX: -28.0", "dY: 169.0", "XV: 0.117", "Yv: 0.258", "Prs: 1.0", and "Size: 0.05", since the display screen is touched. Meanwhile, the background color of the area where the Prs:1.0 is located becomes an opaque color. The execution subject can compare the pointer information areas in the video frames by using an image recognition technology to detect the first video frame with the changed pointer information areas.
In some optional implementations of the present embodiment, the execution subject may determine the first video frame according to the following sub-steps S21 to S22:
sub-step S21, extracting first pixel data of a first sub-area located in the pointer information area from the recorded video frame. The first sub-region may be a sub-region with a changing style (e.g., background color), such as the region with "Prs: 1.0" mentioned above.
The sub-step S22 matches the first pixel data with the first reference pixel data acquired in advance, and determines the first video frame in which the pointer information region is changed based on the matching result.
Here, the first reference pixel data is pixel data of the first sub-area before the preset operation is performed. Taking the matching of the pixel data by the similarity calculation method as an example, the executing entity may first determine a first similarity between the first pixel data and the first reference pixel data. Here, each of the first pixel data and the first reference pixel data may be regarded as a pixel matrix, and a first similarity between the first pixel data and the first reference pixel data may be determined by a matrix similarity calculation method. Then, the first video frame with the first similarity smaller than a first preset threshold (e.g., 90%) is determined as the first video frame with the changed pointer information area. Therefore, the first video frame with the changed pointer information area can be accurately determined.
In some optional implementations of the present embodiment, the execution subject may determine the first video frame according to the following sub-steps S31 to S33:
sub-step S31, truncates the second sub-area located in the pointer information area from the recorded video frame. The second sub-area can be a sub-area where characters change, such as the area where "Prs: 1.0" is located, or the areas where "dX: -28.0" and "dY: 169.0" are located.
And a substep S32, performing character recognition on the second sub-area to obtain a character to be detected. Here, an OCR (Optical Character Recognition) technology or the like may be adopted to perform Character Recognition on the second sub-region, so as to obtain a Character to be detected.
And a substep S33, determining the first video frame of the character to be detected inconsistent with the preset reference character as the first video frame of the pointer information region.
Therefore, the first video frame with the changed pointer information area can be accurately determined.
In some optional implementations of this embodiment, the execution subject may determine the second video frame according to the following sub-steps S41 to S42:
and a substep S41 of extracting second pixel data located in the target response region from the recorded video frame.
And a sub-step S42 of matching the second pixel data with second reference pixel data acquired in advance, and determining a second video frame in which the target response region has changed based on the matching result.
Here, the second reference pixel data is pixel data of the target response region before the preset operation is performed. Taking the matching of the pixel data by the similarity calculation method as an example, the executing entity may first determine a second similarity between the second pixel data and the second reference pixel data. Here, each of the second pixel data and the second reference pixel data may be regarded as a pixel matrix, and the second similarity between the second pixel data and the second reference pixel data is determined by a matrix similarity calculation method. Then, the first video frame with the second similarity smaller than a second preset threshold (e.g. 90%) is determined as the second video frame with the changed target response area. Therefore, the second video frame with the changed target response area can be accurately determined.
Step 304, determining the interval time for acquiring the first video frame and the second video frame, and using the interval time as the response time of the user interface for the preset operation.
In this embodiment, the interval time for acquiring the first video frame and the video frame can be determined by the acquisition order of the first video frame and the second video frame and the acquisition time of each frame. As an example, if the recorded video includes 60 video frames and the recording duration is 1000ms (milliseconds), the acquisition time of a single frame is 1000/60 ms. If the pointer information area is judged to be changed in the 10 th frame and the target response area is judged to be changed in the 15 th frame through the image recognition technology, the response time of the time is as follows: (15-10). times. 1000/60 ms.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 1, the flow 300 of the testing method in this embodiment relates to a step of recording a video, automatically detecting a first video frame with a changed pointer information area and a second video frame with a changed target response area by using an image recognition technology, and determining a response time. Therefore, automatic testing of response time can be realized, and testing efficiency is further improved.
With further reference to fig. 4, as an implementation of the method shown in the above figures, the present application provides an embodiment of a testing apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which can be applied to various electronic devices.
As shown in fig. 4, the test apparatus 400 of the present embodiment includes: a starting unit 401 configured to start a pointer function of the android system to present a pointer information area in a user interface of the android system; an operation unit 402 configured to perform a preset operation on a designated position in the user interface; a determining unit 403, configured to determine a response time of the user interface for the preset operation based on a change of the pointer information area and a change of the target response area corresponding to the designated position.
In some optional implementations of the present embodiment, the operation unit 402 is further configured to: and starting screen recording, and executing preset operation on the appointed position in the user interface in the screen recording process.
In some optional implementations of this embodiment, the determining unit 403 is further configured to: searching a first video frame with the changed pointer information area and a second video frame with the changed target response area; and determining the interval time for acquiring the first video frame and the second video frame, and taking the interval time as the response time of the user interface for the preset operation.
In some optional implementations of this embodiment, the determining unit 403 is further configured to: extracting first pixel data of a first sub-area in the pointer information area from the recorded video frame; matching the first pixel data with first reference pixel data acquired in advance, and determining a first video frame with the changed pointer information area based on a matching result; the first reference pixel data is pixel data of the first sub-region before the predetermined operation is performed.
In some optional implementations of this embodiment, the determining unit 403 is further configured to: determining a first similarity between the first pixel data and the first reference pixel data; and determining the first video frame with the first similarity smaller than a first preset threshold as the first video frame with the changed pointer information area.
In some optional implementations of this embodiment, the determining unit 403 is further configured to: intercepting a second sub-area located in the pointer information area from the recorded video frame; performing character recognition on the second sub-area to obtain a character to be detected; and determining the first video frame of which the character to be detected is inconsistent with the preset reference character as the first video frame of which the pointer information area is changed.
In some optional implementations of this embodiment, the determining unit 403 is further configured to: extracting second pixel data positioned in the target response area from the recorded video frame; matching the second pixel data with second reference pixel data acquired in advance, and determining a second video frame with the changed target response area based on a matching result; the second reference pixel data is pixel data of the target response region before the preset operation is performed.
In some optional implementations of this embodiment, the determining unit 403 is further configured to: determining a second similarity between the second pixel data and the second reference pixel data; and determining the first video frame with the second similarity smaller than a second preset threshold value as a second video frame with the changed target response area.
In some optional implementation manners of this embodiment, an input method application to be tested runs in the android system, and the designated location is a key to be tested in a keyboard interface of the input method application.
According to the device provided by the embodiment of the application, the pointer function of the android system is started, so that the pointer information area is presented in the user interface of the android system. Then, a preset operation is performed on the designated position in the user interface. Since the pointer information area changes when the user interface is subjected to the preset operation, and the target response area changes after the user interface responds to the preset operation, the response time can be determined based on the change condition in the pointer information area and the change condition of the target response area. Therefore, the response time of the user interface to the preset operation can be accurately detected without acquiring the root permission of the operating system, and testers are not required to acquire the root permission, so that the test efficiency is improved, and the labor cost is reduced.
Fig. 5 is a block diagram illustrating an apparatus 500 for input according to an example embodiment, where the apparatus 500 may be an intelligent terminal or a server. For example, the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 5, the apparatus 500 may include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, and communication component 516.
The processing component 502 generally controls overall operation of the device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 502 may include one or more processors 520 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 502 can include one or more modules that facilitate interaction between the processing component 502 and other components. For example, the processing component 502 can include a multimedia module to facilitate interaction between the multimedia component 508 and the processing component 502.
The memory 504 is configured to store various types of data to support operations at the apparatus 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 504 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 506 provides power to the various components of the device 500. The power components 506 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 500.
The multimedia component 508 includes a screen that provides an output interface between the device 500 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 508 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 500 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 510 is configured to output and/or input audio signals. For example, audio component 510 includes a Microphone (MIC) configured to receive external audio signals when apparatus 500 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 504 or transmitted via the communication component 516. In some embodiments, audio component 510 further includes a speaker for outputting audio signals.
The I/O interface 512 provides an interface between the processing component 502 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 514 includes one or more sensors for providing various aspects of status assessment for the device 500. For example, the sensor assembly 514 may detect the open/closed status of the device 500, the relative positioning of the components, such as the display and keypad of the apparatus 500, the sensor assembly 514 may also test for changes in the position of the apparatus 500 or a component of the apparatus 500, the presence or absence of user contact with the apparatus 500, orientation or acceleration/deceleration of the apparatus 500, and temperature changes of the apparatus 500. The sensor assembly 514 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 514 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 516 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The apparatus 500 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 516 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the aforementioned communication component 516 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 504 comprising instructions, executable by the processor 520 of the apparatus 500 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 6 is a schematic diagram of a server in some embodiments of the present application. The server 600 may vary significantly due to configuration or performance, and may include one or more Central Processing Units (CPUs) 622 (e.g., one or more processors) and memory 632, one or more storage media 630 (e.g., one or more mass storage devices) storing applications 642 or data 644. Memory 632 and storage medium 630 may be, among other things, transient or persistent storage. The program stored in the storage medium 630 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, the central processor 622 may be configured to communicate with the storage medium 630 and execute a series of instruction operations in the storage medium 630 on the server 600.
The server 600 may also include one or more power supplies 626, one or more wired or wireless network interfaces 550, one or more input-output interfaces 658, one or more keyboards 656, and/or one or more operating systems 641, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of an apparatus (smart terminal or server), enable the apparatus to perform a method of testing, the method comprising: starting a pointer function of the android system to present a pointer information area in a user interface of the android system; executing preset operation on a designated position in the user interface; and determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position.
Optionally, the executing a preset operation on the designated position in the user interface includes: starting screen recording, and executing preset operation on a specified position in the user interface in the screen recording process; and determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position, wherein the determining comprises: searching a first video frame with the changed pointer information area and a second video frame with the changed target response area; and determining the interval time for acquiring the first video frame and the second video frame, and taking the interval time as the response time of the user interface for the preset operation.
Optionally, the searching for the first video frame with the changed pointer information area includes: extracting first pixel data of a first sub-area located in the pointer information area from the recorded video frame; matching the first pixel data with first reference pixel data acquired in advance, and determining a first video frame with the changed pointer information area based on a matching result; wherein the first reference pixel data is pixel data of the first sub-region before the preset operation is performed.
Optionally, the matching the first pixel data with first reference pixel data obtained in advance, and determining, based on a matching result, a first video frame in which the pointer information area changes includes: determining a first similarity of the first pixel data and the first reference pixel data; and determining the first video frame with the first similarity smaller than a first preset threshold as the first video frame with the changed pointer information area.
Optionally, the searching for the first video frame with the changed pointer information area includes: intercepting a second sub-area located in the pointer information area from the recorded video frame; performing character recognition on the second sub-area to obtain a character to be detected; and determining the first video frame of which the character to be detected is inconsistent with the preset reference character as the first video frame of which the pointer information area changes.
Optionally, searching for the second video frame with the changed target response area includes: extracting second pixel data located in the target response area from the recorded video frame; matching the second pixel data with second reference pixel data acquired in advance, and determining a second video frame with the changed target response area based on a matching result; wherein the second reference pixel data is pixel data of the target response region before the preset operation is performed.
Optionally, the matching the second pixel data with second reference pixel data acquired in advance, and determining, based on a matching result, a second video frame in which the target response area changes includes: determining a second similarity of the second pixel data to the second reference pixel data; and determining the first video frame with the second similarity smaller than a second preset threshold as a second video frame with the changed target response area.
Optionally, an input method application to be tested runs in the android system, and the designated position is a key to be tested in a keyboard interface of the input method application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice in the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
The above detailed description is provided for a testing method, a testing device and a testing device provided by the present application, and the principle and the implementation of the present application are explained by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A method of testing, the method comprising:
starting a pointer function of the android system to present a pointer information area in a user interface of the android system;
executing preset operation on a designated position in the user interface;
and determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position.
2. The testing method according to claim 1, wherein the performing of the preset operation on the designated position in the user interface comprises:
and starting screen recording, and executing preset operation on the appointed position in the user interface in the screen recording process.
3. The test method according to claim 2, wherein the determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the designated position comprises:
searching a first video frame with the changed pointer information area and a second video frame with the changed target response area;
and determining the interval time for acquiring the first video frame and the second video frame, and taking the interval time as the response time of the user interface for the preset operation.
4. The method of claim 3, wherein the searching for the first video frame with the changed pointer information region comprises:
extracting first pixel data of a first sub-area located in the pointer information area from the recorded video frame;
matching the first pixel data with first reference pixel data acquired in advance, and determining a first video frame with the changed pointer information area based on a matching result;
wherein the first reference pixel data is pixel data of the first sub-region before the preset operation is performed.
5. The method according to claim 4, wherein the matching the first pixel data with a first reference pixel data obtained in advance, and determining the first video frame with the changed pointer information area based on the matching result comprises:
determining a first similarity of the first pixel data and the first reference pixel data;
and determining the first video frame with the first similarity smaller than a first preset threshold as the first video frame with the changed pointer information area.
6. The method of claim 3, wherein the searching for the first video frame with the changed pointer information region comprises:
intercepting a second sub-area located in the pointer information area from the recorded video frame;
performing character recognition on the second sub-area to obtain a character to be detected;
and determining the first video frame of which the character to be detected is inconsistent with the preset reference character as the first video frame of which the pointer information area changes.
7. The method of claim 3, wherein searching for the second video frame with the changed target response region comprises:
extracting second pixel data located in the target response area from the recorded video frame;
matching the second pixel data with second reference pixel data acquired in advance, and determining a second video frame with the changed target response area based on a matching result;
wherein the second reference pixel data is pixel data of the target response region before the preset operation is performed.
8. A test apparatus, the apparatus comprising:
the system comprises a starting unit and a display unit, wherein the starting unit is configured to start a pointer function of an android system so as to present a pointer information area in a user interface of the android system;
an operation unit configured to perform a preset operation on a designated position in the user interface;
a determining unit configured to determine a response time of the user interface for the preset operation based on a change condition of the pointer information area and a change condition of a target response area corresponding to the designated position.
9. An apparatus for testing, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory, and wherein execution of the one or more programs by one or more processors comprises instructions for:
starting a pointer function of the android system to present a pointer information area in a user interface of the android system;
executing preset operation on a designated position in the user interface;
and determining the response time of the user interface for the preset operation based on the change condition of the pointer information area and the change condition of the target response area corresponding to the specified position.
10. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202010747557.7A 2020-07-29 2020-07-29 Test method, test device and test device Pending CN112000571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010747557.7A CN112000571A (en) 2020-07-29 2020-07-29 Test method, test device and test device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010747557.7A CN112000571A (en) 2020-07-29 2020-07-29 Test method, test device and test device

Publications (1)

Publication Number Publication Date
CN112000571A true CN112000571A (en) 2020-11-27

Family

ID=73463500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010747557.7A Pending CN112000571A (en) 2020-07-29 2020-07-29 Test method, test device and test device

Country Status (1)

Country Link
CN (1) CN112000571A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113312967A (en) * 2021-04-22 2021-08-27 北京搜狗科技发展有限公司 Detection method, device and device for detection

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113312967A (en) * 2021-04-22 2021-08-27 北京搜狗科技发展有限公司 Detection method, device and device for detection

Similar Documents

Publication Publication Date Title
CN109359056B (en) Application program testing method and device
CN107992257B (en) Screen splitting method and device
EP3232314A1 (en) Method and device for processing an operation
CN113705225A (en) Sensitive word data processing method and device and electronic equipment
CN113920293A (en) Information identification method and device, electronic equipment and storage medium
CN111596832B (en) Page switching method and device
CN113312967A (en) Detection method, device and device for detection
CN112000571A (en) Test method, test device and test device
CN106020694B (en) Electronic equipment, and method and device for dynamically adjusting selected area
CN112381091A (en) Video content identification method and device, electronic equipment and storage medium
CN112329480A (en) Area adjustment method and device and electronic equipment
CN109542244B (en) Input method, device and medium
RU2636673C2 (en) Method and device for line saving
CN110795014A (en) Data processing method and device and data processing device
CN112784858B (en) Image data processing method and device and electronic equipment
CN107526683B (en) Method and device for detecting functional redundancy of application program and storage medium
CN112115947A (en) Text processing method and device, electronic equipment and storage medium
CN108037875B (en) Method, device and storage medium for switching input modes
CN111814797A (en) Picture character recognition method and device and computer readable storage medium
CN112486603A (en) Interface adaptation method and device for adapting interface
CN113918078A (en) Word-fetching method and device and word-fetching device
CN114896000B (en) Component layout method and device, electronic equipment and storage medium
CN111078022B (en) Input method and device
CN113805707A (en) Input method, input device and input device
CN112464616A (en) Method and device for displaying personalized font

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination