CN108900776B - Method and apparatus for determining response time - Google Patents

Method and apparatus for determining response time Download PDF

Info

Publication number
CN108900776B
CN108900776B CN201810962004.6A CN201810962004A CN108900776B CN 108900776 B CN108900776 B CN 108900776B CN 201810962004 A CN201810962004 A CN 201810962004A CN 108900776 B CN108900776 B CN 108900776B
Authority
CN
China
Prior art keywords
video
frame
determining
instruction
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810962004.6A
Other languages
Chinese (zh)
Other versions
CN108900776A (en
Inventor
彭义海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810962004.6A priority Critical patent/CN108900776B/en
Publication of CN108900776A publication Critical patent/CN108900776A/en
Priority to JP2019127386A priority patent/JP6898968B2/en
Priority to KR1020190083623A priority patent/KR102158557B1/en
Application granted granted Critical
Publication of CN108900776B publication Critical patent/CN108900776B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Abstract

The embodiment of the application discloses a method and a device for determining response time. One embodiment of the method comprises: acquiring a video of an image displayed on a screen when the equipment to be detected executes a target operation instruction, wherein the video is shot in advance; parsing a video into a sequence of frames; determining a starting frame and an ending frame from the frame sequence, wherein the starting frame comprises an image displayed by a screen when the device to be detected starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the device to be detected executes the target operation instruction; and determining the response time of the device to be detected when executing the target operation instruction based on the starting frame and the ending frame. This embodiment enables an improved accuracy of the resulting response time.

Description

Method and apparatus for determining response time
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for determining response time.
Background
With the continuous development of internet technology and the rapid popularization of mobile terminals, various Application programs (apps) are emerging. When users use the apps, phenomena such as jamming and time delay often occur. To solve these problems, the response time of the App needs to be tested. Currently, the response time of the App is usually tested by manual testing or by log analysis and other methods.
Disclosure of Invention
The embodiment of the application provides a method and a device for determining response time.
In a first aspect, an embodiment of the present application provides a method for determining a response time, where the method includes: acquiring a video of an image displayed on a screen when the equipment to be detected executes a target operation instruction, wherein the video is shot in advance; parsing a video into a sequence of frames; determining a starting frame and an ending frame from the frame sequence, wherein the starting frame comprises an image displayed by a screen when the device to be detected starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the device to be detected executes the target operation instruction; and determining the response time of the device to be detected when executing the target operation instruction based on the starting frame and the ending frame.
In some embodiments, the method further comprises: in response to determining that the response time is greater than the target threshold, outputting prompt information characterizing that the response time is greater than the target threshold.
In some embodiments, the target threshold is obtained by: acquiring a historical response time set; a target threshold is determined based on the set of historical response times.
In some embodiments, before acquiring a pre-captured video of an image displayed on a screen when the device under test performs the target operation instruction, the method further comprises: responding to a received signal for representing that the device to be detected starts to execute a target operation instruction, and sending a starting instruction for representing that the video starts to be recorded to the client; sending a stopping instruction for representing that the video recording is stopped to the client in response to receiving a signal for representing that the equipment to be detected executes the target operation instruction; receiving a target video sent by a client, and determining the target video as a video of an image displayed on a screen when the device to be detected executes a target operation instruction, wherein the target video is a video obtained by the client starting to record in response to receiving a start instruction and stopping to record in response to receiving a stop instruction.
In some embodiments, determining a start frame and an end frame from a sequence of frames comprises: and in response to determining that the similarity of two adjacent frames in the frame sequence is greater than a preset threshold, determining one of the two adjacent frames as a starting frame or an ending frame.
In a second aspect, an embodiment of the present application provides an apparatus for determining a response time, where the apparatus includes: the acquisition unit is configured to acquire a video of an image which is shot in advance and displayed on a screen when the equipment to be detected executes a target operation instruction; a parsing unit configured to parse a video into a sequence of frames; the key frame determining unit is configured to determine a starting frame and an ending frame from a frame sequence, wherein the starting frame comprises an image displayed by a screen when the device to be detected starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the device to be detected executes the target operation instruction; and the response time determining unit is configured to determine the response time when the device to be detected executes the target operation instruction based on the starting frame and the ending frame.
In some embodiments, the apparatus further comprises: an output unit configured to output prompt information characterizing that the response time is greater than the target threshold in response to determining that the response time is greater than the target threshold.
In some embodiments, the target threshold is obtained by: acquiring a historical response time set; a target threshold is determined based on the set of historical response times.
In some embodiments, the apparatus further comprises: the starting instruction sending unit is configured to respond to the received signal for representing that the device to be detected starts to execute the target operation instruction, and send a starting instruction for representing that the video is recorded to the client; the stopping instruction sending unit is configured to respond to the received signal for representing that the equipment to be detected executes the target operation instruction, and send a stopping instruction for representing that the video recording is stopped to the client; the receiving unit is configured to receive a target video sent by the client and determine the target video as a video of an image displayed on a screen when the device to be detected executes a target operation instruction, wherein the target video is a video obtained by starting recording by the client in response to receiving a starting instruction and stopping recording in response to receiving a stopping instruction.
In some embodiments, the key frame determination unit is further configured to: and in response to determining that the similarity of two adjacent frames in the frame sequence is greater than a preset threshold, determining one of the two adjacent frames as a starting frame or an ending frame.
In a third aspect, an embodiment of the present application provides a server, where the server includes: one or more processors; a storage device having one or more programs stored thereon; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for determining the response time, the starting frame and the ending frame in the frame sequence of the video can be determined firstly, so that the response time is determined according to the starting frame and the ending frame. Compared with manual measurement, analysis can be performed at a granularity beyond what the human eye can perceive, thereby improving the accuracy of the resulting response time. In addition, compared with the method through log analysis, the method avoids the error caused by asynchronous operation and also improves the accuracy of the obtained response time.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for determining a response time according to the present application;
FIG. 3 is a schematic diagram of one application scenario of a method for determining a response time in accordance with an embodiment of the present application;
FIG. 4 is a flow diagram of yet another embodiment of a method for determining a response time according to the present application;
FIG. 5 is a schematic block diagram illustrating one embodiment of an apparatus for determining response times in accordance with the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture to which the method for determining a response time or the apparatus for determining a response time of the embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include a server 101, a client 102, a device under test 103, and networks 104, 105. Network 104 is used, among other things, to provide a medium for communication links between server 101 and client 102. The network 105 is used to provide a medium for a communication link between the server 101 and the device 103 to be detected. The networks 104, 105 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The client 102 communicates with the server 101 through the network 104 to receive or transmit messages and the like. The device 103 to be detected communicates with the server 101 through the network 105 to receive or transmit messages and the like.
The server 101 may be a server providing various services, for example, a video processing server that can analyze and process a video captured by the client 102. The video processing server may perform processing such as parsing on the received video, so as to obtain a processing result (e.g., response time).
The device to be tested 103 may be hardware or software. When the device 103 to be detected is hardware, it may be various electronic devices having a display screen and supporting installation of applications, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like. When the device to be inspected 103 is software, it can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules, or as a single piece of software or software module. And is not particularly limited herein.
It should be noted that the method for determining the response time provided by the embodiment of the present application is generally executed by the server 101, and accordingly, the apparatus for determining the response time is generally disposed in the server 101.
The server 101 and the client 102 may be hardware or software. When the server 101 and the client 102 are hardware, they may be implemented as a distributed device cluster formed by multiple devices, or implemented as a single device. When the server 101 and the client 102 are software, they may be implemented as multiple software or software modules, or implemented as a single software or software module, and are not limited in this respect.
It should be understood that the number of servers, clients, devices to be detected, and networks in fig. 1 is merely illustrative. Any number of servers, clients, devices to be detected and networks may be provided according to implementation needs.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for determining a response time in accordance with the present application is shown.
Step 201, acquiring a video of an image displayed on a screen when the device to be detected executes a target operation instruction, wherein the video is shot in advance.
In this embodiment, an execution subject (for example, the server 101 shown in fig. 1) of the method for determining the response time may acquire a video of an image, which is shot in advance and displayed on a screen when a device to be detected (for example, the device to be detected 103 shown in fig. 1) executes a target operation instruction, from a client (for example, the client 102 shown in fig. 1) connected in communication. The operation instruction may be a command for instructing the computer to perform a certain operation. In practice, the operational instructions may be represented in a variety of computer languages (e.g., machine, assembly, or high-level) that are currently available. The target operation instruction may be an operation instruction input by a technician or an instruction triggered by some operation performed by a user.
When the equipment to be detected executes the target operation instruction, the display screen of the equipment to be detected can display a corresponding image. As an example, before the device to be detected executes the target operation instruction, the technician may start the shooting function of the client, and start shooting by aiming at the display screen of the device to be detected. And when the equipment to be detected finishes executing the target operation instruction, the shooting can be stopped. Therefore, the video of the image displayed on the screen when the equipment to be detected executes the target operation instruction can be obtained. As yet another example, the photographing may also be done by an application with a screen recording function installed on the device under test. Therefore, the video of the image displayed on the screen when the equipment to be detected executes the target operation instruction can be obtained.
In addition, the video of the image displayed on the screen when the device to be detected executes the target operation instruction can also be stored in the local execution main body. At this time, the executing body can directly and locally acquire the video of the image displayed on the screen when the device to be detected executes the target operation instruction.
Step 202, parsing the video into a sequence of frames.
In this embodiment, the executing entity may parse the video acquired in step 201 into a frame sequence by various methods. In practice, a video consists of a plurality of static images, and due to the fact that the vision retention phenomenon exists in human eyes, when the number of frames displayed per second exceeds a certain value, the video observed by the human eyes is continuous. Thus, the video may be restored to a sequence of frames by various existing video processing software. In practice, the frame rate of the video may be used to represent the number of frames displayed per second. Alternatively, the number of images in the sequence of frames may be increased by increasing the frame rate of the video (which may be 240fps, for example). Thereby facilitating subsequent processing.
Step 203 determines a start frame and an end frame from the sequence of frames.
In this embodiment, the execution body may determine the start frame and the end frame in various ways. The starting frame comprises an image displayed by a screen when the to-be-detected device starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the to-be-detected device finishes executing the target operation instruction.
As an example, the execution body described above may determine the start frame and the end frame in the following manner.
In the first step, each image in the frame sequence is subjected to binarization processing to reduce noise information in the image, so that subsequent processing is facilitated.
And a second step of determining a connected body in each image for each image in the frame sequence after the binarization processing. Therefore, the objects displayed in each image are distinguished, and subsequent operation is facilitated.
In the third step, for the images in the frame sequence processed in the second step, the similarity between each group of adjacent frames can be calculated. On the basis, two groups of adjacent frames are selected according to the sequence of similarity from small to large. According to the positions of the two selected adjacent frames on the time axis, one group of adjacent frames with earlier time can be determined as a first adjacent frame, and one group of adjacent frames with later time can be determined as a second adjacent frame. Further, as an example, for two frames of the first adjacent frames, one of the two frames that is earlier in time may be determined as the start frame. For two frames of the second adjacent frame, the earlier of the two frames may be determined to be the end frame.
In some optional implementations of this embodiment, determining the start frame and the end frame from the sequence of frames includes: and in response to determining that the similarity of two adjacent frames in the frame sequence is greater than a preset threshold, determining one of the two adjacent frames as a starting frame or an ending frame.
And step 204, determining the response time of the device to be detected when executing the target operation instruction based on the starting frame and the ending frame.
In this embodiment, the executing body may determine the response time of the device under test executing the target operation instruction based on the start frame and the end frame. In practice, since the time axis is included in the video, each frame in the video corresponds to a time point on the time axis. Accordingly, it can be determined that the start frame and the end frame correspond to two time points on the time axis. The difference between these two points in time can then be determined as the response time of the device under test for executing the target operating instruction.
In some optional implementations of this embodiment, the method may further include: in response to determining that the response time is greater than the target threshold, outputting prompt information characterizing that the response time is greater than the target threshold.
In some optional implementations of this embodiment, the target threshold is obtained by: acquiring a historical response time set; a target threshold is determined based on the set of historical response times.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for determining a response time according to the present embodiment. In the application scenario of fig. 3, the execution subject of the method for determining the response time may be the server 301. The device to be detected may be a tablet computer 302. On which an instant messaging application is installed. The figure shows the login interface of the instant messaging application. The server 301 is connected with the smartphone 300 via bluetooth. The server 301 may be connected to the tablet computer 302 via a data line. The smartphone 300 may capture an image displayed on the screen when the tablet pc 302 executes the login instruction in advance, so as to obtain a video of the image displayed on the screen when the tablet pc 302 executes the login instruction. Thereafter, the server 301 may obtain the video 3011 from the smartphone 300. The video may then be parsed into a sequence of frames 3012. On this basis, a start frame 3013 and an end frame 3014 are determined from the frame sequence 3012 by calculating the similarity of adjacent images in the frame sequence 3012. Based on the difference between the start frame 3013 and the end frame 3014 corresponding to two time points on the time axis, the response time 3015 when the tablet computer 302 executes the login operation instruction of the instant messaging application is obtained.
The method provided by the above-described embodiment of the present application first parses the video into a sequence of frames, so that a start frame and an end frame can be determined. On the basis, the response time of the device to be detected for executing the target operation instruction is determined based on the starting frame and the ending frame. Compared with manual measurement, analysis can be performed at a granularity beyond what the human eye can perceive, thereby improving the accuracy of the resulting response time. In addition, compared with the method through log analysis, the method avoids the error caused by asynchronous operation and also improves the accuracy of the obtained response time.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method for determining a response time is shown. The flow 400 of the method for determining a response time includes the steps of:
step 401, in response to receiving a signal for representing that the device to be detected starts to execute a target operation instruction, sending a start instruction for representing that the video starts to be recorded to the client.
In the present embodiment, in response to receiving a signal for characterizing that the device to be detected starts to execute the target operation instruction, an execution main body (for example, the server 101 shown in fig. 1) of the method for determining the response time may send a start instruction for characterizing that the video recording starts to a client (for example, the client 102 shown in fig. 1). The signal for representing that the device to be detected starts to execute the target operation instruction can be sent to the execution main body by the device to be detected.
In practice, a technician may compile a test case in advance for testing the device under test. One test case may correspond to one or more operations of a user. For example, the login test case may correspond to a user login operation. The refresh test case may correspond to a user refresh operation. The pre-written test cases can be stored locally in the execution subject.
For example, when the response time of the device to be tested executing the login instruction (the instruction triggered by the user login operation) is required to be tested, the execution subject may send the login test case to the device to be tested. Therefore, the device to be tested can execute the login test case to achieve the same effect as the login operation of the user on the device to be tested. In response to starting execution of the logged test case, the device under test may send a signal to the execution subject characterizing that the device under test starts executing the target operation instruction. For example, it may be an interrupt signal. So that the executing body can receive a signal for representing that the device to be detected starts to execute the target operation instruction. On the basis, the execution body can send a starting instruction for representing the start of recording the video to the client.
And 402, in response to receiving a signal for representing that the device to be detected executes the target operation completion instruction, sending a stop instruction for representing that the video recording is stopped to the client.
In this embodiment, in response to receiving a signal for representing that the device to be detected performs the target operation, the execution main body may send a stop instruction for representing that the recording of the video is stopped to the client.
Continuing with the login test case in step 401 as an example, in response to the completion of executing the login test case, the device to be tested may send a signal for representing that the device to be tested executes the completion target operation instruction to the execution main body. For example, it may be an interrupt signal. Therefore, the execution main body can receive a signal for representing the equipment to be detected to execute the target operation instruction. On the basis, the execution main body can send a stop instruction for representing the stop of recording the video to the client.
And 403, receiving a target video sent by the client, and determining the target video as a video of an image displayed on a screen when the device to be detected executes a target operation instruction.
In this embodiment, the execution main body may receive a target video sent by the client, and determine the target video as a video of an image displayed on a screen when the device to be detected executes the target operation instruction. And the target video is a video obtained by starting recording by the client in response to receiving the starting instruction and stopping recording in response to receiving the stopping instruction.
And step 404, acquiring a video of an image which is shot in advance and displayed on a screen when the device to be detected executes the target operation instruction.
Step 405, the video is parsed into a sequence of frames.
A start frame and an end frame are determined from the sequence of frames, step 406.
The starting frame comprises an image displayed by a screen when the to-be-detected device starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the to-be-detected device finishes executing the target operation instruction.
Step 407, determining the response time of the device to be detected when executing the target operation instruction based on the start frame and the end frame.
In this embodiment, the specific implementation of the steps 404 and 407 and the technical effects thereof can refer to the steps 201 and 204 in the embodiment corresponding to fig. 2, which are not described herein again.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the method for determining the response time in this embodiment controls the client to complete video recording by sending a start instruction and a stop instruction to the client. Compared with manual video recording, automatic recording is realized.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an apparatus for determining a response time, where an embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be applied to various electronic devices in particular.
As shown in fig. 5, the apparatus for determining a response time of the present embodiment includes: an acquisition unit 501, a parsing unit 502, a key frame determination unit 503, and a response time determination unit 504. Wherein the acquiring unit 501 is configured to acquire a video of an image displayed on a screen when the apparatus to be detected executes a target operation instruction, which is photographed in advance. The parsing unit 502 is configured to parse the video into a sequence of frames. The key frame determination unit 503 is configured to determine a start frame and an end frame from the sequence of frames. The starting frame comprises an image displayed by a screen when the to-be-detected device starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the to-be-detected device finishes executing the target operation instruction. The response time determination unit 504 is configured to determine a response time when the device under test executes the target operation instruction based on the start frame and the end frame.
In this embodiment, the specific processing and the technical effects of the obtaining unit 501, the parsing unit 502, the key frame determining unit 503 and the response time determining unit 504 in the apparatus 500 for determining response time are similar to those in step 201 and 204 in the embodiment corresponding to fig. 2, and are not described herein again.
In some optional implementations of this embodiment, the apparatus 500 may further include: an output unit (not shown in the figure). Wherein the output unit is configured to output prompt information characterizing that the response time is greater than the target threshold in response to determining that the response time is greater than the target threshold.
In some optional implementations of this embodiment, the target threshold is obtained by: acquiring a historical response time set; a target threshold is determined based on the set of historical response times.
In some optional implementations of this embodiment, the apparatus 500 further includes: a start instruction transmitting unit (not shown in the figure), a stop instruction transmitting unit (not shown in the figure), and a receiving unit (not shown in the figure). Wherein the start instruction sending unit is configured to send a start instruction for representing the start of recording the video to the client in response to receiving a signal for representing the start of execution of the target operation instruction by the device to be detected. The stop instruction sending unit is configured to send a stop instruction for representing stopping recording the video to the client in response to receiving a signal for representing that the device to be detected executes the target operation instruction. The receiving unit is configured to receive the target video sent by the client and determine the target video as the video of the image displayed on the screen when the device to be detected executes the target operation instruction. And the target video is a video obtained by starting recording by the client in response to receiving the starting instruction and stopping recording in response to receiving the stopping instruction.
In some optional implementations of this embodiment, the key frame determining unit 503 is further configured to: and in response to determining that the similarity of two adjacent frames in the frame sequence is greater than a preset threshold, determining one of the two adjacent frames as a starting frame or an ending frame.
In this embodiment, the parsing unit 502 in the apparatus 500 for determining a response time may first parse the video acquired by the acquisition unit 501 into a frame sequence, so that the key frame determination unit 503 may determine the start frame and the end frame. On this basis, the response time determination unit 504 may determine the response time when the device under test executes the target operation instruction based on the start frame and the end frame. Compared with manual measurement, analysis can be performed at a granularity beyond what the human eye can perceive, thereby improving the accuracy of the resulting response time. In addition, compared with the method through log analysis, the method avoids the error caused by asynchronous operation and also improves the accuracy of the obtained response time.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a server according to embodiments of the present application. The server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Liquid Crystal Display (LCD) and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a parsing unit, a key frame determination unit, and a response time determination unit. The names of these units do not in some cases constitute a limitation on the unit itself, and for example, the acquisition unit may also be described as "a unit that acquires a video of an image displayed on a screen when the apparatus to be inspected performs a target operation instruction, which is photographed in advance".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the server described in the above embodiments; or may exist separately and not be assembled into the server. The computer readable medium carries one or more programs which, when executed by the server, cause the server to: acquiring a video of an image displayed on a screen when the equipment to be detected executes a target operation instruction, wherein the video is shot in advance; parsing a video into a sequence of frames; determining a starting frame and an ending frame from the frame sequence, wherein the starting frame comprises an image displayed by a screen when the device to be detected starts to execute the target operation instruction, and the ending frame comprises an image displayed by the screen when the device to be detected executes the target operation instruction; and determining the response time of the device to be detected when executing the target operation instruction based on the starting frame and the ending frame.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A method for determining a response time, comprising:
the method comprises the steps of obtaining a video of an image displayed on a screen when a device to be tested executes a target operation instruction, wherein the video is automatically recorded by a client side based on a start instruction and a stop instruction generated by executing the target operation by a test case which is compiled in advance and used for testing the device to be tested, and one test case corresponds to one or more operations of a user;
parsing the video into a sequence of frames;
determining a starting frame and an ending frame from the frame sequence, wherein the starting frame comprises an image displayed on a screen when the device to be detected starts to execute the target operation instruction, and the ending frame comprises an image displayed on the screen when the device to be detected finishes executing the target operation instruction;
and determining the response time of the equipment to be detected when executing the target operation instruction based on the starting frame and the ending frame.
2. The method of claim 1, wherein the method further comprises:
in response to determining that the response time is greater than a target threshold, outputting prompt information characterizing that the response time is greater than the target threshold.
3. The method of claim 2, wherein the target threshold is obtained by:
acquiring a historical response time set;
determining the target threshold based on the set of historical response times.
4. The method according to any one of claims 1-3, wherein prior to said acquiring a pre-captured video of an image displayed on a screen when the device under test performs the target operation instruction, the method further comprises:
responding to a received signal for representing that the device to be detected starts to execute the target operation instruction, and sending a starting instruction for representing that video recording starts to a client;
sending a stopping instruction for representing that the video recording is stopped to the client in response to receiving a signal for representing that the target operation instruction is executed and completed by the equipment to be detected;
receiving a target video sent by the client, and determining the target video as a video of an image displayed on a screen when the device to be detected executes a target operation instruction, wherein the target video is a video obtained by the client starting to record in response to receiving the starting instruction and stopping to record in response to receiving the stopping instruction.
5. The method of any of claims 1-3, wherein the determining a starting frame and an ending frame from the sequence of frames comprises:
in response to determining that the similarity between two adjacent frames in the frame sequence is greater than a preset threshold, determining one of the two adjacent frames as the start frame or the end frame.
6. An apparatus for determining a response time, comprising:
the system comprises an acquisition unit, a display unit and a processing unit, wherein the acquisition unit is configured to acquire a video of an image displayed on a screen when a device to be detected executes a target operation instruction, the video is automatically recorded by a client based on a start instruction and a stop instruction generated by executing the target operation by a test case which is compiled in advance and used for testing the device to be detected, and one test case corresponds to one or more operations of a user;
a parsing unit configured to parse the video into a sequence of frames;
a key frame determining unit configured to determine a start frame and an end frame from the frame sequence, wherein the start frame includes an image displayed on a screen when the device to be detected starts to execute the target operation instruction, and the end frame includes an image displayed on the screen when the device to be detected finishes executing the target operation instruction;
a response time determination unit configured to determine a response time when the device to be detected executes the target operation instruction based on the start frame and the end frame.
7. The apparatus of claim 6, wherein the apparatus further comprises:
an output unit configured to output prompt information characterizing that the response time is greater than a target threshold in response to determining that the response time is greater than the target threshold.
8. The apparatus of claim 7, wherein the target threshold is obtained by:
acquiring a historical response time set;
determining the target threshold based on the set of historical response times.
9. The apparatus of any of claims 6-8, wherein the apparatus further comprises:
a starting instruction sending unit configured to send a starting instruction for representing the start of recording the video to the client in response to receiving a signal for representing the start of executing the target operation instruction by the device to be detected;
a stopping instruction sending unit configured to send a stopping instruction for representing stopping recording of the video to the client in response to receiving a signal for representing that the target operation instruction is completed by the device to be detected;
and the receiving unit is configured to receive a target video sent by the client and determine the target video as a video of an image displayed on a screen when the device to be detected executes a target operation instruction, wherein the target video is a video obtained by starting recording by the client in response to receiving the starting instruction and stopping recording in response to receiving the stopping instruction.
10. The apparatus according to any of claims 6-8, wherein the key frame determination unit is further configured to:
in response to determining that the similarity between two adjacent frames in the frame sequence is greater than a preset threshold, determining one of the two adjacent frames as the start frame or the end frame.
11. A server, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-5.
CN201810962004.6A 2018-08-22 2018-08-22 Method and apparatus for determining response time Active CN108900776B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810962004.6A CN108900776B (en) 2018-08-22 2018-08-22 Method and apparatus for determining response time
JP2019127386A JP6898968B2 (en) 2018-08-22 2019-07-09 Methods and devices for determining response time
KR1020190083623A KR102158557B1 (en) 2018-08-22 2019-07-11 Method and device for determining response time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810962004.6A CN108900776B (en) 2018-08-22 2018-08-22 Method and apparatus for determining response time

Publications (2)

Publication Number Publication Date
CN108900776A CN108900776A (en) 2018-11-27
CN108900776B true CN108900776B (en) 2020-11-27

Family

ID=64358303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810962004.6A Active CN108900776B (en) 2018-08-22 2018-08-22 Method and apparatus for determining response time

Country Status (3)

Country Link
JP (1) JP6898968B2 (en)
KR (1) KR102158557B1 (en)
CN (1) CN108900776B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111245559B (en) * 2018-11-29 2023-04-18 阿里巴巴集团控股有限公司 Information determination method, information judgment method and device and computing equipment
CN111324521B (en) * 2018-12-13 2023-03-28 花瓣云科技有限公司 Graphical interface performance test method and test equipment
CN111949509B (en) * 2019-05-17 2023-08-15 百度在线网络技术(北京)有限公司 Response time testing method, device and equipment of application software and storage medium
CN110209581A (en) * 2019-05-31 2019-09-06 北京字节跳动网络技术有限公司 Trigger action analysis method and device based on application program
CN110442499B (en) * 2019-07-10 2023-08-04 创新先进技术有限公司 Method and device for testing and improving page response performance and terminal equipment
CN111405218A (en) * 2020-03-26 2020-07-10 深圳市微测检测有限公司 Touch screen time delay detection method, system, device, equipment and storage medium
CN111858318B (en) * 2020-06-30 2024-04-02 北京百度网讯科技有限公司 Response time testing method, device, equipment and computer storage medium
CN111798358A (en) * 2020-07-01 2020-10-20 北京梧桐车联科技有限责任公司 Method and device for determining route calculation time, electronic equipment and readable storage medium
CN112055237B (en) * 2020-08-31 2022-07-19 北京爱奇艺科技有限公司 Method, system, apparatus, device and storage medium for determining screen-to-screen delay
CN112203042B (en) * 2020-09-10 2022-08-05 福建升腾资讯有限公司 Cloud desktop operation response time testing method, system, equipment and medium
CN112218155A (en) * 2020-09-24 2021-01-12 北京达佳互联信息技术有限公司 Automatic detection method and device for switching time consumption and electronic equipment
CN112437289B (en) * 2020-09-28 2023-03-10 上海艾策通讯科技股份有限公司 Switching time delay obtaining method
CN112203150B (en) * 2020-09-30 2022-03-11 腾讯科技(深圳)有限公司 Time-consuming acquisition method, device, equipment and computer-readable storage medium
CN113032228B (en) * 2020-11-26 2023-09-05 北京字节跳动网络技术有限公司 Method, device, equipment and storage medium for determining time consumption of operation response
CN113225624A (en) * 2021-04-08 2021-08-06 腾讯科技(深圳)有限公司 Time-consuming determination method and device for voice recognition
CN113312967A (en) * 2021-04-22 2021-08-27 北京搜狗科技发展有限公司 Detection method, device and device for detection
CN113485579A (en) * 2021-06-30 2021-10-08 东莞市小精灵教育软件有限公司 Finger stable frame detection method and device and computer readable storage medium
CN113821438A (en) * 2021-09-23 2021-12-21 统信软件技术有限公司 Application response performance test method and system and computing equipment
CN113986172A (en) * 2021-10-27 2022-01-28 佛山市顺德区美的电子科技有限公司 Electronic equipment, screen display method and device thereof and storage medium
CN114051110B (en) * 2021-11-08 2024-04-02 北京百度网讯科技有限公司 Video generation method, device, electronic equipment and storage medium
CN114827754B (en) * 2022-02-23 2023-09-12 阿里巴巴(中国)有限公司 Video first frame time detection method and device
CN115695851B (en) * 2022-12-28 2023-03-28 海马云(天津)信息技术有限公司 End-to-end delay calculation method and device, storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090842A (en) * 2014-07-15 2014-10-08 深圳市金立通信设备有限公司 Detection method for application program running
JP2017156817A (en) * 2016-02-29 2017-09-07 株式会社日立製作所 User information management system
WO2018034535A1 (en) * 2016-08-18 2018-02-22 Samsung Electronics Co., Ltd. Display apparatus and content display method thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8644623B2 (en) * 2011-11-16 2014-02-04 Microsoft Corporation Measuring web page rendering time
KR101874078B1 (en) * 2016-10-05 2018-07-03 에스케이테크엑스 주식회사 Apparatus for measuring End-to-End delay of application and method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090842A (en) * 2014-07-15 2014-10-08 深圳市金立通信设备有限公司 Detection method for application program running
JP2017156817A (en) * 2016-02-29 2017-09-07 株式会社日立製作所 User information management system
WO2018034535A1 (en) * 2016-08-18 2018-02-22 Samsung Electronics Co., Ltd. Display apparatus and content display method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴舰.基于帧计算的精确App响应时间计算.《https://www.sohu.com/a/168180235_744135》.2017, *
基于帧计算的精确App响应时间计算;吴舰;《https://www.sohu.com/a/168180235_744135》;20170829;第1-5页 *

Also Published As

Publication number Publication date
JP6898968B2 (en) 2021-07-07
KR102158557B1 (en) 2020-09-22
CN108900776A (en) 2018-11-27
KR20200022329A (en) 2020-03-03
JP2020030811A (en) 2020-02-27

Similar Documents

Publication Publication Date Title
CN108900776B (en) Method and apparatus for determining response time
WO2020000879A1 (en) Image recognition method and apparatus
WO2019242222A1 (en) Method and device for use in generating information
CN109255337B (en) Face key point detection method and device
CN109684188B (en) Test method and device
CN112306793A (en) Method and device for monitoring webpage
CN111309617A (en) Application program control method and device, storage medium and electronic equipment
CN110908922A (en) Application program testing method and device
CN110059064B (en) Log file processing method and device and computer readable storage medium
CN112954056B (en) Method and device for processing monitoring data, electronic equipment and storage medium
CN116662193A (en) Page testing method and device
CN112306826A (en) Method and apparatus for processing information for terminal
CN112306857A (en) Method and apparatus for testing applications
CN109194567B (en) Method and apparatus for retransmitting information
US11960703B2 (en) Template selection method, electronic device and non-transitory computer-readable storage medium
CN114510305B (en) Model training method and device, storage medium and electronic equipment
CN115391204A (en) Test method and device for automatic driving service, electronic equipment and storage medium
CN115033469A (en) Website system performance test method and device, equipment and storage medium
CN109960659B (en) Method and device for detecting application program
CN114840379A (en) Log generation method, device, server and storage medium
CN112084114A (en) Method and apparatus for testing an interface
CN108471635B (en) Method and apparatus for connecting wireless access points
CN112241372A (en) Terminal testing method and device and electronic equipment
CN110851150B (en) Method and apparatus for installing applications
CN111259697A (en) Method and apparatus for transmitting information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant