EP3975159A1 - Method and a system for measuring the latency of a graphical display output - Google Patents

Method and a system for measuring the latency of a graphical display output Download PDF

Info

Publication number
EP3975159A1
EP3975159A1 EP20197801.2A EP20197801A EP3975159A1 EP 3975159 A1 EP3975159 A1 EP 3975159A1 EP 20197801 A EP20197801 A EP 20197801A EP 3975159 A1 EP3975159 A1 EP 3975159A1
Authority
EP
European Patent Office
Prior art keywords
display
time
time stamp
graphical
graphical symbol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20197801.2A
Other languages
German (de)
French (fr)
Inventor
Damir SHAIKHUTDINOV
Sudip JAINE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OpenSynergy GmbH
Original Assignee
OpenSynergy GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OpenSynergy GmbH filed Critical OpenSynergy GmbH
Priority to EP20197801.2A priority Critical patent/EP3975159A1/en
Publication of EP3975159A1 publication Critical patent/EP3975159A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/12Test circuits or failure detection circuits included in a display system, as permanent part thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/145Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light originating from the display screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data

Definitions

  • the present invention concerns a method and a system for measuring the latency of a graphical display output.
  • test display is compared with a control display, usually a CRT and a stopwatch software displays a stopwatch program concurrently on both displays.
  • the time lag between the two displays is measure by taking a photograph of the displays.
  • Update latency is important in automotive domain, since various safety applications can require driver attention and low latency can give the driver more time to handle a dangerous situation if the software is not able to do so.
  • Object of the invention is to provide a method and a system which provides an improved measurement of the latency of the graphical output.
  • a method for measuring the latency of a graphical display output comprising the following steps:
  • the processor transmits to the GPU the time stamp as one or more characters and/or one or more number values, in particular as non-graphical number values;
  • a system for measuring the latency of a graphical display output, comprising a first device having processor and a graphics processing unit, GPU, connected to the at least one processor, the GPU being connected to at least one first display, wherein the processor is adapted to obtain a time value, to generate a time stamp based on the time value and to transmit the time stamp to the GPU, wherein the GPU is adapted to render a display frame including at least one graphical symbol being based on the time stamp and to provide the rendered frame including the at least one graphical symbol to the at least one first display; wherein the system further comprises at least one camera adapted to capture at least a portion of the first display comprising the at least one graphical symbol of the time stamp, wherein the system is adapted to determine from the at least one captured graphical symbol the rendered time stamp, and to compare the rendered time stamp with a time reference for obtaining the latency of the graphical display output.
  • the system further comprises a second device having processor and a graphics processing unit, GPU, connected to the at least one processor, the GPU being connected to at least one second display, wherein the processor of the second device is adapted to obtain a time value, to generate a time stamp based on the time value and to transmit the time stamp to the GPU of the second device, wherein the GPU of the second device is adapted to render a display frame including at least one graphical symbol being based on the time stamp and to provide the rendered frame including the at least one graphical symbol to the at least one second display, wherein the first and second device obtain a time value from the same time source, wherein the camera is adapted to capture concurrently at least portion of the first display and at least a portion of the second display comprising the at least one graphical symbol of the time stamp, wherein the system is adapted to determine from the at least one captured graphical symbol of the second display the rendered time stamp corresponding to the time reference.
  • the processor of the second device is adapted to obtain a time value, to generate a time stamp based
  • Embodiments are also directed to the system for carrying out the disclosed methods steps and in particular including apparatus parts and/or devices for performing described method steps.
  • the method steps may be performed by way of hardware components, firmware, software, a computer programmed by appropriate software, by any combination thereof or in any other manner.
  • a computer program product comprising commands for executing the method according an embodiment disclosed herein, when loaded and executed on a processor.
  • a computer program product may be a physical software product, for example a hard disc, a solid state disc, a CD-ROM, a DVD, comprising the program.
  • the present invention relates to non-volatile memory, for example a hard disc, a solid state device, a CD-ROM, a DVD, including a program containing commands for executing the method according an embodiment disclosed herein, when loaded and executed on a processor.
  • non-volatile memory for example a hard disc, a solid state device, a CD-ROM, a DVD, including a program containing commands for executing the method according an embodiment disclosed herein, when loaded and executed on a processor.
  • Figure 1 shows schematically system for measuring the latency of a graphical display output.
  • the system includes a device 3 for generating at least one display output.
  • the device 3 comprises a processor 5 and a graphics processing unit (GPU) 7.
  • the device 3 includes optionally one or more hardware controller 6 for a network and/or a bus 6a.
  • the GPU is connected to one or more displays 10a, 10b and at least one camera 12 is provided to capture a picture from the one or more displays 10a, 10b.
  • the processor 3 may include one or more cores.
  • the processor is a central processing unit (CPU).
  • the each core is an independent processing unit, independent from the other cores.
  • the cores enable parallel computing in the processor 5.
  • the device 3 may comprise an interface for connecting to one or more bus systems, for example one or more hardware controller 6 for a controller area network (CAN) busses 6a and/or one or more hardware controller 6 for FlexRay busses 6a.
  • the device may also comprise further hardware controller 6 for connecting to one or more wired or wireless networks 6a, for example a Bluetooth connection, an Ethernet network and/or to a USB (Universal Serial Bus) connection.
  • CAN controller area network
  • FlexRay busses 6a a controller area network
  • the device may also comprise further hardware controller 6 for connecting to one or more wired or wireless networks 6a, for example a Bluetooth connection, an Ethernet network and/or to a USB (Universal Serial Bus) connection.
  • USB Universal Serial Bus
  • a time value is obtained by the processor 5 for example from an internal clock 9 of the device 3 and/or of the processor 5.
  • the processor 5 obtains a time value via a computer network 9, for example using a network time protocol (NTP).
  • NTP network time protocol
  • the network time protocol is defined in different standards, for example RFC 5905 or RFC 1305.
  • the network time protocol is provided for synchronization between computer systems over variable latency data networks.
  • the processor 5 is adapted to generate the time stamp from the obtained time value and to transmit the time stamp to the GPU 7, in particular immediately after the generation of the time stamp.
  • the time stamp is selected or generated by the processor 5 immediately before transmitting a rendering request to the GPU 7.
  • the resulting latency is therefore a time between selecting or generating the timestamp on the processor 5 and displaying the time stamp in form of a barcode on the one or more displays 10a, 10b.
  • the time value is for example the astronomic time.
  • the time value includes year, month, day, hours, minutes seconds and/or milliseconds.
  • the time value may include all of these information.
  • the time value may include the milliseconds elapsed from the start of the day.
  • the time stamp generated comprises reduced time information compared to the time value.
  • the processor 5 is adapted to reduce the time information with respect to the time value.
  • the time stamp includes milliseconds elapsed from the start of the day.
  • the time stamp does not include information about the year, the month and/or the day.
  • the processor 5 removes the information about the year, the month and/or the day.
  • the amount of information needed to transfer to the GPU 7 is reduced. Further, portions that are unlikely to change during the test, like day or month or year are filtered. Therefore, a reduced timestamp is transferred to GPU 7.
  • the time stamp includes for example an absolute time or a relative time, in particular of the time when the time stamp is created.
  • a relative time is the time passed since a specific starting time.
  • the time stamp can include the number of seconds and/or milliseconds since the start of the day.
  • the specific starting time is the start of the day, for example midnight.
  • the time stamp includes one or more characters and/or numerical values representing the time. In other words, the time stamp does not include graphical representation.
  • the graphics processing unit (GPU) 7 is a device specifically adapted for calculating and/or rendering graphics, for example two or three dimensional graphics. For that purpose the GPU 7 has a highly parallel structure, where processing of data is done in parallel.
  • the GPU 7 is separated from the one or more processors 3. In other embodiments, the GPU 7 is integrated into the one or more CPUs 3. Typically, a GPU 7 transmits a rendered frame to the one or more displays 10a, 10b, for example directly via a specific connection.
  • the GPU 7 is connected or via the hardware controller 6 and the bus and/or the network 6a to the one or more displays 10a, 10b.
  • the GPU 7 is adapted to transmit the rendered frames via the bus and/or the network 6a to the one or more displays 10a, 10b.
  • rendering is the process for generating an image or a display frame for one or more displays from the information, here the time stamp, received from the processor 5.
  • models or scene files which are a virtual model defining the characteristics of objects and lights, are used as a basis for rendering the frame. For example during rendering, it is determined which objects are visible for the observer, whether the objects are shaded or not, the light distribution within the scene etc.
  • the GPU is adapted to generate or render display frames from a textual and/or numerical input value, in particular the time stamp received from the processor 5.
  • the GPU can be programmed.
  • the GPU is adapted to generate or render from the time stamp including the one or more characters and/or numerical values at least one graphical symbol in a display frame.
  • the graphical symbol is for example a bar code, in particular a one dimensional or two dimensional bar code.
  • the bar code may be a black and white bar code.
  • the time stamp is transformed in an UPC (Universal Product Code)-A barcode.
  • Figure 2 discloses schematically an embodiment of a display frame sent to a display 10a, 10b.
  • the display frame has a top edge 14 and a bottom edge 16.
  • Two graphical symbols 18, 20 calculated based on the time stamp are arranged at the top and the bottom of the display frame, in particular adjacent the top edge 14 and adjacent the bottom edge 16 of the display frame.
  • the display frame may include more or less graphical symbols based on the time stamp.
  • the display frame may include a single graphical symbol or three or more graphical symbols based on the time stamp.
  • the graphical symbols 18 and 20 are different even though they are based on the same time stamp.
  • the top graphical symbol and the bottom graphical symbol encodes the same time stamp in a different manner.
  • the encoded numbers of the top and bottom graphical symbols 18, 20, at the same position are complemented to a predetermined number, for example 9. In other words, if at a first position the digit is 1 in the top graphical symbol 18, the first position of the bottom graphical symbol 20 is 8.
  • the generated display frame is then transmitted by the GPU 7 to the one or more displays 10a, 10b, either via a direct connection as shown in Figure 1 or via a bus and/or network connection using the hardware controller 6 and the bus and/or network 6a.
  • the display or displays 10a, 10b then display upon reception the received display frame.
  • the display 10a is the display, where the latency has to be measured.
  • the latency is the time from generation of the time stamp provided to the GPU until the complete display of the display frame at the display.
  • the graphical latency is determined in the present disclosure.
  • all the displays 10a, 10b are the displays for which the latency may be measured.
  • one of the displays is a reference display.
  • the reference display is a display with known display capabilities, for example having a known display latency.
  • the reference display is adapted to display a complete frame within 50 milliseconds after the display frame has been provided to the input port of the display. Then, the latency of a display may be measured with respect to the reference display. If, for example, the same GPU is used, the latency due to the GPU 7 can be considered being the same for all displays.
  • two or more devices 3 are used each connected to a respective display 10a, 10b.
  • the time stamps of the devices 3 are synchronized using the NTP protocol.
  • the time difference between the different devices 3 is below 1 millisecond.
  • the reference display is a cathode ray tube (CRT) display.
  • CRT cathode ray tube
  • the system further includes the at least one camera 12.
  • the camera 12 has a field of view 22.
  • the field of view 22 of the camera(s) 12 is arranged such that one or more portions of the display frame including the graphical symbols 18, 20 are included in the field of view 22.
  • a single camera 12 is used capturing all graphical symbols 18, 20 of all the displays 10a, 10b.
  • the at least one camera 12 is for example a camera capable of capturing 120 frames per second.
  • the camera speed required depends on the display update frequency.
  • the minimum speed of camera is 120 frames per second, and preferably 240 frames per second.
  • the camera capture rate should be at least twice the display update frequency, in particular of the fastest display 10a, 10b captured by the camera 12. This may be for example necessary in order to capture complete frames.
  • the capture rate of the camera is at least four times the display update rate.
  • the precision of the disclosed method may be improved with using displays with higher update frequency and using cameras with higher frames per seconds capture frequency.
  • the at least one camera 12 may be connected to an evaluation device 24 in order to provide the captured frames to the evaluation device 24.
  • the evaluation device 24 are provided to determine the legacy of the at least one display 10a, 10b.
  • the at least one camera 12 is connected via the network and/or bus 6a to the evaluation device 24.
  • the at least one camera 12 is directly connected to the evaluation device 24, for example using a USB (Universal Serial Bus) connection.
  • USB Universal Serial Bus
  • the evaluation device 24 is integrated into the device 3. In other words, the evaluation device 24 and the device 3 are the same device.
  • the at least one camera 12 is adapted to obtain a time value, for example from the device 3, the evaluation device 24 and/or the bus and/or network 6a using the network time protocol.
  • the at least one camera 12 may associate the time value with a captured frame.
  • the time value can be stored with the captured frame.
  • the at least one camera 12 is capturing frames for a predetermined time, for example several minutes.
  • the captured frame is provided to the evaluation device 24.
  • the evaluation device 24 is adapted to analyze each captured frame.
  • the evaluation device is adapted to convert the graphical symbols 18, 20 into numbers and/or a time stamp.
  • the graphical symbol is a barcode
  • an automated barcode recognition software may be used.
  • the evaluation device 24 is adapted to compare the decoded time stamp with a time reference.
  • the coded time stamp is compared with the time reference if, in particular only if, the graphical symbol at the top of the display and the graphical symbol at the bottom of the display refers to the same time stamp. For example, this is done by adding the numbers of the decoded symbols and compare them with a predefined number as explained above.
  • display frames that were captured in the middle of a display update that means that the graphical symbols at the bottom and the top of the display refer to different time stamps, are discarded.
  • the Figure 3 shows a system with a first display 10a and a second display 10b, wherein the second display is a reference display.
  • the displays 10a and 10b may be connected to the same device 3 or to two different devices 3.
  • each device 3 has obtained the time value from the same time source, for example via the network time protocol and/or the clock 9.
  • both devices 3 are synchronized to the same clock.
  • Both displays 10a, 10b for example a first display 10a connected to the GPU of the first device and a second display 10b connected to the GPU 7 of the second device 3, are within the field of view 22 of the camera 12.
  • the evaluation device 24 determines in a first step from the graphical symbols 18, 20 the time stamps of a complete displayed display frame.
  • the time stamp of the first display 10a is then compared with the time stamp of the second display 10b.
  • the latency of the first display 10a can be calculated.
  • the time stamp difference is calculated and analyzed. This allows an automatic comparison of the graphical latency between the two displays. In the embodiment of Figure 3 , it is not necessary to associate a captured frame of the camera with a time stamp, but is also possible.
  • Typical barcode recognition software handles many types of image distortions, for example camera angles, perspective distortions, and allows reliable timestamp recognition.
  • the captured frames of the at least one camera 12 are associated with a time value, when the frame was captured by the at least one camera 12.
  • the camera 12 saves the time of start of recording, and then time value or time stamp of each frame can be calculated from time of start of recording, the number of the captured frame and a specified predefined frame rate.
  • the time value or a time stamp is embedded into or associated to each captured frame.
  • the evaluation device 24 is adapted to receive the capture from the at least one camera 12 and to determine, optionally, a complete displayed display frame.
  • the time stamp of the captured display frame is determined from the at least one graphical symbol 18, 20 in the captured frame.
  • the evaluation device is adapted to compare the determined time stamp with the time value associated with the captured frame, from which the time stamp was determined, in order to determine the graphical latency. For example, a difference is calculated between the determined time stamp in order to determine the graphical latency.
  • Figure 5 shows a flow chart of a method according to an embodiment.
  • the processor 5 obtains a time value, for example from the clock 9 or from via the network and/or bus 6a using the network time protocol.
  • the clock 9 and the network time protocol are a time source.
  • step 102 the processor 5 determines or creates a time stamp based on the time value and transmits the transmitting the time stamp to the GPU 7.
  • the GPU calculates at least one graphical symbol 18, 20 from the time stamps and renders a display frame including at least one graphical symbol 18, 20.
  • the calculating of the at least one graphical symbol and the rendering of display frame can take place at the same time during the same calculation.
  • the rendered display frame includes two graphical symbols, a first graphical symbol 18 at the top edge of the display frame and a second graphical symbol 20 at the bottom edge of the display frame.
  • the GPU 7 provides the rendered display frame including the at least one graphical symbol to at least one display 10a, 10b, for example via a respective direct connection, as shown in Figure 1 , or via a network and/or bus connection 6a.
  • step 108 the at least one camera 12 captures at least a portion of the at least one display 10a, 10b comprising the graphical symbol 18, 20 of the time stamp.
  • the at least one camera 12 provides the captured frame to the evaluation device 24.
  • the camera captures all graphical symbols of the display frame displayed by the at least one display 10a, 10b.
  • the time stamp is determined or decoded from the captured graphical symbol(s) and the time stamp is compared with a time reference in step 110.
  • the workload of the processor 3 and/or the GPU 7 may be varied over the time and the latency of the display is determined over the time.
  • the evaluation device 24 may store the latency and workload values, in particular subsequent latency values of the same display 10a and/or subsequent workload values, 10b, in at least one database, in particular so that the workload values of the processor 3 can be associated with corresponding individual latency values.
  • the workload is an amount of work that processor 5 and/or GPU 7 need to do.
  • the workload can have peaks, when suddenly the processor 5 and/or the GPU 7 is very busy with calculations. During these peaks the latency of many parameters of the system usually also increases.
  • the latency is continuously measured.
  • the present disclosure enables a precise detection of graphical latency.
  • the task of comparing performance of different solutions for transferring graphics, in particular over the network, as well as measuring latency with which the display get updated is solved.
  • the disclosure allows automatic measurement of latency, producing more data points with much higher frequency, allowing a more complex analysis of graphical latency distribution under different workloads.
  • the method and the system according to the invention enable an automatic latency measurement.
  • any consumer grade camera capable of doing 120 frames per second or more can be used to measure graphics latency so that it is not necessary to use industrial grade high-speed cameras.
  • the present disclosure can be used as part of a continuous integration regression tests for graphical system developers, detecting the latency problems as soon as they are introduced.
  • any feature of any embodiment described herein may be used in combination with any feature of any other embodiment described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention concerns a method for measuring the latency of a graphical display output, comprising the following steps:- obtaining, by a processor (5), a time value; - generating, by the processor (5), a time stamp based on the time value; - transmitting, by the processor, the time stamp to a graphics processor unit (7), GPU, rendering, by the GPU (7), a display frame including at least one graphical symbol being based on the time stamp; providing the rendered frame including the at least one graphical symbol to at least one display (10a, 10b); - capturing, by at least one camera (12), at least a portion of the display comprising the at least one graphical symbol (18, 20) of the time stamp;-determining from the at least one captured graphical symbol the rendered time stamp; and- comparing the rendered time stamp with a time reference for obtaining the latency of the graphical display output.

Description

  • The present invention concerns a method and a system for measuring the latency of a graphical display output.
  • There is a system known, where a high-speed camera is used to measure input and graphics latency, counting frames between a visible mouse click and a display update. This system requires a user interaction.
  • In other systems a test display is compared with a control display, usually a CRT and a stopwatch software displays a stopwatch program concurrently on both displays. The time lag between the two displays is measure by taking a photograph of the displays.
  • Further, there exist some standards, in particular ISO 13406-2, where a monitor is switch from black to white and vice versa and the time is measured of 10 and 90 percent brightness. Another standard is ISO 9241-305, where not only black and white, but also grey values are used.
  • The inconvenience of these systems that they take also into account also other lags or latencies, for example of the treatment of the input device (input latency), without the ability of determining the graphic latency alone. Further, they did not take into account the time of a display to display a complete frame.
  • Update latency is important in automotive domain, since various safety applications can require driver attention and low latency can give the driver more time to handle a dangerous situation if the software is not able to do so.
  • Object of the invention is to provide a method and a system which provides an improved measurement of the latency of the graphical output.
  • According to one aspect, a method is provided for measuring the latency of a graphical display output, comprising the following steps:
    • obtaining, by a processor, a time value;
    • generating, by the processor, a time stamp based on the time value;
    • transmitting, by the processor, the time stamp to a graphics processor unit, GPU,
    • rendering, by the GPU, a display frame including at least one graphical symbol being based on the time stamp;
    • providing the rendered frame including the at least one graphical symbol to at least one display;
    • capturing, by at least one camera, at least a portion of the display comprising the at least one graphical symbol of the time stamp;
    • determining from the at least one captured graphical symbol the rendered time stamp; and
    • comparing the rendered time stamp with a time reference for obtaining the latency of the graphical display output..
  • Further embodiments may relate to one or more of the following features, which may be combined in any technical feasible combination:
    the processor transmits to the GPU the time stamp as one or more characters and/or one or more number values, in particular as non-graphical number values;
    • the at least one graphical symbol is a visually readable graphically coded number;
      • - the graphical symbol is a bar code, in particular a one dimensional or two dimensional bar code;
      • - the at least one graphical symbol is a black and white bar code;
      • - - the camera is adapted to capture at least 120 frames per second and/or the camera capture rate is at least twice the display update frequency of the at least one display;
      • - two graphical symbols depending on the time stamp are arranged at the top and the bottom of the display frame, in particular adjacent the top edge and adjacent the bottom edge of the display frame;
      • - the coded time stamp is compared with the time reference only if the graphical symbol at the top of the display and the graphical symbol at the bottom of the display are based on the same time stamp;
      • - the time value is obtained from a network time protocol;
      • - the time stamp the generated comprises reduced time information with respect to the time value;
      • - capturing comprises capturing, by the camera, rendered frames of at least two displays, each of the displays displaying display frames including at least one graphical symbol based on a time stamp, wherein at least one of the at least two displays is a reference display, in particular a cathode ray tube display, adapted to display a frame within a predetermined time, for example 50ms , wherein the time reference is obtained by determining, from the captured graphical symbol of the reference display, the rendered time stamp;
      • - the method further comprises:
        • assigning to each captured frame of the camera a time value, when the frame was captured by the camera;
        • determining the time reference includes obtaining the time value assigned to the captured frame of the camera, wherein, in particular the at least one camera obtains a time from the network time protocol and determines a time value depending on the obtained time; and/or
        • the method further comprises:
          • determining the workload of the processor;
          • recording a plurality of latency values and a plurality workload values in at least one database, such that the latency values are associated to workload values.
  • According to another aspect, a system is provided for measuring the latency of a graphical display output, comprising a first device having processor and a graphics processing unit, GPU, connected to the at least one processor, the GPU being connected to at least one first display, wherein the processor is adapted to obtain a time value, to generate a time stamp based on the time value and to transmit the time stamp to the GPU, wherein the GPU is adapted to render a display frame including at least one graphical symbol being based on the time stamp and to provide the rendered frame including the at least one graphical symbol to the at least one first display; wherein the system further comprises at least one camera adapted to capture at least a portion of the first display comprising the at least one graphical symbol of the time stamp, wherein the system is adapted to determine from the at least one captured graphical symbol the rendered time stamp, and to compare the rendered time stamp with a time reference for obtaining the latency of the graphical display output.
  • Further embodiments may relate to one or more of the following features, which may be combined in any technical feasible combination:
    The system further comprises a second device having processor and a graphics processing unit, GPU, connected to the at least one processor, the GPU being connected to at least one second display, wherein the processor of the second device is adapted to obtain a time value, to generate a time stamp based on the time value and to transmit the time stamp to the GPU of the second device, wherein the GPU of the second device is adapted to render a display frame including at least one graphical symbol being based on the time stamp and to provide the rendered frame including the at least one graphical symbol to the at least one second display, wherein the first and second device obtain a time value from the same time source, wherein the camera is adapted to capture concurrently at least portion of the first display and at least a portion of the second display comprising the at least one graphical symbol of the time stamp, wherein the system is adapted to determine from the at least one captured graphical symbol of the second display the rendered time stamp corresponding to the time reference.
  • Embodiments are also directed to the system for carrying out the disclosed methods steps and in particular including apparatus parts and/or devices for performing described method steps.
  • The method steps may be performed by way of hardware components, firmware, software, a computer programmed by appropriate software, by any combination thereof or in any other manner.
  • According to a further aspect, a computer program product is provided comprising commands for executing the method according an embodiment disclosed herein, when loaded and executed on a processor. According to an embodiment, a computer program product may be a physical software product, for example a hard disc, a solid state disc, a CD-ROM, a DVD, comprising the program.
  • According to other aspects, the present invention relates to non-volatile memory, for example a hard disc, a solid state device, a CD-ROM, a DVD, including a program containing commands for executing the method according an embodiment disclosed herein, when loaded and executed on a processor.
  • Further advantages, features, aspects and details are evident from the dependent claims, the description and the drawings.
  • So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be read by reference to embodiments. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
  • The accompanying drawings relate to embodiments of the invention and are described in the following:
    • Fig. 1 shows schematically a system for measuring the latency of a graphical display output according to an embodiment,
    • Fig. 2 shows schematically the display screen of such a system;
    • Fig. 3 shows schematically a system for measuring the latency of a graphical display output according to an embodiment using two displays;
    • Fig. 4 shows schematically a system for measuring the latency of a graphical display output according to an embodiment with a single display; and
    • Fig. 5 shows a flowchart of a method for measuring the latency of a graphical display output according to an embodiment.
  • Figure 1 shows schematically system for measuring the latency of a graphical display output. The system includes a device 3 for generating at least one display output. According to an embodiment, the device 3 comprises a processor 5 and a graphics processing unit (GPU) 7. Further, the device 3 includes optionally one or more hardware controller 6 for a network and/or a bus 6a. The GPU is connected to one or more displays 10a, 10b and at least one camera 12 is provided to capture a picture from the one or more displays 10a, 10b.
  • The processor 3 may include one or more cores. For example, the processor is a central processing unit (CPU). Typically, the each core is an independent processing unit, independent from the other cores. The cores enable parallel computing in the processor 5.
  • According to an embodiment, the device 3 may comprise an interface for connecting to one or more bus systems, for example one or more hardware controller 6 for a controller area network (CAN) busses 6a and/or one or more hardware controller 6 for FlexRay busses 6a. In other embodiments, the device may also comprise further hardware controller 6 for connecting to one or more wired or wireless networks 6a, for example a Bluetooth connection, an Ethernet network and/or to a USB (Universal Serial Bus) connection.
  • According to embodiments, a time value is obtained by the processor 5 for example from an internal clock 9 of the device 3 and/or of the processor 5. In other embodiments, the processor 5 obtains a time value via a computer network 9, for example using a network time protocol (NTP). The network time protocol is defined in different standards, for example RFC 5905 or RFC 1305. The network time protocol is provided for synchronization between computer systems over variable latency data networks.
  • The processor 5 is adapted to generate the time stamp from the obtained time value and to transmit the time stamp to the GPU 7, in particular immediately after the generation of the time stamp. In other words, the time stamp is selected or generated by the processor 5 immediately before transmitting a rendering request to the GPU 7. The resulting latency is therefore a time between selecting or generating the timestamp on the processor 5 and displaying the time stamp in form of a barcode on the one or more displays 10a, 10b.
  • The time value is for example the astronomic time. In an embodiment, the time value includes year, month, day, hours, minutes seconds and/or milliseconds. For example, the time value may include all of these information. Instead of hours, minutes, seconds, and milliseconds, the time value may include the milliseconds elapsed from the start of the day.
  • According to embodiments, the time stamp generated comprises reduced time information compared to the time value. In other words, the processor 5 is adapted to reduce the time information with respect to the time value. For example, the time stamp includes milliseconds elapsed from the start of the day. In an embodiment, the time stamp does not include information about the year, the month and/or the day. For example, the processor 5 removes the information about the year, the month and/or the day. Thus, the amount of information needed to transfer to the GPU 7 is reduced. Further, portions that are unlikely to change during the test, like day or month or year are filtered. Therefore, a reduced timestamp is transferred to GPU 7.
  • The time stamp includes for example an absolute time or a relative time, in particular of the time when the time stamp is created. A relative time is the time passed since a specific starting time. For example, the time stamp can include the number of seconds and/or milliseconds since the start of the day. In such a case, the specific starting time is the start of the day, for example midnight. The time stamp includes one or more characters and/or numerical values representing the time. In other words, the time stamp does not include graphical representation.
  • The graphics processing unit (GPU) 7 is a device specifically adapted for calculating and/or rendering graphics, for example two or three dimensional graphics. For that purpose the GPU 7 has a highly parallel structure, where processing of data is done in parallel.
  • According to an embodiment, the GPU 7 is separated from the one or more processors 3. In other embodiments, the GPU 7 is integrated into the one or more CPUs 3. Typically, a GPU 7 transmits a rendered frame to the one or more displays 10a, 10b, for example directly via a specific connection.
  • In some embodiments, the GPU 7 is connected or via the hardware controller 6 and the bus and/or the network 6a to the one or more displays 10a, 10b. The GPU 7 is adapted to transmit the rendered frames via the bus and/or the network 6a to the one or more displays 10a, 10b.
  • According to an embodiment, rendering is the process for generating an image or a display frame for one or more displays from the information, here the time stamp, received from the processor 5. For example models or scene files, which are a virtual model defining the characteristics of objects and lights, are used as a basis for rendering the frame. For example during rendering, it is determined which objects are visible for the observer, whether the objects are shaded or not, the light distribution within the scene etc.
  • According to an embodiment, the GPU is adapted to generate or render display frames from a textual and/or numerical input value, in particular the time stamp received from the processor 5. For that purpose, the GPU can be programmed.
  • For example, the GPU is adapted to generate or render from the time stamp including the one or more characters and/or numerical values at least one graphical symbol in a display frame. The graphical symbol is for example a bar code, in particular a one dimensional or two dimensional bar code. The bar code may be a black and white bar code. According to an embodiment, the time stamp is transformed in an UPC (Universal Product Code)-A barcode.
  • Figure 2 discloses schematically an embodiment of a display frame sent to a display 10a, 10b. The display frame has a top edge 14 and a bottom edge 16. Two graphical symbols 18, 20 calculated based on the time stamp are arranged at the top and the bottom of the display frame, in particular adjacent the top edge 14 and adjacent the bottom edge 16 of the display frame. In other embodiments, the display frame may include more or less graphical symbols based on the time stamp. For example, the display frame may include a single graphical symbol or three or more graphical symbols based on the time stamp.
  • In the embodiment shown in Figure 2, the graphical symbols 18 and 20 are different even though they are based on the same time stamp. For example, the top graphical symbol and the bottom graphical symbol encodes the same time stamp in a different manner. In the example shown, the encoded numbers of the top and bottom graphical symbols 18, 20, at the same position, are complemented to a predetermined number, for example 9. In other words, if at a first position the digit is 1 in the top graphical symbol 18, the first position of the bottom graphical symbol 20 is 8.
  • This enables to determine in a simple manner, whether the frame has been displayed completely by comparing the top graphical symbol 18 and the bottom graphical symbol 20. In other words, it is determined, whether the graphical symbols 18, 20 are based or refer to the same time stamp. In the example shown the sum of the encoded time stamps equals to a predetermined number, which is in the example of Figure 2 9999999. If this is not the case, the camera 12 might have captured a state of the display 10a, 10b in the middle of a frame update.
  • The generated display frame is then transmitted by the GPU 7 to the one or more displays 10a, 10b, either via a direct connection as shown in Figure 1 or via a bus and/or network connection using the hardware controller 6 and the bus and/or network 6a.
  • The display or displays 10a, 10b then display upon reception the received display frame.
  • In case only a single display 10a is used, the display 10a is the display, where the latency has to be measured.
  • The latency is the time from generation of the time stamp provided to the GPU until the complete display of the display frame at the display. In other words, the graphical latency is determined in the present disclosure.
  • In case a plurality of displays 10a, 10b is used, all the displays 10a, 10b are the displays for which the latency may be measured.
  • In other embodiments, in case of a plurality of displays 10a, 10b, one of the displays, for example display 10b, is a reference display. The reference display is a display with known display capabilities, for example having a known display latency. In an example, the reference display is adapted to display a complete frame within 50 milliseconds after the display frame has been provided to the input port of the display. Then, the latency of a display may be measured with respect to the reference display. If, for example, the same GPU is used, the latency due to the GPU 7 can be considered being the same for all displays.
  • In other systems, two or more devices 3 are used each connected to a respective display 10a, 10b. In such a case the time stamps of the devices 3 are synchronized using the NTP protocol. Typically, in such systems the time difference between the different devices 3 is below 1 millisecond.
  • In an embodiment, the reference display is a cathode ray tube (CRT) display. A CRT is considered to have itself nearly no latency.
  • The system further includes the at least one camera 12. The camera 12 has a field of view 22. The field of view 22 of the camera(s) 12 is arranged such that one or more portions of the display frame including the graphical symbols 18, 20 are included in the field of view 22.
  • In an embodiment, a single camera 12 is used capturing all graphical symbols 18, 20 of all the displays 10a, 10b.
  • The at least one camera 12 is for example a camera capable of capturing 120 frames per second. In an embodiment, the camera speed required depends on the display update frequency. For example, for a display update rate of 60 Hz, the minimum speed of camera is 120 frames per second, and preferably 240 frames per second. In some embodiments, the camera capture rate should be at least twice the display update frequency, in particular of the fastest display 10a, 10b captured by the camera 12. This may be for example necessary in order to capture complete frames. In some embodiments, the capture rate of the camera is at least four times the display update rate.
  • In some embodiments, the precision of the disclosed method may be improved with using displays with higher update frequency and using cameras with higher frames per seconds capture frequency.
  • The at least one camera 12 may be connected to an evaluation device 24 in order to provide the captured frames to the evaluation device 24. The evaluation device 24 are provided to determine the legacy of the at least one display 10a, 10b. For example, the at least one camera 12 is connected via the network and/or bus 6a to the evaluation device 24. In other embodiments, the at least one camera 12 is directly connected to the evaluation device 24, for example using a USB (Universal Serial Bus) connection.
  • In some embodiments, the evaluation device 24 is integrated into the device 3. In other words, the evaluation device 24 and the device 3 are the same device.
  • In some embodiments, which may be connected to other embodiments, the at least one camera 12 is adapted to obtain a time value, for example from the device 3, the evaluation device 24 and/or the bus and/or network 6a using the network time protocol.
  • The at least one camera 12 may associate the time value with a captured frame. For example, the time value can be stored with the captured frame.
  • According to some embodiments, the at least one camera 12 is capturing frames for a predetermined time, for example several minutes.
  • The captured frame is provided to the evaluation device 24. The evaluation device 24 is adapted to analyze each captured frame. For example, the evaluation device is adapted to convert the graphical symbols 18, 20 into numbers and/or a time stamp. For that purpose, in case the graphical symbol is a barcode, an automated barcode recognition software may be used.
  • Then the evaluation device 24 is adapted to compare the decoded time stamp with a time reference. In some embodiments, the coded time stamp is compared with the time reference if, in particular only if, the graphical symbol at the top of the display and the graphical symbol at the bottom of the display refers to the same time stamp. For example, this is done by adding the numbers of the decoded symbols and compare them with a predefined number as explained above.
  • In other words, display frames that were captured in the middle of a display update, that means that the graphical symbols at the bottom and the top of the display refer to different time stamps, are discarded.
  • The Figure 3 shows a system with a first display 10a and a second display 10b, wherein the second display is a reference display. The displays 10a and 10b may be connected to the same device 3 or to two different devices 3. In the second case (two different devices 3, for example a first device 3 and a second device 3), each device 3 has obtained the time value from the same time source, for example via the network time protocol and/or the clock 9. Thus, both devices 3 are synchronized to the same clock. Both displays 10a, 10b, for example a first display 10a connected to the GPU of the first device and a second display 10b connected to the GPU 7 of the second device 3, are within the field of view 22 of the camera 12.
  • The evaluation device 24 then determines in a first step from the graphical symbols 18, 20 the time stamps of a complete displayed display frame. In a second step, the time stamp of the first display 10a is then compared with the time stamp of the second display 10b. In case the second display 10b is a reference display, the latency of the first display 10a can be calculated. In some embodiments, the time stamp difference is calculated and analyzed. This allows an automatic comparison of the graphical latency between the two displays. In the embodiment of Figure 3, it is not necessary to associate a captured frame of the camera with a time stamp, but is also possible.
  • Typical barcode recognition software handles many types of image distortions, for example camera angles, perspective distortions, and allows reliable timestamp recognition.
  • In another embodiment, for example as shown in Figure 4, as indicated above, the captured frames of the at least one camera 12 are associated with a time value, when the frame was captured by the at least one camera 12. For example, the camera 12 saves the time of start of recording, and then time value or time stamp of each frame can be calculated from time of start of recording, the number of the captured frame and a specified predefined frame rate. In other words, the time value or a time stamp is embedded into or associated to each captured frame.
  • The evaluation device 24 is adapted to receive the capture from the at least one camera 12 and to determine, optionally, a complete displayed display frame.
  • Then, the time stamp of the captured display frame is determined from the at least one graphical symbol 18, 20 in the captured frame. The evaluation device is adapted to compare the determined time stamp with the time value associated with the captured frame, from which the time stamp was determined, in order to determine the graphical latency. For example, a difference is calculated between the determined time stamp in order to determine the graphical latency.
  • In such a case, no reference system is necessary. In other embodiments, also more than a single display can be evaluated at the same time with a single camera 12. In other embodiments, the systems described with respect to Figures 3 and 4 could be combined.
  • Figure 5 shows a flow chart of a method according to an embodiment. In first step 100 the processor 5 obtains a time value, for example from the clock 9 or from via the network and/or bus 6a using the network time protocol. Thus, the clock 9 and the network time protocol are a time source.
  • In step 102, the processor 5 determines or creates a time stamp based on the time value and transmits the transmitting the time stamp to the GPU 7.
  • In the next step 104, the GPU calculates at least one graphical symbol 18, 20 from the time stamps and renders a display frame including at least one graphical symbol 18, 20. In an embodiment, the calculating of the at least one graphical symbol and the rendering of display frame can take place at the same time during the same calculation.
  • For example, the rendered display frame includes two graphical symbols, a first graphical symbol 18 at the top edge of the display frame and a second graphical symbol 20 at the bottom edge of the display frame.
  • In the next step 106, the GPU 7 provides the rendered display frame including the at least one graphical symbol to at least one display 10a, 10b, for example via a respective direct connection, as shown in Figure 1, or via a network and/or bus connection 6a.
  • In step 108 the at least one camera 12 captures at least a portion of the at least one display 10a, 10b comprising the graphical symbol 18, 20 of the time stamp. The at least one camera 12 provides the captured frame to the evaluation device 24. Preferably, the camera captures all graphical symbols of the display frame displayed by the at least one display 10a, 10b.
  • Finally, for example by the evaluation device 24, the time stamp is determined or decoded from the captured graphical symbol(s) and the time stamp is compared with a time reference in step 110.
  • In some embodiments, which may be combined with other embodiments disclosed herein, the workload of the processor 3 and/or the GPU 7 may be varied over the time and the latency of the display is determined over the time. For that purpose, the evaluation device 24 may store the latency and workload values, in particular subsequent latency values of the same display 10a and/or subsequent workload values, 10b, in at least one database, in particular so that the workload values of the processor 3 can be associated with corresponding individual latency values. The workload is an amount of work that processor 5 and/or GPU 7 need to do.
  • Depending on nature of running applications, the workload can have peaks, when suddenly the processor 5 and/or the GPU 7 is very busy with calculations. During these peaks the latency of many parameters of the system usually also increases.
  • According to embodiments, in particular for safety applications, the latency is continuously measured.
  • The present disclosure enables a precise detection of graphical latency.
  • According to the invention, the task of comparing performance of different solutions for transferring graphics, in particular over the network, as well as measuring latency with which the display get updated is solved.
  • Further, the disclosure allows automatic measurement of latency, producing more data points with much higher frequency, allowing a more complex analysis of graphical latency distribution under different workloads.
  • Further, the method and the system according to the invention enable an automatic latency measurement. Further, any consumer grade camera capable of doing 120 frames per second or more can be used to measure graphics latency so that it is not necessary to use industrial grade high-speed cameras.
  • The present disclosure can be used as part of a continuous integration regression tests for graphical system developers, detecting the latency problems as soon as they are introduced.
  • In some examples of implementation, any feature of any embodiment described herein may be used in combination with any feature of any other embodiment described herein.

Claims (15)

  1. Method for measuring the latency of a graphical display output, comprising the following steps:
    - obtaining, by a processor (5), a time value;
    - generating, by the processor (5), a time stamp based on the time value;
    - transmitting, by the processor, the time stamp to a graphics processor unit (7), GPU,
    - rendering, by the GPU (7), a display frame including at least one graphical symbol being based on the time stamp;
    - providing the rendered frame including the at least one graphical symbol to at least one display (10a, 10b);
    - capturing, by at least one camera (12), at least a portion of the display comprising the at least one graphical symbol (18, 20) of the time stamp;
    - determining from the at least one captured graphical symbol the rendered time stamp; and
    - comparing the rendered time stamp with a time reference for obtaining the latency of the graphical display output.
  2. Method according to claim 1, wherein the processor transmits to the GPU the time stamp as one or more characters and/or one or more number values, in particular as non-graphical number values.
  3. Method according to claim 1 or 2, wherein the at least one graphical symbol (18, 20) is a visually readable graphically coded number.
  4. Method according to one of the preceding claims, wherein the graphical symbol (18, 20) is a bar code, in particular a one dimensional or two dimensional bar code.
  5. Method according to one of the preceding claims, wherein the at least one graphical symbol (18, 20) is a black and white bar code.
  6. Method according to one of the preceding claims, wherein the camera (12) is adapted to capture at least 120 frames per second and/or the camera capture rate is at least twice the display update frequency of the at least one display (10a, 10b).
  7. Method according to one of the preceding claims, wherein two graphical symbols (18, 20) depending on the time stamp are arranged at the top and the bottom of the display frame, in particular adjacent the top edge and adjacent the bottom edge of the display frame.
  8. Method according to claim 7, wherein the coded time stamp is compared with the time reference only if the graphical symbol (18) at the top of the display and the graphical symbol (20) at the bottom of the display (10a, 10b) are based on the same time stamp.
  9. Method according to one of the preceding claims, wherein the time value is obtained from a network time protocol.
  10. Method according to one of the preceding claims, wherein, the time stamp generated comprises reduced time information with respect to the time value.
  11. Method according to one of the preceding claims, wherein capturing comprises capturing, by the camera (12), rendered frames of at least two displays (10a, 10b), each of the displays displaying display frames including at least one graphical symbol (18, 20) based on a time stamp, wherein at least one of the at least two displays is a reference display, in particular a cathode ray tube display, adapted to display a frame within a predetermined time, for example 50ms , wherein the time reference is obtained by determining, from the captured graphical symbol of the reference display (10b), the rendered time stamp.
  12. Method according to one of the claims 1 to 10, further comprising:
    - assigning to each captured frame of the camera a time value, when the frame was captured by the camera (12);
    - determining the time reference includes obtaining the time value assigned to the captured frame of the camera (12), wherein, in particular the at least one camera (12) obtains a time from the network time protocol and determines a time value depending on the obtained time.
  13. Method according to one of the preceding claims, further comprising;
    determining the workload of the processor (5);
    recording a plurality of latency values and a plurality workload values in at least one database, such that the latency values are associated to workload values.
  14. System for measuring the latency of a graphical display output, comprising a first device (3) having processor (5) and a graphics processing unit (7), GPU, connected to the at least one processor (5), the GPU being connected to at least one first display (10a, 10b), wherein the processor is adapted to obtain a time value, to generate a time stamp based on the time value and to transmit the time stamp to the GPU, wherein the GPU (7) is adapted to render a display frame including at least one graphical symbol (18, 20) being based on the time stamp and to provide the rendered frame including the at least one graphical symbol to the at least one first display (10a); wherein the system further comprises at least one camera (12) adapted to capture at least a portion of the first display (10a) comprising the at least one graphical symbol (18, 20) of the time stamp, wherein the system is adapted to determine from the at least one captured graphical symbol (18, 20) the rendered time stamp, and to compare the rendered time stamp with a time reference for obtaining the latency of the graphical display output.
  15. System according to claim 14, further comprising a second device (3) having processor (5) and a graphics processing unit (7), GPU, connected to the at least one processor (5), the GPU being connected to at least one second display (10b), wherein the processor (5) of the second device (3) is adapted to obtain a time value, to generate a time stamp based on the time value and to transmit the time stamp to the GPU of the second device (3), wherein the GPU (7) of the second device (3) is adapted to render a display frame including at least one graphical symbol (18, 20) being based on the time stamp and to provide the rendered frame including the at least one graphical symbol to the at least one second display (10b), wherein the first and second device obtain a time value from the same time source, wherein the camera (12) is adapted to capture concurrently at least portion of the first display (10a) and at least a portion of the second display (10b) comprising the at least one graphical symbol (18, 20) of the time stamp, wherein the system is adapted to determine from the at least one captured graphical symbol (18, 20) of the second display (10b) the rendered time stamp corresponding to the time reference.
EP20197801.2A 2020-09-23 2020-09-23 Method and a system for measuring the latency of a graphical display output Pending EP3975159A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20197801.2A EP3975159A1 (en) 2020-09-23 2020-09-23 Method and a system for measuring the latency of a graphical display output

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP20197801.2A EP3975159A1 (en) 2020-09-23 2020-09-23 Method and a system for measuring the latency of a graphical display output

Publications (1)

Publication Number Publication Date
EP3975159A1 true EP3975159A1 (en) 2022-03-30

Family

ID=72658955

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20197801.2A Pending EP3975159A1 (en) 2020-09-23 2020-09-23 Method and a system for measuring the latency of a graphical display output

Country Status (1)

Country Link
EP (1) EP3975159A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016116107A (en) * 2014-12-16 2016-06-23 株式会社日立国際電気 Delay time measurement system and camera

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016116107A (en) * 2014-12-16 2016-06-23 株式会社日立国際電気 Delay time measurement system and camera

Similar Documents

Publication Publication Date Title
EP2476059B1 (en) A method and apparatus for determining processor performance
US20070052711A1 (en) Reconstruction render farm used in motion capture
US9218199B2 (en) Identifying thread progress information by monitoring transitions between interesting states
KR102241804B1 (en) Method of assessing the psychological state through the drawing process of the subject and computer program
CN108900830B (en) Platform for verifying accuracy of infrared video image processing algorithm
US20160005156A1 (en) Infrared selecting device and method
US20190355112A1 (en) System and method of distributed processing for machine-vision analysis
CN106776252B (en) A kind of method and device for evaluating GPU performance
CN114359818A (en) Utilization rate analysis method and device, computer equipment and storage medium
EP3975159A1 (en) Method and a system for measuring the latency of a graphical display output
CN115866154B (en) Time delay measurement method, device and system of vehicle-mounted multi-camera system and automobile
CN115348441A (en) Time delay measuring method, system, device, equipment and storage medium
CN116489336A (en) Equipment monitoring method, device, equipment, medium and product based on virtual film production
CN113889287A (en) Data processing method, device, system and storage medium
CN115022722A (en) Video monitoring method and device, electronic equipment and storage medium
CN113556366B (en) Multi-sensor data synchronization method and system and electronic equipment thereof
CN110865911B (en) Image testing method, device, storage medium, image acquisition card and upper computer
CN114071127A (en) Live video delay testing method and device, storage medium and electronic equipment
JP6226352B1 (en) Remote control system
US20230394885A1 (en) Assistance system, assistance method, and storage medium
CN112639409B (en) Method and system for dynamic signal visualization of real-time signals
KR101746992B1 (en) Framework apparatus for Dynamic Constructing measuring system in Steel Process
CN116958360A (en) Image rendering method and device, virtual reality equipment and storage medium
CN117294832A (en) Data processing method, device, electronic equipment and computer readable storage medium
CN113420629A (en) Image processing method, device, equipment and medium

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220928

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230509

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240103