WO2020140808A1 - Vr显示的补偿方法及补偿装置和显示装置 - Google Patents

Vr显示的补偿方法及补偿装置和显示装置 Download PDF

Info

Publication number
WO2020140808A1
WO2020140808A1 PCT/CN2019/128273 CN2019128273W WO2020140808A1 WO 2020140808 A1 WO2020140808 A1 WO 2020140808A1 CN 2019128273 W CN2019128273 W CN 2019128273W WO 2020140808 A1 WO2020140808 A1 WO 2020140808A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
compensation
frame
synchronization signal
rendering
Prior art date
Application number
PCT/CN2019/128273
Other languages
English (en)
French (fr)
Inventor
索健文
许景涛
王雪丰
李文宇
苗京花
王亚坤
彭金豹
赵斌
李治富
李茜
范清文
孙玉坤
张�浩
陈丽莉
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US17/254,981 priority Critical patent/US11302280B2/en
Publication of WO2020140808A1 publication Critical patent/WO2020140808A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/30Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
    • G09G3/32Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
    • G09G3/3208Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/005Adapting incoming signals to the display format of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0435Change or adaptation of the frame rate of the video stream
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/16Determination of a pixel data signal depending on the signal applied in the previous frame
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2350/00Solving problems of bandwidth in display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • G09G2370/042Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification

Definitions

  • the present disclosure belongs to the field of display technology, and in particular relates to a VR display compensation method, a compensation device, and a display device.
  • VR Virtual Reality, virtual reality
  • VR Virtual Reality, virtual reality
  • using simulation technology and computer graphics man-machine interface technology, multimedia technology, sensor technology, network technology, etc. can simulate a virtual environment and immerse users in the virtual environment.
  • Virtual reality (VR) technology focuses on the user's experience, and can improve the user's visual effect with ultra-high resolution, thereby making the user's experience more rich and real.
  • An aspect of the present disclosure provides a VR display compensation method, including:
  • the image to be displayed is compensated.
  • the generated compensation image data between the original image data of two adjacent frames is one frame of compensation image data
  • the two adjacent frame original image data are the Nth frame original image data and the N+1th frame original image data, respectively, where N is an integer greater than or equal to 1;
  • the step of generating the compensated image data between the original image data of two adjacent frames includes:
  • a compensation matrix is calculated to obtain the final compensated image data.
  • the generated compensation image data between the original image data of two adjacent frames is multi-frame compensation image data
  • the two adjacent frame original image data are the Nth frame original image data and the N+1th frame original image data, respectively, where N is an integer greater than or equal to 1;
  • the step of generating the compensated image data between the original image data of two adjacent frames includes:
  • M+1 frame speculative compensation image data according to the final compensated image data of the Mth frame and the image data of the previous consecutive consecutive preset frames, where M is an integer greater than or equal to 1;
  • the compensation matrix of the M+1th frame of the estimated compensation image data is calculated to obtain the final compensated image data of the M+1th frame .
  • the number of frames of the generated compensation image data is positively related to the difference between the first synchronization signal value and the second synchronization signal value.
  • the step of calculating the value of the first synchronization signal to be output by the controller according to the rendering resolution, rendering frame rate, and bandwidth data of the display panel includes:
  • the step of calculating the rendering bandwidth according to the rendering resolution and the rendering frame rate of the display panel includes: calculating the product of the rendering resolution and the rendering frame rate as the rendering bandwidth.
  • the second synchronization signal value is a fixed value that the display panel has when shipped from the factory.
  • a dynamic compensation device for VR display including:
  • the calculation unit is used to calculate the first synchronization signal value to be output by the controller according to the rendering resolution, rendering frame rate and bandwidth data of the display panel;
  • a comparison unit configured to compare the first synchronization signal value with the second synchronization signal value of the display panel stored in advance
  • a compensation image data generating unit configured to generate compensation image data between original image data of two adjacent frames when the comparison unit compares that the first synchronization signal value is greater than the second synchronization signal value
  • the compensation unit is configured to compensate the image to be displayed according to the compensation image data generated by the compensation image data generation unit.
  • the compensation image data generation unit includes a guess subunit, a compensation matrix calculation subunit, and a compensation image data generation subunit.
  • the compensation image data generation unit when the comparison unit compares the first synchronization signal value is greater than the second synchronization signal value, the compensation image data generation unit generates between the two adjacent frames of the original image data
  • the compensated image data is a frame of compensated image data
  • the two adjacent frame original image data are the Nth frame original image data and the N+1th frame original image data, respectively, where N is an integer greater than or equal to 1;
  • the speculation subunit is used to obtain speculative compensation image data based on the N-th frame original image data and the original image data of a continuous preset number of frames before it;
  • the compensation matrix calculation sub-unit is used to calculate a compensation matrix based on the image acceleration of the original image data of the Nth frame and the estimated compensation image data acquired by the speculation sub-unit;
  • the compensation image data generation subunit is used for generating compensation image data according to the compensation matrix calculated by the compensation matrix calculation subunit.
  • the compensation image data generation unit when the comparison unit compares the first synchronization signal value is greater than the second synchronization signal value, the compensation image data generation unit generates between the two adjacent frames of the original image data
  • the compensated image data is multi-frame compensated image data
  • the two adjacent frame original image data are the Nth frame original image data and the N+1th frame original image data, respectively, where N is an integer greater than or equal to 1;
  • the speculation subunit is used to acquire the first frame of speculative compensation image data based on the Nth frame of the original image data and the original image data of consecutive consecutive preset frames before it; and the final compensated image data of the Mth frame and before , Continuous preset frame number image data, obtain the M+1 frame speculative compensation image data; where, M is an integer greater than or equal to 1;
  • the compensation matrix calculation sub-unit is used to calculate the compensation matrix of the first frame of the estimated compensation image data based on the image acceleration of the Nth frame of the original image data and the first frame of the estimated image data, and the final compensated image based on the Mth frame
  • the compensation image data generation subunit is used to estimate the compensation matrix of the compensation image data of the first frame calculated by the compensation matrix calculation subunit, generate the final compensation image data of the first frame, and calculate the subunit according to the compensation matrix
  • the compensation matrix of the estimated image data of the M+1th frame is calculated, and the final compensated image data of the M+1th frame is generated.
  • the number of frames of the compensation image data generated by the compensation image data generating unit is positively correlated with the difference between the first synchronization signal value and the second synchronization signal value.
  • the calculation unit includes:
  • An obtaining subunit used to obtain the rendering resolution, rendering frame rate and bandwidth data of the display panel
  • a first calculation subunit configured to calculate a rendering bandwidth according to the rendering resolution and rendering frame rate of the display panel
  • the second calculation subunit is used to calculate the value of the first synchronization signal to be output by the controller according to the formula: 1/B+A*24/(AC); where A is the rendering bandwidth, B is the rendering frame rate, and C is the bandwidth data of the display panel.
  • the first calculation subunit is used to calculate the product of the rendering resolution and the rendering frame rate as the rendering bandwidth.
  • the second synchronization signal value is a fixed value that the display panel has when shipped from the factory.
  • the compensation image data generating unit includes a sensor.
  • the speculation subunit of the compensated image data generation unit includes a sensor.
  • the senor is a gyroscope.
  • Another aspect of the present disclosure provides a display device including the VR display dynamic compensation device provided according to any one of the above-described embodiments of the present disclosure and the display panel.
  • FIG. 1 is a flowchart of a VR display compensation method according to an embodiment of the present disclosure
  • FIG. 2 is another flowchart of a VR display compensation method according to an embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of a VR display dynamic compensation device according to an embodiment of the present disclosure.
  • FIG. 4 is another schematic structural diagram of a VR display dynamic compensation device according to an embodiment of the present disclosure.
  • the inventor of the present disclosure found that in the related art, due to the ultra-high resolution of the VR display, the rendering time is too long and the refresh rate is reduced, and the video transmission rate of the interface in the related art is up to 21.6Gbps, so it cannot meet the super The requirement for real-time transmission of high-resolution video or images. These two reasons make it impossible to achieve ultra-high resolution VR products.
  • the hardware and software limit the ultra-high-resolution display effect of the VR technology, so high frame rate and high-resolution display cannot be achieved.
  • embodiments of the present disclosure provide a VR display compensation method, a VR display compensation device, and a display device including the VR display compensation device .
  • this embodiment provides a VR display compensation method.
  • the method may include the following steps S11 to S14.
  • Step S11 Calculate the first synchronization signal value to be output by the controller according to the rendering resolution, rendering frame rate, and bandwidth data (that is, bandwidth value) of the VR display panel.
  • the pixel arrangement is fixed, and the corresponding rendering resolution and rendering frame rate are fixed.
  • the display panel The rendering resolution and the rendering frame rate are the data provided by the software designer of the VR display, and the bandwidth data is also an inherent parameter of the display panel (which is provided by the manufacturer of the display panel and built into the display panel). In this way, you can first obtain the rendering resolution, rendering frame rate, and bandwidth data of the display panel, and output the three to the computing unit, and the computing unit can calculate the necessary factors based on the rendering resolution and bandwidth data of the display panel.
  • the first synchronization signal value output by the controller that is, the first synchronization signal value of the software is calculated.
  • the first synchronization signal refers to a vertical synchronization (Vsync) signal corresponding to the display panel at the rendering resolution, and the signal is a pulse signal.
  • the first synchronization signal value in this embodiment may refer to the pulse width of the first synchronization signal, that is, the time value.
  • Bandwidth data refers to the rate of data transmission in the display panel.
  • the controller may be a software layer of a display device including the display panel, and the display panel may be a hardware layer of the display device.
  • Step S12 Compare the first synchronization signal value calculated in step S1 with the pre-stored second synchronization signal value of the display panel.
  • the second synchronization signal refers to a corresponding vertical synchronization (Vsync) signal at the physical resolution of the display panel, that is, the second synchronization signal value is a fixed value that the display panel has when shipped from the factory.
  • the first synchronization signal value and the pre-stored second synchronization signal value of the display panel may be compared by the comparison unit, for example.
  • step S13 is executed.
  • it is compared that the value of the first synchronization signal is equal to the value of the second synchronization signal it means that the transmission of the video signal of the image to be displayed is not limited, and the display screen will not appear stuck or other defects, so it can be carried out according to the first synchronization signal value
  • the transmission of video signals does not require compensation for displayed images.
  • the value of the first synchronization signal calculated by the calculation unit will not be less than the value of the second synchronization signal.
  • Step S13 When the value of the first synchronization signal is greater than the value of the second synchronization signal, compensating image data between the original image data of two adjacent frames is generated.
  • each frame of original image data refers to each frame of image data in the original image to be displayed.
  • the compensated image data refers to the data of the frame image to be inserted between the original image data.
  • a step of calculating the difference between the first synchronization signal value and the second synchronization signal value may also be included, and the difference between the two determines the number of frames of the generated compensation image data.
  • the difference between the first synchronization signal value and the second synchronization signal value is positively correlated with the number of frames of the compensated image data, that is, the larger the difference between the first synchronization signal value and the second synchronization signal value, the generated The more frames of the compensated image data.
  • Step S14 Compensate the image to be displayed according to the compensated image data generated in step S13.
  • this step specifically, insert the compensated image data into the corresponding original image data of two adjacent frames to form the image data to be displayed, and then display the screen according to the image data to be displayed, that is, complete the original Display data compensation.
  • the rendering resolution and rendering frame rate all affect the display effect of the VR device.
  • the rendering time may be too long and exceed the time of one frame, so that the information displayed on the screen will be delayed by at least one frame, causing the picture to freeze.
  • the value of the software synchronization signal (for example, pulse width) is first calculated at the software layer according to the resolution information, and then at the hardware layer according to the value of the synchronization signal sent by the software Determine whether to compensate. Without compensation, normal video signal processing is performed.
  • the dynamic compensation algorithm for VR display provided by the embodiments of the present disclosure can be embedded in a hardware layer (for example, the display panel, so compensation image data can be generated in the hardware layer,
  • a hardware layer for example, the display panel
  • compensation image data can be generated in the hardware layer
  • the embodiment of the present disclosure provides another compensation method.
  • the original image data of two adjacent frames may be the Nth frame original image data and the N+1th frame original image data, respectively, where N is an integer greater than or equal to 1.
  • the compensation method may include the following steps S20 to S24.
  • Step S20 Obtain the rendering resolution, rendering frame rate and bandwidth data of the display panel.
  • the software developer will provide some information required by the user and related data of the display panel (for example: rendering resolution, rendering frame rate, bandwidth data of the display panel) to the user and set it in the display panel. Therefore, in this step, the acquisition subunit in the calculation unit of the compensation device (for example, the dynamic compensation device for VR display described below with reference to FIGS. 3 and 4) can directly acquire the rendering resolution, rendering frame rate, and Bandwidth data.
  • the acquisition subunit in the calculation unit of the compensation device for example, the dynamic compensation device for VR display described below with reference to FIGS. 3 and 4
  • Step S21 Calculate the first synchronization signal value to be output by the controller according to the rendering resolution, rendering frame rate, and bandwidth data of the display panel acquired in step S20.
  • the first calculation subunit in the calculation unit may calculate the rendering bandwidth of the display panel according to the acquired rendering resolution and rendering frame rate, that is, correspondingly transmit the display data of the rendering resolution and rendering frame rate The required bandwidth.
  • the rendering bandwidth may be equal to the product of the obtained rendering resolution and the rendering frame rate (ie, the number of frames rendered per second) (in this case, the unit of the calculated rendering bandwidth is bits per second (bps
  • the rendering bandwidth may be equal to the acquired rendering resolution ⁇ rendering frame rate ⁇ 24 ⁇ 1024 ⁇ 1024 ⁇ 1024 (in In this case, the unit of the calculated rendering bandwidth is Gbps (giga bits per second), where the three primary colors R, G, and B represented by the number 24 each include 8 bits.
  • the second calculation subunit can be used According to the formula 1/B+A*24/(AC), the value of the first synchronization signal to be output by the controller is calculated; where A is the rendering bandwidth, B is the rendering frame rate, and C is the bandwidth data of the display panel.
  • Step S22 Compare the first synchronization signal value calculated in step S21 with the pre-stored second synchronization signal value of the display panel, and execute step S23 when the first synchronization signal value is greater than the second synchronization signal value.
  • the comparison unit in the VR display dynamic compensation device shown in FIGS. 3 and 4 may be used to compare the first synchronization signal value calculated in step S21 and stored in advance (for example, on the display panel or The second synchronization signal value inherent to the display panel in the storage unit of the dynamic compensation device).
  • Step S23 Generate compensated image data between the Nth frame original image data and the N+1th frame original image data.
  • the difference between the first synchronization signal value and the second synchronization signal value can be calculated, and the number of frames of the compensation image data to be generated is determined according to the difference between the two.
  • the difference between the first synchronization signal value and the second synchronization signal value and the number of frames of the compensated image data to be generated may be positively correlated, that is, both the first synchronization signal value and the second synchronization signal value
  • the larger the difference the greater the number of frames of the generated compensation image data.
  • the estimated compensated image data is acquired.
  • the original image data from the 90th frame to the 100th frame may be used and pass through a sensor inside the display panel (ie, the VR display panel) ) Make predictions and obtain estimated compensated image data.
  • the sensor may be a gyroscope, which can acquire motion data of a display panel (for example, a VR display panel or a user wearing the VR display panel), including parameters such as roll, pitch, and yaw .
  • the sensor may obtain the presumed compensated image data based on, for example, the original image data of the 90th frame to the 100th frame and the motion data, for example, using an asynchronous time warp technology known in the art.
  • the display panel may include the sensor and/or VR display dynamic compensation device may include the sensor, and the display device may include the display panel and/or VR display dynamic Compensation device. As the user wearing the display device moves, the sensor also moves. Therefore, the image displayed by the display device may have motion speed and acceleration.
  • the "image acceleration” may refer to the motion acceleration of the sensor.
  • the “compensation matrix” may be an asynchronous time warping matrix (ie, a matrix used in the asynchronous time warping technique).
  • the “compensation matrix” may be an N ⁇ 1 matrix, and includes matrix elements that respectively represent parameters such as yaw, pitch, and yaw in the motion data.
  • the image acceleration of the frame image can be obtained based on the original image data of the Nth frame, and the sensor inside the display panel, and then based on the image acceleration of the original image data of the Nth frame and the estimated compensation image data,
  • the compensation matrix is calculated, and then the final compensated image data between the Nth frame original image data and the N+1th frame original image data is obtained.
  • the final compensated image data may be image data obtained by transforming the compensation matrix according to the asynchronous time warping technology.
  • the first frame speculative compensation image data is acquired.
  • the first frame of speculative compensation image data can be obtained from the 90th frame to the 100th frame of the original image data and through the sensor inside the display panel ( The method of acquisition may be the same as described above).
  • the image acceleration of the original image data of the Nth frame is also obtained through the sensor inside the display panel, and the estimated image data of the first frame is calculated based on the image acceleration of the original image data of the Nth frame and the first frame of the compensated image data Texture coordinate (the data structure after texture coordinate integration is a matrix), that is, the compensation matrix is obtained to obtain the final compensated image data of the first frame.
  • the sensor inside the display panel is used to make a guess to obtain the estimated M+1th frame compensation image data; where, M It is an integer greater than or equal to 1 and less than or equal to K.
  • the "preset frame number image data" in the image data and the consecutive preset frame number image data before it refers to the first frame final compensation image data and the first frame final compensation image data (preset frame number- 1) Frame original image data (in this example, the first frame final compensation image data and the 92nd to 100th frame original image data may be used). For cases greater than or equal to 3, understand in the manner described above.
  • the third frame speculative compensation image data can be calculated according to the same method until the last frame (for example, the Kth frame) speculative compensation image data is calculated.
  • the image acceleration of the final compensated image data of the Mth frame is also obtained through the sensor inside the display panel, and the compensated image data is estimated based on the image acceleration of the final compensated image data of the Mth frame and the M+1th frame to calculate the M+th
  • One frame speculates the compensation matrix of the image to obtain the final compensated image data of the M+1th frame.
  • the image compensation of the first frame final compensation image needs to be obtained through the sensor inside the display panel according to the first frame final compensation image data, and then the image of the first frame final compensation image Accelerate and estimate the compensated image data for the second frame to obtain the final compensated image data for the second frame; in the same way, the final compensated image data for the third frame can be calculated in the same way until the final frame (for example, the Kth frame) is calculated Compensate image data.
  • step S24 the final compensated image data calculated in step S23 is compensated for the image to be displayed.
  • the compensation unit of the VR display dynamic compensation device may be sequentially inserted between the corresponding original image data of two adjacent frames according to the multi-frame final compensation image data calculated in step S23 to treat the displayed image Make compensation.
  • the first frame final compensation image data, the second frame final compensation image data, the third frame final compensation image data, etc. may be sequentially inserted between the Nth frame original image data and the N+1th frame original image data.
  • the embodiments of the present disclosure provide a VR display dynamic compensation device, which can be used to implement the VR display compensation method in the embodiments shown in FIGS. 1 and 2.
  • the dynamic compensation device for VR display in this embodiment may include a calculation unit 31, a comparison unit 32, a compensation image data generation unit 33, and a compensation unit 34.
  • the calculation unit 31, the comparison unit 32, the compensation image data generation unit 33, and the compensation unit 34 may be implemented by using a central processing unit (CPU) or an application processor (AP), or multiple central processing units, respectively Or multiple application processors.
  • the dynamic compensation device displayed by VR may further include a memory (for example, a non-volatile memory) in which one or more computer programs are stored, and the one or more computer programs are controlled by the one or more central When executed by the processor or the one or more application processors, the one or more central processors or the one or more application processors are used as the calculation unit 31, the comparison unit 32, the A compensation image data generating unit 33 and the compensation unit 34.
  • a memory for example, a non-volatile memory
  • the one or more central processors or the one or more application processors are used as the calculation unit 31, the comparison unit 32, the A compensation image data generating unit 33 and the compensation unit 34.
  • the memory may also store various data involved in the VR display compensation method provided by the embodiments of the present disclosure, such as rendering resolution, rendering frame rate, bandwidth data, and the first synchronization signal to be output by the controller Value, pre-stored second synchronization signal value of the display panel, multiple frames of original image data, presumed compensated image data, final compensated image data, and other required computer programs and information.
  • the calculation unit 31 is configured to calculate the first synchronization signal value to be output by the controller according to the rendering resolution, rendering frame rate, and bandwidth data of the display panel.
  • the rendering resolution and rendering frame rate of the display panel are data provided by the software designer of the VR display, and the bandwidth data is also an inherent parameter of the display panel.
  • the calculation unit may include, for example, an acquisition subunit 311, a first calculation subunit 312, and a second calculation subunit 313, as shown in FIG.
  • the acquisition subunit 311, the first calculation subunit 312, and the second calculation subunit 313 may be implemented by at least one central processor or application processor.
  • the obtaining subunit 311 is used to obtain the rendering resolution, rendering frame rate and bandwidth data of the display panel.
  • the first calculation subunit is used to calculate the rendering bandwidth according to the rendering resolution and the rendering frame rate of the display panel; the second calculation subunit is used to calculate according to the formula: 1/B+A*24/(AC) The value of the first synchronization signal to be output by the controller; where A is the rendering bandwidth (which is, for example, equal to the product of the rendering resolution of the display panel and the rendering frame rate), B is the rendering frame rate, and C is the bandwidth data of the display panel ( That is, the bandwidth value).
  • the comparison unit 32 is used to compare the first synchronization signal value with the pre-stored second synchronization signal value of the display panel.
  • the compensation image data generating unit 33 is configured to generate the compensation image data between the original image data of two adjacent frames when the comparison unit 32 compares that the first synchronization signal value is greater than the second synchronization signal value.
  • the compensation image data generating unit 33 may include the sensor (for example, a gyroscope).
  • the compensation image data between the original image data of two adjacent frames generated by the compensation image data generating unit 33 may be a frame of compensation image Data, and the original image data of the two adjacent frames may be the Nth frame original image data and the N+1th frame original image data, respectively, where N is an integer greater than or equal to 1.
  • the compensation image data generation unit 33 may include: a guess subunit 331, a compensation matrix calculation subunit 332, and a compensation image data generation subunit 333.
  • the speculation subunit 331, the compensation matrix calculation subunit 332, and the compensation image data generation subunit 333 may be implemented by at least one central processor or application processor.
  • the speculation subunit 331 may include the sensor (for example, a gyroscope).
  • the speculation sub-unit 331 is used to acquire speculative compensated image data based on the N-th frame original image data and the original image data of consecutive preset frames before it.
  • the compensation matrix calculation subunit 332 is configured to calculate a compensation matrix based on the image acceleration of the Nth frame original image data and the estimated compensation image data acquired by the estimation subunit.
  • the compensation image data generation subunit 333 is used to generate compensation image data according to the compensation matrix calculated by the compensation matrix calculation subunit.
  • the compensation image data between the original image data of two adjacent frames generated by the compensation image data generating unit 33 may be multi-frame compensation Image data, and the original image data of the two adjacent frames may be the Nth frame original image data and the N+1th frame original image data, respectively.
  • the speculation sub-unit 331 is used to acquire the first frame speculative compensation image data based on the Nth frame original image data and the original image data of consecutive consecutive preset frames; where N is greater than or equal to 1 Integer.
  • the speculation subunit 331 is also used to obtain the M+1 frame speculative compensation image data according to the M frame final compensation image data and the image data of the consecutive consecutive preset frames before it; where M is greater than or equal to 1 Integer.
  • the compensation matrix calculation subunit 332 is used to calculate the compensation matrix of the final compensation image data of the first frame based on the image acceleration of the original image data of the Nth frame and the first frame, and the final compensation image of the Mth frame.
  • the image acceleration of the data and the estimated M+1th frame compensation image data estimated by the estimation subunit 332 calculate the compensation matrix of the M+1th frame estimation compensation image data.
  • the compensation image data generation sub-unit 333 is used to estimate the compensation matrix of the compensation image data of the first frame calculated by the compensation matrix calculation sub-unit 332, generate the final compensation image data of the first frame, and calculate the sub-unit according to the compensation matrix 332 Calculates the compensation matrix of the M+1 frame estimated compensation image data, and generates the final compensated image data of the M+1 frame.
  • the number of frames of the compensation image data (ie, the final compensation image data) generated by the compensation image data generating unit 33 may be positively correlated with the difference between the first synchronization signal value and the second synchronization signal value.
  • the compensation image data generating unit 33 may also be an FPGA (logic programmable device).
  • the steps implemented by the compensation image data generating unit 33 in the above dynamic compensation algorithm may be embedded in the FPGA to facilitate dynamic compensation of the image.
  • high frame rate, smooth and complete VR display can be realized.
  • the compensation unit 34 is configured to compensate the original image to be displayed according to the compensation image data (for example, the final compensation image data) generated by the compensation image data generating unit 33.
  • the compensation unit 34 is used to sequentially insert the one-frame or multi-frame final compensated image data between the N-th frame original image data and the N+1-th frame image data, thereby completing the compensation of the original image.
  • the value of the software synchronization signal (for example, pulse width) is first calculated at the software layer according to the resolution information, and then at the hardware layer according to the value of the synchronization signal sent by the software Determine whether to perform dynamic compensation and normal video signal processing, thereby reducing the transmission bandwidth requirements of the video signal, reducing the rendering pressure, and achieving high-resolution display, improving the user experience of VR display.
  • the software synchronization signal for example, pulse width
  • An embodiment of the present disclosure provides a display device, which includes a VR display dynamic compensation device and a display panel in the embodiment shown in FIG. 3 or FIG. 4. Since the display device in this embodiment includes the dynamic compensation device for VR display in the embodiment shown in FIG. 3 or FIG. 4, it can realize ultra-high resolution VR display.
  • the display device may be an OLED display device or a liquid crystal display device, such as a liquid crystal panel, a mobile phone, a tablet computer, a television, a display, a notebook computer, a digital photo frame, a navigator, and any other product or component with a display function.
  • OLED display device or a liquid crystal display device, such as a liquid crystal panel, a mobile phone, a tablet computer, a television, a display, a notebook computer, a digital photo frame, a navigator, and any other product or component with a display function.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Liquid Crystal Display Device Control (AREA)
  • Television Systems (AREA)

Abstract

一种VR显示的补偿方法及补偿装置和一种显示装置,该VR显示的补偿方法包括:根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值(S11);将第一同步信号值与预先存储的显示面板的第二同步信号值进行比较(S12);当比较出第一同步信号值大于第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据(S13);以及根据所生成的补偿图像数据,对待显示的图像进行补偿(S14)。

Description

VR显示的补偿方法及补偿装置和显示装置
相关申请的交叉引用
本申请要求于2019年1月2日提交的中国专利申请No.201910002157.0的优先权,该专利申请的全部内容通过引用方式合并于此。
技术领域
本公开属于显示技术领域,具体涉及一种VR显示的补偿方法及补偿装置和一种显示装置。
背景技术
VR(Virtual Reality,虚拟现实)技术,利用仿真技术与计算机图形学人机接口技术、多媒体技术、传感技术、网络技术等,可以模拟虚拟环境并且使用户沉浸于该虚拟环境中。虚拟现实(VR)技术注重用户的体验感受,可以以超高分辨率来提高用户的视觉效果,从而使用户的体验更加丰富真实。
发明内容
本公开的一方面提供了一种VR显示的补偿方法,包括:
根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值;
将所述第一同步信号值与预先存储的所述显示面板的第二同步信号值进行比较;
当比较出所述第一同步信号值大于所述第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据;以及
根据所生成的补偿图像数据,对待显示的图像进行补偿。
在一个实施例中,当比较出所述第一同步信号值大于所述第二同步信号值时,所生成的两相邻帧原始图像数据之间的补偿图像数据为一帧补偿图像数据;
所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
所述生成两相邻帧原始图像数据之间的补偿图像数据的步骤包括:
根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取推测的补偿图像数据;以及
根据第N帧原始图像数据的图像加速度和所述推测的补偿图像数据,计算出补偿矩阵,以得到最终补偿图像数据。
在一个实施例中,当比较出所述第一同步信号值大于所述第二同步信号值时,所生成的两相邻帧原始图像数据之间的补偿图像数据为多帧补偿图像数据;
所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
所述生成两相邻帧原始图像数据之间的补偿图像数据的步骤包括:
根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取第一帧推测补偿图像数据;
根据第N帧原始图像数据的图像加速度和所述第一帧推测补偿图像数据,计算出第一帧推测补偿图像数据的补偿矩阵,以得到第一帧最终补偿图像数据;
根据第M帧最终补偿图像数据和其之前的、连续的预设帧数图像数据,获取第M+1帧推测补偿图像数据,其中,M为大于或者等于1的整数;以及
根据第M帧最终补偿图像数据的图像加速度和所述第M+1帧推测补偿图像数据,计算出第M+1帧推测补偿图像数据的补偿矩阵,以得到第M+1帧最终补偿图像数据。
在一个实施例中,所生成的补偿图像数据的帧数,与所述第一同步信号值和所述第二同步信号值的差值正相关。
在一个实施例中,所述根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值的步骤, 包括:
获取所述显示面板的渲染分辨率、渲染帧频以及带宽数据;
根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽;以及
根据公式:1/B+A*24/(A-C),计算出要由所述控制器输出的所述第一同步信号值;其中,A为所述渲染带宽,B为所述渲染帧频,C为所述显示面板的所述带宽数据。
在一个实施例中,根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽的步骤包括:计算出所述渲染分辨率和所述渲染帧频的乘积,作为所述渲染带宽。
在一个实施例中,所述第二同步信号值是所述显示面板出厂时具有的固定值。
本公开的另一方面提供了一种VR显示的动态补偿装置,包括:
计算单元,用于根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值;
比较单元,用于将所述第一同步信号值与预先存储的所述显示面板的第二同步信号值进行比较;
补偿图像数据生成单元,用于当所述比较单元比较出所述第一同步信号值大于所述第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据;以及
补偿单元,用于根据所述补偿图像数据生成单元生成的补偿图像数据,对待进行显示的图像进行补偿。
在一个实施例中,所述补偿图像数据生成单元包括推测子单元、补偿矩阵计算子单元和补偿图像数据生成子单元。
在一个实施例中,当所述比较单元比较出所述第一同步信号值大于所述第二同步信号值时,所述补偿图像数据生成单元所生成的两相邻帧原始图像数据之间的补偿图像数据为一帧补偿图像数据;
所述两相邻帧原始图像数据分别为第N帧原始图像数据和第 N+1帧原始图像数据,其中,N为大于或者等于1的整数;
所述推测子单元用于根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取推测的补偿图像数据;
所述补偿矩阵计算子单元用于根据第N帧原始图像数据的图像加速度和所述推测子单元所获取的所述推测的补偿图像数据,计算出补偿矩阵;以及
所述补偿图像数据生成子单元用于根据所述补偿矩阵计算子单元计算出的所述补偿矩阵生成补偿图像数据。
在一个实施例中,当所述比较单元比较出所述第一同步信号值大于所述第二同步信号值时,所述补偿图像数据生成单元所生成的两相邻帧原始图像数据之间的补偿图像数据为多帧补偿图像数据;
所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
所述推测子单元用于根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取第一帧推测补偿图像数据;以及根据第M帧最终补偿图像数据和其之前的、连续的预设帧数图像数据,获取第M+1帧推测补偿图像数据;其中,M为大于或者等于1的整数;
所述补偿矩阵计算子单元用于根据第N帧原始图像数据的图像加速度和所第一帧推测补偿图像数据,计算出第一帧推测补偿图像数据的补偿矩阵,以及根据第M帧最终补偿图像数据的图像加速度和所述推测子单元推测出的所述第M+1帧推测补偿图像数据,计算出第M+1帧推测补偿图像数据的补偿矩阵;以及
所述补偿图像数据生成子单元用于根据所述补偿矩阵计算子单元计算出的第一帧推测补偿图像数据的补偿矩阵,生成第一帧最终补偿图像数据,以及根据所述补偿矩阵计算子单元计算出第M+1帧推测图像数据的补偿矩阵,生成第M+1帧最终补偿图像数据。
在一个实施例中,所述补偿图像数据生成单元生成的所述补 偿图像数据的帧数与所述第一同步信号值和所述第二同步信号值的差值正相关。
在一个实施例中,所述计算单元包括:
获取子单元,用于获取所述显示面板的渲染分辨率、渲染帧频以及带宽数据;
第一计算子单元,用于根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽;以及
第二计算子单元,用于根据公式:1/B+A*24/(A-C),计算出要由所述控制器输出的所述第一同步信号值;其中,A为所述渲染带宽,B为所述渲染帧频,C为所述显示面板的所述带宽数据。
在一个实施例中,所述第一计算子单元用于计算所述渲染分辨率和所述渲染帧频的乘积,作为所述渲染带宽。
在一个实施例中,所述第二同步信号值是所述显示面板出厂时具有的固定值。
在一个实施例中,所述补偿图像数据生成单元包括感应器。
在一个实施例中,所述补偿图像数据生成单元的所述推测子单元包括感应器。
在一个实施例中,所述感应器是陀螺仪。
本公开的另一方面提供了一种显示装置,包括根据本公开的上述实施例中任一个所提供的VR显示的动态补偿装置以及所述显示面板。
附图说明
图1为本公开的实施例的VR显示的补偿方法的流程图;
图2为本公开的实施例的VR显示的补偿方法的另一种流程图;
图3为本公开的实施例的VR显示的动态补偿装置的结构示意图;以及
图4为本公开的实施例的VR显示的动态补偿装置的另一种结构示意图。
具体实施方式
为使本领域技术人员更好地理解本公开的技术方案,下面结合附图和具体实施方式对本公开作进一步详细描述。
本公开的发明人发现,在相关技术中,由于VR显示的超高分辨率会导致渲染时间过长和刷新率降低,以及相关技术中的接口的视频传输速率最高为21.6Gbps,因此无法满足超高分辨率的视频或图像的实时传输的要求。这两种原因导致目前无法实现超高分辨率的VR产品。在相关技术中硬件和软件限制了VR技术的超高分辨率的显示效果,因此无法实现高帧频、高分辨率的显示。
为了至少解决相关技术中所存在的上述技术问题,本公开的实施例提供了一种VR显示的补偿方法,一种VR显示的补偿装置,以及一种包括所述VR显示的补偿装置的显示装置。
如图1所示,本实施例提供一种VR显示的补偿方法,该方法可以包括如下步骤S11至S14。
步骤S11、根据VR显示面板的渲染分辨率、渲染帧频和带宽数据(即,带宽值),计算出要由控制器输出的第一同步信号值。
在该步骤中,对于VR显示面板(在下文中可简称为“显示面板”)而言,像素排布是固定的,其所对应的渲染分辨率和渲染帧频均是固定的,其中,显示面板的渲染分辨率和渲染帧频是VR显示的软件设计者提供的数据,带宽数据也是显示面板固有的参数(其由该显示面板的制造商提供并内置于该显示面板中)。这样一来,可以先获取显示面板的渲染分辨率、渲染帧频和带宽数据,并将这三者输出给计算单元,计算单元则可以根据显示面板的渲染分辨率和带宽数据,计算出要由控制器输出的第一同步信号值,也即计算出软件的第一同步信号值。
在此需要说明的是,第一同步信号是指显示面板在渲染分辨率下所对应的垂直同步(Vsync)信号,该信号为脉冲信号。本实施例中的第一同步信号值可以是指第一同步信号的脉冲宽度,也 即时间值。带宽数据是指在显示面板中传输数据的速率。控制器可以为包括所述显示面板的显示装置的软件层,所述显示面板则可以为该显示装置的硬件层。
步骤S12、将步骤S1中计算得到的第一同步信号值,与预先存储的显示面板的第二同步信号值进行比较。其中,第二同步信号是指显示面板物理分辨率下对应的垂直同步(Vsync)信号,也就是说,第二同步信号值是显示面板出厂时具有的固定值。
在该步骤中,例如可以通过比较单元对第一同步信号值和预先存储的显示面板的第二同步信号值进行比较。当比较出第一同步信号值大于第二同步信号值时,执行步骤S13。当比较出第一同步信号值等于第二同步信号值时,说明待显示图像的视频信号的传输是不受限的,显示画面不会出现卡顿等不良,故可以按照第一同步信号值进行视频信号的传输,无需进行显示图像的补偿。
在此需要说明的是,由于显示面板的渲染分辨率是大于或等于显示面板的物理分辨率的,故计算单元计算得到的第一同步信号值是不会小于第二同步信号值的。
步骤S13、当比较出第一同步信号值大于第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据。其中,每一帧原始图像数据是指待显示的原始图像中的每一帧图像数据。补偿图像数据是指在原始图像数据之间即将要插入的帧图像的数据。
在该步骤中,还可以包括计算第一同步信号值和第二同步信号值的差值步骤,而二者的差值决定了所生成的补偿图像数据的帧数。例如,第一同步信号值和第二同步信号值的差值,与补偿图像数据的帧数是正相关的,也即,第一同步信号值和第二同步信号值的差值越大,所生成的补偿图像数据的帧数越多。
步骤S14、根据步骤S13中所生成的补偿图像数据,对待显示的图像进行补偿。
在该步骤中,具体的是将补偿图像数据插入相对应的两相邻帧原始图像数据中,以形成待显示的图像数据,之后根据待显示的图像数据进行画面的显示,也即完成对原始显示数据的补偿。
由于VR设备在运行中,渲染分辨率、渲染帧频都影响着VR设备的显示效果。在佩戴VR设备时,考虑到渲染分辨率和头部运动,渲染时间可能过长而超出一帧的时间,这样显示在屏幕中的信息会至少延迟一帧,导致画面卡顿。在本实施例中所提供的VR显示的补偿方法中,首先在软件层根据分辨率信息计算软件的同步信号的值(例如,脉冲宽度),之后在硬件层根据软件发送的同步信号的值来判断是否进行补偿。在不需要进行补偿的情况下,执行正常的视频信号处理。在需要进行补偿的情况下,例如,可以将本公开的实施例所提供的VR显示的动态补偿算法嵌入到硬件层(例如,所述显示面板中,因此可以在硬件层中生成补偿图像数据,以降低从软件层向硬件层传输视频信号(例如,原始图像数据和补偿图像数据)的传输带宽要求,同时降低了渲染压力,并且可以实现高分辨率的显示,提高VR显示的用户体验,改善相关技术中的不良。
为了更清楚的理解本实施例中VR显示的补偿方法,本公开的实施例提供了另一种补偿方法。在本实施例的补偿方法中,两相邻帧原始图像数据可以分别为第N帧原始图像数据和第N+1帧原始图像数据,其中N为大于或等于1的整数。如图2所示,该补偿方法可以包括如下步骤S20至S24。
步骤S20、获取显示面板的渲染分辨率、渲染帧频和带宽数据。
例如,软件开发者会将一些用户需要的信息以及显示面板的相关数据(例如:显示面板的渲染分辨率、渲染帧频、带宽数据)提供给用户并设置在显示面板中。因此,在该步骤中,补偿装置(例如,下文参照图3和图4描述的VR显示的动态补偿装置)的计算单元中的获取子单元可以直接获取显示面板的渲染分辨率、渲染帧频、带宽数据。
步骤S21、根据步骤S20中获取的显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值。
在该步骤中,计算单元中的第一计算子单元可以根据所获取 的渲染分辨率和渲染帧频,计算出显示面板的渲染带宽,也即对应传输该渲染分辨率和渲染帧频的显示数据所需要的带宽。例如,所述渲染带宽可以等于所获取的渲染分辨率和渲染帧频(即,每秒渲染的帧数)的乘积(在这种情况下,计算出来的渲染带宽的单位是比特/秒(bps)。在进行彩色(例如,包括红色R、绿色G和蓝色B)显示的情况下,所述渲染带宽可以等于所获取的渲染分辨率×渲染帧频×24÷1024÷1024÷1024(在这种情况下,计算出来的渲染带宽的单位是Gbps(giga bits per second),其中,数量24表示的三原色R、G和B各包括8个比特。接下来,可以通过第二计算子单元,根据公式1/B+A*24/(A-C),计算出要由控制器输出的第一同步信号值;其中,A为渲染带宽,B为渲染帧频,C为显示面板的带宽数据。
步骤S22、将步骤S21中所计算得到的第一同步信号值与预先存储的显示面板的第二同步信号值进行比较,并在第一步同步信号值大于第二同步信号值时执行步骤S23。
在该步骤中,可以通过图3和图4所示的VR显示的动态补偿装置中的比较单元来比较步骤S21中所计算得到的第一同步信号值和预先存储(例如在所述显示面板或所述动态补偿装置)的存储单元中的所述显示面板的固有的第二同步信号值。
步骤S23、生成第N帧原始图像数据和第N+1帧原始图像数据之间的补偿图像数据。
在该步骤中,首先,可以计算出第一同步信号值和第二同步信号值二者的差值,并根据二者的差值确定出待生成的补偿图像数据的帧数。例如,第一同步信号值和第二同步信号值二者的差值与待生成的补偿图像数据的帧数可以是正相关的,也即,第一同步信号值和第二同步信号值二者的差值越大,所生成的补偿图像数据的帧数越多。接下来,分别以所生成的补偿图像数据的帧数为一帧和多帧的情况,对如何生成补偿图像数据进行说明。
一方面,对待生成的补偿图像数据的帧数为一帧的情况下进行说明。
首先,根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取推测的补偿图像数据。
例如,假若N为100,预设帧数为10帧,此时则可以根据第90帧至第100帧原始图像数据,并通过显示面板(即,所述VR显示面板)内部的感应器(sensor)进行预测,获取推测的补偿图像数据。在一个实施例中,所述感应器可以是陀螺仪,该陀螺仪能够获取显示面板(例如,VR显示面板或佩戴该VR显示面板的用户)的运动数据,包括摇摆、俯仰、偏航等参数。此外,所述感应器可以基于例如第90帧至第100帧原始图像数据以及所述运动数据,例如利用本领域已知的异步时间扭曲(Asynchronous Timewarp)技术来获取推测的补偿图像数据。
接下来,同样可以通过显示面板内部的感应器获取第N帧原始图像数据的图像加速度,并根据第N帧原始图像数据的图像加速度和推测的补偿图像数据,计算出补偿图像数据的纹理坐标(纹理坐标整合后的数据结构为矩阵),也即得到补偿矩阵,以得到第N帧原始图像数据和第N+1帧原始图像数据之间的补偿图像数据(即,最终的补偿图像数据)。在一个实施例中,所述显示面板可以包括所述感应器和/或VR显示的动态补偿装置可以包括所述感应器,并且显示装置可以包括所述显示面板和/或所述VR显示的动态补偿装置。随着佩戴所述显示装置的用户运动,所述感应器也运动。因此,所述显示装置所显示的图像可以具有运动速度和加速度。在本文中,所述“图像加速度”可以指所述感应器的运动加速度。此外,所述“补偿矩阵”可以是异步时间扭曲矩阵(即,所述异步时间扭曲技术中所使用的矩阵)。此外,所述“补偿矩阵”可以是N×1的矩阵,并且包括分别表示所述运动数据中的摇摆、俯仰、偏航等参数的矩阵元。
例如,在该步骤中可以根据第N帧原始图像数据,并通过显示面板内部的感应器获取该帧图像的图像加速度,之后再根据第N帧原始图像数据的图像加速度和推测的补偿图像数据,计算出补偿矩阵,继而得到第N帧原始图像数据和第N+1帧原始图像数 据之间的最终补偿图像数据。例如,所述最终补偿图像数据可以是根据所述异步时间扭曲技术对所述补偿矩阵进行变换后得到的图像数据。
另一方面,对待生成的补偿图像数据的帧数为多帧的情况下进行说明。
首先,根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取第一帧推测补偿图像数据。
具体的,假若N为100,预设帧数为10帧,此时则可以根据第90帧至第100帧原始图像数据,并通过显示面板内部的感应器来获取第一帧推测补偿图像数据(获取的方法可以与上文描述的相同)。
之后,同样通过显示面板内部的感应器获取第N帧原始图像数据的图像加速度,并根据第N帧原始图像数据的图像加速度和第一帧推测补偿图像数据,计算出第一帧推测补偿图像数据的纹理坐标(纹理坐标整合后的数据结构为矩阵),也即得到补偿矩阵,以得到第一帧最终补偿图像数据。
其次,改变(例如,增大)N的取值,按照与得到第一帧最终补偿图像数据的方法相同的方法,得到第二帧最终补偿图像数据、第三帧最终补偿图像数据……第K帧最终补偿图像数据,其中K为大于或等于2的整数。
接下来,根据第M帧最终补偿图像数据和其之前的、连续的预设帧数图像数据,通过显示面板内部的感应器进行推测,获取推测的第M+1帧补偿图像数据;其中,M为大于或者等于1并且小于等于K的整数。在此需要说明的是,当M=1时,第一帧最终补偿图像数据和其之前的、连续的预设帧数图像数据中的“预设帧数图像数据”是指第一帧最终补偿图像数据之前的“预设帧数原始图像数据”(在该示例中可以为第91帧至第100帧原始图像数据);当M>1时,以M=2为例,第二帧最终补偿图像数据和其之前的、连续的预设帧数图像数据中的“预设帧数图像数据”是指第一帧最终补偿图像数据和第一帧最终补偿图像数据之前的 (预设帧数-1)帧原始图像数据(在该示例中可以为第一帧最终补偿图像数据和第92帧至第100帧原始图像数据)。对于大于或等于3的情况,按照上述方式来理解。
例如,假若M=1,此时则需要根据第一帧最终补偿图像数据和第91帧至第100帧原始图像数据,并通通过显示面板内部的感应器进行推测,获取第二帧推测补偿图像数据;同理按照相同的方法可以计算出第三帧推测补偿图像数据,直至算出最后一帧(例如,第K帧)推测补偿图像数据。
最后,同样通过显示面板内部的感应器获取第M帧最终补偿图像数据的图像加速度,并根据第M帧最终补偿图像数据的图像加速度和第M+1帧推测补偿图像数据,计算出第M+1帧推测图像的补偿矩阵,以得到第M+1帧最终补偿图像数据。
例如,同样以M=1为例,此时需要根据第一帧最终补偿图像数据,通过显示面板内部的感应器获取第一帧最终补偿图像的图像加速度,之后根据第一帧最终补偿图像的图像加速度和第二帧推测补偿图像数据,得到第二帧最终补偿图像数据;同理按照同样的方法可以计算出第三帧最终补偿图像数据,直至计算出最后一帧(例如,第K帧)最终补偿图像数据。
步骤S24、将步骤S23中所计算出的最终补偿图像数据,对待显示的图像进行补偿。
在该步骤中,可以将VR显示的动态补偿装置的补偿单元根据步骤S23中所计算出的多帧最终补偿图像数据依次插入相对应的两相邻帧原始图像数据之间,以对待显示的图像进行补偿。例如,可以将第一帧最终补偿图像数据、第二帧最终补偿图像数据、第三帧最终补偿图像数据……依次插入第N帧原始图像数据和第N+1帧原始图像数据之间。
如图3和4所示,本公开的实施例提供一种VR显示的动态补偿装置,其可以用于实现图1和图2所示的实施例中VR显示的补偿方法。本实施例中VR显示的动态补偿装置可以包括:计算单元31、比较单元32、补偿图像数据生成单元33,以及补偿单 元34。例如,计算单元31、比较单元32、补偿图像数据生成单元33,以及补偿单元34可以采用一个中央处理器(CPU)或一个应用处理器(AP)来实现,也可以分别采用多个中央处理器或多个应用处理器来实现。例如,VR显示的动态补偿装置还可以包括存储器(例如,非易失性存储器),该存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序被所述一个或多个中央处理器或所述一个或多个应用处理器执行时,使所述一个或多个中央处理器或所述一个或多个应用处理器用作所述计算单元31、所述比较单元32、所述补偿图像数据生成单元33和所述补偿单元34。所述存储器还可以用户存储本公开的实施例所提供的VR显示的补偿方法中所涉及的各种数据,诸如渲染分辨率、渲染帧频、带宽数据、要由控制器输出的第一同步信号值、预先存储的显示面板的第二同步信号值、多帧原始图像数据、推测的补偿图像数据、最终补偿图像数据、以及其他需要的计算机程序和信息。
例如,计算单元31用于根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值。
具体的,显示面板的渲染分辨率和渲染帧频是VR显示的软件设计者提供的数据,带宽数据也是显示面板固有的参数。计算单元例如可以包括:获取子单元311、第一计算子单元312和第二计算子单元313,如图4所示。例如,获取子单元311、第一计算子单元312和第二计算子单元313可以由至少一个中央处理器或应用处理器来实现。获取子单元311用于获取显示面板的渲染分辨率、渲染帧频以及带宽数据。第一计算子单元用于根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽;第二计算子单元用于根据公式:1/B+A*24/(A-C),计算出要由控制器输出的第一同步信号值;其中,A为渲染带宽(其例如等于显示面板的渲染分辨率和渲染帧频的乘积),B为渲染帧频,C为显示面板的带宽数据(即,带宽值)。
例如,比较单元32用于将所述第一同步信号值与预先存储的显示面板的第二同步信号值进行比较。
例如,补偿图像数据生成单元33用于当所述比较单元32比较出所述第一同步信号值大于所述第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据。在一个实施例中,补偿图像数据生成单元33可以包括所述感应器(例如,陀螺仪)。
具体的,当比较单元32比较出第一同步信号值大于第二同步信号值时,补偿图像数据生成单元33所生成的两相邻帧原始图像数据之间的补偿图像数据可以为一帧补偿图像数据,并且所述两相邻帧原始图像数据可以分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数。补偿图像数据生成单元33可以包括:推测子单元331、补偿矩阵计算子单元332和补偿图像数据生成子单元333。例如,推测子单元331、补偿矩阵计算子单元332和补偿图像数据生成子单元333可以由至少一个中央处理器或应用处理器来实现。在一个实施例中,推测子单元331可以包括所述感应器(例如,陀螺仪)。例如,推测子单元331用于根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取推测的补偿图像数据。补偿矩阵计算子单元332用于根据第N帧原始图像数据的图像加速度和所述推测子单元所获取的所述推测的补偿图像数据,计算出补偿矩阵。补偿图像数据生成子单元333用于根据所述补偿矩阵计算子单元计算出的所述补偿矩阵生成补偿图像数据。
可替换地,当比较单元32比较出第一同步信号值大于第二同步信号值时,补偿图像数据生成单元33所生成的两相邻帧原始图像数据之间的补偿图像数据可以为多帧补偿图像数据,并且所述两相邻帧原始图像数据可以分别为第N帧原始图像数据和第N+1帧原始图像数据。
在此情况下,推测子单元331用于根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取第一帧推测补偿图像数据;其中,N为大于或者等于1的整数。此外,推测子单元331还用于根据第M帧最终补偿图像数据和其之前的、连续的预设帧数图像数据,获取第M+1帧推测补偿图像数据;其中, M为大于或者等于1的整数。补偿矩阵计算子单元332用于根据第N帧原始图像数据的图像加速度和所述第一帧推测补偿图像数据,计算出第一帧最终补偿图像数据的补偿矩阵;以及根据第M帧最终补偿图像数据的图像加速度和所述推测子单元332推测出的所述推测的第M+1帧补偿图像数据,计算出第M+1帧推测补偿图像数据的补偿矩阵。补偿图像数据生成子单元333用于根据所述补偿矩阵计算子单元332计算出的第一帧推测补偿图像数据的补偿矩阵,生成第一帧最终补偿图像数据,以及根据所述补偿矩阵计算子单元332计算出第M+1帧推测补偿图像数据的补偿矩阵,生成第M+1帧最终补偿图像数据。
例如,补偿图像数据生成单元33生成的所述补偿图像数据(即,所述最终补偿图像数据)的帧数,可以与所述第一同步信号值和第二同步信号值的差值正相关。
此外,补偿图像数据生成单元33还可以为FPGA(逻辑可编程器件),此时可以将上述动态补偿算法中由补偿图像数据生成单元33实现的步骤嵌入到该FPGA中,以方便动态补偿图像,从而可实现高帧频、流畅和完整的VR显示。
例如,补偿单元34用于根据所述补偿图像数据生成单元33生成的补偿图像数据(例如,所述最终补偿图像数据),对待进行显示的原始图像进行补偿。例如,补偿单元34用于将所述一帧或多帧最终补偿图像数据依次插入在第N帧原始图像数据和第N+1帧图像数据之间,从而完成对原始图像的补偿。
由于本实施例中所提供的VR显示的动态补偿装置,首先在软件层根据分辨率信息计算软件的同步信号的值(例如,脉冲宽度),之后在硬件层根据软件发送的同步信号的值来判断是否进行动态补偿和正常视频信号处理,从而降低视频信号的传输带宽要求,同时降低了渲染压力,并且可以实现高分辨率的显示,提高了VR显示的用户体验。
本公开的实施例提供了一种显示装置,其包括图3或图4所示的实施例中的VR显示的动态补偿装置和显示面板。由于本实 施例中的显示装置包括图3或图4所示的实施例中的VR显示的动态补偿装置,故其可以实现超高分辨率的VR显示。
作为示例,该显示装置可以为OLED显示装置或者液晶显示装置,例如液晶面板、手机、平板电脑、电视机、显示器、笔记本电脑、数码相框、导航仪等任何具有显示功能的产品或部件。
在没有明显冲突的情况下,本公开的上述实施例可以互相结合。
应当理解的是,以上实施方式仅仅是为了说明本公开的原理而采用的示例性实施方式,然而本公开并不局限于此。对于本领域内的普通技术人员而言,在不脱离由所附权利要求限定的本公开的保护范围的情况下,可以做出各种变型和改进,这些变型和改进也属于本公开的保护范围。

Claims (19)

  1. 一种VR显示的补偿方法,包括:
    根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值;
    将所述第一同步信号值与预先存储的所述显示面板的第二同步信号值进行比较;
    当比较出所述第一同步信号值大于所述第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据;以及
    根据所生成的补偿图像数据,对待显示的图像进行补偿。
  2. 根据权利要求1所述的VR显示的补偿方法,其中,当比较出所述第一同步信号值大于所述第二同步信号值时,所生成的两相邻帧原始图像数据之间的补偿图像数据为一帧补偿图像数据;
    所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
    所述生成两相邻帧原始图像数据之间的补偿图像数据的步骤包括:
    根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取推测的补偿图像数据;以及
    根据第N帧原始图像数据的图像加速度和所述推测的补偿图像数据,计算出补偿矩阵,以得到最终补偿图像数据。
  3. 根据权利要求1所述的VR显示的补偿方法,其中,当比较出所述第一同步信号值大于所述第二同步信号值时,所生成的两相邻帧原始图像数据之间的补偿图像数据为多帧补偿图像数据;
    所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
    所述生成两相邻帧原始图像数据之间的补偿图像数据的步骤包括:
    根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取第一帧推测补偿图像数据;
    根据第N帧原始图像数据的图像加速度和所述第一帧推测补偿图像数据,计算出第一帧推测补偿图像数据的补偿矩阵,以得到第一帧最终补偿图像数据;
    根据第M帧最终补偿图像数据和其之前的、连续的预设帧数图像数据,获取第M+1帧推测补偿图像数据,其中,M为大于或者等于1的整数;以及
    根据第M帧最终补偿图像数据的图像加速度和所述第M+1帧推测补偿图像数据,计算出第M+1帧推测补偿图像数据的补偿矩阵,以得到第M+1帧最终补偿图像数据。
  4. 根据权利要求1至3中任一项所述的VR显示的补偿方法,其中,所生成的补偿图像数据的帧数,与所述第一同步信号值和所述第二同步信号值的差值正相关。
  5. 根据权利要求1所述的VR显示的补偿方法,其中,所述根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值的步骤,包括:
    获取所述显示面板的渲染分辨率、渲染帧频以及带宽数据;
    根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽;以及
    根据公式:1/B+A*24/(A-C),计算出要由所述控制器输出的所述第一同步信号值;其中,A为所述渲染带宽,B为所述渲染帧频,C为所述显示面板的所述带宽数据。
  6. 根据权利要求5所述的VR显示的补偿方法,其中,根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽的步骤 包括:计算出所述渲染分辨率和所述渲染帧频的乘积,作为所述渲染带宽。
  7. 根据权利要求1至6中任一项所述的VR显示的补偿方法,其中,所述第二同步信号值是所述显示面板出厂时具有的固定值。
  8. 一种VR显示的动态补偿装置,包括:
    计算单元,用于根据显示面板的渲染分辨率、渲染帧频和带宽数据,计算出要由控制器输出的第一同步信号值;
    比较单元,用于将所述第一同步信号值与预先存储的所述显示面板的第二同步信号值进行比较;
    补偿图像数据生成单元,用于当所述比较单元比较出所述第一同步信号值大于所述第二同步信号值时,生成两相邻帧原始图像数据之间的补偿图像数据;以及
    补偿单元,用于根据所述补偿图像数据生成单元生成的补偿图像数据,对待进行显示的图像进行补偿。
  9. 根据权利要求8所述的VR显示的动态补偿装置,其中,所述补偿图像数据生成单元包括推测子单元、补偿矩阵计算子单元和补偿图像数据生成子单元。
  10. 根据权利要求9所述的VR显示的动态补偿装置,其中,当所述比较单元比较出所述第一同步信号值大于所述第二同步信号值时,所述补偿图像数据生成单元所生成的两相邻帧原始图像数据之间的补偿图像数据为一帧补偿图像数据;
    所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
    所述推测子单元用于根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取推测的补偿图像数据;
    所述补偿矩阵计算子单元用于根据第N帧原始图像数据的图 像加速度和所述推测子单元所获取的所述推测的补偿图像数据,计算出补偿矩阵;以及
    所述补偿图像数据生成子单元用于根据所述补偿矩阵计算子单元计算出的所述补偿矩阵生成补偿图像数据。
  11. 根据权利要求9所述的VR显示的动态补偿装置,其中,当所述比较单元比较出所述第一同步信号值大于所述第二同步信号值时,所述补偿图像数据生成单元所生成的两相邻帧原始图像数据之间的补偿图像数据为多帧补偿图像数据;
    所述两相邻帧原始图像数据分别为第N帧原始图像数据和第N+1帧原始图像数据,其中,N为大于或者等于1的整数;
    所述推测子单元用于根据第N帧原始图像数据和其之前的、连续的预设帧数原始图像数据,获取第一帧推测补偿图像数据;以及根据第M帧最终补偿图像数据和其之前的、连续的预设帧数图像数据,获取第M+1帧推测补偿图像数据;其中,M为大于或者等于1的整数;
    所述补偿矩阵计算子单元用于根据第N帧原始图像数据的图像加速度和所第一帧推测补偿图像数据,计算出第一帧推测补偿图像数据的补偿矩阵,以及根据第M帧最终补偿图像数据的图像加速度和所述推测子单元推测出的所述第M+1帧推测补偿图像数据,计算出第M+1帧推测补偿图像数据的补偿矩阵;以及
    所述补偿图像数据生成子单元用于根据所述补偿矩阵计算子单元计算出的第一帧推测补偿图像数据的补偿矩阵,生成第一帧最终补偿图像数据,以及根据所述补偿矩阵计算子单元计算出第M+1帧推测图像数据的补偿矩阵,生成第M+1帧最终补偿图像数据。
  12. 根据权利要求8至11中任一项所述的VR显示的动态补偿装置,其中,所述补偿图像数据生成单元生成的所述补偿图像数据的帧数与所述第一同步信号值和所述第二同步信号值的差值 正相关。
  13. 根据权利要求8至12中任一项所述的VR显示的动态补偿装置,其中,所述计算单元包括:
    获取子单元,用于获取所述显示面板的渲染分辨率、渲染帧频以及带宽数据;
    第一计算子单元,用于根据所述显示面板的渲染分辨率和渲染帧频,计算出渲染带宽;以及
    第二计算子单元,用于根据公式:1/B+A*24/(A-C),计算出要由所述控制器输出的所述第一同步信号值;其中,A为所述渲染带宽,B为所述渲染帧频,C为所述显示面板的所述带宽数据。
  14. 根据权利要求13所述的VR显示的动态补偿装置,其中,所述第一计算子单元用于计算所述渲染分辨率和所述渲染帧频的乘积,作为所述渲染带宽。
  15. 根据权利要求8至14中任一项所述的VR显示的动态补偿装置,其中,所述第二同步信号值是所述显示面板出厂时具有的固定值。
  16. 根据权利要求8至15中任一项所述的VR显示的动态补偿装置,其中,所述补偿图像数据生成单元包括感应器。
  17. 根据权利要求9至11中任一项所述的VR显示的动态补偿装置,其中,所述补偿图像数据生成单元的所述推测子单元包括感应器。
  18. 根据权利要求16或17所述的VR显示的动态补偿装置,其中,所述感应器是陀螺仪。
  19. 一种显示装置,包括根据权利要求8至18中任一项所述的VR显示的动态补偿装置以及所述显示面板。
PCT/CN2019/128273 2019-01-02 2019-12-25 Vr显示的补偿方法及补偿装置和显示装置 WO2020140808A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/254,981 US11302280B2 (en) 2019-01-02 2019-12-25 Compensation method and compensation device for VR display and display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910002157.0A CN109545122B (zh) 2019-01-02 2019-01-02 Vr显示的补偿方法及补偿装置、显示系统
CN201910002157.0 2019-01-02

Publications (1)

Publication Number Publication Date
WO2020140808A1 true WO2020140808A1 (zh) 2020-07-09

Family

ID=65834101

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/128273 WO2020140808A1 (zh) 2019-01-02 2019-12-25 Vr显示的补偿方法及补偿装置和显示装置

Country Status (3)

Country Link
US (1) US11302280B2 (zh)
CN (1) CN109545122B (zh)
WO (1) WO2020140808A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109545122B (zh) 2019-01-02 2021-01-29 京东方科技集团股份有限公司 Vr显示的补偿方法及补偿装置、显示系统
CN114356082A (zh) * 2021-12-20 2022-04-15 歌尔光学科技有限公司 增强现实设备的图像优化方法、装置、电子设备及系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009060371A (ja) * 2007-08-31 2009-03-19 Sony Corp 表示装置
US20110249135A1 (en) * 2010-04-08 2011-10-13 Canon Kabushiki Kaisha Image processing apparatus and method of controlling same
CN105825801A (zh) * 2016-03-21 2016-08-03 联想(北京)有限公司 一种显示控制方法及电子设备
CN106127843A (zh) * 2016-06-16 2016-11-16 福建数博讯信息科技有限公司 三维虚拟场景的渲染方法和装置
CN106658170A (zh) * 2016-12-20 2017-05-10 福州瑞芯微电子股份有限公司 一种降低虚拟现实延迟的方法和装置
CN108109570A (zh) * 2016-11-14 2018-06-01 谷歌有限责任公司 用于有效传输的低分辨率rgb渲染
CN109545122A (zh) * 2019-01-02 2019-03-29 京东方科技集团股份有限公司 Vr显示的补偿方法及补偿装置、显示系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8477848B1 (en) * 2008-04-22 2013-07-02 Marvell International Ltd. Picture rate conversion system architecture
WO2017036429A2 (en) * 2016-12-01 2017-03-09 Viewtrix Technology Co., Ltd. Zone-based display data processing and transmission

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009060371A (ja) * 2007-08-31 2009-03-19 Sony Corp 表示装置
US20110249135A1 (en) * 2010-04-08 2011-10-13 Canon Kabushiki Kaisha Image processing apparatus and method of controlling same
CN105825801A (zh) * 2016-03-21 2016-08-03 联想(北京)有限公司 一种显示控制方法及电子设备
CN106127843A (zh) * 2016-06-16 2016-11-16 福建数博讯信息科技有限公司 三维虚拟场景的渲染方法和装置
CN108109570A (zh) * 2016-11-14 2018-06-01 谷歌有限责任公司 用于有效传输的低分辨率rgb渲染
CN106658170A (zh) * 2016-12-20 2017-05-10 福州瑞芯微电子股份有限公司 一种降低虚拟现实延迟的方法和装置
CN109545122A (zh) * 2019-01-02 2019-03-29 京东方科技集团股份有限公司 Vr显示的补偿方法及补偿装置、显示系统

Also Published As

Publication number Publication date
CN109545122B (zh) 2021-01-29
US20210264872A1 (en) 2021-08-26
US11302280B2 (en) 2022-04-12
CN109545122A (zh) 2019-03-29

Similar Documents

Publication Publication Date Title
US10049642B2 (en) Sending frames using adjustable vertical blanking intervals
US9786255B2 (en) Dynamic frame repetition in a variable refresh rate system
TWI514367B (zh) 以顯示時間的估計爲函數而修改畫素値的系統、方法、與電腦程式產品
US9728166B2 (en) Refresh rate matching with predictive time-shift compensation
EP2622454B1 (en) Image synchronization for multiple displays
CN109920040B (zh) 显示场景处理方法和装置、存储介质
WO2019153723A1 (zh) 视频画面显示方法、装置、电视机及存储介质
US8711207B2 (en) Method and system for presenting live video from video capture devices on a computer monitor
WO2020140808A1 (zh) Vr显示的补偿方法及补偿装置和显示装置
US10957020B2 (en) Systems and methods for frame time smoothing based on modified animation advancement and use of post render queues
TWI749756B (zh) 借助於合成器生成一系列訊框方法和裝置
US6844879B2 (en) Drawing apparatus
WO2022089046A1 (zh) 虚拟现实显示方法、装置及存储介质
CN115151969A (zh) 用以补偿被延迟的图形处理单元渲染时间的被减少的显示处理单元传送时间
US20170075432A1 (en) Cursor handling in a variable refresh rate environment
US8194065B1 (en) Hardware system and method for changing a display refresh rate
US20230086916A1 (en) Image processing apparatus and image processing method
US12034786B2 (en) Image processing device, image data transfer device, and image generation method
US20240007612A1 (en) Virtual reality display method, device and storage medium
WO2023240699A1 (zh) 显示装置及电子设备
CN117612466A (zh) 一种显示方法和显示设备
CN117238244A (zh) 显示方法、终端设备及图像显示装置
JP2009008946A (ja) 表示装置および表示プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19907721

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19907721

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19907721

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.02.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19907721

Country of ref document: EP

Kind code of ref document: A1