WO2022242704A1 - 用于刷新头戴式显示设备的屏幕的方法和头戴式显示设备 - Google Patents
用于刷新头戴式显示设备的屏幕的方法和头戴式显示设备 Download PDFInfo
- Publication number
- WO2022242704A1 WO2022242704A1 PCT/CN2022/093762 CN2022093762W WO2022242704A1 WO 2022242704 A1 WO2022242704 A1 WO 2022242704A1 CN 2022093762 W CN2022093762 W CN 2022093762W WO 2022242704 A1 WO2022242704 A1 WO 2022242704A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- screen
- data
- moment
- pose
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 85
- 238000003860 storage Methods 0.000 claims abstract description 16
- 230000005540 biological transmission Effects 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 8
- 230000001131 transforming effect Effects 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000001052 transient effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 16
- 238000004590 computer program Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000009877 rendering Methods 0.000 description 7
- 238000005286 illumination Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 235000015110 jellies Nutrition 0.000 description 2
- 239000008274 jelly Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 238000013475 authorization Methods 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1431—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/18—Timing circuits for raster scan displays
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/393—Arrangements for updating the contents of the bit-mapped memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/399—Control of the bit-mapped memory using two or more bit-mapped memories, the operations of which are switched in time, e.g. ping-pong buffers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/06—Details of flat display driving waveforms
- G09G2310/061—Details of flat display driving waveforms for resetting or blanking
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0435—Change or adaptation of the frame rate of the video stream
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/04—Display device controller operating with a plurality of display units
Definitions
- the present disclosure relates to the technical field of image display, and in particular to a method for refreshing a screen of a head-mounted display device, a head-mounted display device, and a non-transitory computer-readable storage medium storing computer instructions.
- Head-mounted display devices such as augmented reality (Augmented Reality, AR), virtual reality (Virtual Reality, VR) or mixed reality (Mixed Reality, MR) glasses usually have two screens, respectively corresponding to the user's left eye and right eye.
- the above-mentioned head-mounted display device is usually connected to the smart terminal through a Type-c interface, so that the two screens of the head-mounted display device display the data transmitted by the smart terminal.
- common screen refresh methods include roller illumination and global illumination. If the rolling shutter lighting refresh method is used, the two screens of the head-mounted display device need to be refreshed at the same time. Usually, the rolling shutter lighting refresh method is realized by progressive exposure. The start and end display times of each line on the screen are inconsistent, which may easily lead to distortion of the displayed image, that is, the jelly effect. If the global illumination refresh method is used to avoid the jelly effect, it is necessary to make all the pixels of the screen light up and turn off at the same time.
- Embodiments of the present disclosure propose a method and apparatus for refreshing a screen of a head-mounted display device.
- the embodiments of the present disclosure provide a method for refreshing the first screen and the second screen of the head-mounted display device.
- the method includes: from the first moment to the second moment, sending first data to both the first screen and the second screen, writing the first data into the first screen to configure the first screen as a black screen, and the second screen at least Part of the time is a bright screen state; and from the second moment to the third moment, sending second data to both the first screen and the second screen, and writing the second data to the second screen to configure the second screen as a black screen state,
- the first screen is in a bright screen state at least part of the time to display the first data.
- inventions of the present disclosure provide an apparatus for refreshing a first screen and a second screen of a head-mounted display device, and the first screen and the second screen respectively correspond to two eyes of a user.
- the device includes: a data sending unit configured to send first data to both the first screen and the second screen from the first moment to the second moment, write the first data into the first screen, and configure the first screen as In a black screen state, the second screen is in a bright screen state at least part of the time; and from the second moment to the third moment, the second data is sent to both the first screen and the second screen, and the second data is written into the second screen to configure
- the second screen is in a black state, and the first screen is in a bright state at least part of the time to display the first data.
- an embodiment of the present disclosure provides a wearable device, including the device for refreshing the first screen and the second screen of the head-mounted display device according to the second aspect.
- an embodiment of the present disclosure provides an electronic device, including: one or more processors; a storage device, on which one or more programs are stored, and when one or more programs are processed by one or more executed by a processor, so that one or more processors implement the method in the first aspect.
- embodiments of the present disclosure provide a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause a computer to execute the method in the first aspect.
- embodiments of the present disclosure provide a computer program product, including computer programs/instructions, where the computer program/instructions implement the method in the first aspect when executed by a processor.
- FIG. 1 is a schematic flowchart of an implementation of a method for refreshing a first screen and a second screen of a head-mounted display device according to an embodiment of the present disclosure.
- FIG. 2 is a schematic diagram of a connection between a main device and a head-mounted display device according to an embodiment of the present disclosure.
- Fig. 3 is a schematic diagram of an implementation flow for time synchronization provided by an embodiment of the present disclosure.
- Fig. 4 is a schematic structural diagram of first data and second data provided by an embodiment of the present disclosure.
- FIG. 5 is a schematic diagram of an application process of the method provided by the embodiment of the present disclosure in practice.
- FIG. 6 is a schematic diagram of another practical application flow of the method provided by the embodiment of the present disclosure.
- FIG. 7A to 9C are schematic flowcharts of a method for reducing display delay of image data based on asynchronous time warping provided by an embodiment of the present disclosure.
- FIG. 10 is a schematic structural diagram of an apparatus for refreshing a first screen and a second screen of a head-mounted display device according to an embodiment of the present disclosure.
- FIG. 11 is a schematic structural diagram of an electronic device for refreshing a first screen and a second screen of a head-mounted display device according to an embodiment of the present disclosure.
- the term “comprising” and its variants represent open terms, meaning “including but not limited to”.
- the term “based on” means “based at least in part on”.
- the terms “one embodiment” and “an embodiment” mean “at least one embodiment.”
- the term “another embodiment” means “at least one other embodiment.”
- the terms “first”, “second”, etc. may refer to different or the same object. The following may include other definitions, either express or implied. Unless the context clearly indicates otherwise, the definition of a term is consistent throughout the specification.
- an embodiment of the present disclosure provides a method for refreshing the first screen and the second screen of a head-mounted display device, when the user wears the head-mounted display device , the first screen and the second screen respectively correspond to the user's eyes.
- the first screen corresponds to the user's left eye
- the second screen corresponds to the user's right eye
- the first screen corresponds to the user's right eye
- the second screen corresponds to the user's left eye.
- the "corresponding" here means that when the user wears the head-mounted display device, for example, the user's left eye views the displayed content through the first screen, and the user's right eye views the displayed content through the second screen.
- first screen and second screen are the screens of the head-mounted display device, which is only an exemplary description of the embodiment of the present disclosure, and does not impose any limitation on the embodiment of the present disclosure.
- the execution subject of the method may be various types of computing devices, or may be an application program or an application (Application, APP) installed on the computing device.
- the computing device for example, may be a user terminal such as a mobile phone, a tablet computer, or a head-mounted display device, or may be a server or the like.
- a head-mounted display device may be connected to other devices for receiving, displaying or processing data from the device, such as data from an application of the device.
- the device connected to the head-mounted display device may be other terminals such as a mobile phone or a computer, and is referred to as a main device in this document.
- the head-mounted display device and the main device can be connected through a USB interface, so as to transmit the data of the main device to the head-mounted display device.
- Data described herein includes, but is not limited to, audio data, video data, image data, and the like.
- the head-mounted display device may not be connected with other devices, the head-mounted display device may have components for storing the above data, and the data may be directly transmitted from the internal components of the head-mounted display device itself.
- the main device and the head-mounted display device can be connected through a Type-c interface.
- the video data of the main device can be transmitted to the screen control chip of the head-mounted display device through the DisplayPort (DP) protocol, and the screen control The chip can transmit video data to the first screen and the second screen through low-voltage differential signal (LVDS) technology, and the screen control chip can also send signals to the first screen and the second screen through lines to control other functions of the screen.
- DP DisplayPort
- LVDS low-voltage differential signal
- a buffer area (Buffer) is usually required, so that the central processing unit (CPU) or graphics processing unit (GPU) is required to generate an image in the buffer area in advance, and then the screen control chip reads the image from the buffer area. Get the image and transfer it to the screen.
- this method is prone to transmission delays.
- data transmission may be performed in a manner that does not separately set a buffer area. For example, only a small buffer area is set on the screen control chip, and this buffer area can only buffer a few lines of data, but cannot buffer the data of a whole frame of image. In other words, the image data that can be stored in the buffer area is less than a whole frame of image data. That is, data transferred from the master device needs to be transferred to the screen immediately, thereby reducing latency.
- the data can be written to the screen in advance, the screen is controlled to be in a black state during the process of writing data to the screen, and then the screen is lit to display the data that has been written to the screen.
- the refresh rate supported by the screen of the head-mounted display device is usually 60Hz, 90Hz, 120Hz and 144Hz, etc.
- the upper limit of the image data transmitted by the Type-c interface through the DP protocol is less than 1920*1200*2*120Hz. Therefore, when using related technologies, when the transmission bandwidth is limited, it is impossible to transmit high frame rate and high resolution images, such as 1920*1200*2*144, 3840*1200*2*120 or 3840*1200*2*144 .
- the usual practice is to refresh the image data corresponding to the two screens at the same time, so that the data is transmitted from the main device to the two screens through the Type-c interface.
- a whole frame of image data is 3840*1200*60Hz or 1920*2400*60Hz
- the time for two screens to write a whole frame of image data is 16.666...ms
- the delay time is 16.666...ms+screen lighting time/2. Since the two screens need to be refreshed at the same time, the delay of each screen is also 16.666...ms+screen lighting time/2, and the refresh delay is relatively large.
- the present disclosure adopts a manner of refreshing two screens in turn, so as to reduce the refresh delay of each screen.
- the time to write data to a screen only needs 8.333...ms, that is, the delay time is only 8.333...ms+screen lighting time/ 2.
- the refresh delay of each screen can be reduced.
- a method for refreshing the first screen and the second screen of the display device of the present disclosure will be specifically described as follows.
- the first data and the second data may include image data for refreshing the first screen and the second screen.
- the first data and the second data may be, for example, image data sent by a master device connected to the head-mounted display device.
- the "black screen state” in the present disclosure means that the screen does not display any content, for example, the screen may not display image data. For example the screen can appear black.
- the "bright screen state” in the present disclosure refers to a state where the screen is turned on and data has been written. During the time interval from the first moment to the second moment or from the second moment to the third moment, the screen may always be in the bright state, or may be in the bright state for part of the time, and in the black screen state for part of the time. In the present disclosure, when data is written into the screen, the screen can be controlled to be in a black state.
- the screen may display written data. It should be noted that, if during the time interval from the first moment to the second moment or from the second moment to the third moment, the control screen is in the bright screen state for part of the time and is in the black screen state for part of the time, then in this time interval, even if The screen is in a black state for part of the time, and the data transmitted at this time will not be written to the screen.
- the first screen and the second screen may be controlled to be in a black screen state or a bright screen state by sending signals to the first screen and the second screen.
- the screen control chip continuously sends signals to the first screen and the second screen, and may continuously send the first signal to the first screen and the second screen between the first moment and the second moment, and the first signal indicates that the first screen is The screen is in a black state, and the second screen is in a bright state.
- the screen control chip sends signals to the first screen and the second screen at intervals, and may send a second signal to the first screen and the second screen at the first moment, and the second signal indicates that the second signal between the first moment and the second moment One screen is in a black state, and the second screen is in a bright state.
- the screen control chip sends a third signal to the first screen and the second screen at the initial moment, such as the first moment, and the third signal indicates that the first screen and the second screen are in a preset period (such as the above-mentioned first screen). time interval, the second moment, the third moment... until the time interval divided by the nth moment) is in a bright screen state or a black screen state.
- the first screen and the second screen can be set to be in a black screen state or a bright screen state according to a predetermined period.
- the first screen and the second screen can be set to be in the bright screen state or the black screen state according to the time interval divided by the first moment, the second moment, the third moment... until the nth moment, without sending the first screen and the second screen
- the second screen sends a signal.
- the first moment may be the initial moment, and may be the moment when the first vertical synchronization signal of the first screen and the second screen is obtained. At this time, neither the first screen nor the second screen starts to display images. .
- the screen control chip sends the first data to both the first screen and the second screen, within this time interval, the first data can be written into the first screen, and the first screen Configure it as a black screen state, and configure the second screen as a bright screen state at least part of the time.
- the first moment is the initial moment, since no data is transmitted and written into the first screen or the second screen before this moment, although the second screen is in the on-screen state at least part of the time, the second The screen does not display image data.
- the screen control chip sends the second data to both the first screen and the second screen.
- the second data can be written into the second screen, and the second screen
- the configuration is in a black screen state, and the first screen is in a bright screen state at least part of the time to display the first data.
- the first screen can be written in the time interval between the second moment and the third moment. It is configured as a bright screen state to display the first data. It can be understood that if the first moment is not the initial moment and data has been written on the second screen, then between the first moment and the second moment, the second screen can be in the bright screen state at least part of the time and display the previous The data written.
- the data can only be written to one of the screen A in any time interval, and the control screen A is in a black screen state, and the other screen B is at least part of the time
- the screen is turned on to display the data (if any) that has been written in the previous time interval.
- only data can be written into screen B, and the screen B is controlled to be in a black state, and the screen A is in a bright state at least part of the time to display the data written in the previous time interval. That is, the two screens are refreshed in turn, so that the two screens can alternately write data and display data in turn, thereby reducing the refresh delay of each screen.
- the time interval between the first moment and the second moment may be equal to the time interval between the second moment and the third moment. For example, if each of the aforementioned moments is the moment when the vertical synchronization signal is obtained, then the time interval between any two vertical synchronization signals is the same. In this way, the time for each screen to write data is equal, which facilitates the preparation of data for transmission to the head-mounted display device.
- the number of frames per second (Frames Per Second, FPS) of the screen can be synchronized with the refresh rate of the screen.
- the first moment, the second moment and the third moment can be the moment of obtaining the vertical synchronization (Vertical synchronization, Vsync) signal of the first screen and the second screen of the head-mounted display device, or the first moment and the third moment
- the moment may be the moment of obtaining the Vsync signals of the first screen and the second screen of the head-mounted display device
- the second moment may be the moment between the moments of obtaining the two Vsync signals, for example, the middle of the moments of obtaining the two Vsync signals time. It can be understood that at least one of the first moment, the second moment and the third moment is the moment at which the vertical synchronization signals of the first screen and the second screen are acquired.
- the Vsync signal is used to indicate the end of the previous frame of data and the start of the next frame of data.
- the time stamp of the Vsync of the screen may be transmitted to the master device through the MCU.
- the head-mounted display device is connected to the main device through a Type-c interface to display the data of the App of the main device.
- the App of the main device can obtain the Vsync of the screen of the head-mounted display device through the main device. If the main device is not authorized, the App of the main device needs to obtain the Vsync of the screen through the MCU. It is necessary to align the system time of the head-mounted display device with the system time of the main device through the system time synchronization method. At the same time, you can also know the current refresh status of the screen, such as which screen to write data to next, and control Which screen is in the bright state.
- the method for refreshing the screen of the present disclosure further includes:
- S210 Receive a time synchronization request sent by a master device connected to the head-mounted display device, where the time synchronization request includes a first time of the master device and a time synchronization protocol.
- the time deviation between the main device connected to the head-mounted display device and the head-mounted display device can be calibrated, thereby ensuring the accuracy of data transmission.
- the head-mounted display device is connected to the main device through the Type-c interface, and the main device initiates a time synchronization request, then the system time t h1 of the main device and the time synchronization protocol are transmitted to the MCU through the USB protocol, and the MCU receives the time synchronization request.
- the system time t g1 of the MCU is obtained immediately, and t h1 and t g1 are sent back to the master device, and the master device receives the MCU time synchronization protocol response, and immediately obtains the system time t h2 of the master device.
- t h1 -t g1 includes the difference delta_t between the system time of the master device and the system time of the head-mounted display device and the time t_usb transmitted through USB. Then delta_t can be obtained by the following calculation.
- delta_t (t h1 -t g1 +t h2 -t g1 )/2
- the time synchronization protocol can be executed every once in a while, and the delta_t obtained each time is weighted and averaged to obtain the filtered delta_t, so that the change of delta_t is more stable, and it can also handle the frequency between the two systems The change in delta_t due to the offset.
- the first data may include first full frame data
- the second data may include second full frame data.
- the first entire frame of data and the second entire frame of data may be configured as the same image data; or the first entire frame of data may be configured as left-eye image data, and the second entire frame of data may be configured as right-eye image data.
- the "whole frame of data" in the present disclosure refers to the data of a complete frame of image.
- the first data and the second data may also include partial data of a whole frame of data.
- the first data includes the first partial data of the third full frame of data
- the second data includes the second partial data of the third full frame of data as an example, wherein the first partial data and the second partial data constitute the third full frame of data.
- the first data may include a part of data of a complete frame of image, for example, it may be one-half, one-third, or one-fourth of one frame of complete image.
- the second data may include the remaining one-half, two-thirds or three-quarters data of the complete frame of the image, so that the first data and the second data form the complete frame of the image. It can be understood that when the first data and the second data are not half of the data of the whole frame, the time interval between the first moment, the second moment and the third moment should be adjusted accordingly.
- first data and second data are schematic structural diagram of first data and second data provided in an embodiment of the present disclosure.
- first data may include the left half data of the third full frame data
- second data may include The right half data of the third full frame data.
- first data may include data in the upper half of the third full frame of data
- second data may include data in the lower half of the third full frame of data.
- the first data and the second data are configured as the same image data, and when the user wears the head-mounted display device, only a picture with a two-dimensional display effect can be viewed.
- the first data and the second data may be configured as different image data, for example, the first full frame data is configured as left eye image data, and the second full frame data is configured as right eye image data.
- the "left-eye image data" mentioned in this disclosure refers to the data configured to be transmitted to the screen (such as the first screen) corresponding to the user's left eye.
- the image data for the left eye and the image data for the right eye may be configured as image data with parallax to achieve a stereoscopic display effect.
- the user wears the head-mounted display device he can watch a picture with a three-dimensional display effect.
- left-eye image data and right-eye image data are configured as image data with parallax
- the parallax between the left-eye image data and right-eye image data is such that the user can produce a stereoscopic effect after viewing with both eyes.
- the first data is configured as left-eye image data
- the second data is configured as right-eye image data
- the first data and the second data are configured as different data
- the left-eye image data and the right-eye image data If there is no parallax between the eye images, the two-dimensional display effect can still be achieved when using this method.
- the first data and the second data may be different in areas where the visual fields of the left and right eyes overlap.
- FIG. 5 shows an embodiment of the screen refreshing method of the present disclosure.
- the nth data can be sent to both the first screen and the second screen, and the nth data can be written into the first screen. screen, to configure the first screen to be in a black state, and the second screen to be in a bright state at least part of the time.
- the second screen is configured as a bright screen, it can display the data written earlier.
- n may refer to positive integers such as 1, 2, 3, . . . .
- n refers to 1, it can be understood as the initial state. It should be noted that since no data is written to the second screen before the first Vsync signal is acquired, the time interval between the acquisition of the first Vsync signal and the acquisition of the second Vsync signal , although the second screen is on at least part of the time, it does not display data.
- the n+1th data can be sent to both the first screen and the second screen, and the n+1th data can be written into the second screen
- the screen is configured to configure the second screen to be in a black screen state, and the first screen to be in a bright screen state at least part of the time, wherein, when the first screen is configured to be in a bright screen state, the nth data that has been written can be displayed.
- the operations of continuously transmitting data, writing data in turn, and displaying data in turn are performed.
- the explanation about the nth data and the n+1th data can refer to the above explanation about the first data and the second data, for example, the nth data and the n+1th data can be whole frame data.
- FIG. 6 shows an embodiment of the screen refreshing method of the present disclosure.
- the nth data can be sent to both the first screen and the second screen.
- Any moment between the acquisition of the nth Vsync signal and the acquisition of the n+1th Vsync signal may be used as a dividing point, for example, the middle moment of the time interval between acquiring two Vsync signals.
- the time when the nth Vsync signal is acquired is T n
- the time when the n+1th Vsync signal is acquired is T n+1
- the above intermediate time may be T n +1/2(T n+1 -T n ).
- the first part of data of the nth data can be written into the first screen to configure the first screen as a black screen, and the second screen as a bright screen at least part of the time state.
- the second screen When the second screen is configured as a bright screen, it can display the data written earlier.
- the second part of the nth data can be written into the second screen to configure the second screen as a black screen, and the first screen is at least part of the time In bright screen state.
- the first screen is configured as a bright screen state, it can display the first part of the written nth data.
- the operations of continuously transmitting data, writing data in turn, and displaying data in turn are performed.
- the first part of the nth data can be one-half of the entire frame of data
- the second part of the nth data can be the remaining half of the entire frame of data
- the first part of the data and the second part of the data constitute whole frame of data.
- an asynchronous time warp (Asynchronous Timewarp, ATW) technology may be used in the embodiments of the present disclosure.
- the "pose” referred to in the following embodiments of the present disclosure may refer to the user's position and posture, for example, may be the user's 6-degree-of-freedom (6dof) pose.
- 6dof 6-degree-of-freedom
- the real pose of the user at a preset moment such as position, motion direction, speed, acceleration, etc., can be obtained.
- the estimation operation in the following embodiments can be based on the acquired real pose of the user, or can calculate the estimated pose of the user at a preset moment based on the historical data of the real pose of the user.
- the data can be transformed based on the deviation between different estimated poses. For example, all the rendered data can be transformed, or a part of the rendered data can be transformed.
- the transformed data can correspond to the pose of the user at the preset moment, where the preset moment refers to the moment when the screen display has written data, so that the pose of the user at this moment is the same as the picture displayed on the screen at this moment Correspondingly, the display delay problem is avoided.
- data may be rendered within a time interval between the first preset moment and the second preset moment.
- the second preset moment may be the moment when the data is rendered, or may be a moment after a period of time after the rendering is completed.
- ATW operations may be performed on the master device to transmit the transformed data to the head-mounted display device.
- an embodiment of the present disclosure provides a method for reducing display delay of image data based on asynchronous time warping.
- the first preset moment T1 and the second preset moment T2 are before the first moment t1
- the third preset moment T3 is between the second moment t2 and the third moment t3, which is Displays the time when data has been written.
- the first data includes the first full frame of data (for example, the nth full frame of data in FIG. 7B ), before the first moment t1 of transmitting the nth full frame of data, it may be Render the nth full frame of data between a preset moment and the second preset moment, obtain the first pose of the user at the first preset moment T1, and calculate the user's pose at the third preset moment T3 based on the first pose The first estimated pose.
- the second pose of the user at the second preset moment T2 is obtained, and the second estimated pose of the user at the third preset moment T3 is calculated based on the second pose.
- the pose of T3 corresponds.
- the first data may include the first partial data of the third full frame of data (for example, the nth full frame of data in FIG. 7C ), and the first partial data of the nth full frame of data is transmitted.
- the first part of data of the nth full frame of data can be rendered between the first preset moment and the second preset moment, and the first pose of the user at the first preset moment T1 can be obtained, based on the first
- a pose calculates the first estimated pose of the user at the third preset moment T3.
- the second pose of the user at the second preset moment T2 is obtained, and the second estimated pose of the user at the third preset moment T3 is calculated based on the second pose.
- the first partial data of the n-th full frame data is transformed, so that the transformed first data is consistent with the user at the third preset moment T3 corresponding to the pose.
- the method shown in this embodiment can similarly perform rendering and transformation operations on the second data, and the timing of performing each operation can be deduced similarly according to the foregoing embodiments.
- the second data may be rendered between two preset moments before the second moment t2, and the user's estimate at the estimated moment of displaying the second data is calculated based on the user's pose acquired at the two preset moments pose, and then transform at least a portion of the second data before a second time instant t2.
- this embodiment adopts the operation of separately rendering and transforming the first data and the second data in the corresponding data transmission. This method enables the user to watch through each screen, and the displayed data is consistent with the data displayed by the user. corresponding to the pose at the time.
- Figure 8A shows another alternative embodiment.
- the main difference between the embodiment shown in FIG. 8B and the embodiment shown in FIG. 7B is that, between the first preset time T1 and the second preset time T2, the nth full frame data and the n+1th full frame are rendered data.
- the data corresponding to the two screens are all rendered before the first time t1, and before the first time t1, the nth full frame data and the n+1th full frame data are transformed.
- this method calculates the user's estimated pose at a data display time based on the user's pose collected at the first preset time T1 and the second preset time T2, in order to take into account the nth whole frame data and the n+1th Of the two moments displayed by the entire frame of data, the third preset moment T3 will be closer to the third moment t3, and may even be the third moment t3, so as to take into account the above two moments.
- the main difference between the embodiment shown in FIG. 8C and the embodiment shown in FIG. 7C is that, between the first preset time T1 and the second preset time T2, rendering the nth full frame of data will correspond to the The data is all rendered before the first moment t1, and before the first moment t1, the nth whole frame of data is transformed.
- this embodiment reference may be made to the description of the above-mentioned embodiment 8B, which will not be repeated here.
- rendering and transformation operations are performed on the first data and the second data before the first time t1.
- the estimated pose of the user at a data display moment (the third preset moment T3) is calculated, and the third preset moment T3 takes into account the first preset moment T3
- the third preset time T3 will be closer to the third time t3, and may even be the third time t3.
- the embodiment shown in FIG. 8A adopts the method of rendering and transforming the data displayed on the two screens together, and this method is easy to operate.
- Figure 9A shows yet another alternative embodiment.
- time interval between the third moment and the fourth moment will be the next period of data transmission, writing data and displaying data.
- the understanding of the third moment and the fourth moment can refer to the description of the first moment and the second moment.
- the first data includes the first full frame data (for example, the nth full frame data shown in FIG. 9B ), and the second data includes the second full frame data (for example, the nth full frame data shown in FIG. 9B ).
- n+1 whole frame data before the first moment t1 of transmitting the nth whole frame data, the nth whole frame data and the n+1th whole frame data can be jointly rendered between the first preset moment and the second preset moment frame data.
- Obtain the first pose of the user at the first preset moment T1 and calculate the first estimated pose of the user at the third preset moment T3 and the third estimated pose of the user at the fifth preset moment T5 based on the first pose. Estimation pose.
- the third preset time T3 is the time when the nth whole frame of data is displayed between the second time t2 and the third time t3
- the fifth preset time T5 is the time when the nth whole frame of data is displayed between the third time t3 and the fourth time t4.
- Two estimated poses at the second preset time T2 (before the first moment t1 when the nth full frame of data is transmitted), based on the first estimated pose and the second estimated pose, transform at least the nth full frame of data Part of it, so that the transformed nth whole frame data corresponds to the pose of the user at the third preset time T3.
- the third pose of the user at the fourth preset moment T4 between the first moment t1 and the second moment t2 is obtained, and the fourth estimated pose of the user at the fifth preset moment T5 is calculated based on the third pose.
- At the fourth preset moment T4 (before the second moment t2 when the n+1th full frame of data is transmitted), at least a part of the n+1th full frame of data is transformed based on the third estimated pose and the fourth estimated pose, In order to make the transformed n+1th whole frame data correspond to the pose of the user at the fifth preset moment.
- this embodiment reference may be made to the description about the embodiment in FIG. 5 .
- the first data may include the first partial data of the third full frame of data (for example, the nth full frame of data shown in FIG. 9C), and the second data may include the third full frame of data ( For example, the second partial data of the nth full frame data shown in FIG. 9C ), the first partial data and the second partial data constitute the third full frame data.
- the nth full frame of data Before the first moment t1 when the first partial data of the nth full frame of data is transmitted, the nth full frame of data may be rendered between the first preset moment and the second preset moment.
- the first data and the second data are jointly rendered before the first time t1, but the first data and the second data are respectively rendered before the time when the first data and the second data are transmitted.
- the transformation operation is performed on the two data, and the transformation operation is performed based on the estimated pose of the user at the display time of the corresponding data.
- the rendering operation of this method is simple, and when the user watches through each screen, the displayed data is consistent with the user at the time of the display. Corresponds to the pose when the data is displayed.
- the image data to be rendered can be predicted based on the asynchronous time warping technology, and when the user's pose changes at the moment to be displayed, it can be real-time based on the change of the user's pose At least part of the first data and/or the second data is changed, so that the changed first data and/or the second data match the changed pose, thereby reducing the delay.
- the present disclosure further provides a method for refreshing the first screen and the second screen of a head-mounted display device, wherein when the user wears the head-mounted display device, the image displayed on the first screen falls into the The visual range of one eye of the user, and the image displayed on the second screen falls into the visual range of the other eye of the user.
- the method includes: Sending the first image data, writing the first image data into the first screen, to configure the first screen not to display the image data, and the second screen to display the written image data; and from the second moment to the third moment, to the second Both the first screen and the second screen send the second image data, write the second image data into the second screen to configure the second screen not to display image data, and the first screen displays the written first image data, the first moment At least one of , the second moment and the third moment is a moment for acquiring vertical synchronization signals of the first screen and the second screen.
- An embodiment of the present disclosure also provides a device for refreshing the first screen and the second screen of a head-mounted display device.
- the specific structural diagram of the device is shown in FIG. 10 , including a data sending unit configured to From the first moment to the second moment, the first data is sent to both the first screen and the second screen, and the first data is written into the first screen to configure the first screen as a black screen, and the second screen is at least part of the time Bright screen state; and from the second moment to the third moment, send the second data to both the first screen and the second screen, and write the second data into the second screen to configure the second screen as a black screen state, and the first screen
- the screen is in a bright state at least part of the time to display the first data.
- the device provided by the embodiment of the present disclosure, it is possible to control the first screen and the second screen to refresh the screen in turn in different time periods, so that the refresh delay of a single screen can be reduced, thereby reducing the gap between the smart terminal and the wearable display device. refresh delay.
- an electronic device includes a processor, and optionally an internal bus, a network interface, and a memory.
- the memory may include internal memory, such as high-speed random-access memory (Random-Access Memory, RAM), and may also include nonvolatile memory (nonvolatile memory), such as at least one disk memory.
- RAM Random-Access Memory
- nonvolatile memory such as at least one disk memory.
- the electronic device may also include hardware required by other services.
- the processor, the network interface and the memory can be connected to each other through an internal bus, which can be an ISA (Industry Standard Architecture, industry standard architecture) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnection standard) bus or an EISA (Extended Industry Standard Architecture, extended industry standard architecture) bus, etc.
- the bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one double-headed arrow is used in FIG. 11 , but it does not mean that there is only one bus or one type of bus.
- Memory for storing programs.
- the program may include program code, and the program code includes computer operation instructions.
- Storage which can include internal memory and nonvolatile storage, provides instructions and data to the processor.
- the processor reads the corresponding computer program from the non-volatile memory into the memory and runs it, forming a method device for refreshing the first screen and the second screen of the head-mounted display device on a logical level.
- the processor executes the program stored in the memory, and is specifically used to perform the following operations:
- the above-mentioned method for refreshing the first screen and the second screen of the head-mounted display device as provided in this specification may be applied to a processor or implemented by the processor.
- a processor may be an integrated circuit chip with signal processing capabilities.
- each step of the above method can be completed by an integrated logic circuit of hardware in a processor or an instruction in the form of software.
- the above-mentioned processor can be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; it can also be a digital signal processor (Digital Signal Processor, DSP), a dedicated integrated Circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
- the steps of the method disclosed in the embodiments of this specification may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
- the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
- the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
- the embodiment of this specification also proposes a computer-readable storage medium, the computer-readable storage medium stores one or more programs, and the one or more programs include instructions, and when the instructions are executed by an electronic device including a plurality of application programs , the electronic device can be made to execute a method for refreshing the first screen and the second screen of the head-mounted display device, and is specifically used to execute:
- These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to operate in a specific manner, such that the instructions stored in the computer-readable memory produce an article of manufacture comprising instruction means, the instructions
- the device realizes the function specified in one or more procedures of the flowchart and/or one or more blocks of the block diagram.
- a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
- processors CPUs
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- Memory may include non-permanent storage in computer-readable media, in the form of random access memory (RAM) and/or nonvolatile memory such as read-only memory (ROM) or flash RAM. Memory is an example of computer readable media.
- RAM random access memory
- ROM read-only memory
- flash RAM flash random access memory
- Computer-readable media including both permanent and non-permanent, removable and non-removable media, can be implemented by any method or technology for storage of information.
- Information may be computer readable instructions, data structures, modules of a program, or other data.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
Claims (15)
- 一种用于刷新头戴式显示设备的第一屏幕和第二屏幕的方法,其中,在用户佩戴所述头戴式显示设备时,所述第一屏幕显示的图像落入所述用户的一只眼睛的可视范围,所述第二屏幕显示的图像落入所述用户的另一只眼睛的可视范围,所述方法包括:从第一时刻到第二时刻,向所述第一屏幕和所述第二屏幕均发送第一数据,将所述第一数据写入所述第一屏幕,以配置所述第一屏幕为黑屏状态,所述第二屏幕至少部分时间为亮屏状态;和从所述第二时刻到第三时刻,向所述第一屏幕和所述第二屏幕均发送第二数据,将所述第二数据写入所述第二屏幕,以配置所述第二屏幕为黑屏状态,所述第一屏幕至少部分时间为亮屏状态以显示所述第一数据。
- 根据权利要求1所述的方法,其中,所述第一时刻和所述第二时刻之间的时间间隔与所述第二时刻与所述第三时刻之间的时间隔相等。
- 根据权利要求1所述的方法,其中,所述第一数据和所述第二数据配置为图像数据,所述图像数据每秒传输帧数与所述第一屏幕和所述第二屏幕的刷新率同步,所述第一时刻、所述第二时刻和所述第三时刻中的至少一个为获取所述第一屏幕和所述第二屏幕的垂直同步信号的时刻。
- 根据权利要求1或2所述的方法,其中,所述第一数据包括第一整帧数据,所述第二数据包括第二整帧数据,所述第一整帧数据和所述第二整帧数据配置为如下两者中的一者:所述第一整帧数据和所述第二整帧数据配置为相同的图像数据;和所述第一整帧数据配置为左眼图像数据,所述第二整帧数据配置为右眼图像数据。
- 根据权利要求1或2所述的方法,其中,所述第一数据包括第三整帧数据的第一部分数据,所述第二数据包括所述第三整帧数据的第二部分数据,所述第一部分数据和所述第二部分数据构成所述第三整帧数据,所述第一部分数据和所述第二部分数据配置为如下两者中的一者:所述第一部分数据和所述第二部分数据配置为相同的图像数据;和所述第一部分数据配置为左眼图像数据,所述第二部分数据配置为右眼图像数据。
- 根据权利要求4或5所述的方法,其中,所述左眼图像数据和所述右眼图像数据配置为具有视差的图像数据,以实现立体显示效果。
- 根据权利要求1-6中任一项所述的方法,还包括:获取在第一预设时刻的用户的第一位姿,基于所述第一位姿计算第三预设时刻的用户的第一预估位姿,其中,所述第一预设时刻在所述第一时刻之前,所述第三预设时刻在所述第二时刻和所述第三时刻之间;获取在第二预设时刻的用户的第二位姿,基于所述第二位姿计算所述第三预设时刻的用户的第二预估位姿,其中,所述第二预设时刻在所述第一预设时刻和所述第一时刻之间;和在所述第二预设时刻,基于所述第一预估位姿和所述第二预估位姿变换所述第一数据的至少一部分,以使变换后的第一数据与所述用户在所述第三预设时刻的位姿相对应。
- 根据权利要求1-6中任一项所述的方法,还包括:获取在第一预设时刻的用户的第一位姿,基于所述第一位姿计算第三预设时刻的用户的第一预估位姿,其中,所述第一预设时刻在所述第一时刻之前,所述第三预设时刻在所述第二时刻和所述第三时刻之间或为所述第三时刻;获取在第二预设时刻的用户的第二位姿,基于所述第二位姿计算所述第三预设时刻的用户的第二预估位姿,其中,所述第二预设时刻在所述第一预设时刻和所述第一时刻之间;和在所述第二预设时刻,基于所述第一预估位姿和所述第二预估位姿变换所述第一数据的至少一部分和所述第二数据的至少一部分,以使变换后的第一数据和第二数据与所述用户在所述第三预设时刻的位姿相对应。
- 根据权利要求1-6中任一项所述的方法,还包括:从所述第三时刻到第四时刻,配置所述第一屏幕为黑屏状态,所述第二屏幕至少部分时间为亮屏状态以显示所述第二数据,获取在第一预设时刻的用户的第一位姿,基于所述第一位姿计算第三预设时刻的用户的第一预估位姿和第五预设时刻的用户的第三预估位姿,其中,所述第一预设时刻在所述第一时刻之前,所述第三预设时刻在所述第二时刻和所述第三时刻之间,所述第五预设时刻在所述第三时刻和所述第四时刻之间;获取在第二预设时刻的用户的第二位姿,基于所述第二位姿计算所述第三预设时刻的用户的第二预估位姿,其中,所述第二预设时刻在所述第一预设时刻和所述第一时刻之间;在所述第二预设时刻,基于所述第一预估位姿和所述第二预估位姿变换所述第一数据的至少一部分,以使变换后的第一数据与所述用户在所述第三预设时刻的位姿相对应;获取在第四预设时刻的用户的第三位姿,基于所述第三位姿计算第五预设时刻的用户的第四预估位姿,其中,所述第四预设时刻在所述第一时刻和所述第二时刻之间;和在所述第四预设时刻,基于所述第三预估位姿和所述第四预估位姿变换所述第二数据的至少一部分,以使变换后的第二数据与所述用户在所述第五预设时刻的位姿相对应。
- 根据权利要求1-9中任一项所述的方法,其中,在如下两者中的至少一者之前包括:接收与头戴式显示设备连接的主设备发送的对时请求,所述对时请求包括所述主设备的第一时间和对时协议;获取头戴式显示设备的第二时间;将第一时间以及第二时间发送至主设备;获取主设备的第三时间,以基于第一时间、第二时间以及第三时间获得主设备的系统时间和头戴式显示设备的系统时间之间的差值,其中,所述两者包括:向所述第一屏幕和所述第二屏幕均发送第一数据,将所述第一数据写入所述第一屏幕,以 配置所述第一屏幕为黑屏状态,所述第二屏幕至少部分时间为亮屏状态,和向所述第一屏幕和所述第二屏幕均发送第二数据,将所述第二数据写入所述第二屏幕,以配置所述第二屏幕为黑屏状态,所述第一屏幕至少部分时间为亮屏状态以显示所述第一数据。
- 根据权利要求1-10中任一项所述的方法,还包括如下两者的一者:通过向所述第一屏幕和所述第二屏幕发送信号以控制所述第一屏幕和所述第二屏幕为黑屏状态或亮屏状态;和通过将所述第一屏幕和所述第二屏幕设置为根据预定周期为黑屏状态或亮屏状态。
- 一种头戴式显示设备,包括:第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别对应用户的双眼;和屏幕控制芯片,所述屏幕控制芯片与所述第一屏幕和所述第二屏幕藕接,被配置成:从第一时刻到第二时刻,向所述第一屏幕和所述第二屏幕均发送第一图像数据,将所述第一图像数据写入所述第一屏幕,以配置所述第一屏幕不显示图像数据,所述第二屏幕显示已写入的图像数据;和从所述第二时刻到第三时刻,向所述第一屏幕和所述第二屏幕均发送第二图像数据,将所述第二图像数据写入所述第二屏幕,以配置所述第二屏幕不显示图像数据,所述第一屏幕显示已写入的所述第一图像数据。
- 根据权利要求12所述的头戴式显示设备,其中,所述屏幕控制芯片包括缓存区,所述缓存区所能存储的图像数据少于一整帧图像数据。
- 根据权利要求12所述的头戴式显示设备,包括:USB接口,所述USB接口配置为用于与主设备连接以接收所述主设备的数据,所述数据经由所述USB接口传输给所述屏幕控制芯片,所述数据包括所述第一图像数据和所述第二图像数据。
- 一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行权利要求1-11中任一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/557,592 US20240212536A1 (en) | 2021-05-19 | 2022-05-19 | Method for refreshing screen of head-mounted display device, and head-mounted display device |
EP22804024.2A EP4343407A1 (en) | 2021-05-19 | 2022-05-19 | Method for refreshing screen of head-mounted display device and head-mounted display device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110546795.6 | 2021-05-19 | ||
CN202110546795.6A CN113219668B (zh) | 2021-05-19 | 2021-05-19 | 用于刷新头戴式显示设备的屏幕的方法、装置及电子设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022242704A1 true WO2022242704A1 (zh) | 2022-11-24 |
Family
ID=77093144
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/093762 WO2022242704A1 (zh) | 2021-05-19 | 2022-05-19 | 用于刷新头戴式显示设备的屏幕的方法和头戴式显示设备 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240212536A1 (zh) |
EP (1) | EP4343407A1 (zh) |
CN (1) | CN113219668B (zh) |
WO (1) | WO2022242704A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113219668B (zh) * | 2021-05-19 | 2023-09-08 | 闪耀现实(无锡)科技有限公司 | 用于刷新头戴式显示设备的屏幕的方法、装置及电子设备 |
CN113835526A (zh) * | 2021-09-28 | 2021-12-24 | 青岛歌尔声学科技有限公司 | 显示设备的控制方法、显示设备及介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150341624A1 (en) * | 2011-02-10 | 2015-11-26 | Seiko Epson Corporation | Head-mounted display device and control method for the head-mounted display device |
CN106998409A (zh) * | 2017-03-21 | 2017-08-01 | 华为技术有限公司 | 一种图像处理方法、头戴显示器以及渲染设备 |
CN110221432A (zh) * | 2019-03-29 | 2019-09-10 | 华为技术有限公司 | 头戴式显示器的图像显示方法及设备 |
CN111885265A (zh) * | 2020-07-31 | 2020-11-03 | Oppo广东移动通信有限公司 | 屏幕界面调整方法及相关装置 |
CN113219668A (zh) * | 2021-05-19 | 2021-08-06 | 闪耀现实(无锡)科技有限公司 | 用于刷新头戴式显示设备的屏幕的方法、装置及电子设备 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8421807B2 (en) * | 2010-06-03 | 2013-04-16 | Chimei Innolux Corporation | Display device |
CN112015264B (zh) * | 2019-05-30 | 2023-10-20 | 深圳市冠旭电子股份有限公司 | 虚拟现实显示方法、虚拟现实显示装置及虚拟现实设备 |
-
2021
- 2021-05-19 CN CN202110546795.6A patent/CN113219668B/zh active Active
-
2022
- 2022-05-19 EP EP22804024.2A patent/EP4343407A1/en active Pending
- 2022-05-19 WO PCT/CN2022/093762 patent/WO2022242704A1/zh active Application Filing
- 2022-05-19 US US18/557,592 patent/US20240212536A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150341624A1 (en) * | 2011-02-10 | 2015-11-26 | Seiko Epson Corporation | Head-mounted display device and control method for the head-mounted display device |
CN106998409A (zh) * | 2017-03-21 | 2017-08-01 | 华为技术有限公司 | 一种图像处理方法、头戴显示器以及渲染设备 |
CN110221432A (zh) * | 2019-03-29 | 2019-09-10 | 华为技术有限公司 | 头戴式显示器的图像显示方法及设备 |
CN111885265A (zh) * | 2020-07-31 | 2020-11-03 | Oppo广东移动通信有限公司 | 屏幕界面调整方法及相关装置 |
CN113219668A (zh) * | 2021-05-19 | 2021-08-06 | 闪耀现实(无锡)科技有限公司 | 用于刷新头戴式显示设备的屏幕的方法、装置及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
US20240212536A1 (en) | 2024-06-27 |
EP4343407A1 (en) | 2024-03-27 |
CN113219668A (zh) | 2021-08-06 |
CN113219668B (zh) | 2023-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6465946B2 (ja) | 分散型ビデオ表示用のシステム、制御装置及び制御方法 | |
WO2022242704A1 (zh) | 用于刷新头戴式显示设备的屏幕的方法和头戴式显示设备 | |
US10997929B2 (en) | Display scene processing method and device and storage medium | |
WO2012044703A1 (en) | Image synchronization for multiple displays | |
CN109819232B (zh) | 一种图像处理方法及图像处理装置、显示装置 | |
TWI420503B (zh) | 立體顯示器 | |
WO2015161574A1 (zh) | 用于led电视的数据处理方法、装置及led电视 | |
US20120154559A1 (en) | Generate Media | |
US20120120190A1 (en) | Display device for use in a frame sequential 3d display system and related 3d display system | |
TWI524735B (zh) | 三維影像產生方法及裝置 | |
RU2647664C1 (ru) | Способ обработки видеосигнала | |
Löffler et al. | Networked Displays for VR Applications: Display as a Service. | |
US20120163700A1 (en) | Image processing device and image processing method | |
KR20080006902A (ko) | 비동기 양안 카메라에서 생성되는 출력 데이터의 동기화를수행하는 시스템, 방법 및 기록 매체 | |
CN115174883A (zh) | 基于平面oled屏幕的主动式3d效果显示方法、系统及设备 | |
CN106681674A (zh) | 一种虚拟现实显示器的显示控制方法及系统 | |
TWI520577B (zh) | 立體影像輸出裝置與相關的立體影像輸出方法 | |
TW201406151A (zh) | 用於產生子母畫面(pip)影像之設備及方法 | |
JP5700998B2 (ja) | 立体映像表示装置及びその制御方法 | |
WO2022089046A1 (zh) | 虚拟现实显示方法、装置及存储介质 | |
WO2021042661A1 (zh) | 一种显示设备及图像输出方法 | |
TWI489856B (zh) | Dimensional image processing method | |
US20240007612A1 (en) | Virtual reality display method, device and storage medium | |
CN112230776B (zh) | 虚拟现实显示方法、装置及存储介质 | |
US20230239458A1 (en) | Stereoscopic-image playback device and method for generating stereoscopic images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22804024 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18557592 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022804024 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022804024 Country of ref document: EP Effective date: 20231219 |