EP3677994A1 - Text display method and device in virtual reality, and virtual reality apparatus - Google Patents
Text display method and device in virtual reality, and virtual reality apparatus Download PDFInfo
- Publication number
- EP3677994A1 EP3677994A1 EP18869124.0A EP18869124A EP3677994A1 EP 3677994 A1 EP3677994 A1 EP 3677994A1 EP 18869124 A EP18869124 A EP 18869124A EP 3677994 A1 EP3677994 A1 EP 3677994A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- text information
- eye
- user
- text
- per degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000012937 correction Methods 0.000 claims abstract description 49
- 238000009877 rendering Methods 0.000 claims abstract description 35
- 230000000007 visual effect Effects 0.000 claims description 14
- 238000011161 development Methods 0.000 abstract description 5
- 230000000694 effects Effects 0.000 description 21
- 230000008569 process Effects 0.000 description 19
- 230000006870 function Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 230000004438 eyesight Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000035807 sensation Effects 0.000 description 6
- 238000013461 design Methods 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 230000002207 retinal effect Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008033 biological extinction Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005094 computer simulation Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 206010020675 Hypermetropia Diseases 0.000 description 1
- 201000009310 astigmatism Diseases 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 150000001768 cations Chemical class 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000013100 final test Methods 0.000 description 1
- 230000004305 hyperopia Effects 0.000 description 1
- 201000006318 hyperopia Diseases 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 230000004379 myopia Effects 0.000 description 1
- 208000001491 myopia Diseases 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000013102 re-test Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/106—Display of layout of documents; Previewing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/109—Font handling; Temporal or kinetic typography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
Definitions
- This application relates to the field of virtual reality display, and more specifically, to a text display method and apparatus in virtual reality, and a virtual reality device.
- VR virtual reality
- Display devices of VR hardware include a VR head mounted display (head mounted display) device, a binocular omnidirectional display, and the like. Users obtain virtual reality effects on vision by wearing VR display devices.
- the VR display devices may include VR glasses, VR helmets, and the like.
- This application provides a text display method and apparatus in virtual reality, and a virtual reality device that can resolve a problem of unclear text display in a current virtual reality environment, and can be directly applied to various virtual reality platforms; and can also resolve a problem that in virtual environments, a developer cannot quickly obtain various comfortable and readable font heights in different VR devices and at different distances in the virtual environments, thereby improving user experience and reducing development costs.
- a text display method in virtual reality includes: receiving text information; determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value ⁇ ; rendering the text information based on the font size of the text information; and displaying the rendered text information.
- clear and readable font sizes corresponding to different VR devices and different distances may be determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value ⁇ .
- the pixel per degree PPD VR device of the VR device For designers and developers of display texts in VR systems, individual differences caused by determining a font effect by using existing visual observation manners can be prevented. Therefore, fonts that are designed and developed by using the method in this embodiment of this application for the display texts in the VR systems are universal, and for most users, the font effect is clear.
- the font size can be adjusted with the different pixels per degree, that is, can adapt to the VR devices of different specifications, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- the method before the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value ⁇ , the method further includes: obtaining the distance information between the text information in the VR and the eye of the user.
- the text information includes the distance information between the text information in the VR and the eye of the user.
- the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value ⁇ includes: determining the font size of the text information based on the relationship between the font size H VR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value ⁇ .
- the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- the correction value ⁇ is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of ⁇ is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- a text display apparatus in virtual reality includes a text parameter control module, a 3D rendering module, and a display module.
- the text parameter control module is configured to receive text information; the text parameter control module is further configured to determine a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value ⁇ ;
- the 3D rendering module is configured to render the text information based on the font size of the text information; and the display module is configured to display the rendered text information.
- the text display apparatus in virtual reality may determine, based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value ⁇ , clear and readable font sizes corresponding to different VR devices and different distances.
- the text display apparatus in virtual reality according to the second aspect is related to the pixel per degree of the VR device, a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications is prevented, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- the text parameter control module before determining the font size of the text information, is further configured to obtain the distance information between the text information in the VR and the eye of the user.
- the text information includes the distance information between the text information in the VR and the eye of the user.
- the text parameter control module is specifically configured to determine the font size of the text information based on the relationship between the font size H VR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value ⁇ .
- the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- the correction value ⁇ is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of ⁇ is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- a text display apparatus in virtual reality includes a processor and a memory that are configured to support the apparatus in implementing corresponding functions in the foregoing method.
- the memory is configured to store a program, and the processor is configured to invoke the program to implement the text display method in virtual reality according to the first aspect and the implementations of the first aspect.
- a VR device includes the text display apparatus in virtual reality according to any one of the second aspect or the third aspect and the possible implementations of the second aspect or the third aspect.
- the VR device may perform the text display method in virtual reality according to the first aspect. That is, the clear and readable font sizes corresponding to the different VR devices and the different distances are determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, the distance between the text information in the virtual space and the eye of the user, and the correction value ⁇ . For most users, when the VR device with the font is used, an effect of a viewed font or image is clear, thereby improving user VR experience.
- a computer-readable storage medium configured to store a computer program, where the computer program includes an instruction used to perform the method according to any one of the first aspect or the possible implementations of the first aspect.
- a system chip includes a processing unit and a communications unit, where the processing unit may be, for example, a processor, and the communications unit may be an input/output interface, a pin, a circuit, or the like.
- the processing unit may execute a computer instruction, so that a chip in the terminal performs the text display method in virtual reality according to the first aspect.
- Embodiments of this application relate to following critical terms.
- Retinal resolution is also referred to as a pixel per degree of a human eye.
- a pixel per degree (a pixel per degree, PDD) is a quantity of pixels that can be arranged per inch.
- a larger quantity of pixels per degree indicates that a display screen can display an image at a higher density.
- PDD pixel per degree
- For head mounted displays, to achieve "retinal resolution" is an ultimate goal. At a particular pixel density, even people with perfect visual acuity cannot distinguish any extra details. For visual quality, any screen with more than 60 pixels per degree actually causes a waste, because human eyes cannot distinguish any extra details. This is referred to as a limit of retinal resolution of human eyes.
- a pixel per degree of a head mounted display screen refers to pixels included in each degree on a screen of a head mounted display device.
- a pixel per degree of a screen of an existing VR head mounted display device is 10 to 20 pixels per degree.
- a three-dimensional (three dimensions, 3D) engine is a set that abstracts materials in reality into forms of expression such as polygons or various curves, and performs related computing in a computer and outputs algorithm implementation of final images.
- the 3D engine as an underlying tool, supports graphics software development of higher layers, and the 3D engine is like building a "real world" inside the computer.
- a three-dimensional model refers to a polygon representation of obj ects, and is usually displayed by using a computer or other video devices. Displayed objects may be entities in real world, or may be fictional objects.
- the three-dimensional model is often generated by using dedicated software such as a three-dimensional modeling tool, but the three-dimensional model may also be generated by using other methods. As data of points and other information sets, the three-dimensional model may be manually generated or may be generated based on a particular algorithm.
- Material mapping is also referred to as texture mapping.
- material mapping is to wrap bitmaps stored in a main memory around a surface of a 3D rendering object. Texture mapping provides abundant details for an object, and simulates a complex appearance in a simple manner. An image (texture) is pasted onto (mapped) a simple shape in a scene, like pasting a printing onto a plane.
- Oculus is a VR company that is acquired by Facebook (Facebook), owns VR content platforms and VR helmets, and is a current representative VR company in the industry.
- a resolution of the helmet is 2160 ⁇ 1080, a field of view is 90 degrees, and the helmet is connected to a computer.
- Google Daydream is a platform consolidating Android (Android) smartphones and a new generation of virtual reality head mounted devices.
- Android Android
- virtual reality head mounted devices In addition to Daydream video (View) head mounted devices of Google, smartphone manufacturers can also develop and design head mounted devices, as long as the head mounted devices meet standards of Google.
- FIG. 1 is a schematic diagram of basic components of a virtual reality system.
- the basic components of the virtual reality system include an observer, a sensor module, an effect generator (including a detection module, a feedback module, and a control module) and a real view emulator (including a virtual environment, a modeling module, and a display environment).
- a VR display system that is usually a head mounted VR display system, for example, a head mounted display (Head Mounted Display, HMD).
- the virtual reality systems may be classified, based on different functions thereof, into four types including an immersive virtual reality system, an augmented reality virtual reality system, a desktop virtual reality system, and a distributed virtual reality system.
- the immersive virtual reality system provides a user with completely immersive experience, so that the user has a feeling of being in a virtual world.
- An obvious feature of the immersive virtual reality system is that, vision and an auditory sensation of a user are closed by using a head mounted display, to create virtual vision.
- a data glove is used in the immersive virtual reality system to close a hand feel channel of the user, to create a virtual tactile sensation.
- the system enables, by using a voice recognizer, the user to deliver an operation command to a system host, and meanwhile, a head tracker, a hand tracker, and an eyes vision direction tracker track a head, a hand, and an eye, so that the system achieves real-time performance as much as possible.
- a common immersive system is a system based on the head mounted display, and forms stereopsis by using a binocular parallax between a left eye and a right eye.
- a VR display system (device) on a user side generally includes a modeling module.
- the modeling module is configured to build three-dimensional objects required in a virtual environmental world, for example, texts, cards, object models, and three-dimensional environments.
- a graphical user interface (Graphic User Interface, GUI) in the modeling module displays all buttons, menus, texts, cards, and the like.
- the GUI is also responsible for providing a reliable logical module for functions and interaction of the elements.
- the logical module includes a text parameter control module and a 3D rendering module.
- the text parameter control module is mainly configured to control a size of displayed texts, for example, a text height and a text width.
- the 3D rendering module mainly manages an entire 3D engine, where a main camera of a VR scene determines an object to be rendered and details of encapsulated rendering of the 3D engine, sends the object and the details to the 3D rendering module through a rendering pipeline, and can further provide access through pixels and a vertex shader.
- the modeling module may further include other functional modules or units, for example, an input (input) module and an artificial intelligence module. This is not limited in this embodiment of this application herein.
- FIG. 2 is a schematic flowchart of designing and developing display texts in a VR system in the prior art.
- a common practice in the industry is to set font parameters (a font size, a color, transparency, and the like) in a 3D engine of a modeling module, and then output a program file that can be run in a VR device (for example, a glasses terminal).
- a developer determines a font effect through visual observation with the glasses terminal, and if it is determined that the font effect can be viewed clearly, then a next functional module is developed; otherwise continues to repeat the test.
- pixels per degree (PPD) of screens of different VR terminal devices are also inconsistent, for example, a pixel per degree of a screen of an Oculus terminal device is 12, and a pixel per degree of a screen of a Google Daydream terminal device is 14.
- the developer experiments on a VR terminal device of one specification, and when information such as font setting parameters with a better effect in the VR device of the specification is used in a VR device of another specification, because pixels per degree of screens of different devices are different, display effects of same content on devices having screens with different pixel per degrees are different.
- a same font specification that is set by using a same font setting parameter is just clear on the Google Daydream terminal device, but is unclear on the Oculus terminal device.
- an embodiment of this application provides a text display method in virtual reality that can resolve a problem of unclear text display in a current virtual reality environment, and can be directly applied to various virtual reality platforms; and can also resolve a problem that in virtual environments, a developer cannot quickly obtain various comfortable and readable font heights in different VR devices and at different distances in the virtual environments, thereby improving user experience and reducing development costs.
- FIG. 3 is a schematic diagram of a typical system architecture to which an embodiment of this application is applied.
- the system includes a server, a modeling module, an open graphics library embedded system (open graphics library embedded systems, OpenGL ES) module, a graphics processing unit (Graphics Processing Unit, GPU) module, and a display module.
- the modeling module, the OpenGL ES module, the GPU module, and the display module belong to a virtual reality system on a user side, and are disposed in a VR device.
- the server is configured to send text information to the modeling module in the virtual reality system on the user side, and the modeling module is configured to model the text information according to the text display method in virtual reality provided in the embodiments of this application, that is, configured to build three-dimensional objects required in a virtual environmental world, including, for example, texts, cards, object models, and three-dimensional environments.
- the modeling module may include a text parameter control module, a 3D rendering module, and the like.
- the OpenGL ES module is configured to: compile data output from the modeling module, convert the data into application programming interface (Application Programming Interface, API) data that can be recognized by the GPU module, and send the data to the GPU module.
- the GPU module is configured to perform image rendering, text rendering and the like based on the received data.
- the display module is configured to display an image rendered by the GPU module to a user.
- FIG. 4 is a schematic flowchart of a text display method 100 in virtual reality according to an embodiment of this application.
- the method 100 may be applied to the system architecture shown in FIG. 3 , and certainly, may alternatively be applied to other similar system architectures. This is not limited in this embodiment of this application herein.
- the method 100 includes the following steps:
- clear and readable font sizes corresponding to different VR devices and different distances may be determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value ⁇ .
- the pixel per degree PPD VR device of the VR device For designers and developers of display texts in VR systems, individual differences caused by determining a font effect by using existing visual observation manners can be prevented. Therefore, fonts that are designed and developed by using the method in this embodiment of this application for the display texts in the VR systems are universal, and for most users, the font effect is clear.
- the font size can be adjusted with the different pixels per degree, that is, can adapt to the VR devices of different specifications, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- the VR device may be various types of VR devices, for example, may be a head mounted display or VR glasses. This is not limited in this embodiment of this application herein.
- the server sends the text information to the VR device, and a type of the text information may be a text file (text file, TXT), or may be other format types used to express the text information.
- a type of the text information may be a text file (text file, TXT), or may be other format types used to express the text information.
- the text information may further include other information related to a text, for example, distance information between the text information in the virtual space and the eye of the user. This is not limited by this embodiment of this application herein.
- the server may further send image information or other types of information to the VR device. This is not limited in this embodiment of this application herein.
- a text parameter control module in the VR device determines, based on the received text information, contents and other information of the text information, for example, distance information between the text information in a virtual environment and the eye of the user, and determines the font size of the text information based on the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value ⁇ .
- a height, a transparency, a color, and the like of a font in the text information may be determined. This is not limited in this embodiment of this application herein.
- the text parameter control module in the VR device may determine a size of an image based on the received other types of information such as an image and the like and distance information between the information such as the image and the like and the eye of the user, and with reference to the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, distance information between the image information in the VR and the eye of the user, and the correction value ⁇ .
- This is not limited in this embodiment of this application herein.
- the text parameter control module in the VR device sends a rendering instruction to a 3D rendering module in the VR device, where the rendering instruction includes data such as the determined font size of the text information.
- the 3D rendering module in the VR device renders each text height in the text information based on the data such as the font size of the text information obtained in the previous step, for example, text height information, to obtain the rendered text information, and sends the rendered text information to a display module in the VR device.
- the 3D rendering module in the VR device may further render a height of information such as the image based on size data of the image information obtained in the previous step, for example, a height, to obtain the rendered image information, and send the rendered image information to the display module in the VR device.
- FIG. 5 is a schematic block diagram of a modeling module structure in the VR device.
- the modeling module in the VR device includes a GUI module
- the GUI module includes the text parameter control module and the rendering module.
- the modeling module may further include other functional modules, for example, an input module and an artificial intelligence module.
- the GUI module may further include other functional units or modules. This is not limited in this embodiment of this application herein.
- the display module in the VR device may display the rendered text information or image to the user.
- a font size of the rendered text information or image adapts to the VR device worn by the user. Therefore, the user can view a text in the text information clearly and has better user experience.
- the text information includes the distance information between the text information in the VR and the eye of the user.
- the server may obtain distance information between the text information in a VR environment and the eye of the user, and send the distance information and the text information to the VR device. In this way, a size of the displayed font can be more accurately determined, and a workload of the VR device is reduced, so that display work efficiency of the VR device is improved.
- the text information may include the distance information
- the server may separately send the distance information to the VR device. This is not limited by this embodiment of this application herein.
- the method 100 before S120, that is, before the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value ⁇ , the method 100 further includes the following step: Sill: The VR device obtains the distance information between the text information in the VR and the eye of the user.
- the modeling module in the VR device may obtain the distance information between the text information in the VR and the eye of the user for the user in real time, and determine the font size of the displayed text information by using the updated distance information.
- the font size of the text information in the virtual environment may be controlled in real time based on user location information, and a suitable font size may be determined. In this way, the font size corresponding to the user location can be more accurately determined, that is, the font size can be adjusted in real time, thereby further improving user experience.
- the modeling module in the VR device may also obtain the distance information between the text information in the VR and the eye of the user. This is not limited in this embodiment of this application herein.
- H VR is the font size of the text information
- PPD Human eye is the pixel per degree of the eye of the user
- PPD VR device is the pixel per degree of the VR device
- d is a distance between the text information in the VR and the eye of the user
- ⁇ is the correction value.
- the text parameter control module may determine the font size of the text information based on the formula (1).
- the modeling module in the VR device may determine, based on the formula (1), the font size H VR displayed in the VR device.
- PPD VR device is a pixel per degree of a screen of the VR device, pixels per degree of screens of VR devices of different specifications may be different
- PPD Human eye is the pixel per degree of the eye of the user
- d is the distance between the text information in the virtual environment and the eye of the user
- ⁇ is the correction value.
- the modeling module in the VR device may determine, by using the formula (1), the font size of the text information sent by the server, and render the font based on the font size. Finally, the rendered text information is displayed to the user, so that the user can view text content in the VR device clearly.
- the correction value ⁇ is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of ⁇ is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- FIG. 7 is a diagram of a relationship between a font height and a distance between a human eye and a text.
- ⁇ is the high corresponding visual angle formed by the text height and the eye of the user, and may also be referred to as an image and text visual angle. Based on an international human-machine design specification, it may be concluded that when the value range of ⁇ is greater than or equal to 18 degrees and less than or equal to 22 degrees, a font viewed by a human is clear and meets comfort requirements of human vision.
- ⁇ may be less than 18 degrees or greater than 22 degrees. This is not limited in this embodiment of this application herein.
- the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- any screen with more than 60 pixels per degree actually causes a waste, that is, a limit of a pixel per degree of a screen at which human eyes can distinguish details is 60 pixels per degree. Therefore, the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- a calculation amount of the VR device is reduced and efficiency of the VR device is improved.
- a value of the pixel per degree PPD Human eye of the eye of the user may be less than 60 pixels per degree, for example, may be 50 pixels per degree. This is not limited in this embodiment of this application herein.
- the text parameter control module may determine the font size of the text information based on the formula (3).
- the relationship between the font size H VR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value ⁇ may meet other function formulas or formulas of the foregoing formulas after arbitrary deformation.
- the text parameter control module may determine the font size of the text information based on the other function formulas or the formulas of the foregoing formulas after arbitrary deformation. This is not limited in this embodiment of this application herein.
- the method 200 includes the following steps:
- clear and readable font sizes corresponding to different VR devices and different distance information are determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between the text information in virtual space and the eye of the user, and a correction value ⁇ ; the text in the text information is rendered based on the font size; and the rendered text information is displayed to the user through the VR device, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifi cations.
- the method 300 includes the following steps:
- the font size displayed to the user can be adjusted in real time based on a distance between the text information in virtual space and the eye of the user and a pixel per degree of a screen of the VR device used by the user, so that the user can view the text information in the virtual environment clearly, thereby improving user experience.
- sequence numbers of the foregoing processes and steps do not imply a sequence of execution.
- the sequence of execution of the processes should be determined based on functions and inherent logic of the processes, and should not impose any limitations on implementation processes of the embodiments of this application.
- the text display method in virtual reality according to the embodiments of this application is described in detail above with reference to FIG. 1 to FIG. 10 , and the following describes a text display apparatus in virtual reality according to embodiments of this application in detail with reference to FIG. 11 and FIG. 12 .
- FIG. 11 is a schematic block diagram of a text display apparatus 400 in virtual reality according to an embodiment of this application.
- the text display apparatus 400 in virtual reality includes a text parameter control module 410, a 3D rendering module 420, and a display module 430.
- the text parameter control module 410 is configured to receive text information.
- the text parameter control module 410 is further configured to determine a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value ⁇ .
- the 3D rendering module 420 is configured to render the text information based on the font size of the text information.
- the display module 430 is configured to display the rendered text information.
- the text display apparatus may determine, based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value ⁇ , clear and readable font sizes corresponding to different VR devices and different distances.
- the pixel per degree PPD Human eye of the eye of the user the pixel per degree PPD VR device of the VR device
- a distance between text information in virtual space and the eye of the user and the correction value ⁇
- the font size can be adjusted with the different pixels per degree, that is, can adapt to the VR devices of different specifications, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- the text parameter control module 410 and the 3D rendering module 420 are disposed in a modeling module, the modeling module is disposed in the VR device, and the display module 430 is also disposed in the VR device.
- the text parameter control module is further configured to obtain the distance information between the text information in the VR and the eye of the user.
- the text information includes the distance information between the text information in the VR and the eye of the user.
- the text parameter control module is specifically configured to determine the font size of the text information based on the relationship between the font size H VR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value ⁇ .
- the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- the correction value ⁇ is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of ⁇ is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- FIG. 12 is a schematic block diagram of a text display apparatus 500 in virtual reality according to an embodiment of this application.
- the apparatus 500 includes a memory 510 and a processor 520.
- the memory 510 and the processor 520 communicate with each other through an internal connection path, to transfer a control and/or data signal.
- the memory 510 is configured to store program codes.
- the processor 520 is configured to invoke the program code to implement the methods in the embodiments of this application.
- the text display apparatus 500 in virtual reality shown in FIG. 12 may implement the processes implemented in the embodiments in FIG. 4 , FIG. 6 , FIG. 8 , and FIG. 9 , and for brevity, details are not described herein again.
- the processor 520 in this embodiment of this application may be a central processing unit (central processing unit, CPU), or the processor 520 may be another general-purpose processor, a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (application-specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA), or another programmable logical device, discrete gate or transistor logical device, discrete hardware component, or the like.
- the general-purpose processor may be a microprocessor or the processor may be any conventional processor and the like.
- the memory 510 may include a read-only memory and a random access memory, and provide an instruction and data to the processor 520. Apart of the memory 510 may further include a non-volatile random access memory. For example, the memory 510 may further store information of a device type. It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again.
- steps in the foregoing methods can be implemented by using a hardware integrated logical circuit in the processor 520, or by using an instruction in a form of software.
- the steps of the text display method in virtual reality disclosed with reference to the embodiments of this application may be directly performed and completed by a hardware processor, or may be performed and completed by using a combination of hardware and software modules in the processor 520.
- the software module may be located in a storage medium.
- the storage medium is located in the memory 510, and the processor 520 reads information in the memory 510 and completes the steps in the foregoing methods in combination with hardware of the processor. To avoid repetition, details are not described herein again.
- the memory in the embodiments of this application may be a volatile memory or a non-volatile memory, or may include a volatile memory and a non-volatile memory.
- the non-volatile memory may be a read-only memory (read-only memory, ROM), a programmable read-only memory (programmable ROM, PROM), an erasable programmable read-only memory (erasable PROM, EPROM), an electrically erasable programmable read-only memory (electrically EPROM, EEPROM), or a flash memory.
- the volatile memory may be a random access memory (random access memory, RAM), used as an external cache. It should be noted that the memory of the systems and methods described in this specification includes but is not limited to these and any memory of another proper type.
- An embodiment of this application further provides a virtual reality VR device, including any one of the text display apparatuses in virtual reality provided in the foregoing embodiments of this application.
- the virtual reality VR device provided in this embodiment of this application may implement the text display methods in virtual reality in the embodiments of this application in FIG. 4 , FIG. 6 , FIG. 8 , and FIG. 9 . That is, the clear and readable font sizes corresponding to the different VR devices and the different distances are determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, the distance between the text information in the virtual space and the eye of the user, and the correction value ⁇ . For most users, when the VR device with the font is used, an effect of a viewed font or image is clear, thereby improving user VR experience.
- the VR device provided in this embodiment of this application may be various types of VR devices, for example, may be a head mounted display or VR glasses. This is not limited in this embodiment of this application herein.
- An embodiment of this application further provides a computer-readable medium, configured to store computer program code.
- the computer program includes an instruction used to implement the text display methods in virtual reality in the embodiments of this application in FIG. 4 , FIG. 6 , FIG. 8 , and FIG. 9 .
- the readable medium may be a ROM or a RAM, and is not limited in this embodiment of this application herein.
- An embodiment of this application further provides a system chip.
- the system chip includes a processing unit and a communications unit.
- the processing unit may be, for example, a processor, and the communications unit may be an input/output interface, a pin, a circuit, or the like.
- the processing unit may execute a computer instruction, so that a chip in the terminal performs any one of the text display methods in virtual reality.
- the computer instruction is stored in a storage unit.
- the storage unit is a storage unit in the chip, for example, a register or a cache.
- the storage unit may alternatively be a storage unit located outside the chip in the terminal, for example, a ROM or other types of static storage devices that can store static information and a static instruction, or a RAM.
- Any one of the processors mentioned above may be a CPU, a microprocessor, an ASIC, or one or more integrated circuits configured to control program execution of the text display method in virtual reality.
- the disclosed system, apparatus, and method may be implemented in other manners.
- the described apparatus embodiment is merely an example.
- the unit division is merely logical function division and may be other division during actual implementation.
- a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed.
- the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces.
- the indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
- the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual requirements to achieve the objectives of the solutions of the embodiments in this application.
- the functions When the functions are implemented in the form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or the part contributing to the prior art, or some of the technical solutions may be implemented in a form of a software product.
- the computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or some of the steps of the methods described in the embodiments of this application.
- the foregoing storage medium includes: any medium that can store program code, such as a USB flash drive, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
- This application claims priority to Chinese Patent Application No.
201710975902.0 - This application relates to the field of virtual reality display, and more specifically, to a text display method and apparatus in virtual reality, and a virtual reality device.
- Virtual reality (virtual reality, VR) creates, through computer simulation, a virtual world in three-dimensional space, and provides a user with simulation of senses such as vision, an auditory sensation, and a tactile sensation, so that the user can observe things in the three-dimensional space in a timely manner and without limitations as if the user has an immersed sense. Display devices of VR hardware include a VR head mounted display (head mounted display) device, a binocular omnidirectional display, and the like. Users obtain virtual reality effects on vision by wearing VR display devices. For example, the VR display devices may include VR glasses, VR helmets, and the like.
- Currently, limited by VR hardware performance, the most common problems in a current VR experience process are that the users' eyes are tired and texts or images in a VR scene may not be viewed clearly. In particular, when the users view text information in the VR experience process, smaller texts are displayed less clearly, greatly degrading user experience. How to quickly find a smallest comfortable readable font and improve user experience becomes a problem needing to be urgently resolved in the industry.
- This application provides a text display method and apparatus in virtual reality, and a virtual reality device that can resolve a problem of unclear text display in a current virtual reality environment, and can be directly applied to various virtual reality platforms; and can also resolve a problem that in virtual environments, a developer cannot quickly obtain various comfortable and readable font heights in different VR devices and at different distances in the virtual environments, thereby improving user experience and reducing development costs.
- According to a first aspect, a text display method in virtual reality is provided. The method includes: receiving text information; determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α; rendering the text information based on the font size of the text information; and displaying the rendered text information.
- According to the text display method in virtual reality according to the first aspect, clear and readable font sizes corresponding to different VR devices and different distances may be determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value α. For designers and developers of display texts in VR systems, individual differences caused by determining a font effect by using existing visual observation manners can be prevented. Therefore, fonts that are designed and developed by using the method in this embodiment of this application for the display texts in the VR systems are universal, and for most users, the font effect is clear. In addition, for VR devices of different specifications, when same content is migrated between VR devices with different pixels per degree, because the text display method in virtual reality according to the first aspect is related to the pixel per degree of the VR device, the font size can be adjusted with the different pixels per degree, that is, can adapt to the VR devices of different specifications, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- In a possible implementation of the first aspect, before the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α, the method further includes: obtaining the distance information between the text information in the VR and the eye of the user.
- In a possible implementation of the first aspect, the text information includes the distance information between the text information in the VR and the eye of the user.
- In a possible implementation of the first aspect, a relationship between the font size of the text information and the pixel per degree of the VR device, the pixel per degree of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α is:
- In a possible implementation of the first aspect, the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α includes: determining the font size of the text information based on the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α.
- In a possible implementation of the first aspect, the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- In a possible implementation of the first aspect, the correction value α is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- According to a second aspect, a text display apparatus in virtual reality is provided. The apparatus includes a text parameter control module, a 3D rendering module, and a display module. The text parameter control module is configured to receive text information; the text parameter control module is further configured to determine a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α; the 3D rendering module is configured to render the text information based on the font size of the text information; and the display module is configured to display the rendered text information.
- The text display apparatus in virtual reality according to the second aspect may determine, based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value α, clear and readable font sizes corresponding to different VR devices and different distances. For designers and developers of display texts in VR systems, individual differences caused by determining a font effect by using existing visual observation manners can be prevented. Therefore, fonts that are designed and developed by using the apparatus in this embodiment of this application for the display texts in the VR systems are universal, and for most users, the font effect is clear. In addition, for VR devices of different specifications, when same content is migrated between VR devices with different pixels per degree, because the text display apparatus in virtual reality according to the second aspect is related to the pixel per degree of the VR device, a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications is prevented, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- In a possible implementation of the second aspect, before determining the font size of the text information, the text parameter control module is further configured to obtain the distance information between the text information in the VR and the eye of the user.
- In a possible implementation of the second aspect, the text information includes the distance information between the text information in the VR and the eye of the user.
- In a possible implementation of the second aspect, a relationship between the font size of the text information and the pixel per degree of the VR device, the pixel per degree of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α is:
- In a possible implementation of the second aspect, the text parameter control module is specifically configured to determine the font size of the text information based on the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α.
- In a possible implementation of the second aspect, the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- In a possible implementation of the second aspect, the correction value α is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- According to a third aspect, a text display apparatus in virtual reality is provided. The apparatus includes a processor and a memory that are configured to support the apparatus in implementing corresponding functions in the foregoing method. The memory is configured to store a program, and the processor is configured to invoke the program to implement the text display method in virtual reality according to the first aspect and the implementations of the first aspect.
- According to a fourth aspect, a VR device is provided. The VR device includes the text display apparatus in virtual reality according to any one of the second aspect or the third aspect and the possible implementations of the second aspect or the third aspect.
- The VR device according to the fourth aspect may perform the text display method in virtual reality according to the first aspect. That is, the clear and readable font sizes corresponding to the different VR devices and the different distances are determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, the distance between the text information in the virtual space and the eye of the user, and the correction value α. For most users, when the VR device with the font is used, an effect of a viewed font or image is clear, thereby improving user VR experience.
- According to a fifth aspect, a computer-readable storage medium is provided. The computer-readable storage medium is configured to store a computer program, where the computer program includes an instruction used to perform the method according to any one of the first aspect or the possible implementations of the first aspect.
- According to a sixth aspect, a system chip is provided. The system chip includes a processing unit and a communications unit, where the processing unit may be, for example, a processor, and the communications unit may be an input/output interface, a pin, a circuit, or the like. The processing unit may execute a computer instruction, so that a chip in the terminal performs the text display method in virtual reality according to the first aspect.
-
-
FIG. 1 is a schematic diagram of basic components of a virtual reality system; -
FIG. 2 is a schematic flowchart of designing and developing display texts in a VR system in the prior art; -
FIG. 3 is a schematic diagram of a typical system architecture to which an embodiment of this application is applied; -
FIG. 4 is a schematic flowchart of a text display method in virtual reality according to an embodiment of this application; -
FIG. 5 is a schematic block diagram of a modeling module structure in a VR device; -
FIG. 6 is a schematic flowchart of a text display method in virtual reality according to another embodiment of this application; -
FIG. 7 is a diagram of a relationship between a font height and a distance between a human eye and a text; -
FIG. 8 is a schematic flowchart of a text display method in virtual reality according to another embodiment of this application; -
FIG. 9 is a schematic flowchart of a text display method in virtual reality according to still another embodiment of this application; -
FIG. 10 is a schematic diagram of a font height displayed in a VR device according to an embodiment of this application; -
FIG. 11 is a schematic block diagram of a text display apparatus in virtual reality according to an embodiment of this application; and -
FIG. 12 is a schematic block diagram of a text display apparatus in virtual reality according to another embodiment of this application. - The following describes technical solutions of this application with reference to accompanying drawings.
- Embodiments of this application relate to following critical terms.
- Retinal resolution: Retinal resolution is also referred to as a pixel per degree of a human eye. A pixel per degree (a pixel per degree, PDD) is a quantity of pixels that can be arranged per inch. A larger quantity of pixels per degree indicates that a display screen can display an image at a higher density. For head mounted displays, to achieve "retinal resolution" is an ultimate goal. At a particular pixel density, even people with perfect visual acuity cannot distinguish any extra details. For visual quality, any screen with more than 60 pixels per degree actually causes a waste, because human eyes cannot distinguish any extra details. This is referred to as a limit of retinal resolution of human eyes.
- A pixel per degree of a head mounted display screen refers to pixels included in each degree on a screen of a head mounted display device. A pixel per degree of a screen of an existing VR head mounted display device is 10 to 20 pixels per degree.
- A three-dimensional (three dimensions, 3D) engine is a set that abstracts materials in reality into forms of expression such as polygons or various curves, and performs related computing in a computer and outputs algorithm implementation of final images. Usually, the 3D engine, as an underlying tool, supports graphics software development of higher layers, and the 3D engine is like building a "real world" inside the computer.
- A three-dimensional model refers to a polygon representation of obj ects, and is usually displayed by using a computer or other video devices. Displayed objects may be entities in real world, or may be fictional objects. The three-dimensional model is often generated by using dedicated software such as a three-dimensional modeling tool, but the three-dimensional model may also be generated by using other methods. As data of points and other information sets, the three-dimensional model may be manually generated or may be generated based on a particular algorithm.
- Material mapping is also referred to as texture mapping. In computer graphics, material mapping is to wrap bitmaps stored in a main memory around a surface of a 3D rendering object. Texture mapping provides abundant details for an object, and simulates a complex appearance in a simple manner. An image (texture) is pasted onto (mapped) a simple shape in a scene, like pasting a printing onto a plane.
- Oculus (oculus) is a VR company that is acquired by Facebook (Facebook), owns VR content platforms and VR helmets, and is a current representative VR company in the industry. A resolution of the helmet is 2160∗1080, a field of view is 90 degrees, and the helmet is connected to a computer.
- Google Daydream (Google Daydream) is a platform consolidating Android (Android) smartphones and a new generation of virtual reality head mounted devices. In addition to Daydream video (View) head mounted devices of Google, smartphone manufacturers can also develop and design head mounted devices, as long as the head mounted devices meet standards of Google.
- Virtual reality creates, through computer simulation, a virtual world in three-dimensional space, and provides a user with simulation of senses such as vision, an auditory sensation, and a tactile sensation, so that the user can observe things in the three-dimensional space in a timely manner and without limitations as if the user has an immersed sense.
FIG. 1 is a schematic diagram of basic components of a virtual reality system. As shown inFIG. 1 , the basic components of the virtual reality system include an observer, a sensor module, an effect generator (including a detection module, a feedback module, and a control module) and a real view emulator (including a virtual environment, a modeling module, and a display environment). On a user side, there is mainly a VR display system that is usually a head mounted VR display system, for example, a head mounted display (Head Mounted Display, HMD). - The virtual reality systems may be classified, based on different functions thereof, into four types including an immersive virtual reality system, an augmented reality virtual reality system, a desktop virtual reality system, and a distributed virtual reality system. The immersive virtual reality system provides a user with completely immersive experience, so that the user has a feeling of being in a virtual world. An obvious feature of the immersive virtual reality system is that, vision and an auditory sensation of a user are closed by using a head mounted display, to create virtual vision. In addition, a data glove is used in the immersive virtual reality system to close a hand feel channel of the user, to create a virtual tactile sensation. The system enables, by using a voice recognizer, the user to deliver an operation command to a system host, and meanwhile, a head tracker, a hand tracker, and an eyes vision direction tracker track a head, a hand, and an eye, so that the system achieves real-time performance as much as possible. A common immersive system is a system based on the head mounted display, and forms stereopsis by using a binocular parallax between a left eye and a right eye.
- For a virtual reality system, for example, an immersive virtual reality system, a VR display system (device) on a user side generally includes a modeling module. The modeling module is configured to build three-dimensional objects required in a virtual environmental world, for example, texts, cards, object models, and three-dimensional environments. A graphical user interface (Graphic User Interface, GUI) in the modeling module displays all buttons, menus, texts, cards, and the like. The GUI is also responsible for providing a reliable logical module for functions and interaction of the elements. For example, the logical module includes a text parameter control module and a 3D rendering module. The text parameter control module is mainly configured to control a size of displayed texts, for example, a text height and a text width. The 3D rendering module mainly manages an entire 3D engine, where a main camera of a VR scene determines an object to be rendered and details of encapsulated rendering of the 3D engine, sends the object and the details to the 3D rendering module through a rendering pipeline, and can further provide access through pixels and a vertex shader. The modeling module may further include other functional modules or units, for example, an input (input) module and an artificial intelligence module. This is not limited in this embodiment of this application herein.
-
FIG. 2 is a schematic flowchart of designing and developing display texts in a VR system in the prior art. As shown inFIG. 2 , in current design and development of display texts in a VR system, a common practice in the industry is to set font parameters (a font size, a color, transparency, and the like) in a 3D engine of a modeling module, and then output a program file that can be run in a VR device (for example, a glasses terminal). A developer determines a font effect through visual observation with the glasses terminal, and if it is determined that the font effect can be viewed clearly, then a next functional module is developed; otherwise continues to repeat the test. - In the prior art, because of a lack of a correct standard, the developer needs to spend a lot of time in text testing. In addition, compared with graphic design of a conventional terminal, there is one more piece of dimensional information in a 3D environment, and the user has different requirements for a text height at different distances. Therefore, the developer needs to test texts at different distances in a virtual environment, leading to a high workload. In addition to the high workload, because of differences between individual developers, such as myopia, astigmatism, hyperopia, and other problems, test results are different. Consequently, it cannot be ensured that a text height after a final test is universal. User experience is relatively poor, and a testing process is relatively complex.
- On the other hand, pixels per degree (PPD) of screens of different VR terminal devices are also inconsistent, for example, a pixel per degree of a screen of an Oculus terminal device is 12, and a pixel per degree of a screen of a Google Daydream terminal device is 14. The developer experiments on a VR terminal device of one specification, and when information such as font setting parameters with a better effect in the VR device of the specification is used in a VR device of another specification, because pixels per degree of screens of different devices are different, display effects of same content on devices having screens with different pixel per degrees are different. For example, a same font specification that is set by using a same font setting parameter is just clear on the Google Daydream terminal device, but is unclear on the Oculus terminal device. This case results in the fact that the developer needs to reset a new text height based on the VR device when migrating the same content from a Daydream platform to an Oculus platform, and needs to re-test and re-verify the text height in a development process, greatly increasing difficulty and costs of content migration.
- Based on the foregoing problems, an embodiment of this application provides a text display method in virtual reality that can resolve a problem of unclear text display in a current virtual reality environment, and can be directly applied to various virtual reality platforms; and can also resolve a problem that in virtual environments, a developer cannot quickly obtain various comfortable and readable font heights in different VR devices and at different distances in the virtual environments, thereby improving user experience and reducing development costs.
-
FIG. 3 is a schematic diagram of a typical system architecture to which an embodiment of this application is applied. As shown inFIG. 3 , the system includes a server, a modeling module, an open graphics library embedded system (open graphics library embedded systems, OpenGL ES) module, a graphics processing unit (Graphics Processing Unit, GPU) module, and a display module. The modeling module, the OpenGL ES module, the GPU module, and the display module belong to a virtual reality system on a user side, and are disposed in a VR device. The server is configured to send text information to the modeling module in the virtual reality system on the user side, and the modeling module is configured to model the text information according to the text display method in virtual reality provided in the embodiments of this application, that is, configured to build three-dimensional objects required in a virtual environmental world, including, for example, texts, cards, object models, and three-dimensional environments. The modeling module may include a text parameter control module, a 3D rendering module, and the like. The OpenGL ES module is configured to: compile data output from the modeling module, convert the data into application programming interface (Application Programming Interface, API) data that can be recognized by the GPU module, and send the data to the GPU module. The GPU module is configured to perform image rendering, text rendering and the like based on the received data. The display module is configured to display an image rendered by the GPU module to a user. - It should be understood that, merely the system architecture shown in
FIG. 3 is used as an example for description in this embodiment of this application, but this embodiment of this application is not limited thereto, for example, the system may include other modules and the like. - The following describes a text display method in virtual reality provided in an embodiment of this application in detail with reference to
FIG. 4. FIG. 4 is a schematic flowchart of atext display method 100 in virtual reality according to an embodiment of this application. Themethod 100 may be applied to the system architecture shown inFIG. 3 , and certainly, may alternatively be applied to other similar system architectures. This is not limited in this embodiment of this application herein. - As shown in
FIG. 4 , themethod 100 includes the following steps: - S I 10: A server sends text information to a VR device, and correspondingly, the VR device receives the text information.
- S120: The VR device determines a font size of the text information based on a pixel per degree of the VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α.
- S130: The VR device renders the text information based on the font size of the text information.
- S140: The VR device displays the rendered text information to the user.
- According to the text display method in virtual reality provided in this embodiment of this application, clear and readable font sizes corresponding to different VR devices and different distances may be determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value α. For designers and developers of display texts in VR systems, individual differences caused by determining a font effect by using existing visual observation manners can be prevented. Therefore, fonts that are designed and developed by using the method in this embodiment of this application for the display texts in the VR systems are universal, and for most users, the font effect is clear. In addition, for VR devices of different specifications, when same content is migrated between VR devices with different pixels per degree, because the text display method in virtual reality provided in this embodiment of this application is related to the pixel per degree of the VR device, the font size can be adjusted with the different pixels per degree, that is, can adapt to the VR devices of different specifications, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- It should be understood that, in this embodiment of this application, the VR device may be various types of VR devices, for example, may be a head mounted display or VR glasses. This is not limited in this embodiment of this application herein.
- Specifically, in
S 110, the server sends the text information to the VR device, and a type of the text information may be a text file (text file, TXT), or may be other format types used to express the text information. This is not limited in this embodiment of this application herein. The text information may further include other information related to a text, for example, distance information between the text information in the virtual space and the eye of the user. This is not limited by this embodiment of this application herein. - It should be understood that, in embodiments of this application, the server may further send image information or other types of information to the VR device. This is not limited in this embodiment of this application herein.
- In S120, a text parameter control module in the VR device determines, based on the received text information, contents and other information of the text information, for example, distance information between the text information in a virtual environment and the eye of the user, and determines the font size of the text information based on the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α. For example, a height, a transparency, a color, and the like of a font in the text information may be determined. This is not limited in this embodiment of this application herein.
- It should be understood that, in embodiments of this application, the text parameter control module in the VR device may determine a size of an image based on the received other types of information such as an image and the like and distance information between the information such as the image and the like and the eye of the user, and with reference to the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, distance information between the image information in the VR and the eye of the user, and the correction value α. This is not limited in this embodiment of this application herein.
- In S130, after determining the font size of the text information, the text parameter control module in the VR device sends a rendering instruction to a 3D rendering module in the VR device, where the rendering instruction includes data such as the determined font size of the text information. The 3D rendering module in the VR device renders each text height in the text information based on the data such as the font size of the text information obtained in the previous step, for example, text height information, to obtain the rendered text information, and sends the rendered text information to a display module in the VR device.
- It should be understood that, in embodiments of this application, the 3D rendering module in the VR device may further render a height of information such as the image based on size data of the image information obtained in the previous step, for example, a height, to obtain the rendered image information, and send the rendered image information to the display module in the VR device.
-
FIG. 5 is a schematic block diagram of a modeling module structure in the VR device. As shown inFIG. 5 , the modeling module in the VR device includes a GUI module, and the GUI module includes the text parameter control module and the rendering module. - It should be understood that, the modeling module may further include other functional modules, for example, an input module and an artificial intelligence module. The GUI module may further include other functional units or modules. This is not limited in this embodiment of this application herein.
- In S140, after receiving the rendered data, the display module in the VR device may display the rendered text information or image to the user. A font size of the rendered text information or image adapts to the VR device worn by the user. Therefore, the user can view a text in the text information clearly and has better user experience.
- Optionally, in an embodiment, the text information includes the distance information between the text information in the VR and the eye of the user.
- Specifically, when the user first wears the VR device or when a user location subsequently changes, because the server can communicate with the VR device, the server may obtain distance information between the text information in a VR environment and the eye of the user, and send the distance information and the text information to the VR device. In this way, a size of the displayed font can be more accurately determined, and a workload of the VR device is reduced, so that display work efficiency of the VR device is improved.
- It should be understood that, the text information may include the distance information, and the server may separately send the distance information to the VR device. This is not limited by this embodiment of this application herein.
- Optionally, in an embodiment, as shown in
FIG. 6 , before S120, that is, before the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α, themethod 100 further includes the following step:
Sill: The VR device obtains the distance information between the text information in the VR and the eye of the user. - Specifically, when the user location changes, the distance between the text information in the VR and the eye of the user changes as the user location changes. Therefore, when the user subsequently uses the VR device and the user location changes, the modeling module in the VR device may obtain the distance information between the text information in the VR and the eye of the user for the user in real time, and determine the font size of the displayed text information by using the updated distance information. The font size of the text information in the virtual environment may be controlled in real time based on user location information, and a suitable font size may be determined. In this way, the font size corresponding to the user location can be more accurately determined, that is, the font size can be adjusted in real time, thereby further improving user experience.
- It should be understood that, when the user location does not change, the modeling module in the VR device may also obtain the distance information between the text information in the VR and the eye of the user. This is not limited in this embodiment of this application herein.
- Optionally, in an embodiment, a relationship between the font size of the text information and the pixel per degree of the VR device, the pixel per degree of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α may be shown in a formula (1):
- In the formula (1), HVR is the font size of the text information, PPD Human eye is the pixel per degree of the eye of the user, PPD VR device is the pixel per degree of the VR device, d is a distance between the text information in the VR and the eye of the user, and α is the correction value.
- Optionally, in S120, the text parameter control module may determine the font size of the text information based on the formula (1).
- Specifically, the modeling module in the VR device may determine, based on the formula (1), the font size HVR displayed in the VR device. In the formula (1), PPD VR device is a pixel per degree of a screen of the VR device, pixels per degree of screens of VR devices of different specifications may be different, PPD Human eye is the pixel per degree of the eye of the user, d is the distance between the text information in the virtual environment and the eye of the user, and α is the correction value. The modeling module in the VR device may determine, by using the formula (1), the font size of the text information sent by the server, and render the font based on the font size. Finally, the rendered text information is displayed to the user, so that the user can view text content in the VR device clearly.
- Optionally, the correction value α is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- Specifically,
FIG. 7 is a diagram of a relationship between a font height and a distance between a human eye and a text. As shown inFIG. 7 , in a real environment, a relationship between a text height H and a distance S between a human eye and a font is a tangent function, that is, as shown in a formula (2): - That is, a minimum comfortable readable text height H in a real world for a human and the distance S between the human eye and the text meet the formula (2). α is the high corresponding visual angle formed by the text height and the eye of the user, and may also be referred to as an image and text visual angle. Based on an international human-machine design specification, it may be concluded that when the value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees, a font viewed by a human is clear and meets comfort requirements of human vision.
- It should be understood that, the value range of α may be less than 18 degrees or greater than 22 degrees. This is not limited in this embodiment of this application herein.
- Optionally, the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- Specifically, for retinal resolution of a human, any screen with more than 60 pixels per degree actually causes a waste, that is, a limit of a pixel per degree of a screen at which human eyes can distinguish details is 60 pixels per degree. Therefore, the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree. When the user can view clearly, a calculation amount of the VR device is reduced and efficiency of the VR device is improved.
- It should be understood that, a value of the pixel per degree PPD Human eye of the eye of the user may be less than 60 pixels per degree, for example, may be 50 pixels per degree. This is not limited in this embodiment of this application herein.
- It should further be understood that, in this embodiment of this application, the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α may meet other formulas, for example, meet a formula (3):
- In the formula (3), K is equivalent to the correction value. Optionally, in S120, the text parameter control module may determine the font size of the text information based on the formula (3).
- It should be understood that, in this embodiment of this application, the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α may meet other function formulas or formulas of the foregoing formulas after arbitrary deformation. The text parameter control module may determine the font size of the text information based on the other function formulas or the formulas of the foregoing formulas after arbitrary deformation. This is not limited in this embodiment of this application herein.
- The following describes a
text display method 200 in virtual reality provided in an embodiment of this application in detail with reference toFIG. 8 . As shown inFIG. 8 , themethod 200 includes the following steps: - S201: A server sends text information to a VR device.
- S202: A GUI module in a modeling module in the VR device receives the text information. Optionally, the GUI module further receives distance information between the text information in a virtual environment and an eye of a user. The GUI module determines a type of the text information and the distance information, and sends a font size request at the distance to a text parameter control module, where the font size request carries the distance information and the text information.
- S203: The text parameter control module in the modeling module in the VR device receives the request; determines a font size of the text information based on a relationship between a font size HVR of the text information and a pixel per degree PPD VR device of the VR device, a pixel per degree PPD Human eye of the eye of the user, and a distance d between the text information in VR and the eye of the user, for example, the foregoing formula (1) or formula (3) or other similar function relationships; and sends a rendering instruction to a 3D rendering module.
- S204: The 3D rendering module in the modeling module in the VR device renders each font in the text information according to the rendering instruction with reference to the font size determined by the text parameter control module, and based on the determined font size, and sends rendered data to an OpenGL ES module.
- S205: The OpenGL ES module in the VR device performs format conversion on the data to convert the data into a data format that can be recognized by a GPU module, and sends the data to the GPU module.
- S206: The GPU module in the VR device performs calculation based on the data, performs graphical image rendering, and sends the rendered data to a display module.
- S207: The display module in the VR device displays a rendered image to the user.
- According to the text display method in virtual reality provided in this embodiment of this application, clear and readable font sizes corresponding to different VR devices and different distance information are determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between the text information in virtual space and the eye of the user, and a correction value α; the text in the text information is rendered based on the font size; and the rendered text information is displayed to the user through the VR device, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifi cations.
- The following describes a
text display method 300 in virtual reality provided in an embodiment of this application in detail by using an example in which a user wears a VR device to watch a movie in a VR environment. As shown inFIG. 9 , themethod 300 includes the following steps: - S301: After a user chooses a movie in a virtual environment, for example, the movie Transformers: Age of Extinction, send information about the movie to a server.
- S302: After receiving the information, a server sends text information of the movie to the VR device, for example, the text information may include: "Transformers: Age of Extinction; director: Michael Bay; country or region: America; and synopsis: ...".
- S303: A GUI module in a modeling module in the VR device receives the text information; determines a type of the text information and a distance between the text information in the virtual environment and an eye of the user, for example, the distance is 10 meters, and sends a font size request at the distance to a text parameter control module in the modeling module, where the request carries distance information and the text information.
- S304: The text parameter control module in the modeling module in the VR device receives the request, and calculates a font size of the text information based on a relationship between the font size and a pixel per degree PPD VR device of the VR device, a pixel per degree PPD Human eye of the eye of the user, a distance d between the text information in VR and the eye of the user, and a correction value α, for example by using the function relationship shown in the foregoing formula (1). For example, based on the foregoing formula (1), a value of α is set to 22 degrees. A value of PPD VR device of the VR device is 20 pixels per degree. A value of PPD Human eye is 60 pixels per degree and a value of the d is 10 meters, that is, 10000 millimeters. Font height HVR is calculated to be 190 millimeters after the values are substituted into the foregoing formula (1). The text parameter control module sends data of the font height HVR to a 3D rendering module in the modeling module, that is, sends a rendering instruction to the 3D rendering module. Optionally, when a user location changes, the modeling module may further obtain the distance between the text information in the virtual environment and the eye of the user for the user in real time, and determine the font size of the text information by using the updated distance information.
- S305: The 3D rendering module in the modeling module renders each font in the text information according to the rendering instruction with reference to the font size determined by the text parameter control module, and based on the determined font size, for example, renders each font in the text information based on a height of 190 millimeters, and sends rendered data to an OpenGL ES module.
- S306: The OpenGL ES module in the VR device performs format conversion on the data to convert the data into a data format that can be recognized by a GPU module, and sends the data to the GPU module.
- S307: The GPU module in the VR device performs calculation based on the data, performs graphical image rendering, and sends the rendered data to a display module.
- S308: The display module in the VR device displays rendered text information or a rendered image to the user, where a font size in the text information and the image corresponds to the VR device and the distance. For example, as shown in
FIG. 10 , a height of a single word "Play" displayed by the display module is 190 millimeters. - According to the text display method in virtual reality provided in this embodiment of this application, the font size displayed to the user can be adjusted in real time based on a distance between the text information in virtual space and the eye of the user and a pixel per degree of a screen of the VR device used by the user, so that the user can view the text information in the virtual environment clearly, thereby improving user experience.
- It should be understood that, in the embodiments of this application, sequence numbers of the foregoing processes and steps do not imply a sequence of execution. The sequence of execution of the processes should be determined based on functions and inherent logic of the processes, and should not impose any limitations on implementation processes of the embodiments of this application.
- The text display method in virtual reality according to the embodiments of this application is described in detail above with reference to
FIG. 1 to FIG. 10 , and the following describes a text display apparatus in virtual reality according to embodiments of this application in detail with reference toFIG. 11 andFIG. 12 . -
FIG. 11 is a schematic block diagram of a text display apparatus 400 in virtual reality according to an embodiment of this application. As shown inFIG. 11 , the text display apparatus 400 in virtual reality includes a textparameter control module 410, a3D rendering module 420, and adisplay module 430. - The text
parameter control module 410 is configured to receive text information. - The text
parameter control module 410 is further configured to determine a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α. - The
3D rendering module 420 is configured to render the text information based on the font size of the text information. - The
display module 430 is configured to display the rendered text information. - The text display apparatus provided in this embodiment of this application may determine, based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, a distance between text information in virtual space and the eye of the user, and the correction value α, clear and readable font sizes corresponding to different VR devices and different distances. For designers and developers of display texts in VR systems, individual differences caused by determining a font effect by using existing visual measurement manners can be prevented. Therefore, fonts that are designed and developed by using the apparatus in this embodiment of this application for the display texts in the VR systems are universal, and for most users, the font effect is clear. In addition, for VR devices of different specifications, when same content is migrated between VR devices with different pixels per degree, because the text display apparatus in virtual reality provided in this embodiment of this application is related to the pixel per degree of the VR device, the font size can be adjusted with the different pixels per degree, that is, can adapt to the VR devices of different specifications, to prevent a case in which there are differences between text effects of the text information in a process of migrating the text information between the VR devices of different specifications, thereby improving user experience and reducing difficulty and costs of migrating the text information between the VR devices of different specifications.
- It should be understood that, the text
parameter control module 410 and the3D rendering module 420 are disposed in a modeling module, the modeling module is disposed in the VR device, and thedisplay module 430 is also disposed in the VR device. - Optionally, in an embodiment, before determining the font size of the text information, the text parameter control module is further configured to obtain the distance information between the text information in the VR and the eye of the user.
- Optionally, in an embodiment, the text information includes the distance information between the text information in the VR and the eye of the user.
- Optionally, in an embodiment, a relationship between the font size of the text information and the pixel per degree of the VR device, the pixel per degree of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α is:
- Optionally, in an embodiment, the text parameter control module is specifically configured to determine the font size of the text information based on the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α.
- Optionally, in an embodiment, the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- Optionally, in an embodiment, the correction value α is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- It should be understood that, the foregoing and other operations and/or functions of the modules of the text display apparatus 400 in virtual reality according to this embodiment of this application are respectively for implementing corresponding procedures of the methods in
FIG. 4 ,FIG. 6 ,FIG. 8 , andFIG. 9 , and for brevity, details are not described herein again. -
FIG. 12 is a schematic block diagram of a text display apparatus 500 in virtual reality according to an embodiment of this application. As shown inFIG. 12 , the apparatus 500 includes amemory 510 and aprocessor 520. Thememory 510 and theprocessor 520 communicate with each other through an internal connection path, to transfer a control and/or data signal. - The
memory 510 is configured to store program codes. - The
processor 520 is configured to invoke the program code to implement the methods in the embodiments of this application. - The text display apparatus 500 in virtual reality shown in
FIG. 12 may implement the processes implemented in the embodiments inFIG. 4 ,FIG. 6 ,FIG. 8 , andFIG. 9 , and for brevity, details are not described herein again. - It should be understood that, the
processor 520 in this embodiment of this application may be a central processing unit (central processing unit, CPU), or theprocessor 520 may be another general-purpose processor, a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (application-specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA), or another programmable logical device, discrete gate or transistor logical device, discrete hardware component, or the like. The general-purpose processor may be a microprocessor or the processor may be any conventional processor and the like. - The
memory 510 may include a read-only memory and a random access memory, and provide an instruction and data to theprocessor 520. Apart of thememory 510 may further include a non-volatile random access memory. For example, thememory 510 may further store information of a device type. It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again. - In an implementation process, steps in the foregoing methods can be implemented by using a hardware integrated logical circuit in the
processor 520, or by using an instruction in a form of software. The steps of the text display method in virtual reality disclosed with reference to the embodiments of this application may be directly performed and completed by a hardware processor, or may be performed and completed by using a combination of hardware and software modules in theprocessor 520. The software module may be located in a storage medium. The storage medium is located in thememory 510, and theprocessor 520 reads information in thememory 510 and completes the steps in the foregoing methods in combination with hardware of the processor. To avoid repetition, details are not described herein again. - It may be understood that the memory in the embodiments of this application may be a volatile memory or a non-volatile memory, or may include a volatile memory and a non-volatile memory. The non-volatile memory may be a read-only memory (read-only memory, ROM), a programmable read-only memory (programmable ROM, PROM), an erasable programmable read-only memory (erasable PROM, EPROM), an electrically erasable programmable read-only memory (electrically EPROM, EEPROM), or a flash memory. The volatile memory may be a random access memory (random access memory, RAM), used as an external cache. It should be noted that the memory of the systems and methods described in this specification includes but is not limited to these and any memory of another proper type.
- An embodiment of this application further provides a virtual reality VR device, including any one of the text display apparatuses in virtual reality provided in the foregoing embodiments of this application.
- The virtual reality VR device provided in this embodiment of this application may implement the text display methods in virtual reality in the embodiments of this application in
FIG. 4 ,FIG. 6 ,FIG. 8 , andFIG. 9 . That is, the clear and readable font sizes corresponding to the different VR devices and the different distances are determined based on the pixel per degree PPD Human eye of the eye of the user, the pixel per degree PPD VR device of the VR device, the distance between the text information in the virtual space and the eye of the user, and the correction value α. For most users, when the VR device with the font is used, an effect of a viewed font or image is clear, thereby improving user VR experience. - It should be understood that, the VR device provided in this embodiment of this application may be various types of VR devices, for example, may be a head mounted display or VR glasses. This is not limited in this embodiment of this application herein.
- An embodiment of this application further provides a computer-readable medium, configured to store computer program code. The computer program includes an instruction used to implement the text display methods in virtual reality in the embodiments of this application in
FIG. 4 ,FIG. 6 ,FIG. 8 , andFIG. 9 . The readable medium may be a ROM or a RAM, and is not limited in this embodiment of this application herein. - An embodiment of this application further provides a system chip. The system chip includes a processing unit and a communications unit. The processing unit may be, for example, a processor, and the communications unit may be an input/output interface, a pin, a circuit, or the like. The processing unit may execute a computer instruction, so that a chip in the terminal performs any one of the text display methods in virtual reality.
- Optionally, the computer instruction is stored in a storage unit.
- Optionally, the storage unit is a storage unit in the chip, for example, a register or a cache. The storage unit may alternatively be a storage unit located outside the chip in the terminal, for example, a ROM or other types of static storage devices that can store static information and a static instruction, or a RAM. Any one of the processors mentioned above may be a CPU, a microprocessor, an ASIC, or one or more integrated circuits configured to control program execution of the text display method in virtual reality.
- It should be understood that the term "and/or" and "at least one of A or B" in this specification describes only an association relationship for describing associated objects and represents that three relationships may exist. For example, A and/or B may represent the following three cases: Only A exists, both A and B exist, and only B exists. In addition, the character "/" in this specification generally indicates an "or" relationship between the associated objects.
- A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware or a combination of computer software and electronic hardware. Whether the functions are performed by hardware or software depends on particular applications and design constraint conditions of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of this application.
- It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again.
- In the several embodiments provided in this application, it should be understood that the disclosed system, apparatus, and method may be implemented in other manners. For example, the described apparatus embodiment is merely an example. For example, the unit division is merely logical function division and may be other division during actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
- The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual requirements to achieve the objectives of the solutions of the embodiments in this application.
- In addition, functional units in the embodiments of this application may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit.
- When the functions are implemented in the form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or the part contributing to the prior art, or some of the technical solutions may be implemented in a form of a software product. The computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or some of the steps of the methods described in the embodiments of this application. The foregoing storage medium includes: any medium that can store program code, such as a USB flash drive, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disc.
- The foregoing descriptions are merely specific implementations of this application, but are not intended to limit the protection scope of this application. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in this application shall fall within the protection scope of this application. Therefore, the protection scope of this application shall be subject to the protection scope of the claims.
Claims (15)
- A text display method in virtual reality, comprising:receiving text information;determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α;rendering the text information based on the font size of the text information; anddisplaying the rendered text information.
- The method according to claim 1, wherein before the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α, the method further comprises:
obtaining the distance information between the text information in the VR and the eye of the user. - The method according to claim 1, wherein the text information comprises the distance information between the text information in the VR and the eye of the user.
- The method according to any one of claims 1 to 3, wherein a relationship between the font size of the text information and the pixel per degree of the VR device, the pixel per degree of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α is:
- The method according to claim 4, wherein the determining a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α comprises:
determining the font size of the text information based on the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α. - The method according to claim 4 or 5, wherein the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- The method according to any one of claims 1 to 6, wherein the correction value α is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- A text display apparatus in virtual reality, comprising:a text parameter control module, configured to receive text information, whereinthe text parameter control module is further configured to determine a font size of the text information based on a pixel per degree of a virtual reality VR device, a pixel per degree of an eye of a user, distance information between the text information in VR and the eye of the user, and a correction value α;a 3D rendering module, configured to render the text information based on the font size of the text information; anda display module, configured to display the rendered text information.
- The apparatus according to claim 8, wherein before determining the font size of the text information, the text parameter control module is further configured to obtain the distance information between the text information in the VR and the eye of the user.
- The apparatus according to claim 8, wherein the text information comprises the distance information between the text information in the VR and the eye of the user.
- The apparatus according to any one of claims 8 to 10, wherein a relationship between the font size of the text information and the pixel per degree of the VR device, the pixel per degree of the eye of the user, the distance information between the text information in the VR and the eye of the user, and the correction value α is:
- The apparatus according to claim 11, wherein the text parameter control module is specifically configured to determine the font size of the text information based on the relationship between the font size HVR of the text information and the pixel per degree PPD VR device of the VR device, the pixel per degree PPD Human eye of the eye of the user, the distance d between the text information in the VR and the eye of the user, and the correction value α.
- The apparatus according to claim 11 or 12, wherein the pixel per degree PPD Human eye of the eye of the user is 60 pixels per degree.
- The apparatus according to any one of claims 8 to 13, wherein the correction value α is a high corresponding visual angle formed by a text height and the eye of the user, and a value range of α is greater than or equal to 18 degrees and less than or equal to 22 degrees.
- A virtual reality VR device, comprising the text display apparatus in virtual reality according to any one of claims 8 to 14.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710975902.0A CN109696953B (en) | 2017-10-19 | 2017-10-19 | Virtual reality character display method and device and virtual reality equipment |
PCT/CN2018/110249 WO2019076264A1 (en) | 2017-10-19 | 2018-10-15 | Text display method and device in virtual reality, and virtual reality apparatus |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3677994A1 true EP3677994A1 (en) | 2020-07-08 |
EP3677994A4 EP3677994A4 (en) | 2020-09-02 |
EP3677994B1 EP3677994B1 (en) | 2023-04-26 |
Family
ID=66174298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18869124.0A Active EP3677994B1 (en) | 2017-10-19 | 2018-10-15 | Text display method and device in virtual reality, and virtual reality apparatus |
Country Status (4)
Country | Link |
---|---|
US (1) | US11394947B2 (en) |
EP (1) | EP3677994B1 (en) |
CN (1) | CN109696953B (en) |
WO (1) | WO2019076264A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110442486A (en) * | 2019-08-05 | 2019-11-12 | 北京远舢智能科技有限公司 | A kind of remote device diagnostics system and method based on mixed reality technology |
CN111429585A (en) * | 2020-03-30 | 2020-07-17 | 北京字节跳动网络技术有限公司 | Image generation method and device, electronic equipment and computer readable storage medium |
CN111506378B (en) * | 2020-04-17 | 2021-09-28 | 腾讯科技(深圳)有限公司 | Method, device and equipment for previewing text display effect and storage medium |
CN111885365B (en) * | 2020-07-17 | 2022-02-11 | 深圳市瑞立视多媒体科技有限公司 | Method, device and equipment for realizing subtitles based on illusion engine and storage medium |
CN114089829B (en) * | 2021-10-13 | 2023-03-21 | 深圳中青宝互动网络股份有限公司 | Virtual reality's meta universe system |
CN118569205A (en) * | 2024-08-02 | 2024-08-30 | 雷鸟创新技术(深圳)有限公司 | Augmented reality display method and device, storage medium and terminal |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005055189A1 (en) * | 2003-12-01 | 2005-06-16 | Volvo Technology Corporation | Perceptual enhancement displays based on knowledge of head and/or eye and/or gaze position |
JP4310330B2 (en) * | 2006-09-26 | 2009-08-05 | キヤノン株式会社 | Display control apparatus and display control method |
JP4142073B2 (en) * | 2006-10-13 | 2008-08-27 | 株式会社コナミデジタルエンタテインメント | Display device, display method, and program |
JP4888579B2 (en) * | 2010-04-21 | 2012-02-29 | パナソニック電工株式会社 | Visual function inspection device |
US20120287163A1 (en) * | 2011-05-10 | 2012-11-15 | Apple Inc. | Scaling of Visual Content Based Upon User Proximity |
KR101975906B1 (en) * | 2012-01-09 | 2019-05-08 | 삼성전자주식회사 | Apparatus and method for scaling layout of application program in visual display unit |
CN103165064A (en) * | 2012-04-27 | 2013-06-19 | 深圳市金立通信设备有限公司 | Anti-peep system and method for automatically adjusting resolution ratio of display screen based on distance of human eyes and screen |
US9934614B2 (en) * | 2012-05-31 | 2018-04-03 | Microsoft Technology Licensing, Llc | Fixed size augmented reality objects |
CN103076957B (en) * | 2013-01-17 | 2016-08-03 | 上海斐讯数据通信技术有限公司 | A kind of display control method and mobile terminal |
US10176639B2 (en) * | 2014-11-27 | 2019-01-08 | Magic Leap, Inc. | Virtual/augmented reality system having dynamic region resolution |
WO2017033777A1 (en) * | 2015-08-27 | 2017-03-02 | 株式会社コロプラ | Program for controlling head-mounted display system |
CN105916022A (en) * | 2015-12-28 | 2016-08-31 | 乐视致新电子科技(天津)有限公司 | Video image processing method and apparatus based on virtual reality technology |
CN105653032B (en) * | 2015-12-29 | 2019-02-19 | 小米科技有限责任公司 | Display adjusting method and device |
CN105975169A (en) * | 2016-04-27 | 2016-09-28 | 乐视控股(北京)有限公司 | Method and apparatus for displaying text in 3D space |
CN106200956A (en) * | 2016-07-07 | 2016-12-07 | 北京时代拓灵科技有限公司 | A kind of field of virtual reality multimedia presents and mutual method |
CN106445277B (en) * | 2016-08-31 | 2019-05-14 | 和思易科技(武汉)有限责任公司 | Text rendering method in virtual reality |
CN107247511B (en) * | 2017-05-05 | 2019-07-16 | 浙江大学 | A kind of across object exchange method and device captured based on eye movement in virtual reality |
-
2017
- 2017-10-19 CN CN201710975902.0A patent/CN109696953B/en active Active
-
2018
- 2018-10-15 EP EP18869124.0A patent/EP3677994B1/en active Active
- 2018-10-15 WO PCT/CN2018/110249 patent/WO2019076264A1/en unknown
-
2020
- 2020-04-10 US US16/845,230 patent/US11394947B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN109696953A (en) | 2019-04-30 |
WO2019076264A1 (en) | 2019-04-25 |
EP3677994B1 (en) | 2023-04-26 |
EP3677994A4 (en) | 2020-09-02 |
CN109696953B (en) | 2020-10-16 |
US20200244944A1 (en) | 2020-07-30 |
US11394947B2 (en) | 2022-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11394947B2 (en) | Text display method and apparatus in virtual reality, and virtual reality device | |
US11645801B2 (en) | Method for synthesizing figure of virtual object, electronic device, and storage medium | |
US11402634B2 (en) | Hand-locked rendering of virtual objects in artificial reality | |
US11538229B2 (en) | Image processing method and apparatus, electronic device, and computer-readable storage medium | |
EP3179448A1 (en) | Foveated rendering | |
US20230039100A1 (en) | Multi-layer reprojection techniques for augmented reality | |
JP2017531221A (en) | Countering stumbling when immersed in a virtual reality environment | |
US20210358093A1 (en) | Method and device of correcting image distortion, display device, computer readable medium, electronic device | |
US10127711B2 (en) | Method and apparatus rendering caustics | |
KR20210082242A (en) | Creation and modification of representations of objects in an augmented-reality or virtual-reality scene | |
CN109985384B (en) | Method and device for dynamically adjusting map | |
CN109791431A (en) | Viewpoint rendering | |
CN114026603B (en) | Rendering computer-generated real text | |
CN108038816A (en) | A kind of virtual reality image processing unit and method | |
CN111754431A (en) | Image area replacement method, device, equipment and storage medium | |
TWI694355B (en) | Tracking system, tracking method for real-time rendering an image and non-transitory computer-readable medium | |
CN115512014A (en) | Method for training expression driving generation model, expression driving method and device | |
CN106973283A (en) | A kind of method for displaying image and device | |
US20120098833A1 (en) | Image Processing Program and Image Processing Apparatus | |
US20230260218A1 (en) | Method and apparatus for presenting object annotation information, electronic device, and storage medium | |
US20180130262A1 (en) | Display device and control method therefor | |
WO2019080870A1 (en) | Interaction interface display method and device, storage medium, and electronic device | |
EP4425305A1 (en) | Processor, image processing method, and image processing program | |
CN109864743B (en) | User height determination method and device in virtual reality system and storage medium | |
JP2022136548A (en) | Image processing system and method thereof and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200330 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20200804 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/01 20060101AFI20200729BHEP |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20211004 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20221207 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602018049009 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1563314 Country of ref document: AT Kind code of ref document: T Effective date: 20230515 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20230426 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1563314 Country of ref document: AT Kind code of ref document: T Effective date: 20230426 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230828 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230726 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230826 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230727 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602018049009 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20230830 Year of fee payment: 6 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20240129 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230426 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20231031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231015 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231015 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231015 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240829 Year of fee payment: 7 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240909 Year of fee payment: 7 |