CN109189215A - A kind of virtual content display methods, device, VR equipment and medium - Google Patents

A kind of virtual content display methods, device, VR equipment and medium Download PDF

Info

Publication number
CN109189215A
CN109189215A CN201810935326.1A CN201810935326A CN109189215A CN 109189215 A CN109189215 A CN 109189215A CN 201810935326 A CN201810935326 A CN 201810935326A CN 109189215 A CN109189215 A CN 109189215A
Authority
CN
China
Prior art keywords
lens
human eye
distance
screen
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810935326.1A
Other languages
Chinese (zh)
Other versions
CN109189215B (en
Inventor
余志雄
林明田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810935326.1A priority Critical patent/CN109189215B/en
Publication of CN109189215A publication Critical patent/CN109189215A/en
Application granted granted Critical
Publication of CN109189215B publication Critical patent/CN109189215B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention discloses a kind of virtual content display methods, device, VR equipment and media, are applied to VR technical field, and scene drift exists in the prior art to solve the problems, such as.Specifically: obtain the distance between screen and lens of VR equipment;The visible area of lens on the screen is determined according to distance, and on-screen display (osd) area is determined according to visible area;Obtain position of the human eye relative to lens;It determines the corresponding user's field angle in position, the field angle of virtual content is adjusted according to user's field angle;Virtual content after adjustment field angle is shown on the on-screen display (osd) area.In this way, it is achieved that the automatic adjusument of the field angle to on-screen display (osd) area and virtual content, so that on-screen display (osd) area is consistent with visible area and the field angle of virtual content is consistent with user's field angle, to significantly reduce the probability that scene drift occurs, the visual effect of VR equipment is improved as much as possible.

Description

A kind of virtual content display methods, device, VR equipment and medium
Technical field
The present invention relates to virtual reality (Virtual Reality, VR) technical field more particularly to a kind of virtual content are aobvious Show method, apparatus, VR equipment and medium.
Background technique
VR technology is emulation technology and computer graphics, human-machine interface technology, multimedia technology, sensing technology, network There are a variety of VR equipment such as VR eyeshade with the continuous development of VR technology in the set of the multiple technologies such as technology, and VR equipment passes through Operation simulation generates the virtual scene in a three-dimensional space, body can be brought to face to user on the sense organs such as vision, the sense of hearing, tactile The impression in its border, to realize a kind of complete new experience.
However, in practical applications, due to the assembling tolerance of VR equipment and the use deviation etc. of user, in user During using VR equipment, it is likely that there are problems that user's field angle is not consistent with the field angle of virtual content, this is not only It will use family to occur the phenomenon that scene drift, also seriously affecting the visual effect of VR equipment when rotating visual angle.
Summary of the invention
It is existing to solve the embodiment of the invention provides a kind of virtual content display methods, device, VR equipment and medium VR equipment in technology is likely to occur the phenomenon that scene drift, thus the problem of influencing the visual effect of VR equipment.
Specific technical solution provided in an embodiment of the present invention is as follows:
The embodiment of the invention provides a kind of virtual content display methods, comprising:
Obtain the distance between screen and the lens of Virtual Reality equipment;
The visible area of the lens on the screen is determined according to this distance, and determines that screen is shown according to the visible area Region;
Obtain position of the human eye relative to the lens;
It determines the corresponding user's field angle in the position, and adjusts the field angle of virtual content according to user's field angle;
Virtual content after adjustment field angle is shown on the on-screen display (osd) area.
In virtual content display methods provided in an embodiment of the present invention, obtain VR equipment screen and lens between away from From, comprising:
By light wave distance measuring sensor, emits physical light wave and receive the screen or the physical light of the reflection from lens Wave;
Determine that launch time of the light wave distance measuring sensor when emitting the physics light wave and the light wave distance measuring sensor exist The time difference between receiving time when receiving the physics light wave of reflection;
The distance between the screen and the lens are determined according to the time difference.
In virtual content display methods provided in an embodiment of the present invention, obtain between the screen and lens in VR equipment Distance, comprising:
Physical signal is sent by signal sending end, and receives the object of signal sending end transmission by signal receiving end Manage signal;
Determine that sending time and the signal receiving end when signal sending end sends the physical signal receive physics letter Number when receiving time between time difference;
The distance between the screen and the lens are determined according to the time difference.
In virtual content display methods provided in an embodiment of the present invention, the lens are determined according to this distance on the screen Visible area, comprising:
Obtain the benchmark screen lens distance between the screen and the lens;
Determine the distance between the distance and the benchmark screen lens distance between the screen and lens ratio;
Ratio benchmark visible area corresponding with the benchmark screen lens distance according to this distance, obtains the visible area.
In virtual content display methods provided in an embodiment of the present invention, position of the human eye relative to the lens, packet are obtained It includes:
Emit infrared light by infrared transmitter;
Obtain the eye image that infrared light visual response device is shot when the infrared transmitter emits the infrared light;
According to the human eye area area and human eye shape in the eye image, position of the human eye relative to the lens is determined.
In virtual content display methods provided in an embodiment of the present invention, according to the human eye area area in the eye image With human eye shape, position of the human eye relative to the lens is determined, comprising:
Obtain base position of the human eye relative to the lens;
Determine the benchmark in the human eye area area benchmark eye image corresponding with the base position in the eye image Area difference between human eye area area, according between the corresponding human eye of the area difference and the base position and the lens Benchmark human eye lens distance, determines the distance between the human eye and the lens;
Determine the benchmark human eye in the human eye shape benchmark eye image corresponding with the base position in the eye image Shape difference degree between shape, according to the benchmark between the corresponding human eye of the shape difference degree and the base position and the lens Angle determines the angle between the human eye and the lens;
According to distance and the angle between the human eye and the lens, position of the human eye relative to the lens is obtained.
In virtual content display methods provided in an embodiment of the present invention, the corresponding user's field angle in the position is determined, wrap It includes:
Obtain base position of the human eye relative to the lens;
Determine the human eye relative to the position difference degree between the position and the base position of the lens;
According to position difference degree user's field angle corresponding with the base position, the corresponding user's visual field in the position is obtained Angle.
A kind of virtual content display device, comprising:
Apart from induction module, the distance between screen and lens for obtaining VR equipment;
Human eye tracing module, for obtaining position of the human eye relative to the lens;
Display control module determines the lens on the screen for the distance that induction module according to this distance obtains Visible area, and on-screen display (osd) area is determined according to the visible area;And determine the position that the people's ocular pursuit module obtains Corresponding user's field angle is set, and adjusts the field angle of virtual content according to user's field angle;And after field angle being adjusted Virtual content be shown on the on-screen display (osd) area.
In virtual content display device provided in an embodiment of the present invention, this includes: apart from induction module
Light wave distance measuring sensor, for emitting physical light wave and receiving the screen or the physical light of the reflection from lens Wave;Then,
The display control module, for determine launch time of the light wave distance measuring sensor when emitting the physics light wave with Time difference of the light wave distance measuring sensor between the receiving time when receiving the physics light wave of reflection, and according to the time difference Determine the distance between the screen and the lens.
In virtual content display device provided in an embodiment of the present invention, this includes: apart from induction module
Signal sending end, for sending physical signal;
Signal receiving end, for receiving the physical signal of signal sending end transmission;Then,
The display control module, sending time and the signal when for determining that the signal sending end sends the physical signal The time difference between receiving time when receiving end receives the physical signal, and the screen and the lens are determined according to the time difference The distance between.
In virtual content display device provided in an embodiment of the present invention, which is used for:
Obtain the benchmark screen lens distance between the screen and the lens;
Determine the distance between the distance and the benchmark screen lens distance between the screen and lens ratio;
Ratio benchmark visible area corresponding with the benchmark screen lens distance according to this distance, determines the visible area.
In virtual content display device provided in an embodiment of the present invention, the people's ocular pursuit module includes:
Infrared transmitter, for emitting infrared light;
Infrared light visual response device, for shooting eye image when the infrared transmitter emits infrared light;Then,
The display control module, for obtaining the eye image of infrared light visual response device shooting, and according to the human eye Human eye area area and human eye shape in image, determine position of the human eye relative to the lens.
In virtual content display device provided in an embodiment of the present invention, which is used for:
Obtain base position of the human eye relative to the lens;
Determine the benchmark in the human eye area area benchmark eye image corresponding with the base position in the eye image Area difference between human eye area area, according between the corresponding human eye of the area difference and the base position and the lens Benchmark human eye lens distance, determines the distance between the human eye and the lens;
Determine the benchmark human eye in the human eye shape benchmark eye image corresponding with the base position in the eye image Shape difference degree between shape, according to the benchmark between the corresponding human eye of the shape difference degree and the base position and the lens Angle determines the angle between the human eye and the lens;
According to this distance with the angle, position of the human eye relative to the lens is obtained.
In virtual content display device provided in an embodiment of the present invention, which is used for:
Obtain base position of the human eye relative to the lens;
Determine the human eye relative to the position difference degree between the position and the base position of the lens;
According to position difference degree user's field angle corresponding with the base position, the corresponding user's visual field in the position is obtained Angle.
A kind of VR equipment, comprising: memory, processor and the computer program being stored on the memory, the processor The step of virtual content display methods provided in an embodiment of the present invention is realized when executing the computer program.
The embodiment of the invention provides a kind of nonvolatile computer storage media, the nonvolatile computer storage medias It is stored with executable program, which, which executes, realizes virtual content display side provided in an embodiment of the present invention The step of method.
The embodiment of the present invention has the beneficial effect that:
In the embodiment of the present invention, the visible area of lens on the screen is capable of determining that according to the distance between screen and lens Domain is capable of determining that user's field angle relative to the position of lens according to human eye, in this manner it is possible to according to the visible area determined Domain and user's field angle, the field angle of automatic adjusument on-screen display (osd) area and virtual content so that on-screen display (osd) area with Visible area is unanimously and the field angle of virtual content is consistent with user's field angle, to reduce the general of scene drift appearance Rate effectively improves the visual effect of VR equipment.
Detailed description of the invention
Fig. 1 is the optics configuration diagram of the VR equipment provided in the embodiment of the present invention;
Fig. 2 is to be provided in the embodiment of the present invention when the distance between lens and screen and benchmark screen lens distance exist When deviation, the contrast schematic diagram of lens visible area on the screen and on-screen display (osd) area;
Fig. 3 is to provide in the embodiment of the present invention when position of human eye and preset exit pupil position are there are when deviation, use The contrast schematic diagram of family field angle and the field angle of virtual content;
Fig. 4 be the field angle of user's field angle and virtual content provided in the embodiment of the present invention and scene drift phenomenon it Between relation schematic diagram;
Fig. 5 A is the flow diagram of the virtual content display methods provided in the embodiment of the present invention;
Fig. 5 B is the visible area of the distance between the screen provided in the embodiment of the present invention and lens with lens on the screen Relation schematic diagram between domain;
Fig. 5 C be in the embodiment of the present invention human eye that provides relative to the relationship between the position and user's field angle of lens Schematic diagram;
Fig. 6 A is a kind of illustrative view of functional configuration of the virtual content display device provided in the embodiment of the present invention;
Fig. 6 B is the illustrative view of functional configuration of another virtual content display device provided in the embodiment of the present invention;
Fig. 6 C is the illustrative view of functional configuration of another virtual content display device provided in the embodiment of the present invention;
Fig. 6 D is the illustrative view of functional configuration of another virtual content display device provided in the embodiment of the present invention;
Fig. 7 is the hardware structural diagram of the VR equipment provided in the embodiment of the present invention.
Specific embodiment
Currently, VR equipment mainly has non-adjustable sighting distance and stadia adjustable two types, the VR compared to non-adjustable sighting distance is set Standby, the VR equipment of stadia adjustable can better meet the demand of the users such as myopia, farsightedness, however, either non-adjustable The VR equipment of sighting distance or the VR equipment of stadia adjustable, all there may be user's field angles is not consistent with the field angle of virtual content The problem of, main cause has following two points:
One, the assembling tolerance of VR equipment.
It, generally can be by practical operation as shown in fig.1, the optics framework of most of VR equipment is screen+lens VR equipment is assembled according to the benchmark screen lens distance between preset screen and lens, still, due to screen and lens Specification and installation gimmick etc. have differences, the distance between lens and screen of most of VR equipment are compared to benchmark screen All there is deviation for curtain lens distance, as shown in fig.2, this deviation may result in during using VR equipment The visible area of lens on the screen and on-screen display (osd) area are unequal, in addition, compared to the VR equipment of non-adjustable sighting distance, due to The VR equipment of stadia adjustable can adjust the distance between screen and lens according to user demand, so, the VR of stadia adjustable There is the visible area of lens on the screen in equipment and the unequal probability of on-screen display (osd) area is bigger.In actual use process In, if the visible area of lens on the screen is unequal with on-screen display (osd) area, it is likely to result in user's field angle and void The field angle of quasi- content is not consistent.
Two, the use deviation of user.
Under normal conditions, the virtual content for being shown to user is obtained according to preset reference view field angle, In, reference view field angle is user field angle of human eye when on preset exit pupil position, as shown in fig.3, actually making With in the process, position of human eye and preset exit pupil position are it is possible that deviation, this also results in user's field angle and void The field angle of quasi- content is not consistent.
The either unequal caused user's field angle in the visible area of lens on the screen and on-screen display (osd) area and void The field angle of quasi- content is not consistent or user's field angle caused by deviation occur in position of human eye and preset exit pupil position It is not consistent with the field angle of virtual content, all the phenomenon that scene drift may occurs when user rotates visual angle, refering to Fig. 4 institute Show, be not in the phenomenon that scene is drifted about when the field angle of virtual content is equal to user's field angle, and works as the view of virtual content When rink corner is greater than user's field angle, it may appear that the phenomenon that scene is drifted about backward, when the field angle of virtual content is less than user's visual field When angle, it may appear that the phenomenon that scene forward excursion.
For this purpose, the embodiment of the present invention determines the visible area of lens on the screen by the distance between screen and lens On-screen display (osd) area is determined according to the visible area, and determines that user regards relative to the position of lens by human eye in domain Rink corner adjusts the field angle of virtual content according to user's field angle, and the virtual content after adjustment field angle is shown in screen On display area, in this way, during user uses VR equipment, so that it may according to the distance between the screen of acquisition and lens And position of the human eye relative to lens, the field angle of automatic adjusument on-screen display (osd) area and virtual content, so that screen is aobvious Show that region is consistent with visible area and the field angle of virtual content is consistent with user's field angle, is drifted out to reduce scene Existing probability effectively improves the visual effect of VR equipment.
Tellable to be, it is aobvious that virtual content display methods provided in an embodiment of the present invention can be applied to a variety of virtual contents Show scene, for example, VR game, VR film, VR education, road deployment, venue emulation, real estate roaming etc., it should be noted that on Stating the application scenarios that refer to, which is shown only for the purpose of facilitating an understanding of the spirit and principles of the present invention, and the embodiment of the present invention is in this regard It is unrestricted.On the contrary, the embodiment of the present invention can be applied to the scene that applicable any related virtual content is shown.
Simply describing virtual content display methods provided in an embodiment of the present invention and the virtual content display methods Application scenarios after, next, in conjunction with the attached drawing in the embodiment of the present invention, technical solution in the embodiment of the present invention is carried out It clearly and completely describes, it is clear that described embodiments are only a part of the embodiments of the present invention, is not whole implementation Example.Based on the embodiments of the present invention, obtained by those of ordinary skill in the art without making creative efforts Every other embodiment, shall fall within the protection scope of the present invention.
To facilitate the understanding of the present invention, portion of techniques term involved in the embodiment of the present invention is illustrated first.
1, VR equipment, for the hardware product that immersion is experienced is presented using VR technology, for example, VR eyeshade etc..
2, physics light wave is electromagnetic wave of the wavelength between 0.3~3 μm, for example, infrared ray, microwave, ultrasonic wave etc.;
Physical signal is the carrier of message, for example, voice signal, electric signal, signal of communication etc..
3, the benchmark screen lens distance between screen and lens, for before assemble VR equipment pre-set screen and The distance between lens;
The benchmark visible area of lens on the screen, for the visible area determined according to benchmark screen lens distance.
4, base position of the human eye relative to lens is the pre-set human eye when user uses VR equipment relative to saturating The optimum position of mirror;
Benchmark human eye lens distance, horizontal distance when being located at the reference position for human eye, between human eye and lens;
Benchmark angle, angle when being located at the reference position for human eye, where human eye and lens between plane;
Benchmark human eye area area, the human eye area face in the eye image taken when being located at the reference position for human eye Product;
Benchmark human eye shape, the human eye shape in the eye image taken when being located at the reference position for human eye;
Reference user field angle, for the user's field angle determined according to base position.
Secondly, a kind of virtual content display methods provided in an embodiment of the present invention is described in detail, specifically, refering to Shown in Fig. 5 A, the process of the virtual content display methods of exemplary embodiment of the invention is as follows:
Step 501: obtaining the distance between screen and lens of VR equipment.
Specifically, can use when executing step 501 but be not limited to following manner:
First way: by light wave distance measuring sensor, emit physical light wave and receive the object of screen or reflection from lens Ricoh's wave;Determine that launch time of the light wave distance measuring sensor when emitting physical light wave and light wave distance measuring sensor are receiving reflection Physics light wave when receiving time between time difference;The distance between screen and lens are determined according to the time difference.
For example, it is assumed that light wave distance measuring sensor and screen are located at same steel body structure, then can be sensed by light wave ranging Device emits the infrared light of infrared light and receiving lens reflection, and is determining transmitting of the light wave distance measuring sensor when emitting infrared light After time difference between the receiving time of time and light wave distance measuring sensor when receiving the infrared light of reflection, according to the time difference Determine the distance between screen and lens.
The second way: physical signal is sent by signal sending end, and signal sending end is received by signal receiving end The physical signal of transmission;Determine that sending time and signal receiving end when signal sending end sends physical signal receive physical signal When receiving time between time difference;The distance between screen and lens are determined according to the time difference.
For example, it is assumed that signal sending end and screen are located at same steel body structure, signal receiving end and lens are located at same steel Body structure can then cross signal sending end and send electric signal, and receive the telecommunications that signal sending end is sent by signal receiving end Number, and the sending time when determining that signal sending end sends electric signal and receiving time when signal receiving end reception electric signal Between time difference after, the distance between screen and lens are determined according to the time difference.
Step 502: determining the visible area of lens on the screen according to this distance, and screen is determined according to the visible area Display area.
In practical applications, after at a distance from determining screen between lens, it can determine that according to this distance lens exist Visible area on screen, for example, refering to shown in Fig. 5 B, when the distance between screen and lens are distance 1, can according to away from Determine that visible area is visible area 1 from 1;It, can be true according to distance 2 when the distance between screen and lens are distance 2 Making visible area is visible area 2.Specifically, can use but be not limited to following manner: first obtaining between screen and lens Benchmark screen lens distance, then determine that the distance between screen and lens and the distance between benchmark screen lens distance compare Example, finally, ratio benchmark visible area corresponding with the benchmark screen lens distance according to this distance, determines visible area.
In order to improve the precision of visible area, the visible area can also be calibrated by regional calibration coefficient, that is, is existed When determining visible area according to distance proportion and the corresponding benchmark visible area of benchmark screen lens distance, regional calibration system is utilized It counts to adjust the size of visible area, the visible area finally determined is enabled to bring better visual experience to user, Wherein, regional calibration coefficient obtains based on experience value.
Further, after determining the visible area of lens on the screen, screen can be determined according to the visible area Display area directly can show virtual content using the visible area as on-screen display (osd) area in practical applications.
Step 503: obtaining position of the human eye relative to lens.
In practical applications, infrared light can be emitted by infrared transmitter, and obtains infrared light visual response device and exists The eye image that shoots when infrared transmitter emits infrared light, and, according in eye image human eye area area and people Eye-shaped shape determines position of the human eye relative to lens.
Specifically, determining human eye relative to lens according to the human eye area area and human eye shape in eye image When position, following manner can be used but be not limited to: obtaining base position of the human eye relative to lens;It determines in eye image The area difference between benchmark human eye area area in human eye area area benchmark eye image corresponding with base position, root According to area difference and the corresponding benchmark human eye lens distance of benchmark human eye area area, the distance between human eye and lens are determined; And it determines between the benchmark human eye shape in the benchmark eye image corresponding with base position of the human eye shape in eye image Shape difference degree determines people according to the benchmark angle between the corresponding human eye of shape difference degree and base position and the lens Angle between eye and lens;And according to the distance and the angle determined, obtain position of the human eye relative to lens.
Step 504: determining the corresponding user's field angle in the position, and adjust the view of virtual content according to user's field angle Rink corner.
In practical applications, after determining human eye relative to the position of lens, it may further determine that out the position pair The user's field angle answered, for example, refering to shown in Fig. 5 C, it, can be according to position when human eye is position 1 relative to the position of lens 1 determines that user's field angle is the field angle of straight dotted line composition;When human eye is position 2 relative to the position of lens, Ke Yigen Determine that user's field angle is the field angle of rectilinear(-al) according to position 2;It, can when human eye is position 3 relative to the position of lens To determine field angle that user's field angle is formed as pecked line according to position 3.
Specifically, base position of the human eye relative to lens can first be obtained, then determine position of the human eye relative to lens Position difference degree between base position, finally, being obtained according to the corresponding user's field angle of position difference degree and base position User's field angle corresponding to position of the human eye relative to lens.
In order to improve acquisition user's field angle precision, can also be calibrated by field angle calibration factor user view Rink corner, i.e., right relative to the position institute of lens according to the corresponding user's field angle acquisition human eye of position difference degree and base position When the user's field angle answered, user's visual field corresponding to position of the human eye relative to lens is adjusted using field angle calibration factor The size at angle enables the user's field angle finally determined to bring better visual experience to user, wherein field angle school Quasi- coefficient obtains based on experience value.
Further, after determining user's field angle, the field angle of virtual content can be adjusted according to user's field angle, In practical application, the field angle of virtual content directly can be adjusted to user's field angle.
Step 505: the virtual content after adjustment field angle is shown on on-screen display (osd) area.
It is tellable to be, in the virtual content display methods of exemplary embodiment of the invention, on-screen display (osd) area Determination process and human eye may be performed simultaneously relative to the determination process of the position of lens, can also sequentially execute, i.e. step 501- step 502 can be synchronous execution with step 503- step 504, is also possible to sequence and executes, does not limit specifically herein It is fixed.
Based on the above embodiment, the embodiment of the invention provides a kind of virtual content display device, refering to shown in Fig. 6 A, this The virtual content display device of invention illustrative embodiments includes at least:
Apart from induction module 610, the distance between screen and lens for obtaining VR equipment;
Human eye tracing module 620, for obtaining position of the human eye relative to the lens;
Display control module 630, for determining the lens in the screen according to the distance obtained apart from induction module 610 Visible area on curtain, and on-screen display (osd) area is determined according to the visible area;And determine that human eye tracing module 620 obtains The corresponding user's field angle in the position, and according to user's field angle adjust virtual content field angle;And adjustment is regarded Virtual content behind rink corner is shown on the on-screen display (osd) area.
Refering to shown in Fig. 6 B, in the virtual content display device of exemplary embodiment of the invention, apart from induction module 610 may include:
Light wave distance measuring sensor 611, for emitting physical light wave and receiving the physics light wave of screen or reflection from lens; Then,
Display control module 630, for determine launch time of the light wave distance measuring sensor 611 when emitting physical light wave with Time difference of the light wave distance measuring sensor 611 between the receiving time when receiving the physics light wave of reflection, and it is true according to the time difference Determine the distance between screen and lens.
Tellable to be, in the specific implementation, light wave distance measuring sensor 611 can be located at same steel body structure with screen, Same steel body structure can also be located at lens, be not specifically limited herein.Only with light wave distance measuring sensor 611 and screen in Fig. 6 B Curtain is located at for same steel body structure.
Refering to shown in Fig. 6 C, in the virtual content display device of exemplary embodiment of the invention, apart from induction module 610 may include:
Signal sending end 612, for sending physical signal;
Signal receiving end 613, for receiving the physical signal of the transmission of signal sending end 612;Then,
Display control module 630, sending time when for determining that signal sending end 612 sends physical signal connect with signal The time difference between receiving time when the reception physical signal of receiving end 613, and determined between screen and lens according to the time difference Distance.
Tellable to be, in the specific implementation, signal sending end 612 can be located at same steel body structure with screen, corresponding , signal receiving end 613 can be located at same steel body structure with lens.Certainly, signal sending end 612 can also be located at lens Same steel body structure, corresponding, signal receiving end 613 can be located at same steel body structure with screen, and specific structure is not made herein It limits.In Fig. 6 C only with signal sending end 612 and screen are located at same steel body structure and signal receiving end 613 and lens are located at together For one steel body structure.
In the specific implementation, display control module 630 is used for:
Obtain the benchmark screen lens distance between the screen and the lens;
Determine the distance between the distance and the benchmark screen lens distance between the screen and lens ratio;
Ratio benchmark visible area corresponding with the benchmark screen lens distance according to this distance, determines the visible area.
Refering to shown in Fig. 6 D, in the virtual content display device of exemplary embodiment of the invention, human eye tracing module 620 may include:
Infrared transmitter 621, for emitting infrared light;
Infrared light visual response device 622, for shooting eye image when infrared transmitter 621 emits infrared light; Then,
Display control module 630, for obtaining the eye image of the shooting of infrared light visual response device 622, and according to human eye Human eye area area and human eye shape in image, determine position of the human eye relative to lens.
Tellable to be, in the specific implementation, infrared transmitter 621 and infrared light visual response device 622 can be used as Integrated device and lens are located at same steel body structure, can also be used as individual devices and lens are located at same steel body structure, herein It is not especially limited.Only using infrared transmitter 621 and infrared light visual response device 622 as individual devices and lens in Fig. 6 D It is illustrated for same steel body structure.
In the specific implementation, display control module 630 is used for:
Obtain base position of the human eye relative to the lens;
Determine the benchmark in the human eye area area benchmark eye image corresponding with the base position in the eye image Area difference between human eye area area, according between the corresponding human eye of the area difference and the base position and the lens Benchmark human eye lens distance, determines the distance between the human eye and the lens;
Determine the benchmark human eye in the human eye shape benchmark eye image corresponding with the base position in the eye image Shape difference degree between shape, according to the benchmark between the corresponding human eye of the shape difference degree and the base position and the lens Angle determines the angle between the human eye and the lens;
According to this distance with the angle, position of the human eye relative to the lens is obtained.
In the specific implementation, display control module 630 is used for:
Obtain base position of the human eye relative to the lens;
Determine the human eye relative to the position difference degree between the position and the base position of the lens;
According to position difference degree user's field angle corresponding with the base position, the corresponding user's visual field in the position is obtained Angle.
It should be noted that principle and above-mentioned virtual content due to above-mentioned virtual content display device solution technical problem Display methods is similar, and therefore, the implementation of above-mentioned virtual content display device may refer to the reality of above-mentioned virtual content display methods It applies, overlaps will not be repeated.
After the virtual content display methods and device for describing exemplary embodiment of the invention, next, to this The VR equipment of invention illustrative embodiments is simply introduced.
As shown in fig.7, the VR equipment 700 of exemplary embodiment of the invention may include processor 71, memory 72 With the computer program being stored on memory 72, processor 71 realizes the present invention exemplary embodiment party when executing computer program Step in the virtual content display methods of formula.
It should be noted that VR equipment 700 shown in Fig. 7 is only an example, it should not be to the function of the embodiment of the present invention Any restrictions can be brought with use scope.
The VR equipment 700 of exemplary embodiment of the invention can also include connecting different components (including 71 He of processor Memory 72) bus 73.Wherein, bus 73 indicates one of a few class bus structures or a variety of, including memory bus, outer Enclose bus, local bus etc..
Memory 72 may include the readable medium of form of volatile memory, such as random access memory (Random Access Memory, RAM) 721 and/or cache memory 722, it can further include read-only memory (Read Only Memory, ROM) 723.
Memory 72 can also include the program means 725 with one group of (at least one) program module 724, program module 724 include but is not limited to: operational subsystems, one or more application program, other program modules and program data, these It may include the realization of network environment in each of example or certain combination.
VR equipment 700 can also be communicated with one or more external equipments 74 (such as keyboard, remote controler etc.), can also be with One or more enables a user to the equipment interacted with VR equipment 700 communication, and/or with enable the VR equipment 700 and one Any equipment (such as router, modem etc.) communication that a or a number of other VR equipment 700 are communicated.It is this logical Letter can be carried out by input/output (Input/Output, I/O) interface 75.Also, VR equipment 700 can also pass through network Adapter 76 and one or more network (such as local area network (Local Area Network, LAN), wide area network (Wide Area Network, WAN) and/or public network, such as internet) communication.As shown in fig. 7, network adapter 76 passes through bus 73 communicate with other modules of VR equipment 700.It will be appreciated that though being not shown in Fig. 7, it can be used in conjunction with VR equipment 700 Its hardware and/or software module, including but not limited to: microcode, device driver, redundant processor, external disk drive battle array Column, disk array (Redundant Arrays of Independent Disks, RAID) subsystem, tape drive and Data backup storage subsystem etc..
The non-volatile computer readable storage medium storing program for executing of exemplary embodiment of the invention is introduced below.The present invention Embodiment provides a kind of non-volatile computer readable storage medium storing program for executing, which is stored with Computer executable instructions, the executable code processor execute the step of realizing above-mentioned virtual content display methods.Specifically Ground, the executable program can be built in VR equipment, in this way, VR equipment can be real by executing built-in executable program The step of existing above-mentioned virtual content display methods.
In addition, above-mentioned virtual content display methods provided in an embodiment of the present invention is also implemented as a kind of program product, The program product includes program code, and when the program product can be run in VR equipment, the program code is for setting VR Standby the step of executing above-mentioned virtual content display methods.
Program product provided in an embodiment of the present invention can be using any combination of one or more readable mediums, wherein Readable medium can be readable signal medium or readable storage medium storing program for executing, and readable storage medium storing program for executing can be but it is electric to be not limited to, Magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination is specifically, readable to deposit The more specific example (non exhaustive list) of storage media includes: electrical connection with one or more conducting wires, portable disc, hard Disk, RAM, ROM, erasable programmable read only memory (Erasable Programmable Read Only Memory, EPROM), optical fiber, portable compact disc read only memory (Compact Disc Read-Only Memory, CD-ROM), light are deposited Memory device, magnetic memory device or above-mentioned any appropriate combination.
Program product provided in an embodiment of the present invention can also be set using CD-ROM and including program code in calculating Standby upper operation.However, program product provided in an embodiment of the present invention is without being limited thereto, and in embodiments of the present invention, readable storage medium Matter can be any tangible medium for including or store program, which, which can be commanded execution system, device or device, makes With or it is in connection.
Readable signal medium may include in a base band or as the data-signal that carrier wave a part is propagated, wherein carrying Readable program code.The data-signal of this propagation can take various forms, including but not limited to electromagnetic signal, optical signal Or above-mentioned any appropriate combination.Readable signal medium can also be any readable medium other than readable storage medium storing program for executing, should Readable medium can send, propagate or transmit for by instruction execution system, device or device use or it is in connection The program used.
The program code for including on readable medium can transmit with any suitable medium, including but not limited to wirelessly, have Line, optical cable etc. or above-mentioned any appropriate combination.
The program for executing operation of the present invention can be write with any combination of one or more programming languages Code, described program design language include object oriented program language, Java, C++ etc., further include conventional mistake Formula programming language, such as " C " language or similar programming language.Program code can be fully in user equipment Upper execution, partly executes on a user device, executes as an independent software package, partially execute on a user device, Part executes on a remote computing, or executes in remote computing device or server completely.It is being related to remote computation In the situation of equipment, remote computing device can such as pass through LAN by the network connection of any kind to user calculating equipment Or WAN is connected to user calculating equipment;Or, it may be connected to external computing device (such as utilize ISP To be connected by internet).
It should be noted that although being referred to several unit or sub-units of device in the above detailed description, this stroke It point is only exemplary not enforceable.In fact, embodiment according to the present invention, it is above-described two or more The feature and function of unit can embody in a unit.Conversely, the feature and function of an above-described unit can It is to be embodied by multiple units with further division.
In addition, although describing the operation of the method for the present invention in the accompanying drawings with particular order, this do not require that or Hint must execute these operations in this particular order, or have to carry out shown in whole operation be just able to achieve it is desired As a result.Additionally or alternatively, it is convenient to omit multiple steps are merged into a step and executed by certain steps, and/or by one Step is decomposed into execution of multiple steps.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more, The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, those skilled in the art can carry out various modification and variations without departing from this hair to the embodiment of the present invention The spirit and scope of bright embodiment.In this way, if these modifications and variations of the embodiment of the present invention belong to the claims in the present invention And its within the scope of equivalent technologies, then the present invention is also intended to include these modifications and variations.

Claims (16)

1. a kind of virtual content display methods characterized by comprising
Obtain the distance between screen and the lens of Virtual Reality equipment;
The visible area of the lens on the screen is determined according to the distance, and screen is determined according to the visible area Display area;
Obtain position of the human eye relative to the lens;
It determines the corresponding user's field angle in the position, and adjusts the field angle of virtual content according to user's field angle;
Virtual content after adjustment field angle is shown on the on-screen display (osd) area.
2. virtual content display methods as described in claim 1, which is characterized in that between the screen and lens for obtaining VR equipment Distance, comprising:
By light wave distance measuring sensor, emits physical light wave and receive the physical light of the screen or the reflection from lens Wave;
Determine launch time of the light wave distance measuring sensor when emitting the physics light wave and the light wave distance measuring sensor The time difference between receiving time when receiving the physics light wave of reflection;
The distance between the screen and the lens are determined according to the time difference.
3. virtual content display methods as described in claim 1, which is characterized in that obtain VR equipment on screen and lens it Between distance, comprising:
Physical signal is sent by signal sending end, and the object that the signal sending end is sent is received by signal receiving end Manage signal;
Determine that sending time and the signal receiving end when signal sending end sends the physical signal receive the object The time difference between receiving time when managing signal;
The distance between the screen and the lens are determined according to the time difference.
4. virtual content display methods as claimed in claim 1,2 or 3, which is characterized in that according to the distance determination The visible area of lens on the screen, comprising:
Obtain the benchmark screen lens distance between the screen and the lens;
Determine the distance between the distance and the benchmark screen lens distance between the screen and lens ratio;
According to the distance proportion and the corresponding benchmark visible area of the benchmark screen lens distance, the visible area is obtained Domain.
5. virtual content display methods as described in claim 1, which is characterized in that obtain position of the human eye relative to the lens It sets, comprising:
Emit infrared light by infrared transmitter;
Obtain the eye image that infrared light visual response device is shot when the infrared transmitter emits the infrared light;
According to the human eye area area and human eye shape in the eye image, position of the human eye relative to the lens is determined.
6. virtual content display methods as claimed in claim 5, which is characterized in that according to the human eye area in the eye image Domain area and human eye shape determine position of the human eye relative to the lens, comprising:
Obtain base position of the human eye relative to the lens;
Determine the benchmark in the human eye area area benchmark eye image corresponding with the base position in the eye image Area difference between human eye area area, according to the corresponding human eye of the area difference and the base position and the lens Between benchmark human eye lens distance, determine the distance between the human eye and the lens;
Determine the benchmark human eye in the human eye shape benchmark eye image corresponding with the base position in the eye image Shape difference degree between shape, according between the corresponding human eye of the shape difference degree and the base position and the lens Benchmark angle, determine the angle between the human eye and the lens;
According between the human eye and the lens the distance and the angle, obtain position of the human eye relative to the lens It sets.
7. virtual content display methods as claimed in any one of claims 1 to 6, which is characterized in that determine that the position is corresponding User's field angle, comprising:
Obtain base position of the human eye relative to the lens;
Determine the human eye relative to the position difference degree between the position and the base position of the lens;
According to the corresponding user's field angle of the position difference degree and the base position, the corresponding user's view in the position is obtained Rink corner.
8. a kind of virtual content display device characterized by comprising
Apart from induction module, the distance between screen and lens for obtaining VR equipment;
Human eye tracing module, for obtaining position of the human eye relative to the lens;
Display control module, for determining the lens in the screen according to the distance obtained apart from induction module Visible area on curtain, and on-screen display (osd) area is determined according to the visible area;And determine that the human eye tracing module obtains The corresponding user's field angle in the position obtained, and according to the field angle of user's field angle adjustment virtual content;It will adjustment Virtual content after field angle is shown on the on-screen display (osd) area.
9. virtual content display device as claimed in claim 8, which is characterized in that described to include: apart from induction module
Light wave distance measuring sensor, for emitting physical light wave and receiving the physical light of the screen or the reflection from lens Wave;Then,
The display control module, for determining launch time of the light wave distance measuring sensor when emitting the physics light wave With time difference of the light wave distance measuring sensor between the receiving time when receiving the physics light wave of reflection, and according to institute Stating the time difference determines the distance between the screen and the lens.
10. virtual content display device as claimed in claim 8, which is characterized in that described to include: apart from induction module
Signal sending end, for sending physical signal;
Signal receiving end, the physical signal sent for receiving the signal sending end;Then,
The display control module, sending time when for determining that the signal sending end sends the physical signal with it is described The time difference between receiving time when signal receiving end receives the physical signal, and the screen is determined according to the time difference The distance between curtain and the lens.
11. the virtual content display device as described in claim 8,9 or 10, which is characterized in that the display control module is used In:
Obtain the benchmark screen lens distance between the screen and the lens;
Determine the distance between the distance and the benchmark screen lens distance between the screen and lens ratio;
According to the distance proportion and the corresponding benchmark visible area of the benchmark screen lens distance, the visible area is determined Domain.
12. virtual content display device as claimed in claim 8, which is characterized in that the human eye tracing module includes:
Infrared transmitter, for emitting infrared light;
Infrared light visual response device, for shooting eye image when the infrared transmitter emits infrared light;Then,
The display control module, for obtaining the eye image of the infrared light visual response device shooting, and according to the people Human eye area area and human eye shape in eye image, determine position of the human eye relative to the lens.
13. virtual content display device as claimed in claim 12, which is characterized in that the display control module is used for:
Obtain base position of the human eye relative to the lens;
Determine the benchmark in the human eye area area benchmark eye image corresponding with the base position in the eye image Area difference between human eye area area, according to the corresponding human eye of the area difference and the base position and the lens Between benchmark human eye lens distance, determine the distance between the human eye and the lens;
Determine the benchmark human eye in the human eye shape benchmark eye image corresponding with the base position in the eye image Shape difference degree between shape, according between the corresponding human eye of the shape difference degree and the base position and the lens Benchmark angle, determine the angle between the human eye and the lens;
According to the distance and the angle, position of the human eye relative to the lens is obtained.
14. virtual content display device as described in claim 12 or 13, which is characterized in that the display control module is used In:
Obtain base position of the human eye relative to the lens;
Determine the human eye relative to the position difference degree between the position and the base position of the lens;
According to the corresponding user's field angle of the position difference degree and the base position, the corresponding user's view in the position is obtained Rink corner.
15. a kind of Virtual Reality equipment characterized by comprising memory, processor and be stored on the memory Computer program, the processor realize the described in any item virtual contents of claim 1-7 when executing the computer program The step of display methods.
16. a kind of nonvolatile computer storage media, which is characterized in that the nonvolatile computer storage media is stored with Executable program, the executable code processor, which executes, realizes that the described in any item virtual contents of claim 1-7 are shown The step of method.
CN201810935326.1A 2018-08-16 2018-08-16 Virtual content display method and device, VR equipment and medium Active CN109189215B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810935326.1A CN109189215B (en) 2018-08-16 2018-08-16 Virtual content display method and device, VR equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810935326.1A CN109189215B (en) 2018-08-16 2018-08-16 Virtual content display method and device, VR equipment and medium

Publications (2)

Publication Number Publication Date
CN109189215A true CN109189215A (en) 2019-01-11
CN109189215B CN109189215B (en) 2021-08-20

Family

ID=64918409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810935326.1A Active CN109189215B (en) 2018-08-16 2018-08-16 Virtual content display method and device, VR equipment and medium

Country Status (1)

Country Link
CN (1) CN109189215B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110308560A (en) * 2019-07-03 2019-10-08 南京玛克威信息科技有限公司 The control method of VR equipment
CN111240016A (en) * 2020-02-18 2020-06-05 北京京东方光电科技有限公司 Virtual reality glasses and adjusting device and adjusting method for display picture of virtual reality glasses
CN111562678A (en) * 2020-07-14 2020-08-21 深圳珑璟光电技术有限公司 Method for adjusting field angle and near-to-eye display device
CN111596763A (en) * 2020-05-15 2020-08-28 京东方科技集团股份有限公司 Control method and device of virtual reality equipment
CN113110908A (en) * 2021-04-20 2021-07-13 网易(杭州)网络有限公司 Display content adjusting method and device, computer equipment and storage medium
CN114415368A (en) * 2021-12-15 2022-04-29 青岛歌尔声学科技有限公司 VR equipment regulation and control method, VR equipment regulation and control device, VR equipment system and storage medium
CN114648942A (en) * 2020-12-02 2022-06-21 深圳市奥拓电子股份有限公司 LED display screen, local brightness adjusting method thereof and LED display controller

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950550A (en) * 2010-09-28 2011-01-19 冠捷显示科技(厦门)有限公司 Display device for displaying pictures at different angles based on visual angle of viewer
CN107037584A (en) * 2016-02-03 2017-08-11 深圳市易瞳科技有限公司 A kind of intelligent glasses perspective method and system
EP3240037A1 (en) * 2016-04-29 2017-11-01 LG Display Co., Ltd. Display for personal immersion apparatus
CN107610044A (en) * 2017-08-29 2018-01-19 歌尔科技有限公司 Image processing method, computer-readable recording medium and virtual reality helmet
CN107844190A (en) * 2016-09-20 2018-03-27 腾讯科技(深圳)有限公司 Image presentation method and device based on Virtual Reality equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950550A (en) * 2010-09-28 2011-01-19 冠捷显示科技(厦门)有限公司 Display device for displaying pictures at different angles based on visual angle of viewer
CN107037584A (en) * 2016-02-03 2017-08-11 深圳市易瞳科技有限公司 A kind of intelligent glasses perspective method and system
EP3240037A1 (en) * 2016-04-29 2017-11-01 LG Display Co., Ltd. Display for personal immersion apparatus
CN107844190A (en) * 2016-09-20 2018-03-27 腾讯科技(深圳)有限公司 Image presentation method and device based on Virtual Reality equipment
CN107610044A (en) * 2017-08-29 2018-01-19 歌尔科技有限公司 Image processing method, computer-readable recording medium and virtual reality helmet

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110308560A (en) * 2019-07-03 2019-10-08 南京玛克威信息科技有限公司 The control method of VR equipment
CN110308560B (en) * 2019-07-03 2022-09-30 南京玛克威信息科技有限公司 Control method of VR equipment
CN111240016A (en) * 2020-02-18 2020-06-05 北京京东方光电科技有限公司 Virtual reality glasses and adjusting device and adjusting method for display picture of virtual reality glasses
CN111596763A (en) * 2020-05-15 2020-08-28 京东方科技集团股份有限公司 Control method and device of virtual reality equipment
CN111596763B (en) * 2020-05-15 2023-12-26 京东方科技集团股份有限公司 Control method and device of virtual reality equipment
CN111562678B (en) * 2020-07-14 2020-12-08 深圳珑璟光电科技有限公司 Method for adjusting field angle and near-to-eye display device
WO2022012068A1 (en) * 2020-07-14 2022-01-20 深圳珑璟光电科技有限公司 Viewing angle adjusting method and near-eye display device
CN111562678A (en) * 2020-07-14 2020-08-21 深圳珑璟光电技术有限公司 Method for adjusting field angle and near-to-eye display device
US11709368B2 (en) 2020-07-14 2023-07-25 Shenzhen Lochn Optics Hi-Tech Co., Ltd. Method for adjusting field of view angle and near-eye display equipment
CN114648942A (en) * 2020-12-02 2022-06-21 深圳市奥拓电子股份有限公司 LED display screen, local brightness adjusting method thereof and LED display controller
CN113110908A (en) * 2021-04-20 2021-07-13 网易(杭州)网络有限公司 Display content adjusting method and device, computer equipment and storage medium
CN113110908B (en) * 2021-04-20 2023-05-30 网易(杭州)网络有限公司 Display content adjustment method, device, computer equipment and storage medium
CN114415368A (en) * 2021-12-15 2022-04-29 青岛歌尔声学科技有限公司 VR equipment regulation and control method, VR equipment regulation and control device, VR equipment system and storage medium
WO2023108744A1 (en) * 2021-12-15 2023-06-22 歌尔股份有限公司 Regulation and control method and apparatus for vr device, and vr device, system and storage medium

Also Published As

Publication number Publication date
CN109189215B (en) 2021-08-20

Similar Documents

Publication Publication Date Title
CN109189215A (en) A kind of virtual content display methods, device, VR equipment and medium
JP7335285B2 (en) Single Depth Tracking Accommodation-Binocular Diversion Solution
US11304017B2 (en) Reverberation fingerprint estimation
CN108701371B (en) Method and apparatus for providing virtual reality output and augmented reality output
KR102164723B1 (en) System and method for generating 3-d plenoptic video images
US8963956B2 (en) Location based skins for mixed reality displays
US20170221270A1 (en) Self calibration for smartphone goggles
US20160025982A1 (en) Smart transparency for holographic objects
US11423518B2 (en) Method and device of correcting image distortion, display device, computer readable medium, electronic device
WO2016191049A1 (en) Mixed-reality headset
WO2013155217A1 (en) Realistic occlusion for a head mounted augmented reality display
US11671784B2 (en) Determination of material acoustic parameters to facilitate presentation of audio content
WO2018000629A1 (en) Brightness adjustment method and apparatus
US20160041406A1 (en) Glasses with fluid-fillable membrane for adjusting focal length of one or more lenses of the glasses
US10846901B2 (en) Conversion of 2D diagrams to 3D rich immersive content
US20240037856A1 (en) Walkthrough view generation method, apparatus and device, and storage medium
US20170205628A1 (en) Wearable computing eyeglasses that provide unobstructed views
CN109814710A (en) Data processing method and device and virtual reality equipment
CN109996060A (en) A kind of virtual reality cinema system and information processing method
JP2023505235A (en) Virtual, Augmented, and Mixed Reality Systems and Methods
Yang et al. Augmented system for immersive 3D expansion and interaction
CN106445139A (en) Data display method, device and system
KR20160124985A (en) Method for providing mixed reality experience space
CN108986225A (en) Processing method and processing device, equipment when virtual reality device display scene
WO2018027015A1 (en) Single depth tracked accommodation-vergence solutions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant