WO2021227714A1 - 虚拟现实设备的控制方法及装置 - Google Patents

虚拟现实设备的控制方法及装置 Download PDF

Info

Publication number
WO2021227714A1
WO2021227714A1 PCT/CN2021/085958 CN2021085958W WO2021227714A1 WO 2021227714 A1 WO2021227714 A1 WO 2021227714A1 CN 2021085958 W CN2021085958 W CN 2021085958W WO 2021227714 A1 WO2021227714 A1 WO 2021227714A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
distance
lens
virtual reality
reality device
Prior art date
Application number
PCT/CN2021/085958
Other languages
English (en)
French (fr)
Inventor
刘梓荷
彭金豹
苗京花
王雪丰
王龙辉
李茜
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US17/761,520 priority Critical patent/US20220365594A1/en
Publication of WO2021227714A1 publication Critical patent/WO2021227714A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0081Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/08Auxiliary lenses; Arrangements for varying focal length
    • G02C7/088Lens systems mounted to spectacles
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids

Definitions

  • the present disclosure relates to the field of virtual reality technology, and in particular to a method and device for controlling virtual reality equipment.
  • VR Virtual Reality
  • the distance between the lens in the virtual reality device and the eyes of the current user, the field angle of each virtual camera, and the distance between adjacent virtual cameras are adjusted.
  • the distance between the lens in the virtual reality device and the eyes of the current user and the field of view angle of each virtual camera are adjusted according to the vision information of the current user And the distance between adjacent virtual cameras, specifically including:
  • the distance between the lens in the virtual reality device and the eyes of the current user, the field of view of each virtual camera, and the distance between the adjacent virtual cameras are adjusted simultaneously.
  • the spacing between specifically including:
  • the synchronous adjustment of the distance between the lens in the virtual reality device and the eyes of the current user, the field of view of each virtual camera, and the distance between the adjacent virtual cameras The spacing between, specifically including:
  • the amount of movement of the image plane the amount of lens movement of the lens relative to the eyes of the current user, the amount of field angle change of the field of view of each of the virtual cameras, and the distance between adjacent virtual cameras are determined respectively.
  • the distance change of the spacing is determined respectively.
  • the lens movement amount and the lens movement rate the field angle change amount and the field angle change rate of each of the virtual cameras, and the movement rate of each virtual camera in the horizontal direction and the distance change amount, synchronization Controlling the lens movement, the change of the field of view of each of the virtual cameras, and the reverse movement of the adjacent virtual cameras.
  • the embodiments of the present disclosure also provide a virtual reality device, including:
  • the processor is configured as:
  • the distance between the lens in the virtual reality device and the eyes of the current user, the field angle of each virtual camera, and the distance between adjacent virtual cameras are adjusted.
  • the embodiment of the present disclosure also provides a computer-readable storage medium on which a computer program is stored, wherein the computer program is executed by a processor to realize the steps of the control method of the virtual reality device.
  • Fig. 1 is a flowchart of a control method provided by an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a partial structure of a virtual reality device provided by an embodiment of the present invention.
  • 3 is a schematic diagram of the relationship between binocular disparity and object depth provided by an embodiment of the present invention.
  • Fig. 4a is an image taken by a left virtual camera before adjustment according to an embodiment of the present invention
  • FIG. 4b is an image taken by the left virtual camera after adjustment according to an embodiment of the present invention.
  • FIG. 5a is an image taken by the right virtual camera before adjustment according to an embodiment of the present invention.
  • FIG. 5b is an image taken by the right virtual camera after adjustment according to an embodiment of the present invention.
  • Fig. 6a is an image taken when the distance between the left and right side virtual cameras is a distance of 1 provided by an embodiment of the present invention
  • 6b is an image taken when the distance between the left and right side virtual cameras is a distance of 2 provided by an embodiment of the present invention
  • Figure 7a is a schematic diagram of some visual fields provided by an embodiment of the present invention.
  • FIG. 7b is a schematic diagram of other fields of view provided by an embodiment of the present invention.
  • Fig. 7c is a schematic diagram of still other fields of view provided by an embodiment of the present invention.
  • Virtual reality equipment is usually a product that uses simulation technology and computer graphics, human-machine interface technology, multimedia technology, sensor technology, network technology and other technologies. It is a brand-new human-computer interaction created by computer and sensor technology. means. Virtual reality equipment uses arithmetic simulation to generate a three-dimensional virtual world, thereby providing users with sensory simulations of vision, hearing, and touch, making users feel as if they are on the scene, and then bringing users a brand-new experience effect.
  • virtual reality devices In order to be suitable for users with different myopia degrees, virtual reality devices generally have the function of adjusting the focal length. This function is mainly used to adjust the distance between the lens and the display screen by adjusting the gear on the virtual reality device, so as to make different myopia Users of degrees can see a clear display screen.
  • the virtual imaging plane of the lens can be brought closer to the user's clear vision distance, resulting in a smaller visible field of view content, so that the content at the edge of the display screen will be sacrificed .
  • the position of the imaging plane changes while the parallax remains unchanged, resulting in binocular image fusion errors, and objects in the field of view may be blurred or out of focus. Therefore, the user's immersion and the richness of experience content are weakened, and the experience is affected.
  • the method for controlling a virtual reality device may include the following steps:
  • S200 According to the vision information of the current user, adjust the distance between the lens in the virtual reality device and the eyes of the current user, the angle of view of each virtual camera, and the distance between adjacent virtual cameras.
  • the method for controlling the virtual reality device can obtain the vision information of the current user who uses the virtual reality device, so as to adjust the distance between the lens in the virtual reality device and the eyes of the current user according to the vision information of the current user.
  • the distance of each virtual camera, the angle of view of each virtual camera, and the distance between adjacent virtual cameras This can make the virtual reality device suitable for people with different vision information.
  • the image distance, parallax, and disparity in the VR experience can be adjusted.
  • the field of view angles are coordinated, so as to solve the problems of field of view display loss and inability to focus when the image distance is reduced, and thereby can optimize the current user experience of the virtual reality device.
  • the vision information may include vision data, for example, the degree of myopia.
  • vision data of the current user of the virtual reality device can be obtained to determine the distance between the lens in the virtual reality device and the eyes of the current user, the field of view angle of each virtual camera, and the distance between adjacent virtual cameras according to the vision data. Pitch. Therefore, the virtual reality device can be further suitable for people with different degrees of myopia, so that the short-sighted people can clearly see the content on the display without wearing the original myopic eyes, without affecting the use, and greatly satisfying the needs of users.
  • the virtual reality device body may include a VR helmet, a VR eye, a sports seat, and the like.
  • the virtual reality device has lenses LS1 and LS2, the eye EY is located on the side of the lens LS1 away from the lens LS2, and the screen is located on the side of the lens LS2 away from the lens LS1.
  • the lenses LS1 and LS2 are used as one lens group, and the virtual reality device may include two lens groups, so that one eye can correspond to one lens group, so that the lens LS1 in the virtual reality device can be adjusted to each eye of the current user
  • the distance between EY and EY can be adjusted to the position of the imaging screen.
  • the lens LS2 may be fixed.
  • the lens LS2 can also be made movable, which is not limited here.
  • a stepping motor can be used to control the distance BFL1 from the lens LS1 to the screen BP to change the object distance, thereby changing the image distance, and then changing the position of the imaging screen.
  • it represents the relationship between binocular disparity and object depth. It can be seen from Figure 3 that the relationship between binocular parallax and object depth is inversely proportional. Therefore, the closer the point is to the imaging plane, the greater the parallax between the left and right virtual cameras, and the farther the point is from the imaging plane, the smaller the parallax between the left and right virtual cameras.
  • the virtual reality device is provided with a virtual camera on the software side.
  • the main function of the virtual camera is to take pictures in the field of view according to the set rendering parameters in the virtual scene, and then render the obtained pictures on the screen.
  • the virtual camera has a variety of adjustable rendering parameters (such as field angle parameters, projection mode parameters, clipping plane parameters, depth parameters, etc.).
  • the virtual camera can also have rigid body properties and can be moved and rotated freely so as to be placed in any position in the virtual scene.
  • the self-contained parameter attributes and position attributes of the virtual camera can be controlled by scripts.
  • the virtual reality device may be provided with two virtual cameras. One of the virtual cameras corresponds to one eye of the user, and the other virtual camera corresponds to the other eye of the user. In practical applications, the working principle of the virtual camera can be basically the same as that in the related technology, and will not be repeated here.
  • the virtual reality device can have parameters in the default state.
  • the distance between the lens LS1 in the virtual reality device and the user's eye EY has the distance BL P in the default state
  • the virtual camera has the default state
  • the angle of view FOV P and the distance Dis P between adjacent virtual cameras in the default state.
  • the parameters in the default state may be determined according to users who are not nearsighted or users who have a low degree of myopia (for example, the degree of myopia is less than 100 degrees).
  • the parameters in the default state can be designed and determined according to actual application requirements, and are not limited here.
  • the distance between the lens in the virtual reality device and the current user's eyes, the field of view of each virtual camera, and the distance between adjacent virtual cameras are adjusted.
  • the spacing can specifically include:
  • the distance between the lens in the virtual reality device and the eyes of the current user is adjusted, and the field of view angle of each virtual camera and the distance between adjacent virtual cameras are adjusted at the same time.
  • a plurality of tables of relationships between different vision information and the amount of movement of the imaging plane of the lens relative to the image plane of the user's eyes are stored in advance.
  • N pieces of vision information are pre-stored: Y-1, Y-2, Y-3, ... Y-N.
  • Each piece of vision information corresponds to an image plane movement, that is, vision information Y-1 corresponds to image plane movement ⁇ xz-1, vision information Y-2 corresponds to image plane movement ⁇ xz-2, and vision information Y-3 corresponds to image plane movement ⁇ xz-3, the vision information YN corresponds to the image plane movement amount ⁇ xz-N.
  • the amount of movement of the imaging plane of the lens in the virtual reality device relative to the image plane of the current user's eyes can be determined by querying the pre-stored relationship table according to the acquired vision information of the current user. It should be noted that when the visual acuity information is visual acuity data, a relationship table between a plurality of different visual acuity data and the amount of movement of the imaging plane of the lens relative to the image plane of the user's eyes is stored in advance.
  • the distance between the lens in the virtual reality device and the eyes of the current user, the field of view angle of each virtual camera, and the distance between adjacent virtual cameras are adjusted at the same time, which may specifically include : Synchronously adjust the distance between the lens in the virtual reality device and the eyes of the current user, the field of view of each virtual camera, and the distance between adjacent virtual cameras, so that the distance between the lens and the eyes of the current user, The field of view angle of the virtual camera and the distance between adjacent virtual cameras reach the target value at the same time.
  • the correlation between the distance between the lens in the adjustment virtual reality device and the eyes of the current user, the field of view of each virtual camera, and the distance between adjacent virtual cameras can be improved, so that the image can be reduced.
  • the probability of error occurs, and the clarity of the image is increased, which can further meet the user's use needs and improve the user's experience.
  • the synchronous adjustment of the distance between the lens in the virtual reality device and the eyes of the current user, the angle of view of each virtual camera, and the distance between adjacent virtual cameras may specifically include :
  • the amount of movement of the image plane respectively determine the amount of lens movement of the lens relative to the eyes of the current user, the amount of field angle change of the field of view of each virtual camera, and the distance change amount of the distance between adjacent virtual cameras;
  • the lens movement and the distance change amount of each virtual camera are synchronously controlled.
  • the angle of view changes and the adjacent virtual camera moves in the opposite direction.
  • the lens movement amount ⁇ ls1 of the lens LS1 relative to the current user's eye can be determined according to the image plane movement amount ⁇ z, so that the lens movement amount ⁇ ls1 can be adjusted by the distance BL P in the default state, and then the lens LS1 and the current
  • the field angle change amount 6 ⁇ z of the field of view of each virtual camera can be determined, so that the field angle change can be adjusted based on the field angle FOV P of the virtual camera in the default state
  • the distance change amount 0.02 ⁇ z of the distance between adjacent virtual cameras can be determined, so that the distance change amount 0.02 ⁇ z can be adjusted based on the distance Dis P of the virtual camera in the default state,
  • ⁇ z is set to the determined value of a pre-stored amount of image plane movement.
  • the stepping motor in the mechanical structure in the virtual reality device can be controlled to rotate according to the amount of lens movement, so as to drive the lens to move. If the angular velocity of the stepping motor is ⁇ 0 , the angular velocity of driving the gear to rotate is ⁇ 1 , so that the rate of driving the lens to move can be obtained, that is, the rate of lens movement is v Lens .
  • the specific value of the lens movement rate v Lens can be determined by design according to the requirements of the actual application, and is not limited here.
  • v FOV represents the lens movement rate.
  • the rate of change of the field of view angle v FOV can be correlated with the rate of lens movement v Lens , so that the movement of the lens and the change of the field of view of each virtual camera can be controlled synchronously.
  • v camera 0.02v Lens
  • v Lens represents the lens movement rate.
  • each The rate of change of the field of view of the virtual camera v FOV 6v Lens
  • the following describes the control method of the virtual reality device provided by the embodiment of the present invention with reference to specific embodiments. Among them, the description is made by taking the vision information as vision data as an example.
  • the eyes of the current user who uses the virtual reality device are measured by the vision information acquisition device, so that vision data of the eyes of the current user can be acquired.
  • the vision data of the current user find the image corresponding to the vision data of the current user from the relationship table between a plurality of pre-stored different vision data and the amount of movement of the imaging plane of the lens relative to the image plane of the user's eyes.
  • the lens movement amount ⁇ ls1 of the lens LS1 relative to the eye of the current user can be determined according to the image plane movement amount ⁇ z.
  • the field angle change amount 6 ⁇ z of the field angle of each virtual camera can be determined.
  • the distance change amount 0.02 ⁇ z of the distance between adjacent virtual cameras can be determined.
  • the stepping motor in the mechanical structure of the virtual reality device can be controlled to rotate according to the amount of lens movement, so as to drive the lens to move. If the angular velocity of the stepping motor is ⁇ 0 , the angular velocity of driving the gear to rotate is ⁇ 1 , so that the lens movement rate that drives the lens to move is v Lens .
  • the field angles of the two virtual cameras and the distance between the two virtual cameras also reach the target value at the same time.
  • the adjusted target value of the field of view of the virtual camera FOV a FOV P -6 ⁇ z.
  • the relative position movement of the virtual camera corresponds to 300mm.
  • Fig. 4a shows an image taken by the left virtual camera before adjustment
  • Fig. 4b shows an image taken by the left virtual camera after adjustment
  • Figure 5a shows the image taken by the virtual camera on the right before adjustment
  • Figure 5b shows the image taken by the virtual camera on the right after adjustment
  • Fig. 6a shows an image taken when the distance between the left and right side virtual cameras is a distance of 1
  • Fig. 6b shows an image taken when the distance between the left and right side virtual cameras is a distance of 2.
  • FOV1 represents the field of view of the virtual camera CAM1 on the left
  • FOV2 represents the field of view of the virtual camera CAM2 on the right.
  • Fig. 7a shows the field of view when the distance between two virtual cameras is 8mm and the field of view of each virtual camera is 60°.
  • Figure 7b shows the field of view when the distance between two virtual cameras is 6mm and the field of view of each virtual camera is 60°.
  • Fig. 7c shows the field of view when the distance between two virtual cameras is 8mm and the field of view of each virtual camera is 75°.
  • embodiments of the present invention also provide a virtual reality device, which may include a virtual reality device body and a processor.
  • the processor is configured to: obtain the vision information of the current user using the virtual reality device; adjust the distance between the lens in the virtual reality device and the eyes of the current user according to the vision information of the current user, and the vision of each virtual camera The field angle and the distance between adjacent virtual cameras.
  • the processor may be integrated with the virtual reality device.
  • Other indispensable components of the virtual reality device are understood by those of ordinary skill in the art, and will not be repeated here, and should not be used as a limitation to the present disclosure.
  • embodiments of the present invention also provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the above-mentioned virtual reality device control method provided by the embodiment of the present invention is implemented.
  • the present invention may take the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, optical storage, etc.) containing computer-usable program codes.
  • the embodiments of the present invention can be provided as a method, a system, or a computer program product. Therefore, the present invention may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, the present invention may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种虚拟现实设备的控制方法及装置,包括:获取使用虚拟现实设备的当前用户的视力信息(S100);根据当前用户的视力信息,调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距(S200)。

Description

虚拟现实设备的控制方法及装置
相关申请的交叉引用
本申请要求在2020年05月15日提交中国专利局、申请号为202010414663.3、申请名称为“虚拟现实设备的控制方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及虚拟现实技术领域,特别涉及虚拟现实设备的控制方法及装置。
背景技术
随着虚拟现实(Virtual Reality,VR)技术的发展,出现了很多虚拟现实设备,使得人们可以通过虚拟现实设备体现虚拟场景。
发明内容
本公开实施例提供的虚拟现实设备的控制方法,包括:
获取使用所述虚拟现实设备的当前用户的视力信息;
根据所述当前用户的视力信息,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距。
可选地,在本公开实施例中,所述根据所述当前用户的视力信息,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距,具体包括:
根据所述当前用户的视力信息,确定所述透镜的成像平面相对所述当前用户的眼睛的像面移动量;
根据所述像面移动量,调节所述虚拟现实设备中的透镜与所述当前用户 的眼睛之间的距离,并同时调节各所述虚拟相机的视场角以及相邻所述虚拟相机之间的间距。
可选地,在本公开实施例中,所述同时调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距,具体包括:
同步调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各所述虚拟相机的视场角以及相邻所述虚拟相机之间的间距,以使所述透镜与所述当前用户的眼睛之间的距离、各所述虚拟相机的视场角以及相邻所述虚拟相机之间的间距同时达到目标值。
可选地,在本公开实施例中,所述同步调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距,具体包括:
根据所述像面移动量,分别确定所述透镜相对所述当前用户的眼睛的透镜移动量、各所述虚拟相机的视场角的视场角变化量以及相邻所述虚拟相机之间的间距的距离变化量;
根据所述透镜移动量和透镜移动速率、各所述虚拟相机的视场角变化量和视场角变化速率、以及各所述虚拟相机在水平方向上的移动速率和所述距离变化量,同步控制所述透镜移动、各所述虚拟相机的视场角变化以及相邻所述虚拟相机反向移动。
可选地,在本公开实施例中,所述视场角变化速率v FOV满足公式:v FOV=6v Lens;其中,v Lens代表所述透镜移动速率。
可选地,在本公开实施例中,所述虚拟相机在水平方向上的移动速率v camera满足公式:v camera=0.02v Lens;其中,v Lens代表所述透镜移动速率。
可选地,在本公开实施例中,所述虚拟相机在调节后的视场角的目标值FOV a满足公式:FOV a=FOV P-6Δz;其中,FOV P代表所述虚拟相机在默认状态下的视场角,Δz代表所述像面移动量。
可选地,在本公开实施例中,相邻所述虚拟相机在调节后的间距的目标 值Dis a满足公式:Dis a=Dis P+0.02Δz;其中,Dis P代表所述虚拟相机在默认状态下的间距,Δz代表所述像面移动量。
本公开实施例还提供了虚拟现实设备,包括:
虚拟现实设备本体;
处理器,被配置为:
获取使用所述虚拟现实设备的当前用户的视力信息;
根据所述当前用户的视力信息,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距。
本公开实施例还提供了计算机可读存储介质,其上存储有计算机程序,其中,该计算机程序被处理器执行时实现上述虚拟现实设备的控制方法的步骤。
附图说明
图1为本发明实施例提供的控制方法的流程图;
图2为本发明实施例提供的虚拟现实设备的局部结构示意图;
图3为本发明实施例提供的双目视差与物体深度之间的关系示意图;
图4a为本发明实施例提供的左侧虚拟相机在调节前拍摄的图像;
图4b为本发明实施例提供的左侧虚拟相机在调节后拍摄的图像;
图5a为本发明实施例提供的右侧虚拟相机在调节前拍摄的图像;
图5b为本发明实施例提供的右侧虚拟相机在调节后拍摄的图像;
图6a为本发明实施例提供的左右两个侧虚拟相机之间的间距为间距1时拍摄的图像;
图6b为本发明实施例提供的左右两个侧虚拟相机之间的间距为间距2时拍摄的图像;
图7a为本发明实施例提供的一些视野的示意图;
图7b为本发明实施例提供的另一些视野的示意图;
图7c为本发明实施例提供的又一些视野的示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例的附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。并且在不冲突的情况下,本公开中的实施例及实施例中的特征可以相互组合。基于所描述的本公开的实施例,本领域普通技术人员在无需创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外定义,本公开使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本公开中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。
需要注意的是,附图中各图形的尺寸和形状不反映真实比例,目的只是示意说明本发明内容。并且自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。
虚拟现实设备通常是利用仿真技术与计算机图形学、人机接口技术、多媒体技术、传感技术、网络技术等多种技术集合的产品,是借助计算机及传感器技术创造的一种崭新的人机交互手段。虚拟现实设备利用运算模拟产生一个三维空间的虚拟世界,从而向用户提供视觉、听觉以及触觉等方面的感官模拟,使得用户如同身临其境一般,进而给用户带来全新的体验效果。
随着虚拟现实设备的应用越来越广泛,如何提高用户体验效果、增加虚拟现实设备的实用功能也是目前虚拟现实设备的发展方向之一。通常,为了适用于不同近视度数的用户,虚拟现实设备中普遍具有调节焦距的功能,该 功能主要用于通过调节虚拟现实设备上的齿轮以调节透镜与显示屏之间的距离,从而让不同近视度数的用户都能看到清晰的显示画面。
然而,通过调节透镜与显示屏之间的距离,可以将透镜的虚像成像平面拉近到用户的明视距离之内,导致可见的视野内容变小,使得显示画面边缘的内容将会被牺牲掉。并且,成像平面的位置改变而视差不变导致双眼图像融合误差,视野中的物体可能出现模糊或无法对焦。因此,削弱用户的沉浸感和体验内容的丰富性,影响体验。
有鉴于此,本发明实施例提供的虚拟现实设备的控制方法,如图1所示,可以包括如下步骤:
S100、获取使用虚拟现实设备的当前用户的视力信息;
S200、根据当前用户的视力信息,调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距。
本发明实施例提供的虚拟现实设备的控制方法,可以获取到使用虚拟现实设备的当前用户的视力信息,从而可以根据当前用户的视力信息,调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距。这样可以使虚拟现实设备适合具有不同视力信息的人群。并且,通过调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、调节各虚拟相机的视场角以及调节相邻虚拟相机之间的间距,可以使VR体验中的像距、视差以及视场角相协调,从而可以解决像距降低时视场显示损失与无法对焦的问题,进而可以优化虚拟现实设备的当前用户的体验效果。
在具体实施时,在本发明实施例中,视力信息可以包括视力数据,例如,近视度数。这样可以通过获取虚拟现实设备的当前用户的视力数据,以根据该视力数据,虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距。从而可以使虚拟现实设备进一步适合不同程度近视的人群,使得近视人群不需佩戴原有的近视眼睛也可以清楚看到显示片上的内容,不影响使用,大大满足了使用者的需求。
在一些示例中,虚拟现实设备本体可以包括VR头盔、VR眼睛、运动座椅等。本公开对虚拟现实设备本体不做特殊限定。如图2所示,虚拟现实设备中具有透镜LS1和LS2,眼睛EY位于透镜LS1背离透镜LS2一侧,屏幕位于透镜LS2背离透镜LS1一侧。示例性地,透镜LS1和LS2作为一个透镜组,虚拟现实设备可以包括两个透镜组,这样可以使一个眼睛对应一个透镜组,从而可以调节虚拟现实设备中的透镜LS1与当前用户的每一个眼睛EY之间的距离,可以调节成像屏幕的位置。示例性地,透镜LS2可以是固定的。当然,透镜LS2也可以使可移动的,在此不作限定。
结合图2所示,可以采用步进电机控制透镜LS1到屏幕BP的距离BFL1来改变物距,从而改变像距,进而改变成像屏幕的位置。如图3所示,代表双目视差与物体深度之间的关系。从图3中可知,双目视差与物体深度的关系成反比。因此,距离成像平面越近的点,它在左右两个虚拟相机中的视差越大,距离成像平面越远的点,它在左右两个虚拟相机中的视差越小。
在一些示例中,虚拟现实设备设置有软件端的虚拟相机。该虚拟相机的主要作用是在虚拟场景下根据设定的渲染参数将视野中的画面进行取图,然后将获取到的画面渲染在屏幕上。示例性地,该虚拟相机具有多种可调节的渲染参数(如视场角参数、投影方式参数、剪裁平面参数、深度参数等)。并且,该虚拟相机还可以具有刚体属性,可以自由移动旋转,以放置在虚拟场景中的任何位置。进一步地,该虚拟相机的自带参数属性与位置属性均可通过脚本进行控制。在一些示例中,虚拟现实设备可以设置有两个虚拟相机。其中一个虚拟相机对应用户的一只眼睛,另一个虚拟相机对应用户的另一只眼睛。在实际应用中,虚拟相机的工作原理可以与相关技术中的基本相同,在此不作赘述。
通常情况下,人们有的近视有的不近视,这样使得虚拟现实设备需要适应大部分人的需求。在实际应用中,可以使虚拟现实设备具有默认状态下的参数,例如,虚拟现实设备中的透镜LS1与用户的眼睛EY之间的距离具有默认状态下的距离BL P,虚拟相机具有默认状态下的视场角FOV P,以及相邻 的虚拟相机在默认状态下的间距Dis P。该默认状态下的参数可以是根据不近视的用户或者近视度数较低(例如近视度数小于100度)的用户进行确定的。这样在获取到的当前用户的视力数据对应不近视用户或近视度数较低的用户时,不用额外进行调节,从而可以降低功耗。在实际应用中,默认状态下的参数可以根据实际应用需求进行设计确定,在此不作限定。
在具体实施时,在本发明实施例中,根据当前用户的视力信息,调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距,具体可以包括:
根据当前用户的视力信息,确定透镜的成像平面相对当前用户的眼睛的像面移动量;
根据像面移动量,调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离,并同时调节各虚拟相机的视场角以及相邻虚拟相机之间的间距。
示例性地,预先存储了多个不同视力信息与透镜的成像平面相对用户眼睛的像面移动量之间的关系表。例如,预先存储了N个视力信息:Y-1、Y-2、Y-3、……Y-N。每一个视力信息对应一个像面移动量,即视力信息Y-1对应像面移动量Δxz-1,视力信息Y-2对应像面移动量Δxz-2,视力信息Y-3对应像面移动量Δxz-3,视力信息Y-N对应像面移动量Δxz-N。在具体实施时,可以根据获取到的当前用户的视力信息,通过查询预先存储的关系表,可以确定出虚拟现实设备中的透镜的成像平面相对该当前用户的眼睛的像面移动量。需要说明的是,在视力信息为视力数据时,则预先存储了多个不同视力数据与透镜的成像平面相对用户眼睛的像面移动量之间的关系表。
在具体实施时,在本发明实施例中,同时调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距,具体可以包括:同步调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距,以使透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距同时达到目标值。这样在调节的过程中,可以提高调节虚拟现实设备中 的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距的相关性,从而可以降低图像出现误差的几率,增加图像的清晰度,进而可以更加符合用户的使用需求,提高了用户的体验感受。
在具体实施时,在本发明实施例中,同步调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距,具体可以包括:
根据像面移动量,分别确定透镜相对当前用户的眼睛的透镜移动量、各虚拟相机的视场角的视场角变化量以及相邻虚拟相机之间的间距的距离变化量;
根据透镜移动量和透镜移动速率、各虚拟相机的视场角变化量和视场角变化速率、以及各虚拟相机在水平方向上的移动速率和距离变化量,同步控制透镜移动、各虚拟相机的视场角变化以及相邻虚拟相机反向移动。
示例性地,可以根据像面移动量Δz,确定出透镜LS1相对当前用户的眼睛的透镜移动量Δls1,从而可以在默认状态下的距离BL P调节透镜移动量Δls1,即可得到透镜LS1与当前用户的眼睛EY之间在调节后的距离的目标值BL a,即BL a=BL P-Δls1。
示例性地,根据像面移动量Δz,可以确定出各虚拟相机的视场角的视场角变化量6Δz,从而可以基于虚拟相机在默认状态下的视场角FOV P,调节视场角变化量6Δz,即可得到虚拟相机在调节后的视场角的目标值FOV a,即FOV a满足公式:FOV a=FOV P-6Δz。
示例性地,根据像面移动量Δz,可以确定出相邻虚拟相机之间的间距的距离变化量0.02Δz,从而可以基于虚拟相机在默认状态下的间距Dis P,调节距离变化量0.02Δz,即可得到相邻虚拟相机在调节后的间距的目标值Dis a,即Dis a满足公式:Dis a=Dis P+0.02Δz。
需要说明的是,根据当前用户的视力信息确定该当前用户对应像面移动量为预先存储的像面移动量Y-1、Y-2、Y-3、……Y-N中的一个时,可以将Δz设置为该确定出的预先存储的一个像面移动量的数值。例如,在根据当前用 户的视力信息确定该当前用户对应像面移动量Δxz-2时,则可以将像面移动量Δz=Δxz-2。其余同理,在此不作赘述。
在一些示例中,可以根据透镜移动量,控制虚拟现实设备内的机械结构中的步进电机旋转,从而带动透镜移动。若步进电机旋转的角速度为ω 0,带动齿轮转动的角速度为ω 1,从而可以得到带动透镜移动的速率,即透镜移动速率为v Lens。当然,在实际应用中,透镜移动速率v Lens的具体数值可以根据实际应用的需求进行设计确定,在此不作限定。
在一些示例中,视场角变化速率v FOV可以满足公式:v FOV=6v Lens;其中,v Lens代表透镜移动速率。这样可以使视场角变化速率v FOV与透镜移动速率v Lens相关,从而可以同步控制透镜移动以及各虚拟相机的视场角变化。
在一些示例中,虚拟相机在水平方向上的移动速率v camera满足公式:v camera=0.02v Lens;其中,v Lens代表透镜移动速率。这样可以使虚拟相机在水平方向上的移动速率v camera与透镜移动速率v Lens相关,从而可以同步控制透镜移动以及相邻虚拟相机反向移动。
在具体实施时,通过使视场角变化速率v FOV可以满足公式:v FOV=6v Lens,以及使虚拟相机在水平方向上的移动速率v camera满足公式:v camera=0.02v Lens,可以使各虚拟相机的视场角变化速率v FOV、虚拟相机在水平方向上的移动速率v camera以及透镜移动速率v Lens相关,从而可以同步协调控制透镜移动、各虚拟相机的视场角变化以及相邻虚拟相机反向移动。
下面结合具体实施例,对本发明实施例提供的虚拟现实设备的控制方法进行说明。其中,以视力信息为视力数据为例进行说明。
本发明实施例提供的虚拟现实设备的控制方法,可以包括如下步骤:
(1)通过视力信息获取装置对使用该虚拟现实设备的当前用户的眼睛进行测量,从而可以获取到使用该当前用户的眼睛的视力数据。
(2)根据该当前用户的视力数据,从预先存储的多个不同视力数据与透镜的成像平面相对用户眼睛的像面移动量之间的关系表中,找到该当前用户的视力数据对应的像面移动量,例如找到的该当前用户的视力数据对应的像 面移动量为Δxz-2,那么Δz=Δxz-2。
(3)可以根据像面移动量Δz,确定出透镜LS1相对当前用户的眼睛的透镜移动量Δls1。
并且,根据像面移动量Δz,可以确定出各虚拟相机的视场角的视场角变化量6Δz。
以及,根据像面移动量Δz,可以确定出相邻虚拟相机之间的间距的距离变化量0.02Δz。
(4)可以根据透镜移动量,控制虚拟现实设备内的机械结构中的步进电机旋转,从而带动透镜移动。若步进电机旋转的角速度为ω 0,带动齿轮转动的角速度为ω 1,从而可以得到带动透镜移动的透镜移动速率为v Lens
并且,在透镜移动的同时,根据视场角变化速率v FOV(例如,v FOV=6v Lens),线性改变两个虚拟相机的视场角。
以及,在透镜移动与视场角变化的同时,根据各虚拟相机在水平方向上的移动速率v camera(即v camera=0.02v Lens),控制这两个虚拟相机反向移动,线性改变这两个虚拟相机之间的间距,从而可以使焦点落在成像平面上。
在透镜停止移动时,两个虚拟相机的视场角和这两个虚拟相机之间的间距也同时达到目标值。其中,虚拟相机在调节后的视场角的目标值FOV a=FOV P-6Δz。在调节后这两个虚拟相机之间的间距的目标值Dis a=Dis P+0.02Δz。
示例性地,在调节的过程中,视场角调节的变化量与虚拟相机移动变化量存在对应关系。例如,视场角每改变1°,虚拟相机的相对位置移动量对应为300mm。
并且,如图4a至图6b所示,图4a示出了左侧虚拟相机在调节前拍摄的图像,图4b示出了左侧虚拟相机在调节后拍摄的图像。图5a示出了右侧虚拟相机在调节前拍摄的图像,图5b示出了右侧虚拟相机在调节后拍摄的图像。图6a示出了左右两个侧虚拟相机之间的间距为间距1时拍摄的图像,图6b示出了左右两个侧虚拟相机之间的间距为间距2时拍摄的图像。FOV1代表左 侧虚拟相机CAM1的视场角,FOV2代表右侧虚拟相机CAM2的视场角。结合图4a至图6b所示,可知通过调节两个虚拟相机的视场角和间距,可以改善视差不匹配和视场变小的问题。
并且,如图7a至图7c所示,图7a示出了两个虚拟相机之间的间距为8mm且每个虚拟相机的视场角均为60°时的视野。图7b示出了两个虚拟相机之间的间距为6mm且每个虚拟相机的视场角均为60°时的视野。图7c示出了两个虚拟相机之间的间距为8mm且每个虚拟相机的视场角均为75°时的视野。结合图7a至图7c所示,可知通过调节两个虚拟相机的视场角和间距,可以改视野,从而可以改善视差不匹配和视场变小的问题。
基于同一发明构思,本发明实施例还提供了虚拟现实设备,可以包括:虚拟现实设备本体以及处理器。其中,处理器,被配置为:获取使用虚拟现实设备的当前用户的视力信息;根据当前用户的视力信息,调节虚拟现实设备中的透镜与当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻虚拟相机之间的间距。
示例性地,处理器可以是与虚拟现实设备集成为一体的。对于该虚拟现实设备的其它必不可少的组成部分均为本领域的普通技术人员应该理解具有的,在此不做赘述,也不应作为对本公开的限制。
基于同一发明构思,本发明实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,并且该计算机程序被处理器执行时实现本发明实施例提供的上述虚拟现实设备的控制方法的步骤。具体地,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了本发明的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本发明范围的所有变更和修改。
显然,本领域的技术人员可以对本发明实施例进行各种改动和变型而不脱离本发明实施例的精神和范围。这样,倘若本发明实施例的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。

Claims (10)

  1. 一种虚拟现实设备的控制方法,其中,包括:
    获取使用所述虚拟现实设备的当前用户的视力信息;
    根据所述当前用户的视力信息,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距。
  2. 如权利要求1所述的虚拟现实设备的控制方法,其中,所述根据所述当前用户的视力信息,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距,具体包括:
    根据所述当前用户的视力信息,确定所述透镜的成像平面相对所述当前用户的眼睛的像面移动量;
    根据所述像面移动量,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离,并同时调节各所述虚拟相机的视场角以及相邻所述虚拟相机之间的间距。
  3. 如权利要求2所述的虚拟现实设备的控制方法,其中,所述同时调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距,具体包括:
    同步调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各所述虚拟相机的视场角以及相邻所述虚拟相机之间的间距,以使所述透镜与所述当前用户的眼睛之间的距离、各所述虚拟相机的视场角以及相邻所述虚拟相机之间的间距同时达到目标值。
  4. 如权利要求3所述的虚拟现实设备的控制方法,其中,所述同步调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距,具体包括:
    根据所述像面移动量,分别确定所述透镜相对所述当前用户的眼睛的透 镜移动量、各所述虚拟相机的视场角的视场角变化量以及相邻所述虚拟相机之间的间距的距离变化量;
    根据所述透镜移动量和透镜移动速率、各所述虚拟相机的视场角变化量和视场角变化速率、以及各所述虚拟相机在水平方向上的移动速率和所述距离变化量,同步控制所述透镜移动、各所述虚拟相机的视场角变化以及相邻所述虚拟相机反向移动。
  5. 如权利要求4所述的虚拟现实设备的控制方法,其中,所述视场角变化速率v FOV满足公式:v FOV=6v Lens;其中,v Lens代表所述透镜移动速率。
  6. 如权利要求4或5所述的虚拟现实设备的控制方法,其中,所述虚拟相机在水平方向上的移动速率v camera满足公式:v camera=0.02v Lens;其中,v Lens代表所述透镜移动速率。
  7. 如权利要求2-6任一项所述的虚拟现实设备的控制方法,其中,所述虚拟相机在调节后的视场角的目标值FOV a满足公式:FOV a=FOV P-6Δz;其中,FOV P代表所述虚拟相机在默认状态下的视场角,Δz代表所述像面移动量。
  8. 如权利要求2-7任一项所述的虚拟现实设备的控制方法,其中,相邻所述虚拟相机在调节后的间距的目标值Dis a满足公式:Dis a=Dis P+0.02Δz;其中,Dis P代表所述虚拟相机在默认状态下的间距,Δz代表所述像面移动量。
  9. 一种虚拟现实设备,其中,包括:
    虚拟现实设备本体;
    处理器,被配置为:
    获取使用所述虚拟现实设备的当前用户的视力信息;
    根据所述当前用户的视力信息,调节所述虚拟现实设备中的透镜与所述当前用户的眼睛之间的距离、各虚拟相机的视场角以及相邻所述虚拟相机之间的间距。
  10. 一种计算机可读存储介质,其上存储有计算机程序,其中,该计算机程序被处理器执行时实现权利要求1-8任一项所述的虚拟现实设备的控制方法的步骤。
PCT/CN2021/085958 2020-05-15 2021-04-08 虚拟现实设备的控制方法及装置 WO2021227714A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/761,520 US20220365594A1 (en) 2020-05-15 2021-04-08 Control method and apparatus for virtual reality device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010414663.3A CN111596763B (zh) 2020-05-15 2020-05-15 虚拟现实设备的控制方法及装置
CN202010414663.3 2020-05-15

Publications (1)

Publication Number Publication Date
WO2021227714A1 true WO2021227714A1 (zh) 2021-11-18

Family

ID=72189732

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/085958 WO2021227714A1 (zh) 2020-05-15 2021-04-08 虚拟现实设备的控制方法及装置

Country Status (3)

Country Link
US (1) US20220365594A1 (zh)
CN (1) CN111596763B (zh)
WO (1) WO2021227714A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111596763B (zh) * 2020-05-15 2023-12-26 京东方科技集团股份有限公司 虚拟现实设备的控制方法及装置
CN115413999A (zh) * 2022-09-09 2022-12-02 天津新视光技术有限公司 一种视功能诊疗vr设备的控制方法、控制装置及vr设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170340200A1 (en) * 2014-05-29 2017-11-30 Vivid Vision, Inc. Interactive System for Vision Assessment and Correction
US20180275367A1 (en) * 2017-03-21 2018-09-27 Nhn Entertainment Corporation Method and system for adjusting focusing length to enhance vision
CN109283997A (zh) * 2017-07-20 2019-01-29 中兴通讯股份有限公司 显示方法、装置和系统
CN109521871A (zh) * 2018-10-22 2019-03-26 广州视景医疗软件有限公司 一种融合功能的训练方法、装置、设备及存储介质
CN109964167A (zh) * 2016-10-28 2019-07-02 依视路国际公司 用于确定显示装置的使用者的眼睛参数的方法
CN111596763A (zh) * 2020-05-15 2020-08-28 京东方科技集团股份有限公司 虚拟现实设备的控制方法及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205903239U (zh) * 2016-06-02 2017-01-25 北京伟世万联科技有限公司 基于虚拟现实的视力检查及训练装置
US11099385B2 (en) * 2016-12-30 2021-08-24 Intel Corporation Virtual reality (VR) system with nearsightedness optometry adjustment
CN107049721A (zh) * 2017-02-14 2017-08-18 合肥中感微电子有限公司 一种视力矫正方法及装置
CN107167924B (zh) * 2017-07-24 2019-07-19 京东方科技集团股份有限公司 一种虚拟现实设备和虚拟现实设备的透镜调节方法
EP3740809A4 (en) * 2017-11-01 2021-12-15 Vrgineers, Inc. INTERACTIVE AUGMENTED OR VIRTUAL REALITY DEVICES
CN109189215B (zh) * 2018-08-16 2021-08-20 腾讯科技(深圳)有限公司 一种虚拟内容显示方法、装置、vr设备及介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170340200A1 (en) * 2014-05-29 2017-11-30 Vivid Vision, Inc. Interactive System for Vision Assessment and Correction
CN109964167A (zh) * 2016-10-28 2019-07-02 依视路国际公司 用于确定显示装置的使用者的眼睛参数的方法
US20180275367A1 (en) * 2017-03-21 2018-09-27 Nhn Entertainment Corporation Method and system for adjusting focusing length to enhance vision
CN109283997A (zh) * 2017-07-20 2019-01-29 中兴通讯股份有限公司 显示方法、装置和系统
CN109521871A (zh) * 2018-10-22 2019-03-26 广州视景医疗软件有限公司 一种融合功能的训练方法、装置、设备及存储介质
CN111596763A (zh) * 2020-05-15 2020-08-28 京东方科技集团股份有限公司 虚拟现实设备的控制方法及装置

Also Published As

Publication number Publication date
CN111596763B (zh) 2023-12-26
CN111596763A (zh) 2020-08-28
US20220365594A1 (en) 2022-11-17

Similar Documents

Publication Publication Date Title
US11132056B2 (en) Predictive eye tracking systems and methods for foveated rendering for electronic displays
US11480715B2 (en) Methods and system for creating focal planes using an alvarez lens
JP7094266B2 (ja) 単一深度追跡型の遠近調節-両眼転導ソリューション
US20200051320A1 (en) Methods, devices and systems for focus adjustment of displays
WO2021227714A1 (zh) 虚拟现实设备的控制方法及装置
US20140375531A1 (en) Method of roviding to the user an image from the screen of the smartphome or tablet at a wide angle of view, and a method of providing to the user 3d sound in virtual reality
CN108124509B (zh) 图像显示方法、穿戴式智能设备及存储介质
CN104822061A (zh) 头戴式3d显示器的瞳距调节方法、系统、以及模块
CN106959759A (zh) 一种数据处理方法及装置
CN104869389B (zh) 离轴式虚拟摄像机参数确定方法及系统
US11601638B2 (en) Head-mounted display device
US20180374258A1 (en) Image generating method, device and computer executable non-volatile storage medium
JP2022511571A (ja) 拡張現実ヘッドセットの動的収束調整
WO2022267694A1 (zh) 一种显示调节方法、装置、设备及介质
CN112236709A (zh) 用于vr或ar显示器的基于菲涅耳的变焦透镜组件
JP2016192773A (ja) 立体視効果の調整用の対話型ユーザインターフェース
KR101818839B1 (ko) 스테레오 삼차원 영상 콘텐츠 디스플레이 제작 및 재생 방법
WO2015034453A1 (en) Providing a wide angle view image
CN109031667B (zh) 一种虚拟现实眼镜图像显示区域横向边界定位方法
Mikšícek Causes of visual fatigue and its improvements in stereoscopy
Wei et al. Color anaglyphs for panorama visualizations
US11164375B1 (en) Stereoscopic rendering of non-flat, reflective or refractive surfaces
CN115480410A (zh) 基于透镜阵列的图像处理方法及图像处理装置
Low Creating Better Stereo 3D For Animated Movies

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21803413

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21803413

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21803413

Country of ref document: EP

Kind code of ref document: A1