CN110928403A - 相机模块及其相关系统 - Google Patents

相机模块及其相关系统 Download PDF

Info

Publication number
CN110928403A
CN110928403A CN201811559479.7A CN201811559479A CN110928403A CN 110928403 A CN110928403 A CN 110928403A CN 201811559479 A CN201811559479 A CN 201811559479A CN 110928403 A CN110928403 A CN 110928403A
Authority
CN
China
Prior art keywords
tracking
user
tracking data
optical module
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201811559479.7A
Other languages
English (en)
Inventor
周永明
林俊伟
谢毅刚
吴嘉伟
王铨彰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Future City Co Ltd
XRspace Co Ltd
Original Assignee
Future City Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Future City Co Ltd filed Critical Future City Co Ltd
Publication of CN110928403A publication Critical patent/CN110928403A/zh
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0077Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/005Aspects relating to the "3D+depth" image format

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

本发明公开了一种相机模块及其相关系统,其中相机模块用于头戴式显示器,包含有第一光学模块,用来追踪用户的手部动作以产生第一追踪数据;第二光学模块,用来追踪该用户的手势、步伐及实体空间以产生第二追踪数据;第三光学模块,用来追踪该实体空间中的三维实体物件以产生第三追踪数据;以及处理单元,用来整合该第一追踪数据、该第二追踪数据及该第三追踪数据以基于该实体空间及三维实体物件重建虚拟空间及三维虚拟物件并重建该用户的肢体动作。

Description

相机模块及其相关系统
【技术领域】
本发明涉及一种相机模块及其相关系统,尤其涉及一种将多个光学模块整合于头戴式显示器的相机模块及其相关系统。
【背景技术】
随着科技的发展与进步,计算机游戏与用户之间的互动需求也随之增加。人机互动科技,例如体感游戏、虚拟现实(Virtual Reality,VR)环境、增强现实(AugmentedReality,AR)环境、混合现实(Mixed Reality,MR)环境、扩展现实(Extended Reality,XR)环境,也由于其生理及娱乐功能而愈趋流行。现有的人机互动科技,如头戴式显示器(head-mounted display,HMD)、运动外围控制器或相机,通过由外而内的追踪方法以重建虚拟物件、实时地追踪移动物件的场景坐标,以追踪用户的动作进而虚拟化用户。如此一来,头戴式显示器、运动外围控制器或相机可共同配合以反应用户在VR/AR/MR/XR环境中所做的手势,以同时在VR/AR/MR/XR环境中与用户产生互动。
然而,当现有技术缺少运动外围控制器或相机以追踪用户的动作或手势时,头戴式显示器则无法虚拟化用户的动作或手势,进而影响用户的体验。
【发明内容】
因此,本发明提供一种相机模块及其相关系统,以提供用户更佳的体验。
本发明的一实施例公开一种相机模块,用于头戴式显示器,包含有第一光学模块,用来追踪用户的手部动作以产生第一追踪数据;第二光学模块,用来追踪该用户的手势、步伐及实体空间以产生第二追踪数据;第三光学模块,用来追踪该实体空间中的三维实体物件以产生第三追踪数据;以及处理单元,用来整合该第一追踪数据、该第二追踪数据及该第三追踪数据以基于该实体空间及三维实体物件重建虚拟空间及三维虚拟物件并重建该用户的肢体动作。
本发明的另一实施例公开一种系统,包含有头戴式显示器;以及相机模块,设置于该头戴式显示器,并且该相机模块包含有第一光学模块,用来追踪用户的手部动作以产生第一追踪数据;第二光学模块,用来追踪该用户的手势、步伐及实体空间以产生第二追踪数据;第三光学模块,用来追踪该实体空间中的三维实体物件以产生第三追踪数据;以及处理单元,用来整合该第一追踪数据、该第二追踪数据及该第三追踪数据以基于该实体空间及三维实体物件重建虚拟空间及三维虚拟物件并重建该用户的肢体动作。
【附图说明】
图1为本发明实施例的系统的示意图。
图2为本发明实施例的相机模块的示意图。
图3及图4为本发明实施例的系统应用于用户时的示意图。
附图标记说明:
10 系统
102 头戴式显示器
104 相机模块
C1 第一光学模块
C2 第二光学模块
C3 第三光学模块
P1 第一位置
P2 第二位置
【具体实施方式】
请参考图1,图1为本发明实施例的系统10的示意图。系统10包含头戴式显示器(head-mounted display,HMD)102及相机模块104。头戴式显示器102可被用户配戴,并且头戴式显示器102的显示部分可用来呈现具有至少一虚拟物件的VR/AR/MR/XR环境。相机模块104如图2所示,可设置或安装于头戴式显示器102,其包含有第一光学模块C1用来追踪用户的手部动作以产生第一追踪数据、第二光学模块C2用来追踪用户的手势或步伐以及实体空间以产生第二追踪数据、第三光学模块C3用来追踪实体空间中的三维实体物件以产生第三追踪数据,以及处理单元用来整合第一追踪数据、第二追踪数据及第三追踪数据基于该实体空间及三维实体物件重建虚拟空间及三维虚拟物件,并重建该用户的肢体动作。除此之外,相机模块104为可旋转的。举例来说,当用户配戴具有相机模块104的头戴式显示器102时,系统10可根据相机模块104所撷取的用户的手势、步伐以及实体空间,依据由内而外的追踪方法,以重建三维虚拟空间以及三维虚拟物件。因此,本发明的系统10重建用户的肢体动作及重建VR/AR/MR/XR环境中的虚拟物件,而不需要布建额外装置以重建虚拟物件及虚拟空间。
上述范例仅概略描述本案发明的系统10不需布建额外的装置以重建虚拟空间及虚拟物件。值得注意的是,本发明所属技术领域中的一般技术人员可据以适当修改本发明。举例来说,相机模块104所包含的光学模块不限于前述范例,可采用四个以上的光学模块整合于头戴式显示器,来重建虚拟空间及三维虚拟物件,但不限于此,皆属本发明的范畴。
详细而言,第一光学模块C1可以是宽视野(field-of-view,FOV)RGB相机,用来实现实时定位与地图构建(simultaneous localization and mapping,SLAM)方式及追踪用户的手部动作,以通过处理单元重建或更新未知环境的地图,并实时地在未知环境的地图中,持续追踪用户的位置。在一实施例中,当用户的手在移动时,宽视野RGB相机以宽视野角度追踪用户的手部动作。值得注意的是,宽视野RGB相机的视野角度至少为100度。因此,本发明的系统10可在不需要追踪器或者由外而内的装置(如运动控制器及相机追踪或感测用户的移动)的情形下,实现SLAM并追踪用户的手部动作。
第二光学模块C2可用来追踪用户当前所在的实体空间、用户的手势及步伐。在一实施例中,第二光学模块C2可以深度RGB(RGB-D)相机实现。因此,处理单元可利用深度RGB相机的追踪结果与深度模型或深度算法以重建三维空间、用户的手势及步伐。具体而言,在三维实体空间中,深度RGB相机撷取具有用户的手势、步伐的深度信息的照片,以通过处理单元辨识动态的手势及步伐。相似地,三维虚拟空间也可根据深度RGB相机所拍摄的照片进行重建。换句话说,深度RGB相机扫描用户周遭的空间以建立三维虚拟空间。因此,系统10的第二光学模块C2的追踪结果可用于处理单元重建三维虚拟空间、用户的手势及步伐,而不需要由外而内的装置以感测或追踪用户的动作。
第三光学模块C3可以是高分辨率(high-definition,HD)相机用来拍摄物件的照片,以重建三维空间中的虚拟物件。在一实施例中,高分辨率相机可以拍摄2400万像素以上的照片。具体而言,高分辨率相机拍摄用户周围的物件,使得处理单元可不需要由外而内的装置,在虚拟三维空间中建立虚拟物件。或者,处理单元可整合高分辨率相机所拍摄的照片以及物件重建算法来建立虚拟物件。因此,系统10可依据第三光学模块C3而不需由外而内装置或其他辅助装置,以重建虚拟物件。
值得注意的是,上述实施例仅概略描述本发明的构想,本发明所属技术领域中的一般技术人员可据以适当修改,而不限于此。举例来说,请参考图3及图4,其为本发明实施例的系统10应用于用户时的示意图。如图3及图4所示,虚线代表相机模块104的每一光学模块的追踪范围,其包含有用户的手部、脚部及肢体,使得系统10可不需要由外而内装置以重建并虚拟化三维虚拟物件及空间。进一步地,由于设置于头戴式显示器102上的相机模块104为可旋转的,因此,其追踪范围可以被最大化。如图4所示,相机模块104可在第一位置P1与一第二位置P2之间旋转。当相机模块104位于第一位置P1时,追踪范围主要位于用户前方,而当相机模块104位于第二位置P2时,追踪范围主要位于用户的肢体以及脚部。值得注意的是,相机模块104可以被调整或固定于第一位置P1与第二位置P2之间的任一位置,而不限于的第一位置P1及第二位置P2。在另一实施例中,相机模块104的光学模块C1、C2及C3可分别地往不同维度旋转,以最大化追踪范围并选择适当的拍摄角度。
综上所述,本发明提供一种具有由内而外的追踪方法的相机模块及系统,以提供用户更佳的体验。
以上所述仅为本发明的较佳实施例,凡依本发明权利要求书所做的均等变化与修饰,皆应属本发明的涵盖范围。

Claims (10)

1.一种相机模块,用于头戴式显示器,其特征在于,包含有:
第一光学模块,用来追踪用户的手部动作以产生第一追踪数据;
第二光学模块,用来追踪该用户的手势、步伐及实体空间以产生第二追踪数据;
第三光学模块,用来追踪该实体空间中的三维实体物件以产生第三追踪数据;以及
处理单元,用来整合该第一追踪数据、该第二追踪数据及该第三追踪数据以基于该实体空间及三维实体物件重建虚拟空间及三维虚拟物件并重建该用户的肢体动作。
2.根据权利要求1所述的相机模块,其特征在于,该第一光学模块、该第二光学模块及第三光学模块为可旋转的,且该第一光学模块、该第二光学模块及第三光学模块的旋转不同步。
3.根据权利要求1所述的相机模块,其特征在于,该第一光学模块是宽视野RGB相机,用来实现实时定位与地图构建及追踪该用户的该手部动作。
4.根据权利要求1所述的相机模块,其特征在于,该第二光学模块为深度RGB相机,并且该处理单元基于该深度RGB相机的第二追踪数据及深度算法,重建该用户的该手势、该步伐及虚拟空间。
5.根据权利要求1所述的相机模块,其特征在于,该第三光学模块为高分辨率相机用来拍摄复数张照片,并且该处理单元基于该高分辨率相机的第三追踪数据重建三维虚拟物件。
6.一种系统,其特征在于,包含有:
头戴式显示器;以及
相机模块,设置于该头戴式显示器,并且该相机模块包含有:
第一光学模块,用来追踪用户的手部动作以产生第一追踪数据;
第二光学模块,用来追踪该用户的手势、步伐及实体空间以产生第二追踪数据;
第三光学模块,用来追踪该实体空间中的三维实体物件以产生第三追踪数据;以及
处理单元,用来整合该第一追踪数据、该第二追踪数据及该第三追踪数据以基于该实体空间及三维实体物件重建虚拟空间及三维虚拟物件并重建该用户的肢体动作。
7.根据权利要求6所述的系统,其特征在于,该第一光学模块、该第二光学模块及第三光学模块为可旋转的,且该第一光学模块、该第二光学模块及第三光学模块的旋转不同步。
8.根据权利要求6所述的系统,其特征在于,该第一光学模块是宽视野RGB相机用来以实现实时定位与地图构建及追踪该用户的该手部动作。
9.根据权利要求6所述的系统,其特征在于,该第二光学模块为深度RGB相机,并且该处理单元基于该深度RGB相机的第二追踪数据及深度算法,重建该用户的该手势、该步伐及重建虚拟空间。
10.根据权利要求6所述的系统,其特征在于,该第三光学模块为高分辨率相机用来拍摄复数张照片,并且该处理单元基于该高分辨率相机的第三追踪数据重建三维虚拟物件。
CN201811559479.7A 2018-09-20 2018-12-19 相机模块及其相关系统 Withdrawn CN110928403A (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/136,257 2018-09-20
US16/136,257 US20200097707A1 (en) 2018-09-20 2018-09-20 Camera Module and Extended Reality System Using the Same

Publications (1)

Publication Number Publication Date
CN110928403A true CN110928403A (zh) 2020-03-27

Family

ID=64744574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811559479.7A Withdrawn CN110928403A (zh) 2018-09-20 2018-12-19 相机模块及其相关系统

Country Status (5)

Country Link
US (1) US20200097707A1 (zh)
EP (1) EP3627288A1 (zh)
JP (1) JP2020048176A (zh)
CN (1) CN110928403A (zh)
TW (1) TW202013005A (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112527102B (zh) * 2020-11-16 2022-11-08 青岛小鸟看看科技有限公司 头戴式一体机系统及其6DoF追踪方法和装置
US11684848B2 (en) * 2021-09-28 2023-06-27 Sony Group Corporation Method to improve user understanding of XR spaces based in part on mesh analysis of physical surfaces

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10365711B2 (en) * 2012-05-17 2019-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for unified scene acquisition and pose tracking in a wearable display
US10026232B2 (en) * 2016-01-04 2018-07-17 Meta Compnay Apparatuses, methods and systems for application of forces within a 3D virtual environment
US10114465B2 (en) * 2016-01-15 2018-10-30 Google Llc Virtual reality head-mounted devices having reduced numbers of cameras, and methods of operating the same
KR102658303B1 (ko) * 2016-02-18 2024-04-18 애플 인크. 인사이드-아웃 위치, 사용자 신체 및 환경 추적을 갖는 가상 및 혼합 현실을 위한 머리 장착 디스플레이
US11074292B2 (en) * 2017-12-29 2021-07-27 Realwear, Inc. Voice tagging of video while recording

Also Published As

Publication number Publication date
EP3627288A1 (en) 2020-03-25
TW202013005A (zh) 2020-04-01
JP2020048176A (ja) 2020-03-26
US20200097707A1 (en) 2020-03-26

Similar Documents

Publication Publication Date Title
Memo et al. Head-mounted gesture controlled interface for human-computer interaction
US11315287B2 (en) Generating pose information for a person in a physical environment
CN103180893B (zh) 用于提供三维用户界面的方法和系统
Rabbi et al. A survey on augmented reality challenges and tracking
CN102959616B (zh) 自然交互的交互真实性增强
US6124862A (en) Method and apparatus for generating virtual views of sporting events
US10825197B2 (en) Three dimensional position estimation mechanism
CN109643145A (zh) 具有世界传感器和用户传感器的显示系统
EP3106963B1 (en) Mediated reality
CN105393158A (zh) 共享的和私有的全息物体
CN110633617B (zh) 使用语义分割的平面检测
US20180288387A1 (en) Real-time capturing, processing, and rendering of data for enhanced viewing experiences
CN102591449A (zh) 虚拟内容和现实内容的低等待时间的融合
CN111273766B (zh) 用于生成链接到物品模拟现实表示的示能表示的方法、设备和系统
Hwang et al. Monoeye: Multimodal human motion capture system using a single ultra-wide fisheye camera
WO2020069427A1 (en) Panoramic light field capture, processing and display
JP2020155130A (ja) 仮想紙
WO2017061890A1 (en) Wireless full body motion control sensor
CN110928403A (zh) 相机模块及其相关系统
CN113678173A (zh) 用于虚拟对象的基于图绘的放置的方法和设备
CN111602391B (zh) 用于根据物理环境定制合成现实体验的方法和设备
Sziebig et al. Achieving Total Immersion: Technology Trends behind Augmented Reality- A Survey
CN114201046B (zh) 注视方向优化方法、装置、电子设备及存储介质
Billinghurst Augmented reality: an overview
Lee et al. A Responsive Multimedia System (RMS): VR Platform for Immersive Multimedia with Stories

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200327