CN109155084A - 在虚拟现实和增强现实环境中进行非常大规模的通信和异步文档编集的系统和方法 - Google Patents
在虚拟现实和增强现实环境中进行非常大规模的通信和异步文档编集的系统和方法 Download PDFInfo
- Publication number
- CN109155084A CN109155084A CN201780024807.0A CN201780024807A CN109155084A CN 109155084 A CN109155084 A CN 109155084A CN 201780024807 A CN201780024807 A CN 201780024807A CN 109155084 A CN109155084 A CN 109155084A
- Authority
- CN
- China
- Prior art keywords
- environment
- augmented reality
- participant
- annotation
- immersive environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/006—Details of the interface to the display terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/167—Synchronising or controlling image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/16—Use of wireless transmission of display information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/34—Network arrangements or protocols for supporting network services or applications involving the movement of software or configuration parameters
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
Abstract
本发明公开内容提供了用于通过简化的用户界面框架来简化基于虚拟现实(VR)、增强现实(AR)或虚拟增强现实(VAR)的通信和协作的系统和方法,所述简化的用户界面框架在沉浸式环境中实现同步和异步交互。
Description
技术领域
本发明公开内容提供了用于通过简化的用户界面框架来简化基于虚拟现实(VR)、增强现实(AR)或虚拟增强现实(VAR)的通信和协作的系统和方法,所述简化的用户界面框架在沉浸式环境中实现同步和异步交互。
发明内容
在球坐标或其他三维环境或沉浸式环境中查看的VR、AR、VAR系统(下文中,统称或单独作为VAR)需要复杂且重量级的文件,以供希望在这些环境中协作的所有利益相关者使用。需要简化VAR环境以进行同步和异步交互和通信。
通常,如本发明所使用的发布者可以在沉浸式环境中发布VAR环境以供参与者稍后或异步地查看和/或注释。用户可在沉浸式环境中查看带注释的VAR环境。发布者、参与者、第三方或其组合可以是用户。
根据一个实施例,记录或跟踪参与者在整个VAR沉浸式环境中的运动。根据一个实施例,所述运动是指参与者在VAR沉浸式环境中从起点(SP)开始并通过多个FP的焦点(FP)。根据一个实施例,参与者的FP由所述参与者的头部位置和/或眼睛注视来确定。根据一个实施例,所述参与者通过VAR沉浸式来注释他的运动。
根据一个实施例,存在不止一个参与者。根据一个实施例,存在不止一个用户。根据一个实施例,跟踪所述参与者在所述VAR沉浸式环境中的运动用于具有十字线的用户。
根据一个实施例,十字线可具有不同的颜色、形状、图标等。根据一个实施例,多个用户可同步或异步查看所述带注释的沉浸式环境。
根据一个实施例,可在诸如智能电话或平板电脑的移动计算设备上查看已发布和/或注释的VAR沉浸式环境。根据一个实施例,所述参与者可使用任何可附接的双目光学系统(比如,Google Cardboard或其他类似设备)来查看所述沉浸式环境。根据一个实施例,发布者、参与者或用户可通过触敏屏幕或其他触敏设备与带注释或未注释的VAR沉浸式环境交互。
附图说明
在下面参考附图对优选实施例的详细描述中,本发明的其他特征和优点将变得显而易见,其中:
图1是示出本发明所述系统和方法的示例性实施例的流程图;
图1A是示出本发明所述系统和方法的示例性实施例的流程图;
图1B是示出本发明所述系统和方法的示例性实施例的流程图;
图2是二维空间中示出的示例性VAR沉浸式环境;
图3是触摸屏的示例性实施例;并且
图4是图形表示的示例性实施例。
具体实施方式
在以下详细描述中,参考了附图,附图形成了本发明的一部分。在附图中,除非上下文另有指示,否则在不同附图中使用相似或相同的符号通常表示相似或相同的项目。
在具体说明、附图和权利要求中描述的说明性实施例不意味着是限制性的。在不脱离本发明呈现的主题的精神或范围的情况下,可利用其他实施例,并且可进行其他改变。
本领域技术人员将认识到,为了概念清楚起见,本发明描述的组件(例如,操作)、设备、对象以及伴随它们的讨论被用作示例,并且可以预期各种配置修改。因此,如本发明所使用的,所阐述的特定示例和随附的讨论旨在代表它们更一般的类。通常,任何特定示例的使用旨在代表其类别,并且不应将特定组件(例如,操作)、设备和对象的不包括视为限制。
本申请使用正式的大纲标题以便清楚地呈现。然而,应当理解,轮廓标题是出于呈现目的,并且可在整个申请中讨论不同类型的主题(例如,可在过程/操作标题和/或过程/操作下描述的设备/结构可以在结构/过程标题下讨论;和/或单个主题的描述可以跨越两个或更多个主题标题)。因此,正式大纲标题的使用并非旨在以任何方式进行限制。通过概述的方式给出,说明性实施例包括用于通过简化的用户界面框架来简化基于VAR的通信和协作的系统和方法,所述简化的用户界面框架在沉浸式环境中实现同步和异步交互。
参考图1、1A、1B和图2,如上所述,发布者可以在沉浸式环境(1)中发布VAR环境以供参与者或用户稍后或异步地查看和/或注释(2)。用户可在沉浸式环境中查看带注释的VAR环境。(8)发布者、参与者、第三方或其组合可以是用户。
根据一个实施例,记录或跟踪参与者在整个VAR沉浸式环境中的运动。所述整个VAR沉浸式环境中的运动是指在所述VAR沉浸式环境中从起点(SP)开始并通过多个FP来跟踪或记录参与者的焦点(FP)。根据一个实施例,参与者的FP(30)由头部位置和/或眼睛注视来确定。根据一个实施例,参与者注释他在整个VAR沉浸式中的运动。(5)
根据一个实施例,注释是从SP(20)开始并通过多个FP(30)的语音注释。根据另一个实施例,注释是整个所述VAR环境中的运动。在另一个实施例中,注释是整个所述VAR环境中的运动,所述运动通过相同的空间与语音注释协作。(5)根据一个实施例,所述参与者的注释标有唯一标识符或UID。(6)
根据一个实施例,用户可查看带注释的沉浸式环境。(8)根据一个实施例,用户收到参与者已注释沉浸式环境的通知。(7)然后,用户可查看带注释的沉浸式环境。(8)
根据一个实施例,所述参与者是多个参与者。(2)根据一个实施例,多个参与者可在VAR平台上异步查看所述VAR沉浸式环境。(2)根据一个实施例,多个参与者可异步注释所述VAR沉浸式环境。(5)根据一个实施例,多个参与者可同步查看所述VAR沉浸式环境(2)但可异步注释所述环境。(5)根据一个实施例,每个带注释的沉浸式环境标有UID。(6)
根据一个实施例,所述用户是多个用户。根据一个实施例,多个用户可在VAR平台上同步查看一个带注释的沉浸式环境。(8)根据一个实施例,至少一个用户可加入或离开同步查看组。(12)根据一个实施例,至少一个用户可在VAR平台上查看至少一个带UID注释的VAR沉浸式环境。(8)
参考图1和图1A,根据一个实施例,发布者可在发布(9)之前注释VAR沉浸式环境。根据一个实施例,向该发布的带注释的VAR沉浸式环境分配UID。(10)
参考图2,根据一个实施例,参与者在整个VAR沉浸式环境中的运动通过十字线(40)显示。根据一个实施例,每个参与者和/或发布者在整个VAR沉浸式环境中的运动可通过独特的可见十字线(40)来显示。根据一个实施例,每个独特的可见十字线(40)可显示为不同的颜色、形状、大小、图标等。
根据一个实施例,在触敏设备(50)上观看VAR沉浸式环境。触敏设备(50)是通过例如将触摸点的坐标发送到计算机来响应手指的触摸的设备。触敏区域可以是屏幕本身,在这种情况下,它被称为触摸屏。或者,它可与键盘或可与放在桌子上的独立装置集成在一起;手指在触摸板上的移动导致光标在屏幕上移动。
根据一个实施例,用户可在具有触摸屏的移动计算设备(50)(比如,智能手机或平板电脑)上查看VAR沉浸式环境。(2)根据一个实施例,用户可使用任何可附接的双目光学系统(比如,Google Cardboard或其他类似设备)来查看所述VAR沉浸式环境。
根据一个实施例,所述用户可通过触摸在所述VAR沉浸式环境外部(51)的屏幕的一部分来选择影响VAR沉浸式环境的动作。根据一个实施例,所述动作位于触摸屏(51)的角落上。这时用户可灵活地选择动作。根据一个实施例,所述用户可通过操纵触摸板来选择动作。动作可包括:从1、2、3、4中选择一个;选择发布、查看、注释;选择传送;选择查看兴趣点;选择查看其中一个注释;在同步查看带注释的沉浸式VAR环境时选择进入或离开VAR平台;等等。
参考图1、1A、1B和3,根据另一个实施例,所述用户可通过选择所述VAR沉浸式环境内的热点(52)来选择影响所述VAR沉浸式环境的动作。根据另一个实施例,所选热点(52)确定所述用户可在所述(51)VAR沉浸式环境之外选择的动作。根据一个实施例,所述选择动作意味着对多个属性中的至少一个属性进行投票。(11)根据一个实施例,所选属性以图形方式表示(60)。图4示出了示例性图形表示。如本领域技术人员将理解的,图形表示可体现在许多设计中。
参考图1-4,根据一个实施例,内容发布者(比如,专业设计者或工程师,或用户生成内容的消费者)向利益相关者(参与者)发布VAR沉浸式环境。(1)例如,所述内容发布者可请求所述利益相关者提供关于特定房间的输入。所述利益相关者查看已发布的VAR沉浸式环境。(2)所述参与者可选择热点(52)或触摸屏(51)或其组合来注释VAR沉浸式环境(4)。多个利益相关者可异步地查看和注释VAR沉浸式环境。(8)
根据一个实施例,所述内容专业人员可要求至少一个用户从多个利益相关者的推荐中进行投票(11),其中在查看每个带注释的VAR沉浸式环境(5)之后进行投票。根据一个实施例,每个投票可以图形方式呈现。(14)根据一个实施例,所述用户可选择热点(53)或触摸屏(51)或其组合来进行投票。
根据一个实施例,多个利益相关者可在VAR平台上同步查看至少一个带注释的VAR环境。(8)根据一个实施例,所述多个利益相关者可从多个带注释的VAR环境中选择一个来查看。(8)根据一个实施例,所述多个利益相关者可选择多个带注释的VAR环境来查看。(8)根据一个实施例,所述多个利益相关者中的至少一个可加入或离开同步查看组。(12)根据一个实施例,可在服务器或云上存储或处理至少一个发布的VAR沉浸式环境、带注释的沉浸式环境、投票、图形表示或其组合。(15)本领域技术人员将理解,可使用多个服务器或云。
如本领域技术人员将理解的,本发明的各方面可体现为系统、方法或计算机产品。因此,本发明的各方面可采用完全硬件实施例、完全软件实施例(包括固件、驻留软件、微代码等)或组合软件和硬件方面的实施例的形式。本发明的其他方面可采取体现在一个或多个可读介质中的计算机程序的形式,该介质上具有计算机可读程序代码/指令。计算机可读介质上包含的程序代码可使用任何适当的介质传输,包括但不限于无线、有线、光纤电缆、RF等,或前述的任何合适的组合。计算机代码可作为独立的软件包、云服务完全在用户的计算机上执行、部分在用户的计算机上执行、可部分在用户的计算机上、部分在远程计算机上或完全在远程计算机上、远程或基于云的服务器上执行。
Claims (25)
1.一种用于在沉浸式环境中于增强现实或虚拟增强现实环境中提供异步注释的方法,其特征在于,所述方法包括:
(a)在沉浸式环境中发布增强现实或虚拟增强现实环境,以供参与者查看;以及
(b)使参与者能够在整个所述增强现实或虚拟增强现实环境中注释所述参与者的运动。
2.根据权利要求1所述的方法,其特征在于,所述整个增强现实或虚拟增强现实环境中的运动是所述参与者从起点开始并通过多个焦点的轨迹。
3.根据权利要求2所述的方法,其特征在于,所述注释是:
从起点开始并通过所述沉浸式环境中的多个焦点来跟踪或记录参与者的头部位置和/或焦点或眼睛注视;
从起点开始并通过多个焦点来记录参与者语音注释;或
其组合。
4.根据权利要求3所述的方法,其特征在于,所述方法还包括使用户能够在所述沉浸式环境中查看经过注释的增强现实或虚拟增强现实环境,其中所述用户是发布者、参与者、第三方或其组合。
5.根据权利要求4所述的方法,其特征在于,所述沉浸式环境中的注释路径显示为十字线。
6.根据权利要求3所述的方法,其特征在于,用户不止一个用户。
7.根据权利要求6所述的方法,其特征在于,所述方法还包括使多个用户能够在所述沉浸式环境中同步查看注释。
8.根据权利要求7所述的方法,其特征在于,所述方法还包括使至少一个用户能够加入或离开同步查看。
9.根据权利要求3所述的方法,其特征在于,所述方法还包括给注释分配唯一标识符。
10.根据权利要求1所述的方法,其特征在于,所述参与者不止一个参与者。
11.根据权利要求10所述的方法,其特征在于,所述方法还包括使所述多个参与者能够同步或异步地在沉浸式环境中查看所述虚拟现实或虚拟增强现实环境。
12.根据权利要求10所述的方法,其特征在于,所述方法还包括使至少一个参与者能够加入或离开同步查看。
13.根据权利要求1所述的方法,其特征在于,所述方法还包括在发布之前实现注释。
14.根据权利要求1所述的方法,其特征在于,所述方法还包括使所述用户能够在便携式计算设备上的所述沉浸式环境中查看所述增强现实或虚拟增强现实环境。
15.根据权利要求14所述的方法,其特征在于,所述便携式计算设备是智能电话或平板电脑。
16.根据权利要求15所述的方法,其特征在于,所述便携式计算设备包括触摸屏。
17.根据权利要求16所述的方法,其特征在于,所述触摸屏的一部分使所述用户可触摸所述屏幕的一部分来选择选择动作,所述动作将在所述沉浸式环境中查看所述增强现实或虚拟增强现实环境的同时引起所述增强现实或虚拟增强现实环境的变化。
18.根据权利要求16所述的方法,其特征在于,所述选择动作意味着对多个属性中的至少一个属性进行投票。
19.根据权利要求18所述的方法,其特征在于,所选属性以图形方式表示。
20.根据权利要求19所述的方法,其特征在于,可在服务器或云上存储或处理至少一个发布的VAR沉浸式环境、带注释的沉浸式环境、投票、图形表示或其组合。
21.一种使用户可查看增强现实或虚拟增强现实环境的计算设备,其特征在于,所述计算设备包括触摸屏;其中,对所述触摸屏的一部分进行唯一地标识,以选择影响虚拟增强的现实环境中的所述增强现实的动作。
22.根据权利要求21所述的计算设备,其特征在于,沉浸式环境中所述增强现实或虚拟现实环境的一部分还包括所述沉浸式环境中影响由所述触摸屏许可的操作的热点。
23.根据权利要求21所述的计算设备,其特征在于,所述选择动作意味着至少对多个属性中的属性进行投票。
24.根据权利要求23所述的计算设备,其特征在于,所选属性以图形表示。
25.根据权利要求24所述的计算设备,其特征在于,可在服务器或云上存储或处理至少一个发布的VAR沉浸式环境、带注释的沉浸式环境、投票、图形表示或其组合。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/134,326 US20170309070A1 (en) | 2016-04-20 | 2016-04-20 | System and method for very large-scale communication and asynchronous documentation in virtual reality and augmented reality environments |
US15/134,326 | 2016-04-20 | ||
PCT/US2017/028409 WO2017184763A1 (en) | 2016-04-20 | 2017-04-19 | System and method for very large-scale communication and asynchronous documentation in virtual reality and augmented reality environments |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109155084A true CN109155084A (zh) | 2019-01-04 |
Family
ID=60089589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780024807.0A Withdrawn CN109155084A (zh) | 2016-04-20 | 2017-04-19 | 在虚拟现实和增强现实环境中进行非常大规模的通信和异步文档编集的系统和方法 |
Country Status (4)
Country | Link |
---|---|
US (4) | US20170309070A1 (zh) |
EP (1) | EP3446291A4 (zh) |
CN (1) | CN109155084A (zh) |
WO (2) | WO2017184763A1 (zh) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10496156B2 (en) * | 2016-05-17 | 2019-12-03 | Google Llc | Techniques to change location of objects in a virtual/augmented reality system |
US10602133B2 (en) * | 2016-10-04 | 2020-03-24 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
IT201700058961A1 (it) | 2017-05-30 | 2018-11-30 | Artglass S R L | Metodo e sistema di fruizione di un contenuto editoriale in un sito preferibilmente culturale o artistico o paesaggistico o naturalistico o fieristico o espositivo |
US11087558B1 (en) | 2017-09-29 | 2021-08-10 | Apple Inc. | Managing augmented reality content associated with a physical location |
US10545627B2 (en) | 2018-05-04 | 2020-01-28 | Microsoft Technology Licensing, Llc | Downloading of three-dimensional scene data for asynchronous navigation |
CN108563395A (zh) * | 2018-05-07 | 2018-09-21 | 北京知道创宇信息技术有限公司 | 3d视角交互方法及装置 |
CN108897836B (zh) * | 2018-06-25 | 2021-01-29 | 广州视源电子科技股份有限公司 | 一种机器人基于语义进行地图构建的方法和装置 |
US11087551B2 (en) | 2018-11-21 | 2021-08-10 | Eon Reality, Inc. | Systems and methods for attaching synchronized information between physical and virtual environments |
CN110197532A (zh) * | 2019-06-05 | 2019-09-03 | 北京悉见科技有限公司 | 增强现实会场布置的系统、方法、装置及计算机存储介质 |
CN115190996A (zh) * | 2020-03-25 | 2022-10-14 | Oppo广东移动通信有限公司 | 使用增强现实的协作文档编辑 |
US11358611B2 (en) * | 2020-05-29 | 2022-06-14 | Alexander Yemelyanov | Express decision |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100149093A1 (en) * | 2006-12-30 | 2010-06-17 | Red Dot Square Solutions Limited | Virtual reality system including viewer responsiveness to smart objects |
US20120212405A1 (en) * | 2010-10-07 | 2012-08-23 | Benjamin Zeis Newhouse | System and method for presenting virtual and augmented reality scenes to a user |
US20130293468A1 (en) * | 2012-05-04 | 2013-11-07 | Kathryn Stone Perez | Collaboration environment using see through displays |
CN105075246A (zh) * | 2013-02-20 | 2015-11-18 | 微软公司 | 使用镜子暗喻来提供远程沉浸式体验 |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6119147A (en) * | 1998-07-28 | 2000-09-12 | Fuji Xerox Co., Ltd. | Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space |
US7137077B2 (en) * | 2002-07-30 | 2006-11-14 | Microsoft Corporation | Freeform encounter selection tool |
US20050181340A1 (en) * | 2004-02-17 | 2005-08-18 | Haluck Randy S. | Adaptive simulation environment particularly suited to laparoscopic surgical procedures |
DE602007001600D1 (de) * | 2006-03-23 | 2009-08-27 | Koninkl Philips Electronics Nv | Hotspots zur blickfokussierten steuerung von bildmanipulationen |
US8095881B2 (en) * | 2008-03-24 | 2012-01-10 | International Business Machines Corporation | Method for locating a teleport target station in a virtual world |
US8095595B2 (en) * | 2008-04-30 | 2012-01-10 | Cisco Technology, Inc. | Summarization of immersive collaboration environment |
US8400548B2 (en) * | 2010-01-05 | 2013-03-19 | Apple Inc. | Synchronized, interactive augmented reality displays for multifunction devices |
US9204040B2 (en) * | 2010-05-21 | 2015-12-01 | Qualcomm Incorporated | Online creation of panoramic augmented reality annotations on mobile platforms |
US9071709B2 (en) * | 2011-03-31 | 2015-06-30 | Nokia Technologies Oy | Method and apparatus for providing collaboration between remote and on-site users of indirect augmented reality |
US8375085B2 (en) * | 2011-07-06 | 2013-02-12 | Avaya Inc. | System and method of enhanced collaboration through teleportation |
US20130293580A1 (en) * | 2012-05-01 | 2013-11-07 | Zambala Lllp | System and method for selecting targets in an augmented reality environment |
JP6131540B2 (ja) * | 2012-07-13 | 2017-05-24 | 富士通株式会社 | タブレット端末、操作受付方法および操作受付プログラム |
US20140181630A1 (en) * | 2012-12-21 | 2014-06-26 | Vidinoti Sa | Method and apparatus for adding annotations to an image |
WO2014149794A1 (en) * | 2013-03-15 | 2014-09-25 | Cleveland Museum Of Art | Guided exploration of an exhibition environment |
US9454220B2 (en) * | 2014-01-23 | 2016-09-27 | Derek A. Devries | Method and system of augmented-reality simulations |
US9264474B2 (en) * | 2013-05-07 | 2016-02-16 | KBA2 Inc. | System and method of portraying the shifting level of interest in an object or location |
US9633252B2 (en) * | 2013-12-20 | 2017-04-25 | Lenovo (Singapore) Pte. Ltd. | Real-time detection of user intention based on kinematics analysis of movement-oriented biometric data |
US20150205358A1 (en) * | 2014-01-20 | 2015-07-23 | Philip Scott Lyren | Electronic Device with Touchless User Interface |
KR20150108216A (ko) * | 2014-03-17 | 2015-09-25 | 삼성전자주식회사 | 입력 처리 방법 및 그 전자 장치 |
US10511551B2 (en) * | 2014-09-06 | 2019-12-17 | Gang Han | Methods and systems for facilitating virtual collaboration |
EP3201859A1 (en) * | 2014-09-30 | 2017-08-09 | PCMS Holdings, Inc. | Reputation sharing system using augmented reality systems |
US20160133230A1 (en) * | 2014-11-11 | 2016-05-12 | Bent Image Lab, Llc | Real-time shared augmented reality experience |
US10037312B2 (en) * | 2015-03-24 | 2018-07-31 | Fuji Xerox Co., Ltd. | Methods and systems for gaze annotation |
US20160300392A1 (en) * | 2015-04-10 | 2016-10-13 | VR Global, Inc. | Systems, media, and methods for providing improved virtual reality tours and associated analytics |
US10055888B2 (en) * | 2015-04-28 | 2018-08-21 | Microsoft Technology Licensing, Llc | Producing and consuming metadata within multi-dimensional data |
US9684305B2 (en) * | 2015-09-11 | 2017-06-20 | Fuji Xerox Co., Ltd. | System and method for mobile robot teleoperation |
US10338687B2 (en) * | 2015-12-03 | 2019-07-02 | Google Llc | Teleportation in an augmented and/or virtual reality environment |
US10048751B2 (en) * | 2016-03-31 | 2018-08-14 | Verizon Patent And Licensing Inc. | Methods and systems for gaze-based control of virtual reality media content |
-
2016
- 2016-04-20 US US15/134,326 patent/US20170309070A1/en not_active Abandoned
- 2016-07-22 US US15/216,981 patent/US20170308348A1/en not_active Abandoned
- 2016-12-31 US US15/396,590 patent/US20170309073A1/en not_active Abandoned
-
2017
- 2017-04-19 EP EP17786575.5A patent/EP3446291A4/en not_active Withdrawn
- 2017-04-19 CN CN201780024807.0A patent/CN109155084A/zh not_active Withdrawn
- 2017-04-19 WO PCT/US2017/028409 patent/WO2017184763A1/en active Application Filing
- 2017-08-04 US US15/669,711 patent/US20170337746A1/en not_active Abandoned
-
2018
- 2018-10-03 WO PCT/IB2018/001413 patent/WO2019064078A2/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100149093A1 (en) * | 2006-12-30 | 2010-06-17 | Red Dot Square Solutions Limited | Virtual reality system including viewer responsiveness to smart objects |
US20120212405A1 (en) * | 2010-10-07 | 2012-08-23 | Benjamin Zeis Newhouse | System and method for presenting virtual and augmented reality scenes to a user |
US20130293468A1 (en) * | 2012-05-04 | 2013-11-07 | Kathryn Stone Perez | Collaboration environment using see through displays |
CN105075246A (zh) * | 2013-02-20 | 2015-11-18 | 微软公司 | 使用镜子暗喻来提供远程沉浸式体验 |
Also Published As
Publication number | Publication date |
---|---|
WO2019064078A2 (en) | 2019-04-04 |
US20170337746A1 (en) | 2017-11-23 |
WO2019064078A3 (en) | 2019-07-25 |
EP3446291A1 (en) | 2019-02-27 |
US20170309070A1 (en) | 2017-10-26 |
US20170308348A1 (en) | 2017-10-26 |
EP3446291A4 (en) | 2019-11-27 |
US20170309073A1 (en) | 2017-10-26 |
WO2017184763A1 (en) | 2017-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109155084A (zh) | 在虚拟现实和增强现实环境中进行非常大规模的通信和异步文档编集的系统和方法 | |
Kim et al. | Evaluating the combination of visual communication cues for HMD-based mixed reality remote collaboration | |
Besançon et al. | The state of the art of spatial interfaces for 3D visualization | |
Hürst et al. | Gesture-based interaction via finger tracking for mobile augmented reality | |
CN106845335B (zh) | 用于虚拟现实设备的手势识别方法、装置及虚拟现实设备 | |
KR102319417B1 (ko) | 협업 서비스를 제공하는 서버 및 방법, 그리고 협업 서비스를 제공받는 사용자 단말 | |
KR20210040474A (ko) | 거울 메타포를 사용한 원격 몰입 경험 제공 | |
CN110070556A (zh) | 使用深度传感器的结构建模 | |
CN105144072A (zh) | 在多点触控装置上对压感进行模拟 | |
CN106846496A (zh) | 基于混合现实技术的dicom影像查看系统及操作方法 | |
Kolb et al. | Towards gesture-based process modeling on multi-touch devices | |
CN108027663A (zh) | 将移动设备与人员跟踪组合以用于大型显示器交互 | |
US11694413B2 (en) | Image editing and sharing in an augmented reality system | |
Menzner et al. | Above surface interaction for multiscale navigation in mobile virtual reality | |
Vock et al. | Idiar: Augmented reality dashboards to supervise mobile intervention studies | |
US20190378335A1 (en) | Viewer position coordination in simulated reality | |
Zocco et al. | Touchless interaction for command and control in military operations | |
García-Pereira et al. | MIME: A Mixed-Space Collaborative System with Three Immersion Levels and Multiple Users. | |
Zhang et al. | A hybrid 2D–3D tangible interface combining a smartphone and controller for virtual reality | |
Adhikarla et al. | Design and evaluation of freehand gesture interaction for light field display | |
US11410393B2 (en) | Auto arranging wall in an augmented reality system | |
Auda et al. | VRSketch: Investigating 2D sketching in virtual reality with different levels of hand and pen transparency | |
Arslan et al. | E-Pad: Large display pointing in a continuous interaction space around a mobile device | |
Kim et al. | Motion–display gain: A new control–display mapping reflecting natural human pointing gesture to enhance interaction with large displays at a distance | |
Belkacem et al. | Interactive Visualization on Large High-Resolution Displays: A Survey |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20190104 |
|
WW01 | Invention patent application withdrawn after publication |