WO2018120090A1 - 一种增强现实界面的实现方法及头戴显示设备 - Google Patents

一种增强现实界面的实现方法及头戴显示设备 Download PDF

Info

Publication number
WO2018120090A1
WO2018120090A1 PCT/CN2016/113662 CN2016113662W WO2018120090A1 WO 2018120090 A1 WO2018120090 A1 WO 2018120090A1 CN 2016113662 W CN2016113662 W CN 2016113662W WO 2018120090 A1 WO2018120090 A1 WO 2018120090A1
Authority
WO
WIPO (PCT)
Prior art keywords
target device
captured
mounted display
display device
head
Prior art date
Application number
PCT/CN2016/113662
Other languages
English (en)
French (fr)
Inventor
彭世平
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to PCT/CN2016/113662 priority Critical patent/WO2018120090A1/zh
Priority to CN201680036487.6A priority patent/CN107820706A/zh
Publication of WO2018120090A1 publication Critical patent/WO2018120090A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present invention relates to the field of head-mounted display, and in particular to an implementation method of an augmented reality interface and a head-mounted display device.
  • a Head Mounted Display refers to a display device that can be worn on the head. HMD is divided into immersive and penetrating. The immersive HMD provides users with an immersive visual experience in application scenarios such as viewing and virtual reality (VR).
  • VR virtual reality
  • An object of the present invention is to provide an implementation method of an augmented reality interface and a head mounted display device, in order to enhance the effect of displaying a picture on the head mounted display device.
  • the implementation method of the augmented reality interface provided by the embodiment of the present invention is applicable to a head mounted display device, and the method includes the following steps:
  • the 3D video is played through the augmented reality interface.
  • an embodiment of the present invention provides a head mounted display device, including:
  • a communication module configured to establish a communication connection with the target device
  • a receiving module configured to receive a 3D video sent by the target device
  • a parsing module configured to parse the 3D video to obtain an environment in which the target device is currently located
  • An interface generating module configured to generate an augmented reality interface associated with the environment according to an environment in which the target device is located;
  • a display control module configured to play the 3D video through the augmented reality interface.
  • an embodiment of the present invention provides a head mounted display device, including: a processor and a memory, where the processor is configured to perform the method of the first aspect.
  • the solution provided by the present invention can receive the 3D video captured by another device, then generate an augmented reality interface associated with the 3D video, and finally play the 3D video on the augmented reality interface. It can be seen that the generated augmented reality interface is associated with the information captured by another device, thereby enhancing the effect of the real picture, thereby improving the user viewing experience.
  • FIG. 1 is a schematic diagram of a hardware environment in which an implementation system of an augmented reality interface is provided according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a method for implementing an augmented reality interface according to an embodiment of the present invention, where the execution subject is a head mounted display device;
  • FIG. 3 is a schematic flowchart of a method for implementing an augmented reality interface according to an embodiment of the present invention, where the execution subject is a target device;
  • FIG. 4 is a schematic structural diagram of a head mounted display device according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of a hardware environment in which an implementation system of an augmented reality interface is disclosed according to an embodiment of the present invention.
  • the hardware environment includes a head mounted display device 110.
  • the head mounted display device 110 can be in communication connection with the target device 120 over a network.
  • the target device 120 can receive a control command sent by the remote controller 130.
  • the target device 120 is a drone.
  • the head-mounted display device 110 and the target device 120 communicate through the wireless network.
  • the wireless network may be WIFI, Bluetooth, Zigbee, WiMax, LTE, radio, etc., which is not limited by the present invention.
  • the target device 120 and the remote controller 130 are connected by wireless.
  • the target device 120 includes at least one processor 1201, at least one memory 1202, a communication interface 1203, a receiving/receiving device 1204, a 3D camera 1205, and a driving device 1206.
  • the receiving/receiving device 1204 is placed above the driving device 1206, wherein the driving device 1206 can be a driving motor.
  • the processor 1201, the memory 1202, the communication interface 1203, the receiving/transmitting device 1204, the 3D camera 1205, and the driving device 1206 are connected by a communication bus and complete communication with each other.
  • the receiving/transmitting device 1204 may be an infrared sensor for transmitting an infrared signal.
  • the infrared signal When the infrared signal is blocked by another object, such as the target device of another user, the infrared signal will be reflected by the other object and received again by the receiving/receiving device 1204. Thus, when the infrared signal emitted by the receiving/transmitting device 1204 is reflected back by other objects, it is considered that the target device 120 captures the other object and the other object is hit by the target device 120. In a game system, the other object can be identified as the object to be hit by the target device 120.
  • the 3D camera 1202 is used to acquire real-time 3D video.
  • the communication interface 1203 is used to establish a communication connection between the target device 120 and other devices.
  • the head mounted display device 110 includes at least one processor 1101, at least one memory 1102, a communication interface 1103, and a gyroscope 1104.
  • the processor 1101, the memory 1102, the communication interface 1103, and the gyroscope 1104 are connected by a communication bus and complete communication with each other.
  • the processor 1101 and the processor 1201 may be a general purpose central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling execution of the above program. .
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • the memory 1102 and the memory 1202 may be read-only memory (ROM) Or other types of static storage devices that can store static information and instructions, random access memory (RAM) or other types of dynamic storage devices that can store information and instructions, or electrically erasable programmable read-only Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, Blu-ray A disc or the like, a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of an instruction or data structure and that can be accessed by a computer, but is not limited thereto.
  • the memory can exist independently and be connected to the processor via a bus.
  • the memory can also be integrated with the processor.
  • the gyroscope 1104 is used to acquire the rotation parameters when the user's head is rotated.
  • the memory 1102 is configured to store application code for executing the implementation method of the augmented reality interface according to the first embodiment of the present invention, and is controlled by the processor 1101.
  • the processor 1101 is configured to execute application code stored in the memory 1102.
  • the code stored in the memory 1102 can implement the implementation method of the augmented reality interface provided by the embodiment of the present invention performed by the head mounted display device provided below.
  • the memory 1202 is configured to store application code for executing the implementation method of the augmented reality interface according to the second embodiment of the present invention, and is controlled by the processor 1201 for execution.
  • the processor 1201 is configured to execute application code stored in the memory 1202.
  • the code stored in the memory 1102 can implement the implementation method of the augmented reality interface provided by the embodiment of the present invention performed by the target device 120 provided below.
  • FIG. 2 a flow chart of a method for implementing an augmented reality interface according to a first embodiment of the present invention is shown.
  • the execution body is described by taking a head-mounted display device as an example.
  • the implementation method of the augmented reality interface includes the following steps:
  • Step S201 establishing a communication connection with the target device.
  • the specific implementation manner of the above step S201 is: the processor 1101 of the head mounted display device sends a communication connection request to the target device through the communication interface 1103; the processor 1101 of the head mounted display device receives the target device through the communication interface 1103.
  • the allowable communication connection response sent for the communication connection request completes establishing a communication connection with the target device.
  • Step S202 receiving a 3D video sent by the target device.
  • the method further includes: the processor 1101 of the head-mounted display device sends a video information acquisition request to the target device through the communication interface 1103, where the video information acquisition request is used to control the target device response.
  • the video information acquisition request transmits the acquired 3D video to the head mounted display device.
  • the video information acquisition request further carries the headset display device to control a period in which the target device periodically sends 3D video to the head mounted display device, for example, sending a 3D video every 2 seconds, so that The head-mounted display device can display the content captured by the target device in real time.
  • Step S203 parsing the 3D video to obtain an environment in which the target device is currently located.
  • Step S204 Generate an augmented reality interface associated with the environment according to an environment in which the target device is located. For example, if the target device is currently in a playground, then the generated augmented reality interface is associated with the playground.
  • Step 205 Play the 3D video through the augmented reality interface.
  • the implementation method shown in FIG. 2 further includes:
  • the gyroscope 1104 wearing the display device acquires a first rotation parameter of the user's head; the processor 1101 of the head-mounted display device 110 sends a control instruction to the target device 120 according to the first rotation parameter, the control instruction carrying The first rotation parameter is used to control the target device 120 to collect 3D video according to the first rotation parameter.
  • the gyroscope 1104 is disposed on the head mounted display device.
  • the gyroscope 1104 can detect information such as the position and angle of the user's head rotation, and then the gyroscope 1104 can acquire the user.
  • the rotation parameters of the head At this time, the processor 1101 of the head-mounted display device transmits the rotation parameter acquired by the gyroscope 1104 to the target device 120 through the communication interface 1103, and the processor 1201 of the target device 120 rotates the 3D camera 1205 according to the rotation parameter to make 3D.
  • the camera 1205 acquires a 3D video of another view, and transmits the 3D video acquired by the 3D camera to the head mounted display device 110. In this way, 3D video of different viewing angles can be acquired by the rotation of the user's head.
  • the implementation method shown in FIG. 2 further includes: when the 3D video sent by the target device includes an object to be captured, displaying a cross for identifying the object to be captured on the augmented reality interface.
  • Line of sight The object to be captured is included in the video information acquired by the target device His target device.
  • the crosshair is an aiming line used in optical sights, consisting of two thin lines that meet at right angles. The crosshair is used to identify the object to be captured, that is, the crosshair is displayed based on the object to be captured.
  • the implementation method shown in FIG. 2 further includes:
  • the gyroscope 1104 wearing the display device acquires a second rotation parameter of the user's head; and the processor 1101 of the head-mounted display device 110 moves the cross-hair line according to the second rotation parameter.
  • the rotation angle is 90°
  • the rotation angular velocity is 5 rad/s
  • the rotation direction is horizontal to the right
  • the cross-hair line is rotated at an angle of 90°
  • the angular velocity is rotated.
  • the direction of rotation is horizontal to the right and rotates.
  • the implementation method shown in FIG. 2 further includes:
  • the processor 1101 of the head-mounted display device transmits the second rotation parameter to the target device in real time through the communication interface 1103, so that the target device 110 controls the driving device 1206 to rotate according to the second rotation parameter to drive the receiving The hair unit 1204 rotates.
  • the object to be captured is captured by the target device 120, it is required to transmit the rotation parameter of the user's head rotation to the target device in real time, so that the driving device 1206 of the target device 120 rotates to drive the receiving/transmitting of the target device.
  • the device 1204 rotates, so that the receiving/transmitting device 1204 of the target device is aimed at the object to be captured.
  • the target device controls the driving device 1206 to rotate based on a rotation angle of 90°, a rotational angular velocity of 5 rad/s, and a rotational direction of the horizontally right rotational parameter.
  • the method shown in FIG. 2 further includes:
  • the processor 1101 of the head mounted display device When detecting a pressing operation of the user for the capture button, the processor 1101 of the head mounted display device transmits a capture instruction to the target device via the communication interface 1103, the capture instruction for controlling the receiving/transmitting device 1204 to issue a capture signal. To capture the object to be captured.
  • the capture button may be a button for triggering a capture instruction disposed on an outer surface of the head mounted display device; or may be a button on another device connected to the head mounted display device for triggering a capture instruction, such as a header A trigger for a firearm for a game connected to the display device.
  • the method further includes:
  • the processor 1101 of the head mounted display device 110 receives the target device through the communication interface 1103 120: A capture success instruction sent by the receiving/transmitting device 1204 upon receiving the capture signal reflected by the object to be captured.
  • the processor 1101 of the head mounted display device 110 generates a screen for indicating that the object to be captured has been captured, and in the augmented reality The screen is displayed on the interface.
  • the picture for indicating that the object to be captured has been captured is an explosion picture, or a picture of a gun to be captured, and the like, which is not limited by the present invention.
  • the method further includes:
  • the processor 1101 of the head mounted display device Upon determining that the object to be captured has been captured, the processor 1101 of the head mounted display device acquires audio for indicating that the object to be captured has been captured, and plays the audio when the screen is displayed.
  • the processor 1101 of the head mounted display device acquires, according to the screen, that the object to be captured has been captured. Audio. For example, if the picture is an explosion picture, then the acquired audio is used to represent the explosion of the audio, such as "the rumble of the rumble." As another example, assuming that the picture is an explosion picture, the acquired audio is a voice for indicating an explosion, such as "the object to be captured has been hit.”
  • the rotation parameter includes at least one of a rotation angle, a rotation direction, and a rotation angular rate.
  • the solution head-mounted display device provided by the present invention can receive the 3D video captured by another device, and then generate an augmented reality interface associated with the 3D video, and finally play the 3D video on the augmented reality interface. It can be seen that the generated augmented reality interface is associated with the information captured by another device, thereby enhancing the effect of the real picture, thereby improving the user viewing experience.
  • FIG. 3 is a schematic flowchart of a method for implementing the augmented reality interface by using a target device as an execution subject.
  • the implementation method includes the following steps:
  • Step S301 establishing a communication connection with the head mounted display device.
  • step S301 the specific implementation manner of step S301 is: the processor 1201 of the target device receives the head-mounted display device to send a communication connection request through the communication interface 1203; The processor 1201 requesting the target device responds to the allowable communication connection sent to the head mounted display device via the communication interface 1203 to complete establishing a communication connection with the head mounted display device.
  • Step S302 Receive a video information acquisition request sent by the head mounted display device and collect 3D video in response to the video information acquisition request.
  • step S502 the specific implementation of the above step S502 is that the processor 1201 of the target device controls the 3D camera 1205 of the target device to be turned on to collect the 3D video.
  • the implementation method shown in FIG. 3 further includes:
  • the processor 1201 of the target device receives, by the communication interface 1203, a control instruction sent by the head mounted display device, the control instruction carries the first rotation parameter, and the control instruction is used to indicate that the target device is according to the first
  • the rotation parameter collects 3D video; the processor 1201 of the target device rotates the 3D camera 1205 according to the first rotation parameter; the 3D video of another view is acquired by the 3D camera 1205, and the processor 1201 of the target device passes through the communication interface 1203 This 3D video is sent to the head-mounted display device.
  • the implementation method shown in FIG. 3 further includes:
  • the processor 1201 of the target device receives the second rotation parameter sent by the head mounted display device 110 in real time through the communication interface 1203.
  • the processor 1201 of the target device controls the rotation of the driving device 1206 according to the second rotation parameter to drive the transmission/reception.
  • Device 1204 rotates.
  • the implementation method shown in FIG. 3 further includes:
  • the processor 1201 of the target device receives the capture instruction sent by the head mounted display device through the communication interface 1203; the processor 1201 of the target device controls the receiving/transmitting device 1204 of the target device to issue a capture signal to capture the object to be captured .
  • the implementation method shown in FIG. 3 further includes:
  • the processor 1201 of the target device transmits a capture success instruction through the communication interface 1203.
  • the embodiment of the invention further provides a head mounted display device 400, as shown in FIG. 4, comprising:
  • a communication module 401 configured to establish a communication connection with a target device
  • the receiving module 402 is configured to receive a 3D video sent by the target device.
  • a parsing module 403 configured to parse the 3D video to obtain an environment in which the target device is currently located;
  • the interface generating module 404 is configured to generate an augmented reality interface associated with the environment according to an environment in which the target device is located;
  • the display control module 405 is configured to play the 3D video through the augmented reality interface.
  • the head mounted display device comprises a gyroscope for acquiring a rotation parameter when the user's head rotates, the gyroscope acquiring a first rotation parameter of the user's head; the head mounted display device Also includes:
  • the sending module 406 is configured to send a control instruction to the target device according to the first rotation parameter, where the control instruction carries the first rotation parameter, where the control instruction is used to indicate that the target device is according to the The first rotation parameter collects 3D video.
  • the display control module 405 is further configured to: when the 3D video sent by the target device includes an object to be captured, display a crosshair line for identifying the object to be captured on the augmented reality interface. .
  • the gyroscope acquires a second rotation parameter of the user's head; the display control module 405 is further configured to move the crosshair according to the second rotation parameter.
  • the target device includes a driving device and a receiving/transmitting device disposed on the driving device,
  • the sending module 406 is further configured to send the second rotation parameter to the target device in real time, so that the target device controls the driving device to rotate according to the second rotation parameter to drive the receiving/receiving/ The hair device rotates.
  • the sending module 406 is further configured to: when detecting a pressing operation of the user for the capture button, send a capture instruction to the target device, where the capture instruction is used to control the receiving/transmitting device to send a capture signal To capture the object to be captured.
  • the receiving module 402 is further configured to receive a capture success instruction sent by the target device, where the capture success instruction is that the receiving/transmitting device receives the reflected back from the object to be captured. Triggered when capturing a signal;
  • the head mounted display device further includes:
  • a screen generating module 407 configured to generate a screen for indicating that the object to be captured has been captured
  • the display control module 405 is further configured to display the screen on the augmented reality interface.
  • the head mounted display device further includes:
  • the audio acquisition module 408 is configured to: when determining that the object to be captured has been captured, acquire audio for indicating that the object to be captured has been captured;
  • the audio playing module 409 is configured to play the audio when the screen is displayed.
  • the rotation parameter includes at least one of a rotation angle, a rotation direction, and a rotation angular rate.
  • each of the above modules (communication module 401, receiving module 402, analysis module 403, interface generation module 404, display control module 405, transmission module 406, screen generation module 407, audio acquisition module 408, and audio playback module 409) The relevant steps for performing the above method.
  • the head mounted display device 400 is presented in the form of a module.
  • a “module” herein may refer to an application-specific integrated circuit (ASIC), a processor and memory that executes one or more software or firmware programs, integrated logic circuits, and/or other devices that provide the above functionality.
  • ASIC application-specific integrated circuit
  • the above communication module 401, the parsing module 403, the interface generating module 404, the display control module 405, the screen generating module 407, the audio acquiring module 408, and the audio playing module 409 may pass through the processor 1101 of the head mounted display device shown in FIG.
  • the above receiving module 402 and the transmitting module 406 can be implemented by the processor 1101 communication interface 1103 of the head mounted display device shown in FIG. 1.
  • the embodiment of the present invention further provides a computer storage medium, wherein the computer storage medium can store a program, and the program includes some or all of the steps of implementing the augmented reality interface described in the foregoing method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种增强现实界面的实现方法,适用于头戴显示设备,该方法包括:建立与目标设备的通信连接(S201);接收所述目标设备发送的3D视频(S202);解析所述3D视频以得到所述目标设备当前所处的环境(S203);根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面(S204);通过所述增强现实界面播放所述3D视频(S205)。

Description

一种增强现实界面的实现方法及头戴显示设备 技术领域
本发明涉及头戴式显示领域,尤其涉及一种增强现实界面的实现方法及头戴显示设备。
背景技术
头戴显示设备(Head Mounted Display,HMD)指的是可以戴在头上的显示设备。HMD分为沉浸式和穿透式。沉浸式HMD可为用户提供观影、虚拟现实(Virtual Reality,VR)等应用场景下的沉浸视觉体验。
发明内容
本发明的目的是提供一种增强现实界面的实现方法及头戴显示设备,以期增强头戴显示设备显示画面的效果。
为了实现上述目的,本发明实施例所提供的增强现实界面的实现方法,适用于头戴显示设备,该方法包括步骤:
建立与目标设备的通信连接;
接收所述目标设备发送的3D视频;
解析所述3D视频以得到所述目标设备当前所处的环境;
根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面;以及
通过所述增强现实界面播放所述3D视频。
第二方面,本发明实施例提供一种头戴显示设备,包括:
通信模块,用于建立与目标设备的通信连接;
接收模块,用于接收所述目标设备发送的3D视频;
解析模块,用于解析所述3D视频以得到所述目标设备当前所处的环境;
界面生成模块,用于根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面;
显示控制模块,用于通过所述增强现实界面播放所述3D视频。
第三方面,本发明实施例提供一种头戴显示设备,包括:处理器、存储器,所述处理器用于执行第一方面所述的方法。
本发明提供的方案头戴显示设备可接收另一设备采集到的3D视频,然后生成一与3D视频关联的增强现实界面,最后在这个增强现实界面上播放这个3D视频。可见,生成的增强现实界面是与另一设备拍摄到的信息关联的,进而可增强现实画面的效果,从而提升用户观看体验。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施例提供的一种增强现实界面的实现系统所运行的硬件环境的示意图;
图2为本发明一实施例提供的增强现实界面的实现方法的流程示意图,其中,该执行主体为头戴显示设备;
图3为本发明一实施例所提供的增强现实界面的实现方法的流程示意图,其中,该执行主体为目标设备;
图4为本发明一实施例提供的头戴显示设备的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分的实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本发明保护的范围。
需要说明的是,本发明的说明书和权利要求书及所述附图中的术语“第一”、“第二”、“第三”等是用于区别不同对象,而不是用于描述特定顺序。
本发明所述的一种增强现实界面的实现方法,适用于基于混合显示的空战 对战游戏中,该游戏可有多个玩家参与,且参与游戏的每个玩家均至少拥有一个头戴显示设备、一个目标设备,以及能够遥控目标设备的遥控器。以下本发明以一个玩家拥有的设备对增强现实界面的实现方法进行详细描述。请参阅图1,图1是本发明实施例公开的一种增强现实界面的实现系统所运行的硬件环境的示意图。该硬件环境包括一头戴显示设备110上。该头戴显示设备110可通过网络与目标设备120进行通信连接。该目标设备120可接收遥控器130所发送的控制命令。在一实施方式中,该目标设备120为无人机。头戴显示设备110和目标设备120通过无线网进行通信的,这些无线网可以为WIFI、蓝牙、Zigbee、WiMax、LTE、无线电等等,本发明不作限定。目标设备120和遥控器130通过无线方式进行连接。
目标设备120包括至少一个处理器1201,至少一个存储器1202,通信接口1203、收/发装置1204、3D摄相机1205和驱动装置1206。收/发装置1204置于驱动装置1206之上,其中,驱动装置1206可以为驱动电机。处理器1201、存储器1202、通信接口1203、收/发装置1204、3D摄相机1205、驱动装置1206通过通信总线连接并完成相互间的通信。收/发装置1204可以为红外线传感器,用来发送红外信号。当该红外信号被另一物体例如另一用户的目标设备所阻挡时,该红外信号将会被该另一物体反射回来重新被收/发装置1204接收到。如此,当收/发装置1204所发射的红外信号被其他物体阻挡后反射回来时,则认为目标设备120捕捉到该其他物体及该其他物体被目标设备120击中。在一游戏系统中,该其他物体可认定为目标设备120所要击打的对象。3D摄相机1202用来采集实时3D视频。通信接口1203用于建立目标设备120与其他设备之间的通信连接。
头戴显示设备110包括至少一个处理器1101,至少一个存储器1102,通信接口1103以及陀螺仪1104。处理器1101、存储器1102、通信接口1103以及陀螺仪1104通过通信总线连接并完成相互间的通信。
处理器1101和处理器1201可以是通用中央处理器(CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制以上方案程序执行的集成电路。
存储器1102和存储器1202可以是只读存储器(read-only memory,ROM) 或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、只读光盘(Compact Disc Read-Only Memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过总线与处理器相连接。存储器也可以和处理器集成在一起。
陀螺仪1104用于获取用户头部转动时的转动参数。
其中,存储器1102用于存储执行本发明第一实施例所述的增强现实界面的实现方法的应用程序代码,并由处理器1101来控制执行。处理器1101用于执行存储器1102中存储的应用程序代码。存储器1102存储的代码可执行以下提供的头戴显示设备执行的本发明实施例提供的增强现实界面的实现方法。
其中,存储器1202用于存储执行本发明第二实施例所述的增强现实界面的实现方法的应用程序代码,并由处理器1201来控制执行。处理器1201用于执行存储器1202中存储的应用程序代码。存储器1102存储的代码可执行以下提供的目标设备120执行的本发明实施例提供的增强现实界面的实现方法。
下面结合图1所示的增强现实界面的实现系统所运行的硬件环境对本发明实施例提供的增强现实界面的实现方法进行详细说明。
请参见图2,所示为本发明第一实施例提供的一种增强现实界面的实现方法的流程示意图。执行主体以头戴显示设备为例进行说明,该增强现实界面的实现方法包括以下步骤:
步骤S201,建立与目标设备的通信连接。
在一实施例中,以上步骤S201的具体实施方式为:头戴显示设备的处理器1101通过通信接口1103向目标设备发送通信连接请求;头戴显示设备的处理器1101通过通信接口1103接收目标设备针对所述通信连接请求而发送的允许通信连接响应,以完成建立与目标设备的通信连接。
步骤S202,接收所述目标设备发送的3D视频。
在一实施例中,以上步骤S202之前,所述方法还包括:头戴显示设备的处理器1101通过通信接口1103向目标设备发送视频信息获取请求,所述视频信息获取请求用于控制目标设备响应所述视频信息获取请求并向头戴显示设备发送获取到的3D视频。
在一实施例中,所述视频信息获取请求还携带所述头戴显示设备控制目标设备周期性向所述头戴显示设备发送3D视频的周期,比如,每隔2s发送一次3D视频,以使得所述头戴显示设备可实时显示目标设备拍摄到的内容。
步骤S203,解析所述3D视频以得到所述目标设备当前所处的环境。
步骤S204,根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面。比如,目标设备当前所处的环境为游乐场,那么生成的增强现实界面与游乐场相关联。
步骤205,通过所述增强现实界面播放所述3D视频。
在一实施例中,图2所示的实现方法还包括:
头戴显示设备的陀螺仪1104获取用户头部的第一转动参数;头戴显示设备110的处理器1101根据所述第一转动参数向所述目标设备120发送控制指令,所述控制指令携带所述第一转动参数,所述控制指令用于控制所述所述目标设备120根据所述第一转动参数采集3D视频。
头戴显示设备上配置有陀螺仪1104,在用户佩戴头戴显示设备时,若用户头部转动,陀螺仪1104能够检测到用户头部转动的位置及角度等信息,然后陀螺仪1104可获取用户头部的转动参数。此时头戴显示设备的处理器1101通过通信接口1103再将陀螺仪1104获取到的转动参数发送给目标设备120,目标设备120的处理器1201根据这个转动参数转动3D摄相机1205,以使得3D摄相机1205获取到另一个视角的3D视频,以及将3D摄相机获取的3D视频发送给头戴显示设备110。如此,通过用户头部的转动,能够获取到不同视角的3D视频。
在一实施例中,图2所示的实现方法还包括:当所述目标设备发送的3D视频中包括有待捕捉对象时,在所述增强现实界面上显示用于标识所述待捕捉对象的十字瞄准线。所述待捕捉对象是目标设备获取到的视频信息中包含的其 他目标设备。十字瞄准线是光学瞄准具中使用的一种瞄准标识线,包括两条细线以直角交会在一起。十字瞄准线是用于标识所述待捕捉对象的,也就是说,该十字瞄准线是基于待捕捉对象显示的。
在一实施例中,图2所示的实现方法还包括:
头戴显示设备的陀螺仪1104获取用户头部的第二转动参数;以及头戴显示设备110的处理器1101根据所述第二转动参数移动所述十字瞄准线。
举例来说,假如陀螺仪1104获取到的第一转动参数有:转动角度为90°,转动角速度为5rad/s,转动方向为水平向右,那么十字瞄准线以转动角度为90°,转动角速度为5rad/s,转动方向为水平向右,进行转动。
在一实施例中,图2所示的实现方法还包括:
头戴显示设备的处理器1101通过通信接口1103实时将所述第二转动参数发送给所述目标设备,以使得所述目标设备110根据所述第二转动参数控制驱动装置1206转动,以带动收/发装置1204转动。
举例来说,由于是通过目标设备120捕捉待捕捉对象,因此需要实时将用户头部转动的转动参数发送给目标设备,以使得目标设备120的驱动装置1206转动,以带动目标设备的收/发装置1204转动,进而使得目标设备的收/发装置1204瞄准待捕捉对象,比如,假如陀螺仪1104获取到的第一转动参数有:转动角度为90°,转动角速度为5rad/s,转动方向为水平向右,那么目标设备基于转动角度为90°,转动角速度为5rad/s,转动方向为水平向右的转动参数控制驱动装置1206转动。
在一实施例中,图2所示的方法还包括:
当检测到用户针对捕捉按钮的按压操作时,头戴显示设备的处理器1101通过通信接口1103向所述目标设备发送捕捉指令,所述捕捉指令用于控制所述收/发装置1204发出捕捉信号,以对所述待捕捉对象进行捕捉。
其中,捕捉按钮可以是设置在头戴显示设备外表面的一个用于触发捕捉指令的按钮;也可以是与头戴显示设备连接的用于触发捕捉指令的其他设备上的按钮,比如,与头戴显示设备连接的用于游戏的枪械的扳机。
在一实施例中,所述方法还包括:
头戴显示设备110的处理器1101通过通信接口1103接收所述目标设备 120发送的捕捉成功指令,所述捕捉成功指令是所述收/发装置1204在接收到经所述待捕捉对象反射回来的所述捕捉信号时触发的。当头戴显示设备110接收到所述目标设备120发送的捕捉成功指令时,头戴显示设备110的处理器1101生成用于表示所述待捕捉对象已被捕捉的画面,以及在所述增强现实界面上显示所述画面。
其中,用于表示所述待捕捉对象已被捕捉的画面是爆炸画面,或是待捕捉对象中枪的画面等等,本发明不作唯一限定。
在一实施例中,所述方法还包括:
在确定所述待捕捉对象已被捕捉时,头戴显示设备的处理器1101获取用于表示所述待捕捉对象已被捕捉的音频,以及在显示所述画面时播放所述音频。
进一步地,所述获取用于表示所述待捕捉对象已被捕捉的音频的具体实施方式为:头戴显示设备的处理器1101根据所述画面获取用于表示所述待捕捉对象已被捕捉的音频。比如,假设所述画面为爆炸画面,那么获取的所述音频是用于表示爆炸的音频,比如“轰隆隆轰隆隆轰隆隆……”。又如,假设所述画面为爆炸画面,那么获取的所述音频是用于表示爆炸的语音,比如“待捕捉对象已击中”。
在一实施例中,所述转动参数包括转动角、转动方向、转动角速率中的至少一种。
可见,本发明提供的方案头戴显示设备可接收另一设备采集到的3D视频,然后生成一与3D视频关联的增强现实界面,最后在这个增强现实界面上播放这个3D视频。可见,生成的增强现实界面是与另一设备拍摄到的信息关联的,进而可增强现实画面的效果,从而提升用户观看体验。
请参见图3,其所示是以目标设备为执行主体实施该增强现实界面的实现方法的流程示意图。该实现方法包括以下步骤:
步骤S301、建立与头戴显示设备的通信连接。
在一实施例中,以上步骤S301的具体实施方式为:目标设备的处理器1201通过通信接口1203接收头戴显示设备发送通信连接请求;针对所述通信连接 请求目标设备的处理器1201通过通信接口1203向头戴显示设备发送的允许通信连接响应,以完成建立与头戴显示设备的通信连接。
步骤S302、接收头戴显示设备所发送的视频信息获取请求及响应该视频信息获取请求采集3D视频。
在一实施例中,以上步骤S502的具体实施方式为:目标设备的处理器1201控制目标设备的3D摄像头1205开启以采集3D视频。
S303、将所述3D视频发送给向所述头戴显示设备。
在一实施例中,图3所示的实现方法还包括:
目标设备的处理器1201通过通信接口1203接收头戴显示设备发送的控制指令,所述控制指令携带所述第一转动参数,所述控制指令用于指示所述所述目标设备根据所述第一转动参数采集3D视频;目标设备的处理器1201根据所述第一转动参数转动3D摄相机1205;通过3D摄相机1205采集另一视角的3D视频,以及目标设备的处理器1201通过通信接口1203将这一3D视频发送给头戴显示设备。
在一实施例中,图3所示的实现方法还包括:
目标设备的处理器1201通过通信接口1203接收头戴显示设备110实时发送的所述第二转动参数;目标设备的处理器1201根据所述第二转动参数控制驱动装置1206转动,以带动收/发装置1204转动。
在一实施例中,图3所示的实现方法还包括:
目标设备的处理器1201通过通信接口1203接收头戴显示设备发送的捕捉指令;目标设备的处理器1201控制所述目标设备的收/发装置1204发出捕捉信号,以对所述待捕捉对象进行捕捉。
在一实施例中,图3所示的实现方法还包括:
当所述收/发装置1204接收到经所述待捕捉对象反射回来的所述捕捉信号时,目标设备的处理器1201通过通信接口1203发送捕捉成功指令。
本发明实施例还提供了一种头戴显示设备400,如图4所示,包括:
通信模块401,用于建立与目标设备的通信连接;
接收模块402,用于接收所述目标设备发送的3D视频;
解析模块403,用于解析所述3D视频以得到所述目标设备当前所处的环境;
界面生成模块404,用于根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面;
显示控制模块405,用于通过所述增强现实界面播放所述3D视频。
可选地,所述头戴显示设备包括陀螺仪,所述陀螺仪用于获取用户头部转动时的转动参数,所述陀螺仪获取用户头部的第一转动参数;所述头戴显示设备还包括:
发送模块406,用于根据所述第一转动参数向所述目标设备发送控制指令,所述控制指令携带所述第一转动参数,所述控制指令用于指示所述所述目标设备根据所述第一转动参数采集3D视频。
可选地,所述显示控制模块405,还用于当所述目标设备发送的3D视频中包括有待捕捉对象时,在所述增强现实界面上显示用于标识所述待捕捉对象的十字瞄准线。
可选地,所述陀螺仪获取用户头部的第二转动参数;所述显示控制模块405,还用于根据所述第二转动参数移动所述十字瞄准线。
可选地,所述目标设备包括驱动装置和置于所述驱动装置上的收/发装置,
所述发送模块406,还用于实时将所述第二转动参数发送给所述目标设备,以使得所述目标设备根据所述第二转动参数控制所述驱动装置转动,以带动所述收/发装置转动。
可选地,所述发送模块406,还用于当检测到用户针对捕捉按钮的按压操作时,向所述目标设备发送捕捉指令,所述捕捉指令用于控制所述收/发装置发出捕捉信号,以对所述待捕捉对象进行捕捉。
可选地,所述接收模块402,还用于接收所述目标设备发送的捕捉成功指令,所述捕捉成功指令是所述收/发装置在接收到经所述待捕捉对象反射回来的所述捕捉信号时触发的;
所述头戴显示设备还包括:
画面生成模块407,用于生成用于表示所述待捕捉对象已被捕捉的画面;
所述显示控制模块405,还用于在所述增强现实界面上显示所述画面。
可选地,所述头戴显示设备还包括:
音频获取模块408,用于在确定所述待捕捉对象已被捕捉时,获取用于表示所述待捕捉对象已被捕捉的音频;
音频播放模块409,用于在显示所述画面时播放所述音频。
可选地,所述转动参数包括转动角、转动方向、转动角速率中的至少一种。
需要说明的是,上述各模块(通信模块401、接收模块402、解析模块403、界面生成模块404、显示控制模块405、发送模块406、画面生成模块407、音频获取模块408和音频播放模块409)用于执行上述方法的相关步骤。
在本实施例中,头戴显示设备400是以模块的形式来呈现。这里的“模块”可以指特定应用集成电路(application-specific integrated circuit,ASIC),执行一个或多个软件或固件程序的处理器和存储器,集成逻辑电路,和/或其他可以提供上述功能的器件。此外,以上通信模块401、解析模块403、界面生成模块404、显示控制模块405、画面生成模块407、音频获取模块408和音频播放模块409可通过图1所示的头戴显示设备的处理器1101来实现,以上接收模块402和发送模块406可通过图1所示的头戴显示设备的处理器1101通信接口1103来实现。
本发明实施例还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时包括上述方法实施例中记载的任何一种增强现实界面的实现方法的部分或全部步骤。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
以上对本发明实施例进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上上述,本说明书内容不应理解为对本发明的限制。

Claims (19)

  1. 一种增强现实界面的实现方法,适用于头戴显示设备,其特征在于,该方法包括:
    建立与目标设备的通信连接;
    接收所述目标设备发送的3D视频;
    解析所述3D视频以得到所述目标设备当前所处的环境;
    根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面;以及
    通过所述增强现实界面播放所述3D视频。
  2. 根据权利要求1所述的实现方法,其特征在于,所述头戴显示设备包括陀螺仪,所述陀螺仪用于获取用户头部转动时的转动参数;所述方法还包括:
    所述陀螺仪获取用户头部的第一转动参数;
    根据所述第一转动参数向所述目标设备发送控制指令,所述控制指令携带所述第一转动参数,所述控制指令用于指示所述所述目标设备根据所述第一转动参数采集3D视频。
  3. 根据权利要求1或2所述的实现方法,其特征在于,所述方法还包括:
    当所述目标设备发送的3D视频中包括有待捕捉对象时,在所述增强现实界面上显示用于标识所述待捕捉对象的十字瞄准线。
  4. 根据权利要求2所述的实现方法,其特征在于,所述方法还包括:
    所述陀螺仪获取用户头部的第二转动参数;以及
    根据所述第二转动参数移动所述十字瞄准线。
  5. 根据权利要求4所述的实现方法,其特征在于,所述目标设备包括驱动装置和置于所述驱动装置上的收/发装置,所述方法还包括:
    实时将所述第二转动参数发送给所述目标设备,以使得所述目标设备根据 所述第二转动参数控制所述驱动装置转动,以带动所述收/发装置转动。
  6. 根据权利要求5所述的实现方法,其特征在于,所述方法还包括:
    当检测到用户针对捕捉按钮的按压操作时,向所述目标设备发送捕捉指令,所述捕捉指令用于控制所述收/发装置发出捕捉信号,以对所述待捕捉对象进行捕捉。
  7. 根据权利权利要求6所述的实现方法,其特征在于,所述方法还包括:
    接收所述目标设备发送的捕捉成功指令,所述捕捉成功指令是所述收/发装置在接收到经所述待捕捉对象反射回来的所述捕捉信号时触发的;
    生成用于表示所述待捕捉对象已被捕捉的画面,以及
    在所述增强现实界面上显示所述画面。
  8. 根据权利要求7所述的实现方法,其特征在于,所述方法还包括:
    在确定所述待捕捉对象已被捕捉时,获取用于表示所述待捕捉对象已被捕捉的音频,以及
    在显示所述画面时播放所述音频。
  9. 根据权利要求2或4所述的实现方法,其特征在于,所述转动参数包括转动角、转动方向、转动角速率中的至少一种。
  10. 一种头戴显示设备,其特征在于,包括:
    通信模块,用于建立与目标设备的通信连接;
    接收模块,用于接收所述目标设备发送的3D视频;
    解析模块,用于解析所述3D视频以得到所述目标设备当前所处的环境;
    界面生成模块,用于根据所述目标设备所处的环境生成与所述环境相关联的增强现实界面;
    显示控制模块,用于通过所述增强现实界面播放所述3D视频。
  11. 根据权利要求10所述的头戴显示设备,其特征在于,所述头戴显示设备包括陀螺仪,所述陀螺仪用于获取用户头部转动时的转动参数,所述陀螺仪获取用户头部的第一转动参数;所述头戴显示设备还包括:
    发送模块,用于根据所述第一转动参数向所述目标设备发送控制指令,所述控制指令携带所述第一转动参数,所述控制指令用于指示所述所述目标设备根据所述第一转动参数采集3D视频。
  12. 根据权利要求10或11所述的头戴显示设备,其特征在于,
    所述显示控制模块,还用于当所述目标设备发送的3D视频中包括有待捕捉对象时,在所述增强现实界面上显示用于标识所述待捕捉对象的十字瞄准线。
  13. 根据权利要求11所述的头戴显示设备,其特征在于,所述陀螺仪获取用户头部的第二转动参数;其特征在于,所述显示控制模块,还用于根据所述第二转动参数移动所述十字瞄准线。
  14. 根据权利要求13所述的头戴显示设备,所述目标设备包括驱动装置和置于所述驱动装置上的收/发装置,其特征在于,
    所述发送模块,还用于实时将所述第二转动参数发送给所述目标设备,以使得所述目标设备根据所述第二转动参数控制所述驱动装置转动,以带动所述收/发装置转动。
  15. 根据权利要求14所述的头戴显示设备,其特征在于,
    所述发送模块,还用于当检测到用户针对捕捉按钮的按压操作时,向所述目标设备发送捕捉指令,所述捕捉指令用于控制所述收/发装置发出捕捉信号,以对所述待捕捉对象进行捕捉。
  16. 根据权利要求15所述的头戴显示设备,其特征在于,
    所述接收模块,还用于接收所述目标设备发送的捕捉成功指令,所述捕捉 成功指令是所述收/发装置在接收到经所述待捕捉对象反射回来的所述捕捉信号时触发的;
    所述头戴显示设备还包括:
    画面生成模块,用于生成用于表示所述待捕捉对象已被捕捉的画面;
    所述显示控制模块,还用于在所述增强现实界面上显示所述画面。
  17. 根据权利要求16所述的头戴显示设备,其特征在于,所述头戴显示设备还包括:
    音频获取模块,用于在确定所述待捕捉对象已被捕捉时,获取用于表示所述待捕捉对象已被捕捉的音频;
    音频播放模块,用于在显示所述画面时播放所述音频。
  18. 根据权利要求11或13所述的方法,其特征在于,所述转动参数包括转动角、转动方向、转动角速率中的至少一种。
  19. 一种头戴显示设备,包括:处理器、存储器,其特征在于,所述处理器用于执行权利要求1~9任意一项所述的方法。
PCT/CN2016/113662 2016-12-30 2016-12-30 一种增强现实界面的实现方法及头戴显示设备 WO2018120090A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2016/113662 WO2018120090A1 (zh) 2016-12-30 2016-12-30 一种增强现实界面的实现方法及头戴显示设备
CN201680036487.6A CN107820706A (zh) 2016-12-30 2016-12-30 一种增强现实界面的实现方法及头戴显示设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/113662 WO2018120090A1 (zh) 2016-12-30 2016-12-30 一种增强现实界面的实现方法及头戴显示设备

Publications (1)

Publication Number Publication Date
WO2018120090A1 true WO2018120090A1 (zh) 2018-07-05

Family

ID=61601630

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/113662 WO2018120090A1 (zh) 2016-12-30 2016-12-30 一种增强现实界面的实现方法及头戴显示设备

Country Status (2)

Country Link
CN (1) CN107820706A (zh)
WO (1) WO2018120090A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473078A (zh) * 2021-06-09 2021-10-01 国网上海市电力公司 一种基于增强现实技术的红外运维巡检方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140274241A1 (en) * 2013-03-14 2014-09-18 Sony Computer Entertainment America Llc Scheme for requiring additional user input when catching an object in a computer simulation
CN104759095A (zh) * 2015-04-24 2015-07-08 吴展雄 一种虚拟现实头戴显示系统
CN105629470A (zh) * 2016-01-15 2016-06-01 何军 头戴式显示器及其显示方法
CN205386330U (zh) * 2016-03-16 2016-07-20 成都风霆网络科技有限公司 一种无人机激光cs游戏对战系统
CN106075915A (zh) * 2016-07-15 2016-11-09 成都定为电子技术有限公司 一种可以接收多个方向射击激光束的无人机空中对战装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759833A (zh) * 2016-02-23 2016-07-13 普宙飞行器科技(深圳)有限公司 一种沉浸式无人机驾驶飞行系统
CN106131488B (zh) * 2016-07-12 2018-10-16 北京仿真中心 一种基于无人机的增强现实方法
CN106228615A (zh) * 2016-08-31 2016-12-14 陈昊 基于增强现实的无人飞行器体验系统及其体验方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140274241A1 (en) * 2013-03-14 2014-09-18 Sony Computer Entertainment America Llc Scheme for requiring additional user input when catching an object in a computer simulation
CN104759095A (zh) * 2015-04-24 2015-07-08 吴展雄 一种虚拟现实头戴显示系统
CN105629470A (zh) * 2016-01-15 2016-06-01 何军 头戴式显示器及其显示方法
CN205386330U (zh) * 2016-03-16 2016-07-20 成都风霆网络科技有限公司 一种无人机激光cs游戏对战系统
CN106075915A (zh) * 2016-07-15 2016-11-09 成都定为电子技术有限公司 一种可以接收多个方向射击激光束的无人机空中对战装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473078A (zh) * 2021-06-09 2021-10-01 国网上海市电力公司 一种基于增强现实技术的红外运维巡检方法和装置

Also Published As

Publication number Publication date
CN107820706A (zh) 2018-03-20

Similar Documents

Publication Publication Date Title
US10445925B2 (en) Using a portable device and a head-mounted display to view a shared virtual reality space
JP6545744B2 (ja) 頭部装着ディスプレイにおけるオペレーション・モード・スイッチング
US20210252398A1 (en) Method and system for directing user attention to a location based game play companion application
US10265621B2 (en) Tracking specific gestures relative to user movement
US11534684B2 (en) Systems and methods for detecting and displaying a boundary associated with player movement
US10343069B2 (en) Systems and methods for executing a training program based on player DNA
US20240042324A1 (en) Instant streaming of a view into a gaming environment of a game play to a device of a secondary user without application install
CN107469343B (zh) 虚拟现实交互方法、装置及系统
US11209966B2 (en) Extended on-screen gameplay via augmented reality
KR101880844B1 (ko) 오디오 공간 효과를 위한 초음파 스피커 어셈블리
JP7249975B2 (ja) 位置に基づくゲームプレイコンパニオンアプリケーションへユーザの注目を向ける方法及びシステム
US20180348987A1 (en) Method executed on computer for providing virtual space, program and information processing apparatus therefor
CN112334969B (zh) 多点slam捕获
US20180348531A1 (en) Method executed on computer for controlling a display of a head mount device, program for executing the method on the computer, and information processing apparatus therefor
US11151804B2 (en) Information processing device, information processing method, and program
WO2018120090A1 (zh) 一种增强现实界面的实现方法及头戴显示设备
JP2020515336A (ja) ズーム装置及び関連する方法
JP2020532352A (ja) プレイヤ選択についての注意ベースのai決定
JP2023533227A (ja) ゲームプレイのためにユーザを指導するシステム及び方法
WO2021095576A1 (ja) 情報処理装置、情報処理方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16925620

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/10/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16925620

Country of ref document: EP

Kind code of ref document: A1