US11455035B2 - Inputs to virtual reality devices from touch surface devices - Google Patents

Inputs to virtual reality devices from touch surface devices Download PDF

Info

Publication number
US11455035B2
US11455035B2 US16/970,513 US201816970513A US11455035B2 US 11455035 B2 US11455035 B2 US 11455035B2 US 201816970513 A US201816970513 A US 201816970513A US 11455035 B2 US11455035 B2 US 11455035B2
Authority
US
United States
Prior art keywords
touch screen
screen device
processor
hmd
instructions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/970,513
Other versions
US20220011861A1 (en
Inventor
Hiroshi Horii
Mithra Vankipuram
Ian N. Robinson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HORII, HIROSHI, ROBINSON, IAN N., VANKIPURAM, Mithra
Publication of US20220011861A1 publication Critical patent/US20220011861A1/en
Application granted granted Critical
Publication of US11455035B2 publication Critical patent/US11455035B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0384Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices

Definitions

  • VR systems are wearable interactive systems that allow a user to experience an artificial world.
  • the user may visually see a computer generated world through a display of the VR system.
  • the VR system may provide entertainment, simulations, and the like.
  • the artificial world may be part of a video game for entertainment.
  • the VR world may be a simulation to train an employee for a procedure or process in a corporate setting.
  • FIG. 1 is a block diagram of an example of a virtual reality system with a touch surface device of the present disclosure
  • FIG. 2 illustrates a block diagram of the virtual reality system the present disclosure
  • FIG. 3 illustrates an example display of the touch surface device in a virtual reality display of the virtual reality system of the present disclosure
  • FIG. 4 is a flow chart of an example method for connecting a virtual reality device to a touch surface device to receive an input from the touch surface device;
  • FIG. 5 is a block diagram of an example non-transitory computer readable storage medium storing instructions executed by a processor of the present disclosure.
  • VR virtual reality
  • VR systems are wearable interactive systems that allow a user to experience an artificial world. The user may visually see a computer generated world through a display of the VR system.
  • HMD head mounted display
  • Examples herein provide a VR system that can be used to automatically locate a nearby touch surface device while the user is engaged in an artificial world of the VR system.
  • the user may locate a touch screen device, automatically connect to the touch screen device, and use the touch screen device to provide inputs while in the artificial world of the VR system.
  • the user does not need to remove the HMD of the VR system to find, and connect to, the touch screen device.
  • FIG. 1 illustrates a block diagram of a VR system 100 of the present disclosure.
  • the VR system 100 may include a head mounted display (HMD) 102 , a hand wearable interface 104 , and a touch screen device 106 .
  • the VR system 100 may include a processor 108 that may be communicatively coupled to the hand wearable interface 104 via a wired or wireless connection.
  • the processor 108 may be located external to the HMD 102 .
  • the processor 108 may be integrated as part of the HMD 102 .
  • the processor 108 may execute various instructions stored in memory and/or functions, as described below.
  • a touch screen device may be a more accurate way of providing inputs to the HMD 102 than using hand gestures via the hand wearable interface 104 .
  • the HMD 102 is active and displaying a computer generated artificial world on a display 112 in a virtual reality (VR) mode, the user may not be able to see where the touch screen device 106 is located.
  • VR virtual reality
  • the touch screen device 106 may have a marker 120 that can be detected by the HMD 102 and/or the hand wearable interface 104 . No other objects in the room may appear in the display 112 when the VR mode is active except the marker 120 .
  • the marker 120 may be a pre-defined dynamic marker that can be detected by a camera 110 (e.g., a red, green, blue (RGB) video camera, an infrared camera, and the like) coupled to the HMD 102 .
  • a camera 110 e.g., a red, green, blue (RGB) video camera, an infrared camera, and the like
  • the pre-defined dynamic marker may be a constantly changing mark or code, as opposed to a static or fixed code.
  • the shape of the pre-defined dynamic marker may change periodically, the colors of the pre-defined dynamic marker may change periodically, the codes within the pre-defined dynamic marker may change periodically, and the like.
  • the marker 120 may be an emission of a particular wavelength of light.
  • the camera 110 may be an infrared camera and may detect an infrared wavelength of light emitted by the touch screen device 106 that is invisible to the human eye so as not to distract non-users of the VR system 100 .
  • the infrared wavelength of light may allow a non-user of the VR system 100 to use the touch screen device 106 as a traditional device, while allowing the camera 110 to locate the touch screen device 106 .
  • the infrared wavelength of light may be shown on the display 112 such that the user may move towards the touch screen device 106 .
  • the touch screen device 106 may be registered with the processor 108 of the VR system 100 .
  • the VR system 100 may know the dimensions of the display of the touch screen device 106 .
  • the HMD 102 or the hand wearable interface 104 may emit a signal to wake the touch screen device 106 .
  • the marker 120 may be a glow of the display of the touch screen device 106 .
  • the camera 110 may then identify the touch screen device 106 by searching for a light emitted from a screen that has the same dimensions as the touch screen device 106 that was registered with the VR system 100 .
  • the hand wearable interface 104 may be used to locate the touch screen device 106 without the marker 120 .
  • the hand wearable interface 104 may include an emitter 114 .
  • the emitter 114 may be a wireless emitter that broadcasts a wireless signal.
  • the touch screen device 106 may transmit a response signal that is received by a receiver 116 .
  • the receiver 116 may be a wireless receiver that receives the response signal from the touch screen device 106 .
  • Examples of such wireless links may include Bluetooth, radio frequency identification (RFID), near field communications (NFC), and the like.
  • RFID radio frequency identification
  • NFC near field communications
  • the distance to the touch screen device 106 may be calculated based on the time to receive the response signal from the touch screen device 106 .
  • an avatar of the touch screen device 106 may be shown in the display 112 . The avatar may grow smaller or larger as the user moves further away from or closer to the touch screen device 106 .
  • the hand wearable interface 104 may include a haptic feedback 118 .
  • the haptic feedback 118 may be a component that provides feedback (e.g., a vibration or a buzz) to the user when the hand wearable interface 104 is located over or within a pre-defined distance (e.g., a few inches) from the touch screen device 106 .
  • the strength of the haptic feedback may vary and be proportional to the detected proximity of the device. Thus, the user can locate and grab the touch screen device 106 when the haptic feedback 118 provides constant feedback.
  • the user may place the touch screen device 106 at a location.
  • the VR system 100 may remember the location where the touch screen device 106 was placed such that the touch screen device 106 may be located more quickly the next time the user want to locate and use the touch screen device 106 .
  • the touch screen device 106 may be held.
  • the camera 110 may be used to detect an orientation of the touch screen device 106 .
  • the touch screen device 106 may have sensors that can detect the orientation of the touch screen device 106 .
  • the orientation of the touch screen device 106 can be transmitted to the processor 108 for display.
  • the touch screen device 106 may be communicatively coupled to the HMD 102 .
  • the touch screen device 106 may also be shown in the computer generated artificial world in the orientation that is detected, as discussed in further detail below.
  • the touch screen device 106 can then be used to provide inputs to the computer generated artificial world displayed on the display 112 .
  • FIG. 2 illustrates a block diagram of a VR system 200 .
  • the VR system 200 may include similar components as the VR system 100 illustrated in FIG. 1 .
  • the HMD 102 may be communicatively coupled to the processor 108 .
  • the HMD 102 may be to display the computer generated artificial world when the HMD 102 is operating in a VR mode. In other words, a user may not be able to see his or her surroundings when wearing the HMD 102 and interacting with the computer generated artificial world in the VR mode.
  • the processor 108 may also be communicatively coupled to the hand wearable interface 104 . As discussed above, the processor 108 may communicate with the hand wearable interface 104 via a wired or wireless connection. The hand wearable interface 104 may be worn around the back of a user's hand to provide motion detection, gesture detection, and the like to the processor 108 .
  • the processor 108 may also be communicatively coupled to a wireless communication interface 202 .
  • the wireless communication interface 202 may establish a wireless connection 204 to the touch screen device 106 .
  • the processor 108 may then receive inputs from the touch screen device 106 over the wireless connection 204 .
  • the touch screen device 106 may be registered with a VR application executed by the processor 108 , as noted above.
  • a plurality of different touch screen devices 106 that are associated with, or owned by, a user may be registered with the VR system 100 .
  • the user may have a touch screen phone and a touch screen tablet device to use in the computer generated artificial world.
  • the registration process may be used to download an application on the touch screen device 106 that works with the VR mode of the HMD 102 .
  • the application may allow the touch screen device 106 to automatically establish the wireless connection 204 with the processor 108 when the touch screen device 106 is located.
  • the application may also allow the touch screen device 106 to track which locations (e.g., x-y coordinates) of the display are touched and transmit the location information to the processor 108 .
  • the processor 108 may then identify a touch input displayed on an avatar of the touch screen device 106 that is displayed in the computer generated artificial world.
  • the touch input may be associated with a function and the function may be executed in the computer generated artificial world during the VR mode of the HMD 102 .
  • FIG. 3 illustrates an example display 112 of an avatar 306 of the touch screen device 106 in the HMD 102 of the VR system 100 .
  • the display 112 may show the avatar 306 of the touch screen device 106 .
  • the avatar 306 may show the touch screen device 106 in an orientation in which the user is holding the touch screen device 106 .
  • the avatar 306 may also show touch inputs 310 , 312 , and 314 of a graphical user interface (GUI) 308 .
  • GUI graphical user interface
  • the dimensions of the avatar 306 may be similar to the dimensions of the touch screen device 106 .
  • the size of the GUI 308 may be similar to the size of a GUI that would be displayed on the touch screen device 106 .
  • the number and size of the touch inputs 310 , 312 , and 314 may be a function of the size, or dimensions, of the touch screen device 106 .
  • the touch screen device 106 may be powered on and active, but display a blank screen. In other words, the touch screen device 106 may not show any information or GUI despite being activated and connected to the HMD 102 .
  • the touch screen device 106 may be a touch screen device without a display. Since the touch screen device 106 does not show any information, a touch screen device without a display may be used with the VR system 100 .
  • the avatar 306 may display the GUI 308 with touch inputs 310 , 312 , and 314 . It should be noted that although three touch inputs are illustrated in FIG. 3 , any number of touch inputs may be displayed in the avatar 306 .
  • the user may want to select the touch input 310 .
  • the user may touch a location on the touch screen device 106 that is associated with the location of the touch input 310 in the avatar 306 .
  • the touch screen device 106 may detect the touch and record a location that is touched. The location may be transmitted to the processor 108 of the HMD 102 via the wireless connection 204 .
  • the touch screen device 106 may display images associated with an application or operating system executed by the touch screen device 106 .
  • the images on the touch screen device 106 may be transmitted to the processor 108 to be displayed on the display 112 .
  • the GUI 308 with the touch inputs 310 , 312 , and 314 may be what is shown on the touch screen device 106 .
  • the display 112 may show what is actually displayed by the touch screen device 106 . This may allow the user to check messages, notifications, and the like, on the touch screen device 106 while using the touch screen device 106 in the computer generated artificial world.
  • the processor 108 may determine which touch input was selected based on the location information that is received from the touch screen device 106 . For example, the processor 108 may determine that the touch input 310 was selected based on the location information from the touch screen device 106 .
  • the processor 108 may then determine a function that is associated with the touch input 310 .
  • the touch inputs 310 , 312 , and 314 may be associated with different functions that can be executed in the computer generated world.
  • the computer generated world may be a construction simulation.
  • Each touch input 310 , 312 , and 314 may be a touch input to use a different tool, use a different vehicle, build a different structure, and the like.
  • the touch input 310 may be associated with a function to demolish a structure.
  • the processor 108 may determine that the touch input 310 has been selected to demolish a structure.
  • the processor 108 may then prompt a user to select a structure in the computer generated artificial world and demolish the structure that is selected based on the selection of the touch input 310 .
  • the inputs 310 , 312 , and 314 may be associated with other functions in different applications.
  • the computer generated artificial world may be a video game, a travel simulator, and the like, and the inputs 310 , 312 , and 314 may execute different functions in the different computer generated artificial worlds.
  • the VR system 100 may allow a user to locate a touch screen device 106 while using the HMD 102 .
  • the user does not need to deactivate a VR mode and remove the HMD 102 to locate the touch screen device 106 .
  • the touch screen device 106 may be connected to the HMD 102 and used to provide touch inputs in the computer generated artificial world.
  • an avatar 306 of the touch screen device 106 may be shown in the computer generated artificial world and used to interact, or execute different functions, in the computer generated artificial world.
  • FIG. 4 illustrates a flow diagram of an example method 400 for connecting a virtual reality device to a touch surface device to receive an input from the touch surface device.
  • the method 400 may be performed by the virtual reality system 100 , or the apparatus 500 illustrated in FIG. 5 and described below.
  • the method 400 begins.
  • the method 400 receives an indication that a touch screen device that is located is being held. For example, the user may want to use the touch screen device to provide inputs to a virtual reality world in a virtual reality system. The user may not be able to see his or her surroundings in the real world while in the virtual reality world.
  • the indication may be a haptic feedback that is triggered when the touch screen device is located and/or held.
  • the indication may be an audible indication when the touch screen device is located and/or held. For example, a beep or tone may be played in the audio of the virtual reality system.
  • the indication may be a visual indication when the touch screen device is located and/or held. For example, an outline of the touch screen device may flash or the touch screen device may be displayed in the virtual reality system.
  • the method 400 generates an avatar of the touch screen device in a computer generated artificial world and causes the avatar to be displayed in a head mounted display (HMD) of a virtual reality (VR) system that displays the computer generated artificial world.
  • HMD head mounted display
  • VR virtual reality
  • a graphical representation of the touch screen device may be generated and displayed in the HMD of the VR system.
  • the avatar of the touch screen device may display an image or interface that is not shown on the real touch screen device.
  • the touch screen device in reality may be powered on, but display a blank screen.
  • the avatar of the touch screen device may display a menu with different buttons that can be selected.
  • the VR system may know the dimensions of the touch screen device via a registration process.
  • the registration process may also include downloading an application on the touch screen device that works with the VR system such that the touch screen device can identify and communicate selections of certain areas of the touch screen device to the VR system.
  • the VR system may know how to size and locate the menu and buttons in the avatar of the touch screen device.
  • the location of a button in the avatar of the touch screen device may correspond to a same location on the real touch screen device.
  • the method 400 establishes a wireless connection to the touch screen device.
  • the touch screen device may communicate wirelessly with the virtual reality system to exchange data, inputs, and outputs.
  • the wireless connection may be a Bluetooth low energy (BLE) connection, a Wi-Fi connection, a local area network (LAN) connection, and the like.
  • BLE Bluetooth low energy
  • LAN local area network
  • the VR system may automatically initiate a pairing process or process to establish the wireless connection when the touch screen device is located and confirmed to be held.
  • the method 400 receives a selection of a touch input on the avatar of the touch screen device in the computer generated artificial world.
  • a user may be playing a game in the computer generated artificial world.
  • the touch screen device may be used to make a selection in the game.
  • a user may touch an area of the screen of the real touch screen device via interaction with the avatar of the touch screen device in the computer generated artificial world.
  • the real touch screen device may detect a location of where the user touched the display of the real touch screen device.
  • the location may be transmitted to the VR system.
  • the VR system may associate a touch input on the avatar of the touch screen device with the location of the real touch screen device that was touched.
  • the method 400 executes a function associated with the touch input in the computer generated artificial world.
  • each button in the avatar of the touch screen device may be associated with a function within the computer generated artificial world.
  • the VR system may identify the function associated with the touch input that is selected. Using the game example above, selecting a touch input in the computer generated artificial world may equip a particular armament. Thus, when the armament touch input is selected in the computer generated artificial world, the particular armament may be equipped in the computer generated artificial world.
  • the method 400 ends.
  • FIG. 5 illustrates an example of an apparatus 500 .
  • the apparatus 500 may be the virtual reality system 100 .
  • the apparatus 500 may include a processor 502 and a non-transitory computer readable storage medium 504 .
  • the non-transitory computer readable storage medium 504 may include instructions 506 , 508 , 510 , 512 , and 514 that, when executed by the processor 502 , cause the processor 502 to perform various functions.
  • the instructions 506 may include instructions to locate a touch screen device while a head mounted display (HMD) of a virtual reality (VR) system is displaying a computer generated artificial world.
  • the instructions 508 may include instructions to establish a wireless connection to the touch screen device.
  • the instructions 510 may include instructions to display an avatar of the touch screen device in the computer generated artificial world, wherein the avatar displays a touch input that is different than what is displayed on the touch screen device.
  • the instructions 512 may include instructions to receive a selection of the touch input displayed on the avatar.
  • the instructions 514 may include instructions to execute a function associated with the touch input in the computer generated artificial world.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In example implementations, a virtual reality (VR) system is provided. The VR system includes a head mounted display (HMD), a hand wearable interface, a wireless communication interface, and a processor. The HMD is to display a computer generated artificial world. The wireless communication interface is to establish a wireless communication path. The processor is communicatively coupled to the HMD, the hand wearable interface, and the wireless communication interface. The processor is to receive an indication that a touch screen device is located from a locator device in the HMD or the hand wearable device, to establish a wireless connection to the touch screen device via the wireless communication interface, and to receive an input via the touch screen device.

Description

BACKGROUND
Virtual reality (VR) systems are wearable interactive systems that allow a user to experience an artificial world. The user may visually see a computer generated world through a display of the VR system. The VR system may provide entertainment, simulations, and the like. For example, the artificial world may be part of a video game for entertainment. In another example, the VR world may be a simulation to train an employee for a procedure or process in a corporate setting.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of an example of a virtual reality system with a touch surface device of the present disclosure;
FIG. 2 illustrates a block diagram of the virtual reality system the present disclosure;
FIG. 3 illustrates an example display of the touch surface device in a virtual reality display of the virtual reality system of the present disclosure;
FIG. 4 is a flow chart of an example method for connecting a virtual reality device to a touch surface device to receive an input from the touch surface device; and
FIG. 5 is a block diagram of an example non-transitory computer readable storage medium storing instructions executed by a processor of the present disclosure.
DETAILED DESCRIPTION
Examples described herein provide a virtual reality (VR) system that can connect to a touch surface device and receive inputs from the touch surface device. As discussed above, VR systems are wearable interactive systems that allow a user to experience an artificial world. The user may visually see a computer generated world through a display of the VR system.
In some instances, it may be more efficient to provide inputs using a touch screen device than using hand gestures in the artificial world of the VR system. However, when the user is wearing the head mounted display (HMD) of the VR system, the user may not be able to see the real world surroundings (which may include a touch screen device).
Examples herein provide a VR system that can be used to automatically locate a nearby touch surface device while the user is engaged in an artificial world of the VR system. Thus, the user may locate a touch screen device, automatically connect to the touch screen device, and use the touch screen device to provide inputs while in the artificial world of the VR system. In other words, the user does not need to remove the HMD of the VR system to find, and connect to, the touch screen device.
FIG. 1 illustrates a block diagram of a VR system 100 of the present disclosure. In one example, the VR system 100 may include a head mounted display (HMD) 102, a hand wearable interface 104, and a touch screen device 106. The VR system 100 may include a processor 108 that may be communicatively coupled to the hand wearable interface 104 via a wired or wireless connection. In one example, the processor 108 may be located external to the HMD 102. In one example, as shown in FIG. 1, the processor 108 may be integrated as part of the HMD 102. The processor 108 may execute various instructions stored in memory and/or functions, as described below.
As described above, sometimes using a touch screen device may be a more accurate way of providing inputs to the HMD 102 than using hand gestures via the hand wearable interface 104. However, when the HMD 102 is active and displaying a computer generated artificial world on a display 112 in a virtual reality (VR) mode, the user may not be able to see where the touch screen device 106 is located.
In one example, the touch screen device 106 may have a marker 120 that can be detected by the HMD 102 and/or the hand wearable interface 104. No other objects in the room may appear in the display 112 when the VR mode is active except the marker 120.
In one example, the marker 120 may be a pre-defined dynamic marker that can be detected by a camera 110 (e.g., a red, green, blue (RGB) video camera, an infrared camera, and the like) coupled to the HMD 102. For example, a user may look around a room with the HMD 102 to see if the pre-defined dynamic marker appears in the display 112. The pre-defined dynamic marker may be a constantly changing mark or code, as opposed to a static or fixed code. The shape of the pre-defined dynamic marker may change periodically, the colors of the pre-defined dynamic marker may change periodically, the codes within the pre-defined dynamic marker may change periodically, and the like.
In one example, the marker 120 may be an emission of a particular wavelength of light. For example, the camera 110 may be an infrared camera and may detect an infrared wavelength of light emitted by the touch screen device 106 that is invisible to the human eye so as not to distract non-users of the VR system 100. The infrared wavelength of light may allow a non-user of the VR system 100 to use the touch screen device 106 as a traditional device, while allowing the camera 110 to locate the touch screen device 106. The infrared wavelength of light may be shown on the display 112 such that the user may move towards the touch screen device 106.
In one example, the touch screen device 106 may be registered with the processor 108 of the VR system 100. Thus, the VR system 100 may know the dimensions of the display of the touch screen device 106. The HMD 102 or the hand wearable interface 104 may emit a signal to wake the touch screen device 106. The marker 120 may be a glow of the display of the touch screen device 106. The camera 110 may then identify the touch screen device 106 by searching for a light emitted from a screen that has the same dimensions as the touch screen device 106 that was registered with the VR system 100.
In one example, the hand wearable interface 104 may be used to locate the touch screen device 106 without the marker 120. In one example, the hand wearable interface 104 may include an emitter 114. The emitter 114 may be a wireless emitter that broadcasts a wireless signal.
When the touch screen device 106 receives the wireless signal, the touch screen device 106 may transmit a response signal that is received by a receiver 116. The receiver 116 may be a wireless receiver that receives the response signal from the touch screen device 106. Examples of such wireless links may include Bluetooth, radio frequency identification (RFID), near field communications (NFC), and the like. In one example, the distance to the touch screen device 106 may be calculated based on the time to receive the response signal from the touch screen device 106. When the response signal is received, an avatar of the touch screen device 106 may be shown in the display 112. The avatar may grow smaller or larger as the user moves further away from or closer to the touch screen device 106.
In one example, the hand wearable interface 104 may include a haptic feedback 118. The haptic feedback 118 may be a component that provides feedback (e.g., a vibration or a buzz) to the user when the hand wearable interface 104 is located over or within a pre-defined distance (e.g., a few inches) from the touch screen device 106. In one example, the strength of the haptic feedback may vary and be proportional to the detected proximity of the device. Thus, the user can locate and grab the touch screen device 106 when the haptic feedback 118 provides constant feedback.
In one example, after the user is done with the touch screen device 106, the user may place the touch screen device 106 at a location. The VR system 100 may remember the location where the touch screen device 106 was placed such that the touch screen device 106 may be located more quickly the next time the user want to locate and use the touch screen device 106.
Once the touch screen device 106 is located, the touch screen device 106 may be held. The camera 110 may be used to detect an orientation of the touch screen device 106. In one example, the touch screen device 106 may have sensors that can detect the orientation of the touch screen device 106. The orientation of the touch screen device 106 can be transmitted to the processor 108 for display. After the touch screen device 106 is located, held by the user, and the orientation is detected, the touch screen device 106 may be communicatively coupled to the HMD 102. The touch screen device 106 may also be shown in the computer generated artificial world in the orientation that is detected, as discussed in further detail below. The touch screen device 106 can then be used to provide inputs to the computer generated artificial world displayed on the display 112.
FIG. 2 illustrates a block diagram of a VR system 200. The VR system 200 may include similar components as the VR system 100 illustrated in FIG. 1. In one example, the HMD 102 may be communicatively coupled to the processor 108. The HMD 102 may be to display the computer generated artificial world when the HMD 102 is operating in a VR mode. In other words, a user may not be able to see his or her surroundings when wearing the HMD 102 and interacting with the computer generated artificial world in the VR mode.
The processor 108 may also be communicatively coupled to the hand wearable interface 104. As discussed above, the processor 108 may communicate with the hand wearable interface 104 via a wired or wireless connection. The hand wearable interface 104 may be worn around the back of a user's hand to provide motion detection, gesture detection, and the like to the processor 108.
The processor 108 may also be communicatively coupled to a wireless communication interface 202. The wireless communication interface 202 may establish a wireless connection 204 to the touch screen device 106. The processor 108 may then receive inputs from the touch screen device 106 over the wireless connection 204.
In one example, the touch screen device 106 may be registered with a VR application executed by the processor 108, as noted above. In one example, a plurality of different touch screen devices 106 that are associated with, or owned by, a user may be registered with the VR system 100. For example, the user may have a touch screen phone and a touch screen tablet device to use in the computer generated artificial world.
In addition to learning the dimensions of the touch screen device 106, the registration process may be used to download an application on the touch screen device 106 that works with the VR mode of the HMD 102. For example, the application may allow the touch screen device 106 to automatically establish the wireless connection 204 with the processor 108 when the touch screen device 106 is located.
The application may also allow the touch screen device 106 to track which locations (e.g., x-y coordinates) of the display are touched and transmit the location information to the processor 108. The processor 108 may then identify a touch input displayed on an avatar of the touch screen device 106 that is displayed in the computer generated artificial world. The touch input may be associated with a function and the function may be executed in the computer generated artificial world during the VR mode of the HMD 102.
FIG. 3 illustrates an example display 112 of an avatar 306 of the touch screen device 106 in the HMD 102 of the VR system 100. For example, after the touch screen device 106 is located and an indication that the user is holding the touch screen device 106 is received, the display 112 may show the avatar 306 of the touch screen device 106. The avatar 306 may show the touch screen device 106 in an orientation in which the user is holding the touch screen device 106. The avatar 306 may also show touch inputs 310, 312, and 314 of a graphical user interface (GUI) 308.
The dimensions of the avatar 306 may be similar to the dimensions of the touch screen device 106. The size of the GUI 308 may be similar to the size of a GUI that would be displayed on the touch screen device 106. The number and size of the touch inputs 310, 312, and 314 may be a function of the size, or dimensions, of the touch screen device 106.
As shown in FIG. 3, the touch screen device 106 may be powered on and active, but display a blank screen. In other words, the touch screen device 106 may not show any information or GUI despite being activated and connected to the HMD 102. In some examples, the touch screen device 106 may be a touch screen device without a display. Since the touch screen device 106 does not show any information, a touch screen device without a display may be used with the VR system 100.
However, in the display 112 that shows the computer generated artificial world, the avatar 306 may display the GUI 308 with touch inputs 310, 312, and 314. It should be noted that although three touch inputs are illustrated in FIG. 3, any number of touch inputs may be displayed in the avatar 306. In the computer generated artificial world, the user may want to select the touch input 310. The user may touch a location on the touch screen device 106 that is associated with the location of the touch input 310 in the avatar 306. The touch screen device 106 may detect the touch and record a location that is touched. The location may be transmitted to the processor 108 of the HMD 102 via the wireless connection 204.
In one example, the touch screen device 106 may display images associated with an application or operating system executed by the touch screen device 106. The images on the touch screen device 106 may be transmitted to the processor 108 to be displayed on the display 112. Thus, the GUI 308 with the touch inputs 310, 312, and 314 may be what is shown on the touch screen device 106. In other words, the display 112 may show what is actually displayed by the touch screen device 106. This may allow the user to check messages, notifications, and the like, on the touch screen device 106 while using the touch screen device 106 in the computer generated artificial world.
The processor 108 may determine which touch input was selected based on the location information that is received from the touch screen device 106. For example, the processor 108 may determine that the touch input 310 was selected based on the location information from the touch screen device 106.
The processor 108 may then determine a function that is associated with the touch input 310. For example, the touch inputs 310, 312, and 314 may be associated with different functions that can be executed in the computer generated world. For example, the computer generated world may be a construction simulation. Each touch input 310, 312, and 314 may be a touch input to use a different tool, use a different vehicle, build a different structure, and the like. The touch input 310 may be associated with a function to demolish a structure. The processor 108 may determine that the touch input 310 has been selected to demolish a structure. The processor 108 may then prompt a user to select a structure in the computer generated artificial world and demolish the structure that is selected based on the selection of the touch input 310.
It should be noted that the inputs 310, 312, and 314 may be associated with other functions in different applications. For example, the computer generated artificial world may be a video game, a travel simulator, and the like, and the inputs 310, 312, and 314 may execute different functions in the different computer generated artificial worlds.
As a result, the VR system 100 may allow a user to locate a touch screen device 106 while using the HMD 102. Thus, the user does not need to deactivate a VR mode and remove the HMD 102 to locate the touch screen device 106. In addition, the touch screen device 106 may be connected to the HMD 102 and used to provide touch inputs in the computer generated artificial world. For example, an avatar 306 of the touch screen device 106 may be shown in the computer generated artificial world and used to interact, or execute different functions, in the computer generated artificial world.
FIG. 4 illustrates a flow diagram of an example method 400 for connecting a virtual reality device to a touch surface device to receive an input from the touch surface device. In one example, the method 400 may be performed by the virtual reality system 100, or the apparatus 500 illustrated in FIG. 5 and described below.
At block 402, the method 400 begins. At block 404, the method 400 receives an indication that a touch screen device that is located is being held. For example, the user may want to use the touch screen device to provide inputs to a virtual reality world in a virtual reality system. The user may not be able to see his or her surroundings in the real world while in the virtual reality world.
As a result, one of the methods described above can be used to locate the touch screen device. In one example, the indication may be a haptic feedback that is triggered when the touch screen device is located and/or held. In one example, the indication may be an audible indication when the touch screen device is located and/or held. For example, a beep or tone may be played in the audio of the virtual reality system. In one example, the indication may be a visual indication when the touch screen device is located and/or held. For example, an outline of the touch screen device may flash or the touch screen device may be displayed in the virtual reality system.
At block 406, the method 400 generates an avatar of the touch screen device in a computer generated artificial world and causes the avatar to be displayed in a head mounted display (HMD) of a virtual reality (VR) system that displays the computer generated artificial world. For example, a graphical representation of the touch screen device may be generated and displayed in the HMD of the VR system.
In one example, the avatar of the touch screen device may display an image or interface that is not shown on the real touch screen device. For example, the touch screen device in reality may be powered on, but display a blank screen. However, the avatar of the touch screen device may display a menu with different buttons that can be selected. The VR system may know the dimensions of the touch screen device via a registration process. The registration process may also include downloading an application on the touch screen device that works with the VR system such that the touch screen device can identify and communicate selections of certain areas of the touch screen device to the VR system.
Thus, the VR system may know how to size and locate the menu and buttons in the avatar of the touch screen device. The location of a button in the avatar of the touch screen device may correspond to a same location on the real touch screen device.
At block 408, the method 400 establishes a wireless connection to the touch screen device. The touch screen device may communicate wirelessly with the virtual reality system to exchange data, inputs, and outputs. In one example, the wireless connection may be a Bluetooth low energy (BLE) connection, a Wi-Fi connection, a local area network (LAN) connection, and the like. The VR system may automatically initiate a pairing process or process to establish the wireless connection when the touch screen device is located and confirmed to be held.
At block 410, the method 400 receives a selection of a touch input on the avatar of the touch screen device in the computer generated artificial world. For example, a user may be playing a game in the computer generated artificial world. The touch screen device may be used to make a selection in the game. A user may touch an area of the screen of the real touch screen device via interaction with the avatar of the touch screen device in the computer generated artificial world.
The real touch screen device may detect a location of where the user touched the display of the real touch screen device. The location may be transmitted to the VR system. The VR system may associate a touch input on the avatar of the touch screen device with the location of the real touch screen device that was touched.
At block 412, the method 400 executes a function associated with the touch input in the computer generated artificial world. In one example, each button in the avatar of the touch screen device may be associated with a function within the computer generated artificial world. When the selection of the touch input is received in the block 410, the VR system may identify the function associated with the touch input that is selected. Using the game example above, selecting a touch input in the computer generated artificial world may equip a particular armament. Thus, when the armament touch input is selected in the computer generated artificial world, the particular armament may be equipped in the computer generated artificial world. At block 414, the method 400 ends.
FIG. 5 illustrates an example of an apparatus 500. In one example, the apparatus 500 may be the virtual reality system 100. In one example, the apparatus 500 may include a processor 502 and a non-transitory computer readable storage medium 504. The non-transitory computer readable storage medium 504 may include instructions 506, 508, 510, 512, and 514 that, when executed by the processor 502, cause the processor 502 to perform various functions.
In one example, the instructions 506 may include instructions to locate a touch screen device while a head mounted display (HMD) of a virtual reality (VR) system is displaying a computer generated artificial world. The instructions 508 may include instructions to establish a wireless connection to the touch screen device. The instructions 510 may include instructions to display an avatar of the touch screen device in the computer generated artificial world, wherein the avatar displays a touch input that is different than what is displayed on the touch screen device. The instructions 512 may include instructions to receive a selection of the touch input displayed on the avatar. The instructions 514 may include instructions to execute a function associated with the touch input in the computer generated artificial world.
It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.

Claims (16)

The invention claimed is:
1. A virtual reality (VR) system, comprising:
a head mounted display (HMD) to display a computer generated artificial world;
a hand wearable interface;
a wireless communication interface to establish a wireless communication path; and
a processor communicatively coupled to the HMD, the hand wearable interface, and the wireless communication interface, the processor to receive an indication that a touch screen device is located from a locator device in the HMD or the hand wearable device, to establish a wireless connection to the touch screen device via the wireless communication interface, and to receive an input via the touch screen device, wherein the touch screen device is registered with the HMD such that the HMD can send a signal to the touch screen device to wake the touch screen device to emit an indicator that can be detected by the HMD.
2. The VR system of claim 1, wherein the indicator comprises a pre-defined dynamic marker and wherein the locator device comprises:
a camera coupled to the HMD and communicatively coupled to the processor to detect the pre-defined dynamic marker on the touch screen device and transmit the pre-defined dynamic marker to the processor as the indication.
3. The VR system of claim 1, wherein the indicator comprises a wavelength of light emitted by the touch screen device and wherein the locator device comprises:
a camera coupled to the HMD and communicatively coupled to the processor to detect the wavelength of light emitted by the touch screen device that is invisible to a human eye and to transmit the wavelength of the light that is detected to the processor as the indication.
4. The VR system of claim 1, wherein the locator device comprises:
a wireless emitter located in the hand wearable interface to emit a wireless signal; and
a wireless receiver located in the hand wearable interface to receive a response signal from the touch screen device in response to the touch screen device receiving the wireless signal.
5. The VR system of claim 4, wherein the hand wearable interface comprises a haptic feedback device to provide haptic feedback when the hand wearable interface is located over the touch screen device.
6. The VR system of claim 1, wherein the processor is to generate an avatar of the touch screen device when the touch screen device is located and the HMD is to display the avatar in the computer generated artificial world.
7. The VR system of claim 6, wherein the processor is to generate touch screen input buttons for the avatar of the touch screen device and the HMD is to display the touch screen input buttons on the avatar of the touch screen device in the computer generated artificial world.
8. A non-transitory computer readable storage medium encoded with instructions executable by a processor, the non-transitory computer-readable storage medium comprising:
instructions to locate a touch screen device while a head mounted display (HMD) of a virtual reality (VR) system is displaying a computer generated artificial world, wherein the touch screen device is registered with the VR system such that the VR system can send a signal to the touch screen device to wake the touch screen device to emit an indicator that can be detected by the HMD;
instructions to establish a wireless connection to the touch screen device;
instructions to display an avatar of the touch screen device in the computer generated artificial world, wherein the avatar displays a touch input that is different than what is displayed on the touch screen device;
instructions to receive a selection of the touch input displayed on the avatar; and
instructions to execute a function associated with the touch input in the computer generated artificial world.
9. The non-transitory computer readable storage medium of claim 8, wherein the indicator comprises a pre-defined dynamic marker and wherein the instructions to locate the touch screen device, comprises:
instructions to detect the pre-defined dynamic marker on the touch screen device.
10. The non-transitory computer readable storage medium of claim 8, wherein the indicator comprises a wavelength of light emitted by the touch screen device and wherein the instructions to locate the touch screen device, comprises:
instructions to detect the wavelength of light emitted by the touch screen device that is invisible to a human eye.
11. The non-transitory computer readable storage medium of claim 8, wherein the instructions to locate the touch screen device, comprises:
instructions to emit a wireless signal that is received by the touch screen device;
instructions to receive a response signal from the touch screen device in response to the touch screen device receiving the wireless signal;
instructions to calculate a distance to the touch screen device based on the response signal.
12. The non-transitory computer readable storage medium of claim 11, further comprising:
instructions to change a size of the avatar of the touch screen based on the distance as the distance changes.
13. A method, comprising:
receiving, by a processor, an indication that a touch screen device that is located is being held, wherein the touch screen device is registered with a virtual reality (VR) system such that the VR system can send a signal to the touch screen device to wake the touch screen device to emit an indicator that can be detected by a head mounted display (HMD) of the VR system;
generating, by the processor, an avatar of the touch screen device in a computer generated artificial world and causing the avatar to be displayed in the HMD that displays the computer generated artificial world;
establishing, by the processor, a wireless connection to the touch screen device;
receiving, by the processor, a selection of a touch input on the avatar of the touch screen device in the computer generated artificial world; and
executing, by the processor, a function associated with the touch input in the computer generated artificial world.
14. The method of claim 13, further comprising:
registering, by the processor, a plurality of touch screen devices associated with a user by storing in memory of the VR system an identification and a screen size of each one of the plurality of touch screen devices.
15. The method of claim 14, wherein the indicator comprises an area of light, the method further comprising:
detecting, by the processor, the area of light that is approximately equal to the screen size of the touch screen device to locate the touch screen device.
16. The VR system of claim 1, wherein the processor is further to:
record a location of the touch screen device where the touch screen device is set down.
US16/970,513 2018-04-19 2018-04-19 Inputs to virtual reality devices from touch surface devices Active US11455035B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/028377 WO2019203837A1 (en) 2018-04-19 2018-04-19 Inputs to virtual reality devices from touch surface devices

Publications (2)

Publication Number Publication Date
US20220011861A1 US20220011861A1 (en) 2022-01-13
US11455035B2 true US11455035B2 (en) 2022-09-27

Family

ID=68240283

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/970,513 Active US11455035B2 (en) 2018-04-19 2018-04-19 Inputs to virtual reality devices from touch surface devices

Country Status (4)

Country Link
US (1) US11455035B2 (en)
EP (1) EP3756074A4 (en)
CN (1) CN111989643A (en)
WO (1) WO2019203837A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112363616A (en) * 2020-10-27 2021-02-12 上海影创信息科技有限公司 Split VR/AR device
US11927752B2 (en) * 2022-02-16 2024-03-12 Htc Corporation Control method and mobile device in immersive system

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120218188A1 (en) 2011-02-24 2012-08-30 Tatsuki Kashitani Information processing apparatus, information processing method, and terminal apparatus
US20130335573A1 (en) 2012-06-15 2013-12-19 Qualcomm Incorporated Input method designed for augmented reality goggles
US20160140332A1 (en) * 2014-11-13 2016-05-19 Intel Corporation System and method for feature-based authentication
WO2016130895A1 (en) 2015-02-13 2016-08-18 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
US20160267712A1 (en) 2015-03-09 2016-09-15 Google Inc. Virtual reality headset connected to a mobile computing device
US20160274662A1 (en) * 2015-03-20 2016-09-22 Sony Computer Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in hmd rendered environments
US20170012972A1 (en) 2014-02-24 2017-01-12 Sony Corporation Proximity based and data exchange and user authentication between smart wearable devices
US20170011553A1 (en) 2015-07-07 2017-01-12 Google Inc. System for tracking a handheld device in virtual reality
US20170090744A1 (en) 2015-09-28 2017-03-30 Adobe Systems Incorporated Virtual reality headset device with front touch screen
EP3223061A1 (en) 2012-11-20 2017-09-27 Microsoft Technology Licensing, LLC Head mount display and method for controlling the same
WO2017171418A1 (en) 2016-03-31 2017-10-05 Samsung Electronics Co., Ltd. Method for composing image and electronic device thereof
US20170293351A1 (en) 2016-04-07 2017-10-12 Ariadne's Thread (Usa), Inc. (Dba Immerex) Head mounted display linked to a touch sensitive input device
US20170300116A1 (en) 2016-04-15 2017-10-19 Bally Gaming, Inc. System and method for providing tactile feedback for users of virtual reality content viewers
US9811184B2 (en) 2014-07-16 2017-11-07 DODOcase, Inc. Virtual reality viewer and input mechanism
US20170329419A1 (en) 2016-05-11 2017-11-16 Google Inc. Combining gyromouse input and touch input for navigation in an augmented and/or virtual reality environment
US9886086B2 (en) 2015-08-21 2018-02-06 Verizon Patent And Licensing Inc. Gesture-based reorientation and navigation of a virtual reality (VR) interface
US20180095542A1 (en) 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Object Holder for Virtual Reality Interaction
US20190174088A1 (en) * 2016-08-05 2019-06-06 Apple Inc. Display System
US20190369726A1 (en) * 2016-11-16 2019-12-05 Samsung Electronics Co., Ltd. Electronic device and control method thereof
US20200064921A1 (en) * 2016-11-16 2020-02-27 Samsung Electronics Co., Ltd. Electronic device and control method thereof

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9183612B2 (en) * 2013-09-04 2015-11-10 Qualcomm Incorporated Wearable display device use-based data processing control
US10585478B2 (en) * 2013-09-13 2020-03-10 Nod, Inc. Methods and systems for integrating one or more gestural controllers into a head mounted wearable display or other wearable devices
KR101651535B1 (en) * 2014-12-02 2016-09-05 경북대학교 산학협력단 Head mounted display device and control method thereof
US10317997B2 (en) * 2016-03-11 2019-06-11 Sony Interactive Entertainment Inc. Selection of optimally positioned sensors in a glove interface object
CN105955453A (en) * 2016-04-15 2016-09-21 北京小鸟看看科技有限公司 Information input method in 3D immersion environment
KR20180010845A (en) * 2016-07-22 2018-01-31 엘지전자 주식회사 Head mounted display and method for controlling the same
CN106774888B (en) * 2016-12-15 2024-06-07 北京国承万通信息科技有限公司 System with positioning function and equipment thereof

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120218188A1 (en) 2011-02-24 2012-08-30 Tatsuki Kashitani Information processing apparatus, information processing method, and terminal apparatus
US20130335573A1 (en) 2012-06-15 2013-12-19 Qualcomm Incorporated Input method designed for augmented reality goggles
EP3223061A1 (en) 2012-11-20 2017-09-27 Microsoft Technology Licensing, LLC Head mount display and method for controlling the same
US20170012972A1 (en) 2014-02-24 2017-01-12 Sony Corporation Proximity based and data exchange and user authentication between smart wearable devices
US9811184B2 (en) 2014-07-16 2017-11-07 DODOcase, Inc. Virtual reality viewer and input mechanism
US20160140332A1 (en) * 2014-11-13 2016-05-19 Intel Corporation System and method for feature-based authentication
WO2016130895A1 (en) 2015-02-13 2016-08-18 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
US20160267712A1 (en) 2015-03-09 2016-09-15 Google Inc. Virtual reality headset connected to a mobile computing device
US20160274662A1 (en) * 2015-03-20 2016-09-22 Sony Computer Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in hmd rendered environments
US20170011553A1 (en) 2015-07-07 2017-01-12 Google Inc. System for tracking a handheld device in virtual reality
US9886086B2 (en) 2015-08-21 2018-02-06 Verizon Patent And Licensing Inc. Gesture-based reorientation and navigation of a virtual reality (VR) interface
US20170090744A1 (en) 2015-09-28 2017-03-30 Adobe Systems Incorporated Virtual reality headset device with front touch screen
WO2017171418A1 (en) 2016-03-31 2017-10-05 Samsung Electronics Co., Ltd. Method for composing image and electronic device thereof
US20170287060A1 (en) * 2016-03-31 2017-10-05 Samsung Electronics Co., Ltd. Method for composing image and electronic device thereof
US20170293351A1 (en) 2016-04-07 2017-10-12 Ariadne's Thread (Usa), Inc. (Dba Immerex) Head mounted display linked to a touch sensitive input device
US20170300116A1 (en) 2016-04-15 2017-10-19 Bally Gaming, Inc. System and method for providing tactile feedback for users of virtual reality content viewers
US20170329419A1 (en) 2016-05-11 2017-11-16 Google Inc. Combining gyromouse input and touch input for navigation in an augmented and/or virtual reality environment
US20190174088A1 (en) * 2016-08-05 2019-06-06 Apple Inc. Display System
US20180095542A1 (en) 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Object Holder for Virtual Reality Interaction
US20190369726A1 (en) * 2016-11-16 2019-12-05 Samsung Electronics Co., Ltd. Electronic device and control method thereof
US20200064921A1 (en) * 2016-11-16 2020-02-27 Samsung Electronics Co., Ltd. Electronic device and control method thereof

Also Published As

Publication number Publication date
WO2019203837A1 (en) 2019-10-24
EP3756074A1 (en) 2020-12-30
CN111989643A (en) 2020-11-24
EP3756074A4 (en) 2021-10-20
US20220011861A1 (en) 2022-01-13

Similar Documents

Publication Publication Date Title
US10620699B2 (en) Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system
CN107015638B (en) Method and apparatus for alerting a head mounted display user
US20170043244A1 (en) Interactive entertainment using a mobile device with object tagging and/or hyperlinking
US9950258B2 (en) System, program, and method for operating screen by linking display and plurality of controllers connected via network
JP6523233B2 (en) Information processing method, apparatus, and program for causing a computer to execute the information processing method
US20200005331A1 (en) Information processing device, terminal device, information processing method, information output method, customer service assistance method, and recording medium
JP2018200566A (en) Program executed by computer capable of communicating with head mounted device, information processing apparatus for executing that program, and method implemented by computer capable of communicating with head mounted device
US11278820B2 (en) Method and apparatus for detection of light-modulated signals in a video stream
US11455035B2 (en) Inputs to virtual reality devices from touch surface devices
JP6836379B2 (en) An information processing method, a device, and a program that causes a computer to execute the information processing method.
JP6113897B1 (en) Method for providing virtual space, method for providing virtual experience, program, and recording medium
JP2018200678A (en) Program executed by computer capable of communicating with head mounted device, information processing apparatus for executing that program, and method implemented by computer capable of communicating with head mounted device
CN103430131A (en) A method, system and electronic device for association based identification
JP6457446B2 (en) Method and apparatus for supporting communication in virtual space, and program for causing computer to execute the method
CN105913036B (en) Unmanned aerial vehicle identification method and device
CN109144598A (en) Electronics mask man-machine interaction method and system based on gesture
JP6921789B2 (en) Programs and methods that are executed on the computer that provides the virtual space, and information processing devices that execute the programs.
CN106647794A (en) Flight control method and apparatus
JP2019164216A (en) Instruction position transmission system and instruction position transmission method
JP6382772B2 (en) Gaze guidance device, gaze guidance method, and gaze guidance program
JP2018067297A (en) Information processing method, apparatus, and program for causing computer to implement information processing method
JP6974253B2 (en) A method for providing virtual space, a program for causing a computer to execute the method, and an information processing device for executing the program.
JPWO2020045254A1 (en) Display system, server, display method and equipment
JP6941715B2 (en) Display device, display program, display method and display system
US20230144941A1 (en) Computer-implemented method, computer, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HORII, HIROSHI;VANKIPURAM, MITHRA;ROBINSON, IAN N.;REEL/FRAME:053512/0927

Effective date: 20180418

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE