WO2018106675A1 - Procédé et appareil de fourniture d'une communauté virtuelle - Google Patents

Procédé et appareil de fourniture d'une communauté virtuelle Download PDF

Info

Publication number
WO2018106675A1
WO2018106675A1 PCT/US2017/064689 US2017064689W WO2018106675A1 WO 2018106675 A1 WO2018106675 A1 WO 2018106675A1 US 2017064689 W US2017064689 W US 2017064689W WO 2018106675 A1 WO2018106675 A1 WO 2018106675A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual
interface
physical
virtual device
Prior art date
Application number
PCT/US2017/064689
Other languages
English (en)
Inventor
Kent LYONS
Alan Zhang
Rushil KHURANA
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Publication of WO2018106675A1 publication Critical patent/WO2018106675A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • the present disclosure relates to virtual reality, in particular, to haptic technology in a virtual reality scene.
  • VR Virtual Reality
  • a person using VR equipment is typically able to "look around" the artificial world, move about in it and interact with features or items that are depicted on a screen or in goggles.
  • Virtual realities artificially create sensory experiences, which may include sight, touch, hearing, and, less commonly, smell.
  • Virtual realities may be displayed either on a computer monitor, a projector screen, or with a virtual reality headset (also called head- mounted display or HMD).
  • HMDs typically take the form of head-mounted goggles with a screen in front of the eyes. Some simulations may include additional sensory information and provide sounds through speakers or headphones. Additionally, gloves or hand wearable devices fitted with sensors may be utilized.
  • VR has become the subject of increased attention. This is because VR can be used practically in every field to perform various functions including test, entertain and teach. For example, engineers and architects can use VR in modeling and testing of new designs. Doctors can use VR to practice and perfect difficult operations ahead of time and military experts can develop strategies by simulating battlefield operations. VR is also used extensively in the gaming and entertainment industries to provide interactive experiences and enhance audience enjoyment. VR enables the creation of a simulated environment that feels real and can accurately duplicate real life experiences in real or imaginary worlds. Furthermore, VR covers remote communication environments which provide virtual presence of users with the concepts of telepresence and telexistence or virtual artifact (VA).
  • VA virtual artifact
  • VR content up to 360°
  • immersive or panoramic Such content is potentially not fully visible by a user watching the content on immersive display devices such as mounted displays, smart glasses, PC screens, tablets, smartphones and the like. That means that at a given moment, a user may only be viewing a part of the content.
  • a user can typically navigate within the content by various means such as head movement, mouse movement, touch screen, voice and the like.
  • Advanced VR systems have evolved to include a sense of touch.
  • Some haptic VR systems i.e., related to the sense of touch
  • tactile information generally known as force feedback in medical, video gaming and military training applications.
  • Some VR systems used in video games can transmit vibrations and other sensations to the user via the game controller.
  • haptic VR systems One issue associated with haptic VR systems is the potential need for real objects that may represent each and every virtual object in a VR scene.
  • the need for multiple real objects increases the complexity of the system and impacts the advantages of having a VR system in the first place.
  • each time the virtual scene is changed the infrastructure must be modified and possibly rebuilt. This need limits the ability to develop complex virtual environments with passive haptics or objects.
  • Haptic retargeting is a framework originally created for repurposing passive haptics that leverages the dominance of vision when a user's senses conflict.
  • one passive object (a cube) touched by the user may represent multiple virtual objects (cubes) in the VR scene by applying warping techniques to the VR scene in order to create the illusion that the user is touching different virtual objects at different times.
  • a method including applying a first input received from an interface of a physical device to a first virtual device in a virtual reality scene, the physical device being coupled to the first virtual device, the interface being in a first mode, transforming the virtual reality scene based on haptic technology to associate a location of the physical device in a physical space with a location of a second virtual device in a virtual space of the virtual reality scene, coupling the physical device to the second virtual device in the virtual reality scene, and applying a second input received from the interface to the second virtual device, the interface being in a second mode.
  • an apparatus including a processor and at least one memory coupled to the processor, the processor being configured to apply a first input received from an interface of a physical device to a first virtual device in a virtual reality scene, the physical device being coupled to the first virtual device, the interface being in a first mode, transform the virtual reality scene based on haptic technology to associate a location of the physical device in a physical space with a location of a second virtual device in a virtual space of the virtual reality scene, couple the physical device to the second virtual device in the virtual reality scene, and apply a second input received from the interface to the second virtual device, the interface being in a second mode.
  • a computer-readable storage medium carrying a software program including program code instructions for performing any of the embodiments of the method described above.
  • a non-transitory article of manufacture tangibly embodying computer readable program code instructions which, when executed, cause a computer to perform any of the embodiments of the method described above.
  • FIG. 1 illustrates a simplified block diagram of an exemplary virtual reality system in accordance with an embodiment of the present disclosure
  • FIG. 2A illustrates a user interacting with a head-mounted display device, a hand sensor and a physical device in accordance with an embodiment of the present disclosure
  • FIG. 2B illustrates a user interacting with a head-mounted display device, a hand sensor and a physical device in accordance with an embodiment of the present disclosure
  • FIG. 2C illustrates a user interacting with a head-mounted display device, a hand sensor and a physical device in accordance with an embodiment of the present disclosure
  • FIG. 3 illustrates a flowchart of an exemplary method in accordance with an embodiment of the present disclosure
  • the elements shown in the figures may be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
  • general-purpose devices which may include a processor, memory and input/output interfaces.
  • the phrase "coupled" is defined to mean directly connected to or indirectly connected with, through one or more intermediate components. Such intermediate components may include both hardware and software based components.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage.
  • DSP digital signal processor
  • ROM read only memory
  • RAM random access memory
  • any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
  • the disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
  • the present disclosure is directed to haptic VR systems that include dynamic repurposing of non-passive objects, e.g., electronic devices, electro-mechanical devices, etc.
  • a perceptual illusion or trick in virtual reality is used based on haptic retargeting such that one non-passive physical object (e.g.., a single interactive physical device) in the physical world or space appears as multiple virtual devices in the VR world.
  • Haptic retargeting is applied to the VR scene in order to better align or co-locate the physical device with a selected virtual device.
  • the system of the present disclosure dynamically and automatically couples the physical device to the selected virtual device.
  • the user is able to interact with the virtual world through the physical device, experiencing a sense of touch and utilizing the physical device as a user interface. Therefore, the system provides the user with the illusion of interacting with multiple and potentially distinct virtual devices through the senses of touch and vision while handling just one physical device, instead of handling one physical device for each virtual device. As a result, the VR experience becomes more real with limited added complexity.
  • FIG. 1 illustrates a simplified block diagram of an exemplary virtual reality system 100 in accordance with an embodiment of the present disclosure.
  • System 100 may process virtual reality, augmented reality, augmented virtuality or immersive content, hereby referred to as virtual reality content.
  • System 100 may include a server or service provider 105 which is capable of receiving and processing user requests from one or more of user devices 160-1 to 160-n.
  • the server 105 may be, for example, a content server.
  • the content server in response to a user request for content, may provide program content including various multimedia assets such as, but not limited to, movies or TV shows for viewing, streaming or downloading by users using the devices 160-1 to 160-n, or coupled to the devices 160-1 to 160-n.
  • the devices 160-1 to 160-n may be any consumer electronic device, e.g., a gateway, a settop box, a television, a computer, a laptop, a tablet, a smart phone, etc.
  • the server or service provider may provide other services besides content delivery.
  • Various exemplary user devices 160-1 to 160-n may communicate with the exemplary server 105 and/or each other (e.g., in an multi-user VR game or VR experience) over a communication network 150 such as the Internet, a wide area network (WAN), and/or a local area network (LAN).
  • a communication network 150 such as the Internet, a wide area network (WAN), and/or a local area network (LAN).
  • Server 105 may communicate with user devices 160-1 to 160-n in order to provide and/or receive relevant information such as recommendations, user ratings, metadata, web pages, media contents, sales offers, sales requests, etc., to and/or from user devices 160-1 to 160-n thru the network connections. Server 105 may also provide additional processing of information and data when the processing is not available and/or capable of being conducted on the local user devices 160-1 to 160-n.
  • server 105 may be a computer having a processor 110 such as, e.g., an Intel processor, running an appropriate operating system such as, e.g., Windows 2008 R2, Windows Server 2012 R2, Linux operating system, etc.
  • processor 110 may execute software to perform and control the various functions and components of server 105.
  • FIG. 1 also illustrates further details of server or service provider 105.
  • Processor 110 may control the various functions and components of the server 105 via a control bus 130.
  • Server 105 may also include a memory 125 which may represent at least one of a transitory memory such as RAM, and a non-transitory memory such as a ROM, a Hard Disk Drive (HDD), a Compact Disk (CD) drive or Digital Video Disk (DVD) drive, and/or a flash memory, for processing and storing different files and information as necessary, including computer program products and software, webpages, user interface information, user profiles, user recommendations, user ratings, metadata, electronic program listing information, databases, search engine software, etc., as needed.
  • a memory 125 which may represent at least one of a transitory memory such as RAM, and a non-transitory memory such as a ROM, a Hard Disk Drive (HDD), a Compact Disk (CD) drive or Digital Video Disk (DVD) drive, and/or a flash
  • Search engine and recommender software may be stored in the non-transitory memory 125 of server 105, as necessary, so that media recommendations may be provided, e.g., in response to a user's profile and rating of disinterest and/or interest in certain media assets, and/or for searching using criteria that a user specifies using textual input (e.g., queries using "sports", “adventure”, “Angelina Jolie”, etc.).
  • a server administrator may interact with and configure server 105 to run different applications using different user input/output (I/O) devices 115 as well known in the art.
  • the user I/O or interface devices 115 of the exemplary server 105 may represent e.g., a mouse, touch screen capabilities of a display, a touch and/or a physical keyboard for inputting user data.
  • the user interface devices 115 of the exemplary server 105 may also include a speaker or speakers, and/or other user indicator devices, for outputting visual and/or audio sound, user data and feedback.
  • server 105 may be connected to network 150 through a communication interface 120 for communicating with other servers or web sites (not shown) and one or more user devices 160-1 to 160-n, as shown in FIG. 1.
  • the communication interface 120 may also represent television signal modulator and RF transmitter in the case when the content provider 105 represents a television station, cable or satellite television provider, or other wireless content provider.
  • server components such as, e.g., power supplies, cooling fans, etc., may also be needed, but are not shown in FIG. 1 to simplify the drawing.
  • User devices 160-1 to 160-n may be virtual reality, augmented reality or immersive video rendering devices including one or more displays.
  • the device may employ optics such as lenses in front of each display.
  • the display may also be a part of the immersive display device such as for example in the case of smartphones or tablets.
  • displays and optics may be embedded in a helmet, in glasses, or in a wearable visor which are a part of the device or coupled to the device.
  • the immersive video rendering or user device 160-1 to 160-n may also include one or more sensors and/or external auxiliary devices, as further described below.
  • User devices 160-1 to 160-n may be one or more of but are not limited to, e.g., a PC, a laptop, a tablet, a smart phone, a smart watch, a video receiver, a smart television (TV), an HMD device or smart glasses (such as, e.g., Oculus Rift (from Oculus VR), PlayStation VR (from Sony), Gear VR (from Samsung), Google Glass (from Google), etc.), a set-top box, a gateway, or the like.
  • An example of such devices may be, e.g., a Microsoft Windows 10 computer/tablet/laptop, an Android phone/tablet, an Apple IOS phone/tablet, a Sony TV receiver, or the like.
  • a simplified block diagram of an exemplary user device according to the present disclosure is illustrated in block 160-1 of FIG. 1 as Device 1, and is further described below. Similar components and features may also be present in the other user devices 160-2 to 160-n in FIG. 1.
  • User device 160-1 may be directly coupled to network/Internet/ 150 by wired or wireless means through connection or link 155, or through gateway 156 and connections or links 154 and 158.
  • User device 160-1 may include a processor 172 representing at least one processor for processing various data and signals, and for controlling various functions and components of the device 160-1, including video encoding/decoding and processing capabilities in order to play, display, and/or transport video content.
  • the processor 172 may communicate with and controls the various functions and components of the device 160-1 via a control bus 171.
  • User device 160-1 may also include a display 179 which is driven by a display driver/bus component 177 under the control of processor 172 via a display bus 178.
  • the display 179 may be a touch display.
  • the type of the display 179 may be, e.g., Liquid Crystal Display (LCD), Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.
  • an exemplary user device 160-1 according to the present disclosure may have its display outside of the user device, or an additional or a different external display may be used to display the content provided by the display driver/bus component 177. This is illustrated, e.g., by an exemplary external display 185 which is connected through an external display connection 195 of device 160-1.
  • the connection may be a wired or a wireless connection.
  • Exemplary user device 160-1 may also include a memory 176 which may represent at least one of a transitory memory such as a RAM, and a non-transitory memory such as a ROM, an HDD, a CD drive, a DVD drive, and/or a flash memory, for processing and storing different files and information as necessary, including computer program products and software (e.g., as represented by flow chart diagram 300 of FIG. 3 to be discussed below), webpages, user interface information, databases, etc., as needed.
  • a memory 176 which may represent at least one of a transitory memory such as a RAM, and a non-transitory memory such as a ROM, an HDD, a CD drive, a DVD drive, and/or a flash memory, for processing and storing different files and information as necessary, including computer program products and software (e.g., as represented by flow chart diagram 300 of FIG. 3 to be discussed below), webpages, user interface information, databases, etc., as needed.
  • computer program products and software e
  • device 160-1 may also include a communication interface 170 for coupling and communicating to/from server 105 and/or other devices, via, e.g., the network 150 using the link 155, Communication interface 170 may also couple device 160-1 to gateway 156 using the link 158.
  • Links 155 and 158 may represent a connection through, e.g., an Ethernet network, a cable network, a FIOS network, a Wi-Fi network, and/or a cellphone network (e.g., 3G, 4G, LTE, 5G), etc.
  • One function of an immersive content rendering or user device 160-1 may be to control a virtual camera which captures at least a part of the content structured as a virtual volume.
  • the system may include one or more pose tracking sensors which totally or partially track the user's pose, for example, the pose of the user's head, in order to process the pose of the virtual camera.
  • One or more positioning sensors may be provided to track the displacement of the user.
  • the system may also include other sensors related to the environment for example to measure lighting, temperature or sound conditions. Such sensors may also be related to the body of a user, for instance, to detect or measure sweating or heart rate. Information acquired through these sensors may be used to process the content.
  • an exemplary device 160-1 may also include a sensor 175.
  • sensor 175 may be at least an audio sensor such as a microphone, a visual sensor such as a camera (video or picture), a gyroscope, an accelerometer, a compass, a motion detector, a wearable hand/leg/arm/body band, a glove, a Global Positioning System (GPS) sensor, a Wi-Fi location tracking sensor, a Radio Frequency Identification (RFID) tag (or tracking tag), and/or other types of sensors as previously described.
  • GPS Global Positioning System
  • RFID Radio Frequency Identification
  • an exemplary external sensor 182, 183 may be separate from and coupled to the user device 160-1 (e.g., placed in the room walls, ceiling, doors, inside another device, on the user, etc.).
  • the exemplary external sensor(s) 182, 183 may have wired or wireless connections 192,193, respectively, to the device 160-1 via an external device interface 173 of the device 160-1, as shown in FIG. 1.
  • External sensor(s) 182, 183 may be, e.g., a microphone, a visual sensor such as a camera (video or picture), a gyroscope, an accelerometer, a compass, a motion detector, a wearable hand/leg/arm/body band, a glove, a Global Positioning System (GPS) sensor, a Wi-Fi location tracking sensor, a Radio Frequency Identification (RFID) tag (or tracking tag), etc.
  • sensor data e.g., from sensor 175, 182 and/or 183, may be provided to processor 172 of user device 160-1 via processor bus 171 for further processing.
  • the processor 172 may process the signals received from the sensor 175, 182, 183. Some of the measurements from the sensors may be used to compute the pose of the device and to control the virtual camera. Sensors which may be used for pose estimation include, for instance, gyroscopes, accelerometers or compasses. In more complex systems, a rig of cameras for example may also be used. The processor 172 may perform image processing to estimate the pose of an HMD. Some other measurements may be used to process the content according to environmental conditions or user reactions. Sensors used for detecting environment and user conditions include, for instance, one or more microphones, light sensor or contact sensors. More complex systems may also be used such as, for example, a video camera tracking eyes of a user.
  • exemplary device 160-1 may also include user input/output (I/O) devices 174.
  • the user I/O or interface devices 174 of the exemplary device 160-1 may represent e.g., a mouse, a remote control, a joystick, a touch sensitive surface (e.g. a touchpad or a tactile screen), touch screen capabilities of a display (e.g., display 179 and/or 185), a touch screen and/or a physical keyboard for inputting user data.
  • the user interface devices 174 of the exemplary device 160-1 may also include a speaker or speakers, and/or other user indicator devices, for outputting visual and/or audio sound, user data and feedback. Information from user input devices may be used to process the content, manage user interfaces or to control the pose of the virtual camera.
  • sensors 175, 182, 183 and user input devices 174 communicate with the processor 172 within the immersive rendering or user device 160-1 through wired or wireless communication interfaces.
  • device 160-1 may be coupled to at least one external or auxiliary device 181, via external device interface 173 and link 191.
  • Device 181 may be, e.g., a smart phone, a tablet, a remote control, a keyboard device, etc.
  • the external device 181 may include a touch sensitive surface (e.g. a touchpad or a tactile screen) to be utilized as a user interface (UI).
  • UI user interface
  • device 160-1 may be coupled to a virtual reality, augmented reality (AR), or immersive HMD device or smart glasses 184 (such as, e.g., Oculus Rift (from Oculus VR), PlayStation VR (from Sony), Gear VR (from Samsung), Google Glass (from Google), etc.), via external device interface 173 and link 194.
  • user device 160-1 may itself be an HMD device or smart glasses.
  • the HMD device may include an embedded camera which may be utilized as a sensor, e.g., for localization (when observing the surroundings) or for user recognition when pointed to the user's eye (e.g., iris recognition).
  • the HMD device may also include an embedded microphone which may be utilized as a sensor or as a voice interface to accept voice commands.
  • the HMD device may also include a headphone or earbuds for providing audio.
  • exemplary user devices 160-1 to 160-n may access different media assets, recommendations, web pages, services or databases provided by server 105 using, e.g., Hypertext Transfer Protocol (HTTP).
  • HTTP Hypertext Transfer Protocol
  • a well-known web server software application which may be run by server 105 to service the HTTP protocol is Apache HTTP Server software.
  • examples of well-known media server software applications for providing multimedia programs include, e.g., Adobe Media Server and Apple HTTP Live Streaming (HLS) Server.
  • server 105 may provide media content services similar to, e.g., Amazon, Netflix, or M- GO as noted before.
  • Server 105 may also use a streaming protocol such as e.g., Apple HTTP Live Streaming (HLS) protocol, Adobe Real-Time Messaging Protocol (RTMP), Microsoft Silverlight Smooth Streaming Transport Protocol, etc., to transmit various programs including various multimedia assets such as, e.g., movies, TV shows, software, games, electronic books, electronic magazines, etc., to the end-user device 160-1 for purchase and/or viewing via streaming, downloading, receiving or the like.
  • a streaming protocol such as e.g., Apple HTTP Live Streaming (HLS) protocol, Adobe Real-Time Messaging Protocol (RTMP), Microsoft Silverlight Smooth Streaming Transport Protocol, etc.
  • HLS Apple HTTP Live Streaming
  • RTMP Adobe Real-Time Messaging Protocol
  • Microsoft Silverlight Smooth Streaming Transport Protocol etc.
  • the senor or sensors 175, 182 and/or 183 may also be connected to the server or service provider 105 by wired (e.g., Ethernet cable) or wireless (e.g., 802.11 standards or Bluetooth) means (e.g., LAN or WAN network) and processor 110 may remotely process some or all of the sensor data.
  • wired e.g., Ethernet cable
  • wireless e.g., 802.11 standards or Bluetooth
  • processor 110 may remotely process some or all of the sensor data.
  • connections or links in FIG. 1, including 140, 155, 154, 158, 191-195 may each independently be a wired or a wireless connection.
  • haptic retargeting is a haptic technology framework originally created for repurposing passive haptics by leveraging the dominance of vision when a user's senses conflict.
  • Haptic retargeting dynamically aligns physical and virtual objects as the person is interacting in the environment.
  • warping techniques are applied to the VR scene in order to give the user the illusion of touching different virtual objects at different times, while actually touching just one physical object.
  • world warping the virtual world surrounding a user is manipulated, warped or transformed to better align the haptic object in the physical space with the virtual object in the virtual space.
  • world warping may shift a scene in a direction, e.g., to the left, to better align the virtual object with the location of the haptic object in the physical space.
  • body warping the visual representation of the user in the virtual space may be manipulated, warped or transformed to better align the haptic object in the physical space with the virtual object in the virtual space.
  • the illusion may be accomplished by, e.g., warping the user's arm in the virtual space.
  • body warping may provide the illusion that a user's arm is reaching to the left in the virtual space, when the user's arm is actually reaching to the front in the physical space.
  • body warping and/or world warping may be applied to the VR scene in order to accomplish haptic retargeting of non-passive objects, i.e., physical devices, as explained in greater detail below, in association with FIG. 2A, 2B and 2C.
  • Warping may include the operations of translation, scaling, rotation and/or deformation of elements of the VR scene, including bodies and/or objects.
  • the VR system of the present disclosure in particular, processor 172 of user device 160- 1) dynamically and automatically couples the physical device (e.g., 181 or 174) to the selected virtual device.
  • the user is able to interact with the virtual world through the physical device, experiencing a sense of touch and utilizing the physical device as a user interface.
  • the VR system dynamically and automatically warps or transforms the VR scene to better align the physical device with the second virtual device and couples the physical device with the second virtual device. The user is then allowed to interact with the second virtual device through the physical device.
  • FIG. 2A illustrates a user 265 interacting with a head-mounted device 270, a hand sensor 275 and a physical device 255 in accordance with an embodiment of the present disclosure.
  • the HMD 270 may be similar to user device 161-1, or similar to HMD 184 coupled to user device 161-1.
  • the hand sensor 275 may be similar to sensor 175, 182 or 183.
  • the physical device 255 may be similar to external device 181 or user I/O device 174.
  • FIG. 2A illustrates in drawing 200A the physical space 250 where a first interaction takes place between the user 265, table 260, HMD 270, tablet or smart phone 255 and hand sensor 275.
  • FIG. 2A also illustrates a virtual space 210 representing a VR scene displayed by HMD 270 including two virtual devices (a first virtual device 215 and a second virtual device 220) and a virtual arm/hand 225 corresponding to the user's arm 265 in the physical space.
  • Drawing 200 A illustrates that the user's hand 265 in the physical space 250 is touching the physical device 255. In particular, the user's fingers 265 are touching the physical device 255 interface.
  • Drawing 200A also illustrates that the virtual user's hand 225 is touching the first virtual device 215. In particular, the virtual user's fingers 225 are touching the first virtual device 215 in the virtual space 210 to mimic the user's action and interaction with the devices 270, 255 and hand sensor 275 in the physical space 250.
  • the physical device 255 interface when physical device 255 is coupled to the first virtual device 215, the physical device 255 interface may be in a first mode. When in the first mode, the physical device 255 interface may match the first virtual device 215 interface.
  • the physical device 255, and the virtual devices 215 and 220 are each one of a smart phone and a tablet.
  • the interface of the physical device 255 and the first virtual device 215 are touch screen interfaces of smart phones or tablets.
  • the physical device 255 does not need to match the virtual devices 215 and 220; only the interface of the physical device 255 matches the interface of a virtual device (215 or 220) when the physical device 255 is coupled to the respective virtual device (215 or 220).
  • the physical device 255 may be a tablet
  • the first virtual device 215 may be a remote control
  • the second virtual device 220 may be a smart phone.
  • the user interface of the physical device 255 matches the buttons of the remote control 215.
  • the user interface of the physical device 255 matches the user interface of the smart phone 220.
  • FIG. 2B illustrates a user 265 interacting with a head-mounted device 270, a hand sensor 275 and a physical device 255 in accordance with an embodiment of the present disclosure.
  • FIG. 2B illustrates the same physical space 250 as in FIG. 2A in a subsequent or second interaction of the user 265 with the HMD 270, hand sensor 275 and physical device 255.
  • FIG. 2B illustrates the same virtual space 210 in a subsequent interaction of the virtual arm/hand 225 with the first virtual device 215 and the second virtual device 220, where the subsequent interaction in the virtual space 210 corresponds to the subsequent or second interaction in the physical space 250.
  • Drawing 200B illustrates that the user's hand 265 in the physical space 250 is not touching the physical device 255 any longer (as in FIG. 2A) and is, instead moving to the left of physical device 255.
  • Drawing 200B also illustrates that the virtual user's hand 225 is not touching the first virtual device 215 any longer and is, instead, moving towards the second virtual device 220.
  • the virtual user's hand/arm 225 in the virtual space 210 is mimicking the user's action in the physical space 250.
  • Drawing 200B illustrates the case of a user 265 wanting to touch virtual device 220 and moving the user's hand 265 to the left in physical space 250.
  • the user sees the virtual space 210 on HMD 270, where the virtual hand 225 moves towards the second virtual device 220.
  • Wearable hand sensor 275 captures the movement of the user's hand 265 in the physical space and sends a signal to a processor, e.g., processor 172 of user device 160-1.
  • Processor 172 then processes the signal from hand sensor 275 to detect the hand movement, in this case, towards the second virtual device.
  • the physical space is mapped to the virtual space, so that movement in the former may be translated into movement in the latter.
  • the detection may be based on at least one of a threshold of proximity or distance thereshold to the second virtual device 220, a direction of the trajectory of the user's hand 275 and a user's hand "touching" the second virtual device 220 in the virtual space 210.
  • processor 172 may determine that the user's hand is within a distance threshold from the second virtual device 220 when correlating the hand movement in the physical space 250 with the positon of the second virtual device 220 in the virtual space 210. For example, if the user's hand 265, when mapped or translated to the virtual space 210 (that is, virtual hand 225), is within a distance threshold of the second virtual device 220, then the user intends to touch the second virtual device 220. Otherwise, if the user's hand 265, when mapped or translated to the virtual space 210, is outside a distance threshold of the second virtual device 220, then the user does not intend to touch the second virtual device 220. In this embodiment, the decision is anticipatory or predictive. The processor 272 predicts that the user intends to touch the second virtual device 220 ahead of time.
  • processor 172 may be further configured to determine a direction of the user's hand movement or trajectory in the physical space 250 and, when translated to the virtual space 210, compare the direction or trajectory against a position of the second virtual device 220. In this case, the detection may be based on an angle of direction of the trajectory which extrapolates the trajectory towards the second virtual device 220 in the virtual space 210. For example, if the user's hand trajectory, when translated to the virtual space 210, may be extrapolated to touch or come within a distance threshold of the second virtual device 220, then the user intends to touch the second virtual device.
  • the user does not intend to touch the second virtual device 220.
  • the decision in this embodiment is anticipatory or predictive.
  • the processor 272 predicts that the user intends to touch the second virtual device 220 ahead of time.
  • the processor 172 may be further configured to detect that the user's hand is "touching" the second virtual device, that is, the location of the hand 265 in the physical space 250 is equivalent to the location of the second virtual device 220 in the virtual space 210, in a mapping between the physical and the virtual spaces.
  • processor 172 may dynamically transform or warp the VR scene in virtual space 210 to align or co-locate the physical device 255 with the second virtual device 220, in response, for example, to the user's hand movement towards the second virtual device 220.
  • the transformation may be a direct consequence of the user's hand movement towards the second virtual device 220, or may be a function of (or in response to receiving) another action by the user (e.g., voice command) or by another user.
  • the retargeting, transformation or warping of the VR scene may happen concurrently, dynamically and smoothly to the user's eyes. For example, from FIG. 2A to 2B, as the user's hand 265 moves to the left of the physical device 265, expressing the user's desire to touch the second virtual device 220, processor 172 warps the VR scene in the virtual space 210 to associate or align or co-locate the second virtual device 220 with the physical device 255 giving the user the illusion that they are the same device. Observe that in FIG. 2B, the VR scene has shifted the virtual devices 215 and 220 to the right. The shift happens dynamically and smoothly.
  • the virtual hand 225 As the virtual device 220 shifts to the right, so does the virtual hand 225, prompting the user by visual cue to move his hand in the right direction and return his hand 265 towards the physical device 225. As a result the user's hand original movement in the physical space to the left of the physical device 265 is followed by a movement to the right, back to the physical device 225, prompted by the transformation of the VR scene in the virtual space 210.
  • processor 172 may couple the physical device 255 with the second virtual device 220 allowing the user 265 to utilize the physical device 255 as the user interface for the second virtual device 220.
  • the coupling includes communicating between the physical device 255 and the processor 172 (e.g., through external device interface 173) and applying the inputs from the physical device interface to the second virtual device 220 in the virtual reality scene.
  • the coupling may be a direct consequence of the user's hand movement towards the second virtual device 220, or may be a function of (or in response to receiving) another action by the user (e.g., voice command) or by another user.
  • FIG. 2C illustrates a user 265 interacting with a head-mounted device 270, a hand sensor 275 and a physical device 255 in accordance with an embodiment of the present disclosure.
  • FIG. 2C illustrates the same physical space 250 as in FIG. 2A and 2B in another subsequent or third interaction of the user 265 with the HMD 270, hand sensor 275 and physical device 255.
  • FIG. 2C illustrates the same virtual space 210 in another subsequent interaction of the virtual arm/hand 225 with the first virtual device 215 and the second virtual device 220, where the other subsequent interaction in the virtual space 210 corresponds to the another subsequent or third interaction in the physical space 250.
  • Drawing 200C illustrates that the user's hand 265 in the physical space 250 is again touching the physical device 255. In particular, the user's fingers 265 are touching the physical device 255 interface. Drawing 200C also illustrates that the virtual user's hand 225 is touching the second virtual device 220. In particular, the virtual user's fingers 225 are touching the second virtual device 220 to mimic in the virtual space 210 the user's action in the physical space 250.
  • physical device 255 is coupled to the second virtual device 220 and the physical device 255 interface is in a second mode.
  • the physical device 255 interface matches the second virtual device 220 interface.
  • the physical device 255, and the virtual devices 215 and 220 are each one of a smart phone and a tablet.
  • the interface of the physical device 255 and the second virtual device 220 are touch screen interfaces of smart phones or tablets.
  • processor 172 dynamically transformed or warped the VR scene in virtual space 210 to associate or align the physical device 255 with the second virtual device 220, in response to the user's hand movement towards the second virtual device 220.
  • the retargeting, transformation or warping of the VR scene happened concurrently, dynamically and smoothly to the user's eyes.
  • the user's hand original movement in the physical space to the left of the physical device 265 (as shown in FIG. 2B) was followed by a movement to the right, back to the physical device 225, prompted by the transformation of the VR scene in the virtual space 210.
  • FIG. 2C the user's hand 265 is touching the physical device 255 in the physical space 250.
  • processor 172 has coupled the physical device 255 with the second virtual device 220 allowing the user 265 to utilize the physical device 255 as the user interface for the second virtual device 220.
  • FIG. 2C the second virtual device 220 is approximately at the center of the VR scene 210 in contrast with FIG. 2A (where the first virtual device 215 is at the center of the VR scene) and FIG. 2B (where the virtual user's hand 225 is at the center of the VR scene).
  • the VR scenes from FIG. 2A to 2B and to 2C show the smooth transition and the visual cues that are presented to the user in order to direct his hand's movement back to the physical device 255. Therefore, from FIG. 2A to 2C, the user has been able to interact with two virtual devices 215 and 220 utilizing only one physical device 255 as a user interface to both virtual devices.
  • the user interface of physical device 255 may match the user interface of the first virtual device 215 in a first mode and may match the user interface of the second virtual device 220 in a second mode.
  • the user interface for physical device 255 will have the same configuration, i.e., the upper left corner of the touch screen of the physical device 255 is configured to represent the number 1 of the first virtual device 215.
  • the user interface for physical device 255 will have the same configuration, i.e., the upper left corner of the touch screen of the physical device 255 is configured to represent the start button of the second virtual device 220.
  • the physical device 255 may be a tablet
  • the first virtual device 215 may be a keyboard
  • the second virtual device 220 may be a calculator.
  • the user interface of the physical device 255 matches the keys of the keyboard 215.
  • the user interface of the physical device 255 matches the buttons of the calculator 220.
  • the physical device has a user interface software application which is controlled by the VR rendering device (e.g., user device 160-1).
  • the VR rendering device processor e.g., 172
  • the VR rendering device processor sends an instruction to the physical device to switch to the appropriate user interface mode, matching the user interface of the virtual device.
  • a user's body movement may not be the user's hand and may include, e.g., the user's head.
  • a sensor may be attached to the user's head or included in the HMD device.
  • the user's eye movements may also be tracked with a camera sensor in the HMD.
  • the user's legs or other body movements may also be tracked with wearable sensors.
  • the user may be walking on a physical space while the physical device 255 rests on a desk like surface attached to the user's body.
  • more than one body part may be tracked at the same time, e.g., hand and eye, or hand and leg.
  • at least one fixed camera in the physical space may also be used as sensors to observe the user's movements and/or the position of the physical device 255.
  • the various sensors may be, e.g., sensor 175, 182 or 183.
  • more than one physical device may be present in the VR system according to the present disclosure.
  • additional physical devices may be each coupled to one or more virtual devices.
  • a physical device may only couple to a subset of the virtual devices, depending on its physical features. For example, a wand may only couple to wand like virtual devices.
  • the location of the physical device in the physical space influences the choice of virtual devices that may be coupled to the physical device.
  • a physical device in a quadrant of a table e.g., 260
  • a physical device in a first table e.g., 260
  • a second device in a second table may only couple to virtual devices in a second portion of an immersive virtual scene (e.g., a second room).
  • a user may, e.g., walk from table to table as if walking from room to room in a virtual space.
  • the selection of the second virtual device 220 (and consequent haptic retargeting and/or coupling of the physical device 255 to virtual device 220) is not prompted by a user's movement but made by processor 172, prompted by the VR software it runs (e.g., a VR game), or by another user's devicel60-2, 160-N.
  • the selection of the virtual device and consequent retargeting of the VR scene may be performed by processor 172 instructed by another user or by the VR software
  • the coupling of the physical device may be performed by the processor 172 based on a user's action (e.g., touching the physical device).
  • the selection of the virtual device and consequent retargeting of the VR scene, and the coupling of the physical device to the selected virtual device may be performed by processor 172 instructed by another user or by the VR software
  • the user device 160- 1 and HMD 270, 184 are the same device.
  • a fixed camera (not shown in FIGs 2 A to 2C) focuses on the physical space 250 surrounding the physical device 255, providing visual information about the physical space 250, including movement of the user's hand 265.
  • the external device 181 need not be a full computer (e.g., tablet, smart phone, etc.).
  • the external device 181 may be a peripheral dedicated to input (touch screen, buttons, etc.) that is re-appropriated for different virtual devices/controls.
  • the peripheral may be e.g., user I/O device 174.
  • an apparatus 160- 1 for providing a virtual reality scene/content including a processor 172 and at least one memory 176 coupled to the processor, the processor being configured to apply a first input received from an interface of a physical device to a first virtual device in a virtual reality scene/content, the physical device 174, 181 being coupled to the first virtual device, the physical device interface being in a first mode, transform the virtual reality scene based on or using haptic technology (e.g., haptic retargeting) to associate or align or co-locate a real or physical location/position of the physical device 174,181 in a physical space with a virtual location/position of a second virtual device in the virtual space of the virtual reality scene, couple the physical device 181 to the second virtual device in the virtual reality scene and apply a second input received from the physical device interface to the second virtual device, the physical device interface being in a second mode.
  • haptic technology e.g., haptic retargeting
  • the haptic technology By transforming the VR scene, the haptic technology enables a user to interact with the second virtual device in the virtual space through the sense of touch of the physical device in the physical space.
  • the haptic technology may be haptic retargeting or similar technology. Haptic retargeting dynamically warps elements of the virtual reality scene in order to associate (a user's sense of touch of) the physical device with (a user's view of) the second virtual device.
  • the warping may associate a user's sense of touch and view of the physical device with a user's view of the second virtual device.
  • the warping may be at least one of body warping and world warping.
  • Warping may include at least one of the operations of translation, scaling, rotation and deformation of elements of the VR scene, including bodies and/or objects.
  • the physical or virtual locations/positions may be identified with respect to a set of two-dimensional or three-dimensional axes in the respective 2D or 3D Cartesian coordinate system of the physical or virtual space.
  • the physical device interface may match a first virtual device interface in the first mode and may match a second virtual device interface in the second mode.
  • the physical device may be a smart phone
  • the first virtual device may be a remote control for a television
  • the second virtual device may be a cell phone.
  • the physical device interface matches the touch buttons of a remote control.
  • the physical device interface matches the touch buttons of a cell phone.
  • the physical device interface may be adapted, modified, or changed to match a virtual device interface when the physical device 174, 181 is coupled to a virtual device.
  • the physical device 174, 181 may be coupled to a virtual device as a function of or based on or in response to (the processor) receiving at least one user's coupling action.
  • the user's coupling action may be one of a user's movement towards the virtual device, a user's voice command and the user touching the physical device 174, 181 in the physical space.
  • the user's movement may be, e.g., one of a user's hand movement, body movement and head movement.
  • the virtual reality scene may be transformed as a function of or based on or in response to (the processor) receiving at least one user's transforming action.
  • the user's transforming action may be one of a user's movement towards the second virtual device, a user's voice command and the user touching the second virtual device in the virtual space.
  • the user's movement may be one of a user's hand movement, body movement or head movement.
  • Motion sensors may be applied on the various body parts, and/or a camera may be used as a video and/or an audio sensor.
  • the user's movement may be a hand movement and when the user's hand breaks contact with the physical device 174, 181 and moves towards the second virtual device, the virtual reality scene is transformed to direct the user's hand back to the physical device 174, 181.
  • the processor 172 may be further configured to detect a user's hand movement towards the second virtual device in the virtual space. The detection may be based on at least one of a threshold of proximity or distance threshold to the second virtual device, a direction of the trajectory of the user's hand and a user's hand "touching" the second virtual device in the virtual space.
  • the processor 172 may be further configured to determine that the user's hand is within a distance threshold from the second virtual device when correlating the hand movement in the physical space with the positon of the second virtual device in the virtual space. If the hand is within a distance threshold of the second virtual device, then the user intends to touch the second virtual device. Otherwise, if the hand is outside a distance threshold of the second virtual device, then the user does not intend to touch the second virtual device.
  • the processor 172 may be further configured to determine a direction of the user's hand movement or trajectory and compare the direction against a position of the second virtual device. In this case, the detection may be based on an angle of direction of the trajectory which extrapolates the trajectory towards the second virtual device. For example, if the hand trajectory may be extrapolated to touch or come within a distance threshold of the second virtual device, then the user intends to touch the second virtual device. Otherwise, if the hand trajectory may not be extrapolated to touch or come within a distance threshold of the second virtual device, then the user does not intend to touch the second virtual device.
  • the processor 172 may be further configured to detect that the user's hand is "touching" the second virtual device, that is, the location of the hand in the physical space is equivalent to the location of the second virtual device in the virtual space, in a mapping between the physical and the virtual spaces.
  • the processor 172 may be further configured to receive a signal based on the user's hand movement. The signal may be sent by, e.g., a hand wearable, a glove sensor or a camera, 182, 183, 175.
  • the apparatus may further include a sensor 175, 182, 183 coupled to the processor and configured to sense the user's hand movement and provide a signal based on the user's hand movement.
  • the sensor may be, e.g., one of a hand wearable, a glove sensor and a camera.
  • the user's transforming action and the user's coupling action may be the same action.
  • the processor 172 may be configured to couple based on the VR content.
  • the VR content may be a video game and the processor is configured to transform based on the game actions or progression.
  • the processor 172 may be configured to transform based on the VR content.
  • the VR content may be a video game and the processor is configured to transform based on the game actions or progression.
  • the virtual reality scene may be transformed as a function of or in response to (the processor) receiving at least one transforming action by another user.
  • another user may be interacting with the user in a VR video game or another VR experience.
  • the transforming action by another user may be one of the another user's movement towards the second virtual device, the another user's voice command and the another user touching the second virtual device in the virtual space.
  • the physical device 181, 174 may be coupled to the second virtual device as a function of or in response to (the processor) receiving at least one coupling action by another user.
  • another user may be interacting with the user in a VR video game or another VR experience.
  • the coupling action by another user may be one of the another user's movement towards the second virtual device, the another user's voice command and the another user touching the physical device 181, 174 in the physical space.
  • the transforming action by another user and the coupling action by another user may be the same action.
  • the first input may be provided by the user or another user.
  • the second input may be provided by the user or another user.
  • the physical device interface may be at least one of, e.g., one of a touchscreen interface, a keyboard interface and a series of buttons.
  • the physical device 174, 181 may be a smart phone.
  • the physical device 174, 181 may be a remote control.
  • the physical device 174, 181 may be a keyboard device.
  • the physical device 174, 181 may be a tablet or computer.
  • the processor 172 may be further configured to output the transformed virtual reality scene for display.
  • apparatus 160-1 may output the VR scene to display 179, 184 and/or 185.
  • the apparatus may further include or be coupled to a display device 179, 184, 185 coupled to the processor 172 and configured to display the transformed virtual reality scene.
  • the processor may be further configured to couple the physical device 181, 174 to the first virtual device in the virtual reality scene.
  • Coupling to the first virtual device may be similar to coupling to the second virtual device and may be similarly performed, and may include similar features. For example, it may be a function of or in response to (the processor) receiving a user's coupling action or a coupling action by another user, or it may be a function of the VR content or progression (e.g., video game).
  • Coupling to the first virtual device may be optional, bypassed or removed. For example, it may be a default step in a VR content (e.g., video game) and may not require any particular action by the processor.
  • FIG. 3 illustrates a flowchart 300 of an exemplary method of providing a virtual reality scene/content in accordance with the present disclosure.
  • the method may include, at step 320, applying a first input received from an interface of a physical device to a first virtual device in a virtual reality scene/content, the physical device being coupled to the first virtual device, the physical device interface being in a first mode.
  • the method may include transforming the virtual reality scene based on or using haptic technology (e.g., haptic retargeting) to associate or align or co-locate a real or physical location/position of the physical device in a physical space with a virtual location/position of a second virtual device in the virtual space of the virtual reality scene.
  • haptic technology e.g., haptic retargeting
  • the method may include coupling the physical device to the second virtual device in the virtual reality scene.
  • the method may include applying a second input received from the physical device interface to the second virtual device, the physical device interface being in a second mode.
  • the method 400 may be performed by, e.g., device 160-1, including any of the embodiments previously described. In particular, the steps of the method may be performed by processor 172.
  • the physical device may be, e.g., external physical device 181 (e.g., a smart phone, a tablet, a computer, a keyboard, a remote control, etc.), or user I/O devices 174 (e.g., a keyboard, a remote control, etc.).
  • the haptic technology may be haptic retargeting or similar technology.
  • Haptic retargeting dynamically warps elements of the virtual reality scene in order to associate (a user's sense of touch of) the physical device with (a user's view of) the second virtual device.
  • the warping may associate a user's sense of touch and view of the physical device with a user's view of the second virtual device.
  • the warping may be at least one of body warping and world warping. Warping may include at least one of the operations of translation, scaling, rotation and deformation of elements of the VR scene, including bodies and/or objects.
  • the physical or virtual locations/positions may be identified with respect a set of two-dimensional or three-dimensional axes in the respective physical or virtual space.
  • the physical or virtual locations/positions may be identified with respect to a set of two-dimensional or three-dimensional axes in the respective 2D or 3D Cartesian coordinate system of the physical or virtual space.
  • the physical device interface may match a first virtual device interface in the first mode and may match a second virtual device interface in the second mode.
  • the physical device may be a smart phone
  • the first virtual device may be a remote control for a television
  • the second virtual device may be a cell phone.
  • the physical device interface matches the touch buttons of a remote control.
  • the physical device interface matches the touch buttons of a cell phone.
  • the step of coupling 340 to the second virtual device may further include adapting, modifying or changing the physical device interface to match the second virtual device interface.
  • the step of coupling 340 to the second virtual device may be a function of or based on or in response to receiving at least one user's coupling action.
  • the user's coupling action may be one of a user's movement towards the virtual device, a user's voice command and the user touching the physical device in the physical space.
  • the user's movement may be one of, e.g., a user's hand movement, body movement and head movement.
  • the step of transforming 330 may be a function of or based on or in response to receiving a user's transforming action.
  • the user's transforming action may be one of a user's movement towards the second virtual device, a user's voice command and the user touching the second virtual device in the virtual space.
  • the user's movement may be one of a user's hand movement, body movement or head movement.
  • Motion sensors may be applied on the various body parts, and/or a camera may be used as a video and/or audio sensor.
  • the user's movement may be a hand movement and when the user's hand breaks contact with the physical device and moves towards the second virtual device, the step of transforming 430 may include directing the user's hand back to the physical device.
  • the step of transforming may further include detecting a user's hand movement towards the second virtual device.
  • the detection may be based on at least one of a threshold of proximity or distance threshold to the second virtual device, a direction of the trajectory of the user's hand and a user's hand "touching" the second virtual device in the virtual space.
  • the step of detecting may further include determining that the user's hand is within a distance threshold from the second virtual device when correlating the hand movement in the physical space with the positon of the second virtual device in the virtual space. If the hand is within a distance threshold of the second virtual device, then the user intends to touch the second virtual device. Otherwise, if the hand is outside a distance threshold of the second virtual device, then the user does not intend to touch the second virtual device.
  • the step of detecting may further include determining a direction of the user's hand movement or trajectory and comparing the direction against a position of the second virtual device. In this case, the detection may be based on an angle of direction of the trajectory which extrapolates the trajectory towards the second virtual device. For example, if the hand trajectory may be extrapolated to touch or come within a distance threshold of the second virtual device, then the user intend to touch the second virtual device. Otherwise, if the hand trajectory may not be extrapolated to touch or come within a distance threshold of the second virtual device, then the user does not intend to touch the second virtual device.
  • the step of detecting may further include detecting that the user's hand is "touching" the second virtual device, that is, the location of the hand in the physical space is equivalent to the location of the second virtual device in the virtual space, in a mapping between the physical and the virtual spaces.
  • the step of detecting may further include receiving a signal based on the user's hand movement.
  • the signal is sent by a hand wearable, a glove sensor or a camera, e.g., 182, 183 or 175.
  • the step of detecting may further include sensing the user's hand movement and providing a signal based on the user's hand movement.
  • the sensor may be one of a hand wearable, a glove sensor and a camera, e.g., 182, 183 or 175.
  • the user's transforming action and the user's coupling action may be the same action.
  • the step of coupling 340 may be based on the VR content.
  • the VR content may be a video game and the processor is configured to transform based on the game actions or progression.
  • the step of transforming 330 may be based on the VR content.
  • the VR content may be a video game and the processor is configured to transform based on the game actions or progression.
  • the step of transforming 330 may be a function of or in response to receiving at least one transforming action by another user.
  • another user may be interacting with the user in a VR video game or another VR experience.
  • the transforming action by another user may be one of the another user's movement towards the second virtual device, the another user's voice command and the another user touching the second virtual device in the virtual space.
  • the step of coupling 340 may be a function of or in response to receiving at least one coupling action by another user.
  • another user may be interacting with the user in a VR video game or another VR experience.
  • the coupling action by another user may be one of the another user's movement towards the second virtual device, the another user's voice command and the another user touching the physical device (e.g., 181, 174) in the physical space.
  • the transforming action by another user and the coupling action by another user may be the same action.
  • the first input may be provided by the user or another user.
  • the second input may be provided by the user or another user.
  • the physical device interface may be at least one of, e.g., a touchscreen interface, a keyboard interface or a series of buttons.
  • the physical device may be a smart phone.
  • the physical device may be a remote control.
  • the physical device may be a keyboard device.
  • the physical device may be a tablet or computer.
  • the method may further include, at step 335, outputting the transformed virtual reality scene for display.
  • the step of outputting may be performed by, e.g., processor 172.
  • the VR scene may be outputted to display 179, 184 and/or 185.
  • the step of outputting may further include displaying the transformed virtual reality scene.
  • the step of displaying may be performed by, e.g., display device 179, 184 and/or 185.
  • the method may further include, at step 310, coupling the physical device to the first virtual device in the virtual reality scene.
  • the step 310 may be performed by, e.g., processor 172.
  • Step 310 may be similar to step 340 and may be similarly performed and include similar features. For example, it may be a function of or in response to a user's coupling action or a coupling action by another user, or it may be a function of the VR content or progression (e.g., video game).
  • Step 310 may be optional, bypassed or removed.
  • step 310 may be a default step in a VR content (e.g., video game) and may not require any particular action.
  • method 300 may be implemented as a computer program product including computer executable instructions which may be executed by a processor.
  • the computer program product having the computer-executable instructions may be stored in the respective non- transitory computer-readable storage media of the respective above mentioned device, e.g., 160- 1.
  • a non-transitory computer-readable program product including program code instructions for performing any of the embodiments of the method 300 of providing a virtual reality scene.
  • a non-transitory article of manufacture tangibly embodying computer readable program code instructions which when executed cause a computer to perform any of the embodiments the method 300 of providing a virtual reality scene.
  • aspects of the present disclosure can take the form of a computer-readable storage medium. Any combination of one or more computer-readable storage medium(s) may be utilized.
  • a computer-readable storage medium can take the form of a computer-readable program product embodied in one or more computer-readable medium(s) and having computer- readable program code embodied thereon that is executable by a computer.
  • a computer-readable storage medium as used herein is considered a non-transitory storage medium given the inherent capability to store the information therein as well as the inherent capability to provide retrieval of the information therefrom.
  • a computer-readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer-readable storage medium carrying a software program including program code instructions for performing any of the embodiments of the method 300 of providing a virtual reality scene.
  • the functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. Also, when provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.

Abstract

L'invention concerne un procédé de fourniture d'une scène de réalité virtuelle qui consiste à appliquer (320) une première entrée reçue en provenance d'une interface d'un dispositif physique à un premier dispositif virtuel dans une scène de réalité virtuelle, le dispositif physique étant couplé au premier dispositif virtuel, l'interface étant dans un premier mode, à transformer (330) la scène de réalité virtuelle sur la base d'une technologie haptique pour associer un emplacement physique du dispositif physique à un emplacement virtuel d'un second dispositif virtuel dans la scène de réalité virtuelle, à coupler (340) le dispositif physique au second dispositif virtuel dans la scène de réalité virtuelle, et à appliquer (350) une seconde entrée reçue de l'interface au second dispositif virtuel, l'interface étant dans un second mode. L'invention concerne également un appareil (160-1), un support de stockage lisible par ordinateur portant un programme logiciel et un article non transitoire de fabrication mettant en œuvre de manière tangible un programme lisible par ordinateur.
PCT/US2017/064689 2016-12-09 2017-12-05 Procédé et appareil de fourniture d'une communauté virtuelle WO2018106675A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662431899P 2016-12-09 2016-12-09
US62/431,899 2016-12-09

Publications (1)

Publication Number Publication Date
WO2018106675A1 true WO2018106675A1 (fr) 2018-06-14

Family

ID=60703229

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/064689 WO2018106675A1 (fr) 2016-12-09 2017-12-05 Procédé et appareil de fourniture d'une communauté virtuelle

Country Status (1)

Country Link
WO (1) WO2018106675A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112739433A (zh) * 2018-09-27 2021-04-30 高通股份有限公司 用于远程渲染的vr的异步空间扭曲

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140139439A1 (en) * 2012-11-20 2014-05-22 Lg Electronics Inc. Head mount display and method for controlling the same
US20140237366A1 (en) * 2013-02-19 2014-08-21 Adam Poulos Context-aware augmented reality object commands

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140139439A1 (en) * 2012-11-20 2014-05-22 Lg Electronics Inc. Head mount display and method for controlling the same
US20140237366A1 (en) * 2013-02-19 2014-08-21 Adam Poulos Context-aware augmented reality object commands

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CRISTIANO CARVALHEIRO ET AL: "User Redirection and Direct Haptics in Virtual Environments", PROCEEDINGS OF THE 2016 ACM ON MULTIMEDIA CONFERENCE, MM '16, 1 January 2016 (2016-01-01), New York, New York, USA, pages 1146 - 1155, XP055450338, ISBN: 978-1-4503-3603-1, DOI: 10.1145/2964284.2964293 *
ENRICO RUKZIO: "Physical Mobile Interactions: Mobile Devices as Pervasive Mediators for Interactions with the Real World", INTERNET CITATION, 13 February 2007 (2007-02-13), XP002471780, Retrieved from the Internet <URL:http://edoc.ub.uni-muenchen.de/6494/1/Rukzio_Enrico.pdf> [retrieved on 20080304] *
MAHDI AZMANDIAN ET AL: "Haptic Retargeting", HUMAN FACTORS IN COMPUTING SYSTEMS, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 7 May 2016 (2016-05-07), pages 1968 - 1979, XP058257314, ISBN: 978-1-4503-3362-7, DOI: 10.1145/2858036.2858226 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112739433A (zh) * 2018-09-27 2021-04-30 高通股份有限公司 用于远程渲染的vr的异步空间扭曲

Similar Documents

Publication Publication Date Title
US11925863B2 (en) Tracking hand gestures for interactive game control in augmented reality
US20210405761A1 (en) Augmented reality experiences with object manipulation
US10222981B2 (en) Holographic keyboard display
US10984595B2 (en) Method and apparatus for providing guidance in a virtual environment
US10200819B2 (en) Virtual reality and augmented reality functionality for mobile devices
US10789779B2 (en) Location-based holographic experience
US11520399B2 (en) Interactive augmented reality experiences using positional tracking
JP6478360B2 (ja) コンテンツ閲覧
US11704874B2 (en) Spatial instructions and guides in mixed reality
US11119567B2 (en) Method and apparatus for providing immersive reality content
US20210093977A1 (en) Methods and systems for facilitating intra-game communications in a video game environment
US20170185147A1 (en) A method and apparatus for displaying a virtual object in three-dimensional (3d) space
US20150123901A1 (en) Gesture disambiguation using orientation information
WO2019121654A1 (fr) Procédés, appareil, systèmes et programmes informatiques pour permettre une réalité avec intermédiation
US11126342B2 (en) Electronic device for controlling image display based on scroll input and method thereof
WO2018106675A1 (fr) Procédé et appareil de fourniture d&#39;une communauté virtuelle
US11442268B2 (en) Augmented reality gaming using virtual eyewear beams
US20240079031A1 (en) Authoring tools for creating interactive ar experiences
US11863963B2 (en) Augmented reality spatial audio experience
US20230245410A1 (en) Systems and methods of guided instructions or support using a virtual object
US20240119928A1 (en) Media control tools for managing communications between devices
US20220355211A1 (en) Controller action recognition from video frames using machine learning
CN117742555A (zh) 控件交互方法、装置、设备和介质
KR20230124363A (ko) 전자 장치 및 전자 장치의 제어 방법
Afonso Interação em Realidade Virtual Usando Dispositivos Móveis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17817617

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17817617

Country of ref document: EP

Kind code of ref document: A1