WO2018002800A1 - Procédé et appareil pour créer un sous-contenu dans un contenu de réalité virtuelle et son partage - Google Patents

Procédé et appareil pour créer un sous-contenu dans un contenu de réalité virtuelle et son partage Download PDF

Info

Publication number
WO2018002800A1
WO2018002800A1 PCT/IB2017/053781 IB2017053781W WO2018002800A1 WO 2018002800 A1 WO2018002800 A1 WO 2018002800A1 IB 2017053781 W IB2017053781 W IB 2017053781W WO 2018002800 A1 WO2018002800 A1 WO 2018002800A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
sub
user
input
media file
Prior art date
Application number
PCT/IB2017/053781
Other languages
English (en)
Inventor
Mithun Uliyar
Krishna Govindarao
Ravi Shenoy
Gururaj PUTRAYA
Soumik Ukil
Pushkar Patwardhan
Original Assignee
Nokia Technologies Oy
Nokia Usa Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy, Nokia Usa Inc. filed Critical Nokia Technologies Oy
Publication of WO2018002800A1 publication Critical patent/WO2018002800A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Definitions

  • Various implementations relate generally to method, apparatus, and computer program product for creating, sharing and playback of sub-content within a virtual reality video content.
  • VR content generated using computer technology creates a three- dimensional simulated world that a user can electronically explore (e.g., watching using a VR headset) while feeling as if the user existed in that world.
  • a VR user accessing the VR content may desire to share an event or clip within the VR content (e.g., a VR video content) with one or more users in the network of the VR user.
  • the user might want to share his/her performance during a particular or entire part of the VR video game with another user, so as to show how the user achieved a milestone in the VR video game.
  • VR content may be processed post viewing and clipped to generate a particular sequence of scenes of the VR content for sharing purposes, however such process is complex.
  • sharing parts of the VR content as seen by a VR user with other VR users in real-time remains a challenge.
  • a method comprising: determining, by a processor, a start of a sub-content within a virtual reality (VR) content in response to a first input received from a user while accessing the VR content; storing at least a head movement information associated with the user from a time of the start of the sub-content while the VR content being accessed by the user; and determining, by the processor, an end of the sub-content within the VR content in response to a second input received by the user while accessing the VR content, wherein the storing of the head movement information is stopped upon determining the end of the sub-content.
  • VR virtual reality
  • an apparatus comprising at least one processor; and at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to at least perform: determining a start of a sub-content within a virtual reality (VR) content in response to a first input received from a user while accessing the VR content; storing at least a head movement information associated with the user from a time of the start of the sub-content while the VR content being accessed by the user; determining an end of the sub-content within the VR content in response to a second input received by the user while accessing the VR content; and wherein the storing of the head movement information is stopped upon determining the end of the sub-content.
  • VR virtual reality
  • a computer program product comprising at least one computer -readable storage medium, the computer-readable storage medium comprising a set of instructions, which, when executed by one or more processors, cause an apparatus to at least perform: determining a start of a sub-content within a virtual reality (VR) content in response to a first input received from a user while accessing the VR content; storing at least a head movement information associated with the user from a time of the start of the sub-content while the VR content being accessed by the user; and determining an end of the sub-content within the VR content in response to a second input received by the user while accessing the VR content, wherein the storing of the head movement information is stopped upon determining the end of the sub-content.
  • VR virtual reality
  • an apparatus comprising: means for determining a start of a sub-content within a virtual reality (VR) content in response to a first input received from a user while accessing the VR content; means for storing at least a head movement information associated with the user from a time of the start of the sub-content while the VR content being accessed by the user; and means for determining an end of the sub-content within the VR content in response to a second input received by the user while accessing the VR content, wherein the storing of the head movement information is stopped upon determining the end of the sub-content
  • an apparatus comprising at least one processor; and at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to at least perform: determining a start of a sub-content within a virtual reality (VR) content in response to a first input received from a user while accessing the VR content; storing, continuously, a user input pattern
  • FIGURE 1 illustrates an example representation of a setup for creating a sub-content within a VR content and sharing the sub-content, in accordance with an example embodiment
  • FIGURES 2A, 2B, 2C and 2D illustrate example schematic representations of displays at different time timestamps for creating a sub-content, in accordance with an example embodiment
  • FIGURES 3A, 3B, and 3C illustrate example schematic representations of first inputs and second inputs provided by a user, in accordance with some example embodiments
  • FIGURES 4A, 4B, 4C, 4D and 4E illustrate example representation of a media file, in accordance with some example embodiments
  • FIGURE 5 is a flowchart depicting an example method, in accordance with an example embodiment
  • FIGURE 6 is a flowchart depicting an example method of creation and sharing of the sub- content, in accordance with an example embodiment
  • FIGURE 7 is a flowchart depicting an example method of playback of the sub-content, in accordance with an example embodiment
  • FIGURE 8 illustrates an apparatus for creating a sub-content within a VR content, sharing the sub-content and/or performing playback of the sub-content, in accordance with some example embodiments
  • FIGURE 9 is a device, in accordance with another example embodiment.
  • FIGURES 1 through 9 of the drawings Example embodiments and their potential effects are understood by referring to FIGURES 1 through 9 of the drawings.
  • FIGURE 1 illustrates an example setup 100 where at least some example embodiments of present disclosure can be implemented. It should be understood, however, that the setup 100 as illustrated and hereinafter described is merely illustrative of an arrangement for describing some example embodiments, and therefore, should not be taken to limit the scope of the embodiments. As such, it should be appreciated that at least some of the components described below in connection with the setup 100 may be optional and thus in an example embodiment may include more, less or different components than those described in connection with the example embodiment of FIGURE 1.
  • the setup 100 includes a virtual reality (VR) content server 105 that stores a plurality of VR content.
  • VR content include any electronic content that can be simulated by audio-visual means and/or means providing other special effects to create a 3-D display and other special effects of a virtual environment.
  • the displayed virtual environment may be an altered, a substituted or an augmented display that virtually surrounds a VR user who accesses the VR content on a suitable VR display using VR playback accessories.
  • the VR user while accessing the VR content, the VR user may look in all directions for example in 360 degree directions in the scene, and the scene may accordingly be displayed to the VR user.
  • the scenes are 360 degree in azimuth and 180 degree in elevation, and the VR content is meant to create a virtual display of 360 degree directions, however various embodiments disclosed herein may also be applied even when the VR content is created for enabling a less than 360 degree view of the scene, for example, a 270 degree display of a scene with respect to a reference point.
  • the VR content may include, but are not limited to VR video content, a VR video game, a VR animation, or any form of VR graphical content with or without additional special effects.
  • the VR content server 105 may be a public VR content store where people can upload and/or access the VR content.
  • the VR content server 105 may be a subscription based server or a private server that can be accessed by only few authorized VR users.
  • the setup 100 includes a VR content processing apparatus 110 that is configured to access the VR content received from the VR content server 105 via a communication network 115.
  • the communication network 115 may be a centralized network or may comprise a plurality of sub-networks that may offer a direct communication between the entities or may offer indirect communication between the entities. Examples of the communication network 115 include wireless networks, wired networks, and combinations thereof. Some non-exhaustive examples of wireless networks may include wireless local area networks (WLANs), Bluetooth® or Zigbee® networks, cellular networks and the like.
  • WLANs wireless local area networks
  • Bluetooth® or Zigbee® networks cellular networks and the like.
  • wired networks may include local area networks (LANs), Ethernet, fiber optic networks and the like.
  • An example of a combination of wired networks and wireless networks may include the Internet.
  • the VR content processing apparatus (hereinafter 'apparatus 110) is configured to process the VR content for a display to a user 120.
  • the apparatus 110 may include one or more processing systems for example a processor 125 and a memory 130, where the memory 130 stores executable instructions that are executed by the processor 125 to process the VR content.
  • An example configuration of the apparatus 110 is also provided with reference to FIGURE 8.
  • the apparatus 110 is configured to decode the VR content and provide the decoded VR content to the user 120, where the user 120 can access the decoded VR content via a VR kit 135 associated with the user 120.
  • the VR kit 135 includes a VR headset 140, a console 145 and one or more inertial measuring units (IMUs) for example an IMU 150.
  • IMUs inertial measuring units
  • the IMU 150 may non-exhaustively include an accelerometer, a gyroscope, a magnetometer etc.
  • the VR kit 135 may be a part of the apparatus 110.
  • the user 120 may suitably wear the VR headset 140 to view the VR content fed by the apparatus 110.
  • the user 120 can turn his head in various directions to effect a dynamic change in virtual scene associated with the VR content displayed in a display of the VR headset 140.
  • the console 145 may include any equipment that is used to provide user movement related inputs in any interactive VR content such as in a VR video game.
  • the console 145 may also be an example of a device that can be used for navigating through the VR content thereby enabling the user 120 to access the VR content.
  • the IMU 150 comprises one or more inertial sensors that are configured to detect a head movement pattern of the user 120.
  • the IMU 150 may also includes inertial sensors to detect movement of one or more other limbs or body parts of user 120, for example hand or foot movement, and the like.
  • the IMU 150 can be fused using a sensor fusion algorithm and the fused output can be used for accurate estimation of degree of head, hand or limb movement of the user 120.
  • the IMU 150 may also be used for tracking movement of a device that can be used for enabling the user 120 to navigate through the VR content (i.e. perform the same functions as performed by the VR headset 140) while the user 120 is accessing the VR content.
  • the user 120 while accessing the VR content, can generate a media file associated with a sub-content within the VR content and also share the media file with other VR content users.
  • the sub-content refers to at least a portion of the VR content that is viewed by the user 120 during any time interval of his/her access of the VR content.
  • the setup 100 also includes a database 155 that may be used to temporarily or permanently store the media file associated with the sub-content.
  • the media file is generated such that the shared media file can be played back by another VR content user (see, user 160) using associated VR devices such as a VR kit 165.
  • the VR content user 160 can access the media file shared by the user 120 via the communication network 115.
  • the apparatus 110 associated with the VR content user 160 for enabling the VR content user 160 to access the sub-content constructed from the shared media file on the VR kit 165.
  • FIGURES 2A, 2B, 2C and 2D illustrate an example representation of accessing the VR content and sharing of a sub-content of the VR content, in accordance with an example embodiment.
  • the user 120 accesses the VR content, for example, by using the VR headset 140. It is noted that as the user turns or moves his head, the orientation of the VR headset 140 also changes, and in turn the scene displayed on the VR headset 140 also changes in a dynamic manner. In an example, as the VR content is created by capturing a 3-D representation of the scene in all 360 degree azimuth direction and 180 degree elevation direction, specific VR content is displayed on the VR headset 140 based on any head movement pattern of the user 120. In another example embodiment, the user 120 may also access the VR content using other form of user inputs. For example, scenes of the VR content may also be changed by changing a user input which may or may not include the head movement.
  • the user input may be in form of an input provided from a device configured to enable the user 120 to navigate through the VR content.
  • a device such as a console (e.g., the console 145) a joystick or a trackball such that the movement of the device is linked to the apparatus 110 and/or the VR headset 140, and any movement in the device while accessing the VR content results into dynamic scene changes (i.e. a navigation through the VR content) in the VR content and such dynamic scene changes are displayed on the VR headset 140.
  • a console e.g., the console 145
  • a joystick or a trackball such that the movement of the device is linked to the apparatus 110 and/or the VR headset 140, and any movement in the device while accessing the VR content results into dynamic scene changes (i.e. a navigation through the VR content) in the VR content and such dynamic scene changes are displayed on the VR headset 140.
  • Some example representations of the display of the VR content on the VR headset 140 while access of the VR content by the user 120 are shown in forms of displays 205, 210, 215 and 220 at different timestamps in FIGURES 2A to 2D, respectively.
  • displays 205, 210, 215 and 220 are shown as coming out from the VR headset 140 only for the example representation purposes and it should be understood that such displays are only on the VR headset 140.
  • only a limited view of a scene (of a few degrees only) of the VR content is shown for example purposes to facilitate the present description, and it should be understood that user 120 can actually be able to see the 360 degree views of the scenes in the VR content using the VR headset 140 (a head mounted display).
  • VR content is displayed in form of the display 205 on the VR headset 140 at a timestamp of 00: 12: 10 (hour: minute: second) associated with the VR content.
  • the display 205 may have been generated in response to head position/orientation of the user 120 at the timestamp of 00: 12: 10 - ⁇ (delta), where ⁇ may depend upon the electronic design of the VR headset 140 and other accessories of the VR kit 135.
  • the VR content is displayed in form of the display 210 on the VR headset 140.
  • the user 120 if the user 120 decides to create (or mark) a sub-content within the VR content, the user 120 provides a first input indicating a start of the sub-content.
  • the apparatus 110 is caused to determine the start of the sub-content within the VR content in response to the first input received from the user 120.
  • the processor 125 is configured to, along with the content of the memory 130 to cause the apparatus 110 to determine the start of the sub-content within the VR content.
  • the first input is in form of a hand gesture, for example a sign of thumbs up by the user 120.
  • the apparatus 110 may include sensors, for example, image sensors to recognize the hand gesture of the user 120, and determines a user signal meant for the start of the sub-content.
  • the apparatus 110 is caused to continuously store, at least a head movement information associated with the user 120 from a time of the start of the sub- content while the user 120 is accessing the VR content.
  • the IMU 150 may be associated with the VR headset 140, or the even the VR headset 140 may have IMUs to track a pattern of the head movement of the user 120, and the tracked pattern of the head movement is provided to the apparatus 110.
  • the apparatus 110 may also have sensors to track the head movement pattern starting from the time (00: 14: 13) when the user 120 provided the first input (e.g., hand gesture), and the apparatus 110 is caused to store the information associated with the tracked head movement pattern (i.e. head movement information) on a suitable memory location, such as the database 155 or even in the memory 130.
  • the VR content is displayed in form of the display 215 on the VR headset 140.
  • the apparatus 110 is caused to continuously store the head movement information until the apparatus 110 determines an end of the sub-content.
  • the apparatus 110 is caused to determine an end of the sub-content within the VR content in response to a second input received by the user 120.
  • the processor 125 is configured to, along with the content of the memory 130 to cause the apparatus 110 to determine the end of the sub-content within the VR content.
  • the second input may be same as the first input (e.g., in form of the hand gesture, for example a sign of thumbs up by the user 120) or may be a different gesture than that of the gesture used for the first input.
  • the user 120 provides the hand gesture associated with the second input indicating the end of the sub-content
  • the apparatus 110 is caused to determine the end of the sub-content based on sensing the hand gesture associated with the second input.
  • the apparatus 110 is caused to stop the storage of the head movement information of the user 120 at the timestamp of 00:25: 10 (hour:minute:second), and accordingly, the apparatus 110 is caused to store the head movement information associated with the sub- content that spans between the timestamps 00: 14: 13 and 00:25: 10.
  • the first input and the second input can be tracked or sensed by means present in the VR kit 135, or means present outside of the VR kit 135.
  • FIGURE 3A, 3B, and 3C illustrate example representations of sensing the first and second inputs provided by the user 120, in accordance with some example embodiments.
  • the first input and the second input for the start and the end of the sub-content, respectively can be determined by means that are part of the VR kit 135, and such determination of the first input and the second input can be provided to the processor 125.
  • the first input and the second input may be provided by a press of a mechanical button 305 configured on the VR headset 140.
  • the inputs for the start and the end of the sub-content may also be provided by the user 120 by selecting an icon displayed on a display surface of VR headset 140.
  • the first input and the second input may be provided by the user 120 by pressing any button or providing a touch input configured in an input means 310 (e.g., button, trackball, touch surface, etc.).
  • Example of the input means 310 may be an input device such as a remote controller device linked with the VR kit 135 and/or the apparatus 110.
  • the first input and the second input may be provided using means provided in a console 315, where the console 315 is used by user 120 for playing VR video games.
  • the apparatus 110 is caused to generate a media file associated with the sub-content.
  • the media file 410 is generated if the VR content is a public VR content.
  • the public VR content may include a VR content stored on a server that can be accessed by users from all across the world, or by a group of users, or may be accessed by users who are subscribers of the server.
  • the media file 410 is generated such that it includes an identifier 412 of the VR content, information 414 associated with the start of sub-content, information 416 associated with end of sub-content and the head movement information 418.
  • Examples of the identifier 412 may include, but are not limited to a web uniform resource locator (URL) of the VR content or may represent a storage path of the VR content that can be used for electronically accessing the VR content from anywhere.
  • the information 414 may include a timestamp of the start of the sub-content (e.g., 00: 14: 13).
  • the information 414 may also include a starting position of the head (for example in degree for azimuth and elevation) and starting position of any other interactive objects (e.g., console) for interacting with the VR content.
  • starting positions of the head and any other interactive objects may be part of the head-movement information 418.
  • the information 416 may include a timestamp of the end of the sub-content (e.g., 00:25: 10).
  • the head movement information 418 may include an orientation and position of the head of the user 120 at each timestamp between the timestamps 00: 14: 13 and 00:25: 10.
  • the orientation and position of the head of the user 120 may be measured by the IMU 150 with respect to a reference head orientation and position.
  • the reference could be as preset in the VR content, for example a global reference with respect to North-South-East- West and gravity.
  • the head movement information 418 may be a video of head movement of the user 120 between the timestamps 00: 14: 13 and 00:25: 10 of the VR content.
  • the apparatus 110 is caused to share the media file 410 with another VR user (e.g., the user 160) as selected by the user 120.
  • the user 160 may have associated VR content processing apparatus (e.g., the apparatus 110) to decode the media file 410 and reconstruct the sub-content using the identifier 412, the start information 414, the end information 416 and the head movement information 418.
  • the VR content processing apparatus associated with the user 160 accesses the public VR content using the URL contained in the media file 410, and performs a playback of the sub-content within the VR content starting from the timestamp 00: 14: 13 associated with the start information 414 until the timestamp 00:25: 10 associated with end information 416.
  • playback of the sub-content includes reconstructing the sub-content based on the head movement information 418 between the timestamps 00: 14: 13 and 00:25: 10.
  • information present in a media file includes reconstructing the sub-content based on the head movement information 418 between the timestamps 00: 14: 13 and 00:25: 10.
  • the media file 420 is generated if the VR content is a private VR content.
  • the private VR content may include any VR content that is either locally stored with the user 120 or stored at a location that is not accessible for those users to whom the user 120 wants to share the sub-content.
  • the media file 420 includes the sub-content 422 and a head movement information 424.
  • the user 160 can access the sub-content based on the media file 420 using a VR content processing apparatus (e.g., the apparatus 110) and a VR kit. Since, the sub-content 422 is already present in the media file 420, the sub-content 422 can be directly played back on a VR headset associated with the user 160. In some scenarios, sub-content 422 may also be played back with additional effects generated based on the head movement information 424. Examples of the additional effects may include special sound effects, controlled sound effects (right or left speakers), or any special effect such as optical or mechanical effects that can be created based on the head movement information.
  • the VR content is a VR video game
  • the media file 430 includes an identifier 432 of the VR video game, information 434 associated with the start of the sub-content, information 436 associated with the end of the sub- content, a head movement information 438 of the user 120 while playing the VR video game, and other movement information 440 of one or more limbs (e.g. hands, legs of the user) of the user 120.
  • the start information 434 may include all the associated movements that the VR user did to arrive to the point of start of the sub-content. In an example, all the associated movements till the start of the sub-content may be done so that at the playback side, the VR kit can reconstruct the whole VR video game until the start position.
  • the user 160 can access the sub-content based on the media file 430 using a VR content processing apparatus (e.g., the apparatus 110) and a VR kit.
  • the user 160 using the VR content processing apparatus (e.g., the apparatus 110) reconstructs the sub-content (a part of the VR video game) using the identifier 432, the start information 434, the end information 436, the head movement information 438 and the other movement information 440.
  • the user 160 is able to see the sub-content (clip) of the VR video game as was played by the user 120, where scenes associated with the VR video game along with the head movement and hands and/or legs movements of the user 120, are displayed.
  • the user 160 may have an option to continue playing the VR video game starting from the end of the sub-content in the VR video game.
  • the user 160 can also start playing from the start of the sub-content, wherein the VR kit will reconstruct the VR video game until the start of the sub-content and then the user 160 will start playing the game.
  • the user 160 may continue playing from the timestamp 00:25: 10.
  • a media file 450 as shown in FIGURE 4D may be stored and shared with the user 160.
  • the media file 450 includes the sub- content 452 along with head movement information 454 and other movement information 456 associated with one or more limbs of the user 120.
  • the VR content may be a public VR content, a private VR content, a VR video game or any other VR content.
  • the media file 460 includes an identifier 462 of the VR content, information 464 associated with the start of the sub- content, information 466 associated with the end of the sub-content, and a user input pattern 468 of the user 120 while accessing the VR content.
  • An example of the user input pattern comprises a head movement information.
  • an example of the user input pattern comprises a movement information of one or more limbs of the user that may be used for navigating through the VR content, or to provide any user inputs while playing VR video game.
  • an example of the user input pattern comprises movement information of at least one device configured to enable the user to navigate through the VR content.
  • the at least one device comprise one or more of a joystick, a trackball, and a console.
  • FIGURE 5 is a flowchart depicting an example method 500, in accordance with an example embodiment. The method 500 is shown and explained with reference to FIGURES 1 to 4A-4E. The method 500 depicted in the flowchart may be executed by, for example, the apparatus 110 of FIGURE 1 or by a combination of the apparatus 110 and other components described in the setup 100.
  • the method 500 includes determining, by a processor (e.g., the processor 125), a start of a sub-content within a VR content in response to a first input received from a user while accessing the VR content.
  • a processor e.g., the processor 125
  • the user may provide the first input indicating the start of the sub-content by suitable means such as hand gesture/ head movement gesture, a button press input or a touch input.
  • the first input may also be provided by a voice input, a text input indicating a timestamp of start of the sub-content, etc.
  • the method 500 includes storing at least a head movement information associated with the user from a time of the start of the sub-content while the VR content is being accessed by the user.
  • storing the head movement information is a continuous process, and specifically a pattern of the head movement of the user is stored while the user accesses the VR content.
  • the method 500 may include storing continuously, a user input pattern, where the user input pattern is utilized for the navigation through the VR content while accessing the VR content.
  • the method 500 includes determining, by the processor (e.g., the processor 125), an end of the sub-content within the VR content in response to a second input received from the user while accessing the VR content.
  • the user may provide the second input indicating the end of the sub-content by suitable means such as hand gesture/ head movement gesture, a button press input or a touch input.
  • the second input may also be provided by a voice input, a text input indicating a timestamp of the end of the sub-content, etc.
  • the method 500 includes stopping the storage of the head movement information, at 520.
  • FIGURE 6 is a flowchart depicting an example method 600 of creating and sharing a sub- content of a VR content with other VR content users, in accordance with an example embodiment.
  • the method 600 is shown and explained with reference to FIGURES 1 to 4A-4E.
  • the method 600 depicted in the flowchart may be executed by, for example, the apparatus 110 of FIGURE 1 or by a combination of the apparatus 110 and other components described in the setup 100.
  • the user accesses a VR content.
  • the method 600 checks for any first input from the user. If the first input (e.g., gesture, voice input, touch input, etc.) is received from the user, the method 600 determines a start of a sub-content within the VR content which the user wants to create and/or share with one or more other VR content users. At 615, the method 600 determines a start of a sub-content within the VR content in response to the first input received from the user while the user is accessing the VR content (e.g., watching the VR content using a VR headset).
  • the first input e.g., gesture, voice input, touch input, etc.
  • the method 600 includes storing at least a head movement information associated with the user from a time of the start of the sub-content while the VR content is being accessed by the user.
  • storing the head movement information is a continuous process, and specifically a pattern of the head movement of the user while the user is accessing the VR content, is stored.
  • the method 600 may also include storing movement information of one or more limbs of the user, if the VR content is a VR video game. In some cases, the movement information of the one or more limbs (hands and/or legs) of the user may also be stored even if the VR content is not a VR video game.
  • the method 600 includes checking if there is any second input from the user.
  • the method 600 determines an end of the sub-content within the VR content which the user wants to create and/or share with one or more other users. At 630, the method 600 determines the end of the sub-content within the VR content in response to the second input received from the user while the user is accessing the VR content.
  • the second input e.g., gesture, voice input, touch input, etc.
  • the method 600 includes generating a media file associated with the sub-content.
  • Some examples of generation of the media file are explained with reference to FIGURES 4A-4E. It is appreciated that the primary consideration while generating the media file is such that the sub-content can be reproduced from the media file at the VR headsets of users with whom the media file is shared. Hence, if the VR content is a public VR content, only link (identifier) to the VR content along with the start and end information of the sub-content within the VR content and the head information are stored in the media file.
  • the media file may include the sub-content (view of the VR content that the user has seen between a timestamp associated with the start of the sub- content and a timestamp associated with the end of the sub-content) and may optionally include the head movement information.
  • the method 600 checks if a request for sharing the media file with one or more other VR content users (e.g., the user 160) is received from the user. If the request is received from the user, at 645, the method 600 shares the media file with the one or more VR content users via a communication network or though any other suitable means.
  • one or more other VR content users e.g., the user 160
  • FIGURE 7 is a flowchart depicting an example method 700 of playback of the sub- content, in accordance with an example embodiment.
  • the method 700 is shown and explained with reference to FIGURES 1 to 6.
  • the method 700 depicted in the flowchart may be executed by, for example, the apparatus 110 of FIGURE 1 or by a combination of the apparatus 110 and other components (e.g., VR kit of the user 160) described in the setup 100.
  • the method 700 includes receiving a media file associated with a sub-content of a VR content.
  • the media file is generated at operation 635 and is shared at operation 645, in an example embodiment.
  • the method 700 includes decoding the media file to access identifier of the VR content, information associated with the start of the sub-content, information associated with the end of the sub-content and head movement information contained in the media file. It is noted that if the VR content is a private VR content, the identifier of the VR content is not stored in the media file and instead the sub-content is stored in the media file. Hence, in case of the VR content being the private VR content, the method 700 includes decoding the sub-content along with the head information.
  • the method 700 includes locating the VR content based on the identifier of the VR content.
  • the VR content is located, for example, source location or access location of the VR content is identified based on the identifier such as a web link.
  • the method 700 includes reconstructing the sub-content based on the information associated with the start of the sub-content, the information associated with end of the sub-content and the head movement information.
  • the method 700 includes playing the reconstructed sub-content.
  • playing the reconstructed sub-content includes displaying the sub-content on a VR headset associated with the user to whom the media file is shared with, and also includes creating sound effects or other special effects associated with the reconstructed VR content.
  • the methods depicted in these flowcharts may be executed by, for example, the apparatus 110 of FIGURE 1 or apparatus 800 described with reference to FIGURE 8.
  • Operations of the flowchart, and combinations of operation in the flowcharts may be implemented by various means, such as hardware, firmware, processor, circuitry and/or other device associated with execution of software including one or more computer program instructions.
  • one or more of the procedures described in various embodiments may be embodied by computer program instructions.
  • the computer program instructions, which embody the procedures, described in various embodiments may be stored by at least one memory device of an apparatus and executed by at least one processor in the apparatus.
  • Any such computer program instructions may be loaded onto a computer or other programmable apparatus (for example, hardware) to produce a machine, such that the resulting computer or other programmable apparatus embody means for implementing the operations specified in the flowchart.
  • These computer program instructions may also be stored in a computer-readable storage memory (as opposed to a transmission medium such as a carrier wave or electromagnetic signal) that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the operations specified in the flowchart.
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions, which execute on the computer or other programmable apparatus, provide operations for implementing the operations in the flowchart.
  • the operations of the methods are described with help of apparatus 110. However, the operations of the methods can be described and/or practiced by using any other apparatus.
  • FIGURE 8 illustrates an apparatus 800 for creating a sub-content within a VR content, sharing the sub-content and/or performing playback of the sub-content, in accordance with some example embodiments.
  • the apparatus 800 may be employed, for example, in one or more devices such as the apparatus 110 of FIGURE 1 or a device 900 of FIGURE 9.
  • the apparatus 800 may also be employed on a variety of other devices both mobile and fixed, and therefore, embodiments should not be limited to application on devices such as the apparatus 110 of FIGURE 1.
  • the apparatus 800 may be embodied in form of a VR system comprising a plurality of components that assist a user in viewing the VR content and creating a sub-content within the VR content by providing various types of inputs such as gesture input, voice based input, touch inputs or inputs such as press of a button.
  • Various embodiments of the apparatus 800 may be embodied wholly at a single device, or in a combination of devices.
  • the devices or elements described below may not be mandatory and thus some may be omitted in certain embodiments.
  • the apparatus 800 includes or otherwise is in communication with at least one processor 802 and at least one memory 804.
  • the at least one memory 804 include, but are not limited to, volatile and/or non-volatile memories.
  • volatile memory include, but are not limited to, random access memory, dynamic random access memory, static random access memory, and the like.
  • the non-volatile memory include, but are not limited to, hard disks, magnetic tapes, optical disks, programmable read only memory, erasable programmable read only memory, electrically erasable programmable read only memory, flash memory, phase-change memory and the like.
  • the memory 804 may be configured to store information, data, applications, instructions or the like for enabling the apparatus 800 to carry out various functions in accordance with various example embodiments.
  • the memory 804 may be configured to store image processing instructions and other instructions for determining the user inputs, processing of the VR content and generation of the media file, by the processor 802.
  • the processor 802 may include the processor 125.
  • the processor 802 may be embodied in a number of different ways.
  • the processor 802 may be embodied as a multi-core processor, a single core processor; or combination of multi-core processors and single core processors.
  • the processor 802 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
  • various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit
  • the multi-core processor may be configured to execute instructions stored in the memory 804 or otherwise accessible to the processor 802.
  • the processor 802 may be configured to execute hard coded functionality.
  • the processor 802 may represent an entity, for example, physically embodied in circuitry, capable of performing operations according to various embodiments while configured accordingly.
  • the processor 802 may be specifically configured hardware for conducting the operations described herein.
  • the processor 802 may specifically configure the processor 802 to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor 802 may be a processor of a specific device, for example, a mobile terminal or network device adapted for employing embodiments by further configuration of the processor 802 by instructions for performing the algorithms and/or operations described herein.
  • the processor 802 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 802.
  • ALU arithmetic logic unit
  • a user interface 806 may be in communication with the processor 802.
  • Examples of the user interface 806 include, but are not limited to, an input interface and/or an output interface.
  • the input interface is configured to receive an indication of a user input.
  • the output user interface provides an audible, visual, mechanical or other output and/or feedback to the user.
  • Examples of the input interface may include, but are not limited to, a keyboard, a mouse, a joystick, a keypad, a touch screen, soft keys, and the like.
  • the output interface may include, but are not limited to, a display such as light emitting diode display, thin-film transistor (TFT) display, liquid crystal displays, active-matrix organic light-emitting diode (AMOLED) display, a microphone, a speaker, ringers, vibrators, and the like.
  • the user interface 806 may include, among other devices or elements, any or all of a speaker, a microphone, a display, and a keyboard, touch screen, or the like.
  • the processor 802 may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface 806, such as, for example, a speaker, ringer, microphone, display, and/or the like.
  • the processor 802 and/or user interface circuitry comprising the processor 802 may be configured to control one or more functions of one or more elements of the user interface 806 through computer program instructions, for example, software and/or firmware, stored on a memory, for example, the at least one memory 804, and/or the like, accessible to the processor 802.
  • the apparatus 800 may include an electronic device.
  • the electronic device include a communication device, a VR content playback system, a media capturing device with or without communication capabilities, computing devices, and the like.
  • Some examples of the electronic device may include a mobile phone, a personal digital assistant (PDA), and the like.
  • PDA personal digital assistant
  • computing device may include a laptop, a personal computer, and the like.
  • the electronic device may include a user interface, for example, the user interface 806, having user interface circuitry and user interface software configured to facilitate a user to control at least one function of the electronic device through use of a display and further configured to respond to user inputs.
  • the electronic device may include a display circuitry configured to display at least a portion of the user interface 806 of the electronic device. The display and display circuitry may be configured to facilitate the user to control at least one function of the electronic device.
  • the apparatus 800 may include a VR headset 808 and one or more inertial measuring units (IMUs) such as an IMU 810 and other sensors 812.
  • IMUs inertial measuring units
  • An example of the VR headset 808 is the VR headset 140 described with reference to FIGURE 1
  • example of the IMU 810 may include the IMU 150 described with reference to FIGURE 1.
  • the VR headset 808 may be embodied as a head mounted display for displaying the VR content.
  • the head mounted display includes a display with 360 degree 3-D view of the video content being displayed on the head mounted display in response to the head movement of the user.
  • the head mounted display may be in communication with the processor 802 and/or other components of the apparatus 800 to display virtual reality content (e.g., VR video, VR video game) to the user.
  • virtual reality content e.g., VR video, VR video game
  • examples of the sensors 812 may include devices that can detect the first input and the second inputs received from the user indicating start and end of the sub-content within the VR content while the user is accessing the VR content.
  • the sensor 812 may be an image capturing device that can interpret the hand gesture inputs provided by the user, and can provide the receipt of such inputs to the processor 802.
  • the sensor 812 may be voice sensor such as a microphone.
  • the centralized circuit system 814 may be various devices configured to, among other things, provide or enable communication between the components (802-812) of the apparatus 800.
  • the centralized circuit system 814 may be a central printed circuit board (PCB) such as a motherboard, a main board, a system board, or a logic board.
  • the centralized circuit system 814 may also, or alternatively, include other printed circuit assemblies (PCAs) or communication channel media.
  • the apparatus 800 is caused, by the processor 802 along with the content of memory 804 and other components of the apparatus 800, to perform the functions described in various FIGURES 1 to 7.
  • the apparatus 800 is caused to determine the start of the sub-content within the VR content, start storing the head movement information (and optionally other movement information of one or more limbs of the user) associated with the user once the start of the sub-content is determined.
  • the apparatus 800 is further caused to determine the second input indicating the end of the sub-content and upon determining the end of the sub- content, the apparatus 800 is caused to stop storing the head movement information and other movement information.
  • the apparatus 800 is caused to generate the media file associated with the sub-content as described with reference to FIGURES 4A-4E, and is caused to share the media file with other VR content users once a request for sharing the media file is received from the user.
  • the apparatus 800 is also configured to perform playback of the sub-content based on decoding the media file and reconstructing the sub- content as viewed by the user who has shared the media file.
  • FIGURE 9 illustrates a device 900, in accordance with an example embodiment. It should be understood, however, that the device 900 as illustrated and hereinafter described is merely illustrative of one type of device that may benefit from various embodiments, therefore, should not be taken to limit the scope of the embodiments. As such, it should be appreciated that at least some of the components described below in connection with the device 900 may be optional and thus in an example embodiment may include more, less or different components than those described in connection with the example embodiment of FIGURE 1.
  • the device 900 could be any of a number of types of touch screen based mobile electronic devices, for example, a VR content playback system, portable digital assistants (PDAs), mobile televisions, gaming devices, cellular phones, all types of computers (for example, laptops, mobile computers or desktops), cameras, mobile digital assistants, or any combination of the aforementioned, and other types of communications devices.
  • a VR content playback system portable digital assistants (PDAs)
  • PDAs portable digital assistants
  • mobile televisions gaming devices
  • cellular phones all types of computers (for example, laptops, mobile computers or desktops), cameras, mobile digital assistants, or any combination of the aforementioned, and other types of communications devices.
  • computers for example, laptops, mobile computers or desktops
  • cameras for example, laptops, mobile computers or desktops
  • mobile digital assistants or any combination of the aforementioned, and other types of communications devices.
  • the device 900 may include an antenna 902 (or multiple antennas) in operable communication with a transmitter 904 and a receiver 906.
  • the device 900 may further include an apparatus, such as a controller 908 or other processing devices that provides signals to and receives signals from the transmitter 904 and the receiver 906, respectively.
  • the signals may include signaling information in accordance with the air interface standard of the applicable cellular system, and/or may also include data corresponding to user speech, received data and/or user generated data.
  • the device 900 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types.
  • the device 900 may be capable of operating in accordance with any of a number of first, second, third and/or fourth-generation communication protocols or the like.
  • the device 900 may be capable of operating in accordance with second-generation (2G) wireless communication protocols such as IS- 136 (time division multiple access (TDMA)), GSM (global system for mobile communication), and IS-95 (code division multiple access (CDMA)), or with third-generation (3G) wireless communication protocols, such as Universal Mobile Telecommunications System (UMTS), CDMA1000, wideband CDMA (WCDMA) and time division-synchronous CDMA (TD-SCDMA), with 3.9G wireless communication protocol such as evolved universal terrestrial radio access network (E-UTRAN), with fourth -generation (4G) wireless communication protocols, or the like.
  • 2G wireless communication protocols such as IS- 136 (time division multiple access (TDMA)), GSM (global system for mobile communication), and IS-95 (code division multiple access (CDMA)
  • third-generation (3G) wireless communication protocols such as Universal Mobile Telecommunications System (UMTS), CDMA1000, wideband CDMA (WCDMA) and time division-synchronous CDMA (TD-SCDMA), with 3.9G wireless communication protocol such
  • computer networks such as the Internet, local area network, wide area networks, and the like; short range wireless communication networks such as include Bluetooth® networks, Zigbee® networks, Institute of Electric and Electronic Engineers (IEEE) 802. llx networks, and the like; wireline telecommunication networks such as public switched telephone network (PSTN).
  • PSTN public switched telephone network
  • the controller 908 may include circuitry implementing, among others, audio and logic functions of the device 900.
  • the controller 908 may include, but are not limited to, one or more digital signal processor devices, one or more microprocessor devices, one or more processor(s) with accompanying digital signal processor(s), one or more processor(s) without accompanying digital signal processor(s), one or more special-purpose computer chips, one or more field-programmable gate arrays (FPGAs), one or more controllers, one or more application- specific integrated circuits (ASICs), one or more computer(s), various analog to digital converters, digital to analog converters, and/or other support circuits. Control and signal processing functions of the device 900 are allocated between these devices according to their respective capabilities.
  • the controller 908 thus may also include the functionality to convolutionally encode and interleave message and data prior to modulation and transmission.
  • the controller 908 may additionally include an internal voice coder, and may include an internal data modem.
  • the controller 908 may include functionality to operate one or more software programs, which may be stored in a memory.
  • the controller 908 may be capable of operating a connectivity program, such as a conventional web browser.
  • the connectivity program may then allow the device 900 to transmit and receive web content, such as location-based content and/or other web page content, according to a Wireless Application Protocol (WAP), Hypertext Transfer Protocol (HTTP) and/or the like.
  • WAP Wireless Application Protocol
  • HTTP Hypertext Transfer Protocol
  • the controller 908 may be embodied as a multi-core processor such as a dual or quad core processor. However, any number of processors may be included in the controller 908.
  • the device 900 may also comprise a user interface including an output device such as a ringer 910, an earphone or speaker 912, a microphone 914, a display 916, and a user input interface, which may be coupled to the controller 908.
  • the user input interface which allows the device 900 to receive data, may include any of a number of devices allowing the device 900 to receive data, such as a keypad 918, a touch display, a microphone or other input devices.
  • the keypad 918 may include numeric (0-9) and related keys (#, *), and other hard and soft keys used for operating the device 900.
  • the keypad 918 may include a conventional QWERTY keypad arrangement.
  • the keypad 918 may also include various soft keys with associated functions.
  • the device 900 may include an interface device such as a joystick or other user input interface.
  • the device 900 further includes a battery 920, such as a vibrating battery pack, for powering various circuits that are used to operate the device 900, as well as optionally providing mechanical vibration as a detectable output.
  • the device 900 includes a media capturing element, such as a camera, video and/or audio module, in communication with the controller 908.
  • the media capturing element may be any means for capturing an image, video and/or audio for storage, display or transmission.
  • the camera module 922 may include a digital camera capable of forming a digital image file from a captured image.
  • the camera module 922 includes all hardware, such as a lens or other optical component(s), and software for creating a digital image file from a captured image.
  • the camera module 922 may include the hardware needed to view an image, while a memory device of the device 900 stores instructions for execution by the controller 908 in the form of software to create a digital image file from a captured image.
  • the camera module 922 may further include a processing element such as a co-processor, which assists the controller 908 in processing image data and an encoder and/or a decoder for compressing and/or decompressing image data.
  • the encoder and/or the decoder may encode and/or decode according to a JPEG standard format or another like format.
  • the encoder and/or the decoder may employ any of a plurality of standard formats such as, for example, standards associated with H.261, H.262/ MPEG-2, H.263, H.264, H.264/MPEG-4, MPEG-4, and the like.
  • the camera module 922 may provide live image data to the display 916.
  • the display 916 may be located on one side of the device 900 and the camera module 922 may include a lens positioned on the opposite side of the device 900 with respect to the display 916 to enable the camera module 922 to capture images on one side of the device 900 and present a view of such images to the user positioned on the other side of the device 900.
  • the device 900 may further include a user identity module (UIM) 924.
  • the UIM 924 may be a memory device having a processor built in.
  • the UIM 924 may include, for example, a subscriber identity module (SIM), a universal integrated circuit card (UICC), a universal subscriber identity module (USIM), a removable user identity module (R-UIM), or any other smart card.
  • SIM subscriber identity module
  • UICC universal integrated circuit card
  • USIM universal subscriber identity module
  • R-UIM removable user identity module
  • the UIM 924 typically stores information elements related to a mobile subscriber.
  • the device 900 may be equipped with memory.
  • the device 900 may include a volatile memory 926, such as volatile random access memory (RAM) including a cache area for the temporary storage of data.
  • RAM volatile random access memory
  • the device 900 may also include other non-volatile memory 928, which may be embedded and/or may be removable.
  • the non-volatile memory 928 may additionally or alternatively comprise an electrically erasable programmable read only memory (EEPROM), flash memory, hard drive, or the like.
  • EEPROM electrically erasable programmable read only memory
  • the memories may store any number of pieces of information, and data, used by the device 900 to implement the functions of the device 900.
  • a technical effect of one or more of the example embodiments disclosed herein is to provide users with an option of creating sub-content within a VR content while accessing the VR content, and sharing the sub-content with other users in their network. In this manner, the users will be able to share exactly the sequence of scene that they saw while accessing the VR content to other users.
  • Various embodiments offer techniques to create media file associated with the sub-content such that a less space is required for storing the media file, and the shared media file can be played at the VR kit of the other VR content users, and the other VR content users would be able to see the sub-content that was actually seen by the user who has shared the media file with them.
  • Various embodiments provide multiple techniques in which the sub-content can be created while the user is accessing the VR content, which add to the flexibility offered to the user accessing the VR content.
  • Various embodiments described above may be implemented in software, hardware, application logic or a combination of software, hardware and application logic.
  • the software, application logic and/or hardware may reside on at least one memory, at least one processor, an apparatus or, a computer program product.
  • the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media.
  • a "computer-readable medium" may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of an apparatus described and depicted in FIGURES 1, 8 and/or 9.
  • a computer- readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
  • inventions illustrated and described herein as well as embodiments not specifically described herein but within the scope of aspects of the present disclosure constitute example apparatus means for determining a start of a sub-content within a VR content in response to a first input received from a user while accessing the VR content.
  • Such means may include any or combinations of processors, memory, IMUs, other sensors such as camera or microphone and VR kit.
  • the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions may be optional or may be combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Conformément à un mode de réalisation à titre d'exemple, l'invention concerne un procédé, un appareil et un produit programme d'ordinateur. Le procédé comprend la détermination, par un processeur, d'un début d'un sous-contenu dans un contenu de réalité virtuelle (VR) en réponse à une première entrée reçue d'un utilisateur tout en accédant au contenu VR. Le procédé comprend en outre le stockage d'au moins une information de mouvement de la tête associée à l'utilisateur à partir du moment du début du sous-contenu tandis que l'utilisateur accède au contenu VR. En outre, le procédé comprend la détermination, par le processeur, d'une fin du sous-contenu dans le contenu VR en réponse à une seconde entrée reçue par l'utilisateur tout en accédant au contenu VR. Le stockage des informations relatives au mouvement de la tête est interrompu lors de la détermination de la fin du sous-contenu
PCT/IB2017/053781 2016-06-28 2017-06-23 Procédé et appareil pour créer un sous-contenu dans un contenu de réalité virtuelle et son partage WO2018002800A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201641022178 2016-06-28
IN201641022178 2016-06-28

Publications (1)

Publication Number Publication Date
WO2018002800A1 true WO2018002800A1 (fr) 2018-01-04

Family

ID=60787016

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/053781 WO2018002800A1 (fr) 2016-06-28 2017-06-23 Procédé et appareil pour créer un sous-contenu dans un contenu de réalité virtuelle et son partage

Country Status (1)

Country Link
WO (1) WO2018002800A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108553885A (zh) * 2018-02-28 2018-09-21 腾讯科技(深圳)有限公司 虚拟场景中的动画播放方法和装置及存储介质、电子装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002036225A1 (fr) * 2000-11-02 2002-05-10 Atlantis Cyberspace, Inc. Systeme virtuel de jeu simulant la realite, comprenant un pseudo pilote d'affichage tridimensionnel et un centre de commande
US20020154214A1 (en) * 2000-11-02 2002-10-24 Laurent Scallie Virtual reality game system using pseudo 3D display driver
US20140364228A1 (en) * 2013-06-07 2014-12-11 Sony Computer Entertainment Inc. Sharing three-dimensional gameplay
US20140364208A1 (en) * 2013-06-07 2014-12-11 Sony Computer Entertainment America Llc Systems and Methods for Reducing Hops Associated with A Head Mounted System
WO2015126643A2 (fr) * 2014-02-24 2015-08-27 Sony Computer Entertainment Inc. Procédés et systèmes de partage social du contenu d'un visiocasque (hmd) avec un second écran
US20160054797A1 (en) * 2014-08-22 2016-02-25 Sony Computer Entertainment Inc. Thumb Controller

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002036225A1 (fr) * 2000-11-02 2002-05-10 Atlantis Cyberspace, Inc. Systeme virtuel de jeu simulant la realite, comprenant un pseudo pilote d'affichage tridimensionnel et un centre de commande
US20020154214A1 (en) * 2000-11-02 2002-10-24 Laurent Scallie Virtual reality game system using pseudo 3D display driver
US20140364228A1 (en) * 2013-06-07 2014-12-11 Sony Computer Entertainment Inc. Sharing three-dimensional gameplay
US20140364208A1 (en) * 2013-06-07 2014-12-11 Sony Computer Entertainment America Llc Systems and Methods for Reducing Hops Associated with A Head Mounted System
WO2015126643A2 (fr) * 2014-02-24 2015-08-27 Sony Computer Entertainment Inc. Procédés et systèmes de partage social du contenu d'un visiocasque (hmd) avec un second écran
US20160054797A1 (en) * 2014-08-22 2016-02-25 Sony Computer Entertainment Inc. Thumb Controller

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108553885A (zh) * 2018-02-28 2018-09-21 腾讯科技(深圳)有限公司 虚拟场景中的动画播放方法和装置及存储介质、电子装置

Similar Documents

Publication Publication Date Title
US20220301593A1 (en) Spherical video editing
CN106371782B (zh) 移动终端及其控制方法
KR20230096043A (ko) 실시간 3d 신체 모션 캡처로부터의 사이드-바이-사이드 캐릭터 애니메이션
CN109640125B (zh) 视频内容处理方法、装置、服务器及存储介质
KR20230127312A (ko) 멀티 비디오 클립 캡처를 위한 ar 콘텐츠
CN109922356B (zh) 视频推荐方法、装置和计算机可读存储介质
US20220206738A1 (en) Selecting an audio track in association with multi-video clip capture
US20140218370A1 (en) Method, apparatus and computer program product for generation of animated image associated with multimedia content
US20170329855A1 (en) Method and device for providing content
US11989348B2 (en) Media content items with haptic feedback augmentations
US11997422B2 (en) Real-time video communication interface with haptic feedback response
US20240184372A1 (en) Virtual reality communication interface with haptic feedback response
US20220317775A1 (en) Virtual reality communication interface with haptic feedback response
WO2022146798A1 (fr) Sélection audio pour capture de séquences multi-vidéo
KR20230116938A (ko) 안경류 디바이스 상의 미디어 콘텐츠 플레이어
US9269158B2 (en) Method, apparatus and computer program product for periodic motion detection in multimedia content
US20150325040A1 (en) Method, apparatus and computer program product for image rendering
CN111312207B (zh) 文本转音频方法、装置、计算机设备及存储介质
WO2018002800A1 (fr) Procédé et appareil pour créer un sous-contenu dans un contenu de réalité virtuelle et son partage
GB2513865A (en) A method for interacting with an augmented reality scene
US20220318303A1 (en) Transmitting metadata via inaudible frequencies
US20130107008A1 (en) Method, apparatus and computer program product for capturing images
US11922587B2 (en) Dynamic augmented reality experience
US11825276B2 (en) Selector input device to transmit audio signals
US20220377309A1 (en) Hardware encoder for stereo stitching

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17819450

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17819450

Country of ref document: EP

Kind code of ref document: A1