EP3622487B1 - Method for providing 360-degree video and device for supporting the same - Google Patents
Method for providing 360-degree video and device for supporting the same Download PDFInfo
- Publication number
- EP3622487B1 EP3622487B1 EP18802463.2A EP18802463A EP3622487B1 EP 3622487 B1 EP3622487 B1 EP 3622487B1 EP 18802463 A EP18802463 A EP 18802463A EP 3622487 B1 EP3622487 B1 EP 3622487B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- motion
- video
- degree
- frames
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 61
- 201000003152 motion sickness Diseases 0.000 claims description 54
- 230000009467 reduction Effects 0.000 claims description 36
- 238000013519 translation Methods 0.000 claims description 29
- 238000009877 rendering Methods 0.000 claims description 25
- 230000003068 static effect Effects 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 8
- 230000001133 acceleration Effects 0.000 claims description 4
- 230000015654 memory Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000005043 peripheral vision Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000002207 retinal effect Effects 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 206010019233 Headaches Diseases 0.000 description 1
- 208000008454 Hyperhidrosis Diseases 0.000 description 1
- 206010028813 Nausea Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 231100000869 headache Toxicity 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 206010025482 malaise Diseases 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008693 nausea Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000035900 sweating Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/597—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23418—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
- H04N21/2353—Processing of additional data, e.g. scrambling of additional data or processing content descriptors specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4854—End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the disclosure relates to a method and a device for providing video using virtual reality (VR).
- VR virtual reality
- VR virtual reality
- VR systems may provide a realistic and immersive experience, they also cause motion sickness in many users.
- the motion sickness typically occurs as the motion is perceived visually in VR, but the body is physically at rest. Further, the motion sickness in VR may cause various sensations such as nausea, dizziness, headache, sweating, and other sensations.
- peripheral vision causes the motion sickness.
- reducing the FOV may reduce the motion sickness by restricting the peripheral vision.
- a retinal slip occurs when the retina in the eye is unable to register the object due to high velocity.
- a strobe i.e., empty frames
- 8Hz the technique of stroboscopic illumination
- static point of reference the user's attention is drawn to a static marker during rotation of a scene.
- Patent publication US 2016/0337630 A1 describes image encoding and display.
- Patent publication WO 2018/008991 A1 describes a display device and method for image processing.
- the expression, "at least one of a, b, and c,” should be understood as including only a, only b, only c, both a and b, both a and c, both b and c, or all of a, b, and c.
- the embodiments disclosed herein achieve a method and system for rendering a 360-degree video on a Virtual Reality (VR) display.
- the method includes identifying one or more objects across a plurality of frames of the 360-degree video.
- the method includes estimating one or more motion parameters associated with a camera by tracking a motion of the one or more objects across the plurality of frames.
- the method includes determining a type of motion of the camera across the plurality of frames based on the estimated motion parameters.
- the method includes selecting one or more motion sickness reduction schemes based on the determined type of motion of the camera across the plurality of frames.
- the method includes dynamically rendering the 360-degree video on the VR display by applying the one or more motion sickness reduction schemes across the plurality of frames based on the determined type of motion of the camera across the plurality of frames.
- the one or more motion sickness reduction schemes are applied on the plurality of frames based on the determined type of motion of the camera (i.e., translation of the camera, rotation of the camera or a combination of translation and rotation of the camera) across the plurality of frames.
- the proposed method may be used to enhance the user experience by automatically applying motion sickness reduction techniques while rendering the 360 video.
- FOV Field of View
- the 360-degree video is captured by encoding the motion parameters in the 360-degree video.
- the 360-degree camera includes one or more of a plurality of sensors, such as an accelerometer, a gyroscope, an altimeter, and/or the like, for capturing the 360-degree video.
- the method includes obtaining data from a plurality of sensors of the 360-degree camera with respect to a plurality of frames of the video while capturing the video.
- the method includes estimating motion parameters associated with the 360-degree camera for the plurality of frames by using the data obtained from the plurality of sensors.
- the motion parameters may include any of position coordinates, velocity, acceleration, altitude, angle of rotation, and/or direction of the 360-degree camera.
- the method includes determining whether the 360-degree camera is in relative motion across the plurality of frames based on the estimated motion parameters.
- the method includes dynamically encoding the motion parameters as metadata in the video when it is determined that the 360-degree camera is in relative motion.
- the 360-degree video is encoded to video formats such as MP4 and WebM.
- standard container formats such as Spherical Video V2 RFC are in the process of standardization.
- the metadata when the user captures the 360 video from a moving camera, metadata describing the camera motion is encoded in the video frames.
- the metadata is used to determine the type of camera motion (e.g., translation or rotation), and an appropriate motion sickness reduction technique is then automatically applied in order to counteract the motion sickness effect caused by the motion. This may result in an improved viewing experience for the user.
- the metadata when the metadata is not encoded in the video frame, the metadata is computed in real time by applying feature tracking techniques on the plurality of frames while rendering the 360-degree video on the VR display.
- the motion parameters of the 360-degree video are pre-computed before rendering the 360-degree video on the VR display.
- FIGS. 1 through 8 where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
- FIG. 1 illustrates a 360-degree camera 100 which is configured for capturing a 360-degree video with encoded motion parameters and a Virtual Reality (VR) device 200 which is configured for rendering the 360-degree video, according to an embodiment.
- the 360-degree camera 100 may implement any suitable 360° video-capture technology, e.g., multiple-lens direct capture, single-lens or multiple-lens compressed capture, and so on for capturing the 360-degree video (or a 360-degree still image).
- the 360-degree camera 100 may be an electronic device independent (or distinct) from the VR device 200 (e.g., VR headset, head mount display (HMD), etc.). In an embodiment, the 360-degree camera 100 may be included in the VR device 200.
- the VR device 200 e.g., VR headset, head mount display (HMD), etc.
- the 360-degree camera 100 may be included in the VR device 200.
- the 360-degree camera 100 includes a plurality of sensors, such as at least one from among an accelerometer, a gyroscope, a barometric sensor, and/or the like.
- the plurality of sensors in the 360-degree camera 100 track motion parameters associated with the 360-degree camera 100 while the 360-degree video is being captured.
- the motion parameters associated with the 360-degree camera 100 include any one or more of position coordinates, velocity, acceleration, altitude, angle of rotation, and/or direction of the 360-degree camera 100. These motion parameters indicate movements of the 360-degree camera 100 while the 360-degree video is being captured.
- the accelerometer measures the rate (or velocity) at which the camera has moved while capturing the 360-degree video.
- the gyroscope measures angular rotation (degrees) and direction of the 360-degree camera 100 while capturing 360 the video.
- the barometric sensor can be used to measure an altitude of the 360-degree camera 100 while capturing the 360-degree video.
- a Global Positioning System (GPS) sensor can be used to determine position coordinates of the 360-degree camera with respect to the video being captured.
- GPS Global Positioning System
- data i.e., speed, angular rotation, and altitude
- a plurality of frames for example, N frames.
- a change in velocity, angular rotation of the camera, and direction of the camera are computed by analyzing the plurality of frames.
- a first frame of the 360 video and a next ten frames of the 360 video may be acquired.
- the motion parameters of the 360-degree camera 100 across the 10 frames may be computed by analyzing the frames with the data obtained from the sensors.
- the motion parameters of the 360-degree camera 100 i.e., speed, angular rotation, and/or altitude
- the motion parameters of the 360-degree camera 100 are encoded (or inserted, or added) as metadata in each of the video frame(s) while capturing the 360-degree video, as shown in FIG. 1 .
- each frame of the 360-degree video is analyzed to determine (or detect) the presence of encoded metadata.
- one or more suitable motion sickness reduction schemes are selected for reducing an impact of the motion as indicated by the motion parameters of the 360-degree camera 100.
- the video frame(s) is rendered by applying the one or more suitable motion sickness reduction schemes. It should be noted that the one or more motion sickness reduction schemes are applied over N frames having the encoded metadata for reducing the motion sickness while rendering the 360-degree video on the VR device 200 as shown in FIG. 1 .
- the motion parameters of the 360-degree camera 100 are encoded within the video frame(s) while capturing the 360 video for reducing the motion sickness when the 360-degree video is rendered on the VR display.
- FIG. 2A illustrates various components of the 360-degree camera 100 for capturing the 360-degree video with encoded motion parameters, according to an embodiment.
- the 360-degree camera 100 includes sensors 110, a processor 120, a communicator 130, an image sensor 140, and a memory 150.
- the processor 120 includes a motion parameters estimator 122 and an encoder 124.
- the processor 120 may include a video capturing engine, and the video capturing engine may include the motion parameters estimator 122 and the encoder 124.
- the video capturing engine which includes the motion parameters estimator 122 and the encoder 124, may be implemented as a component that is independent from the processor 120.
- the sensors 110 may include at least one from among the accelerometer, the gyroscope, the altimeter, the barometric sensor, inertial sensors, and/or the like for tracking (or detecting) the movements of the 360-degree camera 100 while the 360-degree video is being captured.
- the data obtained from the sensors is communicated to the processor 120 for encoding the motion parameters of the 360-degree camera as metadata in the video frame(s).
- the motion parameters estimator 122 may be configured to estimate motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and altitude) across the video frames by analyzing the video frames based on the data obtained from the sensors 110.
- motion parameters estimator 122 may be configured to determine whether the 360-degree camera 100 is in relative motion across the plurality of frames based on the estimated motion parameters of the 360-degree camera 100. Further, motion parameters estimator 122 may be configured to provide an indication to the encoder 124 when the 360-degree camera 100 is in relative motion across the plurality of frames.
- the encoder 124 may be configured to encode the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and altitude) as metadata in the video frame(s) while the 360-degree video is being captured.
- the proposed method may include encoding camera motion as metadata which can be added (or inserted) to an existing header such as Spherical video header (svhd) or MeshBox header (Mesh).
- the processor(s) 120 execute instructions that may be loaded into a memory 150.
- the processor(s) 120 may include any suitable number(s) and type(s) of processors or other devices in any suitable arrangement.
- Example types of processor(s) 120 include microprocessors, microcontrollers, digital signal processors, field programmable gate arrays, application specific integrated circuits (i.e., ASICs), and discrete circuitry.
- the processor 120 may include an image signal processor.
- the communicator 130 may be configured to communicate the captured 360-degree video to one or more electronic devices, servers, VR devices, or the like for processing and/or rendering the 360-degree video.
- the communicator 130 may include a network interface card and/or a wireless transceiver configured for facilitating communications over a network.
- the communicator unit 130 may support communications via any suitable physical or wireless communication link(s).
- the image sensor 140 captures the 360-degree video from multiple view ports.
- the image sensor 140 may implement any suitable 360° video-capture technology, e.g., multiple-lens direct capture, single-lens or multiple-lens compressed capture, and so on for capturing the 360-degree video.
- the memory 150 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, and/or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
- the memory 150 may, in some examples, be considered a non-transitory storage medium.
- the term "non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory 150 is non-movable.
- the memory 150 may be configured to store larger amounts of information than the memory.
- a non-transitory storage medium may store data that may, over time, change (e.g., in Random Access Memory (RAM) or cache).
- RAM Random Access Memory
- FIG. 2B is an example illustration in which the 360-degree camera 100 receives the 360-degree video with encoded motion parameters from a server 160, according to an embodiment. As depicted in FIG. 2B , the 360-degree camera 100 communicates the captured 360-degree video to a server 160 for processing and encoding the metadata in the video frame(s).
- the server 160 may implement an image or video processing unit that provides coding of 360-degree videos using region adaptive smoothing and motion parameters estimation for the 360-degree camera 100.
- the server 160 estimates the motion parameters of the camera 100 (i.e., speed, angular rotation, and/or altitude) across the video frames by analyzing the video frames based on the data obtained from the sensors 110 in conjunction with the capturing of the 360-degree video.
- the server 160 may implement a suitable encoder for encoding the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and altitude) as metadata in the video frame(s).
- the server 160 transmits the 360-degree video with encoded motion parameters to the 360-degree camera 100 as shown in FIG. 2B , or to one or more electronic devices and/or VR devices that are configured for rendering the 360-degree video with reduced motion sickness by applying the one or more suitable motion sickness reduction schemes.
- FIG. 3 is a flow chart 300 illustrating a method of capturing the 360-degree video by using the 360-degree camera 100, according to an embodiment.
- the various operations of the flow chart 300 are performed by the processor 120 of the 360-degree camera 100.
- the method includes obtaining data from the plurality of sensors while capturing the 360-degree video by using the image sensor 140.
- the data obtained from the sensors includes at least one of a speed of the camera 100 (as obtained from an accelerometer), an angular rotation of the camera 100 (as obtained from a gyroscope), an altitude of the camera 100 (as obtained from a barometric sensor), and/or the like.
- the data is obtained from the sensors 112 for a plurality of frames, for example from frame i to frame i+N (where N is a natural number).
- the sensor data for the frame i to frame I+N sequence is successively (or sequentially) obtained.
- the data can be obtained from the sensors 112 for frame i and frame i+N which are not successively acquired (i.e., there is at least one frame in between frame I and frame i+N for which the sensor data is not obtained).
- the method includes monitoring sensor data (e.g., GPS data, accelerometer data, altimeter data, gyroscope data of camera, and/or the like) while recording a video using the 360-degree camera 100.
- the method includes estimating (or determining, or extracting) the motion parameters for the plurality of frames based on the data obtained from the plurality of sensors.
- the motion parameters include at least one of the speed, direction, and altitude of the 360-degree camera 100.
- the motion parameters of the 360-degree camera 100 for the plurality of frames is estimated. For example, at frame i and at frame i+N (or frames i to i+N), it is determined whether the 360-degree camera 100 has moved toward the objects (or an object) in the frame i+N.
- the movement of the 360-degree camera 100 from frame i and frame i+N is determined by analyzing the frames i and i+N based on the data obtained from the sensors. Further, the speed with which the 360-degree camera 100 has moved across the frames i and i+N can be measured by using data from the accelerometer, and the angular rotation of the 360-degree camera 100 can be obtained by using data obtained from the gyroscope. Thus, the movement (i.e., speed and direction) of the 360-degree camera 100 across the frames i and i+N is obtained by using the data from the sensors 112.
- the method includes obtaining the motion of the 360-degree camera 100 for the previous frames together with a direction of the motion.
- the motion and the direction of the 360-degree camera 100 for the previous frames are obtained from memory 150.
- the parameters of the 360-degree camera 100 for frames i-N may be obtained by using the data from the sensors 112.
- operation 306 can be omitted (or abbreviated).
- the method includes computing (or determining, or analyzing) a relative motion and direction of the 360-degree camera 100.
- the relative motion and direction of the 360-degree camera 100 is determined for the plurality of frames (e.g., frames i-N to i and frames i to i+N).
- the method includes determining whether the 360-degree camera 100 is in relative motion. If it is determined that the 360-degree camera 100 is in relative motion, then at operation 312, the method includes dynamically encoding the motion parameters of the 360-degree camera 100 in the video frame(s).
- the motion parameters of the 360-degree camera 100 i.e., speed, angular rotation and altitude
- the method loops back to operation 308 to compute the relative motion of the camera across a subsequent set of frames (for example, from frame i+N to frame i+2N).
- the proposed method can be used to track the sensor data of a set of N-frames.
- a change in velocity and direction of the camera is computed.
- This data will be encoded in the video as metadata (i.e. speed, direction, altitude) only when camera motion is detected.
- This information may be used to determine the type of camera motion, and determine an applicable motion sickness reduction technique to overcome sickness that may be induced by the camera motion.
- FIG. 4 illustrates various components of a VR device 200 which is configured for rendering the 360-degree video on a VR display 410, according to an embodiment.
- the VR device 200 may include a processor 401 and the VR display 410.
- the processor 401 may include a feature tracking and depth estimator 402, a motion parameters estimator 404, and a motion sickness reduction engine 408.
- at least one of the depth estimator 402, the motion parameters estimator 404, or the motion sickness reduction engine 408 may be a component that is independent from the processor 401 (i.e., a component which is not included in the processor 401).
- the feature tracking and depth estimator 402 may be configured to perform feature tracking and depth estimation with respect to the plurality of frames of the 360-degree video.
- the feature tracking includes the functions of tracking the object(s) between the plurality of sets of multi-focus frames and generating tracked object information that relates to a same object as shown in a plurality of scenes.
- the feature tracking and depth estimator 402 may be configured to generate depth for each of the scenes in order to produce a depth map of the scene, by using the set of frames that corresponds to the scene.
- the various features (such as objects, edges, contours, etc., using feature detection algorithms) and depth information across a set of video frames of the 360-degree video are extracted during the video playback (or after the video playback).
- the motion parameters estimator 404 may be configured to estimate the motion parameters of the 360-degree camera 100 for the plurality of frames. Based on a result of the feature tracking and depth estimation of the plurality of frames, the motion parameters of the 360-degree camera 100 for the plurality of frames are estimated, thereby providing an indication of the movement of the camera across the frames. In an embodiment, the motion parameters estimator 404 may be configured to determine the movement of each feature across the set of video frames, based on the identified features and depth information. Further, the method includes estimating motion parameters of the camera (i.e., a velocity and a direction of the motion of the camera) across the set of video frames based on determined feature movement.
- the motion parameters estimator 404 may be configured to determine a type of motion of the 360-degree camera 100 across the plurality of frames.
- the type of motion of the camera may include translation (or translation motion, or translational motion) of the camera, rotation of the camera, or a combination of translation and rotation of the camera across the plurality of frames.
- the motion sickness reduction engine 408 may be configured to select one or more motion sickness reduction schemes for reducing an impact of the motion of the 360-degree camera 100 according to the determined type of the motion of the 360-degree camera 100.
- the motion sickness reduction engine 408 may be configured to select from among various schemes, such as, for example, reduced field of view, stroboscopic illumination, and static point of reference, either individually or in combination based on the motion parameters of the camera across the frames according to the determined type of the motion of the 360-degree camera 100.
- the frames are rendered on the VR display 410 with a reduced motion sickness effect.
- FIG. 5 is a flow chart 500 illustrating a method for rendering a 360-degree video on the VR device 200, according to an embodiment.
- the various operations of the flow chart 500 are performed by the processor 401 of the VR device 200.
- the method includes obtaining a plurality of frames of a 360-degree video.
- the method includes identifying one or more objects in at least one of the plurality of frames of the 360-degree video.
- the method includes estimating motion parameters associated with a camera by tracking (or detecting) a motion of the one or more objects across the plurality of frames.
- the metadata may be extracted from the video frame(s), or is the metadata may be computed "on the fly” (or in real time) during video playback.
- the estimated motion parameters are then used to determine at least one of the translation of the camera or rotation of the camera.
- the motion parameters are calculated by tracking the features in a current frame and in an Nth frame (or from the current frame to the Nth frame) from the current frame. At least one of the camera motion and the altitude is used to determine the type of camera motion (i.e., translation and/or rotation).
- the method includes determining a type of motion of the camera across the plurality of frames based on the estimated motion parameters.
- the type of camera motion across the plurality of frames includes at least one of translation, rotation, and a combination of translation and rotation.
- the method includes selecting one or more motion sickness reduction schemes, based on the determined type of motion of the 360-degree camera 100 across the plurality of frames. For example, if there is a faster translation of the 360-degree camera 100 (or if an amount (or a change amount) of the translation of the 360-degree camera 100 is equal to or greater than a threshold), the stroboscopic illumination scheme may be selected. In another example, if there is a slower translation of the camera (or if the amount of the translation of the 360-degree camera 100 is less than the threshold), the FOV reduction scheme may be selected. In still another example, when there is a rotation across the plurality of frames, the static point of reference scheme may be selected and applied for reducing the motion sickness.
- both of the static point of reference scheme and the dynamic FOV reduction scheme may be selected and applied for reducing the motion sickness.
- the one or more motion sickness reduction schemes may be applied across all frames of the plurality of frames of the video.
- the method includes dynamically rendering the 360-degree video on the VR display 410 by applying the selected one or more motion sickness reduction schemes. While rendering the video, depending on the intensity (or change degree) of the motion and the determined type of the motion, the best motion sickness reduction scheme is applied in order to counter the effects of the camera motion.
- the dynamic FOV reduction scheme may be applied.
- both of the dynamic FOV reduction scheme and the stroboscopic illumination scheme may be applied.
- the static points of reference scheme is useful for reducing the motion sickness effect.
- correction schemes are neither exhaustive nor exclusive. Multiple correction schemes can be combined to achieve an improved VR experience.
- the proposed method and system can be extended to support other motion sickness reduction schemes which may be available at present or in the future.
- FIG. 6 illustrates various components of a 360-degree camera which are configured for processing the 360-degree video by the 360-degree camera 100 before rendering the 360-degree video, according to an embodiment.
- the 360-degree camera 100 may include a frame extractor 602, an object identifier 604, a feature identifier 606, an optical tracker 608, a motion parameters estimator 610, an encoder 612, and a memory 614.
- at least one of the frame extractor 602, the object identifier 604, the feature identifier 606, the optical tracker 608, the motion parameters estimator 610, or the encoder 612 may be included in the processor 120.
- the 360-degree camera 100 of FIG. 6 further comprises at least one of sensors 110, and/or communicator 130 of FIG. 2A .
- the frame extractor 602 may be configured to extract the plurality of frames of the 360-degree video, which includes one or more objects.
- the object identifier 604 may be configured to identify the one or more objects included in the plurality of frames.
- the feature identifier 606 may be configured to identify and/or track the object(s) between the plurality of sets of multi-focus frames and to generate tracked object information that relates to the same object as shown in the plurality of scenes.
- the optical tracker 608 may be configured to: extract foreground blobs from frames of a tracking video, and determine locations of the foreground blobs based on their respective positions within the frames of the tracking video.
- the object identifier 604, the feature identifier 606, and the optical tracker 608 may be configured to operate in parallel.
- the motion parameters estimator 610 may be configured to estimate the motion parameters of the 360-degree camera 100 for the plurality of frames based on communication(s) received from the object identifier 604, the feature identifier 606 and the optical tracker 608.
- the encoder 612 may be configured to encode the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation and altitude) as metadata in the video frame(s).
- the video frames with encoded parameters of the 360-degree camera 100 are stored in the memory 614.
- the various operations involved in processing the 360-degree video in the video processing engine include the following:
- tracking techniques such as, for example, an optical flow technique
- machine learning based schemes for object identification for example, identifying a static object, e.g., a tree
- object identification for example, identifying a static object, e.g., a tree
- the video frames with encoded parameters of the 360-degree camera 100 may be rendered via the VR device 200 with a reduced motion sickness effect.
- FIG. 7 is a flow chart 700 illustrating various operations for rendering the 360-degree video, according to an embodiment.
- the various operations of the flow chart 700 are performed by the processor 401.
- the method includes obtaining a plurality of frames of the 360-degree video.
- the method includes obtaining the encoded motion parameters of the 360-degree camera 100 for each frame.
- the method includes rendering each frame by reducing an impact of motion parameters on the VR display.
- the video frame(s) are rendered by applying one or more suitable motion sickness reduction schemes.
- the one or more motion sickness reduction schemes are applied over N frames having the encoded metadata for reducing the motion sickness while rendering the 360 video on the VR display 410.
- FIG. 8 is an example illustration in which the 360-degree video is rendered with reduced motion sickness on the VR device 200, according to an embodiment.
- a 360-degree video is encoded to video formats such as MP4 and WebM.
- video formats such as MP4 and WebM.
- standard container formats such as Spherical Video V2 RFC are in the process of standardization.
- a user may select a motion sickness reduction mode on the electronic device 300, and as a result, the 360-degree video is rendered with a reduced motion sickness effect on the VR device 200.
- the processor 401 applies one or more motion sickness reduction techniques to render an optimized video with a reduced motion sickness effect on the VR display as shown in FIG. 8 .
- the embodiments disclosed herein may be implemented via at least one software program running on at least one hardware device and performing network management functions to control the elements.
- the software program may be stored, for example, on a non-transitory computer-readable storage medium.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
Description
- The disclosure relates to a method and a device for providing video using virtual reality (VR).
- Virtual reality (VR) refers to a computer-simulated environment that may simulate a user's physical presence in real or imaginary environments. Although VR systems may provide a realistic and immersive experience, they also cause motion sickness in many users. The motion sickness typically occurs as the motion is perceived visually in VR, but the body is physically at rest. Further, the motion sickness in VR may cause various sensations such as nausea, dizziness, headache, sweating, and other sensations.
- Due to motion sickness induced by moving 360-degree videos, the users are unable to enjoy 360-degree content for extended periods of time. In order to reduce the motion sickness, various schemes (e.g., reducing a Field Of View (FOV) during camera translation, stroboscopic illumination at 8Hz during camera translation, static point of reference during camera rotation, or the like) have been developed.
- In some instances, due to high relative velocity, peripheral vision causes the motion sickness. During such instances, reducing the FOV may reduce the motion sickness by restricting the peripheral vision. A retinal slip occurs when the retina in the eye is unable to register the object due to high velocity. With the technique of stroboscopic illumination, a strobe (i.e., empty frames) is added at 8Hz, which reduces motion sickness due to the retinal slip. With usage of static point of reference, the user's attention is drawn to a static marker during rotation of a scene.
- However, the above-described schemes hamper immersion of a VR scene. Further, these schemes have not been selectively applied to address the various motion sickness scenarios encountered during a 360-degree video playback. Patent publication
US 2016/0337630 A1 describes image encoding and display.Patent publication WO 2018/008991 A1 describes a display device and method for image processing. - Provided are a method and a device for providing video using a Virtual Reality (VR), the method and the device as set out in the accompanying claims.
- The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 illustrates a 360-degree camera for capturing a 360-degree video with encoded motion parameters, and a Virtual Reality (VR) device for rendering, according to an embodiment; -
FIG. 2A illustrates various components of the 360-degree camera for capturing the 360-degree video with encoded motion parameters, according to an embodiment; -
FIG. 2B is an example illustration in which the 360-degree camera receives the 360 video with encoded motion parameters from a server, according to an embodiment; -
FIG. 3 is a flow chart illustrating a method of capturing the 360-degree video using a 360-degree camera, according to an embodiment; -
FIG. 4 illustrates various components of a VR device for rendering the 360-degree video on the VR display, according to an embodiment; -
FIG. 5 is a flow chart illustrating a method for rendering the 360-degree video on the VR device, according to an embodiment; -
FIG. 6 illustrates various components of a 360-degree camera which are configured for processing the 360-degree video before rendering the 360-degree video, according to an embodiment; -
FIG. 7 illustrates various operations for rendering the 360-degree video, according to an embodiment; and -
FIG. 8 is an example illustration in which the 360-degree video is rendered with reduced motion sickness on the VR device, according to an embodiment. - The embodiments disclosed herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments disclosed herein. Further, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term "or" as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments disclosed herein can be practiced and to further enable persons having ordinary in the art to practice the embodiments disclosed herein.
- As used herein, expressions such as "at least one of' and "at least one from among," when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list. For example, the expression, "at least one of a, b, and c," should be understood as including only a, only b, only c, both a and b, both a and c, both b and c, or all of a, b, and c.
- The embodiments disclosed herein achieve a method and system for rendering a 360-degree video on a Virtual Reality (VR) display. The method includes identifying one or more objects across a plurality of frames of the 360-degree video. The method includes estimating one or more motion parameters associated with a camera by tracking a motion of the one or more objects across the plurality of frames. The method includes determining a type of motion of the camera across the plurality of frames based on the estimated motion parameters. The method includes selecting one or more motion sickness reduction schemes based on the determined type of motion of the camera across the plurality of frames. Further, the method includes dynamically rendering the 360-degree video on the VR display by applying the one or more motion sickness reduction schemes across the plurality of frames based on the determined type of motion of the camera across the plurality of frames. It should be noted that the one or more motion sickness reduction schemes are applied on the plurality of frames based on the determined type of motion of the camera (i.e., translation of the camera, rotation of the camera or a combination of translation and rotation of the camera) across the plurality of frames.
- The proposed method may be used to enhance the user experience by automatically applying motion sickness reduction techniques while rendering the 360 video.
- For example, if the camera is undergoing a relatively slow translation in a particular frame, then dynamic Field of View (FOV) reduction is applied. In case of a faster translation, dynamic FOV reduction and stroboscopic illumination can be applied.
- In various embodiments, the 360-degree video is captured by encoding the motion parameters in the 360-degree video. The 360-degree camera includes one or more of a plurality of sensors, such as an accelerometer, a gyroscope, an altimeter, and/or the like, for capturing the 360-degree video. The method includes obtaining data from a plurality of sensors of the 360-degree camera with respect to a plurality of frames of the video while capturing the video. The method includes estimating motion parameters associated with the 360-degree camera for the plurality of frames by using the data obtained from the plurality of sensors. For example, the motion parameters may include any of position coordinates, velocity, acceleration, altitude, angle of rotation, and/or direction of the 360-degree camera. The method includes determining whether the 360-degree camera is in relative motion across the plurality of frames based on the estimated motion parameters.
- Further, the method includes dynamically encoding the motion parameters as metadata in the video when it is determined that the 360-degree camera is in relative motion. In some embodiments, the 360-degree video is encoded to video formats such as MP4 and WebM. However, in order to add information about 360-degree video, standard container formats such as Spherical Video V2 RFC are in the process of standardization.
- In an embodiment, when the user captures the 360 video from a moving camera, metadata describing the camera motion is encoded in the video frames. During video playback, the metadata is used to determine the type of camera motion (e.g., translation or rotation), and an appropriate motion sickness reduction technique is then automatically applied in order to counteract the motion sickness effect caused by the motion. This may result in an improved viewing experience for the user.
- In some embodiments, when the metadata is not encoded in the video frame, the metadata is computed in real time by applying feature tracking techniques on the plurality of frames while rendering the 360-degree video on the VR display.
- In various embodiments, the motion parameters of the 360-degree video are pre-computed before rendering the 360-degree video on the VR display.
- Referring now to the drawings, and more particularly to
FIGS. 1 through 8 , where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments. -
FIG. 1 illustrates a 360-degree camera 100 which is configured for capturing a 360-degree video with encoded motion parameters and a Virtual Reality (VR)device 200 which is configured for rendering the 360-degree video, according to an embodiment. The 360-degree camera 100 may implement any suitable 360° video-capture technology, e.g., multiple-lens direct capture, single-lens or multiple-lens compressed capture, and so on for capturing the 360-degree video (or a 360-degree still image). - In an embodiment, the 360-
degree camera 100 may be an electronic device independent (or distinct) from the VR device 200 (e.g., VR headset, head mount display (HMD), etc.). In an embodiment, the 360-degree camera 100 may be included in theVR device 200. - In an embodiment, the 360-
degree camera 100 includes a plurality of sensors, such as at least one from among an accelerometer, a gyroscope, a barometric sensor, and/or the like. The plurality of sensors in the 360-degree camera 100 track motion parameters associated with the 360-degree camera 100 while the 360-degree video is being captured. - The motion parameters associated with the 360-
degree camera 100 include any one or more of position coordinates, velocity, acceleration, altitude, angle of rotation, and/or direction of the 360-degree camera 100. These motion parameters indicate movements of the 360-degree camera 100 while the 360-degree video is being captured. For example, the accelerometer measures the rate (or velocity) at which the camera has moved while capturing the 360-degree video. The gyroscope measures angular rotation (degrees) and direction of the 360-degree camera 100 while capturing 360 the video. The barometric sensor can be used to measure an altitude of the 360-degree camera 100 while capturing the 360-degree video. A Global Positioning System (GPS) sensor can be used to determine position coordinates of the 360-degree camera with respect to the video being captured. - As depicted in
FIG. 1 , when recording the 360-degree video, data (i.e., speed, angular rotation, and altitude) from the plurality of sensors is obtained for a plurality of frames (for example, N frames). With the data from the plurality of sensors, a change in velocity, angular rotation of the camera, and direction of the camera are computed by analyzing the plurality of frames. - For example, a first frame of the 360 video and a next ten frames of the 360 video may be acquired. With the data obtained from the sensors with respect to the first frame and the next ten frames, the motion parameters of the 360-
degree camera 100 across the 10 frames may be computed by analyzing the frames with the data obtained from the sensors. - In an embodiment, it is determined whether the 360-
degree camera 100 is in relative motion across the plurality of frames, based on the estimated motion parameters of the 360-degree camera 100 (i.e., whether the 360-degree camera has moved across the frames with the respect to the surrounding environment). - When it is determined that the 360-
degree camera 100 is in relative motion, the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and/or altitude) are encoded (or inserted, or added) as metadata in each of the video frame(s) while capturing the 360-degree video, as shown inFIG. 1 . - In an embodiment, while rendering the 360-degree video on the
VR device 200, each frame of the 360-degree video is analyzed to determine (or detect) the presence of encoded metadata. In this case, when the video frame includes encoded metadata, one or more suitable motion sickness reduction schemes are selected for reducing an impact of the motion as indicated by the motion parameters of the 360-degree camera 100. Further, the video frame(s) is rendered by applying the one or more suitable motion sickness reduction schemes. It should be noted that the one or more motion sickness reduction schemes are applied over N frames having the encoded metadata for reducing the motion sickness while rendering the 360-degree video on theVR device 200 as shown inFIG. 1 . - Thus, by implementing the proposed method, the motion parameters of the 360-
degree camera 100 are encoded within the video frame(s) while capturing the 360 video for reducing the motion sickness when the 360-degree video is rendered on the VR display. -
FIG. 2A illustrates various components of the 360-degree camera 100 for capturing the 360-degree video with encoded motion parameters, according to an embodiment. As depicted inFIG. 2A , the 360-degree camera 100 includessensors 110, aprocessor 120, acommunicator 130, animage sensor 140, and amemory 150. Theprocessor 120 includes amotion parameters estimator 122 and anencoder 124. In an embodiment, theprocessor 120 may include a video capturing engine, and the video capturing engine may include themotion parameters estimator 122 and theencoder 124. In an embodiment, the video capturing engine, which includes themotion parameters estimator 122 and theencoder 124, may be implemented as a component that is independent from theprocessor 120. - The
sensors 110 may include at least one from among the accelerometer, the gyroscope, the altimeter, the barometric sensor, inertial sensors, and/or the like for tracking (or detecting) the movements of the 360-degree camera 100 while the 360-degree video is being captured. The data obtained from the sensors is communicated to theprocessor 120 for encoding the motion parameters of the 360-degree camera as metadata in the video frame(s). - In an embodiment, the
motion parameters estimator 122 may be configured to estimate motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and altitude) across the video frames by analyzing the video frames based on the data obtained from thesensors 110. - In an embodiment,
motion parameters estimator 122 may be configured to determine whether the 360-degree camera 100 is in relative motion across the plurality of frames based on the estimated motion parameters of the 360-degree camera 100. Further,motion parameters estimator 122 may be configured to provide an indication to theencoder 124 when the 360-degree camera 100 is in relative motion across the plurality of frames. - The
encoder 124 may be configured to encode the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and altitude) as metadata in the video frame(s) while the 360-degree video is being captured. The proposed method may include encoding camera motion as metadata which can be added (or inserted) to an existing header such as Spherical video header (svhd) or MeshBox header (Mesh). - The processor(s) 120 execute instructions that may be loaded into a
memory 150. The processor(s) 120 may include any suitable number(s) and type(s) of processors or other devices in any suitable arrangement. Example types of processor(s) 120 include microprocessors, microcontrollers, digital signal processors, field programmable gate arrays, application specific integrated circuits (i.e., ASICs), and discrete circuitry. In an embodiment, theprocessor 120 may include an image signal processor. - The
communicator 130 may be configured to communicate the captured 360-degree video to one or more electronic devices, servers, VR devices, or the like for processing and/or rendering the 360-degree video. For example, thecommunicator 130 may include a network interface card and/or a wireless transceiver configured for facilitating communications over a network. Thecommunicator unit 130 may support communications via any suitable physical or wireless communication link(s). - The
image sensor 140 captures the 360-degree video from multiple view ports. Theimage sensor 140 may implement any suitable 360° video-capture technology, e.g., multiple-lens direct capture, single-lens or multiple-lens compressed capture, and so on for capturing the 360-degree video. - The
memory 150 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, and/or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, thememory 150 may, in some examples, be considered a non-transitory storage medium. The term "non-transitory" may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term "non-transitory" should not be interpreted that thememory 150 is non-movable. In some examples, thememory 150 may be configured to store larger amounts of information than the memory. In certain examples, a non-transitory storage medium may store data that may, over time, change (e.g., in Random Access Memory (RAM) or cache). -
FIG. 2B is an example illustration in which the 360-degree camera 100 receives the 360-degree video with encoded motion parameters from aserver 160, according to an embodiment. As depicted inFIG. 2B , the 360-degree camera 100 communicates the captured 360-degree video to aserver 160 for processing and encoding the metadata in the video frame(s). - In an embodiment, the
server 160 may implement an image or video processing unit that provides coding of 360-degree videos using region adaptive smoothing and motion parameters estimation for the 360-degree camera 100. Theserver 160 estimates the motion parameters of the camera 100 (i.e., speed, angular rotation, and/or altitude) across the video frames by analyzing the video frames based on the data obtained from thesensors 110 in conjunction with the capturing of the 360-degree video. - Further, the
server 160 may implement a suitable encoder for encoding the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation, and altitude) as metadata in the video frame(s). Theserver 160 transmits the 360-degree video with encoded motion parameters to the 360-degree camera 100 as shown inFIG. 2B , or to one or more electronic devices and/or VR devices that are configured for rendering the 360-degree video with reduced motion sickness by applying the one or more suitable motion sickness reduction schemes. -
FIG. 3 is aflow chart 300 illustrating a method of capturing the 360-degree video by using the 360-degree camera 100, according to an embodiment. The various operations of theflow chart 300 are performed by theprocessor 120 of the 360-degree camera 100. - At
operation 302, the method includes obtaining data from the plurality of sensors while capturing the 360-degree video by using theimage sensor 140. The data obtained from the sensors includes at least one of a speed of the camera 100 (as obtained from an accelerometer), an angular rotation of the camera 100 (as obtained from a gyroscope), an altitude of the camera 100 (as obtained from a barometric sensor), and/or the like. The data is obtained from the sensors 112 for a plurality of frames, for example from frame i to frame i+N (where N is a natural number). In an embodiment, the sensor data for the frame i to frame I+N sequence is successively (or sequentially) obtained. In an embodiment, the data can be obtained from the sensors 112 for frame i and frame i+N which are not successively acquired (i.e., there is at least one frame in between frame I and frame i+N for which the sensor data is not obtained). In an embodiment, the method includes monitoring sensor data (e.g., GPS data, accelerometer data, altimeter data, gyroscope data of camera, and/or the like) while recording a video using the 360-degree camera 100. - At operation 304, the method includes estimating (or determining, or extracting) the motion parameters for the plurality of frames based on the data obtained from the plurality of sensors. The motion parameters include at least one of the speed, direction, and altitude of the 360-
degree camera 100. Using the data obtained from the sensors for the plurality of frames, the motion parameters of the 360-degree camera 100 for the plurality of frames is estimated. For example, at frame i and at frame i+N (or frames i to i+N), it is determined whether the 360-degree camera 100 has moved toward the objects (or an object) in the frame i+N. The movement of the 360-degree camera 100 from frame i and frame i+N is determined by analyzing the frames i and i+N based on the data obtained from the sensors. Further, the speed with which the 360-degree camera 100 has moved across the frames i and i+N can be measured by using data from the accelerometer, and the angular rotation of the 360-degree camera 100 can be obtained by using data obtained from the gyroscope. Thus, the movement (i.e., speed and direction) of the 360-degree camera 100 across the frames i and i+N is obtained by using the data from the sensors 112. - At
operation 306, the method includes obtaining the motion of the 360-degree camera 100 for the previous frames together with a direction of the motion. In an embodiment, the motion and the direction of the 360-degree camera 100 for the previous frames are obtained frommemory 150. For example, the parameters of the 360-degree camera 100 for frames i-N may be obtained by using the data from the sensors 112. In an embodiment,operation 306 can be omitted (or abbreviated). - At
operation 308, the method includes computing (or determining, or analyzing) a relative motion and direction of the 360-degree camera 100. The relative motion and direction of the 360-degree camera 100 is determined for the plurality of frames (e.g., frames i-N to i and frames i to i+N). - At
operation 310, the method includes determining whether the 360-degree camera 100 is in relative motion. If it is determined that the 360-degree camera 100 is in relative motion, then atoperation 312, the method includes dynamically encoding the motion parameters of the 360-degree camera 100 in the video frame(s). The motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation and altitude) are encoded as metadata in the video frame(s) while capturing the 360-degree video or after capturing the 360-degree video. - If it is determined that the 360-
degree camera 100 is not in relative motion, the method loops back tooperation 308 to compute the relative motion of the camera across a subsequent set of frames (for example, from frame i+N to frame i+2N). - In an embodiment, when the user is capturing or recording the 360-degree video, the proposed method can be used to track the sensor data of a set of N-frames. In the set of N-frames, a change in velocity and direction of the camera is computed. This data will be encoded in the video as metadata (i.e. speed, direction, altitude) only when camera motion is detected. This information may be used to determine the type of camera motion, and determine an applicable motion sickness reduction technique to overcome sickness that may be induced by the camera motion.
-
FIG. 4 illustrates various components of aVR device 200 which is configured for rendering the 360-degree video on aVR display 410, according to an embodiment. TheVR device 200 may include aprocessor 401 and theVR display 410. Theprocessor 401 may include a feature tracking anddepth estimator 402, amotion parameters estimator 404, and a motionsickness reduction engine 408. In an embodiment, at least one of thedepth estimator 402, themotion parameters estimator 404, or the motionsickness reduction engine 408 may be a component that is independent from the processor 401 (i.e., a component which is not included in the processor 401). - The feature tracking and
depth estimator 402 may be configured to perform feature tracking and depth estimation with respect to the plurality of frames of the 360-degree video. The feature tracking includes the functions of tracking the object(s) between the plurality of sets of multi-focus frames and generating tracked object information that relates to a same object as shown in a plurality of scenes. - Further, the feature tracking and
depth estimator 402 may be configured to generate depth for each of the scenes in order to produce a depth map of the scene, by using the set of frames that corresponds to the scene. - The various features (such as objects, edges, contours, etc., using feature detection algorithms) and depth information across a set of video frames of the 360-degree video are extracted during the video playback (or after the video playback).
- The
motion parameters estimator 404 may be configured to estimate the motion parameters of the 360-degree camera 100 for the plurality of frames. Based on a result of the feature tracking and depth estimation of the plurality of frames, the motion parameters of the 360-degree camera 100 for the plurality of frames are estimated, thereby providing an indication of the movement of the camera across the frames. In an embodiment, themotion parameters estimator 404 may be configured to determine the movement of each feature across the set of video frames, based on the identified features and depth information. Further, the method includes estimating motion parameters of the camera (i.e., a velocity and a direction of the motion of the camera) across the set of video frames based on determined feature movement. - In an embodiment, the
motion parameters estimator 404 may be configured to determine a type of motion of the 360-degree camera 100 across the plurality of frames. The type of motion of the camera may include translation (or translation motion, or translational motion) of the camera, rotation of the camera, or a combination of translation and rotation of the camera across the plurality of frames. - The motion
sickness reduction engine 408 may be configured to select one or more motion sickness reduction schemes for reducing an impact of the motion of the 360-degree camera 100 according to the determined type of the motion of the 360-degree camera 100. The motionsickness reduction engine 408 may be configured to select from among various schemes, such as, for example, reduced field of view, stroboscopic illumination, and static point of reference, either individually or in combination based on the motion parameters of the camera across the frames according to the determined type of the motion of the 360-degree camera 100. - As a result, the frames are rendered on the
VR display 410 with a reduced motion sickness effect. -
FIG. 5 is aflow chart 500 illustrating a method for rendering a 360-degree video on theVR device 200, according to an embodiment. The various operations of theflow chart 500 are performed by theprocessor 401 of theVR device 200. - At
operation 502, the method includes obtaining a plurality of frames of a 360-degree video. Atoperation 504, the method includes identifying one or more objects in at least one of the plurality of frames of the 360-degree video. - At
operation 506, the method includes estimating motion parameters associated with a camera by tracking (or detecting) a motion of the one or more objects across the plurality of frames. The metadata may be extracted from the video frame(s), or is the metadata may be computed "on the fly" (or in real time) during video playback. The estimated motion parameters are then used to determine at least one of the translation of the camera or rotation of the camera. - In an embodiment, the motion parameters are calculated by tracking the features in a current frame and in an Nth frame (or from the current frame to the Nth frame) from the current frame. At least one of the camera motion and the altitude is used to determine the type of camera motion (i.e., translation and/or rotation).
- At
operation 508, the method includes determining a type of motion of the camera across the plurality of frames based on the estimated motion parameters. The type of camera motion across the plurality of frames includes at least one of translation, rotation, and a combination of translation and rotation. - At
operation 510, the method includes selecting one or more motion sickness reduction schemes, based on the determined type of motion of the 360-degree camera 100 across the plurality of frames. For example, if there is a faster translation of the 360-degree camera 100 (or if an amount (or a change amount) of the translation of the 360-degree camera 100 is equal to or greater than a threshold), the stroboscopic illumination scheme may be selected. In another example, if there is a slower translation of the camera (or if the amount of the translation of the 360-degree camera 100 is less than the threshold), the FOV reduction scheme may be selected. In still another example, when there is a rotation across the plurality of frames, the static point of reference scheme may be selected and applied for reducing the motion sickness. In still another example, when there is a rapid rotation together with translation across the plurality of frames, both of the static point of reference scheme and the dynamic FOV reduction scheme may be selected and applied for reducing the motion sickness. It should be noted that the one or more motion sickness reduction schemes may be applied across all frames of the plurality of frames of the video. - At
operation 512, the method includes dynamically rendering the 360-degree video on theVR display 410 by applying the selected one or more motion sickness reduction schemes. While rendering the video, depending on the intensity (or change degree) of the motion and the determined type of the motion, the best motion sickness reduction scheme is applied in order to counter the effects of the camera motion. - In an example, if the camera is undergoing a relatively slow translation, the dynamic FOV reduction scheme may be applied. In the case of faster translation, both of the dynamic FOV reduction scheme and the stroboscopic illumination scheme may be applied.
- In another example, if the camera is undergoing a rotation, the static points of reference scheme is useful for reducing the motion sickness effect.
- However, the correction schemes are neither exhaustive nor exclusive. Multiple correction schemes can be combined to achieve an improved VR experience. The proposed method and system can be extended to support other motion sickness reduction schemes which may be available at present or in the future.
-
FIG. 6 illustrates various components of a 360-degree camera which are configured for processing the 360-degree video by the 360-degree camera 100 before rendering the 360-degree video, according to an embodiment. The 360-degree camera 100 may include aframe extractor 602, anobject identifier 604, afeature identifier 606, anoptical tracker 608, amotion parameters estimator 610, anencoder 612, and amemory 614. In an embodiment, at least one of theframe extractor 602, theobject identifier 604, thefeature identifier 606, theoptical tracker 608, themotion parameters estimator 610, or theencoder 612 may be included in theprocessor 120. In an embodiment, the 360-degree camera 100 ofFIG. 6 further comprises at least one ofsensors 110, and/orcommunicator 130 ofFIG. 2A . - The
frame extractor 602 may be configured to extract the plurality of frames of the 360-degree video, which includes one or more objects. - The
object identifier 604 may be configured to identify the one or more objects included in the plurality of frames. - The
feature identifier 606 may be configured to identify and/or track the object(s) between the plurality of sets of multi-focus frames and to generate tracked object information that relates to the same object as shown in the plurality of scenes. - The
optical tracker 608 may be configured to: extract foreground blobs from frames of a tracking video, and determine locations of the foreground blobs based on their respective positions within the frames of the tracking video. - In an embodiment, the
object identifier 604, thefeature identifier 606, and theoptical tracker 608 may be configured to operate in parallel. - The
motion parameters estimator 610 may be configured to estimate the motion parameters of the 360-degree camera 100 for the plurality of frames based on communication(s) received from theobject identifier 604, thefeature identifier 606 and theoptical tracker 608. - The
encoder 612 may be configured to encode the motion parameters of the 360-degree camera 100 (i.e., speed, angular rotation and altitude) as metadata in the video frame(s).The video frames with encoded parameters of the 360-degree camera 100 are stored in thememory 614. - The various operations involved in processing the 360-degree video in the video processing engine include the following:
- 1. Identifying features in the scene by using feature detection techniques.
- 2. Tracking the identified features in N successive frames.
- 3. Computing an average speed of the identified features.
- 4. Based on the computed average feature speed and camera calibration parameters, detecting a camera motion and storing motion parameters that relate to the detected camera motion.
- 5.
Repeating steps - In various embodiments, other tracking techniques (such as, for example, an optical flow technique) and machine learning based schemes for object identification (for example, identifying a static object, e.g., a tree) may also be used to improve the estimation of camera velocity.
- As a result, the video frames with encoded parameters of the 360-
degree camera 100 may be rendered via theVR device 200 with a reduced motion sickness effect. -
FIG. 7 is aflow chart 700 illustrating various operations for rendering the 360-degree video, according to an embodiment. The various operations of theflow chart 700 are performed by theprocessor 401. - At
operation 702, the method includes obtaining a plurality of frames of the 360-degree video. - At
operation 704, the method includes obtaining the encoded motion parameters of the 360-degree camera 100 for each frame. - At
operation 706, the method includes rendering each frame by reducing an impact of motion parameters on the VR display. The video frame(s) are rendered by applying one or more suitable motion sickness reduction schemes. The one or more motion sickness reduction schemes are applied over N frames having the encoded metadata for reducing the motion sickness while rendering the 360 video on theVR display 410. -
FIG. 8 is an example illustration in which the 360-degree video is rendered with reduced motion sickness on theVR device 200, according to an embodiment. - In an embodiment, a 360-degree video is encoded to video formats such as MP4 and WebM. However, in order to add information about the 360-degree video, standard container formats such as Spherical Video V2 RFC are in the process of standardization.
- By implementing the proposed method, a user may select a motion sickness reduction mode on the
electronic device 300, and as a result, the 360-degree video is rendered with a reduced motion sickness effect on theVR device 200. - The
processor 401 applies one or more motion sickness reduction techniques to render an optimized video with a reduced motion sickness effect on the VR display as shown inFIG. 8 . - The embodiments disclosed herein may be implemented via at least one software program running on at least one hardware device and performing network management functions to control the elements. The software program may be stored, for example, on a non-transitory computer-readable storage medium.
Claims (11)
- A method for reducing a motion sickness effect associated with a rendering of a video on a virtual reality device, the method comprising:extracting, from a portion of the video, at least one motion parameter that relates to a motion of a camera used for capturing the video (506);identifying, based on the extracted at least one motion parameter, a type of a motion of the camera, wherein the type of the motion of the camera includes at least one of a fast translation with a velocity of the camera equal to or greater than a predetermined threshold, a slow translation with the velocity of the camera less than the predetermined threshold, or a rotation; andidentifying, based on the identified type of the motion of the camera, at least one motion sickness reduction scheme to be applied to the video,dynamically rendering the video to the virtual reality device by applying the identified motion sickness scheme to the portion of the video,wherein the identifying at least one motion sickness reduction scheme comprises:selecting, in case that the identified type of the motion includes the fast translation, a stroboscopic illumination scheme;selecting, in case that the identified type of the motion includes the slow translation, a field of view (FOV) reduction scheme; andselecting, in case that the identified type of the motion includes the rotation, a static point of reference scheme.
- The method of claim 1, wherein the at least one motion parameter comprises at least one of position coordinates of the camera, the velocity of the camera, an acceleration of the camera, an altitude of the camera, an angle of rotation of the camera, or a direction of the camera.
- The method of claim 1, wherein the extracting the at least one motion parameter comprises extracting the at least one motion parameter that is included as a metadata in the portion of the video.
- The method of claim 3, wherein the at least one motion parameter is generated based on information that relates to the motion of the camera which is received from a sensor included in the camera.
- The method of claim 3, wherein the at least one motion parameter is generated based on information that relates to at least one object included in the portion of the video.
- The method of claim 1, wherein the camera is included in the virtual reality device.
- A virtual reality device (200) comprising:a display (410); andat least one processor (401),where the at least one processor (401) is configured to:extract, from a portion of the video, at least one motion parameter that relates to a motion of a camera used for capturing the video;identify, based on the extracted at least one motion parameter, a type of the motion of the camera, wherein the type of the motion of the camera includes at least one of a fast translation with a velocity of the camera equal to or greater than a predetermined threshold, a slow translation with the velocity of the camera less than the predetermined threshold, or a rotation;identify, based on the identified type of the motion of the camera, at least one motion sickness reduction scheme to be applied to the video; anddynamically render the video to the virtual reality device by applying the identified motion sickness scheme to the portion of the video,wherein the at least one processor (401) is, in order to identify at least one motion sickness reduction scheme, configured to:select, in case that the identified type of the motion includes the fast translation, a stroboscopic illumination scheme;select, in case that the identified type of the motion includes the slow translation, a field of view (FOV) reduction scheme; andselect, in case that the identified type of the motion includes the rotation, a static point of reference scheme.
- The virtual reality device of claim 7, wherein the at least one motion parameter comprises at least one of position coordinates of the camera, the velocity of the camera, an acceleration of the camera, an altitude of the camera, an angle of rotation of the camera, or a direction of the camera.
- The virtual reality device of claim 7, wherein the at least one processor is, in order to extract the at least one motion parameter, configured to extract the at least one motion parameter that is included as a metadata in the portion of the video.
- The virtual reality device of claim 9, wherein the at least one motion parameter is generated based on information that relates to the motion of the camera which is received from a sensor included in the camera.
- The virtual reality device of claim 9, wherein the at least one motion parameter is generated based on information that relates to at least one object included in the portion of the video.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN201741017479 | 2017-05-18 | ||
PCT/KR2018/005705 WO2018212617A1 (en) | 2017-05-18 | 2018-05-18 | Method for providing 360-degree video and device for supporting the same |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3622487A1 EP3622487A1 (en) | 2020-03-18 |
EP3622487A4 EP3622487A4 (en) | 2020-06-24 |
EP3622487B1 true EP3622487B1 (en) | 2021-12-22 |
Family
ID=64272619
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18802463.2A Active EP3622487B1 (en) | 2017-05-18 | 2018-05-18 | Method for providing 360-degree video and device for supporting the same |
Country Status (3)
Country | Link |
---|---|
US (1) | US11258999B2 (en) |
EP (1) | EP3622487B1 (en) |
WO (1) | WO2018212617A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7027753B2 (en) * | 2017-09-20 | 2022-03-02 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment and programs |
US11399207B2 (en) * | 2018-02-02 | 2022-07-26 | Comcast Cable Communications, Llc | Image selection using motion data |
JP2019152980A (en) * | 2018-03-01 | 2019-09-12 | キヤノン株式会社 | Image processing system, image processing method and program |
JP2021182650A (en) * | 2018-07-20 | 2021-11-25 | ソニーグループ株式会社 | Image processing device and method |
KR102284266B1 (en) * | 2018-12-13 | 2021-08-02 | 한국과학기술원 | Method for vr sickness assessment considering neural mismatch model and the apparatus thereof |
US11064118B1 (en) * | 2019-12-18 | 2021-07-13 | Gopro, Inc. | Systems and methods for dynamic stabilization adjustment |
JP6801136B1 (en) * | 2020-05-14 | 2020-12-16 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Remote control system and its remote work equipment, video processing equipment and programs |
US12044845B2 (en) * | 2021-03-29 | 2024-07-23 | Tencent America LLC | Towards subsiding motion sickness for viewport sharing for teleconferencing and telepresence for remote terminals |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6497649B2 (en) | 2001-01-21 | 2002-12-24 | University Of Washington | Alleviating motion, simulator, and virtual environmental sickness by presenting visual scene components matched to inner ear vestibular sensations |
US6623428B2 (en) * | 2001-10-11 | 2003-09-23 | Eastman Kodak Company | Digital image sequence display system and method |
WO2004042662A1 (en) | 2002-10-15 | 2004-05-21 | University Of Southern California | Augmented virtual environments |
US20160267720A1 (en) | 2004-01-30 | 2016-09-15 | Electronic Scripting Products, Inc. | Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience |
US7722526B2 (en) * | 2004-07-16 | 2010-05-25 | Samuel Kim | System, method and apparatus for preventing motion sickness |
IL172797A (en) * | 2005-12-25 | 2012-09-24 | Elbit Systems Ltd | Real-time image scanning and processing |
US8218855B2 (en) * | 2007-10-04 | 2012-07-10 | Samsung Electronics Co., Ltd. | Method and apparatus for receiving multiview camera parameters for stereoscopic image, and method and apparatus for transmitting multiview camera parameters for stereoscopic image |
US9994228B2 (en) * | 2010-05-14 | 2018-06-12 | Iarmourholdings, Inc. | Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment |
US9298985B2 (en) * | 2011-05-16 | 2016-03-29 | Wesley W. O. Krueger | Physiological biosensor system and method for controlling a vehicle or powered equipment |
EP2577959A1 (en) * | 2010-05-26 | 2013-04-10 | Qualcomm Incorporated | Camera parameter- assisted video frame rate up conversion |
US8619005B2 (en) * | 2010-09-09 | 2013-12-31 | Eastman Kodak Company | Switchable head-mounted display transition |
US20120182206A1 (en) * | 2011-01-17 | 2012-07-19 | Ronald Steven Cok | Head-mounted display control with sensory stimulation |
US20140176296A1 (en) * | 2012-12-19 | 2014-06-26 | HeadsUp Technologies, Inc. | Methods and systems for managing motion sickness |
US9645395B2 (en) * | 2013-03-15 | 2017-05-09 | Mark Bolas | Dynamic field of view throttling as a means of improving user experience in head mounted virtual environments |
GB201305402D0 (en) | 2013-03-25 | 2013-05-08 | Sony Comp Entertainment Europe | Head mountable display |
US9536353B2 (en) * | 2013-10-03 | 2017-01-03 | Honda Motor Co., Ltd. | System and method for dynamic in-vehicle virtual reality |
GB2523555B (en) | 2014-02-26 | 2020-03-25 | Sony Interactive Entertainment Europe Ltd | Image encoding and display |
CN106664393A (en) | 2014-07-31 | 2017-05-10 | 索尼公司 | Information processing device, information processing method, and image display system |
WO2016107635A1 (en) * | 2014-12-29 | 2016-07-07 | Metaio Gmbh | Method and system for generating at least one image of a real environment |
EP3378221B1 (en) * | 2015-11-16 | 2022-01-12 | Google LLC | Stabilization based on accelerometer data |
JP6620063B2 (en) * | 2016-04-21 | 2019-12-11 | 株式会社ソニー・インタラクティブエンタテインメント | Image processing apparatus and image processing method |
KR20180005528A (en) | 2016-07-06 | 2018-01-16 | 삼성전자주식회사 | Display apparatus and method for image processing |
WO2018020735A1 (en) * | 2016-07-28 | 2018-02-01 | 株式会社コロプラ | Information processing method and program for causing computer to execute information processing method |
KR20180028796A (en) * | 2016-09-09 | 2018-03-19 | 삼성전자주식회사 | Method, storage medium and electronic device for displaying images |
WO2018117574A1 (en) * | 2016-12-22 | 2018-06-28 | Samsung Electronics Co., Ltd. | Method for displaying image, storage medium, and electronic device |
US10368047B2 (en) * | 2017-02-15 | 2019-07-30 | Adone Inc. | Six-degree of freedom video playback of a single monoscopic 360-degree video |
US10628994B2 (en) * | 2017-03-07 | 2020-04-21 | Google Llc | Reducing visually induced motion sickness in head mounted display systems |
-
2018
- 2018-05-18 US US15/983,550 patent/US11258999B2/en active Active
- 2018-05-18 EP EP18802463.2A patent/EP3622487B1/en active Active
- 2018-05-18 WO PCT/KR2018/005705 patent/WO2018212617A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
US11258999B2 (en) | 2022-02-22 |
EP3622487A4 (en) | 2020-06-24 |
WO2018212617A1 (en) | 2018-11-22 |
EP3622487A1 (en) | 2020-03-18 |
US20180338132A1 (en) | 2018-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3622487B1 (en) | Method for providing 360-degree video and device for supporting the same | |
US11776222B2 (en) | Method for detecting objects and localizing a mobile computing device within an augmented reality experience | |
US11605204B2 (en) | Image processing for augmented reality | |
CN110046546B (en) | Adaptive sight tracking method, device and system and storage medium | |
KR20220009393A (en) | Image-based localization | |
US10255713B2 (en) | System and method for dynamically adjusting rendering parameters based on user movements | |
US20190206115A1 (en) | Image processing device and method | |
US11436790B2 (en) | Passthrough visualization | |
WO2018102880A1 (en) | Systems and methods for replacing faces in videos | |
US10909764B2 (en) | Providing augmented reality target images in a web browser | |
US11127156B2 (en) | Method of device tracking, terminal device, and storage medium | |
EP3776143B1 (en) | Head-mounted display and method to reduce visually induced motion sickness in a connected remote display | |
US10762713B2 (en) | Method for developing augmented reality experiences in low computer power systems and devices | |
EP3993428A1 (en) | Time delay error correction method, terminal device, server, and storage medium | |
CN111862150A (en) | Image tracking method and device, AR device and computer device | |
WO2017041740A1 (en) | Methods and systems for light field augmented reality/virtual reality on mobile devices | |
CN108804161B (en) | Application initialization method, device, terminal and storage medium | |
EP4300446A1 (en) | Methods and systems for detecting fraud during biometric identity verification | |
JP2015118577A5 (en) | ||
KR20160062665A (en) | Apparatus and method for analyzing motion | |
KR101915578B1 (en) | System for picking an object base on view-direction and method thereof | |
US20230245322A1 (en) | Reconstructing A Three-Dimensional Scene | |
US20240273745A1 (en) | Adaptive frame rate for low power slam in xr | |
US20240062425A1 (en) | Automatic Colorization of Grayscale Stereo Images | |
WO2017097410A1 (en) | System for control and interactive visualization of multimedia content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20191211 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20200528 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 21/435 20110101ALI20200520BHEP Ipc: G06T 19/20 20110101ALI20200520BHEP Ipc: H04N 19/00 20140101ALI20200520BHEP Ipc: H04N 5/232 20060101ALI20200520BHEP Ipc: G06T 7/20 20170101ALI20200520BHEP Ipc: H04N 13/344 20180101ALI20200520BHEP Ipc: H04N 21/234 20110101ALI20200520BHEP Ipc: H04N 19/46 20140101ALI20200520BHEP Ipc: G06T 19/00 20110101AFI20200520BHEP Ipc: G06T 7/70 20170101ALI20200520BHEP |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210113 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20210811 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: DAS, NACHIKETA Inventor name: RAO PADEBETTU, ROHIT |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602018028606 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1457567 Country of ref document: AT Kind code of ref document: T Effective date: 20220115 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220322 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20211222 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1457567 Country of ref document: AT Kind code of ref document: T Effective date: 20211222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220322 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220323 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220422 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602018028606 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220422 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
26N | No opposition filed |
Effective date: 20220923 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20220531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220518 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220531 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220518 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20180518 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240422 Year of fee payment: 7 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240422 Year of fee payment: 7 |