US20180255241A9 - Apparatus and method for generating an output video stream from a wide field video stream - Google Patents
Apparatus and method for generating an output video stream from a wide field video stream Download PDFInfo
- Publication number
- US20180255241A9 US20180255241A9 US14/887,122 US201514887122A US2018255241A9 US 20180255241 A9 US20180255241 A9 US 20180255241A9 US 201514887122 A US201514887122 A US 201514887122A US 2018255241 A9 US2018255241 A9 US 2018255241A9
- Authority
- US
- United States
- Prior art keywords
- video stream
- wide field
- output video
- display
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000000694 effects Effects 0.000 claims description 8
- 230000006870 function Effects 0.000 description 29
- 238000003860 storage Methods 0.000 description 23
- 238000004891 communication Methods 0.000 description 12
- 238000004590 computer program Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
-
- H04N5/23238—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
Definitions
- An apparatus and methods described herein generally relate to generating an output video stream from a wide field video stream.
- Videos cameras can be assembled in numerous directions with multiple angles and viewpoints so as to simultaneously film a particular environment.
- the assembly may results in a wide field video stream with a field of view exceeding the normal human eye.
- the wide field video stream provides the advantage of streaming or creating a video of a particular environment with a 180-degree to 360-degree viewpoint
- a portion of the wide field video stream may be selected for viewing or editing purposes.
- extracting and displaying a portion of the wide field video stream onto a display screen may result in lag time and significantly lower video stream pixel resolution resulting in poor visual display.
- the disclosure herein relates to a generated output video stream from a selected wide field video stream.
- the generated output video stream is displayed on a two-dimensional display space so that an image of the output video stream correlates to the corresponding image of the wide field video stream.
- the objective the output video stream may include utilizing a first geometric function to determine a particular portion of the wide field video stream to be displayed onto a particular display space.
- the application of the first geometric function may considerably limit the calculation time of the output video stream so that the stream is achieved in real time without loss of quality.
- a method and system configured to generate an output video stream from a selected wide field video stream may include one or more physical processors configured by a machine readable instruction to select a wide field video stream.
- the wide field video stream may correspond to an assembly of multiple video streams with various different field of views of the same environment.
- the wide field video stream may be selected with a portion of the video stream to be viewed according to a particular projection.
- the projection may include a modified view of a wide field video stream by distorting the angle view and orientation of the wide field video stream to create a new immersive experience.
- the determined projection of the wide field video stream may then be transmitted onto a display space.
- the display space may include at least one screen. In other embodiments, the display space may include a plurality of display screens that correspond to different field of view and projections of the wide field video stream.
- the physical processors may be configured to generate intermediate points by executing a first geometric function.
- the input parameters of the first geometric function include the specified projection and a specific field of view from a camera lens of the wide field video stream.
- the generated individual intermediate points may correspond to locations on the wide video stream such that the intermediate points contain the location of each point on the image of the wide field video stream.
- the intermediate points may correspond to at least one pixel of the wide field video stream to be displayed on the selected display space.
- the physical processors may also be configured to generate reference points by executing a second geometric function.
- the second geometric function takes as input parameters the intermediate points from the first geometric function.
- the generated reference points may correspond to at least one pixel of the wide field video stream with at least one pixel of the output video stream.
- the generated output video may include visual effects. Some visual effects may include a contrast filter, luminosity filter, saturation filter, color filter, and an image deformation filter to further enhance the visual experience.
- the output video stream may be displayed on a device with a touchscreen, accelerometer, gyroscope, magnetometer, pointing device, and a moving detecting gestural interface.
- the physical processors may be configured to modify the field of view and orientation of a virtual video camera.
- the field of view may be modified by updating the coordinates of the intermediate points.
- the field of view may be further modified by converting the coordinates of the intermediate point in a three dimensional coordinate system into coordinates in a spherical coordinate system.
- Other modifications may include modifying the orientation of the virtual camera and normalizing the coordinates in the spherical coordinate system to create new coordinates in a three dimensional coordinate system.
- FIG. 1 illustrates a system configured for generating an output video stream from a selected wide field video stream, in accordance with one or more implementations.
- FIG. 2A illustrates a representation of a desire projection, in accordance with one or more implementations.
- FIG. 2B illustrates an a representation of a desired projection, in accordance with one or more implementations.
- FIG. 3 illustrates a method configured for generating an output video stream with a first geometric function, in accordance with one or more implementations.
- FIG. 4 illustrates a method configured for generating an output video stream with a second geometric function, in accordance with one or more implementations.
- FIG. 5 illustrates an example device for managing an output video stream, in accordance with one or more implementations.
- FIG. 6 illustrates an example computing module that may be used to implement various features of the technology as disclosed herein, in accordance with one or more implementations.
- FIG. 1 illustrates an example system 100 that is configured for generating an output video stream from a selected wide field video stream.
- the system may include a central processing unit 107 so that specified instructions can be executed by a main processor, or the control processing unit 107 .
- Specified instructions executed by the control processing unit 107 may include selecting a wide field video stream 101 .
- the wide field video stream 101 may correspond to a plurality of video streams pertaining to different field of views of one particular environment.
- the plurality of video streams may be further obtained by utilizing a plurality of video cameras oriented in different directions so as to generate a plurality of complementary films combined. As such, the plurality of video streams may capture a field of view exceeding the human field of view resulting in a 180 to 360 degree viewpoint of the filmed environment.
- the wide field video stream 101 may exhibit a resolution in terms of pixels such that the resolution is greater than the resolution of the output video stream.
- the wide field video stream 101 may constitute a video file filmed from a single lens system.
- the wide field video stream 101 may also be a high definition video file.
- Other specified instructions executed by the control processing unit 107 may include selecting a display space 102 .
- a display space 102 may correspond to a display area of a two-dimensional screen.
- the display space 102 may include display points to display content from the wide field video stream 101 .
- the display points may be associated with a pixel on the screen of the display space 102 .
- Each point on the display space 102 may correspond to a pixel of the image of the output video stream.
- Other specified instructions that may be executed by the control processing unit 107 may further include selecting a desired projection 103 .
- the projection 103 may include selecting a portion of the wide field video stream 101 to be displayed in the determined display space 102 . More specifically, a projection 103 of the wide field video stream 101 includes modifying the current representation of the wide field video stream 101 . As illustrated in FIG. 2A , the wide field video stream is selected to depict a rectilinear projection. In FIG. 2B , the wide field video stream is further selected to depict a stereographic projection.
- Other projections that may be applied onto the wide field video stream may include a flat projection, mirror ball projection, or map projection by way of example only. Other various types of projections may be applied as would be appreciated by one of ordinary skill in the art. Such projections may create an immersive experience for the viewer.
- Determining the desired field of view 104 from the wide field video stream 101 allows a portion of the wide field video stream with the applied projection 103 to be viewed on a display space 102 .
- Selecting the desired field of view 104 can be defined by selecting a particular field of view angle of a virtual camera viewing the wide field view stream 101 in accordance to a particular projection 103 .
- the desired field of view 104 may further be determined by selecting a particular orientation of the virtual camera viewing the desired wide field view stream 101 in accordance to the selected projection 103 .
- the graphic processor unit 107 may be incorporated into system 100 to generate an output video stream 106 by performing display related calculation functions, such as a first geometric function 109 and second geometric function 105 .
- the calculated functions 109 , 105 may convert the pixel of the output video stream and the pixel of the wide field video stream respectively so that the appropriate video content is displayed on the determined display space 102 .
- FIG. 3 illustrates an exemplary method for presenting an output video stream 106 utilizing a first geometric function calculation.
- the operations of method 300 and other methods included are intended to be illustrative and non-limiting examples. In certain implementations, method 300 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 300 are illustrated in FIG. 3 and described below is not intended to be limiting.
- method 300 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, and/or other mechanisms for electronically processing information).
- the one or more processing devices may include one or more devices executing some or all of the operations of method 300 in response to instructions stored electronically on an electronic storage medium.
- the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 300 .
- a wide field video stream may be selected that corresponds to an assembly of a plurality of video streams or cameras oriented in different directions to generate a wide view point of a particular environment.
- a specified projection of content from the wide field video stream may be obtained and displayed on a determined display space.
- the display space may include multiple display points to display content from the wide field video stream.
- a first geometric function may be executed in order to determine each point of the display space so that an output video stream may be viewed on the display space.
- the first geometric function includes input parameters that include the desired projection and desired field of view of the wide field video stream. Executing the first geometric function according to the parameters results in the first geometric function to output intermediate points.
- Intermediate points correspond to the location point of the content of the wide field video stream to be displayed on the a particular display space. As such, each intermediate point corresponds to the content to be displayed on the display point of the displace space. In further embodiments, each intermediate point may correspond to each pixel of content of the wide field video stream displayed on the display space.
- the execution of the first geometric function provides a streamlined calculation for displaying the selected projection of the wide video stream onto the display space.
- the wide field video stream may be decomposed into a plurality of images where each image represents a distinct image within the wide field video stream.
- Each of the images may be used in combination with the intermediate points to form a plurality of corresponding images of the output video stream.
- FIG. 4 illustrates an exemplary method 4 for generating an output video stream.
- method 400 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, and/or other mechanisms for electronically processing information).
- the one or more processing devices may include one or more devices executing some or all of the operations of method 400 in response to instructions stored electronically on an electronic storage medium.
- the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 400 .
- the output video stream may be generated by determining a reference point through the calculation of a second geometric function.
- the second geometric function may include parameters of the corresponding intermediate points generated from the first geometric function.
- the execution of the second geometric function converts the coordinates of the intermediate points that are a function of the desired projection of the wide video stream. The converting of the coordinates of the intermediate points allows pixels of the image of the output video stream to correspond to pixels of the image of the wide field video stream.
- the determination of the reference point corresponds to a pixel of an image of the wide field video stream to be displayed on a display space.
- the reference point may not correspond exactly to a particular pixel of the image from the wide field video stream. Instead, the reference point may correspond to a zone of the wide field video stream. A zone may constitute more than two pixels that correspond to areas closest to the reference point.
- a new pixel may be generated by extracting and weighing a plurality of pixel from the image of the wide field video stream. The new pixel will be dependent on the extraction of pixels originating from the corresponding wide field video stream.
- the output video stream may be generated from the established reference points from the second geometric function from operation 404 of method 400 and displayed on a display space.
- the display space may include at least one screen to display the generated output video stream.
- the display space may include a display screen where each screen includes a plurality of display spaces each displaying an output video stream.
- the output video stream may be formed from the same wide field video stream according to various field of views or different projections.
- the display screen may include two separate display spaces that are each respectively intended to be viewed by left eye and a right eye of the observer respectively.
- a display method may allow the displaying a first output stream in a display space of a first screen and displaying a second output stream in a display space of a second screen.
- the display screens may be viewed by an observer with an immersive headset, which may be known as an “headset VR” to one of ordinary skill in the art.
- the immersive headset may be further configured to be worn by the observer.
- a multimedia portable device with a display screen may me mounted onto the immersive headset to create a more dynamic and immersive experience for the user. Examples of a multimedia portable device may include a multimedia tablet or telephone with a camera device.
- displaying the output video stream on a selected display space may include modifying the field of view or projection during the course of broadcasting the output video stream.
- a new first geometric function is generated by specifying a new projection for the output video stream.
- the new projection will result in a new image of the output video stream according to the new first geometric function.
- operation 406 of method 400 may further include modifying the desired field of view during the course of broadcasting the output video stream.
- Modifying the desired field of view may include updating the coordinates of the intermediate points by modifying the orientation of the virtual camera.
- the orientation of the virtual camera may be located in a spherical coordinate system where further modification of the field of view angle may include modifying the latitude angle and longitude angle of the virtual camera in the spherical coordinate system.
- the coordinates of the intermediate points may be normalized in the three dimensional coordinate system to determine and apply the new coordinates of the intermediate points to the output video stream in real time.
- Further changes to the output video stream may further include applying an effect filter to the images of the output video stream.
- the effect filter may include a contrast filter, luminosity filter, saturation filter, color filter, and an image deformation filter by way of example only.
- the effect filters may modify the visual display of the output video stream to create a more varied and immersive experience for the user and observer.
- Other forms of effects to the output video stream may include incorporating a soundtrack to the wide field video stream to further enhance the immersive experience.
- FIG. 5 illustrates a device for managing an output video stream.
- the device may include a computer system 501 that includes a computer processor unit, graphic processor unit, and a memory containing instruction for implementing and generating a output video stream.
- the device 500 may include an interface 502 including at least one display screen 503 to display the output video stream.
- the display screen 503 may include a touchscreen so that the operator interact with the device 500 to select and choose a wide field video stream to generate an output video stream.
- the device 500 may further include a storage space for selecting a wide field video stream with an element for selecting a wide field video stream 505 from the computer system 501 of the device.
- the storage space can be incorporated within the device.
- the storage space may include a remote storage space, such as a remote server accessible via a communication network.
- the wide field video stream may be downloaded or broadcast in real time from the remote storage space.
- the device 500 may include an various elements to create a virtual reality experience to the user or observer.
- the device may include a gyroscope to allow the user or observer equipped with an immersive headset coupled to the device 500 to view the output video stream in a computer simulated immersive multimedia form.
- Other elements may include an accelerometer or magnetometer to further enhance the virtual reality experience for the user or observer while viewing the output video stream.
- Various elements to further generate an output video stream on the device 500 may include an element to select various portions of the wide field video stream 509 , element for defining the desired field of view 506 , an element for defining a desired projection 507 , and an element for selecting an effect to be applied 508 .
- Other embodiments may include various elements to incorporate extra effects to the output video stream, such as an element for controlling the soundtrack 510 .
- computing module 600 may represent, for example, computing or processing capabilities found within desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, smart-watches, smart-glasses etc.); mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment.
- Computing module 600 might also represent computing capabilities embedded within or otherwise available to a given device.
- a computing module might be found in other electronic devices such as, for example, digital cameras, navigation systems, cellular telephones, portable computing devices, modems, routers, WAPs, terminals and other electronic devices that might include some form of processing capability.
- Computing module 600 might include, for example, one or more processors, controllers, control modules, or other processing devices, such as a processor 604 .
- Processor 604 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic.
- processor 604 is connected to a bus 602 , although any communication medium can be used to facilitate interaction with other components of computing module 600 or to communicate externally.
- Computing module 600 might also include one or more memory modules, simply referred to herein as main memory 608 .
- main memory 608 preferably random access memory (RAM) or other dynamic memory, might be used for storing information and instructions to be executed by processor 604 .
- Main memory 608 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 604 .
- Computing module 600 might likewise include a read only memory (“ROM”) or other static storage device coupled to bus 602 for storing static information and instructions for processor 604 .
- ROM read only memory
- the computing module 600 might also include one or more various forms of information storage mechanism 610 , which might include, for example, a media drive 612 and a storage unit interface 620 .
- the media drive 612 might include a drive or other mechanism to support fixed or removable storage media 614 .
- a hard disk drive, a solid state drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive might be provided.
- storage media 614 might include, for example, a hard disk, a solid state drive, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed by media drive 612 .
- the storage media 614 can include a computer usable storage medium having stored therein computer software or data.
- information storage mechanism 610 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 600 .
- Such instrumentalities might include, for example, a fixed or removable storage unit 622 and a storage interface 620 .
- storage units 622 and storage interfaces 620 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed or removable storage units 622 and storage interfaces 620 that allow software and data to be transferred from the storage unit 622 to computing module 600 .
- Computing module 600 might also include a communications interface 624 .
- Communications interface 624 might be used to allow software and data to be transferred between computing module 600 and external devices.
- Examples of communications interface 624 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface.
- Software and data transferred via communications interface 624 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a given communications interface 624 . These signals might be provided to communications interface 624 via a channel 628 .
- This channel 628 might carry signals and might be implemented using a wired or wireless communication medium.
- Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels.
- computer program medium and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example, memory 608 , storage unit 620 , media 614 , and channel 628 .
- These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution.
- Such instructions embodied on the medium are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable the computing module 600 to perform features or functions of the present application as discussed herein.
- module does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
- An apparatus and methods described herein generally relate to generating an output video stream from a wide field video stream.
- Videos cameras can be assembled in numerous directions with multiple angles and viewpoints so as to simultaneously film a particular environment. When assembling the various complementary films, the assembly may results in a wide field video stream with a field of view exceeding the normal human eye.
- While the wide field video stream provides the advantage of streaming or creating a video of a particular environment with a 180-degree to 360-degree viewpoint, a portion of the wide field video stream may be selected for viewing or editing purposes. However, extracting and displaying a portion of the wide field video stream onto a display screen may result in lag time and significantly lower video stream pixel resolution resulting in poor visual display.
- The disclosure herein relates to a generated output video stream from a selected wide field video stream. The generated output video stream is displayed on a two-dimensional display space so that an image of the output video stream correlates to the corresponding image of the wide field video stream. The objective the output video stream may include utilizing a first geometric function to determine a particular portion of the wide field video stream to be displayed onto a particular display space. The application of the first geometric function may considerably limit the calculation time of the output video stream so that the stream is achieved in real time without loss of quality.
- A method and system configured to generate an output video stream from a selected wide field video stream may include one or more physical processors configured by a machine readable instruction to select a wide field video stream. The wide field video stream may correspond to an assembly of multiple video streams with various different field of views of the same environment.
- The wide field video stream may be selected with a portion of the video stream to be viewed according to a particular projection. The projection may include a modified view of a wide field video stream by distorting the angle view and orientation of the wide field video stream to create a new immersive experience. The determined projection of the wide field video stream may then be transmitted onto a display space. The display space may include at least one screen. In other embodiments, the display space may include a plurality of display screens that correspond to different field of view and projections of the wide field video stream.
- The physical processors may be configured to generate intermediate points by executing a first geometric function. In particular, the input parameters of the first geometric function include the specified projection and a specific field of view from a camera lens of the wide field video stream. The generated individual intermediate points may correspond to locations on the wide video stream such that the intermediate points contain the location of each point on the image of the wide field video stream. In further embodiments, the intermediate points may correspond to at least one pixel of the wide field video stream to be displayed on the selected display space.
- The physical processors may also be configured to generate reference points by executing a second geometric function. In particular, the second geometric function takes as input parameters the intermediate points from the first geometric function. The generated reference points may correspond to at least one pixel of the wide field video stream with at least one pixel of the output video stream.
- The generated output video may include visual effects. Some visual effects may include a contrast filter, luminosity filter, saturation filter, color filter, and an image deformation filter to further enhance the visual experience. In other embodiments, the output video stream may be displayed on a device with a touchscreen, accelerometer, gyroscope, magnetometer, pointing device, and a moving detecting gestural interface.
- The physical processors may be configured to modify the field of view and orientation of a virtual video camera. In one embodiment, the field of view may be modified by updating the coordinates of the intermediate points. The field of view may be further modified by converting the coordinates of the intermediate point in a three dimensional coordinate system into coordinates in a spherical coordinate system. Other modifications may include modifying the orientation of the virtual camera and normalizing the coordinates in the spherical coordinate system to create new coordinates in a three dimensional coordinate system.
- These and other objects, features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related components of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the any limits. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
-
FIG. 1 illustrates a system configured for generating an output video stream from a selected wide field video stream, in accordance with one or more implementations. -
FIG. 2A illustrates a representation of a desire projection, in accordance with one or more implementations. -
FIG. 2B illustrates an a representation of a desired projection, in accordance with one or more implementations. -
FIG. 3 illustrates a method configured for generating an output video stream with a first geometric function, in accordance with one or more implementations. -
FIG. 4 illustrates a method configured for generating an output video stream with a second geometric function, in accordance with one or more implementations. -
FIG. 5 illustrates an example device for managing an output video stream, in accordance with one or more implementations. -
FIG. 6 illustrates an example computing module that may be used to implement various features of the technology as disclosed herein, in accordance with one or more implementations. -
FIG. 1 illustrates anexample system 100 that is configured for generating an output video stream from a selected wide field video stream. In some implementations, as illustrated inFIG. 1 , the system may include acentral processing unit 107 so that specified instructions can be executed by a main processor, or thecontrol processing unit 107. Specified instructions executed by thecontrol processing unit 107 may include selecting a widefield video stream 101. The widefield video stream 101 may correspond to a plurality of video streams pertaining to different field of views of one particular environment. In other instances, the plurality of video streams may be further obtained by utilizing a plurality of video cameras oriented in different directions so as to generate a plurality of complementary films combined. As such, the plurality of video streams may capture a field of view exceeding the human field of view resulting in a 180 to 360 degree viewpoint of the filmed environment. - The wide
field video stream 101 may exhibit a resolution in terms of pixels such that the resolution is greater than the resolution of the output video stream. The widefield video stream 101 may constitute a video file filmed from a single lens system. The widefield video stream 101 may also be a high definition video file. - Other specified instructions executed by the
control processing unit 107 may include selecting adisplay space 102. Adisplay space 102 may correspond to a display area of a two-dimensional screen. Thedisplay space 102 may include display points to display content from the widefield video stream 101. In other embodiments, the display points may be associated with a pixel on the screen of thedisplay space 102. Each point on thedisplay space 102 may correspond to a pixel of the image of the output video stream. - Other specified instructions that may be executed by the
control processing unit 107 may further include selecting a desiredprojection 103. Theprojection 103 may include selecting a portion of the widefield video stream 101 to be displayed in thedetermined display space 102. More specifically, aprojection 103 of the widefield video stream 101 includes modifying the current representation of the widefield video stream 101. As illustrated inFIG. 2A , the wide field video stream is selected to depict a rectilinear projection. InFIG. 2B , the wide field video stream is further selected to depict a stereographic projection. Other projections that may be applied onto the wide field video stream may include a flat projection, mirror ball projection, or map projection by way of example only. Other various types of projections may be applied as would be appreciated by one of ordinary skill in the art. Such projections may create an immersive experience for the viewer. - Determining the desired field of
view 104 from the widefield video stream 101 allows a portion of the wide field video stream with the appliedprojection 103 to be viewed on adisplay space 102. Selecting the desired field ofview 104 can be defined by selecting a particular field of view angle of a virtual camera viewing the widefield view stream 101 in accordance to aparticular projection 103. In other embodiments, the desired field ofview 104 may further be determined by selecting a particular orientation of the virtual camera viewing the desired widefield view stream 101 in accordance to the selectedprojection 103. - The
graphic processor unit 107 may be incorporated intosystem 100 to generate anoutput video stream 106 by performing display related calculation functions, such as a firstgeometric function 109 and secondgeometric function 105. The calculated functions 109, 105 may convert the pixel of the output video stream and the pixel of the wide field video stream respectively so that the appropriate video content is displayed on thedetermined display space 102. -
FIG. 3 illustrates an exemplary method for presenting anoutput video stream 106 utilizing a first geometric function calculation. The operations ofmethod 300 and other methods included are intended to be illustrative and non-limiting examples. In certain implementations,method 300 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations ofmethod 300 are illustrated inFIG. 3 and described below is not intended to be limiting. - In certain implementations,
method 300 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations ofmethod 300 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations ofmethod 300. - Regarding
method 300, atoperation 302, a wide field video stream may be selected that corresponds to an assembly of a plurality of video streams or cameras oriented in different directions to generate a wide view point of a particular environment. - At
operation 304, a specified projection of content from the wide field video stream may be obtained and displayed on a determined display space. The display space may include multiple display points to display content from the wide field video stream. - At
operation 306, a first geometric function may be executed in order to determine each point of the display space so that an output video stream may be viewed on the display space. The first geometric function includes input parameters that include the desired projection and desired field of view of the wide field video stream. Executing the first geometric function according to the parameters results in the first geometric function to output intermediate points. - Intermediate points correspond to the location point of the content of the wide field video stream to be displayed on the a particular display space. As such, each intermediate point corresponds to the content to be displayed on the display point of the displace space. In further embodiments, each intermediate point may correspond to each pixel of content of the wide field video stream displayed on the display space.
- As such, the execution of the first geometric function provides a streamlined calculation for displaying the selected projection of the wide video stream onto the display space. The wide field video stream may be decomposed into a plurality of images where each image represents a distinct image within the wide field video stream. Each of the images may be used in combination with the intermediate points to form a plurality of corresponding images of the output video stream.
-
FIG. 4 illustrates an exemplary method 4 for generating an output video stream. In certain implementations,method 400 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations ofmethod 400 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations ofmethod 400. - Regarding
method 400, at anoperation 402, the output video stream may be generated by determining a reference point through the calculation of a second geometric function. In determining the reference point, the second geometric function may include parameters of the corresponding intermediate points generated from the first geometric function. In other words, the execution of the second geometric function converts the coordinates of the intermediate points that are a function of the desired projection of the wide video stream. The converting of the coordinates of the intermediate points allows pixels of the image of the output video stream to correspond to pixels of the image of the wide field video stream. - At
operation 404 ofmethod 400, the determination of the reference point corresponds to a pixel of an image of the wide field video stream to be displayed on a display space. In some instances, the reference point may not correspond exactly to a particular pixel of the image from the wide field video stream. Instead, the reference point may correspond to a zone of the wide field video stream. A zone may constitute more than two pixels that correspond to areas closest to the reference point. In the instance that the reference point corresponds to a zone within the wide field video stream, a new pixel may be generated by extracting and weighing a plurality of pixel from the image of the wide field video stream. The new pixel will be dependent on the extraction of pixels originating from the corresponding wide field video stream. - At
operation 406 ofmethod 400, the output video stream may be generated from the established reference points from the second geometric function fromoperation 404 ofmethod 400 and displayed on a display space. The display space may include at least one screen to display the generated output video stream. In other instances, the display space may include a display screen where each screen includes a plurality of display spaces each displaying an output video stream. The output video stream may be formed from the same wide field video stream according to various field of views or different projections. - In other embodiments, the display screen may include two separate display spaces that are each respectively intended to be viewed by left eye and a right eye of the observer respectively. Alternatively, such a display method may allow the displaying a first output stream in a display space of a first screen and displaying a second output stream in a display space of a second screen. In further embodiments, the display screens may be viewed by an observer with an immersive headset, which may be known as an “headset VR” to one of ordinary skill in the art. The immersive headset may be further configured to be worn by the observer. In other embodiments, a multimedia portable device with a display screen may me mounted onto the immersive headset to create a more dynamic and immersive experience for the user. Examples of a multimedia portable device may include a multimedia tablet or telephone with a camera device.
- Further regarding
operation 406 ofmethod 400, displaying the output video stream on a selected display space may include modifying the field of view or projection during the course of broadcasting the output video stream. In the particular instance that the projection is modified, a new first geometric function is generated by specifying a new projection for the output video stream. The new projection will result in a new image of the output video stream according to the new first geometric function. - In other embodiments,
operation 406 ofmethod 400 may further include modifying the desired field of view during the course of broadcasting the output video stream. Modifying the desired field of view may include updating the coordinates of the intermediate points by modifying the orientation of the virtual camera. The orientation of the virtual camera may be located in a spherical coordinate system where further modification of the field of view angle may include modifying the latitude angle and longitude angle of the virtual camera in the spherical coordinate system. The coordinates of the intermediate points may be normalized in the three dimensional coordinate system to determine and apply the new coordinates of the intermediate points to the output video stream in real time. - Further changes to the output video stream may further include applying an effect filter to the images of the output video stream. Some examples of the effect filter may include a contrast filter, luminosity filter, saturation filter, color filter, and an image deformation filter by way of example only. The effect filters may modify the visual display of the output video stream to create a more varied and immersive experience for the user and observer. Other forms of effects to the output video stream may include incorporating a soundtrack to the wide field video stream to further enhance the immersive experience.
-
FIG. 5 . illustrates a device for managing an output video stream. According to one implementation, the device may include acomputer system 501 that includes a computer processor unit, graphic processor unit, and a memory containing instruction for implementing and generating a output video stream. Thedevice 500 may include aninterface 502 including at least onedisplay screen 503 to display the output video stream. Thedisplay screen 503 may include a touchscreen so that the operator interact with thedevice 500 to select and choose a wide field video stream to generate an output video stream. Thedevice 500 may further include a storage space for selecting a wide field video stream with an element for selecting a widefield video stream 505 from thecomputer system 501 of the device. The storage space can be incorporated within the device. In other aspects, by way of example only, the storage space may include a remote storage space, such as a remote server accessible via a communication network. The wide field video stream may be downloaded or broadcast in real time from the remote storage space. - The
device 500 may include an various elements to create a virtual reality experience to the user or observer. The device may include a gyroscope to allow the user or observer equipped with an immersive headset coupled to thedevice 500 to view the output video stream in a computer simulated immersive multimedia form. Other elements may include an accelerometer or magnetometer to further enhance the virtual reality experience for the user or observer while viewing the output video stream. - Various elements to further generate an output video stream on the
device 500 may include an element to select various portions of the widefield video stream 509, element for defining the desired field ofview 506, an element for defining a desiredprojection 507, and an element for selecting an effect to be applied 508. Other embodiments may include various elements to incorporate extra effects to the output video stream, such as an element for controlling thesoundtrack 510. - Referring now to
FIG. 6 ,computing module 600 may represent, for example, computing or processing capabilities found within desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, smart-watches, smart-glasses etc.); mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment.Computing module 600 might also represent computing capabilities embedded within or otherwise available to a given device. For example, a computing module might be found in other electronic devices such as, for example, digital cameras, navigation systems, cellular telephones, portable computing devices, modems, routers, WAPs, terminals and other electronic devices that might include some form of processing capability. -
Computing module 600 might include, for example, one or more processors, controllers, control modules, or other processing devices, such as aprocessor 604.Processor 604 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic. In the illustrated example,processor 604 is connected to a bus 602, although any communication medium can be used to facilitate interaction with other components ofcomputing module 600 or to communicate externally. -
Computing module 600 might also include one or more memory modules, simply referred to herein asmain memory 608. For example, preferably random access memory (RAM) or other dynamic memory, might be used for storing information and instructions to be executed byprocessor 604.Main memory 608 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed byprocessor 604.Computing module 600 might likewise include a read only memory (“ROM”) or other static storage device coupled to bus 602 for storing static information and instructions forprocessor 604. - The
computing module 600 might also include one or more various forms ofinformation storage mechanism 610, which might include, for example, amedia drive 612 and astorage unit interface 620. The media drive 612 might include a drive or other mechanism to support fixed orremovable storage media 614. For example, a hard disk drive, a solid state drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive might be provided. Accordingly,storage media 614 might include, for example, a hard disk, a solid state drive, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed bymedia drive 612. As these examples illustrate, thestorage media 614 can include a computer usable storage medium having stored therein computer software or data. - In alternative embodiments,
information storage mechanism 610 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded intocomputing module 600. Such instrumentalities might include, for example, a fixed orremovable storage unit 622 and astorage interface 620. Examples ofsuch storage units 622 andstorage interfaces 620 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed orremovable storage units 622 andstorage interfaces 620 that allow software and data to be transferred from thestorage unit 622 tocomputing module 600. -
Computing module 600 might also include acommunications interface 624. Communications interface 624 might be used to allow software and data to be transferred betweencomputing module 600 and external devices. Examples ofcommunications interface 624 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface. Software and data transferred viacommunications interface 624 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a givencommunications interface 624. These signals might be provided tocommunications interface 624 via achannel 628. Thischannel 628 might carry signals and might be implemented using a wired or wireless communication medium. Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels. - In this document, the terms “computer program medium” and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example,
memory 608,storage unit 620,media 614, andchannel 628. These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution. Such instructions embodied on the medium are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable thecomputing module 600 to perform features or functions of the present application as discussed herein. - The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term “module” does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
- Additionally, the various embodiments set forth herein are described in terms of exemplary block diagrams, flow charts and other illustrations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated examples. For example, block diagrams and their accompanying description should not be construed as mandating a particular architecture or configuration.
- While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not of limitation. Likewise, the various diagrams may depict an example architectural or other configuration for the disclosure, which is done to aid in understanding the features and functionality that can be included in the disclosure. The disclosure is not restricted to the illustrated example architectures or configurations, but the desired features can be implemented using a variety of alternative architectures and configurations. Indeed, it will be apparent to one of skill in the art how alternative functional, logical or physical partitioning and configurations can be implemented to implement the desired features of the present disclosure. Also, a multitude of different constituent module names other than those depicted herein can be applied to the various partitions. Additionally, with regard to flow diagrams, operational descriptions and method claims, the order in which the steps are presented herein shall not mandate that various embodiments be implemented to perform the recited functionality in the same order unless the context dictates otherwise.
- Although the disclosure is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the other embodiments of the disclosure, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiment
Claims (18)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1353565A FR3004881B1 (en) | 2013-04-19 | 2013-04-19 | METHOD FOR GENERATING AN OUTPUT VIDEO STREAM FROM A WIDE FIELD VIDEO STREAM |
FR1353565 | 2013-04-19 | ||
PCT/EP2014/058008 WO2014170482A1 (en) | 2013-04-19 | 2014-04-18 | Method for generating an output video stream from a wide-field video stream |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2014/058008 Continuation WO2014170482A1 (en) | 2013-04-19 | 2014-04-18 | Method for generating an output video stream from a wide-field video stream |
Publications (3)
Publication Number | Publication Date |
---|---|
US20160112635A1 US20160112635A1 (en) | 2016-04-21 |
US20180255241A9 true US20180255241A9 (en) | 2018-09-06 |
US10129470B2 US10129470B2 (en) | 2018-11-13 |
Family
ID=48782410
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/887,122 Active 2035-06-06 US10129470B2 (en) | 2013-04-19 | 2015-10-19 | Apparatus and method for generating an output video stream from a wide field video stream |
Country Status (4)
Country | Link |
---|---|
US (1) | US10129470B2 (en) |
EP (1) | EP2987319A1 (en) |
FR (1) | FR3004881B1 (en) |
WO (1) | WO2014170482A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2578227B (en) * | 2016-05-23 | 2021-09-15 | Canon Kk | Method, device, and computer program for adaptive streaming of virtual reality media content |
EP3249928A1 (en) * | 2016-05-23 | 2017-11-29 | Thomson Licensing | Method, apparatus and stream of formatting an immersive video for legacy and immersive rendering devices |
US20180307352A1 (en) * | 2017-04-25 | 2018-10-25 | Gopro, Inc. | Systems and methods for generating custom views of videos |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5260779A (en) | 1992-02-21 | 1993-11-09 | Control Automation, Inc. | Method and apparatus for inspecting a printed circuit board |
DE69324224T2 (en) * | 1992-12-29 | 1999-10-28 | Koninkl Philips Electronics Nv | Image processing method and apparatus for generating an image from a plurality of adjacent images |
FR2700938B1 (en) | 1993-01-29 | 1995-04-28 | Centre Nat Rech Scient | Method and device for analyzing the movement of the eye. |
DE69411849T2 (en) * | 1993-10-20 | 1999-03-04 | Philips Electronics Nv | Process for processing luminance levels in a composite image and image processing system using this process |
FR2714503A1 (en) * | 1993-12-29 | 1995-06-30 | Philips Laboratoire Electroniq | Image processing method and device for constructing from a source image a target image with change of perspective. |
US6578962B1 (en) | 2001-04-27 | 2003-06-17 | International Business Machines Corporation | Calibration-free eye gaze tracking |
US7057663B1 (en) | 2001-05-17 | 2006-06-06 | Be Here Corporation | Audio synchronization pulse for multi-camera capture systems |
US7336299B2 (en) * | 2003-07-03 | 2008-02-26 | Physical Optics Corporation | Panoramic video system with real-time distortion-free imaging |
CN101833968B (en) | 2003-10-10 | 2012-06-27 | 夏普株式会社 | Content reproduction device and method |
US8723951B2 (en) * | 2005-11-23 | 2014-05-13 | Grandeye, Ltd. | Interactive wide-angle video server |
US8572382B2 (en) | 2006-05-15 | 2013-10-29 | Telecom Italia S.P.A. | Out-of band authentication method and system for communication over a data network |
US7661121B2 (en) | 2006-06-22 | 2010-02-09 | Tivo, Inc. | In-band data recognition and synchronization system |
KR101574339B1 (en) | 2008-04-28 | 2015-12-03 | 엘지전자 주식회사 | Method and apparatus for synchronizing a data between a mobile communication terminal and a TV |
JP5927795B2 (en) | 2011-07-22 | 2016-06-01 | ソニー株式会社 | Stereoscopic imaging system, recording control method, stereoscopic video playback system, and playback control method |
US20130127984A1 (en) * | 2011-11-11 | 2013-05-23 | Tudor Alexandru GRECU | System and Method for Fast Tracking and Visualisation of Video and Augmenting Content for Mobile Devices |
KR102141114B1 (en) | 2013-07-31 | 2020-08-04 | 삼성전자주식회사 | Method and appratus of time synchornization for device-to-device communications |
US20150142742A1 (en) | 2013-11-17 | 2015-05-21 | Zhen-Chao HONG | System and method for syncing local directories that enable file access across multiple devices |
US9473576B2 (en) | 2014-04-07 | 2016-10-18 | Palo Alto Research Center Incorporated | Service discovery using collection synchronization with exact names |
US9754002B2 (en) | 2014-10-07 | 2017-09-05 | Excalibur Ip, Llc | Method and system for providing a synchronization service |
-
2013
- 2013-04-19 FR FR1353565A patent/FR3004881B1/en active Active
-
2014
- 2014-04-18 WO PCT/EP2014/058008 patent/WO2014170482A1/en active Application Filing
- 2014-04-18 EP EP14719707.3A patent/EP2987319A1/en not_active Ceased
-
2015
- 2015-10-19 US US14/887,122 patent/US10129470B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
FR3004881A1 (en) | 2014-10-24 |
FR3004881B1 (en) | 2015-04-17 |
US20160112635A1 (en) | 2016-04-21 |
US10129470B2 (en) | 2018-11-13 |
WO2014170482A1 (en) | 2014-10-23 |
EP2987319A1 (en) | 2016-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11854149B2 (en) | Techniques for capturing and displaying partial motion in virtual or augmented reality scenes | |
CN107636534B (en) | Method and system for image processing | |
US20200288105A1 (en) | Selective culling of multi-dimensional data sets | |
US10110871B2 (en) | Recording high fidelity digital immersive experiences through off-device computation | |
US10499035B2 (en) | Method and system of displaying a popping-screen | |
US10129462B2 (en) | Camera augmented reality based activity history tracking | |
US10986330B2 (en) | Method and system for 360 degree head-mounted display monitoring between software program modules using video or image texture sharing | |
US10235795B2 (en) | Methods of compressing a texture image and image data processing system and methods of generating a 360 degree panoramic video thereof | |
US10049490B2 (en) | Generating virtual shadows for displayable elements | |
CN109478344A (en) | Method and apparatus for composograph | |
US11044398B2 (en) | Panoramic light field capture, processing, and display | |
US10129470B2 (en) | Apparatus and method for generating an output video stream from a wide field video stream | |
CN114782648A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN111818265B (en) | Interaction method and device based on augmented reality model, electronic equipment and medium | |
US11100617B2 (en) | Deep learning method and apparatus for automatic upright rectification of virtual reality content | |
CN111833459A (en) | Image processing method and device, electronic equipment and storage medium | |
US20230103814A1 (en) | Image Processing Systems and Methods | |
US11770551B2 (en) | Object pose estimation and tracking using machine learning | |
CN114900621A (en) | Special effect video determination method and device, electronic equipment and storage medium | |
CN112825198B (en) | Mobile tag display method, device, terminal equipment and readable storage medium | |
US11948257B2 (en) | Systems and methods for augmented reality video generation | |
WO2021160071A1 (en) | Feature spatial distribution management for simultaneous localization and mapping | |
CN116208725A (en) | Video processing method, electronic device and storage medium | |
CN117459745A (en) | Information interaction method, device, electronic equipment and storage medium | |
US20160125910A1 (en) | Networked Divided Electronic Video Messaging System and Method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOPRO, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JENNY, ALEXANDRE;GILQUIN, YANN RENAUD;REEL/FRAME:039791/0084 Effective date: 20160824 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:GOPRO, INC.;REEL/FRAME:041777/0440 Effective date: 20170123 Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT Free format text: SECURITY INTEREST;ASSIGNOR:GOPRO, INC.;REEL/FRAME:041777/0440 Effective date: 20170123 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT Free format text: SECURITY INTEREST;ASSIGNOR:GOPRO, INC.;REEL/FRAME:048508/0728 Effective date: 20190304 Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:GOPRO, INC.;REEL/FRAME:048508/0728 Effective date: 20190304 |
|
AS | Assignment |
Owner name: GOPRO, INC., CALIFORNIA Free format text: RELEASE OF PATENT SECURITY INTEREST;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:055106/0434 Effective date: 20210122 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |