EP3821611A2 - Method for viewing graphical elements arising from an encoded composite video stream - Google Patents
Method for viewing graphical elements arising from an encoded composite video streamInfo
- Publication number
- EP3821611A2 EP3821611A2 EP19758432.9A EP19758432A EP3821611A2 EP 3821611 A2 EP3821611 A2 EP 3821611A2 EP 19758432 A EP19758432 A EP 19758432A EP 3821611 A2 EP3821611 A2 EP 3821611A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- images
- composite
- channel
- video stream
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/12—Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/182—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234345—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43074—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
Definitions
- the invention relates to the field of methods and device for viewing graphic elements from an encoded composite video stream.
- the invention relates to a method for viewing graphic elements from an encoded composite video stream.
- the purpose of the first decoded composite stream is to present a control interface to a user.
- the purpose of the second decoded stream is to present the user with a video stream selected from the composite video streams.
- the selected video stream is selected by the user from the control interface.
- the disadvantage of this method is that it requires a step of duplicating the first encoded composite video stream and two distinct steps of decoding an encoded video stream to decode each of the encoded composite video streams.
- the first part of the image includes a relative image from one of the video streams.
- a second part of the image can be generated from another of the video streams.
- the additional image part can be a thumbnail resulting from a processing carried out on a captured image, said captured image coming from another part of the current image, relative to the other of the video streams.
- this method has the disadvantage of being greedy in memory and in computation time, because it requires carrying out a capture, that is to say an image generation, from each of the video streams necessary for the generation of different thumbnails, as well as scaling of the generated image.
- the publication plans to repeat the generation of thumbnails at a relatively low frequency.
- An object of the invention is in particular to remedy all or part of the aforementioned drawbacks.
- An idea which is the basis of the invention is to entrust the generation of any part of the image, whether it is a thumbnail or more generally an image coming from an encoded composite video stream, to a graphics card (GPU, for English graphies processing unit), which has the effect of speeding up calculations.
- GPU for English graphies processing unit
- a method for viewing graphic elements originating from an encoded composite video stream constituted by a succession of composite images.
- Each of the composite images IC t is formed from a juxtaposition of elementary image IEy.
- juxtaposition the present description designates both two images arranged side by side as two possibly spaced images.
- the encoded composite video stream for example complies with the H264 standard.
- the composite video stream is associated with a data structure, which describes:
- the drawing parameters can include details of position, size, depth, opacity.
- the drawing parameters can be objects in JSON data format (for the JavaScript Object Notation).
- the process includes:
- a decoding step initiated by a web browser, of the composite video stream encoded for recording in a memory of a graphics card in the form of a textured object corresponding to said original composite images
- a step of constructing a final rendering image by processing by said graphics card comprising a construction of an array of pixels as a function of said set of drawing parameters, of the pixels of said elementary images of said coordinates of each of said elementary images and of said personalization parameters, display the succession of final rendering images in a user interface of the interactive video player type.
- a graphics card allows the generation of thumbnails to be repeated at a frequency equal to that of the main image displayed.
- a graphics processor generally has a highly parallel structure which makes it efficient for a wide range of graphics tasks such as 3D rendering, Direct3D, OpenGL, video memory management, video signal processing, Mpeg decompression, etc.
- the method may not include an additional step of capturing images within the meaning of patent application WO2013150250. Also, the processing time and the memory space used increases less rapidly with the number of elementary images.
- texture object this description relates in particular to an object, within the meaning of the WebGL application programming interface, which contains an image.
- the final rendering image is an object of type Framebuffer.
- the personalization parameters can be modified during a selection made by a computer mouse on a part of the user interface, or even via a script, or by means of a pointer on the elementary image.
- a device for viewing graphic elements originating from an encoded composite video stream consisting of a succession of composite images, each of said composite images being formed of a juxtaposition of elementary images, said composite video stream being associated with a data structure describing:
- the device comprises:
- a means configured to decode said composite video stream and save it, in a memory of a graphics card, in the form of a texture object corresponding to said original composite images,
- a means of constructing a final rendering image by processing by said graphics card consisting in constructing an array of pixels as a function of said set of drawing parameters, the pixels of said elementary images of said coordinates of each of said elementary images and of said parameters customization,
- a graphic display system comprising:
- coded composite video stream containing a plurality of channels, comprising at least a first channel and a second channel;
- a graphics processing unit and a memory coupled to the graphics processing unit, the memory storing instructions which, during execution, cause the graphics processing unit to:
- o creates, for the plurality of times given, a sequence of composite images based on, for a first composite image of the sequence of composite images, an arrangement of the first channel and an arrangement of the second channel; and o displays the sequence of composite images, the sequence of composite images comprising a first part where the first channel is displayed and a second part where the second channel is displayed, and in which the first part and the second part are adjustable.
- the graphics processing unit and memory can be included on a graphics card, and the graphics processing unit can be further configured to:
- the composite image sequence can be created using first channel and second channel images stored in the memory of the graphics card.
- the composite image sequence can be created according to a first set of drawing parameters corresponding to the first channel and a second set of drawing parameters corresponding to the second channel.
- Storing the first channel and second channel images may include storing the first channel and second channel images as a single texture object, the texture object possibly being associated with the composite video stream.
- the composite image sequence can be displayed on an output device configured to receive input to adjust the composite image sequence.
- the instructions during execution, can control the graphics processing unit to create a second sequence of composite video stream, in which the composite video stream of second sequence is a set of thumbnails corresponding to the first channel or to the second channel.
- a user interface of the output device may include user-adjustable personalization parameters, and in response to the adjustment of user-adjustable personalization parameters, the first part and the second part of the composite image sequence may be adjusted accordingly.
- the first part and the second part can overlap.
- a computer program product downloadable from a communication network and / or stored on a medium readable by computer and / or executable by a microprocessor, and loadable in an internal memory a calculation unit, comprising program code instructions which, when executed by the calculation unit, implement the steps of the method according to the first aspect of the invention, or one or more of his improvements.
- FIG. 1 schematically shows an embodiment implementing a method according to the invention.
- variants of the invention comprising only a selection of characteristics described, subsequently isolated from the other characteristics described, if this selection of characteristics is sufficient to confer a technical advantage or to differentiate the invention from the state of the prior art.
- This selection includes at least one characteristic, preferably functional, without structural details, or with only part of the structural details if this part only is sufficient to confer a technical advantage or to differentiate the invention from the state of the prior art. .
- FIG. 1 An embodiment of a system 1 implementing a method according to an embodiment of the invention is illustrated in FIG. 1.
- a device 100 according to the invention and more generally a graphic display system according to the invention are simultaneously described.
- the system 1 comprises 4 separate video channels respectively referenced 10, 11, 12 and 13.
- each of the channels produces an elementary image, respectively referenced IE t, i , IE t, 2 , IE l 3 , IE l 2 .
- a composite image IC t is formed by a juxtaposition of elementary images.
- a sequence of composite images forms a composite video stream Fv.
- This composite video stream is encoded during an Eenc encoding step to generate an Fvc encoded composite video stream.
- the composite video stream Fvc can be received by a device 100 in accordance with an embodiment of a device according to the invention, for viewing graphic elements originating from a composite video stream.
- the composite video stream Fvc is associated with a data structure Dstruct, which describes: at least one coordinate of each of said elementary images IE t, i , IE l 2 , IE l 3 , IE l4 with respect to a constant reference in the composite image, at least one set of drawing parameters of each of said elementary images IE t, i , IE, 2, IE 3 , IE 14 with respect to a constant reference in a composition zone to be displayed on a screen.
- a data structure Dstruct describes: at least one coordinate of each of said elementary images IE t, i , IE l 2 , IE l 3 , IE l4 with respect to a constant reference in the composite image, at least one set of drawing parameters of each of said elementary images IE t, i , IE, 2, IE 3 , IE 14 with respect to a constant reference in a composition zone to be displayed on a screen.
- the drawing parameters include details of position, size, depth, opacity.
- the drawing parameters are objects in JSON data format (for the JavaScript Object Notation).
- the device 100 includes:
- a means 102 configured to decode the composite video stream Eve and record it, in a memory 204 of a GPU graphics card, in the form of a texture object TCt corresponding to said original composite images ICt,
- - construction means 106 to a final rendered image IRt configured to implement a processing by the GPU graphics card of constructing an array of pixels according to said set of design parameters, the pixels of said elementary images IE. i, IE. 2, IE. 3, IE. 4 of the coordinates of each of said elementary images IE t, i , IE l 2 , IE l 3 , IE l4 and personalization parameters,
- the means 102 can be produced in the form of a microprocessor.
- the texture object is a texture type object within the meaning of the WebGL application programming interface.
- the means 104 may be a microprocessor configured to occasionally acquire personalization parameters, such as selections of image parts to be displayed.
- the means 106 can be implemented in the form of a microprocessor configured to give instructions to the GPU graphics card.
- the three means 102, 104 and 106 can be produced in the form of a single processor.
- the display means 108 may be a video player displayed by a web browser.
- the processing implemented by the GPU graphics card on the instruction of the means 106 can be, in the case of the implementation of the WebGL application programming interface, that performed by the glDrawElements instruction.
- This instruction implements buffers which do not contain pixels, but vertices or colors.
- each of the images comprising a main part and a strip part
- the main part comprising an image relating to one of the streams of the composite video stream
- the strip part comprising images from each of the other streams in the composite video stream
- the image refresh rate of the main part and of each of the images of the strip part being the same, for example of 30 images per second.
- the composite video stream to be displayed in the main part image can be selected, for example, by clicking on another image of the banner part.
- the main part image can include several of the streams in the composite video stream.
- an image of at least one stream displayed in the main image may be partially obscured by an image of another stream displayed in the main image.
- the graphics display system can be implemented on the one hand by the video streams described above and on the other hand by the GPU graphics card and the memory 204.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1856410A FR3083950B1 (en) | 2018-07-12 | 2018-07-12 | PROCESS FOR VISUALIZING GRAPHIC ELEMENTS FROM AN ENCODED COMPOSITE VIDEO STREAM |
PCT/FR2019/051761 WO2020012139A2 (en) | 2018-07-12 | 2019-07-12 | Method for viewing graphical elements arising from an encoded composite video stream |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3821611A2 true EP3821611A2 (en) | 2021-05-19 |
Family
ID=65031416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19758432.9A Pending EP3821611A2 (en) | 2018-07-12 | 2019-07-12 | Method for viewing graphical elements arising from an encoded composite video stream |
Country Status (4)
Country | Link |
---|---|
US (1) | US20200020068A1 (en) |
EP (1) | EP3821611A2 (en) |
FR (1) | FR3083950B1 (en) |
WO (1) | WO2020012139A2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10979672B1 (en) * | 2020-10-20 | 2021-04-13 | Katmai Tech Holdings LLC | Web-based videoconference virtual environment with navigable avatars, and applications thereof |
US11689749B1 (en) * | 2021-11-22 | 2023-06-27 | Hopin Ltd | Centralized streaming video composition |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1518415A1 (en) * | 2002-06-18 | 2005-03-30 | Koninklijke Philips Electronics N.V. | Video encoding method and corresponding encoding and decoding devices |
WO2008151416A1 (en) | 2007-06-12 | 2008-12-18 | In Extenso Holdings Inc. | Distributed synchronized video viewing and editing |
US9300705B2 (en) * | 2011-05-11 | 2016-03-29 | Blue Jeans Network | Methods and systems for interfacing heterogeneous endpoints and web-based media sources in a video conference |
US9154813B2 (en) * | 2011-06-09 | 2015-10-06 | Comcast Cable Communications, Llc | Multiple video content in a composite video stream |
FR2989244B1 (en) | 2012-04-05 | 2014-04-25 | Current Productions | MULTI-SOURCE VIDEO INTERFACE AND NAVIGATION |
US20150262404A1 (en) * | 2014-03-13 | 2015-09-17 | Huawei Technologies Co., Ltd. | Screen Content And Mixed Content Coding |
KR102218908B1 (en) * | 2014-05-07 | 2021-02-23 | 엘지전자 주식회사 | Digital device and method of processing a service thereof |
US10412130B2 (en) * | 2016-04-04 | 2019-09-10 | Hanwha Techwin Co., Ltd. | Method and apparatus for playing media stream on web browser |
-
2018
- 2018-07-12 FR FR1856410A patent/FR3083950B1/en active Active
-
2019
- 2019-01-31 US US16/263,574 patent/US20200020068A1/en not_active Abandoned
- 2019-07-12 EP EP19758432.9A patent/EP3821611A2/en active Pending
- 2019-07-12 WO PCT/FR2019/051761 patent/WO2020012139A2/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2020012139A2 (en) | 2020-01-16 |
FR3083950B1 (en) | 2021-04-30 |
WO2020012139A3 (en) | 2020-03-12 |
FR3083950A1 (en) | 2020-01-17 |
US20200020068A1 (en) | 2020-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhou et al. | Image restoration for under-display camera | |
Overbeck et al. | A system for acquiring, processing, and rendering panoramic light field stills for virtual reality | |
JP7289796B2 (en) | A method and system for rendering virtual reality content based on two-dimensional ("2D") captured images of a three-dimensional ("3D") scene | |
US20210029336A1 (en) | Processing images captured by a camera behind a display | |
EP1527599B1 (en) | Method and system enabling real time mixing of synthetic images and video images by a user | |
CN107680042B (en) | Rendering method, device, engine and storage medium combining texture and convolution network | |
EP1292921B1 (en) | Refinement of a three-dimensional triangular mesh | |
US10580143B2 (en) | High-fidelity 3D reconstruction using facial features lookup and skeletal poses in voxel models | |
FR2565014A1 (en) | QUICK MEMORY SYSTEM AND DATA PROCESSING METHOD FOR PRODUCING FRAME OF IMAGE ELEMENTS, AND QUICK MEMORY SEGMENT | |
EP3821611A2 (en) | Method for viewing graphical elements arising from an encoded composite video stream | |
CN106569700A (en) | Screenshot method and screenshot device | |
CN109065001B (en) | Image down-sampling method and device, terminal equipment and medium | |
FR2996034A1 (en) | Method for generating high dynamic range image representing scene in e.g. digital still camera, involves generating composite images by superposition of obtained images, and generating high dynamic range image using composite images | |
WO2007071882A2 (en) | Method for providing data to a digital processing means | |
FR2950182A1 (en) | IMAGE PROCESSING METHOD | |
EP2987319A1 (en) | Method for generating an output video stream from a wide-field video stream | |
US20120162215A1 (en) | Apparatus and method for generating texture of three-dimensional reconstructed object depending on resolution level of two-dimensional image | |
Söchting et al. | Lexcube: Interactive Visualization of Large Earth System Data Cubes | |
FR2887347A1 (en) | Digital image`s depth map constructing method for e.g. digital photography field, involves evaluating clarity of each block to obtain discrete function, and determining and assigning optimal value of focal distance to map | |
FR2948800A1 (en) | Method for estimating quantity of light diffused by cloud, involves estimating coefficients in base of spherical functions, and estimating light quantity diffused by media from estimated coefficients along diffusion direction | |
US11043035B2 (en) | Methods and systems for simulating image capture in an extended reality system | |
KR101864454B1 (en) | Apparatus and method for composing images in an image processing device | |
KR101255218B1 (en) | method of abstraction rendering image and apparatus adopting the method | |
Gutiérrez et al. | Computational photography | |
FR3112633A1 (en) | High-resolution controllable facial aging with spatially sensitive conditional GANs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210112 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
RAV | Requested validation state of the european patent: fee paid |
Extension state: MA Effective date: 20210112 |