GB2522468A - Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme - Google Patents

Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme Download PDF

Info

Publication number
GB2522468A
GB2522468A GB1401334.6A GB201401334A GB2522468A GB 2522468 A GB2522468 A GB 2522468A GB 201401334 A GB201401334 A GB 201401334A GB 2522468 A GB2522468 A GB 2522468A
Authority
GB
United Kingdom
Prior art keywords
video
elementary
display
nodes
streams
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1401334.6A
Other versions
GB2522468B (en
GB201401334D0 (en
Inventor
Lionel Tocze
Julien Sevin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to GB1401334.6A priority Critical patent/GB2522468B/en
Publication of GB201401334D0 publication Critical patent/GB201401334D0/en
Publication of GB2522468A publication Critical patent/GB2522468A/en
Application granted granted Critical
Publication of GB2522468B publication Critical patent/GB2522468B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3188Scale or resolution adjustment
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/023Display panel composed of stacked panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/026Video wall, i.e. juxtaposition of a plurality of screens to create a display screen of bigger dimensions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2350/00Solving problems of bandwidth in display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention relates to the field of multi-display/multi-projector systems and covers a method and distributing video data between given nodes of a multi-display system. The method for distributing video data to the display devices involves determining routing paths for distributing elementary sub-streams to their respective destination display devices, and involves cutting the source video streams into sub-streams that group elementary sub-streams when possible, to limit the number of cutting operations. Working on elementary sub-streams makes it possible to reduce the bandwidth required over the communication links, by selecting appropriate routing paths, thereby overcoming network bandwidth limitation. In addition, grouping elementary spatial sub-areas to define the video sub-streams that are actually sent makes it possible to reduce the number of cutting operations, since larger sub-areas are defined. This helps overcoming any limitation of cutting capabilities at the nodes.

Description

Intellectual Property Office Application No. GB1401334.6 RTM Date:17 July 2014 The following terms are registered trade marks and should be read as such wherever they occur in this document: Wi-Fi Intellectual Property Office is an operating name of the Patent Office www.ipo.govuk
METHODS AND DEVICES FOR DISTRIBUTING VIDEO DATA IN A MULTI-DISPLAY
SYSTEM USING A COLLABORATIVE VIDEO CUTTING SCHEME
FIELD OF THE INVENTION
The present invention generally concerns the field of multi-display systems that includes a plurality of nodes among which at least one source device provides a video stream and a plurality of display devices collectively display video in a combined display area.
The present invention concerns more particularly a method and a display device for distributing video data in such a multi-display system, wherein the nodes including the source device and the display devices are interconnected through a communication network.
An appropriate use case regards a multi-projector display system made of a plurality of video projectors. For ease of explanation below, reference is mainly made to such a multi-projector display system, although the invention may apply to multi-monitor display systems.
BACKGROUND OF THE INVENTION
Multi-projector display systems are increasingly deployed to offer high-end video display of very large size and of very high quality, for example in the open air in a stadium for a sports event or a concert, or in a conference hall, or for teaching purposes or simulating purposes.
These are audio-visual projection systems made of several video projectors that tile the combined (or shared) display area and may be blended over the combined display area, i.e. may overlap partly or entirely. A variable number of video projectors may be instantiated.
The video projectors are fed with video data, for example live ultra-high definition (UHD) video. UHD video generally refers to a sequence of 60 Hz video frames having 3,840 vertical lines and 2,160 horizontal lines corresponding to 3,840 x 2,160 pixels of 24 bits. Such video stream thus requires a considerable channel bandwidth of about several Gbps (Gigabits per second) for transmission or processing.
A variable number of sources of video data may be instantiated.
Due to the combined display area, the display devices of the multi-display systems have to locally output the appropriate portion of the source video stream for display. This requires video cutting function at the display devices to extract the relevant portion from the source video stream.
For example, it is known from publication US 8123,360 a multi-projector composite image display system in which an initial source video stream is distributed over the network using one from two proposed solutions: either using a centralised image signal splitter that duplicates the initial full source video stream and distributes the duplicates to all the display devices, or using a cascaded connection link between each display device through which the initial source video stream is duplicated and transmitted. Then, each display device cuts the appropriate portion that has to be locally displayed from the received source video stream, using its video cutting capabilities.
Such approach cannot be used when the communication network has limited bandwidth. Indeed, the source video stream cannot be transmitted over communication links of the network that have not enough bandwidth for the entire stream.
It is an object of the invention to overcome some or all of the above-mentioned drawbacks.
SUMMARY OF THE INVENTION
In this context, the present invention seeks to find a collaborative video cutting scheme which satisfies restricted cutting capacities at the network nodes (in particular at the source device and at the display devices), and enables each video sub-area of the source stream to be distributed to the multi-display system despite limited data rate capacity or bandwidth of the communication network.
In the invention, each display device is referred to as a destination display device for spatial part or parts of the video stream it displays in the combined display area.
A method of distributing video data in a multi-display system according to the invention comprises, at a given one of the nodes: determining (including obtaining if obtained from another device) a candidate spatial split of the video stream into elementary spatial portions (in particular, the portions are spatial parts of the frames forming the video stream); determining (including obtaining if obtained from another device) routing paths in the communication network to distribute respective elementary video sub-streams defined by the elementary spatial portions to corresponding destination display devices. This makes it possible to obtain a video distribution scheme that meets the bandwidth constraints of the communication network, since elementary video sub-streams are considered separately instead of the entire source video stream; actually cutting the video stream into video sub-streams, wherein a spatial portion of at least one video sub-stream groups elementary spatial portions whose corresponding elementary video sub-streams share the same portion of determined routing path from the given node. This means that the video sub-stream encompasses two or more elementary spatial portions that have to be transmitted via the same portion of routing path for example using the same communication link from the given device; and sending the video sub-streams to the destination display devices based on the determined routing paths.
The given device may be one of the source and display devices and other nodes in the communication network. For example it may be the first device/node downstream of the source device, which has two or more communication links to output video data to other devices/nodes. This is because the devices/nodes upstream have only one communication link that has to convey the entire source video stream before a first cutting operation is performed. Upstream and downstream generally refers to the direction in communication paths, meaning that a node in the network receives video data from a node upstream and outputs data to a node downstream.
The present invention makes it possible to comply with the data rate capacity of the communication network in which the video stream is distributed and with the limited cutting capability of the devices/nodes.
This is achieved by individually considering elementary video sub-streams to determine respective routing paths that meet bandwidth requirements of the network for their distribution, and by grouping the elementary spatial portions into larger video sub-streams to reduce the number of cutting operations required during the distribution of the video stream. The invention may thus take advantage of the video cutting capabilities of all the devices.
Correspondingly, a node device (e.g. a video projector acting as a display device) of a multi-display system according to the invention comprises: a candidate splitting module configured to determine a candidate spatial split of the video stream into elementary spatial portions; a routing path module configured to determine routing paths in the communication network to distribute respective elementary video sub-streams defined by the elementary spatial portions to corresponding destination display devices; a video cutting module conflgured to actually cut the video stream into video sub-streams, wherein a spatial portion of at least one video sub-stream groups elementary spatial portions whose corresponding elementary video sub-streams share the same portion of determined routing path from the node device; and a transmission module configured to send the video sub-streams to the destination display devices based on the determined routing paths.
The display device has the same advantages as the method as described above.
The invention also concerns a multi-display system comprising a plurality of nodes among which at least one source device provides a video stream and a plurality of display devices collectively display video in a combined display area, and comprising a communication network interconnecting the nodes, wherein one of the nodes is as defined above.
Other features of embodiments of the invention are further defined in the dependent appended claims. \Miile these features are mostly described with reference to the above method of the invention, similar features are provided for corresponding display device, video projector and multi-display system.
In embodiments, the elementary spatial portions grouped in the spatial portion of the video sub-stream are contiguous elementary spatial portions in frames of the video stream. In particular, the contiguous elementary spatial portions are contiguous in a horizontal or vertical direction. These configurations make it possible to use conventional cutting functions at the devices since the resulting spatial portion to be cut remain rectangle-shaped.
In embodiments, the method further comprises determining a cutting scheme that defines operations of actual cutting of video sub-streams to be performed at nodes of the communication network. This makes it possible to centralize such determination at a single node. It results that the method may further include sending the cutting scheme to the nodes of the communication network.
In specific embodiments, the cutting scheme is determined to define a minimum number of operations of actual cutting to be performed at each node of the communication network.
In variants, the cutting scheme is determined to favor the use of the full cutting capability at the source device or at nodes close to the source device.
In specific embodiments, determining the cutting scheme includes: forming a distribution tree organizing the nodes based on the determined routing paths, ordering the nodes according to a tree-level order, and at the same tree level according to dependencies between the nodes in the distribution tree, and successively selecting the ordered nodes and for each selected node, performing the following steps: obtaining the elementary spatial portions grouped in video sub-streams the selected node has to receive from an upstream node according to the determined routing paths, grouping the obtained elementary spatial portions associated with (i.e. whose corresponding elementary video sub-streams have) determined routing paths that use the same communication link from the selected node, and for each group, finding at least one grouping solution that groups contiguous elementary spatial portions of the group that were grouped in the same received video sub-stream, to form a set of rectangular spatial portions.
Only elementary spatial portions belonging to the same received video sub-stream can be grouped together in order to ensure that the iterative process only requires cutting function at the nodes.
This configuration usually finds efficient grouping solutions thanks to the iterative process of successively selecting each node (for example each source or display device). The rectangular spatial portions are thus potential portions to be cut from the received video sub-stream.
According to embodiments, if the obtained elementary spatial portions have to be received by the selected node according to two or more candidate grouping solutions from a previously-selected node, the step of finding finds at least one grouping solution for each of the two or more candidate grouping solutions from the previously-selected node. This configuration is to iteratively consider all the possible grouping solutions as the distribution tree is scanned.
In particular, the steps for the selected node may further comprise discarding a candidate grouping solution from a previously-selected node if all the grouping solutions for this candidate grouping solution do not meet a criterion relating to a number of cutting operations required to convert the groups of elementary spatial portions in the received video sub-streams into the set of rectangular spatial portions of the grouping solutions. This means that some grouping solutions may be avoided if they lead to having a too high number of cutting operations to be performed at any node. This provision thus ensures the video cutting capacities of the node to be respected.
In embodiments, determining the candidate spatial split is based on the spatial parts of the video stream each destination display device displays in the combined display area. In other words, it is based on the composite display by the multi-display system. This makes it possible to discriminate the portions according to the display devices to which they are addressed, in order for example to efficiently distribute them to such destination display devices. For illustrative purposes, the boundaries between the overlapping displayed parts within the combined display area may be used to define the elementary spatial portions.
In specific embodiments, the display areas of the display devices overlap within the combined display area, and each of the elementary spatial portions corresponds to an area that is entirely displayed by each of the display device or devices that display the elementary spatial portion. In other words, the boundaries of the_overlapping areas are used at least partly to define the elementary spatial portions. This contributes to efficiently use the network bandwidth, since all the video data corresponding to a given elementary spatial portion are used for display by their recipient devices.
In embodiments, determining the routing paths is based on data rates of the elementary video sub-streams and on data rate capacities (i.e. available bandwidth) of communication links between nodes of the communication network (in particular between the devices of the multi-display system). This configuration ensures an efficient distribution scheme of the elementary video sub-streams to be obtained given the bandwidth restrictions of the network.
The invention also provides a non-transitory computer-readable medium storing a program which, when executed by a microprocessor or computer system in a communication device, causes the communication device to perform the steps of the distributing method as defined above.
The invention also provides a node device substantially as herein described with reference to, and as shown in, Figure 2 of the accompanying drawings.
The invention also provides a method of distributing video data substantially as herein described with reference to, and as shown in, Figure 3; Figures 3 and 4; Figures 3, 7 and 8; Figures 3, 4, 7 and 8 of the accompanying drawings.
At least parts of the methods according to the invention may be computer implemented. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects which may all generally be referred to herein as a "circuit", "module" or "system". Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
Since the present invention can be implemented in software, the present invention can be embodied as computer readable code for provision to a programmable apparatus on any suitable carrier medium, for example a tangible carrier medium or a transient carrier medium. A tangible carrier medium may comprise a storage medium such as a floppy disk, a CD-ROM, a hard disk drive, a magnetic tape device or a solid state memory device or the like. A transient carrier medium may include a signal such as an electrical signal, an electronic signal, an optical signal, an acoustic signal, a magnetic signal or an electromagnetic signal, e.g. a microwave or RF signal.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will now be described, by way of example only, and with reference to the following drawings in which: Figure 1 schematically shows a multi-projector display system implementing embodiments of the invention; Figure 2 illustrates functional blocks of a communication adapter implementing embodiments of the invention; Figure 3 is a flowchart illustrating main steps of embodiments of the invention, involving the determination of a collaborative image cutting scheme; Figure 4 is a block diagram illustrating steps to determine a list of elementary spatial portions of a source video stream and their characteristics according to embodiments of the invention; Figure 5 illustrates elementary spatial portions in a source video stream;
B
Figure 6 illustrates a distribution scheme for the multi-display system of Figure 1; Figure 7 is a block diagram illustrating steps to determine a video cutting scheme according to embodiments of the invention; Figure 7a illustrates a distribution tree generated during the process of Figure 7 for the example of Figure 6; Figure 8 is a block diagram illustrating steps to determine the cutting operations required at a device during the process of Figure 7; Figure 9 illustrates an example of process to obtain the maximum concatenation/grouping; and Figures 10 illustrate step by step the determination of a video cutting scheme in the example of Figure 6.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
Figure 1 schematically shows a multi-projector display system 100 implementing the invention.
The exemplary multi-projector display system 100 of Figure 1 comprises four video projectors or display devices 105-1 to 105-4 for video rendering and at least one source device 110 that provides a source video stream to the display devices. The source device is for example a video server that supplies video data of a source video stream to at least a first video projector.
The present invention regards the distribution of the source video stream to the various display devices. In case a plurality of source devices is provided, the invention as described below is implemented for each source video stream output from each source device. Below, reference is mainly made to a single source video stream.
Embodiments of the invention may involve a different number of video projectors and may involve video projectors without any source unit or with several source devices connected thereto. Stream ids (identifiers) make it possible to discriminate between the various source video streams, even if received locally by the same video projector.
The video projectors 105-1 to 105-4 are arranged to collectively display video in a combined display area. Each video projector thus contributes to more or less a quarter of the combined display. The aggregated display resolution is therefore more or less four times the display resolution of each individual video projector (provided that they have the same individual display resolution). A user may interact with a multi-projector display controller (not shown) or with any of the video projectors, using conventional input means such as a keyboard or a mouse or a remote control, to design the layout of the combined display. For example, the user can add a source video stream to or move a source video stream within the combined display area. This results in the source video stream (or streams) to be displayed by two or more video projectors: each display device is considered as the "destination display device" for spatial part or parts of the source video stream it has to display in the combined display area.
In the example of Figure 1, the four display devices 105-1 to 105-4 are disposed as a matrix in order to enable display of a larger area than the one proposed by each video projector. For instance, each video projector being designed to display an area of 1,920 * 1,080 pixels, with a maximum overlapping area of 20%, the multi-projector display system 100 as shown could display an image of 3,456 pixels height and 1,944 pixels width in the combined display area.
Overlapping areas, together with edge-blending areas, advantageously provide smooth video transition for viewer, creating illusion of projection of a seamless, unified image. Blending technique guarantees that difference of luminance between adjacent projectors is not visible for a viewer.
The source device 110 provides video data to the multi-projection system 100, with resolution greater than the 1,920*1,080 pixels resolution (HD TV resolution) of each projector 105-ito 105-4. For example, the source device 110 may provide 4K * 2K (3,840 * 2,160 pixels) resolution image, that either will be cut to match the maximum 3,456 * 1,944 pixel resolution of a 2*2 multi-projector system (as shown in the Figure), or either will be provided to a 2*3 multi-projector system (not shown) that could handle the video source resolution or either will be homothetically resized to the 3,456 * 1,944 pixels resolution or less.
The source device 110 and the display devices 105-1 to 105-4 are interconnected through a communication network made of wired and/or wireless communication links (shown in plain or dash lines respectively). Each of the devices is equipped with a communication adapter 120 (external adapter or integrated into the device) to connect to the network and to exchange data with the other devices, in particular sub-part(s) of the source video stream that each projector has to display to obtain the full image projection. Wired links may be cable connection with guaranteed bandwidth of 4 Gbps, and wireless links may be 60-GHz radio signals able to sustain a maximum data rate of 3.5 Gbps depending on radio environment conditions For each connection between the devices, the communication adapter 120 is able to identify the nature (wired/wireless) of the interface, and in case of a wireless communication link, to discover and characterize the wireless link (for instance in terms of maximum data rate available).
Although the communication network as shown in the Figure is made of only the source device 110 and the display devices 105-1 to 105-4, embodiments may include one or more relay nodes (for example wireless nodes that may contribute to form an ad-hoc wireless network) participating in the distribution of the video data. For the description below, reference is mainly made to the source and display devices excluding relay nodes, but the invention may implement the same routing and video cutting functions as described below to any of such relay nodes.
Due to bandwidth limitation that may happen in the communication network (for example because of bad wireless communication conditions or because of not enough dimensioned wired), the entire source video stream cannot always be distributed to each display device 105-1 to 105-4. Also, due to video cufting capacities that may be limited in the nodes (including the source and display devices) of the network, it is worth limiting the number of cutting operations in the video stream at the nodes. In this context, the present invention seeks to find a collaborative video cutting scheme which satisfies restricted cutting capacities at the network nodes (in particular at the source device and at the display devices), and enables each video sub-area of the source stream to be distributed to the multi-display system despite limited data rate capacity or bandwidth of the communication network.
As described below, the approach according to the invention involves determining routing paths for distributing elementary sub-streams (corresponding to elementary spatial sub-areas of the frames) to their respective destination display devices, and involves cutting the source video streams into sub-streams that group elementary sub-streams when possible, to limit the number of cutting operations.
Working on elementary sub-streams makes it possible to reduce the bandwidth required over the communication links, by selecting appropriate routing paths, thereby overcoming the network bandwidth limitation. In addition, grouping elementary spatial sub-areas to define the video sub-streams to actually send makes it possible to reduce the number of cutting operations, since larger sub-areas are defined. This overcomes the limitation of cutting capabilities at the nodes.
Although the actual determinations of the elementary sub-streams (sub-areas), of the routing paths and of the cutting operations are performed at the same node or device in the description below, the invention encompasses situations where the actual determinations are split over several nodes of the communication network, provided they exchange the result of their respective determination each other to make it possible for a node to perform the whole steps of the invention. It results that such a node may determine" the elementary sub-streams or routing paths or cutting operations by only obtaining such result from another node.
Also, although the description below mainly provides the actual determinations at the source device 110, the actual determinations may occur at the first devices/nodes downstream of the source device 110 that has two or more downstream communication links, i.e. that has at least two outputs to other nodes of the networks. This is because it is the first node that needs to cut the source video stream (the nodes between the source device 110 and such "first device/node" must all receive the entire source video stream and must all send the received entire source video stream to the next node). In the example of the Figure, such "first device/node" is the source node 110.
Figure 2 shows functional blocks of a communication adapter 120 connected to a video-projector device 105-1 to 105-4 or source device 110 (or integrated in these devices).
The communication adapter 120 comprises: -a main controller 201, -several physical layer units (denoted PHY A and PHY B) 211 and 212, able to provide wired or wireless transmission/reception communication means, -a video output 239 acting as an interface with a display device such as video projector 105-1 to 105-4, -a video input 240 acting as an interface with a video source device, as for example with source device 110.
The main controller 201 is itself made of: -a Random Access Memory (denoted RAM) 233, -an Electronically-Erasable Programmable Read-Only Memory (denoted EEPROM) 232, used to store information set by the user, such as a position of the video projector in the display matrix area, -a micro-controller or Central Processing Unit (denoted CPU) 231, -a user interface controller 234, -a medium access controller (denoted MAC) 238, -a video processing controller 235, -a video interface controller 236, -a video Random Access Memory (denoted video RAM) 237.
CPU 231, MAC 238, video processing controller 235, user interface controller 234 exchange control information via a communication bus 244, on which RAM 233 and EEPROM 232 are also connected. CPU 231 controls the overall operation of the communication adapter 120 as it is capable of executing, from RAM 233, instructions pertaining to a computer program, once these instructions have been loaded from the memory EEPROM 232.
Thanks to the user interface controller 234, the user can configure usage of each system, such as video-projector position in display matrix area or can control each device such as device 105-1 to 105-4 to control output display, such as luminance, blending area compensation and so on. This interface can be a wired interface (like Ethernet, Universal Serial Bus USB) or a wireless interface (infrared, WI Fi).
Video RAM 237 temporary stores all video data received from Video interface 236 (i.e. for a source device 110) or from communication interfaces 211 or 212 (i.e. from other nodes of the communication network) before the video data are processed. Preferably the video data as stored are raw video data, i.e. without any compression that would make it more difficult to process each frame independently (because compression often involves temporal prediction).
The video processing controller 235 performs all necessary transformations of the video data stored in video RAM 237, such as cut of a frame into a sub-frame as defined by the collaborative cutting scheme determined by the collaborative cut determination method according to the invention. Video cutting function of the communication adapter 120 is designed to extract rectangular sub-area or sub-areas from each frame of a received video (sub-)stream, the extracted rectangular sub-areas being then stored in Video RAM memory before being transmitted to another communication adapter 120 (i.e. to another node in the communication network) or before being transmitted to the video output 239 for local display by the local video projector.
As memory is limited resource (in term of size and cost reason), each communication adapter 120 is configured to provide a limited number of cutting processes, for example a maximum of three video cutting processes for each frame.
Note that to generate the video output signal on interface 239, the video processing controller 235 reconstructs the local image to be displayed locally from the sub-areas received through communication interfaces 211/212 and/or video interface 236. An example of sub-areas is provided below with reference to Figure 5.
Communication through communication interfaces 21 1/212 requires MAC module 238 which is in charge of controlling the emission and reception of MAC frames conveying control data and/or video data.
Control data are used for the management of the communication protocol.
They are involved in the determinations of the capacity link (i.e. available bandwidth or data rate) and of the distribution scheme, and involved in sharing the results of these determinations with the other nodes of the network (e.g. the other devices of the multi-display system). As will be apparent from below, the MAC module 238 may also be used to share results of the collaborative cut determination method of the invention, i.e. of the cutting scheme, in case such determination is processed only by one communication adapter 120. Of course, in case such determination is locally performed in the same way by each adapter 120, there is no need to share the results.
Video data are then sent or received through the MAC module 238 once the distribution (or transmission) scheme and the collaborative video cutting scheme have been determined. Indeed, these schemes define which video data have to be sent on which communication links between the devices of the multi-display system.
Communication interfaces 211/212 may be wired-type or wireless-type physical layer units.
A wireless communication interface embeds a modem, a radio module and antennas. The radio module is responsible for processing a signal output by the modem before it is sent out by means of the antenna. For example, the processing can be done by frequency transposition and power amplification processes. Conversely, the radio module is also responsible for processing a signal received by the antenna before it is provided to the modem. The modem is responsible for modulating and demodulating the digital data exchanged with the radio module. For instance, the modulation and demodulation scheme applied is of Orthogonal Frequency-Division Multiplexing (OFDM) type. Antennas, for both transmission and reception, can be set either with quasi omni-directional pattern (for control data sharing for example), either with quasi omni-directional or directional radiation pattern antennas, for video data transmission, enabling either broadcast of common video data, or long range connection and better spatial reuse. Radio module also provides means to measure RSSI (Radio Signal Strength Indication) to identify positions of antennas that enable the best signal reception.
MAC 238 is able to manage reception on several PHY5 simultaneously.
The MAC 238 is also able to manage simultaneously several PHYs, some for transmission and the other for reception of data, enabling data forwarding function as described below with reference to Figure 6.
Figure 3 is a flowchart illustrating main steps of embodiments of the invention, involving the determination of a collaborative image cutting scheme.
The process starts at step 300 where the nodes of the network, i.e. in particular the source and display devices, determine the available capacity of the communication links, e.g. the available bandwidth or data rate and share this information with the other nodes.
Note that a communication link between two devices may be unidirectional or bidirectional.
For example, when the communication adapter 120 of a node detects the use of wired physical interface 211, the capacity of the communication link is determined as a fixed value of 4 Gbps. In the example of Figure 1, the communication links between nodes (110,105-3) and nodes (110,105-4) are supposed wired (plain lines) and their respective capacities, denoted c1103 and c1104, are determined equal to 4 Gbps.
When a wireless physical interface is used, the determination of the link capacity can be performed during an initial discovery process. In Figure 1, these links (dash lines) are characterized by their respective communication capacity where refer to the nodes 105-ill 05-j and are in the range [1.4] with i!=j.
To determine the capacity of each link, each node implied in the wireless communication (105-1 to 105-4) sends discovery frames on its radio interface using various antenna directions or antenna array configurations. During the discovery transmission time, the other devices not transmitting perform an RSSI measurement using several positions of antenna if possible. Once the measurement is performed, each device returns the best RSSI measurement for the current transmitting device to all other devices of the system.
By this way, all the nodes of the multi-display system (110, 105-1 to 105-4) gather RSSI information of all wireless links. Based on these results, for each wireless link, one node of the system (called master device, as for example node 105-3 in Figure 1) can then select the best transmission mode, meaning modulation scheme used and useful code rate to apply to each wireless link, in order that transmission on the link is the less subject to transmission errors given the provided Bit Error Rate (BE R).
Indeed, for a low RSSI value, the modulation scheme and the code rate will be selected low in order to provide the more robust modulation and a more powerful error code correction (reducing code rate). On the contrary, when RSSI value is good, the modulation scheme and the code rate will be selected high to provide more data transmission capability.
For a fixed modem Symbol Rate (Sr in number of symbols per second), the gross bit rate (Gbr) provided by the modem increases with a modulation that provides more bits per symbol. For example, m (number of bits coded by symbols) is 1 for BPSK, 2 for QPSK and 4 for 16QAM modulations. Then the capacity of the wireless link is obtained function of the Code rate (Cr) applied, which Cr is a value that takes value like 1/2 or 2/3 when error correction code is reduced. The capacity of the wireless link is = Sr * Symbol_Size (in bits) * m * Cr * fRasi, where fRssl is a correcting factor.
For example, if a modem generates a symbol of 220 ns, Sr = (1/220x109) = 4.54 x 106 symbols/s. For a symbol size of 42 bytes (336 bits), the maximum capacity at modem level with a 16-QAM modulation and a 2/3 code rate is c= (336*4.54 x106)*4*2/3 = 4 Gbps. Considering the overhead used for transmission and the retransmission policy adopted for error concealment, the capacity at MAC level is reduced by a factor fRssl that is function of RSSl value and may be comprised in [0.5.0.85] value interval. Therefore, for a 16-QAM and a Cr=2/3, is in the interval of [2.. 3.4] Gbps at MAC level.
At the end of step 300 of Figure 3, one of the source and display devices (denoted "processing device") is aware of the directional capacities of all the communication links in the network.
Next at step 305, the processing device retrieves information on the combined display, meaning information on the matrix of display devices (how they are arranged in relation to each other) and information on the display of the source video stream on the combined display area. These items of information represent the geometry of the multi-display system.
This step requires the number of video-projectors that form the combined display and their relative positions for the combined display are known. In fact, initially, a user may setup each communication adapter 120, using user interface 234. For example the user may inform first information about display capability (set to "active" for the adapters 120 of the display devices 105-1 to 105-4, and set to "inactive" for all the other nodes including the source device 110), and second information indicating their position in the matrix display area, in terms of row and column position, given their respective definition. Such second information may be implemented as follows in case of four display devices having the same size of display area (see Figure 1): -(1,1) for device 105-1 that displays the top-left part of the combined display -(1,2) for device 105-2, that displays the top-right part of the combined display; -(2,1) for device 105-3, that displays the bottom-left part of the combined display; -(2,2) for device 105-4, that displays the bottom-right part of the combined display.
All these information are stored in EEPROM 232 and shared between the adapters 120 during a network set-up process.
Next at step 310, the processing device determines a candidate spatial split of the source video stream (that is to be displayed by the multi-display system 100) into elementary spatial portions. Preferably, the determination is based on the spatial parts of the video stream each destination display device displays in the combined display area.
This step results in a list of elementary spatial portions which describes how the source video stream (or more precisely each frame thereofl is virtually cut into spatial portions which have to be sent to different display devices (more precisely of each of its elementary spatial portions). This step 310 is a prerequisite to study a distribution (or transmission) scheme of the source video stream to the display devices 105-1 to 105-4 within the multi-display system 100. This is because the split of the source video stream into elementary spatial portions will make it possible to estimate transmission bandwidth required for the transmission of each portions in the multi-display system, depending on which destination device is concerned by each portion.
Based on the information retrieved at step 305, the communication adapter 120 of the processing device is able to determine the number of video-projectors in columns (noted A and equals to maximum value J of couple position in matrix area (l,M) and in rows (noted B and equals to maximum value I of couple position in matrix area (l,M). In the example of Figure 1, A=B=2, corresponding to a 2*2 matrix of video-projectors.
The elementary spatial portions or candidate regions of the source video stream are then determined as now explained with reference to Figure 4. Note that the number of such elementary spatial portions is at least equal to (2A1)*(2B1) in case of overlapping between the display areas of the display devices 105-1 to 105-4. (2A- 1)(2B-1) corresponds to the minimal number of sub-regions to be transmitted to the display video-projectors 105-1 to 105-4. In that case each sub-region is defined by the boundaries of different overlapping in the combined display area.
Here Figure 4 is described with respect to a single source video stream that is displayed with the same size as the whole combined display area. Of course, slight adaptations may be brought to determine which subpart of the source video stream is displayed by which display device, in case the source video stream is displayed on a subpart of the combined display area. Such adaptations are based on the geometry of the multi-display system and on the size and position of the display of the source video stream in the combined display area.
Also, the process described below may be adapted to each source video stream in case of a plurality of such streams is provided, by considering only the display areas corresponding individually to each of these source video streams.
At step 400, the size of the initial rectangular region displayed for the source video stream is determined. For example, it may correspond to the size of the source video stream, in terms of pixels width and height, when the combined display area is able to handle such size. For a 4K*2K source video stream provided by source device 110, maximum pixel width L equals 3,840 and pixel height H equals 2,160.
As explain for Figure 1, considering overlap area, the combined display area may be of reduced size compared to the formal addition of the display sizes of the display devices. For example, the reduced size of the combined display area may be of 3,456 * 1,944 pixels, and in such as case when the source video stream is displayed over the entire combined display area, the initial rectangular region of size 3,456 * 1,944 pixels is thus considered.
Of course any other configuration can be contemplated, provided that the initial rectangular region ultimately defines the region corresponding to the display of the source video stream within the combined display area.
At step 410, N elementary spatial portions or "candidate regions" are considered. This is achieved by dividing the initial rectangular region into R rows and C columns, where R >= (2*B1) and C>= (2*A1).
When R and C are strictly equals to (2*B1) and (2*A1), corresponding to the case of Figure 5 when A=B=2, it means that the nine candidate regions correspond to elementary spatial portions of the source video stream that are either displayed by one display device only (elementary portions 505-1, 505-3, 505-7 and 505-9), or displayed by a set of two display devices (elementary portions 505-2, 505-4, 505-6 and 505-8), or displayed by all the display devices (elementary portion 505-5). This set of candidate regions is the minimal set that should be transmitted by the source device to have a full display of the source video stream by the multi-display system 100.
Nevertheless, as the candidate regions result from a virtual split of the frames of the source video stream in order to determine a distribution scheme as explained below, the source video stream could be split into more elementary spatial portions, for example as shown in Figure 4 where 12 elementary spatial portions are obtained (reference 450).
The number of elementary spatial portions is denoted N and equals R*C.
Then, at step 420, the size of each elementary spatial portion is computed using the formula Si = Xc * Yrk, where -Si is the size of the elementary portion i' located at the th column and kth row of the virtual division. I.e. i = ((k1)* C) + j, with j in [iC] and k in [1..R]; -Xc is the width of the j column of the virtual division; and -Yrk is the height of the km row of the virtual division.
Example 450 of Figure 4 shows a R=4 rows, C=3 columns split of the initial rectangular region, thus resulting in N = 12 regions. The eighth elementary portion corresponds to the intersection of column j=2 and row k=3 and has the size S8 = Xc2 * Yr3.
Back to Figure 3, at the end of step 310, a spatial split of the source video stream into elementary spatial portions has been obtained by the processing device.
Next, at step 320, the processing device determines routing paths in the communication network to distribute respective elementary video sub-streams defined by the elementary spatial portions to corresponding destination display devices. The set of routing paths defines a distribution scheme to distribute the source video stream inside the communication network 100. As explained below it is preferably determined based on data rates of the elementary video sub-streams and on data rate capacities (i.e. available bandwidth as determined at step 300) of the communication links between nodes of the communication network (in particular between the devices of the multi-display system 100 of Figure 1). This distribution scheme makes it possible for each display device 105-1 to 105-4 to receive the video data (i.e. the sub-streams) that it has to locally display (i.e. the set of elementary video sub-streams to be combined to be locally displayed by the display device considered).
Step 320 corresponds to a solution of a Multi-Commodity Flow (MCF) problem, express as follows: 1°) A graph G(V,E) is considered, representing the communication network where the vertexes (set V) of the graph represent the devices of the multi-display system (devices 110, 105-1 to 105-4 with their corresponding adapters 120) and the edges (set E) represent the communication links, either wired or wireless. Each edge or communication link (u,v) has a capacity c(u,v) representing the corresponding available bandwidth as determined at step 300.
2°) M commodities T1, 12 TM defined by T=(s,d,K), where s is the source (source device 110), d is the destination (destination display device) and K the demand (here the throughput requirement corresponding to the size Si of an elementary spatial portion as determined at step 310) of the commodity i.
Note that the throughput constraints K are obtained from the size Si of the corresponding elementary spatial portion (in term of pixels number). The size Si is multiplied by: -the size of the information used for coding each colour of the pixel, also known as colour depth. Generally, colour depth is a value of 8, 12 or 16 bits, depending of video standard; -the number of colours to code, generally three colours (R for Red, G for Green and B for Blue); -the chroma sub-sampling used for the image. Typical sub-sampling applied as [4:4:4, 4:2:2, 4:2:0] results to a sub-sampling ratio of [100%, 66.6%, 50%]; -the frequency of video refresh, also known as Frame_Rate. Its value equals 60, 50, 30 or 24 frames per second, depending on video standard.
Therefore in this example, a K throughput constraint is equal to: K1 = 3 * S1 * Colour_depth * Frame_rate * sub_samplingj0 Bps.
3°) A flow is a nonnegative function f: V->R+. The flow of commodity i along edge (u,v) is f(u,v).
The problem, once defined, is to find an assignment of the flows which satisfies the constraints: -_1f1(u, v) c(u, v) referred to as the capacity constraints; and -Vi, * s, di uEV f1(u, v) = wEV f1(v, w) referred to as the flow conservation constraints; and -EWEV 11(s1,w) = = K1 referred to as the demand satisfaction constraints.
In this problem, the variables f1(u,v) relative to the function f are referred to as the decision variables.
Finally, a last criterion to configure the MCF problem is the objective variable. This is because the MCF problem may have several solutions and one solution should be selected. This selection is based on the solution which optimizes (maximizes or minimizes) an objective variable. This objective variable depends directly on the scenario and may optimize different parameters: bandwidth, robustness, latency.
The objective variable considered is the bandwidth and the objective is to minimize the total bandwidth of the system. This can be expressed by minimizing the following objective variable: U,WEV f1(u, w).
U S wd1
The solving of the MCF problem as defined above based on the values obtained at steps 300 and 310 results in obtaining the routing paths (i.e. a set of edges E and vertexes V) between the source device 110 and the destination device for each commodity T (i.e. each elementary spatial portion K). This result optimizes (minimizes) the use of the bandwidth of the network given the link capacities.
The MCF problem can be solved using a "Simplex" algorithm, which determined how the transmissions of the elementary spatial portions K need to be performed to fulfil the transmission of each elementary spatial portion to its destination display device.
Figures 5 and 6 illustrate a result of the MCF problem. As mentioned above Figure 5 is the minimal number of elementary spatial portions for a 2*2 video projector matrix configuration of Figure 1, namely nine portions 505-1 to 505-9. These elementary spatial portions have to be distributed to: -display device 105-1 for elementary portions 505-1, 505-2, 505-4 and 505-5 corresponding to respective throughput requirements K1, K2, K4 and K5, as this display device needs to display the top-left part of the source video stream; -display device 105-2 for elementary portions 505-2, 505-3, 505-5 and 505-6 corresponding to respective throughput requirements K2, K3, K5 and K6, as this display device needs to display the top-right part of the source video stream; -display device 105-3 for elementary portions 505-4, 505-5, 505-7 and 505-8 corresponding to respective throughput requirements K4, K5, K7 and K5, as this display device needs to display the bottom-left part of the source video stream; and -display device 105-4 for elementary portions 505-5, 505-6, 505-8 and 505-9 corresponding to respective throughput requirements K5, K6, K5 and K9, as this display device needs to display the bottom-right part of the source video stream.
Elementary spatial portions 505-2, 505-4, 505-5, 505-6 and 505-8 correspond to blending areas displayed by 2 (or 4) display devices.
Each communication adapter 120, and more particularly video processing unit 235, knows, based on user settings (for example identification of the display area: top or boftom, left or right), how to process each elementary video sub-stream (corresponding to an elementary spatial portion) received to generate video signal output 239 to be locally displayed.
For example, the video processing unit 235 of display device 105-3 (at bottom-left) determines that memory should store, in the order, video data packets K4 (corresponding to 505-4), then K5 (corresponding to 505-5) to display the first lines of video signal, reading the beginning of each line in packet K4 and the end of the line in packet K5. Once these packets have been processed for local display, the bottom part of the video signal is read, in the order, from video data packets K7 (corresponding to 505-7), and then K5 (corresponding to 505-8), starting reading the beginning of each line in packet K7, and the end of the line in packet K5.
Given these elementary spatial portions 505-1 to 505-9 and based on link capacities c(fl [numeral values not shown for clarity] and throughputs K [numeral values not shown] of the elementary spatial portions, the solving 320 of the MCF problem gives the distribution scheme of Figure 6 also summarized in the table below (K is also used to represent the video data packets 615 transmitted on a communication link and corresponding to elementary spatial portion 505-i).
Path portion N° 1 2 3 4 K1 (110,105-4) (105-4,105-1) K2 (110,105-4) (105-4,105-1) (105-1,105-2) K3 (110,105-4) (105-4,105-2) t K (110,105-3) (105-3,105-1) K5 (110,105-3) (105-3,105-4) (105-4,105-2) (105-2,105-1) K6 (110,105-4) (105-4,105-2)
C _______ _________ __________ __________ __________
K7 (110,105-3) K8 (110,105-3) (105-3,105-4) K9 (110,105-4) Each row of the table represents the routing path of the video data K in the communication network 600. The routing path is described by a succession of transmission path portions (Path portions 1 to 4), corresponding to columns of the table, where each path portion is described by a couple (s, d), where s is the identifier of the transmitting device and d is the receiving device of the transmission.
For example, video packet K1 is transmitted from video source device 110 to display device 105-4, which acts as a relay for transmission of the same packet K1 to destination display device 105-1. In fact, display device 105-1 is the only one that needs packet K1 for display. As no communication link exists between source device and destination display device 105-1, step 320 determines a routing path using one relay.
Another example is the routing of packet K2 that uses three path portions.
In fact, the algorithm of step 320 determines that capacity c(2,4) of communication link 610 between display devices 105-4 and 105-2 is only capable of supporting transmission of packet K3, K5 and K6. Therefore, it determines that display device 105- 4, then display device 105-1 should act as relay for transmitting packet K2 to destination display device 105-2, using link capacities c(14) and c(l,2) provided by links 610 between the respective display devices 105-4, 105-1 and 105-1, 105-2.
A last example is the routing of packet K5 for which each display device is a destination display device (i.e. K5 is required by each display device to be locally displayed). The routing of packet K5 to all the devices uses a routing path having four path portions, starting from source device 110 and that successively reach display devices 105-3, 105-4, 105-2 and 105-1.
Back to Figure 3, at the end of step 320, the processing device is aware of a distribution scheme for the elementary video sub-streams K. Next, at step 330, the processing device determines how to actually cut the video stream into video sub-streams based on the distribution scheme and the elementary spatial portions. In particular, a spatial portion of at least one video sub- stream groups elementary spatial portions whose corresponding elementary video sub-streams share the same portion of determined routing path from the device where the operation of actual cutting is planned. The processing device may determine an optimal cutting scheme defining which cutting operations (if any) have to be performed at each node of the network, and then may send the cutting scheme to these nodes over the network.
This grouping of elementary spatial portions is to take into account the cutting capacities of the devices, and thus to reduce the number of cutting operations at the devices when possible. For that, the processing device may decide that a device that has to provide several elementary video sub-streams on the same communication link should, in regards of the position of each elementary spatial portions in the frames, cut a single region that regroup two or more elementary sub-streams.
Depending on the embodiments, the processing could try to minimise on each device the number of cutting operations. Other embodiments may decide to use the full cutting capacities of the devices close to the source device, thus reducing needs of cutting operations for other devices downstream.
An example of grouping is to group elementary spatial portions that are contiguous within frames of the video stream, for example K1, K2 and K3 that are contiguous in horizontal direction (see Figure 5) and share the same first path portions (110 to 105-4) as shown in Figure 6; or for example K6 and K9 that are contiguous in vertical direction (see Figure 5) and share the same first path portions (110 to 105-4) as shown in Figure 6; Referring now to Figures 7 and 8, an iterative process to determine the cutting scheme is described. This iterative process of determining the cutting scheme includes: forming a distribution tree organizing the nodes based on the determined routing paths (for example considering the minimum number of hops for each node from the source device), ordering the nodes according to a tree-level order, and at the same tree level according to dependencies between the nodes in the distribution tree (for example according to an increasing number of input communication links from other nodes of the communication network) (this is the process of Figure 7), and successively selecting the ordered nodes and for each selected node, performing the following steps (this is the process of Figure 8): obtaining the elementary spatial portions grouped in video sub-streams the selected node has to receive from an upstream node according to the determined routing paths, grouping the obtained elementary spatial portions associated with (i.e. whose corresponding elementary video sub-streams have) determined routing paths that use the same communication link from the selected node, and for each group, finding at least one grouping solution that groups contiguous elementary spatial portions of the group that were grouped in the same received video sub-stream, to form a set of rectangular spatial portions.
In other words, the step of finding a grouping solution is to know whether or not a spatial portion received at the selected node needs to be cut into two or more spatial portions because the latter have to be sent over distinct communication links.
The iterative process of successively selecting each node makes it possible to retroactively discard grouping solutions which have been found at a previous iteration (node). For example, if the obtained elementary spatial portions have to be received by the selected node according to two or more candidate grouping solutions from a previously-selected node, the step of finding finds at least one grouping solution for each of the two or more candidate grouping solutions from the previously-selected node. Then by scanning through the nodes while considering the candidate grouping solutions from the preceding selected nodes, it is possible to decide to discard a candidate grouping solution from a previously-selected node if all the grouping solutions for this candidate grouping solution do not meet a criterion relating to a number of cutting operations required to convert the groups of elementary spatial portions in the received video sub-streams into the set of rectangular spatial portions of the grouping solutions. The grouping solutions kept (that form the cutting scheme at the end) are thus progressively refined.
This is now described with reference to Figures 7 and 8.
At step 700, a distribution tree is created from the distribution scheme as obtained at step 320.
This step identifies from the table above that all packets K are first issued from source device 110. So the first "tree level" (Level 0) is made of the sole source device 110. It corresponds to the source of the path portion n° i of the table above.
Then, from source device 110, only two other display devices can be reached, namely 105-3 and 105-4. These two display devices form the tree level" 1 of the distribution tree, because they correspond to the first hop in the transmission of the video data. Tree level 1 corresponds to the addressee devices of path portion n°1 of
the table above.
From display device 105-4, communication is required to display devices 105-1 and 105-2 (see addressee device of path portion n°2 in the table above). In the same column, communication is also required from 105-3 to display devices 105-1 and 105-2. Therefore display devices 105-1 and 105-2 belong to the "tree level" 2 of the distribution tree.
All the devices of the multi-display system 600 are now included in the distribution tree.
Nevertheless, studying all other paths may help retrieving level dependency information. For example, from display device 105-3, a communication is required to display device 105-4 (see row K5, path portion n°2 of the table) indicating a dependency of device 105-4 from device 105-3. Similarly, inter-dependency exists at "tree level" 2 between display devices 105-1 and 105-2 (row K2, path portion n°3 and row K5, path portion n°4 of the table).
Figure 7a summarizes the distribution tree obtained at step 700, where the dependencies are shown by the arrows.
After the distribution tree determination, the cutting scheme is iteratively determined, starting at step 705 to initialize a "Start_Tree_Level" variable to 0 to successively consider each level of the distribution tree device. The determination of the cutting scheme starts at source level 110.
At step 710, the number Ni of devices belonging to "Start_Tree_Level" level is determined. In the case of the multi-display system 600 of Figures 1 and 6, three levels have to be successively considered with the following numbers Ni of devices: -one device for level 0: source device 110; -two devices for level 1: display devices 105-3 and 105-4; -two devices for level 2: display devices 105-i and 105-2.
Next, at step 715, the Ni devices from the same tree level are ordered based on their dependencies.
For example, as far as the level i is concerned, the classification is performed by checking the number of input paths received by each device. Display device 105-3 receives video data packet from one path coming from source device 110 from Level 0, according to the distribution scheme. On the other hand, display device 105-4 receives video data packet from 2 paths: one from source device 110 from Level 0, one from display device i 05-3 from level 1.
As the level 0 has already been processed, the determination of the cutting scheme for the source device 110 is already known when currently processing the level 1. Therefore, the cutting scheme for display device 105-3 can be determined with no ambiguity, which is not the case for display device 105-4 which depends on what is decided for display device 105-3. As a consequence, 105-3 is ordered before 105-4 for level 1.
As far as the level 2 is concerned, using the same approach, display device 105-i receives video data from three devices (one having not yet been considered: display device 105-2), and display device 105-2 receives video data from two devices (also one having not yet been considered: display device 105-1). Dependency in that case just indicates that some video data that could not be received through other path will be received by the device, which in turn will not use it for retransmission. Therefore, this path is not significant for the cutting scheme determination. The order of the display devices of level 2 is decided based on another criterion such as the device that has the more input paths (and if equal, selection is a random choice). In the example, order for level 2, based on the number of input paths, is 105-1 and then 105-2.
Once the order of the devices has been determined, at step 720 the cutting scheme for the current device is determined. Thanks to the loop from step 735 below to step 720, the cutting scheme for each device of the current level will be determined.
An example of step 720 is illustrated in Figure 8. This step determines how many (generally the minimum number) cutting operations are required by the current device to transmit the video data packets K (that it receives) on their transmission paths according to the distribution scheme.
At step 800, all the elementary spatial portions (K1 in the table above) received by the current device are determined. This may be done by reading the table and selecting each K1 for which a path portion indicates the current device as addressee device.
Note that the source device 110 is considered to receive the nine (K1 to K9) elementary spatial portions as shown in Figure 5.
At step 805, the number of downstream transmission links for the current device (i.e. link on which the current device can send video data) is determined. In the example of Figures 1 and 6, two downstream transmission links are determined for source device 110: one to display device 105-4 and another one to display device 105-3.
Next, step 810 is performed to determine for each transmission link (through the loop from step 820 to step 810) the elementary spatial portions of the source video stream that the current device has to transmit to other nodes. In other words, this step means grouping obtained elementary spatial portions associated with determined routing paths that use the same communication link from the current device.
In the example of source device 110, the following groups of elementary spatial portions K1 are determined: -for the first downstream transmission link (to device 105-4), the set of elementary portions {K1, K2, K3, K5, Kg}; -for the second downstream transmission link (to device 105-3), the set of elementary portions {K4, K5, K7, K8}.
At step 815, it is determined whether or not concatenating or grouping two or more elementary portions for the current downstream transmission link can be performed. This is to reduce the number of cutting operations to provide the elementary portions to the current transmission link.
For example, this step consists in checking whether or not elementary portions of {K1, K2, K3, K5, K9} or{K4, K5, K7, K8}(in case of the source device 110) are contiguous or adjacent, meaning they are neighbours along row (horizontal) or column (vertical) dimension, or along both dimensions.
In other words, for each group, it is searched at least one grouping solution that groups contiguous elementary spatial portions of the group that were grouped in the same received video sub-stream (the whole source video stream for source device 110), to form a set of rectangular spatial portions.
Preferably, the maximum concatenation or grouping is conducted, leading to a minimum number of resulting rectangular spatial portions. This is because the minimum number of cutting operations will be required at the nodes of the network.
Figure 9 illustrates an example of process to obtain the maximum concatenation/grouping. This Figure is based on the above example of the first downstream transmission link from source device 110 (to display device 105-4). The elementary spatial portions of this link are K1, K2, K3, K5, K9.
Block 900 represents the received elementary spatial portions as determined at step 800, i.e. K1 to K9.
Block 910 identifies for the transmission link considered, the elementary spatial portions, namely the set {K1, K2, K3, K5, K9} for link to device 105-4, regions corresponding to this set are hatched in the matrix 910.
Next, as a cutting operation is only defined for a rectangular sub-area, block sets 915 represent all the cutting operations that a cutting function is able to provide for a 3*3 matrix organisation. The possible cutting operations include: -one 9-elementary-portion-size square pattern with a 3*3 area cut (9 15-1); -two 6-elementary-portion-size square patterns with 3*2 or 2*3 area cut (915-2 or 915-3); -one 4-elementary-portion-size square patterns with 2*2 area cut (915-4); -two 3-elementary-portion-size square patterns with 3*1 or 1*3 area cut (915-5 or 915-6); -two 2-elementary-portion-size square pattern with 2*1 or 1*2 area cut (915-7 or 915-8); -one 1-elementary-portion-size square pattern with 1*1 area cut (915-9).
The pattern that theoretically groups the maximum number of elementary portions of block 910 should be a pattern that is made of five elementary portions.
However, using block sets 915, no single pattern has a size equal to 5. Therefore combinations of two patterns having smaller sizes are searched to achieve a five-elementary-portions-size pattern. Several solutions exist with two patterns of block set 915: -{2*2 (915-4), 1*1 (915-9)} -{3*1 (9155),2*1 (915-7)} -{3*1 (915-5), 1*2 (915-8)} -{ 1*3 (915-6), 2*1 (915-7)} -{ 1*3 (915-6), 1*2 (915-8)}.
Then, it is checked whether or not one (or more) of these solutions matches block 910. In the present example, two solutions match: -the first one with the two patterns 915-5 and 915-8, corresponding to concatenation of elementary spatial portions into two groups ({K1, K2, K3},{K6, K9}) -the second one with the two patterns 915-6 and 915-7, corresponding to concatenation of elementary spatial portions into two groups ({K3, K6, K9}, {K1, K2}) If none of the solution with two patterns of block set 915 match block 910,then, the combinations of three patterns of block set 915, which size equals 5, are considered, as for example the combinations made of { 2*1 (915-7), 2*1 (915-7), 1*1 (915-9)} or { 2*1 (915-7), 1*2 (915-8), 1*1 (915-9)}. And so on. All combinations are checked until a solution is found.
In a variant to finding the minimum number of cutting operations, the iterative process may try to use the maximum of the cutting capacities of each node. In that case, more grouping solutions are possible, and thus more iterations to select the best cutting scheme for the whole distribution tree will be required. This approach advantageously anticipates the needs of cutting operations for downstream devices since a large number of cutting operations have already been processed by upstream devices.
Still referring to the above example at source device 110, step 815 will determine the following grouping solutions: -for the first downstream transmission link (to device 105-4), two grouping solutions are found, each made of two resulting rectangular spatial portions: o the first grouping solution made of the set ({K1, K2, K3},{K6, K9}) o the second grouping solution made of the set ({K3, K6, K9}, {K1, K2}) -for the second downstream transmission link (to device 105-3), only one grouping solution made of a single spatial portion is found: a the grouping solution made of the set ({K4, K5, K7, K3}) Step 820 consists to check whether or not all the downstream transmission links as determined at step 805 have been processed. If not, the next link is processed by looping back to step 810.
Otherwise, last step 825 then calculates the number of cutting operations required for the current device for each combination of grouping solutions for the various transmission links: it only consists to add the sizes of the sets corresponding to the grouping solution of the combination.
In the example above for source device 110, whatever the combination is, this value is 3, corresponding to size value 2 (for any grouping solution of the first transmission link), added to size value 1 (for the second transmission link).
Back to Figure 7, at the end of step 720 one or various combinations of grouping solutions have been determined. Each of them is a candidate cutting scheme for the current node, and represents the cutting operations to be performed by the current node.
Next, to take into account limited video cutting capacity of the current device, step 725 checks whether or not the candidate cutting scheme(s) as determined at step 720 remain under the maximum cutting capacity of the node.
All the candidate cutting schemes that does not respect the cutting capacity are discarded (possibly also discarding candidate cutting schemes determined for previously-processed nodes if all possible cutting schemes resulting from them are discarded) If none candidate cutting scheme fulfils this cutting capacity criterion then the process fails at step 730. The user may then be advised that some adaptation, such as adding new device connection is required, to enable video distribution to each display device of the multi-display system.
In case at least one candidate cutting scheme fulfils the cutting capacity criterion, step 735 checks whether or not all the nodes of the current level "Start_Tree_Level" have been considered. This step may be perfomied by using an internal counter incremented when processing a new node of the current level, which internal counter can be compared to Ni.
If not, the process loops back to step 720 to consider the next node.
Otherwise, step 740 checks whether or not all the levels of the distribution tree have been processed. If not, the next level is considered at step 745 and the process loops back to step 710. Otherwise, the process ends at step 750 where the remaining candidate cutting scheme or schemes form the final result of step 330, i.e. the cutting scheme or schemes.
Back to Figure 3, once the cutting scheme (in case of several cutting schemes, one may be selected to be applied by all the nodes) has been determined, the transmission of the source video stream starts at step 340, involving video cutting at the nodes according to the determined cutting scheme. The video sub-streams corresponding to the rectangular spatial portions defined in the cutting scheme are thus sent to the destination display devices according to the routing paths determined at step 320.
Figure 10 illustrates step by step the determination of a cutting scheme in the example of Figure 6. This example shows how a grouping solution found for a given node (source device 110) can be discarded when processing a following node (device 105-4).
As already described above, at source node 110, one grouping solution ((K4, K5, K7, K8}) is found for the communication link to display device 105-3 and two grouping solutions ({K1, K2, K3},{K6, K9}) and ({K3, K8, K9}, (K1, K2}) are found for the communication link to display device 105-4. This requires three cutting operations. This is Figure ba.
Next, 105-3 is processed before 105-4.
Among the received K4, K5, 1<7, K3, 105-3 locally uses K7 without a need to transmit itto another node. For the communication link to 105-4, one grouping solution is found to transmit K5, K8. For the communication link to 105-1, one grouping solution is found to transmit K4. This requires two cutting operations. This is shown in Figure lob.
Next, 105-4 receives two rectangular portions ({K1, K2, K3},{K, K9}) or ({K3, K5, K9}, {K1, K2}) from source device 110, and receives another rectangular portion ({K5, K3)) from 105-3.
In other words, at step 800, 105-4 receives: -either a first set made of ({K1, K2, K3},{K6, K9), {K5, K3)) (1) -or a second set made of ({K3, K5, K9}, (K1, 1<2), (K5, K8)) (2) Step 805 determines two downstream transmission links (one to 105-1 and another one to 105-2).
At step 810, the groups of elementary spatial portions for the two links are as follows: -forthetransmissionlinkto 105-1, {K, K2} -forthetransmissionlinkto 105-2, {K3, K5, K5} At step 815, the grouping or concatenation of the elementary portions is determined in order to minimize the number of cutting operations at device 105-4.
Note that the cutting operations have to be considered with respect to the rectangular portions as received by the node. Therefore, the two cases (1) and (2) have to be studied separately.
Using the first set (1), the grouping solutions found at step 815 are: -for the transmission link to 105-1, a grouping solution requiring a single cutting operation to separate {K1, K2) from the received rectangular portion (K1, K2, 1<3); and -for the transmission link to 105-2, a grouping solution requiring three cutting operations to separate {K3} from the received rectangular portion {K1, K2, K3}; to separate {K5} from the received rectangular portion {K5, K9}; and to separate {K5} from the received rectangular portion {K5, K5}.
The grouping solutions for the first set (1) thus require four cutting operations by 105-4, which is higher than the maximum cutting capacities of the node (three cutting operations). Step 725 should thus fail for this first set. Retroactively, this means that the first set (1), even if possible at source device 110, must be discarded because it is not a viable cutting scheme for the full system.
Using the second set(2), the grouping solutions found at step 815 are: -for the transmission link to 105-1, the grouping solution that is similar to the received rectangular portion {K1, K2}. Therefore, no cut is required. The device 105-4 could relay the video data received from the source device directly to the device 105-1, -for the transmission link to 105-2, a grouping solution requiring two cutting operations to separate {K3, K5} from the received rectangular portion {K3, K5, K9); and to separate {K5} from the received rectangular portion {K5, K8}.
The grouping solutions for the first set (2) thus require two cutting operations, satisfying positively step 725. This cutting scheme for 105-4 is thus selected for the process. This is shown in Figure bc.
Next, 105-1 is considered. It receives {K1, K2) from 105-4 and {K4} from 105-3.
Among the received K1, K2, K4, 105-1 locally uses K1 and K4 without a need to transmit them to another node. 105-1 has only one downstream transmission link to 105-2. And for this communication link, one grouping solution requiring a single cutting operation is found to separate {K2} from the received rectangular portion {K1, K2}. This is shown in Figure lOd.
Next, 105-2 is considered. It receives {K3, K5) and {K5} from 105-4 and {K2} from 105-1.
Among the received K2, K3, K5, K6, 105-2 locally uses K2, K3 and K6 without a need to transmit them to another node. 105-2 has only one downstream transmission link to 105-1. And for this communication link, only {K5} received as such has to be transmitted. Therefore, no cut is required. The device 105-2 could relay {K5} to the device 105-1. This is shown in Figure be which ultimately defines the cutting scheme as determined at step 330 since no more device remains.
Although the present invention has been described hereinabove with reference to specific embodiments, the present invention is not limited to the specific embodiments, and modifications which lie within the scope of the present invention will be apparent to a person skilled in the art. Many further modifications and variations will suggest themselves to those versed in the art upon making reference to the foregoing illustrative embodiments, which are given by way of example only and which are not intended to limit the scope of the invention as determined by the appended claims. In particular different features from different embodiments may be interchanged, where appropriate.

Claims (18)

  1. CLAIMS1. A method of distributing video data in a multi-display system including a plurality of nodes among which at least one source device provides a video stream and a plurality of display devices collectively display video in a combined display area, each display device being a destination display device for spatial part or parts of the video stream it displays in the combined display area, the nodes being interconnected through a communication network, the method comprising, at a given one of the nodes: determining a candidate spatial split of the video stream into elementary spatial portions; determining routing paths in the communication network to distribute respective elementary video sub-streams defined by the elementary spatial portions to corresponding destination display devices; actually cutting the video stream into video sub-streams, wherein a spatial portion of at least one video sub-stream groups elementary spatial portions whose corresponding elementary video sub-streams share the same portion of determined routing path from the given node; and sending the video sub-streams to the destination display devices based on the determined routing paths.
  2. 2. The method of Claim 1, wherein the elementary spatial portions grouped in the spatial portion of the video sub-stream are contiguous elementary spatial portions in frames of the video stream.
  3. 3. The method of Claim 2, wherein the contiguous elementary spatial portions are contiguous in a horizontal or vertical direction.
  4. 4. The method of any of Claims 1 to 3, further comprising determining a cutting scheme that defines operations of actual cutting of video sub-streams to be performed at nodes of the communication network.
  5. 5. The method of Claim 4, wherein the cutting scheme is determined to define a minimum number of operations of actual cutting to be performed at each node of the communication network.
  6. 6. The method of Claim 4, wherein the cutting scheme is determined to favor the use of the full cufting capability at the source device or at nodes close to the source device.
  7. 7. The method of any of Claims 4 to 6, further comprising sending the cutting scheme to the nodes of the communication network.
  8. 8. The method of any of Claims 4 to 7, wherein determining the cutting scheme includes: forming a distribution tree olganizing the nodes based on the determined routing paths, ordering the nodes according to a tree-level order, and at the same tree level according to dependencies between the nodes in the distribution tree, and successively selecting the ordered nodes and for each selected node, performing the following steps: obtaining the elementary spatial portions grouped in video sub-streams the selected node has to receive from an upstream node according to the determined routing paths, grouping the obtained elementary spatial portions associated with determined routing paths that use the same communication link from the selected node, and for each group, finding at least one grouping solution that groups contiguous elementary spatial portions of the group that were grouped in the same received video sub-stream, to form a set of rectangular spatial portions.
  9. 9. The method of Claim 8, wherein if the obtained elementary spatial portions have to be received by the selected node according to two or more candidate grouping solutions from a previously-selected node, the step of finding finds at least one grouping solution for each of the two or more candidate grouping solutions from the previously-selected node.
  10. 10. The method of Claim 9, wherein the steps for the selected node further comprise discarding a candidate grouping solution from a previously-selected node if all the grouping solutions for this candidate grouping solution do not meet a criterion relating to a number of cutting operations required to convert the groups of elementary spatial poitions in the received video sub-streams into the set of rectangular spatial portions of the grouping solutions.
  11. 11. The method of any of Claims 1 to 10, wherein determining the candidate spatial split is based on the spatial parts of the video stream each destination display device displays in the combined display area.
  12. 12. The method of Claim 11, wherein the display areas of the display devices overlap within the combined display area, and each of the elementary spatial portions corresponds to an area that is entirely displayed by each of the display device or devices that display the elementary spatial portion.
  13. 13. The method of any of Claims ito 12, wherein determining the routing paths is based on data rates of the elementary video sub-streams and on data rate capacities of communication links between nodes of the communication network.
  14. 14. A node device of a multi-display system that includes a plurality of nodes among which at least one source device provides a video stream and a plurality of display devices collectively display video in a combined display area, each of the display devices being a destination display device for spatial part or pads of the video stream it displays in the combined display area, the nodes being interconnected through a communication network, the node device comprising: a candidate splitting module configured to determine a candidate spatial split of the video stream into elementary spatial portions; a routing path module configured to determine routing paths in the communication network to distribute respective elementary video sub-streams defined by the elementary spatial portions to corresponding destination display devices; a video cutting module conflgured to actually cut the video stream into video sub-streams, wherein a spatial portion of at least one video sub-stream groups elementary spatial portions whose corresponding elementary video sub-streams share the same portion of determined routing path from the node device; and a transmission module configured to send the video sub-streams to the destination display devices based on the determined routing paths.
  15. 15. A multi-display system comprising a plurality of nodes among which at least one source device provides a video stream and a plurality of display devices collectively display video in a combined display area, and comprising a communication network interconnecting the nodes, wherein one of the nodes is as in Claim 14.
  16. 16. A non-transitory computer-readable medium storing a program which, when executed by a microprocessor or computer system in a communication device, causes the communication device to perform the steps of the distributing method as in any of Claims ito 14.
  17. 17. A node device substantially as herein described with reference to, and as shown in, Figure 2 of the accompanying drawings.
  18. 18. A method of distributing video data substantially as herein described with reference to, and as shown in, Figure 3; Figures 3 and 4; Figures 3, 7 and 8; Figures 3,4, 7 and 8 of the accompanying drawings.
GB1401334.6A 2014-01-27 2014-01-27 Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme Active GB2522468B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1401334.6A GB2522468B (en) 2014-01-27 2014-01-27 Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1401334.6A GB2522468B (en) 2014-01-27 2014-01-27 Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme

Publications (3)

Publication Number Publication Date
GB201401334D0 GB201401334D0 (en) 2014-03-12
GB2522468A true GB2522468A (en) 2015-07-29
GB2522468B GB2522468B (en) 2016-04-27

Family

ID=50287608

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1401334.6A Active GB2522468B (en) 2014-01-27 2014-01-27 Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme

Country Status (1)

Country Link
GB (1) GB2522468B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL2016098B1 (en) * 2016-01-14 2017-07-24 Opticon Sensors Europe B V Method and device for of providing a multitude of video streams.
US11222611B2 (en) 2016-06-14 2022-01-11 Razer (Asia-Pacific) Pte. Ltd. Image processing devices, methods for controlling an image processing device, and computer-readable media

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10367750B2 (en) * 2017-06-15 2019-07-30 Mellanox Technologies, Ltd. Transmission and reception of raw video using scalable frame rate

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2502330A (en) * 2012-05-24 2013-11-27 Canon Kk Controlling video projectors of a multi-projector system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2502330A (en) * 2012-05-24 2013-11-27 Canon Kk Controlling video projectors of a multi-projector system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL2016098B1 (en) * 2016-01-14 2017-07-24 Opticon Sensors Europe B V Method and device for of providing a multitude of video streams.
US11222611B2 (en) 2016-06-14 2022-01-11 Razer (Asia-Pacific) Pte. Ltd. Image processing devices, methods for controlling an image processing device, and computer-readable media

Also Published As

Publication number Publication date
GB2522468B (en) 2016-04-27
GB201401334D0 (en) 2014-03-12

Similar Documents

Publication Publication Date Title
US11612016B2 (en) Fronthaul interface for advanced split-radio access network (RAN) systems
US8416776B2 (en) Communication channel building device and N-tree building method
CN111294281B (en) Communication method and device based on Service Function Chain (SFC)
KR101826701B1 (en) Method and system for multiplexing data streaming in audio/video networks
US20220078631A1 (en) Management plane functionality for switched network shared cell configuration of open radio access network (o-ran) system
CN110121155A (en) Broadcasting method, device, equipment and the system of virtual network group
KR101805628B1 (en) Method and system for isochronous communication in audio/video networks
US9660836B2 (en) Network topology discovery
GB2522468A (en) Methods and devices for distributing video data in a multi-display system using a collaborative video cutting scheme
CN114465920A (en) Method, device and system for determining corresponding relation
US20220131796A1 (en) Control apparatus, control method and program
JP5800553B2 (en) Distribution device, video distribution method, and program
CN106954004A (en) Screen sharing method and device
US9071768B2 (en) Method of transmitting video information over a wireless multi-path communication link and corresponding wireless station
CN106209671A (en) A kind of method and device determining that routing overhead is shared
US9686516B2 (en) Method and device for improving configuration of communication devices in a video projection system comprising multiple wireless video projectors
US20230208722A1 (en) Communication method and related apparatus
JP2020088720A (en) Electronic device
US9300979B2 (en) Methods for transmitting and receiving data contents, corresponding source and destination nodes and storage means
GB2526148A (en) Seamless display of a video sequence with increased frame rate
CN113169938B (en) Method for multi-channel discovery with partially disjoint paths
US11012347B2 (en) Communication apparatus, communication control method, and communication system for multilink communication simultaneously using a plurality of communication paths
GB2528275A (en) Method and apparatus for system network installation using hybrid wired wireless connections
US20220109961A1 (en) Method for the transmission of a frame by an access point of a wireless local area network
CN108696415A (en) A kind of method and apparatus sent and received information