GB2526618A - Method for generating a screenshot of an image to be displayed by a multi-display system - Google Patents

Method for generating a screenshot of an image to be displayed by a multi-display system Download PDF

Info

Publication number
GB2526618A
GB2526618A GB1409633.3A GB201409633A GB2526618A GB 2526618 A GB2526618 A GB 2526618A GB 201409633 A GB201409633 A GB 201409633A GB 2526618 A GB2526618 A GB 2526618A
Authority
GB
United Kingdom
Prior art keywords
display
screenshot
image data
display devices
display system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1409633.3A
Other versions
GB2526618B (en
GB201409633D0 (en
Inventor
Tristan Halna Du Fretay
Brice Le Houerou
Falk Tannhauser
Pascal Lagrange
Arnaud Closset
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to GB1409633.3A priority Critical patent/GB2526618B/en
Publication of GB201409633D0 publication Critical patent/GB201409633D0/en
Publication of GB2526618A publication Critical patent/GB2526618A/en
Application granted granted Critical
Publication of GB2526618B publication Critical patent/GB2526618B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/026Video wall, i.e. juxtaposition of a plurality of screens to create a display screen of bigger dimensions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/02Handling of images in compressed format, e.g. JPEG, MPEG
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • G09G2340/125Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • G09G2370/022Centralised management of display operation, e.g. in a server instead of locally
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players

Abstract

A screenshot of a composite image 107 to be collectively displayed by a plurality of display devices 100, 101, 102, 103 of a multi-display system is generated according to display layout information. The method comprises the following steps: obtaining, from the display devices of the multi-display system, image data of the composite image to be respectively displayed by the display devices, at a capture timestamp shared between said display devices; and generating said screenshot of the composite image from the obtained image data, using said display layout information. The display devices may be video projectors, televisions or other display screens. The screenshot may be generated by one of the display devices or by an external device 111 connected to the display system. The method may include the step of deleting duplicated data corresponding to overlapping areas displayed by two or more display devices so that a single occurrence is kept.

Description

METHOD FOR GENERATING A SCREENSHOT OF AN IMAGE TO BE DISPLAYED
BY A MULTI-DISPLAY SYSTEM
FIELD OF THE INVENTION
The present invention relates in general to multi-display systems for displaying composite images. In particular, it relates to methods and devices for generating a screenshot of such composite image to be displayed by a multi-display system in a display area.
BACKGROUND OF THE INVENTION
Contexts exist where a very large display or projection area is required, for instance in a dome, a stadium or a concert hall, or for projection on buildings. Covering such large areas by merely increasing the distance between a display device and a screen yields a larger, but also darker, image since the brightness decreases with the square of the distance.
Consequently, multi-display systems comprising a plurality of display devices are increasingly used since they allow displaying images with a sufficient definition and brightness. The plurality of display devices is able to collectively display image or video data, thereby forming a composite image.
Each individual display device handles a sub-image of the composite image to be displayed, with a given definition and size. The different display devices cover adjacent and sometimes partially overlapping zones in order to ensure a smooth transition between different displayed sub-images and provide a tolerance against small displacements of the display devices which may be introduced, for example, by vibrations or by thermal expansion.
In the case a user attending the display wishes to capture a screenshot of such composite image, the user may for example take a picture of the display area using a camera. However, this widely used solution often provides poor results, as the screenshot may be blurry or under exposed due to low light in the meeting room.
A solution is thus to obtain a screenshot from the multi-display system displaying the composite image. However, when requesting one of the display devices of the multi-display system, it may happen that this display device has no knowledge of all the data corresponding to the composite image. Typically, the requested display device may only have knowledge of the sub-image that it will display in the display area.
Furthermore, several source devices like servers may be connected to different display devices, called master display devices, that each redistributes data from its connected source device to other display devices of the multi-display system through a communication network.
The multi-display system may for instance provide a picture-in-picture (PIP) service, where first data from a first source are displayed as background, while second data from a second source are displayed on top of the background. In addition, the first and second data may be displayed with different resolutions and sizes.
Since the displayed image is a composition of data from different sources partly redirected by different master display devices to each concerned display device, it may happen that no device in the multi-display system has knowledge of all the data (from all sources) that will be displayed by the multi-display system in the display area.
In particular, the displayed composite image made of data from several sources may have no centralized existence in the multi-display system.
An object of this invention is to provide a method allowing a user of an external device to obtain a good-quality screenshot of a composite image displayed at a display area (e.g. a screen).
SUMMARY OF THE INVENTION
The present invention has been devised to address one or more of the foregoing concerns, with light of providing a good quality screenshot of a displayed composite image.
In this context, according to a first aspect of the invention, there is provided a method for generating a screenshot of a composite image to be collectively displayed by a plurality of display devices of a multi-display system according to display layout information, the method comprising the following steps: obtaining, from the display devices of the multi-display system, image data of the composite image to be respectively displayed by the display devices, at a capture timestamp shared between said display devices; and generating said screenshot of the composite image from the obtained image data, using said display layout information.
Thus, the present invention allows generating a good-quality screenshot of the composite image displayed by the display devices of the multi-display system, independently of the number of sources from which the corresponding image data come from. Thus, even if the composite image has no centralized existence in the multi-display system, a coherent screenshot of it may be generated, thanks to image data corresponding to sub-images of the composite image that are obtained at a same capture timestamp, from the display devices that respectively display these sub-images, and also based on display layout information.
Optional features of the invention are further defined in the dependent appended claims.
In some embodiments, the screenshot is generated by one of the display devices of the multi-display system.
In other embodiments, the screenshot is generated by an external device connected to the multi-display system.
In some embodiments, the image data of the composite image are obtained, upon request from an external device connected to the multi-display system, by a display device of the multi-display system acting as a server receiving the request.
In some embodiments, the capture timestamp defines a time in the future relatively to the time of the request from the external device.
Thus, all the display devices receive the capture order (capture request), from the external device or from the server device depending on the embodiment, before the date of the capture timestamp, at which they respectively have to capture the corresponding image data.
In some embodiments, the method further comprises obtaining display layout information defining how the display devices display respective subparts of the composite image at capture timestamp, and generating the screenshot is based on said obtained display layout information.
In some embodiments, the method further comprises deleting, from the obtained image data, duplicated data corresponding to overlapping areas displayed by two or more display devices so that to keep a single occurrence of image data per overlapping area.
In some embodiments, deleting is performed before any transmission of image data to an external device.
In some embodiments, said image data are obtained prior any processing for adapting the image data to display.
In some embodiments, the method further comprises, determining whether said external device connected to the multi-display system belongs to a list of authorized external devices, and upon negative determination, providing an error message to the external device thereby prohibiting it from obtaining a screenshot or any image data from the multi-display system.
Thanks to this provision, it is possible to control the access of external device to a screenshot of the displayed image.
In some embodiments, said image data of the composite image are obtained block by block.
Thus, buffering memory is saved in the server and/or in all the display devices.
According to a second aspect of the invention, there is provided a device for generating a screenshot of a composite image to be collectively displayed by a plurality of display devices of a multi-display system according to display layout information, the device comprising: a communication module for obtaining, from the display devices of the multi-display system, image data of the composite image to be respectively displayed by the display devices, at a capture timestamp shared between said display devices; and a processing unit for generating said screenshot of the composite image from the obtained image data, using said display layout information.
In some embodiments, the device for generating a screenshot is a display device of a multi-display system.
In other embodiments, the device for generating a screenshot is an external device connected to said multi-display system, for instance by low-speed communication means.
At least parts of the method according to the invention may be computer implemented. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects which may all generally be referred to herein as a "device" or system". Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
Since the present invention can be implemented in software, the present invention can be embodied as computer readable code for provision to a programmable apparatus on any suitable carrier medium, for example a tangible carrier medium or a transient carrier medium. A tangible carrier medium may comprise a storage medium such as a floppy disk, a CD-ROM, a hard disk drive, a magnetic tape device or a solid state memory device and the like. A transient carrier medium may include a signal such as an electrical signal, an electronic signal, an optical signal, an acoustic signal, a magnetic signal or an electromagnetic signal, e.g. a microwave or RF signal.
The invention also concerns a multi-display system as hereinbefore described, with reference to, and as shown in, Figure 1 of the accompanying drawings.
The invention also concerns a display device as hereinbefore described, with reference to, and as shown in, Figure 2 of the accompanying drawings.
The invention also concerns a method of generating a screenshot as hereinbefore described, with reference to, and as shown in, Figure 3a to Figure 4 of the accompanying drawings.
The display device, the external device and the multi-display system, present the same features and advantages as the corresponding steps of the method that they may implement.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will now be described, by way of example only, and with reference to the following drawings in which: -Figure 1 schematically shows an exemplary multi-display system according to some embodiments; -Figure 2 represents a possible architecture for a display device of the multi-display system of Figure 1; -Figures 3a, 3b and 4 show steps of a method for generating a screenshot of a composite image to be collectively displayed by the multi-display system according to some embodiments; -Figure 5 shows an entry of exemplary display layout information.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
Figure 1 shows an example of multi-display system comprising a plurality of display devices, according to some embodiments.
A multi-display system according to some embodiments comprises at least one source device, for example a server (e.g. video server 105 or 106), a digital video camera, a hard-disk or solid-state drive, a digital video recorder, a personal computer, a set-top box, a video game console or the like.
In the general case, each source device is connected to a display device by means of, for instance, an HDMI or Display Port connection, or again DVI or SDI cables. In other embodiments, the display device may be linked to the source device by wireless means, for instance using the 60GHz wireless technology.
On the other side, these display devices are connected to other display devices to which data from the source device are distributed.
In the general case, the display devices communicate between them through a wireless network, for instance a 60 0Hz wireless network, or a multi-gigabit Ethernet network.
The display devices may be video projectors (e.g. video projectors 102 and 103), any other kinds of projectors, televisions, or any other device with a screen that is able to display an image.
In the example shown in Figure 1, the multi-display system is a video projection system for projecting collectively a composite image (composed of image data 108 and 109 respectively from video servers 105 and 106) from a video data stream, using a plurality of display devices embodied in Video Projectors (VP) 100, 101, 102 and 103. In particular, the video projectors 102 and 103 are connected to other video projectors 100 and 101 to which data from the video servers 105 and 106 are respectively distributed, and the video projectors 100 to 103 are connected through a high speed network 104 able to provide several Gbps data throughput for video exchanges without need of compression technology.
Such a system may manage the display of high quality video (for example 4k2k video made of images of 3840x2 160 pixels) on a large area with standard video projector (e.g. supporting 1080p HD video 1920x1080 pixels).
Back to the general case, at least one display device (here the VP 103) is equipped with network capabilities for communicating with an external device 111, typically handled by a user 112, through another network called external network. This external network is preferably wireless. It may be for instance a low speed wireless network 110. This is because in the described embodiments, it is planned to transmit image data, possibly compressed, and thus, the network requirements are less important than for video streaming for instance. However, the external network may also be a high speed network.
In other embodiments, the external device 111 is equipped with other means of communication (possibly wired) to communicate with at least one of the display devices.
The external network may be secured or not. For instance, only a list of authorized external devices may use this network to obtain data from the connected display device(s). The exchanges over the external network may be conditioned upon the validity of a certificate provided by the external device 111. This certificate may be verified by the display device connected to the external network.
The external device 111 is for instance a mobile device, like a smartphone or a tablet. It may also be a personal computer.
Although four display devices are shown in Figure 1, the present invention is not limited thereto. The person skilled in the art may consider other configurations with more (and different kinds of) display devices and apply the teachings described here in the case of four display devices.
The functioning of the different aforementioned devices is now described.
Each source device (e.g. video server 105 or 106) is configured to generate image or video data to be displayed in a display area, for example on a screen (not shown) by the multi-display system, as a sub-image of a composite image.
In the example of Figure 1, the composite image 107 displayed by the multi-display system is composed of a background 108 and a picture-in-picture (PIP) 109, respectively pertaining to video data streams from video servers 105 and 106.
Each source device (e.g. video server 105 or 106) is also configured to send the parts 108 and 109 of the composite image respectively to the display devices 102 and 103.
Thus, the display devices (102 and 103) are configured to receive image data 108 and 109 (in this example, from a video data stream) from the source device (e.g. video server 105 or 106) and to split the received data into sub-parts (correspondingly sub-streams of video data) that are distributed to the other display devices (here the video projectors 100 and 101) in charge of displaying image data from the source device (here video server 105 or 106).
For instance, the image data 108 sent by the video server 105 are split into four sub-parts, and each sub-part is sent by the VP 102 to one of the VP 100 to 103.
Correspondingly, the image data 109 sent by the video server 106 may be split into two sub-parts, one of which is sent by the VP 103 to the VP 102.
All the display devices also share display layout information describing how the data from the different sources are distributed to and displayed by each of them.
B
Exemplary display layout information will be described hereafter with reference to Figure 5.
Each display device is configured to compute blending areas for overlapping areas in order to compensate chrominance and luminance differences between the display devices, so that the composite image 107 appears to be displayed by a single display device.
Also, at least one display device is configured to receive and execute requests (e.g. screenshot request or capture request described below) from an external device ill, for obtaining a screenshot of the composite image or image data necessary to generate such screenshot. Correspondingly, the external device 111 is configured to generate and send such requests through the external network 110.
In some embodiments, only one of the display devices called server (e.g. 103) is connected to the external network and receives the request (screenshot request) from the external device. In alternative embodiments, all the display devices are connected to the external network and receive requests from the external device (capture requests).
Thus, as will be described in further detail hereafter, according to a first scenario, the external device 111 may request a screenshot to a particular display device called server display device, configured to obtain the necessary data to generate the screenshot, and configured to generate the screenshot based on the obtained data. The screenshot thus generated is then sent to the external device 111 in response to the request.
According to a second scenario, the external device 111 may request image data necessary to generate a screenshot, to a particular display device called server display device, configured to obtain the necessary data and to transmit it to the external device 111 in response to the request. The screenshot is then generated based on the received data, by the external device 111.
According to a third scenario, the external device 111 may request image data necessary to generate a screenshot, to all the display devices of the multi-display devices, that each sent the necessary data to the external device 111 in response to the request. The screenshot is then generated based on the received data, by the external device 111.
Other scenarios may be envisaged.
Figure 2 shows a possible architecture for a display device 200 (e.g. VP 100, 101,102, 103 of Figure 1), according to some embodiments.
In this example, the architecture comprises an image distribution processing unit 202 configured to handle image data (for example from a video stream) received through for instance a HDMI or display port input 209.
According to a first aspect, typically present in the display devices, called master display devices, that are connected to the source devices, the image distribution unit 202 is configured to split the input image data into several parts (or video sub-streams), according to display layout information describing how the data from the different sources are distributed to and displayed by the display devices of the multi-display system.
For instance, considering that the composite image to be collectively displayed is partitioned into sub-images, the display layout information may be a table comprising an entry for each sub-image composing the composite image. Figure 5 shows an exemplary entry of this table. In this example, each entry is composed of several fields. The first field 500 comprises an identifier ID of the sub-image it concerns. The second and third fields 501 and 502 comprise the horizontal and vertical sizes (length and height) in pixel, of the sub-image. The fourth and fifth fields 503 and 504 comprise the horizontal and vertical offsets, in pixel, that define the position of the first (top left) pixel of the sub-image, within the composite image to be collectively displayed in the display area. The field 505 comprises an identifier of the master display device connected to the source device from which the sub-image comes. The field 506 comprises an identifier of the display device displaying the sub-image.
Other fields may be present, and the embodiments are not limited to the fields mentioned above. For instance, the vertical and horizontal offsets of the first pixel of the sub-image by reference to the first pixel displayed by the display device may be
specified in an additional field.
Back to Figure 2, the image data (possibly coming from several source devices) are received by a given display device through its high speed networking unit 203 and/or through its port 209 (master display device) and are then forwarded to a high speed networking unit 203. The image distribution processing unit 202 may also receive image data (or video sub-streams) provided by other display devices, through the high speed networking unit 203. The high speed networking unit 203 is thus configured to receive and transmit data through the high speed network 104 shown in Figure 1, between the display devices. It may send data coming from the image distribution processing unit 202 or received from other display devices, to other display devices. Data received from other display devices may be transmitted to the image distribution processing unit 202 for local display.
The image distribution processing unit 202 is also configured to merge together, in accordance with the above-mentioned display layout information, all the image data from several source devices (or all sub-streams) that the display device 208 has to display. The resulting image data are provided to the display processing unit 201.
The display processing unit 201 is configured to process the merged image data thus provided, in view of the collective display. This processing may include chrominance or luminance corrections, geometrical corrections like keystone correction. The processed image data are then forwarded to a display unit 200 that displays them. The display unit 200 comprises for instance an LCD matrix and lens.
According to the present invention, upon request from a display device acting as a server or from an external device (e.g. the external device 111), the display device 208 may capture at the image distribution processing unit 202 output (i.e. between modules 202 and 201), the image data that it is about to display, so that all the image data to be displayed by the display device, possibly coming from several source devices, are captured, but are not yet impacted by display specific processing (color and/or geometric corrections).
Now, several particular embodiments are described. Thus, the features now described may not be present in all the display devices, depending on the embodiments concerned.
According to first embodiments, some of the display devices, for example one particular display device, act as servers towards the external network (e.g. the low speed wireless network 110) to which at least one external device (e.g. the external device 111) is connected. For instance, in addition to the communications with the other display devices using the high speed networking unit 203, the server communicates with the external device over the low speed network 110 of Figure 1 using a low speed networking unit 205.
Also, the low speed networking unit 205 is configured to receive requests from the external device 111 connected to the corresponding low speed network. It is also configured to execute and respond to such requests. In particular, these servers are configured to obtain a copy (or a capture) of image data to be displayed by the display devices at the same timestamp, i.e. at the same date which is set in advance based on a common clock reference shared by all the display device, in order to get image data from the same composite image, as will be described in detail below. The timestamp is generally determined by a given display device, for instance chosen by a user during the installation of the multi-display system.
In some embodiments, an example of which is shown in Figure 2, the server also comprises a frame reconstruction module 204 configured to merge image data corresponding to the same timestamp. Image data to be merged are received from all the display devices, for instance using the high speed networking unit 203.
Merging is done based on the display layout information, to generate a screenshot of the composite image to be displayed collectively by the display devices.
In these embodiments, the screenshot is generated by the frame reconstruction module 204 of the server and forwarded to the low speed networking unit 205 to be transmitted to the external device 111, in response to a request for a screenshot.
In alternative embodiments (not shown), the external device may request the image data instead of the screenshot itself, and the screenshot is generated by the external device which comprises a frame reconstruction module 204 as described above. In these alternative embodiments, the server executes the request, obtains a capture of the image data that all the display devices will display at the same timestamp, and transmits it with the display layout information, to the external device without generating the screenshot. The captured image data are received by the external device using its own low speed networking unit (similar to module 205) and transmitted to its own frame reconstruction module for generating a screenshot based on the received image data and on the display layout information.
In second embodiments (not shown), all the display devices of the multi-display system are equipped with a low speed networking unit as above described with reference to servers, and the external device sends requests to all the display devices in order to obtain the image data necessary to generate a screenshot on its own, using a frame reconstruction module 204 as above-described with reference to the first embodiments.
Optionally, at least one of the display devices (e.g. the server, if any, or all the display devices) may comprise compression means configured to reduce the amount of bits of the image data that will be sent over the low speed network. In the example of Figure 2, the compression means 207 are used before the generation of a screenshot. In a variant (not shown), the compression means 207 may be used after the generation of a screenshot (and before it is sent over the low speed network). In such a case, module 207 may be placed between modules 204 and 205. Compression mean may be a software or hardware jpeg compression engine.
Optionally, the requested data (screenshot or captured image data and display layout information) are sent block by block. Advantageously, the size of the transmission buffer of the display devices required for transmitting the captured image data is thus decreased. Also, the reception buffer of the server is also reduced.
Optionally, as mentioned before, the low speed network may be secured so that only a list of authorized external devices may obtain data (e.g. image data, or screenshots) from the connected display device(s). Thus, the display device(s) equipped with a low speed networking unit may also have such list in memory.
Optionally, the required image data or screenshots may be provided in an encrypted command. To that end, the low speed networking unit 205 for sending it comprises cryptographic means.
According to some embodiments, it is also possible for a first external device to require a screenshot for at least one second external device. Thus, a copy of the screenshot is provided to the second external device(s). A typical use case is when a speaker wants to send a displayed slide to the spectators attending the displayed presentation.
Figures 3a and 3b represent steps of a method for generating a screenshot according to some embodiments. In this example, steps 300 to 304 are performed by the external device 111 (Figure 3a) and steps 310 to 319 are performed by the display device 103 acting as a server which is connected to the external device 111 (Figure 3b).
At step 300, the external device 111 connects to the low speed network 110 to which the server 103 is also connected. Depending on the low speed network protocol, this step may include network scanning, search for multi-display system services and/or authentication. For instance, the multi-display system may accept only external devices of a list of authorized external devices to request and obtain a screenshot (or data to generate a screenshot).
In this example, a graphical user interface (GUI) of the external device is initialized at step 301. The GUI provides a tool enabling a user of the external device to request a screenshot of a composite image displayed by the multi-display system.
According to some embodiments, an application of GUI is configured to generate the required screenshot based on captured image data received from the multi-display system. According to other embodiments, the required screenshot is generated by the server 103.
At step 302, it is checked if the user requests a screenshot. In the present example, a screenshot request is built and transmitted (step 303) to the server 103 through the low speed network 110. The screenshot request is either for obtaining a screenshot or for obtaining data (e.g. image data and display layout information) necessary to generate a screenshot. In other embodiments (not shown), the request is transmifted to each display device 100 to 103 when they are all connected to the low speed network 110.
The request may also specify the maximum amount of bits of the data to be sent back in response, the desired format (jpeg, bmp...), the size of the screenshot or of the captured sub-image, or other information.
Back to the example of Figure 3a, at step 304, the external device 111 receives the requested data in response to the request (i.e. the screenshot itself or the data necessary to generate it) and stores them (for instance for later use) or processes them (for instance for generating a screenshot or for displaying it in a way adapted to the screen of the external device).
Now, steps performed by the server 103 upon receiving (step 310) the screenshot request from the external device 111 are described with reference to Figure 3b.
Optionally, upon receiving the screenshot request from the external device 111 at step 310, the server 103 may check a certificate comprised in the request, or may check if the external device belongs to a list of authorized devices. If the external device 111 does not belong to this list, or if the request does not comprise a valid certificate, the server 103 may send an error message in response to the request and the process stops.
If no negative check is performed, the process goes to step 311 during which the server 103 retrieves display layout information describing how the data from the different source devices (e.g. video servers 105 and 106) are distributed to and displayed by the display devices of the multi-display system. In practice, a copy of these display layout information is stored in the configuration layer of each display device.
The display layout information allows defining, at step 312, the area that each display device has to capture. In particular, when the display devices have already applied the edge-blending process, defining the capture area comprises determining which pixels have to be discarded from image data so that image data corresponding to sub-images of the composite image are captured without any duplicated pixel. In the present embodiments, the capture area is defined by the server.
In other embodiments (for instance where the request is transmitted to each display device, but not only), each display device may define its own capture area (upon capture request reception for instance), based on shared display layout information and shared capture area definition rules.
Anyway, only one device may determine the timestamp and share it. In embodiments where the request is transmitted to each display device, the device determining the timestamp may for instance be designated by a user during the installation of the multi-display system.
In some embodiments, the data are captured before any duplication of pixels by the display devices, and so no cropping is required.
At step 313, a capture timestamp is defined, at which display devices will have to perform the capture in order to ensure that all display devices will capture sub-images of the same composite image. It is particularly useful when video data or image from a succession of images is displayed. Since all the display devices of the multi-display system share (e.g. though the high speed network 104) a same network clock reference, the capture timestamp is defining based on this clock reference. In addition, the timestamp is a date in the future, so that a request for capture (the capture request is described in detail below) sent by the server 103 at step 315 is received before this date, by all the display devices.
In practice, the timestamp is defined so that the time needed by the system to reach step 315, the time needed to deliver the capture requests to other display devices over the network 104, and the time for each display device to process the capture request, are all taken into account. The timestamp is at least equal to the sum of the current date with those three durations.
After having defined the timestamp, a capture request is built and transmifted to each display device (including the server, through inter-process messaging) by the server. The capture request comprises the capture timestamp. It may also include the area to be captured (capture area) as defined at step 312, so that each display device transmits only the image data corresponding to the targeted area to the server 103. Thus, bandwidth is saved.
Upon receiving all the data for the captured areas from the display devices, the corresponding data are stored in the server 103, at step 316.
In some embodiments, the server may generate a screenshot based on the display layout information and the received and stored captured data, by merging all the different areas into a single composite image (step 317). In other embodiments, the screenshot is not generated by the server 103 but by the external device 111 requesting the data necessary to generate the screenshot on its own (i.e. image data from the display devices and display layout information).
Optionally, the server may format (step 318) the generated screenshot or the stored image data captured by all the display devices in order to compress, crop or resize it or them, thus decreasing the amount of bits of the data before any transmission to the external device 111. Again optionally, the response to the request may be encrypted.
At step 319, the screenshot or the captured image data, eventually formatted, are transmitted to the external device 111 through the low speed network.
The response to the request may be sent in one monolithic block or successively in several blocks (e.g. line byline). With the latter approach, the captured image data may be progressively provided by the display devices to the device generating the screenshot (i.e. the server or the external device). Thus, a first piece of the data captured by each display device is obtained and processed (and possibly transmitted to the requesting external device 111) before another piece is requested, and so on.
For instance, the server 103 may obtain image data from the display devices on a line-by-line basis, so that for each line of the composite image, only the concerned display devices are requested. This reduces the memory needed at server device to store the received data before processing them (merged into a screen shot or transmifted). Advantageously, buffering memories are distributed among the display devices, thus avoiding the need of a wide memory on the server (the more display devices, the bigger a server centralized has to be). Thus the multi-display system is scalable.
In other embodiments (not shown), all the display devices 100 to 103 are connected to the external device through the external network, and when a user requires a screenshot using the GUI (step 302), the external device 111 performs steps 311 to 315 previously described instead of the server 103, and obtains directly during step 316 the captured sub-images from the display devices in response to the capture requests sent at step 315, and generates the screenshot based on them at step 317 described before. In these embodiments, the display device determining the timestamp is designated by a user during the installation of the multi-display system.
Thus, in these other embodiments, steps 303, 304, 310, 318 and 319 are not performed at all. And the remaining steps of Figure 3b are all performed by the external device 111.
Figure 4 represents steps performed by each of the capturing display devices 100 to 103 upon receiving (step 400) the capture request sent at step 315 by the server 103 in the example of Figure 3b.
At step 401, the display device extracts the timestamp from the capture request, and waits for this timestamp to be reached by the network clock.
Once the current time is equal to the timestamp, the display device, in practice, waits for a start of frame" event (e.g. vertical synchronization event) upon which image data at the output of the image distribution processing unit 202 are duplicated (capture step 402) and stored into a temporary buffer at step 403. The duplicated data may be cropped if cropping information (or a specific capture area) is provided in the capture request. Optionally, the duplicated data may be compressed before being transmitted.
At step 404, the resulting image data are transmitted to the server 103 through the network 104, which stores them at step 316 of Figure 3b (optionally, after decompression).
In some embodiments, each display device may transmit the captured data to the server block by block so that the reception buffer of the server device does not need to be as large as the size of the whole captured data, but only of the size of a block. For instance, each block may correspond to a line of the captured sub-image.
In some embodiments, sequence of composite images may be captured instead of a single composite image. The number of composite images to capture may be provided by the server in the capture request. A larger buffer may be required to store, in each display device, the image data successively captured. A counter may be used to monitor the number of captured composite images.
The screenshots corresponding to the successive composite images may be all or partly generated by the server and/or by the external device.
In other embodiments (not shown), all the display devices 100 to 103 are connected to the external device, and directly transmit (step 404) image data to the external device 111 upon direct requests from it at step 400.
In again other embodiments, rather than using a timestamp in the future, the timestamp may be defined so that it is equal to the time at which the request from the external device 111 is received by the server 103. These embodiments require that the display devices keep in memory several sub-images they have displayed respectively, together with time information. This is because the time needed by the display devices to receive and process the capture requests makes expire the timestamp.
Although the present invention has been described hereinabove with reference to specific embodiments, the present invention is not limited to the specific embodiments, and modifications which lie within the scope of the present invention will be apparent to a person skilled in the art. Many further modifications and variations will suggest themselves to those versed in the art upon making reference to the foregoing illustrative embodiments, which are given by way of example only and which are not intended to limit the scope of the invention as determined by the appended claims. In particular different features from different embodiments may be interchanged, where appropriate.

Claims (20)

  1. CLAIMS1. A method for generating a screenshot of a composite image to be collectively displayed by a plurality of display devices of a multi-display system according to display layout information, the method comprising the following steps: -obtaining, from the display devices of the multi-display system, image data of the composite image to be respectively displayed by the display devices, at a capture timestamp shared between said display devices; and -generating said screenshot of the composite image from the obtained image data, using said display layout information.
  2. 2. The method according to claim 1, wherein the screenshot is generated by one of the display devices of the multi-display system.
  3. 3. The method according to claim 1, wherein the screenshot is generated by an external device connected to the multi-display system.
  4. 4. The method according to any of claims 1 to 3, wherein said image data of the composite image are obtained, upon request from an external device connected to the multi-display system, by a display device of the multi-display system acting as a server receiving the request.
  5. 5. The method according to claim 4, wherein the capture timestamp defines a time in the future relatively to the time of the request from the external device.
  6. 6. The method according to any one of claims ito 5, comprising obtaining display layout information defining how the display devices display respective subparts of the composite image at capture timestamp, and wherein generating the screenshot is based on said obtained display layout information.
  7. 7. The method according to any one of claims 1 to 6, further comprising deleting, from the obtained image data, duplicated data corresponding to overlapping areas displayed by two or more display devices so that to keep a single occurrence of image data per overlapping area.
  8. 8. The method according to claim 7, wherein said deleting is performed before any transmission of image data to an external device.
  9. 9. The method according to any one of claims 1 to 8, wherein said image data are obtained prior any processing for adapting the image data to display.
  10. 10. A method according to any one of claims 3 to 5, further comprising, determining whether said external device connected to the multi-display system belongs to a list of authorized external devices, and upon negative determination, providing an error message to the external device thereby prohibiting it from obtaining a screenshot or any image data from the multi-display system.
  11. 11. The method according to any one of claims 1 to 10, wherein said image data of the composite image are obtained block by block.
  12. 12. A computer program product for a programmable apparatus, the computer program product comprising a sequence of instructions for implementing a method according to any one of claims 1 to 11, when loaded into and executed by the programmable apparatus.
  13. 13. A computer-readable storage medium storing instructions of a computer program for implementing a method according to any one of claims 1 to 11.
  14. 14. A device for generating a screenshot of a composite image to be collectively displayed by a plurality of display devices of a multi-display system according to display layout information, the device comprising: -a communication module for obtaining, from the display devices of the multi-display system, image data of the composite image to be respectively displayed by the display devices, at a capture timestamp shared between said display devices; and -a processing unit for generating said screenshot of the composite image from the obtained image data, using said display layout information.
  15. 15. A display device of a multi-display system comprising a screenshot generating device according to claim 14.
  16. 16. An external device comprising a screenshot generating device according to claim 14, said external device being connected to said multi-display system.
  17. 17. The external device according to claim 16, being connected to the multi-display system by low-speed communication means.
  18. 18. A multi-display system as hereinbefore described, with reference to, and as shown in, Figure 1 of the accompanying drawings.
  19. 19. A display device as hereinbefore described, with reference to, and as shown in, Figure 2 of the accompanying drawings.
  20. 20. A method of generating a screenshot as hereinbefore described, with reference to, and as shown in, Figure 3a to Figure 4 of the accompanying drawings.
GB1409633.3A 2014-05-30 2014-05-30 Method for generating a screenshot of an image to be displayed by a multi-display system Active GB2526618B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1409633.3A GB2526618B (en) 2014-05-30 2014-05-30 Method for generating a screenshot of an image to be displayed by a multi-display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1409633.3A GB2526618B (en) 2014-05-30 2014-05-30 Method for generating a screenshot of an image to be displayed by a multi-display system

Publications (3)

Publication Number Publication Date
GB201409633D0 GB201409633D0 (en) 2014-07-16
GB2526618A true GB2526618A (en) 2015-12-02
GB2526618B GB2526618B (en) 2016-07-27

Family

ID=51214480

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1409633.3A Active GB2526618B (en) 2014-05-30 2014-05-30 Method for generating a screenshot of an image to be displayed by a multi-display system

Country Status (1)

Country Link
GB (1) GB2526618B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017148211A1 (en) * 2016-02-29 2017-09-08 努比亚技术有限公司 Mobile terminal and webpage screenshot capturing method
CN112788177A (en) * 2020-12-31 2021-05-11 读书郎教育科技有限公司 Screen capturing system and method for telephone watch
US11500605B2 (en) * 2019-09-17 2022-11-15 Aver Information Inc. Image transmission device, image display system capable of remote screenshot, and remote screenshot method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109656494A (en) * 2017-10-12 2019-04-19 阿里巴巴集团控股有限公司 The management method of content is shown in Mosaic screen

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017148211A1 (en) * 2016-02-29 2017-09-08 努比亚技术有限公司 Mobile terminal and webpage screenshot capturing method
US11500605B2 (en) * 2019-09-17 2022-11-15 Aver Information Inc. Image transmission device, image display system capable of remote screenshot, and remote screenshot method
CN112788177A (en) * 2020-12-31 2021-05-11 读书郎教育科技有限公司 Screen capturing system and method for telephone watch

Also Published As

Publication number Publication date
GB2526618B (en) 2016-07-27
GB201409633D0 (en) 2014-07-16

Similar Documents

Publication Publication Date Title
US9332160B1 (en) Method of synchronizing audio-visual assets
US10623609B1 (en) Virtual video environment display systems
RU2534951C2 (en) Device, method and system for sharing of plotted image at multiple workplaces, programme and recordable media
WO2013132828A1 (en) Communication system and relay apparatus
WO2016202887A1 (en) Providing low & high quality streams
WO2021147702A1 (en) Video processing method and apparatus
KR101446995B1 (en) Helmet for imaging multi angle video and method thereof
WO2017113534A1 (en) Method, device, and system for panoramic photography processing
GB2526618A (en) Method for generating a screenshot of an image to be displayed by a multi-display system
JP2010288230A (en) Moving image information relay system and moving image information relay program
WO2023279793A1 (en) Video playing method and apparatus
CN107027066B (en) High-resolution digital movie theater playing method and system
KR101987062B1 (en) System for distributing and combining multi-camera videos through ip and a method thereof
WO2010103963A1 (en) Information processing device and method, and information processing system
WO2017049597A1 (en) System and method for video broadcasting
KR101877034B1 (en) System and providing method for multimedia virtual system
US9794534B2 (en) Image processing methods, and image processing devices and system for a scalable multi-projection system
KR102268167B1 (en) System for Providing Images
KR20150051048A (en) Method and apparatus for providing user interface menu of multi-angle video capturing
KR20150030889A (en) Method and apparatus for providing multi angle video broadcasting service
KR102149004B1 (en) Method and apparatus for generating multi channel images using mobile terminal
WO2013060295A1 (en) Method and system for video processing
US20210409613A1 (en) Information processing device, information processing method, program, and information processing system
US20240015264A1 (en) System for broadcasting volumetric videoconferences in 3d animated virtual environment with audio information, and procedure for operating said device
WO2012146203A1 (en) Method, device and system for displaying data content