WO2008044881A1 - Image board and display method using dual codec - Google Patents

Image board and display method using dual codec Download PDF

Info

Publication number
WO2008044881A1
WO2008044881A1 PCT/KR2007/004965 KR2007004965W WO2008044881A1 WO 2008044881 A1 WO2008044881 A1 WO 2008044881A1 KR 2007004965 W KR2007004965 W KR 2007004965W WO 2008044881 A1 WO2008044881 A1 WO 2008044881A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
high quality
image
processing unit
low quality
Prior art date
Application number
PCT/KR2007/004965
Other languages
French (fr)
Inventor
Yonghoon Kim
Boohee Lee
Dongmin Kim
Seongtaek Song
Original Assignee
Mivision Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mivision Co., Ltd. filed Critical Mivision Co., Ltd.
Publication of WO2008044881A1 publication Critical patent/WO2008044881A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/2625Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists for delaying content or additional data distribution, e.g. because of an extended sport event
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • H04N21/234372Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution for performing aspect ratio conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26208Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints
    • H04N21/26216Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints involving the channel capacity, e.g. network bandwidth
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention relates, in general, to an image transmission system, an image display method, and an image processing board each using a dual codec, and, more particularly, to an image transmission system, an image display method, and an image processing board each using a dual codec, in which input from various types of image generating devices is received by the image processing board using a dual codec, so that the storage and transmission of the input and the restoration of stored images can be enabled by both a low quality processing unit and a high quality processing unit, and in which the low quality processing unit and high quality processing unit, for receiving and processing one or more real-time images or files transmitted from various types of devices capable of transmitting one or more images to remote places are provided, so that one or more real-time images or files transmitted from remote places can be stored, the images transmitted from remote places can be restored, and one or more images stored in remote places can be stored.
  • Images are processed in both the low quality processing unit and the high quality processing unit, so that the bandwidth of a network can be effectively used and the number of operations can be reduced. Further, the present invention relates to an image transmission system, an image display method, and an image processing board each using a dual codec, which present an effective method of transmitting successive images to a user who wants to view real-time images, even though there is a time delay.
  • an object of the present invention is to provide an image transmission system, an image display method, and an image processing board each using a dual codec, which make it easy to view or search for images using limited bandwidth and acquire high quality images if necessary, thereby reducing costs related to increasing of network bandwidth.
  • the gist of the present invention is an image processing board using a dual codec, including both a high quality processing unit for processing one or more input images into high resolution images, and a low quality processing unit for processing one or more input images into low resolution images, wherein, when images are received from a plurality of image generating devices, the high quality processing unit generates high quality images and the low quality processing unit generates low quality images, simultaneously, thereby reducing the number of operations and the amount of network transmission by transmitting low quality images from the low quality processing unit when a plurality of screens is requested to be simultaneously transmitted onto a split screen, and thereby effectively utilizing the number of operations and the bandwidth of a network and maintaining image quality by transmitting high quality images from the high quality processing unit for a corresponding screen when a single screen is requested to be transmitted from among a plurality of screens.
  • another gist of the present invention is a high quality image display method using image processing board using a dual codec, the method including, when a split screen is viewed, receiving low quality images, transmitted from the image processing board using a dual codec, and displaying the images in real time, and, when a specific image corresponding to a specific time point is magnified and checked, receiving high quality images, generated and stored by the image processing board using a dual codec, within the limit of allowable bandwidth, and displaying the high quality images.
  • the low quality real-time images are displayed on a small screen using a Picture In Picture (PIP) function, thereby mitigating a disadvantage of time delay occurring in a case of a magnified screen, and, when a synchronization command is input, a time point of the high quality images is advanced to a corresponding time point of the low quality images, and the corresponding high quality images are displayed in synchronization with the low quality images.
  • PIP Picture In Picture
  • FIG. 1 is a diagram showing the detailed functions of an image processing board using a dual codec
  • FIG. 2 is a diagram showing the concept of the image processing board using a dual codec
  • FIG. 3 is a schematic diagram showing the shape of a product using the image processing board using a dual codec
  • FIG. 4 is a diagram showing a method of restoring images stored in the image processing board using a dual codec
  • FIG. 5 is a diagram showing the efficient use of bandwidth over a high-speed network
  • FIG. 6 is a diagram showing a method of viewing a high quality image over a low- speed network
  • FIG. 7 is a diagram showing a method of displaying a screen when high quality images from a remote place are viewed over a low-speed network; and [23] FIG. 8 is a diagram showing a method of playing images after a file is transmitted to a single user or to a plurality of users. [24]
  • the gist of the present invention is an image transmission system using a dual codec, including high quality processing units 120 and 140 for receiving and processing one or more images and generating high quality images; and low quality processing units 110 and 130 for generating one or more low quality images while processing images in synchronization with the high quality processing unit, as shown in FIG. 1.
  • the input to an image processing board using a dual codec is divided and input to an input image processing unit 11 for receiving images input from an image generating device 20 and processing the images, and a transmission image processing unit 12 for receiving and processing one or more real-time images or files from a device 30 capable of transmitting one or more images to remote places via a network.
  • an image transmission processing board 10 using a dual codec processes images input from the image generating device 20 of FIG. 2, the images are input to the input image processing unit 11, and are then transmitted to the low quality processing unit 110 and the high quality processing unit 12, thereby simultaneously generating low quality images and high quality images.
  • the images, received from the image generating device 20, are generated as low quality images in the low quality processing unit 110, and then the generated images are stored in a recording medium (111) or transmitted to a network (112). If necessary, one or more images, recorded in the recording medium, are restored (113) and displayed on a display device 60, such as a Television (TV) or a Liquid Crystal Display (LCD) monitor.
  • a display device 60 such as a Television (TV) or a Liquid Crystal Display (LCD) monitor.
  • the low quality processing unit 130 stores them in a recording medium (131), restore them to display on a TV or an LCD monitor (132), or, if necessary, receives files, stored in the remote place over a network, restores images, and displays them on a TV or an LCD monitor, which is called "remote place image restore (133)".
  • a file other than real-time images is received, the file is received and then restored (132 and 142).
  • a command may be received from a remote place such that the file is played on a TV or an LCD monitor.
  • the image processing board 10 using a dual codec may be used as an independent device by wrapping it in a case 40 or by forming it in the form of a Personal Computer (PC) card 50.
  • PC Personal Computer
  • the low quality processing unit 110 When images input from the image generating device 20 are viewed in a remote place via a high-speed network, the low quality processing unit 110 always generates low quality images and the high quality processing unit always generates high quality images in the dual codec image processing board 10, as shown in FIG. 5.
  • the case in which images are viewed on a four-split screen via a high-speed network and the case in which an image is magnified and viewed on a single screen via a high-speed network will be described.
  • the image processing board 10 using a dual codec which had received images from the image generating device 20, transmits low quality images, generated in the low quality processing unit 110 of the input image processing unit 11, to a high-speed network 112.
  • the transmitted low quality images are received by another image processing board 10 using a dual codec and connected to the high-speed network 12, and are restored by the low quality processing unit 130 of the transmission image processing unit 12, thereby displaying real-time images transmitted at low quality on a four-split screen.
  • high quality images generated in the high quality processing unit 120 of the input image processing unit 11, are transmitted to the high-speed network (122).
  • the transmitted images are received by another image processing board 10 using a dual codec, and are restored by the high quality processing unit 140 of the transmission image processing unit 12 (142), thereby displaying the real-time images transmitted at high quality on a single screen.
  • the image processing board 10 using a dual codec has advantages in that network efficiency is increased fourfold and the number of operations at a remote place is decreased while image quality is maintained the same.
  • Images transmitted to a remote place are received by another image processing board 10 using a dual codec, restored using a remote place image restoration function (143) of the high quality processing unit 140 of the transmission image processing unit 12, and are magnified and displayed on a single screen.
  • a remote place image restoration function 143
  • the number of images stored is larger than the number of images to be transmitted. Therefore, although a time delay phenomenon may occur, high quality images can be viewed within the limits of allowable bandwidth.
  • images are displayed on a large screen 1431 using the remote place image restoration function (143) of the high quality processing unit 140 of the transmission image processing unit 12, and for real-time images transmitted at low quality, images are displayed on a small screen 1321 using the restoration function (132) of the low quality processing unit 130 of the transmission image processing unit 12 using a Picture In Picture (PIP) function.
  • a time delay phenomenon may occur on the large screen 1431, and real-time images are displayed on the small screen 1321.
  • a synchronization command is issued, an image, subjected to the time delay phenomenon and synchronized with a real-time image on the small screen 1321, is displayed on the large screen 1431.
  • This effect can be achieved using any plural number of cameras, and is not limited to four cameras.
  • the high quality resolution of an image, that is, 640 ? 480, and the low quality resolution of an image, that is, 320 ? 240, have been adopted as one example, but the present invention is not limited thereto.
  • the resolution can be appropriately adjusted based on the bandwidth of a system.

Abstract

An image processing board using a dual codec is disclosed. A high quality processing unit processes one or more input images into high resolution images. A low quality processing unit processes one or more input images into low resolution images. When images are received from a plurality of image generating devices, the high quality processing unit generates high quality images and the low quality processing unit generates low quality images, simultaneously. When a plurality of screens is requested to be simultaneously transmitted onto a split screen, low quality images are transmitted from the low quality processing unit, thereby reducing a number of operations and the amount of network transmission. When a single screen is requested to be transmitted from among a plurality of screens, high quality images are transmitted from the high quality processing unit for the corresponding screen, thereby effectively utilizing a number of operations and bandwidth of a network and maintaining image quality.

Description

Description
IMAGE BOARD AND DISPLAY METHOD USING DUAL
CODEC
Technical Field
[1] The present invention relates, in general, to an image transmission system, an image display method, and an image processing board each using a dual codec, and, more particularly, to an image transmission system, an image display method, and an image processing board each using a dual codec, in which input from various types of image generating devices is received by the image processing board using a dual codec, so that the storage and transmission of the input and the restoration of stored images can be enabled by both a low quality processing unit and a high quality processing unit, and in which the low quality processing unit and high quality processing unit, for receiving and processing one or more real-time images or files transmitted from various types of devices capable of transmitting one or more images to remote places are provided, so that one or more real-time images or files transmitted from remote places can be stored, the images transmitted from remote places can be restored, and one or more images stored in remote places can be stored. Images are processed in both the low quality processing unit and the high quality processing unit, so that the bandwidth of a network can be effectively used and the number of operations can be reduced. Further, the present invention relates to an image transmission system, an image display method, and an image processing board each using a dual codec, which present an effective method of transmitting successive images to a user who wants to view real-time images, even though there is a time delay.
[2]
Background Art
[3] When images are compressed and transmitted, the size of image data is still larger than that of text, so that it is difficult to transmit image data over a network. In particular, when images from remote places are received and simultaneously viewed on a plurality of screens or when high quality images stored in remote places are searched for, high bandwidth and a large number of operations are necessary.
[4] Existing image transmission methods are performed using high quality compression or low quality compression.
[5] When images are stored at high quality, the size of data is large. Therefore, when images are transmitted from a remote place over a network and viewed in real time or when stored images are searched for, the search cannot be smoothly performed, and images are not displayed in succession due to the limit of network bandwidth. [6] When images are stored at low quality, the images can be smoothly viewed or found in remote places but the quality of the stored images is deteriorated. Therefore, when the stored images are viewed again, the dissatisfaction with image quality may occur.
[7]
Disclosure of Invention Technical Problem
[8] Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide an image transmission system, an image display method, and an image processing board each using a dual codec, which make it easy to view or search for images using limited bandwidth and acquire high quality images if necessary, thereby reducing costs related to increasing of network bandwidth.
[9]
Technical Solution
[10] In order to accomplish the above object, the gist of the present invention is an image processing board using a dual codec, including both a high quality processing unit for processing one or more input images into high resolution images, and a low quality processing unit for processing one or more input images into low resolution images, wherein, when images are received from a plurality of image generating devices, the high quality processing unit generates high quality images and the low quality processing unit generates low quality images, simultaneously, thereby reducing the number of operations and the amount of network transmission by transmitting low quality images from the low quality processing unit when a plurality of screens is requested to be simultaneously transmitted onto a split screen, and thereby effectively utilizing the number of operations and the bandwidth of a network and maintaining image quality by transmitting high quality images from the high quality processing unit for a corresponding screen when a single screen is requested to be transmitted from among a plurality of screens.
[11] Further, another gist of the present invention is a high quality image display method using image processing board using a dual codec, the method including, when a split screen is viewed, receiving low quality images, transmitted from the image processing board using a dual codec, and displaying the images in real time, and, when a specific image corresponding to a specific time point is magnified and checked, receiving high quality images, generated and stored by the image processing board using a dual codec, within the limit of allowable bandwidth, and displaying the high quality images.
[12] According to a preferred embodiment of the present invention, when the high quality images are displayed, the low quality real-time images are displayed on a small screen using a Picture In Picture (PIP) function, thereby mitigating a disadvantage of time delay occurring in a case of a magnified screen, and, when a synchronization command is input, a time point of the high quality images is advanced to a corresponding time point of the low quality images, and the corresponding high quality images are displayed in synchronization with the low quality images. [13]
Advantageous Effects
[14] According to the above-described image transmission system and image processing method each using a dual codec of the present invention, there is an advantage in that one or more images, transmitted from a plurality of regions, can be displayed and found using limited bandwidth and an advantage in that a specific image can be received and checked at high quality if necessary. Further, images from remote places can be simultaneously viewed while stored images are being viewed. Therefore, a plurality of regions can be viewed using limited bandwidth, thereby acquiring an advantage in that costs for installation and operation, required when bandwidth is increased, are reduced. Further, a low quality image module is mainly used, so that a system can be realized using a Central Processing Unit (CPU), which performs a small number of operations, thereby realizing an advantage of reducing system costs. Further, after a file is transmitted, the image display method can be reliably operated for video-based education or advertisement over a network.
[15]
Brief Description of the Drawings
[16] FIG. 1 is a diagram showing the detailed functions of an image processing board using a dual codec;
[17] FIG. 2 is a diagram showing the concept of the image processing board using a dual codec;
[18] FIG. 3 is a schematic diagram showing the shape of a product using the image processing board using a dual codec;
[19] FIG. 4 is a diagram showing a method of restoring images stored in the image processing board using a dual codec;
[20] FIG. 5 is a diagram showing the efficient use of bandwidth over a high-speed network;
[21] FIG. 6 is a diagram showing a method of viewing a high quality image over a low- speed network;
[22] FIG. 7 is a diagram showing a method of displaying a screen when high quality images from a remote place are viewed over a low-speed network; and [23] FIG. 8 is a diagram showing a method of playing images after a file is transmitted to a single user or to a plurality of users. [24]
Best Mode for Carrying Out the Invention
[25] For this purpose, for an image transmission system which processes one or more captured and stored images and transmits the resulting images via a network, the gist of the present invention is an image transmission system using a dual codec, including high quality processing units 120 and 140 for receiving and processing one or more images and generating high quality images; and low quality processing units 110 and 130 for generating one or more low quality images while processing images in synchronization with the high quality processing unit, as shown in FIG. 1.
[26] As shown in FIG. 2, the input to an image processing board using a dual codec is divided and input to an input image processing unit 11 for receiving images input from an image generating device 20 and processing the images, and a transmission image processing unit 12 for receiving and processing one or more real-time images or files from a device 30 capable of transmitting one or more images to remote places via a network.
[27] When an image transmission processing board 10 using a dual codec processes images input from the image generating device 20 of FIG. 2, the images are input to the input image processing unit 11, and are then transmitted to the low quality processing unit 110 and the high quality processing unit 12, thereby simultaneously generating low quality images and high quality images.
[28] The images, received from the image generating device 20, are generated as low quality images in the low quality processing unit 110, and then the generated images are stored in a recording medium (111) or transmitted to a network (112). If necessary, one or more images, recorded in the recording medium, are restored (113) and displayed on a display device 60, such as a Television (TV) or a Liquid Crystal Display (LCD) monitor.
[29] When one or more real-time images or files are received using a device capable of transmitting one or more images to remote places via a network, other than the image generating device 20, the real-time images or files are transmitted to the transmission image processing unit 12. Here, in the case of the real-time low quality images, the low quality processing unit 130 stores them in a recording medium (131), restore them to display on a TV or an LCD monitor (132), or, if necessary, receives files, stored in the remote place over a network, restores images, and displays them on a TV or an LCD monitor, which is called "remote place image restore (133)".
[30] When a file other than real-time images is received, the file is received and then restored (132 and 142). Here, a command may be received from a remote place such that the file is played on a TV or an LCD monitor.
[31] As shown in FIG. 3, the image processing board 10 using a dual codec may be used as an independent device by wrapping it in a case 40 or by forming it in the form of a Personal Computer (PC) card 50.
[32] When four images are transmitted over a network and viewed and it is assumed, for example, that the resolution of a monitor is 640 ? 480, that the resolution of a high quality image is 640 ? 480, that the resolution of a low quality image is 320 ? 240, and that a user wants to use high quality, four 640 ? 480 images should be received according to the existing method in order to simultaneously view four camera images. The received images should be restored again and reduced to 1/4 of the original size thereof so as to match the resolution of the monitor, that is, 640 ? 480.
[33] That is, four 640 ? 480 images are received, converted into four 320 ? 240 images so as to match the respective images to the resolution of the monitor, and then displayed on a screen.
[34] Although this case has an advantage in that high quality can be maintained even in the case in which magnification is performed to view a single camera image, there are disadvantages in that a large amount of network bandwidth must be allocated, and the number of operations are increased because images should be displayed by being reduced.
[35] When a user wants four 320 ? 240 images, that is, low quality resolution images, the received images are combined and displayed on the screen.
[36] Although this case has advantages in that a small amount of network bandwidth, corresponding to four 320 ? 240 images, is consumed and a small number of operations are required, there is a disadvantage in that image quality is deteriorated when a desired single camera image is magnified and viewed on the screen.
[37] When images, stored using the dual codec image processing board 10, are restored and displayed, images are input from the image generating device 20, the low quality processing unit 110 always stores low quality images (111), and the high quality processing unit 120 always stores high quality images (112), as shown in FIG. 4. Therefore, when images are viewed on a four-split screen, low quality images, stored using the store function (111) of the low quality processing unit 110, are restored using a stored image restoration function (113) and are then displayed, so that the number of operations can be reduced. When a magnified image is viewed on a single screen, rather than a split screen, high quality images, stored using the store function (121) of the high quality processing unit 120, are restored using a stored image restoration function (123) and are then displayed, so that images can be viewed at high quality.
[38] When images input from the image generating device 20 are viewed in a remote place via a high-speed network, the low quality processing unit 110 always generates low quality images and the high quality processing unit always generates high quality images in the dual codec image processing board 10, as shown in FIG. 5. The case in which images are viewed on a four-split screen via a high-speed network and the case in which an image is magnified and viewed on a single screen via a high-speed network will be described. In the case in which images are viewed on a four-split screen, the image processing board 10 using a dual codec, which had received images from the image generating device 20, transmits low quality images, generated in the low quality processing unit 110 of the input image processing unit 11, to a high-speed network 112. The transmitted low quality images are received by another image processing board 10 using a dual codec and connected to the high-speed network 12, and are restored by the low quality processing unit 130 of the transmission image processing unit 12, thereby displaying real-time images transmitted at low quality on a four-split screen. In the case in which images are magnified and displayed on a single screen, for images input from the image generating device 20, high quality images, generated in the high quality processing unit 120 of the input image processing unit 11, are transmitted to the high-speed network (122). The transmitted images are received by another image processing board 10 using a dual codec, and are restored by the high quality processing unit 140 of the transmission image processing unit 12 (142), thereby displaying the real-time images transmitted at high quality on a single screen.
[39] That is, in the case of the four-split screen, four 320 ? 240 real-time images, transmitted from the low quality processing unit 110 of the dual codec image processing board 10 (112), are received. In the case of the magnified screen, images are not magnified, but a 640 ? 480 real-time image transmitted from the high quality processing unit 120 of the dual codec image processing board 10 (122) is received in real time. Therefore, since four 320 ? 240 images are the same as the 640 ? 480 image, the network bandwidth is stably maintained at a constant level. Compared to the case in which the transmission is performed only at high quality, the above-described method can solve the disadvantages in which four times the network bandwidth (four 640 ? 480) is required and the number of operations to achieve a reduction in size are increased. Further, compared to the case in which the transmission is performed only at low quality, the disadvantage of low quality when an image is magnified to be displayed on a single screen can be overcome.
[40] Compared to the case in which transmission is performed only at high quality, the image processing board 10 using a dual codec has advantages in that network efficiency is increased fourfold and the number of operations at a remote place is decreased while image quality is maintained the same.
[41] When it is difficult to transmit high quality images in real time over a low- speed network, if the image processing board 10 using a dual codec is used, high quality can be realized as a result. When images input from the image generating device 20 are viewed at a remote place over a low- speed network, the low quality processing unit 110 always generates low quality images and the high quality processing unit 120 always generates high quality images in the image processing board 10 using a dual codec, as shown in FIG. 6. The case in which images are viewed on a four-split screen via a low- speed network and the case in which an image is magnified and viewed on a single screen via a high-speed network will be described. In the case in which images are viewed on a four-split screen, low quality images are transmitted and restored in the same method shown in FIG. 5. However, in the case in which an image is magnified and viewed on a single screen, high quality images are difficult to transmit in real time due to the limits of network bandwidth. In order to overcome this problem, for images input from the image generating device 20, high quality images, generated in the high quality processing unit 120 of the input image processing unit 11, are stored first, and the stored images are transmitted. Transmission is performed in non-real time within the limits of allowable bandwidth. Images transmitted to a remote place are received by another image processing board 10 using a dual codec, restored using a remote place image restoration function (143) of the high quality processing unit 140 of the transmission image processing unit 12, and are magnified and displayed on a single screen. In this case, because of the low-speed network, not all of the high quality images to be stored can be transmitted, and the number of images stored is larger than the number of images to be transmitted. Therefore, although a time delay phenomenon may occur, high quality images can be viewed within the limits of allowable bandwidth. For example, when a user, who is viewing camera images in remote places on a four-split screen, wants to magnify and clearly view a desired camera image, the user can issue a command such that high quality images are stored first, and then transmitted, thereby receiving and viewing images within the limits of allowable bandwidth of the network. Here, although a phenomenon, in which images do not naturally change, that is, images do not follow each other in succession, occurs, there is an advantage in that an image corresponding to an important time point can be viewed at high quality over a low- speed network. [42] In this case, the necessity to simultaneously view real-time images may exist. As shown in FIG. 7, for high quality images stored in a remote place, images are displayed on a large screen 1431 using the remote place image restoration function (143) of the high quality processing unit 140 of the transmission image processing unit 12, and for real-time images transmitted at low quality, images are displayed on a small screen 1321 using the restoration function (132) of the low quality processing unit 130 of the transmission image processing unit 12 using a Picture In Picture (PIP) function. In this case, a time delay phenomenon may occur on the large screen 1431, and real-time images are displayed on the small screen 1321. Here, when a synchronization command is issued, an image, subjected to the time delay phenomenon and synchronized with a real-time image on the small screen 1321, is displayed on the large screen 1431. At this time point, images corresponding to the time period during which the time delay occurred are not displayed, but are skipped. Thereafter, the display of images corresponding to the time point of synchronization starts. In an example, it is assumed that stored high quality images are viewed from ten o'clock, that the synchronization time point is eleven o'clock, and that high quality images corresponding to the time period until ten-twenty have been received and viewed. In this case, images corresponding to the time period from ten-twenty one to eleven o'clock are skipped, and an image corresponding to the time point of synchronization, that is, the time point of eleven o'clock, starts to be displayed. Although this function is performed in non- real time, there is an advantage in that images corresponding to important time points can be viewed at high quality.
[43] This effect can be achieved using any plural number of cameras, and is not limited to four cameras. The high quality resolution of an image, that is, 640 ? 480, and the low quality resolution of an image, that is, 320 ? 240, have been adopted as one example, but the present invention is not limited thereto. The resolution can be appropriately adjusted based on the bandwidth of a system.
[44] When images must be transmitted to a plurality of users, smooth transmission is not realized due to the limits of the bandwidth of a network in most cases. The use of the image processing board 10 using a dual codec is effective when the necessity to view the images in real time is low. That is, as shown in FIG. 8, a file, in which input from the image generating device 20 is stored first (121) at high quality, or a file, stored in a device capable of transmitting one or more images to remote places, is transmitted over a network. The file is received and stored by other image processing boards 10 using a dual codec in remote places. Thereafter, images are displayed using the stored file restore function (123) thereof. This method can realize excellent results when it is utilized for video-based education or advertisements.

Claims

Claims
[1] An image processing board using a dual codec, comprising both a high quality processing unit for processing one or more input images into high resolution images, and a low quality processing unit for processing one or more input images into low resolution images, wherein, when images are received from a plurality of image generating devices, the high quality processing unit generates high quality images and the low quality processing unit generates low quality images, simultaneously, thereby reducing a number of operations and an amount of network transmission by transmitting low quality images from the low quality processing unit when a plurality of screens is requested to be simultaneously transmitted onto a split screen, and thereby effectively utilizing a number of operations and bandwidth of a network and maintaining image quality by transmitting high quality images from the high quality processing unit for a corresponding screen when a single screen is requested to be transmitted from among a plurality of screens.
[2] A high quality image display method using image processing board using a dual codec, the method comprising, when a split screen is viewed, receiving low quality images, transmitted from the image processing board using a dual codec according to claim 1, and displaying the images in real time, and, when a specific image corresponding to a specific time point is magnified and checked, receiving high quality images, generated and stored by the image processing board using a dual codec according to claim 1, within a limit of allowable bandwidth, and displaying the high quality images.
[3] The high quality image display method according to claim 2, wherein, when the high quality images are displayed, the low quality real-time images are displayed on a small screen using a Picture In Picture (PIP) function, thereby mitigating a disadvantage of time delay occurring in a case of a magnified screen, and, when a synchronization command is input, a time point of the high quality images is advanced to a corresponding time point of the low quality images, and the corresponding high quality images are displayed in synchronization with the low quality images.
PCT/KR2007/004965 2006-10-13 2007-10-11 Image board and display method using dual codec WO2008044881A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020060099979A KR100831704B1 (en) 2006-10-13 2006-10-13 image board and display method using dual codec
KR10-2006-0099979 2006-10-13

Publications (1)

Publication Number Publication Date
WO2008044881A1 true WO2008044881A1 (en) 2008-04-17

Family

ID=39283039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/004965 WO2008044881A1 (en) 2006-10-13 2007-10-11 Image board and display method using dual codec

Country Status (2)

Country Link
KR (1) KR100831704B1 (en)
WO (1) WO2008044881A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109963514A (en) * 2016-11-17 2019-07-02 皇家飞利浦有限公司 Long-range ultrasound diagnosis with controlled image displaying quality

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5369599B2 (en) * 2008-10-20 2013-12-18 富士通株式会社 Video encoding apparatus and video encoding method
KR101117271B1 (en) * 2009-08-20 2012-03-20 경북대학교 산학협력단 Image data processing method and apparatus
KR101305356B1 (en) * 2013-04-17 2013-09-06 주식회사 씨트링 Method and apparatus for displaying double encoded images

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020021260A1 (en) * 2000-06-26 2002-02-21 Takeya Meguro Multiscreen display apparatus and multiscreen display method
US20040184531A1 (en) * 2003-03-20 2004-09-23 Byeong-Jin Lim Dual video compression method for network camera and network digital video recorder
US20050169546A1 (en) * 2004-01-29 2005-08-04 Samsung Electronics Co., Ltd. Monitoring system and method for using the same
JP2006037823A (en) * 2004-07-27 2006-02-09 Komatsu Ltd Exhaust emission control device and exhaust emission control method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003037823A (en) 2001-07-24 2003-02-07 Matsushita Electric Ind Co Ltd Image display processing system
KR100459124B1 (en) * 2001-11-02 2004-12-03 엘지전자 주식회사 Apparatus for Displaying Twin Picture of Display and Method of The Same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020021260A1 (en) * 2000-06-26 2002-02-21 Takeya Meguro Multiscreen display apparatus and multiscreen display method
US20040184531A1 (en) * 2003-03-20 2004-09-23 Byeong-Jin Lim Dual video compression method for network camera and network digital video recorder
US20050169546A1 (en) * 2004-01-29 2005-08-04 Samsung Electronics Co., Ltd. Monitoring system and method for using the same
JP2006037823A (en) * 2004-07-27 2006-02-09 Komatsu Ltd Exhaust emission control device and exhaust emission control method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109963514A (en) * 2016-11-17 2019-07-02 皇家飞利浦有限公司 Long-range ultrasound diagnosis with controlled image displaying quality
CN109963514B (en) * 2016-11-17 2023-04-04 皇家飞利浦有限公司 Remote ultrasound diagnosis with controlled image display quality

Also Published As

Publication number Publication date
KR100831704B1 (en) 2008-05-26
KR20080033796A (en) 2008-04-17

Similar Documents

Publication Publication Date Title
WO2017193576A1 (en) Video resolution adaptation method and apparatus, and virtual reality terminal
US6502107B1 (en) Visual database system
CN106528025B (en) Multi-screen image projection method, terminal, server and system
JP7084450B2 (en) Systems and methods for distributed media interaction and real-time visualization
US20110229106A1 (en) System for playback of ultra high resolution video using multiple displays
US20200186887A1 (en) Real-time broadcast editing system and method
US7626637B2 (en) Method and apparatus for capturing full-screen frames
KR20040025073A (en) Method for displaying schedule information on television screen with thumbnail channel image on digital broadcasting
US7957603B2 (en) Digital image decoder with integrated concurrent image prescaler
CN105681720A (en) Video playing processing method and device
WO2008044881A1 (en) Image board and display method using dual codec
US7876996B1 (en) Method and system for time-shifting video
CN111147883A (en) Live broadcast method and device, head-mounted display equipment and readable storage medium
CN101594477B (en) Processing system of ultralong caption rendering
US20130170820A1 (en) Video image capture and playback for display devices
CN101483034B (en) Multiple image display method and apparatus
KR101506030B1 (en) Multi-vision system and picture visualizing method the same
CN114697690A (en) System and method for extracting specific stream from multiple streams transmitted in combination
WO2019087984A1 (en) Image processing device, display device, image processing method, control program, and recording medium
TWI812003B (en) Method and system for previewing the image
CN101465959A (en) Method for generating and rapidly browsing image of hand-hold equipment
CN112073801B (en) Image processing method, electronic equipment and connector
TW202002604A (en) Image processing method and electronic device
KR102392908B1 (en) Method, Apparatus and System for Providing of Free Viewpoint Video Service
CN111699672B (en) Video control device and video control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07833276

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07833276

Country of ref document: EP

Kind code of ref document: A1