WO2004004350A1 - 画像データ配信システムならびにその画像データ送信装置および画像データ受信装置 - Google Patents

画像データ配信システムならびにその画像データ送信装置および画像データ受信装置 Download PDF

Info

Publication number
WO2004004350A1
WO2004004350A1 PCT/JP2003/008302 JP0308302W WO2004004350A1 WO 2004004350 A1 WO2004004350 A1 WO 2004004350A1 JP 0308302 W JP0308302 W JP 0308302W WO 2004004350 A1 WO2004004350 A1 WO 2004004350A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
image
request information
information
viewpoint
Prior art date
Application number
PCT/JP2003/008302
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Toshio Nomura
Hiroyuki Katata
Norio Ito
Tadashi Uchiumi
Shuichi Watanabe
Original Assignee
Sharp Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Kabushiki Kaisha filed Critical Sharp Kabushiki Kaisha
Priority to JP2004517332A priority Critical patent/JP4346548B2/ja
Priority to KR1020047021448A priority patent/KR100742674B1/ko
Priority to EP03761837A priority patent/EP1519582A4/en
Priority to AU2003244156A priority patent/AU2003244156A1/en
Priority to US10/519,154 priority patent/US7734085B2/en
Publication of WO2004004350A1 publication Critical patent/WO2004004350A1/ja

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/23439Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests

Definitions

  • Image data distribution system image data transmission device thereof, and image data reception
  • the present invention relates to an image data distribution system for distributing image data, and an image
  • the present invention relates to a data transmitting device and an image data receiving device.
  • a device that displays a stereoscopic image viewed from an arbitrary viewpoint using a multi-viewpoint image captured by a plurality of cameras has been considered.
  • a display device for example, there is one as shown in FIG. This is an input device 95 for setting the left and right viewpoints, and a computer 91 1 connected to the input device 95 and creating two images viewed from the left viewpoint. And a three-dimensional display 98 connected to the computer 91 to receive image data of two images, combine them by an internal circuit, and three-dimensionally display them.
  • the computer 91 has a memory 92 for storing multi-viewpoint image data, a CPU 93 for executing a process of creating an image display of two images, a stereoscopic display 98, and an input device.
  • the input / output interface 94 connected to the input / output device 95 for controlling the input / output of input values and image data from the input device 95, and the CPU 93, the memory 92, and the input / output interface 94 are interconnected.
  • bus 96 and including.
  • the input device 95 selects a desired one of the multi-view image data stored in the memory 92 and sets the left and right viewpoints for performing the stereoscopic display.
  • there is image data viewed from the left viewpoint within a multi-view image It determines whether or not the image data exists, and if it exists, outputs the image data via the input / output interface 94. If it does not exist, the image data viewed from the left viewpoint is created by performing interpolation processing using the image data of a predetermined number of images taken from the viewpoint near the left viewpoint, and the input / output interface Output via 9 4. The same applies to the right viewpoint.
  • the present invention has been made in view of such a situation, and an image data distribution system capable of observing a stereoscopic image viewed from an arbitrary viewpoint even on a portable terminal or the like, an image data overnight transmission device, and an image data transmission system To provide a receiving device
  • an image data distribution system, an image data transmitting device, and an image data receiving device employ the following means. That is, the image data distribution system of the present invention includes a request information receiving unit that receives client request information transmitted via a network, and a request information receiving unit that receives the request information.
  • Request information analysis means for analyzing received request information
  • multi-view image supply means for supplying multi-view image data
  • An image generating means for generating image data of a predetermined viewpoint that matches the request information input from the multi-viewpoint image supply means, and combines a plurality of image data generated by the image generating means based on the display unit information of the request information.
  • Image synthesizing means encoding means for encoding the image data synthesized by the image synthesizing means, transmitting means for transmitting the image data encoded by the encoding means to the network, Receiving means for receiving the image data, decoding means for decoding the encoded image data received by the receiving means, and displaying the image data decoded by the decoding means on the display means
  • Image processing means for processing image data, display means for displaying image data processed by the image processing means, request information input means for inputting request information of the client, and request information transmitting means for transmitting request information to the network
  • the image data transmitting apparatus further comprises: request information receiving means for receiving client request information transmitted via a network; and request information received by the request information receiving means.
  • Image generating means for generating image data of a predetermined viewpoint matching the request information by inputting the image data from a plurality of image data generated by the image generating means.
  • Image synthesizing means for synthesizing the image data synthesized by the image synthesizing means, and transmitting means for transmitting the image data encoded by the encoding means to a network. It is characterized by having.
  • the image data receiving apparatus of the present invention includes: a receiving unit that receives encoded image data via a network; a decoding unit that decodes the encoded image data received by the receiving unit; Image processing means for processing the image data decoded by the means so that it can be displayed on the display means, and a display means for displaying the image data processed by the image processing means. And request information input means for inputting request information of the client, and request information transmitting means for transmitting the request information to the network.
  • the image data distribution system and the image data transmitting device of the present invention have management information adding means for adding management information for enabling access to each viewpoint image data and random access to multi-view image data. It is characterized by the following. Further, the image data distribution system and the image data receiving device of the present invention are characterized in that the image data distribution system and the image data receiving device have a determination unit for determining whether the received image data is two-dimensional image data or stereoscopic image data.
  • the image data distribution system, the image data transmitting device, and the image data receiving device of the present invention include information for identifying whether a transmitted or received image data is a two-dimensional image data or a three-dimensional image data. It is characterized by having an identification information adding means for adding an ID.
  • the client side does not need a memory having a sufficient capacity and a CPU having a high processing capability. This has the effect that the stereoscopic image viewed from the viewer can be observed.
  • FIG. 1 is a block diagram showing an embodiment of an image data distribution system according to the present invention.
  • FIG. 2 is a diagram showing an arrangement of a plurality of cameras for creating multi-viewpoint image data.
  • FIG. 3 is a diagram showing a left viewpoint L and a right viewpoint R of image data generated by interpolation.
  • FIG. 4 is a diagram showing an area cut out from the generated image data according to the resolution of the display unit.
  • Figure 5 is an image diagram of a portable terminal serving as a client.
  • FIG. 6 is a diagram illustrating an example of a storage image form of multi-view image data.
  • FIG. 7 is a diagram showing an example of a generated image data form.
  • FIG. 8 is a diagram showing accumulation and extraction of multi-viewpoint image data.
  • FIG. 9 is a flowchart illustrating a processing procedure in the server.
  • FIG. 10 is a flowchart showing a processing procedure in the client.
  • FIG. 11 is a diagram showing an example in which multi-view video data is encoded using MPEG-4.
  • FIG. 12 is a diagram showing multi-view image data to which management information has been added.
  • FIG. 13 is a diagram illustrating an example of management information.
  • FIG. 14 shows a connection form between a server and a client assumed in the present embodiment.
  • FIG. 15 is a flowchart showing details of the processing of the image generation unit.
  • FIG. 16 shows a conventional example.
  • FIG. 1 is a block diagram showing an embodiment of an image data distribution system according to the present invention.
  • Server 1 image data transmitting device
  • client 11 image data receiving device
  • the server 1 stores the multi-view image data 2, and the client 11 sends request information to the server 1, so that an image viewed from a desired viewpoint is displayed in a stereoscopic manner on the display unit 14.
  • the multi-view image data does not necessarily need to be stored (recorded) in the server 1 in advance, and may be input in real time from outside.
  • the server 1 analyzes the request information transmitted from the client 11 by the request analysis unit 4 (request information analysis means (including request information reception means)), and stores the request information in the multi-view image data 2 (multi-view image supply means).
  • the required image data is selected and output to the image generation unit 3 (image generation means), and the image generation unit 3 interpolates and generates the image data of the requested viewpoint (viewpoint information) and outputs the image data to the image synthesis unit 5 (image generation unit).
  • the image synthesizing unit 5 synthesizes the input image data into a form suitable for encoding (a form based on the display unit information), and outputs it to the encoding unit 6 (encoding means).
  • the encoding unit 6 encodes the input image data at an appropriate bit rate and transmits the encoded image data to the network 7 (transmission means).
  • the client 11 receives the encoded image data (receiving means), decodes the data by the decoding unit 12 (decoding means), and sends the decoded image data to the image processing unit 13 (image processing means). Output and convert the image data into an appropriate form according to the stereoscopic display format and display it on the display unit 14 (display means).
  • the client 11 also has an input unit 16 (request information input means) for changing the viewpoint, and transmits request information for viewpoint change to the network 7 via the request output unit 15 (request information transmission means). I do.
  • the multi-viewpoint image data 2 is a set of image data shot using a plurality of cameras.
  • the plurality of cameras are arranged so that the optical axes of the plurality of cameras converge at one point as shown in FIG. 2 (a), for example.
  • the force sensors may be arranged on the circumference so that the optical axes of the force lenses are oriented toward the center of the circle. In either case, the cameras need not be arranged at equal intervals, and the distribution may be sparse and dense. Information indicating how the cameras are arranged is recorded together with the image data.
  • the image generation unit 3 interpolates and generates image data of a predetermined viewpoint based on the request information from the client 11, It is used when determining whether to create the image based on the image data from the camera.
  • camera image data required when generating an image of a requested viewpoint by interpolation.
  • the left viewpoint L and the right viewpoint R are set as illustrated for the cameras C1 to C4.
  • the image data of the left viewpoint L is generated.
  • the image data of C2 and C3 are used.
  • FIG. 3 shows an example in which four cameras are used, the number of cameras is not limited to four.
  • the generation of intermediate viewpoint images by interpolation is a known technique. For example, see “Tsunashima et al .: Creation of Intermediate Image Data from Twin-Image Stereo Image Data Considering Occlusion-3D Image Conference '95, pp. 174-177 (1995) J and "Azuma et al .: Parallax Estimation Using Edge Information for Intermediate Image Generation, 3D Image Conference '95, pp. 190-194 (1995)” .
  • the image synthesizing unit 5 extracts the image data of the required resolution.
  • Fig. 4 (a) it is assumed that the size of the generated image data 21 is equal to the size of the image captured by the camera, and the resolution required for display on the client 11 is indicated by the area 22. are doing.
  • FIG. 4A only a part of the generated image data is cut out.
  • the display data is generated from the generated image data 23 (having the same size as the image captured by the camera). After cutting out the largest area 24 that can be cut out while keeping the spectrum, the resolution may be reduced to the required resolution.
  • FIG. 5 is an image diagram of a portable terminal 50 serving as the client 11. It has a three-dimensional display 54, and has a cross key 51 for moving the viewpoint up, down, left and right, and keys 52, 5 3 for moving the viewpoint back and forth. Although not shown, a communication means for communicating with the server is also provided.
  • FIG. 6 shows an example of an accumulation form of the multi-viewpoint image data 2. Since moving image data captured by multiple cameras must be synchronized in time, multiple image data C1 to C4 captured by each camera are displayed horizontally as shown in Fig. 6 (a). There is a form in which image data is stored as one image by attaching them side by side. By doing so, it is ensured that C1 to C4 included in the image data to be one image are shot at the same time, and time management becomes easy.
  • the attachment method is not limited to a horizontal row as shown in FIG. 6 (a), but may be attached as shown in FIG. 6 (b).
  • the multi-viewpoint image data 2 may be compressed and stored, or may be stored in an uncompressed state.
  • the image data for each camera The case of compressing and storing will be described with reference to FIG.
  • the image data C1 to C4 captured by each camera are input to the encoding unit 31 as shown in FIG.
  • the encoding unit 31 encodes each image data and outputs information (such as a frame type and the number of generated pits) necessary for generating management information to the management information generation unit 32.
  • the recording unit 33 records management information as accumulated data together with the encoded image data (management information adding means). Details of the management information and the storage form will be described later.
  • the selection unit 34 selects only the necessary image data from the stored data and outputs it to the decoding unit 35 for decoding.
  • Original image data image data used in the image generation unit 3
  • management information recorded together with the image data is used to quickly extract necessary parts.
  • FIG. 7 shows an example of an image display mode generated by the image synthesizing unit 5.
  • the left-view image data L and the right-view image data R are encoded as shown in Fig. 7 (a). It is preferable to arrange the images side by side (or vertically) to synthesize and encode the image data as one image.
  • Fig. 7 (a) by using image data that forms one image, synchronization between the left-viewpoint image data L and the right-viewpoint image data R is guaranteed, which simplifies time management. .
  • the image data finally displayed on the display unit 14 may have left-view image data L and right-view image data R for each line as shown in FIG. 7 (c).
  • Stripes are alternately lined up (Wrench wrench method Even in such a case, it is desirable that the image data to be encoded is in the form shown in FIG. 7 (a). This is because when block-based coding such as DCT is performed, image data in the form shown in Fig. 7 (c) has a weaker correlation between adjacent pixels and a higher spatial frequency, resulting in poor compression efficiency. It is.
  • the same concept can be applied when the required number of viewpoints is larger than two. As shown in Fig.
  • the recording means is provided before decoding (before the decoding unit 12) or after image processing (after the image processing unit).
  • FIG. 9 is a flowchart showing a processing procedure in the server 1.
  • a request from the client is analyzed (step S1).
  • necessary image data is selected from the multi-view image data (step S2).
  • the image of the requested viewpoint is generated by using it (step S3).
  • image data of a size necessary for display is cut out (reduced if necessary) (step S4). Soshi Then, the clipped left viewpoint image data and the right viewpoint image data are combined (step S5).
  • the synthesized image data is encoded (step S6). Then, it is output as a bit stream (step S7).
  • FIG. 10 is a flowchart showing a processing procedure in the client 11.
  • initialization is performed to set the viewpoint position (viewpoint information) and viewpoint-independent information (stereoscopic display format, resolution, etc. (display unit information)) in the initial state (step S11).
  • the information is transmitted as a request to the server 1 (step S12).
  • a bit stream (image data) satisfying the request is transmitted from the server via the network (step S13).
  • it is decrypted (step S14). Since the decoded image data is not in a format that can be displayed three-dimensionally as it is as shown in Fig. 7 (a), by rearranging it, it matches the stereoscopic display format as shown in Fig. 7 (c). Format (step S15).
  • step S16 it is displayed on the display unit 14 (step S16).
  • step S17 it is determined whether or not to continue the display. If the display is to be continued, it is determined whether there is a request to change the viewpoint (step S18). Next, when there is a change in the viewpoint, the request is output to the server 1 again, and the process returns to the step S12. If the display is not continued in step S18, the process returns to step S13.
  • FIG. 14 shows a connection form between a server and a client assumed in the present embodiment.
  • Terminals A to C are clients, and are connected to the server 41 via the network 7.
  • the server 41 transmits different image data to each terminal in response to a request from each terminal because the stereoscopic display format and the display resolution are different for each terminal, and the position of the viewpoint to be viewed is also different. If all of the multi-viewpoint image data stored (recorded) by the server 41 is transmitted to the network and each terminal selects and displays it, only one type of image data needs to be transmitted. In practice, it is impossible to do so because the viewpoint image data has a huge amount of information and the bandwidth of the network is limited. From this, from the client like the present invention A system in which the server sends appropriate image data according to the requirements of the above is essential in an environment where different types of terminals are connected to the network.
  • FIG. 15 is a flowchart showing details of the processing of the image generating unit 3.
  • the position of the viewpoint requested by the client is analyzed (step S21).
  • the image data may be used as it is, and in other cases, an interpolated image is generated (step S23).
  • FIG. 11 shows an example of a case where a multi-view video image is encoded by MPEG-4.
  • frames to be encoded have frame numbers that are discrete such as LT f0, LT f3, LTf 5, and LT f10.
  • LT f 0 is a frame to be intra-coded (I frame)
  • LT fl O is a frame to be predictively encoded from a decoded frame of LT f 0 (P frame)
  • LT f 3 and LT f 5 are LT f 0
  • the frame is decoded as a frame (B frame) to be bidirectionally predictively encoded from the decoded frame of O, LT fl0.
  • FIG. 13 shows an example of the management information added by the management information generation unit 32 in FIG. 8A.
  • the encoded data of each camera image can be combined and stored together with the management information as shown in Fig. 12, but at this time, information for enabling access to each camera image data is It is management information.
  • information that enables random access to the coded data at a specified time in each camera image data is included together with access to each camera image data.
  • FIG. 13A shows an example of management information for accessing the encoded data of each camera image. For example, it indicates that the encoded data of the camera image C2 exists in the B2 byte from the beginning of the data in FIG. Fig. 13 (a) also shows information for accessing the encoded data at the specified time within the digital image data. Pointer is described. In the case of the encoded data of C2, it is indicated that the access table for the encoded data at the designated time is in the address P2 in the management information.
  • FIG. 13B is an example of an access table to the encoded data at the specified time. The times tl, t2, t3, ⁇ may be set at equal intervals, or may be at any time intervals.
  • the encoded data corresponding to the time t3 is present at the Bt 3rd byte from the beginning of the encoded image of the camera image, and the encoded data of the ⁇ frame is located at a position as far back as It 3 bytes from that position. It is shown that there is.
  • the decoder wants to display from time t3, it first decodes the encoded data of the I frame in (Bt3-It3) bytes from the beginning. Then, while decoding P and B frames sequentially, counting the number of bytes decoded, and starting to display when only It 3 bytes have been decoded, the image data at the specified time t 3 is displayed. .
  • (A) The sign data is packetized, and there is information indicating whether or not the header information of each packet includes the head of the I frame.
  • Figure 13 (b) shows the specified time and the number of bytes up to the corresponding bucket. After accessing the bucket at the specified time t3 with the decoder, it checks whether the bucket includes the beginning of the I-frame and starts decoding and displaying from the packet containing the I-frame. Discard it).
  • management information and encoded information are combined and stored, but the management information may be separated and stored as a separate file.
  • the information for accessing the designated time may be included not in the management information but in the header information of the encoding image of each camera image. In this case, the third column in Fig. 13 (a) (for accessing the specified time in the camera image)
  • the file name of the code image of each camera image is written in the second column of FIG. 13 (a). Access to each camera image is based on the file name.
  • the image data distribution system and the image data transmitting device and the image data receiving device according to the present invention can be viewed from an arbitrary viewpoint without requiring a memory having a sufficient capacity and a CPU having a high processing capacity on the client side. It can be applied to mobile terminals that can observe 3D images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Processing Or Creating Images (AREA)
PCT/JP2003/008302 2002-06-28 2003-06-30 画像データ配信システムならびにその画像データ送信装置および画像データ受信装置 WO2004004350A1 (ja)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2004517332A JP4346548B2 (ja) 2002-06-28 2003-06-30 画像データ配信システムならびにその画像データ送信装置および画像データ受信装置
KR1020047021448A KR100742674B1 (ko) 2002-06-28 2003-06-30 화상데이터 전송시스템, 그의 화상데이터 송신장치, 및그의 화상데이터 수신장치
EP03761837A EP1519582A4 (en) 2002-06-28 2003-06-30 Image data delivery system, image data sending device therefor and image data receiving device therefor
AU2003244156A AU2003244156A1 (en) 2002-06-28 2003-06-30 Image data delivery system, image data transmitting device thereof, and image data receiving device thereof
US10/519,154 US7734085B2 (en) 2002-06-28 2003-06-30 Image data delivery system, image data transmitting device thereof, and image data receiving device thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2002189470 2002-06-28
JP2002-189470 2002-06-28

Publications (1)

Publication Number Publication Date
WO2004004350A1 true WO2004004350A1 (ja) 2004-01-08

Family

ID=29996848

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2003/008302 WO2004004350A1 (ja) 2002-06-28 2003-06-30 画像データ配信システムならびにその画像データ送信装置および画像データ受信装置

Country Status (7)

Country Link
US (1) US7734085B2 (zh)
EP (1) EP1519582A4 (zh)
JP (1) JP4346548B2 (zh)
KR (1) KR100742674B1 (zh)
CN (1) CN100342733C (zh)
AU (1) AU2003244156A1 (zh)
WO (1) WO2004004350A1 (zh)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006113807A (ja) * 2004-10-14 2006-04-27 Canon Inc 多視点画像の画像処理装置および画像処理プログラム
JP2006165878A (ja) * 2004-12-06 2006-06-22 Shigeru Handa コンテンツ配信システム、及びデータ構造
JP2008211417A (ja) * 2007-02-26 2008-09-11 Fujitsu Ltd 多視点動画像伝送システム
CN100426218C (zh) * 2005-04-08 2008-10-15 佳能株式会社 信息处理方法和设备
JP2010258848A (ja) * 2009-04-27 2010-11-11 Mitsubishi Electric Corp 立体映像配信システム、立体映像配信方法、立体映像配信装置、立体映像視聴システム、立体映像視聴方法、立体映像視聴装置
WO2010140864A2 (ko) * 2009-06-05 2010-12-09 삼성전자주식회사 스테레오 영상 처리 장치 및 방법
WO2012147596A1 (ja) * 2011-04-28 2012-11-01 ソニー株式会社 画像データ送信装置、画像データ送信方法、画像データ受信装置および画像データ受信方法
JP2016119513A (ja) * 2014-12-18 2016-06-30 ヤフー株式会社 画像処理装置、画像処理方法及び画像処理プログラム
JP2016220113A (ja) * 2015-05-22 2016-12-22 株式会社ソシオネクスト 映像配信装置、映像配信方法、及び映像配信システム
JP2019068130A (ja) * 2017-09-28 2019-04-25 Kddi株式会社 映像配信サーバ、映像配信方法および映像配信プログラムならびに映像再生装置

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4665167B2 (ja) * 2005-06-29 2011-04-06 ソニー株式会社 ステレオ画像処理装置、ステレオ画像処理方法およびステレオ画像処理用プログラム
JP4665166B2 (ja) * 2005-06-29 2011-04-06 ソニー株式会社 ステレオ画像処理装置、ステレオ画像処理方法およびステレオ画像処理用プログラム
JP4687279B2 (ja) * 2005-06-29 2011-05-25 ソニー株式会社 画像再生装置、画像再生方法、および画像再生用プログラム
KR20070008289A (ko) * 2005-07-13 2007-01-17 삼성전자주식회사 디스플레이 장치 및 그것을 구비한 정보 처리 시스템,그리고 그것의 구동 방법
KR100755457B1 (ko) * 2005-08-16 2007-09-05 홍길철 인터넷 통신망을 이용한 다중 또는 입체 화상 서비스 방법
US20090201992A1 (en) * 2005-10-07 2009-08-13 Jeong-Il Seo Method and apparatus for encoding and decoding hopping default view for multiple cameras system
JPWO2007122907A1 (ja) * 2006-03-29 2009-09-03 パナソニック株式会社 画像コーデック装置
US8385628B2 (en) * 2006-09-20 2013-02-26 Nippon Telegraph And Telephone Corporation Image encoding and decoding method, apparatuses therefor, programs therefor, and storage media for storing the programs
US8184692B2 (en) * 2006-09-25 2012-05-22 Framecaster, Inc. Distributed and automated video encoding and delivery system
WO2009027923A1 (en) * 2007-08-31 2009-03-05 Koninklijke Philips Electronics N.V. Conveying auxiliary information in a multiplexed stream
CN101472190B (zh) * 2007-12-28 2013-01-23 华为终端有限公司 多视角摄像及图像处理装置、系统
JP2010169777A (ja) * 2009-01-21 2010-08-05 Sony Corp 画像処理装置、画像処理方法およびプログラム
JP5460702B2 (ja) 2009-05-14 2014-04-02 パナソニック株式会社 ビデオデータのパケット伝送方法
KR101234495B1 (ko) * 2009-10-19 2013-02-18 한국전자통신연구원 화상회의 시스템을 위한 단말, 중계 노드 및 스트림 처리 방법
KR101320350B1 (ko) * 2009-12-14 2013-10-23 한국전자통신연구원 보안관제서버 및 보안관제서버의 영상데이터 관리 방법
KR101329057B1 (ko) * 2010-03-29 2013-11-14 한국전자통신연구원 다시점 입체 동영상 송신 장치 및 방법
JP5515988B2 (ja) * 2010-04-05 2014-06-11 ソニー株式会社 信号処理装置、信号処理方法、表示装置及びプログラム
EP2613531A4 (en) * 2010-09-03 2014-08-06 Sony Corp DEVICE AND ENCODING METHOD, AND DEVICE, AND DECODING METHOD
CN102402412B (zh) * 2010-09-19 2014-12-31 联想(北京)有限公司 显示功能处理模块,服务器和显示处理方法
JP2012134893A (ja) * 2010-12-24 2012-07-12 Hitachi Consumer Electronics Co Ltd 受信装置
CN103179302B (zh) * 2011-12-22 2017-10-10 腾讯科技(深圳)有限公司 开放平台中的图片处理方法及系统
CN104041027A (zh) * 2012-01-06 2014-09-10 奥崔迪合作公司 用于三维显示的显示处理器
JP2014007648A (ja) * 2012-06-26 2014-01-16 Sony Corp 画像処理装置と画像処理方法およびプログラム
EP2893436B1 (en) * 2012-09-10 2020-08-05 UTC Fire & Security Americas Corporation, Inc. Systems and methods for security panel content management
EP2908519A1 (en) 2014-02-14 2015-08-19 Thomson Licensing Method for displaying a 3D content on a multi-view display device, corresponding multi-view display device and computer program product
KR101521890B1 (ko) * 2014-03-28 2015-05-22 주식회사 넥스트이온 다시점 비디오 스트리밍 시스템 및 그 제공방법
USD748196S1 (en) 2014-08-27 2016-01-26 Outerwall Inc. Consumer operated kiosk for sampling products
US10547825B2 (en) * 2014-09-22 2020-01-28 Samsung Electronics Company, Ltd. Transmission of three-dimensional video
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
DE102014226122A1 (de) * 2014-12-16 2016-06-16 Robert Bosch Gmbh Transkodereinrichtung sowie Server-Client-Anordnung mit der Transkodereinrichtung
US10129579B2 (en) 2015-10-15 2018-11-13 At&T Mobility Ii Llc Dynamic video image synthesis using multiple cameras and remote control
US11049218B2 (en) 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
CN108055324A (zh) * 2017-12-13 2018-05-18 济南汇通远德科技有限公司 一种基于智能穿戴产品实现数据交互的方法
CN110913202B (zh) * 2019-11-26 2022-01-07 深圳英伦科技股份有限公司 一种三维显示云渲染方法和系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07274211A (ja) * 1994-03-30 1995-10-20 Sanyo Electric Co Ltd 2次元画像を3次元画像に変換する方法
JPH09200715A (ja) * 1996-01-19 1997-07-31 Canon Inc 通信装置、通信方法及び通信システム
JPH10178594A (ja) * 1996-12-19 1998-06-30 Sanyo Electric Co Ltd 3次元映像伝送方法、デジタル放送システムおよびデジタル放送システムにおけるユーザ側端末
JPH11225160A (ja) * 1998-02-05 1999-08-17 Chokosoku Network Computer Gijutsu Kenkyusho:Kk 動画転送方法及びサーバ
JP2000165831A (ja) * 1998-11-30 2000-06-16 Nec Corp 多地点テレビ会議システム
JP2001008232A (ja) * 1999-06-25 2001-01-12 Matsushita Electric Ind Co Ltd 全方位映像出力方法と装置

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01107247A (ja) 1987-10-21 1989-04-25 Matsushita Electric Ind Co Ltd 3次元映像表示装置
EP2271090B1 (en) * 1996-02-28 2013-12-11 Panasonic Corporation High-resolution optical disk for recording stereoscopic video, optical disk reproducing device and optical disk recording device
EP0931420A4 (en) * 1996-10-11 2002-06-26 Sarnoff Corp METHOD AND DEVICE FOR CODING AND DECODING STEREOSCOPIC VIDEO SIGNALS
US5864337A (en) * 1997-07-22 1999-01-26 Microsoft Corporation Mehtod for automatically associating multimedia features with map views displayed by a computer-implemented atlas program
GB9800397D0 (en) * 1998-01-09 1998-03-04 Philips Electronics Nv Virtual environment viewpoint control
KR100345235B1 (ko) * 1998-11-08 2005-07-29 엘지전자 주식회사 디지털데이터스트림기록방법및그장치
US6631205B1 (en) * 1999-01-13 2003-10-07 Canon Kabushiki Kaisha Stereoscopic imaging in a portable document format
CN1409925A (zh) * 1999-10-15 2003-04-09 凯瓦津格公司 采用可巡视的摄像机阵列的用于比较多个图像的方法及系统
US6525732B1 (en) * 2000-02-17 2003-02-25 Wisconsin Alumni Research Foundation Network-based viewing of images of three-dimensional objects
US20030132939A1 (en) * 2000-03-01 2003-07-17 Levin Moshe Interactive navigation through real-time live video space created in a given remote geographic location
KR20010100539A (ko) 2000-05-03 2001-11-14 박승준 다시점 입체 영상 표시기를 이용한 2차원 입체 영상제작방법
JP2002095018A (ja) 2000-09-12 2002-03-29 Canon Inc 画像表示制御装置及び画像表示システム、並びに画像データの表示方法
US6573912B1 (en) * 2000-11-07 2003-06-03 Zaxel Systems, Inc. Internet system for virtual telepresence
US6803912B1 (en) * 2001-08-02 2004-10-12 Mark Resources, Llc Real time three-dimensional multiple display imaging system
US7190825B2 (en) * 2001-08-17 2007-03-13 Geo-Rae Co., Ltd. Portable communication device for stereoscopic image display and transmission
JP4148671B2 (ja) * 2001-11-06 2008-09-10 ソニー株式会社 表示画像制御処理装置、動画像情報送受信システム、および表示画像制御処理方法、動画像情報送受信方法、並びにコンピュータ・プログラム
JP3922543B2 (ja) * 2002-06-05 2007-05-30 ソニー株式会社 撮像装置、および画像表示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07274211A (ja) * 1994-03-30 1995-10-20 Sanyo Electric Co Ltd 2次元画像を3次元画像に変換する方法
JPH09200715A (ja) * 1996-01-19 1997-07-31 Canon Inc 通信装置、通信方法及び通信システム
JPH10178594A (ja) * 1996-12-19 1998-06-30 Sanyo Electric Co Ltd 3次元映像伝送方法、デジタル放送システムおよびデジタル放送システムにおけるユーザ側端末
JPH11225160A (ja) * 1998-02-05 1999-08-17 Chokosoku Network Computer Gijutsu Kenkyusho:Kk 動画転送方法及びサーバ
JP2000165831A (ja) * 1998-11-30 2000-06-16 Nec Corp 多地点テレビ会議システム
JP2001008232A (ja) * 1999-06-25 2001-01-12 Matsushita Electric Ind Co Ltd 全方位映像出力方法と装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1519582A4 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7873207B2 (en) 2004-10-14 2011-01-18 Canon Kabushiki Kaisha Image processing apparatus and image processing program for multi-viewpoint image
JP2006113807A (ja) * 2004-10-14 2006-04-27 Canon Inc 多視点画像の画像処理装置および画像処理プログラム
JP2006165878A (ja) * 2004-12-06 2006-06-22 Shigeru Handa コンテンツ配信システム、及びデータ構造
CN100426218C (zh) * 2005-04-08 2008-10-15 佳能株式会社 信息处理方法和设备
JP2008211417A (ja) * 2007-02-26 2008-09-11 Fujitsu Ltd 多視点動画像伝送システム
JP2010258848A (ja) * 2009-04-27 2010-11-11 Mitsubishi Electric Corp 立体映像配信システム、立体映像配信方法、立体映像配信装置、立体映像視聴システム、立体映像視聴方法、立体映像視聴装置
US10356388B2 (en) 2009-04-27 2019-07-16 Mitsubishi Electric Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distribution apparatus, stereoscopic video viewing system, stereoscopic video viewing method, and stereoscopic video viewing apparatus
US8677436B2 (en) 2009-04-27 2014-03-18 Mitsubishi Electronic Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distribution apparatus, stereoscopic video viewing system, stereoscopic video viewing method, and stereoscopic video viewing apparatus
US9118955B2 (en) 2009-06-05 2015-08-25 Samsung Electronics Co., Ltd. Stereo image handling device and method
WO2010140864A2 (ko) * 2009-06-05 2010-12-09 삼성전자주식회사 스테레오 영상 처리 장치 및 방법
WO2010140864A3 (ko) * 2009-06-05 2011-03-03 삼성전자주식회사 스테레오 영상 처리 장치 및 방법
WO2012147596A1 (ja) * 2011-04-28 2012-11-01 ソニー株式会社 画像データ送信装置、画像データ送信方法、画像データ受信装置および画像データ受信方法
CN103026725A (zh) * 2011-04-28 2013-04-03 索尼公司 图像数据发送设备、图像数据发送方法、图像数据接收设备以及图像数据接收方法
JP2016119513A (ja) * 2014-12-18 2016-06-30 ヤフー株式会社 画像処理装置、画像処理方法及び画像処理プログラム
JP2016220113A (ja) * 2015-05-22 2016-12-22 株式会社ソシオネクスト 映像配信装置、映像配信方法、及び映像配信システム
JP2019068130A (ja) * 2017-09-28 2019-04-25 Kddi株式会社 映像配信サーバ、映像配信方法および映像配信プログラムならびに映像再生装置

Also Published As

Publication number Publication date
KR20050014893A (ko) 2005-02-07
KR100742674B1 (ko) 2007-07-25
US7734085B2 (en) 2010-06-08
EP1519582A1 (en) 2005-03-30
JP4346548B2 (ja) 2009-10-21
CN1666525A (zh) 2005-09-07
JPWO2004004350A1 (ja) 2005-11-04
AU2003244156A1 (en) 2004-01-19
EP1519582A4 (en) 2007-01-31
US20050248802A1 (en) 2005-11-10
CN100342733C (zh) 2007-10-10

Similar Documents

Publication Publication Date Title
WO2004004350A1 (ja) 画像データ配信システムならびにその画像データ送信装置および画像データ受信装置
JP4425635B2 (ja) 両眼/多視点3次元動画像処理システム及びその方法
KR100475060B1 (ko) 다시점 3차원 동영상에 대한 사용자 요구가 반영된 다중화장치 및 방법
KR100488804B1 (ko) Mpeg-4 기반의 양안식 3차원 동영상 데이터 처리시스템 및 그 방법
JP4877852B2 (ja) 画像符号化装置、および画像送信装置
KR100703715B1 (ko) 다시점 3차원 동영상 송수신 시스템
US20080310762A1 (en) System and method for generating and regenerating 3d image files based on 2d image media standards
JP2010508752A (ja) 立体映像コンテンツ再生に利用されるメタデータの復号化方法及び装置
JP2004240469A (ja) 画像データ作成装置およびそのデータを再生する画像データ再生装置
WO2012070364A1 (ja) 画像データ送信装置、画像データ送信方法、画像データ受信装置および画像データ受信方法
KR100874226B1 (ko) 다시점 화상 및 3차원 오디오 송수신 장치 및 이를 이용한송수신 방법
TW201138425A (en) Method and system for rendering 3D graphics based on 3D display capabilities
KR102178947B1 (ko) 스티칭 기법을 이용한 360도 멀티뷰 영상 전송 시스템 및 방법
EP2183924A2 (en) Method of generating contents information and apparatus for managing contents using the contents information
JP3129784B2 (ja) 立体映像高能率符号化装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003761837

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 10519154

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 20038153718

Country of ref document: CN

Ref document number: 1020047021448

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 1020047021448

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2004517332

Country of ref document: JP

WWP Wipo information: published in national office

Ref document number: 2003761837

Country of ref document: EP