WO2007142445A1 - Method of transmitting/playing multimedia data over wireless network and wireless device using the method - Google Patents

Method of transmitting/playing multimedia data over wireless network and wireless device using the method Download PDF

Info

Publication number
WO2007142445A1
WO2007142445A1 PCT/KR2007/002706 KR2007002706W WO2007142445A1 WO 2007142445 A1 WO2007142445 A1 WO 2007142445A1 KR 2007002706 W KR2007002706 W KR 2007002706W WO 2007142445 A1 WO2007142445 A1 WO 2007142445A1
Authority
WO
WIPO (PCT)
Prior art keywords
stream
audio
video
video stream
audio stream
Prior art date
Application number
PCT/KR2007/002706
Other languages
French (fr)
Inventor
Chang-Yeul Kwon
Seong-Soo Kim
Ki-Bo Kim
Se-Young Shin
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP07807922.5A priority Critical patent/EP2025182B1/en
Priority to MX2008015594A priority patent/MX2008015594A/en
Priority to CN2007800164724A priority patent/CN101438615B/en
Priority to JP2009514198A priority patent/JP5065382B2/en
Publication of WO2007142445A1 publication Critical patent/WO2007142445A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/18Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/22Processing or transfer of terminal data, e.g. status or physical capabilities
    • H04W8/24Transfer of terminal data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • H04W88/04Terminal devices adapted for relaying to or from another terminal or user

Definitions

  • Methods and apparatuses consistent with the present invention relate to wireless communication technology, and more particularly, to efficiently transmitting large multimedia data over a wireless network.
  • An IEEE 802.15.3c task group is developing a technological standard for transmitting large- volume data over a wireless home network.
  • the technological standard which is called "millimeter wave (mmWave)," uses an electric wave having a physical wavelength of a millimeter, i.e., an electric wave having a frequency band of 30-300 GHz to transmit the large-volume data.
  • This frequency band which is an unlicensed band, has conventionally been used by communication service providers or used for limited purposes, such as observing electric waves or preventing vehicle collision.
  • FIG. 1 is a diagram comparing frequency bands of the IEEE 802.11 series of standards and mmWave.
  • the IEEE 802.1 Ib or IEEE 802.1 Ig standard uses a carrier frequency of 2.4 GHz and has a channel bandwidth of approximately 20 MHz.
  • the IEEE 802.1 Ia or IEEE 802.1 In standard uses a carrier frequency of 5 GHz and has a channel bandwidth of approximately 20 MHz.
  • mmWave uses a carrier frequency of 60 GHz and has a channel bandwidth of approximately 0.5 - 2.5 GHz. Therefore, it can be understood that mmWave has a much greater carrier frequency and channel bandwidth than the conventional IEEE 802.11 series of standards.
  • a high-frequency signal (a mmWave) having a millimeter wavelength
  • a very high transmission rate of several Gbps can be achieved.
  • the size of an antenna can also be reduced to less than 1.5 mm, a single chip including the antenna can be implemented.
  • interference between devices can be reduced due to a very high attenuation ratio of the high-frequency signal in the air.
  • the high-frequency signal has a short distance range due to the very high attenuation ratio.
  • the high-frequency signal has high directability, it is difficult to have a proper communication in a non-line-of-sight environment.
  • an array antenna having a high gain is used to solve the former problem, and a beam steering method is used to solve the latter problem.
  • uncompressed audio and video (AV) data is large- volume data that is not compressed, it can be transmitted only in a high-frequency band of several tens of GHz. Even when having a packet loss, the uncompressed AV data has relatively less effect on the quality of video display than the compressed data. Therefore, there is no need for an automatic repeat request or a retry.
  • FIG. 2 is a block diagram illustrating a method of transmitting a multimedia stream from a source device to a sink device in a related art wireless home network.
  • a source device 21 transmits a multiplexed AV stream to a sink device 22.
  • the source device 21 is a device generating an AV stream from a multimedia source and wirelessly transmitting the AV stream, such as a set-top box, a DVD player, or a portable multimedia player (PMP).
  • the sink device 22 is a device receiving the AV stream and outputting the AV stream in a form that can be audio-visually perceived by a user, such as a beam projector, a television (TV), a monitor, or an AV receiver.
  • Each of the source device 21 and the sink device 22 includes a wireless network interface. From the perspective of a wireless network, the source device 21 operates as a transmitting device, and the sink device 22 operates as a receiving device. Disclosure of Invention Technical Problem
  • a broad bandwidth is required to transmit large multimedia data such as uncompressed video data, e.g., data that presents red (R), green (G) and blue (B) or yellow-ultraviolet (YUV) components, which constitute a pixel, as digital values.
  • the sink device 22 can process only either of a video stream and an audio stream, if the source device 21 transmits an AV stream which contains both audio and video data to the sink device 22, the bandwidth is wasted. Therefore, the source device 21 needs to adaptively transmit a video stream, an audio stream, or an AV stream according to the available capability of the sink device 22.
  • the present invention provides a method and apparatus for efficiently transmitting large multimedia data to various sink devices in a high-frequency band of several tens of GHz.
  • a wireless device including a device information reading unit which receives and reads information regarding available capabilities of devices included in a wireless network; a control unit which selects a first device and a second device from the devices based on the information; a stream generation unit which generates a video stream and an audio stream which form the same content; and a wireless transmission/reception unit which transmits the video stream to the first device, and transmits the audio stream to the second device.
  • a wireless device including a device information storage unit which provides information regarding an available capability of the wireless device to other devices included in a wireless network; a wireless transmission/reception unit which receives one of a video stream and an audio stream from a source device among the devices based on the information; a time information reading unit which reads time information included in the video stream or the audio stream; and a media play unit which plays video or audio included in the video stream or the audio stream at a time indicated by the time information.
  • a wireless device including a device information collection unit which collects information regarding available capabilities of devices included in a wireless network; a wireless transmission/reception unit which receives an AV stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; a media play unit which demultiplexes the received AV stream into the video stream and the audio stream, and plays each of the video and audio streams; a video output unit which outputs the video stream which is played to a user; and an audio retransmission unit which retransmits the audio stream which is played through a cable.
  • a method of transmitting multimedia data includes receiving and reading information regarding available capabilities of devices included in a wireless network; selecting a first device and a second device from the devices based on the information; generating a video stream and an audio stream which form the same content; and transmitting the video stream to the first device and transmitting the audio stream to the second device.
  • a method of playing multimedia data includes providing information regarding an available capability of a wireless device to other devices included in a wireless network; receiving one of a video stream and an audio stream from a source device among the devices based on the information; reading time information included in the video stream or the audio stream; and playing video or audio included in the video stream or the audio stream at a time indicated by the time information.
  • a method of playing multimedia data includes providing information regarding an available capability of a wireless device to other devices included in a wireless network; receiving one of a video stream and an audio stream from a source device among the devices based on the information; reading time information included in the video stream or the audio stream; and playing video or audio included in the video stream or the audio stream at a time indicated by the time information.
  • the method includes collecting information regarding available capabilities of devices included in a wireless network; receiving an AV stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; demultiplexing the AV stream into the video stream and the audio stream, and playing each of the video and audio streams; outputting the video stream which is played to a user; and retransmitting the audio stream which is played through a cable.
  • FIG. 1 is a diagram comparing frequency bands of IEEE 802.11 series of standards and mmWave;
  • FIG. 2 is a block diagram illustrating a method of transmitting a multimedia stream from a source device to a sink device in a related art wireless home network;
  • FIG. 3 is a conceptual diagram of a large multimedia data transmission system according to an exemplary embodiment of the present invention;
  • FIG. 4 illustrates a process of transmitting device information from a wireless device to another wireless device according to an exemplary embodiment of the present invention;
  • FIG. 4 illustrates a process of transmitting device information from a wireless device to another wireless device according to an exemplary embodiment of the present invention;
  • FIG. 5 is a conceptual diagram of a large multimedia data transmission system according to another exemplary embodiment of the present invention.
  • FIG. 6 illustrates information added to a stream for synchronization when a source device transmits the stream according to an exemplary embodiment of the present invention;
  • FIG. 7 is a block diagram of a source device according to an exemplary embodiment of the present invention;
  • FIG. 8 is a block diagram of a sink device according to an exemplary embodiment of the present invention;
  • FIG. 9 is a block diagram of a network coordinator according to an exemplary embodiment of the present invention.
  • FIG. 3 is a conceptual diagram of a large multimedia data transmission system according to an exemplary embodiment of the present invention.
  • a source device 100 selectively transmits a video stream or an audio stream according to the available capability of a sink device.
  • a first sink device 200a is a device which can process only a video stream, such as a monitor, a beam projector or a plasma display panel (PDP)
  • the source device 100 transmits a video stream to the first sink device 200a.
  • a second sink device 200b is a device which can process only an audio stream, such as an AV receiver or a motion picture experts group (MPEG) layer- 3 (MP3) decoder
  • MPEG motion picture experts group
  • a source device needs to know the available capability of each sink device in order to select a type of a stream to transmit. Since the source device and the sink devices are wireless devices that form a wireless network, each device informs its available capability to other wireless devices when associating with the wireless network.
  • FIG. 4 illustrates a process of transmitting device information from a wireless device to another wireless device according to an exemplary embodiment of the present invention.
  • a wireless network 40 is composed of a first device 41 and a second device 42.
  • a third device 43 associates with the wireless network 40, it notifies the first and second devices 41 and 42 of information regarding its available capability.
  • Such notification of the information regarding the available capability is given regardless of whether the wireless network 40 has an infrastructure network structure based on an access point, e.g., IEEE 802.11 series of standards, or an ad hoc network structure centered on a Piconet coordinator (PNC).
  • PNC Piconet coordinator
  • the third device 43 joins the wireless network 40, it directly sends the information regarding its available capability to the first and second devices 41 and 42 or indirectly sends the information to them through a medium.
  • the information regarding the available capability of the third device 43 may include a model name of the third device 43, manufacturer information, a supported communication method such as a modulation method and a transmission rate, a processable video stream format, and a processable audio stream format.
  • a source device can determine whether a sink device (the third sink device 43) can process a video stream or an audio stream based on the information regarding the available capability of the sink device.
  • FIG. 5 is a conceptual diagram of a large multimedia data transmission system according to another exemplary embodiment of the present invention.
  • a network coordinator 300 can process both audio and video streams, such as a TV.
  • the network coordinator 300 initiates, maintains, terminates and manages a network, such as an IEEE 802.15.3 PNC.
  • a wired device 250 is connected to an external terminal of the network coordinator 300 by a cable and outputs sound provided by the network coordinator 300 to a user.
  • the wired device 250 may be a speaker or a combination of an amplifier and a speaker.
  • the sink device 200 which is an audio device, receives an audio stream from the source device 100. The user may select either the sink device 200 or the wired device 250 to output sound.
  • the network coordinator 300 receives information regarding available capability of each of the source and sink devices. Then, the network coordinator 300 transmits the received information to other source and sink devices. In multimedia data transmission, the source device 100 transmits a video stream or an audio stream if a receiving device is the sink device 200. If the receiving device is the network coordinator 300, the source device 100 transmits an AV stream.
  • the source device 100 may transmit a compressed video stream or a compressed audio stream.
  • the source device 100 selectively transmits a video stream, an audio stream, or an AV stream, according to the available capability of a receiving device as illustrated in FIGS. 3 and 5. If a video stream and an audio stream that belong to the same content are played in different receiving devices, a process of synchronizing the video and audio streams must be performed.
  • FIG. 6 illustrates information added to a stream for synchronization when a source device transmits the stream according to an exemplary embodiment of the present invention.
  • a video stream may be composed of an intra-coded frame (I frame), a predictive- coded frame (P frame), and a bi-directional predictive-coded frame (B frame).
  • the I frame can be decoded on its own, and the P frame can be decoded based on a preceding I or P frame.
  • the B frame can be decoded based on I frames or P frames preceding and following the B frame.
  • the video stream includes the above frames arranged in a predetermined order.
  • the frames are decoded according to their characteristics, and are presented in a predetermined order.
  • a decoding time stamp (DTS) indicating a time when each frame is decoded
  • a presentation time stamp (PTS) indicating a time when each frame is played are set for each frame included in a video stream.
  • Times of the DTS and PTS correspond to reference times generated by a system time clock (STC) installed in a source device.
  • STC system time clock
  • the STC is actually a counter value circulating at intervals of 26 hours. If image data is recorded using a National Television Standards Committee (NTSC) method, the difference between counter values of frames is 3,003 (3,600 in the case of a phase alternating line (PAL) method).
  • NTSC National Television Standards Committee
  • PAL phase alternating line
  • DTS(n) where a counter value of the STC is n
  • a next DTS is n+3003 followed by n+6006, n+9009, n+12012, and n+15015, sequentially.
  • the PTS is set based on the DTS and may be set variously according to types of frames. For example, if the cycles of I and P frames are three frames, that is, if the number of B frames successively inserted between I or P frames is two and thus the I or P frame appears at intervals of three frames, the PTS of the I or P frame is a value obtained after DTS values of three frames are added.
  • the DTS and the PTS are equal in the case of B frames, only the PTS may be recorded in the video stream.
  • B frames do not exist in the video stream, since an I frame can be decoded on its own and an I or P frame required for a P frame to be decoded already exists, only the PTS can be recorded in the video stream.
  • a program clock reference (PCR) indicating a corresponding relationship between a
  • STC which is a reference time
  • stream data is also recorded in the video stream.
  • DTS and the PTS are STC values of a source device, they do not match STC values of a sink device. Therefore, the PCR is recorded in the video stream in order to non-periodically correct STC values of the source device for frames of stream data.
  • PCR x and PCR x+l are recorded in the second and fifth frames from the left.
  • the sink device reads this information. Accordingly, the sink device perceives that the second frame from the left was processed at a time t and corrects its STC.
  • the sink device perceives that the x+l fifth frame was processed at a time t and corrects its STC, thereby playing the video stream.
  • an audio stream is similar to a video stream in that it is composed of a plurality of audio frames.
  • the stream illustrated in FIG. 6 may denote a video stream or an audio stream, and the frame may denote a video frame or an audio frame.
  • a source device records a DTS, a PTS and a PCR in each of video frames that form a video stream and in each of audio frames that form an audio stream, and transmits the video and audio streams accordingly. Therefore, even if a sink device or a network coordinator receives only either of the video stream and the audio stream, the video and audio streams can be properly synchronized when played.
  • a wireless device denotes a source device, a sink device, or a network coordinator.
  • the wireless device includes a transmission/reception unit performing mmWave communication.
  • FIG. 7 is a block diagram of a source device according to an exemplary embodiment of the present invention.
  • a source device 100 includes a device information reading unit
  • the source device 100 may further include an STC 135, a time information recording unit 160, a media storage unit 170, and a buffer 190.
  • a device When a device associates with the wireless network, it transmits information regarding its available capability to other devices in the wireless network.
  • the information includes a processable video stream format, a processable audio stream format, and a supported communication method.
  • the control unit 110 controls other components connected to a bus 130.
  • the control unit 110 may select a network coordinator, which can process an AV stream obtained after a video stream and an audio stream are multiplexed, from the devices based on the information regarding the available capabilities of the devices in the wireless network.
  • the transmission/reception unit 150 transmits the AV stream to the network coordinator.
  • the network coordinator is a wireless device initiating, maintaining, and terminating the wireless network. The network coordinator can receive and process an AV stream.
  • the media storage unit 170 stores multimedia data received from an external source or from a recording medium and provides the multimedia data to the stream generation unit 180.
  • the stream generation unit 180 generates a video stream and an audio stream from the received multimedia data. If the multimedia data is data into which video data and audio data are multiplexed, the stream generation unit 180 demultiplexes the multimedia data.
  • the time information recording unit 160 inserts time information for synchronizing the generated video and audio streams into the video stream and the audio stream.
  • the time information includes a PCR and at least one of a PTS and a DTS.
  • the time information may be presented as a counter value provided by the STC 135.
  • the PTS and the DTS may be recorded in the video stream or the audio stream in units of frames, and the PCR may be recorded in the video stream or the audio stream at irregular frame intervals.
  • the buffer 190 temporarily stores the received video or audio stream and provides the stored video or audio stream to the transmission/reception unit 150.
  • the buffer 190 may be a nonvolatile memory device such as a flash memory, a volatile memory device such as a random access memory (RAM), or a storage medium such as a hard disk or an optical disk, and may be implemented in different forms known to the art to which the present invention pertains.
  • a medium access control (MAC) unit 140 adds an MAC header to a stream received from the buffer 190, generates an MAC protocol data unit (MPDU), and transmits the generated MPDU through a radio frequency (RF) unit 152.
  • the RF unit 150 generates a wireless signal after processing a baseband signal and transmits the wireless signal over the air through an antenna 153.
  • FIG. 8 is a block diagram of a sink device according to an exemplary embodiment of the present invention.
  • a sink device 200 includes a device information storage unit 295 providing information regarding available capabilities of devices included in a wireless network, a transmission/reception unit 250 receiving one of a video stream and an audio stream from a source device among the devices based on the information, a time information reading unit 260 reading time information included in the received stream, and a media play unit 280 playing video or audio included in the received stream at a time indicated by the read time information.
  • the sink device 200 associates with the wireless network, it directly sends information regarding its available capability to other devices or indirectly sends the in- formation through a network coordinator.
  • the information includes a processable video stream format, a processable audio stream format, and a supported communication method.
  • the time information includes a PCR and at least one of a PTS and a DTS.
  • the PTS and the DTS may be recorded in the received stream in units of frames, and the PCR may be recorded in the received stream at irregular frame intervals.
  • the sink device 200 may further include a control unit 210, an STC 235, an output unit 285, and a buffer 290.
  • the control unit 210 controls other components connected to a bus 230.
  • the 235 displays a local time within the sink device 200 as a counter value.
  • the counter value of the STC 235 may be different from a value of the STC 135 of the source device 100 illustrated in FIG. 7. Therefore, the media play unit 280 frequently corrects the STC 235 according to a PCR read by the time information reading unit 260. Accordingly, the STC 135 and the STC 235 may maintain equal counter values.
  • the transmission/reception unit 250 is configured similar to the transmission/ reception unit 150 of the source device 100 illustrated in FIG. 7. In a receiving operation, the transmission/reception unit 250 receives a wireless signal via an antenna 253. Then, an RF unit 252 modulates the wireless signal into digital data, and an MAC unit 240 removes an MAC header from the digital data.
  • the digital data having the MAC header removed may be an audio stream or a video stream.
  • the buffer 290 temporarily stores the video stream or the audio stream provided by the transmission/reception unit 250 and provides the stored video or audio stream to the media play unit 280.
  • the media play unit 280 includes a video codec such as MPEG- 1 , MPEG-2, MPEG-
  • H.264 or Windows media video (WMV)
  • an audio codec such as MP3, audio codec (AC) 3 or Windows media audio (WMA)
  • WMV Windows media video
  • AC audio codec
  • WMA Windows media audio
  • the output unit 285 outputs the video or audio stream played by the media play unit
  • the output unit 285 may be a display device or a combination of an amplifier and a speaker.
  • FIG. 9 is a block diagram of a network coordinator according to an exemplary embodiment of the present invention.
  • a network coordinator 300 is basically configured similar to the sink device 200 illustrated in FIG. 8. However, they are different in that the network coordinator 300 further includes a network management unit 375 managing the initiation, maintenance and termination of a network, a device information collection unit 395 collecting device information provided by other devices, and a audio retransmission unit 370 retransmitting sound to a wired device, such as an amplifier or a speaker, through a cable. Therefore, the network coordinator 300 receives an AV stream, into which an audio stream and a video stream are multiplexed, from the source device 100 illustrated in FIG. 7, and retransmits played sound to the wired device through the cable.
  • the network coordinator 300 may be implemented as a digital TV having a wireless network, and the network management unit 375 may be implemented as a PNC module in IEEE 802.15.3.
  • the device information collection unit 395 collects information regarding available capabilities of devices included in a wireless network when the devices associate with the wireless network. Then, the device information collection unit 395 provides the collected information to other devices in the wireless network periodically or whenever requested.
  • a transmission/reception unit 350 receives the AV stream from the source device
  • a buffer 390 temporarily stores the received AV stream and provides the stored AV stream to a media play unit 380.
  • the media play unit 380 frequently corrects an STC 335 according to a PRC read by a time information reading unit 360.
  • the media play unit 380 includes a video codec such as MPEG-I, MPEG-2, MPEG-4, H.264 or WMV, or an audio codec such as MP3, AC 3 or WMA, in order to play the video or audio stream.
  • the media play unit 380 outputs a video signal which is input to a video output unit
  • the video output unit 381 is a visual display device and visually displays the video signal to a user.
  • the audio retransmission unit 370 retransmits the audio signal through the cable.
  • the audio retransmission unit 370 is an external sound terminal, and the retransmitted audio signal may be an analog signal.
  • Each component described above with reference to FIGS. 7 through 9 may be implemented as a software component, such as a task performed in a predetermined region of a memory, a class, a subroutine, a process, an object, an execution thread or a program, or a hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC).
  • the components may be composed of a combination of the software and hardware components.
  • the components may be reside on a computer readable storage medium or may be distributed over a plurality of computers.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A method and apparatus for efficiently transmitting large multimedia data over a wireless network are provided. A wireless device includes a device information reading unit which receives and reads information regarding available capabilities of devices included in a wireless network; a control unit which selects a first device which can process a video stream and a second device which can process an audio stream from the devices based on the information; a stream generation unit which generates a video stream and an audio stream which form the same content; and a wireless transmission/reception unit which transmits the generated video stream to the first device and transmits the generated audio stream to the second device.

Description

Description
METHOD OF TRANSMITTING/PLAYING MULTIMEDIA DATA OVER WIRELESS NETWORK AND WIRELESS DEVICE
USING THE METHOD
Technical Field
[1] Methods and apparatuses consistent with the present invention relate to wireless communication technology, and more particularly, to efficiently transmitting large multimedia data over a wireless network. Background Art
[2] As networks become wireless and the demand for large multimedia data transmission increases, there is a need for studies on an effective transmission method in a wireless network environment. In particular, various home devices are increasingly required to wirelessly transmit high-quality videos, such as digital versatile disk (DVD) videos or high definition television (HDTV) videos.
[3] An IEEE 802.15.3c task group is developing a technological standard for transmitting large- volume data over a wireless home network. The technological standard, which is called "millimeter wave (mmWave)," uses an electric wave having a physical wavelength of a millimeter, i.e., an electric wave having a frequency band of 30-300 GHz to transmit the large-volume data. This frequency band, which is an unlicensed band, has conventionally been used by communication service providers or used for limited purposes, such as observing electric waves or preventing vehicle collision.
[4] FIG. 1 is a diagram comparing frequency bands of the IEEE 802.11 series of standards and mmWave. Referring to FIG. 1, the IEEE 802.1 Ib or IEEE 802.1 Ig standard uses a carrier frequency of 2.4 GHz and has a channel bandwidth of approximately 20 MHz. In addition, the IEEE 802.1 Ia or IEEE 802.1 In standard uses a carrier frequency of 5 GHz and has a channel bandwidth of approximately 20 MHz. On the other hand, mmWave uses a carrier frequency of 60 GHz and has a channel bandwidth of approximately 0.5 - 2.5 GHz. Therefore, it can be understood that mmWave has a much greater carrier frequency and channel bandwidth than the conventional IEEE 802.11 series of standards.
[5] When a high-frequency signal (a mmWave) having a millimeter wavelength is used, a very high transmission rate of several Gbps can be achieved. Since the size of an antenna can also be reduced to less than 1.5 mm, a single chip including the antenna can be implemented. Furthermore, interference between devices can be reduced due to a very high attenuation ratio of the high-frequency signal in the air. [6] However, the high-frequency signal has a short distance range due to the very high attenuation ratio. In addition, since the high-frequency signal has high directability, it is difficult to have a proper communication in a non-line-of-sight environment. In mmWave, an array antenna having a high gain is used to solve the former problem, and a beam steering method is used to solve the latter problem.
[7] Recently, a method of transmitting uncompressed data using mmWave in a high- frequency band of several tens of GHz has been introduced to home and office environments, along with a related art method of transmitting compressed data using a band of several GHz of the IEEE 802.11 standard series.
[8] Since uncompressed audio and video (AV) data is large- volume data that is not compressed, it can be transmitted only in a high-frequency band of several tens of GHz. Even when having a packet loss, the uncompressed AV data has relatively less effect on the quality of video display than the compressed data. Therefore, there is no need for an automatic repeat request or a retry.
[9] FIG. 2 is a block diagram illustrating a method of transmitting a multimedia stream from a source device to a sink device in a related art wireless home network. Referring to FIG. 2, a source device 21 transmits a multiplexed AV stream to a sink device 22. The source device 21 is a device generating an AV stream from a multimedia source and wirelessly transmitting the AV stream, such as a set-top box, a DVD player, or a portable multimedia player (PMP). The sink device 22 is a device receiving the AV stream and outputting the AV stream in a form that can be audio-visually perceived by a user, such as a beam projector, a television (TV), a monitor, or an AV receiver. Each of the source device 21 and the sink device 22 includes a wireless network interface. From the perspective of a wireless network, the source device 21 operates as a transmitting device, and the sink device 22 operates as a receiving device. Disclosure of Invention Technical Problem
[10] A broad bandwidth is required to transmit large multimedia data such as uncompressed video data, e.g., data that presents red (R), green (G) and blue (B) or yellow-ultraviolet (YUV) components, which constitute a pixel, as digital values. Although the sink device 22 can process only either of a video stream and an audio stream, if the source device 21 transmits an AV stream which contains both audio and video data to the sink device 22, the bandwidth is wasted. Therefore, the source device 21 needs to adaptively transmit a video stream, an audio stream, or an AV stream according to the available capability of the sink device 22. Technical Solution
[11] The present invention provides a method and apparatus for efficiently transmitting large multimedia data to various sink devices in a high-frequency band of several tens of GHz.
[12] According to an aspect of the present invention, there is provided a wireless device including a device information reading unit which receives and reads information regarding available capabilities of devices included in a wireless network; a control unit which selects a first device and a second device from the devices based on the information; a stream generation unit which generates a video stream and an audio stream which form the same content; and a wireless transmission/reception unit which transmits the video stream to the first device, and transmits the audio stream to the second device.
[13] According to another aspect of the present invention, there is provided a wireless device including a device information storage unit which provides information regarding an available capability of the wireless device to other devices included in a wireless network; a wireless transmission/reception unit which receives one of a video stream and an audio stream from a source device among the devices based on the information; a time information reading unit which reads time information included in the video stream or the audio stream; and a media play unit which plays video or audio included in the video stream or the audio stream at a time indicated by the time information.
[14] According to another aspect of the present invention, there is provided a wireless device including a device information collection unit which collects information regarding available capabilities of devices included in a wireless network; a wireless transmission/reception unit which receives an AV stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; a media play unit which demultiplexes the received AV stream into the video stream and the audio stream, and plays each of the video and audio streams; a video output unit which outputs the video stream which is played to a user; and an audio retransmission unit which retransmits the audio stream which is played through a cable.
[15] According to another aspect of the present invention, there is provided a method of transmitting multimedia data. The method includes receiving and reading information regarding available capabilities of devices included in a wireless network; selecting a first device and a second device from the devices based on the information; generating a video stream and an audio stream which form the same content; and transmitting the video stream to the first device and transmitting the audio stream to the second device.
[16] According to another aspect of the present invention, there is provided a method of playing multimedia data. The method includes providing information regarding an available capability of a wireless device to other devices included in a wireless network; receiving one of a video stream and an audio stream from a source device among the devices based on the information; reading time information included in the video stream or the audio stream; and playing video or audio included in the video stream or the audio stream at a time indicated by the time information. [17] According to another aspect of the present invention, there is provided a method of playing multimedia data. The method includes collecting information regarding available capabilities of devices included in a wireless network; receiving an AV stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; demultiplexing the AV stream into the video stream and the audio stream, and playing each of the video and audio streams; outputting the video stream which is played to a user; and retransmitting the audio stream which is played through a cable.
Brief Description of the Drawings [18] The above and other aspects of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings, in which: [19] FIG. 1 is a diagram comparing frequency bands of IEEE 802.11 series of standards and mmWave; [20] FIG. 2 is a block diagram illustrating a method of transmitting a multimedia stream from a source device to a sink device in a related art wireless home network; [21] FIG. 3 is a conceptual diagram of a large multimedia data transmission system according to an exemplary embodiment of the present invention; [22] FIG. 4 illustrates a process of transmitting device information from a wireless device to another wireless device according to an exemplary embodiment of the present invention; [23] FIG. 5 is a conceptual diagram of a large multimedia data transmission system according to another exemplary embodiment of the present invention; [24] FIG. 6 illustrates information added to a stream for synchronization when a source device transmits the stream according to an exemplary embodiment of the present invention; [25] FIG. 7 is a block diagram of a source device according to an exemplary embodiment of the present invention; [26] FIG. 8 is a block diagram of a sink device according to an exemplary embodiment of the present invention; and [27] FIG. 9 is a block diagram of a network coordinator according to an exemplary embodiment of the present invention.
Mode for the Invention [28] The present invention will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. The invention may, however, be embodied in many different forms and should not be construed as being limited to the exemplary embodiments set forth herein; rather, these exemplary embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the invention to those skilled in the art.
[29] FIG. 3 is a conceptual diagram of a large multimedia data transmission system according to an exemplary embodiment of the present invention.
[30] Referring to FIG. 3, a source device 100 selectively transmits a video stream or an audio stream according to the available capability of a sink device. If a first sink device 200a is a device which can process only a video stream, such as a monitor, a beam projector or a plasma display panel (PDP), the source device 100 transmits a video stream to the first sink device 200a. In addition, if a second sink device 200b is a device which can process only an audio stream, such as an AV receiver or a motion picture experts group (MPEG) layer- 3 (MP3) decoder, the source device 100 transmits an audio stream to the second sink device 200b.
[31] In this case, it is required to synchronize the audio and video streams, which form the same content, in order to properly play the audio and video streams. Such synchronization will be described later with reference to FIG. 6.
[32] As described above, a source device needs to know the available capability of each sink device in order to select a type of a stream to transmit. Since the source device and the sink devices are wireless devices that form a wireless network, each device informs its available capability to other wireless devices when associating with the wireless network.
[33] FIG. 4 illustrates a process of transmitting device information from a wireless device to another wireless device according to an exemplary embodiment of the present invention.
[34] Referring to FIG. 4, a wireless network 40 is composed of a first device 41 and a second device 42. When a third device 43 associates with the wireless network 40, it notifies the first and second devices 41 and 42 of information regarding its available capability. Such notification of the information regarding the available capability is given regardless of whether the wireless network 40 has an infrastructure network structure based on an access point, e.g., IEEE 802.11 series of standards, or an ad hoc network structure centered on a Piconet coordinator (PNC). In any case, if the third device 43 joins the wireless network 40, it directly sends the information regarding its available capability to the first and second devices 41 and 42 or indirectly sends the information to them through a medium.
[35] The information regarding the available capability of the third device 43 may include a model name of the third device 43, manufacturer information, a supported communication method such as a modulation method and a transmission rate, a processable video stream format, and a processable audio stream format. A source device can determine whether a sink device (the third sink device 43) can process a video stream or an audio stream based on the information regarding the available capability of the sink device.
[36] FIG. 5 is a conceptual diagram of a large multimedia data transmission system according to another exemplary embodiment of the present invention.
[37] Referring to FIG. 5, a network coordinator 300 can process both audio and video streams, such as a TV. In addition, the network coordinator 300 initiates, maintains, terminates and manages a network, such as an IEEE 802.15.3 PNC. A wired device 250 is connected to an external terminal of the network coordinator 300 by a cable and outputs sound provided by the network coordinator 300 to a user. The wired device 250 may be a speaker or a combination of an amplifier and a speaker. The sink device 200, which is an audio device, receives an audio stream from the source device 100. The user may select either the sink device 200 or the wired device 250 to output sound.
[38] In a process, such as an association process, of forming a network including one or more source devices and one or more sink devices, the network coordinator 300 receives information regarding available capability of each of the source and sink devices. Then, the network coordinator 300 transmits the received information to other source and sink devices. In multimedia data transmission, the source device 100 transmits a video stream or an audio stream if a receiving device is the sink device 200. If the receiving device is the network coordinator 300, the source device 100 transmits an AV stream.
[39] If the available capability of the receiving device is not sufficient to process high transmission data, the source device 100 may transmit a compressed video stream or a compressed audio stream.
[40] The source device 100 according to an exemplary embodiment of the present invention selectively transmits a video stream, an audio stream, or an AV stream, according to the available capability of a receiving device as illustrated in FIGS. 3 and 5. If a video stream and an audio stream that belong to the same content are played in different receiving devices, a process of synchronizing the video and audio streams must be performed.
[41] FIG. 6 illustrates information added to a stream for synchronization when a source device transmits the stream according to an exemplary embodiment of the present invention.
[42] A video stream may be composed of an intra-coded frame (I frame), a predictive- coded frame (P frame), and a bi-directional predictive-coded frame (B frame). The I frame can be decoded on its own, and the P frame can be decoded based on a preceding I or P frame. In addition, the B frame can be decoded based on I frames or P frames preceding and following the B frame.
[43] As described above, the video stream includes the above frames arranged in a predetermined order. When the frames are played, they are decoded according to their characteristics, and are presented in a predetermined order. Referring to FIG. 6, a decoding time stamp (DTS) indicating a time when each frame is decoded and a presentation time stamp (PTS) indicating a time when each frame is played are set for each frame included in a video stream.
[44] Times of the DTS and PTS correspond to reference times generated by a system time clock (STC) installed in a source device. The STC is actually a counter value circulating at intervals of 26 hours. If image data is recorded using a National Television Standards Committee (NTSC) method, the difference between counter values of frames is 3,003 (3,600 in the case of a phase alternating line (PAL) method).
[45] Accordingly, referring to FIG. 6, if DTS(n), where a counter value of the STC is n, is set to a header of a first frame, a next DTS is n+3003 followed by n+6006, n+9009, n+12012, and n+15015, sequentially.
[46] The PTS is set based on the DTS and may be set variously according to types of frames. For example, if the cycles of I and P frames are three frames, that is, if the number of B frames successively inserted between I or P frames is two and thus the I or P frame appears at intervals of three frames, the PTS of the I or P frame is a value obtained after DTS values of three frames are added.
[47] Hereinafter, a case where both the DTS and the PTS are used will be described.
However, since the DTS and the PTS are equal in the case of B frames, only the PTS may be recorded in the video stream. In addition, even if B frames do not exist in the video stream, since an I frame can be decoded on its own and an I or P frame required for a P frame to be decoded already exists, only the PTS can be recorded in the video stream.
[48] A program clock reference (PCR) indicating a corresponding relationship between a
STC, which is a reference time, and stream data is also recorded in the video stream. In other words, since the DTS and the PTS are STC values of a source device, they do not match STC values of a sink device. Therefore, the PCR is recorded in the video stream in order to non-periodically correct STC values of the source device for frames of stream data.
[49] For example, referring to FIG. 6, PCR x and PCR x+l are recorded in the second and fifth frames from the left. When each frame is processed, the sink device reads this information. Accordingly, the sink device perceives that the second frame from the left was processed at a time t and corrects its STC. In addition, based on the information of PCR included in the fifth frame from the left, the sink device perceives that the x+l fifth frame was processed at a time t and corrects its STC, thereby playing the video stream.
[50] While there may be some differences in encoding and decoding methods, an audio stream is similar to a video stream in that it is composed of a plurality of audio frames. The stream illustrated in FIG. 6 may denote a video stream or an audio stream, and the frame may denote a video frame or an audio frame.
[51] As described above, a source device records a DTS, a PTS and a PCR in each of video frames that form a video stream and in each of audio frames that form an audio stream, and transmits the video and audio streams accordingly. Therefore, even if a sink device or a network coordinator receives only either of the video stream and the audio stream, the video and audio streams can be properly synchronized when played.
[52] Hereinafter, a wireless device according to an exemplary embodiment of the present invention denotes a source device, a sink device, or a network coordinator. The wireless device includes a transmission/reception unit performing mmWave communication.
[53] FIG. 7 is a block diagram of a source device according to an exemplary embodiment of the present invention.
[54] Referring to FIG. 7, a source device 100 includes a device information reading unit
195 receiving and reading information regarding available capabilities of devices included in a wireless network, a control unit 110 selecting a first device which can process a video stream and a second device which can process an audio stream based on the information regarding the available capabilities of the devices, a stream generation unit 180 generating a video stream and an audio stream that form the same content, and a transmission/reception unit 150 transmitting the video stream to the first device and transmitting the audio stream to the second device. The source device 100 may further include an STC 135, a time information recording unit 160, a media storage unit 170, and a buffer 190.
[55] When a device associates with the wireless network, it transmits information regarding its available capability to other devices in the wireless network. Here, the information includes a processable video stream format, a processable audio stream format, and a supported communication method.
[56] The control unit 110 controls other components connected to a bus 130. In addition, the control unit 110 may select a network coordinator, which can process an AV stream obtained after a video stream and an audio stream are multiplexed, from the devices based on the information regarding the available capabilities of the devices in the wireless network. In this case, the transmission/reception unit 150 transmits the AV stream to the network coordinator. [57] The network coordinator is a wireless device initiating, maintaining, and terminating the wireless network. The network coordinator can receive and process an AV stream.
[58] The media storage unit 170 stores multimedia data received from an external source or from a recording medium and provides the multimedia data to the stream generation unit 180.
[59] The stream generation unit 180 generates a video stream and an audio stream from the received multimedia data. If the multimedia data is data into which video data and audio data are multiplexed, the stream generation unit 180 demultiplexes the multimedia data.
[60] The time information recording unit 160 inserts time information for synchronizing the generated video and audio streams into the video stream and the audio stream. The time information includes a PCR and at least one of a PTS and a DTS. The time information may be presented as a counter value provided by the STC 135.
[61] In this case, the PTS and the DTS may be recorded in the video stream or the audio stream in units of frames, and the PCR may be recorded in the video stream or the audio stream at irregular frame intervals.
[62] The buffer 190 temporarily stores the received video or audio stream and provides the stored video or audio stream to the transmission/reception unit 150. The buffer 190 may be a nonvolatile memory device such as a flash memory, a volatile memory device such as a random access memory (RAM), or a storage medium such as a hard disk or an optical disk, and may be implemented in different forms known to the art to which the present invention pertains.
[63] A medium access control (MAC) unit 140 adds an MAC header to a stream received from the buffer 190, generates an MAC protocol data unit (MPDU), and transmits the generated MPDU through a radio frequency (RF) unit 152. The RF unit 150 generates a wireless signal after processing a baseband signal and transmits the wireless signal over the air through an antenna 153.
[64] FIG. 8 is a block diagram of a sink device according to an exemplary embodiment of the present invention.
[65] Referring to FIG. 8, a sink device 200 includes a device information storage unit 295 providing information regarding available capabilities of devices included in a wireless network, a transmission/reception unit 250 receiving one of a video stream and an audio stream from a source device among the devices based on the information, a time information reading unit 260 reading time information included in the received stream, and a media play unit 280 playing video or audio included in the received stream at a time indicated by the read time information.
[66] When the sink device 200 associates with the wireless network, it directly sends information regarding its available capability to other devices or indirectly sends the in- formation through a network coordinator. The information includes a processable video stream format, a processable audio stream format, and a supported communication method.
[67] The time information includes a PCR and at least one of a PTS and a DTS. The PTS and the DTS may be recorded in the received stream in units of frames, and the PCR may be recorded in the received stream at irregular frame intervals.
[68] The sink device 200 may further include a control unit 210, an STC 235, an output unit 285, and a buffer 290.
[69] The control unit 210 controls other components connected to a bus 230. The STC
235 displays a local time within the sink device 200 as a counter value. The counter value of the STC 235 may be different from a value of the STC 135 of the source device 100 illustrated in FIG. 7. Therefore, the media play unit 280 frequently corrects the STC 235 according to a PCR read by the time information reading unit 260. Accordingly, the STC 135 and the STC 235 may maintain equal counter values.
[70] The transmission/reception unit 250 is configured similar to the transmission/ reception unit 150 of the source device 100 illustrated in FIG. 7. In a receiving operation, the transmission/reception unit 250 receives a wireless signal via an antenna 253. Then, an RF unit 252 modulates the wireless signal into digital data, and an MAC unit 240 removes an MAC header from the digital data. The digital data having the MAC header removed may be an audio stream or a video stream.
[71] The buffer 290 temporarily stores the video stream or the audio stream provided by the transmission/reception unit 250 and provides the stored video or audio stream to the media play unit 280.
[72] The media play unit 280 includes a video codec such as MPEG- 1 , MPEG-2, MPEG-
4, H.264 or Windows media video (WMV), or an audio codec such as MP3, audio codec (AC) 3 or Windows media audio (WMA), in order to play the video or audio stream.
[73] The output unit 285 outputs the video or audio stream played by the media play unit
280 to a user. Accordingly, the output unit 285 may be a display device or a combination of an amplifier and a speaker.
[74] FIG. 9 is a block diagram of a network coordinator according to an exemplary embodiment of the present invention.
[75] Referring to FIG. 9, a network coordinator 300 is basically configured similar to the sink device 200 illustrated in FIG. 8. However, they are different in that the network coordinator 300 further includes a network management unit 375 managing the initiation, maintenance and termination of a network, a device information collection unit 395 collecting device information provided by other devices, and a audio retransmission unit 370 retransmitting sound to a wired device, such as an amplifier or a speaker, through a cable. Therefore, the network coordinator 300 receives an AV stream, into which an audio stream and a video stream are multiplexed, from the source device 100 illustrated in FIG. 7, and retransmits played sound to the wired device through the cable. The network coordinator 300 may be implemented as a digital TV having a wireless network, and the network management unit 375 may be implemented as a PNC module in IEEE 802.15.3.
[76] The device information collection unit 395 collects information regarding available capabilities of devices included in a wireless network when the devices associate with the wireless network. Then, the device information collection unit 395 provides the collected information to other devices in the wireless network periodically or whenever requested.
[77] A transmission/reception unit 350 receives the AV stream from the source device
100 which is provided with the information retained by the network coordinator 300.
[78] A buffer 390 temporarily stores the received AV stream and provides the stored AV stream to a media play unit 380.
[79] The media play unit 380 frequently corrects an STC 335 according to a PRC read by a time information reading unit 360. In addition, the media play unit 380 includes a video codec such as MPEG-I, MPEG-2, MPEG-4, H.264 or WMV, or an audio codec such as MP3, AC 3 or WMA, in order to play the video or audio stream.
[80] The media play unit 380 outputs a video signal which is input to a video output unit
381 and an audio signal which is input to the audio retransmission unit 370.
[81] The video output unit 381 is a visual display device and visually displays the video signal to a user. The audio retransmission unit 370 retransmits the audio signal through the cable. For example, the audio retransmission unit 370 is an external sound terminal, and the retransmitted audio signal may be an analog signal.
[82] Each component described above with reference to FIGS. 7 through 9 may be implemented as a software component, such as a task performed in a predetermined region of a memory, a class, a subroutine, a process, an object, an execution thread or a program, or a hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC). In addition, the components may be composed of a combination of the software and hardware components. The components may be reside on a computer readable storage medium or may be distributed over a plurality of computers.
[83] While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims. The exemplary embodiments should be considered in descriptive sense only and not for purposes of limitation. Industrial Applicability
[84] As described above, according to the exemplary embodiments of the present invention, large multimedia data is transmitted according to user's requirements. Therefore, the efficiency of network use can be enhanced.

Claims

Claims
[1] A wireless device comprising: a device information reading unit which receives and reads information regarding available capabilities of devices included in a wireless network; a control unit which selects a first device and a second device from the devices based on the information; a stream generation unit which generates a video stream and an audio stream which form the same content; and a wireless transmission and reception unit which transmits the video stream to the first device, and transmits the audio stream to the second device.
[2] The device of claim 1, wherein the first device is a device which can process the video stream, and the second device is a device which can process the audio stream.
[3] The device of claim 1, wherein the control unit selects a network coordinator, which can process an audio and video (AV) stream obtained after the video stream and the audio stream are multiplexed, from the devices based on the information regarding the available capabilities of the devices, and the wireless transmission/reception unit transmits the AV stream to the network coordinator.
[4] The device of claim 3, wherein the network coordinator initiates, maintains, and terminates the wireless network.
[5] The device of claim 1, wherein the information regarding the available capabilities of the devices is transmitted, when each of the devices associates with the wireless network.
[6] The device of claim 1, wherein the information regarding the available capabilities of the devices comprises at least one of a processable video stream format, a processable audio stream format, and a supported communication method.
[7] The device of claim 1, further comprising a time information recording unit which inserts time information for synchronizing the video stream and the audio streams into the video stream and the audio stream.
[8] The device of claim 7, wherein the time information comprises a program clock reference (PCR) and at least one of a presentation time stamp (PTS) and a decoding time stamp (DTS).
[9] The device of claim 8, wherein the PTS and the DTS are recorded in the video stream or the audio stream in units of frames, and the PCR is recorded in the video stream or the audio stream at irregular frame intervals.
[10] The device of claim 1, wherein the wireless transmission and reception unit performs a millimeter wave communication, and the video stream is uncompressed video data.
[11] A wireless device comprising: a device information storage unit which provides information regarding an available capability of the wireless device to other devices included in a wireless network; a wireless transmission and reception unit which receives one of a video stream and an audio stream from a source device among the devices based on the information; a time information reading unit which reads time information included in the video stream or the audio stream; and a media play unit which plays video or audio included in the video stream or the audio stream at a time indicated by the time information.
[12] The device of claim 11, further comprising an output unit which outputs the video or the audio which is played to a user.
[13] The device of claim 11, wherein the information regarding the available capability of the wireless device is transmitted, when the wireless device associates with the wireless network.
[14] The device of claim 11, wherein the information regarding the available capability of the wireless device comprises at least one of a processable video stream format, a processable audio stream format, and a supported communication method.
[15] The device of claim 11, wherein the time information comprises a program clock reference (PCR) and at least one of a presentation time stamp (PTS) and a decoding time stamp (DTS).
[16] The device of claim 15, wherein the PTS and the DTS are recorded in the video stream or the audio stream in units of frames, and the PCR is recorded in the video stream or the audio stream at irregular frame intervals.
[17] The device of claim 16, wherein the media play unit corrects a system time clock
(STC) according to the PCR.
[18] The device of claim 11, wherein the wireless transmission/reception unit performs a millimeter wave communication, and the video stream is uncompressed video data.
[19] A wireless device comprising: a device information collection unit which collects information regarding available capabilities of devices included in a wireless network; a wireless transmission and reception unit which receives an audio and video (AV) stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; a media play unit which demultiplexes the AV stream into the video stream and the audio stream, and plays each of the video and audio streams; a video output unit which outputs the video stream which is played to a user; and an audio retransmission unit which retransmits the audio stream which is played through a cable. [20] The device of claim 19, further comprising a network management unit which initiates, maintains, and terminates the wireless network. [21] The device of claim 19, wherein the audio retransmission unit is an external sound terminal, and the retransmitted audio stream is an analog signal. [22] A method of transmitting multimedia data, the method comprising: receiving and reading information regarding available capabilities of devices included in a wireless network; selecting a first device and a second device from the devices based on the information; generating a video stream and an audio stream which form the same content; and transmitting the video stream to the first device and transmitting the audio stream to the second device. [23] The method of claim 22, wherein the first device is a device which can process the video stream, and the second device is a device which can process the audio stream. [24] A method of playing multimedia data, the method comprising: providing information regarding an available capability of a wireless device to other devices included in a wireless network; receiving one of a video stream and an audio stream from a source device among the devices based on the information; reading time information included in the video stream or the audio stream; and playing video or audio included in the video stream or the audio stream at a time indicated by the time information. [25] A method of playing multimedia data, the method comprising: collecting information regarding available capabilities of devices included in a wireless network; receiving an audio and video (AV) stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; demultiplexing the AV stream into the video stream and the audio stream, and playing each of the video and audio streams; outputting the video stream which is played to a user; and retransmitting the audio stream which is played through a cable. [26] A wireless device comprising: a device information reading unit which receives and reads information regarding available capabilities of devices connected to a network; a control unit which selects at least one sink devices from the devices based on the information; and a transmission and reception unit which transmits a video stream or an audio stream to the at least one sink devices. [27] The device of claim 26, wherein the at least one sink devices comprise a first sink device which receives the video stream and a second sink device which receives the audio stream. [28] The device of claim 26, wherein the at least one sink devices comprise a first sink device which receives the video stream and the audio stream and a second sink device which receives the audio stream. [29] The device of claim 28, wherein the first sink device processes the audio stream and outputs the audio stream to a separate audio device through a wired cable. [30] A method of transmitting multimedia data, the method comprising: receiving and reading information regarding available capabilities of devices connected to a network; selecting at least one sink devices from the devices based on the information; and transmitting a video stream or an audio stream to the at least one sink devices. [31] A computer readable recording medium storing a computer program for performing a method of transmitting multimedia data, the method comprising: receiving and reading information regarding available capabilities of devices included in a wireless network; selecting a first device and a second device from the devices based on the information; generating a video stream and an audio stream which form the same content; and transmitting the video stream to the first device and transmitting the audio stream to the second device. [32] A computer readable recording medium storing a computer program for performing a method of playing multimedia data, the method comprising: providing information regarding an available capability of a wireless device to other devices included in a wireless network; receiving one of a video stream and an audio stream from a source device among the devices based on the information; reading time information included in the video stream or the audio stream; and playing video or audio included in the video stream or the audio stream at a time indicated by the time information. [33] A computer readable recording medium storing a computer program for performing a method of playing multimedia data, the method comprising: collecting information regarding available capabilities of devices included in a wireless network; receiving an audio and video (AV) stream, into which a video stream and an audio stream are multiplexed, from a source device among the devices based on the information; demultiplexing the AV stream into the video stream and the audio stream, and playing each of the video and audio streams; outputting the video stream which is played to a user; and retransmitting the audio stream which is played through a cable. [34] A computer readable recording medium storing a computer program for performing a method of transmitting multimedia data, the method comprising: receiving and reading information regarding available capabilities of devices connected to a network; selecting at least one sink devices from the devices based on the information; and transmitting a video stream or an audio stream to the at least one sink devices.
PCT/KR2007/002706 2006-06-05 2007-06-04 Method of transmitting/playing multimedia data over wireless network and wireless device using the method WO2007142445A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP07807922.5A EP2025182B1 (en) 2006-06-05 2007-06-04 Method of transmitting/playing multimedia data over wireless network and wireless device using the method
MX2008015594A MX2008015594A (en) 2006-06-05 2007-06-04 Method of transmitting/playing multimedia data over wireless network and wireless device using the method.
CN2007800164724A CN101438615B (en) 2006-06-05 2007-06-04 Method of transmitting/playing multimedia data over wireless network and wireless device using the method
JP2009514198A JP5065382B2 (en) 2006-06-05 2007-06-04 Wireless device, multimedia data transmission method and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020060050504A KR100801002B1 (en) 2006-06-05 2006-06-05 Method for transferring/playing multimedia data on wireless network and wireless device thereof
KR10-2006-0050504 2006-06-05

Publications (1)

Publication Number Publication Date
WO2007142445A1 true WO2007142445A1 (en) 2007-12-13

Family

ID=38790148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/002706 WO2007142445A1 (en) 2006-06-05 2007-06-04 Method of transmitting/playing multimedia data over wireless network and wireless device using the method

Country Status (8)

Country Link
US (2) US8045665B2 (en)
EP (1) EP2025182B1 (en)
JP (1) JP5065382B2 (en)
KR (1) KR100801002B1 (en)
CN (1) CN101438615B (en)
MX (1) MX2008015594A (en)
TW (1) TW200746763A (en)
WO (1) WO2007142445A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012517636A (en) * 2009-02-09 2012-08-02 アップル インコーポレイテッド Portable electronic device using proximity-based content synchronization
US9491437B2 (en) 2010-12-07 2016-11-08 Samsung Electronics Co., Ltd. Transmitter for transmitting data for constituting content, receiver for receiving and processing data, and method therefor
US9628771B2 (en) 2010-12-07 2017-04-18 Samsung Electronics Co., Ltd. Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2009125599A (en) * 2006-12-04 2011-01-20 Дзе Борд Оф Трастиз Оф Дзе Юниверсити Оф Иллинойс (Us) COMPOSITIONS AND METHODS FOR TREATING CANCER WITH CPG-RICH DNA AND CUPREDOXINS
JP5019597B2 (en) * 2007-06-29 2012-09-05 株式会社東芝 Wireless communication device, wireless communication system, and network control method
US7936790B2 (en) * 2007-08-30 2011-05-03 Silicon Image, Inc. Synchronizing related data streams in interconnection networks
US20090156128A1 (en) * 2007-12-12 2009-06-18 Motorola, Inc. Eyewear communications system
US20090215436A1 (en) * 2008-02-25 2009-08-27 Internet Connectivity Group, Inc. Integrated Wireless Mobile Media System
US20110149164A1 (en) * 2008-08-26 2011-06-23 Netanel Goldberg Method circuit and system for mitigating interference between wireless data and wireless video transceivers operating in proximity with one another
US8503377B2 (en) 2008-09-25 2013-08-06 Intel Corporation Methods for multi-band wireless communication and bandwidth management
JP5330039B2 (en) * 2009-03-16 2013-10-30 シャープ株式会社 Wireless transmission system, relay device, wireless sink device, and wireless source device
US8356113B2 (en) * 2009-03-25 2013-01-15 Cisco Technology, Inc. UPnP AV demux
JP5367814B2 (en) 2009-05-14 2013-12-11 パナソニック株式会社 Video data transmission method
KR20110008860A (en) * 2009-07-21 2011-01-27 엘지이노텍 주식회사 Wireless a/v system
US8755302B2 (en) * 2009-09-24 2014-06-17 Samsung Electronics Co., Ltd. Method and system for ad-hoc communications over millimeter wave wireless channels in wireless systems
JP5577789B2 (en) * 2010-03-25 2014-08-27 ソニー株式会社 Image data transmitting apparatus, image data transmitting method, and image data receiving apparatus
US20120155443A1 (en) * 2010-12-16 2012-06-21 Carlos Cordeiro Millimeter-wave communication station and methods for station and information discovery in a millimeter-wave basic service set
CN102547300B (en) 2010-12-17 2015-01-21 华为技术有限公司 Method for detecting frame types and device
KR101607425B1 (en) 2012-03-29 2016-03-29 닛본 덴끼 가부시끼가이샤 Wireless communication device, wireless communication system and wireless communication method
CN102638726B (en) * 2012-04-24 2016-06-15 惠州Tcl移动通信有限公司 A kind of multimedia streaming method based on Terahertz radio communication and system
KR102238399B1 (en) * 2012-12-07 2021-04-09 삼성전자주식회사 Method and system for streaming multimedia contents in a wi-fi network
US9826015B2 (en) * 2013-09-04 2017-11-21 Qualcomm Incorporated Dynamic and automatic control of latency buffering for audio/video streaming
US20150077635A1 (en) * 2013-09-18 2015-03-19 Htc Corporation Method for outputting multiple multimedia tracks along multiple processing paths from a portable electronic device
US9736806B2 (en) * 2014-02-28 2017-08-15 Qualcomm Incorporated Apparatuses and methods for wireless synchronization of multiple multimedia devices using a common timing framework
US11044386B1 (en) 2014-12-18 2021-06-22 The Directv Group, Inc. Method and system for synchronizing playback of independent audio and video streams through a network
US11006128B2 (en) * 2018-02-20 2021-05-11 Arlo Technologies, Inc. Camera communication channel selection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001052553A1 (en) 2000-01-14 2001-07-19 Koninklijke Philips Electronics N.V. Interconnection of audio/video devices
US20020044200A1 (en) 2000-07-05 2002-04-18 Ulrich Leimkoetter Method and multimedia communication device for enabling communication between terminals having different multimedia capabilities
KR20030005092A (en) * 2002-11-19 2003-01-15 케이티링커스 주식회사 Multi-function public phone using of a wireless lan
US6671520B1 (en) * 1999-02-05 2003-12-30 Wooju Communications Co., Ltd. Remotely operated portable wireless video/audio monitoring system
WO2005020634A1 (en) * 2003-08-22 2005-03-03 Koninklijke Philips Electronics N.V. Audio/video system for wireless driving of loudspeakers
US20050108762A1 (en) * 2003-11-17 2005-05-19 Avermedia Technologies, Inc. Wireless audio-video transmission apparatus

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001160066A (en) * 1998-12-25 2001-06-12 Matsushita Electric Ind Co Ltd Device and method for processing data and recording medium, and program for allowing computer to execute data processing method
JP2000324163A (en) 1999-05-12 2000-11-24 Matsushita Electric Ind Co Ltd Data transmitter and data receiver
JP3708007B2 (en) * 1999-11-22 2005-10-19 株式会社東芝 Information exchange device
KR100336517B1 (en) 1999-12-09 2002-05-11 정경택 Digital settop box with printing function and operation method thereof
JP2001267946A (en) 2000-03-22 2001-09-28 Casio Comput Co Ltd Receiver
JP2001268542A (en) * 2000-03-21 2001-09-28 Communication Research Laboratory Distribution system, transmission system and receiving system
EP1148688A1 (en) * 2000-04-20 2001-10-24 Telefonaktiebolaget L M Ericsson (Publ) Proxy apparatus and method
JP2001326651A (en) * 2000-05-16 2001-11-22 Toshiba Corp Av data transfer control method, av data transfer system, av data receiver and av data transmitter
JP2002247484A (en) * 2001-02-22 2002-08-30 Olympus Optical Co Ltd Spectacles type video display device and its system
JP3591493B2 (en) * 2001-07-25 2004-11-17 ソニー株式会社 Network system and network system synchronization method
JP2003046949A (en) * 2001-07-30 2003-02-14 Hitachi Ltd Data multiplexing method, data recording medium, data recording apparatus, and data recording program
CN100379229C (en) * 2002-08-12 2008-04-02 华为技术有限公司 Method of receivel/send end ability selection
EP1401224A1 (en) 2002-09-17 2004-03-24 Motorola, Inc. Software download to software definable radio by intermediate communication unit
JP2004260454A (en) * 2003-02-25 2004-09-16 Sony Corp Transmitting/receiving system and transmitter
JP2004282667A (en) * 2003-03-19 2004-10-07 Matsushita Electric Ind Co Ltd Transmitter having correction function of regeneration desynchronization, receiver having the same, and transmission equipment having the transmitter and the receiver
KR100547849B1 (en) * 2003-12-05 2006-01-31 삼성전자주식회사 Frame Structure for Selecting Bridge Device in WPAN and Method for Selecting Bridge Device in WPAN
BRPI0418974A8 (en) * 2004-07-27 2017-12-26 Telecom Italia Spa METHOD FOR PERFORMING A COMMUNICATION BETWEEN A FIRST USER AND A SECOND USER AND TELEPHONE EQUIPMENT FOR PERFORMING A COMMUNICATION IN A COMMUNICATION NETWORK
US8719874B2 (en) * 2004-09-28 2014-05-06 Sony Corporation System and method of streaming audio from a common video device
JP4182437B2 (en) * 2004-10-04 2008-11-19 ソニー株式会社 Audio video synchronization system and monitor device
US20060140265A1 (en) * 2004-12-29 2006-06-29 Adimos Inc. System circuit and method for transmitting media related data
KR100599452B1 (en) 2005-05-06 2006-07-12 한국전자통신연구원 Voice/image process module for transferring audio signal to short distance, voice receiving/processing module for receiving audio signal at short distance, and method for transmitting audio signal from voice/image process module to voice receiving/processing module
US7573847B2 (en) * 2005-06-27 2009-08-11 Intel Corporation Media distribution system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6671520B1 (en) * 1999-02-05 2003-12-30 Wooju Communications Co., Ltd. Remotely operated portable wireless video/audio monitoring system
WO2001052553A1 (en) 2000-01-14 2001-07-19 Koninklijke Philips Electronics N.V. Interconnection of audio/video devices
US20020044200A1 (en) 2000-07-05 2002-04-18 Ulrich Leimkoetter Method and multimedia communication device for enabling communication between terminals having different multimedia capabilities
KR20030005092A (en) * 2002-11-19 2003-01-15 케이티링커스 주식회사 Multi-function public phone using of a wireless lan
WO2005020634A1 (en) * 2003-08-22 2005-03-03 Koninklijke Philips Electronics N.V. Audio/video system for wireless driving of loudspeakers
US20050108762A1 (en) * 2003-11-17 2005-05-19 Avermedia Technologies, Inc. Wireless audio-video transmission apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012517636A (en) * 2009-02-09 2012-08-02 アップル インコーポレイテッド Portable electronic device using proximity-based content synchronization
US9491437B2 (en) 2010-12-07 2016-11-08 Samsung Electronics Co., Ltd. Transmitter for transmitting data for constituting content, receiver for receiving and processing data, and method therefor
US9628771B2 (en) 2010-12-07 2017-04-18 Samsung Electronics Co., Ltd. Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor

Also Published As

Publication number Publication date
JP5065382B2 (en) 2012-10-31
EP2025182A1 (en) 2009-02-18
EP2025182A4 (en) 2015-04-08
US20090031365A1 (en) 2009-01-29
TW200746763A (en) 2007-12-16
KR100801002B1 (en) 2008-02-11
EP2025182B1 (en) 2020-01-22
KR20070116454A (en) 2007-12-10
US8059775B2 (en) 2011-11-15
JP2009540425A (en) 2009-11-19
US8045665B2 (en) 2011-10-25
CN101438615B (en) 2012-09-05
CN101438615A (en) 2009-05-20
US20070280361A1 (en) 2007-12-06
MX2008015594A (en) 2009-01-13

Similar Documents

Publication Publication Date Title
EP2025182B1 (en) Method of transmitting/playing multimedia data over wireless network and wireless device using the method
EP2186297B1 (en) Apparatus, systems and methods to synchronize communication of content to a presentation device and a mobile device
EP3113498A1 (en) Synchronized rendering of split multimedia content on network clients
US8654767B2 (en) Method and system for wireless communication of audio in wireless networks
US20080019398A1 (en) Clock recovery in wireless media streaming
US20140090007A1 (en) Broadcast receiving apparatus, playback apparatus, broadcast communication system, broadcast receiving method, playback method and program
US9032453B2 (en) Method and system for multiplexed transport interface between demodulators (DEMODs) and set-top box (STB) system-on-chips (SoCs)
KR20070073564A (en) Method of lip synchronizing for a wireless audio/video network and apparatus for the same
US8368811B2 (en) Reproducing apparatus
US20050147175A1 (en) Stream data communication system
CN102595162B (en) Image processing equipment, image processing method and program
JP2010258489A (en) Video display device, reception device, transmission and reception system, and video display method
US20130113991A1 (en) Apparatus for transceiving point to point moving signal reception high channel using horizontal blanking interval and method for executing the apparatus
JP7501317B2 (en) Video receiving device and video receiving method
JP5572541B2 (en) Video encoder system
US20120207207A1 (en) Method, system and associated modules for transmission of complimenting frames
US20110126243A1 (en) Device, method and system for transmitting data network based data over a wireless video link

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07807922

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 200780016472.4

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2007807922

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009514198

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: MX/A/2008/015594

Country of ref document: MX