WO2008057407A2 - Wireless hd av packet format - Google Patents

Wireless hd av packet format Download PDF

Info

Publication number
WO2008057407A2
WO2008057407A2 PCT/US2007/023126 US2007023126W WO2008057407A2 WO 2008057407 A2 WO2008057407 A2 WO 2008057407A2 US 2007023126 W US2007023126 W US 2007023126W WO 2008057407 A2 WO2008057407 A2 WO 2008057407A2
Authority
WO
WIPO (PCT)
Prior art keywords
video
composite packet
phy
packets
sub
Prior art date
Application number
PCT/US2007/023126
Other languages
French (fr)
Other versions
WO2008057407A3 (en
WO2008057407A8 (en
Inventor
Ching-Hsiu Chiang
Jeffrey M. Gilbert
Victor Ramamoorthy
Original Assignee
Sibeam, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sibeam, Inc. filed Critical Sibeam, Inc.
Priority to DE602007011819T priority Critical patent/DE602007011819D1/en
Priority to EP07861651A priority patent/EP2098027B1/en
Priority to AT07861651T priority patent/ATE494711T1/en
Publication of WO2008057407A2 publication Critical patent/WO2008057407A2/en
Publication of WO2008057407A3 publication Critical patent/WO2008057407A3/en
Publication of WO2008057407A8 publication Critical patent/WO2008057407A8/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/64Hybrid switching systems
    • H04L12/6418Hybrid transport
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • G09G2370/045Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller using multiple communication channels, e.g. parallel and serial
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/10Use of a protocol of communication by packets in interfaces along the display data pipeline
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/006Details of the interface to the display terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/64Hybrid switching systems
    • H04L12/6418Hybrid transport
    • H04L2012/6445Admission control
    • H04L2012/6448Medium Access Control [MAC]

Definitions

  • 60/856,104 filed 1 1/01/2006, U.S. Provisional Application No. 60/873,759 filed 12/08/2006, U.S. Provisional Application No. 60/901 ,388 filed 02/14/2007, U.S. Provisional Application No. 60/901 ,384 filed 02/14/2007, U.S. Provisional Application No. 60/920,338 filed 03/26/2007, U.S. Provisional Application No. 60/920,266 filed 03/26/2007, U.S. Provisional Application No. 60/920,357 filed 03/26/2007.
  • the present invention relates to the field of wireless communication; more particularly, the present invention relates to a wireless communication device.
  • HDMI is a connection interface standard that was developed to meet the explosive demand for high-definition audio and video.
  • HDMI is capable of carrying video and audio and is backward-compatible with DVI (which carries only video signals).
  • the key advantage of DVI and HDMI is that both of them are capable of transmitting uncompressed high-definition digital streams via a single cable.
  • 0005] HDCP is a system for protecting content being transferred over DVI and HDMI from being copied. See HDCP 1.0 for details.
  • HDCP provides authentication, encryption, and revocation. Specialized circuitry in the playback device and in the display monitor encrypts video data before it is sent over. With HDCP, content is encrypted immediately before (or inside) the DVI or HDMI transmitter chip and decrypted immediately after (or inside) the DVI or HDMI receiver chip.
  • HDCP implements authentication to verify that the receiving device (e.g., a display, a television, etc.) is licensed to receive encrypted content. Re-authentication occurs approximately every two seconds to continuously confirm the security of the DVI or HDMl interface. If, at any time, re-authentication does not occur, for example by disconnecting a device and/or connecting an illegal recording device, the source device (e.g., a DVD player, a set-top box, etc.) ends transmission of encrypted content.
  • the receiving device e.g., a display, a television, etc.
  • Re-authentication occurs approximately every two seconds to continuously confirm the security of the DVI or HDMl interface. If, at any time, re-authentication does not occur, for example by disconnecting a device and/or connecting an illegal recording device, the source device (e.g., a DVD player, a set-top box, etc.) ends transmission of encrypted content.
  • the source device e.g., a DVD player, a set-top box, etc
  • a media access controller generates a composite packet having an optimized format for carrying audio, video, and data traffic.
  • a physical device interface (PHY) is coupled to the MAC. The PHY to encode and decode between a digital signal and a modulated analog signal.
  • the PHY comprises a high rate physical layer circuit (HRP) and a low rate physical layer circuit (LRP).
  • a radio frequency (RF) transmitter is coupled to the PHY to transmit data.
  • FIG. 1 is a block diagram of one embodiment of a communication system.
  • FIG. 2 is a block diagram of one embodiment of a communication device.
  • FIG. 3 is a block diagram of one embodiment of a packet format of a
  • FIG. 4 is a block diagram of an example of a packet of a PHY mode segmentation.
  • FIG. 5 is a block diagram of a first embodiment of deep color pixel packing.
  • FIG. 6 is a block diagram of a first embodiment of deep color pixel packing.
  • FIG. 7 is a block diagram of a second embodiment of deep color pixel packing.
  • FIG. 8 is a block diagram of a second embodiment of deep color pixel packing.
  • FIG. 9 is a table of a first embodiment of a video sub-packet.
  • FIG. 10 is a table of a second embodiment of a video sub-packet.
  • FIG. 11 is a table illustrating an example of multiple-partitions.
  • FIG. 12 is a table illustrating an example of deep color multiple- partitions in accordance with a first embodiment.
  • FIG. 13 is a table illustrating an example of deep color multiple- partitions in accordance with a second embodiment.
  • FIG. 14 is a block diagram of one embodiment of a packet format of a
  • the wireless communication occurs using a wireless transceiver with or without an adaptive beamforming antenna.
  • the wireless communication could occur with a wireless receiver or transmitter.
  • a media access controller generates a composite packet having an optimized format for carrying audio, video, and data traffic.
  • a physical device interface (PHY) is coupled to the MAC. The PHY to encode and decode between a digital signal and a modulated analog signal.
  • the PHY comprises a high rate physical layer circuit (HRP) and a low rate physical layer circuit (LRP).
  • HRP high rate physical layer circuit
  • LRP low rate physical layer circuit
  • RF transmitter is coupled to the PHY to transmit data.
  • the wireless communication includes an additional link, or channel, for transmitting information between a transmitter and a receiver.
  • the link may be uni-directional or bi-directional.
  • the channel is used to send antenna information back from a receiver to a transmitter to enable the transmitter to adapt its antenna array by steering the antenna elements to find a path to another direction. This may be obstacle avoidance.
  • the link is also used to transfer information corresponding to the content that is being transferred wirelessly (e.g., wireless video).
  • This information may be content protection information.
  • the link is used to transfer encryption keys and acknowledgements of encryption keys when the transceivers are transferring HDMl data.
  • the link transfers control information and content protection information.
  • FIG. 1 is a block diagram of one embodiment of a communication system.
  • the system comprises media receiver 100, a media receiver interface 102, a transmitting device 140, a receiving device 141, a media player interface 1 13, a media player 1 14 and a display 1 15.
  • Media receiver 100 receives content from a source (not shown).
  • media receiver 100 comprises a set top box.
  • the content may comprise baseband digital video, such as, for example, but not limited to, content adhering to the HDMl or DVI standards.
  • media receiver 100 may include a transmitter (e.g., an HDMI transmitter) to forward the received content.
  • Media receiver 101 sends content 101 to transmitter device 140 via media receiver interface 102.
  • media receiver interface 102 includes logic that converts content 101 into HDMI content.
  • media receiver interface 102 may comprise an HDMI plug and content 101 is sent via a wired connection; however, the transfer could occur through a wireless connection.
  • content 101 comprises DVI content.
  • the transfer of content 101 between media receiver interface 102 and transmitter device 140 occurs over a wired connection; however, the transfer could occur through a wireless connection.
  • Transmitter device 140 wirelessly transfers information to receiver device 141 using two wireless connections.
  • One of the wireless connections is through a phased array antenna with adaptive beamforming, also referred as High Rate channel.
  • the other wireless connection is via wireless communications channel 107, referred to herein as the Low Rate channel.
  • the HR and LR wireless communication are enabled through a MAC, and a PHY (discussed in FIG.
  • Receiver device 141 transfers the content received from transmitter device 140 to media player 1 14 via media player interface 1 13.
  • the transfer of the content between receiver device 141 and media player interface 1 13 occurs through a wired connection; however, the transfer could occur through a wireless connection.
  • media player interface 1 13 comprises an
  • HDMI plug Similarly, the transfer of the content between media player interface 1 13 and media player 1 14 occurs through a wired connection; however, the transfer could occur through a wireless connection.
  • Media player 1 14 causes the content to be played on display 1 15.
  • the content is HDMI content and media player 1 14 transfer the media content to display via a wired connection; however, the transfer could occur through a wireless connection.
  • Display 1 15 may comprise a plasma display, an LCD, a
  • system in Figure 1 may be altered to include a DVD player/recorder in place of a DVD player/recorder to receive, and play and/or record the content.
  • transmitter 140 and media receiver interface 102 are part of media receiver 100.
  • receiver 140, media player interface 1 13, and media player 1 14 are all part of the same device.
  • receiver 140, media player interface 1 13, media player 1 14, and display 1 15 are all part of the display. An example of such a device is shown in Figure 3.
  • transmitter device 140 comprises a processor 103, an optional baseband processing component 104, a phased array antenna 105, and a wireless communication channel interface 106.
  • Phased array antenna 105 comprises a radio frequency (RF) transmitter having a digitally controlled phased array antenna coupled to and controlled by processor 103 to transmit content to receiver device 141 using adaptive beamforming.
  • RF radio frequency
  • receiver device 141 comprises a processor 1 12, an optional baseband processing component 1 1 1 , a phased array antenna 1 10, and a wireless communication channel interface 109.
  • Phased array antenna 1 10 comprises a radio frequency (RF) transmitter having a digitally controlled phased array antenna coupled to and controlled by processor 1 12 to receive content from transmitter device 140 using adaptive beamforming.
  • RF radio frequency
  • processor 103 generates baseband signals that are processed by baseband signal processing 104 prior to being wirelessly transmitted by phased array antenna 105.
  • receiver device 141 includes baseband signal processing to convert analog signals received by phased array antenna 1 10 into baseband signals for processing by processor 1 12.
  • the baseband signals are orthogonal frequency division multiplex (OFDM) signals.
  • the baseband signals are single carrier phase, amplitude, or both phase and amplitude modulated signals.
  • transmitter device 140 and/or receiver device 141 are part of separate transceivers.
  • Transmitter device 140 and receiver device 141 perform wireless communication using phased array antenna with adaptive beamforming that allows beam steering. Beamforming is well known in the art.
  • processor 103 sends digital control information to phased array antenna 105 to indicate an amount to shift one or more phase shifters in phased array antenna 105 to steer a beam formed thereby in a manner well-known in the art.
  • Processor 1 12 uses digital control information as well to control phased array antenna 1 10. The digital control information is sent using control channel 121 in transmitter device 140 and control channel 122 in receiver device 141.
  • the digital control information comprises a set of coefficients.
  • each of processors 103 and 1 12 comprises a digital signal processor.
  • Wireless communication link interface 106 is coupled to processor 103 and provides an interface between wireless communication link 107 and processor 103 to communicate antenna information relating to the use of the phased array antenna and to communicate information to facilitate playing the content at another location.
  • the information transferred between transmitter device 140 and receiver device 141 to facilitate playing the content includes encryption keys sent from processor 103 to processor 1 12 of receiver device 141 and one or more acknowledgments from processor 1 12 of receiver device 141 to processor 103 of transmitter device 140.
  • Wireless communication link 107 also transfers antenna information between transmitter device 140 and receiver device 141. During initialization of the phased array antennas 105 and 1 10, wireless communication link 107 transfers information to enable processor 103 to select a direction for the phased array antenna 105.
  • the information includes, but is not limited to, antenna location information and performance information corresponding to the antenna location, such as one or more pairs of data that include the position of phased array antenna 1 10 and the signal strength of the channel for that antenna position.
  • the information includes, but is not limited to, information sent by processor 1 12 to processor 103 to enable processor 103 to determine which portions of phased array antenna 105 to use to transfer content.
  • wireless communication link 107 transfers an indication of the status of communication path from the processor 1 12 of receiver device 141.
  • the indication of the status of communication comprises an indication from processor 1 12 that prompts processor 103 to steer the beam in another direction (e.g., to another channel). Such prompting may occur in response to interference with transmission of portions of the content.
  • the information may specify one or more alternative channels that processor 103 may use.
  • the antenna information comprises information sent by processor 1 12 to specify a location to which receiver device 141 is to direct phased array antenna 1 10.
  • transmitter device 140 is telling receiver device 141 where to position its antenna so that signal quality measurements can be made to identify the best channels.
  • the position specified may be an exact location or may be a relative location such as, for example, the next location in a predetermined location order being followed by transmitter device 140 and receiver device 141.
  • wireless communications link 107 transfers information from receiver device 141 to transmitter device 140 specifying antenna characteristics of phased array antenna 1 10, or vice versa.
  • FIG. 2 illustrates one embodiment of a communication device 200.
  • the communication device 200 includes data storage 202, an Audio/Video (AV) processor 204, a media access controller (MAC) 206, a physical device interface (PHY) 208, and a radio module 210.
  • Data storage 202 may store any types of data.
  • data storage 202 may store audio and video data as well as other types of data.
  • AV processor 204 receives and processes data from data storage 202.
  • MAC 206 handles generating and parsing physical frames.
  • PHY 208 handles how this data is actually moved to/from the radio module 210.
  • Wireless HD specification supports two basic types of PHY: high rate PHY (HRP) and low rate PHY (LRP).
  • HRP supports multi-Gbps data rates.
  • HRP may operate in a directional mode (typically beam-formed mode).
  • HRP may be used to transmit audio, video, data, and control messages.
  • HRP occupies roughly 1.7GHz bandwidth.
  • LRP supports multi-Mbps data rates.
  • LRP may operate in a directional, omni-directional, or beam-formed modes.
  • LRP may be used to transmit control messages, beacons, and acknowledgements.
  • LRP may further be used to transmit audio or compressed video.
  • LRP may further be used to transmit low-speed data.
  • LRP occupies one of three
  • Unequal Error Protection means using different PHY coding scheme to protect most significant bits (msb) and least significant bits (lsb) of data portion separately. Having separate msb/lsbCRCs for video pixel data allows for msb/lsb data portions to be independently checked and used by receiver.
  • FIG. 3 illustrates an example of a composite packet format with data regions separately encoded.
  • (H, V) means horizontal/vertical coordinates for each pixel in a video frame, and it uniquely identifies any location in a particular video frame.
  • the main advantage of using (H, V) instead of sequence number is it allows for more robust video data re-assembly on the receiver side, and also saves the redundant descriptor field in case sequence number is used.
  • Sequence numbers are thus not required in the header for video subpackets since H, V, frame number, and partition are sufficient to determine where data should go.
  • FIGS. 4 and 14 illustrate an example of Video Header Optimization.
  • Header fields for video sections are important for decoding video. Bit errors in video data can be tolerated in some cases but in other cases (i.e. H & V location) they cannot. Thus, video headers are placed in "video header" section. However audio, control, and data are more sensitive to bit errors, as the headers are, and thus since the non-header information for these types is of similar sensitivity as the header information, separate protection and coding for the two is not needed and the header information can be placed with the data information for these types.
  • the sizes of the video sub-packets are fixed in length in bytes for the duration of a given video stream.
  • the size of the duration of the video sub-packets in time is the same for the duration of a given video stream.
  • the number of video bytes in each sub-packet scales linearly with the data rate so that video sub-packets using a first MCS that delivers a data rate that is twice as high a second data rate has twice as many video bytes in the first video sub-packet as in the second video sub-packet.
  • incoming video pixel data has to be "packetized” and sent out separately.
  • sub-packet sizes respect video pixel boundaries to avoid needing color component offset and thus simplifying implementations. If the size of the video sub-packet does not obey pixel boundary, it would require additional information to indicate how much data amount at the end of the subpacket is sent ("pixel offset"). It also creates complexity for interoperable operation.
  • VIDEO PIXEL PACKING FOR DEEP COLOR FOR UEP SUPPORT (0056)
  • Unequal Error Protection (UEP) divides the data bytes or other transfer units into groups of bits.
  • UEP divides bytes into two groups - most significant bits (mbs) and least significant bits (lsb). This can allow direct mapping if the transfer units are the same size as the pixel components as in the case with UEP over byte boundaries coding 24-bit-per-pixel RGB video.
  • video pixel data has several different variations (16/20/24/30/36-bits per pixel) and thus such direct alignment is not possible.
  • One embodiment cycles the pixel component bits in the same ratio as the msb / lsb groups to pack deep color (> 8 bits per component) pixels into UEP blocks.
  • One embodiment uses similar techniques to pack the pixels into two different CRCs covering the lsb and msb groups. This can be used with the UEP techniques or independently.
  • FIGS. 5 and 6 illustrate a first embodiment of a deep color format.
  • the UEP splits the data stream into 4 most significant bits (msb) and 4 least significant bits (lsb) and the 4 lsb and 4 msb are used to generate an lsb CRC and msb CRC.
  • Bit packing is required to pack non byte pixel components into byte stream.
  • MAC and PHY see standard byte streams.
  • Support of separate lsb/msb CRCs requires separate packing of upper and lower pixel component halves across upper and lower nibbles of bytes. For example, 30 bit mode packs 4 pixel groups in 5 byte chunks as illustrated in FIG. 5.
  • FIG. 6 illustrates 36 bit mode packs 2 pixel groups into 3 byte chunks. Dividing of pixels into partitions occurs prior to packing each partition's pixels into bytes.
  • FIG. 7 and 8 illustrate a second embodiment of a deep color format.
  • Bit packing is required to pack non byte pixel components into byte stream.
  • MAC and PHY see standard byte stream
  • lsb/msb CRCs are defined based on byte stream between MAC/PHY to simplify the design significantly - not related to specific bit locations in original pixel data bits. For example, 30 bit mode packs 4 pixel groups in 5 byte chunks as illustrated in FIG. 7.
  • FIG. 8 illustrates 36 bit mode packs 2 pixel groups into 3 byte chunks. Dividing of pixels into partitions occurs prior to packing each partition's pixels into bytes.
  • the main concept of multiple partitions is to separate adjacent pixel data in different video sub-packets, so in case there's packet loss the missing pixels can be reconstructed.
  • the video format is YCbCr and multiple partitions are in place, due to different data rate of each color elements, re-ordering the color element of each pixel data will achieve optimal reconstruction on the receiver side.
  • FIG. 9 illustrates a first embodiment of a first pixel partition.
  • First bit in the sub-packet is always Y, bit 0 from pixel 0.
  • Each color element switch on one byte boundary.
  • Cb and Cr alternate in order (YO, CbO, Yl , CrO, Y2, Cr2, Y3, Cb2,
  • the ordering is from horizontal pixel 0 and independent of row. This is needed for proper operation with 2x2 partition mode.
  • FIG. 10 illustrates a second embodiment of a first pixel partition.
  • First bit in the sub-packet is always Y, bit 0 from pixel 0.
  • Each color element switch on one byte boundary.
  • Cb and Cr alternate in order (YO, CbO, Yl , CrO, Y2, Cr2, Y3,
  • FIG. 1 1 illustrates YCbCr 4:2:2, 4 partitions.
  • YCbCr For example, YCbCr
  • FIG. 12 illustrates YCbCr 4:2:2, 4 partitions, Deep Color mode in accordance with a first embodiment.
  • Partition 0 may include ⁇ Y0 ⁇ ⁇ CbO ⁇ ⁇ Y0/Y2 ⁇ ⁇ CbO/Cr2 ⁇ ⁇ Y2/Y4 ⁇ ⁇ Cr2/Cr4 ⁇ ⁇ Y4/Y6 ⁇ ⁇ Cr4/Cb6 ⁇ ⁇ Y6 ⁇ .
  • Partition 1 may include ⁇ Yl ⁇ ⁇ CrO ⁇ ⁇ Y1/Y3 ⁇ ⁇ CrO/Cb2 ⁇ ⁇ Y3/Y5 ⁇ ⁇ Cb2/Cb4 ⁇ ⁇ Y5/Y7 ⁇ ⁇ Cb4/Cr6 ⁇ ⁇ Y7 ⁇ . This scheme balances Cb and Cr pixels across each partition.
  • FIG. 13 illustrates YCbCr 4:2:2, 4 partitions, Deep Color mode in accordance with a second embodiment.
  • Partition 0 may include ⁇ Y0 ⁇ ⁇ CbO ⁇ ⁇ Y2 ⁇ ⁇ Cr2 ⁇ ⁇ Y4 ⁇ ⁇ Cr4 ⁇ ⁇ Y6 ⁇ ⁇ Cb6 ⁇ ⁇ Y8 ⁇ ⁇ Cb8 ⁇ ⁇ Y10 ⁇ ⁇ Crl O ⁇ .
  • Partition 1 may include ⁇ Yl ⁇ ⁇ CrO ⁇ ⁇ Y3 ⁇ ⁇ Cb2 ⁇ ⁇ Y5 ⁇ ⁇ Cb4 ⁇ ⁇ Y7 ⁇ ⁇ Cr6 ⁇ ⁇ Y9 ⁇ ⁇ Cr8 ⁇ ⁇ Yl 1 ⁇ ⁇ Cbl O ⁇ . This scheme balances Cb and Cr pixels across each partition.
  • Synchronization between audio and video can be achieved by controlling when the audio and video streams play out. This additionally can be used for buffer and link management.
  • a "playback timestamp" can be used to specify when a particular sample or section of audio or video is played back.
  • the four video sub-packet headers could each include a "playback timestamp” to indicate when they should be played back.
  • timestamps can take many bits (30 in one embodiment) which can be scarce in headers.
  • only one playback timestamp is specified in the video header and two additional "playback select" bits indicate which of the four video sub-packets the playback timestamp applies to.
  • the playback timestamp for any of the four video sub-packets can be communicated. Since the playback timestamp only needs to be communicated once per video frame, this is typically sufficient.
  • the present invention also relates to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
  • a machine-readable medium includes read only memory ("ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Communication Control (AREA)
  • Liquid Developers In Electrophotography (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

An audio/video (AV) processor is coupled to a media access controller (MAC) to generate a composite packet having an optimized format for carrying audio, video, and data traffic with fields in a header of the composite packet specifying video- specific information. A physical device interface (PHY) is coupled to the MAC. The PHY encodes and decodes between a digital signal and a modulated analog signal. The PHY comprises a high rate physical layer circuit (HRP) and a low rate physical layer circuit (LRP). A radio frequency (RF) transmitter is coupled to the PHY to transmit data.

Description

WIRELESS HD AV PACKET FORMAT
RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Application No.
60/856,104 filed 1 1/01/2006, U.S. Provisional Application No. 60/873,759 filed 12/08/2006, U.S. Provisional Application No. 60/901 ,388 filed 02/14/2007, U.S. Provisional Application No. 60/901 ,384 filed 02/14/2007, U.S. Provisional Application No. 60/920,338 filed 03/26/2007, U.S. Provisional Application No. 60/920,266 filed 03/26/2007, U.S. Provisional Application No. 60/920,357 filed 03/26/2007.
FIELD OF THE INVENTION
|0002] The present invention relates to the field of wireless communication; more particularly, the present invention relates to a wireless communication device. BACKGROUND OF THE INVENTION
|0003] In 1998, the Digital Display Working Group (DDWG) was formed to create a universal interface standard between computers and displays to replace the analog VGA connection standard. The resulting standard was the Digital Visual Interface (DVI) specification, released in April 1999. There are a number of content protection schemes available. For example, HDCP and DTCP are well-known content protection schemes. HDCP was proposed as a security component for DVI and was designed for digital video monitor interfaces.
[0004] HDMI is a connection interface standard that was developed to meet the explosive demand for high-definition audio and video. HDMI is capable of carrying video and audio and is backward-compatible with DVI (which carries only video signals). The key advantage of DVI and HDMI is that both of them are capable of transmitting uncompressed high-definition digital streams via a single cable. |0005] HDCP is a system for protecting content being transferred over DVI and HDMI from being copied. See HDCP 1.0 for details. HDCP provides authentication, encryption, and revocation. Specialized circuitry in the playback device and in the display monitor encrypts video data before it is sent over. With HDCP, content is encrypted immediately before (or inside) the DVI or HDMI transmitter chip and decrypted immediately after (or inside) the DVI or HDMI receiver chip.
[0006] In addition to the encryption and decryption functions, HDCP implements authentication to verify that the receiving device (e.g., a display, a television, etc.) is licensed to receive encrypted content. Re-authentication occurs approximately every two seconds to continuously confirm the security of the DVI or HDMl interface. If, at any time, re-authentication does not occur, for example by disconnecting a device and/or connecting an illegal recording device, the source device (e.g., a DVD player, a set-top box, etc.) ends transmission of encrypted content.
[0007] While discussions of HDMI and DVI are generally focused on wired communication, the use of wireless communication to transmit content has become more prevalent every day. While much of the current focus is on cellular technologies and wireless networks, there has been a growing interest in the unlicensed spectrum around 60 GHz for wireless video transmission or very high-speed networking. More specifically, seven GHz of contiguous bandwidth has been opened for unlicensed use at millimeter- wave frequencies around 60 GHz in the U.S. and Japan.
SUMMARY OF THE INVENTION
[0008] A media access controller (MAC) generates a composite packet having an optimized format for carrying audio, video, and data traffic. A physical device interface (PHY) is coupled to the MAC. The PHY to encode and decode between a digital signal and a modulated analog signal. The PHY comprises a high rate physical layer circuit (HRP) and a low rate physical layer circuit (LRP). A radio frequency (RF) transmitter is coupled to the PHY to transmit data.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which, however, should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
[0010] FIG. 1 is a block diagram of one embodiment of a communication system.
[0011] FIG. 2 is a block diagram of one embodiment of a communication device.
[0012] FIG. 3 is a block diagram of one embodiment of a packet format of a
PHY mode segmentation.
[0013] FIG. 4 is a block diagram of an example of a packet of a PHY mode segmentation.
[0014] FIG. 5 is a block diagram of a first embodiment of deep color pixel packing.
[0015] FIG. 6 is a block diagram of a first embodiment of deep color pixel packing.
[0016] FIG. 7 is a block diagram of a second embodiment of deep color pixel packing.
[0017] FIG. 8 is a block diagram of a second embodiment of deep color pixel packing.
[0018] FIG. 9 is a table of a first embodiment of a video sub-packet.
[0019] FIG. 10 is a table of a second embodiment of a video sub-packet.
[0020] FIG. 11 is a table illustrating an example of multiple-partitions.
[0021 ] FIG. 12 is a table illustrating an example of deep color multiple- partitions in accordance with a first embodiment.
[0022] FIG. 13 is a table illustrating an example of deep color multiple- partitions in accordance with a second embodiment.
[0023] FIG. 14 is a block diagram of one embodiment of a packet format of a
PHY mode segmentation.
DETAILED DESCRIPTION OF THE PRESENT INVENTION
[0024] An apparatus and method for wireless communication is disclosed. In one embodiment, the wireless communication occurs using a wireless transceiver with or without an adaptive beamforming antenna. As would be apparent to one skilled in the art, the wireless communication could occur with a wireless receiver or transmitter.
[0025] A media access controller (MAC) generates a composite packet having an optimized format for carrying audio, video, and data traffic. A physical device interface (PHY) is coupled to the MAC. The PHY to encode and decode between a digital signal and a modulated analog signal. The PHY comprises a high rate physical layer circuit (HRP) and a low rate physical layer circuit (LRP). A radio frequency
(RF) transmitter is coupled to the PHY to transmit data.
[0026] In one embodiment, the wireless communication includes an additional link, or channel, for transmitting information between a transmitter and a receiver.
The link may be uni-directional or bi-directional. In one embodiment, the channel is used to send antenna information back from a receiver to a transmitter to enable the transmitter to adapt its antenna array by steering the antenna elements to find a path to another direction. This may be obstacle avoidance.
[0027] In one embodiment, the link is also used to transfer information corresponding to the content that is being transferred wirelessly (e.g., wireless video).
This information may be content protection information. For example, in one embodiment, the link is used to transfer encryption keys and acknowledgements of encryption keys when the transceivers are transferring HDMl data. Thus, in one embodiment, the link transfers control information and content protection information.
[0028] FIG. 1 is a block diagram of one embodiment of a communication system. Referring to Figure 1 , the system comprises media receiver 100, a media receiver interface 102, a transmitting device 140, a receiving device 141, a media player interface 1 13, a media player 1 14 and a display 1 15.
[0029] Media receiver 100 receives content from a source (not shown). In one embodiment, media receiver 100 comprises a set top box. The content may comprise baseband digital video, such as, for example, but not limited to, content adhering to the HDMl or DVI standards. In such a case, media receiver 100 may include a transmitter (e.g., an HDMI transmitter) to forward the received content.
[0030] Media receiver 101 sends content 101 to transmitter device 140 via media receiver interface 102. In one embodiment, media receiver interface 102 includes logic that converts content 101 into HDMI content. In such a case, media receiver interface 102 may comprise an HDMI plug and content 101 is sent via a wired connection; however, the transfer could occur through a wireless connection. In another embodiment, content 101 comprises DVI content.
[0031] In one embodiment, the transfer of content 101 between media receiver interface 102 and transmitter device 140 occurs over a wired connection; however, the transfer could occur through a wireless connection.
(0032] Transmitter device 140 wirelessly transfers information to receiver device 141 using two wireless connections. One of the wireless connections is through a phased array antenna with adaptive beamforming, also referred as High Rate channel. The other wireless connection is via wireless communications channel 107, referred to herein as the Low Rate channel. In one embodiment, the HR and LR wireless communication are enabled through a MAC, and a PHY (discussed in FIG.
2).
(0033] Receiver device 141 transfers the content received from transmitter device 140 to media player 1 14 via media player interface 1 13. In one embodiment, the transfer of the content between receiver device 141 and media player interface 1 13 occurs through a wired connection; however, the transfer could occur through a wireless connection. In one embodiment, media player interface 1 13 comprises an
HDMI plug. Similarly, the transfer of the content between media player interface 1 13 and media player 1 14 occurs through a wired connection; however, the transfer could occur through a wireless connection.
[0034] Media player 1 14 causes the content to be played on display 1 15. In one embodiment, the content is HDMI content and media player 1 14 transfer the media content to display via a wired connection; however, the transfer could occur through a wireless connection. Display 1 15 may comprise a plasma display, an LCD, a
CRT, etc.
[0035] Note that the system in Figure 1 may be altered to include a DVD player/recorder in place of a DVD player/recorder to receive, and play and/or record the content.
[0036J In one embodiment, transmitter 140 and media receiver interface 102 are part of media receiver 100. Similarly, in one embodiment, receiver 140, media player interface 1 13, and media player 1 14 are all part of the same device. In an alternative embodiment, receiver 140, media player interface 1 13, media player 1 14, and display 1 15 are all part of the display. An example of such a device is shown in Figure 3.
[0037] In one embodiment, transmitter device 140 comprises a processor 103, an optional baseband processing component 104, a phased array antenna 105, and a wireless communication channel interface 106. Phased array antenna 105 comprises a radio frequency (RF) transmitter having a digitally controlled phased array antenna coupled to and controlled by processor 103 to transmit content to receiver device 141 using adaptive beamforming.
[0038) In one embodiment, receiver device 141 comprises a processor 1 12, an optional baseband processing component 1 1 1 , a phased array antenna 1 10, and a wireless communication channel interface 109. Phased array antenna 1 10 comprises a radio frequency (RF) transmitter having a digitally controlled phased array antenna coupled to and controlled by processor 1 12 to receive content from transmitter device 140 using adaptive beamforming.
[0039] In one embodiment, processor 103 generates baseband signals that are processed by baseband signal processing 104 prior to being wirelessly transmitted by phased array antenna 105. In such a case, receiver device 141 includes baseband signal processing to convert analog signals received by phased array antenna 1 10 into baseband signals for processing by processor 1 12. In one embodiment, the baseband signals are orthogonal frequency division multiplex (OFDM) signals. In one embodiment, the baseband signals are single carrier phase, amplitude, or both phase and amplitude modulated signals.
(0040] In one embodiment, transmitter device 140 and/or receiver device 141 are part of separate transceivers.
[0041] Transmitter device 140 and receiver device 141 perform wireless communication using phased array antenna with adaptive beamforming that allows beam steering. Beamforming is well known in the art. In one embodiment, processor 103 sends digital control information to phased array antenna 105 to indicate an amount to shift one or more phase shifters in phased array antenna 105 to steer a beam formed thereby in a manner well-known in the art. Processor 1 12 uses digital control information as well to control phased array antenna 1 10. The digital control information is sent using control channel 121 in transmitter device 140 and control channel 122 in receiver device 141. In one embodiment, the digital control information comprises a set of coefficients. In one embodiment, each of processors 103 and 1 12 comprises a digital signal processor.
[0042] Wireless communication link interface 106 is coupled to processor 103 and provides an interface between wireless communication link 107 and processor 103 to communicate antenna information relating to the use of the phased array antenna and to communicate information to facilitate playing the content at another location. In one embodiment, the information transferred between transmitter device 140 and receiver device 141 to facilitate playing the content includes encryption keys sent from processor 103 to processor 1 12 of receiver device 141 and one or more acknowledgments from processor 1 12 of receiver device 141 to processor 103 of transmitter device 140.
[0043] Wireless communication link 107 also transfers antenna information between transmitter device 140 and receiver device 141. During initialization of the phased array antennas 105 and 1 10, wireless communication link 107 transfers information to enable processor 103 to select a direction for the phased array antenna 105. In one embodiment, the information includes, but is not limited to, antenna location information and performance information corresponding to the antenna location, such as one or more pairs of data that include the position of phased array antenna 1 10 and the signal strength of the channel for that antenna position. In another embodiment, the information includes, but is not limited to, information sent by processor 1 12 to processor 103 to enable processor 103 to determine which portions of phased array antenna 105 to use to transfer content. [0044] When the phased array antennas 105 and 1 10 are operating in a mode during which they may transfer content (e.g., HDMI content), wireless communication link 107 transfers an indication of the status of communication path from the processor 1 12 of receiver device 141. The indication of the status of communication comprises an indication from processor 1 12 that prompts processor 103 to steer the beam in another direction (e.g., to another channel). Such prompting may occur in response to interference with transmission of portions of the content. The information may specify one or more alternative channels that processor 103 may use. [0045] In one embodiment, the antenna information comprises information sent by processor 1 12 to specify a location to which receiver device 141 is to direct phased array antenna 1 10. This may be useful during initialization when transmitter device 140 is telling receiver device 141 where to position its antenna so that signal quality measurements can be made to identify the best channels. The position specified may be an exact location or may be a relative location such as, for example, the next location in a predetermined location order being followed by transmitter device 140 and receiver device 141.
(0046] In one embodiment, wireless communications link 107 transfers information from receiver device 141 to transmitter device 140 specifying antenna characteristics of phased array antenna 1 10, or vice versa.
|0047] FIG. 2 illustrates one embodiment of a communication device 200.
The communication device 200 includes data storage 202, an Audio/Video (AV) processor 204, a media access controller (MAC) 206, a physical device interface (PHY) 208, and a radio module 210. Data storage 202 may store any types of data. For example, data storage 202 may store audio and video data as well as other types of data. AV processor 204 receives and processes data from data storage 202. MAC 206 handles generating and parsing physical frames. PHY 208 handles how this data is actually moved to/from the radio module 210. Wireless HD specification supports two basic types of PHY: high rate PHY (HRP) and low rate PHY (LRP). [0048] In accordance with one embodiment, HRP supports multi-Gbps data rates. HRP may operate in a directional mode (typically beam-formed mode). HRP may be used to transmit audio, video, data, and control messages. In one embodiment, HRP occupies roughly 1.7GHz bandwidth.
[0049] In accordance with one embodiment, LRP supports multi-Mbps data rates. LRP may operate in a directional, omni-directional, or beam-formed modes. In one embodiment, LRP may be used to transmit control messages, beacons, and acknowledgements. In an alternative embodiment, LRP may further be used to transmit audio or compressed video. In yet another embodiment, LRP may further be used to transmit low-speed data. In one embodiment, LRP occupies one of three
91 MHz sub-channels within HRP channel as discussed below.
SEPARATION OF VIDEO PIXEL DATA FOR UEP SUPPORT
[0050] Unequal Error Protection (UEP) means using different PHY coding scheme to protect most significant bits (msb) and least significant bits (lsb) of data portion separately. Having separate msb/lsbCRCs for video pixel data allows for msb/lsb data portions to be independently checked and used by receiver. FIG. 3 illustrates an example of a composite packet format with data regions separately encoded.
STATEFULL COORDINATES INSTEAD OF SEQUENCE NUMBER
[0051 ] (H, V) means horizontal/vertical coordinates for each pixel in a video frame, and it uniquely identifies any location in a particular video frame. The main advantage of using (H, V) instead of sequence number is it allows for more robust video data re-assembly on the receiver side, and also saves the redundant descriptor field in case sequence number is used.
[0052] Sequence numbers are thus not required in the header for video subpackets since H, V, frame number, and partition are sufficient to determine where data should go.
[0053] FIGS. 4 and 14 illustrate an example of Video Header Optimization.
Header fields for video sections are important for decoding video. Bit errors in video data can be tolerated in some cases but in other cases (i.e. H & V location) they cannot. Thus, video headers are placed in "video header" section. However audio, control, and data are more sensitive to bit errors, as the headers are, and thus since the non-header information for these types is of similar sensitivity as the header information, separate protection and coding for the two is not needed and the header information can be placed with the data information for these types.
VIDEO SUB-PACKET SIZE SCALING WITH DIFFERENT PHY CODING RATE
[0054] While allowing video sub-packets of any size at any time results in the greatest flexibility, in one embodiment the sizes of the video sub-packets are fixed in length in bytes for the duration of a given video stream. One benefit of this is to reduce encoder and/or decoder implementation complexity. In another embodiment, the size of the duration of the video sub-packets in time is the same for the duration of a given video stream. In yet another embodiment, the number of video bytes in each sub-packet scales linearly with the data rate so that video sub-packets using a first MCS that delivers a data rate that is twice as high a second data rate has twice as many video bytes in the first video sub-packet as in the second video sub-packet. VIDEO SUB-PACKET SIZE WITH CLEAN PIXEL BOUNDARY [0055] Each video pixel is composed of different color elements (such as Red,
Green, Blue or Y, Cb, Cr). For wireless video streaming, incoming video pixel data has to be "packetized" and sent out separately. In one embodiment, sub-packet sizes respect video pixel boundaries to avoid needing color component offset and thus simplifying implementations. If the size of the video sub-packet does not obey pixel boundary, it would require additional information to indicate how much data amount at the end of the subpacket is sent ("pixel offset"). It also creates complexity for interoperable operation.
VIDEO PIXEL PACKING FOR DEEP COLOR FOR UEP SUPPORT (0056) Unequal Error Protection (UEP) divides the data bytes or other transfer units into groups of bits. In one embodiment, UEP divides bytes into two groups - most significant bits (mbs) and least significant bits (lsb). This can allow direct mapping if the transfer units are the same size as the pixel components as in the case with UEP over byte boundaries coding 24-bit-per-pixel RGB video. However, video pixel data has several different variations (16/20/24/30/36-bits per pixel) and thus such direct alignment is not possible. One embodiment cycles the pixel component bits in the same ratio as the msb / lsb groups to pack deep color (> 8 bits per component) pixels into UEP blocks.
[0057J One embodiment uses similar techniques to pack the pixels into two different CRCs covering the lsb and msb groups. This can be used with the UEP techniques or independently.
[0058] FIGS. 5 and 6 illustrate a first embodiment of a deep color format. In this embodiment, the UEP splits the data stream into 4 most significant bits (msb) and 4 least significant bits (lsb) and the 4 lsb and 4 msb are used to generate an lsb CRC and msb CRC. Bit packing is required to pack non byte pixel components into byte stream. After packing, MAC and PHY see standard byte streams. Support of separate lsb/msb CRCs requires separate packing of upper and lower pixel component halves across upper and lower nibbles of bytes. For example, 30 bit mode packs 4 pixel groups in 5 byte chunks as illustrated in FIG. 5. FIG. 6 illustrates 36 bit mode packs 2 pixel groups into 3 byte chunks. Dividing of pixels into partitions occurs prior to packing each partition's pixels into bytes.
|0059J FIG. 7 and 8 illustrate a second embodiment of a deep color format.
Bit packing is required to pack non byte pixel components into byte stream. After packing, MAC and PHY see standard byte stream, lsb/msb CRCs are defined based on byte stream between MAC/PHY to simplify the design significantly - not related to specific bit locations in original pixel data bits. For example, 30 bit mode packs 4 pixel groups in 5 byte chunks as illustrated in FIG. 7. FIG. 8 illustrates 36 bit mode packs 2 pixel groups into 3 byte chunks. Dividing of pixels into partitions occurs prior to packing each partition's pixels into bytes.
VIDEO PIXEL RE-ORDERING FOR MULTIPLE PARTITIONS
[0060] The main concept of multiple partitions is to separate adjacent pixel data in different video sub-packets, so in case there's packet loss the missing pixels can be reconstructed. When the video format is YCbCr and multiple partitions are in place, due to different data rate of each color elements, re-ordering the color element of each pixel data will achieve optimal reconstruction on the receiver side.
[0061] FIG. 9 illustrates a first embodiment of a first pixel partition. First bit in the sub-packet is always Y, bit 0 from pixel 0. Each color element switch on one byte boundary. Cb and Cr alternate in order (YO, CbO, Yl , CrO, Y2, Cr2, Y3, Cb2,
....). The ordering is from horizontal pixel 0 and independent of row. This is needed for proper operation with 2x2 partition mode.
[0062] Data packing for deep color; for example 20 bits/pixel, it will be
Y0[0..3, 5..8], Cb0[0..3, 5..8], Y0[4], Yl [0..2], Y0[9], Yl [5..7], Cr0[0..3, 5..8], ...., etc.
[0063] FIG. 10 illustrates a second embodiment of a first pixel partition. First bit in the sub-packet is always Y, bit 0 from pixel 0. Each color element switch on one byte boundary. Cb and Cr alternate in order (YO, CbO, Yl , CrO, Y2, Cr2, Y3,
Cb2, ....). The ordering is from horizontal pixel 0 and independent of row. This is critical for proper operation with 2x2 partition mode. [0064] Data packing for deep color; for example 20 bits/pixel, it will be
Y0[0..9], Cb0[0..9], Yl [0..9], Cr0[0..9], Y2[0..9], Cr2[0..9], Y3[0..9], Cb2[0..9], ...., etc.
|0065] FIG. 1 1 illustrates YCbCr 4:2:2, 4 partitions. For example, YCbCr,
4:2:2, 16 bits/pixel.
[0066] FIG. 12 illustrates YCbCr 4:2:2, 4 partitions, Deep Color mode in accordance with a first embodiment. Pack pixels in the same partition group. For example, YCbCr, 4:2:2, 20 bits/pixel (each bracket represent one byte). Partition 0 may include {Y0} {CbO} {Y0/Y2} {CbO/Cr2} {Y2/Y4} {Cr2/Cr4} {Y4/Y6} {Cr4/Cb6} {Y6}. Partition 1 may include {Yl } {CrO} {Y1/Y3} {CrO/Cb2} {Y3/Y5} {Cb2/Cb4} {Y5/Y7} {Cb4/Cr6} {Y7}. This scheme balances Cb and Cr pixels across each partition.
FIG. 13 illustrates YCbCr 4:2:2, 4 partitions, Deep Color mode in accordance with a second embodiment. Pack pixels in the same partition group. For example, YCbCr, 4:2:2, 20 bits/pixel (each bracket represent one pixel - 10 bits). Partition 0 may include {Y0} {CbO} {Y2} {Cr2} {Y4} {Cr4} {Y6} {Cb6} {Y8} {Cb8} {Y10} {Crl O}. Partition 1 may include {Yl } {CrO} {Y3} {Cb2} {Y5} {Cb4} {Y7} {Cr6} {Y9} {Cr8} {Yl 1 } {Cbl O}. This scheme balances Cb and Cr pixels across each partition.
PLAYBACK TlMESTAMP
[0067] Synchronization between audio and video can be achieved by controlling when the audio and video streams play out. This additionally can be used for buffer and link management. As illustrated in FIG. 14, a "playback timestamp" can be used to specify when a particular sample or section of audio or video is played back. In one embodiment, the four video sub-packet headers could each include a "playback timestamp" to indicate when they should be played back. However, timestamps can take many bits (30 in one embodiment) which can be scarce in headers. Thus in another embodiment, only one playback timestamp is specified in the video header and two additional "playback select" bits indicate which of the four video sub-packets the playback timestamp applies to. Thus the playback timestamp for any of the four video sub-packets can be communicated. Since the playback timestamp only needs to be communicated once per video frame, this is typically sufficient.
(0068] In the description, numerous details are set forth to provide a more thorough explanation of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention. [0069] Some portions of the detailed descriptions which follow are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
(0070] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0071] The present invention also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
|0072] The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
[0073] A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium includes read only memory ("ROM"); random access memory ("RAM"); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc. [0074| Whereas many alterations and modifications of the present invention will no doubt become apparent to a person of ordinary skill in the art after having read the foregoing description, it is to be understood that any particular embodiment shown and described by way of illustration is in no way intended to be considered limiting. Therefore, references to details of various embodiments are not intended to limit the scope of the claims which in themselves recite only those features regarded as essential to the invention.

Claims

CLAIMSWe claim:
1. An apparatus comprising: an audio/video (AV) processor; a media access controller (MAC) coupled to the AV processor for generating a composite packet having a format configured to carry audio, video, and data traffic with fields in a header of the composite packet specifying video- specific information; a physical device interface (PHY) coupled to the MAC, the PHY to encode and decode between a digital signal and a modulated analog signal, the PHY comprising a high rate physical layer circuit (HRP) and a low rate physical layer circuit (LRP); and a radio frequency (RF) transmitter coupled to the PHY to transmit data.
2. The apparatus of claim 2 wherein the composite packet comprises sub- packets sharing a common MAC and PHY header.
3. The apparatus of claim 1 wherein a msb and lsb data portion of the composite packet is transmitted using different Modulation Coding Scheme (MCS).
4. The apparatus of claim 1 wherein a msb and lsb data portion of the composite packet is protected using an msb and lsb CRC.
5. The apparatus of claim 1 wherein the composite packet comprises horizontal and vertical coordinates in a video sub-packet header of the composite packet to uniquely identify any location in a particular video frame.
6. The apparatus of claim 1 wherein a size of the video sub-packets of the composite packet is scaled with different PHY coding rates.
7. The apparatus of claim 1 wherein a size of the video sub-packets of the composite packet are determined when for the duration of the video stream
8. The apparatus of claim 1 wherein the video sub-packets of the composite packet include clean pixel boundaries.
9. The apparatus of claim 1 wherein the deep color video sub-packets of the composite packet support UEP.
10. The apparatus of claim 1 wherein the MAC reorders a color element for each pixel data in multiple partitions for improved reconstruction.
1 1 . The apparatus of claim 1 wherein at least one video playback time is included in the video header portion.
12. The apparatus of claim 1 1 wherein only one playback time field is included for a plurality of video sub-packets and a playback select is specified to indicate which of the video sub-packets the playback time describes.
13. A method comprising: generating a composite packet with a wireless HD AV format configured to carry audio, video, and data traffic, with fields in a header of the composite packet specifying video-specific information, wherein a msb and lsb video portion of the composite packet transmitted is protected by an msb and lsb CRC.
14. The method of claim 13 wherein the composite packet comprises sub- packets sharing a common MAC and PHY header.
15. The method of claim 13 wherein the composite packet comprises horizontal and vertical coordinates in a video sub-packet header of the composite packet to uniquely identify any location in a particular video frame.
16. The method of claim 13 wherein a size of the video sub-packets of the composite packet is scaled with different PHY coding rate.
17. The method of claim 13 wherein the video sub-packets of the composite packet include clean pixel boundaries.
18. The method of claim 13 wherein the deep color video sub-packets of the composite packet support UEP.
19. The method of claim 13 further comprising: reordering a color element for each pixel data in multiple partitions for improved reconstruction.
20. The method of claim 13 wherein at least one video playback time is included in the video header portion.
PCT/US2007/023126 2006-11-01 2007-11-01 Wireless hd av packet format WO2008057407A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE602007011819T DE602007011819D1 (en) 2006-11-01 2007-11-01 WIRELESS HD AV PACKAGE FORMAT
EP07861651A EP2098027B1 (en) 2006-11-01 2007-11-01 Wireless hd av packet format
AT07861651T ATE494711T1 (en) 2006-11-01 2007-11-01 WIRELESS HD AV PACKAGE FORMAT

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
US85610406P 2006-11-01 2006-11-01
US60/856,104 2006-11-01
US87375906P 2006-12-08 2006-12-08
US60/873,759 2006-12-08
US90138807P 2007-02-14 2007-02-14
US90138407P 2007-02-14 2007-02-14
US60/901,384 2007-02-14
US60/901,388 2007-02-14
US92033807P 2007-03-26 2007-03-26
US92026607P 2007-03-26 2007-03-26
US92035707P 2007-03-26 2007-03-26
US60/920,338 2007-03-26
US60/920,357 2007-03-26
US60/920,266 2007-03-26
US11/982,209 US8279784B2 (en) 2006-11-01 2007-10-31 Wireless HD AV packet format
US11/982,209 2007-10-31

Publications (3)

Publication Number Publication Date
WO2008057407A2 true WO2008057407A2 (en) 2008-05-15
WO2008057407A3 WO2008057407A3 (en) 2008-10-30
WO2008057407A8 WO2008057407A8 (en) 2009-06-18

Family

ID=39475704

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/023126 WO2008057407A2 (en) 2006-11-01 2007-11-01 Wireless hd av packet format

Country Status (6)

Country Link
US (1) US8279784B2 (en)
EP (1) EP2098027B1 (en)
AT (1) ATE494711T1 (en)
DE (1) DE602007011819D1 (en)
TW (1) TWI431981B (en)
WO (1) WO2008057407A2 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7782836B2 (en) * 2006-03-24 2010-08-24 Samsung Electronics Co., Ltd. Method and system for transmission of different types of information in wireless communication
US8259647B2 (en) * 2006-06-12 2012-09-04 Samsung Electronics Co., Ltd. System and method for wireless communication of uncompressed video having a link control and bandwidth reservation scheme for control/management message exchanges and asynchronous traffic
KR100917889B1 (en) * 2006-11-01 2009-09-16 삼성전자주식회사 Apparatus and method for wireless communication
US20090232116A1 (en) * 2008-03-11 2009-09-17 Li Guoqing C Mechanism to avoid interference and improve channel efficiency in mmwave wpans
US8520726B2 (en) * 2008-03-18 2013-08-27 Electronics And Telecommunications Research Institute Method and apparatus for unequal error protection in transmitting uncompressed video with various type over wideband high frequency wireless system
KR101509257B1 (en) * 2008-03-18 2015-04-08 한국전자통신연구원 Method and Appratus for Unequal Error Protection in transmitting uncompressed video with various type over wideband high frequency wireless system
JP4561893B2 (en) * 2008-07-11 2010-10-13 ソニー株式会社 Data transmitting apparatus, data receiving apparatus, data transmitting method and data receiving method
JP5460702B2 (en) 2009-05-14 2014-04-02 パナソニック株式会社 Video data packet transmission method
US8891610B2 (en) * 2010-05-05 2014-11-18 Samsung Electronics Co., Ltd. Method and system for chroma partitioning and rate adaptation for uncompressed video transmission in wireless networks
KR20140043372A (en) * 2011-07-05 2014-04-09 삼성전자주식회사 Trasmitting apparatus, receiving apparatus, image signal trasmitting method and image signal receiving method
GB2506349B (en) * 2012-09-21 2015-12-16 Canon Kk Method and device for transmitting uncompressed video streams
US9538138B2 (en) 2013-06-05 2017-01-03 Puddle Innovations System for providing access to shared multimedia content

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1385292A2 (en) 2002-07-26 2004-01-28 Samsung Electronics Co., Ltd. Method of generating transmission control parameters and method of selective retransmission according to packet characteristics

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7483532B2 (en) 2003-07-03 2009-01-27 Microsoft Corporation RTP payload format
US20070202843A1 (en) * 2006-02-15 2007-08-30 Samsung Elctronics Co., Ltd. Method and system for data partitioning and encoding for transmission of uncompressed video over wireless communication channels
US8605797B2 (en) * 2006-02-15 2013-12-10 Samsung Electronics Co., Ltd. Method and system for partitioning and encoding of uncompressed video for transmission over wireless medium
US8665967B2 (en) * 2006-02-15 2014-03-04 Samsung Electronics Co., Ltd. Method and system for bit reorganization and packetization of uncompressed video for transmission over wireless communication channels
US7782836B2 (en) * 2006-03-24 2010-08-24 Samsung Electronics Co., Ltd. Method and system for transmission of different types of information in wireless communication
US20070230461A1 (en) * 2006-03-29 2007-10-04 Samsung Electronics Co., Ltd. Method and system for video data packetization for transmission over wireless channels
US20070286103A1 (en) * 2006-06-08 2007-12-13 Huaning Niu System and method for digital communication having puncture cycle based multiplexing scheme with unequal error protection (UEP)
US20080049707A1 (en) * 2006-07-12 2008-02-28 Samsung Electronics Co., Ltd. Transmission packet for wireless transmission in a high frequency band, and method and apparatus for transmission/receiving using the same
US8111654B2 (en) * 2006-08-09 2012-02-07 Samsung Electronics Co., Ltd. System and method for wireless communication of uncompressed video having acknowledgement (ACK) frames
US8230288B2 (en) * 2006-10-18 2012-07-24 Samsung Electronics Co., Ltd. Data transmission apparatus and method for applying an appropriate coding rate

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1385292A2 (en) 2002-07-26 2004-01-28 Samsung Electronics Co., Ltd. Method of generating transmission control parameters and method of selective retransmission according to packet characteristics

Also Published As

Publication number Publication date
EP2098027A2 (en) 2009-09-09
TW200838229A (en) 2008-09-16
US20080130741A1 (en) 2008-06-05
ATE494711T1 (en) 2011-01-15
WO2008057407A3 (en) 2008-10-30
DE602007011819D1 (en) 2011-02-17
WO2008057407A8 (en) 2009-06-18
TWI431981B (en) 2014-03-21
EP2098027B1 (en) 2011-01-05
US8279784B2 (en) 2012-10-02

Similar Documents

Publication Publication Date Title
EP2098027B1 (en) Wireless hd av packet format
US9065682B2 (en) Wireless HD MAC frame format
US11863812B2 (en) Video processing system for demultiplexing received compressed and non-compressed video signals and transmitting demultiplexed signals
US11509953B2 (en) Information processing apparatus and information processing method
US7983304B2 (en) Communication system provided with transmitter for transmitting audio contents using packet frame of audio data
US7562379B2 (en) Method and system for wireless digital multimedia presentation
US8275732B2 (en) High definition multimedia interface transcoding system
EP2568713B1 (en) Electronic device and method of transmitting content item
US20110103472A1 (en) Methods, systems and devices for compression of data and transmission thereof using video transmission standards
US20080232588A1 (en) System and method for implementing content protection in a wireless digital system
US8331765B2 (en) Method and apparatus for protecting against copying contents by using WiHD device
US20070297612A1 (en) Method, device and system of encrypted wireless communication
US20130028416A1 (en) System and method for media transcoding and presentation
US8355504B2 (en) AV communication control circuit for realizing copyright protection with respect to radio LAN
US20120159146A1 (en) System and Method for Transcoding Content
US20060069965A1 (en) Data transfer device
US20100067693A1 (en) System and method of enabling content output on a digital device
US20090077605A1 (en) Transmission method, transmission apparatus, video equipment, and display apparatus
JP2012016053A (en) Digital signal processing device
JP2009118528A (en) Digital signal transmission method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07861651

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2007861651

Country of ref document: EP