WO2016196138A1 - Communication of sideband data for videos - Google Patents

Communication of sideband data for videos Download PDF

Info

Publication number
WO2016196138A1
WO2016196138A1 PCT/US2016/034153 US2016034153W WO2016196138A1 WO 2016196138 A1 WO2016196138 A1 WO 2016196138A1 US 2016034153 W US2016034153 W US 2016034153W WO 2016196138 A1 WO2016196138 A1 WO 2016196138A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sideband
frame
video
digital samples
Prior art date
Application number
PCT/US2016/034153
Other languages
French (fr)
Inventor
Jiong Huang
Jun Guo
Hongpend WANG
Original Assignee
Lattice Semiconductor Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lattice Semiconductor Corporation filed Critical Lattice Semiconductor Corporation
Publication of WO2016196138A1 publication Critical patent/WO2016196138A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2365Multiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4344Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/025Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2350/00Solving problems of bandwidth in display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/10Use of a protocol of communication by packets in interfaces along the display data pipeline

Definitions

  • Embodiments of the present disclosure generally relate to the field of data communications, and more particularly, to communication of sideband data for videos.
  • data may be transmitted over a data interconnect or link between audio/video (A/V) devices.
  • A/V audio/video
  • a stream of data may be sent from a first A/V device to a second A/V device, where the second device may either utilize the data or retransmit such data to another device.
  • the transmitted data stream may be converted to a frame in a certain digital A/V format.
  • the receiving device is required to interpret and handle the data stream in that format.
  • a blanking period refers to a period during which an electron beam for scanning the video data on a screen moves back to an initial position to scan the next line.
  • the video data for display on the screen is not transmitted during the blanking periods.
  • non-video data such as audio data, control data, and/or sideband data may be embedded in one or more of the blanking periods.
  • Non-video data embedded in the frame for example, the sideband data
  • the sideband data are not supported in converting of such a frame to digital A/V formats compatible with various interface specifications.
  • the sideband data is included in blanking periods of a frame, converting the frame into a format compatible with one of those digital A/V formats may result in the non-video data being stripped out. It is desired to retain the sideband data during the transmission of the data stream to the receiving device.
  • Example embodiments of the present disclosure propose a mechanism for transmitting sideband data for videos.
  • example embodiments of the present disclosure provide a method.
  • the method includes downsampling digital samples representing raw sideband data for a video.
  • the method also includes generating first sideband data based on the downsampled digital samples, and generating a first frame for the video, the first frame at least including first video data for the video and the first sideband data.
  • the method further includes transmitting to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.
  • example embodiments of the present disclosure provide a method.
  • the method includes receiving from a further device first video data and first sideband data included in a first frame for a video during a first active data period for the first frame, the first sideband data being generated based on downsampled digital samples representing raw sideband data for the video, and the first active data period being different from a first blanking period for the first frame.
  • the method also includes generating second video data for display based on the first video data and the first sideband data.
  • example embodiments of the present disclosure provide a device.
  • the device includes a processor configured to downsample digital samples representing raw sideband data for a video, generate first sideband data based on the downsampled digital samples, and generate a first frame for the video, the first frame at least including first video data for the video and the first sideband data.
  • the device also includes a transmitter configured to transmit to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.
  • example embodiments of the present disclosure provide a device.
  • the device includes a receiver configured to receive from a further device first video data and first sideband data included in a first frame for a video during a first active data period for the first frame, the first sideband data being generated based on downsampled digital samples representing raw sideband data for the video, and the first active data period being different from a first blanking period for the first frame.
  • the device also includes a processor configured to generate second video data for display based on the first video data and the first sideband data.
  • Fig. 1 is a block diagram of an example system for exchanging A/V information according to an embodiment of the present disclosure
  • Fig. 2 is a schematic diagram showing frames to be processed according to an embodiment of the present disclosure
  • 100151 Fig. 3 is a schematic diagram showing frames to be generated according to an embodiment of the present disclosure
  • FIG. 4 is a block diagram of a device for generating a frame according to an embodiment of the present disclosure
  • FIG. 5 is a block diagram of a sideband processor of the device of Fig. 4 according to an embodiment of the present disclosure
  • 100181 Fig. 6 is a schematic diagram of converting an analog signal for the sideband data into the downsampled digital samples according to an embodiment of the present disclosure
  • 100191 Fig. 7A-7C are schematic diagrams of packets generated by different operations of the device of Fig. 4 according to an embodiment of the present disclosure
  • 100201 Fig. 8 is a schematic diagram of a portion of a frame according to an embodiment of the present disclosure
  • 100211 Fig. 9 is a block diagram of a device for receiving a frame according to an embodiment of the present disclosure.
  • Fig. 10 is a block diagram of a sideband processor of the device of Fig. 9 according to an embodiment of the present disclosure
  • Fig. 11 is a flowchart of a method for generating a frame according to an embodiment of the present disclosure.
  • Fig. 12 is a flowchart of a method for receiving a frame according to an embodiment of the present disclosure.
  • audio/video or “A/V” refers to one or more characteristics relating to audio data or video data, or relating to both audio data and video data.
  • A/V sink or simply “sink” refers to a device receiving audio/video data from some other devices.
  • A/V source refers to a device providing A/V information to some other (sink) devices.
  • A/V information may include some or all of audio data and/or control information and video data and/or control information.
  • A/V device refers to either an A/V source or an A/V sink, or both an A/V source and an A/V sink.
  • an A/V device may, in addition to exchanging A/V information with another device, be operable to render audio data and/or video data for a user.
  • video data of frames may also include audio data, for example, although certain embodiments are not limited in this regard.
  • an A/V device may be operable to exchange A/V information according to some interface standard in one or more respects.
  • the A/V device may exchange A/V information via a data link, connector, and/or interconnect which is compatible with a video interface specification.
  • the video interface specification may include, but are not limited to, a High-Definition Multimedia Interface (HDMI) specification, a Mobile High- Definition Link (MHL) specification, a Digital Visual Interface (DVI) specification, and a DisplayPort specification.
  • the video interface specifications can transfer non-compressed video data, but certain embodiments are not limited to processing communications which are exchanged according to a non-compressed video interface specification. For example, such embodiments may be applied to communications which are exchanged according to other video interface specifications that include frame formats having characteristics as those discussed herein.
  • the A/V device may implement communications which, at different times, may be compatible with different interface standards via the same connector and/or interconnect.
  • the A/V device may include first communication logic to detect the presence of and communicate with a HDMI interconnector and second communication logic to detect the presence of and communicate with a MHL device.
  • the respective detection and communication functionalities of the first and second communication logics may not conflict with or otherwise impede the other.
  • Various embodiments are discussed herein in the context of exchanging A/V information according to an HDMI interface specification. However, the discussion may be extended to apply to any of a variety of additional or alternative interface specifications for exchanging A/V information in some other embodiments.
  • a frame refers to a logical grouping of data where a beginning and an end of the frame may be each indicated by or otherwise associated with a respective control signal, a time interval, and/or the like.
  • a frame includes video data and non-video data for transmission.
  • the non-video data may be used to facilitate processing or displaying of such video data. Examples of such non-video data may include, but are not limited to, audio data, control data, auxiliary information, and sideband data.
  • sideband data refers to various types of text and/or image data that will be displayed separately or in combination with the video data. Examples of sideband data may include, but are not limited to, teletext data, closed captioning data, Macrovision data, and/or other non-video data for display.
  • data of a frame may also be divided as “blanking data” and “active data.”
  • the blanking data refers to data that is communicated during one or more blanking intervals (or “blanking periods") for the frame.
  • the blanking periods may include, for example, one or more horizontal blanking periods or a vertical blanking period.
  • the active data is distinguished from the blanking data and communicated during an active data period (or "active data interval") for the frame.
  • the length or duration of a horizontal or vertical blanking period or an active data period may vary from system to system depending upon the type of interface specifications used and the number of pixels per line (i.e. the size or dimensions of the display at the video sink side).
  • line refers to a logic grouping of data included in a frame where one or more control signals distinguish one set of data as belonging to a first line and another set of data as belonging to a second line.
  • lines of data the terms “horizontal” and “vertical” are used according to conventional use to distinguish different types of logical grouping of data in a frame in the horizontal and vertical directions.
  • the sideband data is processed in advance by a video source.
  • the video source is required to process and blend the sideband data with corresponding video data.
  • the video source then generates a frame including the blended video data and other non-video data in a digital A/V format.
  • the blended video data in the resulting frame is sent to a video sink during an active data period for the frame while other non-video data in the frame is sent during the blanking periods. Consequently, the video sink such as a digital-only television receives the sideband data in an active data period of the frame instead of in a blanking period.
  • the processing and blending of sideband data requires additional complexity and resource load for the video source.
  • Fig. 1 illustrates an example system 100 for exchanging A/V information according to an embodiment of the present disclosure.
  • the system 100 may include a video source 110, a video sink 160, and a converter device 130.
  • the converter device 130 may be used to facilitate A/V communications between the video source 110 and the video sink 160.
  • the video source 110 and the converter device 130 may be respective components of a single larger device (not shown) of the system 100.
  • Another embodiment may be implemented entirely by a single device of which the video source 110 and the converter device 130 are each a component.
  • One embodiment may be implemented entirely by the video source 100 or the converter device 130, for example.
  • Still another embodiment may be implemented by the system 100 as a whole. Any of a variety of other embodiments may be alternatively implemented according to techniques discussed herein.
  • the video source 110 may include functionality of one or more A/V source devices.
  • the video source 110 may include functionality including, but not limited to, that of a personal computer (e.g. tablet, notebook, laptop, desktop and/or the like), camcorder, smart phone, video game console, television, monitor, display, set-top box, home theater receiver, and/or the like.
  • the video source 110 may include a component (for example, a hard disk drive, a solid state drive, a bus, an input port and/or the like) of such an A/V source device.
  • the video sink 160 may include functionality of one or more conventional A/V sink devices including, but not limited to, a television, monitor, display and/or the like.
  • the video source 110 is further capable of providing functionality of one or more A/V sink devices and/or the video sink 160 is further capable of providing functionality of one or more A/V source devices.
  • the video source 110 may send to the converter device 130 a communication 120 including a first frame.
  • the first frame may be generated according to a frame format which supports one or more types of sideband data.
  • the first frame in the communication 120 may be generated according to an analog television format, a standard-definition A/V format such as 480i or 576i, or even a high definition (HD) format which supports sideband communication in a blanking period or supports the communication of the sideband data with the video data in an active data period according to some embodiments of the present disclosure.
  • a standard-definition A/V format such as 480i or 576i
  • HD high definition
  • a blanking period of the first frame includes first sideband data and an active data period of the first frame includes first video data.
  • Control data included in the communication 120 may be used to distinguish a blanking period of the first frame from one or more periods for communicating the active data.
  • a blanking period and an active data period may be distinguished from one another by a vertical synchronization (VSYNC) control signal, a horizontal synchronization (HSYNC) control signal, and/or one or more data enable (DE) signals including a horizontal DE signal, a vertical DE signal, and/or a logical combination thereof.
  • VSYNC vertical synchronization
  • HSEL horizontal synchronization
  • DE data enable
  • the converter device 130 may generate a communication 150 which includes a second frame generated based at least in part on the first frame of the communication 120.
  • the converter device 130 may be a hardware interconnect device directly coupled to one or both of the video source 110 and the video sink 160.
  • the converter device 130 may include a cable and a connector housing (not shown) at one end of such cable, the connector housing including logic to provide frame conversion functionality disclosed herein.
  • the communication 150 is transmitted to the video sink 160 via a hardware interconnect which is compatible with a video interface specification for communicating video data.
  • a video interface specification may include, but are not limited to, a HDMI specification, a MHL specification, a DVI specification, and a DisplayPort specification.
  • the hardware interconnect may include, for example, a multimedia communication link (HDMI cable, MHL cable) with several video channels and one or more control channels.
  • the hardware interconnect may be compatible with the physical layer requirements of the video interface specification including, but not limited to, connector, cable and/or other hardware requirements identified in one or more of HDMI, MHL or other such specifications.
  • the video interface specification may allow only non-compressed video data, or in other embodiments may allow compressed video data.
  • the video interface specification may identify (specify or reference) a frame format for the second frame which, for example, requires a respective number of bits-per-pixel, pixels-per-line, lines-per-frame, a respective number of periods for a horizontal blanking interval, a vertical blanking interval, or the active data period of the second frame.
  • the second frame of the communication 150 may include a frame format identified in the video interface specification. As mentioned above, the second frame may be generated by the converter device 130 based at least in part on the first frame of the communication 120. For example, second video data of the second frame may be generated based on the video data of the first frame, and second sideband data of the second frame may be generated based on the first sideband data of the first frame.
  • the first frame of the communication 120 may be in an analog A/V format while the second frame of the communication 150 may be in a digital A/V format according to the used video interface specification.
  • the first sideband data of the first frame may be transmitted during a blanking period for the first frame such as the vertical blanking period.
  • the converter device 130 may generate the second sideband data based on the first sideband data and transmit the second sideband data with the second video data during an active data period instead of a blanking period.
  • the video source 110 is able to generate and transmit the first frame in a conventional way without increasing any complexity while the converter device 130 can still retransmit the sideband data to the video sink 160.
  • the format of the first frame may include a total of X horizontal lines of non-video data in a vertical blanking period, where X is an integer.
  • a frame format may include an active data period which comprises a total of Y horizontal lines of video data, where Y is another integer.
  • Audio data if present, may be distributed across X lines of vertical blanking interval and horizontal blanking intervals of Y lines of active data.
  • the active data of the second frame generated by the converter device 130 may include a total of Y horizontal lines of the second video data and S horizontal lines of the second sideband data, where S is an integer. That is, the active data is extended from Y horizontal lines to (Y+S) horizontal lines in the second frame.
  • the vertical blanking interval of the second frame may then be reduced to (X-S) horizontal lines for other blanking data than the second sideband data.
  • the second frame after being modified, still includes a total of Y horizontal lines of video data and a total of X horizontal lines of non-video data.
  • the format of the second frame may be arranged in any other structures as long as the second sideband data and video data can be transmitted in the active data period.
  • the second frame in a digital A/V format compatible with the video interface specification is converted by the converter device 130 based on the first frame.
  • the video source 110 may generate the first frame according to the digital A/V format compatible with requirements of the video interface specification.
  • the first frame may include an active data period with a total of Y horizontal lines of video data and S horizontal lines of sideband data. Other blanking data other than the sideband data may be included in (X-S) horizontal lines and transmitted in the vertical blanking interval.
  • the converter device 130 may directly forward the first frame received in the communication 120 from the video source 110 to the video sink 150.
  • the converter device 130 may generate the second frame according to requirements of a different video interface specification.
  • Fig. 2 illustrates a schematic diagram showing frames 200, one or more of which are to be processed according to an embodiment of the present disclosure.
  • Frames 200 may be sent in the communication 120 and processed by the converter device 130, for example.
  • processing one of the frames 200 is to generate a second frame (not shown), where an active data period of the second frame carries both video data and sideband data.
  • the frames 200 may be formatted according to an A/V specification which supports communication of sideband data in a blanking period and video data in an active data period which is distinct from that vertical blanking period.
  • frames 200 may be arranged according to a format compatible with 576i, 480i, or the like.
  • transmission of a frame according to such an A/V specification includes sequentially sending horizontal lines of data.
  • a frame 202 includes a vertical blanking interval 220 and horizontal blanking intervals 250 and 252 for blanking data.
  • the frame 202 further includes video data in the active data period 230.
  • the frame 204 may include a vertical blanking (VB) interval 225 and horizontal blanking (HB) intervals 254 and 256 for blanking data, and an active data period 235 for video data.
  • VB vertical blanking
  • HB horizontal blanking
  • the video data 230 of the frame 202 may span Y horizontal lines where portions of such horizontal lines include horizontal blanking data outside of the video data 230.
  • the VB interval 220 may include X horizontal lines of blanking data including sideband data.
  • the video data 230 may span 288 horizontal lines where portions of such 288 horizontal lines include horizontal blanking data outside of the video data 230.
  • the VB interval 220 may include 24 or 25 horizontal lines. However, according to different embodiments, additional or fewer horizontal lines may be used for communicating either or each of the video data 230 and blanking data in the VB interval 220.
  • the VB intervals 220, 225 include respective sideband (SB) data 240, 245.
  • SB sideband
  • either or each of sideband data 240 and 245 may include respective sideband data (for example, teletext or closed caption).
  • sideband data if available, is included in horizontal lines of a vertical blanking period.
  • the frames 200 including the sideband data 240 and 245 are typically transmitted in an analog form.
  • the sideband data 240, 245 is teletext data, there may be up to 17 lines of teletext data in a single frame 202 or 204.
  • the types of data in the vertical and horizontal lines may be indicated by one or more control signals transmitted in or with the frame.
  • Blanking data transmitted in a blanking period and video data transmitted in an active data period may be distinguished from one another by a horizontal synchronization (HSYNC) control signal 212, a vertical synchronization (VSYNC) control signal 214, and/or one or more data enable (DE) signals including a horizontal DE (HDE) signal 210, a vertical DE signal (VDE) 216, and/or a logical combination thereof.
  • HDE horizontal DE
  • VDE vertical DE signal
  • the VDE 216 may indicate an active data period which is used to identify both the video data 220 as being active data of the frame 202.
  • Each of the respective polarities (for example, active low or active high) of the HDE 210, VDE 216, HSYNC 212, and VSYNC 214 are merely illustrative and may not be limiting on certain embodiments.
  • the VDE 216 may enable the vertical blanking data 220 and 225 each having a respective span according to the specified frame format of an interface specification. Control signals HDE 210, HSYNC 212, and VSYNC 214 may otherwise support communication of the frames 202 and 204 according to the frame format.
  • Fig. 3 illustrates a schematic diagram showing frames 300, one or more of which are to be generated by a converter device or a video source according to an embodiment.
  • Frames 300 may be sent in the communication 120 or the communication 150, for example.
  • one or more of the frames 300 may be generated by the source device 110 based on the original sideband data, video data, and other data to be transmitted to the video sink 160.
  • one or more of the frames 300 may be generated by the converter device 130 based on the first frame received from the source device 110.
  • a frame 302 includes a vertical blanking (VB) interval 320 which occupy a total of only (X-S) consecutive horizontal lines of blanking data.
  • VB vertical blanking
  • the blanking data in the interval 320 may be transmitted during a vertical blanking period for the frame 302.
  • the frame 302 may further include additional (Y+S) consecutive horizontal lines of active data.
  • the active data includes S consecutive horizontal lines of sideband data (SB) 340 and Y consecutive horizontal lines of video data 330.
  • SB sideband data
  • Y consecutive horizontal lines of video data 330.
  • the S horizontal lines for the sideband data 340 and the Y horizontal lines for video data 330 may be arranged in a different way, for example, may be transmitted in a different order, interleaved with one another, and/or the like.
  • the frame 302 may also include horizontal blanking (HB) intervals 350 and 352 which are located in the remaining portions of the frame 302 other than the active data interval and the vertical blanking interval. Some non-video data may be included in the blanking intervals 320, 350, and 352. In some embodiments, the packetized non-video data such as audio data and/or the unpacketized non-video data such as control signals may be included in the blanking intervals.
  • the structure of the frame 304 may be similar to that of the frame 302.
  • the frame 304 may include a vertical blanking interval 325, horizontal blanking intervals 354 and 356, sideband (SB) data 345, and video data 335.
  • SB sideband
  • the respective numbers of lines (horizontal and/or vertical lines) of different areas in the frame 304 may be the same or different from those of the frame 302.
  • Blanking data transmitted in a blanking period and active data transmitted in an active data period may be distinguished from one another by a HSYNC control signal 312, a VSYNC control signal 314, and/or one or more DE signals including a HDE signal 310, a VDE 316, and/or a logical combination thereof.
  • the VDE 316 may indicate an active data period which is used to identify both the sideband data 340 and video data 330 as being active data of the frame 302.
  • Each of the respective polarities (for example, active low or active high) of HDE 310, VDE 316, HSYNC 312, and VSYNC 314 are merely illustrative and may not be limiting on certain embodiments.
  • the VDE 316 may enable the vertical blanking data in the intervals 320 and 325 each having a respective span that is deviated from the specified frame format of an interface specification. Control signals HDE 310, HSYNC 312, and VSYNC 314 may otherwise support communication of the frames 302 and 304 according to the frame format.
  • the sideband data used to generate the frames 300 is in an analog format. If the raw sideband data is simply converted into digital samples and then included into the frames 300, more horizontal lines will be used to carry the sideband data 340, 345. That is, S will be large. Since the number of horizontal lines for blanking data X is specified by the video interface specification, less horizontal lines will be left for other blanking data such as audio data or control data.
  • the digital samples are downsampled so as to reduce the size of the digital data for carrying the sideband data.
  • the downsampled digital samples may be used to generate the sideband data 340, 345 included in the frames 300 to be transmitted to the video sink 160.
  • the processing of the raw sideband data will be discussed in more detail below.
  • Fig. 4 illustrates a block diagram of a device 400 for generating a frame including sideband data in the active data period according to an embodiment of the present disclosure.
  • the device 400 may have functionality of generating a frame in a format compatible with a video interface specification.
  • the device 400 may represent the converter device 130 or the video source 110 as shown in Fig. 1.
  • the device 400 may include an analog to digital converter (ADC) 410, a frame splitter 415, a sideband processor 420, a video processor 422, a frame generator 424, and a transmitter (TX) 440.
  • ADC analog to digital converter
  • TX transmitter
  • an input 405 of the device 400 may be in an analog format and may be converted by the ADC 410 into digital samples 412.
  • the input 405 may include the first frame in the communication 120.
  • the input 405 may include a frame having sideband data such as teletext data or closed captioning data and other non-video data in a blanking period (for example, a vertical blanking period) for this frame.
  • the frame may further include video data in the active data period which is distinct from the blanking period.
  • the blanking period, sideband data, and video data may include the VB interval 220, sideband data 240, and video data 230, respectively.
  • the input 405 may include sideband data and video data generated or otherwise obtained by the device 400 (for example, the video source 110 in Fig. 1).
  • the sideband data and video data in the analog format without being processed may be referred to as "raw" sideband data and "raw” video data, respectively.
  • the input 405 may be converted into multi-bit digital samples 412.
  • the digital samples may be eight bit samples.
  • the sampling rate of the ADC 410 may be the same as the rate of a video clock for the video data. In some cases, the sampling rate of the ADC 410 may be the same for all data within the input 405.
  • the device 400 may directly obtain the input 405 in a digital A/V format, for example, the digital samples representing the raw sideband data, video data, and/or other blanking data. In these cases, the ADC 410 may be optional to the device 400.
  • the frame splitter 415 may identify one or more portions of digital samples 412 as being sideband data and identify other portions of the input 405 as being video data. For example, the frame splitter 415 may identify that a format of the frame includes a particular time interval dedicated to communication of the sideband data. The frame splitter 415 may identify the sideband data from the digital samples 412 of the input 405 based on such a time interval. The frame splitter 415 may then perform de-multiplexing to the digital samples 412 to separate digital samples 430 representing the raw sideband data to output to the sideband processor 420.
  • the frame splitter 415 may further separate video data samples 432 from the digital samples 412 to output to the video processor 422. In the cases where the input 405 is not in a frame format but is directly generated or otherwise obtained by the video source, the frame splitter 415 may be optional to the device 400.
  • the sideband processor 420 may perform one or more operations on the sideband data samples 430 to generate sideband data 434 to be included in the resulting frame.
  • the operations may include downsampling the sideband data samples 430 to reduce the size of the samples.
  • the downsampling may be performed by comparing the respective multi-bit sideband data samples 430 with a decision threshold to reduce the number of bits used to represent the respective samples.
  • the downsampling may be performed by simply reducing the sampling rate by selecting bits of the sideband data samples with a predetermined ratio.
  • the operations may further include de-serializing the downsampled sideband data samples, packetizing the de-serialized sideband data, adding error correcting code (ECC) to the sideband data packets, and interleaving the sideband data packets.
  • ECC error correcting code
  • the resulting sideband data 434 may be inserted into an active data period for output 450.
  • the operations of the sideband processor 420 will be discussed in more detail below with reference to Fig. 5.
  • the sideband data 434 is thus a processed version of the sideband data samples 430 that consumes less bandwidth in a frame than the original sideband data.
  • the sideband data 434 may also be transmitted more reliably due to the ECC bits and the operation of interleaving.
  • the complexity and cost of the device receiving the frame may also be reduced since the device can directly recover information from the sideband data in the frame.
  • the sink device may necessarily include additional components to recover the net information from the raw digital samples.
  • the video processor 422 may perform one or more operations to generate video data 436 based on video data samples 432.
  • the video processor 422 may perform operations to convert, synchronize, order, condition, or otherwise generate video data 436 including one or more characteristics of a frame format according to a video interface specification.
  • Generation of the video data 436 may include one or more operations according to conventional techniques for converting video data into a format compatible with a video interface specification.
  • the device 400 may include one or more additional processor (not shown) to perform one or more operations according to conventional techniques for converting blanking data such as audio data and/or control data into a format compatible with a video interface specification.
  • the video data 436, sideband data 434, and other blanking data may be provided to the frame generator 424.
  • the frame generator 424 may generate a frame which includes both the video data 436 and the sideband data 434 in the active data period.
  • the frame generator 424 may perform one or more multiplexing operations to interleave portions of the sideband data 434 with portions of the frame data 436.
  • the frame generator 424 may also add blanking data such as audio data and/or control data to the frame.
  • the frame generator 424 may provide the resulting frame to the transmitter (TX) 440.
  • the transmitter 440 may transmit an output 450 including the resulting frame to a receiving device such as the video sink 160.
  • the output 450 may be transmitted across a multimedia communication link (for example, HDMI, MHL, or the like).
  • the resulting frame may be compatible with physical requirements of the used video interface specification and may be formed in a format as shown in Fig. 3.
  • the frame generator 424 may receive a support indication 460 from the receiving device, for example, the video sink 160.
  • the indication 460 indicates whether the video sink 160 supports the sideband data.
  • the frame generator 424 may only generate a frame to include the sideband data 434 if the indication 460 indicates that the video sink 160 supports it. Otherwise, the frame generator 424 may exclude the sideband data from the frame.
  • the indication 460 may also indicate whether the video sink 160 supports transmission of the sideband data within the active data period of the frame.
  • the support indication may be an extension flag in a vendor specific data block (VSDB) of an enhanced display identification data (EDID).
  • EDID information may be provided to the device 400 in a VSDB from the sink device so that the device 400 can determine whether and/or how a frame may be sent across an interconnect.
  • the frame splitter 415, the sideband processor 420, the video processor 422, and the frame generator 424 may be implemented by a single processor 490, a set of processors, one or more microprocessors, controllers, central processing units (CPUs), and/or the like. It would also be appreciated that the components of the device 400 are given for the purpose of illustration. In some use cases, as mentioned above, some components may be omitted from the device 400. In some other embodiments, additional components may be included.
  • Fig. 5 shows a block diagram of a sideband processor 420 according to an embodiment of the present disclosure.
  • the sideband processor 420 may include a downsampler 510, a deserializer 515, a packetizer 520, an error correcting code (ECC) unit 530, and a packet interleaver 540.
  • the sideband processor 420 may process the multi-bit digital samples 430 representing the raw sideband data to reduce the size of the sideband data to be included in the frame and probably increase the reliability of the transmission of the sideband data.
  • the downsampler 510 may receive multi-bit digital samples 430 representing the raw sideband data and downsample the digital samples 430 into downsampled digital samples 512.
  • the downsampled digital samples 512 may represent net sideband data recovered from the data samples 430.
  • the downsampler 510 may compare the respective digital samples 430 to a decision threshold.
  • a multi-bit digital sample 430 greater than the threshold is converted into a binary value of "1,” and a multi-bit digital sample 430 lower than the threshold may be converted into a binary value of "0.”
  • the downsampler 510 may compare a multi-bit digital samples with more than one decision threshold to compare the digital sample into a digital sample represented with less bits.
  • the downsampler 510 may select bits of data from the digital samples 430 or the digital samples obtained after comparing based on a predetermined rate.
  • the sampling rate of the ADC 410 may be appropriate for video data but may be higher for the sideband data which thereby produces too many digital samples 412 and thus digital samples 430. Therefore, the downsampler 510 may eliminate 1/2 or 3/4 of the digital samples.
  • the down-sampling rate may be determined by a ratio between a data rate of the video data and a data rate of the sideband data from the incoming frame of the device 400.
  • the deserializer 515 may deserialize the downsampled digital samples such that the deserialized digital samples 514 is output in 8-bit (1-byte) chunks.
  • the deserializing may further save the bandwidth since a smaller amount of control signals (clock signals) may be required to be transmitted with the frame for reception of the digital samples.
  • Fig. 6 shows a schematic diagram of converting an analog signal for the sideband data into the downsampled digital samples according to an embodiment.
  • the analog waveform 602 represents a part of the sideband data received by the ADC 410.
  • the ADC 410 generates eight multi-bit digital samples from the analog waveform 602. Each of the eight multi-bit samples has an 8-bit value shown using hexadecimal encoding.
  • the downsampler 510 compares each of the eight sample values against a decision threshold 620 to converts 612 the multi-bit digital samples into digital samples 625. It can be seen that the number of bits of the digital samples 625 is much lower than the multi-bit digital samples, which reduces the amount of sideband data to be transmitted.
  • the digital samples 625 may then be further down-sampled 625 with a down-sampling ratio of 4: 1 to produce down-sampled digital samples 630, which may further reduce the amount of sideband data to be transmitted.
  • eight-bit chunks of the digital samples 630 may then be output as the deserialized digital samples 514 for the sideband data.
  • the packetizer 520 may generate one or more sideband data packets 522 from the deserialized digital samples 514.
  • the sideband data packets 522 may be transmitted during the active data period for the frame in some embodiments.
  • An example structure of a packet 522 is shown in Fig. 7A.
  • the packet 522 includes a header 710 and a payload 720.
  • the header 710 may contain a length of 4 bytes and the payload may contain a length of M bytes in the example of Fig. 7. It would be appreciated that the header 710 and the payload 720 may be designed to contain any length of data.
  • each line of the sideband data in the first frame may be converted into one sideband data packet.
  • the first byte 712 of the header 710 LineNumLow may indicate the low byte of the line location of the payload 720, with reference to the leading edge of the VSYNC control signal.
  • the second byte 714 of the header 710 LineNumHigh may indicate the high byte of the line location of the payload 720.
  • the third byte 716 of the header 710 LengthLow may indicate the low byte of the length of the payload 720, and the fourth byte 718 of the header 710 LengthHigh may indicate the high byte of the length of the payload 720.
  • the header 710 may be used to indicate which line the payload 720 will be located (supposed each line of the original sideband data is converted into one packet) and the length of the payload 720. For example, if Line 10 of the converted frame carries 40 bytes, the 4-byte header 710 may be 0x0 A, 0x00, 0x28, 0x00.
  • the first byte 712 LineNumLow and the second byte 714 LineNumHigh may not be both 0, and the third byte 716 and the fourth byte 718 may be in a range of 1 to OxFFF. It would be appreciated that the bytes in the header 710 are only shown and discussed for the purpose of illustration and additional bytes or less bytes may be included in the header 710.
  • the ECC unit 530 may add one or more ECC bits to the data packets 522, thereby generating ECC protected data packets 532.
  • the ECC unit 530 may generate ECC bits for the packets using BCH (Bose, Ray-Chaudhuri, Hocquenghem) coding, for example.
  • An ECC encoded sideband data packet 532 is shown in Fig. 7B.
  • a sideband data packet 522 is divided into three byte chunks. Starting from the first byte 712 of the header 710, every three bytes are protected by a byte of ECC bites (parity bits, for example). For example, for three bytes 712 to 716 of the payload 710, an ECC byte 722 may be added. Other ECC bytes may be added into the remaining bytes of the packet 522 to obtain an ECC protected data packet 532.
  • the ECC protected data packets 532 may be interleaved by the packet interleaver 500 of the sideband processor 420 to generate interleaved sideband data packets 542 which can be used as the sideband data 434 in Fig. 4.
  • the interleaving is shown in Fig. 7C where bits of a byte of the packets are interleaved with bits of another byte of the packets.
  • An un-interleaved ECC packet 532 may be divided into pairs of byte. For example, the first two bytes 712 and 714 of the packet 532 may form a pair of byte to be interleaved.
  • the bits of the byte 712 may be interleaved with the bits of the byte 714 to form a pair of interleaved bytes of 732 and 734 of the interleaved packet 542.
  • FIG. 9 An example interleaving pattern is shown in the example of Fig. 9 where odd bits of the byte 712 are retained while even bits of the byte 712 are replaced with even bits of the byte 714 to form the byte 732. Odd bits of the byte 714 are retain while even bits of the byte 714 are replaced with even bits of the byte 712 to form the byte 734. Bits of the bytes 716 and 722 of the packet 532 may also be interleaved to form interleaved bytes 736 and 738 of the interleaved packet 542. The remaining bytes of the packet 532 may be interleaved in a similar way to obtain the interleaved packet 542.
  • the interleaving shown in Fig. 7C is merely for the purpose of illustration and any other interleaving pattern may be applied.
  • bits in every three or more bytes of the packet 532 may be interleaved and those bits may be interleaved in other manners.
  • the interleaving of the packet may maximize the error correction capability of ECC bits for errors such as bursty errors occurring on the data interconnect.
  • the bit errors may be corrected after de-interleaving at the video sink side.
  • the deserializer 515, packetizer 520, ECC unit 530, and interleaver 540 may be optional to the sideband processor 420. When some of those components are omitted from the sideband processor 420, the remaining components may still perform corresponding operations on the received sideband data.
  • the packetizer 520 may packet the downsampled digital samples 512.
  • the ECC unit 530 may add ECC bits into the bit stream from the downsampler 510 or the deserializer 515.
  • Fig. 8 illustrates a portion of the frame 502 generated by the device 400 in more detail according to an embodiment of the present disclosure.
  • the sideband data is packetized as N sideband data packets.
  • the N sideband data packets may be placed in S horizontal lines 340 of the frame 302 (2 lines in the example of Fig. 5B). It is noted that although the S horizontal lines are used to carry sideband data which is not video data, they are still treated as corresponding to the active data period. Due to the additional horizontal lines for sideband data, the original VDE signal 318 may be extended to the VDE signal 316 to indicate the extension of the active data period.
  • the video sink receiving the frame 302 may extract the S horizontal lines as sideband data from the frame 302.
  • Fig. 9 illustrates a block diagram of a device 900 for receiving and processing a frame generated by the device 400 according to an embodiment of the present disclosure.
  • the device 900 may include some or all features of the video sink 600.
  • the device 900 includes a receiver (RX) 910, a sideband processor 930, a video processor 932, a display engine 950, and a capability report unit 960.
  • the receiver 910 receives an input 905 which includes a frame such as the frame 502 or 504 generated by the device 400.
  • the frame may include both video data and sideband data received in the active data period.
  • the frame may be received via a hardware interconnect compatible with physical layer requirements of a video interface specification.
  • the video interface specification may specify a frame format.
  • the frame format may include a total of X consecutive horizontal lines of vertical blanking data and a total of Y consecutive horizontal lines of video data, where X is a first integer and Y is a second integer.
  • the receiver 910 may receive vertical blanking data in (X- S) consecutive horizontal lines of the frame during a vertical blanking period, where S is a third integer. During an active data period, the receiver 910 may receive sideband data in S horizontal lines of the frame and video data in Y horizontal lines of the frame. In some embodiments, the receiver 910 may also receive horizontal blanking data outside in the (Y+S) horizontal lines outside of the active data. The receiver 910 may identify one or more portions of the input 905 as being sideband data, blanking data, or video data according to control signals received in the input 905.
  • the receiver 910 may provide to the sideband processor 930 the downsampled sideband data 912 received from an active data period of the frame.
  • the receiver 910 may perform de-multiplexing to separate the sideband data for the sideband processor 930 for processing.
  • the receiver 910 may further provide video data 924 to video processor 932.
  • the receiver 910 may also provide other blanking data in the received frame such as audio data, control data, other packetized or unpacketized data to one or more other processors (not shown) for processing.
  • the sideband processor 930 may perform one or more operations to generate sideband data 922 based on the received sideband data 912. The operations may be depending on the generation of the sideband data 912 in the frame (for example, the sideband data 434). By way of illustration only, the sideband processor 930 may perform operations for de-interleaving the sideband data 912, error checking the sideband data 912, and/or de-packetizing the sideband data packets. The processing of the sideband processor 930 will be discussed in more detail below with reference to Fig. 10. The resulting sideband data may then be converted into information for use in replacing, masking, or otherwise modifying the video data in the frame to be displayed.
  • the resulting sideband data may be pixel data.
  • the video processor 932 may perform one or more operations to provide video data 926 based on the video data included in the received frame. By way of illustration only, the video processor 932 may perform operations to isolate, synchronize, order, condition, or otherwise prepare the video data for a digital display.
  • the video data 926 and pixel data 922 may be provided to the display engine 950.
  • the display engine 950 may generate video data 940 for display.
  • the video data 940 may be provided to a HD video display (not shown).
  • the display engine 950 may perform one or more multiplexing operations to interleave portions of the pixel data 922 with portions of the video data 926, thereby blending the pixel data 922 with the video data 926.
  • the display engine 950 may perform operations to calculate pixel color values based on values of the pixel data 922 and pixel color values of the video data 926.
  • the capability report unit 960 may output a support indication 460 indicating whether the device 900 supports the sideband data transmitted within a frame.
  • the indication 460 may also indicate whether the device 900 supports transmission of the sideband data within the active data period of the frame.
  • the support indication 460 may be transmitted to the device transmitting the frame, for example, the device 400.
  • the sideband processor 930, the video processor 932, and the capability report unit 960 may be implemented by a single processor 990, a set of processors, one or more microprocessors, controllers, central processing units (CPUs), and/or the like. It would also be appreciated that the components of the device 900 are given for the purpose of illustration. In some use cases, as mentioned above, some components may be omitted from the device 900. In some other embodiments, additional components may be included.
  • Fig. 10 is a block diagram of a sideband processor 930 according to an embodiment of the present disclosure.
  • the sideband data processor 930 may include a de- interleaver 1010, an ECC unit 1020, a de-packetizer 1030, and a pixel data generator 1040.
  • the de-interleaver 1010 may receive the sideband data 912.
  • the de-interleaver 1010 may de-interleave the data packets into de-interleaved data packets 1012.
  • the de-interleaving may involve reversing the interleaving of Fig. 7C to reconstruct the data packets 1012.
  • the ECC unit 1020 may receive the de-interleaved data packets 1012 and perform error checking on the data packets 1012. The ECC unit 1020 may then generate error corrected data packets 1022.
  • the de-packetizer 1030 may receive the error corrected data packets 1022 and de-packetize the data packets 1022 by extracting the header and pay load from the data packets 1022, for example. The de-packetizer 1030 may output the resulting sideband data 1032, which may be extracted from the payload of the data packets 1022. The de-packetizer 1030 may also output the header information in some examples.
  • the pixel data generator 1040 may receive the sideband data 1032 and then generate pixel data 922 from the sideband data 1032.
  • the pixel data generator 1040 may convert the sideband data 1032 into information for use in replacing, masking, or otherwise modifying pixel color values of the frame. Such information may, for example, include alternate pixel colors or data for determining such alternate pixel colors.
  • the pixel data generator 1040 may be a sideband data translator, such as an on-screen display (OSD) controller.
  • the sideband data may contain a text message to be displayed and metadata describing the text message, for example, metadata describing the location and character size of the message, features, etc.
  • the sideband data translator may use the sideband data to generate pixel data 922 that is passed on to the display engine 950.
  • the de-interleaver 1010, ECC unit 1020, de-packetizer 1030, and pixel data generator 1040 may be optional to the sideband processor 930 depending on the generation of the sideband data in the received frame.
  • the pixel data generator 1040 may directly generate pixel data 922 based on the bit stream.
  • the ECC unit 1020 and the de-interleaver 1010 may be omitted from the device 900. It is noted that since the device 400 have processed the raw sideband data and transmitted the net sideband data to the device 900, the device 900 may not be required for extracting the information from the raw sideband data and thus its complexity and cost are reduced.
  • Fig. 11 illustrates a flowchart of a method 1100 for generating a frame including sideband data in the active data period according to an embodiment of the present disclosure.
  • the method 1100 may be performed by a device which provides functionality of generating a frame in a format compatible with a video interface specification.
  • the device may be the converter device 130 or the video source 110 as shown in Fig. 1, or may be the device 400 as shown in Fig. 4.
  • step 1110 digital samples representing raw sideband data for a video are downsampled.
  • step 1120 first sideband data is generated based on the downsampled digital samples and in step 1130, a first frame for the video is generated where the first frame at least including first video data for the video and the first sideband data.
  • step 1140 the first video data and the first sideband data is transmitted to a further device during a first active data period for the first frame. The first active data period is different from a first blanking period for the first frame.
  • the downsampled digital samples may be deserialized; and the first sideband data may be generated based on the deserialized digital samples.
  • the first sideband data may be generated by adding an ECC bit into the downsampled digital samples.
  • a sideband data packet may be generated based on the downsampled digital samples. Bits of the sideband data packet may be interleaved and the first sideband data may be generated based on the interleaved sideband data packet.
  • the digital samples may be obtained by receiving a second frame at least includes an analog signal for the raw sideband data and converting the analog signal into the digital samples.
  • the analog signal for the raw sideband data may be received during a second blanking period for the second frame.
  • the first frame may be transmitted to the further device via an interconnect compatible with an interface specification, the interface specification specifying a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
  • the first frame further includes first blanking data.
  • the first video data may be transmitted in the first number of horizontal lines, and the first sideband data may be transmitted in a third number of horizontal lines as a part of the non-video data.
  • the first blanking data may be transmitted in a fourth number of horizontal lines as a further part of the non-video data. The fourth number plus the third number may be equal to the second number.
  • the first sideband data may be generated in response to receiving from the further device an indication that the further device supports the first sideband data.
  • the raw sideband data includes teletext data or closed captioning data.
  • Fig. 12 illustrates a flowchart of a method 1200 for receiving a frame including sideband data in the active data period according to an embodiment of the present disclosure.
  • the method 1200 may be performed by a device which provides functionality of generating a frame in a format compatible with a video interface specification.
  • the device may be the video sink 120 as shown in Fig. 1, or may be the device 900 as shown in Fig. 9.
  • first video data and first sideband data included in a first frame for a video may be received from a further device during a first active data period for the first frame.
  • the first sideband data may be generated based on downsampled digital samples representing raw sideband data for the video.
  • the first active data period may be different from a first blanking period for the first frame.
  • second video data for display may be generated based on the first video data and the first sideband data.
  • the first sideband data may include deserialized digital samples generated by deserializing the downsampled digital samples.
  • the first sideband data may include the downsampled digital samples and an ECC bit.
  • ECC protected digital samples may be generated performing error checking on the downsampled digital samples based on the ECC bit.
  • the second video data may be generated based on the first video data and the ECC protected digital samples.
  • the first sideband data may include an interleaved sideband data packet generated based on the downsampled digital samples.
  • the interleaved sideband data packet may be de-interleaved.
  • the de- interleaved sideband data packet may be de-packetized to obtain the downsampled digital samples.
  • the second video data may be generated based on the first video data and the downsampled digital samples.
  • the first frame may be received from the further device via an interconnect compatible with an interface specification.
  • the interface specification may specify a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
  • the first frame may further include first blanking data.
  • the first video data may be received in the first number of horizontal lines, and the first sideband data may be received in a third number of horizontal lines as a part of the non-video data.
  • the first blanking data may be received in a fourth number of horizontal lines as a further part of the non-video data. The fourth number plus the third number may be equal to the second number.
  • an indication may be transmitted to the further device.
  • the indication may indicate that the device supports the first sideband data.
  • the raw sideband data includes teletext data or closed captioning data.
  • the components of the device 400, 900 may be a hardware module or a software unit module.
  • the system may be implemented partially or completely as software and/or in firmware, for example, implemented as a computer program product embodied in a computer readable medium.
  • the system may be implemented partially or completely based on hardware, for example, as an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on chip (SOC), a field programmable gate array (FPGA), and so forth.
  • IC integrated circuit
  • ASIC application- specific integrated circuit
  • SOC system on chip
  • FPGA field programmable gate array
  • various example embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof.
  • Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of the example embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it will be appreciated that the blocks, apparatus, systems, techniques or methods of the present disclosure may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • example embodiments of the present disclosure include a computer program product including a computer program tangibly embodied on a machine readable medium, the computer program containing program codes configured to carry out the methods as described above.
  • a machine readable medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine readable medium may be a machine readable signal medium or a machine readable storage medium.
  • a machine readable medium may include, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • machine readable storage medium More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • optical storage device a magnetic storage device, or any suitable combination of the foregoing.
  • Computer program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These computer program codes may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor of the computer or other programmable data processing apparatus, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented.
  • the program code may execute entirely on a computer, partly on the computer, as a stand-alone software package, partly on the computer and partly on a remote computer or entirely on the remote computer or server.
  • the program code may be distributed on specially-programmed devices which may be generally referred to herein as "modules".
  • modules may be written in any computer language and may be a portion of a monolithic code base, or may be developed in more discrete code portions, such as is typical in object-oriented computer languages.
  • the modules may be distributed across a plurality of computer platforms, servers, terminals, mobile devices and the like. A given module may even be implemented such that the described functions are performed by separate processors and/or computing hardware platforms.
  • the term “includes” and its variants are to be read as open-ended terms that mean “includes, but is not limited to.”
  • the term “or” is to be read as “and/or” unless the context clearly indicates otherwise.
  • the term “based on” is to be read as “based at least in part on.”
  • the term “one embodiment” and “an embodiment” are to be read as “at least one embodiment.”
  • the term “another embodiment” is to be read as “at least one other embodiment”.
  • the terms “first,” “second,” and the like may refer to different or same objects. Other definitions, either explicit or implicit, may be included below.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Television Systems (AREA)

Abstract

Example embodiments of the present disclosure relate to transmitting sideband data with video data. In some example embodiments, a method is proposed. The method includes downsampling digital samples representing raw sideband data for a video. The method also includes generating first sideband data based on the downsampled digital samples, and generating a first frame for the video, the first frame at least including first video data for the video and the first sideband data. The method further includes transmitting to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.

Description

COMMUNICATION OF SIDEBAND DATA FOR VIDEOS
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims priority of U.S. Provisional Patent Application No. 62/168,169, filed on May 29, 2015, which is incorporated herein by reference in its entirety.
TECHNOLOGY
[0002] Embodiments of the present disclosure generally relate to the field of data communications, and more particularly, to communication of sideband data for videos.
BACKGROUND
[0003] In some video systems, data may be transmitted over a data interconnect or link between audio/video (A/V) devices. For example, a stream of data may be sent from a first A/V device to a second A/V device, where the second device may either utilize the data or retransmit such data to another device. In the data transfer operation, the transmitted data stream may be converted to a frame in a certain digital A/V format. The receiving device is required to interpret and handle the data stream in that format.
[0004] In a frame for video data there are certain time periods referred as vertical and horizontal blanking periods. A blanking period refers to a period during which an electron beam for scanning the video data on a screen moves back to an initial position to scan the next line. Generally, the video data for display on the screen is not transmitted during the blanking periods. Instead, non-video data such as audio data, control data, and/or sideband data may be embedded in one or more of the blanking periods.
[0005] Some types of non-video data embedded in the frame (for example, the sideband data) are not supported in converting of such a frame to digital A/V formats compatible with various interface specifications. When the sideband data is included in blanking periods of a frame, converting the frame into a format compatible with one of those digital A/V formats may result in the non-video data being stripped out. It is desired to retain the sideband data during the transmission of the data stream to the receiving device. SUMMARY
[0006] Example embodiments of the present disclosure propose a mechanism for transmitting sideband data for videos.
[0007] In one aspect, example embodiments of the present disclosure provide a method. The method includes downsampling digital samples representing raw sideband data for a video. The method also includes generating first sideband data based on the downsampled digital samples, and generating a first frame for the video, the first frame at least including first video data for the video and the first sideband data. The method further includes transmitting to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.
[0008] In a second aspect, example embodiments of the present disclosure provide a method. The method includes receiving from a further device first video data and first sideband data included in a first frame for a video during a first active data period for the first frame, the first sideband data being generated based on downsampled digital samples representing raw sideband data for the video, and the first active data period being different from a first blanking period for the first frame. The method also includes generating second video data for display based on the first video data and the first sideband data.
[0009] In a third aspect, example embodiments of the present disclosure provide a device. The device includes a processor configured to downsample digital samples representing raw sideband data for a video, generate first sideband data based on the downsampled digital samples, and generate a first frame for the video, the first frame at least including first video data for the video and the first sideband data. The device also includes a transmitter configured to transmit to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.
[0010] In a fourth aspect, example embodiments of the present disclosure provide a device. The device includes a receiver configured to receive from a further device first video data and first sideband data included in a first frame for a video during a first active data period for the first frame, the first sideband data being generated based on downsampled digital samples representing raw sideband data for the video, and the first active data period being different from a first blanking period for the first frame. The device also includes a processor configured to generate second video data for display based on the first video data and the first sideband data.
100111 Other advantages achieved by example embodiments of the present disclosure will become apparent through the following descriptions.
DESCRIPTION OF DRAWINGS
100121 Through the following detailed description with reference to the accompanying drawings, the above and other objectives, features and advantages of example embodiments of the present disclosure will become more comprehensible. In the drawings, several example embodiments of the present disclosure will be illustrated in an example and non- limiting manner, wherein:
100131 Fig. 1 is a block diagram of an example system for exchanging A/V information according to an embodiment of the present disclosure;
100141 Fig. 2 is a schematic diagram showing frames to be processed according to an embodiment of the present disclosure;
100151 Fig. 3 is a schematic diagram showing frames to be generated according to an embodiment of the present disclosure;
100161 Fig. 4 is a block diagram of a device for generating a frame according to an embodiment of the present disclosure;
100171 Fig. 5 is a block diagram of a sideband processor of the device of Fig. 4 according to an embodiment of the present disclosure;
100181 Fig. 6 is a schematic diagram of converting an analog signal for the sideband data into the downsampled digital samples according to an embodiment of the present disclosure; 100191 Fig. 7A-7C are schematic diagrams of packets generated by different operations of the device of Fig. 4 according to an embodiment of the present disclosure; 100201 Fig. 8 is a schematic diagram of a portion of a frame according to an embodiment of the present disclosure;
100211 Fig. 9 is a block diagram of a device for receiving a frame according to an embodiment of the present disclosure;
100221 Fig. 10 is a block diagram of a sideband processor of the device of Fig. 9 according to an embodiment of the present disclosure;
100231 Fig. 11 is a flowchart of a method for generating a frame according to an embodiment of the present disclosure; and
100241 Fig. 12 is a flowchart of a method for receiving a frame according to an embodiment of the present disclosure.
[0025] Throughout the drawings, the same or corresponding reference symbols refer to the same or corresponding parts.
DESCRIPTION OF EXAMPLE EMBODIMENTS
[0026] Principles of example embodiments of the present disclosure will now be described with reference to various example embodiments illustrated in the drawings. It should be appreciated that depiction of those embodiments is only to enable those skilled in the art to better understand and further implement example embodiments of the present disclosure and is not intended for limiting the scope of the present disclosure in any manner. [0027] As used herein, the term "audio/video" or "A/V" refers to one or more characteristics relating to audio data or video data, or relating to both audio data and video data. As used herein, the term "A/V sink," "sink device," or simply "sink" refers to a device receiving audio/video data from some other devices. Correspondingly, the term "A/V source," "source device," or simply "source" refers to a device providing A/V information to some other (sink) devices. For example, A/V information may include some or all of audio data and/or control information and video data and/or control information. The term "A/V device" refers to either an A/V source or an A/V sink, or both an A/V source and an A/V sink. In an embodiment, an A/V device may, in addition to exchanging A/V information with another device, be operable to render audio data and/or video data for a user. Various embodiments are discussed herein with respect to video data of frames. Such frames may also include audio data, for example, although certain embodiments are not limited in this regard.
[0028] In some embodiments, an A/V device may be operable to exchange A/V information according to some interface standard in one or more respects. By way of illustration and not limitation, the A/V device may exchange A/V information via a data link, connector, and/or interconnect which is compatible with a video interface specification. Examples of the video interface specification may include, but are not limited to, a High-Definition Multimedia Interface (HDMI) specification, a Mobile High- Definition Link (MHL) specification, a Digital Visual Interface (DVI) specification, and a DisplayPort specification. The video interface specifications can transfer non-compressed video data, but certain embodiments are not limited to processing communications which are exchanged according to a non-compressed video interface specification. For example, such embodiments may be applied to communications which are exchanged according to other video interface specifications that include frame formats having characteristics as those discussed herein.
[0029] In an embodiment, the A/V device may implement communications which, at different times, may be compatible with different interface standards via the same connector and/or interconnect. For example, the A/V device may include first communication logic to detect the presence of and communicate with a HDMI interconnector and second communication logic to detect the presence of and communicate with a MHL device. The respective detection and communication functionalities of the first and second communication logics may not conflict with or otherwise impede the other. Various embodiments are discussed herein in the context of exchanging A/V information according to an HDMI interface specification. However, the discussion may be extended to apply to any of a variety of additional or alternative interface specifications for exchanging A/V information in some other embodiments.
[0030] As used herein, the term "frame" refers to a logical grouping of data where a beginning and an end of the frame may be each indicated by or otherwise associated with a respective control signal, a time interval, and/or the like. In an embodiment, a frame includes video data and non-video data for transmission. The non-video data may be used to facilitate processing or displaying of such video data. Examples of such non-video data may include, but are not limited to, audio data, control data, auxiliary information, and sideband data. The term "sideband data" refers to various types of text and/or image data that will be displayed separately or in combination with the video data. Examples of sideband data may include, but are not limited to, teletext data, closed captioning data, Macrovision data, and/or other non-video data for display.
[0031] In another embodiment, depending on the periods for transmission, data of a frame may also be divided as "blanking data" and "active data." The blanking data refers to data that is communicated during one or more blanking intervals (or "blanking periods") for the frame. The blanking periods may include, for example, one or more horizontal blanking periods or a vertical blanking period. The active data is distinguished from the blanking data and communicated during an active data period (or "active data interval") for the frame. The length or duration of a horizontal or vertical blanking period or an active data period may vary from system to system depending upon the type of interface specifications used and the number of pixels per line (i.e. the size or dimensions of the display at the video sink side).
[0032] The term "line" refers to a logic grouping of data included in a frame where one or more control signals distinguish one set of data as belonging to a first line and another set of data as belonging to a second line. With reference to lines of data, the terms "horizontal" and "vertical" are used according to conventional use to distinguish different types of logical grouping of data in a frame in the horizontal and vertical directions.
[0033] In order to transmit the sideband data even when a data stream carrying the sideband data is converted into a digital A/V format, traditionally the sideband data is processed in advance by a video source. The video source is required to process and blend the sideband data with corresponding video data. The video source then generates a frame including the blended video data and other non-video data in a digital A/V format. The blended video data in the resulting frame is sent to a video sink during an active data period for the frame while other non-video data in the frame is sent during the blanking periods. Consequently, the video sink such as a digital-only television receives the sideband data in an active data period of the frame instead of in a blanking period. However, the processing and blending of sideband data requires additional complexity and resource load for the video source.
[0034] Fig. 1 illustrates an example system 100 for exchanging A/V information according to an embodiment of the present disclosure. The system 100 may include a video source 110, a video sink 160, and a converter device 130. The converter device 130 may be used to facilitate A/V communications between the video source 110 and the video sink 160.
[0035] In an embodiment, the video source 110 and the converter device 130 may be respective components of a single larger device (not shown) of the system 100. Another embodiment may be implemented entirely by a single device of which the video source 110 and the converter device 130 are each a component. One embodiment may be implemented entirely by the video source 100 or the converter device 130, for example. Still another embodiment may be implemented by the system 100 as a whole. Any of a variety of other embodiments may be alternatively implemented according to techniques discussed herein.
[0036] In an embodiment, the video source 110 may include functionality of one or more A/V source devices. By way of illustration and not limitation, the video source 110 may include functionality including, but not limited to, that of a personal computer (e.g. tablet, notebook, laptop, desktop and/or the like), camcorder, smart phone, video game console, television, monitor, display, set-top box, home theater receiver, and/or the like. In an embodiment, the video source 110 may include a component (for example, a hard disk drive, a solid state drive, a bus, an input port and/or the like) of such an A/V source device. Alternatively, or in addition, the video sink 160 may include functionality of one or more conventional A/V sink devices including, but not limited to, a television, monitor, display and/or the like. In an embodiment, the video source 110 is further capable of providing functionality of one or more A/V sink devices and/or the video sink 160 is further capable of providing functionality of one or more A/V source devices.
[0037] The video source 110 may send to the converter device 130 a communication 120 including a first frame. In an embodiment, the first frame may be generated according to a frame format which supports one or more types of sideband data. In some embodiments, the first frame in the communication 120 may be generated according to an analog television format, a standard-definition A/V format such as 480i or 576i, or even a high definition (HD) format which supports sideband communication in a blanking period or supports the communication of the sideband data with the video data in an active data period according to some embodiments of the present disclosure.
[0038] In an embodiment, a blanking period of the first frame includes first sideband data and an active data period of the first frame includes first video data. Control data included in the communication 120 may be used to distinguish a blanking period of the first frame from one or more periods for communicating the active data. In some examples, a blanking period and an active data period may be distinguished from one another by a vertical synchronization (VSYNC) control signal, a horizontal synchronization (HSYNC) control signal, and/or one or more data enable (DE) signals including a horizontal DE signal, a vertical DE signal, and/or a logical combination thereof.
[0039] The converter device 130 may generate a communication 150 which includes a second frame generated based at least in part on the first frame of the communication 120. In an embodiment, the converter device 130 may be a hardware interconnect device directly coupled to one or both of the video source 110 and the video sink 160. In such embodiment, the converter device 130 may include a cable and a connector housing (not shown) at one end of such cable, the connector housing including logic to provide frame conversion functionality disclosed herein.
[0040] In an embodiment, at least part of the communication 150 is transmitted to the video sink 160 via a hardware interconnect which is compatible with a video interface specification for communicating video data. Examples of the video interface specification may include, but are not limited to, a HDMI specification, a MHL specification, a DVI specification, and a DisplayPort specification. The hardware interconnect may include, for example, a multimedia communication link (HDMI cable, MHL cable) with several video channels and one or more control channels. By way of illustration and not limitation, the hardware interconnect may be compatible with the physical layer requirements of the video interface specification including, but not limited to, connector, cable and/or other hardware requirements identified in one or more of HDMI, MHL or other such specifications.
[0041] The video interface specification may allow only non-compressed video data, or in other embodiments may allow compressed video data. The video interface specification may identify (specify or reference) a frame format for the second frame which, for example, requires a respective number of bits-per-pixel, pixels-per-line, lines-per-frame, a respective number of periods for a horizontal blanking interval, a vertical blanking interval, or the active data period of the second frame.
[0042] The second frame of the communication 150 may include a frame format identified in the video interface specification. As mentioned above, the second frame may be generated by the converter device 130 based at least in part on the first frame of the communication 120. For example, second video data of the second frame may be generated based on the video data of the first frame, and second sideband data of the second frame may be generated based on the first sideband data of the first frame.
[0043] In some embodiments, the first frame of the communication 120 may be in an analog A/V format while the second frame of the communication 150 may be in a digital A/V format according to the used video interface specification. In some cases, the first sideband data of the first frame may be transmitted during a blanking period for the first frame such as the vertical blanking period. In order to transfer the sideband data to the video sink 160, the converter device 130 may generate the second sideband data based on the first sideband data and transmit the second sideband data with the second video data during an active data period instead of a blanking period. In these cases, the video source 110 is able to generate and transmit the first frame in a conventional way without increasing any complexity while the converter device 130 can still retransmit the sideband data to the video sink 160. [0044] For example, the format of the first frame may include a total of X horizontal lines of non-video data in a vertical blanking period, where X is an integer. Alternatively, or in addition, such a frame format may include an active data period which comprises a total of Y horizontal lines of video data, where Y is another integer. Audio data, if present, may be distributed across X lines of vertical blanking interval and horizontal blanking intervals of Y lines of active data.
[0045] To communicate the second sideband data with the second video data, the active data of the second frame generated by the converter device 130 may include a total of Y horizontal lines of the second video data and S horizontal lines of the second sideband data, where S is an integer. That is, the active data is extended from Y horizontal lines to (Y+S) horizontal lines in the second frame. The vertical blanking interval of the second frame may then be reduced to (X-S) horizontal lines for other blanking data than the second sideband data. As a result, the second frame, after being modified, still includes a total of Y horizontal lines of video data and a total of X horizontal lines of non-video data. It would be appreciated that the format of the second frame may be arranged in any other structures as long as the second sideband data and video data can be transmitted in the active data period.
[0046] In embodiments discussed above, the second frame in a digital A/V format compatible with the video interface specification is converted by the converter device 130 based on the first frame. In some other embodiments, the video source 110 may generate the first frame according to the digital A/V format compatible with requirements of the video interface specification. The first frame may include an active data period with a total of Y horizontal lines of video data and S horizontal lines of sideband data. Other blanking data other than the sideband data may be included in (X-S) horizontal lines and transmitted in the vertical blanking interval. In these cases, the converter device 130 may directly forward the first frame received in the communication 120 from the video source 110 to the video sink 150. Alternatively, the converter device 130 may generate the second frame according to requirements of a different video interface specification.
[0047] Fig. 2 illustrates a schematic diagram showing frames 200, one or more of which are to be processed according to an embodiment of the present disclosure. Frames 200 may be sent in the communication 120 and processed by the converter device 130, for example. In an embodiment, processing one of the frames 200 is to generate a second frame (not shown), where an active data period of the second frame carries both video data and sideband data.
[0048] The frames 200 may be formatted according to an A/V specification which supports communication of sideband data in a blanking period and video data in an active data period which is distinct from that vertical blanking period. For example, frames 200 may be arranged according to a format compatible with 576i, 480i, or the like. In an embodiment, transmission of a frame according to such an A/V specification includes sequentially sending horizontal lines of data.
[0049] In the example represented by the frames 200, a frame 202 includes a vertical blanking interval 220 and horizontal blanking intervals 250 and 252 for blanking data. The frame 202 further includes video data in the active data period 230. Similarly, the frame 204 may include a vertical blanking (VB) interval 225 and horizontal blanking (HB) intervals 254 and 256 for blanking data, and an active data period 235 for video data.
[0050] The video data 230 of the frame 202 may span Y horizontal lines where portions of such horizontal lines include horizontal blanking data outside of the video data 230. The VB interval 220 may include X horizontal lines of blanking data including sideband data. For example, in a case where the frames 200 are in a format of 576i, the video data 230 may span 288 horizontal lines where portions of such 288 horizontal lines include horizontal blanking data outside of the video data 230. The VB interval 220 may include 24 or 25 horizontal lines. However, according to different embodiments, additional or fewer horizontal lines may be used for communicating either or each of the video data 230 and blanking data in the VB interval 220. The numbers of the lines for the video data and blanking data in the frame 204 may be the same or different from those in the frame 202. [0051] In an embodiment, the VB intervals 220, 225 include respective sideband (SB) data 240, 245. For example, either or each of sideband data 240 and 245 may include respective sideband data (for example, teletext or closed caption). Typically, such sideband data, if available, is included in horizontal lines of a vertical blanking period. The frames 200 including the sideband data 240 and 245 are typically transmitted in an analog form. When the sideband data 240, 245 is teletext data, there may be up to 17 lines of teletext data in a single frame 202 or 204.
[0052] In some embodiments, the types of data in the vertical and horizontal lines (and/or of portions of such lines) may be indicated by one or more control signals transmitted in or with the frame. Blanking data transmitted in a blanking period and video data transmitted in an active data period may be distinguished from one another by a horizontal synchronization (HSYNC) control signal 212, a vertical synchronization (VSYNC) control signal 214, and/or one or more data enable (DE) signals including a horizontal DE (HDE) signal 210, a vertical DE signal (VDE) 216, and/or a logical combination thereof.
[0053] In an embodiment, the VDE 216 may indicate an active data period which is used to identify both the video data 220 as being active data of the frame 202. Each of the respective polarities (for example, active low or active high) of the HDE 210, VDE 216, HSYNC 212, and VSYNC 214 are merely illustrative and may not be limiting on certain embodiments. The VDE 216 may enable the vertical blanking data 220 and 225 each having a respective span according to the specified frame format of an interface specification. Control signals HDE 210, HSYNC 212, and VSYNC 214 may otherwise support communication of the frames 202 and 204 according to the frame format.
[0054] Fig. 3 illustrates a schematic diagram showing frames 300, one or more of which are to be generated by a converter device or a video source according to an embodiment. Frames 300 may be sent in the communication 120 or the communication 150, for example. In an embodiment, one or more of the frames 300 may be generated by the source device 110 based on the original sideband data, video data, and other data to be transmitted to the video sink 160. In another embodiment, one or more of the frames 300 may be generated by the converter device 130 based on the first frame received from the source device 110. [0055] In the example represented by the frames 300, a frame 302 includes a vertical blanking (VB) interval 320 which occupy a total of only (X-S) consecutive horizontal lines of blanking data. The blanking data in the interval 320 may be transmitted during a vertical blanking period for the frame 302. The frame 302 may further include additional (Y+S) consecutive horizontal lines of active data. The active data includes S consecutive horizontal lines of sideband data (SB) 340 and Y consecutive horizontal lines of video data 330. In some embodiments, the S horizontal lines for the sideband data 340 and the Y horizontal lines for video data 330 may be arranged in a different way, for example, may be transmitted in a different order, interleaved with one another, and/or the like.
[0056] The frame 302 may also include horizontal blanking (HB) intervals 350 and 352 which are located in the remaining portions of the frame 302 other than the active data interval and the vertical blanking interval. Some non-video data may be included in the blanking intervals 320, 350, and 352. In some embodiments, the packetized non-video data such as audio data and/or the unpacketized non-video data such as control signals may be included in the blanking intervals. [0057] The structure of the frame 304 may be similar to that of the frame 302. For example, the frame 304 may include a vertical blanking interval 325, horizontal blanking intervals 354 and 356, sideband (SB) data 345, and video data 335. In some examples, the respective numbers of lines (horizontal and/or vertical lines) of different areas in the frame 304 may be the same or different from those of the frame 302.
[0058] Similar to the frames 200, the types of data in the vertical and horizontal lines (and/or of portions of such lines) may be indicated by one or more control signals transmitted in or with the frame. Blanking data transmitted in a blanking period and active data transmitted in an active data period may be distinguished from one another by a HSYNC control signal 312, a VSYNC control signal 314, and/or one or more DE signals including a HDE signal 310, a VDE 316, and/or a logical combination thereof.
[0059] In an embodiment, the VDE 316 may indicate an active data period which is used to identify both the sideband data 340 and video data 330 as being active data of the frame 302. Each of the respective polarities (for example, active low or active high) of HDE 310, VDE 316, HSYNC 312, and VSYNC 314 are merely illustrative and may not be limiting on certain embodiments. The VDE 316 may enable the vertical blanking data in the intervals 320 and 325 each having a respective span that is deviated from the specified frame format of an interface specification. Control signals HDE 310, HSYNC 312, and VSYNC 314 may otherwise support communication of the frames 302 and 304 according to the frame format.
[0060] As mentioned above, it is desired to transmit the sideband data with the video data during the active data period as illustrated by the frames 300 in Fig. 3. Generally the sideband data used to generate the frames 300 is in an analog format. If the raw sideband data is simply converted into digital samples and then included into the frames 300, more horizontal lines will be used to carry the sideband data 340, 345. That is, S will be large. Since the number of horizontal lines for blanking data X is specified by the video interface specification, less horizontal lines will be left for other blanking data such as audio data or control data.
[0061] In accordance with embodiment of the present disclosure, instead of carrying the digital samples obtained after analog-to-digital conversion of the raw sideband data, the digital samples are downsampled so as to reduce the size of the digital data for carrying the sideband data. The downsampled digital samples may be used to generate the sideband data 340, 345 included in the frames 300 to be transmitted to the video sink 160. The processing of the raw sideband data will be discussed in more detail below.
[0062] Fig. 4 illustrates a block diagram of a device 400 for generating a frame including sideband data in the active data period according to an embodiment of the present disclosure. The device 400 may have functionality of generating a frame in a format compatible with a video interface specification. The device 400 may represent the converter device 130 or the video source 110 as shown in Fig. 1. As shown, the device 400 may include an analog to digital converter (ADC) 410, a frame splitter 415, a sideband processor 420, a video processor 422, a frame generator 424, and a transmitter (TX) 440.
[0063] In some embodiments, an input 405 of the device 400 may be in an analog format and may be converted by the ADC 410 into digital samples 412. In some embodiments, the input 405 may include the first frame in the communication 120. For example, the input 405 may include a frame having sideband data such as teletext data or closed captioning data and other non-video data in a blanking period (for example, a vertical blanking period) for this frame. The frame may further include video data in the active data period which is distinct from the blanking period. By way of illustration and not limitation, the blanking period, sideband data, and video data may include the VB interval 220, sideband data 240, and video data 230, respectively. Alternatively, the input 405 may include sideband data and video data generated or otherwise obtained by the device 400 (for example, the video source 110 in Fig. 1). The sideband data and video data in the analog format without being processed may be referred to as "raw" sideband data and "raw" video data, respectively.
[0064] In some embodiments, depending on the ADC 410, the input 405 may be converted into multi-bit digital samples 412. For examples, the digital samples may be eight bit samples. In some embodiments, the sampling rate of the ADC 410 may be the same as the rate of a video clock for the video data. In some cases, the sampling rate of the ADC 410 may be the same for all data within the input 405. In some embodiments, the device 400 may directly obtain the input 405 in a digital A/V format, for example, the digital samples representing the raw sideband data, video data, and/or other blanking data. In these cases, the ADC 410 may be optional to the device 400.
[0065] In embodiments where the input 405 is received from a device in a frame format such as the frame 202, 204, the frame splitter 415 may identify one or more portions of digital samples 412 as being sideband data and identify other portions of the input 405 as being video data. For example, the frame splitter 415 may identify that a format of the frame includes a particular time interval dedicated to communication of the sideband data. The frame splitter 415 may identify the sideband data from the digital samples 412 of the input 405 based on such a time interval. The frame splitter 415 may then perform de-multiplexing to the digital samples 412 to separate digital samples 430 representing the raw sideband data to output to the sideband processor 420. The frame splitter 415 may further separate video data samples 432 from the digital samples 412 to output to the video processor 422. In the cases where the input 405 is not in a frame format but is directly generated or otherwise obtained by the video source, the frame splitter 415 may be optional to the device 400.
[0066] In some embodiments, the sideband processor 420 may perform one or more operations on the sideband data samples 430 to generate sideband data 434 to be included in the resulting frame. The operations may include downsampling the sideband data samples 430 to reduce the size of the samples. The downsampling may be performed by comparing the respective multi-bit sideband data samples 430 with a decision threshold to reduce the number of bits used to represent the respective samples. Alternatively, or in addition, the downsampling may be performed by simply reducing the sampling rate by selecting bits of the sideband data samples with a predetermined ratio. In some other embodiments, the operations may further include de-serializing the downsampled sideband data samples, packetizing the de-serialized sideband data, adding error correcting code (ECC) to the sideband data packets, and interleaving the sideband data packets. The resulting sideband data 434 may be inserted into an active data period for output 450. The operations of the sideband processor 420 will be discussed in more detail below with reference to Fig. 5.
[0067] The sideband data 434 is thus a processed version of the sideband data samples 430 that consumes less bandwidth in a frame than the original sideband data. The sideband data 434 may also be transmitted more reliably due to the ECC bits and the operation of interleaving. The complexity and cost of the device receiving the frame (for example, the sink device 160) may also be reduced since the device can directly recover information from the sideband data in the frame. However, in the cases where the digital samples representing the raw sideband data are received, the sink device may necessarily include additional components to recover the net information from the raw digital samples.
[0068] In some embodiments, the video processor 422 may perform one or more operations to generate video data 436 based on video data samples 432. By way of illustration and not limitation, the video processor 422 may perform operations to convert, synchronize, order, condition, or otherwise generate video data 436 including one or more characteristics of a frame format according to a video interface specification. Generation of the video data 436 may include one or more operations according to conventional techniques for converting video data into a format compatible with a video interface specification. In some other embodiments, the device 400 may include one or more additional processor (not shown) to perform one or more operations according to conventional techniques for converting blanking data such as audio data and/or control data into a format compatible with a video interface specification.
[0069] The video data 436, sideband data 434, and other blanking data (if available) may be provided to the frame generator 424. In some embodiments, the frame generator 424 may generate a frame which includes both the video data 436 and the sideband data 434 in the active data period. In an embodiment, the frame generator 424 may perform one or more multiplexing operations to interleave portions of the sideband data 434 with portions of the frame data 436. In other embodiments, the frame generator 424 may also add blanking data such as audio data and/or control data to the frame. The frame generator 424 may provide the resulting frame to the transmitter (TX) 440. In some embodiments, the transmitter 440 may transmit an output 450 including the resulting frame to a receiving device such as the video sink 160. The output 450 may be transmitted across a multimedia communication link (for example, HDMI, MHL, or the like). The resulting frame may be compatible with physical requirements of the used video interface specification and may be formed in a format as shown in Fig. 3.
[0070] In some embodiments, the frame generator 424 may receive a support indication 460 from the receiving device, for example, the video sink 160. The indication 460 indicates whether the video sink 160 supports the sideband data. The frame generator 424 may only generate a frame to include the sideband data 434 if the indication 460 indicates that the video sink 160 supports it. Otherwise, the frame generator 424 may exclude the sideband data from the frame. In addition, the indication 460 may also indicate whether the video sink 160 supports transmission of the sideband data within the active data period of the frame. In one example, the support indication may be an extension flag in a vendor specific data block (VSDB) of an enhanced display identification data (EDID). In one embodiment, upon request, EDID information may be provided to the device 400 in a VSDB from the sink device so that the device 400 can determine whether and/or how a frame may be sent across an interconnect.
[0071] It would be appreciated that the frame splitter 415, the sideband processor 420, the video processor 422, and the frame generator 424 may be implemented by a single processor 490, a set of processors, one or more microprocessors, controllers, central processing units (CPUs), and/or the like. It would also be appreciated that the components of the device 400 are given for the purpose of illustration. In some use cases, as mentioned above, some components may be omitted from the device 400. In some other embodiments, additional components may be included.
[0072] Fig. 5 shows a block diagram of a sideband processor 420 according to an embodiment of the present disclosure. As shown, the sideband processor 420 may include a downsampler 510, a deserializer 515, a packetizer 520, an error correcting code (ECC) unit 530, and a packet interleaver 540. The sideband processor 420, as discussed above, may process the multi-bit digital samples 430 representing the raw sideband data to reduce the size of the sideband data to be included in the frame and probably increase the reliability of the transmission of the sideband data.
[0073] In some embodiments, the downsampler 510 may receive multi-bit digital samples 430 representing the raw sideband data and downsample the digital samples 430 into downsampled digital samples 512. The downsampled digital samples 512 may represent net sideband data recovered from the data samples 430. In one embodiment, the downsampler 510 may compare the respective digital samples 430 to a decision threshold. In the example where the downsampler 510 is a 1-bit sheer, a multi-bit digital sample 430 greater than the threshold is converted into a binary value of "1," and a multi-bit digital sample 430 lower than the threshold may be converted into a binary value of "0." The downsampler 510 may compare a multi-bit digital samples with more than one decision threshold to compare the digital sample into a digital sample represented with less bits.
[0074] Alternatively, or in addition, the downsampler 510 may select bits of data from the digital samples 430 or the digital samples obtained after comparing based on a predetermined rate. Generally, the sampling rate of the ADC 410 may be appropriate for video data but may be higher for the sideband data which thereby produces too many digital samples 412 and thus digital samples 430. Therefore, the downsampler 510 may eliminate 1/2 or 3/4 of the digital samples. In one embodiment, the down-sampling rate may be determined by a ratio between a data rate of the video data and a data rate of the sideband data from the incoming frame of the device 400. [0075] In some embodiments, after the down-sampling, the deserializer 515 may deserialize the downsampled digital samples such that the deserialized digital samples 514 is output in 8-bit (1-byte) chunks. The deserializing may further save the bandwidth since a smaller amount of control signals (clock signals) may be required to be transmitted with the frame for reception of the digital samples.
[0076] Fig. 6 shows a schematic diagram of converting an analog signal for the sideband data into the downsampled digital samples according to an embodiment. The analog waveform 602 represents a part of the sideband data received by the ADC 410. The ADC 410 generates eight multi-bit digital samples from the analog waveform 602. Each of the eight multi-bit samples has an 8-bit value shown using hexadecimal encoding. The downsampler 510 compares each of the eight sample values against a decision threshold 620 to converts 612 the multi-bit digital samples into digital samples 625. It can be seen that the number of bits of the digital samples 625 is much lower than the multi-bit digital samples, which reduces the amount of sideband data to be transmitted. The digital samples 625 may then be further down-sampled 625 with a down-sampling ratio of 4: 1 to produce down-sampled digital samples 630, which may further reduce the amount of sideband data to be transmitted. In some embodiments, eight-bit chunks of the digital samples 630 may then be output as the deserialized digital samples 514 for the sideband data.
[0077] Referring back to Fig. 5, in some embodiments, the packetizer 520 may generate one or more sideband data packets 522 from the deserialized digital samples 514. The sideband data packets 522 may be transmitted during the active data period for the frame in some embodiments. An example structure of a packet 522 is shown in Fig. 7A. As shown, the packet 522 includes a header 710 and a payload 720. The header 710 may contain a length of 4 bytes and the payload may contain a length of M bytes in the example of Fig. 7. It would be appreciated that the header 710 and the payload 720 may be designed to contain any length of data. In some embodiments where the device 400 convert the first frame received from the video source 110 in the analog format into the second frame to be transmitted in the digital A/V format, each line of the sideband data in the first frame may be converted into one sideband data packet.
[0078] In some embodiments, the first byte 712 of the header 710 LineNumLow may indicate the low byte of the line location of the payload 720, with reference to the leading edge of the VSYNC control signal. The second byte 714 of the header 710 LineNumHigh may indicate the high byte of the line location of the payload 720. The third byte 716 of the header 710 LengthLow may indicate the low byte of the length of the payload 720, and the fourth byte 718 of the header 710 LengthHigh may indicate the high byte of the length of the payload 720. The header 710 may be used to indicate which line the payload 720 will be located (supposed each line of the original sideband data is converted into one packet) and the length of the payload 720. For example, if Line 10 of the converted frame carries 40 bytes, the 4-byte header 710 may be 0x0 A, 0x00, 0x28, 0x00.
[0079] In some embodiments, the first byte 712 LineNumLow and the second byte 714 LineNumHigh may not be both 0, and the third byte 716 and the fourth byte 718 may be in a range of 1 to OxFFF. It would be appreciated that the bytes in the header 710 are only shown and discussed for the purpose of illustration and additional bytes or less bytes may be included in the header 710.
[0080] Referring back to Fig. 5, the ECC unit 530 may add one or more ECC bits to the data packets 522, thereby generating ECC protected data packets 532. The ECC unit 530 may generate ECC bits for the packets using BCH (Bose, Ray-Chaudhuri, Hocquenghem) coding, for example. An ECC encoded sideband data packet 532 is shown in Fig. 7B. In Fig. 7B, a sideband data packet 522 is divided into three byte chunks. Starting from the first byte 712 of the header 710, every three bytes are protected by a byte of ECC bites (parity bits, for example). For example, for three bytes 712 to 716 of the payload 710, an ECC byte 722 may be added. Other ECC bytes may be added into the remaining bytes of the packet 522 to obtain an ECC protected data packet 532.
[0081] In some embodiments, the ECC protected data packets 532 may be interleaved by the packet interleaver 500 of the sideband processor 420 to generate interleaved sideband data packets 542 which can be used as the sideband data 434 in Fig. 4. The interleaving is shown in Fig. 7C where bits of a byte of the packets are interleaved with bits of another byte of the packets. An un-interleaved ECC packet 532 may be divided into pairs of byte. For example, the first two bytes 712 and 714 of the packet 532 may form a pair of byte to be interleaved. The bits of the byte 712 may be interleaved with the bits of the byte 714 to form a pair of interleaved bytes of 732 and 734 of the interleaved packet 542.
[0082] An example interleaving pattern is shown in the example of Fig. 9 where odd bits of the byte 712 are retained while even bits of the byte 712 are replaced with even bits of the byte 714 to form the byte 732. Odd bits of the byte 714 are retain while even bits of the byte 714 are replaced with even bits of the byte 712 to form the byte 734. Bits of the bytes 716 and 722 of the packet 532 may also be interleaved to form interleaved bytes 736 and 738 of the interleaved packet 542. The remaining bytes of the packet 532 may be interleaved in a similar way to obtain the interleaved packet 542.
[0083] It would be appreciated that the interleaving shown in Fig. 7C is merely for the purpose of illustration and any other interleaving pattern may be applied. For example, bits in every three or more bytes of the packet 532 may be interleaved and those bits may be interleaved in other manners. The interleaving of the packet may maximize the error correction capability of ECC bits for errors such as bursty errors occurring on the data interconnect. In many cases, the bit errors may be corrected after de-interleaving at the video sink side.
[0084] It would also be appreciated that some or all of the deserializer 515, packetizer 520, ECC unit 530, and interleaver 540 may be optional to the sideband processor 420. When some of those components are omitted from the sideband processor 420, the remaining components may still perform corresponding operations on the received sideband data. For example, without the deserializer 515, the packetizer 520 may packet the downsampled digital samples 512. In another example, without the packetizer 520, the ECC unit 530 may add ECC bits into the bit stream from the downsampler 510 or the deserializer 515.
[0085] Fig. 8 illustrates a portion of the frame 502 generated by the device 400 in more detail according to an embodiment of the present disclosure. In Fig. 5B, after various operations by the device 400, the sideband data is packetized as N sideband data packets. The N sideband data packets may be placed in S horizontal lines 340 of the frame 302 (2 lines in the example of Fig. 5B). It is noted that although the S horizontal lines are used to carry sideband data which is not video data, they are still treated as corresponding to the active data period. Due to the additional horizontal lines for sideband data, the original VDE signal 318 may be extended to the VDE signal 316 to indicate the extension of the active data period. The video sink receiving the frame 302 may extract the S horizontal lines as sideband data from the frame 302.
[0086] Fig. 9 illustrates a block diagram of a device 900 for receiving and processing a frame generated by the device 400 according to an embodiment of the present disclosure. The device 900 may include some or all features of the video sink 600. As shown, the device 900 includes a receiver (RX) 910, a sideband processor 930, a video processor 932, a display engine 950, and a capability report unit 960.
[0087] In some embodiments, the receiver 910 receives an input 905 which includes a frame such as the frame 502 or 504 generated by the device 400. The frame may include both video data and sideband data received in the active data period. The frame may be received via a hardware interconnect compatible with physical layer requirements of a video interface specification. In an embodiment, the video interface specification may specify a frame format. The frame format may include a total of X consecutive horizontal lines of vertical blanking data and a total of Y consecutive horizontal lines of video data, where X is a first integer and Y is a second integer.
[0088] In some embodiments, the receiver 910 may receive vertical blanking data in (X- S) consecutive horizontal lines of the frame during a vertical blanking period, where S is a third integer. During an active data period, the receiver 910 may receive sideband data in S horizontal lines of the frame and video data in Y horizontal lines of the frame. In some embodiments, the receiver 910 may also receive horizontal blanking data outside in the (Y+S) horizontal lines outside of the active data. The receiver 910 may identify one or more portions of the input 905 as being sideband data, blanking data, or video data according to control signals received in the input 905.
[0089] In some embodiments, the receiver 910 may provide to the sideband processor 930 the downsampled sideband data 912 received from an active data period of the frame. By way of illustration and not limitation, the receiver 910 may perform de-multiplexing to separate the sideband data for the sideband processor 930 for processing. The receiver 910 may further provide video data 924 to video processor 932. In some other embodiments, the receiver 910 may also provide other blanking data in the received frame such as audio data, control data, other packetized or unpacketized data to one or more other processors (not shown) for processing.
[0090] In some embodiments, the sideband processor 930 may perform one or more operations to generate sideband data 922 based on the received sideband data 912. The operations may be depending on the generation of the sideband data 912 in the frame (for example, the sideband data 434). By way of illustration only, the sideband processor 930 may perform operations for de-interleaving the sideband data 912, error checking the sideband data 912, and/or de-packetizing the sideband data packets. The processing of the sideband processor 930 will be discussed in more detail below with reference to Fig. 10. The resulting sideband data may then be converted into information for use in replacing, masking, or otherwise modifying the video data in the frame to be displayed. In some examples, the resulting sideband data may be pixel data. [0091] In some embodiments, the video processor 932 may perform one or more operations to provide video data 926 based on the video data included in the received frame. By way of illustration only, the video processor 932 may perform operations to isolate, synchronize, order, condition, or otherwise prepare the video data for a digital display.
[0092] In some embodiments, the video data 926 and pixel data 922 may be provided to the display engine 950. The display engine 950 may generate video data 940 for display. By way of example, the video data 940 may be provided to a HD video display (not shown). In an embodiment, the display engine 950 may perform one or more multiplexing operations to interleave portions of the pixel data 922 with portions of the video data 926, thereby blending the pixel data 922 with the video data 926. Alternatively, the display engine 950 may perform operations to calculate pixel color values based on values of the pixel data 922 and pixel color values of the video data 926.
[0093] In some embodiments, the capability report unit 960 may output a support indication 460 indicating whether the device 900 supports the sideband data transmitted within a frame. In addition, the indication 460 may also indicate whether the device 900 supports transmission of the sideband data within the active data period of the frame. The support indication 460 may be transmitted to the device transmitting the frame, for example, the device 400.
[0094] It would be appreciated that the sideband processor 930, the video processor 932, and the capability report unit 960 may be implemented by a single processor 990, a set of processors, one or more microprocessors, controllers, central processing units (CPUs), and/or the like. It would also be appreciated that the components of the device 900 are given for the purpose of illustration. In some use cases, as mentioned above, some components may be omitted from the device 900. In some other embodiments, additional components may be included.
[0095] Fig. 10 is a block diagram of a sideband processor 930 according to an embodiment of the present disclosure. As shown, the sideband data processor 930 may include a de- interleaver 1010, an ECC unit 1020, a de-packetizer 1030, and a pixel data generator 1040. In some embodiments, the de-interleaver 1010 may receive the sideband data 912. In the embodiments where the sideband data 912 includes interleaved sideband data packets, the de-interleaver 1010 may de-interleave the data packets into de-interleaved data packets 1012. The de-interleaving may involve reversing the interleaving of Fig. 7C to reconstruct the data packets 1012. [0096] In the embodiments where the sideband data 912 includes ECC bits, the ECC unit 1020 may receive the de-interleaved data packets 1012 and perform error checking on the data packets 1012. The ECC unit 1020 may then generate error corrected data packets 1022. In some embodiments, the de-packetizer 1030 may receive the error corrected data packets 1022 and de-packetize the data packets 1022 by extracting the header and pay load from the data packets 1022, for example. The de-packetizer 1030 may output the resulting sideband data 1032, which may be extracted from the payload of the data packets 1022. The de-packetizer 1030 may also output the header information in some examples.
[0097] In some embodiments, the pixel data generator 1040 may receive the sideband data 1032 and then generate pixel data 922 from the sideband data 1032. The pixel data generator 1040 may convert the sideband data 1032 into information for use in replacing, masking, or otherwise modifying pixel color values of the frame. Such information may, for example, include alternate pixel colors or data for determining such alternate pixel colors. In one embodiment where the sideband data 1032 is teletext data, the pixel data generator 1040 may be a sideband data translator, such as an on-screen display (OSD) controller. The sideband data may contain a text message to be displayed and metadata describing the text message, for example, metadata describing the location and character size of the message, features, etc. The sideband data translator may use the sideband data to generate pixel data 922 that is passed on to the display engine 950.
[0098] It would also be appreciated that some or all of the de-interleaver 1010, ECC unit 1020, de-packetizer 1030, and pixel data generator 1040 may be optional to the sideband processor 930 depending on the generation of the sideband data in the received frame. In the embodiments where the sideband data is an unpacketized bit stream, the pixel data generator 1040 may directly generate pixel data 922 based on the bit stream. In the embodiments where the sideband data is not interleaved or include no ECC bits, the ECC unit 1020 and the de-interleaver 1010 may be omitted from the device 900. It is noted that since the device 400 have processed the raw sideband data and transmitted the net sideband data to the device 900, the device 900 may not be required for extracting the information from the raw sideband data and thus its complexity and cost are reduced.
[0099] Fig. 11 illustrates a flowchart of a method 1100 for generating a frame including sideband data in the active data period according to an embodiment of the present disclosure. The method 1100 may be performed by a device which provides functionality of generating a frame in a format compatible with a video interface specification. In some examples, the device may be the converter device 130 or the video source 110 as shown in Fig. 1, or may be the device 400 as shown in Fig. 4.
[00100] In step 1110, digital samples representing raw sideband data for a video are downsampled. In step 1120, first sideband data is generated based on the downsampled digital samples and in step 1130, a first frame for the video is generated where the first frame at least including first video data for the video and the first sideband data. In step 1140, the first video data and the first sideband data is transmitted to a further device during a first active data period for the first frame. The first active data period is different from a first blanking period for the first frame.
[00101] In some embodiments, the downsampled digital samples may be deserialized; and the first sideband data may be generated based on the deserialized digital samples. Alternatively, or in addition, the first sideband data may be generated by adding an ECC bit into the downsampled digital samples. In some other embodiments, a sideband data packet may be generated based on the downsampled digital samples. Bits of the sideband data packet may be interleaved and the first sideband data may be generated based on the interleaved sideband data packet.
[00102] In some embodiments, the digital samples may be obtained by receiving a second frame at least includes an analog signal for the raw sideband data and converting the analog signal into the digital samples. The analog signal for the raw sideband data may be received during a second blanking period for the second frame.
[00103] In some embodiments, the first frame may be transmitted to the further device via an interconnect compatible with an interface specification, the interface specification specifying a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
[00104] In some embodiments, the first frame further includes first blanking data. During the first active data period, the first video data may be transmitted in the first number of horizontal lines, and the first sideband data may be transmitted in a third number of horizontal lines as a part of the non-video data. During the first blanking period, the first blanking data may be transmitted in a fourth number of horizontal lines as a further part of the non-video data. The fourth number plus the third number may be equal to the second number.
[00105] In some embodiments, the first sideband data may be generated in response to receiving from the further device an indication that the further device supports the first sideband data.
[00106] In some embodiments, the raw sideband data includes teletext data or closed captioning data.
[00107] Fig. 12 illustrates a flowchart of a method 1200 for receiving a frame including sideband data in the active data period according to an embodiment of the present disclosure. The method 1200 may be performed by a device which provides functionality of generating a frame in a format compatible with a video interface specification. In some examples, the device may be the video sink 120 as shown in Fig. 1, or may be the device 900 as shown in Fig. 9.
[00108] In step 1210, first video data and first sideband data included in a first frame for a video may be received from a further device during a first active data period for the first frame. The first sideband data may be generated based on downsampled digital samples representing raw sideband data for the video. The first active data period may be different from a first blanking period for the first frame. In step 1220, second video data for display may be generated based on the first video data and the first sideband data.
[00109] In some embodiments, the first sideband data may include deserialized digital samples generated by deserializing the downsampled digital samples.
[00110] In some embodiments, the first sideband data may include the downsampled digital samples and an ECC bit. In some embodiments, ECC protected digital samples may be generated performing error checking on the downsampled digital samples based on the ECC bit. The second video data may be generated based on the first video data and the ECC protected digital samples.
[00111] In some embodiments, the first sideband data may include an interleaved sideband data packet generated based on the downsampled digital samples. In some embodiments, the interleaved sideband data packet may be de-interleaved. The de- interleaved sideband data packet may be de-packetized to obtain the downsampled digital samples. The second video data may be generated based on the first video data and the downsampled digital samples.
[00112] In some embodiments, the first frame may be received from the further device via an interconnect compatible with an interface specification. The interface specification may specify a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
[00113] In some embodiments, the first frame may further include first blanking data. During the first active data period, the first video data may be received in the first number of horizontal lines, and the first sideband data may be received in a third number of horizontal lines as a part of the non-video data. During the first blanking period, the first blanking data may be received in a fourth number of horizontal lines as a further part of the non-video data. The fourth number plus the third number may be equal to the second number.
[00114] In some embodiments, an indication may be transmitted to the further device. The indication may indicate that the device supports the first sideband data.
[00115] In some embodiments, the raw sideband data includes teletext data or closed captioning data.
[00116] It is to be understood that the components of the device 400, 900 may be a hardware module or a software unit module. For example, in some embodiments, the system may be implemented partially or completely as software and/or in firmware, for example, implemented as a computer program product embodied in a computer readable medium. Alternatively, or in addition, the system may be implemented partially or completely based on hardware, for example, as an integrated circuit (IC), an application- specific integrated circuit (ASIC), a system on chip (SOC), a field programmable gate array (FPGA), and so forth. The scope of the present disclosure is not limited in this regard. [00117] Generally speaking, various example embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of the example embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it will be appreciated that the blocks, apparatus, systems, techniques or methods of the present disclosure may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
[00118] Additionally, various blocks shown in the flowcharts may be viewed as method steps, and/or as operations that result from operation of computer program code, and/or as a plurality of coupled logic circuit elements constructed to carry out the associated function(s). For example, example embodiments of the present disclosure include a computer program product including a computer program tangibly embodied on a machine readable medium, the computer program containing program codes configured to carry out the methods as described above.
[00119] In the context of the disclosure, a machine readable medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable medium may include, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
[00120] Computer program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These computer program codes may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor of the computer or other programmable data processing apparatus, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a computer, partly on the computer, as a stand-alone software package, partly on the computer and partly on a remote computer or entirely on the remote computer or server. The program code may be distributed on specially-programmed devices which may be generally referred to herein as "modules". Software component portions of the modules may be written in any computer language and may be a portion of a monolithic code base, or may be developed in more discrete code portions, such as is typical in object-oriented computer languages. In addition, the modules may be distributed across a plurality of computer platforms, servers, terminals, mobile devices and the like. A given module may even be implemented such that the described functions are performed by separate processors and/or computing hardware platforms.
[00121] As used herein, the term "includes" and its variants are to be read as open-ended terms that mean "includes, but is not limited to." The term "or" is to be read as "and/or" unless the context clearly indicates otherwise. The term "based on" is to be read as "based at least in part on." The term "one embodiment" and "an embodiment" are to be read as "at least one embodiment." The term "another embodiment" is to be read as "at least one other embodiment". The terms "first," "second," and the like may refer to different or same objects. Other definitions, either explicit or implicit, may be included below.
[00122] While operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the present disclosure or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. [00123] Various modifications, adaptations to the foregoing example embodiments of the present disclosure may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings. Any and all modifications will still fall within the scope of the non-limiting and example embodiments of the present disclosure. Furthermore, other embodiments of the present disclosure will come to mind to one skilled in the art to which those embodiments pertain having the benefit of the teachings presented in the foregoing descriptions and the drawings. [00124] It will be appreciated that the embodiments of the present disclosure are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are used herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

WHAT IS CLAIMED IS:
1. A method comprising:
downsampling digital samples representing raw sideband data for a video;
generating first sideband data based on the downsampled digital samples; generating a first frame for the video, the first frame at least including first video data for the video and the first sideband data; and
transmitting to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.
2. The method of claim 1, wherein the generating first sideband data comprises: deserializing the downsampled digital samples; and
generating the first sideband data based on the deserialized digital samples.
3. The method of claim 1, wherein the generating first sideband data comprises: generating the first sideband data by adding an error correcting code (ECC) bit into the downsampled digital samples.
4. The method of claim 1, wherein the generating first sideband data comprises: generating a sideband data packet based on the downsampled digital samples; interleaving bits of the sideband data packet; and
generating the first sideband data based on the interleaved sideband data packet.
5. The method of claim 1, further comprising obtaining the digital samples by: receiving a second frame at least including an analog signal for the raw sideband data, the analog signal for the raw sideband data being received during a second blanking period for the second frame; and
converting the analog signal into the digital samples.
6. The method of claim 1, wherein the transmitting comprises: transmitting the first frame to the further device via an interconnect compatible with an interface specification, the interface specification specifying a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
7. The method of claim 6, wherein the first frame further includes first blanking data, and the transmitting comprises:
during the first active data period,
transmitting the first video data in the first number of horizontal lines, and transmitting the first sideband data in a third number of horizontal lines as a part of the non-video data; and
during the first blanking period,
transmitting the first blanking data in a fourth number of horizontal lines as a further part of the non-video data, the fourth number plus the third number being equal to the second number.
8. The method of claim 1, wherein the generating first sideband data comprises: in response to receiving from the further device an indication that the further device supports the first sideband data, generating the first sideband data.
9. The method of claim 1, wherein the raw sideband data includes teletext data or closed captioning data.
10. A method comprising:
receiving from a further device first video data and first sideband data included in a first frame for a video during a first active data period for the first frame, the first sideband data being generated based on downsampled digital samples representing raw sideband data for the video, and the first active data period being different from a first blanking period for the first frame; and
generating second video data for display based on the first video data and the first sideband data.
11. The method of claim 10, wherein the first sideband data includes deserialized digital samples generated by deserializing the downsampled digital samples.
12. The method of claim 10, wherein the first sideband data includes the downsampled digital samples and an error correcting code (ECC) bit; and
wherein the generating second video data comprises:
generating ECC protected digital samples by performing error checking on the downsampled digital samples based on the ECC bit; and
generating the second video data based on the first video data and the ECC protected digital samples.
13. The method of claim 10, wherein the first sideband data includes an interleaved sideband data packet generated based on the downsampled digital samples; and
wherein the generating second video data comprises:
de-interleaving the interleaved sideband data packet;
de-packetizing the de-interleaved sideband data packet to obtain the downsampled digital samples; and
generating the second video data based on the first video data and the downsampled digital samples.
14. The method of claim 10, wherein the receiving comprises receiving the first frame from the further device via an interconnect compatible with an interface specification, the interface specification specifying a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non- video data.
15. The method of claim 14, wherein the first frame further includes first blanking data, and the receiving comprises:
during the first active data period,
receiving the first video data in the first number of horizontal lines, and receiving the first sideband data in a third number of horizontal lines as a part of the non-video data; and
during the first blanking period, receiving the first blanking data in a fourth number of horizontal lines as a further part of the non-video data, the fourth number plus the third number being equal to the second number.
16. The method of claim 10, further comprising:
transmitting to the further device an indication that the device supports the first sideband data.
17. The method of claim 10, wherein the raw sideband data includes teletext data or closed captioning data.
18. A device comprising:
a processor configured to
downsample digital samples representing raw sideband data for a video, generate first sideband data based on the downsampled digital samples, and generate a first frame for the video, the first frame at least including first video data for the video and the first sideband data; and
a transmitter configured to transmit to a further device the first video data and the first sideband data during a first active data period for the first frame, the first active data period being different from a first blanking period for the first frame.
19. The device of claim 18, wherein the processor is configured to
deserialize the downsampled digital samples; and
generate the first sideband data based on the deserialized digital samples.
20. The device of claim 18, wherein the processor is further configured to generate the first sideband data by adding an error correcting code (ECC) bit into the downsampled digital samples.
21. The device of claim 18, wherein the processor is further configured to generate a sideband data packet based on the downsampled digital
samples; interleave bits of the sideband data packet; and generate the first sideband data based on the interleaved sideband data packet.
22. The device of claim 18, further comprising:
a receiver configured to receive a second frame at least including an analog signal for the raw sideband data, the analog signal for the raw sideband data being received during a second blanking period of the second frame; and
an analog-to-digital converter configured to convert the analog signal into the digital samples.
23. The device of claim 18, wherein the transmitter is configured to transmit the first frame to the further device via an interconnect compatible with an interface specification, the interface specification specifying a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
24. The device of claim 23, wherein the first frame further includes first blanking data, and the transmitter is configured to
during the first active data period,
transmit the first video data in the first number of horizontal lines, and transmit the first sideband data in a third number of horizontal lines as a part of the non-video data, and
during the first blanking period,
transmit the first blanking data in a fourth number of horizontal lines as a further part of the non-video data, the fourth number plus the third number being equal to the second number.
25. The device of claim 18, further comprising:
a receiver configured to receive from the further device an indication that the further device supports the first sideband data; and
wherein the processor is configured to, in response to reception of the indication, generate the first sideband data.
26. The device of claim 18, wherein the raw sideband data includes teletext data or closed captioning data.
27. A device comprising:
a receiver configured to receive from a further device first video data and first sideband data included in a first frame for a video during a first active data period for the first frame, the first sideband data being generated based on downsampled digital samples representing raw sideband data for the video, and the first active data period being different from a first blanking period for the first frame; and
a processor configured to generate second video data for display based on the first video data and the first sideband data.
28. The device of claim 27, wherein the first sideband data includes deserialized digital samples generated by deserializing the downsampled digital samples.
29. The device of claim 27, wherein the first sideband data includes the downsampled digital samples and an error correcting code (ECC) bit; and
wherein the processor is further configured to
generate ECC protected digital samples by performing error checking on the downsampled digital samples based on the ECC bit; and
generate the second video data based on the first video data and the ECC protected digital samples.
30. The device of claim 27, wherein the first sideband data includes an interleaved sideband data packet generated based on the downsampled digital samples; and
wherein the processor is configured to:
de-interleave the interleaved sideband data packet;
de-packetize the de-interleaved sideband data packet to obtain the downsampled digital samples; and
generate the second video data based on the first video data and the downsampled digital samples.
31. The device of claim 27, wherein the receiver is configured to receive the first frame from the further device via an interconnect compatible with an interface specification, the interface specification specifying a frame format including a first number of horizontal lines of video data and a second number of horizontal lines of non-video data.
32. The device of claim 31, wherein the first frame further includes first blanking data, and the receiver is configured to
during the first active data period,
receive the first video data in the first number of horizontal lines, and receive the first sideband data in a third number of horizontal lines as a part of the non-video data; and
during the first blanking period,
receive the first blanking data in a fourth number of horizontal lines as a further part of the non-video data, the fourth number plus the third number being equal to the second number.
33. The device of claim 27, further comprising:
a transmitter configured to transmit to the further device an indication that the device supports the first sideband data.
34. The device of claim 27, wherein the raw sideband data includes teletext data or closed captioning data.
PCT/US2016/034153 2015-05-29 2016-05-25 Communication of sideband data for videos WO2016196138A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562168169P 2015-05-29 2015-05-29
US62/168,169 2015-05-29

Publications (1)

Publication Number Publication Date
WO2016196138A1 true WO2016196138A1 (en) 2016-12-08

Family

ID=57441660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/034153 WO2016196138A1 (en) 2015-05-29 2016-05-25 Communication of sideband data for videos

Country Status (1)

Country Link
WO (1) WO2016196138A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110858865A (en) * 2018-08-24 2020-03-03 上海富瀚微电子股份有限公司 Data transmission method and device for simulating high-definition video
WO2022193914A1 (en) * 2021-03-17 2022-09-22 上海哔哩哔哩科技有限公司 Method and apparatus for sample adaptive offset, device, and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120314128A1 (en) * 2008-06-23 2012-12-13 Kuan-Chou Chen Apparatus and method of transmitting/receiving multimedia playback enhancement information, vbi data, or auxiliary data through digital transmission means specified for multimedia data transmission
WO2014182717A1 (en) * 2013-05-10 2014-11-13 Silicon Image, Inc. Method, apparatus and system for communicating sideband data with non-compressed video
WO2014192568A1 (en) * 2013-05-30 2014-12-04 ソニー株式会社 Signal processing apparatus, signal processing method, program, and signal transmission system
US20150009408A1 (en) * 2012-04-03 2015-01-08 Panasonic Corporation Video signal transmitter apparatus and receiver apparatus using uncompressed transmission system of video signal
US20150138317A1 (en) * 2013-11-18 2015-05-21 Electronics And Telecommunications Research Institute System and method for providing three-dimensional (3d) broadcast service based on retransmission networks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120314128A1 (en) * 2008-06-23 2012-12-13 Kuan-Chou Chen Apparatus and method of transmitting/receiving multimedia playback enhancement information, vbi data, or auxiliary data through digital transmission means specified for multimedia data transmission
US20150009408A1 (en) * 2012-04-03 2015-01-08 Panasonic Corporation Video signal transmitter apparatus and receiver apparatus using uncompressed transmission system of video signal
WO2014182717A1 (en) * 2013-05-10 2014-11-13 Silicon Image, Inc. Method, apparatus and system for communicating sideband data with non-compressed video
WO2014192568A1 (en) * 2013-05-30 2014-12-04 ソニー株式会社 Signal processing apparatus, signal processing method, program, and signal transmission system
US20150138317A1 (en) * 2013-11-18 2015-05-21 Electronics And Telecommunications Research Institute System and method for providing three-dimensional (3d) broadcast service based on retransmission networks

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110858865A (en) * 2018-08-24 2020-03-03 上海富瀚微电子股份有限公司 Data transmission method and device for simulating high-definition video
WO2022193914A1 (en) * 2021-03-17 2022-09-22 上海哔哩哔哩科技有限公司 Method and apparatus for sample adaptive offset, device, and medium

Similar Documents

Publication Publication Date Title
US11223874B2 (en) Transmission and detection of multi-channel signals in reduced channel format
US7555693B2 (en) Auxiliary data transmitted within a display's serialized data stream
US9014258B2 (en) Transmission device and method of determining transmission date format
US8098690B2 (en) System and method for transferring high-definition multimedia signals over four twisted-pairs
EP2355506A1 (en) Transmitter apparatus and transmission data format deciding method
KR20100020952A (en) Data transmission apparatus with information skew and redundant control information and method
GB2429359A (en) HDMI transmission systems for delivering image signals and packetized audio and auxiliary data
US9769417B1 (en) Metadata transfer in audio video systems
KR20170028165A (en) Image processing apparatus and control method thereof
US9288418B2 (en) Video signal transmitter apparatus and receiver apparatus using uncompressed transmission system of video signal
TWI565254B (en) Radio frequency interference reduction in multimedia interfaces
US20140267902A1 (en) Transmission device and reception device for baseband video data, and transmission/reception system
CN102256092A (en) Data transmission device, data reception device, data transmission method, and data reception method
US20100325651A1 (en) Video Signal Processing Apparatus and Set Top Box
WO2016196138A1 (en) Communication of sideband data for videos
US9112520B2 (en) Transmission interface and system using the same
JP2009047698A (en) Apparatus and method for measuring skew between channels in serial data communication
US8786776B1 (en) Method, apparatus and system for communicating sideband data with non-compressed video
US8648739B2 (en) Transmission interface and system using the same
WO2012147791A1 (en) Image receiving device and image receiving method
US20170150083A1 (en) Video signal transmission device, method for transmitting a video signal thereof, video signal reception device, and method for receiving a video signal thereof
US20110285869A1 (en) Serial data sending and receiving apparatus and digital camera
US11451648B2 (en) Multimedia communication bridge
JP2005318490A (en) Transmission system
WO2023087143A1 (en) Video transmission method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16804037

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16804037

Country of ref document: EP

Kind code of ref document: A1