WO2011090663A2 - Method, apparatus, and system for simultaneously previewing contents from multiple protected sources - Google Patents
Method, apparatus, and system for simultaneously previewing contents from multiple protected sources Download PDFInfo
- Publication number
- WO2011090663A2 WO2011090663A2 PCT/US2010/061572 US2010061572W WO2011090663A2 WO 2011090663 A2 WO2011090663 A2 WO 2011090663A2 US 2010061572 W US2010061572 W US 2010061572W WO 2011090663 A2 WO2011090663 A2 WO 2011090663A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data stream
- primary
- pixels
- port
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
- H04N21/43632—Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wired protocol, e.g. IEEE 1394
- H04N21/43635—HDMI
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/02—Handling of images in compressed format, e.g. JPEG, MPEG
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2352/00—Parallel handling of streams of display data
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2358/00—Arrangements for display data security
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/12—Use of DVI or HDMI protocol in interfaces along the display data pipeline
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/005—Adapting incoming signals to the display format of the display terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4347—Demultiplexing of several video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4405—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream decryption
- H04N21/44055—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream decryption by partially decrypting, e.g. decrypting a video stream that has been partially encrypted
Definitions
- Embodiments of the invention generally relate to the field of electronic networks and, more particularly, to simultaneously previewing contents from multiple protected sources.
- the data may include data protected by High-bandwidth Digital Content Protection (HDCP) data, which is referred to herein as HDCP data.
- Communicating multiple media data streams may include a flow of content between a transmitting authority (e.g., cable television (TV) or satellite companies) and a receiving device (e.g., a TV) via a transmission device (e.g., cable/satellite signal transmission device) through a High-Definition Multimedia Interface (HDMI).
- a transmitting authority e.g., cable television (TV) or satellite companies
- a receiving device e.g., a TV
- a transmission device e.g., cable/satellite signal transmission device
- HDMI High-Definition Multimedia Interface
- Certain receiving devices e.g., televisions
- this conventional technology has been mainly used only for legacy analog inputs because of their low resolutions and lower demand for hardware resources.
- some conventional techniques have begin to cover digital inputs; nevertheless, they are still based on a
- a method includes generating a primary data stream associated with a primary port, the primary data stream having a primary image to be displayed on a display screen, generating a secondary data stream associated with a plurality of secondary ports coupled with the primary port, the secondary data stream having a plurality of secondary images received from the plurality of secondary ports, merging the secondary data stream with the primary data stream into a display data stream, the display data stream having the primary image and further having the plurality of secondary images as a plurality of preview images, and displaying the primary image and the plurality of preview images on the display screen, wherein each of the plurality of preview images is displayed through an inset screen on the display screen.
- a system includes a data processing device having a storage medium and a processor coupled with the storage medium, the processor to generate a primary data stream associated with a primary port, the primary data stream having a primary image to be displayed on a display screen, generate a secondary data stream associated with a plurality of secondary ports coupled with the primary port, the secondary data stream having a plurality of secondary images received from the plurality of secondary ports, merge the secondary data stream with the primary data stream into a display data stream, the display data stream having the primary image and further having the plurality of secondary images as a plurality of preview images.
- the apparatus further includes a display device coupled with the data processing device, the display device to display the primary image and the plurality of preview images on the display screen, wherein each of the plurality of preview images is displayed through an inset screen on the display screen.
- an apparatus includes a data processing device having a storage medium and a processor coupled with the storage medium, the processor to generate a primary data stream associated with a primary port, the primary data stream having a primary image to be displayed on a display screen, generate a secondary data stream associated with a plurality of secondary ports coupled with the primary port, the secondary data stream having a plurality of secondary images received from the plurality of secondary ports, and merge the secondary data stream with the primary data stream into a display data stream, the display data stream having the primary image and further having the plurality of secondary images as a plurality of preview images.
- Figure 1 illustrates a logical block diagram of an HDCP pre- authentication system
- Figure 2 illustrates an embodiment of an HDCP engine-to- port system employing a one-on-one ratio between the HDCP engines and the corresponding ports;
- Figure 3 illustrates an embodiment of a technique for displaying multiple data streams from multiple sources
- Figure 4A illustrates an embodiment of a preview system
- Figure 4B illustrates an embodiment of a stream mixer
- Figure 5 illustrates an embodiment of a process for displaying multiple data streams from multiple sources
- Figure 6 is an illustration of embodiments of components of a network computer device employing an embodiment of the present invention.
- Embodiments of the invention are generally directed to previewing contents from multiple protected sources.
- a receiving device e.g., TV
- displays multiple contents e.g., video images with audio
- multiple protected sources or ports e.g., HDMI or non-HDMI input ports.
- One of the multiple images being displayed serves as the primary image (being received via a main HDMI or non-HDMI port) encompassing most of the display screen, while other images are displayed as secondary images (being received via corresponding roving HDMI or non-HDMI ports) occupying small sections or insets of the display screen.
- a port may include an HDMI or a non-HDMI port and that HDMI ports are used in this document merely an example and brevity and clarity.
- network or “communication network” mean an interconnection network to deliver digital media content (including music, audio/video, gaming, photos, and others) between devices using any number of technologies, such as Serial Advanced Technology Attachment (SATA), Frame Information Structure (FIS), etc.
- An entertainment network may include a personal entertainment network, such as a network in a household, a network in a business setting, or any other network of devices and/or components.
- a network includes a Local Area Network (LAN), Wide Area Network (WAN), Metropolitan Area Network (MAN), intranet, the Internet, etc.
- certain network devices may be a source of media content, such as a digital television tuner, cable set-top box, handheld device (e.g., personal device assistant (PDA)), video storage server, and other source device.
- Other devices may display or use media content, such as a digital television, home theater system, audio system, gaming system, and other devices.
- certain devices may be intended to store or transfer media content, such as video and audio storage servers.
- Certain devices may perform multiple media functions, such as cable set-top box can serve as a receiver device (receiving information from a cable headend) as well as a transmitter device (transmitting information to a TV) and vice versa.
- Network devices may be co-located on a single local area network or span over multiple network segments, such as through tunneling between local area networks.
- a network may also include multiple data encoding and encryption processes as well as identify verification processes, such as unique signature verification and unique identification (ID) comparison.
- a transmitting device e.g., a DVD player
- a receiving device e.g., TV
- the receiving device authenticates the transmitting device prior to accepting the protected media content from it. To avoid waiting time of such authentication processes, pre-authentication of devices is performed.
- Pre- Authentication is a term used here to indicate a feature of devices, including HDMI switch products, to allow them to switch more quickly between inputs. The term describes the performing of necessary HDCP
- HDCP receivers are considered slave devices, an HDCP receiver is not expected to explicitly signal a transmitter with any request or status. Even a "broken" link is typically signaled implicitly (and rather crudely) by intentionally “breaking" the Ri sequence (the response from receiver (Rx) to transmitter (Tx) when Tx checks if the link is kept being synchronized securely).
- Ri sequence the response from receiver (Rx) to transmitter (Tx) when Tx checks if the link is kept being synchronized securely.
- Rx receiver
- Tx transmitter
- Much of the delay that pre-authentication addresses is caused by these transmitter quirks, and not by the receiver. While, ideally, the transmitters would be modified to avoid these performance issues, realistically, this cannot be expected, and thus pre-authentication can provide significant value in data stream operations.
- an HDCP receiver needs two things to stay synchronized with the transmitter: (1) the receiver knows where the frame boundaries are; and (2) the receiver knows which of these frames contains a signal that indicates that a frame is encrypted (e.g., CTL3).
- CTL3 is used as an example of encryption indicator without any limitation for the ease of explanation, brevity, and clarity.
- FIG. 1 illustrates an embodiment of an HDCP pre- authentication system 100.
- the illustrated HDCP pre-authentication system 100 includes an HDCP (pre-authenticated) device 101 that include a dedicated HDCP engine block 104-108, 120 per input port.
- the normal HDCP logic is used in every case, even when the open-loop ciphers do not do any decryption. This is because the re-keying functions use the HDCP logic to maximize dispersion.
- an open-loop HDCP engine 104-108 uses a Phase Lock Loop (PLL) 110- 114 or PLL-like circuit to lock onto the frame rate and provide ongoing information about where the frame boundaries are while running in the open-loop mode.
- PLL Phase Lock Loop
- TMDS TMDS Signaling
- roving receiver may be used to sequentially provide the essential information to the open-loop logic.
- This roving receiver 116 cycles through the currently unused inputs, finds the frame boundaries (so that the corresponding PLL 110-114 can lock on), and also finds the first CTL3 signal when an authentication occurs. In some cases, this could be a stripped-down version of a TMDS receiver 116 because in essence, it merely needs the VSYNC and CTL3 indicators.
- a main/normal TV data path 132 may work in the same manner as conventional switch products.
- one of the input ports can be selected for the main/normal data path 132, while the data stream is decoded and decrypted (e.g., decipher to take out original audio/video (A/V) data from the incoming encrypted data) as necessary, and then is routed through the remainder of the appliance.
- decrypted e.g., decipher to take out original audio/video (A/V) data from the incoming encrypted data
- the roving receiver 116 samples the currently idle ports (i.e., all ports except the one selected by user to watch), one at a time. This necessitates a state-machine or (more likely) a microcontroller of some kind to control the process.
- the initial operational sequence typically follows: (1) the roving receiver 116 is connected to an unused input port (i.e., the port that is not selected by the user to watch) and monitors it for video; (2) the HDCP engine 104-108 is connected to the port as well, which means that the I 2 C bus is connected (e.g., I 2 C is regarded as an additional communication channel between Tx and Rx for link synchronization check).
- the roving receiver 116 provides information to align the PLL with the frame boundaries; (4) the state machine or microcontroller waits a time period for the HDCP authentication to begin. If it does, it continues to wait until the authentication completes and the first CTL3 signal is received; (5) the HDCP block continues to cycle in an open-loop function counting "frames" using information only from the PLL.
- EDID Extended Display Identification Data
- the I C port stays connected, and the hotplug signal continues to indicate that a receiver is connected; (6) the roving receiver 116 then continues on to the next port and performs the same operations. In some embodiments, once the roving receiver 116 has started all ports, it then goes into a service loop, checking each port in sequence.
- the illustrated system 100 may contain m ports to select each port 124-130 one by one in the background through a Time Division Multiplexing (TDM) technique.
- HDMI signals from the selected port 124-130 are used for pre- authentication.
- Each roving port 124-128 having its own HDCP Engine 104-108 is synchronized with the main port 130 such that each roving port 124-128 is ready for a change to be selected to replace the main port 130.
- the roving pipe gets HDMI signals from all background ports 124-128 one by one and keeps them pre-authenticated and ready.
- FIG. 2 illustrates an embodiment of an HDCP engine-to- port system 200 employing a one-on-one ratio between the HDCP engines 202-208 and the corresponding ports 210-216.
- the illustrated system 200 includes four HDCP engines 202-208 that corresponding to ports 210-216 in a one-on-one ratio, e.g., each HDCP engine 202-208 corresponds to a single port 210-216.
- the system 200 further illustrates port 1 210 as being in main pipe or path 218 and is associated with HDCP engine 1 202.
- Other paths 2-3 204-206 are in roving pipe or path 220 and are associated with HDCP engines 2-4 204-208. It is to be noted that the terms pipe and path are used interchangeably throughout this document.
- HDCP engine 202 of main path 218 works for each pixel (to decrypt and get the video and audio data) and synchronization (e.g., re -keying, which refers to at every frame boundary, Tx and Rx change the shared key used for cipher and decipher the contents. This is to prevent a key from being used for too many data.
- Tx and Rx exchange the residue of the key and check the synchronization of the link, called Ri checking in HDCP
- HDCP engines 204-208 of roving path 220 work for synchronization (e.g., re -keying) and idle.
- HDCP engines 204-208 of roving path 220 work for a short period of time (e.g., performing the re-keying process) merely to synchronize Ri values that are used to make a transmitter (Tx) trust a receiver (Rx) is synchronized.
- HDCP engines 204-208 are only needed and are functioning during the synchronization period and the rest of the time period they become idle without any further use for the remainder of the time period while HDCP engine 202 continues to work.
- Figure 3 illustrates an embodiment of a technique for displaying multiple data streams 312-320 from multiple sources 302-310.
- preview system 324 employs the pre-authentication and roving techniques of Figures 1-2 to display multiple data streams 312-320 on a receiving device (e.g., television) 322.
- a receiving device e.g., television
- Each data stream e.g., video data/content/program
- Each data stream being displayed through multiple screens is received from a separate HDMI input source/port 302-310.
- data streams 312-320 having the pre- authentication and roving functionalities, include not only main data from the main HDMI port (assuming that HMDI input port 302 serves as the corresponding main port) but also roving data extracted from one or more roving HDMI ports (assuming that HDMI input ports 304-310 serve as the corresponding roving ports) that is then downsized as roving snapshots.
- These roving snapshots from the roving ports 304- 310 are then merged with the main data image from the main port 302 such that the viewers see the main port-based data stream 312 as a full main image on the video display screen of the receiving device 322 and the roving ports-based data streams 314-320 as the roving snapshots through a corresponding number of inset video display screens, as illustrated here.
- pre-authentication of all ports i.e., including the main HDMI port 302 as well as the roving HDMI ports 304-310.
- pre-authentication of the roving ports 304-310 may be performed in the background such that each roving port 304-310 remains authenticated and available whenever it is needed to serve as the main port (to replace the currently serving main port 302) and while the data/content is being extracted from all ports 302-310.
- each sub-image of each roving data stream 314-320 coming from a roving port 304-310 is stored into a frame buffer.
- the image of the main port-based data stream (main data stream/image) 312 may not be put into a frame buffer due to its relatively large size (e.g., about 6MB for 1080p/24bpp); instead, the main image pixels are placed with those of the roving sub-images (e.g., snapshots as previously described) on the fly that do not use a frame buffer for the main image.
- a roving sub-image 314-320 is converted such that it is in compliance with the main image 312 and put into the main image 312 at a correct position; this way, a user can see all video frames including the main image 312 and the roving sub-images 314-320 from the main port 302 and the roving ports 304-310, respectively, in one screen (including screen insets) as illustrated here.
- Figure 4A illustrates an embodiment of a preview system
- the illustrated preview system 324 includes four major parts including: a stream extractor 402, a sub-frame handler 404, a stream mixer 406, and a Tx interface 408.
- the stream extractor 402 receives multiple HDMI inputs (such as HDMI ports 302-310 of Figure 3) which are then generated into two data streams: a main port (MP) data stream 410 relating to a main port (e.g., main HDMI port 302) and a number of roving port (RP) data streams 412 relating to a corresponding number of roving ports (e.g., roving HDMI ports 304-310).
- MP main port
- RP number of roving port
- the MP data stream 410 is used to provide the MP image on a display screen associated with a receiver device and this MP image further contains previews of the sub-images (e.g., snapshots) extracted from the roving data streams being extracted from the corresponding roving ports.
- the MP data stream 410 also contains audio and other control/information packets associated with the main image and the sub-images.
- any relevant MP information 414 is also generated and associated with the MP data stream 410.
- RP data stream 412 generates multiple streams having snapshots of the roving images being received from the roving ports in time-multiplexing, while simultaneously keeping the roving HDCP ports pre-authenticated in the background. Any control/information packets of the RP data stream 412 may be used, but not forwarded to the downstream to TV.
- a relevant RP information stream 416 is also generated and associated with the RP data stream 412.
- These MP and RP information streams 414, 416 may include relevant video information (e.g., color depth, resolution, etc.) as well as audio information relating to the MP, RP data streams 410, 412.
- HDCP decipher 428 and 436 includes HDCP decipher 428 and 436 and control/information packet (e.g., Data Island (DI) Packet) Analyzer 430 and 438 to generate an audio/video (AV) data stream and its relevant information stream (such as resolution, color depth (e.g., how many bits are used to represent a color), etc.) and also to detect a possible bad HDCP situation and reinitiate HDCP authentication 426 or pre-authentication in the background as needed.
- DI Data Island
- AV audio/video
- the stream extractor 402 further includes an analog core 418 and a multiplexer 420 a well as an HDCP re-initiator 426, a port change control component 440, and an m HDCP engines 442 to support authentication of m ports. Any HDMI signals from each selected port are then used for pre-authentication.
- the illustrated components of the stream extractor 402 and their functionalities have been further described in Figure 1.
- the sub-frame handler 404 captures the image of back ground roving port through the RP streams 412, 416.
- the RP streams 412, 416 are received at a deep color handling component 446 which extracts pixels per color depth
- color conversion of the pixels is performed using a color conversion component 448 followed by performing down sampling per each resolution via a sub-sampling/down-scaling logic 450 and then, compression is performed (using a Discrete Cosine Transform (DCT)/Run Length Coding (RLC) logic 454) and the result is then stored in a frame memory in an input buffer 462.
- DCT Discrete Cosine Transform
- RLC Rastere Coding
- the compressed image is taken out from a frame buffer 460 and then, it is decompressed and put it into an output buffer 456 via IDCT Inverse Discrete Cosine Transform (IDCT) and Run Length Decoding (RLD) and is provided to the stream mixer 406 at a proper time.
- IDCT Inverse Discrete Cosine Transform
- RLD Run Length Decoding
- the deep color handling component 446 detects pixel boundary using color depth information (i.e., how many bits are used for
- the logic 450 performs sub-sampling/down-scaling (i.e., reducing the picture size).
- a sub-sampling/down-scaling ratio is determined by the resolution, video format (such as interlacing), and pixel replication of the main port and those of the roving ports.
- each port has a different size of the video source, its downsizing ratio can also be different. For example, the number of pixels for a 1080p image is bigger than that for a 480p image to preserve the same size of inset displays (called PVs, Pre Views) regardless of the main image resolution.
- the sub-sampled/down-scaled pixels are put into one of the line buffers 452, while the contents of the other line buffers 452 are used by the following block (e.g., dual buffering).
- Each line buffer 452 may contain several lines (e.g., 4 lines) of pixels for the following operation (e.g., 4x4 DCT).
- DCT and RLC (Run Length Coding) at a DCT/RLC logic 454 get pixel data (e.g., 4x4 pixel data) from one of the line buffers 452 which is not under getting new data and do compression.
- the output coefficients which are the result from RLC of DCT at the DCT/RLC logic 454 are put into the input buffer 462.
- the contents of the input buffer 462 are copied to one of several (e.g., four) segments of the frame buffer 460 that is assigned to the current RP. This copying is performed during a Vertical Sync (VS) period of the main image to prevent any tearing effect and if the sampling of RP data is done successfully.
- An IDCT/RLD (Run Length Decoding) logic 458 monitors the "empty" status of the output line buffers 456 and if they become empty, the
- IDCT/RLD logic 458 gets one block of coefficients from the frame buffer 460 and performs decompression.
- the output of this decompression e.g., YCbCr in 4x4 block
- This output line buffer 456 then sends out one pixel data per each request from the stream mixer 406.
- the assignment of any segments of the frame buffer 460 and the output line buffer 456 to each port can change dynamically per the MP selection to support m-1 PVs (e.g., PreViews, inset displays) among m ports with merely m-1 segments.
- m-1 PVs e.g., PreViews, inset displays
- the stream mixer 406 receives the MP data and information streams 410, 414. Once the MP data stream 410, along with its associated MP information stream 414, is received, its pixel boundary is detected by boundary detection logic 468. The boundary detection logic 468 then receives pixels from the output buffer 456 of the sub-frame handler 404, which is then followed by performing the color conversion per main color using the color conversion component 472, and further followed by mixing or replacing of the pixels of the MP data stream 410 with the color-converted pixels of any sub-images on the fly. In one embodiment, using this novel technique of mixing or replacing of MP pixel with that of the RP, images with inset displays are generated without using a frame buffer for the MP data stream 410.
- the boundary detection logic 468 detects pixel boundary using any deep color (e.g., color depth representing the number of bits per color in a pixel) information obtained from the MP information stream 414 and generates pixel coordination (e.g., X,Y) and any relevant pixel boundary information (e.g., Pos, Amt).
- a RP pixel fetch block 480 evaluates and determines whether one pixel from an RP image is needed and if it is needed, it sends out a pixel data read request to the output line buffer 456.
- pixel coordination (X, Y) is in any of PV (inset display) area (which means whether pixel data from RP is needed) and if there is enough remaining pixel data of RP that is previously read out and not yet used (if not, a new pixel of RP is needed).
- the pixel data from output line buffers 456 is, for example, 2 bytes for one pixel (e.g., YCbCr422) and it goes into the color conversion component 472 and becomes the color of the MP image.
- the output of the color conversion component 472 enters the RP pixel cut & paste block 478 which then extracts the needed amount of bits from the input which then enters into a new pixel calculation block 476 and then merged with the pixel obtained from the MP information stream 414 and then becomes the merged final pixel.
- the final pixel replaces the pixel provided by the MP information stream 414 in a new pixel insertion block 474.
- the new pixel insertion block 474 generates and provides a new MP stream 482.
- any sub-images are converted to be compliant with the main image and put into the main image at its appropriate position. For example, color depth, different color spaces (such as YCbCr vs.
- RGB red, green, blue, blue, and blue.
- interleaving vs. progressive different resolutions and video formats of both the main image and the roving images are considered.
- the new MP stream 482 serves as the output that passes through the Tx interface 408 which provides TMDS encoding of the stream using a TMDS encoder 464, while a First- In-First-Out (FIFO) block 466 places the MP stream 482 in FIFO for an interface with Tx analog block.
- the new MP stream 482 may then be sent to a TX analog core 484.
- the MP stream 482 contains the main image as well as the roving sub-images and these images (having video and/or audio) are displayed by the display/final receiving device (e.g., TV) such that the main device occupies most of the screen while the roving sub-images are shown in small inset screens.
- FIG. 5 illustrates an embodiment of a process for displaying multiple data streams from multiple sources.
- a stream extractor is coupled with a number of input ports (e.g., including HDMI main port and one or more HDMI roving ports).
- the stream extractor is used to generate two data streams: an MP data stream (MP_STRM) relating to the main port and a RP data stream (RP_STRM) relating to a roving port at processing block 502.
- MP_STRM MP data stream
- RP_STRM RP data stream
- the stream extractor repeatedly performs this function for each one of a number of roving ports one roving port at a time.
- a sub-frame handler in communication with the stream extractor, scales down the RP data stream associated with a roving port.
- the sub-frame handler performs compression of the scaled roving port data stream and then stores it in an internal buffer.
- a stream mixer in communication with the stream extractor, receives the MP data stream and calculates its coefficients coordinates (e.g., X, Y).
- the stream mixer compares the (X, Y) coordinates with the area of preview images provided by users to determine whether the (X, Y) coordinates are in that preview image area. If the (X, Y) coordinates are in the preview image area, the stream mixer requests one pixel data to the sub-frame handler at processing block 512. If not, the process continues with processing block 508. If the sub-frame handler gets a request from the stream mixer, it takes out one of several preview images that corresponds with the current (X, Y) coordinates from its internal buffer at processing block 514.
- the sub-frame handler further decompresses the RP data stream that was previously compressed and sends a pixel to the stream mixer per its request.
- the stream mixer is then used to convert pixel formats (e.g., color conversion using its color conversion logic) of the pixel received from the sub-frame handler in accordance with those of the MP data stream.
- the stream mixer puts the received pixel into the MP data stream (e.g., replacing the pixel of the MP data stream with that of the preview images using its pixel merger).
- HDMI ports are merely described as an example and brevity and clarity and that it is contemplated that other non-HDMI ports may also be used and employed.
- video sources such as old legacy analog inputs are converted into RGB and control streams in TV for internal processing that can be easily converted to and included into an HDMI stream.
- compression and storing mechanism described in this document is used as an example and provided for brevity and clarity. It is contemplated that various other compression/decompression and storing schemes can be used in the framework according to one or more embodiments of the present invention.
- FIG. 6 is an illustration of embodiments of components of a network computer device 605 employing an embodiment of the present invention.
- a network device 605 may be any device in a network, including, but not limited to, a television, a cable set-top box, a radio, a DVD player, a CD player, a smart phone, a storage unit, a game console, or other media device.
- the network device 605 includes a network unit 610 to provide network functions.
- the network functions include, but are not limited to, the generation, transfer, storage, and reception of media content streams.
- the network unit 610 may be implemented as a single system on a chip (SoC) or as multiple components.
- SoC system on a chip
- the network unit 610 includes a processor for the processing of data.
- the processing of data may include the generation of media data streams, the manipulation of media data streams in transfer or storage, and the decrypting and decoding of media data streams for usage.
- the network device may also include memory to support network operations, such as DRAM (dynamic random access memory) 620 or other similar memory and flash memory 625 or other nonvolatile memory.
- DRAM dynamic random access memory
- the network device 605 may also include a transmitter 630 and/or a receiver 640 for transmission of data on the network or the reception of data from the network, respectively, via one or more network interfaces 655.
- the transmitter 630 or receiver 640 may be connected to a wired transmission cable, including, for example, an Ethernet cable 650, a coaxial cable, or to a wireless unit.
- the transmitter 630 or receiver 640 may be coupled with one or more lines, such as lines 635 for data transmission and lines 645 for data reception, to the network unit 610 for data transfer and control signals. Additional connections may also be present.
- the network device 605 also may include numerous components for media operation of the device, which are not illustrated here.
- modules, components, or elements described throughout this document may include hardware, software, and/or a combination thereof.
- a module includes software
- the software data, instructions, and/or configuration may be provided via an article of manufacture by a machine/electronic device/hardware.
- An article of manufacture may include a machine accessible/readable medium having content to provide instructions, data, etc. The content may result in an electronic device, for example, a filer, a disk, or a disk controller as described herein, performing various operations or executions described.
- Portions of various embodiments of the present invention may be provided as a computer program product, which may include a
- machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, compact disk read-only memory (CD-ROM), and magneto-optical disks, read-only memory (ROM), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically EPROM (EEPROM), magnet or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions.
- ROM read-only memory
- RAM random access memory
- EPROM erasable programmable read-only memory
- EEPROM electrically EPROM
- magnet or optical cards magnet or optical cards
- flash memory or other type of media/machine-readable medium suitable for storing electronic instructions.
- the present invention may also be downloaded as a computer program product, wherein the program may be transferred from a remote computer to a requesting computer.
- element A may be directly coupled to element B or be indirectly coupled through, for example, element C.
- a component, feature, structure, process, or characteristic A “causes” a component, feature, structure, process, or characteristic B, it means that "A” is at least a partial cause of "B” but that there may also be at least one other component, feature, structure, process, or characteristic that assists in causing "B.” If the specification indicates that a component, feature, structure, process, or characteristic “may”, “might”, or “could” be included, that particular component, feature, structure, process, or characteristic is not required to be included. If the specification or claim refers to "a” or “an” element, this does not mean there is only one of the described elements.
- An embodiment is an implementation or example of the present invention.
- Reference in the specification to "an embodiment,” “one embodiment,” “some embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments.
- the various appearances of "an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. It should be appreciated that in the foregoing description of exemplary embodiments of the present invention, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Systems (AREA)
- Studio Circuits (AREA)
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201080060056.6A CN102714759B (zh) | 2009-12-30 | 2010-12-21 | 对来自多个受保护源的内容进行同时预览的方法、装置及系统 |
| KR1020127019924A KR101724484B1 (ko) | 2009-12-30 | 2010-12-21 | 다수의 보호 소스로부터의 컨텐츠를 동시에 미리 보기 위한 방법, 장치, 및 시스템 |
| EP10844229.4A EP2520098A4 (en) | 2009-12-30 | 2010-12-21 | METHOD, APPARATUS AND SYSTEM FOR PREVIEWING SIMULTANEOUS CONTENT FROM MULTIPLE PROTECTED SOURCES |
| JP2012547145A JP5784631B2 (ja) | 2009-12-30 | 2010-12-21 | 複数の保護されたソースから同時にコンテンツをプレビューする方法、装置、及びシステム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/650,357 | 2009-12-30 | ||
| US12/650,357 US20110157473A1 (en) | 2009-12-30 | 2009-12-30 | Method, apparatus, and system for simultaneously previewing contents from multiple protected sources |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| WO2011090663A2 true WO2011090663A2 (en) | 2011-07-28 |
| WO2011090663A3 WO2011090663A3 (en) | 2011-11-17 |
| WO2011090663A8 WO2011090663A8 (en) | 2012-09-13 |
Family
ID=44187112
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2010/061572 Ceased WO2011090663A2 (en) | 2009-12-30 | 2010-12-21 | Method, apparatus, and system for simultaneously previewing contents from multiple protected sources |
Country Status (7)
| Country | Link |
|---|---|
| US (1) | US20110157473A1 (enExample) |
| EP (1) | EP2520098A4 (enExample) |
| JP (1) | JP5784631B2 (enExample) |
| KR (1) | KR101724484B1 (enExample) |
| CN (1) | CN102714759B (enExample) |
| TW (1) | TWI527457B (enExample) |
| WO (1) | WO2011090663A2 (enExample) |
Families Citing this family (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8692937B2 (en) * | 2010-02-25 | 2014-04-08 | Silicon Image, Inc. | Video frame synchronization |
| US9516372B2 (en) * | 2010-12-10 | 2016-12-06 | Lattice Semiconductor Corporation | Multimedia I/O system architecture for advanced digital television |
| US9413985B2 (en) * | 2012-09-12 | 2016-08-09 | Lattice Semiconductor Corporation | Combining video and audio streams utilizing pixel repetition bandwidth |
| FR2996091B1 (fr) * | 2012-09-21 | 2015-07-17 | Thales Sa | Noeud fonctionnel pour un reseau de transmission d'informations et reseau correspondant |
| CN105100865B (zh) * | 2014-05-12 | 2018-04-13 | 深圳Tcl新技术有限公司 | 多画面显示的控制方法及装置 |
| AU2017222421B2 (en) | 2016-02-23 | 2022-09-01 | nChain Holdings Limited | Personal device security using elliptic curve cryptography for secret sharing |
| EP3550825A4 (en) * | 2016-12-01 | 2020-05-13 | LG Electronics Inc. -1- | IMAGE DISPLAY DEVICE AND THIS IMAGE DISPLAY SYSTEM |
| JP6838148B2 (ja) * | 2017-05-30 | 2021-03-03 | シャープNecディスプレイソリューションズ株式会社 | 表示装置、表示方法、及びプログラム |
| CN111295659B (zh) * | 2017-11-02 | 2024-09-17 | 区块链控股有限公司 | 用于将区块链技术与数字双胞胎结合的计算机实现的系统和方法 |
| CN111787377B (zh) * | 2020-08-19 | 2022-06-28 | 青岛海信传媒网络技术有限公司 | 显示设备及投屏方法 |
| CN113507638B (zh) * | 2021-07-07 | 2023-05-05 | Vidaa(荷兰)国际控股有限公司 | 显示设备及投屏方法 |
| US12306007B2 (en) | 2021-11-12 | 2025-05-20 | Rockwell Collins, Inc. | System and method for chart thumbnail image generation |
| US12002369B2 (en) | 2021-11-12 | 2024-06-04 | Rockwell Collins, Inc. | Graphical user interface (GUI) for selection and display of enroute charts in an avionics chart display system |
| US11842429B2 (en) | 2021-11-12 | 2023-12-12 | Rockwell Collins, Inc. | System and method for machine code subroutine creation and execution with indeterminate addresses |
| US12254282B2 (en) | 2021-11-12 | 2025-03-18 | Rockwell Collins, Inc. | Method for automatically matching chart names |
| US12304648B2 (en) | 2021-11-12 | 2025-05-20 | Rockwell Collins, Inc. | System and method for separating avionics charts into a plurality of display panels |
| US11954770B2 (en) | 2021-11-12 | 2024-04-09 | Rockwell Collins, Inc. | System and method for recreating graphical image using character recognition to reduce storage space |
| US11887222B2 (en) | 2021-11-12 | 2024-01-30 | Rockwell Collins, Inc. | Conversion of filled areas to run length encoded vectors |
| US11915389B2 (en) | 2021-11-12 | 2024-02-27 | Rockwell Collins, Inc. | System and method for recreating image with repeating patterns of graphical image file to reduce storage space |
Family Cites Families (31)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0219079A (ja) * | 1988-07-06 | 1990-01-23 | Pioneer Electron Corp | 映像信号処理装置 |
| US5161019A (en) * | 1990-06-29 | 1992-11-03 | Rca Thomson Licensing Corporation | "channel guide" automatically activated by the absence of program information |
| JPH08237563A (ja) * | 1995-02-28 | 1996-09-13 | Toshiba Corp | テレビジョン受像機 |
| US5737035A (en) * | 1995-04-21 | 1998-04-07 | Microtune, Inc. | Highly integrated television tuner on a single microcircuit |
| US6311161B1 (en) * | 1999-03-22 | 2001-10-30 | International Business Machines Corporation | System and method for merging multiple audio streams |
| US6784945B2 (en) * | 1999-10-01 | 2004-08-31 | Microtune (Texas), L.P. | System and method for providing fast acquire time tuning of multiple signals to present multiple simultaneous images |
| US7373650B1 (en) * | 2000-02-01 | 2008-05-13 | Scientific-Atlanta, Inc. | Apparatuses and methods to enable the simultaneous viewing of multiple television channels and electronic program guide content |
| US6473135B1 (en) * | 2000-02-16 | 2002-10-29 | Sony Corporation | Signal input selector for television set and method of implementing same |
| JP2002354367A (ja) * | 2001-05-25 | 2002-12-06 | Canon Inc | マルチ画面表示装置、マルチ画面表示方法、記録媒体、及びプログラム |
| JP2003116073A (ja) * | 2001-10-04 | 2003-04-18 | Mitsubishi Electric Corp | テレビジョン放送受信装置 |
| KR100441504B1 (ko) * | 2002-01-15 | 2004-07-23 | 삼성전자주식회사 | 주화면 및 부화면의 콤포지트신호 및 콤포넌트신호를 각각디지털신호로 변환할 수 있는 영상신호 복원장치 |
| JP4229816B2 (ja) * | 2003-11-25 | 2009-02-25 | シャープ株式会社 | 受信装置 |
| TW200704183A (en) * | 2005-01-27 | 2007-01-16 | Matrix Tv | Dynamic mosaic extended electronic programming guide for television program selection and display |
| KR100557146B1 (ko) * | 2005-07-09 | 2006-03-03 | 삼성전자주식회사 | 디지털 멀티미디어 방송 수신 장치 |
| KR100765317B1 (ko) * | 2005-07-12 | 2007-10-09 | 삼성전자주식회사 | 디지털 방송 시스템에서의 채널 전환 장치 및 방법 |
| US7532253B1 (en) * | 2005-07-26 | 2009-05-12 | Pixelworks, Inc. | Television channel change picture-in-picture circuit and method |
| KR100761140B1 (ko) * | 2005-12-01 | 2007-09-21 | 엘지전자 주식회사 | 입력 신호의 검출 방법 및 이를 구현한 방송 수신기 |
| US20070186015A1 (en) * | 2006-02-08 | 2007-08-09 | Taft Frederick D | Custom edid content generation system and method |
| JP4822972B2 (ja) * | 2006-07-28 | 2011-11-24 | シャープ株式会社 | 表示装置 |
| EP2048882A4 (en) * | 2006-07-28 | 2010-05-05 | Sharp Kk | DISPLAY DEVICE |
| JP2008054300A (ja) * | 2006-07-28 | 2008-03-06 | Sharp Corp | 表示装置及び表示システム |
| JP4289397B2 (ja) * | 2007-01-04 | 2009-07-01 | 船井電機株式会社 | 受信装置 |
| TWI397899B (zh) * | 2007-04-30 | 2013-06-01 | Mstar Semiconductor Inc | 多視窗顯示控制器及相關方法 |
| KR101386882B1 (ko) * | 2007-06-08 | 2014-04-18 | 삼성전자주식회사 | 디지털 방송 채널 정보 표시 방법 및 장치 |
| JP4530033B2 (ja) * | 2007-12-06 | 2010-08-25 | ソニー株式会社 | 受信装置および受信装置における入力切換制御方法 |
| US8644504B2 (en) * | 2008-02-28 | 2014-02-04 | Silicon Image, Inc. | Method, apparatus, and system for deciphering media content stream |
| KR101442611B1 (ko) * | 2008-03-06 | 2014-09-23 | 삼성전자주식회사 | 복수의 레이어들을 중복하여 디스플레이하는 디스플레이장치 및 이의 제어 방법 |
| JP2009253468A (ja) * | 2008-04-02 | 2009-10-29 | Canon Inc | 映像制御装置およびその制御方法 |
| JP4821824B2 (ja) * | 2008-09-19 | 2011-11-24 | ソニー株式会社 | 画像表示装置、コネクタ表示方法、伝送路状態検出装置、伝送路状態検出方法および半導体集積回路 |
| US8374346B2 (en) * | 2009-01-09 | 2013-02-12 | Silicon Image, Inc. | Method, apparatus, and system for pre-authentication and keep-authentication of content protected ports |
| US8237868B2 (en) * | 2009-03-30 | 2012-08-07 | Sharp Laboratories Of America, Inc. | Systems and methods for adaptive spatio-temporal filtering for image and video upscaling, denoising and sharpening |
-
2009
- 2009-12-30 US US12/650,357 patent/US20110157473A1/en not_active Abandoned
-
2010
- 2010-12-21 KR KR1020127019924A patent/KR101724484B1/ko active Active
- 2010-12-21 JP JP2012547145A patent/JP5784631B2/ja active Active
- 2010-12-21 WO PCT/US2010/061572 patent/WO2011090663A2/en not_active Ceased
- 2010-12-21 EP EP10844229.4A patent/EP2520098A4/en not_active Withdrawn
- 2010-12-21 CN CN201080060056.6A patent/CN102714759B/zh active Active
- 2010-12-23 TW TW099145595A patent/TWI527457B/zh active
Non-Patent Citations (1)
| Title |
|---|
| See references of EP2520098A4 * |
Also Published As
| Publication number | Publication date |
|---|---|
| US20110157473A1 (en) | 2011-06-30 |
| EP2520098A4 (en) | 2014-11-19 |
| CN102714759B (zh) | 2016-10-12 |
| TW201134214A (en) | 2011-10-01 |
| TWI527457B (zh) | 2016-03-21 |
| WO2011090663A3 (en) | 2011-11-17 |
| KR20120096944A (ko) | 2012-08-31 |
| WO2011090663A8 (en) | 2012-09-13 |
| CN102714759A (zh) | 2012-10-03 |
| JP5784631B2 (ja) | 2015-09-24 |
| KR101724484B1 (ko) | 2017-04-07 |
| JP2013516840A (ja) | 2013-05-13 |
| EP2520098A2 (en) | 2012-11-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20110157473A1 (en) | Method, apparatus, and system for simultaneously previewing contents from multiple protected sources | |
| US8644504B2 (en) | Method, apparatus, and system for deciphering media content stream | |
| EP2386166B1 (en) | Method, apparatus and system for pre-authentication and keep-authentication of content protected ports | |
| US8832844B2 (en) | Fast switching for multimedia interface system having content protection | |
| EP3051801B1 (en) | Video switch and switching method thereof | |
| US8166499B2 (en) | Method, apparatus and set-top device for transmitting content to a receiver | |
| JP2021007266A (ja) | 映像送信装置 | |
| JP6171065B2 (ja) | 表示装置及び表示方法 | |
| EP2384579B1 (en) | Method and system for detecting successful authentication of multiple ports in a time-based roving architecture | |
| JP7037598B2 (ja) | 映像送信装置 | |
| JP6775635B2 (ja) | 表示装置 | |
| JP6249311B2 (ja) | 出力装置 | |
| JP2022033966A (ja) | 映像信号処理装置 | |
| JP2019140702A (ja) | 表示装置 | |
| JP2017130959A (ja) | 表示装置 | |
| JP2018046572A (ja) | 表示装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201080060056.6 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10844229 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012547145 Country of ref document: JP |
|
| REEP | Request for entry into the european phase |
Ref document number: 2010844229 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010844229 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 20127019924 Country of ref document: KR Kind code of ref document: A |