US20190098251A1 - Anti-Piracy Video Transmission and Display - Google Patents
Anti-Piracy Video Transmission and Display Download PDFInfo
- Publication number
- US20190098251A1 US20190098251A1 US15/714,575 US201715714575A US2019098251A1 US 20190098251 A1 US20190098251 A1 US 20190098251A1 US 201715714575 A US201715714575 A US 201715714575A US 2019098251 A1 US2019098251 A1 US 2019098251A1
- Authority
- US
- United States
- Prior art keywords
- color
- content
- cycled
- frame
- video content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000005540 biological transmission Effects 0.000 title description 57
- 238000000034 method Methods 0.000 claims abstract description 32
- 238000001228 spectrum Methods 0.000 abstract description 3
- 239000003086 colorant Substances 0.000 description 19
- 238000004891 communication Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 7
- 230000001413 cellular effect Effects 0.000 description 5
- 230000003139 buffering effect Effects 0.000 description 4
- 239000000835 fiber Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000015556 catabolic process Effects 0.000 description 3
- 238000006731 degradation reaction Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000011144 upstream manufacturing Methods 0.000 description 2
- KNMAVSAGTYIFJF-UHFFFAOYSA-N 1-[2-[(2-hydroxy-3-phenoxypropyl)amino]ethylamino]-3-phenoxypropan-2-ol;dihydrochloride Chemical compound Cl.Cl.C=1C=CC=CC=1OCC(O)CNCCNCC(O)COC1=CC=CC=C1 KNMAVSAGTYIFJF-UHFFFAOYSA-N 0.000 description 1
- 241000023320 Luma <angiosperm> Species 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/913—Television signal processing therefor for scrambling ; for copy protection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
- H04N19/467—Embedding additional information in the video signal during the compression process characterised by the embedded information being invisible, e.g. watermarking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4627—Rights management associated to the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/167—Systems rendering the television signal unintelligible and subsequently intelligible
- H04N7/169—Systems operating in the time domain of the television signal
- H04N7/1696—Systems operating in the time domain of the television signal by changing or reversing the order of active picture signal portions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7847—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
-
- G06F17/30799—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/186—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/913—Television signal processing therefor for scrambling ; for copy protection
- H04N2005/91357—Television signal processing therefor for scrambling ; for copy protection by modifying the video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/913—Television signal processing therefor for scrambling ; for copy protection
- H04N2005/91392—Television signal processing therefor for scrambling ; for copy protection using means for preventing making copies of projected video images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/835—Generation of protective data, e.g. certificates
- H04N21/8352—Generation of protective data, e.g. certificates involving content or source identification data, e.g. Unique Material Identifier [UMID]
Definitions
- Video content rights holders commonly use digital rights management software, secure devices, and other strategies to prevent unauthorized copying and distribution of video content.
- unauthorized copies of video content can still be made by using video recording devices to capture content being output by video displays.
- Some aspects of this disclosure relate to methods, systems and computing devices that prevent or interfere with video capture of content displayed on a video display, such as a television, monitor, projection screen, or other video display.
- Video capturing devices such as camcorders, commonly have fairly low capture rates (e.g., 20 frames per second), whereas many video displays are capable of outputting content at much higher refresh rates (e.g., 120-240 frames per second). This refresh rate disparity can be used to display content that appears normal to a human viewer, but that is difficult or impossible to accurately capture with common video recording devices.
- a video display may output a first frame colored in shades of a first primary color (e.g., a frame with only red image data), followed by a second frame colored in shades of a second primary color (e.g., a frame with only green image data), followed by a third frame colored in shades of a third primary color (e.g., a frame with only blue image data).
- a first primary color e.g., a frame with only red image data
- a second frame colored in shades of a second primary color e.g., a frame with only green image data
- a third frame colored in shades of a third primary color e.g., a frame with only blue image data
- the video display may repeatedly cycle between individual color frames, thus providing video content that appears normal to a human viewer.
- a camcorder or video recording device may capture only some of the color frames. Therefore, the video recorded by video recording devices may include undesirable color flicker or other color artifacts, which may deter unauthorized copying.
- transmitted video content may omit certain color information for any given frame and thereby use a lower bandwidth on a transmission medium.
- the lower bandwidth on the transmission medium may beneficially enable more robust transmission properties that may be useful when signal loss becomes an issue. Therefore, techniques described herein may be useful to combat piracy and/or to lower the bandwidth of a transmitted video signal, among other benefits.
- the methods, systems and computing devices described herein may be included as part of a network, such as a cable television distribution network.
- FIG. 1 illustrates an example network according to one or more aspects described herein.
- FIG. 2 illustrates an example computing device on which the various elements described herein may be implemented according to one or more aspects described herein.
- FIG. 3 illustrates an example process for causing content to be displayed according to a secure display mode.
- FIG. 4 illustrates an example of selected video content and corresponding color-cycled content.
- FIGS. 5A, 5B, 5C, 5D are example transmission diagrams illustrating selected video content converted into transmitted video content and corresponding color-cycled content.
- FIG. 6 illustrates an example of selected video content and corresponding color-cycled content.
- FIGS. 7A and 7B are example transmission diagrams illustrating selected video content converted into transmitted video content and corresponding color-cycled content.
- a computing device with access to video content may implement techniques described herein to combat piracy by displaying video in repeating color cycles.
- the computing device may be part of a video distribution network, such as a cable television network, and/or may be connected to other sources of video content, such as the Internet, video players such as DVD and BLU-RAY players, and the like.
- Such computing systems may be part of a home entertainment system, a projection system (e.g., for a movie theater), or any other system for displaying video content.
- the computing device may cause a display device to quickly display sequential frames of a video content, but to have each individual frame presented in just one color, such as a primary color, instead of as a full-color frame.
- a cycle of frames may include a first frame of a first primary color (e.g., showing one frame of video content in red), a second frame of a second primary color (e.g., showing a next frame of video content in green), and a third frame of a third primary color (e.g., showing a next frame of video content in blue).
- the next cycle may similarly include three frames of three different primary colors.
- Various colors may be used as primary colors for a color cycle.
- a color cycle may include a cyan frame, followed by a magenta frame, followed by a yellow frame.
- a human viewer's eyes may tend to blend the colors of the sequential frames, so the true full-color video content frame is still perceived over time, even though only components of a particular color are displayed in any one time.
- a color cycle may include a black-and-white or grayscale frame, which may beneficially increase the perceived brightness of the displayed video content.
- a four-color cycle including a red frame, a green frame, a blue frame, and a white frame may appear brighter to a viewer than a three-color cycle including a red frame, a blue frame, and a green frame.
- two primary colors may be used in some frames.
- a two-color cycle may include a red-green frame, followed by a blue frame.
- a frame may be half one color (e.g., a top half of the frame may be red) and half another color (e.g., a bottom half of the frame may be green).
- each frame of a color cycle may include reduced color information, and the plurality of frames forming a color cycle may collectively include a full color gamut.
- content may be modified before transmission over a local display link, such as a DISPLAYPORT link, HDMI link, DVI-D link, or some other link for transmitting video data.
- a local display link such as a DISPLAYPORT link, HDMI link, DVI-D link, or some other link for transmitting video data.
- Such links may include multiple physical channels for transmitting different video color information simultaneously.
- HDMI includes three physical transmission minimized differential signaling (TMDS) channels for transmitting three different primary colors according to a selected color space (e.g., one channel for red data, one channel for green data, and one channel for blue data when transmitting video in the RGB color space).
- TMDS physical transmission minimized differential signaling
- content may be transmitted across other types of links.
- FIG. 1 illustrates an example network 100 on which many of the various features described herein may be implemented.
- Network 100 may be any type of information distribution network, such as satellite, telephone, cellular, wireless, optical fiber network, coaxial cable network, and/or a hybrid fiber/coax (HFC) distribution network. Additionally, network 100 may be a combination of networks.
- Network 100 may use a series of interconnected communication links 101 (e.g., coaxial cables, optical fibers, wireless, etc.) and/or some other network (e.g., the Internet, a PSTN, etc.) to connect an end-point to a local office or headend 103 .
- Example end-points are illustrated in FIG.
- premises 102 e.g., businesses, homes, consumer dwellings, etc.
- the local office 103 e.g., a data processing and/or distribution facility
- each premises 102 may have a receiver used to receive and process those signals.
- the links 101 may include components not illustrated, such as splitters, filters, amplifiers, etc. to help convey the signal clearly, but in general each split introduces a bit of signal degradation. Portions of the links 101 may also be implemented with fiber-optic cable, while other portions may be implemented with coaxial cable, other links, or wireless communication paths.
- the local office 103 may include a termination system (TS) 104 , such as a cable modem termination system (CMTS) in a HFC network, a DSLAM in a DSL network, a cellular base station in a cellular network, or some other computing device configured to manage communications between devices on the network of links 101 and backend devices such as servers 105 - 107 (which may be physical servers and/or virtual servers, for example, in a cloud environment).
- the TS may be as specified in a standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. CableLabs), or it may be a similar or modified device instead.
- DOCSIS Data Over Cable Service Interface Specification
- the TS may be configured to place data on one or more downstream frequencies to be received by modems or other user devices at the various premises 102 , and to receive upstream communications from those modems on one or more upstream frequencies.
- the local office 103 may also include one or more network interfaces 108 , which can permit the local office 103 to communicate with various other external networks 109 .
- These networks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks, fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, and any other desired network, and the interface 108 may include the corresponding circuitry needed to communicate on the network 109 , and to other devices on the network such as a cellular telephone network and its corresponding cell phones.
- the local office 103 may include a variety of servers 105 - 107 that may be configured to perform various functions.
- the servers may be physical servers and/or virtual servers.
- the local office 103 may include a push notification server 105 .
- the push notification server 105 may generate push notifications to deliver data and/or commands to the various homes 102 in the network (or more specifically, to the devices in the homes 102 that are configured to detect such notifications).
- the local office 103 may also include a content server 106 .
- the content server 106 may be one or more computing devices that are configured to provide content to users in the homes. This content may be, for example, video on demand movies, television programs, songs, text listings, etc.
- the content server 106 may include software to validate user identities and entitlements, locate and retrieve requested content, encrypt the content, and initiate delivery (e.g., streaming) of the content to the requesting user and/or device.
- the local office 103 may also include one or more application servers 107 .
- An application server 107 may be a computing device configured to offer any desired service, and may run various languages and operating systems (e.g., servlets and JSP pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTMLS, JavaScript, AJAX and COMET).
- an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings.
- Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements.
- Another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to the premises 102 .
- Another application server may be responsible for formatting and providing data for an interactive service being transmitted to the premises 102 (e.g., chat messaging service, etc.).
- An example premises 102 a may include an interface 120 .
- the interface 120 may comprise a modem 110 , which may include transmitters and receivers used to communicate on the links 101 and with the local office 103 .
- the modem 110 may be, for example, a coaxial cable modem (for coaxial cable links 101 ), a fiber interface node (for fiber optic links 101 ), or any other desired device offering similar functionality.
- the interface 120 may also comprise a gateway interface device 111 or gateway.
- the modem 110 may be connected to, or be a part of, the gateway interface device 111 .
- the gateway interface device 111 may be a computing device that communicates with the modem 110 to allow one or more other devices in the premises to communicate with the local office 103 and other devices beyond the local office.
- the gateway 111 may comprise a set-top box (STB), digital video recorder (DVR), computer server, or any other desired computing device.
- the gateway 111 may also include (not shown) local network interfaces to provide communication signals to devices in the premises, such as display devices 112 (e.g., televisions), additional STBs 113 , personal computers 114 , laptop computers 115 , wireless devices 116 (wireless laptops and netbooks, mobile phones, mobile televisions, personal digital assistants (PDA), etc.), a landline phone 117 , and any other desired devices.
- Premises 102 a may further include one or more listening devices 119 , the operation of which will be further described below.
- FIG. 2 illustrates an example computing device on which various elements described herein can be implemented.
- the computing device 200 may include one or more processors 201 , which may execute instructions of a computer program to perform any of the features described herein.
- the instructions may be stored in any type of computer-readable medium or memory, to configure the operation of the processor 201 .
- ROM read-only memory
- RAM random access memory
- removable media 204 such as a Universal Serial Bus (USB) drive, compact disk (CD) or digital versatile disk (DVD), floppy disk drive, or any other desired electronic storage medium.
- Instructions may also be stored in an attached (or internal) hard drive 205 .
- the computing device 200 may include one or more output devices, such as a display 206 (or an external television), and may include one or more output device controllers 207 , such as a video controller.
- the device controller 207 may implement an HDMI standard and/or a modified HDMI standard.
- the computing device 200 may also include one or more network interfaces, such as input/output circuits 209 (such as a network card) to communicate with an external network 210 .
- the network interface may be a wired interface, wireless interface, or a combination of the two.
- the interface 209 may include a modem (e.g., a cable modem), and network 210 may include the communication links and/or networks illustrated in FIG. 1 , or any other desired network.
- FIG. 2 example is an illustrative hardware configuration. Modifications may be made to add, remove, combine, divide, etc. components as desired. Additionally, the components illustrated may be implemented using basic computing devices and components, and the same components (e.g., processor 201 , storage 202 , user interface, etc.) may be used to implement any of the other computing devices and components described herein.
- FIG. 3 illustrates a process that may be implemented by the computing device 200 , which may be a gateway 111 , set top box 113 , personal computer 114 , or some other device.
- the process may begin at step 301 , when the computing device 200 is turned on or otherwise begins operation of a mode for transmitting video content to a display device 206 .
- a user and/or the computing device 200 may have selected video content.
- the user may select a television program on a broadcast or on-demand channel, and/or a default or previously-selected video content may be displayed when the computing device 200 is turned on.
- the computing device 200 may retrieve metadata that can be used to determine how the selected video content should be transmitted and displayed (e.g., using a color cycle technique) at the display device 206 .
- the video metadata may be electronic program guide (EPG) data and/or other metadata.
- the video metadata may be retrieved from storage, for example, one or more of ROM 203 , removable media 204 , or hard drive 205 .
- the video metadata may be received via a network, such as external network 210 .
- the video metadata received from an external network 210 may be EPG data received via a television distribution network and/or via the Internet.
- the video metadata may also be received from a local device, such as a video game console, tablet, or other device that may communicate with computing device 200 to provide video content to display 206 .
- Such received video metadata may be stored in the storage for later retrieval.
- the computing device 200 may receive video metadata from another device (e.g., in response to a request transmitted to the other device) and/or retrieve the video metadata from storage.
- the video metadata may contain information about video content that may be transmitted to the display device 206 .
- the video content may include video channels (e.g., television channels), video sources (e.g., a connected device such as a video game console), or other video content assets (e.g., television episodes, movies, etc.).
- the video metadata may include descriptive information such as a title, genre, and summary of a particular video content asset, channel, or source.
- the video metadata may also include schedule information such as a time and channel of broadcast for a particular video content asset.
- the video metadata may also include a security setting for each item of video content.
- each television program on NBC may have its own security setting.
- the security setting may indicate that particular video content should or must be transmitted to and/or displayed by a display device 206 using a secure display mode.
- the security setting may be a binary setting specifying that a secure display mode is either required or not required for particular content, or it may be a setting that specifies one or more of a plurality of security settings.
- the security setting may specify one or more parameters of a secure display mode, such as a set of allowed color cycles, a minimum display refresh rate, a maximum display refresh rate, and/or a range of allowed display refresh rates.
- the security setting may include an indication of whether the computing device 200 must transmit the video in a secure display mode, or whether the display device 206 may handle conversion to a secure display mode.
- the security setting may reference a pre-defined set of parameters for a secure display mode. For example, a “high security” setting may map to a first set of parameters for a secure display mode, and a “medium security” setting may map to a second set of parameters (e.g., a more permissive set of parameters).
- the video metadata may also include one or more time periods associated with a particular security setting, and may include multiple security settings for a given video content. Each of the multiple security settings may be associated with one or more time periods, or no time period.
- the computing device 200 may enforce the associated security setting. Security settings with no associated time period may be enforced at any time. For example, a “high security” setting may be associated with a newly-released movie or television episode during the first 2 weeks of its initial air date, and a “medium security” setting may be associated with the same movie or episode during the following two weeks. Finally, a “low security” setting may be associated with the same movie for any other time. If no security settings are associated with video content, the computing device 200 may cause the video content to be displayed in a normal (e.g., non-secure) manner.
- the video metadata may generated by and/or received from a device at the local office or headend 103 .
- one of servers 105 , 106 , 107 may generate the video metadata, including the security setting(s) and/or associated time periods.
- the server may determine that a particular video content should be associated with a security setting(s) and/or associated time periods based on a type of video content and/or an indication from a rights owner associated with the video content. For example, the server may determine that a broadcast of a new episode of a hit show should be associated with a high security setting for the first two weeks after its first broadcast, and a medium security setting for the following two weeks. Accordingly, the server may generate video metadata including appropriate security settings and time periods and transmit the video metadata to computing device 200 .
- the computing device 200 may determine one or more supported display modes, which may indicate supported display refresh rates (which may affect which color cycle modes, frame rates, or other security settings are compatible with the connected display).
- the computing device 200 may communicate with the display 206 (e.g., via an HDMI connection) to determine one or more display modes supported by display 206 .
- the computing device 200 may receive extended display identification data (EDID) via a display data channel (DDC) of the HDMI channel.
- EDID may contain information such as a maximum resolution supported by the display, a manufacturer and/or model of the display, what resolutions and corresponding refresh rates are supported by the display, and the like.
- the EDID may further include information about specific secure display modes that are supported by the display (e.g., a “high security” mode, a “medium security” mode, and the like), and/or information about secure display mode parameters, such as supported color cycles, whether the display supports conversion into a secure display mode, and the like. Additionally or alternatively, information about supported secure display modes and/or supported parameters may be received via a Consumer Electronics Control (CEC) line, or via some other communication channel of the HDMI or other link.
- CEC Consumer Electronics Control
- such information may be retrieved from another device (e.g., from a remote server in communication with computing device 200 ) in response to a query identifying the display (e.g., by make and/or model, serial number, and/or some other identifier of the display). Additionally or alternatively, such secure display mode information may be retrieved from local storage of the computing device 200 (e.g., from a stored database comprising secure display mode information keyed to one or more display identifiers). Such secure display mode information, once received (e.g., from display device 206 and/or a remote server), may be stored by computing device 200 and reused in the future.
- the computing device 200 may determine whether a selected video content (e.g., content asset, channel, and/or content source, etc.) is associated with a security setting.
- the selected video content asset may be broadcast video, on-demand video, recorded video, or any other video content.
- Computing device 200 may determine whether the video metadata contains one or more security settings associated with content for display at display 206 .
- Computing device 200 may further determine whether the one or more security settings are currently enforced (e.g., based on determining that the current time falls within a particular time period associated with a particular security setting). If multiple security settings are currently enforced for selected content, computing device 200 may determine which security setting takes precedence.
- the computing device 200 may select one of the first security setting or the second security setting.
- the computing device 200 may determine that the more restrictive security setting (e.g., the security setting that permits fewer security modes) takes precedence.
- the computing device 200 may select the most specific security setting (e.g., a security setting for a content asset over a security setting for a channel, or a security setting for a channel over a security setting for a content source).
- the computing device 200 may transmit the selected content to display 206 for display in a normal (e.g., non-secure) mode. If the computing device 200 receives a selection of new content at step 306 (e.g., because the user selects different content, the previously-selected content ends and new content begins, etc.), the computing device returns to step 304 .
- the computing device 200 may select a secure display mode to use for transmitting the content to display 206 .
- a secure display mode may specify multiple parameters for displayed content, including a color cycle (e.g., a three-color cycle of a red, green, and blue, a four-color cycle of red, green, blue, and white, etc.), a frame rate of the displayed content, and a transmission mode.
- Selection of the parameters of the secure display mode may depend on one or more factors including a frame rate of the selected video content, supported refresh rates of display 206 , security settings of the video metadata, and/or video conversion capabilities of the display 206 , as well as other factors.
- FIGS. 4, 5A-5D, 6, and 7A-7B illustrate various combinations of secure display mode parameters, and the selection of such parameters based on various factors.
- the supported refresh rates of display 206 may be used to select a frame rate of transmitted or displayed content.
- the computing device 200 may receive a list of supported refresh rates, which may indicate a particular refresh rate (e.g., 240 Hz).
- the computing device may then determine a supported transmission or display frame rate for transmitted or displayed content by dividing a number of colors in a color cycle by the refresh rate (e.g., a 240 Hz refresh rate divided by 3 colors allows 80 frames per second of a first color, 80 frames per second of a second color, and 80 frames per second of a third color to be transmitted and/or displayed). This process may be repeated for each supported refresh rate of display 206 .
- the computing device 200 may thus determine which transmission rates and/or display rates are supported based on the list of supported refresh rates received from display 206 .
- the computing device 200 may further determine at least some secure display mode parameters based on a maximum cycle duration, which may indicate the maximum time for completing a color cycle that will be invisible (or at least acceptable) to a human eye.
- a maximum cycle duration 401 is illustrated with respect to selected video content 403 of a particular frame rate.
- the maximum cycle duration 401 may specify a maximum time during which the display 206 must cycle through all the colors of the color cycle while displaying in a secure display mode.
- the display must display each frame of a three-color cycle for no more than 20 ms (e.g., a red frame for 20 ms, a green frame for 20 ms, and a blue frame for 20 ms), or each frame of a four-color cycle for no more than 15 ms (e.g., a red frame for 15 ms, a green frame for 15 ms, a blue frame for 15 ms, and a white frame for 15 ms).
- Exceeding the maximum cycle duration 401 may cause undesirable effects for viewers, such as excessive visible flickering of content.
- the frame rate of the selected video content 403 may be compared to the maximum cycle duration 401 to determine which color cycles can be completed within the maximum cycle duration 401 .
- selected video content 403 may have a relatively high frame rate in comparison to the maximum cycle duration 401 .
- selected video content 403 may be converted into, for example, color-cycled content using a three-color cycle ( 450 A) or a four-color cycle ( 405 B).
- the selected video content 403 includes a sequence of frames for sequential display (with the first six frames illustrated).
- FIG. 4 shows how each frame of selected video content 403 , which has full color information (e.g., red, green, and blue primary colors according to an RGB color gamut, although other color gamuts may be used), may be converted into a frame having only single color information according to a three-color cycle mode (as illustrated by color-cycled content 405 A) or a four-color cycle mode (as illustrated by color-cycled content 405 B).
- full color information e.g., red, green, and blue primary colors according to an RGB color gamut, although other color gamuts may be used
- a first frame of the selected video content 403 may be converted to a first frame of color-cycled content 405 A having only the red component of the first frame of selected video content 403 .
- Such conversion of the selected video content 403 into color-cycled content 405 may be performed by the computing device 200 and/or by the display device 206 .
- FIG. 5A illustrates transmission and display of content according to a first example secure display mode.
- selected video content 503 is converted to color-cycled content 505 using a four-color cycle.
- the selected video content 503 may be transmitted to the display device 206 using a standard transmission mode, and the display device 206 may convert the received video content to color-cycled content 505 for display.
- three TMDS channels may be used to carry three color components (e.g., red, green and blue) of the original frame.
- TMDS Channel 0 may carry the red component information
- TMDS Channel 1 may carry the green component information
- TMDS Channel 2 may carry the blue component information.
- the transmission proceeds similarly for a second frame, third frame, and so on.
- the display device 206 which converts the received transmitted video content 504 to color-cycled content 505 , may simply discard unnecessary color information from transmitted video content 504 .
- the display device 206 may simply discard the other color data of transmitted video content 504 (e.g., the green and blue frame 1 data of transmitted video content 504 ). In some cases, the display device 206 may process the received transmitted video content 504 in order to generate one or more frames of the color-cycled content 505 .
- the display device 206 may generate the illustrated white frame 4 of color-cycled content 505 by combining the red frame data, green frame data, and blue frame data of transmitted video content 504 , then converting the combined data to black-and-white data (e.g., grayscale data). So, while Frames 1 - 4 might not each be displayed in their original full color spectrum, if those 4 color-cycled sequential frames are displayed quickly enough, the human eye might not notice the difference, but a camcorder or smartphone camera might capture the individual colors in some or all of the frames.
- black-and-white data e.g., grayscale data
- FIG. 5B illustrates transmission and display of video content according to a second example secure display mode, in which those unnecessary color frames are simply omitted from the TMDS Channels.
- the computing device 200 may avoid transmitting unnecessary data (e.g., data that would be discarded by the display device 206 ).
- unnecessary data e.g., data that would be discarded by the display device 206 .
- the computing device 200 may instead transmit blank data indicating that a frame has no green color data or blue color data.
- the computing device may transmit nothing at all on the transmission channels that would normally carry the green color data and blue color data.
- the computing device 200 may transmit blank data in transmitted video content 506 in order to maintain compatibility with display devices 206 that do not support conversion of transmitted video content 506 into color-cycled content 505 at the display device 206 .
- the computing device 200 may simply transmit nothing instead of blank data and/or may transmit other data during time periods that would otherwise contain blank data of transmitted video content 506 .
- computing device 200 may process the selected video content 503 before transmission.
- the illustrated frame 4 of selected video content 503 may be converted to grayscale by computing device 200 before separation into red, green, and blue components for transmitted video content 506 , in order to yield white (grayscale) frame 4 of color-cycled content 505 .
- FIG. 5C illustrates transmission and display of content according to a third example secure display mode in which a slower transmission frame rate is used.
- the frame rate of the transmitted video content 508 may be reduced in comparison to the frame rate of selected video content 503 .
- the frame rate may be reduced by a factor of three, and three frames may be transmitted on three transmission channels in a given time period (e.g., frames 1 - 3 may be transmitted during a first time period, and frames 4 - 6 may be transmitted during a second time period, as illustrated).
- 90 frames-per-second content may be transmitted at 30 frames-per-second by transmitting three sequential frames for display as color-cycled content 507 during the same time period.
- the FIG. 5C example also shows three frames (e.g., red Frame 1 , green Frame 2 , and blue Frame 3 ) being transmitted on the TMDS Channel at the same time (e.g., time period T 1 ).
- the display device 206 may receive these multiple frames of transmitted video content 508 at the same time, store some or all of them in a buffer, and cause them to be displayed sequentially as frames of color-cycled content 507 in order to present the video content in secure display mode.
- FIG. 5C shows three frames (e.g., red Frame 1 , green Frame 2 , and blue Frame 3 ) being transmitted on the TMDS Channel at the same time (e.g., time period T 1 ).
- the display device 206 may receive these multiple frames of transmitted video content 508 at the same time, store some or all of them in a buffer, and cause them to be displayed sequentially as frames of color-cycled content 507 in order to present the video content in secure display mode.
- a first frame of the selected video content 503 may correspond to a first frame of color-cycled content 507 displayed in a first color
- a second frame of the selected video content 503 may correspond to a second frame of color-cycled content 507 displayed in a second color
- colors repeating in a three-color cycle in the illustrated example may correspond to a first frame of color-cycled content 507 displayed in a first color
- a secure display mode that uses a reduced transmission frame rate may beneficially provide more robust transmission characteristics, due to the lower data rate. Accordingly, such a reduced transmission frame rate mode may be beneficially useful with longer transmission cables, damaged transmission cables, or to address other causes of transmission signal degradation and/or signal loss, as further discussed for steps 311 and 312 of FIG. 3 .
- FIG. 5D illustrates transmission and display of content according to a fourth example secure display mode that staggers the frames.
- the frame rate of transmitted video content 510 may be reduced in comparison to selected video content 503 .
- the computing device 200 may transmit multiple frames in overlapping, staggered time periods. Such a transmission configuration may beneficially reduce buffering at the computing device 200 and/or at the display device 206 .
- the display device 206 may receive the multiple frames transmitted at overlapping, staggered times and cause them to be displayed sequentially to present the video content in secure display mode.
- a first frame of the selected video content 503 may correspond to a first frame of color-cycled content 507 of a first color
- a second frame of the selected video content 503 may correspond to a second frame of color-cycled content 507 of a second color
- so on with colors repeating in a three-color cycle in the illustrated example.
- FIG. 6 illustrates another case of a maximum cycle duration 601 with respect to a selected video content 603 of a particular frame rate.
- the relatively low frame rate of the selected video content 603 in comparison to the maximum cycle duration 601 could cause excessive visible flickering for human viewers if the frame rate is not increased for color-cycled content.
- the maximum cycle duration 601 thus provides a frame rate criterion (e.g., a minimum frame rate). If the frame rate criterion is not satisfied (e.g., the video content 603 has a frame rate below a minimum frame rate), the computing device 200 may select an increased frame rate security mode. In other words, compared to the selected video content 403 of FIG. 4 , the selected video content 603 of FIG.
- the sixth may comprise fewer frames per second, such that cycling colors by causing a first frame of selected video content 603 to be displayed in a first color, a second frame of selected video content 603 to be displayed in a second color, and so on, may cause the maximum cycle duration 601 to be exceeded, which may cause undesirable effects for viewers, such as visible flickering of content.
- a first frame of the selected video content 603 may be converted into multiple frames of color-cycled content 605 , thereby increasing the frame rate of the color-cycled content.
- a frame 1 of selected video content 603 which may have three color components, may be converted three sequential frames of color-cycled content 605 A, each having a different color.
- frame 1 of selected video content 603 may be converted into four frames of color-cycled content 605 B, each having a different color. Accordingly, the color-cycled content 605 may be increased in frame rate in comparison to the selected video content 603 .
- Such conversion may be performed by the computing device 200 and/or by the display device 206 .
- FIG. 7A illustrates transmission and display of content according to a fifth example secure display mode that generates the increased frame rate color-cycled content 705 at the display device 206 .
- the computing device 200 may transmit the selected video content 703 to the display 206 as transmitted video content 704 according to a normal transmission mode (e.g., at a same frame rate as the selected video content).
- a normal transmission mode e.g., at a same frame rate as the selected video content.
- first color data, second color data, and third color data for a first frame of selected video content 703 may be transmitted together on three corresponding transmission channels as transmitted video content 704 .
- the display device 206 may sequentially display the first color data as a first frame of color-cycled content 705 , followed by the second color data as a second frame of color-cycled content 705 , followed by the third color data as a third frame of color-cycled content 705 , and so on.
- the display device may generate additional color-cycled content 705 frames based on the color data.
- the display device 206 may generate a fourth color-cycled content frame in grayscale using the received first, second, and third color data (e.g., to generate a four-color cycle color-cycled content, such as color-cycled content 605 B of FIG. 6 ).
- the computing device 200 may increase a frame rate of transmitted video content 706 .
- the computing device 200 may optionally transmit blank data (e.g., during time periods when a TMDS channel does not need to carry color information for display).
- FIG. 7B illustrates transmission and display of content according to a sixth example secure display mode.
- the computing device 200 may increase a frame rate of the transmitted video content 706 as compared to the selected video content 703 .
- selected video content 703 at 30 frames per second may be transmitted as transmitted video content 706 at 90 frames per second, with each frame of selected video content 703 split into three frames of transmitted video content 706 in different colors.
- selected video content 703 at 30 frames per second may be transmitted as transmitted video content at 120 frames per second, with each frame of selected video content 703 split into four transmitted frames in different colors.
- transmitted video content 706 may include a frame for display on only a single transmission channel of multiple transmission channels at a time, and the other transmission channels may contain blank data, no data at all, or other data.
- TMDS Channel 0 may carry red color information for frame 1 of transmitted video content 706
- second and third transmission channels may carry blank data (as illustrated), no data at all, or other data.
- the computing device 200 may transmit blank data in order to maintain compatibility with display devices 206 that do not support conversion of content at the display device 206 (e.g., for a legacy display device 206 that implements standard HDMI transmissions).
- each transmitted video content 706 frame at the higher frame rate may be transmitted with additional error correction information as compared to a normal transmission (e.g., one or more additional error correction bits per byte as compared to an HDMI normal transmission).
- computing device 200 may process the selected video content 703 before transmission. For example, a grayscale frame may be generated by computing device 200 , then separated into red, green, and blue components for transmission.
- the computing device 200 may convert the selected video content before transmission to the display device 206 in accordance with the selected secure display mode.
- the computing device 200 may transmit the selected video content using conventional transmission methods (e.g., using standard HDMI transmissions), and the display device 206 may convert the transmitted content into color-cycled content.
- the computing device 200 may replace certain color data of the selected video content with blank color data, replace certain color data with other data, and/or transmit nothing instead of transmitting certain color data.
- the computing device 200 may lower the frame rate of the transmitted video content in comparison to the selected video content.
- the computing device 200 may transmit multiple color-cycled content frames simultaneously and/or in overlapping time periods.
- the computing device 200 may increase a frame rate of the transmitted video content in comparison to the selected video content.
- the computing device 200 may transmit the converted video content to the display device 206 .
- the video content may be transmitted across a local video connection, such as an HDMI connection.
- the various color data may be separated and transmitted across different transmission channels of the link.
- one transmission channel may be used to carry red data
- one transmission channel may be used to carry green data
- one transmission channel may be used to carry blue data.
- other colors e.g., cyan, yellow, and magenta
- color spaces e.g., a YCC color space including one luma channel and two chroma channels
- step 310 if the computing device 200 receives a selection of new content (e.g., because the user selects different content, the previously-selected content ends and new content begins, etc.), the computing device returns to step 304 . Otherwise, the computing device 200 may continue transmitting the selected video content.
- a selection of new content e.g., because the user selects different content, the previously-selected content ends and new content begins, etc.
- the computing device 200 may adjust the selected secure display mode to reduce a bandwidth of the transmission at step 312 .
- the computing device 200 may receive an indication from the display device 206 that the display device 206 did not receive one or more frames, detected one or more errors, or otherwise received a degraded signal. Responsive to receiving the indication, at step 312 , the computing device 200 may adjust one or more parameters of the secure display mode to lower a bandwidth of the transmission. For example, the computing device 200 may reduce a frame rate of the transmitted video content (e.g., as in FIG. 5C or 5D ), or switch from a four-color cycle to a three-color cycle. The lower bandwidth transmission may be more robust to signal degradation.
- One or more aspects of the disclosure may be embodied in computer-usable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices.
- program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other data processing device.
- the computer executable instructions may be stored on one or more computer readable media such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc.
- the functionality of the program modules may be combined or distributed as desired in various embodiments.
- the functionality may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), application-specific integrated circuits (ASIC), and the like.
- firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), application-specific integrated circuits (ASIC), and the like.
- Particular data structures may be used to more effectively implement one or more aspects of the invention, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
- Video content rights holders commonly use digital rights management software, secure devices, and other strategies to prevent unauthorized copying and distribution of video content. However, unauthorized copies of video content can still be made by using video recording devices to capture content being output by video displays. There remains an ever-present need for protecting the rights of content holders, and for countering the efforts of those would seek to make unauthorized copies of video content.
- The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.
- Some aspects of this disclosure relate to methods, systems and computing devices that prevent or interfere with video capture of content displayed on a video display, such as a television, monitor, projection screen, or other video display. Video capturing devices, such as camcorders, commonly have fairly low capture rates (e.g., 20 frames per second), whereas many video displays are capable of outputting content at much higher refresh rates (e.g., 120-240 frames per second). This refresh rate disparity can be used to display content that appears normal to a human viewer, but that is difficult or impossible to accurately capture with common video recording devices.
- In one aspect, techniques described herein generate content for display that quickly cycles between different-colored frames at a rapid rate that may be imperceptible to human viewers. For example, a video display may output a first frame colored in shades of a first primary color (e.g., a frame with only red image data), followed by a second frame colored in shades of a second primary color (e.g., a frame with only green image data), followed by a third frame colored in shades of a third primary color (e.g., a frame with only blue image data). The human eye tends to blend sequential frames together if they are presented in quick succession, so that a human viewer will see a full color spectrum even though only a single color is displayed at any given time. The video display may repeatedly cycle between individual color frames, thus providing video content that appears normal to a human viewer. However, a camcorder or video recording device may capture only some of the color frames. Therefore, the video recorded by video recording devices may include undesirable color flicker or other color artifacts, which may deter unauthorized copying.
- In some aspects, by displaying (and potentially transmitting) a reduced amount of color information for each frame, transmitted video content may omit certain color information for any given frame and thereby use a lower bandwidth on a transmission medium. The lower bandwidth on the transmission medium may beneficially enable more robust transmission properties that may be useful when signal loss becomes an issue. Therefore, techniques described herein may be useful to combat piracy and/or to lower the bandwidth of a transmitted video signal, among other benefits.
- The methods, systems and computing devices described herein may be included as part of a network, such as a cable television distribution network.
- The details of these and other embodiments of the present disclosure are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
- The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
-
FIG. 1 illustrates an example network according to one or more aspects described herein. -
FIG. 2 illustrates an example computing device on which the various elements described herein may be implemented according to one or more aspects described herein. -
FIG. 3 illustrates an example process for causing content to be displayed according to a secure display mode. -
FIG. 4 illustrates an example of selected video content and corresponding color-cycled content. -
FIGS. 5A, 5B, 5C, 5D are example transmission diagrams illustrating selected video content converted into transmitted video content and corresponding color-cycled content. -
FIG. 6 illustrates an example of selected video content and corresponding color-cycled content. -
FIGS. 7A and 7B are example transmission diagrams illustrating selected video content converted into transmitted video content and corresponding color-cycled content. - In the following description of various illustrative embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown, by way of illustration, various embodiments in which aspects of the disclosure may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional modifications may be made, without departing from the scope of the present disclosure.
- A computing device with access to video content may implement techniques described herein to combat piracy by displaying video in repeating color cycles. The computing device may be part of a video distribution network, such as a cable television network, and/or may be connected to other sources of video content, such as the Internet, video players such as DVD and BLU-RAY players, and the like. Such computing systems may be part of a home entertainment system, a projection system (e.g., for a movie theater), or any other system for displaying video content.
- The computing device may cause a display device to quickly display sequential frames of a video content, but to have each individual frame presented in just one color, such as a primary color, instead of as a full-color frame. In one embodiment, a cycle of frames may include a first frame of a first primary color (e.g., showing one frame of video content in red), a second frame of a second primary color (e.g., showing a next frame of video content in green), and a third frame of a third primary color (e.g., showing a next frame of video content in blue). The next cycle may similarly include three frames of three different primary colors. Various colors may be used as primary colors for a color cycle. For example, a color cycle may include a cyan frame, followed by a magenta frame, followed by a yellow frame. By quickly including individual colors in sequential frames, a human viewer's eyes may tend to blend the colors of the sequential frames, so the true full-color video content frame is still perceived over time, even though only components of a particular color are displayed in any one time.
- In some embodiments, a color cycle may include a black-and-white or grayscale frame, which may beneficially increase the perceived brightness of the displayed video content. For example, a four-color cycle including a red frame, a green frame, a blue frame, and a white frame may appear brighter to a viewer than a three-color cycle including a red frame, a blue frame, and a green frame. In some embodiments, two primary colors may be used in some frames. For example, a two-color cycle may include a red-green frame, followed by a blue frame. As another example, a frame may be half one color (e.g., a top half of the frame may be red) and half another color (e.g., a bottom half of the frame may be green). In general, each frame of a color cycle may include reduced color information, and the plurality of frames forming a color cycle may collectively include a full color gamut.
- In some embodiments, content may be modified before transmission over a local display link, such as a DISPLAYPORT link, HDMI link, DVI-D link, or some other link for transmitting video data. Such links may include multiple physical channels for transmitting different video color information simultaneously. For example, HDMI includes three physical transmission minimized differential signaling (TMDS) channels for transmitting three different primary colors according to a selected color space (e.g., one channel for red data, one channel for green data, and one channel for blue data when transmitting video in the RGB color space). Additionally or alternatively, content may be transmitted across other types of links.
-
FIG. 1 illustrates anexample network 100 on which many of the various features described herein may be implemented. Network 100 may be any type of information distribution network, such as satellite, telephone, cellular, wireless, optical fiber network, coaxial cable network, and/or a hybrid fiber/coax (HFC) distribution network. Additionally,network 100 may be a combination of networks. Network 100 may use a series of interconnected communication links 101 (e.g., coaxial cables, optical fibers, wireless, etc.) and/or some other network (e.g., the Internet, a PSTN, etc.) to connect an end-point to a local office or headend 103. Example end-points are illustrated inFIG. 1 as premises 102 (e.g., businesses, homes, consumer dwellings, etc.) The local office 103 (e.g., a data processing and/or distribution facility) may transmit information signals onto thelinks 101, and eachpremises 102 may have a receiver used to receive and process those signals. - There may be one
link 101 originating from thelocal office 103, and it may be split a number of times to distribute the signal tovarious homes 102 in the vicinity (which may be many miles) of thelocal office 103. Thelinks 101 may include components not illustrated, such as splitters, filters, amplifiers, etc. to help convey the signal clearly, but in general each split introduces a bit of signal degradation. Portions of thelinks 101 may also be implemented with fiber-optic cable, while other portions may be implemented with coaxial cable, other links, or wireless communication paths. - The
local office 103 may include a termination system (TS) 104, such as a cable modem termination system (CMTS) in a HFC network, a DSLAM in a DSL network, a cellular base station in a cellular network, or some other computing device configured to manage communications between devices on the network oflinks 101 and backend devices such as servers 105-107 (which may be physical servers and/or virtual servers, for example, in a cloud environment). The TS may be as specified in a standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. CableLabs), or it may be a similar or modified device instead. The TS may be configured to place data on one or more downstream frequencies to be received by modems or other user devices at thevarious premises 102, and to receive upstream communications from those modems on one or more upstream frequencies. Thelocal office 103 may also include one ormore network interfaces 108, which can permit thelocal office 103 to communicate with various otherexternal networks 109. Thesenetworks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks, fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, and any other desired network, and theinterface 108 may include the corresponding circuitry needed to communicate on thenetwork 109, and to other devices on the network such as a cellular telephone network and its corresponding cell phones. - As noted above, the
local office 103 may include a variety of servers 105-107 that may be configured to perform various functions. The servers may be physical servers and/or virtual servers. For example, thelocal office 103 may include a push notification server 105. The push notification server 105 may generate push notifications to deliver data and/or commands to thevarious homes 102 in the network (or more specifically, to the devices in thehomes 102 that are configured to detect such notifications). Thelocal office 103 may also include acontent server 106. Thecontent server 106 may be one or more computing devices that are configured to provide content to users in the homes. This content may be, for example, video on demand movies, television programs, songs, text listings, etc. Thecontent server 106 may include software to validate user identities and entitlements, locate and retrieve requested content, encrypt the content, and initiate delivery (e.g., streaming) of the content to the requesting user and/or device. - The
local office 103 may also include one ormore application servers 107. Anapplication server 107 may be a computing device configured to offer any desired service, and may run various languages and operating systems (e.g., servlets and JSP pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTMLS, JavaScript, AJAX and COMET). For example, an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings. Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements. Another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to thepremises 102. Another application server may be responsible for formatting and providing data for an interactive service being transmitted to the premises 102 (e.g., chat messaging service, etc.). - An
example premises 102 a may include aninterface 120. Theinterface 120 may comprise amodem 110, which may include transmitters and receivers used to communicate on thelinks 101 and with thelocal office 103. Themodem 110 may be, for example, a coaxial cable modem (for coaxial cable links 101), a fiber interface node (for fiber optic links 101), or any other desired device offering similar functionality. Theinterface 120 may also comprise agateway interface device 111 or gateway. Themodem 110 may be connected to, or be a part of, thegateway interface device 111. Thegateway interface device 111 may be a computing device that communicates with themodem 110 to allow one or more other devices in the premises to communicate with thelocal office 103 and other devices beyond the local office. Thegateway 111 may comprise a set-top box (STB), digital video recorder (DVR), computer server, or any other desired computing device. Thegateway 111 may also include (not shown) local network interfaces to provide communication signals to devices in the premises, such as display devices 112 (e.g., televisions),additional STBs 113,personal computers 114,laptop computers 115, wireless devices 116 (wireless laptops and netbooks, mobile phones, mobile televisions, personal digital assistants (PDA), etc.), alandline phone 117, and any other desired devices. Examples of the local network interfaces include Multimedia Over Coax Alliance (MoCA) interfaces, Ethernet interfaces, universal serial bus (USB) interfaces, wireless interfaces (e.g., IEEE 802.11), BLUETOOTH® interfaces (including, for example, BLUETOOTH® LE), ZIGBEE®, and others.Premises 102 a may further include one ormore listening devices 119, the operation of which will be further described below. -
FIG. 2 illustrates an example computing device on which various elements described herein can be implemented. Thecomputing device 200 may include one ormore processors 201, which may execute instructions of a computer program to perform any of the features described herein. The instructions may be stored in any type of computer-readable medium or memory, to configure the operation of theprocessor 201. For example, instructions may be stored in a read-only memory (ROM) 202, random access memory (RAM) 203,removable media 204, such as a Universal Serial Bus (USB) drive, compact disk (CD) or digital versatile disk (DVD), floppy disk drive, or any other desired electronic storage medium. Instructions may also be stored in an attached (or internal)hard drive 205. Thecomputing device 200 may include one or more output devices, such as a display 206 (or an external television), and may include one or moreoutput device controllers 207, such as a video controller. In some embodiments, thedevice controller 207 may implement an HDMI standard and/or a modified HDMI standard. There may also be one or moreuser input devices 208, such as a remote control, keyboard, mouse, touch screen, microphone, etc. Thecomputing device 200 may also include one or more network interfaces, such as input/output circuits 209 (such as a network card) to communicate with anexternal network 210. The network interface may be a wired interface, wireless interface, or a combination of the two. In some embodiments, theinterface 209 may include a modem (e.g., a cable modem), andnetwork 210 may include the communication links and/or networks illustrated inFIG. 1 , or any other desired network. - The
FIG. 2 example is an illustrative hardware configuration. Modifications may be made to add, remove, combine, divide, etc. components as desired. Additionally, the components illustrated may be implemented using basic computing devices and components, and the same components (e.g.,processor 201,storage 202, user interface, etc.) may be used to implement any of the other computing devices and components described herein. -
FIG. 3 illustrates a process that may be implemented by thecomputing device 200, which may be agateway 111, settop box 113,personal computer 114, or some other device. The process may begin atstep 301, when thecomputing device 200 is turned on or otherwise begins operation of a mode for transmitting video content to adisplay device 206. - At
step 302, a user and/or thecomputing device 200 may have selected video content. For example, the user may select a television program on a broadcast or on-demand channel, and/or a default or previously-selected video content may be displayed when thecomputing device 200 is turned on. Then, thecomputing device 200 may retrieve metadata that can be used to determine how the selected video content should be transmitted and displayed (e.g., using a color cycle technique) at thedisplay device 206. The video metadata may be electronic program guide (EPG) data and/or other metadata. The video metadata may be retrieved from storage, for example, one or more ofROM 203,removable media 204, orhard drive 205. Additionally or alternatively, the video metadata may be received via a network, such asexternal network 210. The video metadata received from anexternal network 210 may be EPG data received via a television distribution network and/or via the Internet. The video metadata may also be received from a local device, such as a video game console, tablet, or other device that may communicate withcomputing device 200 to provide video content to display 206. Such received video metadata may be stored in the storage for later retrieval. Accordingly, atstep 302, thecomputing device 200 may receive video metadata from another device (e.g., in response to a request transmitted to the other device) and/or retrieve the video metadata from storage. - The video metadata may contain information about video content that may be transmitted to the
display device 206. The video content may include video channels (e.g., television channels), video sources (e.g., a connected device such as a video game console), or other video content assets (e.g., television episodes, movies, etc.). The video metadata may include descriptive information such as a title, genre, and summary of a particular video content asset, channel, or source. The video metadata may also include schedule information such as a time and channel of broadcast for a particular video content asset. - The video metadata may also include a security setting for each item of video content. For example, each television program on NBC may have its own security setting. The security setting may indicate that particular video content should or must be transmitted to and/or displayed by a
display device 206 using a secure display mode. The security setting may be a binary setting specifying that a secure display mode is either required or not required for particular content, or it may be a setting that specifies one or more of a plurality of security settings. For example, the security setting may specify one or more parameters of a secure display mode, such as a set of allowed color cycles, a minimum display refresh rate, a maximum display refresh rate, and/or a range of allowed display refresh rates. Additionally or alternatively, the security setting may include an indication of whether thecomputing device 200 must transmit the video in a secure display mode, or whether thedisplay device 206 may handle conversion to a secure display mode. In some cases, the security setting may reference a pre-defined set of parameters for a secure display mode. For example, a “high security” setting may map to a first set of parameters for a secure display mode, and a “medium security” setting may map to a second set of parameters (e.g., a more permissive set of parameters). - The video metadata may also include one or more time periods associated with a particular security setting, and may include multiple security settings for a given video content. Each of the multiple security settings may be associated with one or more time periods, or no time period. During a time period, the
computing device 200 may enforce the associated security setting. Security settings with no associated time period may be enforced at any time. For example, a “high security” setting may be associated with a newly-released movie or television episode during the first 2 weeks of its initial air date, and a “medium security” setting may be associated with the same movie or episode during the following two weeks. Finally, a “low security” setting may be associated with the same movie for any other time. If no security settings are associated with video content, thecomputing device 200 may cause the video content to be displayed in a normal (e.g., non-secure) manner. - The video metadata may generated by and/or received from a device at the local office or
headend 103. For example, one ofservers computing device 200. - At
step 303, thecomputing device 200 may determine one or more supported display modes, which may indicate supported display refresh rates (which may affect which color cycle modes, frame rates, or other security settings are compatible with the connected display). Thecomputing device 200 may communicate with the display 206 (e.g., via an HDMI connection) to determine one or more display modes supported bydisplay 206. For example, thecomputing device 200 may receive extended display identification data (EDID) via a display data channel (DDC) of the HDMI channel. The EDID may contain information such as a maximum resolution supported by the display, a manufacturer and/or model of the display, what resolutions and corresponding refresh rates are supported by the display, and the like. In some embodiments, the EDID may further include information about specific secure display modes that are supported by the display (e.g., a “high security” mode, a “medium security” mode, and the like), and/or information about secure display mode parameters, such as supported color cycles, whether the display supports conversion into a secure display mode, and the like. Additionally or alternatively, information about supported secure display modes and/or supported parameters may be received via a Consumer Electronics Control (CEC) line, or via some other communication channel of the HDMI or other link. In cases where the display does not transmit certain information about the supported secure display modes and/or secure display mode parameters, such information may be retrieved from another device (e.g., from a remote server in communication with computing device 200) in response to a query identifying the display (e.g., by make and/or model, serial number, and/or some other identifier of the display). Additionally or alternatively, such secure display mode information may be retrieved from local storage of the computing device 200 (e.g., from a stored database comprising secure display mode information keyed to one or more display identifiers). Such secure display mode information, once received (e.g., fromdisplay device 206 and/or a remote server), may be stored by computingdevice 200 and reused in the future. - At
step 304, thecomputing device 200 may determine whether a selected video content (e.g., content asset, channel, and/or content source, etc.) is associated with a security setting. The selected video content asset may be broadcast video, on-demand video, recorded video, or any other video content.Computing device 200 may determine whether the video metadata contains one or more security settings associated with content for display atdisplay 206.Computing device 200 may further determine whether the one or more security settings are currently enforced (e.g., based on determining that the current time falls within a particular time period associated with a particular security setting). If multiple security settings are currently enforced for selected content,computing device 200 may determine which security setting takes precedence. For example, if a user selects a movie associated with a first security setting, and the movie is transmitting on a channel associated with a second security setting, thecomputing device 200 may select one of the first security setting or the second security setting. In some cases, thecomputing device 200 may determine that the more restrictive security setting (e.g., the security setting that permits fewer security modes) takes precedence. In some cases, thecomputing device 200 may select the most specific security setting (e.g., a security setting for a content asset over a security setting for a channel, or a security setting for a channel over a security setting for a content source). - Responsive to determining that the selected content is not secure content (e.g., based on determining that no security settings are associated with the selected content or that any security settings associated with the selected content are not currently enforced), at
step 305 thecomputing device 200 may transmit the selected content to display 206 for display in a normal (e.g., non-secure) mode. If thecomputing device 200 receives a selection of new content at step 306 (e.g., because the user selects different content, the previously-selected content ends and new content begins, etc.), the computing device returns to step 304. - At
step 307, responsive to determining that the selected content is secure content (e.g., based on determining at least one security setting is currently enforced), thecomputing device 200 may select a secure display mode to use for transmitting the content to display 206. A secure display mode may specify multiple parameters for displayed content, including a color cycle (e.g., a three-color cycle of a red, green, and blue, a four-color cycle of red, green, blue, and white, etc.), a frame rate of the displayed content, and a transmission mode. Selection of the parameters of the secure display mode may depend on one or more factors including a frame rate of the selected video content, supported refresh rates ofdisplay 206, security settings of the video metadata, and/or video conversion capabilities of thedisplay 206, as well as other factors.FIGS. 4, 5A-5D, 6, and 7A-7B illustrate various combinations of secure display mode parameters, and the selection of such parameters based on various factors. - In some examples, the supported refresh rates of
display 206 may be used to select a frame rate of transmitted or displayed content. Thecomputing device 200 may receive a list of supported refresh rates, which may indicate a particular refresh rate (e.g., 240 Hz). The computing device may then determine a supported transmission or display frame rate for transmitted or displayed content by dividing a number of colors in a color cycle by the refresh rate (e.g., a 240 Hz refresh rate divided by 3 colors allows 80 frames per second of a first color, 80 frames per second of a second color, and 80 frames per second of a third color to be transmitted and/or displayed). This process may be repeated for each supported refresh rate ofdisplay 206. Thecomputing device 200 may thus determine which transmission rates and/or display rates are supported based on the list of supported refresh rates received fromdisplay 206. - The
computing device 200 may further determine at least some secure display mode parameters based on a maximum cycle duration, which may indicate the maximum time for completing a color cycle that will be invisible (or at least acceptable) to a human eye. Turning toFIG. 4 , amaximum cycle duration 401 is illustrated with respect to selectedvideo content 403 of a particular frame rate. Themaximum cycle duration 401 may specify a maximum time during which thedisplay 206 must cycle through all the colors of the color cycle while displaying in a secure display mode. For example, for amaximum cycle duration 401 of 60 ms, the display must display each frame of a three-color cycle for no more than 20 ms (e.g., a red frame for 20 ms, a green frame for 20 ms, and a blue frame for 20 ms), or each frame of a four-color cycle for no more than 15 ms (e.g., a red frame for 15 ms, a green frame for 15 ms, a blue frame for 15 ms, and a white frame for 15 ms). Exceeding themaximum cycle duration 401 may cause undesirable effects for viewers, such as excessive visible flickering of content. - The frame rate of the selected
video content 403 may be compared to themaximum cycle duration 401 to determine which color cycles can be completed within themaximum cycle duration 401. In the illustrated example ofFIG. 4 , selectedvideo content 403 may have a relatively high frame rate in comparison to themaximum cycle duration 401. As illustrated byFIG. 4 , because at least four frames of selected video content fit within themaximum cycle duration 401, selectedvideo content 403 may be converted into, for example, color-cycled content using a three-color cycle (450A) or a four-color cycle (405B). - The selected
video content 403 includes a sequence of frames for sequential display (with the first six frames illustrated).FIG. 4 shows how each frame of selectedvideo content 403, which has full color information (e.g., red, green, and blue primary colors according to an RGB color gamut, although other color gamuts may be used), may be converted into a frame having only single color information according to a three-color cycle mode (as illustrated by color-cycledcontent 405A) or a four-color cycle mode (as illustrated by color-cycledcontent 405B). For example, according to a three-color cycle secure display mode (405A), a first frame of the selectedvideo content 403 may be converted to a first frame of color-cycledcontent 405A having only the red component of the first frame of selectedvideo content 403. Such conversion of the selectedvideo content 403 into color-cycled content 405 may be performed by thecomputing device 200 and/or by thedisplay device 206. -
FIG. 5A illustrates transmission and display of content according to a first example secure display mode. In the illustrated secure display mode, selectedvideo content 503 is converted to color-cycledcontent 505 using a four-color cycle. The selectedvideo content 503 may be transmitted to thedisplay device 206 using a standard transmission mode, and thedisplay device 206 may convert the received video content to color-cycledcontent 505 for display. As illustrated, according to a normal transmission mode, three TMDS channels may be used to carry three color components (e.g., red, green and blue) of the original frame. For example, whenframe 1 is transmitted during a first time period T1,TMDS Channel 0 may carry the red component information,TMDS Channel 1 may carry the green component information, andTMDS Channel 2 may carry the blue component information. The transmission proceeds similarly for a second frame, third frame, and so on. Thedisplay device 206, which converts the received transmittedvideo content 504 to color-cycledcontent 505, may simply discard unnecessary color information from transmittedvideo content 504. For example, to convert the first frame data of transmittedvideo content 504 to a first frame of color-cycled content 505 (e.g., to generatered frame 1 of color-cycled content 505), thedisplay device 206 may simply discard the other color data of transmitted video content 504 (e.g., the green andblue frame 1 data of transmitted video content 504). In some cases, thedisplay device 206 may process the received transmittedvideo content 504 in order to generate one or more frames of the color-cycledcontent 505. For example, thedisplay device 206 may generate the illustratedwhite frame 4 of color-cycledcontent 505 by combining the red frame data, green frame data, and blue frame data of transmittedvideo content 504, then converting the combined data to black-and-white data (e.g., grayscale data). So, while Frames 1-4 might not each be displayed in their original full color spectrum, if those 4 color-cycled sequential frames are displayed quickly enough, the human eye might not notice the difference, but a camcorder or smartphone camera might capture the individual colors in some or all of the frames. - In the above example, some of the frame data transmitted on the TMDS Channels are not actually used. For example, for
Frame 1, which is displayed in red in the color-cycledcontent 505, the transmission of the blue and green components ofFrame 1 might be unnecessary.FIG. 5B illustrates transmission and display of video content according to a second example secure display mode, in which those unnecessary color frames are simply omitted from the TMDS Channels. In the illustrated second example secure display mode, thecomputing device 200 may avoid transmitting unnecessary data (e.g., data that would be discarded by the display device 206). For example, instead of transmitting green color data and blue color data forframe 1 of selectedvideo content 503, thecomputing device 200 may instead transmit blank data indicating that a frame has no green color data or blue color data. As another example, the computing device may transmit nothing at all on the transmission channels that would normally carry the green color data and blue color data. In some cases, thecomputing device 200 may transmit blank data in transmittedvideo content 506 in order to maintain compatibility withdisplay devices 206 that do not support conversion of transmittedvideo content 506 into color-cycledcontent 505 at thedisplay device 206. However, in some cases (e.g., if thedisplay device 206 supports such a setting), thecomputing device 200 may simply transmit nothing instead of blank data and/or may transmit other data during time periods that would otherwise contain blank data of transmittedvideo content 506. In some cases,computing device 200 may process the selectedvideo content 503 before transmission. For example, the illustratedframe 4 of selectedvideo content 503 may be converted to grayscale by computingdevice 200 before separation into red, green, and blue components for transmittedvideo content 506, in order to yield white (grayscale)frame 4 of color-cycledcontent 505. - In the above example, which shows a four-color cycle, each TMDS Channel only transmits information about half of the time. A three-color cycle, which omits the white frame, might transmit information in the TMDS Channels only a third of the time. In alternative embodiments, this fact may allow the TMDS Channels to transmit at a slower data rate than normal.
FIG. 5C illustrates transmission and display of content according to a third example secure display mode in which a slower transmission frame rate is used. In the third example secure display mode, the frame rate of the transmittedvideo content 508 may be reduced in comparison to the frame rate of selectedvideo content 503. In the illustrated example, because only one third of the color data for each frame of selectedvideo content 503 is being transmitted as transmitted video content 508 (e.g., because in the three-color cycle, data for the other two colors is being discarded for each frame), the frame rate may be reduced by a factor of three, and three frames may be transmitted on three transmission channels in a given time period (e.g., frames 1-3 may be transmitted during a first time period, and frames 4-6 may be transmitted during a second time period, as illustrated). Thus, for example, 90 frames-per-second content may be transmitted at 30 frames-per-second by transmitting three sequential frames for display as color-cycledcontent 507 during the same time period. - The
FIG. 5C example also shows three frames (e.g.,red Frame 1,green Frame 2, and blue Frame 3) being transmitted on the TMDS Channel at the same time (e.g., time period T1). Thedisplay device 206 may receive these multiple frames of transmittedvideo content 508 at the same time, store some or all of them in a buffer, and cause them to be displayed sequentially as frames of color-cycledcontent 507 in order to present the video content in secure display mode. As illustrated byFIG. 5C (with buffering and transmission delays not shown), a first frame of the selectedvideo content 503 may correspond to a first frame of color-cycledcontent 507 displayed in a first color, a second frame of the selectedvideo content 503 may correspond to a second frame of color-cycledcontent 507 displayed in a second color, and so on, with colors repeating in a three-color cycle in the illustrated example. - A secure display mode that uses a reduced transmission frame rate may beneficially provide more robust transmission characteristics, due to the lower data rate. Accordingly, such a reduced transmission frame rate mode may be beneficially useful with longer transmission cables, damaged transmission cables, or to address other causes of transmission signal degradation and/or signal loss, as further discussed for
steps FIG. 3 . - In the above example, the TMDS Channels were aligned, such that 3 frames were all received at the same time (T1), and the display device would buffer some of the frames until they were needed for display. In alternative embodiments, the frames may be staggered in the TMDS Channels to reduce the buffering requirement.
FIG. 5D illustrates transmission and display of content according to a fourth example secure display mode that staggers the frames. Similarly as for the third example secure display mode, in the fourth example secure display mode, the frame rate of transmittedvideo content 510 may be reduced in comparison to selectedvideo content 503. However, rather than transmitting multiple frames aligned in a single time period, thecomputing device 200 may transmit multiple frames in overlapping, staggered time periods. Such a transmission configuration may beneficially reduce buffering at thecomputing device 200 and/or at thedisplay device 206. - The
display device 206 may receive the multiple frames transmitted at overlapping, staggered times and cause them to be displayed sequentially to present the video content in secure display mode. As illustrated byFIG. 5D (with buffering and transmission delays not shown), a first frame of the selectedvideo content 503 may correspond to a first frame of color-cycledcontent 507 of a first color, a second frame of the selectedvideo content 503 may correspond to a second frame of color-cycledcontent 507 of a second color, and so on, with colors repeating in a three-color cycle in the illustrated example. -
FIG. 6 illustrates another case of amaximum cycle duration 601 with respect to a selectedvideo content 603 of a particular frame rate. In this case, the relatively low frame rate of the selectedvideo content 603 in comparison to themaximum cycle duration 601 could cause excessive visible flickering for human viewers if the frame rate is not increased for color-cycled content. Themaximum cycle duration 601 thus provides a frame rate criterion (e.g., a minimum frame rate). If the frame rate criterion is not satisfied (e.g., thevideo content 603 has a frame rate below a minimum frame rate), thecomputing device 200 may select an increased frame rate security mode. In other words, compared to the selectedvideo content 403 ofFIG. 4 , the selectedvideo content 603 ofFIG. 6 may comprise fewer frames per second, such that cycling colors by causing a first frame of selectedvideo content 603 to be displayed in a first color, a second frame of selectedvideo content 603 to be displayed in a second color, and so on, may cause themaximum cycle duration 601 to be exceeded, which may cause undesirable effects for viewers, such as visible flickering of content. - In order to convert such selected
video content 603 to a secure display mode, a first frame of the selectedvideo content 603 may be converted into multiple frames of color-cycled content 605, thereby increasing the frame rate of the color-cycled content. For example, aframe 1 of selectedvideo content 603, which may have three color components, may be converted three sequential frames of color-cycledcontent 605A, each having a different color. As another example,frame 1 of selectedvideo content 603 may be converted into four frames of color-cycledcontent 605B, each having a different color. Accordingly, the color-cycled content 605 may be increased in frame rate in comparison to the selectedvideo content 603. Such conversion may be performed by thecomputing device 200 and/or by thedisplay device 206. -
FIG. 7A illustrates transmission and display of content according to a fifth example secure display mode that generates the increased frame rate color-cycledcontent 705 at thedisplay device 206. In the fifth example secure display mode, thecomputing device 200 may transmit the selectedvideo content 703 to thedisplay 206 as transmittedvideo content 704 according to a normal transmission mode (e.g., at a same frame rate as the selected video content). For example, first color data, second color data, and third color data for a first frame of selectedvideo content 703 may be transmitted together on three corresponding transmission channels as transmittedvideo content 704. Then, thedisplay device 206 may sequentially display the first color data as a first frame of color-cycledcontent 705, followed by the second color data as a second frame of color-cycledcontent 705, followed by the third color data as a third frame of color-cycledcontent 705, and so on. In some modes, the display device may generate additional color-cycledcontent 705 frames based on the color data. For example, thedisplay device 206 may generate a fourth color-cycled content frame in grayscale using the received first, second, and third color data (e.g., to generate a four-color cycle color-cycled content, such as color-cycledcontent 605B ofFIG. 6 ). - In some cases (e.g., when a
display device 206 lacks the capability of generating the increased frame rate color cycle content 705), thecomputing device 200 may increase a frame rate of transmittedvideo content 706. Thecomputing device 200 may optionally transmit blank data (e.g., during time periods when a TMDS channel does not need to carry color information for display).FIG. 7B illustrates transmission and display of content according to a sixth example secure display mode. In the sixth example secure display mode, thecomputing device 200 may increase a frame rate of the transmittedvideo content 706 as compared to the selectedvideo content 703. For example, selectedvideo content 703 at 30 frames per second may be transmitted as transmittedvideo content 706 at 90 frames per second, with each frame of selectedvideo content 703 split into three frames of transmittedvideo content 706 in different colors. In another example (not illustrated), selectedvideo content 703 at 30 frames per second may be transmitted as transmitted video content at 120 frames per second, with each frame of selectedvideo content 703 split into four transmitted frames in different colors. As illustrated, transmittedvideo content 706 may include a frame for display on only a single transmission channel of multiple transmission channels at a time, and the other transmission channels may contain blank data, no data at all, or other data. For example, during time period T1,TMDS Channel 0 may carry red color information forframe 1 of transmittedvideo content 706, and second and third transmission channels may carry blank data (as illustrated), no data at all, or other data. Thecomputing device 200 may transmit blank data in order to maintain compatibility withdisplay devices 206 that do not support conversion of content at the display device 206 (e.g., for alegacy display device 206 that implements standard HDMI transmissions). In some embodiments, each transmittedvideo content 706 frame at the higher frame rate may be transmitted with additional error correction information as compared to a normal transmission (e.g., one or more additional error correction bits per byte as compared to an HDMI normal transmission). In some cases,computing device 200 may process the selectedvideo content 703 before transmission. For example, a grayscale frame may be generated by computingdevice 200, then separated into red, green, and blue components for transmission. - Turning back to
FIG. 3 , atstep 308, after selecting a secure display mode, thecomputing device 200 may convert the selected video content before transmission to thedisplay device 206 in accordance with the selected secure display mode. According to some secure display modes, thecomputing device 200 may transmit the selected video content using conventional transmission methods (e.g., using standard HDMI transmissions), and thedisplay device 206 may convert the transmitted content into color-cycled content. According to some secure display modes, thecomputing device 200 may replace certain color data of the selected video content with blank color data, replace certain color data with other data, and/or transmit nothing instead of transmitting certain color data. According to some secure display modes, thecomputing device 200 may lower the frame rate of the transmitted video content in comparison to the selected video content. According to some secure display modes, thecomputing device 200 may transmit multiple color-cycled content frames simultaneously and/or in overlapping time periods. According to some secure display modes, thecomputing device 200 may increase a frame rate of the transmitted video content in comparison to the selected video content. - At
step 309, thecomputing device 200 may transmit the converted video content to thedisplay device 206. The video content may be transmitted across a local video connection, such as an HDMI connection. Accordingly, the various color data may be separated and transmitted across different transmission channels of the link. For example, as illustrated inFIGS. 5A-5D, 7A, and 7B , one transmission channel may be used to carry red data, one transmission channel may be used to carry green data, and one transmission channel may be used to carry blue data. However, other colors (e.g., cyan, yellow, and magenta) or color spaces (e.g., a YCC color space including one luma channel and two chroma channels) may be used for transmission. - At
step 310, if thecomputing device 200 receives a selection of new content (e.g., because the user selects different content, the previously-selected content ends and new content begins, etc.), the computing device returns to step 304. Otherwise, thecomputing device 200 may continue transmitting the selected video content. - In some cases, in response to detecting a transmission error at
step 311, thecomputing device 200 may adjust the selected secure display mode to reduce a bandwidth of the transmission atstep 312. Thecomputing device 200 may receive an indication from thedisplay device 206 that thedisplay device 206 did not receive one or more frames, detected one or more errors, or otherwise received a degraded signal. Responsive to receiving the indication, atstep 312, thecomputing device 200 may adjust one or more parameters of the secure display mode to lower a bandwidth of the transmission. For example, thecomputing device 200 may reduce a frame rate of the transmitted video content (e.g., as inFIG. 5C or 5D ), or switch from a four-color cycle to a three-color cycle. The lower bandwidth transmission may be more robust to signal degradation. - One or more aspects of the disclosure may be embodied in computer-usable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other data processing device. The computer executable instructions may be stored on one or more computer readable media such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. The functionality of the program modules may be combined or distributed as desired in various embodiments. In addition, the functionality may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), application-specific integrated circuits (ASIC), and the like. Particular data structures may be used to more effectively implement one or more aspects of the invention, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
- Aspects of the disclosure have been described in terms of illustrative embodiments thereof. While illustrative systems and methods as described herein embodying various aspects of the present disclosure are shown, it will be understood by those skilled in the art, that the disclosure is not limited to these embodiments. Modifications may be made by those skilled in the art, particularly in light of the foregoing teachings. For example, each of the features of the aforementioned illustrative examples may be utilized alone or in combination or sub-combination with elements of the other examples. For example, any of the above described systems and methods or parts thereof may be combined with the other methods and systems or parts thereof described above. For example, the steps illustrated in the illustrative figures may be performed in other than the recited order, and one or more steps illustrated may be optional in accordance with aspects of the disclosure. It will also be appreciated and understood that modifications may be made without departing from the true spirit and scope of the present disclosure. The description is thus to be regarded as illustrative instead of restrictive on the present disclosure.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/714,575 US11533450B2 (en) | 2017-09-25 | 2017-09-25 | Anti-piracy video transmission and display |
US17/980,426 US11930294B2 (en) | 2017-09-25 | 2022-11-03 | Anti-piracy video transmission and display |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/714,575 US11533450B2 (en) | 2017-09-25 | 2017-09-25 | Anti-piracy video transmission and display |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/980,426 Continuation US11930294B2 (en) | 2017-09-25 | 2022-11-03 | Anti-piracy video transmission and display |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190098251A1 true US20190098251A1 (en) | 2019-03-28 |
US11533450B2 US11533450B2 (en) | 2022-12-20 |
Family
ID=65808177
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/714,575 Active 2037-12-14 US11533450B2 (en) | 2017-09-25 | 2017-09-25 | Anti-piracy video transmission and display |
US17/980,426 Active US11930294B2 (en) | 2017-09-25 | 2022-11-03 | Anti-piracy video transmission and display |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/980,426 Active US11930294B2 (en) | 2017-09-25 | 2022-11-03 | Anti-piracy video transmission and display |
Country Status (1)
Country | Link |
---|---|
US (2) | US11533450B2 (en) |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050128348A1 (en) * | 2003-12-15 | 2005-06-16 | Eastman Kodak Company | Display apparatus and method for enabling artifact-free rapid image format changes |
US20060012598A1 (en) * | 2004-06-21 | 2006-01-19 | Che-Chih Tsao | Data writing methods for volumetric 3D displays |
US20100191541A1 (en) * | 2007-04-17 | 2010-07-29 | Prokoski Francine J | System and method for using three dimensional infrared imaging for libraries of standardized medical imagery |
US20100259474A1 (en) * | 2009-04-08 | 2010-10-14 | Gesturetek, Inc. | Enhanced handheld screen-sensing pointer |
US8272016B2 (en) * | 2008-02-07 | 2012-09-18 | Fujitsu Limited | Broadcast receiving system |
US20120287144A1 (en) * | 2011-05-13 | 2012-11-15 | Pixtronix, Inc. | Display devices and methods for generating images thereon |
US20130106923A1 (en) * | 2010-05-14 | 2013-05-02 | Dolby Laboratories Licensing Corporation | Systems and Methods for Accurately Representing High Contrast Imagery on High Dynamic Range Display Systems |
US20130300948A1 (en) * | 2012-04-13 | 2013-11-14 | Red.Com, Inc. | Video projector system |
US8727016B2 (en) * | 2010-12-07 | 2014-05-20 | Saudi Arabian Oil Company | Apparatus and methods for enhanced well control in slim completions |
US8855375B2 (en) * | 2012-01-12 | 2014-10-07 | Kofax, Inc. | Systems and methods for mobile image capture and processing |
US20150287354A1 (en) * | 2014-04-03 | 2015-10-08 | Qualcomm Mems Technologies, Inc. | Error-diffusion based temporal dithering for color display devices |
US20160037122A1 (en) * | 2013-03-15 | 2016-02-04 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20160163356A1 (en) * | 2013-07-19 | 2016-06-09 | Koninklijke Philips N.V. | Hdr metadata transport |
US20160248989A1 (en) * | 2015-02-24 | 2016-08-25 | Newtek, Inc. | Method and Apparatus for Adaptively Mixing Video Source Signals |
US20160313866A1 (en) * | 2015-04-21 | 2016-10-27 | Dell Products L.P. | Managing Inputs at an Information Handling System by Adaptive Infrared Illumination and Detection |
US20160313842A1 (en) * | 2015-04-21 | 2016-10-27 | Dell Products L.P. | Disambiguation of False Touch Inputs at an Information Handling System Projected User Interface |
US20160313821A1 (en) * | 2015-04-21 | 2016-10-27 | Dell Products L.P. | Capacitive Mat Information Handling System Display and Totem Interactions |
US20160366444A1 (en) * | 2015-06-09 | 2016-12-15 | Microsoft Technology Licensing, Llc | Metadata describing nominal lighting conditions of a reference viewing environment for video playback |
US9551789B2 (en) * | 2013-01-15 | 2017-01-24 | Helmholtz Zentrum Munchen Deutsches Forschungszentrum Fur Gesundheit Und Umwelt (Gmbh) | System and method for quality-enhanced high-rate optoacoustic imaging of an object |
US20170257414A1 (en) * | 2012-01-26 | 2017-09-07 | Michael Edward Zaletel | Method of creating a media composition and apparatus therefore |
US20170339418A1 (en) * | 2016-05-17 | 2017-11-23 | Qualcomm Incorporated | Methods and systems for generating and processing content color volume messages for video |
US20180270467A1 (en) * | 2017-03-15 | 2018-09-20 | Sony Corporation | Display System and Method for Display Control of a Video Based on Different View Positions |
US20180359420A1 (en) * | 2017-02-24 | 2018-12-13 | Synaptive Medical (Barbados) Inc. | Video stabilization system and method |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4553176A (en) | 1981-12-31 | 1985-11-12 | Mendrala James A | Video recording and film printing system quality-compatible with widescreen cinema |
GB8825429D0 (en) | 1988-10-31 | 1988-11-30 | Raychem Ltd | Frame-sequential colour display system |
WO2000074366A2 (en) | 1999-05-27 | 2000-12-07 | Digital Electronic Cinema, Inc. | Systems and methods for preventing camcorder piracy of motion picture images |
KR100612831B1 (en) * | 2002-04-25 | 2006-08-18 | 삼성전자주식회사 | A method for color temperature conversion in image displaying device using contents description metadata of visual contents, and system using thereof |
US20070200920A1 (en) * | 2006-02-14 | 2007-08-30 | Walker Mark R | Digital communications adaptor |
JP4680166B2 (en) * | 2006-10-30 | 2011-05-11 | ソニー株式会社 | Imaging apparatus and imaging method |
US7978239B2 (en) * | 2007-03-01 | 2011-07-12 | Eastman Kodak Company | Digital camera using multiple image sensors to provide improved temporal sampling |
KR100866278B1 (en) * | 2007-04-26 | 2008-10-31 | 주식회사 코아로직 | Apparatus and method for making a panorama image and Computer readable medium stored thereon computer executable instruction for performing the method |
US8593476B2 (en) * | 2008-02-13 | 2013-11-26 | Gary Demos | System for accurately and precisely representing image color information |
US8817120B2 (en) * | 2012-05-31 | 2014-08-26 | Apple Inc. | Systems and methods for collecting fixed pattern noise statistics of image data |
US9979960B2 (en) * | 2012-10-01 | 2018-05-22 | Microsoft Technology Licensing, Llc | Frame packing and unpacking between frames of chroma sampling formats with different chroma resolutions |
KR101804925B1 (en) * | 2014-01-06 | 2017-12-05 | 엘지전자 주식회사 | Method and device for transmitting and receiving broadcast signal on basis of color gamut resampling |
CN106233706B (en) * | 2014-02-25 | 2020-01-03 | 苹果公司 | Apparatus and method for providing backward compatibility of video with both standard and high dynamic range |
US20150326846A1 (en) * | 2014-05-12 | 2015-11-12 | DDD IP Ventures, Ltd. | Systems and methods for processing video frames |
US9524092B2 (en) * | 2014-05-30 | 2016-12-20 | Snaptrack, Inc. | Display mode selection according to a user profile or a hierarchy of criteria |
US9736335B2 (en) * | 2015-04-15 | 2017-08-15 | Apple Inc. | Techniques for advanced chroma processing |
US10848795B2 (en) * | 2015-05-12 | 2020-11-24 | Lg Electronics Inc. | Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal and method for receiving broadcast signal |
US10171849B1 (en) * | 2015-07-08 | 2019-01-01 | Lg Electronics Inc. | Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method |
US10203064B2 (en) * | 2016-04-29 | 2019-02-12 | GCX Corporation | Locking release mechanism for an articulated support arm |
IL287380B2 (en) * | 2016-08-22 | 2024-03-01 | Magic Leap Inc | Virtual, augmented, and mixed reality systems and methods |
-
2017
- 2017-09-25 US US15/714,575 patent/US11533450B2/en active Active
-
2022
- 2022-11-03 US US17/980,426 patent/US11930294B2/en active Active
Patent Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050128348A1 (en) * | 2003-12-15 | 2005-06-16 | Eastman Kodak Company | Display apparatus and method for enabling artifact-free rapid image format changes |
US20060012598A1 (en) * | 2004-06-21 | 2006-01-19 | Che-Chih Tsao | Data writing methods for volumetric 3D displays |
US20100191541A1 (en) * | 2007-04-17 | 2010-07-29 | Prokoski Francine J | System and method for using three dimensional infrared imaging for libraries of standardized medical imagery |
US8272016B2 (en) * | 2008-02-07 | 2012-09-18 | Fujitsu Limited | Broadcast receiving system |
US20100259474A1 (en) * | 2009-04-08 | 2010-10-14 | Gesturetek, Inc. | Enhanced handheld screen-sensing pointer |
US20130106923A1 (en) * | 2010-05-14 | 2013-05-02 | Dolby Laboratories Licensing Corporation | Systems and Methods for Accurately Representing High Contrast Imagery on High Dynamic Range Display Systems |
US8727016B2 (en) * | 2010-12-07 | 2014-05-20 | Saudi Arabian Oil Company | Apparatus and methods for enhanced well control in slim completions |
US20120287144A1 (en) * | 2011-05-13 | 2012-11-15 | Pixtronix, Inc. | Display devices and methods for generating images thereon |
US8855375B2 (en) * | 2012-01-12 | 2014-10-07 | Kofax, Inc. | Systems and methods for mobile image capture and processing |
US20170257414A1 (en) * | 2012-01-26 | 2017-09-07 | Michael Edward Zaletel | Method of creating a media composition and apparatus therefore |
US20130300948A1 (en) * | 2012-04-13 | 2013-11-14 | Red.Com, Inc. | Video projector system |
US9551789B2 (en) * | 2013-01-15 | 2017-01-24 | Helmholtz Zentrum Munchen Deutsches Forschungszentrum Fur Gesundheit Und Umwelt (Gmbh) | System and method for quality-enhanced high-rate optoacoustic imaging of an object |
US20160037122A1 (en) * | 2013-03-15 | 2016-02-04 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20160163356A1 (en) * | 2013-07-19 | 2016-06-09 | Koninklijke Philips N.V. | Hdr metadata transport |
US20150287354A1 (en) * | 2014-04-03 | 2015-10-08 | Qualcomm Mems Technologies, Inc. | Error-diffusion based temporal dithering for color display devices |
US20160248989A1 (en) * | 2015-02-24 | 2016-08-25 | Newtek, Inc. | Method and Apparatus for Adaptively Mixing Video Source Signals |
US20160313842A1 (en) * | 2015-04-21 | 2016-10-27 | Dell Products L.P. | Disambiguation of False Touch Inputs at an Information Handling System Projected User Interface |
US20160313821A1 (en) * | 2015-04-21 | 2016-10-27 | Dell Products L.P. | Capacitive Mat Information Handling System Display and Totem Interactions |
US20160313866A1 (en) * | 2015-04-21 | 2016-10-27 | Dell Products L.P. | Managing Inputs at an Information Handling System by Adaptive Infrared Illumination and Detection |
US20160366444A1 (en) * | 2015-06-09 | 2016-12-15 | Microsoft Technology Licensing, Llc | Metadata describing nominal lighting conditions of a reference viewing environment for video playback |
US20170339418A1 (en) * | 2016-05-17 | 2017-11-23 | Qualcomm Incorporated | Methods and systems for generating and processing content color volume messages for video |
US20180359420A1 (en) * | 2017-02-24 | 2018-12-13 | Synaptive Medical (Barbados) Inc. | Video stabilization system and method |
US20180270467A1 (en) * | 2017-03-15 | 2018-09-20 | Sony Corporation | Display System and Method for Display Control of a Video Based on Different View Positions |
Also Published As
Publication number | Publication date |
---|---|
US20230139203A1 (en) | 2023-05-04 |
US11930294B2 (en) | 2024-03-12 |
US11533450B2 (en) | 2022-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230328305A1 (en) | Method and device for adapting the video content decoded from elementary streams to the characteristics of a display | |
US8732353B2 (en) | Transmitter device, receiver device, transmission method, reception method, and transmitter/receiver device | |
US10999554B2 (en) | Communication device and communication method | |
WO2016027423A1 (en) | Transmission method, reproduction method and reproduction device | |
JP4743196B2 (en) | Electronic device and loop determination method in electronic device | |
US9357192B2 (en) | Video sender and video receiver | |
US20120054664A1 (en) | Method and systems for delivering multimedia content optimized in accordance with presentation device capabilities | |
KR20080058740A (en) | Digital broadcasting receive apparatus and method for syncronization thereof | |
US20240259542A1 (en) | Dynamic distribution of three-dimensional content | |
US11122245B2 (en) | Display apparatus, method for controlling the same and image providing apparatus | |
US20110176496A1 (en) | On-the-fly video quality switching for video distribution networks and methods therefor | |
US7176980B2 (en) | Method and apparatus for verifying a video format supported by a display device | |
CN110121887A (en) | Branch equipment Bandwidth Management for video flowing | |
US11930294B2 (en) | Anti-piracy video transmission and display | |
JP2017220690A (en) | Reproducer, display device and transmission method | |
JPWO2018003643A1 (en) | Video display device, television receiver, transmission device, control program, and recording medium | |
KR101116071B1 (en) | Broadcasting system for integrated controlling of mixed audio video signal using serial digital interface and method thereof | |
US20110176495A1 (en) | Video distribution network and methods therefor | |
Supa’at et al. | HIGH DEFINITION TELEVISION SYSTEM | |
Povozniucov | HDMI propojovací systém | |
EP1816859A1 (en) | Method of transmitting multimedia programs and devices involved in this transmission |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: COMCAST CABLE COMMUNICATIONS, LLC, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HERBERT, MICHAEL;REEL/FRAME:043822/0189 Effective date: 20170922 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |