WO2001056279A2 - Cinema anti-piracy measures - Google Patents

Cinema anti-piracy measures Download PDF

Info

Publication number
WO2001056279A2
WO2001056279A2 PCT/US2001/002721 US0102721W WO0156279A2 WO 2001056279 A2 WO2001056279 A2 WO 2001056279A2 US 0102721 W US0102721 W US 0102721W WO 0156279 A2 WO0156279 A2 WO 0156279A2
Authority
WO
WIPO (PCT)
Prior art keywords
entities
modulated
source material
modulation
video
Prior art date
Application number
PCT/US2001/002721
Other languages
French (fr)
Other versions
WO2001056279A3 (en
Inventor
Herschel C. Burstyn
Original Assignee
Sarnoff Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/592,472 external-priority patent/US7324646B1/en
Priority claimed from US09/679,320 external-priority patent/US7634089B1/en
Application filed by Sarnoff Corporation filed Critical Sarnoff Corporation
Priority to AU2001234600A priority Critical patent/AU2001234600A1/en
Publication of WO2001056279A2 publication Critical patent/WO2001056279A2/en
Publication of WO2001056279A3 publication Critical patent/WO2001056279A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/913Television signal processing therefor for scrambling ; for copy protection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/913Television signal processing therefor for scrambling ; for copy protection
    • H04N2005/91357Television signal processing therefor for scrambling ; for copy protection by modifying the video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/913Television signal processing therefor for scrambling ; for copy protection
    • H04N2005/91392Television signal processing therefor for scrambling ; for copy protection using means for preventing making copies of projected video images

Definitions

  • the invention is related to the field of film and video projection, and more particularly to preventing the illegal recording of film and video. BACKGROUND OF THE INVENTION
  • Movie pirates illegally copy movies by captu ⁇ ng the projected image with a video-recording device, such as a camcorder.
  • Camcorders can be used in a dark movie theater to illegally record both the projected image and the corresponding audio.
  • the illegal copy of the image, recorded on videotape can be repeatedly recopied and sold to the public. Movie pirates often package these illegal copies in a manner to cause a purchaser to believe that a legitimate copy of the movie has been purchased.
  • an anti-piracy system introduces distortion into video source mate ⁇ al and into an illegally copied image, while maintaining a high quality image for viewing by the legitimate audience.
  • the video source mate ⁇ al includes modulated entities for providing artifacts incompatible with the content of the video source mate ⁇ al and selectively deliverable information that the projection system uses to demodulate the entities.
  • the modulated entities can be, for example, shapes on the video content having an emphasized or de-emphasized color.
  • the projection system receives the information specifying the modulated entity (modulation information) and corrects, or demodulates, the entity.
  • the projection system can also impose a recording device dependent interference on the projected images. In this manner, both the originating video source material and recorded copies contain artifacts that degrade content of the material while maintaining high quality for legitimately viewed renditions.
  • FIGs. 1A and IB are timing diagrams for a projected image and a video recorder
  • FIG. 2 is a functional block diagram for an exemplary system according to the principles of the invention.
  • FIGs. 3A to 3C illustrate exemplary video source material in accordance with the principles of the invention
  • FIG. 4 illustrates exemplary modulation according to the principles of the invention
  • FIG. 5 illustrates exemplary demodulation according to the principles of the invention
  • FIG. 6 illustrates an exemplary system for correcting video content and introducing alterations in recorded images according to the principles of the invention.
  • FIG. 7 shown another exemplary system for introducing alterations in recorded images according to the principles of the invention.
  • An exemplary system introduces a distorting signal to the projected image that is substantially imperceptible to a viewer.
  • a recording camera records the projected image
  • a viewable distortion appears during playback of the recorded image.
  • the video source material serves as the platform for a distorting signal, which decreases the availability of inexpensive countermeasures, makes the video source material unpleasant to view in its raw form, and permits the purveyor to retain control of the video source material content.
  • a film projector timing diagram 100 illustrates the projection of individual images within five frames 110, 120, 130, 140 and 150.
  • a first image is projected for a finite time, as represented by frame 110.
  • the image changes at the conclusion of the frame time.
  • the projected image is interrupted for a short time, as illustrated by interval 115.
  • a second image is prepared for projection, such as by placing image data before the projector light source (not shown)
  • the second image is projected for the duration of the second frame, as at 120.
  • the projected image is again interrupted to change to a third image.
  • the interruption time is represented by interval 125.
  • the projection can be characte ⁇ zed by va ⁇ ous parameters, including the frame rate and the duty cycle.
  • the effective frame rate is 24 frames per second.
  • the actual frame rate is 48 frames per second, but the same image is projected twice).
  • the projection to interruption ratio in each frame is known as the duty cycle, which impacts perceived b ⁇ ghtness and the strength of the interference in the recorded image. Viewers cannot discern distinct frames when the frame rate exceeds a certain frequency. The eye retains the previous image and integrates it with the next image. Because the images are slightly different, the audience sees motion.
  • FIG. IB illustrates a video recording timing diagram 170, where the recording device records image data during finite periods.
  • the field exposures occur at a regular rate, such as 60 Hz for NTSC (National Television Standards Committee) and 50 Hz for PAL (Phase Alternating Line).
  • NTSC National Television Standards Committee
  • PAL Phase Alternating Line
  • video recording occurs at 30 frames per second, as two interlaced (odd line and even line) scans of an image are recorded in each cycle. This rate can be controlled by, for example, a mechanical or electronic shutter, which is closed for only a fraction of each field.
  • image data is recorded in the frames shown at 172, 174, 176, 178, 180 and 182, with non-recording periods shown at 185, 187, 189, 191 and 193.
  • the interruptions of the recorded image remain undetected, because the eye integrates the image from one field to the next.
  • the integrating over a finite period typical of a taking camera provides a basis for introducing recording device dependent interference in a projected image.
  • FIG. 2 A functional block diagram for an exemplary system 200 for distorting a recorded image is illustrated in FIG. 2.
  • a modulator 204 modulates selected parameters of video content 202 to render the video content unpleasant to view.
  • the video content is the information that is intended for projection, such as the scene information in a motion picture for display in a cinema.
  • the modulator 204 modulates parameters of selected entities in the video content.
  • the modulated entities also called multi-frame entities, are realizable images that can be viewable during projection of the modulated video content.
  • the modulator 204 provides the video source material 206 for the system, which contains the video content, the modulated entities, and the modulation information.
  • the modulation information specifies the modulated entities and is used to demodulate the video source material 206 during projection.
  • a dashed line 207 indicates that the remaining functions of the system 200 are typically carried out physically remote from the functions described above.
  • a projection system 208 consisting of a modulator/demodulator 210 and a projector 212, processes the video source material 206 to provide projection of the video content to a viewing audience.
  • the modulator/demodulator 210 processes the video source material and performs two functions.
  • the demodulator 210 removes (or demodulates) the modulated entities using the modulation information provided in the video source material, and the modulator 210 imparts a recording device dependent interference on the video source material.
  • a recording device dependent interference is an interference that is imperceptible to a human viewer but which appears on a recording of the projected video content.
  • the projector 212 projects the video content with the recording device dependent interference 214.
  • the modulator/demodulator 210 is shown as a single functional block, because the same processing that demodulates the modulated entities imparts the recording device dependent interference. Separate modulators and demodulators can be used, however, without departing from the principles of the invention. Likewise, the modulator/demodulator is shown acting upon the video source material prior to projection. The projection functions, however, can be implemented prior to modulation and demodulation functions without departing from the principles of the invention.
  • the video source material 206 can be film, a digital video signal, or any video information that can be used by projection apparatus to create projected or displayed images.
  • the modulation information can be encoded on the film, and the projection system 208 includes a decoder (not shown) for decoding the modulation information.
  • the modulation information and the video content need not comprise a single information entity, however. Where the modulation information and video content do not constitute a single entity, the dashed line 207 is a link for transferring the information to the projection system 208.
  • the modulation information is downloadable from a remote source over a communications link.
  • the provision of the modulation information need not occur statically, but can be provided dynamically while the video content is being shown.
  • the system could include a transmitter and receiver for establishing the link, so that the purveyor could provide the modulation keys without providing a permanent copy of the information (as would be the case if the information were part of the video content).
  • Video source material can include the video content, entities that are incompatible with the video content, and the keys the projection system requires to render the entities imperceptible to a human during projection.
  • the entities can be any realizable image, including a unique watermark.
  • the keys are selectively deliverable to those who are approved to show the video content. Selectively deliverable means that the content provider can control who receives the information required to remove the entities. Without the keys, entities that can be seen by a human will appear on the video content.
  • FIG. 3A illustrates a color modulation 300 diagram according to the principles of the invention.
  • the X-axis plots a mapping to position on the video content.
  • the Y-axis plots a modulation parameter, which, in this instance is color intensity (luminance or irradiance), although the parameter can be any of several, such as duty cycle or frequency.
  • the resulting curve 302 defines a modulation envelope as a function of position on the video content.
  • the value BL is the intensity base line for the color represented in the diagram 300.
  • the base line is the average value for the rendition of the color that makes the color compatible with the video content. If the color intensity magnitude is emphasized or de-emphasized above a perceivable threshold, an artifact appears on the video content. Video material having a color with this exemplary modulation diagram will have the intensity of the color vary as a function of position of the video content. The mapping is selected to create the desired shapes.
  • FIG. 3B illustrates exemplary video content 350 having a modulated entity 352 (video source material) imposed according to the modulation function 302 of FIG. 3A.
  • the video source material is shown as if projected without post-processing to remove the artifact and without the remaining screen information.
  • the shape of the artifact 352 is a triangle, which is created by mapping color gain to positions on the video content.
  • the flat portion 303 of the modulation curve 302 can represent color gain that is mapped to positions on the video content that cause a colored triangle to appear on the rendition, as in FIG. 3B.
  • the crosshatched area 356 defines a proximity of decreasing intensity corresponding to the tapering sections 308a and b of the modulation curve 302 of FIG. 3A. This proximity represents an intensity gain boost outside of the visible threshold. Decreasing the gain boost near the edges of the shape provides an error bar for possible misalignment with compensating demodulation that occurs during projection. The demodulation process will impose a color de-emphasis mapped to the same or nearly same position as the color emphasis.
  • the table 370 of FIG. 3C represents exemplary modulation information for the video source material.
  • the information can include shape, center position on the video content, orientation, color(s), modulation taper, size, vertices, parameter information, mapping information, synchronization information, or any other information required to remove the artifact upon projection.
  • the modulation information further includes the keys to demodulate the video source material, and will depend upon the specific modulation in use.
  • Information concerning gain changes is synchronized with the frame information.
  • the video source material is film
  • the information can be encoded on the film and synchronized with the frame, similar to the encoding of audio information.
  • Much of the pattern information can be stored in the modulator/ demodulator, and the information stored on the film can provide selected information for carrying out the demodulation. For example, information such as shapes, modulation area and modulation style can be downloaded to the theater independent of the video content, and the content provider can limit its availability to the duration of the film showing.
  • the track on the film then calls out the shape and provides a scale factor and an orientation for the shape, a location for the centroid, the modulation style (temporal or spatial, for example), the modulation depth, the taper (if any), and the modulation rate or scale factor.
  • Apparatus for creating the video source material can include color separators, amplifiers, filters and integrators.
  • video content can be color separated and selected colors emphasized or de- emphasized spatially in each color field for each frame before the distribution copied is printed.
  • Other methods and hardware should be apparent for imposing artifacts on video content according to the principles of the invention.
  • Color modulation of a multi-frame entity is illustrated in the diagram 400 of FIG. 4.
  • An entity need not be a physical image in the frame.
  • pyramid processing for example, an entity can be a level in the resolution decimation.
  • a spatial entity for another example, is an area in the image defined by some algorithm.
  • a multi-frame entity has a lifetime of a plurality of frames.
  • This multi-frame entity 401 is an orange patch in the image field formed by displaying red 402 and green 404 signals nominally 180 degrees out of phase.
  • the signals are shown as square waves, although other signal shapes, such as sinusoids, are appropriate.
  • the frame period 406 is greater than the modulation period 408 for the green 404 and red 402 signals.
  • the modulation is sufficiently fast that a viewer perceives the effective color without flicker.
  • An explanation of flicker and modulation thresholds is given in Wyszecki and Stiles, Color Science Concepts and Methods, Quantitative Data and Formulae, John Wiley & Sons, 2d ed., at 557 to 567.
  • FIG. 4 illustrates color modulation at the effective frame rate, as at 420.
  • a recording camera sweeps the time-integrated signal at the end of every frame.
  • the area 412 under the green signal curve 404 represents the time-integrated value of the green signal 404 in the first frame 410.
  • This area exceeds the area 414 under the red signal curve 402 for the same period 410.
  • This frame therefore, will be recorded with excess green.
  • the sixth frame 416 exhibits an excess of the red signal 418, which will be recorded by a taking camera integrating over the frame period.
  • the recorded image will therefore exhibit a modulation period of about 10 frames, or based upon currently accepted video standards, a frequency of 3 Hz.
  • the modulation threshold at 3 Hz is about 3 %, and a recorded color modulation exceeding this threshold will appear as flicker in a playback of the recorded images.
  • a modulation diagram 500 illustrates color modulation that demodulates entities imposed on the video source material.
  • the modulation signal 502 has properties as described with reference to FIG. 4, such as having a period different from the frame period.
  • the signal 502 is further modulated by the modulation envelope of the modulated entity, such as the envelope 302 of FIG. 3.
  • the signal 502 is modulated such that it returns the modulated entity to the color base line. For example, if the multi-frame entity is a shape with a color emphasis, the signal 502 is modulated to de-emphasize the color and return it to the color baseline.
  • the modulating signal 502 is chosen to implement the complement of the modulated entity.
  • the modulation signal 502 is a sinusoid, although it could be some other signal shape, such as a square wave.
  • the modulation signal 502 is also illustrated with amplitude modulation, although other parameters can be modulated, such as duty cycle, frequency, or phase.
  • the signal 502 can represent a single color, or multiple color signals can be used, such as red and green to display an orange as in FIG. 4.
  • Spatial modulation can also be used to create artifacts.
  • an image exhibiting periodicity such as irradiance periodicity
  • the periodicity is designed to beat with the periodicity of a taking camera, introducing moire in the recorded image. For example, if a spatial frequency on a CCD for a taking camera is fl, a multi-frame entity image with a spatial frequency f2 will cause moire due to frequency components at fl+f2 and fl-f2.
  • the periodicity can be introduced in the orange triangle of FIG. 4 by constructing the triangle with a checkerboard or line pattern.
  • the color emphasis or de-emphasis in the video source material and in the demodulation is controlled by on-areas and off-areas instead of on-off times as in the time multiplexed color modulation discussed with respect to FIG. 4.
  • the appropriate spatial frequencies for use can be obtained from the parameters of the taking camera. Taking camera with 1024 pixels in the horizontal and vertical directions and a field of view of 50 degrees are currently available. Where every two pixels is a cycle, 1024 pixels is 512 cycles. Such a camera dictates a desired modulation of approximately 10 cycles per degree to achieve noticeable artifacts. Eye sensitivity at this spatial frequency is low; therefore, the periodicity of the imposed image should vary between 10 and 40 cycles per degree.
  • FIG. 6 a system for altering a projected image using chromatic modulation of the image on a time basis is shown.
  • an image splitter 1100 separates the frame information 900 provided by the video source material into frame-dependent (frame-linked) and frame-independent entities, as described with reference to FIG. 4.
  • a separator 1140 separates the colors in the frame-independent and frame-linked entities.
  • the separate signals are redefined with time-multiplexed values by a processor 1115.
  • An order of presentation of coarse and fine bits is defined for each color 1130. For instance, in one color channel a first frame of a frame pair can have coarse (wide time interval) intensity data presented at the end of the frame, while the subsequent frame has coarse data presented at the beginning of the frame.
  • the frame pair for the second color channel can have its coarse data presented in reverse order. Bright data and dim data can be effectively clustered while maintaining average intensity values.
  • a processor 1130 then combines this presentation data with the color-separated frame-linked entities.
  • a white light source 940 provides white light to a separator 960, which splits the white light 940 into component colors.
  • a red modulator 1010, green modulator 1020 and blue modulator 1030 are responsive to these component colors and to the frame-linked and frame- independent entities and modulate the separated color image data for the entities.
  • the color image data is chosen to be the complement of the data used to create the modulated entity.
  • the modulated color image data is combined by the combiner 970, resulting in a displayed image with color modulation 1150 and with the modulated entities removed (as to a human viewer).
  • FIG. 7 An exemplary modulator 700 for spatial-based intensity modulation is shown in FIG. 7.
  • the frame information 702 is input to the modulator from the video source material, and the modulated entities are identified 704.
  • the entities to be spatially modulated are color- separated 706 and color and intensity are redefined into spatially defined values 708. These values will be the complement of the values of the modulated entities, resulting in demodulation of the entities.
  • the pattern, intensity, order and duty cycle for each pattern are defined for each color channel, as at 709. This information can be stored on the modulator and called by the modulation information.
  • a laser drive and deflection system 710 causes red 712, green 714 and blue 716 lasers to write the video content with the desired complementary values and periodicity. Use of the laser allows sufficient modulation depth (difference between bright and dim areas) for a noticeable beating effect.
  • the spatial information unrelated to the modulated entities are also color separated, as at 718.
  • This information is modulated in the red 720, green 722 and blue 724 channels and combined as at 726.
  • a white light source 728 is color separated 730 into red, green and blue and provides the excitement for the modulators 720, 722 and 724.
  • the laser rendered information and the information rendered by the modulators combine to form the image on the screen 736.
  • the color modulation and modulation/demodulation techniques described above can be combined with other techniques to further defeat possible piracy countermeasures.
  • the pattern also can be made to move on the screen at a rate that is not perceivable to a viewer but that will cause a beat frequency with the taking rate of the recording camera (a recording device dependent interference).
  • the spatial frequency of the screen also can be changed to alter the beating effects with the spatially modulated entities and the taking camera.

Abstract

An anti-piracy system introduces distortion (204) into a recorded image, while maintaining a high quality projected image. The video source material includes modulated entities for providing artifacts incompatible with the content of the video source material and selectively deliverable information that the projection system uses to demodulate the entities. A projector receives the information about the modulated entity and corrects, or demodulates, the entity (210). The projection system (208) also imposes a recording device dependent interference on the projected images.

Description

CINEMA ANTI-PIRACY MEASURES
CROSS REFERENCE TO RELATED APPLICATIONS
This application relates to, and claims priority from, Provisional Application Serial Number 60/178,618, entitled "ANTI-CINEMA PIRACY," filed on January 28, 2000, Provisional Application Serial Number 60/188,897, entitled "ANTI-PIRACY USING CHROMATIC AND INTENSITY FLICKER MEASURES," filed on March 13, 2000, Provisional Application Serial Number 60/195,612 entitled "ANTI-PIRACY AND WATERMARK TECHNOLOGY," filed on April 6, 2000, Provisional Application Serial Number 60/199,065 entitled "EXTENDING ANTI-PIRACY TECHNIQUES TO FILM BASED PLATFORMS," filed on April 20, 2000, Non-provisional Application Serial Number 09/592,472 entitled "METHOD AND APPARATUS FOR FILM ANTI-PIRACY," filed on June 9, 2000 and Non-provisional Application Serial No. 09/679,320 entitled "CINEMA ANTI-PIRACY MEASURES," filed on October 4, 2000, which are assigned to the same assignee and are incorporated by reference herein.
FIELD OF THE INVENTION The invention is related to the field of film and video projection, and more particularly to preventing the illegal recording of film and video. BACKGROUND OF THE INVENTION
Each year the film industry loses millions of dollars in revenue due to the illegal copying and reselling of movies. Movie pirates illegally copy movies by captuπng the projected image with a video-recording device, such as a camcorder. Camcorders can be used in a dark movie theater to illegally record both the projected image and the corresponding audio. The illegal copy of the image, recorded on videotape, can be repeatedly recopied and sold to the public. Movie pirates often package these illegal copies in a manner to cause a purchaser to believe that a legitimate copy of the movie has been purchased.
In response to widespread cinema piracy, there have been vaπous methods attempted to distort the projected image such that an illegal copy is unpleasant to view No acceptable methods exist, however, for adding distortion without unacceptably degrading the projected image as it plays to the legitimate viewers. In addition, no methods have been suggested for alteπng the video source mateπal in such a way that it is unpleasant to view in its raw form. There is a need, therefore, for a system and method for distorting an illegally recorded image, while still maintaining a high quality image for the legitimate viewing audience.
SUMMARY OF THE INVENTION
An anti-piracy system according to the principles of the invention introduces distortion into video source mateπal and into an illegally copied image, while maintaining a high quality image for viewing by the legitimate audience. In an exemplary system for distorting a recording of projected images, the video source mateπal includes modulated entities for providing artifacts incompatible with the content of the video source mateπal and selectively deliverable information that the projection system uses to demodulate the entities. The modulated entities can be, for example, shapes on the video content having an emphasized or de-emphasized color. The projection system receives the information specifying the modulated entity (modulation information) and corrects, or demodulates, the entity. The projection system can also impose a recording device dependent interference on the projected images. In this manner, both the originating video source material and recorded copies contain artifacts that degrade content of the material while maintaining high quality for legitimately viewed renditions.
BRIEF DESCRIPTION OF THE DRAWINGS
The features of the invention will appear more fully upon consideration of the embodiments to be described in detail in connection with the accompanying drawings, in which: FIGs. 1A and IB are timing diagrams for a projected image and a video recorder;
FIG. 2 is a functional block diagram for an exemplary system according to the principles of the invention;
FIGs. 3A to 3C illustrate exemplary video source material in accordance with the principles of the invention;
FIG. 4 illustrates exemplary modulation according to the principles of the invention; FIG. 5 illustrates exemplary demodulation according to the principles of the invention;
FIG. 6 illustrates an exemplary system for correcting video content and introducing alterations in recorded images according to the principles of the invention; and
FIG. 7 shown another exemplary system for introducing alterations in recorded images according to the principles of the invention.
DETAILED DESCRIPTION This detailed description sets forth exemplary methods and systems for distorting an image recorded from projected film or video without appreciably degrading the projected information. An exemplary system introduces a distorting signal to the projected image that is substantially imperceptible to a viewer. When a recording camera records the projected image, a viewable distortion appears during playback of the recorded image. The video source material serves as the platform for a distorting signal, which decreases the availability of inexpensive countermeasures, makes the video source material unpleasant to view in its raw form, and permits the purveyor to retain control of the video source material content.
A. Projection and Recording
Projecting a seπes of slightly different images that are changed at a rate faster than is perceived by the eye creates motion pictures. The eye ignores the disruption in the projected image by integrating a previous image with a subsequent image. In FIG. 1A, a film projector timing diagram 100 illustrates the projection of individual images within five frames 110, 120, 130, 140 and 150. In this example, a first image is projected for a finite time, as represented by frame 110. The image changes at the conclusion of the frame time. To make this change, the projected image is interrupted for a short time, as illustrated by interval 115. Duπng the interval 115, a second image is prepared for projection, such as by placing image data before the projector light source (not shown) The second image is projected for the duration of the second frame, as at 120. At the conclusion of this frame 120, the projected image is again interrupted to change to a third image. The interruption time is represented by interval 125. For each image contained on the film there is a projection peπod and a corresponding interruption peπod The interruptions are represented in the timing diagram 100 at 125, 135, 145 and 155.
The projection can be characteπzed by vaπous parameters, including the frame rate and the duty cycle. For the projection represented in diagram 100, the effective frame rate is 24 frames per second. (The actual frame rate is 48 frames per second, but the same image is projected twice). The projection to interruption ratio in each frame is known as the duty cycle, which impacts perceived bπghtness and the strength of the interference in the recorded image. Viewers cannot discern distinct frames when the frame rate exceeds a certain frequency. The eye retains the previous image and integrates it with the next image. Because the images are slightly different, the audience sees motion.
A video-recording device operates similarly to a film or video projector (images are recorded during finite periods). FIG. IB illustrates a video recording timing diagram 170, where the recording device records image data during finite periods. The field exposures occur at a regular rate, such as 60 Hz for NTSC (National Television Standards Committee) and 50 Hz for PAL (Phase Alternating Line). In NTSC, video recording occurs at 30 frames per second, as two interlaced (odd line and even line) scans of an image are recorded in each cycle. This rate can be controlled by, for example, a mechanical or electronic shutter, which is closed for only a fraction of each field. In the diagram 170, image data is recorded in the frames shown at 172, 174, 176, 178, 180 and 182, with non-recording periods shown at 185, 187, 189, 191 and 193. During the playback of the recorded image, the interruptions of the recorded image remain undetected, because the eye integrates the image from one field to the next. The integrating over a finite period typical of a taking camera provides a basis for introducing recording device dependent interference in a projected image.
B. Functional Block Diagram of An Exemplary System
A functional block diagram for an exemplary system 200 for distorting a recorded image is illustrated in FIG. 2. In this system 200, a modulator 204 modulates selected parameters of video content 202 to render the video content unpleasant to view. The video content is the information that is intended for projection, such as the scene information in a motion picture for display in a cinema. The modulator 204, as will be more fully explained hereafter, modulates parameters of selected entities in the video content. The modulated entities, also called multi-frame entities, are realizable images that can be viewable during projection of the modulated video content. The modulator 204 provides the video source material 206 for the system, which contains the video content, the modulated entities, and the modulation information. The modulation information specifies the modulated entities and is used to demodulate the video source material 206 during projection.
A dashed line 207 indicates that the remaining functions of the system 200 are typically carried out physically remote from the functions described above. A projection system 208, consisting of a modulator/demodulator 210 and a projector 212, processes the video source material 206 to provide projection of the video content to a viewing audience. The modulator/demodulator 210 processes the video source material and performs two functions. The demodulator 210 removes (or demodulates) the modulated entities using the modulation information provided in the video source material, and the modulator 210 imparts a recording device dependent interference on the video source material. A recording device dependent interference is an interference that is imperceptible to a human viewer but which appears on a recording of the projected video content. The projector 212 projects the video content with the recording device dependent interference 214.
In this exemplary projection system 208, the modulator/demodulator 210 is shown as a single functional block, because the same processing that demodulates the modulated entities imparts the recording device dependent interference. Separate modulators and demodulators can be used, however, without departing from the principles of the invention. Likewise, the modulator/demodulator is shown acting upon the video source material prior to projection. The projection functions, however, can be implemented prior to modulation and demodulation functions without departing from the principles of the invention.
The video source material 206 can be film, a digital video signal, or any video information that can be used by projection apparatus to create projected or displayed images. When the video source material 206 is film, the modulation information can be encoded on the film, and the projection system 208 includes a decoder (not shown) for decoding the modulation information. The modulation information and the video content need not comprise a single information entity, however. Where the modulation information and video content do not constitute a single entity, the dashed line 207 is a link for transferring the information to the projection system 208.
In one embodiment, the modulation information is downloadable from a remote source over a communications link. This permits the film purveyor to provide the modulation keys separately from the video content, rendering successful countermeasures unlikely. Of course the provision of the modulation information need not occur statically, but can be provided dynamically while the video content is being shown. The system could include a transmitter and receiver for establishing the link, so that the purveyor could provide the modulation keys without providing a permanent copy of the information (as would be the case if the information were part of the video content).
C. Exemplary Video Source Material
Video source material according to the principles of the invention can include the video content, entities that are incompatible with the video content, and the keys the projection system requires to render the entities imperceptible to a human during projection. The entities can be any realizable image, including a unique watermark. The keys are selectively deliverable to those who are approved to show the video content. Selectively deliverable means that the content provider can control who receives the information required to remove the entities. Without the keys, entities that can be seen by a human will appear on the video content.
In one embodiment, entities are created by emphasizing or de- emphasizing selected colors on the video source material, upsetting the correct color balance for the rendition. This color modulation can be used to create colored shapes on the video content. FIG. 3A illustrates a color modulation 300 diagram according to the principles of the invention. The X-axis plots a mapping to position on the video content. The Y-axis plots a modulation parameter, which, in this instance is color intensity (luminance or irradiance), although the parameter can be any of several, such as duty cycle or frequency. The resulting curve 302 defines a modulation envelope as a function of position on the video content. The value BL is the intensity base line for the color represented in the diagram 300. The base line is the average value for the rendition of the color that makes the color compatible with the video content. If the color intensity magnitude is emphasized or de-emphasized above a perceivable threshold, an artifact appears on the video content. Video material having a color with this exemplary modulation diagram will have the intensity of the color vary as a function of position of the video content. The mapping is selected to create the desired shapes.
FIG. 3B illustrates exemplary video content 350 having a modulated entity 352 (video source material) imposed according to the modulation function 302 of FIG. 3A. The video source material is shown as if projected without post-processing to remove the artifact and without the remaining screen information. The shape of the artifact 352 is a triangle, which is created by mapping color gain to positions on the video content. For example, with reference to FIG. 3A, the flat portion 303 of the modulation curve 302 can represent color gain that is mapped to positions on the video content that cause a colored triangle to appear on the rendition, as in FIG. 3B.
The crosshatched area 356 defines a proximity of decreasing intensity corresponding to the tapering sections 308a and b of the modulation curve 302 of FIG. 3A. This proximity represents an intensity gain boost outside of the visible threshold. Decreasing the gain boost near the edges of the shape provides an error bar for possible misalignment with compensating demodulation that occurs during projection. The demodulation process will impose a color de-emphasis mapped to the same or nearly same position as the color emphasis.
The table 370 of FIG. 3C represents exemplary modulation information for the video source material. The information can include shape, center position on the video content, orientation, color(s), modulation taper, size, vertices, parameter information, mapping information, synchronization information, or any other information required to remove the artifact upon projection. The modulation information further includes the keys to demodulate the video source material, and will depend upon the specific modulation in use.
Information concerning gain changes (or other modulation) is synchronized with the frame information. Where the video source material is film, the information can be encoded on the film and synchronized with the frame, similar to the encoding of audio information. Much of the pattern information can be stored in the modulator/ demodulator, and the information stored on the film can provide selected information for carrying out the demodulation. For example, information such as shapes, modulation area and modulation style can be downloaded to the theater independent of the video content, and the content provider can limit its availability to the duration of the film showing. The track on the film then calls out the shape and provides a scale factor and an orientation for the shape, a location for the centroid, the modulation style (temporal or spatial, for example), the modulation depth, the taper (if any), and the modulation rate or scale factor.
Apparatus for creating the video source material can include color separators, amplifiers, filters and integrators. For example, for a film format, video content can be color separated and selected colors emphasized or de- emphasized spatially in each color field for each frame before the distribution copied is printed. Other methods and hardware should be apparent for imposing artifacts on video content according to the principles of the invention.
D. Exemplary Modulation/Demodulation—Color
Color modulation of a multi-frame entity is illustrated in the diagram 400 of FIG. 4. An entity need not be a physical image in the frame. In pyramid processing, for example, an entity can be a level in the resolution decimation. A spatial entity, for another example, is an area in the image defined by some algorithm. Generally, a multi-frame entity has a lifetime of a plurality of frames.
This multi-frame entity 401 is an orange patch in the image field formed by displaying red 402 and green 404 signals nominally 180 degrees out of phase. The signals are shown as square waves, although other signal shapes, such as sinusoids, are appropriate. The frame period 406 is greater than the modulation period 408 for the green 404 and red 402 signals. The modulation is sufficiently fast that a viewer perceives the effective color without flicker. An explanation of flicker and modulation thresholds is given in Wyszecki and Stiles, Color Science Concepts and Methods, Quantitative Data and Formulae, John Wiley & Sons, 2d ed., at 557 to 567. In addition, FIG. 4 illustrates color modulation at the effective frame rate, as at 420. This effectively parses the multi-frame entity into a frame dependent entity and a frame independent entity. A recording camera sweeps the time-integrated signal at the end of every frame. For the first frame on the left 410, the area 412 under the green signal curve 404 represents the time-integrated value of the green signal 404 in the first frame 410. This area exceeds the area 414 under the red signal curve 402 for the same period 410. This frame, therefore, will be recorded with excess green. Similarly, the sixth frame 416 exhibits an excess of the red signal 418, which will be recorded by a taking camera integrating over the frame period. The recorded image will therefore exhibit a modulation period of about 10 frames, or based upon currently accepted video standards, a frequency of 3 Hz. The modulation threshold at 3 Hz is about 3 %, and a recorded color modulation exceeding this threshold will appear as flicker in a playback of the recorded images.
The color modulation induced flicker described with reference to FIG. 4 can also be used to demodulate the modulated entities described with reference to FIG. 3. Referring to FIG. 5, a modulation diagram 500 illustrates color modulation that demodulates entities imposed on the video source material. The modulation signal 502 has properties as described with reference to FIG. 4, such as having a period different from the frame period. The signal 502 is further modulated by the modulation envelope of the modulated entity, such as the envelope 302 of FIG. 3. The signal 502 is modulated such that it returns the modulated entity to the color base line. For example, if the multi-frame entity is a shape with a color emphasis, the signal 502 is modulated to de-emphasize the color and return it to the color baseline. The modulating signal 502 is chosen to implement the complement of the modulated entity.
Various other modulation/demodulation schemes also can be used without departing from the principles of the invention. For example, here the modulation signal 502 is a sinusoid, although it could be some other signal shape, such as a square wave. The modulation signal 502 is also illustrated with amplitude modulation, although other parameters can be modulated, such as duty cycle, frequency, or phase. Also, the signal 502 can represent a single color, or multiple color signals can be used, such as red and green to display an orange as in FIG. 4.
Spatial modulation can also be used to create artifacts. In spatial modulation, an image exhibiting periodicity (such as irradiance periodicity) with respect to space is imposed on the video content. The periodicity is designed to beat with the periodicity of a taking camera, introducing moire in the recorded image. For example, if a spatial frequency on a CCD for a taking camera is fl, a multi-frame entity image with a spatial frequency f2 will cause moire due to frequency components at fl+f2 and fl-f2. The periodicity can be introduced in the orange triangle of FIG. 4 by constructing the triangle with a checkerboard or line pattern. The color emphasis or de-emphasis in the video source material and in the demodulation is controlled by on-areas and off-areas instead of on-off times as in the time multiplexed color modulation discussed with respect to FIG. 4.
The appropriate spatial frequencies for use can be obtained from the parameters of the taking camera. Taking camera with 1024 pixels in the horizontal and vertical directions and a field of view of 50 degrees are currently available. Where every two pixels is a cycle, 1024 pixels is 512 cycles. Such a camera dictates a desired modulation of approximately 10 cycles per degree to achieve noticeable artifacts. Eye sensitivity at this spatial frequency is low; therefore, the periodicity of the imposed image should vary between 10 and 40 cycles per degree.
E. Exemplary Modulation/demodulation Systems
In FIG. 6, a system for altering a projected image using chromatic modulation of the image on a time basis is shown. In this system, an image splitter 1100 separates the frame information 900 provided by the video source material into frame-dependent (frame-linked) and frame-independent entities, as described with reference to FIG. 4. A separator 1140 separates the colors in the frame-independent and frame-linked entities. For the frame independent entities, the separate signals are redefined with time-multiplexed values by a processor 1115. An order of presentation of coarse and fine bits is defined for each color 1130. For instance, in one color channel a first frame of a frame pair can have coarse (wide time interval) intensity data presented at the end of the frame, while the subsequent frame has coarse data presented at the beginning of the frame. The frame pair for the second color channel can have its coarse data presented in reverse order. Bright data and dim data can be effectively clustered while maintaining average intensity values. A processor 1130 then combines this presentation data with the color-separated frame-linked entities. Also in this exemplary system, a white light source 940 provides white light to a separator 960, which splits the white light 940 into component colors. A red modulator 1010, green modulator 1020 and blue modulator 1030 are responsive to these component colors and to the frame-linked and frame- independent entities and modulate the separated color image data for the entities. The color image data is chosen to be the complement of the data used to create the modulated entity. The modulated color image data is combined by the combiner 970, resulting in a displayed image with color modulation 1150 and with the modulated entities removed (as to a human viewer).
An exemplary modulator 700 for spatial-based intensity modulation is shown in FIG. 7. As in time-based modulation, the frame information 702 is input to the modulator from the video source material, and the modulated entities are identified 704. The entities to be spatially modulated are color- separated 706 and color and intensity are redefined into spatially defined values 708. These values will be the complement of the values of the modulated entities, resulting in demodulation of the entities. The pattern, intensity, order and duty cycle for each pattern are defined for each color channel, as at 709. This information can be stored on the modulator and called by the modulation information. A laser drive and deflection system 710 causes red 712, green 714 and blue 716 lasers to write the video content with the desired complementary values and periodicity. Use of the laser allows sufficient modulation depth (difference between bright and dim areas) for a noticeable beating effect.
To accomplish the rendition of the video content, the spatial information unrelated to the modulated entities are also color separated, as at 718. This information is modulated in the red 720, green 722 and blue 724 channels and combined as at 726. A white light source 728 is color separated 730 into red, green and blue and provides the excitement for the modulators 720, 722 and 724. The laser rendered information and the information rendered by the modulators combine to form the image on the screen 736.
F. Additional Modulation Techniques
The color modulation and modulation/demodulation techniques described above can be combined with other techniques to further defeat possible piracy countermeasures. The pattern also can be made to move on the screen at a rate that is not perceivable to a viewer but that will cause a beat frequency with the taking rate of the recording camera (a recording device dependent interference). The spatial frequency of the screen also can be changed to alter the beating effects with the spatially modulated entities and the taking camera.
As would be understood the principles of the invention disclosed are related to introducing alterations or distortions in film and video content as a method of rendering illegally obtained copies of the materials unpleasant to view. The principles of the invention may also be applied to other forms of content on other media, such as DVD and DVDX. The examples given herein are presented to enable those skilled in the art to more clearly understand and practice the invention. The examples should not be considered as limitations upon the scope of the invention, but as merely illustrative. Numerous modifications and alternative embodiments of the invention will be apparent to those skilled in the art in view of the foregoing description.

Claims

WHAT IS CLAIMED IS: 1. A method for distorting a recording of projected images, comprising the steps of: imposing modulated entities on video content of video source material, the modulated entities including artifacts incompatible with the video content; demodulating the modulated entities; and projecting the video content to provide the projected images.
2. The method of claim 1 wherein the step of imposing modulated entities includes the steps of: separating the video content into selected colors; and varying at least one of a plurality of parameters of at least one of the selected colors.
3. The method of claim 1 wherein the projecting step includes the further step of imposing a recording device dependent interference on the projected video content.
4. The method of claim 1 further comprising the step of encoding modulation information corresponding to the modulated entities, wherein the projecting step further includes the step of decoding the modulation information.
5. The method of claim 4 wherein imposing the modulated entities further includes the step of modulating the video in a selected space.
6. The method of claim 2 wherein the at least one parameter comprises intensity, the varying step including the step of determining the intensity as a function of position on the video content.
7. The method of claim 2 wherein the varying step includes the step of determining a value of the at least one parameter as a function of position on the video content, the function describing a modulation envelope, the modulation envelope decreasing a magnitude of the at least one parameter to correct an alignment error.
8. The method of claim 1 wherein the video source material comprises film.
9. Video source material for a projection system, comprising: modulated entities for providing artifacts incompatible with a video content of the video source material; and selectively deliverable modulation information, wherein the projection system demodulates the modulated entities according to the modulation information and introduces a recording device dependent interference.
10. A system for distorting a recording of projected images, comprising: video source material having modulated entities for providing artifacts incompatible with a content of the video source material and selectively deliverable modulation information; and a projector system responsive to the video source material to provide the projected images, the projector system including: a modulator responsive to the video source material, the modulator imposing a recording device dependent interference on the projected images; and a demodulator responsive to the video source material for demodulating the modulated entities according to the selectively deliverable modulation information.
PCT/US2001/002721 2000-01-28 2001-01-26 Cinema anti-piracy measures WO2001056279A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001234600A AU2001234600A1 (en) 2000-01-28 2001-01-26 Cinema anti-piracy measures

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US17861800P 2000-01-28 2000-01-28
US60/178,618 2000-01-28
US18889700P 2000-03-13 2000-03-13
US60/188,897 2000-03-13
US19561200P 2000-04-06 2000-04-06
US60/195,612 2000-04-06
US19906500P 2000-04-20 2000-04-20
US60/199,065 2000-04-20
US09/592,472 2000-06-09
US09/592,472 US7324646B1 (en) 1999-10-29 2000-06-09 Method and apparatus for film anti-piracy
US09/679,320 US7634089B1 (en) 1999-10-29 2000-10-04 Cinema anti-piracy measures
US09/679,320 2000-10-04

Publications (2)

Publication Number Publication Date
WO2001056279A2 true WO2001056279A2 (en) 2001-08-02
WO2001056279A3 WO2001056279A3 (en) 2001-12-13

Family

ID=27558682

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/002721 WO2001056279A2 (en) 2000-01-28 2001-01-26 Cinema anti-piracy measures

Country Status (2)

Country Link
AU (1) AU2001234600A1 (en)
WO (1) WO2001056279A2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1237369A2 (en) * 2001-02-28 2002-09-04 Eastman Kodak Company Copy protection for digital motion picture image data
EP1345428A2 (en) * 2002-03-11 2003-09-17 Sony Corporation Optical intensity modulation method and system, and optical state modulation apparatus
EP1372342A1 (en) * 2002-06-11 2003-12-17 Sony Corporation Optical state modulation apparatus, display system and optical state modulation method
EP1414250A2 (en) * 2002-10-23 2004-04-28 Sony Corporation Image display apparatus and image display method
EP1416318A1 (en) * 2001-08-10 2004-05-06 Sony Corporation Imaging disturbing method and system
EP1301034A3 (en) * 2001-10-02 2004-11-10 Sony Corporation Optical state modulation
DE102004023800A1 (en) * 2004-05-05 2005-12-01 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Marker system, to give detection or interference of illegal filming at public events, uses a light spectrum invisible to the human eye which triggers an electronic response
US7302162B2 (en) 2002-08-14 2007-11-27 Qdesign Corporation Modulation of a video signal with an impairment signal to increase the video signal masked threshold
EP1926314A1 (en) * 2006-11-23 2008-05-28 Thomson Holding Germany GmbH & Co. OHG Method and apparatus for displaying video signal data with protection against digital copying
US7386125B2 (en) * 2002-10-28 2008-06-10 Qdesign Usa, Inc. Techniques of imperceptibly altering the spectrum of a displayed image in a manner that discourages copying
WO2008073077A1 (en) * 2006-12-11 2008-06-19 Thomson Licensing Text-based anti-piracy system and method for digital cinema
EP1936975A1 (en) * 2006-12-20 2008-06-25 Thomson Licensing Method and device for processing source pictures to generate aliasing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0519320A2 (en) * 1991-06-18 1992-12-23 Matsushita Electric Industrial Co., Ltd. Video theater system and copy preventive method
EP0851678A1 (en) * 1995-08-04 1998-07-01 HE HOLDINGS, INC. dba HUGHES ELECTRONICS System and method for antipiracy using frame rate dithering
WO2000074366A2 (en) * 1999-05-27 2000-12-07 Digital Electronic Cinema, Inc. Systems and methods for preventing camcorder piracy of motion picture images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0519320A2 (en) * 1991-06-18 1992-12-23 Matsushita Electric Industrial Co., Ltd. Video theater system and copy preventive method
EP0851678A1 (en) * 1995-08-04 1998-07-01 HE HOLDINGS, INC. dba HUGHES ELECTRONICS System and method for antipiracy using frame rate dithering
WO2000074366A2 (en) * 1999-05-27 2000-12-07 Digital Electronic Cinema, Inc. Systems and methods for preventing camcorder piracy of motion picture images

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1237369A3 (en) * 2001-02-28 2004-08-25 Eastman Kodak Company Copy protection for digital motion picture image data
US7043019B2 (en) 2001-02-28 2006-05-09 Eastman Kodak Company Copy protection for digital motion picture image data
EP1237369A2 (en) * 2001-02-28 2002-09-04 Eastman Kodak Company Copy protection for digital motion picture image data
EP1416318A4 (en) * 2001-08-10 2007-12-05 Sony Corp Imaging disturbing method and system
EP1416318A1 (en) * 2001-08-10 2004-05-06 Sony Corporation Imaging disturbing method and system
EP1301034A3 (en) * 2001-10-02 2004-11-10 Sony Corporation Optical state modulation
EP1345428A2 (en) * 2002-03-11 2003-09-17 Sony Corporation Optical intensity modulation method and system, and optical state modulation apparatus
EP1345428A3 (en) * 2002-03-11 2005-03-16 Sony Corporation Optical intensity modulation method and system, and optical state modulation apparatus
US7030956B2 (en) 2002-03-11 2006-04-18 Sony Corporation Optical intensity modulation method and system, and optical state modulation apparatus
EP1372342A1 (en) * 2002-06-11 2003-12-17 Sony Corporation Optical state modulation apparatus, display system and optical state modulation method
US7050076B2 (en) 2002-06-11 2006-05-23 Sony Corporation Optical state modulation apparatus, display system and optical state modulation method
US7302162B2 (en) 2002-08-14 2007-11-27 Qdesign Corporation Modulation of a video signal with an impairment signal to increase the video signal masked threshold
US7018045B2 (en) 2002-10-23 2006-03-28 Sony Corporation Image display apparatus and image display method
EP1414250A2 (en) * 2002-10-23 2004-04-28 Sony Corporation Image display apparatus and image display method
EP1414250A3 (en) * 2002-10-23 2005-01-26 Sony Corporation Image display apparatus and image display method
US7386125B2 (en) * 2002-10-28 2008-06-10 Qdesign Usa, Inc. Techniques of imperceptibly altering the spectrum of a displayed image in a manner that discourages copying
US7747015B2 (en) 2002-10-28 2010-06-29 Qdesign Usa, Inc. Techniques of imperceptibly altering the spectrum of a displayed image in a manner that discourages copying
DE102004023800A1 (en) * 2004-05-05 2005-12-01 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Marker system, to give detection or interference of illegal filming at public events, uses a light spectrum invisible to the human eye which triggers an electronic response
EP1926314A1 (en) * 2006-11-23 2008-05-28 Thomson Holding Germany GmbH & Co. OHG Method and apparatus for displaying video signal data with protection against digital copying
WO2008061889A1 (en) * 2006-11-23 2008-05-29 Thomson Licensing Method and apparatus for displaying video signal data with protection against digital copying
US8116610B2 (en) 2006-11-23 2012-02-14 Thomson Licensing Method and apparatus for displaying video signal data with protection against digital copying
WO2008073077A1 (en) * 2006-12-11 2008-06-19 Thomson Licensing Text-based anti-piracy system and method for digital cinema
EP1936975A1 (en) * 2006-12-20 2008-06-25 Thomson Licensing Method and device for processing source pictures to generate aliasing
WO2008074754A1 (en) * 2006-12-20 2008-06-26 Thomson Licensing Method and device for processing source pictures to generate aliasing

Also Published As

Publication number Publication date
WO2001056279A3 (en) 2001-12-13
AU2001234600A1 (en) 2001-08-07

Similar Documents

Publication Publication Date Title
US7324646B1 (en) Method and apparatus for film anti-piracy
US7865034B2 (en) Image display methods and systems with sub-frame intensity compensation
US7043019B2 (en) Copy protection for digital motion picture image data
TW440819B (en) Copy protection schemes for copy protected digital material
US7634134B1 (en) Anti-piracy image display methods and systems
EP1557031B1 (en) Techniques of imperceptibly altering the spectrum of a displayed image in a manner that discourages copying
KR100491774B1 (en) Video copy protection devices and methods and recording media including them
US7899242B2 (en) Methods of processing and displaying images and display device using the methods
JP3349704B2 (en) Method and apparatus for scrambling a video signal with sufficient network transmission and recording capabilities
JP2004295098A (en) Projector with enhanced security for interrupting illegal action due to camcorder
WO2001056279A2 (en) Cinema anti-piracy measures
JP2004234007A (en) Projector having means for obstructing illicit duplication by camcorder
US20100027968A1 (en) Method and device for processing source pictures to generate aliasing
US6137952A (en) Apparatus and method for degrading the quality of unauthorized copies of color images and video sequences
JP2007219512A (en) Method and device for processing sequence of video image
US7634089B1 (en) Cinema anti-piracy measures
JP2003302960A (en) Device and method for image display and computer program
US6041160A (en) Method and apparatus for picture encoding and decoding
JP2001211433A (en) Copy protection system
WO2001054345A1 (en) Security systems for motion picture, video, program, and datafile transmission, viewing, access control and usage control
JPH1127695A (en) Copy preventing method and video signal output device
JP2007510319A (en) System and method for copy suppression due to repetitive quantization loss
JPH04368083A (en) Video signal processing unit
FR2728421A1 (en) Image signal jamming device to prevent recording of television programme
CZ407599A3 (en) Apparatus, method and record carrier as well as signal with protection against reproduction

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP