CA1290052C - Program identification method and apparatus - Google Patents

Program identification method and apparatus

Info

Publication number
CA1290052C
CA1290052C CA000615693A CA615693A CA1290052C CA 1290052 C CA1290052 C CA 1290052C CA 000615693 A CA000615693 A CA 000615693A CA 615693 A CA615693 A CA 615693A CA 1290052 C CA1290052 C CA 1290052C
Authority
CA
Canada
Prior art keywords
signal
predetermined
video
comparing
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA000615693A
Other languages
French (fr)
Inventor
David A. Kiewit
Daozheng Lu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TNC US Holdings Inc
Original Assignee
AC Nielsen Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US06/604,367 external-priority patent/US4697209A/en
Application filed by AC Nielsen Co filed Critical AC Nielsen Co
Application granted granted Critical
Publication of CA1290052C publication Critical patent/CA1290052C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Abstract

PROGRAM IDENTIFICATION METHOD AND APPARATUS
Abstract of the Disclosure A method and apparatus for identifying programs such as television programs received from various sources detects the occurrence of predetermined events such as scene changes in a video signal and extracts a signature from the video signal.
The signatures and the times of occurrence of the signatures are stored and subsequently compared with reference signatures to identify the program. The signature may be compared in pairs to increase resolution, and the time interval between events or signatures may also be used to identify the program, either by themselves or in conjunction with the signatures.

Description

~D ~
Field of the Invention This invention relate generally to methods and apparatus for identifying programs and the viewing habits of the public, and more particularly to a method and apparatus for identifying programs, such as tele-vision programs obtained from various sources, includ-ing video recorders.
~b~
Systems for identifying programs such as television programs that are broadcast or viewed are known. Such systems fall into various ca~egories.
These categories include manual systems wherein the viewer keeps a diary indicating all of the programs viewed; mechanical, electromechanical and electronic systems that sense the ch~nnel to which a ~elevision receiver is tuned; systems that detect iden~ifying signals present in the television broadcast signal;
and systems that monitor and analyze the program con-tent of the broadcast signal utilizing image proces-sing and correlation techniques to identify the pro-gram.
While all of these systems do provide a way to iden~ify programs and to monitor the viewing habits 2S of the public, the manual methods are slow in acquiring data and are prone to inaccuracies resulting from the entry oE erroneous ~ata that may be intentionally or unintentionally entered~ Sy-Qtems that monitor the channel ts whic~ a receiver is ~uned overcome ~he drawbacks of the manual systems, but require access to the tuning mechanism of the receiver. Such access is becoming increasingly dificult with the advent of ; cable and pay television ystems that utilize various converters and decoders, many of which are difficult to access~ Moreover, such sys~ems cannot identify ~k s~

programs played from a source such as a home video recorder. Systems that de~ect identification signals encoded on the broadcast signal overcome many of the accuracy and access disadvantages described above, but require the cooperation of the broadcasters who must encode the programs prior to broadcasS for such systems to be effective. Con~equently, such systems are useable only on a limited number of broadcasts, such as network broadcasts. Systems that monitor and analyze th~ content of the program itself theoretical-ly eliminate the disadvantages of the other systems, ~ut heretofore the information processing requirements of such sys~ems rendered such systems impractical or uneconomical because of the computing capacity requir-ed to analyze the vast quantity of information presentin a ~roadcast signals. Moreover, previous attempts to reduce ~he quantity of information processed tend-ed to introduce inaccuracies.
An example of a system ~hat utilizes the program content of a broadcast ~ignal to identify the program is de~cribed in United States Patent No.
3,919,479 to Moon et al. The Moon et al. system utilizes a non-linear analog transform to produce a low frequ2ncy envelope waveform, and the information in the low frequency envelope of a predetermined time interval i8 digitized to generate a signature. The signatures thu~ generated are compared with reference signatures to identify the program. However, the ~oon et al. system generates the siqnatures continuous-ly, and consequently, requires a large computer toprocess the data thus generated.
Another system that monitors the program content of a broadcast signal i5 disclosed in United 5tates Patent No. 4,230,990 to Lert Jr. et al. ThP
system disclosed in the Lert Jr. et al patent reduces the amount of data that must be processed, as compared to the data amount of proces3ed by the Moon et al.
paten~, by utilizing cues that are either externally generated or present in the program material to ini-tiate the signature generation and correlation processonly after a cue has been detected. However, the system disclosed in the ~ert Jr. et al. patent is designed to monitor the program~ broadcast by broad-casting stations such as networ~ affiliated stations in order to determine whether they are broadcasting the program~ required by ~he networks to be broadcast, rather than as a y3~em for determining the viewing habits of the pu~lic. Thu~, many of the problems that occur when the habi~s of a television viewer are being monitored, such a~, or ex~mple, frequen~ chan-nel changing, and the problem~ ~hat occur in identify-ing programs that are recorded and ~ubsequently play-ed back by a home video recorder are not addre~sed.

It is an objec~ of the present invention to provide a method and apparatu~ for identifying pro-grams tha~ overcome many of the disadvantages of th~
prior art system~.
It is another object of the present inven-tion to provide an improved me~hoa and apparatus for det~rmining ~he viewing habits of ~he public that utilizes the content of the program being viewed.
It is another obj~c~ of ~h~ present inven-tion to provide an improved ma~hod and appara~us for monitoring ~roadc~ts and determining whether broad-castlng stations ar~ broadcasting the program~ and commercial mes~ages that ~h~y are require~ ~o broad-cast.
It is yet ano~her object of th2 present invention to provide a method and appara~us for iden-~d9 ~ O SZ

tifying previously recorded programs being viewed by a view2r.
It is s~ill another object of the presen~
invention to provide a system that accurately identi-S fies programs with a minimal amount of computation utilizing data reduction and correlation techniques on the program material.
It is yet another object of the present invention to provide a program identification system that extracts program identifying signatures only upon the occurrenee of certain pr~de~ermined events.
It is another object of the present inven-tion to provide a program identifying system that extracts program identifying signatures from the pro-gram ma~erial and compares the extracted signatures with reference signatures in pairs to provide more accurate identification.
It is yet another object of the present invention to provide a progrzm identifying system that monitors the program m terial for the occurrence of predetermined events, and monitor~ the time inter-val be~ween such events to identify the program, utilizing the events either by themselves or in con-junction with the extracted signatures.
It is yet another object of the present invention to provide a system that alters the identi-fication criteria depending on whether the program being viewed is a broadcast progra~ or a program pre-vlou~ly recorded by the viewer.
Therefore, in accordance with a preferred embodimen~ of the invention, there is provided a sys-tem having a home unit that monitors the viewing con-dition~ in the home, for example, whetner the televi-sion set is on or off, whether any home video recorder (e.g., a video casse~te recorder or VCR) is on or off s~

and the video signal being received. By monitoring the aforementioned signals and functions, the system determines whether no viewing is occurring, whether a television broadcast is being viewed, whether a tele-vision broadcast is being recorded, whether viewingand recording is occurring simultaneously, or whether a previously recorded program is being viewed. Once the viewing mode has been established, the program being viewed is identified by extraoting a character-istic signature from the video signal a~d storing itfor subsequent comparison with reference signatures stored in a central office.
Once the mode of viewing has been determin-ed, a signature of the program material being viewed or recorded is extracted, provided certain predeter-mined events occur first. These events shall ~e re-ferred to as Event 1 and Event 2. Several events can serve as Event 1. These events may be a turn on of the ~elevision set, a channel change, a predetermined time interval, for example 5 to 10 minutes, since the occurrence of the previous Event 1, or a sudden change to a blac~ scene. Events that can serve as an Event 2 include a scene change to a scene other than a black scene, a color change, which may be an overall color change or a color change of a predetermined line, a still picture as evidenced by several consecutive substantially identical frames, an audio level change, and others.
Upon the occurrence of an Event 1I the home unit monitors ~he ~roadcast for the occurrence of an Event 2, and extracts a pair of signatures from the video signal immediately following the next two Event 2s following an Event 1. Or, if a previously recorded program is being viewed, the system extracts three or more signatures from the video signal immediately following the next three or more Event 2s following an Event 1. The signatures are extracted by detecting and digit-izing the envelope of the video signal immediately following the Event 2s. The digitized signatures, as well as the times that the signatures were extracted, are stored for future comparison with reference signals stored in a central location. The time intervals between Event 2s are also stored and used to identify the program either by themselves or in conjunction with extracted signatures. The time intervals between Event 2s are particularly useful for identifying the playback of previously recorded video tapes for reason~ that will be discussed in a subsequent portion of the specification.
The unit used to acquire the reference signatures that are compared with the signatures received from the home units is similar to the home unit except that the reference signature acquisition unit monitors each broadcast continuously, and does not require an Event l to initiate a signature extraction.
Therefore, each broadcast is continuously monitored and a signa-ture is extracted each time an Event 2 is detected. The signa-tures, as well as the times of occurrence of the signatures are stored to form a library of reference signatures. The stored signatures are compared with signatures obtained from the home unit that occurred at approximately the same time. The time intervals between Event 2s are also stored for comparison with corresponding time intervals received from the home units.
The invention may be summarized, according to a first broad aspect, as a method for identifying signals for determining audience ratings comprising the steps of operating on the signal to be identified in order to extract a feature string, comparing the feature string with various feature strings corresponding to known signals and identifying the signal when the feature string of the signal to be identified correlates with one of the feature strings corresponding to a known signal within a predetermined tolerance, wherein the step of extracting the feature string from the signal to be identified comprises the steps of: monitoring predetermined changes in a predetermined parameter of the signal to be identified; determining the time intervals between the predetermined changes in the predetermined parameter, and utilizing the ti.me intervals between successive predetermined changes to generate the feature string, the feature string being a digital representation of a predefined number of said time intervals.
According to a second broad aspect, the invention pro-vides in a system for identifying signals for determining audi-ence ratings having means for operating on the signal to be identified in order to extract a feature string, means for com-paring the feature string with various feature strings cor-responding to known signals and for identifying the signal when the feature string of the signal to be identified correlates with one of the feature strings corresponding to a known signal within a predetermined tolerance, the improvement wherein the feature ~0(~52 - 6b - 63076-1014D

string extracting means comprises: means for monitoring pre-determined changes in a predetermined parameter of the signal to be identified: means for determining the time intervals between the predetermined changes in the predetermined parameter; and means for utilizing the time intervals between successive pre-determined changes to generate the feature string, the feature string being a digital representation of a predefined number of said time intervals.

~z~s~

upon considera~ion of the following detailed descrip-tion and attached drawing wherein:
FIG. 1 is a bloc~ diagram of the home unit of the program iden~ification 5y tem according to the present invention;
FIG. 2 is a block diagram of the central office and reference ~ignature extraction portions of the system according to the invention;
FIG. 3 is a chart illustra~ing th~ various modes of viewing and recording that can occur in a typical household;
FIGS. 4 and 5 are flow charts illustrating ~he logical steps performed by the home and central office units, respectively;
FIG. 6 is a flow chart illustrating the logical steps performed by ~he central office unit in identifying signatures; and FIG. 7 illustrates how signatures and scene changes may be extracted from the video envelope.
Referring now to the drawing, with particular attention to FIG. 1, there is illustrated a block diagram of the home unit of the system according to the invention generally designated by the reference numeral 10. The system 10 receives signal3 to be ident$fied from a televi~ion receiver 12, and in many instances rom a home video recorder, - such as a vides cassette recorder or a VCR 14. In a typical home installation~ the receiver 12 and the VCR 14 receive signal from a receiving antenna 16~
and in many instances from a cable television sys~em 18. These ignal may be applied directly to the receiver 12 and to the VCR 14, but it is convenien~
to employ a switching network 20 that selectively switches the receiver 12 and VCR 14 to ~he antenna 16 ~0(~52 and to the cable 18 in order to permit the viewing and recording of broadcasts receiYed from the antenna 16 or from the cable 18. In addi~ion, the switching network 20 may connect the VCR 14 to the receiver 12 to permit prerecorded or previously recorded tapes to be played back by the VCR and viewed by the receiver 12.
Video signals representative of the program applied to the receiver 120 as well as video signals representative of the signals being recorded by the VCR 14, are applied to a video processing cirCuit 22, for example, by a video switching circuit 24 which permits either the receiver 12 OE the VCR 14, or both, to b~ monitored. The video processing circuit 22 includes a detector and a low pass filter (discussed subsequently in greater detail), and provides a signal representative of the envelope of the video signal to an analog-to-digital converter 25 which generates a digital representation of the video envelspe. The digitized video envelope signal is applied to an events detector 26 which detects predetermined events that occur in the digitized video signal, and causes the signature of the digitized video signal to be extract-ed by signature extraction circuitry 28 upon the oc-currence of a predetermined sequence of events. The signature thus generated is ~tored in a data storage ~ystem 30 along with ~he time that the signature was extracted" The extraction times are provided by a clock 32 which provides a digital representation of time to the data storage system 30 via, for example, a combining circuit 34. The clock 32 may be a real time clock or a relative time clock that is periodical-ly rese~ from a central location. The function of the events detector 26, the signature ex~ractor cir-cuitry 2~, ~he data storage 30 and the cloc~ 32 can g .

be performed by individual circuits as shown or by a microprocessor based system.
The home unit 10 is interrogated at periodi~
intervals, for example, once a day, by a cPntral o~-fice unit, generally designated by the reference numer-al 40 (~IG. 2), which compares ~he signatures from the variou home units with reference signatures in order to identify the signatures from the home unit.
The home unit~ may be periodically interrogated by a data communications circuit 42 that accesses the vari-ous home units via a suitable communications system, for example, a plurality of telephone lines 44. The signatures thus collected are compared by a central computer system 48 with signatures stored in a data base 46. Central computer system 4~ controls the collection and classification of the signatures re-celved from the home uni~ as well as the generation of reference signatures to be stored in the data base 46. The latter function is performed in conjunction with a plurality of reference signal extraction cir-- cuits 50 that are located in the cities being monitor-ed. The reference signature extraction circuits monl-tor the various networks, cables and other sign~l sources in those citie~ and extract reference signals, for example, whenever an Event 2 occurs in any of the signals being monitored. The reference signals thus extracted are transmitted to the central office unit 40 and ~tored in the data base 46 along with the times that such signatures are extracted, as provided by a cloc~ 52. The clock 52 of FIG. 2 is similar to the cloc~ 32 of the home unit and ~erves to indicate the time of occurrence of the reference signatur~s. The clock 52 may also be a real time clock or a relative time clock tha~ is periodically set by the central office 40. A central reference ~igna~ure extraction circuit 54 and associated clock 56 serve to extract ~ignatu~es and times of extraction of cable originated program~ and ~ignature-~ of prerecor~ed programs. Al-ternatively, extraction circuits may be placed at the S head ends of cable system to extract reference signa-tures of cable program~. The signature extraction performed by the various extraction circuits and data storage is controlled by the central computer system 48.
The home unit of the system accordi~g to the inv~ntion monitors the mode of viewing by monitor-ing the on and off and other functions of the televi-sion rec iver and any video recorder that may be used in the home. The various modes of viewing or recording are illu~tra~ed in FIG. 3. Mode 1 occurs when both ths VC~ and ~elevision receiver are both off, and indicates that no viewing is taking place. Mode 2 represent~ television viewing and occurs when the televi~ion receiver is on and the VCR is off. Mode 3 represents recording by the home VCR and occurs when the television receiver is off and the VCR i5 on.
Mode 4 occurs when the VCR and the television receiver are both on, and can represent one of three conditions.
The first condition, de~ignated as mode 4.0, represents television viewing through the tuner of the VCR with-ou~ recording~ Mode 4.1 represents VCR recording and televi~ion viewing, whlle the third condition~ desig-nated as Moda 4.2, represents VCR playback. Other mode~ of opecation of the VCR, such as fasS forward and rewind, are designated as Modes 4.3 and 4.4, respectively. Most of ~hese modes can be ea~ily de-tected by monitoriny power line voltage or voltages elsewhere in the televisio~ set and ~hs video recorder.
Mode~ 4.0, 4.3 and 4.4 can be detected by logic cir-~uitry.

s~
--ll--As previously stated, the home unit 10 moni-tors the viewing mode and the video signal, and ex-tracts signatures from the video signal only if cer-~ain events occur. The logic necessary to provide this function is illustrated in FIG. 4. As i3 appar-ent from FIG. 4, the mode of operation is recorded whenever ther~ i a change in the mode, such as, for example, a change from viewing to recording, etc.
Each time the mode changes, a determination is made in order to determine what the new mode is. This is accomplished by determining whether th~ television is on or of~, whether the VCR is on or off, and whether the VCR is in a record or playback mode. Once it has been determined whether ~he television and VCR are on or off, and whether the VC~ is in a record or playback mode, the mode can readily be determined by using a look-up table containing the information in FIG. 3, and the mode is recorded.
If the mode is determined to be Mode 1, ~0 indicating no viewing, no further ac~ion is taken until the mode changes again. If it i~ determined that the mod~ of operation is either Mode 2 or Mode 4, the system monitors the television receiver for the occurrence o~ an Event 1, which may, for example, be any one of the following:
1. television turn-on
2. a channel change
3. a predetermined ~ime interval (e.g 5-10 minutes) since the previou~ Event 1
4. a sudden scene change to a black scene.
5. loss of synchronization for an appreci-able period of time.
Such Event ls are relatively easy to detect. For example; ~elev~sion turn on can be detected simply by monitoring an appropriate voltage or current. Televi-s~

sion turn-on is not used as an Event 1 in the system illustrated in FIG. 4 but may be used in other systems, particularly systems that do not monitor a VCR and consequently need not employ mode logic. A channel S c~ange can be detected by monitoring the position of the tuner mechanism when mechanical tuners are used, by monitoring tuning voltage in electrically tuned tuners, by monitoring local o~cillator frequency, or even more ~imply by moni~oring the video synchroni-zation pulses; preferably the vertical syncronizationpulses and indicating a channel change in the event of a loss or a change in the syncronization pulse for a short time in~erval. The predetermined ~ime interval can readily be determined from a clock, and a sudden :~ 15 scene change to black can be readily detected by moni-toring the average value of the video signal and indi-cating a chan~e to a black scene when the average v~lue becomes that representative of an all black scene. A loss of synchronization for an appreciable length o~ time (longer than that caused by a channel change) indicate~ fast forward or rewind mode of opera-tion of a VCR (Mode 4.3 or 4.4) or the loss of ~he video signal (e.g. station off the air).
As long as no Event 1 occurs, the home unit record no new data; however, upon the occurrence of an Event 1, he system i5 conditioned to extract a ~lgnatur2 upon the occurrence of the next two consecu-tive Event 2s, wi~h an Event 2 being defined as:
1. a scene change to a scene other than a black scene 2. a color change in successive frames or por~ions thereof 3. a still picture i 4. an audio level change.

~90~5Z

Way~ of detecting the various Event 2s described above, as well as ways for extracting signatures will be discussed in a subsequent portion of the specification~
Upon the occurrence of an Event 1, a counter is set to zero and the video signal i5 monitored to determine if th~ television synchronization or sync signal i9 stable. This is accomplished, for example, by m~nitoring the vertical synchronization pulses and waiting until a predetermined number, for example, 16, consecutive stabl~ vertical sync pulses are detect-ed before the unit is permitted to detect an Event 2.
If th2 synchronization has not been sta~le for a signi-ficant period of time, more than five seconds, this period will be marked either as television station is off the air or VCR fas~ forward or rewind.
Once the sync has been stable or the requir-ed number of pulses, the system monitors the video signal for the occurrence of an Event 2. Whenever such an Event 2 change is detected, the time of the 2~ occurrence of the Event 2 is recoeded as is the time interval since the previous Event 2, if the time in-tervals between Event 2s are to be used as a means of program identification. The signature of the first frame following the last detected Event 2 change is ~hen extrac~ed. Since a standard broadca~t television frame consi~ts of two interlaced fields, and since the two fields that form a frame are quite similar, it is not necessary to analyze both fields of the frame o obtain a signature. Typically, a sa~isfactory signature i~ obtained from only a single field, for ex2mple, the first field of a frame. ~he method of extracting the signature will be discussed in greater detail in a subsequent portion of ~he detailed descrip-tion, as will the method of detecting an Event 2.

5~

Once the signature has been extracted, tAe counter i~ incremented by l. If the count in She counter is less than two, indicating that more signa-tures are needed, the system is conditioned to respond to another Ev~nt 2 to extract and record another signa-ture. If the count in the counSer is two ~or the required number of signatures) or greater, indicating that the signatures required have been extracted, the : 3ys~em is conditioned to require an Event l ~o occur before it responds to any further Event 2s.
If it is determined that the mode of opera-tion is Mode 3, that is, VCR recording, it is not necessary for an Event 1 to occur before the signal is monitored for Event 2s. The only determination lS that is made is whether the synchronization is sta~le, and if so, the system looks for Event 2s continuously and extracts the signature and records the time of occurrence of the signature each time an Event 2 is detected. Thus, all signaSures following every Event 2 are extracted and s~ored so that upon subsequent playback, the program may be identified even though only portions o~ the tape may be played back.
The logic for extrac~ing reference signa-tures is illu~trated in FIG. 5. Each broadcast and 2S cable chann~l is monitored continuously by the refer-ence ~ignature ext~action circuits 50 and 54, and a ~ignature i~ extracted each time an ~vent 2 occur~ to provid~ a reference library of reference signals.
Consequently, since it is desired to extract a signa-ture that occurs after every Event 2, ~vent ls arenot monitored. All that is required is that the sync be stable and tha~ an Event 2 occur. Thus, whenever an Event 2 occurs, She time of occurrence of the Event 2 as well as the signature of the scene following the 3S scene change is recorded for future comparison witb ~9~)~3S~

signatures received from the various home units. The time intervals between Event 2s may be recorded or obtained from the times of occurrence of the Event 2s.
The logic employed by the central computer system 48 of the central unit 40 in order to identify a program is illustrated in FIG. 6. As i~ illustrated in FIGo 6, two consecutive signatures from a home unit are selected. If there is not enough data to define two consecutive signatures, the next home unit is monitored~ After tWQ con~ecutive signatures are selected, a check is made to determine whether the signatures really are a pair, for example, that there is no Event 1 between the two signatures.
lS If the two signatures are a pair, the time acsociated wi~h the first home unit signature is read.
~11 reference unit signa~ures in the data base 46 that occurred within a predetermined time interval of the occurrence of the first home unit signature, for example, plu~ or minus eight seconds, are selected for comparison, and the correlation coefficient be-tween the first home unit signature and the selected reference unit signatures are computed. If one of the correlation coefficient exceeds a predetermined threshhold, tbe time associated with the second home unit signature i~ read. The n~xt several~ for example, 5iX, reference unit signatures that occur, for example, within a prede~ermined time interval, for example, plu~ or minus eight seconds 9f the time associated with the second home unit signa~ure arP selected, and the correlation coefficients between the second home unit si~nature and the reference unit signatures are computed. If one of ~he correlations coefficient exceeds the predetermined threshhold, an identifica-~90~5Z
"

tion is made and stored, and printed out, if neces-sary.
As discussed above, the system according ~o the invention utilizes a time reference to access referenc2 signatures that occurred at approximately th~ same time as the home unit 3ignature being evalu-a~ed in order ~o eliminate the need for comparing the home unit signature with all of the reference signa-~ures in the data base 46. However, when a previous-ly recorded program i~ being played back, ~he timethat the signature occurs duri~g playback cannot be used to locate the reference signal. Thus, in accordance with another important a~pect of th~
present invention, the times of occurrence of signa-- 15 ~ure3 hat occur when programs are being recorded are also stored. The times of occurrence of ~ignatures tha~ occur in Mode 3 (reco~dingl are kept in a separ-ate recording ~ime file in the data base 46. Similar-ly, in Mode 4.1 (viewing and recording) the times of 20 signatures are also kept in the recording time file.
Thu~ , when the broadcast is played back (Mode 4 . 2), ~ven though the recorded signatures cannot be matched to signatures stor~d in the reference unit during playback, the signature can still be iden~ified by utilizing the recording time in~tead of the playback time to retrieve the corresponding reference signa-tures from the reference ~ignature data bas~ 46.
Thus, by keeping signatures relating to programs that were broadca~ sPveral days or week~ ago within the 30 data base 46, such previously recorded and subsequent-; ly played back broadca~s can be readily iden~ified.
Alterna~ively, in srdes to iden~ify theplaybac~ of previously recorded programs, instead of comparing the signature obtained during playback with ~5 reference signatures that represent the programs broadcast ~29~ Z

~everal day3 or weeks ago stored in the recording time file of the reference signature data base 46 (FIG. 2) of the central office unit, the played bac~
signatures may be compared with signatures obtained by the home unit previou31y during VCR recording (Mode 3) and s~ored in the data ~torage 30 (FIG. 1) of the home unit. If desired, the ~igna~ures may also be sent ~o the central of~ice, a~ are the signatures obtained during normal televi~ion vi~wing, bu~ these sig~ature~ would be labeled as VCR r~cording, and may be stor~d in the reoording ~ime file. However, when playback occur~, the played back signature~ would be compared with the signature3 stored in the home unit data storage. Only if a m~tch occurred, would the lS signatures be sent to the centsal oÇf ic2 and co~pared with the signatures stored in the central ofice data bas~ 46 to identify the pro~ram. ~f no matc~ occurred, tbe signatures would ~ot be sent. This is because what-ever i~ being played back cannot be identified, and may be something like, for example, a ho~e movie.
Thus, the ~y~tem eliminates ~he need to search the entire recording file of the data base 46 in an attempt to identify something that cannot be identified.
In the playbac~ mode, when a~tempting to match the signatures obtai~ed during playbac~ with the ~ignature~ stored in t~e da~a s~orage 30, it is conven~.en~ to use the time interval~ be ween the occur-rence of the signa~ures (or Event 2s) to g~nerate a time in~erval signature9 TIS, (discu sed in greater detail in a sub~e~uent portion of he specirication~
to obtain the match. When usinq a ti~e interval signa-ture for matching, ~hree or more consecutive ~ignatures are extracted, and the time intervals between tha signatures are compared with the time interval~ between signatures stored in the data storage 30. When a ~o~s~

corresponding pair of consecutiv~ time intervals is found, a match is indicated. If greater accuracy is desired, more than three consecutive ~ignatures may be extracted to provide more than two consecutive time intervals.
Event 2s may be detected in a variety of ways. For example, the video signal may be low pass filtered, for example, through a low pass filter hav-ing a cut-off frequency on the order of approximately 2 to 6 ~iloHertz, and its average amplitude detected.
The signal may be digitized by assi~ning a number represen~ative of thé average ampli~ude of ~he video signal for each frame. The numbers thus obtained may be compared in variou~ ways. For example, the numbers ob~ained from different frames may be compared, and if the numbers are substantially different, a scene change will be indicated. The numbers compared may be obtained from consecutive frames or from non-consP-cutive frames, for example, frames spaced ~y a 20 to 40 frame interval, depending on the type of event that has been defined as the Event 2. Alternatively, 2 moving average of the numbers representative of the video amplitudes of th~ last several frames may be calculated, and the number representative of the aver-age amplitude of the last detec~ed frame comparedwith the moving average to indicate a scene change if the number associated with th~ last frame substantial-ly differs from ~he moving average.
The various comparing schemes are useful for identifying various different types of Event 2s.
For example, comparing the number associated with a particular frame with the number associaked with a previou~ frame is useful for detecting an Event 2 that may be termed an instant scene change. A moving average comparison or a comparison with frames that occurred several frames ago is useful for detecting an Event 2 that may be termed a ~ade type scene change wherein one scene is gradually faded in and the pre-; vious scene is gradually faded out. Because in a S fade type scene change, the change occurs gradually,there i5 no drastic change in the number associated with consecutive frames, and consequently, such a scene change cannot readily be detected by a system that compares immediately successive frames. Numbers that are substantially identical for several consecu-tive frames may be used to indicated a still picture.
Other methods of detecting an Event 2 in-clude detecting a color change, for example, by de-tecting the average amplitude of a particular color component, such as, for example, a red component, of each frame or a particular line or lines of each frame, and making comparisons of the types described above.
Alternatively; different color components or an aver-age of the three color signals can be monitored for each frame and comparisons made. In addition, rather than monitoring an entire frame, only portions of a frame may be monitored. These portions may be select ed to b~ ~hose portions sf a frame that normally do not change unless a scene change occurs. Such portions may occur near the top or bottom of the screen because such areas are less likely to be affected by action or movement occurring within a scene. The ~vent 2s may be detected electronically by monitoring the video envelope, or by monitoring the cathode ray tube, for example, by placing a light sensitive photoelectric device 36 (FI~. 1) in proximity with the cathode ray tube, preferably in one corner thereof. Alternatively, the audio signal may be moni~ored by an audio proces-sing circui~ 3~ and applied to the analog-to-digital 0~5~

converter 25 ~o generate an Event 2 when ~he audio signal departs from a p~edetermined level.
In accordance with an al~ernative embodiment of the present invention that is particularly suitable for identifying prerecorded program~, ~he time inter-vals be~ween events, or in change~ in one or more predetermined parameters, can be used to define a feature string. Such a system can be u~ed either alone, or in combination with a signature extracting ~ystem to iden~ify the program. When used alone, the amount of da~a generated by the sy~tem is le53 ~han that generated by a signature generating system, and therefore, does not require much compu~ation time to obtain a correlation. Conse~uently, it is not necessary to store the ~ime of occurrence of each change (unless it is de~ired for other reasons), but rather, only ~he elapsed time between such changes need be stored. Such a ~y~tem i9 particularly useful in identifying the playback of prerecorded programs, . 20 or delayed broadca3t programs because real time data is not necessary to make the identification.
In a sy3tem wherein the intervals between pred~ermined change~ in prede~ermined event~ are u~ed to g~nerate a time interval signatur~ (TIS), t~he ti~e in~erYal signature, which can be represented a~ a digital code sequence, can be constructed by m~a~uring the time intervals between the ~equential occurr~n~es of regularly occurring events in a broad-cas~. Preferably, the chosen events should have the following properties:
1. Be readily measurable automa~icallyO
2. Occur at a 3ufficiently high ra~ that enough intervals are available to allow unique identi-fication of the signal.
3. Alway~ be present ~29(~52 4. Cannot be controlled by program origi-nators to render identification vulnerable to changes in programming methods.
; Typical events that can be used as the events to generate a time interval string includes scene changes to black, Event 2s, including shifts in the average amplitude of the video signal, departures in the amplitude of the instantaneous video signal from ~he average, shifts in the video color balance, changes in the av rage color of a predetermined line, for example, a line near the top of the screen, a predeter-mined audio power level, for example, silence, etc.
Thus, to define a time interval string, the home unit would a generate time interval string (TISJ by detect-ing one of the predetermined events, for example, an Event 2 such as a scene change, and store the time intervals between consecutive changes.
The monitoring of events can be done continu-ously to generate a long TIS without requiring an excessive memory capacity because such changes occur relatively infrequently, and the time intervals there-between can be defined by only a few bits. The TIS's thus obtained from the home unit are compared with reference TIS's stored in the central unit.
However, in order further to reduce the amount of data that mus~ be stored, the time intervals between the events being moni~ored may be monitored only after the occurrence of another predetermined event, such as an Event 1, and it has been found that satisfactory identification can be obtained by gener-ating a feature string consisting of three time inter-vals between changes in the monitored variable, such as a scene change or other Event 2. The time interval strings ~hus generated are matched with reference time interval strings that are obtained in much ~he ~L?,9005~

same way as ~hose generated by the home units except that the reference time interval strings are continu-ously generated for each broadcast or other program source being monitored.
Referring now to Fig. 7, there is shown a video envelope 100 as provided from the video proces-sing circuit 22 (Fig. 1) to the analog-to-digital converter 25. The analog-to-digital converter 25 takes a plurality of samples 102 (Fig. 7) for example, 80 samples during a single frame interval, and gener-ates a digital value represen~ative of the ampli~ude of each of the samples 102. The digital values thus generated are applied to the events de~ector 26 (Fig.
l) and to the signature extraction circuit 28 and used to detect the various event~ and to extract sig-natures.
In order to detect an Event 2, for example, a scene change, the events detector 26 compares each of the digitized samples 102 of a current frame with a corresponding digitized ~ample 102 of a previous frame, either the immediately preceding frame or another previous frame, depending on the type of scene change ~o be detected. The counterpart samples are then compared to determine whether a scene change ha~ occurred. More specifically, the detec~ion of a scene change occurs as follow~:
l. Take, for example, 80 samples of the video envelope o~ each frame 2. Compare counterpart samples of a cur-rent frame and a previous frame 3. Take the absolute value of the differ-ence of each of the counterpart samples 4. Take the average of the absolute values 5.. Indicate a scene change if the average J3~0~35Z

of the absolute values exceeds a pre-determined threshhold.
The indication of a scene change is applied to the signature extraction circuit 28 tFig. 1) to cause the signature extraction circuit 28 to operate on the digitized samples 102 to extract the signature.
The signature extraction circuit 28 operates on the digitized samples 102 by taking the average value of consecutive samples 102, for example, eight consecutive samples to generate a total o~, for example, ten features 104 that comprise an image signature.
Typically, four bits define a fea~ure, and consequent-ly, an image signature can be defined by forty bits of data for the example di cussed above; however, in ~ome instances more feature~ and more data may be required.
Both the home unit 10 (Fig. 1) and the refer-ence signature extraction circuits 50 and 54 (Fig. 2) operate in the same manner to extract signatures, the only difference in operation being that the signature extraction circuit 28 (Fig. 1) requires an Event 2 to be preceded by an Event 1 before signature extraction occurs; but the reference signature extraction cir-cuits 50 and 54 (Fig. 2) do not require an Event 1 to precede an Event 2, but rather extract a referencP
signature each time an Event 2 occurs.
Once the home unit signatures and the refer ence ~ignaSures are obtained, the home unit signatures will be compared with the reference unit signatures, normally with ~hose signatures occurring in corrPspond-- ing time intervals. The correlation coefficients between the home unit signatures and the reference unit signature~ will be compared, and the reference signature that has the best correlation to the home unit signature will be used to identify the program being viewed, provided that the correlation coDf-ficient exceeds a minimum threshhold. If it does not, no match will be indicated.
Obviously, modifications and variations of the present invention are possible in light of the above teachings. Thus, it is to be understood that, within the scope of the appended claims, the invention may be practiced otherwise than as specifically de-scribed above.

Claims (44)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. A method for identifying signals for determining audience ratings comprising the steps of operating on the signal to be identified in order to extract a feature string, comparing the feature string with various feature strings corresponding to known signals and identifying the signal when the feature string of the signal to be identified correlates with one of the feature strings corresponding to a known signal within a predetermined tolerance, wherein the step of extracting the feature string from the signal to be identified comprises the steps of: monitoring predetermined changes in a predetermined parameter of the signal to be identified; determining the time intervals between the predetermined changes in the predetermined parameter; and utilizing the time intervals between successive predetermined changes to generate the feature string, the feature string being a digital representation of a predefined number of said time intervals.
2. The method recited in claim 1 wherein the signal to be identified includes an audio signal, and wherein the step of extracting the feature string includes the step of monitoring the amplitude of the audio signal and determining the time intervals that the amplitude of the audio signal departs from a predeter-mined level of amplitude.
3, The method recited in claim 2 wherein the predeter-mined level amplitude is a level that corresponds substantially to silence, and said time interval corresponds to the time inter-val between successive periods of silence in the audio signal.
4. The method recited in claim 1 wherein the signal to be identified includes a video signal, and wherein the step of extracting the feature string includes the steps of detecting predetermined events in the video signal and determining the time intervals between detected predetermined events to generate the feature string.
5. The method recited in claim 4 wherein the step of detecting predetermined events includes the step of monitoring the light emanating from a display on which the video signal is displayed, and indicating a detected predetermined event when the amount of light emanating from the display changes by a predeter-mined amount.
6. The method recited in claim 5 wherein the step of monitoring the light emanating from the display includes the step of monitoring the light with a photoelectric device, and indicat-ing a detected predetermined event when the output of the photo-electric device changes by a predetermined amount.
7. The method recited in claim 4 wherein the step of detecting predetermined events includes the step of determining the average amplitude of the video signal and indicating a detec-ted predetermined event when the average amplitude of the video signal changes by a predetermined amount.
8. The method recited is claim 4 wherein the step of detecting predetermined events includes the step of monitoring the video envelopes of at least one field of two frames.
9. The method recited in claim 8 wherein the step of monitoring the video envelope of two frames includes the step of comparing the envelopes of at least one field of two consecutive frames.
10. The method recited in claim 8 wherein the step of monitoring the video envelope of two frames includes the step of comparing the envelopes of at least one field of two non-consecu-tive frames.
11. The method recited in claim 10 wherein the step of comparing the video envelope of two non-consecutive frames includes the step of comparing the video envelopes of two frames spaced at approximately a twenty to forty frame interval.
12. The method recited in claim 11 wherein the step of comparing the video envelope of two frames includes the step of sampling said video envelope and comparing counterpart samples.
13. The method recited in claim 4 wherein the step of detecting predetermined events includes the steps of monitoring the video envelopes of two frames, obtaining a predetermined number of samples from each envelope and comparing the magnitudes of counterpart samples to obtain a difference in counterpart samples, taking the average of the differences, and indicating a detected predetermined event whenever the average exceeds a predetermined level.
14. The method recited in claim 13 wherein the step of comparing and taking the difference of counterpart samples includes the step of taking the difference of the absolute values of the counterpart samples.
15. The method recited in claim 13 wherein said predeter-mined number of samples is eighty.
16. The method recited in claim 13 further including the step of obtaining a signature including the step of monitoring the video envelope of a predetermined frame, obtaining a predetermined number of samples from said envelope, obtaining the average of a predetermined number of consecutive samples to define said signa-ture.
17. The method recited in claim 4 wherein the step of detecting predetermined events includes the steps of determining the average level of the video signal, determining the amplitudes of predetermined portions of the video signal, comparing the amp-litudes of the predetermined portions of the video signal with the average amplitude of the video signal and assigning one of a first and a second designation to each portion depending on whether the amplitude of the predetermined portion exceeds or is less than the average amplitude, utilizing the designations so obtained to obtain a frame code, comparing the frame code with the frame code of a previous frame, and indicating a change if the current frame code differs from the previous frame code by a predetermined amount.
18. The method recited in claim 1 wherein said signal to be identified includes a video signal and said video signal in-cludes a color signal, wherein the step of extracting the feature string includes the step of determining the time interval between predetermined changes in the color signals.
19. The method recited in claim 18 wherein the step of determining changes in the color signal includes the step of comparing a function of the color signal with the average of the color signal and determining the time intervals when the function of the color signal exceeds the average of the color signal and the time intervals when the function of the color signal is less than the average of the color signal to extract the feature string.
20. The method recited in claim 19 wherein the function of the color signal and the average of the color signal is deter-mined for only a single line of the frame.
21. The method recited in claim 20 wherein the single line is near the top of the image.
22. In a system for identifying signals for determining audience ratings having means for operating on the signal to be identified in order to extract a feature string, means for com-paring the feature string with various feature strings correspon-ding to known signals and for identifying the signal when the feature string of the signal to be identified correlates with one of the feature strings corresponding to a known signal within a predetermined tolerance, the improvement wherein the feature string extracting means comprises: means for monitoring predeter-mined changes in a predetermined parameter of the signal to be identified; means for determining the time intervals between the predetermined changes in the predetermined parameter; and means for utilizing the time intervals between successive predeter-mined changes to generate the feature string, the feature string being a digital representation of a predefined number of said time intervals.
23. The improvement recited in claim22 wherein the signal to be identified includes an audio signal, and wherein said fea-ture string extracting means includes means for monitoring the amplitude of the audio signal and determining the time intervals that the amplitude of the audio signal departs from a predeter-mined level of amplitude.
24. The improvement recited in claim 23 wherein the pre-determined level amplitude is a level that corresponds substan-tially to silence, and said time interval corresponds to the time interval between successive periods of silence in the audio signal.
25. The improvement recited in claim 22 wherein the signal to be identified includes a video signal, and wherein said feature string extracting means includes means for detecting predetermined events in the video signal and determining the time intervals between detected predetermined events to generate the feature string.
26. The improvement recited in claim 25 wherein predeter-mined events detecting means includes means for monitoring the light emanating from the display on which the video signal is dis-played and indicating a scene change predetermined event when the amount of light emanating from the display changes by a predeter-mined amount.
27. The improvement recited in claim 26 wherein the display light monitoring means includes means for monitoring the light with a photoelectric device, and indicating a detected predeter-mined event when the output of the photoelectric device changes by a predetermined amount.
28. The improvement recited in claim 25 wherein the pre-determined event detecting means includes means for determining the average amplitude of the video signal and indicating a detected predetermined event when the average amplitude of the video signal changes by a predetermined amount.
29. The improvement recited in claim 25 wherein the pre-determined event detecting means includes means for comparing the video envelopes of two frames.
30. The improvement recited in claim 29 wherein the video envelope monitoring means includes means for comparing the envel-opes of at least one field of two consecutive frames.
31. The improvement recited in claim 29 wherein the video envelope monitoring means includes means for comparing the envel-opes of at least one field of two non-consecutive frames.
32. The improvement recited in claim 31 wherein the video envelope comparing means includes means for comparing the video envelopes of two frames spaced at approximately twenty to forty frame intervals.
33, The improvement recited in claim 29 wherein the video envelope comparing means includes means for sampling said video envelope and comparing counterpart samples.
34. The improvement recited in claim 25 wherein said pre-determined event detecting means includes means for monitoring the video envelopes of two frames, means for obtaining a predeter-mined number of samples from each envelope, means for comparing the magnitudes of counterpart samples to obtain a difference in coun-terpart samples, and means for taking the average of the differ-ences and for indicating a detected predetermined event whenever the average exceeds a predetermined level.
35. The improvement recited in claim 34 wherein the compar-ing and counterpart samples difference taking means includes means for taking the differences of the absolute values of the counter-part samples.
36. The improvement recited in claim 34 wherein said pre-determined number of samples is eighty.
37. The improvement recited in claim 34 further including means for obtaining a signature including means for monitoring the video envelope of a predetermined frame and obtaining a predeter-mined number of samples from said envelope, and means for obtaining the average of a predetermined number of consecutive samples to define said signature.
38. The improvement recited in claim 25 wherein the means for detecting predetermined events includes means for determining the average level of the video signal, means for determining the amplitudes of predetermined portions of the video signal, means for comparing the amplitudes of the predetermined portions of the video signal with the average amplitude of the video signal and assigning one of a first and a second designation to each portion depending on whether the amplitude of the predetermined portion exceeds or is less than the average amplitude, means for utilizing the designations so obtained to obtain a frame code, means for comparing the frame code with the frame code of a previous frame and indicating a change if the current frame code differs from the previous frame code by a predetermined amount.
39. The improvement recited in claim 23 wherein said signal to be identified includes a video signal and said video signal includes a color signal, wherein the feature string extracting means includes means for determining the time interval between predetermined changes in the color signals.
40. The improvement recited in claim 39 wherein the color signal change determining means includes means for comparing a function of the color signal with the average of the color signal and determining the time intervals when the function of the color signal exceeds the average of the color signal and the time inter-vals when the function of the color signal is less than the aver-age of the color signal to extract the feature string.
41. The improvement recited in claim 40 wherein the func-tion of the color signal and the average of the color signal is determined for only a single line of the frame.
42. The improvement recited in claim 41 wherein the single line is near the top of the image.
43. A system for identifying signals for determining audience ratings comprising: means for detecting the occurrence of predetermined events in the signal; means responsive to said detecting means for determining the time intervals between suc-cessive occurrences of said predetermined events; and means re-sponsive to said time intervals for generating a signal identify-ing feature string in response to said time intervals, said sig-nal identifying feature string being a digital representation of a predefined number of said time intervals.
44. The system recited in claim 43 further including means for comparing the feature string with reference feature strings representative of known signals and for identifying said signal when said feature string correlates with one of said reference feature strings within a predetermined coefficient of correla-tion.
CA000615693A 1984-04-26 1990-04-04 Program identification method and apparatus Expired - Lifetime CA1290052C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US06/604,367 1984-04-26
US06/604,367 US4697209A (en) 1984-04-26 1984-04-26 Methods and apparatus for automatically identifying programs viewed or recorded
CA000480035A CA1279124C (en) 1984-04-26 1985-04-25 Program identification method and apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CA000480035A Division CA1279124C (en) 1984-04-26 1985-04-25 Program identification method and apparatus

Publications (1)

Publication Number Publication Date
CA1290052C true CA1290052C (en) 1991-10-01

Family

ID=25670656

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000615693A Expired - Lifetime CA1290052C (en) 1984-04-26 1990-04-04 Program identification method and apparatus

Country Status (1)

Country Link
CA (1) CA1290052C (en)

Similar Documents

Publication Publication Date Title
EP0161512B1 (en) Program identification system
JP3512419B2 (en) Audience measurement system
EP0450631B1 (en) Automatic commercial message recognition device
US4792864A (en) Apparatus for detecting recorded data in a video tape recorder for audience rating purposes
CN1135756C (en) Coded/non-coded program audience measurement system
US6513161B2 (en) Monitoring system for recording device
EP0745308B1 (en) Television receiver
US6856758B2 (en) Method and apparatus for insuring complete recording of a television program
US20040181799A1 (en) Apparatus and method for measuring tuning of a digital broadcast receiver
JP2003523692A (en) Audience measurement system and method for digital broadcasting
US5034902A (en) Method and system for ascertaining the consumption habits of a test population
CA1290052C (en) Program identification method and apparatus
EP0382996B1 (en) Method and system for ascertaining the consumption habits of a test population
HU228533B1 (en) Television signal processing device for automatically selecting and indicating the beginning of television programs of interest to the user
EP1199898B1 (en) Method for automatically identifying or recognising a recorded video sequence
JPH06508738A (en) Channel detection method

Legal Events

Date Code Title Description
MKLA Lapsed