CA2192439A1 - Subtitling video data - Google Patents

Subtitling video data

Info

Publication number
CA2192439A1
CA2192439A1 CA 2192439 CA2192439A CA2192439A1 CA 2192439 A1 CA2192439 A1 CA 2192439A1 CA 2192439 CA2192439 CA 2192439 CA 2192439 A CA2192439 A CA 2192439A CA 2192439 A1 CA2192439 A1 CA 2192439A1
Authority
CA
Canada
Prior art keywords
data
video
subtitling
look
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA 2192439
Other languages
French (fr)
Inventor
David John Atkins
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SCREEN SUBTITLING SYSTEMS Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2192439A1 publication Critical patent/CA2192439A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/41Bandwidth or redundancy reduction
    • H04N1/411Bandwidth or redundancy reduction for the transmission or storage or reproduction of two-tone pictures, e.g. black and white pictures
    • H04N1/413Systems or arrangements allowing the picture to be reproduced without loss or modification of picture-information
    • H04N1/419Systems or arrangements allowing the picture to be reproduced without loss or modification of picture-information in which encoding of the length of a succession of picture-elements of the same value along a scanning line is the only encoding step
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/005Statistical coding, e.g. Huffman, run length coding
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/46Conversion to or from run-length codes, i.e. by representing the number of consecutive digits, or groups of digits, of the same kind by a code word and a digit indicative of that kind
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • H04N7/0884Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of additional display-information, e.g. menu for programme or channel selection
    • H04N7/0885Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of additional display-information, e.g. menu for programme or channel selection for the transmission of subtitles

Abstract

Subtitling data is generated for association with video signals. The subtitling data may be combined with the video signals, by placing it in vertical blanking periods for example, or it may be supplied over an associated data channel, particularly when the video signal has undergone digital compression. A representation of displayable characters are generated in the form of a pixel array and output codes are generated representing compressed lines of said array. Run-length encoding is performed by recursively addressing a look-up table and shifting new input data to provide addresses to said look-up table as required. The output codes are synchronised to the video frames for transmission, via terrestrial broadcast, cable or satellite, or for storage.

Description

~ wo 9sra4i65 2 ~ 9 ~ 4 ~ 9 r~ ia~a SUBTITLING VIDEO DATA

The present invention relates to adding subtitling data to video signals and to l~lu~,e~ g video signals having subtitling data added thereto.

Processes for the addition of subtitles to video and AinPnnq~A,~phi~A
film are known. Traditionally, the video material was processed by a trqnclAtin~ and subtitling d~ , resulting in new video material being produced with subtitles forming a p ~llA ,~ ,I part of the video image. In the art, the subtitling data is referred to as being "burnt" into the video pictures.
It is also known to include subtitling data as a teletext ~ t~ d page. Under the teletext system, character data is supplied during vertical blanking intervals of the video l"..,~.";~-~)n Each video frame includes blanking intervals at the top and bottom of each image, that may be 15 considered as I I A- I~ 1 Iines which do not contain video ;. . r, .., ., - ;~ ", This in turn may be cu..~idel~d as ArlrlitiAnAlly available bandwidth, allowing data to be 1,, .~ d therein, such as the ~ru. "~ teletext data or Nicam stereo audio data etc.

In conventional teletext systems, character codes are 1,, ~ during 20 the blanking periods and these codes are translated into dia~ y_ble characters at the receiving station. Known teletext systems therefore have restricted character sets and the definition of these character sets is severely restricted.
Thus, it is possible using these character sets to generate characters in most roman based lqngA,lAgPc, including English and most European lAAn~A~lA~c but 25 this character set does not facilitate the 1,~ ~.";~:,, of characters in manyother lqn~ PC Thus, for example, it is extremely difficult to convey WO 95134165 2 1 9 2 4 3 9 P~

Chinese, Korean or Japanese characters using teletext t. h~ therefore subtitles for these languages, using conventional methods, are burnt into the video images.

A procedure for adding subtitles to video images is described in S European Patent Publication No. 0,400,990. The system provides an cllvilul~ llt in which a user may generate titles (not necessarily translated subtitles) wL~ an~l this data may be added to video frames processed by a video tape recorder. A more 50phi~ti~t~d system for adding subtitles is disclosed in United Kingdom Patent Publication No. 2,170,371. In this system, non-Roman characters may be combined with video frames by storing the character image data on an j..~ I data carrying medium, such as an optical disc. In this way, only time codes are stored on the actual video data and m operation these time codes are used to access full bandwidth character inf~rrnsltit~n stored j,,.1. IJ~ lly on the optical disc medium. Clearly, a 15 severe di:~adv~ a~, of this system is that s~lLIh;~ d additional hardware is required at a receiving station.

The above l~ i4~ represent subtitles as an array of pixels and within this array ~h~ ly any c~nfi~lr~ n of lettering is possible.
However, the severe disadvaula~, of such an approach is that the subtitling 20 pixels require a snhst ~nti~l amount of bandwidth for storage or 1 ,.. .~
In particular, incnffi~i.ont bandwidth is provided in video blanking periods forsubtitling of this type to be 1,, ,~ l ;1 in a form which would allow it to be sel.,.,~iv~ly combined with the video images at the receiver.

According to a first aspect of the present invention, there is provided 25 apparatus for generating subtitlrng data for ~o~ -l;.... with video signals, c~mpri~ing character pixel generating means arranged to generate a 2 1 ~
~ W0 9S134165 r~ a:~5 ,..L~ion of di~layal)le characters as a pixel array; compression means arranged to produce output codes ~ lliug lines of said pixel array in cvl.ll,l~ ed form; and ~yll.,LIvlli~illg means atranged to ~yll~,hlvlli~c said output codes with said output video frames.

Thus, an advantage of the present invention is that characters are generated as an array of pixels, thereby allowing l~; g Al~l,;r characters to beused as subtitles without reference to a small set of character codes. The CU~II,ul~ivll means is then arranged to produce output codes l~ ,a~ g lines of the pixel array in cv...~,.c;i,~;1 form. thereby reducing the bandwidth0 I~ UilG~ lL for the subtitling data. In this way, the subtitling data in Culll~ form may be associated with the video signals (for example by placing said cv...,u.~,,cd data within the video blanking periods) without requiring unrealistic levels of bandwidth.

In a preferred ~ I,odil--~l-L the subtitling date is associated with the 15 video signals by being added to vertical blanking periods of a cv.l~,~,l.Lullal television broadcast signal. Alt~,llldti~,~ly, the subtitling data is associatedwith the video signals by being added to am associated data ~ ."
chalmel and this data l,, ,~ ";~ , channel may be associated with the video frames conveyed in cvlll~ ai,~d form, such as in àccOld with MPEG 2 Ic~ , ;nn~

According to a second aspect of the present invention, there is provided apparatus encoding image data lc~ llLllg picture elements, c~lmrri~ing a look-up table configured to produce table data in response to input address data; means for supplying c~ nti~lr~nc picture element data as input address 25 data to said look-up table; and analysing means for analysing said table dataread from said look-up table, wherein said analysing means analyses a first wo 95/34165 2 1 9 2 4 3 9 PCT/GBgS/01353 table data and generating run-length output data or, in response to said analysis, said analysing means requests new address data, from said input data, so as to produce new table data, on detecting table data l~ g mput data having runs of similar data extending beyond the input address.

Preferably, the analysing means ~1. r~ the number of input pixels encoded by an output code, such as elements previously used as addl.~ g elements are not encoded by said code and additional conti~l~nlc picture elements are read to provide a new address to said look-up table.

According to a third aspect of the present invention, there is provided apparatus for adding subtitling data to a video image, wherein said subtitlmg data is received with associated video signals, c~,...l.. ;~;..g means for d~,vlllJJl~a~;llg associated subtitlimg data at video rate; means for ~cc~mhlingsaid subtitling data as pixel values; and means for ç--mhining said pixel valueswith associated video frames at video rate.

According to a fourth aspect of the present invention, there is provided a method of generating subtitling data for ~Cco~ nn with video signals, c~ g generating a l~l.l~ ,.,lll~tion of li~L.~lc characters as a pixel array; producing output codes l.,~ llillg lines of said pixel array in CVL~ Cd form; and ~ ,Lvlli:,illg said output codes with video frames.

Preferably, CVIll~ .iUll is effected by a process of run-length encoding upon lines of the pixel array. In a preferred cllllJvdill.-llL the run-length encoding is performed by ad~ hlg a look-up table and, preferably, a plurality of look-up table addresses are , ' ~ before an output code is generated.

~ WO 95/34165 2 1 ~ ~ ~ 3 9 PcTlGB95lol3s3 The invention will now be described by way of e~ample only, with reference to the accu~ llyi~,g drawings, in which:

Figure 1 shows a video monitor, displaying a video image with subtitles added thereto;

Figure 2 shows a system for gen~r)lting subtitles, including am off-line assembly station, a subtitle syll~,LIvlli~. and a subtitle encoder;

Figure 3 details the assembly station shown in Figure 2;

Figure 4 details the subtitle ~ ,LIUIIis~,~ shown in Figure 2;

Figure 5 details the subtitle encoder shown in Figure 2;

Figure 6 and Figure 7 illustrate the operation of the subtitle encoder shown in Figure 5;

Figure 8 shows a receiving station for receiving subtitles generated by the system shown in Figure 2;

Figure 9 illustrates the operation of the receiving system shown in 15 Figure 8.

A video monitor 15 is shown in Figure 1, that may forrn part of a cbll~ iullal television receiver. The video monitor displays a conventional television picture 16, over which subtitles 17 have been overlaid, by a process known as keying.

wo 95/34165 ~ .ll~,., ~1J~ ~

The subtitles are placed within a notional horizontal band towards the bottom half of the picture, in response to signals received by the receiver itself or by a separate decoding apparatus. Thus, at the receiving station, it is possible to sele~ ,ly decide whether the subtitles are to be combined with S thedisplayedvideoimage. Furthermore, inamore5~ Ilo.l;.~
it is possible to select from a plurality of available languages of subtitles, each of which may be keyed onto the video image in ~yll~llLvll; llll with a~ Li~Lt., frames of displayed video.

Video data keyed within region 17 is derived from pixel data 0 ICIJlC:~,lltillg a region 50 lines high and 720 pixel locations across. Subject to bandwidth ava;labilily~ it is possible to transmit any image within this region, such that the generation of keyable subtitling characters is not restricted to characters available from a character generator provided at the reception end. Thus, with a region of this size, it is possible to display characters from any desired character set, such as Chinese, Japanese or Korean etc.

Subtitling characters tend to be displayed in solid colours, therefore with a defmition of 750 pixel positions over 50 lines, it is possible to supply single bit data for each pixel location. Thus, ~ ;..g data of this type is 20 usually referred to as a bit map. In alternative cl-.l,odi.l-.,l-ts, subject to available bandwidth, it is possible to transmit keyable image data having multi-bit values, for example to indicate image data having a plurality of colours or l,. ;gl.~ Filtering l~ are also employed to "soften"
the edges of the key.

An overview of a system for generating subtitles is shown in Figure 2. At an assembly station 201 an operator reviews video sequences and ~ W0 95134165 2 ~ 9 2 4 3 ~ F~ ..,. S [1;,~.~

manually enters subtitling character codes. The assembly station 201 is a conventional c~-nfi~mqtinn and is arranged to produce files of data consisting of video time codes with associated strings of subtitling text. For each video sequence a plurality of such files may be produced when subtitling is required S in a plurality of l~n~lqgec Thus, each file would consist of time code listings with an associated text string of the ~ u~l;at~, language. These files are written to a ~ lloval~lc data carrying medium, such as a 31/2" floppy disc 202. Thus, the qcc~mhling operation is ~t~,LiY~ly performed "off-line" to produce files which relate subtitling strings to specific frames within video seql-~ n~Pc, identified by time codes.

The system shown in Figure 2 also includes a subtitle 7yll~luul~;.7 203, a subtitle encoder 204, a video ~ .1 device 205 for CUIlvcillliul~al broadcasts, a video ~ ,ly~liull device 206 for digital ~lictrihlltil~n and a video recorder 207 for recording subtitles as video pictures. The ~y~,luù~ 203, encoder 204 and m~lrllllqti~m devices 205 and 206 are arranged to operate in real time, allowing subtitling characters to be associated with video infnrmqtil-n while said video informqti~n is being 1,, ,~.";lr~ ~1 In this way, decisions relating to the subtitling process do not need to be made until the actual time for l,,..,~ -", occurs, thereby enhancing system flexibility and ~ l;", .,-:;"g the need for full bandwidth storage of video inf 7rm~tinn with its associated subtitling data.

The assembly station 201 is detailed in Figure 3 and, as ~ viuu71y ~ stated, represents a station of ~.,h~l,."l;~lly conventional form. Input video source material from a video tape recorder 301 or similar device supplies video signals to a processor 302. In addition, the video tape recorder 301 also supplies time code to the processor 302, such that said processor may W0 95/3416~ P'~

identify specific frames of the video sequence with reference to said time code.

A manually operable keyboard 303 allows an operator to manually select characters for inclusion in video sequences as subtitles. As character 5 subtitles are being generated, they may be displayed via a visual display unit304 and video SPqnPnrP~ with or without subtitles, may be displayed on a video monitor 305.

The A~coAiAtinn of specific subtitling strings with video frames is recorded by A~U. A i~lg these strings to j l ,lil; ~.. ~ of time code. Thus, 10 data of this type, mapping time code to character strings, is written to files on a l~llluvdblc floppy disc 306, using disc drive 307.

The subtitle ~yl~,l..u~ 203 and the subtitling encoder 204 operate in real time allowing the subtitling data to be associated with the video data as the video data is being ~ d or recorded. The ~yll~LIvlli~e~ 203 is 15 detailed in Figure 4, shown connected to an automated video feed device 401.
The video feed device 401 is loaded with a plurality of source video tapes, allowing video signals to be produced (video out) for broadcast purposes from a collection of pre-recorded tapes. Thus, it is possible for several hours, possibly several days, of video material to be generated by the video feed 20 device 401 with ~Anhst:~nt~ ly little manual i..L.,. v .,l..ioll.

The automated feed, such as a Sony LMS, supplies an i~iPntifirAti~-n of the current video sequence selection to the ~ llUII;~ 203. The video source material to be played is selected and, while playing, video time code is supplied to the ~yll~,LIulli~,l 203.

~ W095/34165 ~19~-~3~ r~ ,s,~c~

The syn~lllu~ 203 receives a disc 202 of subtitling files and these files are read by floppy disc drive 402 under the control of a Illi~lU,UlU~ VI
based system 403. The lld~lulJluc~ aul based system uses a cull~ ltiullal central ~lu~.~a~hlg unit, such as an Intel 88486 DX2 device, with associated 5 program memory. Data read via the floppy disc drive 402 is assembled by the ~ h,~uulu~ ul system 403 and written to a local hard storage device 404.
As this process occurs, individual files, related to the same video sequence, are combined such that, for each time code entry, each of the available subtitles, of differing lqn~l~ge~ are combined into the same file. Thus, in 10 this way, the number of files present on the hard disc 404 is reduced and selection of a required subtitling language is made by an a~ulu~ , index mto the data read from the disc 404.

The ~ ,LIu~ ,l 203 also includes a Graphics Signal Processor (GSP) based l..u~,c;,~;--g environment 405, configured around a Texas 34010 device.
15 On start up, instruction codes for the GSP are dulllOàdcd, from the hard storage device 404, via the Ill;~lu~uluc~ ul subsystem 403 and its associated system bus 406. The GSP 405 is configured to ~yll.,LIull;~, the subtitling files to the incoming time code. The incoming time code, read by a time code reader 407, is supplied to the GSP subsystem 405 via the U~UlUCCil:~'.)l subsystem 403 and its system bus 406. Time code is supplied from the reader 407 to the lllh,lu,ulu~,c;~.)l subsystem 403 under interrupt control, ~L~ a~l said processor 403 directs the time code infnnns~ti~ n to the GSP subsystem 405. The time code received by the time code reader 407 is ~ llUlll~d to ~ the video output signal and the GSP subsystem 405 is arranged to maintain this ~ uull;~ such that, in response to time code coming in, its associated subtitlmg text is supplied as a ~ Llulli~d output.

2 1 92~39 WO 95/34165 r~ 5 ~.~5 The syll~lll u~ e :i subtitling text, assembled by the GSP subsystem 405, is supplied to a parallel to serial conversion circuit 408 which~ given the processing power of the GSP subsystem 405, is capable of generating eight serial outputs, possibly conveying subtitles in different l~ gec at videû
S rate. In the present .,lllI,odilll.,~lL, one of these serial outputs of ~yll~,hlvlli~ed subtitling characters is supplied to the subtitling encoder 204 via an output port 409.

The subtitle encoder 204 is detailed in Figure 5. The encoder is based upon a Texas 34010 GSP subsystem, in which the GSP 501 c-.. ;~n with a hard disc storage device 502, a two megabyte frame buffer 503, an eight megabyte random access memûry device 504 and a serial ~n~ device 505. The eight megabyte random access memûry device 504 is used for storing working copies of fonts and program-specific i.,r.,, ...~1;.,.. Images are cu"~.lu.,t~d within the two megabyte frame buffer 15 503 which is configured frûm RAM devices providing a serial access port, allowing the image to be scanned out in real time in Dy~ u..;~lll with a video source.

The encoder 204 receives character infnnn~til n at video rate in response to time codes received by the syllclllulli~l 203. These character 20 strings are supplied to an input serial port 506. which in turn directs thern to the working memory 504 over an intemal system bus 507.

Once buffered in the working memory 504, the GSP 501 processes the i ~ - r( ~. ~ - -~ " ~ to determine the character data structure. In addition to particular characters, this will contain other inf(mnqrinn such as font type and font size.25 In response to this infnrmsltinn, the processor 501 produces a list of blitting ~ W0 9513416S 2 1 9 2 ~ 3 9 r~

operations required to generate pixel inforrn~ti~n, derived from the inforrn~ti~n available on hard disc 502, from the input character strings.

Having produced a blit list of this type, the blit list is executed to produce image pixels that are written to the two megabyte frame buffer 503.
5 Having written pixels to the frame buffer 503, an ,.~ . .i is made as to the area of coverage within the buffer, resulting in inf~mn~tir~" being generated identifying a bounding rectangular (603) and the position of said rectangle within the di~lJla.~able area of the final image screen.

Having supplied a complete subtitle to the frame buffer 503, the processor 501 scans the image retained within the frame buffer 503 line by line, to produce c~ ..,d run-length codes that are supplied to an output port 508.

As shown in Figure 6, a subtitle image has been written to tbe frame buffer 503 in which the first word, positioned at the top left corner, starts 15 with the letters "HE". The processor 501 has .1- f ...;...~l tbat the whole of the subtitle starts from the top left corner of the letter "H", therefore it is necessary to initiate scanning from this position. Arrow 601 represents the starting position for the first scan Ime. Scan line 601 will result m infnnn~tion being produced to the effect that four white pixels are required, 20 followed by eight black, followed by four white, followed by six black, followed by eleven white, and so on.

Similarly, at scan line position eight, the starting point of which is identified by arrow 602, the relevant infonn~fi-~n consists of sixteen white pixels, followed by six black, followed by seven white and so on. Thus, it 25 can be a~ t~,l that for the majority of subtitling ~h~r:~rtrr~ the run of pixel data will consist of a ~ t~ ,; l Ird number of white or coloured pixels, h~g the location of characters, followed by runs of black rh~r~rtl~r~
Lillg the spaces.

In the majority of situations in which subtitling data is to be associated 5 with video data and possibly combined with said video data for L.- .~...;-- ...
over a comrnon carrier, bandwidth is limited; therefore it would not be possible to provide pixel infortn~tir~n for each pixel within the subtitling region. The data is ~ .1 umder such ~ by effecting a level of data compression and given the nature of the infr,rrn:ltir,n involved, run-10 length encoding has been identified as an attractive option. However, undernormal schemes for p~.rullllhlg run-length encoding, it would be necessary to examine each pixel individually in series, to make a decision as to whether that pixel value is the same as the previous value, or different from the previous value thereby initiating the start of a new run for the scam line being15 examined.

In the present CllVilVIllll~, all of the processing perforrned by the subtitle ~yll-,hlvll.~,~. 203 and the subtitle encoder 204 is effected in real time such that time code received from video source material, bemg l.,...~ .(i in real time, may be used to generate and associate subtitles with said video for 20 ; . l ....~ real time n ,~ . Under these ~ the processing and encodmg of the subtitling pixels must also be perforrned in real time and this is extremely difficult, given realisable platforms, if it is necessary to serially consider the nature of each pixel in order to detect the start and endsof individual runs.

It is known to encode data strings by examining a plurality of bits, making up a word, in parallel. Output codes may be generated for each input 2 1 9~39 code by using the input code ~ an address or index to a look-up table. Thus, for example, an eight bit word could be supplied to a look-up table having a total of two hundred and fifty six adL~ "al,lc locations. Thereafter, assuming that the input data words have a ~JIcli~lablc level of l~d~ulda~l~ y7 some words5 occurring more often than others, it is possible to produce smaller output codes for the regularly occurring words with the longer codes being used for the less frequently occurring words.

In acculdallcc with known systems, it is possible for the look-up table to be updated over time, so as to perform an ~ l. process in terms of 10 the mapping of input words to output words. Thus, at the i , a process may continually examine the input data words to determine which are occurring more frequently. On the basis of this A~ , it is then possible for the ~ f ' to issue an hl71lu~ liull to the receiver to the effect that, at an a~lJIUl moment, a l~lnd;r~ ;llll is to be made to the look-up table 15 coding. Thus, in this way, it is possible for the ~ to identify a new optimieAtion table, issue codes to the receiver to the effect that a 1ll~
is required and thereafter make use of the new IJlJ~ A1;llll table so ~ to enhance overall p~.. r~lllll~ 1..~. and to take full advantage of the inherent l~lulllau~,y in the input data stream.

A first problem with using look-up tables in a subtitling ~,llVilUIII~
is that ~ ll paths are often ,- ~ ~IJ1;1 Jc to relatively large levels of noise. When noise is present on the line, data l l n~ are corrupted and - it is krlown to introduce levels of 1~ d~d~.~y in order to facilitate error i~Pntif f~ m and error correction. However, the whole point of using look-up tables is to reduce data rates and minimise l~dufluall~y, therefore it would be high y ulld. i7ha'l,1e to start hllludu~ hlg new l~dulldàu~y in order to provide a level of error correction.

Noise immunity is sllhct~mti~lly improved if the code ~,Ull~ iU~I tables remain fixed and do not attempt to perform adaptive oprimic~tinn during 1". . .~ 5 " . However, a problem with this approach is that the coding may not make the best use of the inherent l~,dulldall~,y, resulting in a notionally 5 higher bandwidth l~lU;lI;illl-,ll~.

A further problem with the look-up table approach is that the coding process is limited to input strings of a length equal to the loûk-up table indexaddress bus. Thus, for an eight bit look-up table, the input string would be c(,.,l~ d in units of eight bits. Clearly when CC~ i lg data derived 10 from images of the type shown in Figure 6, runs of 5nhct~nfiqlly more than eight bits may be present, and these large runs could be culll~ l very efficiently using run-length encoding t ~ l..i However, as ~ viuu~ly stated, cu--~,llLiu~al run-length encoding h ' . serially examine each incoming pixel so as to identify transition points. This requires a s~hQfs~fi~l l~lu~ hlg overhead and cannot be j",pl~ .. ~ d practically for the present Thus, it can be seen that a first constraint is placed on the 1~
of pixelated subtitles in that, in many CIlVilulllll.,llt~, the available bandwidth for 1.~ subtitles is severely restricted. As used herein, the subtitles 20 are referred to as being associated with the video, meaning that the subtitles are ~y---,luull;..c-l to video frames allowing the receiver to overlay or key the subtitling data at the a,UIJIU~ t~ points in the ~ . In some situations, such as traditional broadcast cllvilulull~llL~, the ~u ;,.~;m. involves "" ~ ;,.g the subtitling data with the 1.,..,~ signal. This may be 25 achieved by placing the subtitling data in vertical frame blanking periods, in a similar fashion to that employed by teletext systems in which character codes are conveyed in vertical blanking periods. The use of character codes 2 1 92~39 WO9S/34165 F~ .,,~v1353 in teletext systems is very bandwidth-efficient but the amount of bandwidth for 11~ .";ll;,lg pixel data is severely restricted and it is not possible to transmit an image of the type shown in Figure 6 without performing a level of data Culll,ul~oi~iull. F,~PriPnre has shown that it is necessary to provide a5 CvlllAul~o~i(JIl ratio of at least three to one in order to transmit pixelated subtitles within vertical blanking periods.

Associated pixelated subtitles may be associated with the video frames (i.e. syll-,lllulliO.,d) for other forms of lln~ ll For example, using MPEG 2, separate channels are provided for the ~ ., of additional 10 data and these channels may be employed for the 1l~ ... of cu~ U~io ~,d pixelated subtitle data. Other digital 1l~ lll systems are becoming h~ oill~ly available and again data channels will be provided for the tr2-ncmiceion of associated data, such as subtitles, along with other types of non-~y...lllu.liO~d data. In ~..h~ ly all of these cases, the amount of 15 bandwidth for the L~ ;.lll of associated data is restricted and data UUIII~UI~ oiU~I tPrhniqllrc are required.

If the pixelated subtitling data were being associated to video frames in non real-time, that is, as an off-line process, the only constraint would be that of bandwidth and processing power would no longer become a problem.
20 Situations in which encoding l h" ~ require ~ ;nlly more processing power at the tr~n~micci~n end compared to the amount of plu~,e;>Oillg power at the receiving end are well known. For example, during MPEG encoding ~ search ~ r~rithm~ are required in order to encrypt ;~ r frarnes in order to calculate ,~ r, .,- .I vectors. It is the r.,~ ;"" of the vectors 25 that requires a ~i~lbo~ ..u~,.,;"i..g overhead, while at the receiver it is asimple process of merely making use of the vectors calculated at the In the present ~ bodil-lc.lL, the subtitling data is associated with the video stream in real-time. Therefore, in addition to bandwidth c~".~ a further constraint is made upon the system in terms of IJIU~ aillg power.
Thus, known pixel-by-pixel run-length encoding 1~.1",;~1". ~ would provide a 5 good solution to optimiQ;ng bandwidth, whereas word-by-word look-up table 1~. I",;~ examining a plurality of pixels in parallel, would provide a good solution to reducing the processor overhead.

As previously stated, in the present ~,llll,odilll~ both bandwidth and processing speed are ~",~1".;".~ therefore neither of the above known 10 ~.h~ provide suitable solutions for cu~ a~;llg the pixel data for trPncmicQ;rm in an associated form in real time.

Operat;on of processor 501, in order to encode images of the type shown in Figure 6~ may be cu..Did~ with reference to the functional iilnctrstinn shown in Figure 7. A shift register 701, a look-up table 702 and 15 an output buffer 703 are physically provided as ad~Lcaa~liJlc locations within random access memory device 504. Data is read from frame buffer 503 amd processor 501 effects the functions of an input modifier 704, an output modifier 705, and analysing logic 706.

The look-up table 702 is tqctahliQh~d within random access memory 504 20 on system initialiQstinn The look-up table values remain constant durmg operation and the inverse table, to perform the dcculll~ ,iull process, is retained p " ~ ly at decoding stations. The look-up table 702 receives a nine bit address and is tberefore capable of holding a total of five hundred and twelve table entries. Character strings, l~ llt.,.l as picture elements, 25 will have been assembled in the frame buffer 503. Contiguous data elements are read from the frame buffer 503 line-by-line and supplied to the shift 2~ 9~3~
~ W095/34165 .~ '.'al.5:~

register 701. The shift register 701 provides means for supplying c~-nti~-nu~
picture element data, as an input addresses~ to the look-up table 702.

The output from the look-up table, in the form of table data, is supplied to analysing means in the form of the analysing logic 706, ,,,,IlI.. ,.. ,f~d by processor 501. The analysing logic 706 is arranged to analyse table data (which may be coll ,id. l~d as first table data) that may then be supplied to the output buffer 703 via the output modifier 705.
A t~ , the analysing logic 706 may request a new address data, although at this stage no output data will have been produced. New address 10 data is supplied to the shift register 701 but the input address is modified, by means of the input modifying circuit 704, in response to address modify data received from the output modifier 705. Thus, although the ad~ ,;,...g of the look-up table 702, for this new data, is performed in a ~nhst~nti~lly similar way to that performed for the initial look-up, the input address has been 15 modified by input modifying circuit 704, therefore t'ne addressing of t'ne look-up table 702 represents an i~i. . . l; r.. ~ ;"., of output codes ~. l r~ g an input string larger than the input address. This situation occurs when a long run of similar characters has been identified, thereby r~ ;..g highly optimised run-length encoding. On the second iteration it is possible that the run has 20 continued, therefore again it will be possible for the output modifying circuit to modify the ad~L.;"h.g input and for the analysing logic 706 to request new data from the frame buffer 503. Thus, new table data will be produced, possibly suitable for supplying to the output buffer 703, on detecting initial ~ table data 1~ 7~11Lillg input data having runs of similar data extending 25 beyond the input address.

The operation of the system in Figure 7 may be c-,..,;d. .~d in further detail with reference to the illn~trslti~-n shown in Figure 6. Coding is initiated WO 95/34165 2 1 9 2 ~ 3 9 F~ ..,,S,'/ii3~

firom the top left corner of the first character, therefore no coding is performed until a transition occurs from the notionally black background to the notionally white character edge. As shown in Figure 6, this represents the coding of four white characters followed by eight black rhA-~rtP c Nine S characters will be written to the shift register 701, consisting of the first four white characters of line 601 followed by five black rhAr:.rfP-.~ lAhe look-up table is therefore presented with an address consisting of four white charactersfollowed by five black characters although optimised run-length encoding would not produce an output code for the white characters umtil the full 10 length of the run had been identified; the run consisting of a total of eight characters in this example.
In a non-recursive, open loop situation, using conventional look-up tables, an output code could be produced identifying the situation in which four white characters have been received followed by five black rhDrAArtP-~
15 Groupings of this type are common in character strings, therefore thiscrnfigAI~tirln could be allocated a relatively short ~ 1 code.
However, from a run-length encoding point of view, full ~ ;(III would not have been achieved. IAhus, the look-up table 702, in accvldA. I,C with the present clllb - ' t, does not produce an output code l~ ,.,lllillg four white 20 characters followed by five black characters. Under these ~,il' ....~1 - . ~ it produces a recursive code to the analysing logic 706, informing said analysing means that the input string consists of four white characters followed by a run of black rhArA~rrP-c With this ;.. r.. ~ ;. ,.. known to the analysing logic 706, the frame buffer 503 is addressed, shown filnrtiorDlly as line 707, resulting 25 in the frame buffer 503 supplying new contigAIr.--c data to the shift register 701.

The previous data, consisting of four white pixels followed by five black pixels, is replaced and the new input data consists of three black pixels, W095/3416S P_l,. _.'~I.S:~.t followed by four white pixels, followed by two black pixels. However, the input address to the look-up table 702 will be modified by input modifier 704, c~c~,liv~,ly providing infonn~tinn to the effect that the three black pixels are related to the ~IICV;U~JIY Cvll~h~ cll input and that the ~ ,vivu~ly S cull,idc,~d input has not, as yet, been supplied to the output buffer 703.

The output code from the look-up table 702, supplied to the analysing circuit 706, informs said analysing circuit that the previously identified groupof five black pixels is completed by a further three black pixels before a transition from black to white occurs. Furlhermore, after this transition 10 occurs, four white pixels are ~ ;urd before a transition back to black pixels occurs. Thus, the analysing logic 706 is aware of the transition from white to black but, at this stage, it is unable to determine how many black pixels are present. C.~ y~ an output code is produced identifying a run of four white pixels, followed by a run of eight black pixels, followed by 15 a run of four white pixels. Thus, sixteen pixels have been coded, nine Cull~;d~,~cd as a first table address with the remaining seven being cv..i.;d~,.~;l as part of a second table address. The table has been addressed Ic~ ;v.,ly, so as to produce a composite code for a total of sixteen input pixels. The composite code for the sixteen pixels is assembled within the output modifier 20 705, which, on this occasion, allows the output code to be directed to the output buffer 703. The input modifier 704 is cE,~ ly re-set, such that the next conti~ input pixels will be cull,id.,..,d as the start of a new run.

Sixteen input pixels have been cvll,id.,.cd, whereas a total of eighteen pixels have been read from the frame buffer 503. The shift register 701 is 25 therefore iUl~lClll~ t~ by seven positions, so that the two remaining pixels,l~lc~c..liug the start of the run of black pixels at transition 611, may be CUII~id~ . As the seven l~lcviuu~ly processed input pixels are clocked out WO95/34165 2 1 9 2 4 3 9 ~ JaJ ~

of the shift register 701, seven new input pixels are read from the frame buffer 503, thereby making up a new ~;UIU~ of nine input pixels.

The nine input pixels provide an input address to the look-up table 702, I.,~ ,Clliillg six black pixels followed by three white pixels. Again, three white pixels extend beyond the input address therefore coding is performed lc.,ul ,;~,~,ly in order to determine the run-length of the white pixels.
Nine new pixels are read from the frame buffer 503, I.,~lci,.,lllillg a run of eight white pixels followed by one black pixel. Thus, having received two input words from the frame buffer 503, it is possible for the analysing logic 706 to deternune that, from transition 611, six black pixels have been received followed by a total of eleven white pixels. A code to this effect is l;llr~i, and, given that a total of nine plus eight input pixels have been coded, a further eight pixels are read from the frame buffer 503 and coding continues from position 613 shown in Figure 6.

This process contmues until the end of the scan lirle. It is not necessary to consider each pixel individually in order to determine the position of pixel ~nCiti~ n~ The look-up table 702 receives nine pixel values m parallel and produces codes identifying pixel transitions within the word.
Runs of pixels are encoded resulting in codes being ~ I to the output buffer 703. The analysing logic 706 is aware of how many pixels have been coded from the input stream, therefore this logic allows an a~
number of cnnti~lnllc input pixels to be supplied to the shift register 701. In this way, run-length encoding is optimised so as to reduce the lc~uu~ .clll on transition bandwidtb. Thus, by using the look-up table 702 in a recursive loop, as illustrated in Figure 7, it is possible to optimise run-length encodingand thereby optimise tr~nci~innh~l storage bandwidth, while at the same time ~llb~lh-ll;,llly increasing processing speed by c~ rinP a plurality of input ~ W0 95/34165 2 1 9 2 4 3 9 ~ 7~ l ~

pixel values in parallel. FUILI~ given this level of optimiesltirn~ it is not necessary to adapt values stored within the look-up table 702, therefore it is not necessary to transmit details of new look-up table values to receivers, thereby snhetAntiAlly improving noise immunity.

In alternative ~ .. ho~ a plurality of look-up tables, similar to look-up table 702 may be provided. An input to a first look-up table produces an output which may be used as an output code. All~,lllali~ this output may itself be used as an index to a second level look-up table. Further jl~t...lll...ijAI~ look-up tables may be included and a chain would then be 10 l ~ ~ by a final look-up table which, under all input r,nn~iiti~ne, is arr~mged to produce an output. The provision of a plurality of look-up tables should further optimise C~ ,t7;011. However, this will also increase tne hardware overhead, therefore a Cull~ JIlli7-, must be made in terms of 1-,--.~---;~;---- efficiency and hardware l~uui ~ ta. Similarly""r l;l~ c 15 could be made to the word length, with fewer than nine bits being used or more than nine bits being used. FYr.or-~nre has shown that with a plurality of sample fonts, a nine bit word provides a good COIll~ 7ll.;..~, between hardware demands and colll~ 7;ioll efficiency. The size of subtitles is controlled by many c~ given that it must be possible to display an 20 intelligible number of words on the final screen, while at the same time making each word large enough so as to be legible to viewers at the relevant display definition.

- Output broadcast signals produced by device 205 of Figure 2, are processed by receiving stations, of the type shown in Figure 8, allowing the 25 ~ .l subtitle data to be displayed in combination with television pictures, as shown in Figure 1.

wo95/34165 2 ~ q 2 4 3 q P~ ..,. C iJ~a Trqn~mittf d signals are received by an antenna 801 and supplied to a tuner/-k-"n~ , 802. The tuner/~iPmn<~ tnr 802 separates the video blanking data from the trqn~mitt~d television signal and supplies the video-related infnrrn~linn to a video processing device 803. The video processing 5 device 803 cleans up the video infnrm~lti/m and lI:;illLlV~ new blanking intervals. Additional video processing may be performed so as to supply ,.u, 'y encoded video signals to a television monitor 804.

A combining device 805 is positioned between the video ~lV~ C~D;llg device 803 and the television monitor 804. The cullll,ill;llg device 805 is 10 arranged to key subtitling data over the video picture, as shown in Figure 1.

The run-length encoded data derived from the video blanking intervals is supplied to a subtitle decoder 806, arranged to decode the CVlll~ D~ d subtitling data and to provide a subtitling video signal to the combiner 805 via a switch 807. Switch 807 may be operated manually, so as to D.le~ Iy 15 add received subtitling infnm~qtinn to the picture. In addition, switch 807 is activated to its off position if no new subtitling data is received for a d time-out period. At the combiner 805, the decoded subtitling data is keyed over the television picture and a composite image is supplied to the TV monitor 805, wh~ v-- conventional processing is performed so 20 as to display the television pictures.

The subtitle decoder 806 is detailed in Figure 9. In the present ,o~ , codes 1, . ,~..,;~u d to represent the subtitles have a maximum bit length of nine bits. This bit length is ~iPt~tTninpd by the particular coding adopted in order to effect the trqn~mi~inn and is not related to the nine bit 25 address at the 1., .~.";ltl ..

~ wo ss/34~6s 2 1 9 2 4 3 9 ~ s o~

The l ~ rd words are conveyed as a serial stream and are therefore coded such that the decoder can identify the start of new words.
The decoder 806 includes a shift register 901, a look-up table 902, an address writing device 903, a frame buffer 904 and a serial reading device 905.
S Incoming bits are supplied to the shift register 901 which in turn supplies a parallel address to the look-up table 902 when a complete word has been received. After a ~ d word has been supplied to the look-up table 902 as an index, the shift register 901 is cleared and the next word is shifted ,Iuuu~;ll.

Codes supplied to the look-up table 902 represent runs of white pixels and black spaces. These runs are written to au~lvlul;~t~, locations in the framebuffer 904 under the control of the address writing circuit 903. In this way, the original array of pixels, l~ ia~.ltillg the subtitle, are built up in the frame buffer, until a complete frame has been ~Pmhl~ Once a complete frame 15 of subtitling data has been assembled in the frame buffer 904, this inf~ ti-ln is read serially under the control of serial reading device 905, so as to provide a serial data stream of video ;"r.,....~ -" a.~ uulli~,d to the data supplied to combmer 805 from the video processing circuit 803.

The output from the frame buffer 904 is in digital form and the system 20 may include digital-to-analogue converting devices so as to effect the combining of video signals in the analogue domain. Alt~ dth,.,ly, the video output from the video lulu.,~,;"ulg circuit 803 may be in digital form, allowingthe combiner 805 to operate in the digital domain. In either event, assuming switch 807 is in the closed position, the two video signals are bi 1, 25 resulting in the composite signal being supplied to the TV monitor 804.

Claims (20)

1. Apparatus for generating subtitling data for association with video signals, comprising character pixel generating means arranged to generate a representation of displayable characters as a pixel array; and compression means arranged to produce output codes representing lines of said pixel array in compressed form; and synchronising means for associating said output codes with video frames.
2. Apparatus according to claim 2, wherein said subtitling data is associated with the video signals by being added to vertical blanking periods of a conventional television broadcast signal.
3. Apparatus according to claim 1, wherein said subtitling data is associated with video signals by being added to an associated data transmission channel.
4. Apparatus according to claim 3, wherein said associated data transmission channel is associated with digitally compressed video frames.
5. Apparatus according to claim 1, wherein said compression means is arranged to perform a process of run-length encoding upon lines of said pixelarray.
6. Apparatus according to claim 5, wherein said run-length encoding is performed by addressing a look-up table.
7. Apparatus according to claim 6, including means for generating an output code and means for addressing a plurality of look-up table addresses before generating said output code.
8. Apparatus according to any of claims 1 to 7, wherein said subtitling data is stored as character codes, said character codes are read at video rate in response to video time code, and said read character codes are supplied to an encoder at video rate.
9. Apparatus according to claim 8, wherein said encoder converts character codes to run-length codes at video rate, to produce run-length codes synchronised to said time code.
10. Apparatus for encoding image data representing picture elements, comprising a look-up table configured to produce table data in response to input address data;
means for supplying contiguous picture element data as input address data to said look-up table; and analysing means for analysing said table data read from said look-up table, wherein said analysing means analyses a first table data and generates run length output data or, in response to said analysis, on detecting table data representing input data having runs of similar data extending beyond the input address, said analysing means requests new address data from said input data, and said look-up table produces new table data in response to said new address data, said new table data comprising a code representing a string of input data largerthan a said input address.
11. Apparatus according to claim 10 wherein said analysing means determines the number of input pixels encoded by an output code, such that elements previously used as addressing elements are not encoded by said code and additional contiguous picture elements are read to provide a new address to said look-up table.
12. Apparatus for adding subtitling data to a video image, wherein said subtitling data is received with associated video signals, said subtitling data comprising output codes representing lines of said pixel array in compressed form, said apparatus comprising:
means for decompressing associated subtitling data at video rate;
means for assembling said subtitling data as pixel values; and means for combining said pixel values with associated video frames at video rate.
13. Apparatus according to claim 12, wherein said subtitling data is received with a broadcast television signal in the vertical blanking periods of said signal.
14. Apparatus according to claim 12, wherein said subtitling data is received with a digitally encrypted signal in an associated data channel.
15. Apparatus according to claim 14, wherein said digitally encrypted video signal is compressed to reduce video image bandwidth.
16. Apparatus according to claim 15, wherein said video data is read from a local storage medium with said associated subtitling.
17. Apparatus according to claim 15, wherein said video data is received from a cable television system or a satellite system with said associated subtitling.
18. A method of generating subtitling data for association with video signals, comprising generating a representation of displayable characters as a pixel array;
producing output codes representing lines of said pixel array in compressed form; and synchronising said output codes with video frames.
19. A method of encoding character strings represented as picture elements, comprising steps of supplying contiguous picture element data as an input address to a look-up table; and analysing said table data read from said look-up table, wherein first table data may be supplied to an output or, in response to said analysis, on detectingtable data representing input data having runs of similar data extending beyond the input address, new address data is selected to produce a new table data comprising an output code representing a plurality of input picture elements larger than a said input address.
20. A method of adding subtitling data to a video image, wherein said subtitling data is received with associated video signals, said subtitling data comprising output codes representing lines of said pixel array in compressed form, the method comprising:
decompressing associated subtitling data at video rate;
assembling said subtitling data as pixel values; and combining said pixel values with associated video frames at video rate.
CA 2192439 1994-06-09 1995-06-09 Subtitling video data Abandoned CA2192439A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9411615.9 1994-06-09
GB9411615A GB2290430B (en) 1994-06-09 1994-06-09 Subtitling video data

Publications (1)

Publication Number Publication Date
CA2192439A1 true CA2192439A1 (en) 1995-12-14

Family

ID=10756500

Family Applications (1)

Application Number Title Priority Date Filing Date
CA 2192439 Abandoned CA2192439A1 (en) 1994-06-09 1995-06-09 Subtitling video data

Country Status (4)

Country Link
AU (1) AU2678595A (en)
CA (1) CA2192439A1 (en)
GB (2) GB2290430B (en)
WO (1) WO1995034165A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2296148A (en) * 1994-11-28 1996-06-19 Snell & Wilcox Ltd Retaining sub-titles in a wide screen television display
US8599064B2 (en) * 2008-02-29 2013-12-03 Honeywell International Inc. Systems and methods for radar data communication

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4264933A (en) * 1977-07-27 1981-04-28 Canon Kabushiki Kaisha Method and apparatus for facsimile recording
JPS5951653A (en) * 1982-09-17 1984-03-26 Fujitsu Ltd Data compression coding processing system
US4610027A (en) * 1983-12-30 1986-09-02 International Business Machines Corporation Method for converting a bit map of an image to a run length or run end representation
US4646356A (en) * 1984-06-29 1987-02-24 International Business Machines Corporation Method for converting a bit map of an image to a run length or run end representation
JPS61147677A (en) * 1984-12-21 1986-07-05 Mitsumi Electric Co Ltd Superimposing device
JP2637821B2 (en) * 1989-05-30 1997-08-06 シャープ株式会社 Superimpose device

Also Published As

Publication number Publication date
GB2290430B (en) 1998-08-05
GB2304016A (en) 1997-03-05
GB2290430A (en) 1995-12-20
GB9411615D0 (en) 1994-08-03
WO1995034165A1 (en) 1995-12-14
AU2678595A (en) 1996-01-04
GB9625754D0 (en) 1997-01-29
GB2290430A8 (en) 1998-05-21

Similar Documents

Publication Publication Date Title
US6493036B1 (en) System and method for scaling real time video
US4393376A (en) Teletext interface for digital storage medium having synthetic video generator
EP1079610B1 (en) Moving-picture processing method, and apparatus therefor
KR100378538B1 (en) Method and apparatus for selectively changing program guide format by a viewer
KR930001679B1 (en) Televison receiver with teletext receiving function and method of superimposing teletext picture on television picture
US5969770A (en) Animated &#34;on-screen&#34; display provisions for an MPEG video signal processing system
USRE39003E1 (en) Closed caption support with timewarp
US5781687A (en) Script-based, real-time, video editor
KR100375800B1 (en) Animated &#34;on-screen&#34; display provisions for an video signal processing system
MY120196A (en) On screen display arrangement for a digital video signal processing system
JP3472667B2 (en) Video data processing device and video data display device
JPH07298223A (en) Caption information receiver
JP2002511998A (en) Encoding and decoding of pixel color values
CA2192439A1 (en) Subtitling video data
EP0987656A2 (en) Method of graphics data compression
US7526186B2 (en) Method of scaling subpicture data and related apparatus
JP2988584B2 (en) Character generator that displays characters with shading on the display screen
US8265461B2 (en) Method of scaling subpicture data and related apparatus
US6711305B2 (en) Image processing apparatus and method
KR100188273B1 (en) Pop-on scroll method of caption broadcasting televiewer option
KR19980047446A (en) Video signal converter from high definition television (HDTV) system to analog television broadcasting system
JPH0159794B2 (en)
JPH08251558A (en) Display control circuit
JPS60240290A (en) Teletext picture file device of teletext receiver
JPS611186A (en) Television receiver

Legal Events

Date Code Title Description
FZDE Dead