MXPA96003752A - Apparatus using memorielelated control tables with devideo graphics processing for televis receivers - Google Patents

Apparatus using memorielelated control tables with devideo graphics processing for televis receivers

Info

Publication number
MXPA96003752A
MXPA96003752A MXPA/A/1996/003752A MX9603752A MXPA96003752A MX PA96003752 A MXPA96003752 A MX PA96003752A MX 9603752 A MX9603752 A MX 9603752A MX PA96003752 A MXPA96003752 A MX PA96003752A
Authority
MX
Mexico
Prior art keywords
graphics
list
moving image
data
previously determined
Prior art date
Application number
MXPA/A/1996/003752A
Other languages
Spanish (es)
Other versions
MX9603752A (en
Inventor
S Butler Donald
S Amano Richard
Original Assignee
General Instrument Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/523,394 external-priority patent/US5835103A/en
Application filed by General Instrument Corporation filed Critical General Instrument Corporation
Publication of MXPA96003752A publication Critical patent/MXPA96003752A/en
Publication of MX9603752A publication Critical patent/MX9603752A/en

Links

Abstract

The present invention relates to an apparatus for processing mixed signals of video and graphics for display in a standard television receiver, comprising: a graphics memory comprising: a table of list of moving images, to list a plurality of graphics in a predetermined sequence, to be displayed on a television receiver and to store general information related to said graphics, in control words in each listing, a table of moving image data, to store pixel data for horizontal lines of each of said graphs, wherein the horizontal lines, in the data table of moving images, for each of said graphs, is accessed by a control word in the list of the list of moving images, for each one of said graphs, and a table of control lines comprising control words that are accessed by a control word in listing the predetermined of said graphs in the moving image list table to provide independent controls for selectively relocating pixel data in each of the horizontal lines obtained from the moving image data table to produce a special effect predetermined for each of the predetermined of said graphics, and a memory controller and moving image state to access the graphics memory tables in a predetermined sequence, in order to assemble and superimpose each of said graphics in places predetermined, on the horizontal lines of a received video signal that is being displayed on the television receiver's screen

Description

DEVICE OUE USA MEMORY CONTROL TABLES RELATED TO THE PROCESSING OF VIDEO GRAPHICS FOR TELEVISION RECEIVERS Reference of Related Requests This invention relates to the following applications, all of which are assigned to the assignee of the present invention, have common inventors, and are being filed at the same time: Patent Application of the United States of America with serial number: (GID872), entitled "Method and Apparatus for Performing Two Dimensional Video Convolving", United States Patent Application Serial No.: (GID906), entitled "Apparatus For processing Mixed YUV and Color Palettized Video Signals ", and the United States of America Patent Application Serial No.: (GID907), entitled" Video Magnification Apparatus ". Field of the Invention The present invention relates to an apparatus that uses memory control tables in a graphics memory to process mixed video signals YUV and palette color to produce desired special effects. A memory controller and motion picture state machine is used with the graphics memory to selectively display the graphics video signals alone or selectively overlay these mixed video signals on live television signals received from a remote source. BACKGROUND OF THE INVENTION Some commercially available computers, particularly personal computers, provide circuits that allow a combination of a composite video signal, such as, for example, a National Television Standard Comitee (NTSC) signal, with video display signals of computer-generated graphics, typically red, green, and blue (RGB) More particularly, the modern video graphic equipment has the ability to produce backgrounds, characters, symbols, and other representations of images and configurations in sizes, shapes and colors selected by the operator. U.S. Patent No. 4,737,772 (Nishi et al.), Published April 12, 1988, discloses a video display controller comprising a video display processor (VDP), a central processing unit ( CPU), a memory, and a video direct access memory (VRAM). The memory stores the two programs that will be executed by the central processing unit and several kinds of image data. The video direct access memory stores image data which the video display processor can change and then transferred to the outside for display on a visual display screen of Cathode Ray Tube (CRT). In the video display processor, a synchronization signal generator generates synchronization signals to correctly track elements of the image to be displayed which are used by horizontal and vertical counters and the display of cathode ray tube to synchronize the data processing in an image data processing circuit (IDPC) and visually displaying correctly the processed data on the display screen of cathode ray tube. A video digitizer samples an externally supplied analog video signal and converts the signal levels or amplitudes of the analog video signal into digital data, each consisting of 2 or 4 bits. The output data with digitized amplitude of the video digitizer represents a fixed image, and the data is supplied to the image data processing circuit. The image data processing circuit selectively stores both the video digitizer output data and the color codes supplied from the central processing unit in the video direct access memory via an interface circuit. Each color code from the central processing unit represents a respective color of the visual display elements (for example pixel) constituting a fixed image on the screen. During the operation, in response to a command for visual display from the central processing unit, the image data processing circuit sequentially reads point data from the video direct access memory in synchronization with the tracking position on the display of the tube. cathode rays, and produces the point data in a color palette circuit. At the same time, the image data processing circuit calculates and reads the data necessary to display an animation image from the video direct access memory and supplies color codes to the color palette circuit. When an animation image and a still image are located in the same position of visual display on the screen of the cathode ray tube display, the animation image is preferably displayed. The color palette circuit converts each color code into three color data for red (R), green (G), and blue (B), each composed of three bits. A Digital to Analog Converter (DAC) converts color data from the color palette circuit into red, green, and blue signals that are provided to the cathode ray tube display. U.S. Patent No. 5,355,175 (Okada et al.), Published October 11, 1994, discloses a video mixing apparatus that mixes a video image of graphics and a background video image into a plurality of mixing proportions in an image plane. The faded data indicative of the mixing ratio of at least one line of the background video signal and the graphics video signal are sequentially generated according to a predetermined order. The fading data is retained in a retaining element and output from the holding element synchronously with a horizontal synchronization signal. The levels of the background video signal and the graphics video signal are adjusted individually according to the fading data exiting the holding element, and the adjusted signals are added together. The background video signal and the graphics video signal are mixed in the mixing ratio set for each line in a plane of an image to generate a video output signal from the apparatus. The Patent of the United States of North America No. 4,420,770 (Rahman), published on December 13, 1983, describes a video background generation system for generating rectangular video patterns that has video attributes selected by the operator. The system comprises a horizontal bit memory and a vertical bit memory, each of the memories being a memory of 16 entities for storing information for 16 background entities. The memory for each background entity defines opposite corners of the background area for an entity on the screen. As shown in Figure 2 of the patent, a first entity defines a first rectangular area and a second higher priority entity defines a second rectangular area partially overlapping. An attribute query table stores information for each entity related to the color video output (red, green, blue) for that entity. During the tracing of the lines of an image that is occurring, the first entity occurs in its defined area and the second entity occurs in its defined area. However, the second entity has a higher priority which results in the overlapping region of the two entities presenting with the stored attributes of the second entity. The Patent of the United States of North America No. 4,754,270 (Murauchi), published on June 28, 1988, discloses a digitized display apparatus that is capable of enlarging or reducing the size of a visually displayed image on the screen of a point-by-point scanning type display as a display of CRT. The apparatus comprises an addressable memory element, a data entry element, and a variable address data generation element. The addressable memory element stores visual display data that is extracted in a predetermined time relationship by a predetermined synchronization relationship with a point-by-point exploration of the visual display to produce a visual display image. The input data element provides numerical data that determines the size of the image. The variable address data generation element includes variable addressing increments to generate address data that correlates to display data addresses that are stored in the memory element to produce the visual display data. The variable address data generation element comprises an arithmetic calculation element for digitally calculating the addressing increments in response to the numerical data supplied by the input data element. Still further, in response to the time signals related to the tracking of the visual display, the variable address data generating element responds to the arithmetic calculation element to increment directions to address the memory element according to the numerical data that determine the size of the image. More particularly, when an original sized image is displayed, a horizontal direction of the memory element is increased by a "1" every 200 nanoseconds. In other words, a point size in the horizontal direction of the visual display has a visual display time of 200 nanoseconds to display the original size. The size of the point in the horizontal direction can be enlarged or reduced by changing the visual display time of a point in the horizontal direction by appropriately selecting addendum data that is provided to the memory element. By properly setting the summed data, the size of the characters and associated images in the visual display screen can be enlarged or reduced with respect to a nominal size. Currently, a need is emerging for interactive video graphics that will enable a whole new class of services to be provided at home through a cable television network. These new services will enhance the viewing experience for many traditional television programs while providing enhanced services to others. Nevertheless, the NTSC television receivers and the Alternate Phase Line (PAL), unlike computer monitors, have a fairly low video bandwidth and employ an interlaced visual display, not a progressive scan. These limitations place severe restrictions on the generation of a high resolution synthetic video signal without artifact. Traditionally, consumer products, such as video games, avoid these problems by generating low resolution non-interlaced video signals. This form of approximation results in images that have low quality, have a "block" appearance, and limit their choice of color, and have a caricature appearance. The generation of synthetic video that achieves transmission quality requires that the generated synthesized signals emulate those of a video camera that tracks a scene and the subsequent analog signal processing for these video camera signals. Therefore, it is desirable to provide a relatively inexpensive configuration that uses memory tables to store and process many graphics (motion pictures) in various configurations and special effects while allowing good synthetic video graphics to be superimposed one on top of the other. or on top of the live television programming to be viewed on a standard NTSC or PAL interlaced television receiver. SUMMARY OF THE INVENTION The present invention is directed to an inexpensive apparatus that uses memory control tables to process mixed YUV video signals and graphics (moving images) in color palette, to produce desired special effects, and display visually in a manner selective these graphics video signals alone or selectively overlaying these graphics video signals on live television signals received from a remote source. Viewed from one aspect, the present invention is directed to an apparatus for processing mixed signals of video and graphics for display on a standard television receiver comprising a graphics memory and a memory controller machine and motion picture state. The graphics memory comprises a list table of moving images, a table of moving image data, and a line control table. The moving picture list table lists one or more graphics in a predetermined sequence for visual display in the television receiver, and for storing general information related to one or more graphics with control words in each listing. The moving image data table stores pixel data for the horizontal lines of each of the graphics or graphics where the horizontal lines in the table of moving image data for each of the graphics or graphics are accessed through a control word in the list in the table of list of moving images for each one of the graphics. The line control table comprises control words that are accessed by a control word in the list of the graphics previously determined in the list of moving pictures table. The control words of the line control table for a chart provides independent controls to selectively re-locate pixel data on each of the horizontal lines obtained from the moving image data table to produce a special effect previously determined for each one of the previously determined graphs. The memory controller and motion picture state machine accesses the graphics memory tables in a previously determined sequence to assemble and display each of the graphics or graphics in previously determined locations on the horizontal lines on a screen of the television receiver. Seen from another aspect, the present invention is directed to an apparatus for processing mixed signals of video and graphics for visual display in a standard television receiver., which comprises a graphics memory and a memory controller machine and moving image status. The graphics memory comprises a list table of moving images, a table of moving image data, and an extension list table. The motion picture list table lists one or more graphics in a predetermined sequence to be visually displayed on the television receiver and to store general information related to the graphics or graphics within control words in each listing. The moving image data table stores pixel data for the horizontal lines of each of the graphics. The horizontal lines in the moving image data table are accessed for each of the graphics or by means of a control word in the list in the list of moving images for each of the graphics. The extension list table comprises at least one extension list control word for a predetermined number of horizontal lines forming each one of a plurality of previously determined separate sections of the television receiver screen. The control word (s) of the extension list define which of a plurality of N chart listings in the moving picture list table are active and appear in the previously determined associated section. The memory controller and motion picture state machine first has access to at least one control word of the extension list in the table of the extension list when it assembles a previously determined section of a television receiver screen. Then, the memory controller and motion picture status machine only has access to the chart listings in the table of the list of motion pictures and the table of motion picture data that are active and appear in the minus one word from the extension list. Seen from still another aspect, the present invention is directed to an apparatus for processing mixed video and graphics signals to display them visually in a standard television receiver comprising a graphics memory and a memory controller machine and state of motion images. . The graphics memory comprises a table of. list of images with movement, and a data table of images with movement. The motion picture list table lists one or more graphics in a previously determined sequence to be displayed on the television receiver. The list of moving pictures table also stores general information related to the graphic (s) with control words in each listing. The moving image data table stores pixel data for the horizontal lines of each of the graphics. The horizontal lines in the data table of moving images are accessed for each of the graphics or by means of a control word in the list in the list of moving pictures for each of the graphics. The memory controller and motion picture state machine responds to field-enabled signals from the motion picture list table by indicating which field of a two-field frame of a video image is being displayed on a screen of an image. television receiver to access and assemble a previously determined first graphic listed in the moving image list table for the horizontal lines of only one of the two fields and a second previously determined graphic listed in the moving image list table or a live television signal for the horizontal lines of the other of the two fields. The invention will be better understood from the following more detailed description taken with the drawings that accompany it. BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a block diagram of a subscriber cable box unit in accordance with the present invention. Figure 2 is a block diagram of a first portion of the video processing circuit found in the subscriber cable box unit of Figure 1 according to the present invention. Figure 3 is a block diagram of a second portion of the video processing circuit found in the subscriber cable box unit of Figure 1 according to the present invention.; Figures 4, 5, and 6 illustrate the operation of a Pixel Assembly Buffer that is part of the first portion of the video processing circuit of Figure 3 according to the present invention. Figure 7 is a block diagram of a Exemplary muitiplexer / fader that is part of a second portion of the video processing circuit shown in Figure 3. Figure 8 is a block diagram of an exemplary configuration of a convolver that is part of the second portion of the video processing circuit. video shown in Figure 3. Figure 9 is a block diagram of a Graphics Memory comprising tables, and the first portion of a video processing circuit shown in Figure 2 according to the present invention. Figures 10, 11, 12, 13, 14, and 15 show various configurations that can be achieved using the tables of the Graphics Memory shown in Figure 9 according to a first embodiment of the present invention. Figure 16 is an exemplary view of a visual display screen using an extension list table found in the Graphics Memory shown in Figure 9 according to a second embodiment of the present invention. And Figure 17 is an exemplary section of an interlaced television receiver screen wherein a first and a second motion images are interspersed in a portion of the screen according to a third embodiment of the present invention. Detailed Description It will be understood that the corresponding elements that perform the same function in each of the figures have been given the same designation number. Referring now to Figure 1, a block diagram of a subscriber cable box unit 10 that can be found at a subscriber's location and provides interactive video processing in accordance with the present invention is shown. The subscriber cable box unit 10 comprises a first module (MODULE 1) 12 (shown within a first dotted line rectangle) and a second module (MODULE 2) 14 (shown in a second dotted line rectangle). The first module 12 is a conventional array comprising a Radio Frequency (RF) to Baseband 20 converter and a Converter Control System 22, which are known in the art. The baseband radio frequency converter 20 receives multiplexed radio frequency television channel signals in the standard NTSC or PAL format that propagates on a cable 27 from a remote cable company's head office (not shown) and selectively converts multiplexed radio frequency television channel signals from its multiplexed channel frequencies to baseband frequencies. The baseband radio frequency converter 20 transmits a resultant baseband composite video output signal from the conversion process on a busbar 24 to the second module 14. The converter control system 22 is user ( subscriber) is typically controlled either by an infrared remote control device or a keyboard in the cable box as is well known in the art. The converter control system 22 functions to receive and / or transmit authorization and access control signals via cable 27 to or from a remote cable company central office, active baseband video mixer or demixer, and produces visual display (OSD) messages. The converter control system 22 outputs control signals via a busbar 29 to the radio frequency to baseband converter 20 to select the programming of the desired channel, and various decoded control and data signals (e.g. upstream and control data output, infrared receive and transmit signals, and decoupled TI quadrature phase shift modulation data signals) via conductors 31 and 33 to the second module 14. The second module 14 comprises a Serial Interface Processor (SIP) 30, input / output devices (1/0) 32, a Read Only Memory (ROM) 34, a Memory of Direct Access (RAM) 35, a Central Processing Unit (CPU) 36, a Graphics Memory 38, and an integrated circuit of Memory and Video Control (CONT.VIDEO &ME.) 40 (shown in a dotted line rectangle). The serial interface processor 30, the input / output devices 32, the read-only memory 34, the direct access memory 35, the central processing unit 36, and the memory controller machine and moving image state (MAQ.) 42 of the integrated circuit of the memory and video control 40 are interconnected by a data bus 48. The central processing unit 36 can comprise any convenient processing unit and, according to the present invention, is a unit Central processing type 386 that is relatively inexpensive. The read-only memory 34 may comprise any convenient memory such as, for example, an Electrically Programmable Erasable Read Only Memory (EPROM) for initialization purposes and for the programming of the central processing unit 36. The direct access memory 35 it can comprise any convenient memory, such as two 256-kilobyte-per-16-bit Dynamic Direct Access (DRAM) memories connected in series to provide a 512-kilobyte-per-16-bit direct access memory configuration for use as transient memory for the central processing unit 36. The graphics memory 38 may comprise any convenient memory such as, for example, a direct access memory with 32-bit wide area or preferably two 256 kilobyte dynamic direct access memories. -by-16 bits arranged in parallel for use with a 32-bit wide busbar. The graphics memory 38 is used to store moving image data related to graphics and video images. The use of a 32-bit-wide busbar 39 allows the use of fast-call mode memory addressing for both a memory controller and image state machine with motion 42 and for a block memory mover (not shown). ) that are part of the memory and video controller 40. Through the significant use of memory addressing in block mode, an average data transfer rate of approximately 52 nanoseconds can be achieved, which corresponds to processing approximately 77 million of bytes of data per second. The serial interface processor 30 operates to handle data communications between the first module 12 and the second module 14. More particularly, the serial interface processor 30 handles all data transfer signals between the second module 14 and the system converter control 22 of the first module 12. These data transfer signals may have formats such as, for example, a data stream similar to IT at 1.5 Mbits / second which involves the volume of communication transfers, and raw data from an infrared receiver (not shown) in the converter control system 22. The serial interface processor 30 may also include a full-duplex synchronous serial port (not shown) for future expansion. These data transfer signal formats are used for communication between the converter control system 22 in the first module 12 and the central processing unit 36 in the second module 14 to activate the desired actions in the second module 14. integrated memory and video control circuit 40 comprises the memory controller machine and motion picture state 42, the composite circuit to YUV 44, the video processing circuit (PROC.) 46. The memory controller machine and motion picture state 42 is coupled to the graphics memory 38 via a data bus 39, and to the video processing circuit 46 via a data bus 45. The composite circuit to YUV 44 receives the signal of composite video of baseband from the busbar 24 and outputs the resulting YUV video signals to the memory controller machine and image state with motion 42 over a bar 43. The video processing circuit 46 receives the video signals from the memory controller machine and image state with movement 44 on the data bus 45., and outputs the standard NTSC or PAL video signals in a busbar 47 to a remote television receiver (not shown) or additional processing circuits (not shown). It will be understood that the present invention lies within the area of the integrated memory and video control circuit 40 and the graphics memory 38. The elements of the first module 12 and the second module 14 were presented and discussed hereinabove for a better understanding how the present invention fits into the interactive subscriber cable box unit 10. Referring now to FIGS. 2 and 3, block diagrams of the first and second portions, respectively, of a chip processing circuit are shown. video 40 (shown in a dotted line area) found in the second module 14 of the subscriber cable box unit 10 of Figure 1 according to the present invention. As shown in Figure 2, a graphics memory 38, which forms an element of the second module 14 of Figure 1, is coupled by a data bus 39 with a memory controller machine and image state with movement 42 which is part of a memory and video controller 40 (shown within a dotted line area) of the second module 14 of Figure 1. A first portion of the video processing circuit 46 comprises a data tube 50, and a memory intermediate assembly of pixels 52 (shown within a dotted line area) of the second module 14 of Figure 1. A first portion of the video processing circuit 46 comprises a data tube 50 and a pixel assembly buffer 52 (shown within a dotted line rectangle). The data tube 50 receives data on a bus 45 which were obtained by the memory controller machine and image state with motion 42 from the graphics memory 38 for the transmission of an image with particular movement to the assembly buffer of pixels 52. More particularly, the data tube 50 receives data for an image with movement from a graphics memory 38 via the memory controller machine and moving image state 42 and provides separate outputs as luminance data (ydata) and chrominance data (cdata) for transmission to the pixel assembly buffer 52. The pixel assembly buffer 52 comprises a first, second and third double line buffer zones 53, 54 and 55, respectively, and a buffer zone 58 of Line 0 of Y / G. The first double line buffer zone 53 is used to store Y / G data (luminance) of the line a and the Y / G data of the line ib for the first and second lines of a first field of an image with movement received via a busbar 49 from the data tube 50. The line luminance data comprises 10 bits (the bits 9-0) of data and control for each pixel of a line. The second double line buffer zone 54 is used to store the data of C (chroma) of the line la and data C of the line Ib of the first and second lines of the first field of the image with movement received via a bus 51 from data pipe 50. Line chrominance data comprises 8 bits (bits 7 - 0) of data for each pixel of a line. The third double line buffer zone 55 is used to store Y / G data (luminance) of line 2a and Y / G data of line 2b of the first and second lines of the second field of a moving image. received via bus 49 from data pipe 50. Line luminance data comprises 10 bits (bits 9-0) of data and control for each pixel of a line. It will be understood that the lines la and 2a of the first and third double-line buffer zones 53 and 55 store the first and second horizontal lines of pixel data, respectively, where the first and second horizontal lines are adjacent lines within separate fields of the moving image in an interlaced visual display format. Similarly, the lines Ib and 2b of the first and third buffer areas of double line 53 and 55 store the third and fourth horizontal lines of pixel data, respectively, where the third and fourth horizontal lines are Adjacent lines in separate fields of the moving image in an interlaced visual display format. In other words, the first and third buffer zones of double 1 line 53 and 55 sequentially store data and luminance control for, for example, pixels of a pair of odd and even lines, respectively, of the respective first and second fields, or vice versa, of the moving image during a scan of a format of interlaced visual display. The second double-line buffer zone 54 stores the chroma data for the data of the lines stored in the double-line buffer zone 53. A chroma-line double-buffer memory zone (not shown) can be provided. similar to the double line buffer zone 54 for the double line buffer zone 55, but is not necessary for reasons of economy and lack of importance in a convolver that will be explained later herein. The output data of the first double line buffer zone 53 comprises ten bits of luminance and control data for each pixel of the lines stored therein which is output in parallel for each pixel on a busbar designated LINE 1 to the circuit in Figure 3. The output data of the second double-line buffer zone 54 comprises eight bits of chrominance data and control for each pixel of the lines stored therein which leaves in parallel for each pixel on a bus designated LINEA to the circuit in Figure 3. The output data of the third zone of double line buffer 55 comprises ten data bits and luminance control for each pixel of the lines stored therein that leaves in parallel for each pixel on a busbar designated LINE 2 to the circuit in Figure 3 and to the memory area intermediate 58 of Line 0 of Y / G. The Y / G line 0 buffer zone 58 functions to delay the line data output by the third double line buffer zone 55 by a horizontal line period to provide a delayed line output comprising ten bits of line. data and luminance control for each pixel of the line stored in it that leaves in parallel on a bus designated LINE 0 to the circuit in Figure 3. It will be understood that at a sampling rate of 13.5 MHz for the visual display of television NTSC standard, there are 858 pixels per image line of which only around 704 pixels are actually displayed, and that there are 525 horizontal lines of pixels in two fields of an image of which approximately 440-500 lines are normally seen depending on the used television receiver. Turning now to Figures 4, 5, and 6, there is shown an exemplary sequence of operation for the first and third buffer zones of double line 53 and 55, ? * respectively, and the Y / G line 0 buffer zone 58 of the pixel assembly buffer 52 of FIG. 2 according to the present invention. It will be understood that in a normal operation of the double line buffer zones 53 and 55, a horizontal pixel data line is introduced in a first field in the middle of the first double line buffer zone 53 at the same time that a horizontal line of pixel data is introduced in a second field in the middle of the third buffer zone of double line 55. At the same time as the input of the horizontal lines of pixel data in the first halves of the first and the third double line buffer zones 53 and 55, the horizontal lines of pixel data previously stored in the other halves of the first and the third buffer zones of double line 53 and 55 in the output LINES are extracted. and 2 respectively. In other words, for the first double line buffer zone 53, a first horizontal line of pixel data is input from the first field to, for example, the Y / G portion of the line of the first memory zone. intermediate double line 53, and during a subsequent horizontal line period, a second horizontal line of pixel data is introduced from a first field of a frame to the Y / G portion of line Ib of the first double line buffer zone 53 while the first horizontal line of pixel data is extracted from the Y / G portion of the line on the output LINE 1. During a following horizontal line period, a third horizontal line of pixel data is introduced from the first field to the Y / G portion of the LINE is that of the first double line buffer zone 53 while the second horizontal line of the pixel data from the Y / G portion of the LINE Ib is extracted on the LINE of output l. At the same time, the first, second, and third horizontal lines of pixel data of a second data field of the frame are sent similarly to the third buffer zone 55 and are output therefrom on the LINE output 2. Figure 4 shows an end point of an initialization stage of the pixel assembly buffer 52 after the subscriber cable box unit 10 of Figure 1 is first turned on. More particularly, At power-up, the pixel data for a horizontal line 0 of a first field and the pixel data for a horizontal line 1 of a second field are input to the Y / G portion of the line of the first buffer zone of a second field. double line 53 and in the Y / G portion of the line 2a of the double line buffer zone 55, respectively, during a first horizontal line period. during a second horizontal line period, the pixel data for a horizontal line 2 of a first field and the pixel data for a horizontal line 3 of a second field are entered into the Y / G portion of the Ib line of the first double line buffer zone 53 and in the Y / G portion of the line 2b of the third double line buffer zone 55, respectively, while the pixel data for the horizontal lines 0 and 1 are extracted from the Y / G portion of the line of the first double line buffer zone 53 and of the Y / G portion of the line 2a of the third double line buffer zone 55, respectively, in the respective output LINES 1 and 2. At the same time, the pixel data for a horizontal line 1 from the Y / G portion of the line 2a of the third double-line buffer zone 55 is entered into the buffer zone 58 of line 0 of Y / C. Since the Y / G line 0 buffer zone 58 functions to delay the horizontal line data stored therein by a horizontal line period, and the buffer zone 58 had no data stored therein in the initialization, the output of the same on the output LINE 0 does not include valid data. Figure 5 continues the loading and output process after the initialization steps shown in Figure 4. More particularly, the pixel data for a horizontal line 4 of a first field and the pixel data for a horizontal line 5 of a second field are entered into the Y / G portion of the line that of the first double line buffer zone 53 and in the Y / G portion of the line 2a of the third double line buffer zone 55, respectively, during a third period of horizontal line. At the same time, the pixel data for the horizontal lines 2 and 3 are extracted from the Y / G portion of the Ib line of the first double line buffer zone 53 and the 2b portion of the Y / G line of the third double line buffer zone 55, respectively, in their respective output LINES 1 and 2. At the same time, the pixel data for the horizontal line 3 from the portion 2b of the Y / G line of the third double line buffer zone 55 is put into the line buffer zone 0 Y / G 58 while that the previously stored pixel data for the horizontal line 1 data is taken out on the output line 0. Therefore,, the buffer zones 58, 53, and 55 are producing pixel data for the horizontal lines 1, 2 and 3, respectively, for an image with movement in the respective output lines 0, 1, and 2 during the third period horizontal line, where the pixel data for the horizontal lines 1 and 3 are part of the second field and the pixel data for the horizontal line 2 is part of the first field of a moving image that was stored in Graphics Memory 38 (shown in Figures 1 and 2). Figure 6 continues the process of loading and unloading the step shown in Figure 5. More particularly, the pixel data for a horizontal line 6 of a first field and the pixel data for a horizontal line 7 of a second field of a moving image are introduced into the Y / G portion of the horizontal line Ib of the first double line buffer zone 53 and the Y / G portion of the horizontal line 2b of the third double line buffer zone 55, respectively, during a fourth period of horizontal line. At the same time, the pixel data for the horizontal lines 4 and 5 of the moving image are extracted from the Y / G portion of the horizontal line that of the first double line buffer zone 53 and the Y / Y portion. G of the horizontal line 2a of the third double line buffer zone 55, respectively, in the respective output LINES 1 and 2. At the same time, the pixel data for the horizontal line 5 from the Y / G portion of the line 2a of the third double line buffer zone 55 is input to the buffer zone 58 line 0 Y / G while the previously stored pixel data for the horizontal line 3 data is output on LINE 0. Therefore, the buffer zones 58, 53, and 55 are taking pixel data for the horizontal lines 3, 4, and 5, respectively, on the output LINES O, 1, and 2 during the fourth line period. horizontal, where the data for the horizontal lines 3 and 5 are part of the second field of the moving image while the data for the horizontal line 4 are obtained from the first field of the moving image that were stored in the graphics memory 38 (shown in Figures 1 and 2). From Figures 5 and 6, it can be seen that after initialization (Figure 4), the pixel data of the output LINE 1 represents data for the sequential horizontal lines (for example, the horizontal lines numbered with pairs (0- 254) of a standard NTSC image) of a first field of the two fields of a frame for an NTSC interlaced visual display. After the horizontal lines numbered by pairs of the example of the first field have sequentially left on the LINE 1 of output during the periods of sequential horizontal lines, the output continues with the horizontal lines sequentially numbered (for example, the horizontal lines numbered with nones 1-255 of a standard NTSC image) of the second field of the frame in the manner found to explore an interlaced visual display. Although not shown in Figures 4-6, it will be understood that the chrominance data is output in the LINE of output of the second double line buffer area 54 shown in Figure 2, at the same time as the data of associated luminance pixel for each horizontal line that comes out on the LINE of output. Turning now to Figure 3, a block diagram of a second portion of a video processing circuit 46 found in the subscriber cable box unit 10 of Figure 1 is shown, according to the present invention. The second portion of the video processing circuit 46 comprises a color palette circuit 60, a demultiplexer YC to YUV 62, a Multiplexer / Fader (MUX. / DEVA.) 64, a multiplexer and 3: 1 control.
(MUX. &CONT. 3: 1) 66, and a convolver 68. The 10-bit pixel data (bits 9-0) propagated in each of the LINES 0, 1, 2 from the output of the pixel assembly buffer 52 of Figure 2 for the corresponding pixels in three adjacent horizontal lines of a moving image are received in separate inputs of each color palette circuit 60, the deemultiplexer from YC to YUV 62 and the Multiplexer and control 3: 1 66. More particularly, the bits 7-0 of the parallel output of 10 bits / pixel of the pixel assembly buffer 52 for each of the Output lines 0, 1, and 2 are received at the inputs of the color palette circuit 60 and the demultiplexer YC to YUV 62, while bits 9 and 8 of the 10 bits / pixel parallel output of the buffer of pixel assembly 52 for each of the output LINES O, 1, and 2 are received at the multiplexer and control inputs 3: 1 66. Additionally, the demultiplexer YC to YUV 62 receives the data bits of chrominance 7-0 taken out in parallel on the LINE out of memory intermediate assembly of pixels 52 since the chrominance data is only used when the pixel data of the moving image is related to an image signal with True Color movement. More particularly, when the motion picture is encoded as a color palette signal, the same code defines the color, and the chrominance data is not required as needed with a true color video signal. The color palette circuit 60 functions to detect when the 8 bits (bits 7: 0) of a pixel data received in parallel in each of the output LINES 0, 1, and 2 represent separate codes for particular colors of a color palette, and converting those color palette codes into an output signal in a bus bar 61 that represents a 24-bit YUV multiplexed color palette signal for three 8-bit pixel data received for those three lines. The color palette circuit 60 is a well-known device, and any suitable circuit can be used for the same. The demultiplexer YC to YUV 62 detects when the 8 bits (bits 7: 0) of data received in parallel for pixels in each of the output LINES 0, 1, and 2 of the pixel assembly memory 52 represents color data true (for example, a motion image obtained directly from a television image), and also uses the 8-bit chrominance data obtained via the LINE output of the pixel assembly buffer 52 to generate an output signal True-color 24-bit YUV for pixels of the three lines for transmission in the bus bar 63. The Multiplexer / Fader 64 receives, in separate inputs thereof, each of the color palette YUV data signals of 24 bits propagating over a bus bar 61 from the color palette circuit 60, the true-color 24-bit YUV data signals propagating in the bus 63 from the demultiplexer YC to YUV 62, and the video signals and n Live 24-bit YUV in a busbar 59. The multiplexer / fader 64 responds to the control signals in a driver 67 from the multiplexer and control 3: 1 66 to output one of the three input signals (YUV from the 24-bit color, true-color 24-bit YUV, or YUV 24-bit live video) received in the multiplexer / fader 64 during each pixel period as the mixed YUV output signals digitized in a busbar 65. More in particular, the multiplexer and control 3: 1, 66, determines from the bits 9 and 8 received in the output LINES 0, 1, and 2 of the pixel assembly buffer 5-2 whether the pixel data of the pixel assembly buffer 52 in the output LINES 0, 1, and 2 represent color palette data, true color data, or data (invalid data) for a pixel that is not part of a moving image that is to be superimposed on a live video signal, and, therefore, the live video signal must be used for that pixel instead of the color palette or the true color data received from the pixel assembly memory 52. As a result of this control the information obtained from bits 9 and 8 of the output LINES 0, 1, and 2 of the pixel assembly buffer 52, the multiplexer and 3: 1 control, 66, sends control signals over the lead 67 to the multiplexer / fader 64 to select the correct input data for each pixel of an image to be displayed in a NTSC or PAL remote television receiver (not shown). The convolver 68 uses sequential sets of three pixel data values received in the signal from the multiplexer / fader 64 in a bus bar 65 to provide an 8-bit weighted output signal for the pixel data for a central pixel in a matrix of 3 by 3 of the corresponding pixels in three adjacent lines of a television image or provides the signal from the multiplexer / fader 64 in a busbar 65 as a YUV output signal in a busbar 47 that depends on the control signals from multiplexer and control 3: 1, 66, on a conductor 69. Referring now to Figure 7, there is shown a block diagram of an exemplary multiplexer / demodulator circuit 64 (shown within a dotted line rectangle) that it comprises a multiplexer 2: 1 (MUX.) 72, and a Fader 74 (shown inside a dotted line rectangle). The fader 74 comprises an AB 75 Adder, a signed Multiplier (MULT.SIGN.) 77, an adder A + B 78. The multiplexer 2: 1 receives each of the graphics data signals from the color palette 60 in the bus bar 61 in the first input terminal (A), and the graphics data signals of the demultiplexer YC to YUV 62 in the bus bar 63 in a second input terminal (B). A control signal in the driver 67 of multiplexer and control 3: 1, 66, selects which of the two graphics input signals (from the input terminal A or B) will be output from the multiplexer 2: 1 72 at the terminal of exit (O). The output signals of pixel graphics (Y, U, or V) from an output terminal (O) of multiplexer 2: 172 (designated G) in a busbar 70 are received at a first input terminal (A) of the AdderA-B 75 of fader 74. A live video YUV signal (Y, U, or V) (designated L) is received from a busbar 59 in a second input terminal (B) of the AB 75 Adder. The pixel data values of the input data of the A terminal from the multiplexer 2: 1, 72, minus the data values of the YUV pixel data. live video received at the input terminal B are provided as an output at an output terminal (O) of the adder AB 75. The signed multiplier 77 receives, for example, a record (not shown) a changeable ratio control value (R) in a bus 71 of a first input terminal (A), and the output of the adder AB 75 in a busbar 76 in a second input terminal B. The multiplied value resulting from the ratio control value (R) in the bus 71 and the output data of the signal of graphics of the adder AB 75 in the bus 76 exit in an output terminal (O) in a bus bar 79 to a first input terminal (A) of the adder A + B 78. The live video signal (Y, U, or V) in the bus 59 is received in a second input terminal (B) of the adder A + B 78, and the sum of the two values of the input signals are provided as an output signal (designated Q) in the busbar 65 to the convolver 68 (shown in Figure 3). Fader 74 functions to fade a graphic signal for an image with movement in and out so that the graphic does not instantly appear or disappear over the live video signal. In other words, for a graph input fading, the fader 74 causes the graph to appear with increasing intensity in a television receiver while the video signal decreases in intensity in the area of the graph for a short period of time so that the graphic is totally visible. Similarly, for an outward fading of the graph, the fader 74 causes the graph to appear with decreasing intensity on a television receiver while the live video signal increases in intensity in the area of the graph for a short period of time until the graph disappears. The operation of the fader 74 can be explained according to the following algorithms. For the next one, an example of 9-bit fading multiplier (R) bus 71 is defined as follows. From the above definitions, R is the fading control value and varies from 0 to 256. Q = [(R / 256) * G] + [(1-R / 256) * L), y = L + [(GL) * R] / 256, Equation 1 where "L" is a pixel value of the live video, "G "is a pixel value of the image overlap with movement, and the symbol" * "represents a multiplication function. From equation 1 above, as the ratio used for the multiplication value R changes, the intensity of the live graphics video signals change in opposite directions. Referring now to Figure 8, a block diagram of the convolver 68 shown in Figure 3 is shown. The convolver 68 (shown within a rectangle of dashed lines) comprises a deflection circuit 80, a convolution circuit 82, and a multiplexer (MUX.) 84. The deflection circuit 80 receives pixel data in sequence from the multiplexer / fader 64 (shown in Figures 3 and 7) in the busbar 65, and at the same time generates from the same data in the bus bars 81 for three pixels in a vertical of a moving image to be displayed on a television receiver. More particularly, the three pixels are obtained from corresponding pixels in three adjacent lines of both fields of a frame forming an image with movement. The three pixel data values are obtained by any suitable arrangement as a plurality of delay circuits operating from a clock of pixel or three times a pixel clock. The three pixel data values are received via the bus bars 81 by the convolver circuit 82. The sequential pixel data of the multiplexer / fader 64 is received by the deflection circuit 80 in the bus bar 65. The sequential pixel data of the multiplexer / fader 64 passes through branch circuit 80 and is provided to a first input (A) of multiplexer 84 via a bus 85. Still more, the branch circuit 80 transmits sequential sets of three pixel data values from separate outputs thereof to separate inputs from the convolver circuit 82 at the bus bars 81. The convolver circuit 82 provides an 8-bit weighted output signal for the pixel data for a central pixel in a 3 by 3 matrix of corresponding pixels in three adjacent lines of a television image to a second multiplexer input B (B) via a bus 86. The multiplexer 84 selects the signals in the first (A) or the second (B) inputs for transmission to the output terminal (0) and the bus 47 depending on the control signals from the multiplexer and control 3: 1, 66, on a conductor 69. As described in the pending patent application related to serial number filed on the same date as the present application for the present inventors and incorporated herein by reference, the convolver circuit 82 effectively multiplies (using only adders and delays) the three pixels of a vertical received in the bus bars 81 in a 3 by 3 pixel array with predetermined weight values, and provides an output signal averaged for the central pixel of the matrix 3 by 3 to a second input (B) of the multiplexer 84. This process continues for each pixel of a central row (LINE of output 1 of the intermediate assembly of pixels 52 of the Figure 2) as the moving image data for the corresponding pixels of the three adjacent lines as the pixel data advances horizontally across the moving image for three adjacent lines. Referring now to Figure 9, there is shown a graphics memory 38, and a portion of the video processing circuit 46 (shown within a rectangle of dashed lines as shown in Figure 2 forming part of a memory controller and video 40 of the second module 14 of Figure 1. The graphics memory 38 is coupled via a data bus 39 to a memory controller machine and motion picture state 42 that is part of the processing circuit portion of the video 46 for bidirectional communication The memory controller and motion picture state machine 42 is coupled to a central processing unit 36 (shown in FIGURE 1) via a bus 48 for writing within graphics memory 38 via bus 39, and receive field signals <; 1: 0 > from the composite to the Y, U, V 44 circuit of Figure 1 via the conductor 56. This portion of the video processing circuit 46 further comprises a data tube 50. (shown inside a dotted line rectangle), and a pixel assembly buffer 52. The data tube 50 comprises a pixel buffer 97 address generator and a pixel buffer zone data pipe 98. , each of which receives an output signal from the memory controller machine and motion picture state 42 via a bus 45. The pixel 97 buffer zone address generator and the zone data tube pixel buffer 98 transmits separate addresses and pixel data output signals, respectively to the pixel assembly buffer 52 via the respective bus bars 49 and 51. The pixel 98 buffer memory data tube also receives the data from the pixel buffer. data in the busbar 39 directly from the graphics memory 38. The pixel buffer zone address generator 97 and the memory area data tube intermediate pixel 98 uses pixel data obtained from an image control word with movement in a moving image input, a moving image data table 92, and any other information from a line control table 94 in the graphics memory 38 for placing the data for each pixel at the appropriate address location of the double line buffer zones 53, 54, and 55 (shown only in Figure 2) of the pixel assembly buffer 52 As will be explained in more detail later herein, special effects are obtained such as amplification, warp, etc. for the horizontal lines of an image input with movement from the image control words with movement in the image input with associated movement and from the line control table 94 of the graphics memory 38. The generator The pixel buffer 97 of the data tube 50 uses this information to suitably alter the direction of the pixel in a moving image data line obtained from the moving image data table 92 of the graphics memory 38 to achieve the designated special effect. This altered address is sent to the pixel assembly buffer 52 to be used to place the associated pixel data at the pixel location designated by the altered address in the double buffer buffer 53, 54, or 55 buffer memory. of pixel assembly 52 to subsequently provide the designated special effect on the television screen. The pixel buffer data tube 98 at the same time receives the pixel data for the pixel address, and transmits the pixel data to the pixel assembly buffer 52 for storage in the direction of the memory zones double line intermediate 53, 54, or 55 generated by the pixel 97 buffer memory address generator. The pixel assembly buffer extracts data from luminance pixels for three adjacent horizontal lines of an image with movement in designated busbars LINE 0, LINE 1, and LINE 2, and outputs chrominance pixel data associated with the luminance output data of LINE 1 in a LINE as explained hereinabove for the pixel assembly buffer in the Figure 2. The graphics memory 38 comprises multiple tables comprising the table of image list with movement 90, the table of image data with movement 92, the tab the line control 94, and an extension list table 96. The moving image list table 90 comprises a separate memory section for each of a plurality of N moving images (only the entries for the images with movement # 1, # 2 and #N). As shown for the input of the image with movement # 1, the memory section thereof comprises a portion of image data indicator with movement, a portion of moving image controls, a portion of line table indicator, a portion of other optional controls, and a portion of field enable control. The moving image data indicator portion is used to access the moving image data table 92 at a predetermined position associated with moving image input. The portion of moving image controls comprises data relating to, for example, the size of the moving image, its location or position X and Y on the television receiver screen, and information about amplification, warp, etc., which is they will perform on the image with movement. More particularly, if the portion of moving image controls indicates that a moving image has an amplification of 2, then each line of the moving image is amplified by 2. Similarly, if the portion of moving image controls indicates that a moving image has a deviation of 2, then all lines of the moving image are deviated by two. The portion of image controls with movement affects each horizontal line of pixel data of the moving image in the same way. The line table indicator portion of each moving image input is used to access a previously determined portion of the line control table 94 for the control word associated with that moving image input in order to produce more advanced special effects from those produced by the portion of motion picture controls mentioned above. Finally, the field enable control portion is used to produce the effects of "smoked glass" (transparent overlay) on the television receiver screen so that the input of the moving image has access to the table in the list of image with movement 90. Other similar portions are shown for each of the other image entries with movement # 2 to #N. Furthermore, the moving images in the moving image list table 90 are listed, preferably, in ascending priority order where, for example, the image input with movement # 1 has the lowest priority and the input of image with movement #N has the highest priority. As a result of that priority setting, the pixels of a moving image with a higher priority override, replace, or precede the pixels of an image with lower priority movement where two moving images or one live video are located in a position of overlap on the screen of a television receiver. The moving image data table 92 comprises data words that include pure data for each of the pixels of each horizontal line for each image input with movement of the moving image list table 90. In other words, when there is access to a moving image input in the moving image list table 90, the moving image data portion directs access to the moving image data table 92 wherein the data words of moving image (for example, the image word A with movement to the data word C with moving image) for that moving image input are stored in the image data table with movement 92 of the memory of graphics 38. It should be understood that these words of moving image data do not include the numerical number of lines in the moving image, since the size of the moving image, the crazy On the screen of the television receiver, etc., for that image with movement is located in the portion of image controls with movement of the input of the image list with movement. The table of the image list with movement 90 and the table of image data with movement 92 work with each other having access first to the table of image list with movement 90 and then, under the control of the moving image data indicating portion, the moving image data table 92 being accessed to retrieve the data that tells the video processing circuit 46 to draw the moving image in the manner described by the words of image data with movement. The line control table 94 of the graphics memory 38 is an optionally used table containing sub-tables of separate predetermined lengths (only one sub-table including N control words is shown), wherein each sub-table comprises a line control word separated for each line of a moving image. The line control words in the line control table 94 provide independent controls for the horizontal lines of that moving image. More particularly, as mentioned earlier in the preamble, the portion of image controls with movement of each image input with movement in the table of the image list with movement 90 affects each line of a moving image in the same manner. In contrast, the line control words in a subtable in the line control table 94 for a moving image input, as indicated by the indicator portion of the line table of the table of the image list with movement 90, are used to provide independent controls for each of the horizontal lines of that moving image. For example, it is assumed that the portion of image controls with motion for the image input with movement # 1 indicates that the image with movement # 1 includes ten lines in a position X and Y on the television receiver screen without any effect special such as constant deviations. The pixel data for each of the ten lines of the moving image # 1 is provided in the moving image data table 92 starting in the direction indicated in the frame by the image data indicating portion with movement of the image. Image input with movement # 1. The special effects that must be made for any one or more of the ten lines of the moving image # 1 are found in any of the line control words in the line control table 94 starting at the direction indicated by the portion Indicator of the line table of the entrance of the image with movement # 1. In other words, the edges of the image of the image with movement # 1 could be aligned in a straight line on the screen of the television receiver. However, with the line control words of the line control table 94, each line of the input of the moving image # 1 may have, for example, different deviations for warping the image in a predetermined manner. For example, a moving image can be woven using the associated line control words of the line control table 94 to appear on the outer surface of a cylinder in three dimensions. Referring now to Figures 10, 11, 12, 13, 14, and 15, examples of what can be done with the line control words of the line control table 94 according to a first embodiment of the present invention are shown. More particularly, in Figure 10 a moving image as defined by both the moving image controls of the moving image list table 90 and the moving image data associated with moving image input is shown. In the table of image data with movement 92 there is no special advanced effect as it can be introduced by a subtable in the line control table 94. In Figures 11 and 12 it is shown how the moving image of Figure 10 can be change to produce pseudo three-dimensional effects by varying horizontal deviations in each line. For example, in Figures 11 and 12 each line of the moving image of Figure 10 is deflected by a separate amount as defined in the line control words of a first and second sub-table, respectively, of the control table of FIG. line 94. Figure 13 shows an example of performing an advanced warp effect with horizontal amplification line controls in the moving image of Figure 10. More particularly, the line control words of a subtable of the control table of line 94, as indicated by the line table indicator of an image input with movement of the image list table with movement 90, defines both the amount of detour warp for each edge of the moving image for each Line of the image with movement, such as the amount of amplification that will be used for each line of the moving image. Figures 14 and 15 show an example of an advanced special effect to vary a horizontal line mirroring each line of a moving image. More particularly, Figure 14 shows a moving image as can be defined by an image input with associated movement of the moving image list table 90 and the moving image data table 92. Figure 15 shows how a subtable of the line control table 94 can change the moving image of Figure 14 by varying the deviations of the horizontal line in only the upper half of the moving image to produce a mirror image of the lower half of the moving image shown in Figure 14. Other advanced special effects that can be performed with subtables of the line control table 94 are, for example, (a) varying a color palette bank for 256 colors in a 4-bit motion image , (b) vary the visibility control to make selective lines disappear from a moving image, and (c) vary the limits of horizontal cropping to selectively trim and around a shape of an image with movement. An advantage gained by the use of the line control table 94 is that an advanced special effect created by a particular subtable can be used by many of the image entries with movement in the image list table with movement 90. This saves memory space as opposed to accumulating the subtable in each of the image entries with movement as can be found in the prior art. Still more, the data for the same motion picture can be used in multiple motion picture entries of the motion picture list table 90, where each of the multi-motion picture entries uses its portion of the motion indicator. line table to enter a different subtable of the line control table 94. These same multiple moving image entries associated with different line control subtables are used when viewing the same moving image with different special effects advanced. in different places on the screen of the television receiver. Returning now to Figure 9, the extension list table 96 is an optional table that is used to save time by processing many image entries with movement in the table of the moving image list 90. In the prior art systems, each of the moving image entries in the moving image list table 90 are entered in sequence to determine if that moving image exists in a pixel that is being assembled for a horizontal line using the size and position X and Y on a screen of visual display designed for that image with movement. As a result, prior art systems, for example, for playing games were limited to a small number of moving images (for example, N = 8 or 16 moving images) in order to assemble the pixels for each line within of the period of time necessary to display the horizontal line on the visual display screen. In the present subscriber cable box unit 10, without the table of the optional extension list 96 being present, the memory controller machine and motion picture state 42 normally gives access to each of a plurality of N entries of image with movement listed in the imagery table with motion 90 of the graphics memory 38 to determine which of the N image inputs with movement exists in each pixel of the horizontal line being assembled. By giving access to each of the N image inputs with movement of the motion picture table with motion 90, the memory controller and motion picture moving machine 42 obtains the data from the moving image data table. 92 and the optional line control table 94 performs what is necessary for each moving image to produce the pixel data for each horizontal line assembled in the pixel assembly buffer 52. However, if the memory controller machine and image state with movement 42 has to give access and process, for example, 96 different image entries with movement, the time needed to process the 96 images with motion would exceed a period of time allowed to assemble each horizontal line of pixel data in the pixel assembly buffer 52. The use of the table in the extension list 96 solves this problem. When the optional expansion list table 96 is used, at least one register (not shown) in the memory controller machine and motion picture state 42 indicates that the table in the extension list 96 exists, and provides all the data necessary for the memory controller machine and motion picture state 42 to appropriately use the table of the expansion list 96, including a portion designated "image input words with motion #" containing a number (integer value) of words (NW) per image list entry with movement that is a constant integer value for each of the moving image entries when the extension list table 96 exists. More particularly, each image input with movement in the Image list table with movement 90 can include words for (1) a moving image data indicator, (2) moving image controls, (3) an optional line table indicator word, (4) an optional word for optional motion picture controls, and (5) an optional field enable control word. Therefore, in theory, each image input with movement of the moving image list table 90 contains 2 - 5 words. When an expansion grid 96 is used, each of the image entries with movement of the moving image list 90 includes the same number of words (for example, 5 words) regardless of which optional words are normally required for each image input with movement. The purpose of the record indicating the number of words of the image input with movement in each image input with movement is to simplify the access of only some of the entries in the table of the image list with movement 90 when the images are constructed. pixels of a horizontal line. The extension list table 96 comprises an extension list control word or a group of extension list control words that describe which of the N moving images found in the table of the list of moving images 90 exists in each line. It should be understood that the extension table is mainly used where there are many images with movement (for example, N = 96 moving images) in order to reduce the processing time in assembling the pixel data for each of the horizontal lines in the pixel assembly buffer 52. The number of words in the table of extension list 96 depends on the equation: No. words extension list = (NS / 32) * (NH / NL), Equation 2 where NS is the number of lines in the visual deepliege screen, NL is the number of lines on the screen of a television receiver by extension list word, and 32 represents the exemplary number of bits available in each word of the table of extension list 96. The values for NH and NL are programmable numbers, and NL can have a value of, for example, 2, 4, 8, 16, 32 or 128. More particularly, although electrically there are 525 horizontal lines of video in two fields of a standard NTSC television display, only about 440 - 500 lines are usually seen from hanging from the used television receiver. The display area of the screen where the 96 moving pictures are to be displayed may vary from 0 - 500 lines for any predetermined number of sections, where each section has an equal number (NL) of lines. Referring now to Figure 16, there is shown a video display screen portion that is divided by dashed lines into 4 equal sections 110, 111, 112, and 113 each section having an exemplary number of 32 lines per extension list word ( NL) according to a second embodiment of the present invention. Therefore, the total area of the video display screen that is used to display the exemplary moving pictures 96 covers 128 horizontal lines (4 in 32 lines / section). Still further, a plurality of motion pictures are shown, where the various motion pictures previously determined are designated 101, 102, 104, 106, 108, and 109, which for purposes of the description hereinafter in the present will represent the entries of image with movement 1, 2, 4, 6, 8, and 9, respectively, in the table of image list with movement 90. According to the Equation (2), the number of words in the extension list is equal to (96 images with movement / 32) times (128 lines on the display screen (NH) divided by 32 lines per word of extension list (NL)) which gives a result of a total of 3 * 4 = 12 words of extension list. More particularly, the first three words of the extension screen are associated with section 110 of the video display screen area, the next three words in the extension list are associated with section 111, the next words in the extension list are associated with section 112, and the last three words in the extension list are associated with section 113 for Give a total of twelve words from the extension list. As shown in Section 110 of the visual display screen area, only motion picture entries numbered 1, 2, 4, and 6 of the moving picture list table are found in any of the 32 lines of the visual display screen which are assembled in the pixel assembly buffer 52 shown in Figure 9. Therefore, a first 32-bit word in the extension list table 96 associated with section 110 has 32 bits and appears as, 00000000000000000000000000101011, where the bit further to the right is associated with the image input with movement # 1 and the leftmost bit is associated with the image input # 32 with movement in the extension list table 90 of the graphics memory 38. Still further, the "1" in the expansion list word indicate that images with movement 1, 2, 4, and 6 are active in section 110. The remaining words second and third in the table of extension list 96 associated with section 110 for moving images 33-96 each contains 32 zeros since none of these moving image entries in the moving image list table 90 are active or appear in the section 110. The other nine words in the extension list associated with sections 111-113 are encoded in the same way for active moving images or that appear in each of these sections. During the operation, the memory controller machine and motion picture state 42 of FIG. 9 determines from one or more records therein that there is an extension list table 96, it also obtains the data. (rate NS, NH and NL, the start line for the visual display area, and the number of moving images) that are stored in it and are needed to be used with the extension list table 96, and determines the number of stretch words needed for each section of the visual display area according to equation 2. When assembling the pixel data for the 525 lines of the video display, when the memory controller machine and image state with movement 42 reaches the start line for the visual display area of the extension list, the memory controller machine and moving image state 42 first gives access to the words in the extension list (for example, the first three words) associated with the upper section of the display area of the extension list (for example, section 110). From these first three words in the table of the extension list 96, the memory controller machine and moving image state 42 determines that only the image entries with movement 1, 2, 4 and 6 are active in the section 110. The memory controller machine and motion picture state 42 then first has access to the picture input # 1 with motion in the picture list table with motion 90 to assemble each pixel of a first horizontal line of the picture. section 110, then you have access to entries # 2, # 4, and # 6 of images with movement in sequence. The portion of the region denoted as "moving image input words" indicates how many words (NW) the memory controller machine and moving image state 42 use to calculate where the next image input with active movement is located in. the image list table with movement 90. More particularly, if each image input with movement has five (5) words in it, then NW = 5. This indicates that the start of the image entries with movement 1, 2, 4 and 6 are in the storage places 1, 6, 16, and 26, respectively, in the list of image with movement 90 because Each entry has five words using five sequential memory locations. Therefore, the memory controller machine and motion picture state 42 sequentially jump to places 1, 6, 16, and 26 to obtain the five words associated with the picture entries with motion # 1, # 2, # 4, and # 6, respectively. This avoids the time needed to go through all 96 entries of moving images to see how many words are included in each image input with movement, and allows the memory controller machine and image state with motion 42 to easily jump to the information necessary for images with movement activate and jump on inactive motion images for each of sections 110-113.
It should be understood that the memory controller and motion picture state machine 42 uses one or more extension list words for each of the lines of a section (e.g., section 110) since the same images with movement they are active in each of the lines of that section. The memory controller and motion picture state machine 42 operates in the same manner for each of the other sections of the visual display area covered by the words in the extension list (e.g., sections 111-113). Moreover, a single large image with movement in more than one section can be included. For example, the motion images 102 and 108 in Figure 13 for the image entries with movement # 2 and # 8 respectively are included in the respective sections 110-111 and 112-113. As a result the first word of the separate extension list for each of the sections 110 and 111 includes a "1" in the position designated for the image entry with movement # 2 and the 113 includes a "1" in the designated position for the image input with movement # 8. As shown further in Figure 16, the images for the motion images 108 and 109 partially overlap, and since the motion image 109 has a higher priority than the motion image 108, the pixels associated with the motion image 109 will overwrite the pixels for the image with movement 108 in the overlap area. It was found that with the present recorder cable box unit 10 of FIG. 1, approximately 100 small motion pictures can be accommodated in the motion picture list table 90 without the use of the extension list table 96. Using the data stored in the extension list table 96, it was found that many more small moving images (eg, up to about 3,000) can be accommodated in the list of images with movement 90 for display on the display screen. a television receiver. Moreover, the information in each of the portions of each of the N entries of the list of moving pictures list 90, and tables 92, 94, and 96 are entered into the graphics memory 38 from the remote central process 36 (shown only in Figure 1) via the memory controller machine and motion picture state 42 forming part of the first portion of the video processing circuit 46. This information can be updated at any time by the central processing unit 36. The field enable control portion of each moving image entry in the moving image list table 90 refers to the controls to form a "smoked glass" effect (transparent overlay) with two moving images or an image with movement over live video. More particularly, a "smoked glass" effect is defined as an area of overlap of two moving images, or an image with movement over live video, where a first moving image is displayed on the screen of a television receiver in the lines (for example even lines) of a first field of an image, and a second image with movement or live video is displayed on the screen of a television receiver on the lines (for example, odd lines) of a second field of an image. Such an effect allows an image of the first moving image to be viewed at the same time that an image is also seen behind it of the second moving image which can be, for example, a snapshot captured from a frame of a live television signal that is stored as an image input with movement in graphics memory 38, or a real live video. The two-dimensional convolver 68 then processes the assembled image to produce a "smoked glass" effect between the two images with "motion" The prior art systems mainly use software to combine the two images by computer. invention, the field enable control portion of the moving image input indicates that this moving image is only to be displayed on the even lines or the nones lines of the area of the visual display screen indicated for that moving image. When the pixel data is plotted on each horizontal line of a visual display for the image entries with movement of the table of the moving image list 90, the field enable control indicates whether or not such a moving image exists in a line with its designated area on a visual display screen.This is a simple and inexpensive method to allow Let a graphic or moving image be inserted or lit in only one of the two fields. Referring now to Figure 17, an exemplary section of lines 1-13 of an interlaced television receiver screen 120 is shown where a first (image with movement # 1) and second (image with movement # 2) images with movement are interleaved in a pixel area 121 of a screen (shown within a dotted line rectangle) according to a third embodiment of the present invention. In particular, the image input with movement # 2 is defined by eue moving image controls in the moving image list table 90 of the graphics memory 38 as lying within the pixel area 121 formed by the lines 2 -7, and that the image input with movement # 2 will be inserted or lit in only the numbered lines paree 2, 4, and 6 forming part of the first field of a frame in the area of pixel 121. Moreover, the input of image with movement # 1 in the image list table with movement 90 ee defined as occupying the entire area covered by lines 1-13 of screen 120. As the image input with movement # 1 has a lower priority than the image input with movement # 2, the image input with movement # 1 occupies the lines numbered with nones 3, 5 and 7 in section 121 along with all the remaining area of lines 1-13. Returning now to Figure 9, in order for the memory controller machine and motion picture state 42 to produce the "smoked glass" effect, one needs to know which field is currently being displayed on the television receiver screen . This information indicating the current video field is provided to the memory controller machine and image state with motion 42 by a 2-bit field signal (FIELD <.; 1: 0 > ) indicating the bits I and 0 transmitted by a remote video synchronization circuit (not shown) generally located in the composite circuit for Y, U, V 44 (shown in FIGURE 1) of the subscriber cable box unit 10 and obtained from a stream of live video signals received. This 2-bit field signal is basically a clock signal that runs continuously. The memory controller and motion picture state machine 42 also reads a 4-bit field enabled signal from the input of the image list with associated motion indicating which field to enable of two frames stored for a moving image. It should be understood that the entire information for the color of a color image is transmitted within four fields of two frames in order to explain why four bits are needed for the field enabled signal, where each frame has two fields. Moreover, the use of two frames is not a matter of displaying the color image, but instead it becomes a matter of artifacts (eg flicker, etc.) produced on the screen of an interlaced television receiver. More particularly, in a NTSC color video signal there are (a) 227.5 color bursts sent on each horizontal line of the image, (b) 262.5 lines for each of two fields of a frame, and (c) 525 lines in a picture that includes the two fields. Because there are 227.5 bursts of color per line, if bursts of color on line 0 of field 0 go in a positive direction at a certain point, then on the next line (line 2) of field 0 lae bursts of color go in a negative direction to some extent because each line contains a full burst sequence of color plus one half of a burst of color instead of a sequence of only full bursts of color. Moreover, because there is a non-number of lines 525 in a table, the bursts of color in the first line (line 0) of field 0 of the next (second) frame will go in a negative direction and will have the opposite direction of the line 0 of field 0 of the immediately preceding box. Thus, to obtain a burst of color that goes positively on line 0 of field 0 of a table, the repeated pattern only occurs every second frame. It should be understood that all the content of a color image is presented after the first frame, but that the repeated patterns of the artifacts (for example, flickering, etc.) are by-products of a cycle of four frames. This is the result of a commitment originally made to form the NTSC standards for transmitting TV color signals that are compatible with black and white television signals. When a snapshot of a live television image is placed within the graphics memory 38 as an image input with movement in the list table with motion image 90, only one frame comprising two fields is needed to be stored, , re-display the image on the screen of a television receiver. The 4-bit field enable control is used by the memory controller machine and motion picture state to indicate when an image with movement is to be accessed in a certain frame or field depending on the code of the four bits . For example, a "1" in bit three of the field enabled signal indicates that the associated motion image must be enabled in frame 1, and a "1" in bit two of the field enabled signal indicates that the Image with associated movement must be enabled in box 0. Similarly, a "1" in bit one of the field enabled signal indicates that the image with associated motion must be enabled in field 1, and a "1" in the zero bit of the field enabled signal it indicates that the image with associated movement must be enabled in field 0. Therefore, bits 3 and 2 are used for the images with double-frame movement placed in intermediate memory while loe bits 1 and 0 are used to produce a "smoked glass" effect when the moving image is only seen in a field, or for double-frame moving images buffered when a moving image is seen in both fields. The memory controller and motion picture state machine compares the field and field settings enabled to determine which of the four exclusive fields in the two frames is currently lit in order to display the image in its desired horizontal line positions. , and for the remodulation of the NTSC image to be displayed on the television receiver screen via remote processing circuits' (not shown) to provide correctly directed bursts of color. More particularly, the information for the four fields, determined from the field signals and enabled field signals, is used to put the double line in the buffer memory 52 in the buffer zone to determine which data of motion image will be placed in each pixel position of the double line buffer zones 53, 54 and 55 (shown in Figure 2). When the comparison of the field signals and the field signals enabled indicate a field match, the data of the moving image is read from the table of image entries with movement 90, the table of image data with movement 92 , and the line control table 94 in the graphics memory 38, and the double line buffer zones 53-55 in the pixel assembly buffer 52 are appropriately loaded during a certain field or frame. A regietro (not shown) in the memory controller machine and moving image output 42 is updated from the central processing unit 36 via the busbar 48 when it is necessary to indicate which field and / or frame it is desired to load the data of a image input with movement in the moving image list table 90 within the pixel assembly buffer 52. In accordance with the present invention, the use of simple control word or bite group, and a comparator to compare field and field signals enabled to determine repeated patterns on the horizontal lines of video signals NTSC allows a "smoked glass" effect to be formed in an interlaced visual display. This is in contrast to performing the same functions all in software that requires a powerful and typically relatively expensive software processor with a large amount of programming as found in some systems of the prior art. In those prior art systems the processor (for example a central processing unit 36 in Figure 1) has a part to build the image that requires a relatively expensive central processing unit 36, and if the processor shuts down the construction of the the image stops. An advantage of the present subscriber cable box unit 10 is that if the central processing unit is turned off, any animation of the image that is being displayed is stopped because the central processing unit 36 is not providing information of what things move around. However, the image stands on its own. Particularly, insofar as the graphics memory is not corrupted, the video graphics portion of the memory and video controller 40 shown in FIGURES 2, 3 and 9 knows how to build the image from the data in the graphics memory 38. It will be appreciated and understood that the specific embodiments of the invention described hereinabove are merely illustrative of the general principles of the invention. Those skilled in the art can make different modifications that are consistent with the established principles. For example, although the present invention has been described hereinabove for use in a subscriber cable box unit 10, it will be understood that the present invention can be used in, for example, an editing station prior to the signal of television that is being broadcast. In other words, the present invention can be used in television productions to create initial products before being transmitted instead of manipulating the television signal later at a remote location of the subscriber. This is possible because the quality and resolution of the image displayed on the television receiver does not change regardless of whether the editing takes place during the initial production or after the production at the subscriber's premises when the present apparatus is used. Therefore, it does not matter if the quality or resolution could be better in a non-edited television production if the editing is done before the production is viewed on the subscriber's interlaced television set or at the subscriber's premises.

Claims (22)

  1. CLAIMS 1. An apparatus for processing mixed video and graphics signals to be displayed on a standard television receiver comprising: a graphics memory comprising: a list of moving pictures table to list one or more graphics in a previously determined sequence for visual display in the television receiver, and for storing general information related to one or more graphics with control words in each listing; a table of moving image data to store pixel data for the horizontal lines of each of the graphics or where horizontal lines in the table of moving image data for each of the graphics or graphics are accessed by means of a control word in the list in the table of the list of images with movement for each one of the graphics; a line control table comprising control words that are accessed by a control word in the listing of the graphics previously determined in the moving image list table to provide independent controls for selective re-locating pixel data in each of the horizontal lines obtained from the moving image data table to produce a special effect previously determined for each of the previously determined graphs; a memory controller and motion picture state machine to access the graphics memory table in a pre-determined sequence to assemble and display each of the graphics at predetermined locations on the horizontal lines on a screen of the television receiver.
  2. 2. The apparatus of claim 1 wherein: the graphics memory further comprises an extension list table comprising at least one extension list control word for a predetermined number of horizontal lines, each forming one, of a plurality of previously determined separate sections of the television receiver screen, the control word (s) of the extension list define which of a plurality of N chart listings in the moving picture list table are active and appear in the associated section determined previously; and the memory controller and motion picture state machine first has access to at least one control word of the extension list in the extension list table when assembling a previously determined section of the television receiver's screen, and then you only have access to the chart listings in the table of the list of motion pictures that are active and appear in at least one word in the extension list.
  3. The apparatus of claim 2 wherein the total number of words in the extension list in the extension list table is defined as (NS / X) * (NH / NL) where NS is the total number of graphics which appear on the television receiver screen, X is the number of bits available in each extension list control word, NH is the number of horizontal lines in all sections of the plurality of previously determined separate sections of the screen of the television receiver, and NL is the number of horizontal lines per each at least one control word of the extension list found in each previously determined section of the television receiver's screen.
  4. The apparatus of claim 2 wherein a register defines the total number of control words in that chart listing, and each of the chart listings contains the same number of control words in total.
  5. The apparatus of claim 1 wherein the memory controller and motion picture state machine responds to the field enable signals from the list table of moving images indicating which field of a two-frame field of a video image ee is displaying on a television receiver screen to have access and eneamblar a previously determined first graphic listed in the list table of moving images for the horizontal lines of only one of the two fields and a second previously determined graphic listed in the list of moving images or a live television signal for the horizontal lines of the other of the two fields.
  6. The apparatus of claim 1 further comprising: a data tube that responds to the pixel data to which it has access by means of the memory controller machine and moving image state for each of the horizontal lines from the datoe table of moving images and the line control table for each of the chart listings in the moving picture list table to generate a selective direction for each of the pixel data for each horizontal line of agreement with control words previously determined in the moving image list table and the line control table; and a pixel assembly buffer responsive to a selective address for each pixel data from the data tube for each horizontal line, for assembling and buffering each horizontal line of pixel data according to each pre-determined selective direction generated by the data pipe for that horizontal line of pixel data.
  7. The apparatus of claim 1 wherein the graphics or graphics in the list of moving images are listed in sequence with a predetermined priority, and the pixel data of a first graphic of a higher priority overwrites the data of pixel of a second graphic with a lower priority at a pixel location when assembling a horizontal line where the first and second graphics overlap on a television receiver screen as the memory controller machine and motion picture status has access to get the graphs in sequence to assemble a horizontal line of pixel data.
  8. 8. An apparatus for processing mixed video and graphics signals to be displayed on a standard television receiver comprising: a graphics memory comprising: a moving image list table for listing one or more graphics in a previously determined sequence for the visual display on the television receiver, and for storing general information related to one or more graphics with control words on each list; a moving image data table to store pixel data for the horizontal lines of each of the graphics or graphics where horizontal lines in the table of moving image data for each of the graphics or graphics are accessed by means of a control word in the list in the table of list of moving images for each of the graphics or graphics; and an extension list table comprising at least one extension list control word for a predetermined number of horizontal lines, each forming a plurality of previously determined separate sections of the television receiver's display, or the control words of the extension list that define which of a plurality of N chart listings in the list of moving pictures are active and appear in the previously determined aeociated section; and the memory controller and motion picture processor machine first has access to at least one control word of the extension list in the extension list table when assembling a previously determined section of the television receiver screen, and then you only have access to the chart listings in the table of the list of motion pictures that are active and appear in at least one word in the extension list.
  9. The apparatus of claim 8 wherein the total number of words in the extension list in the extension list table is defined as (NS / X) * (NH / NL) where NS is the total number of the graph which appear on the television receiver screen, X is the number of bits available in each control word of the extension list, NH is the number of horizontal lines in all sections of the plurality of previously determined sections of the screen of the television receiver, and NL is the number of horizontal lines per each at least one control word of the extension list found in each previously determined section of the television receiver's screen.
  10. 10. The apparatus of claim 8 wherein each of the graphics listings in the moving image blank table comprises an input control word of the motion picture line that defines the total number of control words in that list of graphs, and each of the listings of graphics contains the same total number of control words to allow the memory controller machine and moving image status to jump to the chart listings in the moving image list table that are active and appear in the at least one word from the extension list.
  11. The apparatus of claim 8 wherein: the graphics memory further comprises a line control table comprising control words that are accessed by a control word in the listing of the graphics previously determined in the moving image list table to provide independent controls for selectively re-locating pixel data on each of the horizontal lines obtained from the moving image data table to produce a special effect previously determined for each of the the previously determined graphics; and the memory controller and motion picture state machine has access to the motion picture list table, the motion picture data table, and the line memory chart control chart in a given sequence previously to join and display each of the graphics or graphics in places previously determined on the horizontal lines on the television receiver's screen.
  12. The apparatus of claim 11 further comprising: a data tube that responds to the pixel data to which it has access by means of the memory controller machine and moving image state for each of the horizontal lines of the moving image data table and the line control table for each of the graphics listings in the moving image list table to generate an elective address for each of the pixel data for each horizontal line of agreement with control words previously determined in the moving image list table and the line control table; and a pixel assembly buffer responsive to a selective address for each pixel data from the data tube for each horizontal line, for assembling and buffering each horizontal line of pixel data according to each selective previously determined direction generated by the data tube for that horizontal line of pixel dts.
  13. The apparatus of claim 8 wherein the memory controller and motion picture state machine responds to the field enable signals from the moving picture list table indicating which field of a two field picture of a video image is being displayed on a screen of a television receiver to access and assemble a previously determined first graphic listed in the list of moving images for the horizontal lines of only one of the two fields and a second graphic previously determined listed in the table of moving image list or a live television signal for the horizontal lines of the other of the two fields.
  14. The apparatus of claim 8 wherein the graphics or graphics in the list of moving images are listed in sequence with a previously determined priority, and the pixel data of a first graphic of a higher priority overwrites the data of pixel of a second graphic with a lower priority at a pixel location when assembling a horizontal line where the first and second graphics overlap on a television receiver screen as the memory controller machine and motion picture status has access to the listing of graph in sequence to join a horizontal line of pixel data.
  15. 15. An apparatus for processing mixed signals of video and graphics to be displayed on a standard television receiver comprising: a graphics memory comprising: a list of moving pictures table to list one or more graphics in a previously determined sequence for the visual display on the television receiver, and for storing general information related to one or more graphics with control words in each listing; and a table of moving image data to store pixel data for the horizontal lines of each one of the graphics or where horizontal lines in the table of moving image data for each of the graphics or graphics have access by means of a control word in the list in the list table of images with movement for each one of the graphics; and a memory controller and motion state machine that responds to the field enable signals of the motion picture list table indicating which field of a frame of two fields of a video image is being displayed on a television receiver screen to access and assemble a first predetermined graphic listed in the table of moving image list for the horizontal lines of only one of the two fields and a second previously determined graphic listed in the table of list of moving images or a live television signal for the horizontal lines of the other of the two fields.
  16. 16. The apparatus of claim 15 wherein the graphics memory further comprises a line control table comprising control words that are accessed by a control word in the listing of the graphics previously determined in the moving image list table to provide independent controls for selectively re-locating pixel data on each of the horizontal lines obtained from the moving image data table to produce a special effect previously determined for each of the previously determined graphics.
  17. The apparatus of claim 16 further comprising: a data tube that responds to the pixel data to which it has access by means of the memory controller machine and moving image status for each of the horizontal lines from the moving image data table and the line control table for each of the graphics listings in the moving image list table to generate a selective direction for each of the pixel data for each horizontal line of agreement with control words previously determined in the moving image list table and the line control table; and a pixel assembly buffer that responds to an elective address for each pixel data from the data tube for each horizontal line, for assembling and buffering each horizontal line of pixel data according to each previously determined selective direction generated by the data tube for that horizontal line of pixel data.
  18. 18. The apparatus of claim 15 wherein: the chart memory further comprises an extension list table comprising at least one extension list control word for a predetermined number of horizontal lines, each forming one, of a plurality of previously determined separate sections of the television receiver screen, the control word (s) of the extension list define which of a plurality of N chart listings in the moving picture list table are active and appear in the associated section determined previously; and the memory controller and motion picture state machine first has access to at least one control word of the extension list in the table of the extension list when it assembles a previously determined section of the television receiver's screen, and then you only have access to the chart listings in the table of the moving image list that are active and appear in at least one word in the extension list.
  19. The apparatus of claim 18 wherein the total number of words in the extension list in the extension list table is defined as (NS / X) * (NH / NL) where NS is the total number of graphics which appear on the television receiver screen, X is the number of bits available in each extension list control word, NH ee the horizontal line number in all the sections of the plurality of previously determined separate sections of the screen of the television receiver, and NL is the number of horizontal lines per each at least one control word of the extension list found in each previously determined section of the television receiver's screen.
  20. The apparatus of claim 18 wherein each of the graphics listings in the moving image list table comprises an input control word of the moving image list that defines the total number of control words in That is the graph, and each of the chart listings contains a total number of control words.
  21. The apparatus of claim 15 wherein the memory controller and motion picture state machine responds to the field enable signals from the moving picture list table by indicating which field of a two frame of a field video image is displayed on a television receiver screen to access and assemble a previously determined first graphic listed in the list of moving images for the horizontal lines of only one of the fields and a second graphic previously determined in the picture grid with motion or a live television signal for the horizontal lines of the other of the two fields.
  22. 22. The apparatus of claim 15 wherein the graphics in the moving image list table are listed in sequence with a predetermined priority, and the pixel data of a first graphic of a higher priority overwrites the data of pixel of a second graphic with a lower priority in a pixel place when eneamblar a horizontal line where the first and second graphice ee bring it on a screen of the television receiver as the machine of memory controller and state of image with movement has access to the listings of graphs in sequence to assemble a horizontal line of pixel data. SUMMARY An apparatus for processing mixed video and graphic signals to be displayed on a standard televi- sion receiver includes a Graphics Memory and a Memory Controller and Motion Picture State Machine. The Graphics Memory comprises a table of Motion Picture List, a Moving Image Data table, and other optional tables. The List of Images table with Movement lists one or more graphics in a sequence previously determined for display on the television receiver, and stores general information related to the graphics with control words in each listing. The Moving Image Data table stores pixel data for the horizontal lines of each of the graphics to which it has access by means of a control word in each of the graphics listings in the list of Images with Motion. An optional Line Control table comprises control words that are accessed through the chart listing to provide independent controls for each of the horizontal lines obtained from the Moving Image Data table to produce special effects previously determined. An optional Extension List table is used to determine which of the chart listings are presented on a horizontal line to cause access of only the chart listings that are presented to assemble pixel data for that horizontal line. The Memory Controller and Motion Picture Status Machine has access to the Graphics Memory tables in a previously determined sequence to assemble and display each of the graphs and their specific effects in place previously determined in the horizontal lines in a television receiver screen. The Memory Controller and Motion Picture State Machine also uses field enable controls from the Moving Image List table to assemble a graphic on the lines of a first field and a second graphic on the lines of a second field .
MX9603752A 1995-08-31 1996-08-29 Apparatus using memory control tables related to video graphics processing for tv receivers. MX9603752A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/523,394 US5835103A (en) 1995-08-31 1995-08-31 Apparatus using memory control tables related to video graphics processing for TV receivers
US08523394 1995-08-31

Publications (2)

Publication Number Publication Date
MXPA96003752A true MXPA96003752A (en) 1997-06-01
MX9603752A MX9603752A (en) 1997-06-28

Family

ID=24084816

Family Applications (1)

Application Number Title Priority Date Filing Date
MX9603752A MX9603752A (en) 1995-08-31 1996-08-29 Apparatus using memory control tables related to video graphics processing for tv receivers.

Country Status (9)

Country Link
US (1) US5835103A (en)
EP (1) EP0762331B1 (en)
JP (2) JP3542690B2 (en)
KR (1) KR100215131B1 (en)
CA (1) CA2179790C (en)
DE (1) DE69630264T2 (en)
ES (1) ES2208718T3 (en)
MX (1) MX9603752A (en)
NO (1) NO963578L (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2784532B1 (en) * 1998-10-09 2000-12-22 St Microelectronics Sa METHOD FOR CORRECTING THE SHAKING EFFECT AND SCINILING THE IMAGE ELEMENTS INCLUDED ON A VIDEO IMAGE
FR2784534B1 (en) * 1998-10-09 2000-12-22 St Microelectronics Sa METHOD AND CIRCUIT FOR DISPLAYING IMAGE ELEMENTS EMBEDDED ON A VIDEO IMAGE
US6246803B1 (en) 1998-12-27 2001-06-12 The University Of Kansas Real-time feature-based video stream validation and distortion analysis system using color moments
KR100348422B1 (en) * 2000-06-28 2002-08-10 주식회사 아리랑테크 The apparatus for controlling the graphic display using the line control table
US7489320B2 (en) * 2005-05-13 2009-02-10 Seiko Epson Corporation System and method for conserving memory bandwidth while supporting multiple sprites
US8325282B2 (en) * 2008-04-08 2012-12-04 Mitsubishi Electric Visual Solutions America, Inc. Television automatic geometry adjustment system
USD871240S1 (en) * 2018-08-20 2019-12-31 Amazon Technologies, Inc. Motion sensor

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4420770A (en) * 1982-04-05 1983-12-13 Thomson-Csf Broadcast, Inc. Video background generation system
US4700181A (en) * 1983-09-30 1987-10-13 Computer Graphics Laboratories, Inc. Graphics display system
US4754270A (en) * 1984-02-16 1988-06-28 Nintendo Co., Ltd. Apparatus for varying the size and shape of an image in a raster scanning type display
US4580165A (en) * 1984-04-12 1986-04-01 General Electric Company Graphic video overlay system providing stable computer graphics overlayed with video image
US5089811A (en) * 1984-04-16 1992-02-18 Texas Instruments Incorporated Advanced video processor having a color palette
JPS60254190A (en) * 1984-05-31 1985-12-14 株式会社 アスキ− Display controller
CA1243779A (en) * 1985-03-20 1988-10-25 Tetsu Taguchi Speech processing system
IL79822A (en) * 1985-12-19 1990-03-19 Gen Electric Method of comprehensive distortion correction for a computer image generation system
US5285193A (en) * 1987-01-16 1994-02-08 Sharp Kabushiki Kaisha Data base system
DE3702220A1 (en) * 1987-01-26 1988-08-04 Pietzsch Ibp Gmbh METHOD AND DEVICE FOR DISPLAYING A TOTAL IMAGE ON A SCREEN OF A DISPLAY DEVICE
US4951038A (en) * 1987-05-15 1990-08-21 Hudson Soft Co., Ltd. Apparatus for displaying a sprite on a screen
US5030946A (en) * 1987-05-20 1991-07-09 Hudson Soft Co., Ltd. Apparatus for the control of an access to a video memory
US5258843A (en) * 1987-09-04 1993-11-02 Texas Instruments Incorporated Method and apparatus for overlaying displayable information
US5179642A (en) * 1987-12-14 1993-01-12 Hitachi, Ltd. Image synthesizing apparatus for superposing a second image on a first image
US5185597A (en) * 1988-06-29 1993-02-09 Digital Equipment Corporation Sprite cursor with edge extension and clipping
US5065231A (en) * 1988-09-26 1991-11-12 Apple Computer, Inc. Apparatus and method for merging input RGB and composite video signals to provide both RGB and composite merged video outputs
GB2226471A (en) * 1988-12-23 1990-06-27 Philips Electronic Associated Displaying a stored image in expanded format
US5235677A (en) * 1989-06-02 1993-08-10 Atari Corporation Raster graphics color palette architecture for multiple display objects
US4965670A (en) * 1989-08-15 1990-10-23 Research, Incorporated Adjustable overlay display controller
US5168363A (en) * 1989-10-16 1992-12-01 Sony Corporation Video signal parocessing apparatus with memory and crossfader
US5389947A (en) * 1991-05-06 1995-02-14 Compaq Computer Corporation Circuitry and method for high visibility cursor generation in a graphics display
KR940001439B1 (en) * 1991-08-30 1994-02-23 삼성전자 주식회사 Tv screen title superimposing circuit
US5258826A (en) * 1991-10-02 1993-11-02 Tandy Corporation Multiple extended mode supportable multimedia palette and multimedia system incorporating same
US5313231A (en) * 1992-03-24 1994-05-17 Texas Instruments Incorporated Color palette device having big/little endian interfacing, systems and methods
DE69309780T2 (en) * 1992-05-19 1997-10-23 Canon Kk Method and device for controlling a display
JP3059302B2 (en) * 1992-06-03 2000-07-04 株式会社ハドソン Video mixing device
CN1125029A (en) * 1993-06-07 1996-06-19 亚特兰大科技公司 Display system for a subscriber terminal
EP0633693B1 (en) * 1993-07-01 2001-04-11 Matsushita Electric Industrial Co., Ltd. Flagged video signal recording apparatus and reproducing apparatus
US5519825A (en) * 1993-11-16 1996-05-21 Sun Microsystems, Inc. Method and apparatus for NTSC display of full range animation

Similar Documents

Publication Publication Date Title
KR100249403B1 (en) Video magnification apparatus
US6061094A (en) Method and apparatus for scaling and reducing flicker with dynamic coefficient weighting
US6166772A (en) Method and apparatus for display of interlaced images on non-interlaced display
US5587928A (en) Computer teleconferencing method and apparatus
KR100261638B1 (en) Video System and Method of Using Same
US6028589A (en) Method and apparatus for video scaling and convolution for displaying computer graphics on a conventional television monitor
WO1998020670A2 (en) System for converting computer graphics to television format with scaling requiring no frame buffers
US4771275A (en) Method and apparatus for assigning color values to bit map memory display locations
US6023262A (en) Method and apparatus in a computer system to generate a downscaled video image for display on a television system
US5739868A (en) Apparatus for processing mixed YUV and color palettized video signals
US5835103A (en) Apparatus using memory control tables related to video graphics processing for TV receivers
MXPA96003752A (en) Apparatus using memorielelated control tables with devideo graphics processing for televis receivers
MXPA96003750A (en) Apparatus for processing yuv video signals and mezcl color envelope
JP3500991B2 (en) Colorimetry converter
JPH07162816A (en) Teletext receiver
JPH0380293A (en) Display device
JPH07255017A (en) Wide aspect ratio television equipment
JPH07274088A (en) Wide aspect television device
JPH05328246A (en) Two-pattern display television receiver