WO1999008450A1 - Universal visual image communication system - Google Patents

Universal visual image communication system Download PDF

Info

Publication number
WO1999008450A1
WO1999008450A1 PCT/US1998/016694 US9816694W WO9908450A1 WO 1999008450 A1 WO1999008450 A1 WO 1999008450A1 US 9816694 W US9816694 W US 9816694W WO 9908450 A1 WO9908450 A1 WO 9908450A1
Authority
WO
WIPO (PCT)
Prior art keywords
data stream
duration
pixel
width
storage medium
Prior art date
Application number
PCT/US1998/016694
Other languages
French (fr)
Inventor
Mitchell A. Cotter
Original Assignee
Cotter Mitchell A
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cotter Mitchell A filed Critical Cotter Mitchell A
Priority to EP98940852A priority Critical patent/EP0983689A1/en
Priority to AU89035/98A priority patent/AU8903598A/en
Publication of WO1999008450A1 publication Critical patent/WO1999008450A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream

Definitions

  • This invention relates to the field of representation of visual images by electronic means
  • This invention has as an object, to free the formation of the image representation from any constraints of the scanning and hardware characteristics of past analog and digital systems which have maintained format and display concepts derived from the old limitations no longer properly a part of the definition. It is possible to see the problems in a new light by considering that any image, no matter how created or derived, may ultimately be considered to be composed of an array of pixels.
  • the ordering of the data stream representing such an array may be any arbitrary format and such an arbitrary format will generally be of substantially smaller information content than that representing any substantial length of time of full-motion image.
  • the primary data stream might be transmitted or carried on a prerecoded medium.
  • the secondary data stream protocol can be provided in a variety of ways. It might precede or accompany a primary data stream transmission. It could be derived from a storage technique. For example, any digital medium such as compact discs, tapes or floppy discs. It might be transmitted by any convenient method either at the time of viewing or before.
  • Such decoupling of the secondary data stream or recovery protocol from the transmission or recorded information source allows pay-per-view functions as well as other forms of control of the use of an image transmission or image record source.
  • Present day television does not so represent images but currently various schemes are under consideration to make the transmission and recording of television information utilize some form of digital signaling in order to gain the performance advantages of digital signals.
  • All of the systems presently being proposed appear to this inventor to have constrained the transmission format, even though digital, to limitations derived from the particular scan based prior art of television picture representation.
  • Another fundamental difference between the prior art and the subject invention is that in the prior art the meaning and values of the picture elements (or pixels) is determined by a fixed data interpretation system built into the transmission, recording and receiving devices employed.
  • the present invention escapes these severe limitations by means of several processes that are represented by the block diagram of the invention shown in Figure 1.
  • the purpose of the invention which the system of Fig 1 embodies, is to allow the flexible creation, transmission, and recording of image digital data to be carried by a variety of means including any purely digital signaling path able to provide character-less digital data, and to allow the instruction data stream program for controlling the way in which the image digital data is utilized to be transmitted over the same digital signaling path. Since the volume of data that will be required to represent any motion picture sequence of images will be very much greater than the volume of data for such an image control instruction program, it becomes possible to transmit such an instructional data stream in a very brief interval.
  • the architecture of the devices disclosed herein differs basically from prior art in the separation of the instruction data stream function from the pixel digital data stream. This decoupling provides an opportunity to develop systems that are appropriate to the creative and performance objectives without restriction from any format that is based on some one particular limited concept of how the image might be represented. Transmission systems using this new concept are free to develop their own "software" to produce whatever objective is possible (see Figure 1) within the constraints of the resolution of the selected LPPA and the capacity of the programming potential of the IDSP and associated PMC and DA F. These elements can also compress resolution, color and, image size to fit within the capacity of a users imaging system LPPA device. Different formats of the same image program might be provided by image sources to suit a wide range of possible LPPA devices. The principle of separating the system architecture from the display devices allows the full potential of character-less digital transmission to be utilized while giving free reign to the possibilities for different systems (PMC data) to vary the form and character of images delivered.
  • PMC data free reign to the possibilities for different systems
  • Figure 2 Representation of the major components and data flow as implemented in software example embodiment 1.
  • FIG. 3 Detailed description of the Input Data Stream Processor (IDSP) and the interaction between its sub-components
  • FIG. 4 Detailed description of the Data Addresser Function (DAF) and the interaction between its sub-components
  • Figure 5 Detailed description of the Light Producing Pixel Array (LPPA) and the interaction between its sub-components DETAILED DESCRIPTION OF SOME PREFERRED EMBODIMENTS
  • LPPA Light Producing Pixel Array
  • FIG. 1 A first example embodiment of a receiving portion of the system may be implemented by those of ordinary skill in the art by the use of digital circuits to achieve the functions and signal flow paths from a pre-organized image pixel data stream and an associated and separate program information, all shown within Figure 1.
  • Reference number 101 shows a data path input to the first section of the novel system. This signal might be a parallel digital word data path however, for purposes of the following description the non-limiting case of a serial data stream such as the Synchronous Digital Hierarchy at 622 Megabits/ sec may be used for purposes of this example.
  • This block is the first digital processing of the data input which performs the first function of interpreting the input 101.
  • a stored program shown dotted as the IDSP Operating System Program
  • PMC DAF Program Memory and Controller
  • That block distributes the pixel data to the correct position represented by the Pixel Plane Memory which is shown as sub-block 105 within the Light Producing Pixel Array (LPPA) block 106. That block is the system that produces the light pixels that become the visual representation of the transmitted image indicated as Image Output 107.
  • LPPA Light Producing Pixel Array
  • the specific LPPA employed may be a separate object from the rest of the system. For example it might be a flat array hanging on a wall. Such panels may have any density of pixels and so the capability a shape of a particular LPPA is communicated to the IDSP via the path 110 shown going from the LPPA to the IDSP via an internal interface shown as (LPPA Subset Info block 111).
  • This information allows the IDSP to correctly inte ⁇ ret the image data in accordance with the data entered by the user via the Cursors Controller 108. This allows users, among other possibilities, to arrange the display in any size or portion of the display that suits the needs of the moment.
  • the IDSP provides outputs 109 to interface with other systems that may be utilizing the Universal Visual Image Communication System (UVICS). That path may also function to permit interactive operations with the image source data by operating a transmission of data via the available send path.
  • UVICS Universal Visual Image Communication System
  • a second example embodiment of the system is realized in the form of a software embodiment shown as software example 1 performing simulation of a versatile receiving system for displaying image data in some varied formats.
  • UNIX software is given in Appendix A for achieving this simulation of a system. The approach is discussed in the following section and references Figures 2 through 5.
  • Figure 2 represents the major components and data flow as implemented in software example embodiment 1. There are three major components that interact with each other and with external sources in this basic embodiment of the UVICS system.
  • the Input Data Stream Processor (201), or IDSP, is responsible for initializing the system, requesting the program control data and stream data, and coordinating the order of execution.
  • the Data Addresser Function (203), or DAF, is an arbitrary size block of memory.
  • DAF Data Addresser Function
  • UVICS system When the UVICS system is fully initialized, it will serve an area for the program control data to reside and execute, this execution arena is an indirect embodiment of the program memory and controller (103).
  • the DAF also, has a handle to the incoming data stream from which it will read the data to display.
  • the Light Producing Pixel Array (204), or LPPA is emulated in software for the pu ⁇ oses of this embodiment. It consists of an 'n' by 'm', randomly accessible array mapped to an X window in order to represent a hardware LPPA display.
  • the IDSP makes a request (205) to a data source (202) for the program control data.
  • the data source is located locally in this example, but may reside on a server across a network or any transportable media.
  • the data source returns (206) the program control data, in this case, as a handle to a shared object.
  • the control data consists of a series of routines that will be used by UVICS to retrieve, process and display the stream data.
  • the IDSP also, makes the request (208) for the data stream. As with the request for the control data, the data stream request is for a local file, but may be implemented to retrieve data using many different protocols.
  • the other interaction, with external sources occurs when the DAF receives (209) a stream of data from the data source.
  • the IDSP loads(210) the control data, from the data source, into the DAF module.
  • the IDSP eventually, makes one or more calls(211) to the control data in the DAF.
  • the IDSP interacts with the LPPA once, in order to initialize the windowing system and the DAF communicates (212) with the LPPA each time it maps a pixel.
  • I. Request Program (301) The Input Data Stream Processor (IDSP) requests the program control data.
  • IDSP Input Data Stream Processor
  • the exact implementation of the request protocol may vary and is unimportant to workings of the UVICS system, however, we envision support for several commonly used methods, such as requests over a network or on a local disk. Our example retrieves the control data from a file on the local disk (line 77-85).
  • Verify Program (302): The IDSP verifies that it received valid program control data and loads each routine into memory (line 87-98). If one or more of the routines are missing from the control data, the IDSP will load the missing routines from its set of default functions (307).
  • the stream processor is only aware of the name of the routines, the calling convention and when these functions should be called; it is unimportant to the IDSP how the functions actually work.
  • the format of the program control data and the list of functions that they define depends on the implementation of the IDSP, our simple example uses a shared object (Dynamically Linked Library) and defines routines for retrieving and manipulating the data.
  • the IDSP sends the request for the data by calling the data loaderGine 211-249) function(304) from the control data (line 100-106).
  • This function is responsible for setting up a data stream and verifying the correctness of the data format, such as reading and decoding the preamble to the pixel data stream.
  • the information is stored in some data structure that is implementation dependent (see code section 5 for example). Our implementation uses data that consist of a short header which describes the dimensions of the data, followed by RGB data for each pixel.
  • Setup LPPA This step is required for initializing the Light Producing Pixel Array (LPPA). For example, some of the actions taken may be to determine the size of the LPPA, initialize the pixel plane memory. We emulate the functionality of the LPPA in software so we use this step to setup the windowing system by calling an initialization function in the LPPA module (line 108-114).
  • V. Wait For And Handle Events (306): Now the IDSP enters a loop and waits on events 0ine 116-142). These events may come in the form of refresh events, which cause the DAF to map the current frame, or as user interaction. Each event will either be ignored, handled by a default routine, or by a routine loaded from the control data (see code section 4 for example), as is appropriate for the event type. Our implementation handles the refresh event in order to map the data to the LPPA (line 136-138).
  • Data Handle (401): The data handle is created by the data loader routine when the data are requested.
  • the handle is the source of the input stream and can be a descriptor to a file, a network connection, or some other data source. In our implementation, our data is read from a file on a local disk.
  • mapping Routine (402) The mapping routine Gine 16-49) may be defined in the control data and is called by the IDSP whenever it encounters a refresh event. It is responsible for reading the pixel data from the data handle ine 42), formatting it as needed and finally, loading it into the pixel plane memory in the LPPA Gine 44-45). Light Producing Pixel Array (Fig. 5/Code Section 3) I. Initialization Routine (501): The initialization consists of a set of instructions that prepare the LPPA. It can also be used as a way of querying the LPPA for system resources (display size for example).
  • Pixel Plane Memory (502): The PPM is the randomly addressable memory that is used to store individual pixel data after it has been manipulated by the Data Addresser Function so that the LPPA can display it. Ideally, Pixel Plane Memory will be a block on RAM, large enough to address pixel data proportional to the number of pixels in the LPPA. For the sake of our emulation of the LPPA, we use an mapping interface (line 50-75) to the window created by the initialization routine.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

This invention relates to the field or representation of visual images by electronic means, for example in a television system, to flexibly represent, transmit, store and recover such images including dynamic full-motion image representations.

Description

I UNIVERSAL VISUAL IMAGE COMMUNICATION SYSTEM
2 3 4
5 RELATED APPLICATION: This application claims the benefit of U.S. Provisional Patent
6 Application No. 60/054,918, filed August 8, 1997 and a non- Provisional U.S. Patent Application of
7 the same title as this application and which was filed August 7, 1998 thereby claiming priority of the
8 foregoing Provisional Patent Application. 9
10 NOTICE UNDER 37 CFR 1.71 : A portion of the disclosure of this patent document contains
I I material which is subject to copyright protection. The copyright owner has no objection to the facsimile
12 reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and
13 Trademark OfBce patent file or records, but otherwise reserves all copyright rights whatsoever. 14
15 FIELD OF INVENTION
16 This invention relates to the field of representation of visual images by electronic means
17 including also what are commonly called full-motion images as for example in a television system, which
18 is a non-limiting case of the general application of the invention to flexibly represent, transmit, store and
19 recover such images including dynamic full-motion image representations. 20
21 BACKGROUND OF THE INVENTION
22 Present television and graphical image systems have been constrained by the historical use of
23 raster scanned representations of such images. The older systems were analog light intensity scans of an
24 image field that had superimposed timing markers to synchronize the restoring scanner with the source
25 scanner. These older systems and indeed any image mapping system can be considered to have some
26 resolution limit to the image which has come to be expressed as a "pixel", equal to the smallest element
27 of the image which the system can distinguish. Thus the image represented by any such system can be
28 considered to be composed of such discrete elements, readily allowing the use of an electronic digital
29 signal to represent the pixels. For transmission or even for many recording purposes such discrete values
30 can be strung together forming a serial data stream. Indeed most transmission and recording systems
31 basically employ such a signal that has a single value at any point in time, that is, a continuous single valued function of time. The digital information is encoded as some state difference on such a signal. At the present time there is a wide variety of timing and format organizations employed by television and similar imaging systems. These all appear to have limited format and signal representation possibilities because their systems generally relate to some form of hardware implemented scanning system. There do exist memory systems that work in conjunction with pixel element displays such as LCD computer screens and some forms of microchip mechanical light modulators. Some example of such prior art are: US Patents, 5,612,714 Souviron; 5,612,73 Bhuva et al 5,680,156 Gove et al. To date these systems have been used for television and some forms of computer displays which still employ the pixel scan and update procedures derived from the raster and hardware oriented older CRT display systems. Present systems for higher resolution television and for different aspect ratio displays have been hardware bound by such system protocols.
SUMMARY OF THE INVENTION
This invention has as an object, to free the formation of the image representation from any constraints of the scanning and hardware characteristics of past analog and digital systems which have maintained format and display concepts derived from the old limitations no longer properly a part of the definition. It is possible to see the problems in a new light by considering that any image, no matter how created or derived, may ultimately be considered to be composed of an array of pixels. The ordering of the data stream representing such an array may be any arbitrary format and such an arbitrary format will generally be of substantially smaller information content than that representing any substantial length of time of full-motion image. It therefore becomes possible within the teaching of this disclosure to create an ordering protocol of such an image array in a first data stream series of image pixel sets which may also carry the visual impression of continuous motion and which ordering protocol may easily be accomplished as only a small second data stream series to be utilized to instruct the production of the pixel light output representing the intended image or full-motion visual images. The primary data stream might be transmitted or carried on a prerecoded medium. The secondary data stream protocol can be provided in a variety of ways. It might precede or accompany a primary data stream transmission. It could be derived from a storage technique. For example, any digital medium such as compact discs, tapes or floppy discs. It might be transmitted by any convenient method either at the time of viewing or before. Such decoupling of the secondary data stream or recovery protocol from the transmission or recorded information source allows pay-per-view functions as well as other forms of control of the use of an image transmission or image record source. Present day television does not so represent images but currently various schemes are under consideration to make the transmission and recording of television information utilize some form of digital signaling in order to gain the performance advantages of digital signals. All of the systems presently being proposed appear to this inventor to have constrained the transmission format, even though digital, to limitations derived from the particular scan based prior art of television picture representation. Another fundamental difference between the prior art and the subject invention is that in the prior art the meaning and values of the picture elements (or pixels) is determined by a fixed data interpretation system built into the transmission, recording and receiving devices employed. The present invention escapes these severe limitations by means of several processes that are represented by the block diagram of the invention shown in Figure 1. The purpose of the invention which the system of Fig 1 embodies, is to allow the flexible creation, transmission, and recording of image digital data to be carried by a variety of means including any purely digital signaling path able to provide character-less digital data, and to allow the instruction data stream program for controlling the way in which the image digital data is utilized to be transmitted over the same digital signaling path. Since the volume of data that will be required to represent any motion picture sequence of images will be very much greater than the volume of data for such an image control instruction program, it becomes possible to transmit such an instructional data stream in a very brief interval. Even if a very large such instruction program were desired - say 50 Megabytes which is much greater than what might be required - it could be transmitted in less than one second by a channel such as a SONET based 622 Megabit/ sec data path. This form of system architecture is based upon having a very wide bandwidth digital communication path, capable of hundreds of Megabits per second data rate. Such data paths are not only possible but are becoming common place resources. Each of the functions described in the foregoing review of the system block diagram is implemented by standard digital techniques familiar to those skilled in the art. It may be achieved in either a hardware form or by software in a more general system. The hardware form can be readily derived from the disclosure and Figure 1. An example of a software realization is given in software example 1. The architecture of the devices disclosed herein differs basically from prior art in the separation of the instruction data stream function from the pixel digital data stream. This decoupling provides an opportunity to develop systems that are appropriate to the creative and performance objectives without restriction from any format that is based on some one particular limited concept of how the image might be represented. Transmission systems using this new concept are free to develop their own "software" to produce whatever objective is possible (see Figure 1) within the constraints of the resolution of the selected LPPA and the capacity of the programming potential of the IDSP and associated PMC and DA F. These elements can also compress resolution, color and, image size to fit within the capacity of a users imaging system LPPA device. Different formats of the same image program might be provided by image sources to suit a wide range of possible LPPA devices. The principle of separating the system architecture from the display devices allows the full potential of character-less digital transmission to be utilized while giving free reign to the possibilities for different systems (PMC data) to vary the form and character of images delivered.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 : Representation of the major components and data flow of a generic UVICS system.
Figure 2: Representation of the major components and data flow as implemented in software example embodiment 1.
Figure 3: Detailed description of the Input Data Stream Processor (IDSP) and the interaction between its sub-components
Figure 4: Detailed description of the Data Addresser Function (DAF) and the interaction between its sub-components
Figure 5: Detailed description of the Light Producing Pixel Array (LPPA) and the interaction between its sub-components DETAILED DESCRIPTION OF SOME PREFERRED EMBODIMENTS A first example embodiment of a receiving portion of the system may be implemented by those of ordinary skill in the art by the use of digital circuits to achieve the functions and signal flow paths from a pre-organized image pixel data stream and an associated and separate program information, all shown within Figure 1. Reference number 101 shows a data path input to the first section of the novel system. This signal might be a parallel digital word data path however, for purposes of the following description the non-limiting case of a serial data stream such as the Synchronous Digital Hierarchy at 622 Megabits/ sec may be used for purposes of this example. This signal coming from some source that is producing a digital signal representing some form of picture sequence (video like information) flows into the block labeled Input Data Stream Processor. This block is the first digital processing of the data input which performs the first function of interpreting the input 101. There is within block 102 a stored program (shown dotted as the IDSP Operating System Program) for processing the data input, firstly, to recognize the program information and relay it in correct form to block 103 (the DAF Program Memory and Controller (PMC)) and secondly, to channel and organize the pixel data to the Pixel Array Data Addressor Function block labeled 4. That block distributes the pixel data to the correct position represented by the Pixel Plane Memory which is shown as sub-block 105 within the Light Producing Pixel Array (LPPA) block 106. That block is the system that produces the light pixels that become the visual representation of the transmitted image indicated as Image Output 107.
Several other functions are shown as a part of the IDSP 102. The specific LPPA employed may be a separate object from the rest of the system. For example it might be a flat array hanging on a wall. Such panels may have any density of pixels and so the capability a shape of a particular LPPA is communicated to the IDSP via the path 110 shown going from the LPPA to the IDSP via an internal interface shown as (LPPA Subset Info block 111). This information allows the IDSP to correctly inteφret the image data in accordance with the data entered by the user via the Cursors Controller 108. This allows users, among other possibilities, to arrange the display in any size or portion of the display that suits the needs of the moment. This flexibility is a part of the IDSP and may be greater or lesser as the design objectives and cost concerns permit. For a variety of computer and other uses of the results of cursor manipulations the IDSP provides outputs 109 to interface with other systems that may be utilizing the Universal Visual Image Communication System (UVICS). That path may also function to permit interactive operations with the image source data by operating a transmission of data via the available send path.
A second example embodiment of the system is realized in the form of a software embodiment shown as software example 1 performing simulation of a versatile receiving system for displaying image data in some varied formats. UNIX software is given in Appendix A for achieving this simulation of a system. The approach is discussed in the following section and references Figures 2 through 5.
Major Components and Data Flow (Fig. 2) Figure 2 represents the major components and data flow as implemented in software example embodiment 1. There are three major components that interact with each other and with external sources in this basic embodiment of the UVICS system.
The Input Data Stream Processor (201), or IDSP, is responsible for initializing the system, requesting the program control data and stream data, and coordinating the order of execution.
Initially, the Data Addresser Function (203), or DAF, is an arbitrary size block of memory. When the UVICS system is fully initialized, it will serve an area for the program control data to reside and execute, this execution arena is an indirect embodiment of the program memory and controller (103). The DAF, also, has a handle to the incoming data stream from which it will read the data to display.
The Light Producing Pixel Array (204), or LPPA, is emulated in software for the puψoses of this embodiment. It consists of an 'n' by 'm', randomly accessible array mapped to an X window in order to represent a hardware LPPA display.
The majority of the interaction, with external sources, is handled by the IDSP. The IDSP makes a request (205) to a data source (202) for the program control data. The data source is located locally in this example, but may reside on a server across a network or any transportable media. The data source returns (206) the program control data, in this case, as a handle to a shared object. The control data consists of a series of routines that will be used by UVICS to retrieve, process and display the stream data. The IDSP, also, makes the request (208) for the data stream. As with the request for the control data, the data stream request is for a local file, but may be implemented to retrieve data using many different protocols. The other interaction, with external sources, occurs when the DAF receives (209) a stream of data from the data source.
The rest of the interaction occurs between the IDSP, DAF and the LPPA. The IDSP loads(210) the control data, from the data source, into the DAF module. The IDSP, eventually, makes one or more calls(211) to the control data in the DAF. The IDSP interacts with the LPPA once, in order to initialize the windowing system and the DAF communicates (212) with the LPPA each time it maps a pixel.
Input Data Stream Processor (Fig. 3/Code Section 1 , see Appendix A)
I. Request Program (301): The Input Data Stream Processor (IDSP) requests the program control data. The exact implementation of the request protocol may vary and is unimportant to workings of the UVICS system, however, we envision support for several commonly used methods, such as requests over a network or on a local disk. Our example retrieves the control data from a file on the local disk (line 77-85).
II. Verify Program (302): The IDSP verifies that it received valid program control data and loads each routine into memory (line 87-98). If one or more of the routines are missing from the control data, the IDSP will load the missing routines from its set of default functions (307). The stream processor is only aware of the name of the routines, the calling convention and when these functions should be called; it is unimportant to the IDSP how the functions actually work. The format of the program control data and the list of functions that they define depends on the implementation of the IDSP, our simple example uses a shared object (Dynamically Linked Library) and defines routines for retrieving and manipulating the data. III. Request Data (303): The IDSP sends the request for the data by calling the data loaderGine 211-249) function(304) from the control data (line 100-106). This function is responsible for setting up a data stream and verifying the correctness of the data format, such as reading and decoding the preamble to the pixel data stream. The information is stored in some data structure that is implementation dependent (see code section 5 for example). Our implementation uses data that consist of a short header which describes the dimensions of the data, followed by RGB data for each pixel.
IV. Setup LPPA (305): This step is required for initializing the Light Producing Pixel Array (LPPA). For example, some of the actions taken may be to determine the size of the LPPA, initialize the pixel plane memory. We emulate the functionality of the LPPA in software so we use this step to setup the windowing system by calling an initialization function in the LPPA module (line 108-114).
V. Wait For And Handle Events (306): Now the IDSP enters a loop and waits on events 0ine 116-142). These events may come in the form of refresh events, which cause the DAF to map the current frame, or as user interaction. Each event will either be ignored, handled by a default routine, or by a routine loaded from the control data (see code section 4 for example), as is appropriate for the event type. Our implementation handles the refresh event in order to map the data to the LPPA (line 136-138).
Data Addresser Function (F ig. 4/ Code Section 2) I. Data Handle (401): The data handle is created by the data loader routine when the data are requested. The handle is the source of the input stream and can be a descriptor to a file, a network connection, or some other data source. In our implementation, our data is read from a file on a local disk.
II. Mapping Routine (402) The mapping routine Gine 16-49) may be defined in the control data and is called by the IDSP whenever it encounters a refresh event. It is responsible for reading the pixel data from the data handle ine 42), formatting it as needed and finally, loading it into the pixel plane memory in the LPPA Gine 44-45). Light Producing Pixel Array (Fig. 5/Code Section 3) I. Initialization Routine (501): The initialization consists of a set of instructions that prepare the LPPA. It can also be used as a way of querying the LPPA for system resources (display size for example). We use this routine to initialize the X window system Gi e 25-32), create a window Gine 35-45), and save the information in a structure (see code section 6) or future use. II. Pixel Plane Memory (502): The PPM is the randomly addressable memory that is used to store individual pixel data after it has been manipulated by the Data Addresser Function so that the LPPA can display it. Ideally, Pixel Plane Memory will be a block on RAM, large enough to address pixel data proportional to the number of pixels in the LPPA. For the sake of our emulation of the LPPA, we use an mapping interface (line 50-75) to the window created by the initialization routine.
Although there has been disclosed so far in this document some particular processes, structures, methods and a particular example of a software embodiment of system function according to the invention therefore, it is not intended that such specific references be considered as limitations upon the scope of this invention except in-so-far as set forth in the following claims. Furthermore, having described the invention in connection with certain specific embodiments thereof, it is to be understood that further modifications may now suggest themselves to those skilled in the arts, it is intended to cover all such modifications as fall within the scope of the appended claims.

Claims

I What is claimed is:
2
3 1. An improved process for producing a first data stream of arbitrary duration capable of
4 representing all or any subset of the pixels of a selected full-motion visual image source, at a
5 pre— selected precision and, having an arbitrary time duration,
6 a) said images having a pre-selected image height and width and,
7 b) said images having a pre-selected effective pixel density and,
8 c) each of said pixel representations in said first data stream generally capable of defining for
9 each such pixel a magnitude of light intensity and a color value and,
10 d) each of said pixel representations having a defined time in said data stream and a defined
I I location with respect to said pre-selected image height and width and,
12 a second data stream separable from said first data stream and, of arbitrary duration and
13 capable of instructing said process for producing said first data stream. 14
15 2. A process as in claim 1 wherein said second data stream is shorter in duration than said first
16 data stream. 17
18 3. A process as in claim 1 wherein said second data stream is shorter in duration than said first
19 data stream and is resident in a storage medium and is loaded or obtained from said storage
20 medium. 21
22 4. A process as in claim 1 wherein said second data stream is caused to exist by a time prior to
23 the production of said first data stream. 24
25 5. A process as in claim 1 wherein said second data stream is derived at least in part by reaction
26 to at least some portion of said first data stream. 27
28 6. An improved process for transmitting a first data stream of arbitrary duration capable of
29 representing all or any subset of the pixels of a selected full-motion visual image source, at a
30 pre-selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre-selected image height and width and, a second transmitted data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for producing said first transmitted data stream.
7. A process as in claim 6 wherein said second transmitted data stream is shorter in duration than said first data stream.
8. A process as in claim 6 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
9. A process as in claim 6 wherein said second data stream is caused to exist by a time prior to the transmission of said first data stream.
10. A process as in claim 6 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
11. An improved process for producing a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected full-motion visual image source, at a pre-selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre— selected image height and width and, a second data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for reproducing a visual representation of said selected full- motion visual image source.
12. A process as in claim 1 1 wherein said second data stream is shorter in duration than said first data stream.
13. A process as in claim 1 1 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
14. A process as in claim 1 1 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
15. A process as in claim 1 1 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
16. A process as in claim 1 1 wherein said produced second data stream is derived at least in part by reaction to data representing the configuration and light producing properties of the light producing pixel array of the image displaying device so as to perform the function of translating the height, width and pixel density of the first data stream into a format and pixel density selected to accommodate the display device properties used for reproducing the visual representation and, possibly having differently displayed such height, width and pixel density than said first data stream.
17. An improved process for transmitting a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected full-motion visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre-selected image height and width and, a second transmitted data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for reproducing a visual representation of said selected full-motion visual image source.
18. A process as in claim 17 wherein said second transmitted data stream is shorter in duration than said first data stream.
19. A process as in claim 17 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
20. A process as in claim 17 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
21. A process as in claim 17 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
22. A process as in claim 17 wherein said produced second data stream is derived at least in part by reaction to data representing the configuration and light producing properties of the light producing pixel array of the image displaying device so as to perform the function of translating the height, width and pixel density of the first data stream into a format and pixel density selected to accommodate the display device properties used for reproducing the visual representation and, possibly having differently displayed such height, width and pixel density than said first data stream.
23. An improved process for recording a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected full-motion visual image source, at a pre-selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre-selected image height and width and, a second transmitted data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for producing said first recorded data stream.
24. A process as in claim 23 wherein said second data stream is shorter in duration than said first data stream.
25- A process as in claim 23 wherein said second data stream is shorter in duration than said first data stream and is resident in a separate storage medium and is loaded or obtained from said separate storage medium.
26. A process as in claim 23 wherein said second data stream is shorter in duration than said first data stream and is resident in the same storage medium and is loaded or obtained from said storage medium
27. A process as in claim 23 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
28. A process as in claim 23 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
9. An improved process for recording a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected full-motion visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre— selected image height and width and, a second recorded data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for reproducing a visual representation of said selected full-motion visual image source.
30. A process as in claim 29 wherein said second recorded data stream is shorter in duration than said first data stream.
31. A process as in claim 29 wherein said second data stream is shorter in duration than said first data stream and is resident in a separate storage medium and is loaded or obtained from said separate storage medium.
32. A process as in claim 29 wherein said second data stream is shorter in duration than said first data stream and is resident in the same storage medium and is loaded or obtained from said storage medium
33. A process as in claim 29 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
34. A process as in claim 29 wherein said second data stream is derived at least in pan by reaction to at least some portion of said first data stream.
35. A process as in claim 17 wherein said produced second data stream is derived at least in part by reaction to data representing the configuration and light producing properties of the light producing pixel array of the image displaying device so as to perform the function of translating the height, width and pixel density of the first data stream into a format and pixel density selected to accommodate the display device properties used for reproducing the visual representation and, possibly having differently displayed such height, width and pixel density than said first data stream.
36. An improved process for producing a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected visual image source, at a pre-selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre— selected image height and width and, a second data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for producing said first data stream.
37. A process as in claim 36 wherein said second data stream is shorter in duration than said first data stream.
38. A process as in claim 36 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
39. A process as in claim 36 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
40. A process as in claim 36 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
1. An improved process for transmitting a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre-selected image height and width and, a second transmitted data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for producing said first transmitted data stream.
42. A process as in claim 41 wherein said second transmitted data stream is shorter in duration than said first data stream.
43. A process as in claim 41 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
44. A process as in claim 41 wherein said second data stream is caused to exist by a time prior to the transmission of said first data stream.
45. A process as in claim 41 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
46. An improved process for producing a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre-selected image height and width and, a second data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for reproducing a visual representation of said selected visual image source.
47. A process as in claim 46 wherein said second data stream is shorter in duration than said first data stream.
48. A process as in claim 46 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
49. A process as in claim 46 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
50. A process as in claim 46 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
51. A process as in claim 46 wherein said produced second data stream is derived at least in part by reaction to data representing the configuration and light producing properties of the light producing pixel array of the image displaying device so as to perform the function of translating the height, width and pixel density of the first data stream into a format and pixel density selected to accommodate the display device properties used for reproducing the visual representation and, possibly having differently displayed such height, width and pixel density than said first data stream.
52. An improved process for transmitting a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre— selected image height and width and, a second transmitted data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for reproducing a visual representation of said selected visual image source.
53. A process as in claim 52 wherein said second transmitted data stream is shorter in duration than said first data stream.
54. A process as in claim 52 wherein said second data stream is shorter in duration than said first data stream and is resident in a storage medium and is loaded or obtained from said storage medium.
55- A process as in claim 52 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
56. A process as in claim 52 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
57. A process as in claim 52 wherein said produced second data stream is derived at least in part by reaction to data representing the configuration and light producing properties of the light producing pixel array of the image displaying device so as to perform the function of translating the height, width and pixel density of the first data stream into a format and pixel density selected to accommodate the display device properties used for reproducing the visual representation and, possibly having differently displayed such height, width and pixel density than said first data stream.
58. An improved process for recording a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre— selected image height and width and, a second transmitted data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for producing said first recorded data stream.
59. A process as in claim 58 wherein said second data stream is shorter in duration than said first data stream.
60. A process as in claim 58 wherein said second data stream is shorter in duration than said first data stream and is resident in a separate storage medium and is loaded or obtained from said separate storage medium.
61. A process as in claim 58 wherein said second data stream is shorter in duration than said first data stream and is resident in the same storage medium and is loaded or obtained from said storage medium
62. A process as in claim 58 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
63. A process as in claim 58 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
64. An improved process for recording a first data stream of arbitrary duration capable of representing all or any subset of the pixels of a selected visual image source, at a pre— selected precision and, having an arbitrary time duration, a) said images having a pre-selected image height and width and, b) said images having a pre-selected effective pixel density and, c) each of said pixel representations in said first data stream generally capable of defining for each such pixel of an image a magnitude of light intensity and a color value and, d) each of said pixel representations having a defined time in said data stream and a defined location with respect to said pre— selected image height and width and, a second recorded data stream separable from said first data stream and, of arbitrary duration and capable of instructing said process for reproducing a visual representation of said selected visual image source.
65. A process as in claim 64 wherein said second recorded data stream is shorter in duration than said first data stream.
66. A process as in claim 64 wherein said second data stream is shorter in duration than said first data stream and is resident in a separate storage medium and is loaded or obtained from said separate storage medium.
67. A process as in claim 64 wherein said second data stream is shorter in duration than said first data stream and is resident in the same storage medium and is loaded or obtained from said storage medium
68. A process as in claim 64 wherein said second data stream is caused to exist by a time prior to the production of said first data stream.
69. A process as in claim 64 wherein said second data stream is derived at least in part by reaction to at least some portion of said first data stream.
70. A process as in claim 64 wherein said produced second data stream is derived at least in part by reaction to data representing the configuration and light producing properties of the light producing pixel array of the image displaying device so as to perform the function of translating the height, width and pixel density of the first data stream into a format and pixel density selected to accommodate the display device properties used for reproducing d e visual representation and, possibly having differently displayed such height, width and pixel density than said first data stream.
PCT/US1998/016694 1997-08-08 1998-08-10 Universal visual image communication system WO1999008450A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP98940852A EP0983689A1 (en) 1997-08-08 1998-08-10 Universal visual image communication system
AU89035/98A AU8903598A (en) 1997-08-08 1998-08-10 Universal visual image communication system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US5491897P 1997-08-08 1997-08-08
US60/054,918 1997-08-08

Publications (1)

Publication Number Publication Date
WO1999008450A1 true WO1999008450A1 (en) 1999-02-18

Family

ID=21994376

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/016694 WO1999008450A1 (en) 1997-08-08 1998-08-10 Universal visual image communication system

Country Status (3)

Country Link
EP (1) EP0983689A1 (en)
AU (1) AU8903598A (en)
WO (1) WO1999008450A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0546770A2 (en) * 1991-12-10 1993-06-16 Sony United Kingdom Limited Apparatus and methods for designing, analyzing or simulating signal processing functions
WO1995035628A1 (en) * 1994-06-17 1995-12-28 Snell & Wilcox Limited Video compression
WO1996024222A1 (en) * 1995-01-30 1996-08-08 Snell & Wilcox Limited Video signal processing
EP0762777A2 (en) * 1995-09-04 1997-03-12 Sharp Kabushiki Kaisha Picture reproducing apparatus
EP0782068A1 (en) * 1995-12-29 1997-07-02 Deluxe Corporation Remote printing system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0546770A2 (en) * 1991-12-10 1993-06-16 Sony United Kingdom Limited Apparatus and methods for designing, analyzing or simulating signal processing functions
WO1995035628A1 (en) * 1994-06-17 1995-12-28 Snell & Wilcox Limited Video compression
WO1996024222A1 (en) * 1995-01-30 1996-08-08 Snell & Wilcox Limited Video signal processing
EP0762777A2 (en) * 1995-09-04 1997-03-12 Sharp Kabushiki Kaisha Picture reproducing apparatus
EP0782068A1 (en) * 1995-12-29 1997-07-02 Deluxe Corporation Remote printing system

Also Published As

Publication number Publication date
AU8903598A (en) 1999-03-01
EP0983689A1 (en) 2000-03-08

Similar Documents

Publication Publication Date Title
US6154207A (en) Interactive language editing in a network based video on demand system
US5826102A (en) Network arrangement for development delivery and presentation of multimedia applications using timelines to integrate multimedia objects and program objects
US6373534B1 (en) Intelligent digital television receiver
US5659793A (en) Authoring tools for multimedia application development and network delivery
KR100473537B1 (en) Transmission device and reception device achieving an interactivity by using a broadcasting, reception method thereof, medium storing the reception program, and communication system
US6130668A (en) Supervisory control system for networked multimedia workstations that provides simultaneous observation of multiple remote workstations
RU2000108445A (en) A DEVICE FOR VIDEO ACCESS AND MANAGEMENT THROUGH A COMPUTER NETWORK, INCLUDING AN IMAGE CORRECTION
KR100233481B1 (en) Still picture filing apparatus
WO1996019779A9 (en) Authoring tools for multimedia application development and network delivery
WO1996019779A1 (en) Authoring tools for multimedia application development and network delivery
CN100466720C (en) Video composition apparatus, video composition method and video composition program
US7215345B1 (en) Method and apparatus for clipping video information before scaling
KR20090044105A (en) Live-image providing system using contents of 3d virtual space
EP0842582A1 (en) Linked list structure onscreen display
EP0983689A1 (en) Universal visual image communication system
JP3872295B2 (en) Data display method, recording medium, and processor-based system
CN1216424A (en) Frame construction apparatus for low-speed reproduction of moving picture signal
US6243086B1 (en) Reproduction apparatus
JP2001128194A (en) Stereoscopic video processor, method for processing stereoscopic video and storing medium
JPS6057792B2 (en) Screen composition method
JP2989376B2 (en) Image processing device
JP3646110B2 (en) TRANSMITTING DEVICE, RECEIVING DEVICE, RECEIVING METHOD, MEDIUM CONTAINING THE RECEIVING PROGRAM, COMMUNICATION SYSTEM
KR20000024126A (en) system and method for providing image over network
JP4243256B2 (en) RECEIVING DEVICE, RECEIVING METHOD, AND PROGRAM RECORDING MEDIUM CONTAINING THE RECEIVING PROGRAM
JP3914056B2 (en) Data conversion apparatus and data receiving apparatus for data communication system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1998940852

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 1999512491

Format of ref document f/p: F

WWP Wipo information: published in national office

Ref document number: 1998940852

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1998940852

Country of ref document: EP