WO2006046203A1 - Systeme et procede pour traiter des donnees, element de programme et support lisible par ordinateur - Google Patents

Systeme et procede pour traiter des donnees, element de programme et support lisible par ordinateur Download PDF

Info

Publication number
WO2006046203A1
WO2006046203A1 PCT/IB2005/053490 IB2005053490W WO2006046203A1 WO 2006046203 A1 WO2006046203 A1 WO 2006046203A1 IB 2005053490 W IB2005053490 W IB 2005053490W WO 2006046203 A1 WO2006046203 A1 WO 2006046203A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
format
replay
recording
processing
Prior art date
Application number
PCT/IB2005/053490
Other languages
English (en)
Inventor
Henricus Van Der Heijden
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2007538577A priority Critical patent/JP2008519481A/ja
Publication of WO2006046203A1 publication Critical patent/WO2006046203A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/7904Processing of colour television signals in connection with recording using intermediate digital signal processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • H04N9/8045Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction using predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/806Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal

Definitions

  • the invention relates to a system of processing data.
  • the invention further relates to a method of processing data.
  • the invention relates to a program element.
  • the invention relates to a computer-readable medium.
  • Video is the technology of processing electronic signals representing moving pictures.
  • a major application of video technique is television, but it is also widely used in engineering, scientific, manufacturing and security applications.
  • a personal video recorder (PVR) is an electronic device that records television shows to a hard disk in digital format.
  • a PVR makes a "time shifting" feature more convenient, wherein time shifting is the recording of television shows to a storage medium to be viewed at a time convenient to a user.
  • a digital PVR brings new freedom for time shifting, as it is possible to start watching the recorded show from the beginning even if the recording is not yet complete.
  • PVR technology also allows for trick modes, such as pausing live TV, instant replay of interesting scenes, skipping advertising, and the like.
  • Many PVR recorders use the MPEG format for encoding analog video signals.
  • US 5,287,420 discloses a method of image compression for personal computer applications, which compresses and stores data in two steps. An image is captured in real ⁇ time and compressed and stored to a hard disk. At some later time, the data is further compressed in non-real-time.
  • the system of processing data comprises a first processing unit adapted to convert data from a recording format, in which data are recordable, to a partially processed intermediate format before replaying data, in such a manner that the data in the partially processed intermediate format contain additional data compared to the data in the recording format. Further, data from the intermediate format are converted to a fully processed replay format by a second processing unit, in which data are replayable, during replaying data.
  • a method of processing data comprising the steps of converting data from a recording format, in which data are recordable, to a partially processed intermediate format before replaying data, in such a manner that the data in the partially processed intermediate format contain additional data compared to the data in the recording format, and converting data from the intermediate format to a fully processed replay format, in which data are replayable, during replaying data.
  • "Replayable” data particularly refer to data which are ready to be sent to a non-processing ("dumb”) digital or analog display or recording device.
  • a program element which, when being executed by a processor, is adapted to carry out the steps according to the above-mentioned method of processing data.
  • a computer-readable medium in which a computer program is stored which, when being executed by a processor, is adapted to carry out the steps of the above-mentioned method of processing data.
  • the processing of data of the invention can be realized by a computer program, i.e. by software, or by using one or more special electronic optimization circuits, i.e. in hardware, or in hybrid form, i.e. by means of software components and hardware components.
  • the characteristic features according to the invention particularly have the advantage that recorded video (or audio) data are partially processed to be converted from a recording format to an intermediate format before being played back.
  • an idle time of the system can be used efficiently to start the processing of data before playing back these data.
  • Such a pre-processing or partial processing of input data may be performed in such a manner that particularly numerically expensive algorithms may be carried out before playing back the data, which would not be possible to be done in real-time since the time needed for performing such a quality- improving algorithm is in many cases larger than the maximum available processing time in the frame of a real-time processing and playback scheme.
  • the partially processed data may be then stored in the intermediate format. Starting from such an intermediate format, only the remaining part of processing has to be completed during real ⁇ time playback to achieve data in the recording format allowing a reproduction of data in real- time and simultaneously with an improved quality.
  • the pre-processing of the data according to the invention slightly increases the amount of data to be stored since, in addition to the recorded data, data which result from the pre-processing are stored additionally, to allow to improve the quality of the playback data compared to the recorded data.
  • the pre-processing may include the estimation of motion vectors, which motion vectors, in combination with the recorded data, allow a reproduction with an improved quality by allowing to implement sophisticated features like motion compensated temporal up-conversion or de- interlacing.
  • Motion vectors are calculated by a relatively expensive and time-consuming algorithm carried out between data recording and data reproduction, wherein the resulting motion vectors only require very small amounts of additional memory space but allow to significantly improve the quality of data playback.
  • an ahead-of-playback calculation and storage of intermediate video processing data is enabled, particularly for PC- TV and PVR applications.
  • the invention teaches leveraging storage capacity to overcome video processing power shortage by recording a video, doing part of the video processing, such as motion vector estimation, during and after recording and storing the intermediate results - not as fully-processed video data, but only minimally needed data. Then, during a subsequent playback of the video, the final steps of the algorithms may be completed, preferably in a real-time manner. Thus, preferably the more expensive parts of the processing algorithm, and in particularly everything which can not be done in real-time, may be performed during an idle time of the system, so that a very efficient usage of the system resources is combined with very small additional amounts of memory required for storing the partially processed intermediate format data. Completing the final steps of the algorithm, during the playback of the video, may include calculating the proper de-interlaced lines based on previously estimated motion vector data, or calculating missing frames for temporal up- conversion.
  • the invention is based on the recognition that future home entertainment TV recording and TV viewing applications (particularly video applications) such as personal video recorders (PVRs) and hybrid PC-TV systems can combine two factors.
  • MPEG-2 encoding or any other video compression method
  • the video processing facilities in traditional TVs typically only have access to the past two or three frames.
  • general-purpose processors are often unsuitable for video processing applications involving many operations per pixel, rather than dedicated hardware.
  • the first aspect makes it possible to overcome disadvantages due to the lack of dedicated hardware (see second aspect) provided that a part of the video processing can be done during and after recording the program, not in real-time.
  • the video processing can be essentially done in software and is not restricted to a specific hardware implementation (for example, without real-time requirements, there is no critical threshold for processing speed), upgrading the video processing algorithms in devices is possible, in contrast to conventional TVs and VCRs.
  • the video application according to a preferred embodiment of the invention will start the preliminary video processing work needed for high quality playback, using whatever processing power is not devoted to the recording task. Typically, the video processing task will not be finished by the time the program is ended. Two scenarios are possible:
  • the user watches during the recording of the program, or so soon thereafter that processing is not completed.
  • the system will use "medium" quality processing that can be done in real-time. In this case, it is determined by a determining unit that it is not possible or suitable to carry out the pre-processing step.
  • a user flexibly has a choice which is easy to understand: to view the recording immediately with "medium” quality video processing, or to give the system time to calculate the needed data and then view the recording with high quality video processing.
  • the calculation of motion vectors is performed during idle time, for example using the 3DRS ("three-dimensional recursive search") algorithm.
  • An advantageous application in the context of implementing motion vectors is de- interlacing (or "interlaced to progressive scan format conversion") which can be done in a cheap and efficient way (for example, using line doubling) or in a more sophisticated motion compensated way.
  • the latter option yields better quality, but requires motion vectors. It is possible to estimate the motion vectors offline, and use these motion vectors to de-interlace the video during playback of the video.
  • De-interlacing is the process of converting interlaced images of video into non ⁇ interlaced form. Interlaced video draws only half of the lines on the screen for each frame (alternatively drawing the odd and even lines for each frame), taking advantage of the time it takes for an image to fade on a CRR ("cathode ray tube") to give the impression of double the actual refresh rate, helping to prevent flicker.
  • Basic methods of de-interlacing include "combination”, where the even and odd frames are combined into one image and then displayed, and "extension”, wherein each frame (with only half the lines) is extended to the entire screen. Another application in the frame of calculating motion vectors is a so called
  • a video or audio playback device can be manufactured to output enhanced or high definition signals from standard definition signals.
  • Such devices may include an integrated sealer to up-convert the standard definition video to high definition video. This up- conversion process may improve the perceived picture quality of standard definition video.
  • Preferred applications of the invention are personal video recorders (PVR), especially with progressive output which requires de- interlacing.
  • PVR personal video recorders
  • Newly developed algorithms that have not yet been implemented in hardware can be deployed (and therefore tested in the market) earlier and cheaper in an offline scenario as outlined in this description.
  • Another preferred application of the invention are television sets with built-in hard disk.
  • a further application is a PC-TV application. While US 5,287,420 merely discloses compression of data and hence in so that compressed image data use less memory than primary data, the invention goes exactly against this teaching by describing a system that stores more data than initially recorded. According to the invention, the initially recorded data are stored, and additional data that is generated offline. Thus, the invention improves the reproduction quality at a cost of a slightly increased storage space.
  • US 5,287,420 improves the compression ratio at the cost of the picture quality.
  • the recorded signal is treated as a "compressed” signal from which the invention intends to "decompress” a high resolution signal using video or audio processing techniques.
  • this "decompression” not in real-time (i.e. not during playing back) some more space will be used than the original recording of a standard definition interlaced signal requires.
  • the invention provides a system that stores a video data stream and then generates and stores some more data.
  • the invention introduces ahead-of-time non-real-time calculation and storage of intermediate video processing data.
  • An important idea is that a part of the data that is needed for high quality playback is calculated (and temporarily stored) ahead of time, rather than at the time of playback.
  • the invention improves the playback quality, and not the compression properties.
  • the invention involves optional video processing that is done during system idle time, not in real-time.
  • the invention does a part of the video processing already during recording or directly after recording, and performs the final processing step during playback of the video.
  • the system of the invention may be applied to process video data or audio data.
  • the invention may be applied to any kind of data which may be processed before being replayed, and which can be improved concerning quality by such a processing before being replayed.
  • video data may be processed by calculating motion vectors.
  • audio data may be processed by calculating reverberation to be added to the primary audio data to improve the subjective quality of audio replay as perceived by a human listener.
  • the system of the invention may comprise a recording unit which may be adapted to record data in the recording format and which may be adapted to provide recorded data to the first processor unit.
  • the system may further comprise a replay unit adapted to replay data in the replay format and adapted to be provided with data to be replayed by the second processor unit.
  • a replay unit may comprise a personal computer, an LCD, an audio player, or the like.
  • a storage unit may be provided which may be adapted to store recorded data in the recording format and to store partially processed intermediate data in the partially processed intermediate format.
  • Such a storage unit may be, for instance, a hard disk, a RAM memory, a flash memory or an optical storage medium like a DVD.
  • Data in the recording format may have a first quality level
  • data in the replay format may have a second quality level, the second quality level indicating a higher quality than the first quality level.
  • the quality of replay data may be improved.
  • incoming video data may have a replay rate of 24 Hz.
  • modern LCD panels are able to replay visual data with a frequency of 60 Hz or even with 75 Hz.
  • motion vectors may be calculated.
  • temporal up- conversion may be performed to generate intermediate pictures.
  • incoming video may have a frequency of 60 Hz and be interlaced.
  • motion vector information can be used to perform motion compensated de-interlacing for displaying the video on a progressive scan display (such as an LCD).
  • Temporal up-conversion and de- interlacing may be performed alternatively or additionally.
  • the system of the invention may comprise a determining unit adapted to determine, based on a user-defined time interval between recording data and replaying data or based on available system resources or based on system resources expected for the future, whether the first processing unit is controlled to convert recorded data from the recording format to the intermediate format before replaying data in the replay format, or whether the first processing unit is controlled to convert recorded data from the recording format directly to the replay format.
  • the system can be flexibly controlled according to results from a check whether the time between recording and playing back is sufficient to carry out the pre-processing of the invention. If yes, pre-processing is carried out and the quality of the reproduced data may be increased compared to the recorded data.
  • the system flexibly decides whether a video signal quality improvement is possible or not.
  • the decision whether intermediate data should be generated or not may also be taken on the basis of the fact whether sufficient system resources (e.g. CPU capacity, memory space) are presently available or are expected to become probably available in the near future.
  • a determining unit may be implemented which is adapted to determine, at a time of replaying data, whether data in the intermediate format have already been generated by the first processing unit and are thus available so that the second processing unit is controlled to convert data from the intermediate format to the replay format, or whether data in the intermediate format are not available so that the first processing unit is controlled to convert recorded data from the recording format directly to the replay format.
  • the first processing unit starts to calculate intermediate data in any case during and directly after recording.
  • the determining unit checks whether the first processing unit has already finished the calculation of intermediate data, i.e.
  • the determining unit controls the second processing unit to generate replay data under consideration of the previously estimated intermediate data. If no, the determining unit controls the first processing unit to generate replay data directly from the recorded data. The decision is thus based on the fact whether intermediate data are available or not.
  • the first processing unit is controlled to convert recorded data from the recording format directly to the replay format
  • data in the replay format may be replayed, wherein any mismatch between the framerates of the recorded format and replay format is overcome by using frame repetition. For example, if 24 Hz input signals shall be reproduced on a 60 Hz display device, a particular frame is simply repeated several times.
  • the first processing unit is controlled to convert recorded
  • data in the replay format may be replayed using a line repetition method, wherein an interlaced recorded format may be converted to a progressive replay format using line doubling.
  • a line repetition method just repeatedly displays a particular line to convert interlaced video to a progressive scan format.
  • Converting data to a partially processed intermediate format may include calculating motion vector data.
  • the calculation of motion vector data is in many cases computationally expensive and thus needs a considerable time to be carried out.
  • the system of the invention may use an idle time, i.e. a time in which resources of the system are free, to calculate the motion vector data.
  • Motion vector data can be stored with a very small amount of memory space and, however, allow to significantly improve the quality of displayed data. In case that motion vector data have been previously recorded, it is then possible to replay, based on the recorded data and the additionally calculated motion vector data, video data in a significantly improved manner, without the necessity to store huge amounts of additional data.
  • Converting data to a fully processed replay format may include a temporal up- conversion.
  • Such a temporal up-conversion may include using previously estimated motion vector data to improve the picture quality.
  • converting data to a fully processed replay format may include a motion-compensated de-interlacing.
  • a de-interlacing may be based on a previously performed motion vector analysis and allows a real-time playback of video data in a significantly improved quality.
  • converting data to a partially processed intermediate format may include a color analysis, in particularly, the creation and the analysis of a color histogram.
  • Converting data to a fully processed replay format may include an enhancement using a modification of a color histogram, using previously created and analyzed color histogram information.
  • Data in the recording format may be data in a compressed data format.
  • a preferred compression format is the MPEG-2 format used for encoding audio and video data.
  • alternative compression schemes may be applied, for example MPEG-4.
  • the system of the invention may be realized as an integrated circuit, particularly as a semiconductor integrated circuit.
  • the system can be realized as a monolithic IC which may be fabricated in silicon technology.
  • the system of the invention may be realized as a personal video recorder (PVR) or as a personal computer television system (PC-TV), or as a portable audio player or as a DVD player or as an MP3 player.
  • PVR personal video recorder
  • PC-TV personal computer television system
  • portable audio player or as a DVD player or as an MP3 player.
  • the first processing unit and the second processing unit may be combined to one common processing unit.
  • both processing units may be combined to or integrated in a single common processing unit, e.g. one CPU (central processing unit).
  • Fig. 1 shows a system of processing data according to a preferred embodiment of the invention.
  • Fig. 2 to Fig. 12 show flow charts according to methods of processing data according to preferred embodiments of the invention.
  • a video processing system 100 according to a preferred embodiment of the invention will be described.
  • the video processing system 100 comprises a recording unit 102 for recording video data.
  • An output of the recording unit 102 is coupled to an input of a determining unit 105 for determining if a provided time interval between recording video data and reproducing video data is sufficient for pre-processing data before replaying data.
  • the determining unit 105 is adapted to determine, based on a user-defined time interval between recording data and replaying data, whether a first processing unit 101a is controlled to convert the recorded data from the recording format (denoted as A) to an intermediate format (denoted as B) before replaying data in the replay format (denoted as C), or whether the first processing unit 101a (or alternatively the second processing unit 101b) is controlled to convert the recorded data from the recording format directly to the replay format.
  • the determining unit 105 controls the first processing unit 101a to directly convert the recorded data into replayable data and to forward replayable data to a replay unit 103.
  • the determining unit 105 provides the first processing unit 101a with the recorded data for a pre-processing.
  • the first processing unit 101a pre-processes the recorded data and may use a storage unit 104 for storing and accessing data.
  • FIG. 1 shows the video processing system 100 with the first processing unit
  • the 101a being adapted to convert data from a recording format, in which the data are recorded by the recording unit 102, to a partially processed intermediate format before replaying the data by the replay unit 103.
  • the data are converted, by the first processing unit 101a, in such a manner that the data in the partially processed intermediate format contain additional data compared to the recorded data, namely the original data plus data appropriate to improve the playback quality.
  • the second processing unit 101b converts data from the intermediate format to a fully processed replay format in which the data are replayed by the replay unit 103.
  • the latter processing step is carried out during replaying the data, i.e. in a real-time manner.
  • the storage unit 104 is adapted to store recorded data in the recording format and to store partially processed intermediate data in the partially processed intermediate format.
  • the data recorded by the recording unit 102 in the recording format have a first quality level (namely data with a framerate of 24 Hz), and the data in the replay format to be replayed by the replay unit 103 have a second quality level (namely data with a framerate of 60 Hz), so that the second quality level indicates a higher quality than the first quality level.
  • the components of the video processing system 100 are realized as an integrated circuit in silicon technology.
  • the video processing system 100 is a personal video recorder.
  • the determining unit 105 may be adapted to determine, at a time when a replay of the data is requested, whether data in the intermediate format (i.e. motion vectors) have already been generated by the first processing unit 101a and are thus available. If such a calculation is already finished at the time at which data shall be replayed, the second processing unit 101b is controlled to convert data from the intermediate format to the replay format. In contrast to this, if such a calculation is not yet finished at the time at which data shall be replayed so that data in the intermediate format are not available, the first processing unit 101a is controlled to convert recorded data from the recording format directly to the replay format.
  • data in the intermediate format i.e. motion vectors
  • the first processing unit 101a starts to calculate intermediate data in any case during and directly after recording.
  • the determining unit 105 checks whether the first processing unit 101 has already finished the calculation of intermediate data, i.e. whether intermediate data (like motion vectors) are already available. If yes, the determining unit 105 controls the second processing unit 101b to generate replay data under consideration of the previously estimated intermediate data. If no, the determining unit 105 controls the first processing unit 101a (or alternatively the second processing unit 101b) to generate replay data directly from the recorded data. The decision is thus based on the fact whether intermediate data are available or not.
  • the processing unit 101 is adapted to carry out the corresponding method steps.
  • Fig. 2 to Fig. 3 and Fig. 4 a first embodiment related to motion compensation up-conversion will be described.
  • Fig. 2 shows a flow chart 200 which illustrates receiving and storing data.
  • a TV signal is received and is converted into video data by receiving means 201.
  • These video data are provided to a compression means 202 to generate compressed video data which are then stored in a data storage means 203.
  • the data storage means 203 may be a hard disk, a RAM memory, a flash memory or an optical storage medium like a DVD.
  • Fig. 2 shows receiving and storing a TV signal as compressed data (usually MPEG-2 or MPEG-4 variant or other formats like Digital Video DV).
  • An alternative embodiment is to bypass this compression step, and use a DVD or other digital carrier signal directly as the data storage means 203.
  • the flow chart 300 illustrates an offline processing scheme, i.e. a method of generating motion vectors, which method can be carried out in the first processing unit 101a.
  • a decompression unit 301 is provided with compressed video data stored in the data storage means 203.
  • the decompression means 301 partially decompress the compressed video data to generate video data.
  • the decompression means 301 may be adapted, for instance, to only partially, not fully, decompress data. For instance, in case that the video data are in the MPEG format, the decompression unit 301 may not consider color data, but only luminance data for the sake of decompressing.
  • a motion vector estimation unit 302 estimates motion vector data from the video data.
  • a compression means 303 generates compressed motion vector data from the motion vector data using a lossless compression algorithm. In an additional data storage means 304, the compressed motion vector data are stored.
  • Fig. 3 illustrates an embodiment for an offline processing job.
  • the motion vector estimation can be a numerically expensive process.
  • the estimation process can be started as soon as the data storage means 203 is (partially) filled and the system has spare computing resources.
  • the resulting data typically one (vx, vy) data pair for each 8x8 pixel block in each video frame, can be stored in the additional data storage means 304.
  • an additional lossless compression step is carried out in the lossless compression unit 303.
  • a flow chart 400 which illustrates the playback of film material (provided with a framerate of 24 Hz) on a 60 Hz display, i.e. a replay with a framerate of 60 Hz.
  • the compressed video data stored in the data storage means 203 are provided to a decompression means 401a which may be an MPEG-2 decoder.
  • the decompression means 401a generate 24 Hz video data from the compressed video data and provide these 24 Hz video data to a decision means 402.
  • the decision means 402 it is decided whether motion vector data are available, i.e. have been previously calculated in the motion vector estimation unit 303.
  • a frame repetition means 403 generates 60 Hz video data from the 24 Hz video data using a simple repetition of frames.
  • the frame repetition means 403 generates video data with a frequency of 60 Hz from video data having a frequency of 24 Hz by simply repeating different frames a plurality of times.
  • the video data with the frequency of 60 Hz are provided to a 60 Hz display device 405, for instance a liquid crystal device (LCD), for displaying the processed video data at a rate of 60 Hz.
  • a 60 Hz display device 405 for instance a liquid crystal device (LCD)
  • the video data with a frequency of 24 Hz are provided to a temporal up-conversion means 404 to generate video data with a frequency of 60 Hz with a quality which is improved with respect to the video data having a frequency of 24 Hz.
  • These pre-processed video data with a frequency of 60 Hz are provided to the display device 405 to be displayed.
  • decompression means 401b (which may be related to a Huffman- like lossless compression) are provided with compressed motion vector data which are delivered from the additional data storage unit 304. These motion vector data are provided to the temporal up-conversion unit 404, which up-conversion unit 404 uses these motion vector data to provide the video data with a frequency of 60 Hz having an improved quality compared to the video data having a frequency of 24 Hz.
  • Fig. 4 shows an example of how the computed motion vectors can be used in a film rate up-conversion.
  • a 24 Hz film sequence is used (this is usually determined from an analysis of the 60 Hz interlaced TV signals by the "film detector") that shall be displayed on a 60 Hz display 405.
  • the video can be up-converted in a simple way up to 60 Hz. If such motion vectors are not available (for instance since a time interval between recording the data and replaying the data is too small for calculating the motion vectors), frame repetition is carried out.
  • the compressed video data which had been generated previously are provided to a decompression means 401a to generate video data 48Oi (as in NTSC (National Television System Committee) TV signals - similar examples exist for PAL and HDTV interlaced modes).
  • the decision means 402 decide whether motion vector data are available, i.e. if motion vectors have been generated by the motion vector estimation means 302. If no, a simple line repetition is carried out by a line repetition means 501 converting video data 48Oi to video data 48Op, for display on a progressive display 503. If motion vector data are available, the video data 48Oi are provided to a motion compensated de-interlacing means 502 for generating video data 48Op using previously estimated motion vector data. These video data 48Op are then displayed on the progressive display device 503.
  • Fig. 5 shows a second example of using pre-calculated motion vectors to improve the quality of displaying video data.
  • a 48Oi 60 Hz (NTSC, National Television System Committee) signal shall be de-interlaced for display on the 48Op (progressive) display 503.
  • a motion compensated de- interlacing algorithm for example "majority select" is used. If no motion vectors are available, the system falls back on a very basic de- interlacing algorithm by just repeating all lines once.
  • Fig. 6 shows another embodiment of offline processing, introducing de- interlacing parameters, i.e. another way of de-interlacing which is still cheaper than motion-compensated de-interlacing, namely motion adaptive de-interlacing.
  • a flow chart 700 shown in Fig. 7 illustrates details of the functionality of the de- interlacing analysis means 601. Image pixels are provided to a moving decision means 701 for deciding whether a particular image is (locally) moving. If yes, a parameter
  • “use_previous” is set to a value of "0" in a set means 702, and a determination means 703 determines an optimum interpolation direction L. If the moving decision means 701 has decided that the image is not locally moving, the parameter "use_previous” is set, by a set means 703, to a value of "1", and L is set to "0".
  • a storing means 705 is connected with outputs of units 703, 704 and stores the values of "use_previous” and "L".
  • Fig. 8 shows a flow chart 800 illustrating a playback diagram belonging to the procedure described in Fig. 6.
  • a decompression means 401a is provided with data stored in the data storage means 203. After decompressing these data, the thus generated video data 48Oi are provided to a decision means 801 for deciding if de- interlacing parameters (as distinguished according to the procedure shown in Fig. 7) are available. If not, the video data 48Oi are provided to a line repetition means 501 to generate video data 48Op to be displayed on a progressive display device 503. If the decision means 801 decides that de- interlacing parameters are available, then the video data 48Oi are converted to video data 48Op using motion adaptive de- interlacing as performed by a motion adaptive de-interlacing means 802. The motion adaptive de-interlacing means 802 are supplied with de-interlacing parameters from a decompression unit 401b which receives data stored in the additional data storage means 304, see "c".
  • the flow chart 900 shown in Fig. 9 is similar to the flow chart 300 shown in Fig. 3 and the flow chart 600 shown in Fig. 6 and illustrates how the compressed video data stored in the data storage means 203 (see Fig. 2) are processed to generate color histogram data.
  • the compressed video. data are partially decompressed in a decompression unit 301, wherein the decompressed data are provided as video data to a color histogram creation and analysis means 901.
  • the color histogram creation and analysis means 901 is adapted to generate color histogram data by applying a corresponding algorithm.
  • the results of this calculation step are provided as color histogram data to a lossless compression unit 303 which generates compressed color histogram data which are stored in the additional data storage means 304.
  • Fig. 9 show an embodiment which implements image enhancement using histogram modification.
  • a well-known technique for increasing color contrast where not all available colors are used at the same time is implemented. This method is also used in TV systems, but there it suffers from the fact that the algorithm needs to "look into the future". Color use must be known well in advance to prevent temporal inconsistencies.
  • a flow chart 1000 as shown in Fig. 10 which shows details of the method for determining and storing the color usage per scene. So a according to the Fig.10 a histogram analysis is shown. Firstly, image color pixels are provided to a measure means 1001 for measuring color histograms. In a decision means 1002, it is detected whether scene changes have been present. If yes, a storing means 1003 stores frame number as scene boundary, and allocates a new empty scene average histogram. The results of the storing means 1003 are provided to an adding means 1004 for adding histogram data to scene average. Further, in case that no scene changes are detected by decision means 1002, the result of this detection is provided to the adding means 1004.
  • Fig. 11 shows a flow chart 1100 illustrating, for different frame references, corresponding histogram blocks.
  • FIG. 11 gives an overview of storage of color data. Per scene, one histogram is stored. The frame numbers that signify boundaries between scenes are also stored, so the whole provides a mapping of frame number to average histogram data.
  • a flow chart 1200 will be described illustrating a playback with color and brightness enhancement using histogram modification. Similar like shown in Fig. 3, Fig. 4 and Fig. 8, Fig. 12 shows that a decompression unit 401a is provided with data stored in data storage means 203 shown in Fig. 2. Further, data stored in the additional data storage means 304, shown in Fig. 9, are provided to the decompression unit 401b, as shown in Fig. 12.
  • Data provided by the data storage means 203 are decompressed in a decompression unit 401a and are provided to a decision means 1201 checking whether histogram data are available (i.e. if history data have been produced by the color histogram creation and analysis means 901). If no, the video data are displayed on a display device 1202 (a liquid crystal display, LCD). If yes, the video data are treated in an enhancement using histogram modification means 1204 which is supplied with current scene histogram data from a look-up means 1203 looking up for histograms for a current scene. This look-up means 1203 contains color histogram data provided by the decompression unit 401b.
  • the enhancement means 1204 enhance the provided video data and display a video data on a display device 1202. Thus, color and brightness enhancement using histogram modification is carried out.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)

Abstract

La présente invention concerne un système (100) conçu pour traiter des données. Ce système comprend une première unité de traitement (101a) conçue pour convertir des données d'un format d'enregistrement dans lequel des données peuvent être enregistrées en un format intermédiaire partiellement traité, avant la retransmission de données, de manière que les données se trouvant dans le format intermédiaire partiellement traité contiennent des données supplémentaires par rapport aux données se trouvant dans le format d'enregistrement, ainsi qu'une seconde unité de traitement (101b) conçue pour convertir des données du format intermédiaire en un format de retransmission complètement traité dans lequel des données peuvent être retransmises, lors de la retransmission de données.
PCT/IB2005/053490 2004-10-29 2005-10-25 Systeme et procede pour traiter des donnees, element de programme et support lisible par ordinateur WO2006046203A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007538577A JP2008519481A (ja) 2004-10-29 2005-10-25 データ処理システム及び方法、プログラム要素並びにコンピュータ可読媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP04105410.7 2004-10-29
EP04105410 2004-10-29

Publications (1)

Publication Number Publication Date
WO2006046203A1 true WO2006046203A1 (fr) 2006-05-04

Family

ID=35789104

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2005/053490 WO2006046203A1 (fr) 2004-10-29 2005-10-25 Systeme et procede pour traiter des donnees, element de programme et support lisible par ordinateur

Country Status (3)

Country Link
JP (1) JP2008519481A (fr)
CN (1) CN101053261A (fr)
WO (1) WO2006046203A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169068B (zh) * 2017-05-06 2021-05-04 广东伟邦科技股份有限公司 轿厢媒体机换图方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5287420A (en) * 1992-04-08 1994-02-15 Supermac Technology Method for image compression on a personal computer
EP0735748A2 (fr) * 1995-03-27 1996-10-02 AT&T Corp. Procédé et dispositif pour convertir une séquence d'images vidéo entrelacées en une séquence d'images à balayage progressif
WO2001013625A1 (fr) * 1999-08-17 2001-02-22 General Instrument Corporation Transcodage pour applications de memorisation de decodeur de consommateurs
US20040252232A1 (en) * 2001-11-01 2004-12-16 Rogier Lodder Edge oriented interpolation of video data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5287420A (en) * 1992-04-08 1994-02-15 Supermac Technology Method for image compression on a personal computer
EP0735748A2 (fr) * 1995-03-27 1996-10-02 AT&T Corp. Procédé et dispositif pour convertir une séquence d'images vidéo entrelacées en une séquence d'images à balayage progressif
WO2001013625A1 (fr) * 1999-08-17 2001-02-22 General Instrument Corporation Transcodage pour applications de memorisation de decodeur de consommateurs
US20040252232A1 (en) * 2001-11-01 2004-12-16 Rogier Lodder Edge oriented interpolation of video data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HAGHIRI M ET AL: "A LOW BIT RATE CODING ALGORITHM FOR FULL MOTION VIDEO SIGNAL", SIGNAL PROCESSING. IMAGE COMMUNICATION, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 2, no. 2, 1 August 1990 (1990-08-01), pages 187 - 199, XP000243477, ISSN: 0923-5965 *

Also Published As

Publication number Publication date
JP2008519481A (ja) 2008-06-05
CN101053261A (zh) 2007-10-10

Similar Documents

Publication Publication Date Title
US20200252582A1 (en) Conversion method and conversion apparatus
JP5075195B2 (ja) 映像送信装置、映像受信装置、映像記録装置、映像再生装置及び映像表示装置
US5754248A (en) Universal video disc record and playback employing motion signals for high quality playback of non-film sources
US7242436B2 (en) Selection methodology of de-interlacing algorithm of dynamic image
JP4686594B2 (ja) 画像処理装置及び画像処理方法
US6323914B1 (en) Compressed video recording device with integrated effects processing
US5557298A (en) Method for specifying a video window's boundary coordinates to partition a video signal and compress its components
JPH06153069A (ja) 画像の変換装置、複製装置、再生装置、および表示装置
JP2002290876A (ja) 動画シーケンスの表示方法
US7593463B2 (en) Video signal coding method and video signal encoder
US8189102B2 (en) Image processing apparatus using interpolation direction
US8224120B2 (en) Image signal processing apparatus and image signal processing method
WO2006046203A1 (fr) Systeme et procede pour traiter des donnees, element de programme et support lisible par ordinateur
JP2005026885A (ja) テレビジョン受信装置及びその制御方法
JP4184223B2 (ja) トランスコーダ
JP3312456B2 (ja) 映像信号処理装置
US6680747B1 (en) Image processing apparatus and image processing method
JP2002199405A (ja) 動画像再生装置、及び動画像再生方法
JP3811668B2 (ja) 映像撮像装置および映像変換装置
JPH11215403A (ja) 映像信号処理装置
US20080279527A1 (en) Method of high speed video playback and video playback apparatus using the same
JP2000188718A (ja) 映像信号再生装置
JP2000333131A (ja) 画像処理装置、および画像処理方法
KR20030067158A (ko) 부분 인터레이스 동영상 재생 방법 및 그 장치
WO2007043227A1 (fr) Caméra, vidéo-enregistreur et système de caméra

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV LY MD MG MK MN MW MX MZ NA NG NO NZ OM PG PH PL PT RO RU SC SD SG SK SL SM SY TJ TM TN TR TT TZ UG US UZ VC VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IS IT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005796287

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2007538577

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 200580037464.9

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 1842/CHENP/2007

Country of ref document: IN

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: 2005796287

Country of ref document: EP