US8174544B2 - Image display apparatus, image displaying method, plasma display panel apparatus, program, integrated circuit, and recording medium - Google Patents

Image display apparatus, image displaying method, plasma display panel apparatus, program, integrated circuit, and recording medium Download PDF

Info

Publication number
US8174544B2
US8174544B2 US12/301,054 US30105407A US8174544B2 US 8174544 B2 US8174544 B2 US 8174544B2 US 30105407 A US30105407 A US 30105407A US 8174544 B2 US8174544 B2 US 8174544B2
Authority
US
United States
Prior art keywords
motion
signal
region
image
motion information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/301,054
Other languages
English (en)
Other versions
US20090184894A1 (en
Inventor
Daisuke Sato
Yusuke Monobe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Corp
Original Assignee
Panasonic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corp filed Critical Panasonic Corp
Assigned to PANASONIC CORPORATION reassignment PANASONIC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MONOBE, YUSUKE, SATO, DAISUKE
Publication of US20090184894A1 publication Critical patent/US20090184894A1/en
Application granted granted Critical
Publication of US8174544B2 publication Critical patent/US8174544B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/28Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels
    • G09G3/288Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels using AC panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/28Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels
    • G09G3/2803Display of gradations
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0242Compensation of deficiencies in the appearance of colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0257Reduction of after-image effects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/103Detection of image changes, e.g. determination of an index representative of the image change
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/106Determination of movement vectors or equivalent parameters within the image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data

Definitions

  • the present invention relates to an image display apparatus that displays an image using phosphors each having a persistence time and to an image displaying method of the same.
  • Image display apparatuses such as a plasma display panel (hereinafter referred to as PDP) use phosphors of 3 colors (red, green, and blue) each having a different persistence time. While blue phosphors have a persistence time of several microseconds as short as possible, red and green phosphors have a long persistence time of several tens of milliseconds until an amount of the phosphors is reduced to not more than 10% of the total.
  • PDP plasma display panel
  • motion blur a blur of a motion in an image occurs due to persistence of the phosphors and movement of a line of sight.
  • color shift due to the motion blur occurs (hereinafter referred to as color shift).
  • a human perceives light entering the human eyes by integrating an amount of the light incident on the retina, and the human senses the brightness and color based on the integration value through the sense of sight (hereinafter referred to as integration on the retina).
  • the PDP uses the integration on the retina to generate tones by changing a light-emission time without changing brightness of the light.
  • FIG. 1 explanatorily shows integration on the retina for each color when an image signal of a white dot on a pixel is stationary.
  • FIG. 1 shows that the motion blur does not occur when there is no change in a time distribution of emitted light from a PDP; in the integration on the retina; and in the line of sight.
  • Light emitted during one field of the PDP is basically composed of: signal components, for example, of 10 to 12 sub-fields each having a different gray value; and persistence components of fields subsequent to the 10 to 12 sub-fields.
  • blue phosphors have an extremely short persistence time.
  • the following description assumes that only the blue phosphors do not include any persistence component.
  • FIG. 1 shows a time distribution of light emission during one field period of one white pixel including stationary red, green, and blue image signals each having 255 as an image value (hereinafter represented as red: 255, green: 255, and blue: 255).
  • a red signal component 201 is followed by a red persistence component 204
  • a green signal component 202 is followed by a green persistence component 205 .
  • a blue signal component 203 emits light.
  • the integration on the retina is performed on the emitted light of red, green, and blue phosphors as shown in (b) of FIG. 1 .
  • the integration on the retina is performed on the red signal component 201 and the red persistence component 204 along a line of sight 206 that is fixed to obtain a red-signal-component integral quantity 207 and a red-persistence-component integral quantity 210 on the retina. Consequently, a human perceives the sum of these integral quantities as a red color through the sense of sight.
  • the integration on the retina is performed on the green signal component 202 and the green persistence component 205 to obtain a green-signal-component integral quantity 208 and a green-persistence-component integral quantity 211 on the retina.
  • the obtained integral quantities of the red, green, and blue signals are equal, a human perceives them as white. This is because emitted light includes the blue-signal-component integral quantity 209 greater than the red-signal-component integral quantity 207 and the green-signal-component integral quantity 208 by the red persistence component 210 and the green persistence component 211 .
  • the blue signal component on the PDP has intensity of light emission higher than those of the red and green signal components.
  • FIG. 2 explanatorily shows integration on the retina for each color when a line of sight traces a white image signal in a pixel. This integration on the retina will be explained using FIG. 2 .
  • FIG. 2 shows a time distribution of light of 2 field periods when a white dot (red: 255, green: 255, and blue: 255) in a pixel is horizontally displaced to the right in a black background (red: 0, green: 0, and blue: 0) at a predetermined velocity.
  • red signal components 301 and 306 are followed by red persistence components 304 and 309
  • green signal components 302 and 307 are followed by green persistence components 305 and 310 .
  • a blue phosphor only blue signal components 303 and 308 emit light.
  • the integration on the retina is performed on the red persistence component 304 and the green persistence component 305 respectively in positions of integral quantities 312 and 313 .
  • the integration on the retina is performed on the red signal component 306 and the red persistence component 309 in an identical position to obtain integral quantities 314 and 317 , respectively.
  • the integration on the retina is performed on the green signal component 307 and the green persistence component 310 in an identical position to obtain integral quantities 315 and 318 , respectively.
  • the integration on the retina is performed on the blue signal component 308 to obtain an integral quantity 316 .
  • an integral quantity 312 and 313 causes color shift, and a human perceives it as yellow.
  • the color shift occurs in a very short period of one field period, the color shift poses almost no problem.
  • a human perceives the image as shown in (d) of FIG. 2 .
  • the signal components 320 , 321 , and 322 of each color on the retina are perceived as somewhat blue as shown by the integral quantity 325 .
  • the persistence components 323 and 324 on the retina are perceived as a yellow tailing shown by the integral quantity 326 .
  • color shift occurs in a moving direction when a line of sight traces a moving object.
  • the color shift causes image components to be perceived as somewhat blue and a persistence component to be perceived as yellow.
  • the motion blur and the color shift in each pixel overlap with each other when there is a plurality of pixels, in other words, an image including the plurality of pixels.
  • FIG. 3 explanatorily shows integration on the retina for each signal component and each persistence component when a line of sight traces a white rectangle object in a gray background.
  • (a) in FIG. 3 shows a state where the white rectangle object (red: 255, green: 255, and blue: 255) is horizontally displaced to the right at a predetermined velocity in the gray background (red: 128, green: 128, and blue: 128) using an image signal viewed on a PDP.
  • FIG. 3 shows a time distribution of one field period of light emitted from one horizontal line that has been extracted from the image signal shown in (a) of FIG. 3 .
  • a signal component 401 emits light
  • a persistence component 402 emits light.
  • the persistence persists in the next field.
  • a line of sight 403 subsequently moves to the right according to the passage of time since the line of sight continuously traces movement of the white rectangle object.
  • the integration on the retina is performed along the line of sight. More specifically, the integration is performed on a component S 1 included in the signal component 401 in a position P 1 to calculate an integral quantity I 1 .
  • integration is performed on: a component S 2 included in the signal component 401 in a position P 2 to calculate an integral quantity I 2 ; a component S 3 included in the signal component 401 in a position P 3 to calculate an integral quantity I 3 ; a component S 4 included in the signal component 401 in a position P 4 to calculate an integral quantity I 4 ; a component S 5 included in the signal component 401 in a position P 5 to calculate an integral quantity I 5 ; a component S 6 included in the signal component 401 in a position P 6 to calculate an integral quantity I 6 ; a component S 7 included in the signal component 401 in a position P 7 to calculate an integral quantity I 7 ; and a component S 8 included in the signal component 401 in a position P 8 to calculate an integral quantity I 8 .
  • an integral quantity 404 of the signal component as shown in (c) of FIG. 3 is obtained from the signal component 401 .
  • integration is performed on: a component S 11 included in the persistence component 402 in the position P 1 to calculate an integral quantity I 11 ; a component S 12 included in the persistence component 402 in the position P 2 to calculate an integral quantity I 12 ; a component S 13 included in the persistence component 402 in the position P 3 to calculate an integral quantity I 13 ; a component S 14 included in the persistence component 402 in the position P 4 to calculate an integral quantity I 14 ; a component S 15 included in the persistence component 402 in the position P 5 to calculate an integral quantity I 15 ; a component S 16 included in the persistence component 402 in the position P 6 to calculate an integral quantity I 16 ; a component S 17 included in the persistence component 402 in the position P 7 to calculate an integral quantity I 17 ; and a component S 18 included in the persistence component 402 in the position P 8 to calculate an integral quantity I 18 .
  • the integral quantity 404 of the signal components needs to be proportioned to the integral quantity 405 of the persistence components on each coordinate position.
  • the persistence component 405 has excess or deficiency (hereinafter referred to as motion blur component).
  • a persistence excess amount 408 occurs in the vicinity of a region 406 where a value of a red or a green image signal is reduced from a previous field to a current field (hereinafter referred to as reduced intensity region) and the region is perceived as yellow.
  • a persistence deficiency amount 409 occurs in the vicinity of a region 407 where a value of a red or a green image signal is increased from a previous field to a current field (hereinafter referred to as increased intensity region) and the region is perceived as blue.
  • Patent Reference 1 suggests a method for reducing color shift caused by the persistence excess in a vicinity of the reduced intensity region by generating a pseudo-persistence signal from a current field and adding the generated pseudo-persistence signal to the current field.
  • the pseudo-persistence signal has a broken-line characteristic identical to those of the red and green phosphors with respect to a blue image signal.
  • adding the blue pseudo-persistence signal to a current field in the method suggested in Patent Reference 1 corresponds to adding the blue pseudo-persistence signal to a region where the persistence excess amount 408 appears as exemplified in FIG. 3 .
  • color shift can be solved by adding an integral quantity of a blue pseudo-persistence signal to integral quantities of a red persistence component and a green persistence component.
  • adding a blue pseudo-persistence signal to a current field is, in fact, the same as actively adding a motion blur to a blue image signal.
  • Patent Reference 1 does not take a region having the persistence deficiency amount 409 into account.
  • the present invention relates to an image display apparatus using phosphors each having a persistence time, and has an object of providing the image display apparatus and an image displaying method that are capable of reducing a motion blur caused by movement of an object.
  • the image display apparatus is an image display apparatus that displays an image using phosphors each having a persistence time, and includes: a motion detecting unit configured to detect motion information from an inputted image signal; a correction signal calculating unit configured to calculate a correction signal for correcting image degradation using the motion information, the image degradation being caused by persistence and a motion of the image signal; and a correcting unit configured to correct the image signal using the calculated correction signal.
  • a motion blur is corrected in image signals corresponding to phosphors each having a persistence time, in other words, generally only red and green image signals, a motion blur caused by movement of a line of sight can be corrected with higher precision. As a result, a problem of color shift caused by the motion blur can be fundamentally solved, and thus no color shift occurs.
  • a persistence time is a time period necessary for an amount of light of the emitted phosphors to be attenuated to equal to or less than 10% of the total amount of light at the time of immediate emission.
  • motion information includes a motion region, a motion direction, and a matching difference when a motion is detected.
  • the motion region is a region, for example, where an object in an inputted image moves from a previous field to a current field.
  • image degradation corresponds to a motion blur of an object displayed with emission of phosphors including persistence components.
  • image degradation also includes color shift caused by the motion blur.
  • a correction signal corresponds to a motion blur component.
  • a motion region may be specified by a pixel unit or a region unit including plural pixels.
  • the motion detecting unit may detect a motion region of the image signal as the motion information, and the correction signal calculating unit may calculate a correction signal for attenuating the image signal in a region where a value of the image signal is smaller than a value of a previous field and in a vicinity of the region, the region being included in the motion region and a vicinity of the motion region.
  • the previous field in the present invention refers to fields prior to the current field, and thus the previous field is not limited to an immediate previous field.
  • the motion detecting unit may detect a motion region of the image signal as the motion information
  • the correction signal calculating unit may calculate a correction signal for amplifying the image signal in a region where a value of the image signal is larger than a value of a previous field and in a vicinity of the region, the region being included in the motion region and a vicinity of the motion region.
  • the motion detecting unit may calculate a velocity of a motion in the motion region
  • the correction signal calculating unit may correct an amount of change between a value of the image signal in a current field and a value of the image signal in a previous field, in the motion region and in a vicinity of the motion region according to the velocity of the motion, and calculate the corrected amount of change as the correction signal.
  • the previous field refers to, for example, an immediate previous field.
  • the correction signal calculating unit may correct the amount of change by performing low-pass filter processing with the number of taps associated with the velocity of the motion.
  • the motion detecting unit may calculate a motion direction of the motion region, and the correction signal calculating unit may asymmetrically correct the amount of change according to the velocity of the motion and the motion direction, and may calculate the corrected amount of change as the correction signal.
  • an asymmetric correction in a motion direction refers to correction by assigning more weights to a motion direction so as to correct the motion direction to a higher degree.
  • Persistence is attenuated due to the exponential function characteristic, and integration on the retina is performed on the persistence component according to the movement of a line of sight.
  • a human strongly perceives, forward of the moving line of sight, a portion having a larger amount of light including a persistence component that temporally appears earlier.
  • asymmetrical correction needs to be performed on a correction signal in a motion direction such that a forward region is corrected to a higher degree than the correction in the motion direction.
  • the persistence component can be corrected more precisely.
  • the correction signal calculating unit may correct the amount of change by (i) performing low-pass filter processing with the number of taps associated with the velocity of the motion, and (ii) multiplying a low-pass filter passing signal on which the low-pass filter processing has been performed, by an asymmetrical signal generated by using two straight lines and a quadratic function according to the motion direction.
  • any methods may be used as long as a correction signal value forward of a motion direction becomes larger.
  • the motion detecting unit may calculate the motion information regarding the motion region and motion information reliability indicating reliability of the motion information, and the correction signal calculating unit may attenuate the correction signal as the motion information reliability is lower.
  • the motion information includes, for example, a velocity, a motion direction, and a motion vector in a moving image, and a difference calculated in detecting the motion vector (hereinafter referred to as difference). Furthermore, a difference represents a sum of absolute values (SAD), for example, to be used in two-dimensional block matching between each pixel of two-dimensional blocks in a reference field and each pixel of two-dimensional blocks in a current field.
  • the motion detecting unit is a unit that outputs motion information, for example, a unit that may perform two-dimensional block matching.
  • motion information reliability is a value that decreases when reliability of motion detection is lower or when correlation between motion information and a tendency of tracing an object by a human's line of sight is lower.
  • Motion detection cannot totally detect actual motions, and not every motion is traced by a human's line of sight even when the motions can be completely detected. Thus, in the case where it is highly likely that a motion is erroneously detected, unnecessary correction (hereinafter referred to as unfavorable consequence) can be suppressed by attenuating a correction signal.
  • the motion detecting unit may calculate the velocity of the motion in the motion region as the motion information, and calculate the motion information reliability so that the motion information reliability becomes lower in inverse proportion to the velocity of the motion.
  • correction is weakened when a motion is too fast.
  • the human tends not to trace a motion that is too fast through the sense of sight.
  • an unfavorable consequence spreads widely. In such a case, the unfavorable consequence can be suppressed by weakening the correction effect.
  • the motion detecting unit may calculate a difference in a corresponding region between a current field and a previous field as the motion information, and calculate the motion information reliability so that the motion information reliability becomes lower in inverse proportion to the difference.
  • correction is weakened when a difference is too large.
  • motion detection fails.
  • the unfavorable consequence can be suppressed by weakening the correction effect.
  • the motion detecting unit may calculate, as the motion information, a difference in a corresponding region between a current field and a previous field and a difference of a vicinity of the corresponding region between the current field and the previous field, and calculate the motion information reliability so that the motion information reliability becomes lower in inverse proportion to a difference between the calculated differences.
  • correction is weakened when a motion direction is erroneously detected.
  • motion detection fails.
  • a difference between a difference of motion information that has been detected and a difference of motion information in a vicinity of a region of the detected motion information, for example, motion information at the opposite side is smaller, the reliability of the motion direction is lower. In such a case, the unfavorable consequence can be suppressed by weakening the correction effect.
  • the motion detecting unit may calculate, as the motion information, a velocity and a motion direction of a motion in the motion region, and calculate the motion information reliability so that the motion information reliability becomes lower in inverse proportion to a difference between (i) the velocity and the motion direction of the motion and (ii) a velocity and a motion direction of a motion in a vicinity of the motion region.
  • a difference between a difference between (i) a velocity and a motion direction of a motion and (ii) a velocity and a motion direction of a motion in a vicinity of the motion region represents a difference between a motion vector in an object block and an average vector of motion vectors in above, upper left, and left of a calculated block.
  • the difference may be obtained by calculating a dot product between an object motion vector and an average motion vector in a vicinity of the object motion vector.
  • correction is weakened when a difference between an object motion and an average motion in a vicinity of the object motion is larger.
  • a human perceives peripheral average motions through the sense of sight when small objects move in various directions. In such a case, the unfavorable consequence can be suppressed by weakening the correction effect.
  • the motion detecting unit may calculate, as the motion information, a velocity and a motion direction of a motion in the motion region, and calculate the motion information reliability so that the motion information reliability becomes lower in inverse proportion to a difference between a difference between (i) the velocity and the motion direction of the motion and (ii) a velocity and a motion direction of a motion in a corresponding region of the previous field.
  • a difference between an object motion vector in a two-dimensional block and a motion vector in a two-dimensional block prior to a current field pointed by the object motion vector is used.
  • the difference may be obtained by calculating a dot product between such motion vectors.
  • the correction signal calculating unit attenuates a correction signal when a motion in a region largely varies in two field periods.
  • a human tends to trace a motion that continues for periods that are consecutive to some extent, and tends not to trace a motion that does not continue for consecutive periods through the sense of sight. In such a case, the unfavorable consequence can be suppressed by weakening the correction effect.
  • not only change in a motion for 2 field periods but also change in a motion for much longer field periods may be used, and furthermore, temporal change between motion vectors may be calculated to take an acceleration vector of a motion into account.
  • the present invention may be realized not only as such an image display apparatus but also as an image display method having the characteristic units of the image display apparatus as steps and as a program that causes a computer to execute such steps.
  • Such program can obviously be distributed with a recording medium such as a CD-ROM, and via a transmission medium, such as the Internet.
  • the motion blur can be reduced. Accordingly, color shift caused by the motion blur of a motion of an object can be reduced.
  • the object is to be displayed with emission of emitters having different persistence times.
  • FIG. 1 explanatorily shows integration on the retina for each color when an image signal of a white dot in a pixel is stationary, and respectively shows: (a) a distribution of light emission in a temporal direction for one field period, and (b) integral quantities on the retina.
  • FIG. 3 explanatorily shows integration on the retina for each signal component and each persistence component when a line of sight traces a white rectangle object in a gray background, and respectively shows: (a) a display pattern on the PDP; (b) a distribution of light emission from one horizontal line of an image signal in a temporal direction for 1 field period; (c) an integral quantity of a signal component on the retina when the line of sight traces the white rectangle object; and (d) an integral quantity of a persistence component on the retina when the line of sight traces the white rectangle object.
  • FIG. 4 is a block diagram illustrating a configuration of an image display apparatus as a base configuration of the present invention.
  • FIG. 5 illustrates a more specific application of the image display apparatus of the present invention.
  • FIG. 6 is a block diagram illustrating the configuration of the image display apparatus of the first embodiment.
  • FIG. 7 shows a flow of processing in the image display apparatus according to the first embodiment, and respectively shows: (a) a previous field; (b) a current field; (c) a subtraction signal (previous field-current field); (d) an LPF-passing subtraction signal; (e) an asymmetric gain; (f) a correction signal; and (g) a corrected current field.
  • FIG. 8 illustrates a block diagram of the configuration of the motion information reliability calculating unit.
  • FIG. 9 illustrates a block diagram of the configuration of the image display apparatus according to the second embodiment.
  • FIG. 10 illustrates a block diagram of the configuration of the image display apparatus according to the third embodiment.
  • FIG. 11 shows a flow of processing in the image display apparatus according to the third embodiment, and respectively shows: (a) a previous field; (b) a current field; (c) a subtraction signal (previous field-current field); (d) a motion region; (e) an LPF-passing signal in a current field; (f) an absolute value signal of a subtraction signal obtained by subtracting the LPF-passing signal from the current field; (g) an LPF-passing signal of an absolute value signal; and (h) a corrected current field.
  • FIG. 12 illustrates a block diagram of the configuration of the image display apparatus according to the fourth embodiment.
  • a base configuration of the present invention and four embodiments including limited constituent elements of the base configuration will be described.
  • FIG. 4 illustrates a block diagram of a configuration of an image display apparatus as the base configuration
  • FIG. 5 illustrates a more specific application of the image display apparatus.
  • An image display apparatus 1 displays an image using red and green phosphors each having a persistence time and a blue phosphor having almost no persistence time.
  • the image display apparatus 1 includes: a motion detecting unit 2 that detects, from an inputted image signal, motion information of a motion, such as a region, a velocity, a direction, and a matching difference; a correction signal calculating unit 3 that calculates a correction signal for a red image signal and a green image signal, using the inputted image signal and the motion information; and a correcting unit 4 that corrects the inputted image signal using the calculated correction signal. More specifically, this image display apparatus 1 can be applied to, for example, a plasma display panel as illustrated in FIG. 5 . This base configuration makes it possible to reduce a motion blur.
  • each of the four embodiments each including the motion detecting unit 2 , the correction signal calculating unit 3 , and the correcting unit 4 that are limited as the base configuration.
  • Each of the four embodiments uses a correction signal of a different geometry in a vicinity of a reduced intensity region and an increased intensity region, and either a method for correcting an image with higher precision using a motion direction or a method for correcting an image on a hardware scale without detecting a motion direction (each of the four embodiments combines a different correction method and a correction signal of a different geometry).
  • a first embodiment corrects the vicinity of a reduced intensity region using a motion direction
  • a second embodiment corrects the vicinity of an increased intensity region using a motion direction
  • a third embodiment corrects the vicinity of a reduced intensity region without using a motion direction
  • a fourth embodiment corrects the vicinity of an increased intensity region without using a motion direction.
  • the image display apparatus of the first embodiment will be described with reference to FIGS. 6 and 7 .
  • An object of the first embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of a reduced intensity region for each image signal and subtracting a correction signal from current fields of a red image signal and a green image signal. Furthermore, another object of the first embodiment is to reduce color shift simultaneously by reducing the motion blur.
  • processing is performed for each horizontal line to reduce a hardware scale in all of the first to fourth embodiments.
  • FIG. 6 illustrates a block diagram of the configuration of the image display apparatus of the first embodiment.
  • An image display apparatus 600 of the first embodiment includes a one-field delay device 601 , a motion detecting unit 603 , subtractors 602 and 608 , a low-pass filter (hereinafter referred to as LPF) 604 , an asymmetric gain calculating unit 605 , a motion information reliability calculating unit 606 , a multiplier 607 , and a motion information memory 609 .
  • LPF low-pass filter
  • each of the constituent elements of the image display apparatus 600 does input and output per horizontal line of red, green, and blue image signals.
  • the one-field delay device 601 delays an inputted current field by one field period, and outputs a previous field that is one field prior to the current field.
  • the subtractor 602 subtracts the current field from the previous field, and outputs a subtraction signal including only positive components.
  • the motion detecting unit 603 detects a motion using the inputted current field, the previous field, and the subtraction signal, and outputs motion information (a motion region, a direction, a velocity, and a difference).
  • the LPF 604 applies the number of taps calculated according to the velocity of the motion to the inputted subtraction signal, and outputs an LPF-passing subtraction signal.
  • the asymmetric gain calculating unit 605 outputs an asymmetric gain for shaping the LPF-passing subtraction signal using the inputted motion information.
  • the motion information reliability calculating unit 606 calculates motion information reliability using: the object motion information outputted from the motion detecting unit 603 ; motion information of 3 lines that are adjacent to an upper side of a line that is currently being processed and is outputted from the motion information memory 609 ; and motion information of a region that is present in a previous field and that corresponds to the object motion information.
  • the multiplier 607 multiplies the LPF-passing subtraction signal outputted from the LPF 604 by the asymmetric gain outputted from the asymmetric gain calculating unit 605 by a gain of the motion information reliability outputted from the motion information reliability calculating unit 606 .
  • the subtractor 608 subtracts the correction signal from the current fields of the red image signal and the green image signal, and outputs the current is fields in which motion blur has been corrected.
  • the motion information memory 609 stores motion information that has been detected.
  • FIG. 7 explanatorily show a flow of processing in the image display apparatus according to the first embodiment.
  • (a) to (g) in FIG. 7 show each signal for generating a correction signal for the red or green image signal per horizontal line, and changes in each signal.
  • the image display apparatus 600 of the first embodiment receives one horizontal line of a current field, and outputs the horizontal line in which a motion blur has been corrected.
  • the one-field delay device 601 delays an inputted current field by one field period, and outputs a previous field that is one field prior to the current field. (a) in FIG. 7 shows the previous field, and (b) in FIG. 7 shows the current field.
  • a subtraction signal is calculated using the inputted previous field and the current field.
  • the subtractor 602 subtracts the current field from the previous field, and outputs the calculated subtraction signal including only positive components. (c) in FIG. 7 shows this subtraction signal.
  • the subtraction signal is used herein.
  • a motion blur component can be approximately calculated by deforming, such as a current field or a field prior to the current field, a signal for the calculation is not limited to the subtraction signal.
  • motion information is detected using the previous field, the current field, and the subtraction signal.
  • the motion detecting unit 603 detects a motion using the inputted current field, the previous field, and the subtraction signal, and outputs motion information (a motion region, a direction, a velocity, and a difference).
  • the motion detecting unit 603 detects a motion region, and calculates a velocity of the motion region. In other words, the motion detecting unit 603 determines a region that exceeds a predetermined threshold value of one of or both of a red subtraction signal and a green subtraction signal to be a motion region, and a width of the motion region to be a velocity of the motion. Thereby, a reduced intensity region may be defined as the motion region. Furthermore, since motion search, for example, two-dimensional block matching is not performed, a motion region and a velocity can be detected with a reduced circuit scale.
  • the motion detecting unit 603 calculates a difference, and detects a direction from the calculated difference.
  • the motion detecting unit 603 calculates sums of absolute difference (hereinafter referred to as SAD) for regions present in a previous field and in a current field.
  • the regions are present in regions left and right of the current field, and the left and right regions have an identical width.
  • the obtained sums of absolute difference are referred to as a left SAD and a right SAD, respectively.
  • a total sum of differences for example, a sum of red, green, and blue image signals is used to obtain a SAD.
  • the motion detecting unit 603 determines a motion direction as a left direction when the left SAD is smaller than the right SAD, determines a motion direction as a right direction when the right SAD is smaller than the left SAD, and determines a state as motionless when the right SAD is equal to the left SAD. In the case of the motionless state, no correction is performed on an image signal.
  • the motion detecting unit 603 detects at least a motion direction and a velocity, using, for example, two-dimensional block matching, any motion detecting method may be used.
  • an LPF-passing subtraction signal is calculated by applying an LPF to a subtraction signal.
  • a subtraction signal and motion information are inputted to the LPF 604 .
  • the LPF 604 applies an LPF having the number of taps calculated according to a velocity of a motion to the inputted subtraction signal, and outputs an LPF-passing subtraction signal.
  • FIG. 7 shows the LPF-passing subtraction signal.
  • the number of taps represents a velocity of a motion (pixels per field).
  • an LPF calculates an average of peripheral pixel values, the number of taps and the LPF are not limited to such.
  • the motion blur component is in principle coextensive in a line of sight with integration on the retina.
  • the LPF is used for performing necessary processing corresponding to the integration.
  • the processing spatially amplifies a subtraction signal, the processing is not limited to an LPF processing.
  • an asymmetric gain is calculated using motion information.
  • the asymmetric gain calculating unit 605 outputs an asymmetric gain for shaping an LPF-passing subtraction signal using the inputted motion information.
  • the asymmetric gain calculating unit 605 generates an asymmetric gain using two straight lines and a quadratic function, as shown in (e) in FIG. 7 .
  • the asymmetric gain calculating unit 605 generates an asymmetric gain using combinations of a straight part 701 in a forward region (in this case, an adjacent right region) with respect to a motion region, a quadratic function part 702 in the motion region, and a straight line 703 .
  • values of each of the straight part 701 , and the quadratic function part 702 , and the straight line 703 range 0.0 to 1.0 inclusive. Since a forward region needs to be understood with respect to a motion region, a motion direction is always necessary for generating an asymmetric gain.
  • the asymmetric gain is used for correcting the forward region. Then, a persistence excess amount 408 in a vicinity of a reduced intensity region, for example, as shown in (d) of FIG. 3 is generated by multiplying the asymmetric gain by the subtraction signal LPF signal.
  • a geometry of the asymmetric gain in (e) of FIG. 7 is obtained under the states in FIGS. 3 and 6 , a motion blur component varies depending on a current field inputted.
  • the geometry is not limited to the geometry in (e) of FIG. 7 .
  • a geometry of an asymmetric gain can be extended more laterally. As a motion moves at a higher velocity, a region where image quality is degraded becomes larger. Consequently, a region necessary to be corrected also becomes larger.
  • the motion information reliability calculating unit 606 calculates motion information reliability using: the object motion information outputted from the motion detecting unit 603 ; motion information of 3 lines that are adjacent to an upper side of a line that is currently being processed and that is outputted from the motion information memory 609 ; and motion information of a region that is present in a previous field and that corresponds to the object motion information.
  • the motion information reliability is 1.0 in FIG. 7
  • FIG. 8 illustrates a block diagram of a detailed configuration of the motion information reliability calculating unit 606 .
  • the motion information reliability calculating unit 606 outputs a product of five gains (hereinafter referred to as first to fifth gains), and includes a first gain calculating unit 801 , average coordinate calculating units 802 a and 802 b , a lowest value selecting unit 803 , a second gain calculating unit 804 , an absolute difference calculating unit 805 , a third gain calculating unit 806 , a motion vector generating unit 807 , a peripheral vector calculating unit 808 , a fourth gain calculating unit 809 , and a fifth gain calculating unit 810 .
  • first to fifth gains a product of five gains
  • the first gain related to a velocity of a motion will be described first.
  • the first gain calculating unit 801 is a gain function having a broken-line characteristic, and outputs: 1.0 when a velocity of an inputted motion is lower than a first threshold; a variable that linearly ranges from 1.0 to 0.0 when the velocity is equal to or higher than the first threshold and lower than a second threshold; and 0.0 when the velocity is equal to or higher than the second threshold.
  • the image display apparatus 600 makes it possible to weaken the correction effect or disable the correction.
  • the second gain related to a difference in motion detection will be described.
  • the average coordinate calculating unit 802 a and 802 b respectively obtain an average left SAD and an average right SAD by dividing each of the left SAD and the right SAD by a width of a motion region. Then, the lowest value selecting unit 803 selects a lowest value of these average left SAD and average right SAD.
  • the second gain calculating unit 804 is a gain function having a broken-line characteristic, and outputs: 1.0 when the inputted lowest value is smaller than a first threshold; a variable that linearly ranges from 1.0 to 0.0 when the inputted lowest value is equal to or larger than the first threshold and smaller than the second threshold; and 0.0 when the inputted lowest value is equal to or larger than the second threshold.
  • the image display apparatus 600 makes it possible to weaken the correction effect or disable the correction.
  • the third gain related to a direction of a motion will be described.
  • the absolute difference calculating unit 805 calculates an absolute difference between an average left SAD calculated by the average coordinate calculating unit 802 a and an average right SAD calculated by the average coordinate calculating unit 802 b .
  • the third gain calculating unit 806 is a gain function having a broken-line characteristic, and outputs: 0.0 when the inputted absolute difference is smaller than a first threshold; a variable that linearly ranges from 0.0 to 1.0 when the absolute difference is equal to or larger than the first threshold and smaller than a second threshold; and 1.0 when the absolute difference is equal to or larger than the second threshold.
  • the image display apparatus 600 makes it possible to weaken the correction effect or disable the correction.
  • the first to third gains are all generated using a gain function having a broken-line characteristic, a step function using only one threshold or a gain function having a curve characteristic may be used instead.
  • the fourth gain related to isolation of object motion information from a vicinity of the object motion information will be described.
  • the motion vector generating unit 807 generates a motion vector using a motion direction and a velocity. More specifically, the motion vector generating unit 807 generates values each with a code, such as “+5” in the case of a motion at a velocity 5 in a right direction and “ ⁇ 10” in the case of a motion at a velocity 10 in a left direction. These operations are necessary when a motion direction and a velocity are respectively calculated. However, when a motion is initially converted to a vector, for example, as in two-dimensional block matching, such operation is not necessary.
  • each motion vector in regions of respectively 1 line, 2 lines, and 3 lines spatially above a line that has been currently processed is outputted from the motion information memory 609 (according to a method identical to a method for generating a motion vector by the motion vector generating unit 807 ).
  • the motion vectors are inputted to the peripheral vector calculating unit 808 .
  • the peripheral vector calculating unit 808 outputs an average vector of the inputted 3 motion vectors as a peripheral vector.
  • An average vector of motion vectors in adjacent blocks that are above, upper left, and left of a calculated block may be used as a peripheral motion vector, for example, when a motion vector is detected using two-dimensional block matching.
  • a peripheral motion vector may be anything as long as peripheral motion information is spatially used.
  • the fourth gain calculating unit calculates cosine of an angle between a motion vector outputted from the motion vector generating unit 807 and a peripheral vector outputted from the peripheral vector calculating unit 808 , for example, by calculating a dot product. Then, 1 is added to the calculated cosine, and the resulting value is divided by 2 to obtain a value ranging from 1.0 to 0.0 inclusive.
  • the fourth gain calculating unit 809 outputs the obtained value as the fourth gain.
  • the image display apparatus 600 makes it possible to weaken the correction effect or disable the correction in the case where a difference between an object motion vector and a motion vector in a vicinity of the object motion vector is larger, in other words, in the case where the object motion vector is isolated from motion vectors in a vicinity of the object motion vector.
  • a motion vector that is included in a current field and that is generated by the motion vector generating unit 807 (hereinafter referred to as current motion vector) is inputted to the motion information memory 609 , and a motion vector that is in a region of a previous field and that corresponds to the current motion vector (hereinafter referred to as previous motion vector) is outputted.
  • the fifth gain calculating unit 810 calculates cosine of an angle between the inputted current motion vector and the previous motion vector, for example, by calculating a dot product. Then, 1 is added to the calculated cosine, and the resulting value is divided by 2 to obtain a value ranging from 1.0 to 0.0 inclusive. Finally, the obtained value as the fifth gain is outputted.
  • the image display apparatus 600 makes it possible to weaken the correction effect or disable the correction in the case where a difference between the inputted current motion vector and the previous motion vector is larger, in other words, in the case where there is no continuity in the motion.
  • the multiplier 811 outputs a product of the first to fifth gains as motion information reliability.
  • the arithmetic computation may be performed using bit shift operation on all of the first to fifth gains. Furthermore, not all of the first to fifth gains have to be used. For example, the fourth and fifth gains are not used because they need a motion information memory.
  • an LPF-passing subtraction signal is multiplied by an asymmetric gain and a motion information reliability gain to calculate a correction signal.
  • the multiplier 607 multiplies the LPF-passing subtraction signal outputted from the LPF 604 by the asymmetric gain outputted from the asymmetric gain calculating unit 605 by the motion information reliability gain outputted from the motion information reliability calculating unit 606 , and outputs a correction signal.
  • (f) in FIG. 7 shows the obtained correction signal.
  • processing is performed independently one each line in the first to fourth embodiments albeit no illustration in FIG. 6 , there are cases where processing variations in a vertical direction may occur depending on execution of processing or non-processing.
  • a correction signal for a line that is currently being processed and an IIR filter that spatially replaces an interior signal included in a correction signal on one line with a current correction signal may be used.
  • a corrected current field is outputted using a current field and a correction signal.
  • (g) in FIG. 7 shows the corrected current field.
  • the subtractor 608 subtracts the correction signal from the current fields of the red image signal and the green image signal, and outputs the current field in which motion blur has been corrected.
  • the object of the first embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of a reduced intensity region for each image signal and subtracting a correction signal from current fields of a red image signal and a green image signal. Simultaneously, color shift can be reduced by reducing the motion blur.
  • FIG. 9 illustrates a block diagram of a detailed configuration of an image display apparatus according to the second embodiment.
  • the image display apparatus according to the second embodiment is partially changed from that of the first embodiment. The differences will only be described hereinafter.
  • An object of the second embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of an increased intensity region for each image signal and adding a correction signal to current fields of a red image signal and a green image signal that have long persistence times. Furthermore, another object of the second embodiment is to reduce color shift simultaneously by reducing the motion blur.
  • An image display apparatus 610 of the second embodiment includes a subtractor 611 , a motion detecting unit 612 , and an adder 613 that are respectively changed from the subtractor 602 , the motion detecting unit 603 , and the subtractor 608 of the image display apparatus 600 according to the first embodiment. The following describes the details.
  • the subtractor 611 subtracts a previous field from a current field, and outputs a subtraction signal including only positive components.
  • an increased intensity region may be defined as a motion region.
  • a field to be referred to when a difference is calculated and a motion direction to be detected are changed in reverse.
  • the motion detecting unit 612 calculates SADs for regions present in a previous field and in a current field. The regions are present in regions left and right of the current field, and the left and right regions have an identical width. Supposedly, the obtained sums of absolute difference are referred to as a left SAD and a right SAD, respectively.
  • the motion detecting unit 612 determines a motion direction as a right direction when the left SAD is smaller than the right SAD, determines a motion direction as a left direction when the right SAD is smaller than the left SAD, and determines a state as motionless when the right SAD is equal to the left SAD. In the case of a motionless state, no correction is performed on an image signal.
  • the operation is changed from subtraction to addition.
  • the adder 613 adds a correction signal to a current field and outputs the resulting signal.
  • a current field exceeds 255 when added, the value is outputted as 255, for example.
  • the object of the second embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of an increased intensity region for each image signal and adding a correction signal to current fields of a red image signal and a green image signal that have long persistence times. Simultaneously, color shift can be reduced by reducing the motion blur.
  • FIGS. 10 and 11 An image display apparatus according to the third embodiment of the present invention will be described with reference to FIGS. 10 and 11 .
  • An object of the third embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of a reduced intensity region for each image signal and subtracting a correction signal from current fields of a red image signal and a green image signal. Furthermore, another object of the third embodiment is to reduce color shift simultaneously by reducing the motion blur.
  • FIG. 10 illustrates a block diagram of a detailed configuration of the image display apparatus according to the third embodiment.
  • An image display apparatus 900 of the third embodiment includes a one-field delay device 901 , subtractors 902 , 905 , and 909 , a motion detecting unit 903 , low-pass filters 904 and 907 , an absolute value calculating unit 906 , and a correction signal region limiting unit 908 as illustrated in FIG. 10 .
  • each of the constituent elements of the image display apparatus 900 does input and output per horizontal line of red, green, and blue image signals.
  • the one-field delay device 901 delays an inputted current field by one field period, and outputs a previous field that is one field prior to the current field.
  • the subtractor 902 subtracts the current field from the previous field, and outputs a subtraction signal including only positive components.
  • the motion detecting unit 903 determines a width of a motion region that exceeds a threshold in the inputted subtraction signal, and outputs the width as a velocity of the motion.
  • the LPF 904 applies an LPF to the inputted current field to output the resulting signal.
  • the subtractor 905 subtracts an LPF-passing subtraction signal of a current field from the current field.
  • the absolute value calculating unit 906 calculates an absolute value between the current field and the LPF-passing subtraction signal of the current field.
  • the LPF 907 applies an LPF to an absolute value signal outputted from the absolute value calculating unit 906 to output the resulting signal.
  • the correction signal region limiting unit 908 limits a correction signal value in a region other than the peripheral motion region to 0.
  • the subtractor 909 subtracts, from the current field, the correction signal outputted from the correction signal region limiting unit 908 .
  • FIG. 11 explanatorily show a flow of processing in the image display apparatus according to the third embodiment.
  • (a) to (h) in FIG. 11 show each signal for generating a correction signal for the red or green image signal per horizontal line, and changes in each of the signals. The following describes processing in the third embodiment in details.
  • the image display apparatus 900 of the third embodiment receives a horizontal line of a current field, and outputs the horizontal line in which a motion blur has been corrected.
  • the one-field delay device 901 delays an inputted current field by one field period, and outputs a previous field that is one field prior to the current field. (a) in FIG. 11 shows the previous field, and (b) in FIG. 11 shows the current field.
  • a subtraction signal is calculated using the previous field and the current field.
  • the subtractor 902 subtracts the current field from the previous field, and outputs a subtraction signal including only positive components.
  • (c) in FIG. 11 shows this subtraction signal.
  • a motion region is detected from the subtraction signal.
  • the motion detecting unit 903 determines a width of a motion region that exceeds a threshold in the inputted subtraction signal, and outputs the width as a velocity of the motion. (d) in FIG. 11 shows the motion region. Thereby, a reduced intensity region may be defined as the motion region. Furthermore, since motion search, for example, two-dimensional block matching is not performed, a motion region and a velocity can be detected with a reduced circuit scale.
  • a region including a motion region, a region in the left vicinity of the motion region, and a region in the right vicinity of the motion region is referred to as a peripheral motion region to be used by the correction signal region limiting unit 908 .
  • the left vicinity of the motion region, the right vicinity of the motion region, and the motion region have an identical width.
  • an LPF is applied to a current field.
  • the LPF 904 applies the LPF to the inputted current field to output the resulting signal.
  • the LPF calculates an average of pixels and the number of taps represents a velocity outputted from the motion detecting unit 903 in this embodiment, the calculation and the definition of the number of taps may not be limited to these.
  • (e) of FIG. 11 shows an LPF-passing subtraction signal in a current field.
  • the subtractor 905 subtracts the LPF-passing subtraction signal from the current field.
  • an absolute value between a current field and the LPF-passing subtraction signal is calculated.
  • the absolute value calculating unit 906 calculates an absolute value between the current field and the LPF-passing subtraction signal.
  • (f) of FIG. 11 shows an absolute value signal between the current field and the LPF-passing subtraction signal.
  • an LPF is applied to the absolute value signal outputted from the absolute value calculating unit 906 .
  • the LPF 907 applies the LPF to the absolute value signal outputted from the absolute value calculating unit 906 to output the resulting signal.
  • the LPF calculates an average of pixels and the number of taps represents a velocity outputted from the motion detecting unit 903 in this embodiment, the calculation and the definition of the number of taps may not be limited to these.
  • (g) of FIG. 11 shows an LPF-passing signal of an absolute value signal. This is used as a correction signal.
  • the correction signal region limiting unit 908 limits a correction signal value in a region other than the peripheral motion region to 0.
  • An end of the peripheral motion region may be blurred using an LPF and other means so as to prevent a correction signal becomes discontinuous. Thereby, only a region where a motion blur is noticeable and intensity of light is greatly reduced can be corrected.
  • the correction signal is subtracted from a current field.
  • the subtractor 909 subtracts, from a current field, a correction signal outputted from the correction signal region limiting unit 908 .
  • (h) in FIG. 11 shows the corrected current field.
  • the object of the third embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of a reduced intensity region for each image signal without using a motion direction, and subtracting a correction signal from current fields of a red image signal and a green image signal. Simultaneously, color shift can be reduced by reducing the motion blur.
  • FIG. 12 illustrates a block diagram of a detailed configuration of an image display apparatus according to the fourth embodiment.
  • the image display apparatus according to the fourth embodiment is partially changed from that of the third embodiment. The differences will only be described hereinafter.
  • An object of the fourth embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of an increased intensity region for each image signal and adding a correction signal to current fields of a red image signal and a green image signal that have long persistence times. Furthermore, another object of the fourth embodiment is to reduce color shift simultaneously by reducing the motion blur.
  • the subtractor 902 is changed to a subtractor 911
  • the subtractor 909 is changed to an adder 912 .
  • the change from the subtractor 909 will be described.
  • the subtractor 909 is changed to the adder 912 .
  • a correction signal can be added to an increased intensity region where persistence is insufficient.
  • a blur caused by the persistence can be reduced and color shift can also be reduced.
  • the object of the fourth embodiment is to reduce a motion blur by calculating a motion blur component in a vicinity of an increased intensity region for each image signal and adding a correction signal to current fields of a red image signal and a green image signal that have long persistence times. Simultaneously, color shift can be reduced by reducing the motion blur.
  • the motion detecting unit, an asymmetric gain, and an LPF may be extended two-dimensionally to perform two-dimensional correction.
  • red and green image signals may have values beyond a variable range, and correction may be insufficient after the final correction, namely, subtraction or addition (the processing is performed in the correcting unit 4 in FIG. 4 as the base configuration). In other words, there are cases where a motion blur cannot be removed completely. In the case of 8 bits, there are cases where an image signal that has been corrected may have a negative value or a value equal to or more than 255.
  • the red and green image signals may be simply clipped to a value in a range from 0 to 255.
  • a negative value of the image signal may be replaced with 0, and a value equal to or larger than 255 of the image signal may be replaced with 255 for the output.
  • color shift may be improved by adding an absolute value representing a correction-deficient component (of one of a red signal and a green signal that has a larger absolute value) to a blue image signal having no motion blur, and by subtracting the absolute value from the blue image signal in a vicinity of a reduced intensity region.
  • a correction-deficient component of one of a red signal and a green signal that has a larger absolute value
  • a correction signal is calculated even for a blue image signal to limit the correction, thus preventing correction beyond the value of the calculated correction signal from being performed on the blue image signal.
  • this function can be used.
  • a reduced intensity region is corrected in the first and third embodiments, and an increased intensity region is corrected in the second and fourth embodiments.
  • red and green image signals are corrected in the first to fourth embodiments
  • a signal to be corrected is not limited to these signals.
  • a blue image signal may be corrected.
  • the motion blur cannot be improved but color shift can be improved.
  • a blue signal can be corrected more precisely than the correction in Patent Reference 1 by using a motion direction.
  • An object of this image display apparatus is to reduce a motion blur by calculating a motion blur component in a vicinity of a reduced intensity region for each image signal and adding a correction signal to a current field of a blue image signal having a short persistence time.
  • the LPF 604 the asymmetric gain calculating unit 605 , and the subtractor 608 are changed. The following describes the details.
  • the LPF 604 is not used. This is because processing for spatially amplifying a subtraction signal is not necessary when a blue image signal is used for the correction. For such correction, a motion region has only to be corrected as a region 412 in FIG. 3 .
  • An asymmetric gain has a geometry that can be corrected, for example, as the region 412 in FIG. 3 .
  • a correction signal needs to have a geometry like the region 412 in FIG. 3 .
  • the geometry is different from a correction signal geometry 410 for use in correction by red and green image signals.
  • an asymmetric gain having a geometry different from the correction signal geometry 410 needs to be used.
  • the subtractor 608 is changed to an adder. This is because a blue correction signal is added.
  • a motion blur can be reduced by calculating a motion blur component in a vicinity of a reduced intensity region for each image signal and adding a correction signal to a current field of a blue image signal having a short persistence time.
  • the partial changes from the first embodiment are embodied using the image display apparatus having a case where an increased intensity region may be corrected with respect to a blue image signal using a motion direction.
  • the changes will only be described hereinafter.
  • the object of this image display apparatus is to reduce a motion blur by calculating a motion blur component in a vicinity of an increase intensity region for each image signal and adding a correction signal to a current field of a blue image signal having a short persistence time.
  • the subtractor 602 and the motion detecting unit 603 are changed in the same manner as those of the second embodiment.
  • the LPF 604 is not used. This is because processing for spatially amplifying a subtraction signal is not necessary when a blue image signal is used for the correction. For such correction, a motion region has only to be corrected as the region 413 in FIG. 3 .
  • An asymmetric gain has a geometry that can be corrected, for example, as the region 413 in FIG. 3 .
  • a correction signal needs to have a geometry like the region 413 in FIG. 3 .
  • the geometry is different from a correction signal geometry 411 for use in correction by red and green image signals.
  • an asymmetric gain having a geometry different from the correction signal geometry 411 needs to be used.
  • a motion blur can be reduced by calculating a motion blur component in a vicinity of an increased intensity region for each image signal and subtracting a correction signal from a current field of a blue image signal having a short persistence time.
  • Each of the above apparatuses is specifically a computer system including a micro processing unit, a ROM, a RAM, and the like.
  • the computer program is stored in the RAM.
  • the micro processing unit operates according to the computer program, so that each of the apparatuses fulfills a function.
  • the computer program is programmed by combining plural instruction codes each of which indicates an instruction for a computer.
  • the system LSI is a super-multifunctional LSI manufactured by integrating components on one chip and is, specifically, a computer system including a micro processing unit, a ROM, a RAM, and the like.
  • the computer program is stored in the RAM.
  • the micro processing unit operates according to the computer program, so that the system LSI fulfills its function.
  • the IC card or the module is a computer system including a micro processing unit, a ROM, a RAM, and the like.
  • the IC card or the module may include the above super-multifunctional LSI.
  • the micro processing unit operates according to the computer program, so that the IC card or the module fulfills its function.
  • the IC card or the module may have tamper-resistance.
  • the present invention may be any of the above methods.
  • the present invention may be a computer program which causes a computer to execute these methods, and a digital signal which is composed of the computer program.
  • the computer program or the digital signal may be recorded on a computer-readable recording medium such as a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a Blu-ray Disc (BD), and a semiconductor memory.
  • a computer-readable recording medium such as a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a Blu-ray Disc (BD), and a semiconductor memory.
  • the digital signal may be recorded on these recording media.
  • the computer program or the digital signal may be transmitted via an electronic communication line, a wireless or wired communication line, a network represented by the Internet, data broadcasting, and the like.
  • the present invention may be a computer system including a micro processing unit and a memory.
  • the memory may store the above computer program, and the micro processing unit may operate according to the computer program.
  • the present invention may execute the computer program or the digital signal in another independent computer system by recording the computer program or the digital signal on the recording medium and transmitting the recorded computer program or digital signal or by transmitting the computer program or the digital signal via the network and the like.
  • the present invention may be any of the above methods.
  • the image display apparatus and the image displaying method according to the present invention can reduce, in an image, a motion blur occurring due to a persistence component in a phosphor. Accordingly, the color shift can be improved.
  • the present invention is applicable to an image display apparatus using phosphors each having a persistence time, such as a plasma display panel.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Power Engineering (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Control Of Gas Discharge Display Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)
US12/301,054 2006-05-23 2007-05-23 Image display apparatus, image displaying method, plasma display panel apparatus, program, integrated circuit, and recording medium Expired - Fee Related US8174544B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2006-142490 2006-05-23
JP2006142490 2006-05-23
PCT/JP2007/060553 WO2007136099A1 (ja) 2006-05-23 2007-05-23 画像表示装置、画像表示方法、プラズマディスプレイパネル装置、プログラム、集積回路、及び、記録媒体

Publications (2)

Publication Number Publication Date
US20090184894A1 US20090184894A1 (en) 2009-07-23
US8174544B2 true US8174544B2 (en) 2012-05-08

Family

ID=38723409

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/301,054 Expired - Fee Related US8174544B2 (en) 2006-05-23 2007-05-23 Image display apparatus, image displaying method, plasma display panel apparatus, program, integrated circuit, and recording medium

Country Status (6)

Country Link
US (1) US8174544B2 (zh)
EP (1) EP2028638A4 (zh)
JP (1) JP5341509B2 (zh)
KR (1) KR101359139B1 (zh)
CN (1) CN101449312B (zh)
WO (1) WO2007136099A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120081419A1 (en) * 2010-10-01 2012-04-05 Canon Kabushiki Kaisha Image display apparatus and control method thereof
US9247245B2 (en) 2010-06-07 2016-01-26 Stmicroelectronics International N.V. Adaptive filter for video signal processing for decoder that selects rate of switching between 2D and 3D filters for separation of chroma and luma signals
US10283031B2 (en) 2015-04-02 2019-05-07 Apple Inc. Electronic device with image processor to reduce color motion blur
US20190287469A1 (en) * 2018-03-13 2019-09-19 Samsung Display Co, Ltd, Display device and a method for driving the same

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101543065B (zh) * 2007-02-20 2012-03-14 索尼株式会社 图像显示装置、视频信号处理器以及视频信号处理方法
TWI401944B (zh) * 2007-06-13 2013-07-11 Novatek Microelectronics Corp 用於視訊處理系統之雜訊消除裝置
KR20090008621A (ko) * 2007-07-18 2009-01-22 삼성전자주식회사 중요 움직임 검출 방법 및 장치
JP2010015061A (ja) * 2008-07-04 2010-01-21 Panasonic Corp 画像表示装置、集積回路及びコンピュータプログラム
EP2242035A1 (en) 2009-04-17 2010-10-20 Thomson Licensing Reduction of phosphor lag artifacts on display devices
US20120133835A1 (en) * 2009-08-11 2012-05-31 Koninklijke Philips Electronics N.V. Selective compensation for age-related non uniformities in display
JP4947668B2 (ja) * 2009-11-20 2012-06-06 シャープ株式会社 電子機器、表示制御方法、およびプログラム
US8588474B2 (en) * 2010-07-12 2013-11-19 Texas Instruments Incorporated Motion detection in video with large areas of detail
CN102543006A (zh) * 2010-12-13 2012-07-04 康耀仁 补偿显示装置、补偿方法及电路
JP2015041367A (ja) * 2013-08-23 2015-03-02 株式会社東芝 画像解析装置、画像解析方法、画像解析プログラム
KR102294633B1 (ko) * 2015-04-06 2021-08-30 삼성디스플레이 주식회사 표시 장치 및 표시 장치의 구동 방법
CN105957468B (zh) * 2016-05-03 2019-11-29 苏州佳世达光电有限公司 一种色彩校正方法及应用其的显示装置

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10254403A (ja) 1997-03-06 1998-09-25 Fujitsu General Ltd ディスプレイ装置の動画補正回路
JPH10282930A (ja) 1997-04-10 1998-10-23 Fujitsu General Ltd ディスプレイ装置の動画補正方法及び動画補正回路
EP0896317A2 (en) 1997-08-07 1999-02-10 Hitachi, Ltd. Color image display apparatus and method
US5872604A (en) * 1995-12-05 1999-02-16 Sony Corporation Methods and apparatus for detection of motion vectors
JPH11109916A (ja) 1997-08-07 1999-04-23 Hitachi Ltd カラー画像表示装置
EP0924684A1 (en) 1997-12-15 1999-06-23 THOMSON multimedia Method of compensating for the differences in persistence of the phosphors of a plasma display panel
JP2001255863A (ja) 2000-03-14 2001-09-21 Nippon Hoso Kyokai <Nhk> 表示画像の画質劣化を低減させる方法および装置
JP2002014647A (ja) 2000-06-28 2002-01-18 Fujitsu Hitachi Plasma Display Ltd 表示パネルの駆動方法および駆動装置
EP1175089A2 (en) * 2000-07-17 2002-01-23 SANYO ELECTRIC Co., Ltd. A method for suppressing noise in image signals and an image signal processing device adopting the same
JP2004004782A (ja) 2002-04-24 2004-01-08 Matsushita Electric Ind Co Ltd 画像表示装置
US6687387B1 (en) * 1999-12-27 2004-02-03 Internet Pictures Corporation Velocity-dependent dewarping of images
EP1426915A1 (en) 2002-04-24 2004-06-09 Matsushita Electric Industrial Co., Ltd. Image display device
JP2004191728A (ja) 2002-12-12 2004-07-08 Matsushita Electric Ind Co Ltd 残光による画質劣化を補正する表示方法及び表示装置
US20040169732A1 (en) * 2001-06-23 2004-09-02 Sebastien Weitbruch Colour defects in a display panel due to different time response of phosphors
EP1460611A1 (en) 2003-03-17 2004-09-22 Deutsche Thomson-Brandt Gmbh Method and device for compensating the phosphor lag of display devices
US20050001935A1 (en) * 2003-07-03 2005-01-06 Shinya Kiuchi Image processing device, image display device, and image processing method
US20050030302A1 (en) * 2003-07-04 2005-02-10 Toru Nishi Video processing apparatus, video processing method, and computer program
JP2005141204A (ja) 2003-10-14 2005-06-02 Matsushita Electric Ind Co Ltd 画像信号処理方法および画像信号処理装置
EP1605689A1 (en) 2004-01-30 2005-12-14 Matsushita Electric Industrial Co., Ltd. Frame circulating type noise reduction method and frame circulating type noise reduction device
JP2005351949A (ja) 2004-06-08 2005-12-22 Mitsubishi Electric Corp 画像表示装置
US20060192786A1 (en) * 2003-10-14 2006-08-31 Kazuhiro Yamada Image signal processing method and image signal processing apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2824947B1 (fr) * 2001-05-17 2003-08-08 Thomson Licensing Sa Procede d'affichage d'une sequence d'image video sur un panneau d'affichage au plasma

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5872604A (en) * 1995-12-05 1999-02-16 Sony Corporation Methods and apparatus for detection of motion vectors
JPH10254403A (ja) 1997-03-06 1998-09-25 Fujitsu General Ltd ディスプレイ装置の動画補正回路
JPH10282930A (ja) 1997-04-10 1998-10-23 Fujitsu General Ltd ディスプレイ装置の動画補正方法及び動画補正回路
US6014258A (en) 1997-08-07 2000-01-11 Hitachi, Ltd. Color image display apparatus and method
EP0896317A2 (en) 1997-08-07 1999-02-10 Hitachi, Ltd. Color image display apparatus and method
JPH11109916A (ja) 1997-08-07 1999-04-23 Hitachi Ltd カラー画像表示装置
JPH11259044A (ja) 1997-12-15 1999-09-24 Thomson Multimedia Sa 画像表示スクリーン中の蛍光体の残光性の差を補償する方法及び装置
US6377232B1 (en) 1997-12-15 2002-04-23 Thomson Licensing S.A. Method of compensating for the differences in persistence of the phosphors in an image display screen
EP0924684A1 (en) 1997-12-15 1999-06-23 THOMSON multimedia Method of compensating for the differences in persistence of the phosphors of a plasma display panel
US6687387B1 (en) * 1999-12-27 2004-02-03 Internet Pictures Corporation Velocity-dependent dewarping of images
JP2001255863A (ja) 2000-03-14 2001-09-21 Nippon Hoso Kyokai <Nhk> 表示画像の画質劣化を低減させる方法および装置
JP2002014647A (ja) 2000-06-28 2002-01-18 Fujitsu Hitachi Plasma Display Ltd 表示パネルの駆動方法および駆動装置
EP1175089A2 (en) * 2000-07-17 2002-01-23 SANYO ELECTRIC Co., Ltd. A method for suppressing noise in image signals and an image signal processing device adopting the same
US20040169732A1 (en) * 2001-06-23 2004-09-02 Sebastien Weitbruch Colour defects in a display panel due to different time response of phosphors
JP2004532433A (ja) 2001-06-23 2004-10-21 トムソン ライセンシング ソシエテ アノニム 発光素子の時間的応答が異なることに起因するカラートレールのディスカラレーション
JP2004004782A (ja) 2002-04-24 2004-01-08 Matsushita Electric Ind Co Ltd 画像表示装置
US20040183820A1 (en) * 2002-04-24 2004-09-23 Isao Kawahara Image display device
EP1426915A1 (en) 2002-04-24 2004-06-09 Matsushita Electric Industrial Co., Ltd. Image display device
JP2004191728A (ja) 2002-12-12 2004-07-08 Matsushita Electric Ind Co Ltd 残光による画質劣化を補正する表示方法及び表示装置
EP1460611A1 (en) 2003-03-17 2004-09-22 Deutsche Thomson-Brandt Gmbh Method and device for compensating the phosphor lag of display devices
US20050001935A1 (en) * 2003-07-03 2005-01-06 Shinya Kiuchi Image processing device, image display device, and image processing method
US20050030302A1 (en) * 2003-07-04 2005-02-10 Toru Nishi Video processing apparatus, video processing method, and computer program
JP2005141204A (ja) 2003-10-14 2005-06-02 Matsushita Electric Ind Co Ltd 画像信号処理方法および画像信号処理装置
US20060192786A1 (en) * 2003-10-14 2006-08-31 Kazuhiro Yamada Image signal processing method and image signal processing apparatus
EP1605689A1 (en) 2004-01-30 2005-12-14 Matsushita Electric Industrial Co., Ltd. Frame circulating type noise reduction method and frame circulating type noise reduction device
US20060192896A1 (en) 2004-01-30 2006-08-31 Kazuki Sawa Frame circulating type noise reduction method and frame circulating type noise reduction device
JP2005351949A (ja) 2004-06-08 2005-12-22 Mitsubishi Electric Corp 画像表示装置

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International Search Report issued Aug. 7, 2007 in the International (PCT) Application of which the present application is the U.S. National Stage.
Supplementary European Search Report (in English language) issued Aug. 6, 2010 in corresponding European Patent Application No. 07 74 3987.

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9247245B2 (en) 2010-06-07 2016-01-26 Stmicroelectronics International N.V. Adaptive filter for video signal processing for decoder that selects rate of switching between 2D and 3D filters for separation of chroma and luma signals
US20120081419A1 (en) * 2010-10-01 2012-04-05 Canon Kabushiki Kaisha Image display apparatus and control method thereof
US10283031B2 (en) 2015-04-02 2019-05-07 Apple Inc. Electronic device with image processor to reduce color motion blur
US20190287469A1 (en) * 2018-03-13 2019-09-19 Samsung Display Co, Ltd, Display device and a method for driving the same

Also Published As

Publication number Publication date
KR20090010990A (ko) 2009-01-30
US20090184894A1 (en) 2009-07-23
EP2028638A4 (en) 2010-09-08
WO2007136099A1 (ja) 2007-11-29
JP5341509B2 (ja) 2013-11-13
JPWO2007136099A1 (ja) 2009-10-01
CN101449312A (zh) 2009-06-03
CN101449312B (zh) 2012-06-20
KR101359139B1 (ko) 2014-02-05
EP2028638A1 (en) 2009-02-25

Similar Documents

Publication Publication Date Title
US8174544B2 (en) Image display apparatus, image displaying method, plasma display panel apparatus, program, integrated circuit, and recording medium
CA2417169C (en) Image processing apparatus and image processing method
JPWO2004002135A1 (ja) 動き検出装置及びそれを用いたノイズリダクション装置
JP2008261984A (ja) 画像処理方法及びこれを用いた画像表示装置
KR100714723B1 (ko) 디스플레이 패널에서의 잔광 보상 방법과 잔광 보상 기기,그리고 상기 잔광 보상 기기를 포함하는 디스플레이 장치
JP5149725B2 (ja) 画像処理装置及びその制御方法
WO2011155258A1 (ja) 画像処理装置および方法、画像表示装置および方法
US20100002005A1 (en) Image display apparatus, integrated circuit, and computer program
US20120314969A1 (en) Image processing apparatus and display device including the same, and image processing method
KR20100064808A (ko) 픽셀의 휘도 정보에 기초한 영상 향상 장치 및 방법
JP6180135B2 (ja) 画像表示装置及びその制御方法
US20120162528A1 (en) Video processing device and video display device
JP2008033592A (ja) 画像処理装置および画像処理方法、並びにプログラム
JP3801179B2 (ja) フレーム巡回型ノイズ低減方法
KR100700405B1 (ko) 계조 표시 장치
US7602357B2 (en) Method and apparatus of image signal processing
KR20100115310A (ko) 디스플레이 장치 상의 발광체 지연 아티팩트 결함 저감 장치 및 방법
JP4412042B2 (ja) フレーム巡回型ノイズ低減方法
JP5111310B2 (ja) 画像処理方法及び画像処理装置
JP2010139947A (ja) 画像信号処理方法及び画像信号処理装置
KR101577703B1 (ko) 흐림과 이중 윤곽의 효과를 줄이는 비디오 화상 디스플레이방법과 이러한 방법을 구현하는 디바이스
KR20040068970A (ko) 디지털 이미지 처리 시스템들에서 움직임 벡터들의 조정
KR100462596B1 (ko) 의사 윤곽 보정 장치 및 방법
JP2011232586A (ja) 三次元誤差拡散装置および三次元誤差拡散方法
JP2006217057A (ja) 画質変換処理方法および画質変換処理装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: PANASONIC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SATO, DAISUKE;MONOBE, YUSUKE;REEL/FRAME:022060/0762

Effective date: 20081018

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20200508