US9270897B2 - Apparatus, method, and program for processing image - Google Patents

Apparatus, method, and program for processing image Download PDF

Info

Publication number
US9270897B2
US9270897B2 US12/487,922 US48792209A US9270897B2 US 9270897 B2 US9270897 B2 US 9270897B2 US 48792209 A US48792209 A US 48792209A US 9270897 B2 US9270897 B2 US 9270897B2
Authority
US
United States
Prior art keywords
motion blur
motion
image
shutter speed
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/487,922
Other languages
English (en)
Other versions
US20090316009A1 (en
Inventor
Atsushi Ito
Seiji Kobayashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOBAYASHI, SEIJI, ITO, ATSUSHI
Publication of US20090316009A1 publication Critical patent/US20090316009A1/en
Application granted granted Critical
Publication of US9270897B2 publication Critical patent/US9270897B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • H04N5/243
    • G06T5/002
    • G06T5/003
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • H04N5/235
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation

Definitions

  • the present invention relates to an apparatus, method and program for processing an image and, in particular, to an image processing technique for obtaining a high-quality image taking into consideration a motion image blur on the image.
  • motion blur When a moving image taken at a high-speed shutter or an animation is displayed on a display device such as a projector or a display, the motion of a moving object contained in an image can be displayed in a discontinued fashion. This leads to frequent image degradation in which a user sees multiple images of the moving object.
  • the degradation of the moving image due to motion unnaturalness is generally referred to as motion jerkiness.
  • motion blur when a moving image taken at a low-speed shutter such as with an open shutter is displayed, the image of an object may lack detail or an edge of the image becomes blurred because of the effect of motion blur. This phenomenon is referred to as motion blur, which is also one of the image degradations.
  • FIGS. 26-28 diagrammatically illustrate the way an object is viewed by a viewer in accordance with the vision characteristics.
  • FIGS. 26A and 26B illustrate how a still object and a moving object look in the real world.
  • FIG. 26A illustrate chronological movement of a still object 71 and a moving object 72 with the abscissa representing position (x) and the ordinate representing time (t).
  • FIG. 26B diagrammatically illustrates the vision of a viewer who views the still object 71 and the moving object 72 .
  • the viewers views the objects in two vision conditions, i.e., a tracking vision tracking the moving object 72 and a fixed vision not tracking the moving object 72 , respectively illustrated as (a) tracking vision and (b) fixed vision.
  • the moving object 72 looks like moving object vision information a 72 .
  • This is identical to fixed object vision information b 71 in which the still object 71 looks in (b) fixed vision in FIG. 26B .
  • the moving object 72 looks in the same way as the still object 71 looks in the fixed vision.
  • the moving object 72 looks like moving object vision information b 72 in FIG. 26B . The viewer visually recognizes the moving object as a continuously moving object, and is free from any discomfort viewing.
  • FIGS. 27A and 27B illustrate the principle of the generation of jerkiness viewed by the viewer when a moving image taken at a high-speed shutter or an animation is displayed on a display device such as a projector or a display device.
  • Jerkiness is a phenomenon in which the motion of a moving object contained in an image is displayed in a discontinuous manner, causing the viewer to visually recognize multiple images of the moving object.
  • FIGS. 26A and 26B illustrate the way the viewer visually recognizes the moving object.
  • FIG. 27A illustrates a change in the display positions of a display still object 81 and a display moving object 82 on a display device.
  • the ordinate represents time (t) and is graduated in refresh periods of the display device (each period being 1/60 second), and the abscissa represents display position (x).
  • FIG. 27B diagrammatically illustrates the vision status of the viewer who views the display still object 81 and the display moving object 82 displayed on the display device.
  • the vision status of the viewer includes (a) tracking vision in which the viewer views the image with the display moving object 82 being tracked, and (b) fixed vision in which the viewer views the image with the display moving object 82 not tracked but with the vision of the viewer fixed.
  • an image a 82 looks in the same way as the image a 72 looks in (a) tracking vision in FIG. 26B .
  • the viewer visually recognizes the image in the same way as the viewer views a still object in the fixed vision.
  • the display moving object 82 displayed on the display device When the display moving object 82 displayed on the display device is viewed by the viewer in (b) fixed vision as illustrated in FIG. 27B , the display moving object 82 looks like images b 82 not continuously moving but discontinuously moving in a manner different from the real world in the viewer's vision. As a result, the viewer visually recognizes as multiple images the moving object displayed on the display device based on the human vision characteristics. In human vision characteristics, humans visually recognize a light ray incident on the eyes as a value that results from integrating the light ray for a predetermined period of time.
  • jerkiness In principle, an object moving at a high speed suffers more from jerkiness. The lower the frame rate of the display device, the more jerkiness takes place, and the higher the frame rate, the less jerkiness takes place. Furthermore, jerkiness takes place more in a portion of an image where a change in spatial luminance is large, i.e., where a spatial contrast is high.
  • FIGS. 28A and 28B illustrate how a blur viewed by the viewer is generated when a moving image taken at a low-speed shutter such as with an open shutter or an animation is displayed on a display device such as a projector or a display.
  • the blur is a phenomenon in which the image of an object may lack detail or an edge of the image becomes blurred because of the effect of motion blur.
  • FIGS. 28A and 28B diagrammatically illustrate how the moving object in the real world illustrated in FIGS. 26A and 26B looks to the viewer when the moving object is imaged at a low-speed shutter and then displayed on the display device at a refresh rate of 60 Hz.
  • FIG. 28A illustrates a change in the display positions of a display still object 91 and a display moving object 92 on the display device.
  • the ordinate represents time (t) and is graduated in refresh periods of the display device (each period being 1/60 second), and the abscissa represents display position (x).
  • FIG. 28B diagrammatically illustrates the vision status of the viewer who views the display still object 91 and the display moving object 92 displayed on the display device.
  • the vision status of the viewer includes (a) tracking vision in which the viewer views the image with the display moving object 92 being tracked, and (b) fixed vision in which the viewer views the image with the display moving object 92 not tracked but with the vision of the viewer fixed.
  • an image b 92 looks in the same way as the image b 72 looks in (b) fixed vision in FIG. 26B .
  • the viewer visually recognizes the moving object as a continuously moving object, and is free from any discomfort viewing.
  • an image a 92 looks to the viewer as a blurred image as illustrated in FIG. 28B in a manner different from the case in which the still object is viewed in fixed vision.
  • the display moving object 92 in FIG. 28A is imaged, the motion of the moving object during a long exposure of a low-speed shutter is recorded within one frame, and the moving object is thus displayed as a band in one frame.
  • Such a phenomenon is referred to as blur.
  • Mere shutter control causes either the degradation of jerkiness or the degradation of blur to be pronounced. More specifically, if an image taken at a relatively high shutter speed with respect to the frame rate of the moving image is displayed as a still image, high sharpness is provided. If the image is displayed as a moving image, the motion of a moving area within the image, in particular, a moving area at a high speed is not smooth. Such an image looks unnatural to the vision of humans. If an image taken at a relatively low shutter speed with respect to the frame rate of the moving image is displayed as a moving image, high sharpness is provided. If the image is displayed as a moving image, the motion of a high-speed moving area within the image is smooth, but the entire image lacks sharpness.
  • Japanese Unexamined Patent Application Publication No. 2007-274299 discloses a jerkiness reducing technique intended to be used on an image taken at a high shutter speed.
  • a motion blur is added through image processing.
  • An added amount of motion blur is controlled through analysis of image processing so that the excessive addition of the motion blur does not cause blur degradation.
  • a technical approach of reducing the motion blur through image processing performed mainly on an input image taken at a low shutter speed is widely studied.
  • image processing techniques for correcting blur of an image include mainly an inverse convolution technique based on a blur model, and a technique not using no blur model, such as a peaking technique or a shock filter technique.
  • the technique disclosed in the paper entitled “Extension of Coupled Nonlinear Diffusion to Motion De-blurring—Introduction of Anisotropic Peaking,” Takahiro SAITO, Hiroyuki HARADA, and Takashi KOMATSU, The Institute of Image Information and Television Engineers Vol. 58, No. 12 pp. 1839-1844 (2004) is related to the blur-model-based inverse convolution technique as motion blur reduction means.
  • the technique disclosed in the paper entitled “Motion De-blurring Using a Blur Model” Takahiro SAITO, Hiroyuki HARADA, Taishi SANO, and Takashi KOMATSU, The Institute of Image Information and Television Engineers Vol. 59, No. 11, pp. 1714-1721 (2005) is related to the technique not using no blur model as motion blur reduction means.
  • an image processing apparatus includes correction parameter calculation means for calculating a motion blur correction parameter for motion blur correction on the basis of motion information indicating a motion of an image between unit images, the unit images forming image data, and shutter speed information obtained at the image capturing of the image data, and motion blur correction processing means for correcting a motion blur quantity contained in the image data by performing at least a process of reducing a motion blur in accordance with the motion blur correction parameter.
  • the motion blur correction processing means may perform a process of adding a motion blur on the image data and the process of reducing a motion blur in accordance with the motion blur correction parameter.
  • the image processing apparatus may further include shutter speed estimation processing means for estimating shutter speed information by analyzing the image data, wherein the correction parameter calculation means uses the shutter speed information estimated by the shutter speed estimation processing means in order to calculate the motion blur correction parameter.
  • the motion blur correction processing means may adaptively select in response to each partition area of the image data one of the process of adding the motion blur and the process of reducing the motion blur on the image data, in accordance with the motion blur correction parameter.
  • the motion blur correction processing means may perform on the image separately the process of adding the motion blur and the process of reducing the motion blur, and select between data resulting from the process of adding the motion blur and data resulting from the process of reducing the motion blur in accordance with the motion blur correction parameter as data to be adaptively output for each partition area of the image data.
  • the image processing apparatus may further include motion vector generating means for generating from the image data a motion vector as the motion information.
  • the shutter speed estimation processing means may include a motion blur characteristic analyzer for extracting a shutter speed calculation parameter by analyzing motion blur characteristics contained in a target area of the image data, and an imaging shutter speed calculator for calculating the shutter speed at the image capturing of the image data.
  • the shutter speed estimation processing means may further include a process target area selector for extracting and identifying from the unit image forming the image data the target area of the analysis process of the motion blur characteristic analyzer.
  • the image processing apparatus may further include motion vector generating means for generating from the image data a motion vector as the motion information.
  • the process target area selector in the shutter speed estimation processing means identifies the target area using edge information of the image data, and the motion vector generated by the motion vector generating means.
  • the shutter speed estimation processing means may further include an imaging shutter speed accuracy enhancement processor.
  • the motion blur characteristic analyzer extracts the shutter speed calculation parameters of a plurality of target areas.
  • the imaging shutter speed calculator calculates a plurality of shutter speeds using the shutter speed calculation parameters of the plurality of target areas and the motion information of the respective target areas.
  • the imaging shutter speed accuracy enhancement processor estimates an imaging shutter speed using calculation results of the plurality of shutter speeds.
  • the shutter speed estimation processing means may estimate the shutter speed once within a period from the detection of a scene change to the detection of a next scene change in the input image data, and hold the estimated shutter speed within the period.
  • the shutter speed estimation processing means may include an imaging shutter speed accuracy enhancement processor.
  • the imaging shutter speed accuracy enhancement processor estimates the shutter speed by a plurality of times within a period from the detection of a scene change to the detection of a next scene change in the input image data, and estimates an imaging shutter speed on the basis of the calculation results of the plurality shutter speeds estimated.
  • the correction parameter calculation means may acquire an optimum shutter speed corresponding to a speed of an object from each partition area of the image data by referencing mapping information mapping the object speed to an imaging shutter speed at which image quality degradation of an output image is reduced, and calculate the motion blur correction parameter as selection control information for selecting between a process of adding a motion blur and a process of reducing a motion blur on the image data by comparing information regarding an input imaging shutter speed with the optimum shutter speed.
  • the motion blur correction processing means may selectively perform on the image data the process of adding the motion blur and the process of reducing the motion blur in accordance with the motion blur correction parameter.
  • the correction parameter calculation means may acquire an optimum shutter speed corresponding to a speed of an object from each partition area of the image data by referencing mapping information mapping the object speed to an imaging shutter speed at which image quality degradation of an output image is reduced, and calculate the motion blur correction parameter as selection control information for selecting between a process of adding a motion blur and a process of reducing a motion blur on the image data by comparing information regarding an input imaging shutter speed with the optimum shutter speed.
  • the motion blur correction processing means may perform on the image separately the process of adding the motion blur and the process of reducing the motion blur, and select between data resulting from the process of adding the motion blur and data resulting from the process of reducing the motion blur in accordance with the motion blur correction parameter as data to be adaptively output for each partition area of the image data.
  • the correction parameter calculation means may calculate the motion blur correction parameter indicating one of the degree of addition of the motion blur and the degree of reduction of the motion blur, respectively used by the motion blur correction processing means in the process of adding the motion blur and the process of reducing the motion blur on the image data.
  • the motion blur correction parameter indicating one of the degree of addition of the motion blur and the degree of reduction of the motion blur may include one of an imaging shutter speed and a difference between the imaging shutter speed and an optimum shutter speed.
  • the motion blur correction parameter indicating one of the degree of addition of the motion blur and the degree of reduction of the motion blur may include movement speed information of a partition area.
  • an image processing method includes the steps of calculating a motion blur correction parameter for motion blur correction on the basis of motion information indicating a motion of an image between unit images, the unit images forming image data, and shutter speed information obtained at the image capturing of the image data, and correcting a motion blur quantity contained in the image data by performing at least a process of reducing a motion blur in accordance with the motion blur correction parameter.
  • a program causes a computer to perform an image processing method.
  • the image processing method includes the steps of calculating a motion blur correction parameter for motion blur correction on the basis of motion information indicating a motion of an image between unit images, the unit images forming image data, and shutter speed information obtained at the image capturing of the image data, and correcting a motion blur quantity contained in the image data by performing at least a process of reducing a motion blur in accordance with the motion blur correction parameter.
  • FIG. 1 is a block diagram of a first basic structure of an image processing apparatus in accordance with one embodiment of the present invention
  • FIG. 2 illustrates a partition area in accordance with one embodiment of the present invention
  • FIG. 3 is a block diagram of a second basic structure of the image processing apparatus in accordance with one embodiment of the present invention.
  • FIG. 4 is a block diagram of a third basic structure of the image processing apparatus in accordance with one embodiment of the present invention.
  • FIG. 5 is a block diagram of an image reproducing apparatus in accordance with one embodiment of the present invention.
  • FIG. 6 is a block diagram of a motion vector generation processor in accordance with one embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating operation of the motion vector generation processor of one embodiment of the present invention.
  • FIG. 8 is a block diagram illustrating a shutter speed estimation processor in accordance with one embodiment of the present invention.
  • FIG. 9 illustrates a motion blur length in accordance with one embodiment of the present invention.
  • FIGS. 10 A 1 through 10 C 2 illustrate the motion blur length in accordance with one embodiment of the present invention
  • FIGS. 11A-11C illustrate a calculation process of the motion blur length in accordance with one embodiment of the present invention
  • FIG. 12 is a detailed block diagram illustrating a shutter speed estimation processor in accordance with one embodiment of the present invention.
  • FIG. 13 illustrates a process of a motion blur characteristic analyzer in accordance with one embodiment of the present invention
  • FIG. 14 is a motion blur sample frequency table in accordance with one embodiment of the present invention.
  • FIG. 15 is a block diagram of a motion blur correction parameter calculator and a motion blur correction processor in accordance with one embodiment of the present invention.
  • FIG. 16 illustrates an optimum shutter speed in accordance with one embodiment of the present invention
  • FIG. 17 is a flowchart of a process of a process selection controller in accordance with one embodiment of the present invention.
  • FIG. 18 illustrates a process performed in response to a speed of an object and an imaging shutter speed in accordance with one embodiment of the present invention
  • FIG. 19 is a block diagram of a motion blur reduction processor in accordance with one embodiment of the present invention.
  • FIGS. 20A and 20B illustrate a smoothing filter in accordance with one embodiment of the present invention
  • FIG. 21 is a block diagram of a motion blur addition processor in accordance with one embodiment of the present invention.
  • FIG. 22 is a flowchart illustrating of a motion vector masking process in accordance with one embodiment of the present invention.
  • FIG. 23 illustrates a filter parameter calculation process in accordance with one embodiment of the present invention
  • FIG. 24 is a block diagram of another motion blur addition processor in accordance with one embodiment of the present invention.
  • FIG. 25 is a block diagram illustrating another motion blur correction processor in accordance with one embodiment of the present invention.
  • FIGS. 26A and 26B illustrate the generation principle of jerkiness and blur relating to how a still object and a moving object look
  • FIGS. 27A and 27B illustrate the generation principle of jerkiness
  • FIGS. 28A and 28B illustrate the generation principle of blur.
  • An image processing apparatus of one embodiment of the present invention is intended to generate an image with jerkiness and blur thereof reduced through image processing. If a moving image taken at simple shutter control is displayed on a display device, that moving image may look unnatural due to human vision characteristics with either jerkiness or blur pronounced.
  • the generation of jerkiness is reduced by adding a motion blur to a high-speed shutter captured image in accordance with information regarding a shutter speed used at the capturing of an input image. If a low-speed shutter captured image is input, a process of reducing a motion blur is performed in order to reduce a blur degradation.
  • two processes are adaptively performed in response to conditions of an input image signal, and the relationship between a shutter speed used at the image capturing and a movement speed of an object. Both the jerkiness degradation and blur degradation are thus controlled.
  • a high-quality image process is performed by generating an image signal with less image degradation and outputting the high-quality image.
  • FIG. 1 illustrates an image processing apparatus 1 having the first basic structure.
  • the image processing apparatus 1 includes an image acquisition unit 11 , a motion blur correction parameter calculator 12 , and a motion blur correction processor 13 .
  • the image acquisition unit 11 acquires image data into the image processing apparatus 1 .
  • the motion blur correction parameter calculator 12 sets a parameter for correcting a motion blur on the image acquired by the image acquisition unit 11 .
  • the motion blur correction processor 13 performs a motion blur correction process on the image data acquired by the image acquisition unit 11 .
  • the motion blur correction parameter calculator 12 receives motion information of the image data acquired by the image acquisition unit 11 , and shutter speed information indicating an exposure time of each frame when the image data is captured.
  • the motion blur correction parameter calculator 12 calculates from these pieces of input information an optimum parameter for correcting the motion blur of the acquired image data on a per partition area basis in each frame of the image data, and then supplies the calculated optimum parameter to the motion blur correction processor 13 .
  • the motion blur correction parameter calculator 12 sets a plurality of partition areas (pixel blocks) # 1 -#m within one frame, calculates a motion blur correction parameter for each of the partition areas # 1 -#m, and then supplies the motion blur correction parameter to the motion blur correction processor 13 .
  • the motion information is not limited to information indicating a motion of an image between frames.
  • the motion information may be information representing a motion of an image between unit images forming a moving image, such as information representing a motion of an image between fields.
  • the motion blur correction processor 13 corrects a quantity of motion blur of the image data, using the motion blur correction parameter calculated by the motion blur correction parameter calculator 12 and then externally outputs the motion blur corrected image data from within the image processing apparatus 1 .
  • the motion blur correction processor 13 includes a sorting unit 31 , a motion blur reduction processor 32 , a motion blur addition processor 33 , and a synthesizer 34 .
  • the sorting unit 31 outputs the image data of each of the partition areas # 1 -#m of the input image data to one of the motion blur reduction processor 32 and the motion blur addition processor 33 , depending on whether the motion blur quantity of the partition area is to be reduced or increased.
  • the motion blur reduction processor 32 performs a process of reducing the motion blur quantity of the corresponding area of the input image data.
  • the motion blur addition processor 33 performs a process of increasing the motion blur quantity of the corresponding area of the input image data. If motion blur correction is not using in a given partition area, the sorting unit 31 outputs the image data of that partition area to the synthesizer 34 .
  • the motion blur reduction processor 32 and the motion blur addition processor 33 may perform the processes thereof in a setting with a correction motion blur quantity set to zero.
  • the synthesizer 34 performs a synthesis process, synthesizing, as a frame image, the image data of each partition area, corrected by one of the motion blur reduction processor 32 and the motion blur addition processor 33 , and the image data of a partition area having undergone no correction.
  • the sorting unit 31 in the motion blur correction processor 13 receives the motion blur correction parameter calculated by the motion blur correction parameter calculator 12 .
  • the motion blur correction parameter is calculated for each of the partition areas in each frame of the image data, and contains information regarding the motion blur correction process to be performed on a partition area being currently set as a process target of the image data.
  • the motion blur correction process of the motion blur correction processor 13 includes performing one of a process of reducing the motion blur (de-blur process) in the area where the blur degradation is likely, and a process of adding the motion blur (ad-blur process) in the area where an insufficient motion blur, i.e., a jerkiness degradation is likely.
  • the sorting unit 31 sorts the image data of each partition area in response to the motion blur correction parameter. More specifically, the sorting unit 31 outputs the image data of the partition area that is to be de-blurred to the motion blur reduction processor 32 and the image data of the partition area that is to be ad-blurred to the motion blur addition processor 33
  • the image data output to one of the motion blur reduction processor 32 and the motion blur addition processor 33 thus undergoes the optimum motion blur correction process in order to reduce both the jerkiness degradation and the blur degradation.
  • the image data having undergone the optimum motion blur correction process is supplied to the synthesizer 34 .
  • the areas of the image data having undergone the motion blur correction process are synthesized by the synthesizer 34 and the resulting image data is thus output.
  • the motion blur correction processor 13 outputs a moving image signal with the jerkiness degradation and blur degradation thereof reduced.
  • the image processing apparatus may perform only the correction process of the motion blur reduction processor 32 with the motion blur addition processor 33 eliminated from the motion blur correction processor 13 .
  • only the motion blur addition processor 33 performs the correction process with the motion blur reduction processor 32 eliminated from the motion blur correction processor 13 .
  • one of the jerkiness degradation and the blur degradation might persist. For example, if only the de-blur process is performed, the jerkiness degradation may not be reduced in response to the image data that has been captured at a high shutter speed. Conversely, if only the ad-blur process is performed, the blur degradation occurring in the area where an object is moving may not be reduced in response to the image data that has been captured at a low shutter speed.
  • a combination of the de-blur process and the ad-blur process in the structure of FIG. 1 reduces both the jerkiness degradation and the blur degradation regardless of conditions such as an imaging shutter speed of the image data.
  • the same is true of the structures illustrated in FIGS. 3 and 4 .
  • the image processing apparatus 1 thus constructed reduces the jerkiness degradation and the blur degradation by correcting adaptively the motion blur of the image data in response to the motion information of the image data and the information of the shutter speed at the image capturing.
  • FIG. 3 illustrates as the second structure an image processing apparatus 2 in accordance with one embodiment of the present invention.
  • the image processing apparatus 1 having the first structure is based on the premise that the image acquisition unit 11 obtains the shutter speed information of the image data.
  • the shutter speed information at the image capturing is referred to in the selection of the motion blur correction processes (ad-blur process and de-blur process).
  • the image processing apparatus 1 including the image acquisition unit 11 having an image capturing function executes an image capturing operation, thereby obtaining the image data.
  • the image processing apparatus 1 can easily extract a shutter speed value used in the actual image capturing operation.
  • the shutter speed information is contained as metadata or the like of the image data, the value of the shutter speed is acquired from the metadata.
  • the image processing apparatus is part of an apparatus that displays an image signal by receiving the image signal or by reproducing the image signal from a recording medium, the shutter speed at the image capturing of the image data typically remains unknown.
  • the image processing apparatus 2 having the second structure analyzes an input image signal through image processing, thereby estimating a shutter speed at the image capturing of the image signal.
  • FIG. 3 illustrates the image processing apparatus 2 having the second structure, in which the image acquisition unit 11 does not acquire the shutter speed information indicating an exposure time of each frame of the image data.
  • the image processing apparatus 2 is different from the image processing apparatus 1 of FIG. 1 in that a shutter speed estimation processor 14 is included.
  • the shutter speed estimation processor 14 receives the image data, which is also acquired by the image acquisition unit 11 , and the motion information of the image data.
  • the shutter speed estimation processor 14 performs image processing in order to analyze the input image data, and thus estimates the shutter speed information indicating the exposure time of each frame at the image capturing of the image data.
  • the estimated shutter speed information is output to the motion blur correction parameter calculator 12 .
  • the process performed by the motion blur correction parameter calculator 12 and the motion blur correction processor 13 is identical to the process of the counterparts in the image processing apparatus 1 illustrated in FIG. 1 .
  • the jerkiness degradation and the blur degradation typically take place in the displayed image, causing the image of the image data to look unnatural to the eyes of humans.
  • the image processing apparatus 2 thus constructed analyzes the image data using the motion information of the image data affecting the jerkiness degradation and the blur degradation.
  • the image processing apparatus 2 estimates the information of the shutter speed at the image capturing of the image data, and corrects the motion blur of the image data adaptively in response to the estimated shutter speed information.
  • the image processing apparatus 2 thus reduces both the jerkiness degradation and the blur degradation.
  • FIG. 4 illustrates an image processing apparatus 3 having the third structure.
  • the image processing apparatus 3 includes a motion blur correction processor 13 A instead of the motion blur correction processor 13 included in the image processing apparatus 2 illustrated in FIG. 3 .
  • the rest of the image processing apparatus 3 is identical in structure to the image processing apparatus 2 .
  • the motion blur correction processor 13 A includes the motion blur reduction processor 32 , the motion blur addition processor 33 , and a selector and synthesizer 35 .
  • the motion blur reduction processor 32 performs the motion de-blur process on all the partition areas of the input image data.
  • the motion blur addition processor 33 performs the motion ad-blur process on all the partition areas of the input image data.
  • the selector and synthesizer 35 receives from the motion blur reduction processor 32 the image data at all the partition areas that have undergone the motion de-blur process.
  • the selector and synthesizer 35 receives from the motion blur addition processor 33 the image data at all the partition areas that have undergone the motion ad-blur process.
  • the selector and synthesizer 35 also receives the input image data (image data not motion blur corrected).
  • the selector and synthesizer 35 selects the motion blur reduced data, the motion blur added data, or the uncorrected data.
  • the selector and synthesizer 35 synthesizes the selected data at each partition area, thereby generating and outputting the image data of one frame.
  • the sorting unit 31 in each of the first and second basic structures selects the correction process to be performed.
  • the motion de-blur process and the motion ad-blur process are performed on all the partition areas, and then the image data at an appropriate correction state is selected and output as output image data.
  • the image processing apparatus 3 illustrated in FIG. 4 may have the same structure as the image processing apparatus 2 illustrated in FIG. 3 except the motion blur correction processor 13 A. In another option, the image processing apparatus 3 may have the same structure as the image processing apparatus 1 illustrated in FIG. 1 except the motion blur correction processor 13 A.
  • the image processing apparatus 2 having the second basic structure in accordance with one embodiment of the present invention is described further in detail.
  • the image processing apparatus 1 having the first basic structure can be considered to be a particular version of the image processing apparatus 2 in which the imaging shutter speed information is available.
  • the discussion that follows focuses on the image processing apparatus 2 .
  • the image processing apparatus 1 is also considered to be a particular version of the image processing apparatus, discussed with reference to FIG. 5 and subsequent drawings, without element for the shutter speed estimation process.
  • the third basic structure will be further described later.
  • FIG. 5 illustrates an image reproducing apparatus 100 to which the image processing apparatus 2 having the second basic structure is applied.
  • the image reproducing apparatus 100 receives and reproduces image data transmitted via a transmission line, or reproduces image data recorded on a recording medium 200 , such as digital versatile disc (DVD) or Blu-ray Disc (Registered Trademark of Sony Corporation).
  • DVD digital versatile disc
  • Blu-ray Disc Registered Trademark of Sony Corporation
  • the image reproducing apparatus 100 includes a receiving processor 110 receiving encoded image data transmitted via a transmission line, and a reading processor 120 reading encoded image data from the recording medium 200 .
  • the image reproducing apparatus 100 also includes a decoding processor 130 decoding the encoded data into image data DD, and a motion vector generation processor 140 generating a motion vector VD from the decoded image data DD.
  • the image reproducing apparatus 100 also includes a shutter speed estimation processor 150 estimating a shutter speed SSD of the image data at the image capturing, using the decoded image data DD and the motion vector VD, and a motion blur correction parameter calculator 170 .
  • the image reproducing apparatus 100 further includes a motion blur correction processor 160 correcting a motion blur quantity of the decoded image data DD in accordance with the motion vector VD and the shutter speed SSD.
  • the image reproducing apparatus 100 also includes a moving image display output unit 190 that causes a display device to display a moving image that has the jerkiness degradation thereof reduced with a motion blur added.
  • the image reproducing apparatus 100 further includes a still image display output unit 180 that causes the display device to display a decoded image as a still image.
  • the receiving processor 110 , the reading processor 120 , the decoding processor 130 , and the motion vector generation processor 140 , enclosed in an dot-and-dash chain lined box, correspond to the image acquisition unit 11 in the third basic structure illustrated in FIG. 3 .
  • the shutter speed estimation processor 150 corresponds to the shutter speed estimation processor 14 illustrated in FIG. 3 .
  • the motion blur correction parameter calculator 170 corresponds to the motion blur correction parameter calculator 12 illustrated in FIG. 3 .
  • the motion blur correction processor 160 corresponds to the motion blur correction processor 13 illustrated in FIG. 3 .
  • the receiving processor 110 and the reading processor 120 retrieve image data predictive coded in accordance with image motion information such as moving picture experts group (MPEG), and supplies the image data to the decoding processor 130 .
  • the image data retrieved as a moving image by the receiving processor 110 and the reading processor 120 has a unit time of 1 second, and contains 60 frames of images per unit time. More specifically, the image data is a progressive unit image of a frame rate of 60 frames per second (fps).
  • the image data is not limited to the progressive image.
  • the image data may be an interlace image that is processed on a field image unit basis.
  • the frame rate is not limited to 60 fps.
  • the image reproducing apparatus 100 may have at least one of the receiving processor 110 and the reading processor 120 to perform an image data retrieval function for retrieving an image from the outside. In addition to the image data retrieval function, the image reproducing apparatus 100 may acquire the shutter speed information contained as metadata of the image data. In such a case, the image reproducing apparatus 100 becomes similar to the image processing apparatus 1 having the first basic structure, and does not use the shutter speed estimation processor 150 for estimating the shutter speed SSD at the image capturing.
  • the decoding processor 130 decodes the image data retrieved from one of the receiving processor 110 and the reading processor 120 .
  • the decoding processor 130 then supplies the decoded image data DD to each of the motion vector generation processor 140 , the shutter speed estimation processor 150 , and the motion blur correction processor 160 .
  • the decoding processor 130 supplies the decoded image data DD to the still image display output unit 180 only, and is free from handling the image data as a moving image.
  • the motion vector generation processor 140 generates the motion vector VD as the motion information of the decoded image data DD from the decoded image data DD supplied from the decoding processor 130 .
  • the motion vector herein is information representing a position of a moving image between frames and a movement direction of the moving image.
  • the motion vector can be generated by pixel to acquire the motion information of a moving object at a high accuracy level.
  • the motion vector generation processor 140 of one embodiment of the present invention generates the motion vector by pixel block to reduce the calculation load on the process.
  • the frame image is here divided into a plurality of pixel blocks.
  • the image data encoded in accordance with the MPEG standard or the like contains a motion vector as encoding information.
  • the motion vector for encoding serves as information to encode primarily a moving image.
  • the encoding process is performed in combination with residual information or the like in addition to the motion vector, and the motion vector does not necessarily faithfully represent a value responsive to a motion of an actual moving object over the entire image.
  • the motion vector generation processor 140 detects accurately a motion vector responsive to a motion of an actual moving object in a decoded image through a process step to be discussed later. The motion vector generation processor 140 thus adds a motion blur faithful to the motion of the actual moving object.
  • the shutter speed estimation processor 150 estimates the shutter speed SSD at the image capturing of the image data from the decoded image data DD supplied from the decoding processor 130 .
  • the shutter speed estimation process is performed as a process in which the motion vector VD supplied from the motion vector generation processor 140 is used as will be described later.
  • the shutter speed information here is information related to a shutter speed that affects a motion blur to be added to the captured image of the image data. More specifically, the shutter speed information represents an exposure time of unit image taken when an imaging apparatus having a shutter function captures the image data.
  • the shutter function may be performed by one of an electronic shutter controlling a drive time of an imaging element, a mechanical shutter that allows light to pass through a lens to the imaging element by opening a closing mechanism for an exposure time, and a liquid-crystal shutter that allows light to pass through a lens to an imaging element by controlling the transmittance ratio of a liquid-crystal element for an exposure time in the imaging apparatus.
  • the motion blur correction parameter calculator 170 calculates the motion blur correction parameter in accordance with the shutter speed SSD supplied from the shutter speed estimation processor 150 , and the motion vector VD supplied from the motion vector generation processor 140 , and then supplies the calculated motion blur correction parameter to the motion blur correction processor 160 .
  • the motion blur correction processor 160 performs the motion blur correction process based on the decoded image data DD supplied from the decoding processor 130 and the motion blur correction parameter supplied from the motion blur correction parameter calculator 170 .
  • the motion blur correction process may be interpreted as a process to convert the partition areas of the decoded image data DD into a pseudo image that is captured at an optimum shutter speed.
  • the optimum shutter speed is intended to reduce the generation of jerkiness and blur in response to a movement speed of each partition area contained in the motion vector VD.
  • the motion blur correction processor 160 references the shutter speed SSD of an input image signal prior to the conversion operation.
  • the motion blur correction processor 160 If the optimum shutter speed in each partition area is lower than the shutter speed SSD, the motion blur correction processor 160 performs the motion ad-blur process. Conversely, if the optimum shutter speed in each partition area is higher than the shutter speed SSD, the motion blur correction processor 160 performs the motion de-blur process.
  • the motion blur correction processor 160 synthesizes images having respectively converted partition areas into one frame, thereby generating an output image signal OD.
  • the output image signal OD is output to the moving image display output unit 190 .
  • the moving image display output unit 190 outputs to a display device such as a liquid-crystal display (LCD) a moving image that has been motion blur corrected with the jerkiness degradation and the blur degradation reduced by the motion blur correction processor 160 .
  • the still image display output unit 180 outputs to the display device such as the LCD the decoded image data DD received from the decoding processor 130 as a still image.
  • the elements illustrated in FIG. 5 are described in detail. A structure and operation of the motion vector generation processor 140 are described first.
  • the motion vector generation processor 140 generates accurately the motion vector on a per pixel block basis.
  • the motion vector generation processor 140 includes a motion vector detector 141 , a pixel block identification processor 142 , a motion vector estimation processor 143 , a motion vector smoothing processor 144 , and delay units 141 a and 142 a.
  • the motion vector detector 141 detects a motion vector from a process target frame and an immediately preceding frame.
  • the pixel block identification processor 142 identifies a pixel block having a high correlation by comparing the motion vector of the process target frame with the motion vector of the immediately preceding fame on a per pixel block basis.
  • the motion vector estimation processor 143 estimates the motion vector of a pixel block other than the pixel block identified by the pixel block identification processor 142 , based on the motion vector of the pixel block identified by the pixel block identification processor 142 .
  • the motion vector smoothing processor 144 performs a smoothing process on the motion vector.
  • the decoded image data DD supplied from the decoding processor 130 is supplied to the motion vector detector 141 , and the delay unit 141 a delaying the decoded image data DD by one frame.
  • the motion vector detector 141 sets the decoded image data DD supplied from the decoding processor 130 as a process target frame.
  • the motion vector detector 141 detects the motion vector of each process target frame on a per pixel block basis based on the process target frame and the immediately preceding frame that is delayed by one frame by the delay unit 141 a . If the process of the motion vector detector 141 is implemented in software, the motion vector may be detected on a pixel block basis using a typically available block matching method.
  • the motion vector detected by the motion vector detector 141 is supplied to the pixel block identification processor 142 and the delay unit 142 a .
  • the delay unit 142 a delays the input motion vector by one frame.
  • the pixel block identification processor 142 compares the motion vector of the process target frame supplied from the motion vector detector 141 with the motion vector of the immediately preceding frame delayed by the delay unit 142 a as described below. From the comparison results, the pixel block identification processor 142 identifies a pixel block having a high correlation.
  • the pixel block identification processor 142 calculates a vector correlation coefficient ⁇ of that pixel block in accordance with the following equation (1).
  • the correlation determination coefficient ⁇ has a range of 0 ⁇ 1. The larger the correlation determination coefficient ⁇ , the more the calculated vector correlation coefficient ⁇ is likely to be 1.
  • the pixel block identification processor 142 calculates the vector correlation coefficient ⁇ of each pixel block in accordance with equation (1), and identifies a pixel block having 1 for the vector correlation coefficient ⁇ as a motion vector having a high correlation.
  • the motion vector estimation processor 143 estimates, from the motion vector of the pixel block determined as having a vector correlation coefficient ⁇ of 1 by the pixel block identification processor 142 , a motion vector of a pixel block having a vector correlation coefficient ⁇ of 0. On the premise that the pixel block determined as having a vector correlation coefficient ⁇ of 1 by the pixel block identification processor 142 has an effective motion vector, the motion vector estimation processor 143 updates the motion vector of another pixel block, i.e., a pixel block having the vector correlation coefficient ⁇ thereof being zero and thus determined as having an ineffective motion vector.
  • step S 1 the motion vector estimation processor 143 determines whether the vector correlation coefficient ⁇ of a pixel block serving currently as a target pixel block in the process target frame is 1 or 0. More specifically, the motion vector estimation processor 143 determines whether the motion vector of the pixel block is effective or not. If it is determined in step S 1 that the motion vector of the pixel block is effective, the motion vector estimation processor 143 ends the process without updating the value of the motion vector. If it is determined in step S 1 that the motion vector of the pixel block is not effective, the motion vector estimation processor 143 proceeds to step S 2 .
  • step S 2 the motion vector estimation processor 143 determines on the target pixel block whether a surrounding pixel block having an effective motion vector is present around the target pixel block. More specifically, the motion vector estimation processor 143 determines eight pixel blocks next to the target pixel block as surrounding pixel blocks contain an effective motion vector. If an effective motion vector is present, the motion vector estimation processor 143 proceeds to step S 3 . If there is not any effective motion vector, the motion vector estimation processor 143 does not update the motion vector of the target pixel block, and ends the process thereof.
  • the estimation process is not performed on the surrounding blocks present within a larger area with respect to the target pixel block having no effective motion vector.
  • the estimation process can be performed on the surrounding blocks present within a larger area at any rate.
  • a storage area storing temporarily the image data handled as the surrounding blocks is increased in capacity to complete the process within a fixed time.
  • ineffective motion vectors may be corrected by performing a smoothing process on the motion vector of the target pixel block using the surrounding pixel blocks larger in area than the eight adjacent pixel blocks.
  • the motion vector estimation processor 143 estimates and updates the motion vector of the target pixel block based on the motion vectors of the surrounding pixel blocks having the effective motion vectors.
  • the motion vector estimation processor 143 thus ends the process.
  • the motion vector estimation processor 143 includes a median filter. The median filter receives the motion vectors of the surrounding pixel blocks having the effective motion vectors, and outputs a smoothed motion vector of the surrounding pixel blocks.
  • the motion vector estimation processor 143 thus estimates the motion vector of the process target frame on a pixel block basis.
  • the motion vector estimation processor 143 thus supplies the motion vectors including the motion vector identified by the pixel block identification processor 142 to the motion vector smoothing processor 144 .
  • the motion vector smoothing processor 144 performs a smoothing process on the motion vectors of the pixel blocks forming a process target image. More specifically, the motion vector smoothing processor 144 receives as an input I(x+i,y+j) the motion vector of the target pixel block prior to the smoothing process, and the motion vector of a surrounding pixel blocks larger than in area than the above-described adjacent pixel blocks, and outputs a motion vector J(x,y) of the target pixel block that has smoothed through a Gaussian function described in the following equation (2):
  • J ⁇ ( x , y ) ( ⁇ I ⁇ ( x + i , y + j ) * e - r 2 2 ⁇ ⁇ 2 - ( I ⁇ ( x + i , y + j ) - I ⁇ ( x , y ) ) 2 t 2 ⁇ e - r 2 2 ⁇ ⁇ 2 - ( I ⁇ ( x + i , y + j ) - I ⁇ ( x , y ) ) 2 t 2 ) ( 2 )
  • r represents a distance in a two-dimensional space between the target pixel block and each surrounding pixel block
  • ⁇ 2 represents a variance of the distance r
  • t 2 represents a variance of the motion vector. More specifically, ⁇ 2 and t 2 are parameters to any values representing the degree of smoothing.
  • the motion vector smoothing processor 144 performs the above-described smoothing process on each pixel block forming the process target frame, and outputs the resulting motion vector VD to the motion blur correction parameter calculator 170 .
  • the motion vector smoothing processor 144 identifies a pixel block having an effective motion vector from pixel blocks forming the process target frame, and estimates another motion vector from the effective motion vector.
  • the motion vector smoothing processor 144 can generate the motion vector responsive to the motion of an actual moving object.
  • the motion vector detected by the motion vector detector 141 may be supplied to the motion vector smoothing processor 144 in the smoothing process with the pixel block identification processor 142 and the motion vector estimation processor 143 skipped. Even in such a process, the motion vector generation processor 140 can provide a more accurate motion vector responsive to the motion of the moving object than when the above-described encoding information is used as the motion vector.
  • FIG. 8 is a block diagram illustrating the shutter speed estimation processor 150 .
  • the shutter speed estimation processor 150 includes a process target area selector 151 , a motion blur characteristic analyzer 152 , an imaging shutter speed calculator 153 , and an imaging shutter speed accuracy enhancement processor 154 .
  • the shutter speed estimation processor 150 receives the decoded image data DD and the motion vector VD.
  • the shutter speed estimation processor 150 image analyzes these pieces of input information, thereby estimating and outputting the shutter speed SSD at which the image data has been captured.
  • the decoded image data DD and the motion vector VD, input to the shutter speed estimation processor 150 are first received by the process target area selector 151 .
  • the process target area selector 151 selects a process target frame on which image analysis is to be performed in order to calculate the shutter speed.
  • the process target area selector 151 also selects a target area within the selected frame.
  • the process target area selector 151 then outputs the image data as a selected target area DDT and a motion vector VDT responsive to the target area DDT to subsequent stages.
  • the target area DDT refers to the image data at an area that is extracted as a target of the shutter speed estimation process within one frame.
  • the process target area selector 151 detects a scene change from the decoded image data DD input as a moving image, and then outputs the scene change detection signal SCD to the imaging shutter speed accuracy enhancement processor 154 .
  • the process target area DDT is input to the motion blur characteristic analyzer 152 .
  • the motion blur characteristic analyzer 152 performs an image analysis process on the image data as the process target area DDT (i.e., image data that is within a pixel area serving as a process target area within one frame).
  • the motion blur characteristic analyzer 152 calculates a “motion blur length L” generated in the process target area.
  • the motion blur length L will be described later.
  • the calculated motion blur length L is output to the imaging shutter speed calculator 153 .
  • the imaging shutter speed calculator 153 calculates an estimation imaging shutter speed SSDT based on the value of the motion blur length L generated in the process target area DDT and the motion vector VDT responsive to the process target area.
  • the estimation imaging shutter speed SSDT is an estimated value of the shutter speed at the image capturing.
  • the calculated estimation imaging shutter speed SSDT is output to the imaging shutter speed accuracy enhancement processor 154 .
  • the imaging shutter speed accuracy enhancement processor 154 receives the estimation imaging shutter speeds SSDT estimated from a plurality of process target areas. In response to the values of these pieces of information, the imaging shutter speed accuracy enhancement processor 154 calculates a highly accurate, estimation imaging shutter speed SSD, and outputs the calculated estimation imaging shutter to the motion blur correction parameter calculator 170 .
  • Motion blur characteristics serving as a basis of the process to be performed by the shutter speed estimation processor 150 are described before describing a process to be performed in each process block by the shutter speed estimation processor 150 of FIG. 8 .
  • the process of the shutter speed estimation processor 150 is a process to estimate a shutter speed from an image having an unknown shutter speed at the image capturing.
  • the relationship of the generation of motion blur, a movement speed, and an imaging shutter speed is described first in order to describe the basic motion blur characteristic.
  • the estimation method of the shutter speed taking into consideration the characteristics of a generated motion blur is then described.
  • FIG. 9 illustrates the motion blur characteristics generated when an image is captured.
  • the upper portion of FIG. 9 focuses on the relationship between a spatial position and brightness at a target area within the real space. As shown, the spatial position is plotted in the horizontal direction and illuminance is represented in the vertical direction. The foreground is moving in position from right to left at a constant speed. The bright foreground is overriding the dark background.
  • the lower portion of FIG. 9 simulates an image signal, which has been obtained by image capturing the target area in the real space illustrated in the upper portion using an imaging apparatus illustrated in FIG.
  • the imaging apparatus here having a shutter function controls a shutter speed that is an exposure time throughout which an image is acquired.
  • An image signal labeled (i) in FIG. 9 is captured when an ideal high-speed shutter of the shutter function (with the exposure time being infinitesimal) is performed.
  • An image signal labeled (ii) in FIG. 9 is captured when a low-speed shutter of the shutter function (with a predetermined exposure time) is performed.
  • the image signal (i) is a step function signal while the image signal (ii) is captured with light being integrated for a longer exposure time.
  • the motion blur takes place in addition to the image signal (i).
  • FIG. 9 illustrates that the motion blur in the vicinity of the outline of the moving object has low-pass filter characteristics.
  • an area having a luminance slope is defined as a motion blur area between luminance Bf of the illustrated foreground and an area where luminance Bb is recorded in a stable fashion.
  • the motion blur length L is defined as a distance of the area in the horizontal direction.
  • FIGS. 10 A 1 , 10 B 1 and 10 C 1 illustrate the relationship between the movement speed of the object and the motion blur length L.
  • FIGS. 10 A 1 , 10 B 1 , and 10 C 1 illustrate the motion blur characteristics that are generated when the movement speed of the foreground in the upper portion of FIG. 9 .
  • the shutter speed at the image capturing remains constant in FIGS. 10 A 1 , 10 B 1 , and 10 C 1 .
  • the motion blur length L generated in the vicinity of a pixel having a movement speed is thus proportional to the magnitude of the movement speed of the object.
  • FIGS. 10 A 2 , 10 B 2 , and 10 C 2 illustrate the relationship of the imaging shutter speed and the motion blur length L.
  • FIGS. 10 A 2 , 10 B 2 , and 10 C 2 show the motion blur characteristics taking place when the shutter speed of the imaging apparatus is changed in the upper portion of FIG. 9 .
  • the motion blur length L is proportional to the movement speed of the object, and is also proportional to the imaging shutter speed.
  • L the motion blur length (pixels)
  • V the movement speed of the object within the image signal
  • S the imaging shutter speed (seconds)
  • F the frame rate of the moving image (frames/second)
  • L V ⁇ S ⁇ F (3)
  • VS is by the frame rate F because the movement speed V is an amount of distance within one frame period.
  • the determined motion blur lengths are thus equal to the motion blur lengths L in FIGS. 10 A 2 , 10 B 2 , and 10 C 2 .
  • the process of each element in one example of the shutter speed estimation processor 150 illustrated in FIG. 8 is clarified, and the procedure of the shutter speed estimation is described.
  • the estimation method of an imaging shutter speed from an image is not limited to this method.
  • the shutter speed at the image capturing is calculated by identifying the motion blur length L as defined above.
  • the decoded image data DD and the motion vector VD serving as the inputs to the shutter speed estimation processor 150 are first supplied to the process target area selector 151 .
  • the process target area selector 151 extracts an area (hereinafter referred to as a “target area”) as a target of the image analysis in the shutter speed estimation, and outputs the process target area DDT and the motion vector VDT responsive to the process target area DDT to the subsequent stage.
  • the process target area selector 151 performs the extraction process on the premise that the extraction process is not necessarily performed on all the areas of the frame in the input moving image signal having an area where the motion blur takes place. Any method may be used to select the target area serving the target of the analysis process.
  • the reasons why no problem arises with the estimation of the imaging shutter speed performed in only the limited area are described below.
  • the shutter speed at the image capturing is typically uniform within one frame image. Furthermore, a smaller number of target areas to be processed is advantageous in terms of process costs.
  • the shutter speed estimation process is performed on one target area within the frame, no shutter speed estimation is necessary on the other areas. As long as process costs permit, performing the shutter speed estimation process within a plurality of extracted target areas is greatly useful from the standpoint of accuracy enhancement of the shutter speed estimation.
  • a process to be described later is performed in a plurality of target areas from within one frame, and the imaging shutter speed SSD is estimated from a plurality of obtained results. If a plurality of shutter speeds are obtained, a process of increasing reliability is carried out by the subsequent stage, i.e., the imaging shutter speed accuracy enhancement processor 154 . Such a process will be described later.
  • a method of selecting a target area of the analysis process from a given frame of the decoded image data DD is not limited to any one method.
  • the target area is preferably in the vicinity of a border outline edge of an object illustrated in FIGS. 9 and 10 A 1 through 10 C 2 in order to perform effectively the analysis process of the motion blur characteristics to be discussed in detail later. If the movement speed of a given area is zero, no motion blur takes place in that area.
  • movement speed information may be used so that an area having a predetermined movement speed is selected as a target area. If the direction of an edge is approximately perpendicular to the direction of the movement speed, the analysis process of the generated motion blur is easily performed.
  • the area to be selected has a certain degree of movement speed, and is a target area close to the edge as perpendicular as possible to the direction of the movement speed.
  • pixels are preferably picked up in a scan line direction in view of the process costs.
  • the target area is thus conveniently extracted from a region close to a vertical edge having a horizontal movement speed.
  • a motion blur characteristic analysis process to be discussed in detail later is performed along one line rather than across a plurality of lines. Focusing on the region close to the vertical edge having a horizontal movement speed, the use of only a sufficient number of pixels in a horizontal direction with respect to the movement speed serves the purpose of the motion blur characteristic analysis process to be discussed later.
  • FIGS. 11A-11C illustrate how the target area is selected.
  • FIG. 11A illustrates one frame of the decoded image data DD.
  • an edge extraction process is performed on the decoded image data DD, for example, using the Sobel filter, and edge data ED illustrated in FIG. 11B is obtained.
  • one horizontal line in the vicinity of a vertical edge having a horizontal movement speed is selected.
  • areas AR 1 -AR 5 are set to be target areas.
  • the target areas AR 1 -AR 5 may be a portion of each horizontal line.
  • Luminance information of each target area is obtained as illustrated in FIG. 11C .
  • the abscissa represents coordinates of each pixel in the target area and the ordinate represents luminance.
  • the discussion heretofore is related to the selection process in one frame. No problem arises even if the selection process is not performed on the entire region of one frame. Likewise, it is not necessary to select all the target areas in the frame. This is because the moving image forming a plurality of frames typically has an imaging shutter speed remaining unchanged at least until a frame in which a scene change takes place. The imaging shutter speed estimated through analysis of one frame can be held at the value thereof until a next scene change is detected.
  • the shutter speed estimation process is performed at least in one given frame within a period from the detection of a scene change to the detection of a next scene change.
  • a plurality of target areas are detected from within one frame, and that the shutter speed estimation process is performed in each of the target areas.
  • performing the shutter speed estimation process in a plurality of frames is particularly useful as long as process costs permit. This enhances the accuracy level of shutter speed estimation.
  • the imaging shutter speed accuracy enhancement processor 154 performs a reliability enhancing process if a plurality of different shutter speed values are estimated.
  • FIG. 12 illustrates, for operation description purposes, an internal functional structure of the process target area selector 151 and the imaging shutter speed accuracy enhancement processor 154 illustrated in FIG. 8 .
  • the process target area selector 151 includes a vertical direction edge detector 1511 , a horizontal direction movement speed threshold value processor 1512 , a target area determiner 1513 , and a scene change detector 1514 .
  • a vertical direction edge detector 1511 the vertical direction edge detector 1511
  • a horizontal direction movement speed threshold value processor 1512 the target area determiner 1513
  • a scene change detector 1514 the scene change detector 1514 .
  • an area having a horizontal speed equal to or higher than a constant value in the vicinity of a vertical edge is extracted as a target area from within one frame, and the shutter speed estimation process is performed on only the extracted target area.
  • the vertical direction edge detector 1511 performs an edge detection process on each area within the frame of input decoded image data DD. In this case, only a vertical edge may be extracted using a direction selective mask process of the Sobel filter or the like.
  • the input image data at an area determined as a vertical edge is output, as is, to the target area determiner 1513 .
  • An area not determined as a vertical edge is output to the target area determiner 1513 with all the pixel signals within that area set to “0.”
  • a motion vector VD is input to the horizontal direction movement speed threshold value processor 1512 . To select an area having a horizontal speed equal to or higher than the constant value, a horizontal component of the motion vector of each area, represented by a horizontal component VDx, is objected to a threshold value process.
  • the motion vector signal of the input area is output as is to the target area determiner 1513 . If the horizontal component VDx is equal to or lower than the threshold value TH (VDx ⁇ TH), all the motion vectors of the areas are set to “0,” and then output to the target area determiner 1513 .
  • the target area determiner 1513 determines that the area is a target of the shutter speed estimation process only if both the image data and the motion vector of the input area are not zero. Only in this case, the area is determined as the process target area DDT. As previously described, the process target area DDT is the image data of the area that is determined as a target. The target area determiner 1513 outputs the process target area DDT to the motion blur characteristic analyzer 152 , and the motion vector VDT of the area to the imaging shutter speed calculator 153 .
  • Each frame of the decoded image data DD input to the process target area selector 151 is supplied to the scene change detector 1514 .
  • the scene change detector 1514 performs a scene change detection process. Any scene change detection technique may be used. For example, the scene change detection technique disclosed in Japanese Unexamined Patent Application Publication No. 2004-282318 may be used.
  • the scene change detector 1514 outputs a scene change detection signal SCD to the imaging shutter speed accuracy enhancement processor 154 .
  • the process target area selector 151 identifies the process target area DDT, and then outputs the process target area DDT to the motion blur characteristic analyzer 152 .
  • the process target area selector 151 also extracts the motion vector VDT responsive to the position of the process target area DDT and outputs the motion vector VDT to the imaging shutter speed calculator 153 .
  • the process of the motion blur characteristic analyzer 152 is described below.
  • the analysis of the motion blur characteristics is a process of estimating the motion blur length L of the motion blur (see (ii) low speed shutter image in FIG. 9 and FIG. 11C ).
  • the motion blur length L is estimated by defining a mathematical model of motion blur and finding a parameter in the mathematical model minimizing an error function to a motion blur occurring in an actual image signal.
  • the motion blur length L is estimated by matching a motion blur sample pattern prepared beforehand to a motion blur actually taking place in an image signal.
  • the motion blur length L is estimated by expressing in a mathematical model a luminance value close to an edge affected by a motion blur as disclosed in the paper entitled “Photometric Registration Based on Defocus and Motion Blur Estimation for Augmented Reality,” Bunyo OKUMURA, Masayuki KANBARA, and Naokazu YOKOYA, The Institute of Electronics, Information and Communication Engineers, D Vol. J90-D No. 8 pp. 2126-2136.
  • equation (6) function g(t) simulating the motion blur characteristics in the vicinity of the edge is defined as equation (6):
  • Pixel coordinates p in the vicinity of the edge are substituted in equation (5), and parameters L, p 0 , Bf, and Bb minimizing a distance function to the value of the actual motion blur are found from equations (5), (6) and (7).
  • the motion blur length L is thus estimated.
  • a numerical analysis method such as quasi-Newton method may be used.
  • the distance function is typically a comparison of squared values of differences, each between pixel values of an actual image and a function f, or a comparison of the linear sum of the absolute values of the differences, each between pixel values of an actual image and a function f.
  • a function simulating a dominant motion blur characteristic is defined.
  • the function may be a simple discontinuous function such as the one of (ii) low-speed shutter image signal in FIG. 9 .
  • the motion blur length L is estimated from a spatial frequency component in a target area in the vicinity of an edge selected as a target in the image signal as described above.
  • the motion blur generated in the vicinity of a pixel having a movement speed can be expressed as a low-pass filter. Taking advantage of the low-pass filter, the motion blur length L is estimated through matching of frequency analysis results.
  • the characteristics of the motion blur are determined by the exposure time. If the movement speed is constant, the motion blurs having the same motion blur length should have the same frequency characteristics.
  • motion blur sample patterns having a variety of given motion blur lengths L are prepared as in (ii) image signal in FIG. 9 suffering from the motion blur during image capturing, a predetermined frequency analysis is performed on the image signals, and a frequency component of each sample pattern is stored.
  • the frequency analysis method may be one of typical methods including Fourier transform and wavelet analysis.
  • the target area is input as a target of analysis, the same frequency analysis as the one performed on the sample pattern is performed on the target area, and the sample pattern having the frequency component closest to the frequency component of the target area is determined using an error function or the like.
  • the motion blur length L of the sample pattern having the frequency component closest to the target area becomes the motion blur length L of an edge in the analysis target area.
  • FIG. 13 illustrates a specific process flow of the estimation of the motion blur length L through matching frequency analysis results.
  • the target area in the vicinity of the edge determined to be an analysis target is input to a fast Fourier transform (FFT) unit 1521 .
  • a fast Fourier transform (FFT) process is performed on the target area, and a dominant frequency component of the target area is output to a frequency component matching unit 1522 .
  • the frequency power spectrum is calculated as a result of the fast Fourier transform, and frequencies having top three power values are sent to the frequency component matching unit 1522 .
  • the frequency component matching unit 1522 searches a motion blur sample frequency table 1523 for a motion blur sample having a frequency pattern most similar to the dominant frequency component of the input target area, and outputs the motion blur length L of the hit sample.
  • FIG. 14 illustrates an example of the motion blur sample frequency table 1523 .
  • the motion blur sample frequency table 1523 lists frequency components for each of the motion blur lengths L (La, . . . , Lmax).
  • the frequency component matching unit 1522 searches the lookup table of FIG. 14 for a sample having the frequency component most similar to the top three frequency components of the target area.
  • a function evaluating an error is prepared. For example, evaluation of the error may be performed using a typical distance function that linearly sums squared differences.
  • the motion blur length of the sample motion blur pattern determined is the sought motion blur length L.
  • the above-described estimation method of the motion blur length L of matching the motion blur sample pattern to the motion blur actually taking place in the image signal focuses on the spatial frequency component. It is also contemplated that the sample pattern and the area in the vicinity of the edge determined to be the analysis target are compared to each other in the real space. In other words, the sample motion blur pattern is stored as an image signal, and a sample motion blur pattern having a error function resulting in a minimum value to an actual image signal is searched for.
  • the motion blur characteristic analyzer 152 in the shutter speed estimation processor 150 estimates the motion blur length L using one of the above-described techniques, and outputs the resulting motion blur length L.
  • the output motion blur length L is input to the imaging shutter speed calculator 153 .
  • the imaging shutter speed calculator 153 determines the estimation imaging shutter speed SSDT based on the motion blur length L of the motion blur in the process target area DDT, and the motion vector VDT responsive to the process target area DDT.
  • the process performed by the imaging shutter speed calculator 153 is merely solving equation (4).
  • the frame rate F in equation (4) is known.
  • the movement speed V is a horizontal component of the motion vector VDT responsive to the process target area DDT, and is also known.
  • the motion blur length L is estimated by the motion blur characteristic analyzer 152 .
  • the shutter speed S is easily determined by solving equation (4).
  • the shutter speed S becomes the estimation imaging shutter speed SSDT to be output by the imaging shutter speed calculator 153 .
  • the imaging shutter speed accuracy enhancement processor 154 generates and outputs the imaging shutter speed SSD.
  • the imaging shutter speed accuracy enhancement processor 154 receives the estimation imaging shutter speed SSDT estimated by the imaging shutter speed calculator 153 .
  • estimation imaging shutter speeds estimated from a plurality of target areas are input.
  • the estimation process may be theoretically performed in one area selected from one frame within a period from the detection of one scene change to the detection of a next scene change.
  • Performing the estimation process in a plurality of target areas within one frame as well as within a plurality of frames is useful to enhance estimation accuracy.
  • a plurality of estimation imaging shutter speeds SSDT may be used to generate the shutter speed SSD. If a plurality of different estimation imaging shutter speeds SSDT are estimated, a weight average or a median value of the speeds may be determined as the shutter speed SSD to be finally output. Process reliability is thus enhanced.
  • the imaging shutter speed accuracy enhancement processor 154 illustrated in FIG. 12 includes an imaging shutter speed accumulator 1541 , an imaging shutter speed filtering processor 1542 , and a scene change detection signal receiver 1543 .
  • the imaging shutter speed accumulator 1541 accumulates the value of the estimation imaging shutter speeds SSDT.
  • the imaging shutter speed filtering processor 1542 performs a predetermined filtering process using at least one of the values of the estimation imaging shutter speeds SSDT accumulated on the imaging shutter speed accumulator 1541 .
  • the filtering process is intended to enhance process reliability when a plurality of different estimation imaging shutter speeds SSDT are input.
  • the filtering process may include an averaging operation, a weighted averaging operation, and a medium value detection operation.
  • IIR infinite impulse response
  • the scene change detection signal receiver 1543 receives the scene change detection signal SCD input from the scene change detector 1514 in the process target area selector 151 . Upon receiving the scene change detection signal SCD, the scene change detection signal receiver 1543 determines that a scene change has taken place in the target frame, and thus determines that the imaging shutter speed has changed. The scene change detection signal receiver 1543 outputs a reset signal to the imaging shutter speed accumulator 1541 , and deletes the estimation values of the imaging shutter speeds heretofore stored. The imaging shutter speed filtering processor 1542 calculates a highly reliable imaging shutter speed using the estimation value of the imaging shutter speed newly input to the imaging shutter speed accumulator 1541 .
  • the process results of the imaging shutter speed filtering processor 1542 are output as the shutter speed SSD of the current frame or of the current scene (lasting from the detection of an immediately preceding scene change to the detection of a next scene change).
  • the serial operations of the shutter speed estimation method of the shutter speed estimation processor 150 have been discussed.
  • the shutter speed SSD estimated through the above-described process is output to the motion blur correction parameter calculator 170 .
  • the process of the motion blur correction parameter calculator 170 and the motion blur correction processor 160 illustrated in FIG. 5 is described below.
  • the motion blur correction parameter calculator 170 and the motion blur correction processor 160 perform a filtering process on each partition area (see FIG. 2 ) of the decoded image data DD in response to the value of the motion vector VD input from the motion vector generation processor 140 while referencing the shutter speed SSD input from the shutter speed estimation processor 150 .
  • the motion blur correction parameter calculator 170 and the motion blur correction processor 160 output to the moving image display output unit 190 an output image with both jerkiness and blur reduced.
  • the motion blur correction process performed by the motion blur correction parameter calculator 170 and the motion blur correction processor 160 includes the filtering process of reducing or adding the motion blur on a per partition area basis. The selection of the filtering process is adaptively performed in accordance with a technique to be discussed later.
  • FIG. 15 illustrates a specific structure of the motion blur correction parameter calculator 170 and the motion blur correction processor 160 .
  • the motion blur correction parameter calculator 170 includes a process selection controller 171 and an optimum shutter speed information memory 172 .
  • the motion blur correction processor 160 includes a sorting unit 163 , a motion blur reduction processor 164 , a motion blur addition processor 165 , and a synthesizer 166 .
  • the motion vector VD output from the motion vector generation processor 140 and the shutter speed SSD output from the shutter speed estimation processor 150 are first received by the process selection controller 171 .
  • the process selection controller 171 uses a motion vector value responsive to the partition area of the input motion vector VD, the process selection controller 171 references optimum shutter information stored on the optimum shutter speed information memory 172 , and determines the shutter speed SSD 0 of the partition area.
  • the process selection controller 171 also compares the shutter speed SSD supplied from the shutter speed estimation processor 150 with the shutter speed SSD 0 for evaluation.
  • the process selection controller 171 thus determines whether the filtering process to be executed on the partition area is the motion de-blur process, the motion ad-blur process, or no motion blur correction at all.
  • the process selection controller 171 then transfers the determination to the sorting unit 163 .
  • the process selection controller 171 also outputs a filter parameter PD to the selected motion blur correction filtering blocks, i.e., the motion blur reduction processor 164 and the motion blur addition processor 165 .
  • the motion blur correction may be performed in either the motion de-blur process and the motion ad-blur process.
  • the filter parameter PD is used to adjust the degree of such a process (in terms of amount and intensity).
  • the sorting unit 163 has already received the decoded image data DD, and outputs the decoded image data DD to one of the motion blur reduction processor 164 and the motion blur addition processor 165 on a per partition area basis in response to process selection control information SCS.
  • the sorting unit 163 may output the partition area where performance of a motion blur correction process is not necessary to the synthesizer 166 instead of supplying the partition area to one of the motion blur reduction processor 164 and the motion blur addition processor 165 .
  • the partition area where performance of a motion blur correction process is not necessary is processed by one of the motion blur reduction processor 164 and the motion blur addition processor 165 with the correction amount set to zero therewithin.
  • the motion blur reduction processor 164 performs on the image data of the partition area supplied from the sorting unit 163 the filtering process to reduce the motion blur quantity in a method to be discussed later, and outputs the resulting image data to the synthesizer 166 .
  • the motion blur addition processor 165 performs on the image data of the partition area supplied from the sorting unit 163 the filtering process to add the motion blur quantity in a method to be discussed later, and outputs the resulting image data to the synthesizer 166 .
  • the image data at the partition area having undergone the filtering process is output to the synthesizer 166 .
  • the synthesizer 166 then reconstructs the received image data into a frame image, and then outputs the frame image as the output image signal OD.
  • the process selection controller 171 first references the optimum shutter speed information stored beforehand on the optimum shutter speed information memory 172 using the value of the vector responsive to the partition area of the input motion vector VD, and determines the shutter speed SSD 0 for the partition area. Before describing the process selection controller 171 , the optimum shutter speed is described.
  • FIG. 16 illustrates an object speed indicating a movement speed of an object detected as a motion vector, and an optimum shutter speed curve responsive to the object speed.
  • the optimum shutter speed responsive to a given movement speed is a value at which both the jerkiness degradation and the blur degradation are reduced if image capturing is performed at that shutter speed. More specifically, the optimum shutter speed is a shutter speed at which the jerkiness degradation is less visible in accordance with the vision characteristics in response to the movement speed of the object, and at which the blur degradation is also less visible.
  • the object lacks a detail or sharpness with an excessive motion blur added. The higher the shutter speed value as the exposure time at the image capturing, the larger the motion blur quantity becomes. If the object is photographed at a shutter speed higher than the optimum shutter speed, the captured image may suffer from jerkiness. If the object is photographed at a shutter speed lower than the optimum shutter speed, the captured image may suffer from motion blur.
  • the optimum shutter speed information memory 172 pre-stores the optimum shutter speed information represented in FIG. 16 .
  • the optimum shutter speed information memory 172 uses the optimum shutter speed as a motion blur adjustment reference, thereby determining the content of the filtering process to adjust the motion blur quantity at subsequent stages.
  • the filtering process performed by the subsequent stages i.e., the motion blur reduction processor 164 and the motion blur addition processor 165 , is interpreted as a process converting each area of the image into an image having a motion blur responsive to a “motion blur quantity captured at the optimum shutter speed.”
  • the optimum shutter speed curve SS 0 in FIG. 16 illustrates a relationship between any object speed and the optimum shutter speed, more specifically, is a curve connecting results of a psychological experiment.
  • a motion blur region A 1 illustrated in FIG. 16 is determined as having an excess degree of motion blur caused by a motion of the object in accordance with the optimum shutter speed curve SS 0 .
  • a jerkiness region A 2 is determined as having no motion blur caused by the motion of the object in accordance with the optimum shutter speed curve SS 0 and having a jerkiness degradation in vision characteristics.
  • the optimum shutter speed information responsive to the motion vector in steps of any value is pre-stored on the optimum shutter speed information memory 172 as a table and then referenced.
  • the optimum shutter speed responsive to the motion vector may be calculated using a function similar to the optimum shutter speed curve denoted by the solid line in FIG. 16 .
  • the process selection controller 171 calculates a shutter speed SSD′ in accordance with an approximate function of the optimum shutter speed curve of the following equation (8):
  • FIG. 16 illustrates, as specific examples, curves SS 1 -SS 3 with the parameters A and B of the parameters of equation (8) set to fixed values, and ⁇ changed at three steps.
  • SS 0 represents the optimum shutter speed curve produced on the basis of the value obtained from an objective evaluation experiment
  • SS 1 -SS 3 are optimum shutter speed curves approximating the optimum shutter speed curve 330 in accordance with equation (8).
  • SS 1 -SS 3 may be used to adjust the optimum shutter speed curve in accordance with a preference of a user of the apparatus.
  • a format of the data stored on the optimum shutter speed information memory 172 is not only a graph in FIG. 16 and a mathematical expression such as equation (8) but also a table listing quantized values of the optimum shutter speed curve. The optimum shutter speed minimizing the jerkiness degradation and the blur degradation has been described.
  • the process selection controller 171 performs the determination method on the premise that the optimum shutter speed information memory 172 stores the optimum shutter speed information in the form of graph illustrated in FIG. 16 .
  • a movement speed AVT of the partition area represents the absolute value of the vector value responsive to the partition area of the motion vector VD.
  • the value of the movement speed AVT is plotted as a point in the abscissa of the graph in FIG. 16 .
  • a point corresponding to that point in the abscissa is then found on the curve of interest.
  • a value in the ordinate corresponding to the point in the curve is read.
  • the read value is the optimum shutter speed SSD 0 in the partition area.
  • FIG. 17 illustrates a process flow of the process selection controller 171 .
  • the process selection controller 171 determines the optimum shutter speed SSD 0 of the partition area in the manner described above.
  • the process selection controller 171 compares the determined optimum shutter speed SSD 0 with the optimum shutter speed SSD input to the process selection controller 171 . The comparison results determine whether the filtering process to be performed on the partition area is the motion de-blur process or the motion ad-blur process.
  • the criterion of the selection process is that if the optimum shutter speed in the partition area is lower than the imaging shutter speed SSD, the motion ad-blur process is to be performed (step S 12 ), and that if the optimum shutter speed is higher than the imaging shutter speed SSD, the motion de-blur process is to be performed (step S 13 ).
  • the process selection controller 171 outputs the filter parameter PD to the selected motion blur correction filtering blocks, namely, the motion blur reduction processor 164 and the motion blur addition processor 165 .
  • the filter parameter PD is used to adjust the degree of selected one of the motion de-blur process and the motion ad-blur process (in terms of amount and intensity).
  • a shutter speed difference SSDD between the imaging shutter speed of each partition area and the optimum shutter speed, and the motion vector VD are supplied as the filter parameter PD to each of the motion blur reduction processor 164 and the motion blur addition processor 165 .
  • FIG. 18 illustrates the imaging shutter speed and the object speed (the speed of the object) illustrated in FIG. 16 with a specific example added thereto.
  • the selection as to whether the motion de-blur process or the motion ad-blur process is to be performed as the filter process is specifically described. Also, the operation of each of the two filtering processes is described below.
  • FIG. 18 illustrates imaging shutter speeds Sa-Sc as examples of the shutter speed at the image capturing and the object speeds Va-Vc as the movement speed of the object.
  • SS 0 is selected as the optimum shutter speed curve from the curves illustrated in FIG. 16 . It is acceptable that any of the curves SS 0 -SS 3 is selected.
  • a motion blur quantity adjustment process is described in each of three values Sa, Sb, and Sc of imaging shutter speed with reference to FIG. 18 .
  • the imaging shutter speed Sa means an open shutter speed.
  • the optimum shutter speed is higher than the actual shutter speed Sa in each of all the object speeds Va, Vb, and Vc. For this reason, the de-blur process as the motion blur reduction process is performed in order to generate an image having the degree of motion blur that is otherwise caused at the optimum shutter speed.
  • the process selection controller 171 thus outputs to the sorting unit 163 the process selection control information SCS indicating that the motion de-blur process has been selected.
  • the sorting unit 163 outputs the signal of each partition area in the input decoded image data DD to the motion blur reduction processor 164 .
  • FIG. 19 illustrates a structure of the motion blur reduction processor 164 .
  • the motion blur reduction processor 164 includes a smoothing filter characteristic converter 1641 , a smoothing filter 1642 , a subtractor 1643 , and an adder 1644 .
  • the motion blur reduction processor 164 reduces the motion blur quantity of each partition area of the input decoded image data DD.
  • the smoothing filter 1642 is one of the simplest types of low-pass filters, and calculates and outputs an average value of the a process target pixel and the surrounding pixels thereof each time the process target pixel moves by one pixel. For example, as illustrated in FIG. 20A , n sample values including a current sample value (four sample values in FIG. 20A ) are averaged at a given time point. As illustrated in FIG. 20B , n sample values including a current sample value (four sample values in FIG. 20B ) are averaged at the next time point. The sample value refers to a pixel value. Each time the process target pixel moves by one pixel, n sample values including the values of the process target pixel and the surrounding pixels are averaged.
  • the smoothing filter characteristic converter 1641 receives the filter parameter PD.
  • the smoothing filter characteristic converter 1641 extracts from the input filter parameters PD a filter parameter positionally corresponding to the partition area in the decoded image data DD, and determines, based on the extracted filter parameter, filter characteristics of the process to be performed by the smoothing filter 1642 .
  • smoothing filters are respectively prepared for a plurality of filter parameters PD so that a filter to be used for a target pixel is determined. This process is specifically described below.
  • the smoothing filter characteristics are interpreted as how many pixels of the pixels surrounding the target pixel are averaged, and the shutter speed difference SSDD and the motion vector VD are used as examples of the filter parameters.
  • one table is prepared which determines the number of pixels to be used for the smoothing filter with respect to a combination of a shutter speed difference SSDD and a motion vector VD. Each time the shutter speed difference SSDD and the motion vector VD are input, the number of pixels to be used for the smoothing filter is output. The determined number of pixels to be used for the smoothing filter is output to the smoothing filter 1642 .
  • the smoothing filter 1642 (low-pass filter) performs a filtering process on a predetermined block containing the target pixel within the process target frame in accordance with the filter characteristics determined by the smoothing filter characteristic converter 1641 , thereby converting the pixel value of the target pixel.
  • the pixel value of the target pixel converted by the smoothing filter 1642 is output to the subtractor 1643 . More specifically, the subtractor 1643 receives the pixel value of the target pixel converted by the smoothing filter 1642 although in a polarity inverted form of the pixel value.
  • the subtractor 1643 also receives the target pixel of the process target frame of the input decoded image data DD.
  • the subtractor 1643 calculates a difference value between the pixel value of the pixel in the input image data DD and the pixel value of the target pixel converted by the smoothing filter 1642 , and then outputs the difference value to the adder 1644 .
  • the adder 1644 receives the difference value between the values before and after the operation of the smoothing filter.
  • the adder 1644 also receives the target pixel of the process target frame in the decoded image data DD.
  • the adder 1644 adds to the uncorrected pixel value of the target pixel the difference value between the values before and after the operation of the smoothing filter, and outputs the addition results as a portion of an output image.
  • the process of the motion blur reduction processor 164 illustrated in FIG. 19 has been described.
  • the process of the motion blur reduction processor 164 is easy to understand if it is considered in terms of frequency domain.
  • the difference value between the values before and after the operation of the smoothing filter, as the output signal of the subtractor 1643 is now considered in terms of frequency domain.
  • a difference between a gain of an input image signal and a gain of the image signal that has been filtered by the smoothing filter becomes a gain of the output signal of the adder 1644 .
  • the gain of the output image signal of the adder 1644 is the sum of the gain of the input image signal and the gain difference between before and after the operation of the smoothing filter.
  • the gain of the output image signal is the gain of the input image signal raised by the gain difference between before and after the operation of the smoothing filter. Since the smoothing filter 1642 is a low-pass filter, the entire process of the motion blur reduction processor 164 illustrated in FIG. 19 is basically equivalent to a high-pass filtering operation.
  • Japanese Unexamined Patent Application Publication No. 2006-81150 discloses a technique of performing a motion blur reduction process by directly high-pass filtering a partition area.
  • the high-pass filter is based on the premise that an inverse function of a transfer function of a smoothing filter is used.
  • the frequency characteristics of the smoothing filter contains a frequency that results in zero gain.
  • a complete inverse smoothing filter may not be realized.
  • the use of a low-pass filter such as the smoothing filter 1642 in the motion blur reduction processor 164 illustrated in FIG. 19 is more appropriate.
  • the motion blur reduction process is performed in the manner discussed above.
  • the method of reducing the motion blur is not limited to the method described above.
  • the imaging shutter speed Sb as illustrated in FIG. 18 is considered.
  • the imaging shutter speed Sb is high enough, and the optimum shutter speed is lower than the imaging shutter speed Sb at all the object speeds Va, Vb, and Vc.
  • the motion ad-blur process is performed.
  • the process selection controller 171 outputs to the sorting unit 163 the process selection control information SCS indicating that the motion ad-blur process has been selected.
  • the sorting unit 163 outputs the signal of each partition area in the decoded image data DD to the motion blur addition processor 165 .
  • the motion ad-blur process to be performed by the motion blur addition processor 165 is specifically described.
  • An output image is here generated through spatial filtering.
  • the motion blur addition processor 165 includes a motion vector masking processor 1651 generating motion vector mask information identifying an image area to which a motion blur is added, and a motion vector corrector 1652 correcting a motion vector.
  • the motion blur addition processor 165 further includes a filter parameter calculator 1653 calculating a filter parameter for adding the motion blur responsive to a pixel in the process target frame, and a motion blur addition filter 1654 performing a motion blur filtering process on the pixel value of each pixel in the process target frame.
  • the motion blur addition processor 165 performs the process of the motion vector masking processor 1651 and the motion vector corrector 1652 on a per partition area basis with the partition area being a pixel block.
  • the filtering process of the filter parameter calculator 1653 and the motion blur addition filter 1654 to add the motion blur to the decoded image data DD is performed on a pixel basis rather on a pixel block basis.
  • the motion vector masking processor 1651 performs a mask process illustrated in FIG. 22 on the motion vector VD of the supplied partition area.
  • the motion vector masking processor 1651 then supplies to the motion vector corrector 1652 the mask processed motion vector of the partition area.
  • the image area which is susceptible to jerkiness degradation and for which motion blur addition is necessary is concentrated in the vicinity of an edge of a moving image on the screen.
  • the motion vector masking processor 1651 outputs, as an effective value, the motion vector of only a pixel block, having a high spatial contrast and susceptible to jerkiness, in the vicinity of the edge.
  • step S 21 the motion vector masking processor 1651 detects an edge of an image in the decoded image data DD supplied from the sorting unit 163 .
  • the edge is detected on a per pixel block basis in order to identify an area having a high spatial contrast within the process target frame.
  • the motion vector masking processor 1651 detects a moving image area by calculating a difference between frames on a pixel block basis in order to identify the moving image area in the process target frame.
  • step S 23 the motion vector masking processor 1651 determines on a per pixel block basis whether an area subject to jerkiness has been detected in step S 21 and/or step S 22 .
  • the motion vector masking processor 1651 sets a mask processing flag “ 1 ” to the pixel block that has been determined as being subject to jerkiness.
  • the motion vector masking processor 1651 sets a masking process flag “ 0 ” to the pixel block that has not been determined as being subject to jerkiness.
  • step S 24 the motion vector masking processor 1651 determines whether the motion vector VD supplied from the process selection controller 171 is the motion vector VD of the pixel block having the above-described flag “ 1 .”
  • the motion vector masking processor 1651 outputs the motion vector of the pixel block having the flag “ 1 ” to the motion vector corrector 1652 without no change added to the value of the motion vector (processing proceeds from step S 25 to step S 26 ).
  • step S 25 the motion vector masking processor 1651 performs the mask process on the motion vector of the pixel block having the above-described flag “ 0 ,” thereby setting the value of the motion vector to “0” or invalidating the value of the motion vector.
  • the motion vector masking processor 1651 then outputs the mask processed motion vector to the motion vector corrector 1652 (processing thus proceeds from step S 24 to step S 25 to step S 26 ).
  • the process of the motion vector corrector 1652 corrects the motion vector VD using the shutter speed difference SSDD of the input partition area.
  • the motion vector is corrected by the motion vector corrector 1652 if the shutter speed difference SSDD is negative.
  • F (frames/second) represents a frame rate of the decoded image data DD as a moving image
  • the shutter speed difference SSDD is not smaller than ⁇ 1/F (seconds).
  • the shutter speed difference SSDD is a difference value between the imaging shutter speed SSD that is not smaller than zero and the optimum shutter speed SSD 0 that is not greater than 1/F (seconds).
  • the smaller the shutter speed difference SSDD (the larger the absolute value of the shutter speed difference SSDD), the greater the motion blur quantity to be added becomes.
  • the smaller the value of the shutter speed difference SSDD the larger the value of the motion vector VD serving as an indicator of a motion blur quantity to be added.
  • the motion vector corrector 1652 thus multiplies the vector value by a function fs (SSDD).
  • the function fs (SSDD) converges to 1 as the shutter speed difference SSDD becomes closer to “ ⁇ 1/F” and converges to “0” as the shutter speed difference SSDD becomes closer to “0”.
  • a value A closer to ⁇ 1/F, and a value B closer to 0 are set wherein the magnitude relationship ⁇ 1/F ⁇ A ⁇ B ⁇ 0 holds.
  • the output value of the function fs is set to 1 for a value equal to or smaller than the value A and the output value of the function fs is set to be 0 for a value equal to or larger than the value B.
  • This method is referred to as a clipping method.
  • the motion vector corrector 1652 may perform a multiplication process with fs (VD) having the motion vector VD as a variable or fs (SSDD, VD) as two variables as the shutter speed difference SSDD and the motion vector VD.
  • VD the motion vector
  • fs the filter parameter PD
  • the motion vector VD the value of the filter parameter PD
  • the data with the motion blur added thereto at the later stage is displayed as a moving image, a more naturally looking image quality in vision characteristics results.
  • the filter parameter calculator 1653 calculates the filter parameter described below on a per pixel basis in order to add the motion blur to each pixel forming the process target frame.
  • the filter parameter calculator 1653 identifies a pixel positioned on a motion vector of each target pixel (hereinafter referred to as a parameter calculation target pixel).
  • the target pixel is a pixel having effective motion vector information.
  • the filter parameter calculator 1653 calculates a filter parameter responsive to a relative position of the parameter calculation target pixel identified with respect to the target pixel in the manner described below.
  • the filter parameter calculator 1653 identifies as the parameter calculation target pixels all the pixels present on a motion vector having a target pixel P 0 at the midpoint between a start point S and an end point E.
  • the absolute value v is the absolute value of the motion vector of the target pixel.
  • the filter parameter calculator 1653 calculates a strength ⁇ of motion blur addition in accordance with the following equation (9) based on the absolute value v of the motion vector, and a distance d between a pixel position of the target pixel P 0 and a pixel position of a parameter calculation target pixel P 1 identified in the process described above:
  • Equation (9) is derived so that the square of the strength ⁇ becomes a variance in the Gaussian function of the subsequent-stage motion blur addition filter 1654 .
  • the filter parameter calculator 1653 calculates an angular direction ⁇ of the motion blur addition in accordance with the following equation (10):
  • the filter parameter calculator 1653 identifies the parameter calculation target pixel from the motion vector of the target pixel, sets parameter information ( ⁇ , ⁇ ) for each identified parameter calculation target pixel, and then supplies the parameter information ( ⁇ , ⁇ ) to the motion blur addition filter 1654 on a per process target frame basis.
  • a plurality of parameter calculation target pixels can be identified with respect to a given pixel.
  • one of a plurality of pieces of parameter information having the largest ⁇ is set as the parameter information of that pixel.
  • the filter parameter calculator 1653 may perform a smoothing process such as a Gaussian function filtering process or a median filtering process on the parameter information ( ⁇ , ⁇ ) of each parameter calculation target pixel so that the image quality of the moving image output from the subsequent motion blur addition filter 1654 is increased.
  • the motion blur addition filter 1654 performs a spatial filtering process within the process target frame described below on the pixel value of each pixel within the process target frame of the decoded image data DD.
  • the motion blur addition filter 1654 outputs an image with a motion blur added thereto by a first filtering process and/or a second filtering process.
  • the motion blur addition filter 1654 receives, in the form of an input I(x+i,y+j), a pixel value of a motion blur addition target pixel prior to the addition of the motion blur and a pixel value of a pixel surrounding the target pixel.
  • the motion blur addition filter 1654 then performs a Gaussian function filtering process on the input I(x+i,y+j) in accordance with a Gaussian function expressed as equation (11), thereby outputting a filter processed pixel value J(x,y):
  • J ⁇ ( x , y ) ( ⁇ I ⁇ ( x + i , y + j ) ⁇ e - r 2 2 ⁇ ⁇ 2 ⁇ e - r 2 2 ⁇ ⁇ 2 ) ( 11 )
  • the surrounding pixel having the input I(x+i,y+j) is set in accordance with an angle direction in which the motion vector is added, and r represents a distance between the motion blur addition target pixel and the surrounding pixel.
  • the motion blur addition filter 1654 performs the above-described filtering process on each pixel having the parameter information ( ⁇ , ⁇ ) set therefor from among all the pixels forming the process target frame, thereby updating the pixel value of the pixel.
  • the motion blur addition filter 1654 thus supplies a moving image with jerkiness reduced therefrom to the moving image display output unit 190 .
  • the motion blur addition filter 1654 calculates the pixel value J(x,y) of the target pixel in accordance with equation (11) with the pixel value I(x,y) of the target pixel replaced with a pixel value I(x+i 0 ,y+j 0 ) of a surrounding pixel if the surrounding pixel has a motion vector with zero value or an invalid motion vector.
  • the motion blur addition filter 1654 outputs an image with jerkiness reduced in a more natural look than with the first filtering process.
  • the motion blur addition processor 165 performs the spatial filtering process in the above discussion.
  • the motion blur addition processor 165 may generate an output image by performing a time filtering process.
  • a method of adding a motion blur with an appropriate number of intermediate frames generated is useful.
  • the structure of such a method is illustrated in FIG. 24 .
  • the motion blur addition processor 165 illustrated in FIG. 24 includes an intermediate frame generator 1655 , an image accumulator 1656 , a filter controller 1657 , and a motion blur addition filter 1658 .
  • the input decoded image data DD is supplied to the intermediate frame generator 1655 and the image accumulator 1656 .
  • the intermediate frame generator 1655 generates a predetermined number of new frames interpolating between existing prior and subsequent frames in time direction in accordance with a predetermined intermediate frame generation technique.
  • the intermediate frame generator 1655 then supplies the new frames to the image accumulator 1656 .
  • a variety of available methods may be applied as the intermediate frame generation technique. For example, in one method, existing prior and subsequent frames may be blended with weight incorporated. In another method, the motion vector is weighted using information regarding the motion vector at each area input to the intermediate frame generator 1655 and image blending is performed. More accurate intermediate frames can thus be generated.
  • the filter controller 1657 receives the imaging shutter speed SSD and the motion vector VD, and calculates a filter parameter FN of each partition area for use by the motion blur addition filter 1658 in accordance with the received imaging shutter speed SSD and motion vector VD.
  • the process performed by the motion blur addition filter 1658 is described before the discussion of the filter parameter FN.
  • the process of the motion blur addition filter 1658 is a frame averaging operation, and the number of frames used in the frame averaging operation is adaptively determined on a per partition area basis.
  • the frame averaging operation is performed on the number of frames different from partition area to partition area using one frame in the input decoded image data DD and a plurality of frames generated by the intermediate frame generator 1655 .
  • the larger the number of frames to be used in the averaging operation the larger the motion blur quantity to be added becomes, and the smaller the number of frames to be used in the averaging operation, the smaller the motion blur quantity to be added becomes.
  • the number of frames to be used in the averaging operation is determined in response to each partition area.
  • the filter controller 1657 then outputs the number of frames to the motion blur addition filter 1658 .
  • the number of frames to be used in the averaging operation is the filter parameter FN.
  • the minimum number of frames to be used in the averaging operation (the filter parameter FN) is one at minimum, and when the number of frames is one, one frame in the decoded image data DD is output as is without being processed.
  • K represent the number of frames generated by the intermediate frame generator 1655 between the existing prior and subsequent frames, and the maximum number of frames is K+1 with the input frame added to the K frames.
  • the filter parameter FN is determined from within a range of from 1 to K+1. As described above, the larger the number of frames, the larger the motion blur quantity to be added becomes.
  • the determination method of the filter parameter FN is not limited to any one method.
  • the filter parameter FN is determined using the shutter speed difference SSDD input to the filter controller 1657 as described below.
  • the shutter speed difference SSDD if negative, is used by the motion blur addition processor 165 .
  • F (frames/second) represent a frame rate of the decoded image data DD as a moving image
  • the shutter speed difference SSDD is not be smaller than ⁇ 1/F (seconds).
  • the shutter speed difference SSDD is a difference value between the imaging shutter speed SSD that is not smaller than zero and the optimum shutter speed SSD 0 that is not greater than 1/F (seconds).
  • the smaller the shutter speed difference SSDD (the larger the absolute value of the shutter speed difference SSDD), the greater the motion blur quantity becomes.
  • the smaller the value of the shutter speed difference SSDD the larger the value of the motion vector VD serving as an indicator of a motion blur quantity to be added.
  • the filter controller 1657 thus performs a process in accordance with a function gs (SSDD).
  • the function gs (SSDD) increases and converges to K+1 as the shutter speed difference SSDD becomes closer to “ ⁇ 1/F” and converges to “1” as the shutter speed difference SSDD becomes closer to “0”.
  • the filter parameter FN obtained as a result is output to the motion blur addition filter 1658 .
  • a value A closer to ⁇ 1/F, and a value B closer to 0 are set and the magnitude relationship ⁇ 1/F ⁇ A ⁇ B ⁇ 0 holds.
  • the output value of the function gs is set to be K+1 for a value equal to or smaller than the value A and the output value of the function gs is set to be 1 for a value equal to or larger than the value B.
  • This method is referred to as a clipping method.
  • the filter controller 1657 may perform a multiplication process with gs(VD) having the motion vector VD as a variable or gs(SSDD, VD) as two variables as the shutter speed difference SSDD and the motion vector VD in order to calculate the filter parameter FN.
  • the motion blur addition filter 1658 receives from the image accumulator 1656 the image data of the number of frames for the addition of the motion blur to each partition area, and performs the above-described filter averaging process.
  • the filter averaging process ends, the image data is reconstructed as a frame image, and then output as the output image signal OD.
  • the motion blur addition processor 165 performs the motion blur addition process using one of the spatial filtering process and the time filtering process.
  • the motion blur addition process is not limited to the process described above.
  • the imaging shutter speed Sc is between Sa and Sb as illustrated in FIG. 18 .
  • the optimum shutter speed may be higher or lower than the imaging shutter speed Sc.
  • any method of reducing the motion blur is performed in order to generate an image containing a motion blur at a level corresponding to the optimum shutter speed.
  • the process selection controller 171 outputs to the sorting unit 163 process selection control information SCS indicating that the motion blur reduction process has been selected.
  • the sorting unit 163 outputs to the motion blur reduction processor 164 the signal of a partition area in the decoded image data DD.
  • the motion blur reduction processor 164 then performs the filtering process in the same manner as described with reference to the imaging shutter speed Sa.
  • any motion addition process is performed to generate an image containing a motion blur at a level corresponding to the optimum shutter speed.
  • the process selection controller 171 outputs to the sorting unit 163 process selection control information SCS indicating that the motion blur addition process has been selected.
  • the sorting unit 163 outputs to the motion blur addition processor 165 the signal of a partition area in the decoded image data DD.
  • the motion blur addition processor 165 performs the filtering process in the same manner as described with reference to the imaging shutter speed Sb.
  • the structure and operation of the image reproducing apparatus 100 corresponding to the image processing apparatus 2 having the second structure discussed with reference to FIG. 3 have been discussed.
  • the image reproducing apparatus 100 performs the correction process in response to jerkiness and blur, thereby controlling the jerkiness degradation and the blur degradation.
  • the image quality of the output image output from the moving image display output unit 190 is thus improved. Even if the information of the shutter speed at the image capturing of the decoded image data DD is unknown, the imaging shutter speed is estimated.
  • the appropriate motion blur correction process is performed based on the evaluated value of the estimated shutter speed.
  • the image reproducing apparatus 100 having the third structure is generally identical to the image reproducing apparatus 100 illustrated in FIG. 5 .
  • the image reproducing apparatus 100 having the third structure is different in that a motion blur correction processor 160 A is substituted for the motion blur correction processor 160 .
  • FIG. 25 illustrates the motion blur correction parameter calculator 170 and a motion blur correction processor 160 A in a diagram form similar to that of FIG. 15 .
  • the motion blur correction processor 160 A includes the motion blur reduction processor 164 , the motion blur addition processor 165 , and the selector and synthesizer 167 .
  • the process selection controller 171 and the optimum shutter speed information memory 172 in the motion blur correction parameter calculator 170 are identical to the counterparts discussed with reference to FIG. 15 and subsequent drawings. However, it is noted that process selection control information SCS output from the process selection controller 171 is supplied to the selector and synthesizer 167 .
  • the correction processes of the motion blur reduction processor 164 and the motion blur addition processor 165 remain unchanged.
  • the input decoded image data DD is directly supplied to both the motion blur reduction processor 164 and the motion blur addition processor 165 .
  • the motion blur reduction processor 164 performs the motion blur reduction process on all the partition areas of the input decoded image data DD in accordance with the filter parameter PD of each partition area supplied from the process selection controller 171 .
  • the resulting image data is output to the selector and synthesizer 167 .
  • the motion blur addition processor 165 performs the motion blur addition process on all the partition areas of the input decoded image data DD in accordance with the filter parameter PD of each partition area supplied from the process selection controller 171 .
  • the resulting image data is output to the selector and synthesizer 167 .
  • the selector and synthesizer 167 receives the image data having undergone the motion blur addition process at all the partition areas of the decoded image data DD (including a partition area having zero correction quantity) and the image data having undergone the motion blur reduction process at all the partition areas of the decoded image data DD (including a partition area having zero correction quantity).
  • the selector and synthesizer 167 selects the image data of each partition area in response to the scene change detection signal SCD supplied from the process selection controller 171 .
  • the selector and synthesizer 167 selects between the image data from the motion blur reduction processor 164 and the image data from the motion blur addition processor 165 .
  • the selector and synthesizer 167 then synthesizes the selected image data of the partition areas, and then outputs the synthesized image data as the output image signal OD of one frame.
  • the motion blur correction processor 160 A also has the same advantages as the motion blur correction processor 160 illustrated in FIG. 15 .
  • each of the image processing apparatuses 1 , 2 , and 3 is the image reproducing apparatus 100 .
  • the present invention is applicable to a variety of apparatuses.
  • the present invention may be applicable not only to the image reproducing apparatus but also to an imaging apparatus, a communication apparatus, an image recording apparatus, a game playing machine, a video editing apparatus or the like.
  • the information processing apparatus such as a general-purpose personal computer may implement each of the image processing apparatuses 1 , 2 , and 3 .
  • CPU central processing unit
  • the computer 1 , 3 and 4 may be provided as an image processing application software program.
  • the computer can thus perform an appropriate image processing.
  • the computer program to be executed by the central processing unit (CPU) includes calculating the motion blur correction parameter for the motion blur correction process on the basis of the motion information indicating the motion of the image between unit images forming the image data, and the shutter speed information at the image capturing of the image data, and correcting the motion blur contained in the image data by performing at least the motion blur reduction process using the motion blur correction parameter.
  • the computer program further includes estimating the shutter speed by analyzing the image data. When the motion blur correction parameter is calculated, the estimated shutter speed is used. The operation of the shutter speed estimation processor 14 is thus performed by the CPU.
  • the computer program allows the same image processing to be performed by a personal computer, a cellular phone, a personal digital assistant (PDA), and a variety of image processing apparatuses using image data.
  • PDA personal digital assistant
  • the computer program causing the CPU to operate as the motion blur correction parameter calculator 12 , and the motion blur correction processor 13 (and the shutter speed estimation processor 14 ) may be pre-stored on a hard disk drive as a recording medium contained in an apparatus such as a computer, or a read-only memory (ROM), a flash memory or the like of a microcomputer having a CPU.
  • the computer program may also be stored temporarily or permanently on removable recording media including a flexible disk, a compact disc read only memory (CD-ROM), a magneto-optical (MO) disc, a digital versatile disc (DVD), Blu-ray disc (registered trademark of Sony), a magnetic disc, a semiconductor memory, and a memory card.
  • a removable recording medium may be supplied as package software.
  • the computer program may be installed onto the personal computer from such a removable recording medium.
  • the computer program may also downloaded from a download site via a network such as a local area network (LAN), or the Internet.
  • LAN local area network

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Picture Signal Circuits (AREA)
  • Adjustment Of Camera Lenses (AREA)
US12/487,922 2008-06-20 2009-06-19 Apparatus, method, and program for processing image Expired - Fee Related US9270897B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008161581A JP4666012B2 (ja) 2008-06-20 2008-06-20 画像処理装置、画像処理方法、プログラム
JPP2008-161581 2008-06-20

Publications (2)

Publication Number Publication Date
US20090316009A1 US20090316009A1 (en) 2009-12-24
US9270897B2 true US9270897B2 (en) 2016-02-23

Family

ID=41057624

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/487,922 Expired - Fee Related US9270897B2 (en) 2008-06-20 2009-06-19 Apparatus, method, and program for processing image

Country Status (4)

Country Link
US (1) US9270897B2 (de)
EP (1) EP2136554A3 (de)
JP (1) JP4666012B2 (de)
CN (1) CN101610349B (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220417434A1 (en) * 2021-06-29 2022-12-29 Canon Kabushiki Kaisha Correction control apparatus, image capturing apparatus, control method, and recording medium

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009253675A (ja) * 2008-04-07 2009-10-29 Canon Inc 再生装置および方法、プログラム
JP4674620B2 (ja) * 2008-07-29 2011-04-20 ソニー株式会社 画像処理装置、画像処理方法、及びプログラム
JP5490404B2 (ja) * 2008-12-25 2014-05-14 シャープ株式会社 画像復号装置
US10757308B2 (en) 2009-03-02 2020-08-25 Flir Systems, Inc. Techniques for device attachment with dual band imaging sensor
US9948872B2 (en) 2009-03-02 2018-04-17 Flir Systems, Inc. Monitor and control systems and methods for occupant safety and energy efficiency of structures
US9843742B2 (en) 2009-03-02 2017-12-12 Flir Systems, Inc. Thermal image frame capture using de-aligned sensor array
US9756264B2 (en) 2009-03-02 2017-09-05 Flir Systems, Inc. Anomalous pixel detection
US9843743B2 (en) 2009-06-03 2017-12-12 Flir Systems, Inc. Infant monitoring systems and methods using thermal imaging
US9716843B2 (en) 2009-06-03 2017-07-25 Flir Systems, Inc. Measurement device for electrical installations and related methods
JP2011114407A (ja) * 2009-11-24 2011-06-09 Sony Corp 画像処理装置、画像処理方法、プログラム及び記録媒体
US9706138B2 (en) 2010-04-23 2017-07-11 Flir Systems, Inc. Hybrid infrared sensor array having heterogeneous infrared sensors
US9948878B2 (en) 2010-04-23 2018-04-17 Flir Systems, Inc. Abnormal clock rate detection in imaging sensor arrays
JP2012032739A (ja) 2010-08-03 2012-02-16 Mitsubishi Electric Corp 画像処理装置及び方法、並びに画像表示装置
JP5506623B2 (ja) * 2010-09-27 2014-05-28 日立コンシューマエレクトロニクス株式会社 映像処理装置及び映像処理方法
FR2968878A1 (fr) * 2010-12-14 2012-06-15 Thomson Licensing Procede et dispositif pour generer des images comportant du flou cinetique
US8823745B2 (en) * 2011-06-02 2014-09-02 Yoostar Entertainment Group, Inc. Image processing based on depth information and color data of a scene
CN103875235B (zh) * 2011-06-10 2018-10-12 菲力尔系统公司 用于红外成像装置的非均匀性校正技术
US9058653B1 (en) 2011-06-10 2015-06-16 Flir Systems, Inc. Alignment of visible light sources based on thermal images
US9961277B2 (en) 2011-06-10 2018-05-01 Flir Systems, Inc. Infrared focal plane array heat spreaders
US9509924B2 (en) 2011-06-10 2016-11-29 Flir Systems, Inc. Wearable apparatus with integrated infrared imaging module
US10051210B2 (en) 2011-06-10 2018-08-14 Flir Systems, Inc. Infrared detector array with selectable pixel binning systems and methods
US9706137B2 (en) 2011-06-10 2017-07-11 Flir Systems, Inc. Electrical cabinet infrared monitor
US9143703B2 (en) 2011-06-10 2015-09-22 Flir Systems, Inc. Infrared camera calibration techniques
US9900526B2 (en) 2011-06-10 2018-02-20 Flir Systems, Inc. Techniques to compensate for calibration drifts in infrared imaging devices
US9235023B2 (en) 2011-06-10 2016-01-12 Flir Systems, Inc. Variable lens sleeve spacer
US10841508B2 (en) 2011-06-10 2020-11-17 Flir Systems, Inc. Electrical cabinet infrared monitor systems and methods
US10389953B2 (en) 2011-06-10 2019-08-20 Flir Systems, Inc. Infrared imaging device having a shutter
JP2013033330A (ja) * 2011-08-01 2013-02-14 Sony Corp 情報処理装置、情報処理方法およびプログラム
JP6089373B2 (ja) * 2012-02-03 2017-03-08 パナソニックIpマネジメント株式会社 評価方法、評価装置、コンピュータ・プログラムおよび記録媒体
WO2014024916A1 (ja) * 2012-08-07 2014-02-13 シャープ株式会社 画像処理装置、画像処理方法、画像処理プログラム及び画像表示装置
US10996542B2 (en) 2012-12-31 2021-05-04 Flir Systems, Inc. Infrared imaging system shutter assembly with integrated thermister
WO2014160297A2 (en) * 2013-03-14 2014-10-02 Drs Rsta, Inc. Method and system for adaptive pixel replacement
US9165345B2 (en) 2013-03-14 2015-10-20 Drs Network & Imaging Systems, Llc Method and system for noise reduction in video systems
US9355435B2 (en) 2013-03-14 2016-05-31 Drs Network & Imaging Systems, Llc Method and system for adaptive pixel replacement
US9973692B2 (en) 2013-10-03 2018-05-15 Flir Systems, Inc. Situational awareness by compressed display of panoramic views
FR3013487B1 (fr) 2013-11-18 2017-04-21 Univ De Nice (Uns) Procede d'estimation de la vitesse de deplacement d'une camera
FR3013488B1 (fr) 2013-11-18 2017-04-21 Univ De Nice (Uns) Procede d'estimation de la vitesse de deplacement d'une camera
KR20150078275A (ko) * 2013-12-30 2015-07-08 삼성전자주식회사 움직이는 피사체 촬영 장치 및 방법
US11297264B2 (en) 2014-01-05 2022-04-05 Teledyne Fur, Llc Device attachment with dual band imaging sensor
US9237277B1 (en) * 2014-06-06 2016-01-12 Google Inc. Accurate simulation of shallow depth of field using contrast detection
CN104363380B (zh) * 2014-10-15 2017-10-27 北京智谷睿拓技术服务有限公司 图像采集控制方法和装置
US9684970B2 (en) * 2015-02-27 2017-06-20 Qualcomm Incorporated Fast adaptive estimation of motion blur for coherent rendering
US20160295108A1 (en) * 2015-04-01 2016-10-06 Cheng Cao System and method for panoramic imaging
EP3573020B1 (de) 2015-04-30 2020-11-11 FotoNation Limited Verfahren und vorrichtung zur erzeugung eines video-streams
WO2016185947A1 (ja) * 2015-05-19 2016-11-24 ソニー株式会社 画像処理装置、画像処理方法、受信装置および送信装置
JP6389801B2 (ja) 2015-05-27 2018-09-12 富士フイルム株式会社 画像処理装置、画像処理方法、プログラムおよび記録媒体
FR3041136A1 (fr) * 2015-09-14 2017-03-17 Parrot Procede de determination d'une duree d'exposition d'une camera embarque sur un drone, et drone associe.
US10230912B2 (en) * 2016-06-28 2019-03-12 Seek Thermal, Inc. Fixed pattern noise mitigation for a thermal imaging system
EP3512323B1 (de) * 2016-09-07 2021-06-16 Fuji Corporation Erkennungsvorrichtung
JP7159866B2 (ja) * 2016-09-30 2022-10-25 株式会社ニコン 撮像装置およびプログラム
WO2018194040A1 (ja) * 2017-04-17 2018-10-25 ソニー株式会社 送信装置、送信方法、受信装置、受信方法、記録装置および記録方法
CN109672818B (zh) * 2017-10-16 2020-12-22 华为技术有限公司 一种调整图像质量的方法及装置
CN108648463B (zh) * 2018-05-14 2020-10-27 三峡大学 一种路口交通视频中车辆检测方法及系统
JP7327917B2 (ja) * 2018-09-14 2023-08-16 キヤノン株式会社 画像処理装置及び画像処理方法
JP7292033B2 (ja) * 2018-12-28 2023-06-16 キヤノン株式会社 情報処理装置、情報処理方法、撮像装置、及びプログラム
WO2020159988A1 (en) * 2019-01-28 2020-08-06 Op Solutions, Llc Inter prediction in exponential partitioning
CN110166649A (zh) * 2019-06-18 2019-08-23 北京控制与电子技术研究所 一种基于fpga的图像消旋器
CN112330544B (zh) * 2019-08-05 2024-02-09 浙江宇视科技有限公司 图像拖影的处理方法、装置、设备及介质
CN111479035B (zh) * 2020-04-13 2022-10-18 Oppo广东移动通信有限公司 图像处理方法、电子装置及计算机可读存储介质
CN116249936A (zh) * 2020-09-30 2023-06-09 富士胶片株式会社 摄像装置、摄像方法及摄像程序
JP7762723B2 (ja) 2021-03-08 2025-10-30 グーグル エルエルシー シンチレーションを小さくし、かつ、ディスプレイの領域を分離している境界の出現を少なくするための運動誘導ぼかし
CN116982075A (zh) * 2021-07-07 2023-10-31 三星电子株式会社 用于增强图像质量的方法和系统
CN114529555A (zh) * 2021-12-31 2022-05-24 红云红河烟草(集团)有限责任公司 一种基于图像识别的烟箱高效进出库检测方法
WO2024035223A1 (en) * 2022-08-11 2024-02-15 Samsung Electronics Co., Ltd. System and method for enhancing the quality of a video
CN116156079B (zh) * 2023-02-22 2025-12-09 维沃移动通信有限公司 视频处理方法、装置、设备和存储介质

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0638098A (ja) 1992-07-20 1994-02-10 Sony Corp ビデオカメラの信号処理回路
US5701163A (en) 1995-01-18 1997-12-23 Sony Corporation Video processing method and apparatus
JP2003006648A (ja) 2001-06-26 2003-01-10 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
US20040052425A1 (en) 2001-05-30 2004-03-18 Tetsujiro Kondo Image processing apparatus
JP2004282318A (ja) 2003-03-14 2004-10-07 Sony Corp シーンチェンジ検出方法および装置
US20050093982A1 (en) 2003-10-31 2005-05-05 Sony Corporation Image pickup apparatus and method, image processing apparatus and method, image display system, recording medium and program
JP2005260928A (ja) 2004-02-13 2005-09-22 Sony Corp 画像処理装置と画像処理方法およびプログラム
JP2006081150A (ja) 2004-08-11 2006-03-23 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
WO2006068293A1 (ja) 2004-12-21 2006-06-29 Sony Corporation 画像処理装置と画像処理方法および画像処理プログラム
JP2007020140A (ja) 2004-12-07 2007-01-25 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
WO2007114220A1 (ja) 2006-03-31 2007-10-11 Sony Corporation 画像処理装置、および画像処理方法、並びにコンピュータ・プログラム
US20090102935A1 (en) * 2007-10-19 2009-04-23 Qualcomm Incorporated Motion assisted image sensor configuration
US7538794B2 (en) * 2005-01-31 2009-05-26 Hewlett-Packard Development Company, L.P. Method and apparatus for motion estimation in a digital imaging device
US7990429B2 (en) * 2005-10-28 2011-08-02 Nikon Corporation Imaging device with blur enhancement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4106554B2 (ja) * 2003-09-08 2008-06-25 ソニー株式会社 撮影環境判定方法および撮像装置

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0638098A (ja) 1992-07-20 1994-02-10 Sony Corp ビデオカメラの信号処理回路
US5701163A (en) 1995-01-18 1997-12-23 Sony Corporation Video processing method and apparatus
US20040052425A1 (en) 2001-05-30 2004-03-18 Tetsujiro Kondo Image processing apparatus
JP2003006648A (ja) 2001-06-26 2003-01-10 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
US20040066460A1 (en) 2001-06-26 2004-04-08 Tetsujiro Kondo Image processing apparatus and method, and image pickup apparatus
JP2004282318A (ja) 2003-03-14 2004-10-07 Sony Corp シーンチェンジ検出方法および装置
US20050093982A1 (en) 2003-10-31 2005-05-05 Sony Corporation Image pickup apparatus and method, image processing apparatus and method, image display system, recording medium and program
JP2005260928A (ja) 2004-02-13 2005-09-22 Sony Corp 画像処理装置と画像処理方法およびプログラム
JP2006081150A (ja) 2004-08-11 2006-03-23 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
US20070070221A1 (en) 2004-08-11 2007-03-29 Sony Corporation Image processing apparatus and method, recording medium, and program
JP2007020140A (ja) 2004-12-07 2007-01-25 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
WO2006068293A1 (ja) 2004-12-21 2006-06-29 Sony Corporation 画像処理装置と画像処理方法および画像処理プログラム
US7538794B2 (en) * 2005-01-31 2009-05-26 Hewlett-Packard Development Company, L.P. Method and apparatus for motion estimation in a digital imaging device
US7990429B2 (en) * 2005-10-28 2011-08-02 Nikon Corporation Imaging device with blur enhancement
WO2007114220A1 (ja) 2006-03-31 2007-10-11 Sony Corporation 画像処理装置、および画像処理方法、並びにコンピュータ・プログラム
JP2007274299A (ja) 2006-03-31 2007-10-18 Sony Corp 画像処理装置、および画像処理方法、並びにコンピュータ・プログラム
US20090102935A1 (en) * 2007-10-19 2009-04-23 Qualcomm Incorporated Motion assisted image sensor configuration

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
B. Okumura et al., "Augmented Reality Based on Estimation of Defocusing and Motion Blurring from Captured Images", Proc. IEEE and ACM Int. Sympo. on Mixed Augmented Reality (ISMAR 06) (2006).
B. Okumura et al., "Photometric Registration Based on Defocus and Motion Blur Estimation for Augmented Reality", The Institute of Image Information and Communication Engineers D, vol. J90-D, No. 8, pp. 2126-2136.
Communication pursuant to Article 94(3) EPC, issued Mar. 3, 2011, in EP 09 251 344.9.
EPO Communication pursuant to Article 94(3), dated Mar. 12, 2015, for EP Application No. 09251344.9 (6 pages).
European Search Report in EP 09 25 1344, dated May 28, 2010.
Partial European Search Report in EP 09 25 1344, dated Nov. 6, 2009.
T. Saito et al., "Extension of Coupled Nonlinear Diffusion to Motion De-blurring-Introduction of Anisotropic Peaking", The Institute of Image Information and Television Engineers, vol. 58, No. 12, pp. 1839-1844 (2004).
T. Saito et al., "Model-Based Robust Variational Method for Motion De-Blurring", Proc. 14th European Signal Processes. Conf. (EUSIPCO 2006).
T. Saito et al., "Motion De-blurring Using a Blur Model", The Institute of Image Information and Television Engineers, vol. 59, No. 11, pp. 1714-1721 (2005).
Y. Kuroki, et al., "3.4: Improvement of Motion Image Quality by High Frame Rate", SID 06 Digest, 2006 SID International Symposium, Society for Information Display, vol. 37, pp. 14-17, XP007012613 (2005).

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220417434A1 (en) * 2021-06-29 2022-12-29 Canon Kabushiki Kaisha Correction control apparatus, image capturing apparatus, control method, and recording medium
US11956539B2 (en) * 2021-06-29 2024-04-09 Canon Kabushiki Kaisha Correction control apparatus, image capturing apparatus, control method, and recording medium

Also Published As

Publication number Publication date
CN101610349B (zh) 2012-08-08
EP2136554A3 (de) 2010-06-30
JP2010004329A (ja) 2010-01-07
CN101610349A (zh) 2009-12-23
EP2136554A2 (de) 2009-12-23
JP4666012B2 (ja) 2011-04-06
US20090316009A1 (en) 2009-12-24

Similar Documents

Publication Publication Date Title
US9270897B2 (en) Apparatus, method, and program for processing image
US8488009B2 (en) Image processing apparatus, image processing method, and program
RU2419243C1 (ru) Устройство и способ обработки изображений и устройство и способ отображения изображений
KR101442153B1 (ko) 저조도 영상 처리 방법 및 시스템
US8369649B2 (en) Image processing apparatus, image processing method, and computer program for performing super-resolution process
US7903148B2 (en) Apparatus, method, and computer program for processing image, and recording medium storing the computer program
US8233062B2 (en) Image processing apparatus, image processing method, and imaging apparatus
US7768551B2 (en) Method to stabilize digital video motion
US12125173B2 (en) Video denoising method and device, and computer readable storage medium
US9262811B2 (en) System and method for spatio temporal video image enhancement
US10672134B2 (en) Image processing apparatus, imaging apparatus, image processing method, and storage medium storing image processing program
US20110188583A1 (en) Picture signal conversion system
EP1984893A2 (de) Verfahren und vorrichtung zum modifizieren einer beweglichen bildsequenz
KR20090107911A (ko) 화상 처리 장치, 촬상 장치, 화상 처리 방법 및 프로그램
KR102106537B1 (ko) 하이 다이나믹 레인지 영상 생성 방법 및, 그에 따른 장치, 그에 따른 시스템
KR20140072386A (ko) 영상 처리 장치 및 방법
US8194141B2 (en) Method and apparatus for producing sharp frames with less blur
US20090086174A1 (en) Image recording apparatus, image correcting apparatus, and image sensing apparatus
JP2005150903A (ja) 画像処理装置、ノイズ除去方法及びノイズ除去プログラム
JP2012085205A (ja) 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
JP5219771B2 (ja) 映像処理装置および映像処理装置の制御方法
JP2010015483A (ja) 画像処理装置、画像処理方法、及びプログラム
CN100407766C (zh) 用于处理图像的方法和设备
JP5506623B2 (ja) 映像処理装置及び映像処理方法
KR101415874B1 (ko) 동영상 획득 시스템 및 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ITO, ATSUSHI;KOBAYASHI, SEIJI;REEL/FRAME:022853/0112;SIGNING DATES FROM 20090511 TO 20090514

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ITO, ATSUSHI;KOBAYASHI, SEIJI;SIGNING DATES FROM 20090511 TO 20090514;REEL/FRAME:022853/0112

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20200223