WO2003001453A1 - Procede et dispositif de traitement d'images et dispositif de prise de vues - Google Patents
Procede et dispositif de traitement d'images et dispositif de prise de vues Download PDFInfo
- Publication number
- WO2003001453A1 WO2003001453A1 PCT/JP2002/006178 JP0206178W WO03001453A1 WO 2003001453 A1 WO2003001453 A1 WO 2003001453A1 JP 0206178 W JP0206178 W JP 0206178W WO 03001453 A1 WO03001453 A1 WO 03001453A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frame
- interest
- background
- foreground
- pixel
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 191
- 238000000034 method Methods 0.000 title claims description 108
- 239000000203 mixture Substances 0.000 claims abstract description 337
- 238000000926 separation method Methods 0.000 claims abstract description 65
- 230000008569 process Effects 0.000 claims description 75
- 238000002156 mixing Methods 0.000 claims description 67
- 238000001514 detection method Methods 0.000 claims description 26
- 238000003384 imaging method Methods 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 8
- 230000010354 integration Effects 0.000 claims description 7
- 238000003672 processing method Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 117
- 230000003068 static effect Effects 0.000 description 38
- 230000002194 synthesizing effect Effects 0.000 description 30
- 230000014509 gene expression Effects 0.000 description 28
- 230000008859 change Effects 0.000 description 22
- 230000015572 biosynthetic process Effects 0.000 description 14
- 238000003786 synthesis reaction Methods 0.000 description 14
- 239000000284 extract Substances 0.000 description 13
- 230000006870 function Effects 0.000 description 10
- 238000000605 extraction Methods 0.000 description 7
- 238000012937 correction Methods 0.000 description 6
- 229910052799 carbon Inorganic materials 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 229910052757 nitrogen Inorganic materials 0.000 description 3
- 229910052698 phosphorus Inorganic materials 0.000 description 3
- 239000002131 composite material Substances 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 101100426956 Caenorhabditis elegans ttn-1 gene Proteins 0.000 description 1
- 240000007643 Phytolacca americana Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 229910052785 arsenic Inorganic materials 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
Definitions
- Image processing apparatus and method and imaging apparatus
- the present invention relates to an image processing apparatus and method, and an image capturing apparatus, and more particularly to an image processing apparatus and method that takes into account a difference between a signal detected by a sensor and the real world, and an image capturing apparatus.
- motion blur occurs when the moving speed of the object is relatively high.
- the present invention has been made in view of such a situation, and has as its object to be able to separate an image of a background from an image of an object in response to a mixed state.
- a foreground object component constituting a foreground object of a frame of interest of image data and a background object component constituting a background object are mixed.
- a foreground area consisting of only foreground object components and a background area consisting of only background object components are specified.
- the foreground object component and the background object component are mixed, and the force is formed on the tip side in the direction of motion of the foreground object.
- Area specifying means for specifying a pad background area; and mixing ratio detecting means for detecting a mixing ratio indicating a mixing ratio of a foreground object component and a background object component in an uncovered background area and a force bird background area. Based on the mixing ratio, the pixels belonging to the uncovered background area and the covered back area And a foreground / background separating means for separating the data into a foreground object component and a background object component, and generating a foreground component image consisting only of the foreground object component and a background component image consisting only of the background object component in the frame of interest. It is characterized by the following.
- the region specifying means temporarily stores three consecutively input frames, and determines, during a first time, a movement between a frame two frames before the frame of interest and a frame immediately before the frame of interest.
- the region determined to be still between the frame immediately before the frame of interest and the frame of interest is identified as an uncovered backdrop region corresponding to the frame of interest.
- a region determined to be motion between the previous frame and the frame of interest, and a region determined to be motion between the frame of interest and the frame immediately after the frame of interest is identified as a foreground region corresponding to the frame of interest.
- it is determined to be still between the frame immediately before the frame of interest and the frame of interest, and is determined to be still between the frame of interest and the frame immediately after the frame of interest.
- the motion is determined to be between the frame of interest and the frame immediately following the frame of interest, and the frame immediately following the frame of interest is determined.
- An area determined to be still between the current frame and the frame immediately after the frame of interest can be specified as a covered background area corresponding to the frame of interest.
- the foreground object specifies an covered background area formed at the rear end of the object in the movement direction, and after the first time, at the second time corresponding to one screen, the foreground The foreground area consisting only of the object component and the background area consisting only of the background object component are specified, and after the second time, at the third time corresponding to one screen, the foreground object component and the background object are identified.
- a mixing ratio detecting step for detecting a mixing ratio indicating a mixing ratio of the foreground object component and the background object component in the uncovered background area and the force pad background area, and the uncovered background based on the mixing ratio.
- the pixel data of the pixels belonging to the ground area and the covered background area are separated into the foreground object component and the background object component, and only the foreground component image consisting of only the foreground object component and the background object component in the frame of interest And a foreground / background separation step of generating a background component image.
- the region identification step temporarily stores three consecutively input frames, and determines that there is movement between the frame immediately before the frame of interest and the frame immediately before the frame of interest at the first time. Then, an area determined to be still between the frame immediately before the frame of interest and the frame of interest is identified as an uncovered background area corresponding to the frame of interest, and at the second time, one of the frames of interest is identified. A region determined to be a motion between the previous frame and the frame of interest, and a motion determined between the frame of interest and the frame immediately after the frame of interest is identified as a foreground region corresponding to the frame of interest.
- the motion is determined to be between the frame of interest and the frame immediately after the frame of interest, and the frame is determined to be stationary between the frame immediately after the frame of interest and the frame two frames after the frame of interest.
- Area can be specified as a covered background area corresponding to the frame of interest.
- a foreground object component constituting a foreground object of a target frame of image data and a background object component constituting a background object are mixed.
- the uncovered background area formed on the rear end side in the moving direction of the object is specified, and after a first time, a foreground object component is obtained at a second time corresponding to one screen.
- a foreground area consisting only of the background object and a background area consisting only of the background object component are identified, and after the second time, at the third time corresponding to one screen, the foreground object component and the background object component are mixed.
- Identify the covered background area formed at the front end in the direction of movement of the foreground object A region identification step, a mixture ratio detection step of detecting a mixture ratio indicating a mixture ratio of a foreground object component and a background object component in the uncovered background region and the covered background region, and a mixing ratio detection step. Then, the pixel data of the pixels belonging to the uncovered background area and the covered background area are separated into a foreground object component and a background object component, and a foreground component image including only the foreground object component in the frame of interest is obtained. And a foreground / background separation step of generating a background component image consisting only of a background object component.
- the region identification step temporarily stores three consecutively input frames, and determines that there is movement between the frame immediately before the frame of interest and the frame immediately before the frame of interest at the first time. An area determined to be still between the frame immediately before the frame of interest and the frame of interest is identified as an uncovered background area corresponding to the frame of interest, and at the second time, one of the frames of interest is identified.
- Movement is determined between the previous frame and the frame of interest, and the frame of interest and one of the frames of interest
- the area determined to be moving between the subsequent frame is identified as the foreground area corresponding to the frame of interest
- the frame is determined to be still between the frame immediately before the frame of interest and the frame of interest
- the region determined to be still between the frame of interest and the frame immediately following the frame of interest is identified as the background region corresponding to the frame of interest
- the region of interest after the frame of interest and one frame after the frame of interest The area that is determined to be motion between the frame of interest and the area that is determined to be still between the frame one frame after the frame of interest and the frame two frames after the frame of interest is a covered back target area corresponding to the frame of interest. Can be specified.
- the program according to the present invention is arranged such that the computer mixes the foreground object component constituting the foreground object of the frame of interest of the image data with the background object component constituting the background object at the first time corresponding to one screen.
- An uncovered background region formed at the rear end of the foreground object in the movement direction of the foreground object is identified, and after a first time, at a second time corresponding to one screen, The foreground area consisting only of the foreground object component and the background area consisting only of the background object component are specified, and after a second time, at a third time corresponding to one screen, the foreground object component and the background object component are determined.
- a covered backdrop region formed at the front end in the direction of movement of the foreground object.
- the pixel data of the pixels belonging to the uncovered background area and the power background area are separated into a foreground object component and a background object component, and a foreground component image and a background object consisting of only the foreground object component in the frame of interest.
- the region identification step temporarily stores three consecutively input frames, and at a first time, a frame two frames before the frame of interest and a frame immediately before the frame of interest.
- a region that is determined to be moving between the target frame and a frame that is determined to be still between the frame immediately before the target frame and the target frame is identified as an uncovered background region ′ corresponding to the target frame.
- the region determined to be moving between the frame immediately before the frame of interest and the frame of interest and the region determined to be moving between the frame of interest and the frame immediately after the frame of interest is defined as the frame of interest.
- the corresponding foreground area is identified, and at the second time, it is determined that the subject is still between the frame immediately before the frame of interest and the frame of interest, and between the frame of interest and the frame immediately after the frame of interest.
- the region determined to be still is identified as the background region corresponding to the frame of interest, and at the third time, it is determined that there is motion between the frame of interest and the frame immediately after the frame of interest,
- An area determined to be still between the frame immediately after the frame of interest and the frame two frames after the frame of interest can be specified as a covered back-drop region corresponding to the frame of interest.
- An imaging apparatus includes: an imaging unit that outputs a subject image captured by an imaging device having a predetermined number of pixels having a time integration effect as image data including a predetermined number of pixel data; At the first time, the foreground object component forming the foreground object of the frame of interest in the image data and the background object component forming the background object are mixed, and formed at the rear end of the foreground object in the movement direction. An anchored background area is identified, and after a first time, at a second time corresponding to one screen, a foreground area including only foreground object components and a background area including only background object components are determined.
- the foreground object Area specifying means for specifying a covered background area formed on the front end side in the movement direction of the foreground object, in which the background component and the background object component are mixed, and an uncovered background area and a force-backed background area.
- a mixture ratio detecting means for detecting a mixture ratio indicating a mixture ratio of a foreground object component and a background object component, and pixels belonging to an uncovered background area and a covered pack area based on the mixture ratio.
- the region specifying means temporarily stores three consecutively input frames, and determines, during a first time, a movement between a frame two frames before the frame of interest and a frame immediately before the frame of interest.
- the region determined to be still between the frame immediately before the frame of interest and the frame of interest is identified as an uncovered backdrop region corresponding to the frame of interest.
- a region determined to be motion between the previous frame and the frame of interest, and a region determined to be motion between the frame of interest and the frame immediately after the frame of interest is identified as a foreground region corresponding to the frame of interest.
- it is determined to be still between the frame immediately before the frame of interest and the frame of interest, and is determined to be still between the frame of interest and the frame immediately after the frame of interest.
- the motion is determined to be between the frame of interest and the frame immediately following the frame of interest, and the frame immediately following the frame of interest is determined.
- An area determined to be still between the current frame and the frame immediately after the frame of interest can be specified as a covered background area corresponding to the frame of interest.
- the foreground object component constituting the foreground object of the frame of interest of the image data is mixed with the background object component constituting the background object.
- An anchored background area formed on the edge side is identified, and after the first time-at the second time corresponding to one screen, only the foreground area consisting of the foreground object component and only the background object component.
- the foreground object component and the background object component are mixed at a third time corresponding to one screen after the second time in the movement direction of the foreground object.
- the area of the covered background formed on the side is identified and the uncovered In the background area and the covered background area, a mixture ratio indicating a mixture ratio of the foreground object component and the background object component is detected.
- the pixel data of the pixels belonging to the region are separated into a foreground object component and a background object component, and a foreground component image composed of only the foreground object component and a background component image composed of only the background object component in the frame of interest are generated. .
- FIG. 1 is a diagram showing an embodiment of an image processing apparatus according to the present invention.
- FIG. 2 is a block diagram illustrating the image processing apparatus.
- FIG. 3 is a diagram illustrating imaging by a sensor.
- FIG. 4 is a diagram illustrating the arrangement of pixels.
- FIG. 5 is a diagram illustrating the operation of the detection element.
- FIG. 6A is a diagram illustrating an image obtained by capturing an object corresponding to a moving foreground and an object corresponding to a stationary background.
- FIG. 6B is a diagram illustrating a model corresponding to an image obtained by capturing an object corresponding to a moving foreground and an object corresponding to a stationary background.
- FIG. 7 is a diagram illustrating a background area, a foreground area, a mixed area, a covered background area, and an uncovered packed ground area.
- Fig. 8 is a model diagram in which the pixel values of the pixels that are adjacent to one row in the image of the object corresponding to the stationary foreground and the object corresponding to the stationary background are expanded in the time direction. It is.
- Figure 9 expands the pixel values in the time direction and divides the period corresponding to the shirt time It is a model figure.
- FIG. 10 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shirt time is divided.
- FIG. 11 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shirt time is divided.
- FIG. 12 is a diagram illustrating an example in which pixels in a foreground area, a background area, and a mixed area are extracted.
- Fig. 13 is a diagram showing the correspondence between pixels and a model in which pixel values are expanded in the time direction.
- Fig. 14 is a model in which pixel values are expanded in the time direction and a period corresponding to the shirt time is divided.
- FIG. 15 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shutter time is divided.
- FIG. 16 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shutter time is divided.
- FIG. 17 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shutter time is divided.
- FIG. 18 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shirt time is divided.
- FIG. 19 is a flowchart illustrating the process of adjusting the amount of motion blur.
- FIG. 20 is a block diagram showing a configuration of the area specifying unit 103. As shown in FIG.
- FIG. 21 is a diagram illustrating an image when an object corresponding to the foreground is moving.
- FIG. 22 is a model diagram in which pixel values are developed in the time direction and a period corresponding to a shutter time is divided.
- FIG. 23 is a model diagram in which pixel values are developed in the time direction and a period corresponding to a shutter time is divided.
- Figure 24 shows the pixel value developed in the time direction and the period corresponding to the shutter time is divided.
- FIG. 25 is a diagram for explaining conditions for region determination.
- FIG. 26 is a diagram illustrating an example of a result of specifying an area by the area specifying unit 103.
- FIG. 27 is a diagram illustrating an example of a result of specifying an area by the area specifying unit 103.
- FIG. 28 is a flowchart illustrating the area identification processing.
- FIG. 29 is a flowchart illustrating the area specifying process.
- FIG. 30 is a block diagram showing an example of the configuration of the mixture ratio calculating section 104. As shown in FIG.
- FIG. 31 is a diagram illustrating an example of an ideal mixture ratio.
- FIG. 32 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shutter time is divided.
- FIG. 33 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shirt time is divided.
- FIG. 34 is a diagram for explaining the approximation using the correlation of the foreground components.
- FIG. 35 is a diagram illustrating the relationship between C, N, and P.
- FIG. 36 is a block diagram illustrating a configuration of the estimated mixture ratio processing unit 401.
- FIG. 37 is a diagram illustrating an example of the estimated mixture ratio.
- FIG. 38 is a block diagram showing another configuration of the mixture ratio calculation unit 104. As shown in FIG. 38
- FIG. 39 is a flowchart for explaining the process of calculating the mixture ratio.
- FIG. 40 is a flowchart illustrating a process of calculating the estimated mixture ratio.
- FIG. 41 is a diagram illustrating a straight line that approximates the mixture ratio.
- FIG. 42 is a diagram illustrating a plane approximating the mixture ratio.
- FIG. 43 is a diagram for explaining the correspondence of pixels in a plurality of frames when calculating the mixture ratio.
- FIG. 44 is a block diagram illustrating another configuration of the mixture ratio estimation processing unit 401.
- FIG. 45 is a diagram illustrating an example of the estimated mixture ratio.
- FIG. 46 is a flowchart illustrating a process of estimating a mixture ratio using a model corresponding to a covered background area.
- FIG. 47 is a block diagram illustrating an example of the configuration of the foreground / background separation unit 105.
- FIG. 48A is a diagram showing an input image, a foreground component image, and a background component image.
- FIG. 48B is a diagram illustrating a model of the input image, the foreground component image, and the background component image.
- FIG. 49 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shutter time is divided.
- FIG. 50 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shirt time is divided.
- FIG. 51 is a model diagram in which pixel values are developed in the time direction and a period corresponding to the shutter time is divided.
- FIG. 52 is a block diagram illustrating an example of the configuration of the separation unit 600. As illustrated in FIG.
- FIG. 53A is a diagram illustrating an example of a separated foreground component image.
- FIG. 53B is a diagram illustrating an example of the separated background component image.
- FIG. 54 is a flowchart illustrating the process of separating the foreground and the background.
- FIG. 55 is a block diagram showing an example of the configuration of the motion-blur adjusting unit 106.
- FIG. 56 is a diagram for explaining a processing unit.
- FIG. 57 is a model diagram in which the pixel values of the foreground component image are developed in the time direction and the period corresponding to the shirt time is divided.
- FIG. 58 is a model diagram in which the pixel values of the foreground component image are developed in the time direction, and the period corresponding to the shirt time is divided.
- FIG. 59 is a model diagram in which the pixel values of the foreground component image are developed in the time direction, and the period corresponding to the shutter time is divided.
- FIG. 60 is a model diagram in which the pixel values of the foreground component image are developed in the time direction, and the period corresponding to the shutter time is divided.
- FIG. 61 is a diagram illustrating another configuration of the motion-blur adjusting unit 106.
- FIG. 62 is a flowchart illustrating a process of adjusting the amount of motion blur included in the foreground component image by the motion blur adjustment unit 106.
- FIG. 63 is a block diagram illustrating another example of the configuration of the motion-blur adjusting unit 106.
- FIG. 64 is a diagram illustrating an example of a model that specifies a correspondence between a pixel value and a foreground component.
- FIG. 65 is a diagram for explaining the calculation of the foreground component.
- FIG. 66 is a diagram for explaining calculation of a foreground component.
- FIG. 67 is a flowchart for explaining the processing for removing motion blur in the foreground.
- FIG. 68 is a block diagram illustrating another configuration of the functions of the image processing apparatus.
- FIG. 69 is a diagram illustrating a configuration of the combining unit 1001.
- FIG. 70 is a block diagram showing still another configuration of the functions of the image processing apparatus.
- FIG. 71 is a block diagram showing the configuration of the mixture ratio calculation unit 111. As shown in FIG.
- FIG. 72 is a block diagram illustrating a configuration of the foreground / background separation unit 1102.
- FIG. 73 is a block diagram illustrating still another configuration of the functions of the image processing apparatus.
- FIG. 74 is a diagram showing a configuration of the synthesizing unit 1221. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 is a diagram showing an embodiment of an image processing apparatus according to the present invention.
- the CPU (Central Processing Unit) 21 executes various processes according to a program stored in a ROM (Read Only Memory) 22 or a storage unit 28.
- ROM Read Only Memory
- RAM Random Access Memory 23 programs executed by the CPU 21 and data are stored as appropriate.
- ROM 22 and RAM 23 are interconnected by a bus 24.
- the CPU 21 is also connected with an input / output interface 25 via a bus 24.
- the input / output interface 25 is connected to an input unit 26 including a keyboard, a mouse, and a microphone, and an output unit 27 including a display, a speaker, and the like.
- the CPU 21 executes various processes in response to a command input from the input unit 26. Then, the CPU 21 outputs an image, a sound, or the like obtained as a result of the processing to the output unit 27.
- the storage unit 28 connected to the input / output interface 25 is composed of, for example, a hard disk, and the c communication unit 29 for storing programs executed by the CPU 21 and various data is provided for the Internet and other networks. Communicates with external devices via.
- the communication unit 29 functions as an acquisition unit that captures the output of the sensor c.
- the program may be acquired via the communication unit 29 and stored in the storage unit 28 c.
- the drive 30 connected to the drive drives the magnetic disk 51, the optical disk 52, the magneto-optical disk 53, or the semiconductor memory 54 when they are mounted, and drives the programs and the programs recorded there. Acquire data etc.
- the acquired programs and data are transferred to and stored in the storage unit 28 as necessary.
- FIG. 2 is a block diagram showing the image processing apparatus.
- each function of the image processing apparatus is implemented by hardware or software. That is, each block diagram in this specification may be considered as a block diagram of hardware or a functional block diagram by software.
- an image to be imaged which corresponds to an object in the real world, is called an image object.
- the input image supplied to the image processing apparatus is supplied to a smart object extraction unit 101, a region identification unit 103, a mixture ratio calculation unit 104, and a foreground / background separation unit 105.
- the object extraction unit 101 roughly extracts an image object corresponding to the foreground object included in the input image, and supplies the extracted image object to the motion detection unit 102.
- the object extracting unit 1. 1 roughly extracts the image object corresponding to the foreground object by detecting the contour of the image object corresponding to the foreground object included in the input image.
- the object extraction unit 101 roughly extracts an image object corresponding to a background object included in the input image, and supplies the extracted image object to the motion detection unit 102. For example, the object extraction unit 101 determines the background op-position based on the difference between the input image and the extracted image object corresponding to the extracted foreground object. The image object corresponding to the etato is roughly extracted.
- the object extraction unit 101 determines the image object corresponding to the foreground object and the background image from the difference between the background image stored in the background memory provided inside and the input image.
- the image object corresponding to the object may be roughly extracted.
- the motion detection unit 102 calculates the motion vector of the image object corresponding to the coarsely extracted foreground object, for example, by a method such as a block matching method, a gradient method, a phase correlation method, and a pel recursive method.
- the calculated motion vector and the position information of the motion vector are supplied to the motion-blur adjusting unit 106.
- the motion vector output by the motion detection unit 102 includes information corresponding to the motion amount V.
- the motion detection unit 102 may output a motion vector for each image object to the motion blur adjustment unit 106 together with pixel position information for specifying a pixel in the image object.
- the motion amount V is a value that represents a change in the position of an image corresponding to a moving object in units of pixel intervals. For example, when the image of the object corresponding to the foreground is moved so that it is displayed at a position separated by four pixels in the next frame with respect to a certain frame, and the image of the object corresponding to the foreground is The motion amount V of the image is set to 4.
- the object extraction unit 101 and the motion detection unit 102 are necessary when adjusting the amount of motion blur corresponding to a moving object.
- the area specifying unit 103 specifies each of the pixels of the input image as one of a foreground area, a background area, and a mixed area, and for each pixel, any one of a foreground area, a background area, and a mixed area.
- information indicating belongs to (hereinafter, region information hereinafter) to the mixture-ratio calculator 1 0 4, a foreground portion 1 0 5, and the motion-blur adjusting unit 1 0 6 supplies c mixture-ratio calculator 1 0 4 Is the input image and the area supplied from the area identification unit 103.
- a mixture ratio hereinafter, referred to as a mixture ratio
- the mixture ratio is a value indicating a ratio of an image component corresponding to a background object (hereinafter, also referred to as a background component) in a pixel value, as shown in Expression (3) described below.
- the foreground / background separation unit 105 is a component of the image corresponding to the foreground object based on the area information supplied from the area identification unit 103 and the mixture ratio supplied from the mixture ratio calculation unit 104.
- the input image is separated into a foreground component image consisting only of the foreground component and a background component image consisting only of the background component, and the foreground component image is divided into a motion blur adjusting unit 106 and a selecting unit 10.
- Supply 7 It is also conceivable to use the separated foreground component image as the final output. It is possible to specify only the foreground and background without considering the conventional mixed area, and obtain more accurate foreground and background compared to the method that was separated.
- the motion-blur adjusting unit 106 determines a processing unit indicating one or more pixels included in the foreground component image based on the motion amount V and the area information known from the motion vector.
- the processing unit is data that specifies a group of pixels to be processed for adjusting the amount of motion blur.
- the motion blur adjustment unit 106 includes a motion blur adjustment amount input to the image processing apparatus, a foreground component image supplied from the foreground / background separation unit 105, and a motion vector supplied from the motion detection unit 102.
- Motion blur included in the foreground component image such as removing motion blur included in the foreground component image, reducing the amount of motion blur, or increasing the amount of motion blur based on the position information and the processing unit.
- the foreground component image in which the amount of motion blur has been adjusted is output to the selector 107.
- the motion vector and its position information may not be used.
- the motion blur refers to a distortion included in an image corresponding to a moving object, which is caused by the movement of the object in the real world to be imaged and the characteristics of the sensor imaging.
- the selection unit 107 includes, for example, a foreground component image supplied from the foreground / background separation unit 105 and a motion supplied from the motion blur adjustment unit 106 based on a selection signal corresponding to the user's selection.
- One of the foreground component images whose blur amount has been adjusted is selected, and the selected foreground component image is output.
- FIG. 3 is a diagram illustrating imaging by a sensor.
- the sensor is constituted by, for example, a CCD video camera equipped with a CCD (Charge-Coupled Device) area sensor which is a solid-state image sensor.
- the object 1 1 1 corresponding to the foreground in the real world moves horizontally between the sensor 1 1 2 corresponding to the background and the sensor in the real world, for example, from the left side to the right side in the figure.
- the sensor images the object 111 corresponding to the foreground together with the object 112 corresponding to the background.
- the sensor outputs the captured image in units of one frame.
- C For example, the sensor outputs an image composed of 30 frames per second.
- the exposure time of the sensor can be 1 to 30 seconds.
- the exposure time is the period from when the sensor starts converting the input light into electric charge until the sensor finishes converting the input light into electric charge.
- the exposure time is also referred to as shirt time.
- FIG. 4 is a diagram illustrating the arrangement of pixels.
- a to I indicate individual pixels.
- the pixels are arranged on a plane corresponding to the image.
- One detection element corresponding to one pixel is arranged on the sensor.
- one detection element When the sensor captures an image, one detection element outputs a pixel value corresponding to one pixel forming the image.
- the position of the detection element in the X direction corresponds to the position in the horizontal direction on the image
- the position of the detection element in the Y direction corresponds to the position in the vertical direction on the image.
- a detection element such as a CCD converts input light into electric charges and accumulates the converted electric charges for a period corresponding to the shutter time.
- the amount of charge is almost proportional to the intensity of the input light and the time the light is input.
- the detection element converts the electric charge converted from the input light into a period corresponding to the shutter time, and In addition to the charge stored in the That is, the detection element integrates the input light for a period corresponding to the shutter time, and accumulates an amount of charge corresponding to the integrated light. It can be said that the detection element has an integration effect with respect to time.
- the electric charge accumulated in the detection element is converted into a voltage value by a circuit (not shown), and the voltage value is further converted into a pixel value such as digital data and output. Therefore, the individual pixel values output from the sensor are calculated by integrating the spatially expanded part of the object corresponding to the foreground or background into the one-dimensional space, which is the result of integrating with respect to the Schott time. Have.
- the image processing device extracts significant information buried in the output signal, for example, the mixture ratio by the accumulation operation of the sensor.
- the image processing apparatus adjusts the amount of distortion caused by the foreground image object itself being mixed, for example, the amount of motion pocket. Further, the image processing apparatus adjusts the amount of distortion caused by mixing the foreground image object and the background image object.
- FIG. 6A is a diagram illustrating an image obtained by capturing an object corresponding to a moving foreground and an object corresponding to a stationary background.
- FIG. 6B is a diagram illustrating a model corresponding to an image obtained by capturing an object corresponding to a moving foreground and an object corresponding to a stationary background.
- FIG. 6A shows an image obtained by capturing an object corresponding to a moving foreground and an object corresponding to a stationary background.
- the object corresponding to the foreground is moving horizontally from left to right with respect to the screen.
- FIG. 6B is a model diagram in which pixel values corresponding to one line of the image shown in FIG. 6A are expanded in the time direction.
- the horizontal direction in FIG. 6B corresponds to the spatial direction X in FIG. 6A.
- the pixels in the background area are pixels of the background component, that is, only the image component corresponding to the background object. It is configured.
- the pixels in the foreground area are composed of only the components of the foreground, that is, the components of the image corresponding to the foreground object.
- the pixel value of a pixel in the mixed area is composed of a background component and a foreground component. Since the pixel value is composed of the background component and the foreground component, the mixed region can be said to be a distorted region.
- the mixed area is further classified into a covered backround area and an uncovered background area.
- the covered background area is a mixed area at a position corresponding to the front end of the foreground object in the traveling direction with respect to the foreground area, and is an area where the background component is covered by the foreground with the passage of time. .
- the uncovered background area is a mixed area at the position corresponding to the rear end of the foreground object in the traveling direction of the foreground area, and the background component appears over time. Refers to the area.
- FIG. 7 is a diagram illustrating a background area, a foreground area, a mixed area, a covered backdrop area, and an uncovered background area as described above.
- the background area is a stationary part
- the foreground area is a moving part
- the covered background area of the mixed area is a part that changes from the background to the foreground
- the uncovered background area of the mixed area is the part that changes from the foreground to the background.
- Figure 8 is a model diagram in which the pixel values of adjacent pixels arranged in a row in the image of the object corresponding to the stationary foreground and the object corresponding to the stationary background are expanded in the time direction. It is. For example, it is possible to select pixels that are arranged on one line of the screen as pixels that are adjacently arranged in one column.
- the pixel values of F01 to F04 shown in FIG. 8 are the pixel values of the pixels corresponding to the stationary foreground object.
- the pixel values B01 to B04 shown in FIG. 8 are the pixel values of the pixels corresponding to the stationary background object.
- the vertical direction in FIG. 8 corresponds to time, and time elapses from top to bottom in the figure.
- the position on the upper side of the rectangle in FIG. 8 corresponds to the time when the sensor starts converting the input light into electric charge, and the position on the lower side of the rectangle in FIG. 8 indicates the electric power of the light input to the sensor.
- the horizontal direction in FIG. 8 corresponds to the spatial direction X described in FIG. 6A. More specifically, in the example shown in FIG. 8, the distance from the left side of the rectangle indicated by “F01” in FIG. 8 to the right side of the rectangle indicated by “B04” is eight times the pixel pitch. That is, it corresponds to the interval between eight consecutive pixels.
- the light input to the sensor does not change during the period corresponding to the shutter time.
- the period corresponding to the shutter time is divided into two or more periods of the same length.
- the number of virtual divisions is 4, the model diagram shown in FIG. Can be represented.
- the number of virtual divisions is set in accordance with the amount of movement V of the object corresponding to the foreground within the shirt time.
- the number of virtual divisions is set to 4 corresponding to the motion amount V of 4, and the period corresponding to the shutter time is divided into four.
- the top row in the figure corresponds to the first, divided period since the shirt opened.
- the second row from the top in the figure corresponds to the second, divided period when the shirt is open.
- the third row from the top in the figure corresponds to the third, divided period since the shirt opened.
- the fourth row from the top in the figure corresponds to the fourth, divided period since the shirt opened.
- the shirt time divided according to the movement amount V is also referred to as shirt time / V.
- the foreground component FO l / v is equal to the value obtained by dividing the pixel value F01 by the number of virtual divisions.
- the foreground component F02 / V is equal to the value obtained by dividing the pixel value F02 by the number of virtual divisions
- the foreground component F03 / v is obtained by dividing the pixel value F03 by the virtual value.
- the foreground component F04 / V is equal to the value obtained by dividing the pixel value F04 by the virtual number of divisions.
- the background component BO l / v is equal to the value obtained by dividing the pixel value B01 by the number of virtual divisions.
- the background component B02 / v is equal to a value obtained by dividing the pixel value B02 by the virtual division number
- B03 / v is a pixel value B03 obtained by dividing the pixel value B03 by the virtual division number.
- B04 / v is equal to the pixel value B04 divided by the number of virtual divisions.
- FIG. 10 is a model diagram in which pixel values of pixels on one line, including the covered background area, are expanded in the time direction when the object corresponding to the foreground moves toward the right side in the figure. .
- the foreground motion amount V is 4. Since one frame is a short time, it can be assumed that the object corresponding to the foreground is rigid and moves at a constant speed.
- the image of the object corresponding to the foreground moves so as to be displayed four pixels to the right in the next frame with respect to a certain frame.
- the leftmost pixel to the fourth pixel from the left belong to the foreground area.
- the fifth through seventh pixels from the left belong to the mixed area that is the covered background area.
- the rightmost pixel belongs to the background area.
- the components included in the pixel values of the pixels belonging to the covered back ground area include the components of the period corresponding to the shutter time.
- the background component changes to the foreground component.
- a pixel value M indicated by a thick line frame in FIG. 10 is represented by Expression (1).
- the fifth pixel from the left contains the background component corresponding to one shirt time / V and the foreground component corresponding to three shirt times / V, so the fifth pixel from the left is mixed
- the ratio is 1/4.
- the sixth pixel from the left contains the background component corresponding to the two shirt times / V and the foreground component corresponding to the two shirt times / v, so the mixture ratio of the sixth pixel from the left is , 1/2.
- the seventh pixel from the left contains the background component corresponding to the three shutter times / V and the foreground component corresponding to one shutter time / V, so the mixture ratio of the seventh pixel from the left is , 3/4.
- the object corresponding to the foreground is rigid, and the image of the foreground is in the next frame. It can be assumed that it moves at a constant speed so that it is displayed 4 pixels to the right, so for example, the foreground component of the fourth pixel from the left in Fig. 10, the first time when the shirt opens and the shirt time / V F07 / v is equal to the foreground component of the fifth pixel from the left in Figure 10 corresponding to the second shutter time / V when the shutter is open.
- the foreground component F07 / v is the foreground component of the sixth pixel from the left in Fig. 10 corresponding to the third shirt time / V when the shirt is opened, and the left in Fig. 10
- the 7th pixel from the foreground component corresponding to the 4th shirt time / V when the shirt is open is equal to the respective pixels.
- the foreground component F06 / v of the pixel of the first shutter time / V when the shutter is open corresponds to the second shutter time / v of the fourth pixel from the left in Fig. 10, of which the shutter is open Equivalent to foreground component.
- the foreground component F06 / v is the fifth pixel from the left in FIG. 10, the foreground component corresponding to the third shirt time / V when the shirt is open, and the left in FIG.
- the foreground components corresponding to the fourth shutter time / V from the shutter opening of the sixth pixel to the sixth pixel, respectively, are equal to each other.
- Foreground component F05 / v of the first pixel, in which the shirt is open is the first shirt time / V, in the third pixel from the left in Figure 10, the second is the time in which the shirt is open, Equivalent to the corresponding foreground component.
- the foreground component F05 / v is the foreground component of the fourth pixel from the left in FIG. 10 corresponding to the third shirt time / V when the shirt is opened, and the left in FIG. From the fifth pixel, the shirt is open and the fourth shirt is equal to the foreground component corresponding to the time / V, respectively.
- the object corresponding to the foreground is rigid, and the image of the foreground is in the next frame.
- the foreground component F04 / v of the leftmost pixel in Fig. 10 is the first foreground time / V
- the second pixel from the left is equal to the foreground component corresponding to the second shutter time / V when the shutter is open.
- the foreground component F04 / v is the foreground component of the third pixel from the left in Fig. 10, which corresponds to the third shutter time / v when the shutter is opened, and the left pixel in Fig. 10.
- the fourth pixel is equal to the foreground component corresponding to the fourth shirt time / V when the shirt is open. Since the foreground area corresponding to the moving object includes the motion blur as described above, it can be said that the area is a distortion area.
- FIG. 11 is a model diagram in which pixel values of pixels on one line, including the uncovered background area, are developed in the time direction when the foreground moves toward the right side in the figure.
- the motion amount V of the foreground is 4. Since one frame is short, it can be assumed that the object corresponding to the foreground is rigid and moves at a constant speed.
- the image of the object corresponding to the foreground moves to the right by four pixels in the next frame with respect to a certain frame.
- the leftmost pixel to the fourth pixel from the left are the background. Belongs to the realm.
- the fifth to seventh pixels from the left belong to the mixed area that is the uncovered background.
- the rightmost pixel belongs to the foreground area.
- the pixel values of the pixels belonging to the uncovered packed ground area At some point during the period that corresponds to the shirt time, the component contained in is replaced by the background component from the foreground component.
- a pixel value M ′ with a thick frame in FIG. 11 is represented by Expression (2).
- the fifth pixel from the left is the background component corresponding to three shirt times / V. Including the minute and the foreground component corresponding to one shirt time / V, the mixture ratio of the fifth pixel from the left is 3/4.
- the sixth pixel from the left contains the background component corresponding to the two shirt times / V and the foreground component corresponding to the two shirt times / v, so the mixture ratio of the sixth pixel from the left is , 1/2.
- the seventh pixel from the left contains the background component corresponding to one shutter time / V and the foreground component corresponding to three shutter times / V, so the mixture ratio of the seventh pixel from the left a Is 1/4.
- B is the pixel value of the background
- Fi / v is the foreground component
- the object corresponding to the foreground is a rigid body and moves at a constant speed, and the amount of movement V is 4, for example, when the shirt at the fifth pixel from the left in Fig. 11 opens, first, the foreground component FOI / v of the shutter time / V is the sixth pixel from the left in FIG. 1 1 is equal to ingredients foreground the shutter corresponding to the second shutter time / V open ⁇ Similarly, Fol / v is the foreground component of the seventh pixel from the left in Fig. 11 corresponding to the third shutter time / V when the shutter is open, and the eighth pixel from the left in Fig. 11 The pixel is equal to the foreground component corresponding to the fourth shutter time / V after the shutter opens.
- the first foreground component F02 / v of the shutter time / V is equal to the foreground component of the seventh pixel from the left in FIG. 11 corresponding to the second shutter time / V from when the shutter is opened.
- the foreground component F02 / v is equal to the foreground component of the eighth pixel from the left in FIG. 11 corresponding to the third shirt time / V when the shirt is opened.
- the object corresponding to the foreground is rigid, moves at a constant speed, and moves Since the threshold v is 4, for example, the foreground component F03 / v of the seventh pixel from the left in Fig. 11, which is the first pixel when the shirt is open, and the shutter time / V is the left in Fig. 11 From the eighth pixel, the shutter is open equal to the foreground component corresponding to the second shutter time / V.
- the number of virtual divisions has been described as four, but the number of virtual divisions corresponds to the amount of motion V.
- the motion amount V generally corresponds to the moving speed of the object corresponding to the foreground. For example, when the object corresponding to the foreground moves so as to be displayed four pixels to the right in the next frame with respect to a certain frame, the motion amount V is set to 4. Corresponding to the motion amount V, the number of virtual divisions is four. Similarly, for example, when the object corresponding to the foreground is moving so that it is displayed 6 pixels to the left in the next frame with respect to a certain frame, the motion amount V is set to 6, and the virtual division is performed. The number is six.
- Figures 12 and 13 show the mixed area consisting of the foreground area, background area, covered background area or uncovered background area described above, and the foreground component and background corresponding to the divided shirt time. This shows the relationship with the components.
- Figure 12 shows an example of extracting pixels in the foreground, background, and mixed regions from an image containing the foreground corresponding to an object moving in front of a stationary background.
- the object indicated by A and corresponding to the foreground is moving horizontally with respect to the screen.
- Frame # n + l is the frame next to frame #n
- frame # n + 2 is the frame next to frame # n + l.
- FIG. 13 shows the model developed in the above.
- the pixel value of the foreground area is composed of four different foreground components corresponding to the time / V period as the object corresponding to the foreground moves.
- the leftmost pixel of the pixels in the foreground area shown in FIG. 13 is
- F01 / v, F0 2 A ⁇ F0 3 / v, consists of Oyobi F04 / v. That is, pixels in the foreground area include motion blur.
- the light corresponding to the background input to the sensor does not change during the period corresponding to the shutter time.
- the pixel value in the background area does not include motion blur.
- the pixel values of the pixels belonging to the mixed area consisting of the force bird background area or the covered background area are composed of a foreground component and a background component.
- the pixel values of pixels that are adjacent to each other and are arranged in one row in a plurality of frames and are located at the same position on the frame are determined in the time direction.
- the model developed in For example, when the image corresponding to the object is moving horizontally with respect to the screen, pixels that are arranged on one line of the screen can be selected as pixels that are adjacently arranged in one row.
- Fig. 14 shows the pixel values of the pixels that are adjacent to each other and are arranged in one row in the three frames of the image obtained by capturing the object corresponding to the stationary background. It is the model figure developed in the time direction.
- Frame #n is the frame following frame # n-1
- frame # n + l is the frame following frame #n.
- Other frames are similarly referred to.
- the pixel values B01 to B12 shown in FIG. 14 are the pixel values of the pixels corresponding to the stationary background object. Since the object corresponding to the background is stationary, the pixel values of the corresponding pixels do not change in frames # n -1 to n + 1. For example, the pixel in the frame and the pixel in the frame # n + 1 corresponding to the position of the pixel having the pixel value of B05 in the frame #nl have the pixel value of B05.
- FIG. 15 shows three adjacent frames of an image of the object corresponding to the foreground moving to the right in the figure together with the object corresponding to the stationary background.
- FIG. 9 is a model diagram in which pixel values of pixels arranged in one column and located at the same position on a frame are developed in the time direction.
- the model shown in FIG. 15 includes a covered background region.
- Fig. 15 it can be assumed that the object corresponding to the foreground is a rigid body and moves at a constant speed, and the foreground image moves so that it is displayed four pixels to the right in the next frame.
- the quantity V is 4 and the number of virtual divisions is 4.
- the foreground component of the leftmost pixel of frame #nl in Fig. 15 when the shutter is open and the first shutter time / V is F12 / v is the second pixel from the left in Fig. 15
- the foreground component of the second shirt time / V when the shirt is opened is also F12 / v.
- the fourth pixel from the left in Figure 15 is the fourth pixel from the left pixel
- the foreground component of the shirt time / V is F12 / v.
- the foreground component of the leftmost pixel of frame #nl in Fig. 15 and the second shutter time / V after shutter opening is Fl l / v
- the second pixel from the left in Fig. 15 In the foreground component of the third shirt time / V when the shirt is open, Fl l / v, Fig. 15
- the foreground component of is Fl l / v.
- the foreground component of the leftmost pixel in frame #n_l in Fig. 15 and the third shutter time / V after shutter opening is FlO / v
- the foreground component of the fourth shutter time / V when the shirt is open is also FlO / v
- Fig. 15 The leftmost pixel in the leftmost pixel of frame #nl in Figure 5, the fourth shutter time when the shutter is open
- the foreground component of / V is F09 / v.
- the background component of the second pixel from the left of frame # n-1 in Fig. 15 that is the first shutter time / V from the shutter to open is BOL / v.
- the background component of the third to third hours / v is B03 / v.
- the leftmost pixel belongs to the foreground area, and the second to fourth pixels from the left side belong to the mixed area which is the power bird background area.
- the fifth through 12th pixels from the left of frame # n-l in FIG. 15 belong to the background area, and their pixel values are B04 through B11, respectively.
- the first to fifth pixels from the left of the frame in FIG. 15 belong to the foreground area.
- the foreground component of the shirt time / V is
- the object corresponding to the foreground is a rigid body and moves at a constant speed, and the image of the foreground moves so that it is displayed four pixels to the right in the next frame.
- the foreground component of the first shutter time / V when the shutter is open is F12 / v
- the sixth pixel from the left in the left is also F12 / v.
- the foreground component of the shirt time / V is F12 / v.
- the foreground component of the fifth pixel from the left of frame #n in Fig. 15 at shutter release time / V for the second shutter release time / V is Fl l / v
- the sixth pixel from the left in Fig. 15 The foreground component of the pixel, which is the third shirt time / V when the shirt is opened, is also Fl l / v.
- Fig. 15 The seventh pixel from the left in Fig. 5, when the shutter is open, the fourth shutter time
- the foreground component of / v is Fl l / v.
- the foreground component of the 5th pixel from the left of frame #n in Fig. 15 at the third shutter opening time / V in the foreground is FlO / v
- the sixth pixel from the left in Fig. 15 The foreground component of the fourth shirt time / V, which is the same as when the shirt opens, is also FlO / v.
- the foreground component of the fifth pixel from the left of frame #n in Fig. 15 at the fourth shutter time / V after the shutter is opened is F09 / v. Since the object corresponding to the background is stationary, the background component of the sixth pixel from the left of frame #n in Fig. 15 at the first shutter time / V after the shutter has opened is B05 / v. .
- the background component of the 7th pixel from the left of frame #n in Fig. 15 from the first and second shirt time / v when the shirt is opened is B06 / v.
- the background component of the first to third shirt time / V after the shirt is opened is B07 / v.
- the sixth to eighth pixels from the left belong to a mixed area which is a covered-back round-trip area.
- the ninth to 12th pixels from the left of frame #n in FIG. 15 belong to the background area, and the pixel values are B08 to B11, respectively.
- the first to ninth pixels from the left of frame # ⁇ + 1 in Fig. 15 belong to the foreground area.
- the foreground component of the shirt time / ⁇ is any of FO l / v to F12 / v.
- the object corresponding to the foreground is a rigid body and moves at a constant speed, and the image of the foreground moves so that it is displayed four pixels to the right in the next frame, so frame # n + l in Fig. 15
- the foreground component of the ninth pixel from the left and the first shutter time / V from when the shutter has opened is F12 / v
- the shirt in the 10th pixel from the left in Figure 15 is open
- the foreground component of the second shirt time / V is also F12 / v.
- the foreground component of the fourth shutter time / V is F12 / v.
- the foreground component of the ninth pixel from the left in frame # n + l in Fig. 15 during the second shutter release time / V after shutter release is Fl l / v
- the left pixel in Fig. 15 The foreground component of the 10th pixel from the third shutter time / V after the shutter is opened also becomes Fl l / v.
- the foreground component of the fourth pixel from the bottom of the shirt is Fl l / v.
- Fig. 15 The 9th pixel from the left of frame # ⁇ + 1 in 5
- the foreground component of the shirt time / v is FlO / v
- the foreground component of the 10th pixel from the left in Fig. 15 at the fourth shirt from the left of the shirt is FlO / v. / v
- the background component of the 10th pixel from the left of frame # n + l in Fig. 15 from the left, and the first shirt time / V background component is B09 / It becomes v.
- the background components of the first pixel from the left of frame # n + l in Fig. 15 corresponding to the first and second shutter time / V from when the shutter has opened are BIO / v.
- the background components of the first to third pixels from the left of the first pixel to the third pixel, and the background of the shirt time / V are Bl l / v.
- FIG. 16 is a model diagram of an image in which foreground components are extracted from the pixel values shown in FIG.
- FIG. 17 shows three adjacent pixels of three frames of an image of the foreground corresponding to the object moving to the right in the figure together with the stationary background.
- FIG. 4 is a model diagram in which pixel values of pixels at the same position are developed in the time direction. In FIG. 17, an uncovered background area is included. In Fig. 17, it can be assumed that the object corresponding to the foreground is a rigid body and moves at a constant speed.
- the amount of motion V is 4, for example, c , which is the largest in frame #nl in Figure 17
- the foreground component of the left pixel which is the first pixel when the shirt is open, is F13 / v
- the second pixel from the left in Figure 17 is the second pixel when the shirt is open.
- the time / V foreground component is also F13 / v.
- the fourth pixel from the left in Fig. 17 is the fourth pixel from the left pixel.
- the foreground component of the second shirt time / v is F13 / v.
- the foreground component of the second pixel from the left of frame -1 in Fig. 17 at the first shutter release time / V when the shutter is open is F14 / v
- the third pixel from the left in Fig. 17 is also F14 / v.
- the foreground component of the shirt time / v, which is the first pixel when the shirt opens is F15 / v.
- the background component of the third and fourth shirts time / V after the shirt is opened is B26 / v.
- the background component of the third pixel from the left of frame # n-1 in FIG. 17 at the fourth shutter time / V after the shutter is opened is B27 / v.
- the leftmost pixel to the third pixel belong to the mixed area which is the covered background area.
- the fourth to 12th pixels from the left of frame # n-l in FIG. 17 belong to the foreground area.
- the foreground component of the frame is one of F13 / v to F24 / v.
- the leftmost pixel to the fourth pixel from the left of frame #n in FIG. 17 belong to the background area, and the pixel values are B25 to B28, respectively.
- the object corresponding to the foreground is a rigid body and moves at a constant speed, and the image of the foreground moves so that it is displayed four pixels to the right in the next frame.
- the foreground component of the first shutter time / V of the fifth pixel from the left, when the shutter is open, is F13 / v, and the sixth pixel from the left in the right of FIG.
- the component of the foreground of the shirt time / V is also F13 / v.
- the foreground component of the seventh pixel from the left in Figure 17, the third pixel at the time of opening, and the foreground component of the seventh pixel from the left in Figure 17, the fourth pixel of the pixel at the eighth pixel from the left in Figure 17 The foreground component of the shirt time / V is F13 / v.
- the foreground component of the sixth pixel from the left of frame #n in Fig. 17 at the first shutter time / V after the shutter has opened is F14 / v
- the seventh pixel from the left in Fig. 17 is F14 / v
- the foreground component of the second shirt time / v when the shirt is open is also F14 / v
- the foreground of the eighth pixel from the left in Figure 17 of the first shirt time / V Is F15 / v.
- the background component of the fifth pixel from the left of the frame in Figure 17 and the second through fourth shirt time / v from the bottom of the shirt is B29 / v .
- the background component of the sixth pixel from the left of frame #n in Fig. 17 corresponding to the third to fourth shutter time / V from when the shutter has opened is B30 / v.
- the background component of the seventh pixel from the left of frame #n in Fig. 17 corresponding to the fourth shutter time / V after the shutter is opened is B31 / v.
- the eighth to 12th pixels from the left of frame #n in FIG. 17 belong to the foreground area.
- the value corresponding to the period of the shirt time / ⁇ in the foreground area of frame # ⁇ is one of F13 / v to F20 / v.
- the leftmost pixel to the eighth pixel from the left of frame # ⁇ + 1 in FIG. 17 belong to the background area, and their pixel values are # 25 to # 32, respectively.
- the object corresponding to the foreground is rigid and moves at a constant speed, and the image of the foreground moves so that it is displayed four pixels to the right in the next frame.
- the foreground component of the ninth pixel from the left and the first shutter time / V from when the shutter has opened is F13 / v
- the shirt in the 10th pixel from the left in Fig. 17 that is open is 2
- the foreground component of the second shirt time / V is also F13 / v.
- the foreground component of the fourth shirt time / V is F13 / v.
- the foreground component of the 10th pixel from the left of frame # ⁇ + 1 in Fig. 17 at the time when the shirt is opened and the first shirt time / V is F14 / v, is 1 1 from the left in Fig. 17
- the shirt is open and the second shirt is time / V, also F14 / v Becomes In FIG. 17, the foreground component of the first pixel from the left in the first pixel at the time when the shirt is opened is F15 / v.
- the background component of the ninth pixel from the left of frame # n + l in Fig. 17 and the second through fourth of the shirts from the left and the fourth is the shirt time / v Becomes B33 / v.
- the background component of the 10th pixel from the left of frame # n + l in Fig. 17 at the 3rd and 4th shutter start time / V from when the shutter has opened is B34 / v.
- the background component of the 11th pixel from the left of frame # n + l in Fig. 17 corresponding to the fourth portion of the shutter time / V from when the shutter has opened is B35 / v.
- the ninth pixel to the eleventh pixel from the left belong to the mixed area that is the uncovered background area.
- the 12th pixel from the left of frame # ⁇ + 1 in Fig. 17 belongs to the foreground area.
- the foreground component of the shirt time / V in the foreground area of frame # ⁇ + 1 is one of F13 / v to F16 / v.
- FIG. 18 is a model diagram of an image in which foreground components are extracted from the pixel values shown in FIG.
- the area specifying unit 103 uses the pixel values of a plurality of frames to set a flag indicating that the area belongs to the foreground area, background area, covered background area, or uncovered back-drop area.
- the information is supplied to the mixture ratio calculation unit 104 and the motion blur adjustment unit 106 as region information in association with each other.
- the mixture ratio calculation unit 104 calculates a mixture ratio for each pixel for the pixels included in the mixed region based on the pixel values of a plurality of frames and the region information, and uses the calculated mixture ratio as the foreground / background separation unit 1. 0 to 5
- the foreground / background separation unit 105 extracts a foreground component image consisting of only foreground components based on the pixel values, the area information, and the mixture ratio of the plurality of frames, and supplies the extracted foreground component image to the motion blur adjustment unit 106. .
- the motion blur adjustment unit 106 receives the foreground component image supplied from the foreground / background separation unit 105, the motion vector supplied from the motion detection unit 1 ⁇ 2, and the region identification unit 103. Based on the supplied area information, the amount of motion blur included in the foreground component image is adjusted, and a foreground component image in which the amount of motion blur is adjusted is output.
- step S11 the region specifying unit 103 determines whether each pixel of the input image belongs to any of the foreground region, the background region, the covered background region, or the uncovered background region based on the input image.
- the area identification processing for generating the area information shown is executed. The details of the area specifying process will be described later.
- the region specifying unit 103 supplies the generated region information to the mixture ratio calculating unit 104.
- the area specifying unit 103 determines, for each pixel of the input image, a foreground area, a background area, or a mixed area (covered background area or uncovered background area) based on the input image.
- Region information indicating whether the region belongs to or may be generated.
- the foreground / background separation unit 105 and the motion-blur adjustment unit 106 determine whether the mixing area is a force bird background area or an uncovered background area. Is determined. For example, if the foreground area, the mixed area, and the background area are arranged in this order according to the direction of the motion vector, the mixed area is determined to be the force bird background area, and corresponds to the direction of the motion vector. Then, when the background area, the mixed area, and the foreground area are arranged in this order, the mixed area is determined to be an covered background area.
- step S12 the mixture ratio calculation unit 104 calculates a mixture ratio for each pixel included in the mixed region based on the input image and the region information. The details of the mixture ratio calculation process will be described later.
- the mixture ratio calculator 1 4 supplies the calculated mixture ratio to the foreground / background separator 105.
- step S13 the foreground / background separation unit 105 extracts a foreground component from the input image based on the region information and the mixture ratio a, and supplies the foreground component image to the motion blur adjustment unit 106. I do.
- step S14 the motion blur adjustment unit 106 sets the motion vector and the area Based on the information, a processing unit is generated that indicates the position in the image of pixels that are continuous in the motion direction and belong to any of the uncovered background area, foreground area, and covered background area. Adjusts the amount of motion blur included in the foreground component corresponding to the unit. The details of the processing for adjusting the amount of motion blur will be described later.
- step S15 the image processing apparatus determines whether or not processing has been completed for the entire screen. If it is determined that processing has not been completed for the entire screen, the process proceeds to step S14, where the processing unit is determined. The process of adjusting the amount of motion blur for the foreground component corresponding to is repeated.
- step S15 If it is determined in step S15 that the processing has been completed for the entire screen, the processing ends.
- the image processing apparatus can separate the foreground and the background and adjust the amount of motion poke included in the foreground. That is, the image processing apparatus can adjust the amount of motion blur included in the sample data that is the pixel value of the foreground pixel.
- FIG. 20 is a block diagram showing the configuration of the area specifying unit 103. As shown in FIG.
- the frame memory 201 stores the input image in frame units.
- the frame memory 201 stores the frame # n-1, the frame immediately before the frame #n, the frame #n, and the frame # n + l, the frame immediately after the frame # ⁇ . .
- the static / movement determination unit 202-2-1 calculates the pixel value of the pixel of the frame #n that is the target of the region identification, and the frame # at the same position as the pixel position of the pixel of the frame to that the target of the region identification.
- the pixel value of the pixel of n_l is read from the frame memory 201, and the absolute value of the pixel value difference is calculated.
- the static / movement determining unit 202-1 determines whether or not the absolute value of the difference between the pixel value of frame and the pixel value of frame-1 is greater than a preset threshold Th, and determines the difference between the pixel values. If it is determined that the absolute value of is larger than the threshold value Th, a static motion determination indicating a motion is supplied to the area determination unit 203. Frame # n pixels When it is determined that the absolute value of the difference between the pixel value and the pixel value of the pixel in the frame to-1 is equal to or smaller than the threshold Th, the static / movement determination unit 202--1 determines whether or not the static / dynamic determination indicating stationary is an area. It is supplied to the judgment unit 203.
- the static / movement determination unit 2 0 2-2 calculates the pixel value of the pixel of frame # n + l and the target of frame #n at the same position on the image as the pixel of frame #n, which is the target of area identification.
- the pixel value of the pixel to be read out is read from the frame memory 201, and the absolute value of the difference between the pixel values is calculated.
- the static / movement determining unit 202-2-2 determines whether the absolute value of the difference between the pixel value of frame # n + l and the pixel value of frame # is greater than a preset threshold Th, When it is determined that the absolute value of the difference between the pixel values is larger than the threshold value Th, the static / movement determination indicating the movement is supplied to the area determination unit 203. If it is determined that the absolute value of the difference between the pixel value of the pixel of frame # n + l and the pixel value of the pixel of frame #n is equal to or smaller than the threshold Th, the static / movement determination unit 20 22 Is supplied to the area determination unit 203.
- the area determination unit 203 determines that the static / movement determination supplied from the static / dynamic determination unit 202-1 is stationary, and the static / dynamic determination supplied from the static / movement determination unit 202-2 is stationary. In this case, it is determined that the pixel which is the target of the area identification in the frame to belongs to the still area, and the address corresponding to the pixel for which the area is determined in the frame #n of the frame memory 204 for storing the determination flag is determined. Set a flag indicating that it belongs to the stationary area. The area determination unit 203 determines that the static / dynamic determination supplied from the static / dynamic determination unit 202-1 indicates movement, and the static / dynamic determination supplied from the static / dynamic determination unit 202-2 indicates movement.
- the pixel that is an area identification target in frame #n belongs to the motion area, and the address corresponding to the pixel whose area is determined in frame to of the determination flag storage frame memory 204 is determined.
- the area determination unit 203 determines that the static / dynamic determination supplied from the static / dynamic determination unit 202-1 indicates stillness, and that the static / dynamic determination supplied from the static / dynamic determination unit 202-2 determines movement.
- a flag indicating that the pixel belongs to the covered background area is set at the address of # n + l corresponding to the pixel whose area is determined.
- the area determination unit 203 determines that the static / dynamic determination supplied from the static / dynamic determination unit 202-1 indicates movement, and the static / dynamic determination supplied from the static / dynamic determination unit 202-2 determines that the vehicle is stationary. When it is indicated, it is determined that the pixel which is the target of the region identification in the frames #n to l belongs to the uncovered background region, and the region of the frame #n_l of the determination flag storage frame memory 204 is determined. A flag indicating that the pixel belongs to the uncovered background area is set at the address corresponding to the pixel.
- the flag set by the area determination unit 203 in the determination flag storage frame memory 204 is, for example, a 2-bit flag. When “0 0”, it indicates a still area,
- the determination flag storage frame memory 204 stores a 2-bit flag corresponding to each pixel, a frame-determined area flag of frame # n-1, a frame-determined area flag of frame #n, and a frame #n The flags determined as the + l area are individually stored.
- the determination flag storage frame memory 204 stores flags indicating the uncovered background area, the motion area, the stationary area, and the force pad background area when the area identification unit 103 finishes the area determination for one frame. Is output, the flag of frame # n + l is output as area information.
- the determination flag storage frame memory 204 when the area identification unit 103 finishes the area determination for one frame, flags indicating the uncovered background area, the motion area, and the quarrel area are set. Move the flag from frame #n to frame.
- the determination flag storage frame memory 204 stores the flag of the frame #n_l in which the flag indicating the uncovered background area is set when the area identification unit 103 finishes the area determination for one frame. Go to #n.
- the determination flag storage frame memory 204 initializes the frame # n-1 when the area identification unit 103 completes the area determination for one frame.
- the frame # n + l stored in the determination flag storage frame memory 204 includes a flag indicating an uncovered background area, a flag indicating a still region, And a flag indicating the motion area are set.
- the area determination unit 203 sets the stationary area or the motion area in frame #n in which the flag indicating the uncovered background area stored in the determination flag storage frame memory 204 is set. Set the indicated flag.
- the area determination unit 203 does not set the flag indicating the moving area.
- the area determination unit 203 sets a flag indicating an uncovered background area, a flag indicating a still area, and a flag indicating a motion area, which are stored in the determination flag storage frame memory 204. Set a flag to indicate the covered background area in frame # n + l.
- Figure 22 shows a model diagram in which the pixel values of pixels arranged in a row adjacent to the motion direction of the image corresponding to the foreground object are developed in the time direction. For example, when the motion direction of the image corresponding to the foreground object is horizontal to the screen, the model diagram in Fig. 22 shows a model in which the pixel values of adjacent pixels on one line are expanded in the time direction. Is shown.
- the line in frame #n is the same as the line in frame ⁇ + l.
- the foreground components corresponding to the objects included in the second to 13th pixels from the left are the sixth to 17th pixels from the left in frame # ⁇ + 1. included.
- the pixels belonging to the covered pack background area are the 11th to 13th pixels from the left, and the pixels belonging to the uncovered background area are the 2nd to 4th pixels from the left. is there.
- the pixels belonging to the covered background area are the 15th to 17th pixels from the left, and the pixels belonging to the uncovered background area are the 6th to 8th pixels from the left. is there.
- the pixel value of the 16th pixel from the left of frame #n does not change from the pixel value of the 16th pixel from the left of frame #n_l
- the pixel value of the 17th pixel from the left of frame #n_l Does not change from the pixel value of the 17th pixel from the left of frame # n-1.
- the pixels of frame #n and frame # n-1 corresponding to the pixels belonging to the covered background area in frame # n + l consist only of the background components, and the pixel values do not change.
- the absolute value is almost zero. Therefore, the still / moving judgment for the pixels of frame # ⁇ and frame fo-1 corresponding to the pixels belonging to the covered background area in frame # n + l is judged to be still by the still / moving judgment section 202.
- the still / movement determination for the pixels belonging to the mixed area in frame # ⁇ + 1 and the corresponding pixels in frame # ⁇ is determined to be movement by the still / movement determination unit 202-2-2.
- the region determination unit 203 is supplied with the result of the static / moving determination indicating the motion from the static / movement determining unit 202-2-2, and the static / dynamic determination indicating the stillness is supplied from the static / movement determining unit 202-1.
- the result of is supplied, it is determined that the corresponding pixel of frame # ⁇ + 1 belongs to the covered background area.
- the pixel value of the third pixel from the left of frame # n + l does not change from the pixel value of the third pixel from the left of frame # 11, and the fourth pixel from the left of frame ⁇ + l The pixel value of the pixel does not change from the pixel value of the fourth pixel from the left of frame # ⁇ .
- the pixels of frame # ⁇ and frame # ⁇ + 1, which correspond to the pixels belonging to the uncovered packed ground area in frame # ⁇ -1, consist only of the background component, and the pixel values do not change.
- the absolute value of is almost zero. Therefore, the still / moving judgment for the frame and the pixel of frame # ⁇ + 1 corresponding to the pixels belonging to the uncovered background area in frame # ⁇ is judged to be still by the still / moving judgment section 202-2.
- the still / moving judgment for the pixels belonging to the mixed area in the frame to-1 and the corresponding pixels in the frame to-1 is judged to be a motion by the still / moving judging section 202-1.
- the region determination unit 203 is supplied with the result of the static motion determination indicating the motion from the static motion determination unit 202-1, and the static motion determination unit 202 indicates the static motion determination from the static motion determination unit 202-2.
- the result of is supplied, it is determined that the corresponding pixel of the frame forest n-1 belongs to the uncovered background area.
- FIG. 25 is a diagram showing the determination conditions of the area specifying unit 103.
- the pixel of frame to- 1 and the pixel of frame # n + 1 at the same position on the image as the position of the pixel to be determined for frame # n + 1 The pixel of frame #n at the position of frame #n is determined to be still, and the pixel of frame #n at the same position as the pixel on the image to be determined for frame # n + l and frame
- the region identifying unit 103 determines that the frame # n + l is to be determined. Pixel belongs to the covered background area.
- the pixel of frame # 11-1 and the pixel of frame #n at the same position on the image of the pixel to be determined for frame #n are determined to be stationary, and the pixel of frame #n is determined to be stationary.
- the region specifying unit 103 sets the frame #n It is determined that the pixel to be determined belongs to the still area.
- the pixel of frame # n-1 and the pixel of frame #n at the same position as the position of the pixel to be determined for the frame on the image are determined to be motion, and the pixel of frame #n and the frame
- the area identifying unit 103 determines the target of frame #n Is determined to belong to the motion area.
- the pixel of frame # n-1 and the pixel of frame #n at the same position as the position of the pixel to be determined for frame #nl on the image are determined to be motion, and the determination of frame # n-1 is made.
- the area specifying unit 103 determines that the pixel to be determined in the frame to-1 belongs to the uncovered background area.
- the area specifying unit 103 specifies the foreground area, the background area, the covered background area, and the anchored background area.
- the region specifying unit 103 needs time corresponding to three frames to specify a foreground region, a background region, a covered background region, and an anchored background region for one frame.
- FIG. 26 is a diagram illustrating an example of a result of specifying an area by the area specifying unit 103.
- the foreground object is moving from left to right in the drawing.
- FIG. 27 is an enlarged view of a part corresponding to the upper left side of the foreground object in the determination result of FIG. 26. '
- step S201 the frame memory 201 acquires the images of the frames # n-1 to # n + 1 to be determined.
- step S202 the still / movement determination unit 202-2-1 determines whether or not the pixel of frame # n-1 and the pixel at the same position of frame #n are stationary, and determines that the pixel is stationary. In this case, the process proceeds to step S203, and the still / movement determination unit 220-2-2 determines whether or not the pixel of frame #n and the pixel at the same position of frame # n + 1 are stationary.
- step S203 If it is determined in step S203 that the pixel of frame #n and the pixel at the same position of frame # ⁇ + 1 are stationary, the process proceeds to step S204, and the area determination unit 203 determines The corresponding address of the frame # ⁇ stored in the flag storage frame memory 204 is set to “0 0” indicating that the frame belongs to the still area, and the procedure proceeds to step S 205.
- step S202 when it is determined that the motion is between the pixel of frame # ⁇ -1 and the pixel at the same position of the frame, or in step S203, the pixel of frame # ⁇ and the frame # ⁇ If the pixel at the same position of +1 is determined to be a motion, the pixel at frame # ⁇ does not belong to the still area, and thus the processing of step S204 is skipped, and the procedure proceeds to step S204. Proceed to 205.
- step S205 the static / movement determining unit 202-2-1 sets the pixel of frame # ⁇ -1 And the pixel at the same position in frame #n to determine whether or not a motion has occurred. If it is determined that the motion has occurred, the process proceeds to step S206, and the static / movement determining unit 202-2-2 determines whether the pixel in frame to And the pixel at the same position in frame # n + l to determine whether it is a motion or not.
- step S206 If it is determined in step S206 that the pixel of frame #n and the pixel at the same position of frame # n + l are motion, the process proceeds to step S207, where the area determination unit 203 determines Based on the flag of frame #n stored in flag storage frame memory 204, it is determined whether the corresponding pixel has already been determined to be in the uncovered background area, and the corresponding pixel is determined. If it is determined that has not been determined to be the uncovered background area, the process proceeds to step S208.
- step S208 the area judgment unit 203 sets the address corresponding to the frame #n stored in the judgment flag storage frame memory 204 to "1 1" indicating that it belongs to the motion area. Is set, and the procedure proceeds to Step S209.
- step S205 when it is determined that the pixel of frame # n-1 and the pixel of the same position of frame #n are still, in step S206, the pixel of the frame and the same position of the frame are If the pixel is determined to be still, or if it is determined in step S207 that the corresponding pixel has already been determined to be in the uncovered background area, the pixel of the frame is Since it does not belong to the motion area, the processing of step S208 is skipped, and the procedure proceeds to step S209.
- step S209 the still / moving determination unit 202-2-1 determines whether or not the pixel of frame # n -1 and the pixel at the same position of frame #n are stationary, and determines that the pixel is stationary. In this case, the process proceeds to step S210, and the static / movement determining unit 220-2-2 determines whether or not the pixel of frame #n and the pixel at the same position of frame # n + 1 are moving.
- step S210 If it is determined in step S210 that the motion is caused by the pixel of frame #n and the pixel at the same position in the frame, the process proceeds to step S211 and the area determination unit 203 determines the determination flag storage frame memory The pair of frame # ⁇ + 1 stored in 204 The corresponding address is set to "1 0" indicating that it belongs to the covered background area, and the procedure proceeds to step S212.
- step S209 If it is determined in step S209 that the pixel of frame # n-1 and the pixel at the same position in frame #n are moving, or in step S210, the pixel of frame #n and the frame If it is determined that the pixel at the same position of # n + l is still, since the pixel at frame # n + l does not belong to the covered background area, the processing of step S211 is skipped, The procedure proceeds to step S212.
- step S212 the static / movement determining unit 202-1—determines whether or not the pixel of frame #n_l and the pixel at the same position of frame #n are moving. Proceeding to step S213, the still / movement determination unit 202-2-2 determines whether or not the pixel of frame #n and the pixel at the same position of frame # n + 1 are stationary.
- step S213 If it is determined in step S213 that the pixel of frame ⁇ and the pixel at the same position of frame # ⁇ + 1 are still, the process proceeds to step S214, and the region determination unit 203 determines the determination flag Set “01” indicating that the frame belongs to the uncovered background area to the address corresponding to the frame # ⁇ -1 stored in the storage frame memory 2 ⁇ 4, and the procedure proceeds to step S215. move on.
- step S212 when it is determined that the pixel of frame # ⁇ _1 and the pixel at the same position of the frame are stationary, or in step S213, the pixel of frame and the same of frame # ⁇ + 1 If the pixel at the position is determined to be a motion, since the pixel at frame # ⁇ -1 does not belong to the uncovered background area, the processing of step S214 is skipped, and the procedure proceeds to step S2. Proceed to 2 1 5 In step S215, the area specifying unit 103 determines whether or not the area has been specified for all the pixels. If it is determined that the area has not been specified for all the pixels, the procedure includes Returning to S202, the process of specifying the area is repeated for other pixels.
- step S215 If it is determined in step S215 that regions have been specified for all pixels, the process proceeds to step S216, where the determination flag storage frame memory 204 stores the foreground A frame flag indicating the area, background area, covered background area, or uncovered background area is output as area information.
- step S217 the determination flag storage frame memory 204 moves the flag of frame #n to frame # n + 1.
- step S2128 the determination flag storage frame memory 204 moves the flag of frame # n-l to frame #n.
- step S219 the determination flag storage frame memory 204 initializes the frame to-1 and the process ends.
- the region specifying unit 103 obtains, for each of the pixels included in the frame, region information indicating that the pixel belongs to the moving region, the still region, the uncovered background region, or the covered background region.
- the area identification unit 103 can generate the area information corresponding to the mixed area by applying a logical sum to the area information corresponding to the covered background area and the force bird background area. May be generated to generate, for each of the pixels included in the frame, area information including a flag indicating that the pixel belongs to a moving area, a still area, or a mixed area.
- the region specifying unit 103 can more accurately specify the moving region.
- the area specifying unit 103 can output the area information indicating the moving area as the area information indicating the foreground area, and output the area information indicating the stationary area as the area information indicating the background area.
- the region specifying unit 103 calculates, for each of the pixels included in the frame, region information indicating that the pixel belongs to the moving region, the still region, the uncovered background region, or the force bird background region. Can be generated.
- the area specifying unit 103 can generate the area information in a relatively small memory space.
- FIG. 30 is a block diagram showing an example of the configuration of the mixture ratio calculating section 104. As shown in FIG. Presumed confusion The mixing processing unit 401 calculates an estimated mixing ratio for each pixel by an operation corresponding to the model of the covered background area based on the input image, and calculates the calculated estimated mixing ratio. Supply to 3.
- the estimated mixture ratio processing unit 402 calculates an estimated mixture ratio for each pixel by an operation corresponding to the model of the uncovered packed region based on the input image, and mixes the calculated estimated mixture ratio. The ratio is supplied to the ratio determination unit 403.
- the mixing ratio Qf of the pixels belonging to the mixed region has the following properties. That is, the mixture ratio changes linearly in accordance with the change in the position of the pixel. If the change in the pixel position is one-dimensional, the change in the mixture ratio can be represented by a straight line. If the change in the pixel position is two-dimensional, the change in the mixture ratio ⁇ should be represented by a plane. I can do it.
- the slope of the mixture ratio is the inverse ratio of the amount of motion V in the foreground within the shutter time.
- Figure 31 shows an example of an ideal mixing ratio.
- the slope 1 in the mixing region with the ideal mixing ratio can be expressed as the reciprocal of the motion amount V.
- the ideal mixture ratio has a value of 1 in the background region, a value of 0 in the foreground region, and a value exceeding 0 and less than 1 in the mixture region. .
- the pixel value C06 of the seventh pixel from the left of frame # ⁇ is expressed by equation (4) using the pixel value ⁇ 06 of the seventh pixel from the left of frame # ⁇ -1. be able to.
- the pixel value C06 is expressed as the pixel value M of the pixel in the mixed area
- the pixel value P06 is expressed as the pixel value B of the pixel in the background area. That is, the pixel value M of the pixel in the mixed area and the pixel value B of the pixel in the background area can be expressed as Expressions (5) and (6), respectively.
- Equation (4) corresponds to the mixing ratio. Since the motion amount V is 4, the mixture ratio of the seventh pixel from the left of frame #n is 0.5.
- Equation (3) indicating the mixture ratio can be rewritten as equation (7).
- Equation (7) is the sum ⁇ iFi / v of the foreground components included in the pixel of interest.
- the variables included in equation (7) are the mixture ratio and the sum f of the foreground components.
- Fig. 33 shows a model in which pixel values are expanded in the time direction, where the amount of motion V is 4 and the number of virtual divisions in the time direction is 4, in the uncovered background area.
- Equation (3) indicating the mixture ratio can be expressed as Equation (8).
- the background object has been described as being stationary, even when the background object is moving, the pixel value of the pixel at the position corresponding to the background motion amount V can be used to obtain the equation. Equations (4) to (8) can be applied.
- the motion amount V of the object corresponding to the background is 2
- the pixel value B of the pixel in the background area in Expression (6) is set to the pixel value P04.
- Equations (7) and (8) each contain two variables, so the mixture ratio cannot be determined as is.
- an image generally has a strong spatial correlation, adjacent pixels have substantially the same pixel value.
- the equation is modified so that the sum f of the foreground components can be derived from the previous or subsequent frame, and the mixture ratio H is obtained.
- the pixel value Mc of the seventh pixel from the left of frame #n in FIG. 34 can be expressed by equation (9).
- Equation (1 1) is established using the spatial correlation of the foreground components.
- Equation (10) can be replaced with equation (1 2) using equation (1 1) .
- Equation (1 1) Assuming that Eq. (14) holds for all pixels in the mixed region from the relationship of the internal division ratio.
- equation (14) holds, equation (7) can be expanded as shown in equation (15).
- equation (16) ⁇ ⁇ ⁇ + ( ⁇ - ⁇ ) ⁇ ⁇ (is) Similarly, if equation (14) holds, equation (8) can be expanded as shown in equation (16).
- Equations (15) and (16) C, N, and P are known pixel values, so Equations (15) and (16) The only variable included in) is the mixing ratio.
- the relationship between C, N, and P in equations (15) and (16) is shown in Fig. 35.
- c C is the pixel value of the pixel of interest in frame #n for calculating the mixture ratio. is there.
- N is the pixel value of the pixel in frame # n + l whose spatial position corresponds to the pixel of interest.
- P is the pixel value of the pixel in frame # n-1 corresponding to the pixel of interest and the position in the spatial direction.
- the mixture ratio can be calculated using the pixel values of the pixels of the three frames.
- the condition for calculating the correct mixture ratio Qf is that the foreground components related to the mixed region are equal, that is, the foreground object is stationary.
- Image object of the foreground captured when The pixel value of the pixel located at the boundary of the image object corresponding to the direction of the movement of the foreground object and being twice as many as the amount of movement V must be constant. It is.
- the mixture ratio of the pixels belonging to the covered background region is calculated by Expression (17), and the mixture ratio of the pixels belonging to the uncovered background region is calculated by Expression (18).
- FIG. 36 is a block diagram illustrating a configuration of the estimated mixture ratio processing unit 401.
- the frame memory 421 stores the input image in frame units, and supplies the next frame after the frame input as the input image to the frame memory 422 and the mixture ratio calculation unit 423.
- the frame memory 422 stores the input image in frame units, and supplies the next frame from the frame supplied from the frame memory 421 to the mixture ratio calculation unit 423.
- the frame memory 4 2 1 supplies the frame to to the mixture ratio calculation unit 4 2 3 and the frame memory 4 2 2 supplies the frame #nl to the mixture ratio calculation section 4 2 3.
- the mixture ratio calculation unit 4 23 calculates the pixel value C of the pixel of interest in frame #n, the frame # n + in which the spatial position corresponds to the pixel of interest. Based on the pixel value N of the pixel l and the pixel value P of the pixel of frame # n-1 corresponding to the pixel of interest and the spatial position, the estimated mixture ratio of the pixel of interest is calculated. And outputs the calculated estimated mixture ratio.
- the mixture ratio calculation unit 4 23 sets the pixel value C of the pixel of interest in frame #n, the position in the frame equal to the pixel of interest, and frame #n Based on the pixel value N of the pixel of + l and the pixel value P of the pixel of frame # n-1 at the same position in the frame as the pixel of interest, the estimated mixture ratio of the pixel of interest is calculated. And outputs the calculated estimated mixture ratio.
- the estimated mixture ratio processing unit 401 can calculate the estimated mixture ratio based on the input image and supply it to the mixture ratio determination unit 403.
- the estimated mixture ratio processing unit 402 calculates the estimated mixture ratio of the pixel of interest by the calculation shown in Expression (17). Since the calculation shown in (18) is the same as that of the estimated mixture ratio processing unit 401 except that the part for calculating the estimated mixture ratio of the pixel of interest is different, the description is omitted.
- FIG. 37 is a diagram illustrating an example of the estimated mixture ratio calculated by the estimated mixture ratio processing unit 401.
- the estimated mixture ratio shown in Fig. 37 shows the result when the amount of motion V of the foreground corresponding to an object moving at a constant speed is 11 for one line.
- the estimated mixture ratio is It can be seen that in the mixed region, as shown in FIG. 31, it changes almost linearly.
- the mixture ratio determination unit 403 determines whether the pixel for which the mixture ratio is to be calculated supplied from the region identification unit 103 is a foreground region, a background region, a covered background region, or The mixing ratio is set based on the area information indicating whether it belongs to any of the uncovered background areas. If the target pixel belongs to the foreground area, the mixture ratio determination unit 403 sets 0 to the mixture ratio, and if the target pixel belongs to the background area, sets 1 to the mixture ratio.
- the estimated mixing ratio supplied from the estimated mixing ratio processing unit 401 is set to the mixing ratio, and if the target pixel belongs to the uncovered background area, The estimated mixture ratio supplied from the estimated mixture ratio processing unit 402 is set as the mixture ratio a.
- the mixture ratio determination unit 403 outputs a mixture ratio set based on the area information.
- FIG. 38 is a block diagram showing another configuration of the mixture ratio calculation unit 104.
- the selection unit 441 Based on the area information supplied from the area specifying unit 103, the selection unit 441 estimates the pixels belonging to the covered background area and the corresponding pixels of the previous and next frames based on the estimated mixture ratio processing. Supply to part 4 4 2.
- the selection unit 4441 selects pixels and pixels belonging to the uncovered background area based on the area information supplied from the area identification unit 103. Then, the pixels of the corresponding frames before and after this are supplied to the estimated mixture ratio processing unit 443.
- the estimated mixture ratio processing unit 4 42 Based on the pixel values input from the selection unit 4 41, the estimated mixture ratio processing unit 4 42
- the estimated mixture ratio of the pixel of interest belonging to the covered background area is calculated, and the calculated estimated mixture ratio is supplied to the selection unit 4444.
- the estimated mixture ratio processing unit 4 4 3 calculates the equation based on the pixel value input from the selection unit 4 4 1
- the estimated mixture ratio of the pixel of interest belonging to the anchored background area is calculated, and the calculated estimated mixture ratio is supplied to the selection unit 4444.
- the selection unit 444 selects an estimated mixture ratio of 0 and sets the mixture ratio as the mixture ratio when the target pixel belongs to the foreground region based on the region information supplied from the region identification unit 103. However, if the target pixel belongs to the background area, an estimated mixture ratio of 1 is selected and set as the mixture ratio. When the target pixel belongs to the force pad background area, the selection unit 444 selects the estimated mixture ratio supplied from the estimated mixture ratio processing unit 442 and sets it as the mixture ratio. If it belongs to the uncovered background area, the estimated mixture ratio supplied from the estimated mixture ratio processing unit 443 is selected and set to the mixture ratio ⁇ . The selection unit 444 outputs a mixture ratio selected and set based on the area information.
- the mixture ratio calculation unit 104 having another configuration shown in FIG. 38 can calculate the mixture ratio for each pixel including the image, and can output the calculated mixture ratio.
- step S401 the mixture ratio calculation unit 104 acquires the area information supplied from the area identification unit 103.
- step S402 the estimated mixture ratio processing unit 401 executes a process of calculating the estimated mixture ratio using a model corresponding to the covered background region, and calculates the calculated estimated mixture ratio.
- the mixture is supplied to the mixing ratio determining unit 403. The details of the calculation process of the mixture ratio estimation will be described later with reference to the flowchart of FIG.
- step S403 the estimated mixture ratio processing unit 402 executes a process of calculating the estimated mixture ratio using a model corresponding to the uncovered background area, and calculates the calculated estimated mixture ratio in the mixture ratio determination unit 400. Supply to 3.
- step S404 the mixture ratio calculation unit 104 determines whether or not the mixture ratio has been estimated for the entire frame. If it is determined that the mixture ratio has not been estimated for the entire frame, Returning to S402, the process of estimating the mixture ratio ⁇ for the next pixel is executed.
- step S404 determines that the mixture ratio has been estimated for the entire frame.
- the mixing ratio is set based on the area information supplied from the area specifying unit 103, which indicates whether the area belongs to the background area or the uncovered background area. If the target pixel belongs to the foreground area, the mixing ratio determination unit 4003 sets 0 to the mixing ratio, and if the target pixel belongs to the background area, sets 1 to the mixing ratio ⁇ .
- the estimated mixture ratio supplied from the estimated mixture ratio processing unit 401 is set to the mixture ratio, and when the target pixel belongs to the uncovered background region. Then, the estimated mixture ratio supplied from the estimated mixture ratio processing unit 402 is set as the mixture ratio, and the process ends.
- the mixture ratio calculation unit 104 calculates the mixture ratio, which is a feature amount corresponding to each pixel, based on the region information supplied from the region identification unit 103 and the input image. Can be.
- step S 421 the mixture ratio calculation unit 423 acquires the pixel value C of the target pixel of the frame # ⁇ from the frame memory 421.
- step S422 the mixture ratio calculation unit 423 acquires the pixel value P of the pixel of frame # n-l corresponding to the target pixel from the frame memory 422.
- step S 423 the mixture ratio calculator 423 acquires the pixel value N of the pixel of frame # n + 1 corresponding to the target pixel included in the input image.
- step S 424 the mixture ratio calculation unit 423 determines the pixel value (:, the pixel value P of the pixel of frame # n ⁇ 1, and the pixel value of the pixel of frame # n + 1 in frame #n. Calculate the estimated mixture ratio based on the prime value N.
- step S425 the mixture ratio calculation unit 423 determines whether the process of calculating the estimated mixture ratio has been completed for the entire frame, and ends the process of calculating the estimated mixture ratio for the entire frame. If it is determined that it has not been performed, the process returns to step S421, and the process of calculating the estimated mixture ratio for the next pixel is repeated.
- step S425 If it is determined in step S425 that the process of calculating the estimated mixture ratio has been completed for the entire frame, the process ends.
- the estimated mixture ratio processing unit 401 can calculate the estimated mixture ratio based on the input image.
- step S403 of FIG. 39 uses the equation corresponding to the model of the uncovered background region, as shown in the flowchart of FIG. Since the processing is the same, the description is omitted.
- estimated mixture ratio processing unit 4 42 and the estimated mixture ratio processing unit 4 43 shown in FIG. 38 execute the same processing as the flow chart shown in FIG. 40 to calculate the estimated mixture ratio. The description is omitted.
- the above-described processing for calculating the mixture ratio is applied even when the image corresponding to the background area includes motion. can do.
- the estimated mixture ratio processing unit 401 shifts the entire image in accordance with the movement of the background, and the object corresponding to the background is stationary. The processing is performed in the same manner as in the case.
- the estimated mixture ratio processing unit 401 responds to the background motion as a pixel corresponding to a pixel belonging to the mixed region. A pixel is selected and the above-described processing is performed.
- the mixture ratio calculation unit 104 executes only the mixture ratio estimation process using the model corresponding to the covered background region for all pixels, and outputs the calculated estimated mixture ratio as the mixture ratio. You may do so.
- the mixture ratio indicates the ratio of the background component for the pixels belonging to the covered background region, and indicates the ratio of the foreground component for the pixels belonging to the uncovered background region.
- the image processing apparatus can calculate For a pixel belonging to the uncovered background area, the mixture ratio ⁇ indicating the ratio of the background component can be obtained.
- the mixture ratio calculation unit 104 executes only the mixture ratio estimation process using the model corresponding to the uncovered background area for all the pixels, and outputs the calculated estimated mixture ratio as the mixture ratio. You may do so.
- the mixing ratio ⁇ changes linearly in response to the change of the pixel position due to the movement of the object corresponding to the foreground at a constant speed within the shutter time.
- An equation is established that approximates the ratio and the sum f of the foreground components.
- the pixel values of the pixels belonging to the mixed area and the pixel values of the pixels belonging to the background area Using multiple equations, solve the equation that approximates the mixture ratio and the sum f of the foreground components.
- Equation (19) i is an index in the spatial direction where the position of the pixel of interest is 0. 1 is the slope of the straight line of the mixture ratio. p is the intercept of the straight line of the mixture ratio and the mixture ratio of the pixel of interest. In Equation (19), the index i is known, but the slope 1 and the intercept p are unknown.
- Figure 41 shows the relationship between index i, slope 1, and intercept p.
- a white circle indicates a pixel of interest
- a black circle indicates a neighboring pixel.
- Equation (19) is extended to a plane, and the mixture ratio is It is expressed by equation (20).
- a white circle indicates a target pixel.
- Equation (20) j is the horizontal index with the position of the pixel of interest set to 0, and k is the vertical index.
- m is the horizontal inclination of the plane of the mixture ratio
- q is the vertical inclination of the plane of the mixture ratio.
- p is the intercept of the plane of the mixing ratio.
- Expressions (21) to (23) hold for C05 to C07, respectively.
- X represents a position in the spatial direction.
- equation (24) can be expressed as equation (25).
- Equation (25) j is a horizontal index with the position of the pixel of interest set to 0, and k is a vertical index.
- Equation (25) the sum of the components of the foreground is given by Equation (25) ).
- Equation (30) If the sum of the mixture ratio and the foreground component in Equation (9) is replaced using Equations (20) and (25), the pixel value M is expressed by Equation (30).
- the unknown variables are the horizontal slope m of the surface of the mixture ratio, the vertical slope q of the surface of the mixture ratio, and the intercepts p, s, t, and u of the surface of the mixture ratio. There are six.
- the horizontal index j of the pixel of interest is set to 0, the vertical index k is set to 0, and a 3 ⁇ 3 pixel in the vicinity of the pixel of interest is calculated.
- the pixel value M or the pixel value B is set to the following, Expressions (31) to (39) are obtained.
- M. li + 1 (-1) B. 1; +1 ⁇ m + (+ l) ⁇ B. li + 1 ⁇ q + B_ li + 1 ⁇ p + (one 1) ⁇ s + (+ l)-t + u (3 7)
- M 0 , +1 (0) B 0 , +1 'm + (+ l) ⁇ B 0 , +1 ⁇ q + B 0i + 1 ⁇ p + (0) ⁇ s + (+ l )-t + u (3 8)
- the least square method is used to calculate the horizontal slope m, the vertical slope q, the intercepts p, s, t, and u, respectively. Calculate the value of and output the intercept p as the mixture ratio.
- index i and index k are represented by one index X
- the relationship between index i, index and index X is expressed by equation (40).
- Equation (3 1) The horizontal slope m, the vertical slope q, the intercept p, s, and u are expressed as variables w0, wl, w2, w3, w4, and W5, respectively, and jB, kB, B, j, k, and 1 is expressed as a0, al, a2, a3, a4, and a5, respectively.
- equation (3 1) can be rewritten as equation (4 1).
- X is any one of integers from 0 to 8.
- equation (4 2) can be derived.
- Equation (44) the partial derivative of the variable with respect to the sum of squares of the error ⁇ should be 0.
- V is any value of integers from 0 to 5. Therefore, wy is calculated so as to satisfy Equation (44).
- equation (45) is obtained.
- the pixel values C04 to C08 of frame #n and the pixel values P04 to P08 is set as a normal equation.
- Mcl (-1)-Bel-m + (-l) Bel-q + Bcl-p + (_ l)-s + (-1) t + u (4 6)
- Mc2 (0) Bc2-m + (-1)-Bc2-q + Bc2p + (0) ⁇ s + (-1)-t + u (4 7)
- Mc3 (+ 1)-Bc3-m + (- 1)-Bc3-q + Bc3-p + (+ l)-s + (-1)-t + u (4 8)
- Mc4 (-1)-Bc4-m + (0)-Bc4-q + Bc4-p + (-1) s + (0)-t + u (4 9)
- Mc5 (0) Bc5m + (0)-Bc5-q + Bc5p + (0)-s + (0)-t + u (5 0)
- Mc6 (+1)-Bc6m + (0) Bc6q + Bc6p + (+1) s + (0) t + u (5 1)
- Mc7 (-1)-Bc7m + ( +1)-Bc7-q + Bc7p + (-1) ⁇ s + (+1)-t + u (5 2)
- Mc8 (0) 'Bc8m + (+1) Bc8-q + Bc8p + (0) s + (+1)-t + u (5 3)
- Mc9 (+1) Bc9-m + (+ 1) Bc9q + Bc9-p + (+1) s + (+1)-t + u (5 4)
- the pixel values Bel to Bc9 of the pixels in the background area of the pixel of frame # n-1 corresponding to the pixel of frame #n are used.
- the following equations (55) to (63) are established.
- the pixel value of the pixel for calculating the mixture ratio is Mu5.
- Mu2 (0)-Bu2-m + (-1) Bu2-q + Bu2-p + (0) s + (-1)-t + u (5 6)
- Mu3 (+ 1) Bu3-m + (- 1)-Bu3q + Bu3-p + (+ l) s + (-1)-t + u (5 7)
- Mu4 (-1) Bu4-m + (0) Bu4-q + Bu4p + (-1) s + (0)-t + u (5 8)
- Mu5 (0) Bu5m + (0) Bu5-q + Bu5-p + (0)-s + (0)-t + u (5 9)
- Mu6 (+1)-Bu6m + (0)-Bu6q + Bu6p + (+1) s + (0)-t + u (6 0)
- Mu7 (-1) ⁇ Bu7 ⁇ m + (+1) ⁇ Bu7 ⁇ q + Bu7 ⁇ p + (-1) ⁇ s + (+1)-t + u (6 1)
- Mu8 (0) Bu8-ra + (+ l)-Bu8-q + Bu8 ⁇ p + (0) ⁇ s + (+ l)-t + u (6 2)
- FIG. 44 is a block diagram showing the configuration of the estimated mixture ratio processing unit 4 ⁇ 1.
- the image input to the estimated mixture ratio processing unit 401 is supplied to the delay circuit 501 and the adding unit 502.
- the delay circuit 501 delays the input image by one frame and supplies it to the adding section 502 I do.
- the delay circuit 501 supplies the frame # ⁇ _1 to the adding section 502.
- the adding unit 502 sets the pixel value of the pixel near the pixel for which the mixture ratio is calculated and the pixel value of the frame # ⁇ -1 in a normal equation. For example, the adding unit 502 sets the pixel values Mcl to Mc9 and the pixel values Bel to Bc9 in the normal equation based on Expressions (46) to (54). The adding unit 502 supplies the normal equation in which the pixel value is set to the calculating unit 503.
- the arithmetic section 503 solves the normal equation supplied from the adding section 502 by a sweeping method or the like to obtain an estimated mixture ratio, and outputs the obtained estimated mixture ratio.
- the estimated mixture ratio processing unit 401 can calculate the estimated mixture ratio based on the input image and supply it to the mixture ratio determination unit 403.
- the estimated mixture ratio processing unit 402 has the same configuration as the estimated mixture ratio processing unit 401, and a description thereof will be omitted.
- FIG. 45 is a diagram illustrating an example of the estimated mixture ratio calculated by the estimated mixture ratio processing unit 401.
- the estimated mixture ratio shown in Fig. 45 is based on the result calculated by generating an equation in units of 7 X 7 pixels, where the foreground motion V corresponding to an object moving at a constant speed is 11 and This is shown for one line.
- step S ⁇ b> 521 the adding unit 502 sets the pixel value included in the input image and the pixel value included in the image supplied from the delay circuit 501 as a model of the covered background area. Set to the corresponding normal equation.
- step S522 the estimated mixture ratio processing unit 401 determines whether the setting for the target pixel has been completed, and the setting for the target pixel has been completed. If it is determined that they do not exist, the process returns to step S522, and the process of setting the pixel values in the normal equation is repeated.
- step S522 when it is determined that the setting of the pixel value for the target pixel has been completed, the process proceeds to step S522, and the arithmetic unit 503 calculates the normal equation in which the pixel value is set. Based on this, the estimated mixture ratio is calculated, and the obtained estimated mixture ratio is output.
- the estimated mixture ratio processing unit 401 shown in FIG. 44 can calculate the estimated mixture ratio based on the input image.
- the process of estimating the mixture ratio by the model corresponding to the covered background region is the same as the process shown in the flowchart of FIG. 46 using the normal equation corresponding to the model of the uncovered background region. Omitted.
- the above-described processing for obtaining the mixture ratio can be applied even when the image corresponding to the background area includes motion.
- the estimated mixture ratio processing unit 401 shifts the entire image in response to this movement, and the object corresponding to the background is stationary. The same processing is performed.
- the estimated mixture ratio processing unit 401 selects a pixel corresponding to the motion as a pixel corresponding to a pixel belonging to the mixed region. The above processing is executed.
- the mixture ratio calculation unit 102 calculates the mixture ratio H, which is a feature amount corresponding to each pixel, based on the region information supplied from the region identification unit 101 and the input image. be able to.
- the mixture ratio it is possible to separate the foreground component and the background component included in the pixel value while retaining the information on the motion blur included in the image corresponding to the moving object.
- FIG. 47 is a block diagram illustrating an example of the configuration of the foreground / background separation unit 105.
- the input image supplied to the foreground / background separation unit 105 is supplied to the separation unit 601, switch 602, and switch 604.
- the information indicating the covered background area and the area information supplied from the area specifying unit 103 indicating the uncovered background area are supplied to the separation unit 601.
- the area information indicating the foreground area is supplied to the switch 602.
- Area information indicating the background area is supplied to the switch 604.
- the mixture ratio supplied from the mixture ratio calculation unit 104 is supplied to the separation unit 601 (the separation unit 601 includes region information indicating the covered background region and region information indicating the uncovered background region).
- the foreground component is separated from the input image on the basis of,, and the mixture ratio, and the separated foreground component is supplied to the synthesis unit 603, and the background component is separated from the input image and separated.
- the background component is supplied to the synthesis unit 605.
- the switch 602 is closed when a pixel corresponding to the foreground is input based on the area information indicating the foreground area, and supplies only the pixel corresponding to the foreground included in the input image to the synthesizing unit 603. I do.
- the switch 604 is closed when a pixel corresponding to the background is input based on the area information indicating the background area, and supplies only the pixel corresponding to the background included in the input image to the combining unit 605. I do.
- the synthesis unit 603 synthesizes a foreground component image based on the foreground component supplied from the separation unit 601 and the pixel corresponding to the foreground supplied from the switch 602, and synthesizes the foreground component. Output an image. Since the foreground area and the mixed area do not overlap, the synthesis unit 603 synthesizes the foreground component image, for example, by applying a logical OR operation to the component corresponding to the foreground and the pixel corresponding to the foreground. .
- the synthesizing unit 603 stores an image in which all pixel values are 0 in a built-in frame memory in an initialization process executed at the beginning of the foreground component image synthesizing process, In the process of synthesizing the foreground component image, the foreground component image is stored (overwritten). Therefore, in the foreground component image output from the combining unit 603, 0 is stored as a pixel value in a pixel corresponding to the background region.
- the synthesis unit 605 synthesizes a background component image based on the component corresponding to the background supplied from the separation unit 601 and the pixel corresponding to the background supplied from the switch 604, and synthesizes them. Outputs a background component image. Since the background area and the mixed area do not overlap, the synthesis unit 605 synthesizes the background component image by, for example, applying a logical OR operation to the component corresponding to the background and the pixel corresponding to the background.
- the synthesizing unit 605 stores an image in which all pixel values are 0 in a built-in frame memory in an initialization process executed at the beginning of the background component image synthesizing process.
- the background component image is stored (overwritten) in the composition process. Accordingly, in the background component image output from the combining unit 605, 0 is stored as a pixel value in a pixel corresponding to the foreground area.
- FIG. 48A is a diagram illustrating an input image input to the foreground / background separation unit 105, and a foreground component image and a background component image output from the foreground / background separation unit 105.
- Figure 4 8 B is a foreground part 1 0 input image input to 5
- c diagram shows a model of the foreground component image and the background component image output from the foreground background separator 1 0 5
- 4 8 A is a schematic diagram of a displayed image
- FIG. 48B is a diagram of one line including pixels belonging to the foreground region, background region, and mixed region corresponding to FIG. 48A.
- FIG. 4 shows a model diagram in which pixels are developed in the time direction.
- the background component image output from the foreground / background separation unit 105 is composed of pixels belonging to the background area and background components included in the pixels of the mixed area. Be composed.
- the foreground component image output from the foreground / background separation unit 105 is composed of the pixels belonging to the foreground area and the foreground components included in the pixels of the mixed area. You.
- the pixel values of the pixels in the mixed area are determined by the foreground / background It is separated into scenery components.
- the separated background component forms a background component image together with the pixels belonging to the background region.
- the separated foreground component forms a foreground component image together with the pixels belonging to the foreground area.
- the pixel value of the pixel corresponding to the background region is set to 0, and meaningful pixel values are set to the pixel corresponding to the foreground region and the pixel corresponding to the mixed region.
- the pixel value of the pixel corresponding to the foreground area is set to 0, and a significant pixel value is set to the pixel corresponding to the background area and the pixel corresponding to the mixed area.
- FIG. 49 is a model of an image showing foreground components and background components of two frames including a foreground corresponding to an object moving from left to right in the figure.
- the motion amount V of the foreground is 4, and the number of virtual divisions is 4.
- the leftmost pixel and the 14th to 18th pixels from the left consist only of background components and belong to the background area.
- the second through fourth pixels from the left include the background component and the foreground component, and belong to the anchor bird background area.
- the 11th to 13th pixels from the left include the background component and the foreground component, and belong to the covered pack ground area.
- the fifth through tenth pixels from the left consist only of foreground components and belong to the foreground area.
- the first through fifth pixels from the left and the eighteenth pixel from the left consist only of background components and belong to the background area.
- the sixth through eighth pixels from the left include the background component and the foreground component and belong to the uncovered background area.
- the fifteenth through seventeenth pixels from the left include the background component and the foreground component, and belong to the covered background area.
- ninth through 1 from the left The fourth pixel consists of only the foreground components and belongs to the foreground area.
- FIG. 50 is a diagram illustrating a process of separating a foreground component from pixels belonging to a covered background area.
- 1 to 18 are mixing ratios corresponding to respective pixels in frame #n.
- the fifteenth through seventeenth pixels from the left belong to the covered background area.
- 15 is the mixture ratio of the 15th pixel from the left of frame #n.
- P15 is the pixel value of the 15th pixel from the left of frame #n_l.
- equation (65) the sum ⁇ 5 of the foreground components of the 15th pixel from the left of frame # ⁇ is expressed by equation (65).
- the foreground component fc included in the pixel value C of the pixel belonging to the covered background area is calculated by Expression (68).
- ⁇ is the pixel value of the corresponding pixel in the previous frame.
- FIG. 51 is a diagram illustrating a process of separating a foreground component from pixels belonging to an uncovered background area.
- 1 to 18 are mixing ratios corresponding to each pixel in frame # ⁇ .
- left The second to fourth pixels from belong to the uncovered background area.
- the pixel value C02 of the second pixel from the left of frame #n is represented by Expression (69).
- hi 2 is the mixture ratio of the second pixel from the left of frame #n.
- N02 is the pixel value of the second pixel from the left of frame # n + l.
- equation (70) the sum f02 of the foreground components of the second pixel from the left of frame #n is expressed by equation (70).
- the foreground component fu included in the pixel value C of the pixel belonging to the uncovered background area is calculated by Expression (73).
- ⁇ is the pixel value of the corresponding pixel in the next frame.
- the separation unit 6001 performs mixing based on the information indicating the covered background area and the information indicating the covered background area included in the area information and the mixing ratio for each pixel. Foreground components and background components can be separated from the pixels belonging to the area.
- FIG. 52 is a block diagram illustrating an example of the configuration of the separation unit 601 that performs the processing described above.
- the image input to the separation unit 61 is supplied to the frame memory 621, and the covered background area and the The area information indicating the background area and the mixture ratio are input to the separation processing block 622.
- the frame memory 6 21 stores the input image in frame units.
- the frame memory 6 2 1 stores the frame # n-1, the frame # n-1, the frame #n that is the frame immediately before the frame #n, and the frame following the frame #n.
- frame # n + l the frame memory 6 21 stores the input image in frame units.
- the frame memory 62 1 supplies the corresponding pixels of the frame # n ⁇ 1, the frame #n, and the frame # n + 1 to the separation processing block 62 2.
- the separation processing block 622 includes the frame # n-1 and the frame # 1 supplied from the frame memory 621, based on the area information indicating the covered background area and the uncovered background area, and the mixing ratio a. Applying the operation described with reference to FIGS. 50 and 51 to the pixel values of the corresponding pixels of frame #n and frame # n + l-the pixels belonging to the mixed area of frame #n and the foreground components and The background component is separated and supplied to the frame memory 623.
- the separation processing block 6 22 is composed of an covered area processing section 6 31, a covered area processing section 6 32, a combining section 6 33, and a combining section 6 3 4.
- the multiplier 641 of the uncovered area processing unit 631 multiplies the mixture ratio by the pixel value of the pixel of frame # n + l supplied from the frame memory 621, and outputs the result to the switch 642. I do.
- the switch 642 is closed when the pixel of frame #n (corresponding to the pixel of frame # n + l) supplied from the frame memory 621, is in the uncovered background area, and the multiplier 642 is closed. 41.
- the pixel value multiplied by the mixture ratio supplied from 1 is supplied to the computing unit 643 and the combining unit 6334.
- the value obtained by multiplying the pixel value of the pixel of frame # n + l output from the switch 642 by the mixture ratio is equal to the background component of the pixel value of the corresponding pixel of the frame.
- Arithmetic unit 643 subtracts the background component supplied from switch 642 from the pixel value of the pixel of frame #n supplied from frame memory 621, and obtains the foreground component.
- the arithmetic unit 6 4 3 is a frame that belongs to the uncovered background area.
- the foreground component of the pixel #m is supplied to the synthesizing unit 633.
- the multiplier 651 of the covered area processing unit 632 multiplies the mixture ratio by the pixel value of the pixel of frame # n-1 supplied from the frame memory 621 and outputs the result to the switch 652.
- the switch 652 is closed when the pixel of frame #n (corresponding to the pixel of the frame) supplied from the frame memory 621, is a covered background area, and the mixture supplied from the multiplier 651 is closed.
- the pixel value multiplied by the ratio is supplied to the computing unit 653 and the synthesizing unit 6334.
- the value obtained by multiplying the pixel value of the pixel of frame # n-1 output from the switch 652 by the mixture ratio is equal to the background component of the pixel value of the corresponding pixel of frame #n.
- Arithmetic unit 653 subtracts the background component supplied from switch 652 from the pixel value of the pixel of frame #n supplied from frame memory 621, to obtain the foreground component.
- the arithmetic unit 653 supplies the foreground component of the pixel of frame #n belonging to the covered background area to the combining unit 633.
- the synthesizing unit 633 3 includes, for frame #n, the foreground components of the pixels belonging to the uncovered background area supplied from the computing unit 643, and the power padding supplied from the computing unit 653.
- the foreground components of the pixels belonging to the ground area are combined and supplied to the frame memory 623.
- the synthesizing unit 634 includes, in the frame, the background component of the pixel belonging to the anchored background area supplied from the switch 642 and belonging to the covered background area supplied from the switch 652.
- the background components of the pixels are combined and supplied to the frame memory 623.
- the frame memory 623 stores the foreground component and the background component of the pixel in the mixed area of the frame #n supplied from the separation processing block 622, respectively.
- the frame memory 623 outputs the stored foreground components of the pixels in the mixed region of the frame #n and the stored background components of the pixels in the mixed region of the frame #n.
- the foreground component contained in the pixel value is obtained by using the mixture ratio And the background components can be completely separated.
- Combining unit 6 0 3 outputted from demultiplexing section 6 0 1, c synthesis to generate a foreground component of the picture element of the mixed area in frame #n, the foreground component image by the pixels belonging to the foreground area
- the unit 605 combines the background component of the pixel in the mixed area of the frame #n output from the separation unit 601 with the pixel belonging to the background area to generate a background component image.
- FIG. 53A shows an example of a foreground component image corresponding to the frame of FIG. 49. Since the leftmost pixel and the 14th pixel from the left consisted of only the background component before the foreground and background were separated, the pixel value was set to 0.
- the second to fourth pixels from the left belong to the covered background area before the foreground and background are separated, the background component is set to 0, and the foreground component is left as it is.
- the 11th to 13th pixels from the left belong to the covered background area, the background component is set to 0, and the foreground component is left as it is.
- the fifth to tenth pixels from the left are left as they are because they consist only of foreground components.
- FIG. 53B shows an example of a background component image corresponding to the frame of FIG. 49.
- the leftmost pixel and the 14th pixel from the left are left alone because they consisted only of the background component before the foreground and background were separated.
- the second to fourth pixels from the left belong to the covered background area before the foreground and the background are separated, the foreground component is set to 0, and the background component is left as it is.
- the 11th to 13th pixels from the left belong to the covered background area before the foreground and background are separated, the foreground component is set to 0, and the background component is left as it is. Since the fifth through tenth pixels from the left consist only of the foreground components before the foreground and background are separated, the pixel value is set to 0.
- step S601 the frame memory 621 of the separation unit 601 acquires the input image and sets the input image as a target for separation between the foreground and the background.
- the next frame #n is stored together with the previous frame #nl and the subsequent frame # n + l.
- step S602 the separation processing block 622 of the separation unit 601 acquires the area information supplied from the mixture ratio calculation unit 104.
- step S603 the separation processing block 622 of the separation unit 601 acquires the mixture ratio supplied from the mixture ratio calculation unit 104.
- step S604 the uncovered area processing unit 631 determines the pixel values of the pixels belonging to the uncovered background area supplied from the frame memory 621, based on the area information and the mixture ratio. , The background components are extracted.
- step S605 the uncovered area processing unit 631 determines the pixel of the pixel belonging to the uncovered background area supplied from the frame memory 621 based on the area information and the mixture ratio. Extract the foreground component from the value.
- step S606 the covered area processing unit 632 determines, based on the area information and the mixture ratio, the pixel value of the pixel belonging to the covered back-drop area supplied from the frame memory 621, Extract background components.
- step S607 the covered area processing unit 632 calculates the pixel value of the pixel belonging to the covered back-drop area supplied from the frame memory 621, based on the area information and the mixture ratio. Extract foreground components.
- step S 608 the synthesizing unit 633 3 extracts the foreground components of the pixels belonging to the anchored background area extracted in the processing in step S 605 and the pixels in the processing in step S 607.
- the pixels belonging to the covered background area are combined with the foreground components.
- the synthesized foreground component is supplied to the synthesis unit 603. Further, the synthesizing unit 603 synthesizes the pixels belonging to the foreground area supplied via the switch 602 with the foreground components supplied from the separating unit 601 to generate a foreground component image. To achieve.
- step S609 the synthesizing unit 634 determines the background components of the pixels belonging to the uncovered background area extracted in the processing in step S604,
- the background component of the pixel belonging to the covered background area extracted in the processing of step S606 is synthesized with the background component.
- the synthesized background component is supplied to the synthesis unit 605. Further, the synthesizing unit 605 synthesizes the pixels belonging to the background area supplied via the switch 604 and the background component supplied from the separating unit 601 to generate a background component image. To achieve.
- step S610 the synthesizing unit 603 outputs a foreground component image.
- step S611 the synthesizing unit 605 outputs the background component image, and the process ends.
- the foreground / background separation unit 105 separates the foreground component and the background component from the input image based on the region information and the mixture ratio or, and outputs the foreground component image including only the foreground component. In addition, it is possible to output a background component image including only the background component. Next, adjustment of the amount of motion blur of the foreground component image will be described.
- FIG. 55 is a block diagram illustrating an example of the configuration of the motion blur adjustment unit 106.
- the motion vector and its position information supplied from the motion detecting unit 102 are supplied to the processing unit determining unit 801, the modeling unit 802, and the arithmetic unit 805.
- the area information supplied from the area specifying unit 103 is supplied to the processing unit determining unit 801.
- the foreground component image supplied from the foreground / background separation unit 105 is supplied to the adding unit 804.
- the processing unit determination unit 8001 generates a processing unit based on the motion vector, its position information, and area information, and supplies the generated processing unit to the modeling unit 800 and the adding unit 800. I do.
- the processing unit generated by the processing unit determination unit 8001 starts from the pixel corresponding to the covered background area of the foreground component image, as shown by an example A in FIG. 56.
- the pixel corresponding to the uncovered background area It indicates continuous pixels arranged in the motion direction up to the pixel corresponding to the uncovered background area or continuous pixels arranged in the motion direction up to the pixel corresponding to the covered background area.
- the processing unit is, for example, two data, an upper left point (the pixel specified by the processing unit and the position of the pixel located at the leftmost or uppermost position on the image) and a lower right point. Hawk, et al.
- the modeling unit 802 performs modeling based on the processing unit input as the motion vector. More specifically, for example, the modeling unit 802 includes a plurality of models corresponding to the number of pixels included in a processing unit, the number of virtual divisions of pixel values in the time direction, and the number of foreground components for each pixel. Is stored in advance, and a model that specifies the correspondence between pixel values and foreground components, as shown in Fig. 57, is selected based on the processing unit and the number of virtual divisions of pixel values in the time direction. .
- the modeling unit 8002 sets the number of virtual divisions to 5 and is positioned at the leftmost position.
- the pixel contains one foreground component
- the second pixel from the left contains two foreground components
- the third pixel from the left contains three foreground components
- the fourth pixel from the left contains four foreground components.
- the fifth pixel from the left contains five foreground components
- the sixth pixel from the left contains five foreground components
- the seventh pixel from the left contains five foreground components.
- the eighth pixel from the left contains five foreground components
- the ninth pixel from the left contains four foreground components
- the tenth pixel from the left contains three foreground components
- 1 The first pixel contains two foreground components
- the second pixel from the left contains one foreground component
- modeling unit 8002 generates a model based on the motion vector and the processing unit when the motion vector and the processing unit are supplied instead of selecting from the model stored in advance. You may make it.
- the modeling unit 802 supplies the selected model to the equation generation unit 803.
- the equation generation unit 803 generates an equation based on the model supplied from the modeling unit 802. Referring to the model of the foreground component image shown in Fig. 57, the number of foreground components is 8, the number of pixels corresponding to the processing unit is 12, the motion amount V is 5, and the The equation generated by the equation generator 803 when the value is 5 will be described.
- the foreground component corresponding to the shutter time / V included in the foreground component image is FOl / v to F08 / v
- the relationship between FOl / v to F08 / v and the pixel values C01 to C12 is expressed by Equation (74). And expressed by Equation (85).
- C06 F06 / v + F05 / v + F04 / v + F03 / v + F02 / v (7 9)
- C07 F07 / v + F06 / v + F05 / v + F04 / v + F03 / v (80)
- Equations (86) to (97) show the equations generated by the equation generator 803.
- C05 lFOl / v + 1-F02 / v + l-F03 / v + l-F04 / v + l F05 / v
- C06 0-FOl / v + 1F02 / v + l-F03 / v + l-F04 / v + l-F05 / v
- Equations (86) to (97) can also be expressed as Equation (98).
- j indicates the position of the pixel.
- j has a value of any one of 1 to 12.
- I indicates the position of the foreground value.
- i has a value of any one of 1 to 8.
- aij has a value of 0 or 1 corresponding to the values of i and j.
- equation (98) can be expressed as equation (99).
- ej is an error included in the target pixel Cj.
- Equation (99) can be rewritten as equation (100).
- equation (1 0 2) since the amount of movement v is a fixed value, equation (1 0 3) can be derived.
- the error contained in the pixel C can be dispersed.
- the equation generating section 803 supplies the normal equation generated in this way to the adding section 804.
- the adding unit 804 converts the pixel value C included in the foreground component image into an equation of a matrix supplied from the equation generating unit 803 based on the processing unit supplied from the processing unit determining unit 801. Set.
- the adding unit 804 supplies the matrix in which the pixel values C are set to the arithmetic unit 805.
- the arithmetic unit 805 calculates the foreground component Fi / v from which the motion blur has been removed by a processing based on a solution method such as a sweeping-out method (Gauss-Jordan elimination), and calculates the foreground pixel from which the motion blur has been removed. Calculate the Fi corresponding to any one of the integers from 0 to 8, i.e., the value of Fi, which is the pixel value from which the motion blur has been removed, as shown in FIG. 58.
- the foreground component image is output to the motion blur adding unit 806 and the selecting unit 807.
- each of FIO to F08 is set for each CIO is to keep the position of the foreground component image on the screen unchanged, and can correspond to any position.
- the motion blur adding unit 806 includes a motion blur adjustment amount v ′ having a value different from the motion amount V, for example, a motion blur adjustment amount v ′ having a half value of the motion amount V, and a value irrelevant to the motion amount V.
- the motion blur adding unit 806 divides the pixel value Fi of the foreground from which the motion blur has been removed by the motion blur adjustment amount v ′ to obtain the foreground component Fi / v, Is calculated, and the sum of the foreground components Fi / v is calculated to generate a pixel value in which the amount of motion blur is adjusted.
- the pixel value C02 is (FO l) / v' motion blurring adjustment quantity v is the pixel value C03 is set to (F01 + F0 2) / ⁇ ', the pixel value C04 is , (F01 + F02 + F03) / ⁇ ′, and the pixel value C05 is (F02 + F03 + F04) / ⁇ ′.
- the motion blur adding unit 806 supplies the foreground component image in which the amount of motion blur has been adjusted to the selecting unit 807.
- the selection unit 807 supplies, for example, a foreground component image from which motion blur has been removed supplied from the calculation unit 805 and a motion blur addition unit 806 based on a selection signal corresponding to the user's selection.
- One of the selected foreground component images with the adjusted amount of motion blur is selected, and the selected foreground component image is output.
- the motion blur adjustment unit 106 can adjust the amount of motion blur based on the selection signal and the motion blur adjustment amount V.
- the motion-blur adjusting unit 106 calculates the equation (10 6) Generate an expression for the matrix shown.
- the motion-blur adjusting unit 106 sets an expression corresponding to the length of the processing unit in this way, and calculates Fi, which is a pixel value in which the amount of motion blur has been adjusted. Similarly, for example, when the number of pixels included in the processing unit is 100, an equation corresponding to 100 pixels is generated, and Fi is calculated.
- FIG. 61 is a diagram illustrating another configuration of the motion-blur adjusting unit 106.
- the same parts as those in the case shown in FIG. 55 are denoted by the same reference numerals, and description thereof will be omitted.
- the selection unit 8221 supplies the input motion vector and its position signal to the processing unit determination unit 8101 and the modeling unit 8102 as they are, or the motion vector Is replaced by the motion blur adjustment amount v ', and the motion vector whose size has been replaced by the motion blur adjustment amount and its position signal are supplied to the processing unit determination unit 801 and the modeling unit 802. I do.
- the processing unit determination unit 801 to the calculation unit 805 of the motion blur adjustment unit 106 in FIG. 61 can handle the motion amount V and the motion blur adjustment amount V according to the values.
- the amount of motion blur can be adjusted.
- the result of the motion blur adding unit 806 includes the motion amount V and the motion blur adjustment amount v ′.
- the meaning of the relationship is Attention must be paid to the differences.
- the motion-blur adjusting unit 106 generates an equation corresponding to the motion amount V and the processing unit, sets the pixel value of the foreground component image in the generated equation, and reduces the amount of motion blur. An adjusted foreground component image is calculated.
- step S801 the processing unit determination unit 801 of the motion blur adjustment unit 106 generates a processing unit based on the motion vector and the area information, and the generated processing unit is modeled by the modeling unit 8. 0 to 2
- step S802 the modeling unit 802 of the motion-blur adjusting unit 106 selects or generates a model according to the motion amount V and the processing unit.
- step S803 the equation generator 803 creates a normal equation based on the selected model.
- step S804 the adding unit 804 sets the pixel value of the foreground component image in the created normal equation.
- step S805 the adding unit 804 determines whether or not the pixel values of all the pixels corresponding to the processing unit have been set, and determines whether the pixel values of all the pixels corresponding to the processing unit have been set. If it is determined that the setting has not been performed, the process returns to step S804, and the process of setting the pixel value to the normal equation is repeated.
- step S805 If it is determined in step S805 that the pixel values of all the pixels in the processing unit have been set, the process proceeds to step S806, and the arithmetic unit 805 is supplied from the adding unit 804. Based on the normal equation in which the pixel value is set, the pixel value of the foreground with the amount of motion blur adjusted is calculated, and the process ends.
- the motion blur adjustment unit 106 can adjust the amount of motion blur from the foreground image including the motion blur based on the motion vector and the area information.
- FIG. 63 is a block diagram illustrating another example of the configuration of the motion-blur adjusting unit 106.
- the motion vector and its position information supplied from the motion detection unit 102 are supplied to the processing unit determination unit 901 and the correction unit 905, and the region information supplied from the region identification unit 103 is The processing unit is supplied to the processing unit determining unit 901.
- the foreground component image supplied from the foreground / background separator 105 is supplied to the calculator 904.
- the processing unit determination unit 901 generates a processing unit based on the motion vector, its position information, and area information, and supplies the generated processing unit to the modeling unit 902 together with the motion vector.
- the modeling unit 902 performs modeling based on the motion vector and the input processing unit. More specifically, for example, the modeling unit 92 includes a plurality of models corresponding to the number of pixels included in the processing unit, the number of virtual divisions of pixel values in the time direction, and the number of foreground components for each pixel. Is stored in advance, and a model that specifies the correspondence between pixel values and foreground components, as shown in Fig. 64, is selected based on the processing unit and the number of virtual divisions of pixel values in the time direction. .
- the modeling unit 902 sets the number of virtual divisions to 5, and the leftmost pixel is one. Contains the foreground component, the second pixel from the left contains two foreground components, the third pixel from the left contains three foreground components, and the fourth pixel from the left contains four foreground components. The fifth pixel from the left contains five foreground components, the sixth pixel from the left contains five foreground components, the seventh pixel from the left contains five foreground components, and the eighth from the left.
- Pixel contains five foreground components
- the ninth pixel from the left contains four foreground components
- the 10th pixel from the left contains three foreground components
- the 11th pixel from the left Contains two foreground components
- the second pixel from the left contains one foreground component, for a total of eight foreground components.
- the modeling unit 902 does not select a model from the models stored in advance, but when the motion vector and the processing unit are supplied, the model is formed based on the motion vector and the processing unit. You may make it generate
- the equation generator 903 uses the model supplied from the modeling unit 902 to Generate an expression.
- the pixel value C12 includes only the foreground component F08 / v, as shown in equation (107), and the pixel value C11 includes the foreground component F08 / v and the foreground component Consists of the sum of products of F07 / v. Therefore, the foreground component F07 / v can be obtained by equation (108).
- the foreground components F06 / v to FOl / v can be obtained by Expressions (109) to (114). .
- the equation generation unit 903 generates an equation for calculating a foreground component based on a difference between pixel values, as shown in Expressions (107) to (114).
- the equation generator 903 supplies the generated equation to the calculator 904.
- the calculation unit 904 sets the pixel value of the foreground component image to the equation supplied from the equation generation unit 903, and calculates the foreground component based on the equation in which the pixel value is set. Performance For example, when the equations (107) to (111) are supplied from the equation generator 903, the arithmetic unit 904 calculates the equations (107) to (114) as follows. Set the pixel values C05 to C12.
- the calculation unit 904 calculates a foreground component based on the equation in which the pixel value is set. For example, the calculating unit 904 calculates the foreground component FOl / as shown in FIG. 65 by performing calculations based on equations (107) to (114) in which pixel values C05 to C12 are set. Calculate v to F08 / v. The calculation unit 904 supplies the foreground components FO l / v to F08 / v to the correction unit 905.
- the correction unit 905 removes motion blur by multiplying the foreground component supplied from the calculation unit 904 by the motion amount v included in the motion vector supplied from the processing unit determination unit 901. Calculate the pixel value of the foreground. For example, when the foreground components FO l / v to F08 / supplied from the arithmetic unit 904 are supplied, the capturing unit 905 may generate, for each of the foreground components FO l / v to F08 / v, By multiplying by the motion amount V of 5, foreground pixel values F01 to F08 from which motion blur has been removed are calculated as shown in FIG.
- the correcting unit 905 supplies the foreground component image composed of the pixel values of the foreground, from which the motion blur has been removed, calculated as described above, to the motion blur adding unit 906 and the selecting unit 907.
- the motion-blur adding unit 906 includes a motion-blur adjustment amount v ′ having a value different from the motion amount V, for example, a motion-blur adjustment amount having a half value of the motion amount V, and a motion-portion adjustment value having a value irrelevant to the motion amount V.
- the amount of motion blur can be adjusted with the blur adjustment amount v '. For example, as shown in FIG.
- the motion blur adding unit 906 calculates the foreground component Fi / v by dividing the foreground pixel value Fi from which the motion blur has been removed by the motion blur adjustment amount. Then, the sum of the foreground components Fi / v 'is calculated to generate a pixel value with the amount of motion blur adjusted. For example, when the motion blur adjustment amount v 'is 3, the pixel value C02 is (F01) / v', the pixel value C03 is (F01 + F02) / v ', and the pixel value C04 is (F01 + F02 + F03) / v ', and the pixel value C05 is (F02 + F03 + F04) / v'.
- the motion blur adding unit 906 supplies the foreground component image in which the amount of motion blur has been adjusted to the selecting unit 907.
- the selection unit 907 supplies the foreground component image from which the motion blur has been removed supplied from the correction unit 905 and the motion blur addition unit 906 based on a selection signal corresponding to the user's selection, for example.
- One of the selected foreground component images with the adjusted amount of motion blur is selected, and the selected foreground component image is output.
- the motion blur adjustment unit 106 can adjust the amount of motion blur based on the selection signal and the motion blur adjustment amount V.
- step S901 the processing unit determination unit 901 of the motion blur adjustment unit 106 generates a processing unit based on the motion vector and the area information, and the generated processing unit is modeled by the modeling unit 9. 0 2 and the correction unit 9 05.
- step S902 the modeling unit 902 of the motion-blur adjusting unit 106 selects or generates a model according to the motion amount V and the processing unit.
- step S903 the equation generation unit 903 generates an equation for calculating a foreground component from a difference between pixel values of the foreground component image based on the selected or generated model.
- step S904 the arithmetic unit 904 sets the pixel value of the foreground component image in the created equation, and calculates the foreground component from the pixel value difference based on the equation in which the pixel value is set. Extract.
- step S905 the arithmetic unit 904 determines whether all foreground components corresponding to the processing unit have been extracted, and has not extracted all foreground components corresponding to the processing unit. If the judgment is made, the process returns to step S904, and the process of extracting the foreground component is repeated.
- step S905 If it is determined in step S905 that all the foreground components corresponding to the processing unit have been extracted, the process proceeds to step S906, where the correction unit 905 determines the operation unit based on the motion amount V. By correcting each of the foreground components FO l / v to F08 / v supplied from 904, the foreground pixel values F01 to F08 from which motion blur has been removed are calculated.
- step S907 the motion-blur adding unit 906 calculates the pixel value of the foreground with the amount of the motion blur adjusted, and the selecting unit 907 selects the image or the motion-blurred image.
- the user selects one of the images in which the amount of motion blur has been adjusted, outputs the selected image, and ends the processing.
- the motion blur adjusting unit 106 having the configuration shown in FIG. 63 can adjust the motion blur from the foreground image including the motion blur faster and with a simpler operation.
- the motion blur adjustment unit 106 shown in Fig. 63 also has a sufficient effect on the actual image that has been quantized and contains noise, enabling accurate motion blur removal. Becomes
- the image processing apparatus having the configuration shown in FIG. 2 can adjust the amount of motion blur included in the input image.
- FIG. 68 is a block diagram illustrating another configuration of the functions of the image processing apparatus.
- the region specifying unit 103 supplies the region information to the mixture ratio calculating unit 104 and the combining unit 1001.
- the mixture ratio calculation unit 104 supplies the mixture ratio to the foreground / background separation unit 105 and the synthesis unit 1001.
- the foreground / background separation unit 105 supplies the foreground component image to the synthesis unit 1001.
- the synthesizing unit 1001 performs an arbitrary background image and a foreground / background separation unit 10 based on the mixture ratio supplied from the mixture ratio calculation unit 104 and the region information supplied from the region identification unit 103.
- the foreground component image supplied from 5 is synthesized, and a synthesized image in which an arbitrary background image and a foreground component image are synthesized is output.
- FIG. 69 is a diagram illustrating a configuration of the combining unit 1001.
- the background component generation unit 1021 generates a background component image based on the mixture ratio ⁇ and an arbitrary background image, and supplies the background component image to the mixed region image synthesis unit 102.
- the mixed area image synthesizing unit 1 0 2 2 outputs the background supplied from the background component generation unit 1 0 2 1. By combining the component image and the foreground component image, a mixed region combined image is generated. The generated mixed region combined image is supplied to the image combining unit 102.
- the image combining unit 1023 combines the foreground component image, the mixed region combined image supplied from the mixed region image combining unit 1022, and an arbitrary background image based on the region information to form a combined image. Generate and output.
- the synthesizing unit 1001 can synthesize the foreground component image with an arbitrary background image.
- An image obtained by synthesizing a foreground component image with an arbitrary background image based on the mixture ratio, which is a feature quantity, is more natural than an image obtained by simply synthesizing pixels.
- FIG. 70 is a block diagram showing still another configuration of the function of the image processing apparatus for adjusting the amount of motion blur. While the image processing apparatus shown in FIG. 2 sequentially performs the area specification and the calculation of the mixture ratio, the image processing apparatus shown in FIG. 70 performs the area specification and the calculation of the mixture ratio in parallel.
- the input image is supplied to a mixture ratio calculation unit 1101, a foreground / background separation unit 1102, an area identification unit 103, and an object extraction unit 101.
- the mixing ratio calculation unit 1101 calculates the estimated mixing ratio when the pixel belongs to the covered background area and the estimated mixing ratio when the pixel belongs to the uncovered background area. Is calculated for each of the pixels contained in the input image, and the estimated mixture ratio when the calculated pixels belong to the covered background area, and the estimated mixture ratio when the pixels belong to the uncovered background area The estimated mixture ratio is supplied to the foreground / background separation unit 1102.
- FIG. 71 is a block diagram illustrating an example of the configuration of the mixture ratio calculation unit 1101.
- the estimated mixture ratio processing unit 401 shown in FIG. 71 is the same as the estimated mixture ratio processing unit 401 shown in FIG.
- the estimated mixing ratio processing unit 402 shown in FIG. This is the same as the comparison processing unit 402.
- the estimated mixture ratio processing unit 401 calculates the estimated mixture ratio for each pixel by an operation corresponding to the model of the covered background area based on the input image, and outputs the calculated estimated mixture ratio.
- the estimated mixture ratio processing unit 402 calculates an estimated mixture ratio for each pixel by an operation corresponding to the model of the uncovered background area based on the input image, and outputs the calculated estimated mixture ratio.
- the foreground / background separation unit 1102 calculates the estimated mixture ratio supplied from the mixture ratio calculation unit 1101, assuming that the pixel belongs to the power background area, and the uncovered pixel.
- a foreground component image is generated from the input image based on the estimated mixture ratio when it is assumed to belong to the ground area and the area information supplied from the area specifying unit 103, and the generated foreground component image is subjected to motion blur. It is supplied to the adjustment unit 106 and the selection unit 107.
- FIG. 72 is a block diagram illustrating an example of the configuration of the foreground / background separation unit 1102.
- Parts similar to those of the foreground / background separation unit 105 shown in FIG. 47 are denoted by the same reference numerals, and description thereof will be omitted.
- the selection unit 1 121 is based on the region information supplied from the region identification unit 103, and based on the region information supplied from the mixture ratio calculation unit 1101, it is assumed that the pixels belong to the covered background region. Either the estimated mixture ratio or the estimated mixture ratio when the pixel belongs to the anchored background region is selected, and the selected estimated mixture ratio is supplied to the separation unit 601 as the mixture ratio.
- the separation unit 6001 extracts a foreground component and a background component from the pixel values of the pixels belonging to the mixed region based on the mixture ratio ⁇ and the region information supplied from the selection unit 1 121, and extracts the extracted foreground. Is supplied to the synthesizing unit 603, and the background component is supplied to the synthesizing unit 605.
- Separating section 600 can have the same configuration as the configuration shown in FIG.
- the combining unit 603 combines the foreground component images and outputs the combined image.
- the synthesis unit 6 05 The component images are combined and output.
- the motion blur adjustment unit 106 shown in FIG. 70 can have the same configuration as that shown in FIG. 2, and is provided from the foreground / background separation unit 1102 based on the area information and the motion vector. The amount of motion blur included in the supplied foreground component image is adjusted, and the foreground component image in which the amount of motion blur is adjusted is output.
- the selection unit 107 shown in FIG. 70 includes, for example, a foreground component image supplied from the foreground / background separation unit 1102 and a motion blur adjustment unit 106 based on a selection signal corresponding to the user's selection. And selects one of the foreground component images supplied with the adjusted amount of motion blur, and outputs the selected foreground component image.
- the image processing device having the configuration shown in FIG. 70 can output the image corresponding to the foreground object included in the input image by adjusting the amount of motion blur included in the image. .
- the image processing device having the configuration shown in FIG. 70 can calculate the mixture ratio buried information and output the calculated mixture ratio similarly to the first embodiment.
- FIG. 73 is a block diagram illustrating another configuration of the function of the image processing apparatus that combines the foreground component image with an arbitrary background image.
- the image processing apparatus shown in FIG. 68 serially performs the area specification and the calculation of the mixture ratio a, whereas the image processing apparatus shown in FIG. 73 performs the area specification and the calculation of the mixture ratio in parallel.
- the mixture ratio calculator 1101 shown in FIG. 73 calculates the estimated mixture ratio when pixels are assumed to belong to the covered background area based on the input image, and the pixels belong to the covered background area. Is calculated for each of the pixels included in the input image, and the estimated mixture ratio when the calculated pixels are assumed to belong to the covered background area, and when the pixels are in the uncovered packed ground area Are supplied to the foreground / background separation unit 1102 and the synthesis unit 1221, respectively.
- the 73 calculates the estimated mixture ratio supplied from the mixture ratio calculation unit 111 when the pixel belongs to the covered background area, and the pixel A foreground component image is generated from the input image based on the estimated mixture ratio assuming that the image belongs to the covered background area and the area information supplied from the area specifying unit 103, and the generated foreground component image is synthesized.
- the synthesizing unit 1221 the estimated mixing ratio supplied from the mixing ratio calculating unit 1101, when the pixel belongs to the covered background area, and assumed that the pixel belongs to the uncovered background area
- the arbitrary background image and the foreground component image supplied from the foreground / background separation unit 1102 are synthesized based on the estimated mixture ratio in the case and the region information supplied from the region identification unit 103, and And outputs a composite image in which the background image and the foreground component image are combined.
- FIG. 74 is a diagram showing a configuration of the synthesizing unit 1221. Parts similar to the functions shown in the block diagram of FIG. 69 are denoted by the same reference numerals, and description thereof will be omitted.
- the selection unit 1221 determines whether the pixel supplied from the mixture ratio calculation unit 1101, belonging to the covered background region, Either the estimated mixture ratio or the estimated mixture ratio when the pixel is assumed to belong to the uncovered background area is selected, and the selected estimated mixture ratio is used as the mixture ratio. To supply.
- the background component generation unit 1021 shown in FIG. 74 generates a background component image based on the mixing ratio 0? Supplied from the selection unit 1221 and an arbitrary background image, and Supply to component 102
- the mixed area image synthesizing section 102 shown in FIG. 74 generates a mixed area synthesized image by synthesizing the background component image supplied from the background component generating section 102 and the foreground component image. Then, the generated mixed area synthesized image is supplied to the image synthesizing unit 102.
- the image combining unit 1023 combines the foreground component image, the mixed region combined image supplied from the mixed region image combining unit 1022, and an arbitrary background image based on the region information, Generate and output a composite image.
- the synthesizing unit 1221 can synthesize the foreground component image with an arbitrary background image.
- the mixture ratio has been described as the ratio of the background component included in the pixel value, the mixture ratio may be the ratio of the foreground component included in the pixel value.
- an example is given in which an image of the real space having the three-dimensional space and the time axis information is projected onto the two-dimensional space and the time and space having the time axis information using the video camera. Corrects the distortion caused by the projection of more first information in the first dimension onto the second information in the smaller second dimension, without falling into this example. It can be used to extract significant information, or to synthesize images more naturally.
- the sensor is not limited to a CCD, but is a solid-state image sensor.
- the sensor may be a sensor in which the detection elements are arranged in a matrix, but may be a sensor in which the detection elements are arranged in a line.
- the recording medium on which the program for performing the signal processing of the present invention is recorded is a magnetic disk 51 on which the program is recorded, which is distributed in order to provide the program to the user separately from the computer.
- a magnetic disk 51 on which the program is recorded which is distributed in order to provide the program to the user separately from the computer.
- optical disk 52 CD-ROM (Compact Disc-Read Only Memory)
- DVD including Digital Versatile Disc
- magneto-optical disk 53 including MD (Mini-Disc) (trademark)
- semiconductor memory 54 It is composed of an R0M 22 in which programs are recorded and provided to the user in a state in which the program is stored in advance, a hard disk included in the storage unit 28, and the like.
- steps for describing a program to be recorded on a recording medium are not only performed in chronological order according to the order described, but are not necessarily performed in chronological order. Alternatively, it also includes processes that are individually executed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Studio Circuits (AREA)
- Image Analysis (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Image Processing (AREA)
- Processing Of Color Television Signals (AREA)
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02743667A EP1339021B1 (en) | 2001-06-25 | 2002-06-20 | Image processing apparatus and method, and image pickup apparatus |
CA2420719A CA2420719C (en) | 2001-06-25 | 2002-06-20 | Image processing apparatus and method, and image pickup apparatus |
DE60239255T DE60239255D1 (de) | 2001-06-25 | 2002-06-20 | Bildverarbeitungsvorrichtung und -verfahren und bilderfassungsvorrichtung |
KR1020037002731A KR100835443B1 (ko) | 2001-06-25 | 2002-06-20 | 화상 처리 장치, 방법 및 그를 위한 기록 매체와 촬상 장치 |
US10/362,354 US7477761B2 (en) | 2001-06-25 | 2002-06-20 | Image processing apparatus and method, and image-capturing apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2001-191004 | 2001-06-25 | ||
JP2001191004A JP4596219B2 (ja) | 2001-06-25 | 2001-06-25 | 画像処理装置および方法、記録媒体、並びにプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003001453A1 true WO2003001453A1 (fr) | 2003-01-03 |
Family
ID=19029694
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2002/006178 WO2003001453A1 (fr) | 2001-06-25 | 2002-06-20 | Procede et dispositif de traitement d'images et dispositif de prise de vues |
Country Status (8)
Country | Link |
---|---|
US (1) | US7477761B2 (ja) |
EP (1) | EP1339021B1 (ja) |
JP (1) | JP4596219B2 (ja) |
KR (1) | KR100835443B1 (ja) |
CN (1) | CN1267857C (ja) |
CA (1) | CA2420719C (ja) |
DE (1) | DE60239255D1 (ja) |
WO (1) | WO2003001453A1 (ja) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4432274B2 (ja) * | 2001-04-12 | 2010-03-17 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596216B2 (ja) * | 2001-06-20 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596225B2 (ja) * | 2001-06-27 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596220B2 (ja) * | 2001-06-26 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
US7440634B2 (en) * | 2003-06-17 | 2008-10-21 | The Trustees Of Columbia University In The City Of New York | Method for de-blurring images of moving objects |
JP4148041B2 (ja) * | 2003-06-27 | 2008-09-10 | ソニー株式会社 | 信号処理装置および信号処理方法、並びにプログラムおよび記録媒体 |
US20050249429A1 (en) * | 2004-04-22 | 2005-11-10 | Fuji Photo Film Co., Ltd. | Method, apparatus, and program for image processing |
EP1605402A2 (en) * | 2004-06-10 | 2005-12-14 | Sony Corporation | Image processing device and method, recording medium, and program for blur correction |
TWI353778B (en) * | 2007-12-21 | 2011-12-01 | Ind Tech Res Inst | Moving object detection apparatus and method |
JP2012215852A (ja) | 2011-03-25 | 2012-11-08 | Semiconductor Energy Lab Co Ltd | 画像処理方法、表示装置 |
CN102254322A (zh) * | 2011-06-09 | 2011-11-23 | 上海智翔信息科技股份有限公司 | 一种图像提取方法及装置 |
CN103440612B (zh) * | 2013-08-27 | 2016-12-28 | 华为技术有限公司 | 一种gpu虚拟化中图像处理方法和装置 |
US8917354B2 (en) * | 2013-09-30 | 2014-12-23 | Amlogic Co., Ltd. | Motion detection in video fields |
JP2019207635A (ja) * | 2018-05-30 | 2019-12-05 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 移動体、画像生成方法、プログラム、及び記録媒体 |
CN113129227A (zh) * | 2021-03-29 | 2021-07-16 | 影石创新科技股份有限公司 | 图像处理方法、装置、计算机设备和存储介质 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2279531A (en) | 1993-06-24 | 1995-01-04 | Sony Uk Ltd | Motion compensated image interpolation |
JPH07336688A (ja) * | 1994-06-06 | 1995-12-22 | Nippon Hoso Kyokai <Nhk> | アンカバー領域の検出方法 |
JPH10164436A (ja) * | 1996-12-04 | 1998-06-19 | Sony Corp | 輪郭抽出装置、輪郭抽出方法、キー信号生成装置及びキー信号生成方法 |
EP0933727A2 (en) | 1998-01-29 | 1999-08-04 | Canon Kabushiki Kaisha | Image information processing apparatus and its method |
JP2001250119A (ja) * | 1999-12-28 | 2001-09-14 | Sony Corp | 信号処理装置および方法、並びに記録媒体 |
EP1164545A1 (en) | 1999-12-28 | 2001-12-19 | Sony Corporation | Signal processing device and method, and recording medium |
JP2002190028A (ja) * | 2000-12-21 | 2002-07-05 | Sony Corp | 信号処理装置および方法、並びに記録媒体 |
JP2002190015A (ja) * | 2000-12-21 | 2002-07-05 | Sony Corp | 画像処理装置および方法、並びに記録媒体 |
JP2002190016A (ja) * | 2000-12-21 | 2002-07-05 | Sony Corp | 信号処理装置および方法、並びに記録媒体 |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2231752B (en) * | 1989-04-27 | 1993-08-04 | Sony Corp | Motion dependent video signal processing |
EP0549681B2 (en) * | 1990-09-20 | 2000-03-01 | British Broadcasting Corporation | Video image processing |
FR2675002B1 (fr) * | 1991-04-05 | 1993-06-18 | Thomson Csf | Procede de classification des pixels d'une image appartenant a une sequence d'images animees et procede d'interpolation temporelle d'images utilisant ladite classification. |
JP3258122B2 (ja) * | 1993-03-31 | 2002-02-18 | 株式会社東芝 | 画像処理装置 |
JPH08154172A (ja) * | 1994-11-29 | 1996-06-11 | Hitachi Ltd | 画像処理方法、画像ファイル及び画像処理用ファイル |
US5920655A (en) * | 1995-02-10 | 1999-07-06 | Canon Kabushiki Kaisha | Binarization image processing for multi-level image data |
JPH08221567A (ja) * | 1995-02-10 | 1996-08-30 | Fuji Photo Film Co Ltd | 色領域分離方法 |
US6008865A (en) * | 1997-02-14 | 1999-12-28 | Eastman Kodak Company | Segmentation-based method for motion-compensated frame interpolation |
JP2952226B2 (ja) * | 1997-02-14 | 1999-09-20 | 日本電信電話株式会社 | 動画像の予測符号化方法および復号方法、動画像予測符号化または復号プログラムを記録した記録媒体、および、動画像予測符号化データを記録した記録媒体 |
WO1999022520A2 (en) * | 1997-10-29 | 1999-05-06 | Koninklijke Philips Electronics N.V. | Motion vector estimation and detection of covered/uncovered image parts |
JP2000030040A (ja) * | 1998-07-14 | 2000-01-28 | Canon Inc | 画像処理装置及びコンピュータ読み取り可能な記憶媒体 |
US6839463B1 (en) * | 2000-12-22 | 2005-01-04 | Microsoft Corporation | System and method providing subpixel-edge-offset-based determination of opacity |
US6741755B1 (en) * | 2000-12-22 | 2004-05-25 | Microsoft Corporation | System and method providing mixture-based determination of opacity |
JP4596215B2 (ja) * | 2001-06-19 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596217B2 (ja) * | 2001-06-22 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596218B2 (ja) * | 2001-06-22 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596209B2 (ja) * | 2001-06-05 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596214B2 (ja) * | 2001-06-15 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4660980B2 (ja) * | 2001-06-15 | 2011-03-30 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4660979B2 (ja) * | 2001-06-15 | 2011-03-30 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596213B2 (ja) * | 2001-06-15 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596211B2 (ja) * | 2001-06-15 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596212B2 (ja) * | 2001-06-15 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4596216B2 (ja) * | 2001-06-20 | 2010-12-08 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
-
2001
- 2001-06-25 JP JP2001191004A patent/JP4596219B2/ja not_active Expired - Fee Related
-
2002
- 2002-06-20 EP EP02743667A patent/EP1339021B1/en not_active Expired - Fee Related
- 2002-06-20 CA CA2420719A patent/CA2420719C/en not_active Expired - Fee Related
- 2002-06-20 CN CNB028027574A patent/CN1267857C/zh not_active Expired - Fee Related
- 2002-06-20 DE DE60239255T patent/DE60239255D1/de not_active Expired - Lifetime
- 2002-06-20 KR KR1020037002731A patent/KR100835443B1/ko not_active IP Right Cessation
- 2002-06-20 US US10/362,354 patent/US7477761B2/en not_active Expired - Fee Related
- 2002-06-20 WO PCT/JP2002/006178 patent/WO2003001453A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2279531A (en) | 1993-06-24 | 1995-01-04 | Sony Uk Ltd | Motion compensated image interpolation |
JPH07336688A (ja) * | 1994-06-06 | 1995-12-22 | Nippon Hoso Kyokai <Nhk> | アンカバー領域の検出方法 |
JPH10164436A (ja) * | 1996-12-04 | 1998-06-19 | Sony Corp | 輪郭抽出装置、輪郭抽出方法、キー信号生成装置及びキー信号生成方法 |
EP0933727A2 (en) | 1998-01-29 | 1999-08-04 | Canon Kabushiki Kaisha | Image information processing apparatus and its method |
JP2001250119A (ja) * | 1999-12-28 | 2001-09-14 | Sony Corp | 信号処理装置および方法、並びに記録媒体 |
EP1164545A1 (en) | 1999-12-28 | 2001-12-19 | Sony Corporation | Signal processing device and method, and recording medium |
JP2002190028A (ja) * | 2000-12-21 | 2002-07-05 | Sony Corp | 信号処理装置および方法、並びに記録媒体 |
JP2002190015A (ja) * | 2000-12-21 | 2002-07-05 | Sony Corp | 画像処理装置および方法、並びに記録媒体 |
JP2002190016A (ja) * | 2000-12-21 | 2002-07-05 | Sony Corp | 信号処理装置および方法、並びに記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1339021A4 |
Also Published As
Publication number | Publication date |
---|---|
DE60239255D1 (de) | 2011-04-07 |
US20040057602A1 (en) | 2004-03-25 |
CA2420719A1 (en) | 2003-02-25 |
JP4596219B2 (ja) | 2010-12-08 |
CA2420719C (en) | 2010-05-25 |
KR20030036731A (ko) | 2003-05-09 |
EP1339021A4 (en) | 2009-01-07 |
KR100835443B1 (ko) | 2008-06-04 |
US7477761B2 (en) | 2009-01-13 |
JP2003006652A (ja) | 2003-01-10 |
CN1471693A (zh) | 2004-01-28 |
EP1339021A1 (en) | 2003-08-27 |
CN1267857C (zh) | 2006-08-02 |
EP1339021B1 (en) | 2011-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4596202B2 (ja) | 画像処理装置および方法、並びに記録媒体 | |
JP4596216B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596222B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596220B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596221B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596226B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4106874B2 (ja) | 画像処理装置および方法、並びに記録媒体 | |
JP4596203B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
WO2003001453A1 (fr) | Procede et dispositif de traitement d'images et dispositif de prise de vues | |
JP4596223B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4674408B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596214B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4840630B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596215B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596217B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4150949B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596209B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596218B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596225B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4596205B2 (ja) | 画像処理装置および方法、並びにプログラム | |
WO2003001456A1 (fr) | Appareil et procede de traitement de l'image, et appareil de prise de vue |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CA CN KR SG US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002743667 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2420719 Country of ref document: CA Ref document number: 1020037002731 Country of ref document: KR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 028027574 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020037002731 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10362354 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2002743667 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2004107997 Country of ref document: RU Kind code of ref document: A |