WO2010035682A1 - 前景領域抽出プログラム、前景領域抽出装置、及び前景領域抽出方法 - Google Patents
前景領域抽出プログラム、前景領域抽出装置、及び前景領域抽出方法 Download PDFInfo
- Publication number
- WO2010035682A1 WO2010035682A1 PCT/JP2009/066245 JP2009066245W WO2010035682A1 WO 2010035682 A1 WO2010035682 A1 WO 2010035682A1 JP 2009066245 W JP2009066245 W JP 2009066245W WO 2010035682 A1 WO2010035682 A1 WO 2010035682A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- foreground
- color
- adjacent
- cost
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20101—Interactive definition of point of interest, landmark or seed
Definitions
- the present invention relates to the field of technology for clipping an object region from an image.
- Extraction of object areas (foreground areas) from images is used in many scenes such as photo and video processing.
- an object region extraction technique for example, an object region is extracted from a clue line drawn by the user on the object region and a clue line drawn by the user on the background region, for example, Lazy Snapping logic or the like A technique (see Non-Patent Document 1) is known.
- Lazy Snapping is that object area extraction is achieved by simply drawing a few clue lines in the area, and each time a line is added, the object area is re-extracted sequentially. Therefore, the user can add and delete lines while viewing the extraction result.
- the user can finally cut out the object region desired by the user, but until the object region is cut out accurately, It may take time.
- the present invention has been made in view of such problems, and one example of the problem is a foreground area extraction program, a foreground area extraction apparatus, and a foreground that can cut out an object area (foreground area) at high speed.
- An object is to provide a region extraction method.
- the invention of the foreground region extraction program according to claim 1 is directed to display control means for displaying an image, at least one foreground pixel on the foreground region included in the displayed image. And a receiving means for receiving designation of at least one background pixel on a background area included in the image from a user, and a three-dimensional color space using the designated foreground pixel and the designated background pixel as reference pixels, respectively.
- Divided color space specifying means for performing a divided color space specifying process for specifying a divided color space to which each of the reference pixels belongs as a reference divided color space from among the divided color spaces divided into a plurality of divided color spaces; adjacent to each of the reference pixels; Color distance calculation for performing a color distance calculation process for calculating a color distance in the color space with a neighboring pixel
- a genus determination unit that performs genre determination processing to determine whether each of the adjacent pixels belongs to each of the reference divided color spaces, the color distance calculated for each of the adjacent pixels, and each of the adjacent pixels
- a cost calculation means for performing a cost calculation process for calculating a cost for each adjacent pixel based on weighting based on whether or not the reference divided color space is determined, and the calculated cost is the smallest Functioning as a confirmation unit that performs a confirmation process for confirming an adjacent pixel as a foreground pixel or a background pixel, and using the determined adjacent pixel as a reference pixel, the color distance calculation process, the attribute determination process
- the object area (foreground area) desired by the user can be cut out at high speed.
- the invention according to claim 2 is the foreground region extraction program according to claim 1, wherein the weighting is reduced when adjacent pixels adjacent to the reference pixel belong to the reference divided color space. It is characterized by that.
- the invention according to claim 3 is the foreground region extraction program according to claim 1 or 2, wherein the foreground pixels specified by the user are different from each other constituting a texture pattern appearing on the foreground region. Foreground pixels corresponding to at least two colors, respectively.
- the foreground and the background can be accurately separated even if the texture pattern is on the image.
- the invention according to claim 4 is the foreground region extraction program according to any one of claims 1 to 3, wherein an adjacent pixel adjacent to the reference pixel does not belong to the reference divided color space. Is characterized by accumulating the cost and inverting the foreground and the background when the accumulated cost exceeds a threshold value.
- the foreground and the background can be accurately reversed in the enclave even if there is anti-aliasing in the enclave outline.
- the invention according to claim 5 is a display control means for displaying an image, at least one foreground pixel on a foreground area included in the displayed image, and at least one background on a background area included in the image.
- a receiving unit that accepts designation of a pixel from a user; and each of the divided color spaces obtained by dividing a three-dimensional color space into a plurality of divided color spaces using the designated foreground pixel and the designated background pixel as reference pixels, respectively.
- a divided color space specifying unit for performing a divided color space specifying process for specifying a divided color space to which the reference pixel belongs as a reference divided color space; and a color distance in the color space between each reference pixel and an adjacent pixel adjacent thereto.
- Color distance calculation means for performing a color distance calculation process to be calculated, and whether each adjacent pixel belongs to each reference divided color space
- An attribute determination unit that performs an attribute determination process for determining an attribute, a distance of the color calculated for each adjacent pixel, and a weight based on the attribute to the reference divided color space determined for each adjacent pixel;
- a cost calculating means for performing a cost calculating process for calculating a cost for each of the adjacent pixels based on, and a determining means for performing a determining process for determining the adjacent pixel having the smallest calculated cost as a foreground pixel or a background pixel;
- the foreground region is extracted from the image by repeatedly performing the color distance calculation process, the attribute determination process, the cost calculation process, and the determination process using the determined adjacent pixel as a reference pixel. It is characterized by that.
- the invention according to claim 6 is a foreground region extraction method performed by a computer, wherein a display control step of displaying an image, at least one foreground pixel on a foreground region included in the displayed image, and An accepting step of accepting designation of at least one background pixel on a background area included in an image from a user; and a plurality of three-dimensional color spaces using the designated foreground pixel and the designated background pixel as reference pixels, respectively.
- a confirmation step of confirming as a foreground pixel or a background pixel, and using the confirmed adjacent pixel as a reference pixel, the color distance calculation process, the attribute determination process, the cost calculation process, and the confirmation A foreground region is extracted from the image by repeatedly performing the processing.
- each of the reference pixels belongs to a divided color space obtained by dividing a three-dimensional color space into a plurality of parts by using the foreground pixel specified by the user and the specified background pixel as reference pixels, respectively.
- a divided color space is specified as a reference divided color space, and a color distance calculation process for calculating a color distance in the color space between each reference pixel and an adjacent pixel adjacent thereto is performed, and each adjacent pixel is converted into each reference division.
- the distance of the color calculated for each adjacent pixel and the affiliation to the reference divided color space determined for each adjacent pixel a cost calculation process for calculating a cost for each adjacent pixel based on the weighting based on the weight, and further, the calculated cost is the smallest
- FIG. 7 is a flowchart illustrating an example of foreground area extraction processing in the system control unit 6; It is a figure which shows the example of the cue line drawn on the foreground area
- the present invention is applied to an image editing apparatus that cuts out an object region from an image by separating a foreground and a background by using a clue line (handwritten line) drawn by a user as a hint. It is a form.
- “foreground” means an image displaying a search target such as a person or an article
- “background” means an image excluding the search target.
- FIG. 1 is a diagram illustrating a schematic configuration example of the image editing apparatus S according to the present embodiment.
- the image editing apparatus S includes an operation unit 1, a display unit 2, a drive unit 3, a storage unit 4, an input / output interface unit 5, a system control unit 6, and the like. And the input / output interface unit 5 are connected via a system bus 7.
- a personal computer can be applied as the image editing apparatus S.
- the operation unit 1 includes, for example, a keyboard and a mouse, receives an operation instruction from the user, and outputs the instruction content to the system control unit 6 as an instruction signal.
- the display unit 2 includes, for example, a CRT (Cathode Ray Tube) display, a liquid crystal display, and the like, and displays information such as characters and images.
- CTR Cathode Ray Tube
- LCD liquid crystal display
- the drive unit 3 reads data from a disk DK (recording medium) such as a flexible disk, CD (Compact Disc), DVD (Digital Versatile Disc), and the like, while reading data from the disc DK (recording medium). Record.
- a disk DK recording medium
- CD Compact Disc
- DVD Digital Versatile Disc
- the input / output interface unit 5 performs interface processing between the operation unit 1 to the storage unit 4 and the system control unit 6.
- the storage unit 4 includes, for example, a hard disk drive and stores an operating system (O / S), various programs, data, and the like.
- the programs stored in the storage unit 4 include a moving image editing application program (having the foreground area extraction program of the present invention) and the like.
- the moving image editing application program is provided by being recorded on a disk DK such as a CD-ROM, or provided by being downloaded from a server connected to the network NW, installed, and used.
- the system control unit 6 includes a CPU (Central Processing Unit) 6a, a ROM (Read Only Memory) 6b, a main memory and a RAM (Random Access Memory) 6c used as an image memory, and the like. Then, the system control unit 6 executes the moving image editing application program, thereby displaying the display control means, the receiving means, the divided color space specifying means, the color distance calculating means, the affiliation determining means, the cost calculating means, And foreground area extraction processing by functioning as a determinator or the like.
- a CPU Central Processing Unit
- ROM Read Only Memory
- main memory main memory
- a RAM Random Access Memory
- FIG. 2 is a flowchart showing an example of foreground region extraction processing in the system control unit 6.
- the process shown in FIG. 2 is started, for example, when a moving image editing application program is started.
- the system control unit 6 displays a still image in the moving image instructed by the user through the operation unit 1 on the display unit 2 (step S1).
- the system control unit 6 receives from the user designation of at least one foreground pixel on the foreground area included in the displayed image and at least one background pixel on the background area included in the image (step S2). ), Registered as a foreground pixel and background pixel (stored in a registered area of the RAM).
- the foreground pixel can be specified by drawing a cue line on a desired foreground area (object area) by the user operating the mouse
- the background pixel can be specified by operating the mouse on the desired background area. This can be specified by drawing a line.
- FIG. 3 is a diagram showing an example of a cue line drawn on the foreground area and the background area in the photograph image.
- a plurality of pixels overlapping the cue line 51 are designated as foreground pixels
- a plurality of pixels overlapping the cue line 52 are designated as background pixels.
- the dog object region is the foreground region desired by the user.
- the object area of the tree is the foreground area
- the other area including the dog is the background area.
- the system control unit 6 performs a divided color space specifying process (step S3).
- a divided color space (hereinafter referred to as a three-dimensional RGB color space) divided into a plurality of parts using the specified at least one foreground pixel and at least one specified background pixel as reference pixels, respectively.
- Bucket the reference bucket to which each reference pixel belongs is specified.
- FIG. 4 is a diagram showing a bucket obtained by dividing the RGB color space by a grid having a division number s.
- the RGB color space is divided into 64 buckets.
- the bucket is divided into 64 buckets, but it may be divided into fewer than this or more than this.
- the reference bucket 62 to which the foreground pixel 61 belongs and the reference bucket 64 to which the background pixel 63 belongs are specified, and the reference bucket 62 is a foreground color cluster (in other words, a foreground color group). ) And the reference bucket 64 is registered as a background color cluster (in other words, a background color group) (such clustering is referred to as a clustering method using buckets).
- a clustering method using buckets such clustering is referred to as a clustering method using buckets.
- a texture pattern (continuous pattern) as shown in FIG. 5 may appear on the foreground region.
- the distance in the color space of at least two colors constituting the texture pattern becomes large, and it becomes difficult to separate the foreground and the background. Therefore, in the present embodiment, when a texture pattern appears on the foreground region, at least two colors (the color distance in the color space is separated by a certain distance or more) constituting the texture pattern (example in FIG. 5).
- the user draws a clue line 73 on the foreground area so that the foreground pixels corresponding to the color in the area 71 and the color in the area 72 are designated, thereby registering at least two foreground clusters.
- FIG. 6 is a diagram showing registered foreground color clusters and background color clusters when a texture pattern appears in the foreground area.
- two foreground colors clusters are registered, and foreground pixels corresponding to the respective colors constituting the texture pattern belong to each foreground color cluster.
- a texture pattern may appear in the background area.
- at least two background color clusters are registered as in the foreground area.
- step S4 determines whether or not there is a pixel for which the foreground or the background is not determined (step S4), and if there is (step S4: YES), the process proceeds to step S5. (Step S4: NO), the foreground / background distinction is complete, so the foreground area is cut out from the image as the object area desired by the user (step S10), and the process ends.
- step S5 the system control unit 6 sets 1 or 2 that has not yet been specified as a grow candidate among the adjacent pixels adjacent to the reference pixel (the foreground pixel or the background pixel) (adjacent to the top, bottom, left, or right).
- the above pixels are specified as grow candidates.
- FIG. 7 is a diagram showing how pixels on an image are determined as foreground or background.
- the pixels 83 and 84 adjacent to the foreground pixel 81 specified by the user and the pixels 85 and 86 adjacent to the background pixel 82 specified by the user are the grow candidates. Have been identified.
- the system control unit 6 performs a color distance calculation process for each identified grow candidate (step S6).
- this color distance calculation process the color distance in the RGB color space between the reference pixel and the adjacent grow candidate is calculated.
- the color distance D (p i , p j ) can be calculated by the following equation (2).
- p i (p i, r , p i, g , p i, b )) is a pixel ahead (grow candidate)
- the system control unit 6 performs an attribute determination process for determining whether the grow candidate whose color distance has been calculated belongs to the foreground color cluster and whether it belongs to the background color cluster (step S1). S7).
- a bucket b (p i) is calculated by the equation (1) grow candidate p i belongs.
- the conventional k-means method (a prototype that is representative of each of the K clusters) is used to determine whether the grow candidate belongs to the foreground color cluster and the background color cluster. It is possible to discriminate at a higher speed (small calculation amount) than a method of performing clustering by giving (mean value) and assigning each individual to the nearest prototype.
- the system control unit 6 calculates the color distance D (p i , p j ) calculated for the grow candidate, the weighting based on the belonging to the foreground color cluster and the background color cluster determined for the grow candidate, Based on the above, a cost calculation process for calculating a grow cost for each grow candidate is performed (step S8).
- This grow cost cost i can be calculated by the following equation (3).
- K1 is a foreground color cluster and K2 is a background color cluster
- K1 is a background color cluster
- K2 is Foreground color cluster
- FIG. 8 is a diagram showing the magnitude of the grow cost when there is a texture pattern on the image.
- C 1 (p i ) can take any value of 1, 2, and 4. However, this magnitude relationship is important. For example, 0.5, 1, 2 It is also good. However, when C 1 (p i ) is set to 1, 2, 4 it is possible to complete the calculation of the grow cost by an integer operation, and considering the truncation error, it is desirable to make it a power of 2.
- the enclave means a background area surrounded by the foreground area (or a foreground area surrounded by the background area), and corresponds to, for example, the central portion of the donut.
- the system control unit 6 cannot recognize the enclave surrounded by the foreground area as the background unless a clue line is added in the enclave. Therefore, in order for the system controller 6 to recognize the enclave, it is necessary to reverse the foreground and the background when the grow cost exceeds the threshold value T.
- the threshold T can be calculated by the following equation (6).
- F b indicates the foreground color with the maximum brightness
- F d indicates the foreground color with the minimum brightness
- B b indicates the background color with the maximum brightness
- B d indicates the background color with the minimum brightness. That is, the threshold T is, the distance D (F b, B d) of the background color B d foreground F b and the minimum luminance of the maximum luminance and the background color B b foreground F d and the maximum brightness of the minimum brightness Distance D (F d , B b ) and the larger one (max).
- FIG. 9 is a diagram showing a comparative example between the case where the grow cost is not accumulated (A) and the case where the grow cost is accumulated (B) when there is anti-aliasing on the image.
- the threshold value T cannot be exceeded between the pixels.
- the grow cost is accumulated, so that the pixel 91 to the pixel 92 are not accumulated. The grow cost when moving to can exceed the threshold T.
- step S9 it is determined as the same foreground or background pixel as the reference pixel for which the color distance is calculated with the grow candidate (step S6) (for example, if the adjacent reference pixel is the foreground pixel, the grow candidate is the foreground. Confirmed as pixels).
- the grown candidate thus determined will be treated as a new reference pixel.
- the pixels 85 and 88 adjacent to the pixel 85 are specified as new grow candidates by determining the pixel 85 having the smallest grow cost as the background pixel.
- step S4 determines whether or not there is a pixel for which the foreground or the background is not determined, and if there is (step S4: YES), proceeds to step S5.
- the system control unit 6 specifies a pixel that is not specified as a grow candidate among adjacent pixels adjacent to the new reference pixel, and performs the color distance calculation process for the grow candidate in the same manner as described above.
- Step S6 belonging determination processing (Step S7), cost calculation processing (Step S8), and foreground / background determination processing (Step S9) are performed. By repeating this process until there are no more grow candidates, the pixels on the image are gradually determined as the foreground or the background, and as shown in FIG. When the foreground or background is determined, the foreground area is cut out from the image as the object area desired by the user in step S10.
- the state of FIG. 7C there may be a plurality of reference pixels adjacent to a certain grow candidate.
- the color distance between each reference pixel and the grow cost are reduced.
- the grow candidate is determined as the same foreground or background as the reference pixel having the smaller grow cost.
- the foreground cluster and the background color cluster are registered in advance by specifying each reference bucket to which each of the foreground pixel and the background pixel specified by the user belongs, An adjacent pixel adjacent to a pixel or a background pixel is specified as a grow candidate, a color distance calculation process is performed on each grow candidate, and an attribute determination process is performed by a clustering method using a bucket.
- the object area (foreground area) desired by the user can be cut out at high speed.
- each cluster (foreground color cluster or background color cluster) is specified by specifying pixels corresponding to at least two different colors constituting the texture pattern. Since the cost is calculated as described above, the foreground and the background can be accurately separated even if the texture pattern is on the image.
- the foreground and the background can be accurately reversed even if there is anti-aliasing in the outline portion of the enclave.
- the image editing apparatus S can be used in a personal computer as well as in a stand-alone manner, for example, on a web server that provides various information providing services to clients over the Internet.
- the configuration and function of the device S may be incorporated and used.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
アプリケーションプログラムを実行することにより、本発明における表示制御手段、受付手段、分割色空間特定手段、色距離算出手段、属否判別手段、コスト算出手段、及び確定手段等として機能し、前景領域抽出処理を行うようになっている。
2 表示部
3 ドライブ部
4 記憶部
5 入出力インターフェース部
6 システム制御部
7 システムバス
S 画像編集装置
Claims (7)
- コンピュータを、
画像を表示させる表示制御手段、
前記表示された画像に含まれる前景領域上における少なくとも一つの前景ピクセル、及び当該画像に含まれる背景領域上における少なくとも一つの背景ピクセルの指定をユーザから受け付ける受付手段、
前記指定された前景ピクセル及び前記指定された背景ピクセルを夫々基準ピクセルとして、3次元の色空間が複数に分割された分割色空間のうちから、各前記基準ピクセルが属する分割色空間を基準分割色空間として特定する分割色空間特定処理を行う分割色空間特定手段、
前記各基準ピクセルとこれに隣接する隣接ピクセルとの前記色空間における色の距離を算出する色距離算出処理を行う色距離算出手段、
各前記隣接ピクセルが前記各基準分割色空間に属するか否かを判別する属否判別処理を行う属否判別手段、
前記各隣接ピクセルについて算出された前記色の距離と、前記各隣接ピクセルについて判別された前記基準分割色空間への属否に基づく重み付けと、に基づいて前記各隣接ピクセルについてのコストを算出するコスト算出処理を行うコスト算出手段、及び、
前記算出されたコストが最も小さい隣接ピクセルを前景ピクセル又は背景ピクセルとして確定する確定処理を行う確定手段、として機能させ、
前記確定された隣接ピクセルを基準ピクセルとして、前記色距離算出処理、前記属否判別処理、前記コスト算出処理、及び前記確定処理を前記コンピュータに繰り返し行わせることにより、前記画像から前景領域を抽出するように機能させることを特徴とする前景領域抽出プログラム。 - 請求項1に記載の前景領域抽出プログラムであって、
前記基準ピクセルに隣接する隣接ピクセルが、前記基準分割色空間に属する場合には、前記重み付けが小さくされることを特徴とする前景領域抽出プログラム。 - 請求項1又は2に記載の前景領域抽出プログラムであって、
前記ユーザにより指定される前記前景ピクセルは、前記前景領域上に表われるテクスチャパターンを構成する互いに異なる少なくとも2つの色の夫々に対応する前景ピクセルであることを特徴とする前景領域抽出プログラム。 - 請求項1乃至3の何れか一項に記載の前景領域抽出プログラムであって、
前記基準ピクセルに隣接する隣接ピクセルが、前記基準分割色空間に属さない場合には、前記コストを蓄積し、蓄積されたコストが閾値を超えた場合に、前景と背景を反転させることを特徴とする前景領域抽出プログラム。 - 画像を表示させる表示制御手段と、
前記表示された画像に含まれる前景領域上における少なくとも一つの前景ピクセル、及び当該画像に含まれる背景領域上における少なくとも一つの背景ピクセルの指定をユーザから受け付ける受付手段と、
前記指定された前景ピクセル及び前記指定された背景ピクセルを夫々基準ピクセルとして、3次元の色空間が複数に分割された分割色空間のうちから、各前記基準ピクセルが属する分割色空間を基準分割色空間として特定する分割色空間特定処理を行う分割色空間特定手段と、
前記各基準ピクセルとこれに隣接する隣接ピクセルとの前記色空間における色の距離を算出する色距離算出処理を行う色距離算出手段と、
各前記隣接ピクセルが前記各基準分割色空間に属するか否かを判別する属否判別処理を行う属否判別手段と、
前記各隣接ピクセルについて算出された前記色の距離と、前記各隣接ピクセルについて判別された前記基準分割色空間への属否に基づく重み付けと、に基づいて前記各隣接ピクセルについてのコストを算出するコスト算出処理を行うコスト算出手段と、
前記算出されたコストが最も小さい隣接ピクセルを前景ピクセル又は背景ピクセルとして確定する確定処理を行う確定手段と、を備え、
前記確定された隣接ピクセルを基準ピクセルとして、前記色距離算出処理、前記属否判別処理、前記コスト算出処理、及び前記確定処理を繰り返し行うことにより、前記画像から前景領域を抽出することを特徴とする前景領域抽出装置。 - コンピュータにより行われる前景領域抽出方法であって、
画像を表示させる表示制御工程と、
前記表示された画像に含まれる前景領域上における少なくとも一つの前景ピクセル、及び当該画像に含まれる背景領域上における少なくとも一つの背景ピクセルの指定をユーザから受け付ける受付工程と、
前記指定された前景ピクセル及び前記指定された背景ピクセルを夫々基準ピクセルとして、3次元の色空間が複数に分割された分割色空間のうちから、各前記基準ピクセルが属する分割色空間を基準分割色空間として特定する分割色空間特定処理を行う分割色空間特定工程と、
前記各基準ピクセルとこれに隣接する隣接ピクセルとの前記色空間における色の距離を算出する色距離算出処理を行う色距離算出工程と、
各前記隣接ピクセルが前記各基準分割色空間に属するか否かを判別する属否判別処理を行う属否判別工程と、
前記各隣接ピクセルについて算出された前記色の距離と、前記各隣接ピクセルについて判別された前記基準分割色空間への属否に基づく重み付けと、に基づいて前記各隣接ピクセルについてのコストを算出するコスト算出処理を行うコスト算出工程と、
前記算出されたコストが最も小さい隣接ピクセルを前景ピクセル又は背景ピクセルとして確定する確定処理を行う確定工程と、を含み、
前記確定された隣接ピクセルを基準ピクセルとして、前記色距離算出処理、前記属否判別処理、前記コスト算出処理、及び前記確定処理を繰り返し行うことにより、前記画像から前景領域を抽出することを特徴とする前景領域抽出方法。 - 請求項1乃至4の何れか一項に記載の前景領域抽出プログラムがコンピュータ読み取り可能に記録されていることを特徴とする記録媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2737115A CA2737115C (en) | 2008-09-25 | 2009-09-17 | Foreground region extraction program, foreground region extraction apparatus and foreground region extraction method |
ES09816097.1T ES2464345T3 (es) | 2008-09-25 | 2009-09-17 | Programa de extracción de área de primer plano, aparato de extracción de área de primer plano y procedimiento de extracción de área de primer plano |
CN200980137340.6A CN102165487B (zh) | 2008-09-25 | 2009-09-17 | 前景区提取程序、前景区提取装置以及前景区提取方法 |
KR1020117009217A KR101180570B1 (ko) | 2008-09-25 | 2009-09-17 | 전경 영역 추출 프로그램, 전경 영역 추출 장치, 및 전경 영역 추출 방법 |
US13/063,334 US8611648B2 (en) | 2008-09-25 | 2009-09-17 | Foreground region extraction program, foreground region extraction apparatus and foreground region extraction method |
EP09816097.1A EP2328127B1 (en) | 2008-09-25 | 2009-09-17 | Foreground area extracting program, foreground area extracting apparatus and foreground area extracting method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008245350A JP4963306B2 (ja) | 2008-09-25 | 2008-09-25 | 前景領域抽出プログラム、前景領域抽出装置、及び前景領域抽出方法 |
JP2008-245350 | 2008-09-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010035682A1 true WO2010035682A1 (ja) | 2010-04-01 |
Family
ID=42059683
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/066245 WO2010035682A1 (ja) | 2008-09-25 | 2009-09-17 | 前景領域抽出プログラム、前景領域抽出装置、及び前景領域抽出方法 |
Country Status (9)
Country | Link |
---|---|
US (1) | US8611648B2 (ja) |
EP (1) | EP2328127B1 (ja) |
JP (1) | JP4963306B2 (ja) |
KR (1) | KR101180570B1 (ja) |
CN (1) | CN102165487B (ja) |
CA (1) | CA2737115C (ja) |
ES (1) | ES2464345T3 (ja) |
TW (1) | TWI415030B (ja) |
WO (1) | WO2010035682A1 (ja) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011066738A (ja) * | 2009-09-18 | 2011-03-31 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
JP5840940B2 (ja) * | 2011-12-16 | 2016-01-06 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 画像領域抽出装置、画像領域抽出方法、および画像領域抽出プログラム |
JP2013196308A (ja) * | 2012-03-19 | 2013-09-30 | Ricoh Co Ltd | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP5968098B2 (ja) * | 2012-06-14 | 2016-08-10 | キヤノン株式会社 | 画像処理装置、画像処理方法、プログラム、及び記憶媒体 |
CN103577475B (zh) * | 2012-08-03 | 2018-01-30 | 阿里巴巴集团控股有限公司 | 一种图片自动化分类方法、图片处理方法及其装置 |
JP6089886B2 (ja) * | 2013-03-29 | 2017-03-08 | オムロン株式会社 | 領域分割方法および検査装置 |
CN110827287B (zh) * | 2018-08-14 | 2023-06-23 | 阿里巴巴(上海)有限公司 | 确定背景色置信度和图像处理的方法、装置及设备 |
CN113408403A (zh) * | 2018-09-10 | 2021-09-17 | 创新先进技术有限公司 | 活体检测方法、装置和计算机可读存储介质 |
CN112532882B (zh) * | 2020-11-26 | 2022-09-16 | 维沃移动通信有限公司 | 图像显示方法和装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0528263A (ja) * | 1991-07-17 | 1993-02-05 | Photo Composing Mach Mfg Co Ltd | カラー画像処理装置 |
JPH10340334A (ja) * | 1997-06-09 | 1998-12-22 | Fukiage Hiroshi | デジタルカラ−静止画像の中の切り出す対象の輪郭を含めて太い線で明示的に指定した領域に基づいて対象の画像を切り出す方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU668043B2 (en) * | 1993-05-21 | 1996-04-18 | Sega Enterprises, Ltd. | Image processing device and method |
JPH0721346A (ja) * | 1993-06-18 | 1995-01-24 | Fujitsu Ltd | 画像処理における限定色表示処理方法 |
US6134345A (en) * | 1998-08-28 | 2000-10-17 | Ultimatte Corporation | Comprehensive method for removing from an image the background surrounding a selected subject |
JP4224748B2 (ja) * | 1999-09-13 | 2009-02-18 | ソニー株式会社 | 画像符号化装置および画像符号化方法、画像復号装置および画像復号方法、記録媒体、並びに画像処理装置 |
US20040130546A1 (en) * | 2003-01-06 | 2004-07-08 | Porikli Fatih M. | Region growing with adaptive thresholds and distance function parameters |
KR100816607B1 (ko) | 2003-10-21 | 2008-03-24 | 닛본 덴끼 가부시끼가이샤 | 화상 대조 시스템, 화상 대조 방법 및 컴퓨터로 판독가능한 기록 매체 |
US7508455B2 (en) * | 2004-03-26 | 2009-03-24 | Ross Video/Live Production Technology | Method, system, and device for automatic determination of nominal backing color and a range thereof |
JP2006039689A (ja) * | 2004-07-22 | 2006-02-09 | Nara Institute Of Science & Technology | 画像処理装置、画像処理方法、画像処理プログラムおよびそのプログラムを記録した記録媒体 |
US20060210159A1 (en) * | 2005-03-15 | 2006-09-21 | Yea-Shuan Huang | Foreground extraction approach by using color and local structure information |
US7720283B2 (en) * | 2005-12-09 | 2010-05-18 | Microsoft Corporation | Background removal in a live video |
US8094943B2 (en) * | 2007-09-27 | 2012-01-10 | Behavioral Recognition Systems, Inc. | Background-foreground module for video analysis system |
JP4497236B2 (ja) | 2008-08-11 | 2010-07-07 | オムロン株式会社 | 検出用情報登録装置、電子機器、検出用情報登録装置の制御方法、電子機器の制御方法、検出用情報登録装置制御プログラム、電子機器の制御プログラム |
-
2008
- 2008-09-25 JP JP2008245350A patent/JP4963306B2/ja active Active
-
2009
- 2009-09-17 CN CN200980137340.6A patent/CN102165487B/zh active Active
- 2009-09-17 WO PCT/JP2009/066245 patent/WO2010035682A1/ja active Application Filing
- 2009-09-17 EP EP09816097.1A patent/EP2328127B1/en active Active
- 2009-09-17 ES ES09816097.1T patent/ES2464345T3/es active Active
- 2009-09-17 KR KR1020117009217A patent/KR101180570B1/ko active IP Right Grant
- 2009-09-17 US US13/063,334 patent/US8611648B2/en active Active
- 2009-09-17 CA CA2737115A patent/CA2737115C/en active Active
- 2009-09-23 TW TW098132075A patent/TWI415030B/zh active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0528263A (ja) * | 1991-07-17 | 1993-02-05 | Photo Composing Mach Mfg Co Ltd | カラー画像処理装置 |
JPH10340334A (ja) * | 1997-06-09 | 1998-12-22 | Fukiage Hiroshi | デジタルカラ−静止画像の中の切り出す対象の輪郭を含めて太い線で明示的に指定した領域に基づいて対象の画像を切り出す方法 |
Non-Patent Citations (3)
Title |
---|
See also references of EP2328127A4 |
Y. LI; J. SUN; C. K. TANG; H. Y. SHUM: "Lazy Snapping", ACM TRANSACTIONS ON GRAPHICS (TOG), vol. 23, no. 3, August 2004 (2004-08-01), pages 303 - 308 |
YIN LI ET AL.: "Lazy Snapping", ACM TRANSACTIONS ON GRAPHICS (TOG), vol. 23, no. ISSUE., 2004, pages 303 - 308, XP009099127 * |
Also Published As
Publication number | Publication date |
---|---|
TWI415030B (zh) | 2013-11-11 |
KR20110084405A (ko) | 2011-07-22 |
US20110164814A1 (en) | 2011-07-07 |
TW201020973A (en) | 2010-06-01 |
KR101180570B1 (ko) | 2012-09-06 |
US8611648B2 (en) | 2013-12-17 |
EP2328127B1 (en) | 2014-02-26 |
CA2737115C (en) | 2013-01-22 |
CA2737115A1 (en) | 2010-04-01 |
JP2010079477A (ja) | 2010-04-08 |
EP2328127A4 (en) | 2012-12-05 |
EP2328127A1 (en) | 2011-06-01 |
CN102165487A (zh) | 2011-08-24 |
JP4963306B2 (ja) | 2012-06-27 |
ES2464345T3 (es) | 2014-06-02 |
CN102165487B (zh) | 2014-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4963306B2 (ja) | 前景領域抽出プログラム、前景領域抽出装置、及び前景領域抽出方法 | |
KR100450793B1 (ko) | 영역 분할된 영상의 영역 특징치 정합에 기초한객체추출장치 및 그 방법 | |
WO2011013579A1 (ja) | 画像処理装置および画像処理方法、並びにプログラム | |
US8842911B2 (en) | Luma-based color matching | |
JP2006338313A (ja) | 類似画像検索方法,類似画像検索システム,類似画像検索プログラム及び記録媒体 | |
US20130222696A1 (en) | Selecting between clustering techniques for displaying images | |
JP2012199652A (ja) | 画像処理装置、方法及びプログラム | |
JP2008262424A (ja) | 画像処理装置及びその制御方法、並びにコンピュータプログラム | |
CN105844609B (zh) | 对图像进行划分 | |
JP2011039944A (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP6948787B2 (ja) | 情報処理装置、方法およびプログラム | |
US8423881B2 (en) | Systems and methods for placing visual links to digital media on physical media | |
JP5158974B2 (ja) | 注目領域抽出方法、プログラム、及び、画像評価装置 | |
JP4870721B2 (ja) | 画像特徴抽出装置、画像特徴抽出方法、プログラム、及び記録媒体 | |
JP4986934B2 (ja) | 図形領域抽出装置、図形領域抽出方法、プログラム、及び記録媒体 | |
JP5232107B2 (ja) | 画像表示方法、プログラム、画像表示装置、及び、撮像装置 | |
JP7362568B2 (ja) | 画像検索装置、画像検索方法、およびプログラム | |
US20230156177A1 (en) | Information processing apparatus, information processing method, and storage medium | |
KR101194098B1 (ko) | 무늬감 특성을 이용한 유사 쉐이더 검색 방법 및 시스템 | |
JP2013098790A (ja) | 映像編集装置およびその制御方法 | |
CN115511909A (zh) | 一种图像分割方法及装置 | |
JPH11345308A (ja) | モザイク画像構成方法、記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980137340.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09816097 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13063334 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2737115 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009816097 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20117009217 Country of ref document: KR Kind code of ref document: A |