JPS5983280A - Pattern recognizing method - Google Patents

Pattern recognizing method

Info

Publication number
JPS5983280A
JPS5983280A JP57192825A JP19282582A JPS5983280A JP S5983280 A JPS5983280 A JP S5983280A JP 57192825 A JP57192825 A JP 57192825A JP 19282582 A JP19282582 A JP 19282582A JP S5983280 A JPS5983280 A JP S5983280A
Authority
JP
Japan
Prior art keywords
pattern
frame
patterns
degree
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP57192825A
Other languages
Japanese (ja)
Inventor
Takaaki Terashita
寺下 隆章
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Holdings Corp
Original Assignee
Fuji Photo Film Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Photo Film Co Ltd filed Critical Fuji Photo Film Co Ltd
Priority to JP57192825A priority Critical patent/JPS5983280A/en
Publication of JPS5983280A publication Critical patent/JPS5983280A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/28Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Projection-Type Copiers In General (AREA)
  • Image Analysis (AREA)

Abstract

PURPOSE:To improve the degree of resemblance between frames separated from each other, by defining an average value of the n-th and the (n+1)-th patterns or the (n+1)- th pattern as a standard pattern in accordance with the result of the comparison between a prescribed value and the degree of resemblance which is obtained by comparing both patterns with each other. CONSTITUTION:The first frame A of patterns A-D of a negative film 1 is defined as a standard pattern, and the degree of resemblance between the frame A and the next frame A' is discriminated by calculation to output discrimination signals S1-S5; and if this degree is higher than a prescribed value S3, it is decided that both patterns have a resemblance to each other or equal to each other, and average feature quantity data of frames A and A' is defined as a new standard pattern 2. The frame B and the pattern 2 are compared with each other to attain the degree of resemblance between them, and the pattern of the frame B is stored as a standard pattern 3 if the discrimination signal is prescribed value S2 or lower. The frame A'' is compared with patterns 2 and 3 to calculate the degree of resemblance; and in this case, since the frame A'' has a resemblance to the pattern 2, the averaging processing between the frame A'' and the pattern 2 is performed to generate and sotre a standard pattern 4. Hereafter, patterns of frames C, A''', and D are recognized similarly to generate standard patterns 5-7.

Description

【発明の詳細な説明】[Detailed description of the invention]

この発明はパターン認識方法に関する。 たとえば1本のネガフィルム中には、同一シーンないし
は類似のシーンを撮影したコマが含まれていることが多
く、この同一シーンに対しては回−#度にプリントする
ことが必要である。ノドη當大部分の同一シーンは、カ
メラの位置や角度、主些被写体の大きさや位置、背景等
によって細部において異なって(・ることか多い。従っ
て、最高濃度1)nax 、最低濃度Dminにおいて
大きな差が生じ、この最高濃度Dmaxと最低閾度])
min との差が小さくても、演算出力を段階的な製置
キーVこ変換するに際して差が生じる。この結果、同一
シーンであってもそれぞれ異なった濃IWにプリントさ
ハ2てしまう。 ところで、同一シーンは連続して撮影されるごとが多い
。そこで、被測定コマの前後数コマの全面積平均透過濃
度(Large Area ’I’ransrnitt
anceDensity ;以下単にLATDとする)
及び撮影光質の検出値と、当該コマの検出値とを比較し
、その差か−Y値以内にある場合には、これらが同一シ
ーンであると判定するようにした判定方法がある(たと
えば特開昭49−40942号)。こうして同一シーン
であると判定された場合には、これら同一シーンの露光
量の中間値を用いて、同一シーンコマが同一濃度になる
ようにプリントされる。 ここにおいて、エバンズの原理から明らかなように、は
とんどのコマはLATDが一定値となる。 従って、−り記LA、TDの値だけで判定する方法は、
画像構成が異なった別シーンであっても、同一シーンと
して判定することが多い。露光量制御力法によってはL
A’l’l)が同じであっても、他の特性値の影響によ
って濃度が変えられてプリントされてしまう。また、上
述のようにLA’I’Dが同じであるために同一シーン
と判定されると、別シーンであっても前後のコマと同じ
濃度にプリントされてしまうという問題を生じる。この
ようなことから、従来は第1図に示すようにネガフィル
ム1の互いに隣り合ったコマ同志を比較して、その類似
度から類似パターンを認識するようにしている。才なわ
ち、コマAとこれに隣接するコマA′とを比較し、類似
している場合にはコマA′を同−又は類似パターンとし
、次忙このコマA′と直ぐ隣りのコマA”とを比較し、
これが互いに類似していればコマA″を同−又は類似の
シーンであると判断する。このように、互いに隣り合っ
たコマの類似程度から同一シーンであるか否かの判断を
行l、「うようにしているので、隣り同志のコマに対し
ては同一シーンの判断は正確なものであるが、たとえば
2コマ1l11ハ、たAとA″′とでは非類似であるに
もがかわらず、類似となってしまう欠点がある。」、っ
て、この発明の目的は、上述の如き欠点のl[いバクー
ン認識方法を折供することにある。 以下にこの発明を説明する。 この発明は、連続的に配列されたべ個のパターンを1膣
次類似度によって認#i&する方法に四fるもので、n
(1≦n≦N)番目のパターンと(n+1)番目のパタ
ーンとを比較して類似度を求め、当該類似度が所定値よ
りも小さい場合又は以下の場合には当該(n+1)番目
のパターンを4M準パターンとして記憶し、類似度が所
定値以上の場合又はよりも大きい場合にはn番目及び(
n+1)番目のパターンの平均化処理によって標準パタ
ーンを求めて記憶し、これら記憶された標準パターンと
他のパターンとを順次比較して、その類似度に基づいて
新たな標準パターンをIh’1次形成するようにしたも
のである。 すなわち、第2図に示すようなネガフィルム1のパター
ンA〜I)を順次認識する場合を例に挙げて説明すると
、最初のコマAを標準パターンとして、次のコマA′ 
とを後述する類似度の引算によってその類似の程度を判
断し、たとえば5段階81 、82 、83 、84 
、85の識別信号を出力する。 そして、その識別信号81〜S5が予め定められた所定
値(たとえば83)以上なら類似又は同一と判定し、コ
マA及びA′の平均の特徴量データ(後述する)を新し
い標準パターン2とする。なお、この平均化処理は2つ
のコマのパターンの相加平均でも良(、コマ数に応じた
重み伺けを行なった平均値でも良い。そして、次のコマ
Bとこの新しく設定された標準パターン2にの比較によ
る類似度を求め、その識別信号が82以下の場合にはこ
のコマBのパターンをそのまま別の標準パターン3とし
て記憶する。次に、次のコマA”を今までに記憶された
標準パターン2,3とそれぞれ比較してその類似度を計
算し、この場合コマA ”は標準パターン2と類似して
いるので、コマA”と標準パターン2との平均化処理を
行なって新しく・標準パターン4を形成して記憶する。 更に、次のコマCと今までに求められた積率パターン2
〜4との比較を行なってそれぞり、の類似度を計算する
が、この例ではコマCはいずれの標準パターン2〜4に
対しても識別46号が82以下とt
The present invention relates to a pattern recognition method. For example, a single negative film often contains frames of the same or similar scenes, and it is necessary to print the same scene several times. Most of the same scene often differs in detail depending on the position and angle of the camera, the size and position of the main subject, the background, etc. Therefore, the maximum density 1) nax and the minimum density Dmin A large difference occurs between the maximum concentration Dmax and the minimum threshold])
Even if the difference from min is small, a difference occurs when the calculation output is converted into a stepwise conversion. As a result, even the same scene is printed in different dark IWs. Incidentally, the same scene is often photographed consecutively. Therefore, we calculated the total area average transmission density (Large Area 'I') of several frames before and after the frame to be measured.
anceDensity; hereinafter simply referred to as LATD)
There is a determination method that compares the detected value of the shooting light quality and the detected value of the frame, and if the difference is within -Y value, it is determined that these are the same scene (for example, Japanese Patent Publication No. 49-40942). If it is determined that they are the same scene, the same scene frames are printed with the same density using the intermediate value of the exposure amounts of these same scenes. Here, as is clear from Evans' principle, most of the pieces have a constant LATD. Therefore, the method of determining only by the values of LA and TD is as follows:
Even if they are different scenes with different image configurations, they are often determined to be the same scene. L depending on the exposure control force method
Even if A'l'l) is the same, the density will be changed and printed due to the influence of other characteristic values. Furthermore, if it is determined that the scenes are the same because the LA'I'D is the same as described above, a problem arises in that even if the scenes are different, they will be printed with the same density as the previous and subsequent frames. For this reason, conventionally, adjacent frames of a negative film 1 are compared as shown in FIG. 1, and similar patterns are recognized from the degree of similarity. In other words, frame A is compared with the adjacent frame A', and if they are similar, frame A' is treated as the same or similar pattern, and the next frame A' and the immediately adjacent frame A' are compared. Compare with
If these are similar to each other, frame A'' is determined to be the same or similar scene.In this way, it is determined whether or not they are the same scene based on the degree of similarity between adjacent frames. Therefore, for adjacent frames, the judgment of the same scene is accurate, but for example, 2 frames 1l11c, A and A''' are dissimilar even though they are dissimilar. , there is a drawback that they become similar. Therefore, the object of the present invention is to provide a method of recognition that overcomes the above-mentioned drawbacks. This invention will be explained below. This invention is based on a method of recognizing consecutively arranged patterns by one degree of similarity, and n
The (1≦n≦N)th pattern and the (n+1)th pattern are compared to determine the degree of similarity, and if the degree of similarity is smaller than a predetermined value or below, the (n+1)th pattern is is stored as a 4M quasi-pattern, and when the degree of similarity is greater than or equal to a predetermined value, the n-th and (
Standard patterns are obtained and stored by averaging processing of the (n+1)th pattern, and these stored standard patterns are sequentially compared with other patterns, and a new standard pattern is created based on the degree of similarity. It was designed so that it could be formed. That is, to explain the case where the patterns A to I) of the negative film 1 as shown in FIG.
The degree of similarity is determined by subtracting the degree of similarity described later, for example, in five stages 81 , 82 , 83 , 84
, 85 identification signals are output. If the identification signals 81 to S5 are greater than or equal to a predetermined value (for example, 83), it is determined that they are similar or identical, and the average feature amount data (described later) of frames A and A' is set as the new standard pattern 2. . Note that this averaging process may be an arithmetic average of the patterns of two frames (or an average value weighted according to the number of frames).Then, the next frame B and this newly set standard pattern may be 2, and if the identification signal is 82 or less, the pattern of this frame B is stored as is as another standard pattern 3. Next, the next frame A" is The degree of similarity is calculated by comparing them with standard patterns 2 and 3. In this case, since frame A'' is similar to standard pattern 2, average processing of frame A'' and standard pattern 2 is performed to create a new pattern.・Form and memorize standard pattern 4.Furthermore, the next frame C and the moment pattern 2 found so far
-4 are compared to calculate the similarity of each, but in this example, frame C has identification number 46 of 82 or less for any standard pattern 2-4.

【るので、このコマ
Cのパターンを冶しい標準パターン5どして記憶する。 以下同様にしてコマA、”’ 、 Dについてのパター
ンbaを行なうか、この例では新しい標準パターン6.
7を)1.構成1゛ることになる。なお、この類似の判
断において、カj似と判定される標準パターンが複数個
イj在する場合には、最も類似度の高い標準パターンを
SL1似σ)バターンとして判定すると共に、■1′シ
い標準パターンを形成−f′ろ。 ここに、パターンの類似度を計算する特徴量としては画
像の最高濃度1)may 、赤色濃度と緑色濃度との差
(DibDG)、緑色濃度と青色濃度との差(DG−D
B)、画像を数分割した部分の平均濃度D!、全画面の
平均濃度1〕、濃度又は色相のヒストグラム、空間周波
数分布、N値化画像の形状等が考えられる。これら各特
a量は公知の技術によって得ることができ、これら各特
徴量:によってフィルム画像(コマ)のパターンを構成
する。 また、類似度の目算は次の式による。 X=に1 +に2 (lΔttGl+IΔG)31)+
に3 (lΔULl+lΔI(J、l ) +に41Δ
CPI十に51ΔI)B l         ・・・
・・・・・・・・・(1)ここにおいて、K1%に5一
定数 1ΔRG l = l (LATD’、Iq−IATD
’(G+ )−(LATJ)(杓−hATD((3) 
) I・・・・・・・・・(2)1ΔGB l = l
 (LA’l”D ’(Q−LATD ’(13))−
(LATD(G) LATI)(B) ) l ・・・
・・・・・−(3)1ΔULI = l (DL’−D
U’) −(DL−DU) 1・・・・・・・・・・・
・ (4) )Δf(I、l = l (1)RI’−1)LP’/
)−(DHJ−DLF) l・・・・・・・・・・・・
 (5) ・・・・・・・・・・・・ (6) ・・・・・・・・・・・・ (7) (これは隣り合う測定点の1#度差の平均101であり
、画面全体の平均的なコントラストを表わしている。) である。また、LATE’)(N)は中性灰色の全面積
平均透過濃度、LA、TD (11は赤色の全面積平均
透過濃度、LAT、D ((l緑色の全面積平均透過濃
度、LATD (B>は青色の全面積平均透過濃度、D
minは中性灰色の最低濃度、Dmaxは中性灰色の最
高濃度、DCは画面中心部の平均濃度、])Fは画面周
辺部の平均濃度、DUは画面上方部の平均濃度、DLは
画面下方部の平均濃度、J)几■は画面右方部の平均濃
度、1)LPは画面左方部の平均濃度である。そして、
これらの特性値にダッシュ「′」を付したものは、前コ
マの特徴値であり、付してないものは前コマと同一シー
ンであるかどうかについて判定の対象となる現コマであ
る。 このようにして、上記(1)式の判定式において、識別
信号Sが一定値α(上述の例では識別信号83)以上の
場合は、前コマと現コマが同一シーンであると判定する
。すなわち である。なお、一定値αは任意に設定することができ、
識別信号の類似度段階も任意に設定するととができる。 一方、上述の如きこの発明によるパターンの認識方法を
実現する装置としては第3図に示すものが考えられ、パ
ターン検出部10で検出された上記特徴量がパターンデ
ータPI)として類似度計算部1】に入力され、予め標
準バクーン記惰部14に記憶されている標準パターンと
比較し、上記計q式に従ってその類似度を計算する。そ
して、類似度計算部11で計算された両パターンの類似
程度を7Iりす識別(tr M Sが比較判定部12に
入力され、その識別信号Sが所定値αと比較され、比較
結果几Sが出力される。また、識別信号Sが所定値αよ
りも大きい場合には標準パターン修1[部13に修正信
号CB、を送って、当該標準パターンを修止する処理を
行ない、識別信号Sがハ1定値αよりも小さい場合には
非類似であるので、当該パターンを標準パターンとして
標準パターン記憶部14に記憶するようにする。 以上のようにこの発明のパターンr+々識方法によれば
、互いに隣接するコマ同志で類イυ〒11断を行なうよ
うにしていないので、類似コマ群の途中に非類似のコマ
が(r在しても、判定はそれを除いて類似コマを集める
ことができる。また、全パターンを一度記録し、相互に
fA化のパターンを集めるようにしていないと共に、類
似と判定された場合に標準パターンを修正して記憶する
ようにしているので、記憶してお(標準パターンの数を
減少させることが可能となる。さらに、標準パターンを
修正するようにしているので、離れたコマとの類似度を
向上させることができる。 なお、長尺フィルムの場合にはフィルムのスプライスを
検出したとき、全ての標準パターンを消去し、フィルム
の最初のパターンを常に標卓パターンとして記憶するよ
うにしても良い。要は、フィルムのパターン認識に先立
って予め標準パターンを記憶させておけば良い。
Therefore, the pattern of this frame C is stored as a new standard pattern 5. Thereafter, pattern ba for frames A, "', and D is performed in the same manner, or in this example, a new standard pattern 6.
7)1. This will be configuration 1. In this similarity determination, if there are multiple standard patterns that are determined to be similar, the standard pattern with the highest degree of similarity is determined as the SL1 similar σ) pattern, and Form a standard pattern -f'. Here, the feature quantities used to calculate the pattern similarity are the maximum image density 1) may, the difference between the red density and the green density (DibDG), and the difference between the green density and the blue density (DG-D).
B), average density D of several parts of the image! , average density of the entire screen 1], histogram of density or hue, spatial frequency distribution, shape of N-valued image, etc. can be considered. Each of these characteristic amounts can be obtained by a known technique, and a pattern of a film image (frame) is constructed by each of these characteristic amounts. Furthermore, the degree of similarity is calculated using the following formula. X=1 to +2 (lΔttGl+IΔG)31)+
3 (lΔULl+lΔI(J,l) +41Δ
CPI 1051ΔI) B l...
・・・・・・・・・(1) Here, K1% has a 5 constant number 1ΔRG l = l (LATD', Iq-IATD
'(G+)-(LATJ)(ladder-hATD((3)
) I・・・・・・・・・(2) 1ΔGB l = l
(LA'l"D'(Q-LATD'(13))-
(LATD(G) LATI)(B) ) l...
...-(3) 1ΔULI = l (DL'-D
U') -(DL-DU) 1・・・・・・・・・・・・
・ (4) )Δf(I, l = l (1) RI'-1) LP'/
)-(DHJ-DLF) l・・・・・・・・・・・・
(5) ・・・・・・・・・・・・ (6) ・・・・・・・・・・・・ (7) (This is the average 101 of the 1# degree difference between adjacent measurement points. , which represents the average contrast of the entire screen.) In addition, LATE') (N) is the total area average transmission density of neutral gray, LA, TD (11 is the total area average transmission density of red, LAT, D ((l) The total area average transmission density of green, LATD (B > is the total area average transmission density of blue, D
min is the minimum density of neutral gray, Dmax is the maximum density of neutral gray, DC is the average density at the center of the screen,]) F is the average density at the periphery of the screen, DU is the average density at the top of the screen, DL is the screen 1) LP is the average density of the lower part, J) is the average density of the right part of the screen, and 1) LP is the average density of the left part of the screen. and,
Those characteristic values with a dash "'" are the characteristic values of the previous frame, and those without a dash are the current frame to be determined as to whether or not the scene is the same as the previous frame. In this manner, in the determination formula (1) above, if the identification signal S is equal to or greater than the fixed value α (identification signal 83 in the above example), it is determined that the previous frame and the current frame are the same scene. In other words. Note that the constant value α can be set arbitrarily,
The similarity level of the identification signal can also be set arbitrarily. On the other hand, as an apparatus for realizing the pattern recognition method according to the present invention as described above, the apparatus shown in FIG. 3 can be considered. ] and is compared with the standard pattern stored in advance in the standard Bakun recording section 14, and the degree of similarity is calculated according to the above equation. Then, the degree of similarity between both patterns calculated by the similarity calculation unit 11 is inputted to the comparison judgment unit 12, and the identification signal S is compared with a predetermined value α, and the comparison result S is Furthermore, when the identification signal S is larger than the predetermined value α, a correction signal CB is sent to the standard pattern modification 1 [section 13 to perform processing to modify the standard pattern, and the identification signal S is output. C1 If it is smaller than the constant value α, it is dissimilar, so the pattern is stored as a standard pattern in the standard pattern storage unit 14. As described above, according to the pattern r+ knowledge method of the present invention, Since we are not trying to determine the similarity between adjacent frames, even if there is a dissimilar frame in the middle of a group of similar frames, the judgment will not be able to remove it and collect similar frames. In addition, we record all patterns once and do not collect mutually fA patterns, and if it is determined that they are similar, we modify and store the standard pattern, so it is possible to memorize it. (It is possible to reduce the number of standard patterns.Furthermore, since the standard patterns are modified, it is possible to improve the similarity with distant frames.In addition, in the case of long films, When a film splice is detected, all standard patterns may be erased and the first pattern of the film may be always stored as a standard pattern.In short, the standard pattern may be stored in advance before film pattern recognition. Just let it happen.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は従来のパターン認識方法を説明するための図、
第2図はこの発明によるパターン認識方法の一例を説明
するための図、第3図はこの発明を実施するための装置
の一例を示すプ「1ツク図である。 1・・・ネカフィルム、2へ7・・・標準バター7.1
0・・・パターン検出部、11・・・類似反FtlJQ
都、12・・・比較刊足部、J3・・・標準パターン修
正部、1・1・・・傳i、 f(/4パターン記憶れ4
;。
Figure 1 is a diagram for explaining the conventional pattern recognition method.
FIG. 2 is a diagram for explaining an example of a pattern recognition method according to the present invention, and FIG. 3 is a diagram showing an example of an apparatus for carrying out the present invention. To 7...Standard butter 7.1
0...Pattern detection unit, 11...Similar anti-FtlJQ
Miyako, 12...Comparative publication foot part, J3...Standard pattern correction part, 1.1...Den i, f(/4 pattern memory 4
;.

Claims (1)

【特許請求の範囲】 連続的に配列されたN個のパターンを順次類似度によっ
て認識する方法において、n(1≦n〈N)番目のパタ
ーンと(n+1)番目のパターンとを比較して類似度を
求め、当該類似度が7([定値よりも小さく・場合又は
以下の場合忙は当該(n+1)番目のパターンを標準パ
ターンとして記憶し、前記類似度が前記所定値以上の場
合又はよりも大きい場合には前記n番目及び(n+1)
番目のバ処 ターンの平均化宜理によって標準パターンを求め。 て記憶し、これら記憶された標準パターンと他のパター
ンとを順次比較して、その類似度に基づいて新たな標準
パターンをJllii次形成するようにしたことを特徴
とするパターン認識方法。
[Claims] In a method for sequentially recognizing N consecutively arranged patterns based on similarity, the n (1≦n<N)-th pattern is compared with the (n+1)-th pattern to determine the similarity. If the similarity is less than or equal to the predetermined value, the (n+1)th pattern is stored as the standard pattern, and if the similarity is greater than or equal to the predetermined value, If larger, the nth and (n+1)
Find the standard pattern by averaging the th bar processing turn. A pattern recognition method characterized in that the stored standard patterns are sequentially compared with other patterns, and a new standard pattern is sequentially formed based on the degree of similarity.
JP57192825A 1982-11-02 1982-11-02 Pattern recognizing method Pending JPS5983280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP57192825A JPS5983280A (en) 1982-11-02 1982-11-02 Pattern recognizing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP57192825A JPS5983280A (en) 1982-11-02 1982-11-02 Pattern recognizing method

Publications (1)

Publication Number Publication Date
JPS5983280A true JPS5983280A (en) 1984-05-14

Family

ID=16297598

Family Applications (1)

Application Number Title Priority Date Filing Date
JP57192825A Pending JPS5983280A (en) 1982-11-02 1982-11-02 Pattern recognizing method

Country Status (1)

Country Link
JP (1) JPS5983280A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6069685A (en) * 1994-02-16 2000-05-30 Agfa-Gevaert Ag Method and apparatus for printing high quality prints from photographic negatives
WO2010137267A1 (en) * 2009-05-29 2010-12-02 株式会社 日立ハイテクノロジーズ Method of manufacturing a template matching template, as well as a device for manufacturing a template

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6069685A (en) * 1994-02-16 2000-05-30 Agfa-Gevaert Ag Method and apparatus for printing high quality prints from photographic negatives
WO2010137267A1 (en) * 2009-05-29 2010-12-02 株式会社 日立ハイテクノロジーズ Method of manufacturing a template matching template, as well as a device for manufacturing a template
JP2010276487A (en) * 2009-05-29 2010-12-09 Hitachi High-Technologies Corp Template creating method for template matching, and template creating apparatus
US8929665B2 (en) 2009-05-29 2015-01-06 Hitachi High-Technologies Corporation Method of manufacturing a template matching template, as well as a device for manufacturing a template

Similar Documents

Publication Publication Date Title
US8295606B2 (en) Device and method for detecting shadow in image
US7289154B2 (en) Digital image processing method and apparatus for brightness adjustment of digital images
US7657090B2 (en) Region detecting method and region detecting apparatus
US5881171A (en) Method of extracting a selected configuration from an image according to a range search and direction search of portions of the image with respect to a reference point
US5278921A (en) Method of determining exposure
US8155396B2 (en) Method, apparatus, and program for detecting faces
US20080170778A1 (en) Method and system for detection and removal of redeyes
US4100424A (en) Method for controlling a photographic printing exposure
JP2008282416A (en) Method and apparatus for segmenting image prior to coding
US8655060B2 (en) Night-scene light source detecting device and night-scene light source detecting method
GB2409030A (en) Face detection
Luo et al. An efficient automatic redeye detection and correction algorithm
JP4658532B2 (en) Method for detecting face and device for detecting face in image
JP2008004123A (en) Specific shape region extraction device and method, specific region extraction device and method, and copy condition decision device and method
CN112215154B (en) Mask-based model evaluation method applied to face detection system
US7283667B2 (en) Image processing unit, image processing method, and image processing program
JP3576654B2 (en) Exposure determination method, figure extraction method, and face area determination method
CN111428730A (en) Weak supervision fine-grained object classification method
JPS5983280A (en) Pattern recognizing method
JPH09101579A (en) Face area extraction method and copying condition determination method
JPS6352367B2 (en)
JPS5929848B2 (en) Photographic printing exposure control method
JPH01119765A (en) Dividing method of region
CN113506332B (en) Target object identification method, electronic device and storage medium
CN110532947A (en) A kind of method and apparatus for going colour cast to handle for car test table