WO1998031136A1 - Verfahren zur bestimmung der geometriedaten von abtastvorlagen - Google Patents
Verfahren zur bestimmung der geometriedaten von abtastvorlagen Download PDFInfo
- Publication number
- WO1998031136A1 WO1998031136A1 PCT/DE1997/002952 DE9702952W WO9831136A1 WO 1998031136 A1 WO1998031136 A1 WO 1998031136A1 DE 9702952 W DE9702952 W DE 9702952W WO 9831136 A1 WO9831136 A1 WO 9831136A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- contours
- scanner
- values
- determined
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3872—Repositioning or masking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- the invention relates to the field of electronic reproduction technology and relates to a method for automatically determining the position and angle of rotation of images to be scanned on a scanner tablet or a scanner drum.
- print templates are created for print pages that contain all elements to be printed, such as texts, graphics and images.
- these elements are in
- the data is e.g. generated by scanning the image point by point and line by line in a scanner, dividing each pixel into color components and digitizing the color values of these components.
- Images are usually broken down into the color components red, green and blue (R, G, B) in a scanner.
- R, G, B red, green and blue
- these components are then further transformed into the printing inks cyan, magenta, yellow and black (C, M, Y, K).
- the scanner either generates just one component with gray values or the RGB components initially scanned are later converted into the printing ink black.
- the scanner can be a flatbed device in which the image originals to be scanned are mounted on a scanner tray.
- the image templates can be transparent (slides or color negatives) or reflective (top images).
- the scanner tray is illuminated and the translucent or reflected light from a scan line is broken down into color components by color filters. The light of the color components is then further broken down into discrete pixels, for example using a CCD line, and converted into electrical signals, which are then digitized.
- a drum scanner can also be used, in which the image originals are mounted on a transparent scanner drum.
- the scanner drum is illuminated point by point from the inside or outside, and the translucent or reflected light of the color components is combined in one Light sensors focused and converted into electrical signals.
- the scanner drum rotates while the illumination device and the scanning head are moved along the axis of the scanner drum, so that the surface of the scanner drum is scanned point by point and line by line.
- Some flatbed scanners have a device with which the scanner tray around any predetermined angle can be rotated. This enables the crooked mounting of the image on the scanning surface to be corrected when scanning. If such a rotating device is not available, the scanned image data can later be rotated in a computing process in order to correct the inclined mounting.
- Fig. 8 search for a fitted line using the Hough transform and 9 shows an example of the result of the processing.
- Fig. 1 shows a scanning surface (1) with some mounted image templates (2).
- the picture templates are generally colored or black and white slides, negatives or top pictures. In FIG. 1 they are indicated as binary images with only black and white pixels for reasons of simple duplication.
- the scanning area is the surface of a scanner tablet in a flatbed scanner or the surface of the scanner drum in a drum scanner.
- an overview scan of the scan area (1) is carried out in rough resolution, e.g. with 30 pixels / cm.
- an image signal is calculated from the stored RGB scan data of this scanning, which reproduces the outlines of the mounted image templates as clearly as possible.
- a brightness component can also be obtained by weighted addition of the RGB data.
- a single color component e.g. the green portion of the RGB data to be used as the brightness component.
- a white point Lw and a black point Ls are determined from the values of the brightness component.
- the frequencies of all values in the brightness image are preferably determined and plotted in a cumulative histogram.
- the white point Lw is then defined, for example, as the brightness value at which 5% of all brightness values are reached in the histogram. Accordingly, the brightness value is defined as black point Ls, at which 95% of all brightness values are reached in the histogram.
- these percentage values give white and black points that are representative of the image.
- the white point and the black point Ls shows the cumulative histogram with the white point Lw and the black point Ls. It is not essential for the present invention at which percentage values the white point and the black point are determined in the histogram. Any percentage values close to 0% or 100% can be selected. In principle, the brightness values at 0% and at 100%, i.e. the absolutely brightest and darkest values in the brightness image are selected as white point and black point. However, there is then the possibility that the white point and black point are not representative of the image if the extreme brightness values at 0% and 100% are very rare in the image.
- the histogram results in a very large value at 0%, which reflects the empty areas outside the image originals and is not representative of the white values within the image originals. This influence can be corrected by reducing extremely high values at 0% in the histogram by a certain factor before analyzing the histogram and determining the white and black point.
- the brightness component is subjected to digital edge filtering.
- Filters are preferably used which produce high initial values on approximately horizontal and vertical edges and thereby emphasize such edges.
- Fig. 3 shows an example of a simple filter for horizontal edges (3) and for vertical edges (4).
- the horizontal filter extends over 2 x 5 pixels.
- the circled point P denotes the position of the current pixel.
- the hy values at each position of the filter window are the filter coefficients.
- the filtering is carried out by placing the point P of the filter window over each pixel of the brightness image and multiplying and adding the pixel values Ly lying under the respective window positions by the coefficients hy.
- the result is still normalized to the dynamic range D by it is multiplied by 1 / (k1 x D), where k1 is a constant.
- the filter value F h of each pixel is therefore:
- the filter values F h and F v of the horizontal and vertical edge filtering are then combined according to the invention into a resulting filter value F.
- the amounts of F and F v are preferably compared for each pixel, and the respectively larger value is taken as the resulting filter value F. Then he surrenders
- Vz max is the sign of the selected maximum value.
- the shape and coefficients of the edge filters shown in FIG. 3 are not essential for the present invention. Filter windows with more or less than 2 x 5 pixels and with other coefficients can also be used. It is only important that the filtering mainly highlights horizontal and vertical edges. Summary functions other than those according to equation (4) can also be used, for example the sum of the absolute values
- the filtered brightness image F is converted into a binary image B with only two values 0 and 1 by the filter values F are compared with threshold values.
- an upper threshold value S1 and a lower threshold value S2 are formed as
- filter values F which are above S1 or below S2, are converted into binary value 1 and filter values, which lie between S1 and S2, into binary value 0.
- the threshold value decision illustrates the threshold value decision and the generation of the binary image B for a section of a filtered image F.
- the goal of the threshold value decision is to only reproduce in the binary image the highest filter values representing the horizontal and vertical edges, and the rest Suppress filter values.
- FIG. 5 shows the binary image generated for the example from FIG. 1, the binary values 0 being shown as white pixels and the binary values 1 as black pixels. It is not essential for the present invention that the threshold decision is carried out exactly according to equations (5) and (6). It is only important that the threshold values are selected so that the binary image B predominantly only reproduces the filter values F which correspond to the horizontal and vertical edges in the brightness image L. Neither do two threshold values S1 and S2 need to be selected. A threshold value is sufficient with which e.g. the amount of the filter values F is compared.
- the contours are analyzed in binary image B.
- a first contour point is searched line by line and pixel by pixel, ie a pixel with the binary value 1. From this starting point, a contour becomes pixel by pixel followed until the starting point is reached again.
- Various known methods can be used for contour tracking.
- FIG. 6 shows an example of a mask over 3 x 3 pixels for a preferred method of contour tracking.
- the central point P is set at the starting point of the contour, and the eight neighboring pixels are examined clockwise in order to determine whether they have the binary value 1.
- the examination mask is shifted there and the examination of the eight neighboring pixels starts again. This continues until the starting point is reached again.
- the order in which the neighboring pixels are examined is shown in FIG. 6 by the entered numbers 1... 8.
- various criteria are used to check whether the contour found is the outline of an image template or something else, e.g. a scratch or dirt residue from an adhesive tape.
- a preferred criterion is that the contour must have a minimum length, e.g. 150 mm to be interpreted as the outline of an image.
- the area enclosed by the contour must have a minimum width and height, e.g. 20 mm.
- a contour that is not an image template outline according to these criteria is deleted in the binary image B.
- the inside of a found image outline is also deleted, since the image contours contained therein are for further investigation. are relevant. Then a new starting point is sought and the next contour is analyzed until all contours in the binary image have been processed.
- FIG. 7 shows the result of the contour analysis for the example from FIG. 1. In comparison to the binary image B in FIG. 5, only the contours remain which are the outlines of image templates.
- FIG. 8 shows the next processing step of the invention, in which an optimally adapted straight line is determined for each of the four pages of an image template outline found.
- a method is used according to the invention which is known in image processing technology as the Hough transformation (H. Bässmann, P.W. Besslich: Schmish Ad Oculos, pp. 101-121, Springer Verlag 1993).
- the circumscribing rectangle (5) of the outline with the corner points A, B, C, D is formed, the sides of which are parallel to the main or secondary scanning direction.
- the straight line on which most of the outline points lie is selected as the optimally adapted straight line for this outline side.
- Figure 8 shows the search area for the left side of the outline.
- a point G is defined at a distance s from point A along a horizontal line.
- ⁇ lines (6) are laid at different angles.
- For each of the lines it is checked how many points of the outline lie on this line. This number is entered in an ⁇ , s matrix (7) under the column and line defined by ⁇ and s.
- Each cell in the matrix corresponds to one of the straight lines tested.
- s and ⁇ a large number of straight lines are examined in this way. In this case, since an approximately vertical straight line is sought, the parameter can be restricted to a strip and ⁇ to a small angular range in order to reduce the processing time required.
- the associated values of s and ⁇ define a straight line that most accurately represents the corresponding side of the image outline.
- the search and determination of the optimally adapted straight line for the remaining three sides of the image outline takes place in the same way as was described for FIG. 8.
- a scanning rectangle is formed from the fitted straight lines. This can be done in a variety of ways.
- a preferred method is: a) Averaging the angles of all four straight lines (with 90 ° being added or subtracted for two straight lines). The angles are the value of Hough transformation weighted, since the more contour points were found for the corresponding straight line, the more "safe" an angle. b) Check whether an angle deviates from the mean by more than a certain amount. If so, the mean is formed from the remaining three straight lines. c) Determination of the scanning rectangle with the four straight lines using the mean angle (modified for two straight lines by 90 °).
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Facsimile Image Signal Circuits (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10530446A JP2000508461A (ja) | 1997-01-08 | 1997-12-18 | スキャン原画の幾何学的データの決定方法 |
EP97952733A EP0950310A1 (de) | 1997-01-08 | 1997-12-18 | Verfahren zur bestimmung der geometriedaten von abtastvorlagen |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE19700318.4 | 1997-01-08 | ||
DE19700318A DE19700318A1 (de) | 1997-01-08 | 1997-01-08 | Verfahren zur Bestimmung der Geometriedaten von Abtastvorlagen |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1998031136A1 true WO1998031136A1 (de) | 1998-07-16 |
Family
ID=7816924
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE1997/002952 WO1998031136A1 (de) | 1997-01-08 | 1997-12-18 | Verfahren zur bestimmung der geometriedaten von abtastvorlagen |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP0950310A1 (de) |
JP (1) | JP2000508461A (de) |
DE (1) | DE19700318A1 (de) |
WO (1) | WO1998031136A1 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1909215A2 (de) | 2006-09-27 | 2008-04-09 | Fujitsu Limited | Verfahren zum Erkennen eines Bildbereichs, Aufzeichnungsmedium und Vorrichtung dafür |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000222386A (ja) * | 1998-11-25 | 2000-08-11 | Advantest Corp | 積和回路及び傾き検出装置 |
FR2803157B1 (fr) * | 1999-12-22 | 2002-12-13 | Sagem | Procede d'analyse par scanner et appareil a scanner a determination automatique de la zone a analyser |
DE60218916T2 (de) * | 2001-01-17 | 2007-11-29 | Fujifilm Corp. | Verfahren, Vorrichtung und Programmspeichermedium zur Bestimmung des Umrisses eines gescannten Bildes |
JP4238749B2 (ja) * | 2004-03-10 | 2009-03-18 | カシオ計算機株式会社 | 画像処理装置、画像投影装置、画像処理方法及びプログラム |
JP2005267457A (ja) * | 2004-03-19 | 2005-09-29 | Casio Comput Co Ltd | 画像処理装置、撮影装置、画像処理方法及びプログラム |
FR2945649A1 (fr) * | 2009-05-18 | 2010-11-19 | St Ericsson Sa St Ericsson Ltd | Procede et dispositif de traitement d'une image numerique. |
CN113191272A (zh) * | 2021-04-30 | 2021-07-30 | 杭州品茗安控信息技术股份有限公司 | 一种工程图像的识别方法、识别系统及相关装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4774569A (en) * | 1987-07-24 | 1988-09-27 | Eastman Kodak Company | Method for adaptively masking off a video window in an overscanned image |
US5054098A (en) * | 1990-05-21 | 1991-10-01 | Eastman Kodak Company | Method of detecting the skew angle of a printed business form |
EP0509549A2 (de) * | 1991-04-19 | 1992-10-21 | Fuji Photo Film Co., Ltd. | Abtastlesemethode |
WO1995012271A1 (en) * | 1993-10-25 | 1995-05-04 | Visioneer, Inc. | Method and apparatus for document skew and size/shape detection |
US5568571A (en) * | 1992-12-14 | 1996-10-22 | University Microfilms, Inc. | Image enhancement system |
US5629989A (en) * | 1993-04-27 | 1997-05-13 | Honda Giken Kogyo Kabushiki Kaisha | Image line-segment extracting apparatus |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS50100931A (de) * | 1973-12-31 | 1975-08-11 | ||
US4325086A (en) * | 1979-04-20 | 1982-04-13 | Canon Kabushiki Kaisha | Recording device |
GB2130837B (en) * | 1982-10-01 | 1987-04-23 | Canon Kk | Facsimile processing control |
JPS61282978A (ja) * | 1985-06-07 | 1986-12-13 | Toyota Motor Corp | 形状判定装置 |
JPH07113969B2 (ja) * | 1986-03-05 | 1995-12-06 | キヤノン株式会社 | 画像処理方法 |
JPH0820367B2 (ja) * | 1991-04-19 | 1996-03-04 | 株式会社イナックス | タイルユニットの検査方法 |
JPH05258146A (ja) * | 1992-03-13 | 1993-10-08 | Glory Ltd | 紙葉類の斜行データ補正装置 |
US5452374A (en) * | 1992-04-06 | 1995-09-19 | Ricoh Corporation | Skew detection and correction of a document image representation |
JPH05344318A (ja) * | 1992-06-10 | 1993-12-24 | Canon Inc | 画像入力装置 |
JPH07220066A (ja) * | 1994-01-28 | 1995-08-18 | Matsushita Electric Ind Co Ltd | 画像処理装置 |
US5528387A (en) * | 1994-11-23 | 1996-06-18 | Xerox Corporation | Electronic image registration for a scanner |
JPH08294007A (ja) * | 1995-04-20 | 1996-11-05 | Mita Ind Co Ltd | 画像処理装置 |
-
1997
- 1997-01-08 DE DE19700318A patent/DE19700318A1/de not_active Withdrawn
- 1997-12-18 EP EP97952733A patent/EP0950310A1/de not_active Ceased
- 1997-12-18 JP JP10530446A patent/JP2000508461A/ja active Pending
- 1997-12-18 WO PCT/DE1997/002952 patent/WO1998031136A1/de not_active Application Discontinuation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4774569A (en) * | 1987-07-24 | 1988-09-27 | Eastman Kodak Company | Method for adaptively masking off a video window in an overscanned image |
US5054098A (en) * | 1990-05-21 | 1991-10-01 | Eastman Kodak Company | Method of detecting the skew angle of a printed business form |
EP0509549A2 (de) * | 1991-04-19 | 1992-10-21 | Fuji Photo Film Co., Ltd. | Abtastlesemethode |
US5568571A (en) * | 1992-12-14 | 1996-10-22 | University Microfilms, Inc. | Image enhancement system |
US5629989A (en) * | 1993-04-27 | 1997-05-13 | Honda Giken Kogyo Kabushiki Kaisha | Image line-segment extracting apparatus |
WO1995012271A1 (en) * | 1993-10-25 | 1995-05-04 | Visioneer, Inc. | Method and apparatus for document skew and size/shape detection |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1909215A2 (de) | 2006-09-27 | 2008-04-09 | Fujitsu Limited | Verfahren zum Erkennen eines Bildbereichs, Aufzeichnungsmedium und Vorrichtung dafür |
EP1909215A3 (de) * | 2006-09-27 | 2008-07-02 | Fujitsu Limited | Verfahren zum Erkennen eines Bildbereichs, Aufzeichnungsmedium und Vorrichtung dafür |
US7813553B2 (en) | 2006-09-27 | 2010-10-12 | Fujitsu Limited | Image region detection method, recording medium, and device therefor |
Also Published As
Publication number | Publication date |
---|---|
DE19700318A1 (de) | 1998-07-09 |
EP0950310A1 (de) | 1999-10-20 |
JP2000508461A (ja) | 2000-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE19842572B4 (de) | Verfahren zur automatischen Entfernung von Bildfehlern | |
DE69620302T2 (de) | Verfahren und Vorrichtung zur Verbesserung eines digitalen Bildes | |
DE3851173T2 (de) | Verfahren und Gerät zur Erkennung von Halbtonbildmustern. | |
DE69715076T2 (de) | Vorrichtung zur Erzeugung eines Binärbildes | |
DE4445386C1 (de) | Verfahren und Vorrichtung zur Trennung einer Vordergrundinformation von einer Hintergrundinformation in einer Vorlage | |
DE69419682T2 (de) | Raumfiltereinheit zur adaptiven Randverstärkung | |
DE3881392T2 (de) | System und Verfahren für automatische Segmentierung. | |
DE69600461T2 (de) | System und Verfahren zur Bewertung der Abbildung eines Formulars | |
DE69432585T2 (de) | Verfahren und Gerät zur Auswahl von Text und/oder Non-Text-Blöcken in einem gespeicherten Dokument | |
EP0131676B1 (de) | Verfahren zum automatischen Digitalisieren des Umrisses von Strichgraphiken z.B. Buchstaben | |
DE69629888T2 (de) | Bildverarbeitungsverfahren und Bilderkennungs- und Pixelverarbeitungssystem | |
DE102007035884B4 (de) | Linienrauschunterdrückungsvorrichtung, -verfahren und -programm | |
DE69926205T2 (de) | Artefaktentfernungstechnik für schiefenkorrigierte bilder | |
DE2063932A1 (de) | Verfahren zum Korrelieren zweier Bilder | |
DE69925354T2 (de) | Bildverarbeitungssystem um vertikale Muster auf abgetasteten Bildern zu reduzieren | |
DE19956158A1 (de) | Bild-Binärisierungsverfahren auf Bereichsbasis | |
DE69532025T2 (de) | Farbbildverarbeitungsverfahren und -Vorrichtung | |
DE102007040070B4 (de) | Bilddichteumwandlungsverfahren, Bildverbesserungsverarbeitungseinrichtung und dafür geeignetes Programm | |
EP3289398B1 (de) | Verfahren zum generieren eines reflexionsreduzierten kontrastbildes und diesbezügliche vorrichtungen | |
DE19904997A1 (de) | Automatische Farbausfällung unter Verwendung von Luminanz/Chrominanz-Zwischenraumverarbeitung | |
DE102008013789A1 (de) | Vorrichtung, Verfahren und Programm zum Eliminieren von Zeichenstörungen | |
DE3732459C2 (de) | ||
DE4102587A1 (de) | Verfahren und einheit zur binaeren bildverarbeitung sowie verfahren und einheit zur zeichenerkennung | |
DE3312050A1 (de) | Verfahren zum herstellen einer photographischen maske | |
DE3324736A1 (de) | Eingabe-/ausgabesystem fuer die bilddarstellung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1997952733 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref country code: JP Ref document number: 1998 530446 Kind code of ref document: A Format of ref document f/p: F |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09341157 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1997952733 Country of ref document: EP |
|
WWR | Wipo information: refused in national office |
Ref document number: 1997952733 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1997952733 Country of ref document: EP |