WO2019107141A1 - Image processing device and image processing method - Google Patents

Image processing device and image processing method Download PDF

Info

Publication number
WO2019107141A1
WO2019107141A1 PCT/JP2018/041980 JP2018041980W WO2019107141A1 WO 2019107141 A1 WO2019107141 A1 WO 2019107141A1 JP 2018041980 W JP2018041980 W JP 2018041980W WO 2019107141 A1 WO2019107141 A1 WO 2019107141A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge
projection
medium
image
axis
Prior art date
Application number
PCT/JP2018/041980
Other languages
French (fr)
Japanese (ja)
Inventor
中村 宏
Original Assignee
日本電産サンキョー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産サンキョー株式会社 filed Critical 日本電産サンキョー株式会社
Publication of WO2019107141A1 publication Critical patent/WO2019107141A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to an image processing apparatus and an image processing method for processing information on a rectangular medium.
  • Hough transform is well known as a method of detecting and recognizing a straight line or a specific geometric figure formed of straight lines on a rectangular medium to be processed using a digital image. Are widely used in various industrial fields.
  • the image including the target graphic is subjected to preprocessing such as noise removal and edge enhancement, and then the extracted image pattern is subjected to Hough transform to be converted into a cumulative point, and then the maximum A procedure is performed in which inverse Hough transform is performed on an accumulated point having an accumulated frequency to obtain a straight line in the image space (see, for example, Patent Documents 1 and 2).
  • the position coordinate is detected using Hough transformation for positioning of a workpiece in the manufacturing process of a semiconductor device, and it is used for position correction.
  • inverse Hough transform is performed to obtain a straight line serving as a reference to obtain an angle correction of the work.
  • the Hough transform is applied to estimate a line segment that constitutes a part of the periphery of the object. Count the number of points to be converted to polar coordinates, and select polar coordinates with high frequency. The linear equation is calculated by inverse Hough transform for each of the selected polar coordinates, and the rotation angle is detected.
  • the present invention has been made in view of the above situation, and provides an image processing technique for detecting the rotation angle on an image of a rectangular medium, which can reduce the operation load. To aim.
  • the present invention is an image processing apparatus for detecting a rotation angle on an image of a rectangular medium by using a digital image, wherein luminance relative to each of a horizontal axis and a vertical axis of image data of the rectangular medium is detected.
  • a projection generation unit that generates a projection of pixel values by projection; an end point detection unit that determines both ends of a projection pattern of the projection waveform for each of the projection on the horizontal axis and the projection on the vertical axis;
  • a rectangular part consisting of four end points of the processing target, and excluding the other area as the outside of the target area, a processing target outside the removal area, and two parallel lines at the processing target area passing through the processing target area Media at least parallel to the horizontal axis or the vertical axis to determine the edge position of the rectangular medium on each parallel line, and determining the edge spacing of the two edge positions
  • Tsu-di-point deviation calculation unit characterized in that it comprises a an angle calculating part for calculating a tilt angle from the edge interval and
  • the medium edge point deviation calculation unit obtains the edge intervals at each of opposite sides of the rectangular medium with respect to the two parallel lines
  • the angle calculation unit It is preferable to calculate two said inclination angles from the said edge space
  • a pair of edge positions can be obtained respectively on two opposing sides of a rectangular medium by two parallel lines, so that it is possible to obtain inclination angles at two opposing sides.
  • the detection accuracy can be further improved by setting the average value of these tilt angles as the rotation angle of the rectangular medium.
  • the medium edge point deviation calculation unit draws at least one pair of the two parallel lines parallel to the horizontal axis and the vertical axis, and sets the two parallel lines to each other.
  • An edge interval and the separation distance are determined, and the angle calculation unit may calculate the inclination angle for each of the set of the determined edge interval and the separation distance, and may set an average value thereof as the rotation angle.
  • the angle calculation unit may calculate the inclination angle for each of the set of the determined edge interval and the separation distance, and may set an average value thereof as the rotation angle. preferable.
  • one or more sets of parallel lines are drawn parallel to each of the horizontal axis and the vertical axis, one or more inclination angles can be obtained at each of the long side and the short side of the rectangular medium.
  • the detection accuracy can be further improved by setting the average value of these tilt angles as the rotation angle of the rectangular medium.
  • the tilt angles are obtained on the four sides of the rectangular medium and the average value is set as the rotation angle, it is possible to further improve the detection accuracy.
  • the present invention is an image processing method for detecting a rotation angle of a rectangular medium on a rectangular medium using a digital image, and a luminance for each of a horizontal axis and a vertical axis of image data of the rectangular medium is detected.
  • a media edge point deviation calculating step characterized in that it comprises, an angle calculation step of calculating a tilt angle from the edge interval and the distance of the two parallel lines.
  • digital images can be used to detect the rotation angle on an image of a rectangular medium. For this reason, it is possible to reduce the operation load without using the Hough transform.
  • FIG. 2 is a diagram showing an example of a configuration of main parts of an image processing apparatus according to an embodiment. It is a figure which shows typically the external appearance of the card
  • FIG. 7 is a diagram showing a change in luminance in the vicinity of the medium edge position on parallel lines (first horizontal line, second horizontal line) according to the embodiment. It is a figure which shows the relationship between the extracted rectangular area
  • FIG. 1 is a view showing an example of the arrangement of main parts of an image processing apparatus 10 according to the embodiment.
  • FIG. 2 is a view schematically showing the appearance of a card 100 which is an example of a rectangular medium.
  • the image processing apparatus 10 is an apparatus that detects a rotation angle on an image of the rectangular medium 100 using a digital image.
  • the rectangular medium 100 is a general card (hereinafter referred to as a "card") conforming to JIS.
  • the card 100 has a rectangular shape, and is, for example, a plastic card having a width of 86 mm, a height of 54 mm, and a thickness of 0.76 mm.
  • the rectangular medium is not limited to the card 100 described above, and may be, for example, a passport book having a width of 125 mm and a height of 88 mm. Moreover, it is not limited to a plastic card or a passport book, and an ID card or a driver's license may be used.
  • the longitudinal direction of the card 100 is the X axis direction
  • the short direction is the Y axis direction.
  • the direction of the information 110 such as a character string formed in the recording area 120 of the information 110 (for example, an OCR character)
  • the direction in which the are aligned is the X-axis direction.
  • the direction orthogonal to the X-axis direction is taken as the Y-axis direction.
  • the direction orthogonal to the direction in which the characters are arranged is the Y-axis direction.
  • the image processing apparatus 10 includes a table 20 on which a card 100 is placed, an image reading unit 30 as an image data input unit, an analog-to-digital converter (A / D converter) 40, an image memory 50, and A data processing unit 60 is provided.
  • a / D converter analog-to-digital converter
  • the image reading unit 30 guides incident light to a pixel area of a CCD (Charge Coupled Device) image sensor as a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates electric charge (ACD (Charge Coupled Device) image sensor as a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates electric charge (ACD (Charge Coupled Device) image sensor as a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates electric charge (ACD (Charge Coupled Device) image sensor as a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates electric charge (ACD (Charge Coupled Device) image sensor) using a photoelectric conversion element that detects light and generates electric charge (ACD (Charge Coupled Device) image sensor) using a photoelectric conversion element that detects light and generates electric charge (ACD (Charge Coupled Device) image sensor) using a photoelectric conversion element
  • the A / D converter 40 converts an image including the card 100 captured by the image reading unit 30 into digital image data, and stores the digital image data in the image memory 50.
  • the A / D converter 40 can also include the function of the image reading unit 30.
  • the image memory 50 stores (stores) digitized image data of the card 100 including the information 110 such as an OCR character string captured by the image reading unit 30.
  • the original image stored in the image memory 50 is formed by arranging a plurality of pixels in a matrix, and specifically, although not shown, pixels of M rows in the X axis direction and N columns in the Y axis direction Is arranged.
  • Each pixel has a pixel value (luminance value).
  • each pixel value takes, for example, any value between 0 and 255 when expressed by 8 bits, and the pixel value is smaller as black is closer and larger as white is closer.
  • the image memory 50 may be anything as long as it can store image data, such as RAM, SDRAM, DDRSDRAM, RDRAM.
  • the data processing unit 60 detects the rotation angle on the image of the card 100 using the digital image, and refers to the detected rotation angle to recognize the information 110 such as characters or barcodes recorded on the card 100. Have a function to The data processing unit 60 is configured as part of a CPU or the like that controls the overall control of the image processing apparatus 10.
  • the data processing unit 60 reads out multi-valued image data (multi-gradation gray-scale image, for example, 256 gradations) from the image memory 50.
  • multi-valued image data multi-gradation gray-scale image, for example, 256 gradations
  • FIG. 3 is a block diagram showing a configuration example of the data processing unit 60 in the image processing apparatus 10.
  • FIG. 4 is a flowchart showing medium rotation angle detection processing by the data processing 60.
  • FIG. 5 is a view showing an example of an image obtained by imaging the card 100 and luminance projection (hereinafter, simply referred to as “projection”).
  • FIG. 5 (a) is an example of image data IMG obtained by imaging the card 100
  • FIG. 5 (b) is an example of projection onto a horizontal axis (X axis)
  • FIG. 5 (c) is a vertical axis (Y axis). It is an example of the projection to).
  • the horizontal axis indicates the position of the X axis
  • the vertical axis indicates the pixel value P.
  • FIG. 5C the horizontal axis indicates the pixel value P
  • the vertical axis indicates the position of the Y axis.
  • the data processing unit 60 includes a projection generation unit 630, an end point detection unit 640, a processing target area outside removal unit 650, a medium edge position deviation calculation unit 660, and an angle calculation unit 670, and performs medium rotation angle detection processing. I do.
  • the projection generation unit 630 executes a projection formation process S10. Specifically, projection formation unit 630 obtains image data IMG from image memory 50, and, for example, for each of the horizontal axis (X axis) and the vertical axis (Y axis) of image IMG shown in FIG. 5A. On the other hand, a first projection prjX (FIG. 5 (b)) and a second projection prjY (FIG. 5 (c)) of pixel values by luminance projection are generated.
  • the first projection prjX is obtained by taking an average of pixel values (brightness values) of each line in a direction perpendicular to the X axis.
  • the second projection prjY is obtained by averaging the pixel values (luminance values) of each line in the direction perpendicular to the Y axis.
  • the end point detection unit 640 executes an end point detection process S12. Specifically, the end point detection unit 640 detects four end points of the area based on the first projection prjX and the second projection prjY in order to extract the area in which the card 100 is included. More specifically, the end point detection unit 640 first determines the points at which the output values of the left end and the right end are minimum in the first projection prjX, and sets these points as the left end point XL and the right end point XR. Similarly, in the second projection prjY, the end point detection unit 64 obtains the points at which the output values of the upper end and the lower end are minimum, and sets these points as the upper end YU and the lower end YL, respectively.
  • the non-target area removal unit 650 executes the non-target area removal process S14. Specifically, the outside-of-target-area removing unit 650 determines the positions of both end points (left end point XL and right end point XR) with respect to the horizontal axis X and the positions of both end points (upper end point YU, lower end point YL) with respect to the vertical axis Y
  • the rectangular area 150 having four vertices is cut out as a processing target area, and the other areas are removed as out of the processing target area.
  • the removal target area outside removal unit 650 is a rectangle ABCD (rectangle) formed by point A (XL, YU), point B (XL, YL), point C (XR, YL), and point D (XR, YU). Cut out the area 150). Thereby, parts unnecessary for processing are separated.
  • the medium edge position deviation calculation unit 660 executes a medium edge position deviation calculation process S16. Specifically, as shown in FIG. 6, the medium edge position deviation calculation unit 660 calculates two parallel lines (first horizontal line L1) at positions passing through the rectangular area 150, as shown in FIG. , Second horizontal line L2). Here, as illustrated, the lower side is the first horizontal line L1, and the upper side is the second horizontal line L2.
  • the first horizontal line L1 is composed of pixels (XL to XR) formed in the X-axis direction of the Y1-th row.
  • the second horizontal line L2 is composed of pixels (XL to XR) formed in the X-axis direction of the Y2-th row.
  • the medium edge position deviation calculation unit 660 obtains the medium edge positions (first edge position X1 and second edge position X2) on the respective parallel lines (first horizontal line L1 and second horizontal line L2). A distance W between two horizontal edge positions is obtained.
  • first horizontal line L1 and second horizontal line L2 The place where two parallel lines (first horizontal line L1 and second horizontal line L2) are drawn is the position where the medium edge position (first edge position X1, second edge position X2) can be found on the same side of the card 100.
  • the same predetermined width is taken from the upper and lower middle positions of the rectangular area 150.
  • the predetermined width can be about 1/4 of the upper and lower length.
  • FIG. 7 shows the change in luminance near the medium edge position on parallel lines (first horizontal line L1, second horizontal line L2), and FIG. 7 (a) corresponds to the area Q1 on the left side of the rectangular area 150. .
  • FIG. 7B corresponds to the area Q2 on the right side of the rectangular area 150.
  • the vertical distance H is determined by the following equation 2.
  • the angle calculation unit 670 executes an angle calculation process S18. Specifically, the angle calculation unit 670 calculates the inclination angle ⁇ according to the following equation 3 from the distance W between the horizontal direction edge positions and the vertical distance H between the two parallel lines (the first horizontal line L1 and the second horizontal line L2). Calculate
  • the calculated inclination angle ⁇ may be adopted as a final inclination angle, the same operation is performed on the right area Q2 of the rectangular area 150 from the viewpoint of accuracy improvement, and the result on the left end side
  • the angle may be calculated together.
  • the mean value can be the final tilt angle.
  • first vertical line V1 is composed of pixels (YL to YR) formed in the Y-axis direction of the X1th row.
  • second horizontal line V2 is composed of pixels (YL to YR) formed in the Y-axis direction of the X2-th row.
  • the horizontal distance WW is determined by the following equation 5.
  • the inclination angle ⁇ is calculated by the following equation 6.
  • the same operation may be performed on the area R2 side of the lower end of the medium, and the angle may be calculated in combination with the result on the area R1 side of the upper end.
  • the average value be the final inclination angle.
  • the inclination angle ⁇ (or the inclination angle ⁇ ) determined by the angle calculation unit 670 is output to the character recognition processing unit in the data processing unit 60 although not shown.
  • the character recognition processing unit corrects (corrects) the inclination on the image of the recording area 120 on the card 100 according to the inclination angle ⁇ (and / or the inclination angle ⁇ ), and the recording area 120 in the corrected image.
  • a character recognition process is performed on the information 110 such as a character string formed in the.
  • the image processing apparatus 10 detects the rotation angle on the image of the card 100 using a digital image. Then, the image processing apparatus 10 generates a projection of the pixel value by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, projection on the horizontal axis (X axis), For each projection onto the vertical axis (Y axis), an end point detection unit 640 that determines the end points of the projection pattern of the projection waveform and a rectangular portion (rectangular area 150) consisting of four end points on the left, right, upper, and lower are processed.
  • a process target area outside removal unit 650 for excluding the other areas as the target area outside, and for the area to be processed, a pair of parallel lines (a set of the first horizontal line L1 and the second horizontal line L2) at a position passing the area Or draw the first vertical line V1 and the second vertical line V2 at least parallel to the horizontal axis (X axis) or the vertical axis (Y axis), and position the edge of the card 100 on each parallel line
  • a medium edge point deviation calculating unit 660 for determining the edge distance between the two edge positions
  • an angle calculating unit 670 for calculating an inclination angle from the edge distance and the separation distance between the two parallel lines.
  • the medium edge position deviation calculation unit 660 obtains edge intervals at each of two opposing sides of the card 100 on opposite sides of one parallel line, and the angle calculation unit 670 calculates edge intervals at two locations. From the distance between two parallel lines, two inclination angles can be calculated, and the average value of them can be used as the rotation angle. As a result, since a pair of edge positions can be obtained on the two opposing sides (long sides or short sides) of the card 100 corresponding to two parallel lines, the inclination angles should be determined at the two opposing sides. Can. By setting the average value of these tilt angles as the rotation angle of the card 100, detection accuracy can be further improved.
  • the medium edge position deviation calculation unit 660 draws at least one pair of two parallel lines parallel to the horizontal axis (X axis) and the vertical axis (Y axis) at each of two parallel lines.
  • the edge spacing and the separation distance may be determined, and the angle calculation unit 670 may calculate the inclination angle for each of the set of the determined edge spacing and the separation distance, and the average value thereof may be used as the rotation angle.
  • one or more sets of parallel lines are drawn parallel to each of the horizontal axis and the vertical axis, so that one or more inclination angles can be obtained on each of the long side and the short side of the card 100.
  • the average value of these tilt angles as the rotation angle of the card 100, detection accuracy can be further improved.
  • the detection accuracy can be further improved.
  • the two parallel lines are small at positions passing through the area.
  • Media edge point deviation calculation step S16 for obtaining edge positions of the card 100 on each parallel line by drawing parallel to the horizontal axis (X axis) or vertical axis (Y axis), and for calculating the edge interval between the two edge positions; And an angle calculation step S18 for calculating an inclination angle from the distance and the separation distance between the two parallel lines.

Abstract

Provided is a technique that enables reduction of an arithmetic operation load involved in an image processing technique for detecting the rotational angle, on an image, of a rectangular medium. In order to detect the rotational angle, on an image, of a card 100, this image processing device is provided with: a projection formation unit 630 that generates a projection of a pixel value through luminance projections on the X axis and the Y axis of image data of the card; an end-point detection unit 640 that determines the two end points of a projection pattern of a projection wave for each of projections to the X axis and the Y axis; a non-processing-subject section removal unit 650 which extracts a rectangular region formed by the end points as a section to be subjected to processing; a medium edge position deviation calculation unit 660 that draws two lines parallel to the X axis at positions passing through the section, calculates edge positions on the parallel lines, and also calculates an edge interval; and an angle calculation unit 670 that calculates an inclination angle on the basis of the edge interval and the distance between the two parallel lines.

Description

画像処理装置および画像処理方法Image processing apparatus and image processing method
 本発明は、矩形状した媒体上の情報を処理する画像処理装置および画像処理方法に関する。 The present invention relates to an image processing apparatus and an image processing method for processing information on a rectangular medium.
 デジタル画像を用いて、処理対象となる矩形状した媒体上の直線あるいは直線で構成される特定の幾何学図形を検出し、認識する手法として、ハフ(Hough)変換がよく知られており、さまざまな産業分野で広く用いられている。 Hough transform is well known as a method of detecting and recognizing a straight line or a specific geometric figure formed of straight lines on a rectangular medium to be processed using a digital image. Are widely used in various industrial fields.
 この方法は、一般には、対象図形を含む画像にノイズ除去やエッジ強調等の前処理を施したのち、抽出された画像パターンに対してハフ変換を施して累積点に変換し、その後、最大の累積度数をもつ累積点に関して逆ハフ変換を行い、画像空間上の直線を求める、という手順をとる(たとえば、特許文献1,2参照)。 In this method, generally, the image including the target graphic is subjected to preprocessing such as noise removal and edge enhancement, and then the extracted image pattern is subjected to Hough transform to be converted into a cumulative point, and then the maximum A procedure is performed in which inverse Hough transform is performed on an accumulated point having an accumulated frequency to obtain a straight line in the image space (see, for example, Patent Documents 1 and 2).
 特許文献1に記載の技術では、半導体デバイスの製造工程におけるワークの位置決めについて、位置座標を、ハフ変換を用いて検出し、位置補正に用いるものであるが、最大累積点を求めて、不要パターン除去を行った後、逆ハフ変換して基準となる直線を求め、ワークの角度補正を得るようにしている。 In the technique described in Patent Document 1, the position coordinate is detected using Hough transformation for positioning of a workpiece in the manufacturing process of a semiconductor device, and it is used for position correction. After removal, inverse Hough transform is performed to obtain a straight line serving as a reference to obtain an angle correction of the work.
 特許文献2に記載の技術では、多角形形状の形状特徴量を算出する場合に、物体の周囲の一部を構成する線分を推定するためにハフ変換を適用している。極座標に変換される点の個数を数え上げ、頻度が多い極座標を選択する。選択された極座標ごとに逆ハフ変換で直線の式を算出するとともに回転角の検出を行っている。 In the technique described in Patent Document 2, when calculating the shape feature amount of a polygon shape, the Hough transform is applied to estimate a line segment that constitutes a part of the periphery of the object. Count the number of points to be converted to polar coordinates, and select polar coordinates with high frequency. The linear equation is calculated by inverse Hough transform for each of the selected polar coordinates, and the rotation angle is detected.
特開平11-97512号公報Japanese Patent Application Laid-Open No. 11-97512 特開2010-79643号公報JP, 2010-79643, A
 しかしながら、特許文献1および特許文献2に示す技術では、ハフ空間上の最大累積点を求めている。このように、ハフ空間上の最大累積点を求めることは、二次元空間上の極大点を求めることであり、不要パターンに起因するノイズを除去する必要があるなど、一般には容易ではなく、かつ演算負荷が大きい。そのため高速処理を行おうとすると、高性能なプロセッサを用いる必要があり、コスト上昇につながるという課題がある。 However, in the techniques shown in Patent Document 1 and Patent Document 2, the maximum cumulative point on the Hough space is obtained. As described above, finding the maximum cumulative point on the Hough space is finding the local maximum point on the two-dimensional space, and it is not easy in general, as it is necessary to remove noise caused by unnecessary patterns, and Calculation load is heavy. Therefore, if high-speed processing is to be performed, it is necessary to use a high-performance processor, which leads to an increase in cost.
 そこで本発明は、上記の状況に鑑みなされたものであって、矩形状した媒体の画像上の回転角度を検出する画像処理技術に関して、演算負荷を軽減することが可能な技術を提供することを目的とする。 Therefore, the present invention has been made in view of the above situation, and provides an image processing technique for detecting the rotation angle on an image of a rectangular medium, which can reduce the operation load. To aim.
 本発明は、デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出する画像処理装置であって、前記矩形状した媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成部と、前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出部と、左右上下の4端点からなる長方形部分を処理対象として、それ以外の領域を対象区域外として除外する処理対象区域外除去部と、前記処理対象の区域について、その区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状した媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出部と、前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出部と、を備えることを特徴とする。
 これによって、本発明の画像処理装置では、デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出することが可能になる。このため、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。
The present invention is an image processing apparatus for detecting a rotation angle on an image of a rectangular medium by using a digital image, wherein luminance relative to each of a horizontal axis and a vertical axis of image data of the rectangular medium is detected. A projection generation unit that generates a projection of pixel values by projection; an end point detection unit that determines both ends of a projection pattern of the projection waveform for each of the projection on the horizontal axis and the projection on the vertical axis; A rectangular part consisting of four end points of the processing target, and excluding the other area as the outside of the target area, a processing target outside the removal area, and two parallel lines at the processing target area passing through the processing target area Media at least parallel to the horizontal axis or the vertical axis to determine the edge position of the rectangular medium on each parallel line, and determining the edge spacing of the two edge positions And Tsu-di-point deviation calculation unit, characterized in that it comprises a an angle calculating part for calculating a tilt angle from the edge interval and the distance of the two parallel lines.
By this, in the image processing apparatus of the present invention, it is possible to detect the rotation angle on the image of the rectangular medium using the digital image. For this reason, it is possible to reduce the operation load without using the Hough transform.
 また、本発明では、前記媒体エッジ点偏差算出部は、前記エッジ間隔を、前記2本の平行線に対して前記矩形状した媒体の対向する辺でそれぞれ1箇所ずつで求め、前記角度算出部は、2箇所における前記エッジ間隔と、前記2本の平行線の前記離間距離とから、二つの前記傾斜角を算出し、それらの平均値を前記回転角度とすることが好ましい。
 本発明において、2本の平行線で、矩形状した媒体の対応する対向する2辺において、それぞれエッジ位置の組を得られるため、対向する2箇所で傾斜角を求めることが可能になる。これらの傾斜角の平均値を矩形状した媒体の回転角度とすることで、検出精度をより向上させることが可能になる。
Further, in the present invention, the medium edge point deviation calculation unit obtains the edge intervals at each of opposite sides of the rectangular medium with respect to the two parallel lines, and the angle calculation unit It is preferable to calculate two said inclination angles from the said edge space | interval in two places, and the said separation distance of the said two parallel lines, and let those average value be the said rotation angle.
In the present invention, a pair of edge positions can be obtained respectively on two opposing sides of a rectangular medium by two parallel lines, so that it is possible to obtain inclination angles at two opposing sides. The detection accuracy can be further improved by setting the average value of these tilt angles as the rotation angle of the rectangular medium.
 さらに、本発明では、前記媒体エッジ点偏差算出部は、前記2本の平行線を、前記水平軸および前記垂直軸に平行に、少なくともそれぞれ一組引き、それぞれの前記2本の平行線において前記エッジ間隔と前記離間距離とを求め、前記角度算出部は、求めた前記エッジ間隔と前記離間距離との組のそれぞれについて前記傾斜角を算出し、それらの平均値を前記回転角度とすることが好ましい。
 本発明において、平行線を、水平軸および垂直軸のそれぞれに平行に1組以上引くので、矩形状した媒体の長辺および短辺でそれぞれで1箇所以上の傾斜角が得られる。これらの傾斜角の平均値を矩形状した媒体の回転角度とすることで、検出精度をより向上させることができる。また、矩形状した媒体の4辺において傾斜角を求め平均値を回転角度とすれば、さらに検出精度を向上させることが可能になる。
Furthermore, in the present invention, the medium edge point deviation calculation unit draws at least one pair of the two parallel lines parallel to the horizontal axis and the vertical axis, and sets the two parallel lines to each other. An edge interval and the separation distance are determined, and the angle calculation unit may calculate the inclination angle for each of the set of the determined edge interval and the separation distance, and may set an average value thereof as the rotation angle. preferable.
In the present invention, since one or more sets of parallel lines are drawn parallel to each of the horizontal axis and the vertical axis, one or more inclination angles can be obtained at each of the long side and the short side of the rectangular medium. The detection accuracy can be further improved by setting the average value of these tilt angles as the rotation angle of the rectangular medium. In addition, if the tilt angles are obtained on the four sides of the rectangular medium and the average value is set as the rotation angle, it is possible to further improve the detection accuracy.
 本発明は、デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出する画像処理方法であって、前記矩形状した媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成工程と、前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出工程と、左右上下の4端点からなる長方形部分を処理対象として、それ以外の領域を対象区域外として除外する処理対象区域外除去工程と、前記処理対象の区域について、その区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状した媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出工程と、前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出工程と、を備えることを特徴とする。
 これによって、本発明の画像処理方法では、デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出することが可能になる。このため、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。
The present invention is an image processing method for detecting a rotation angle of a rectangular medium on a rectangular medium using a digital image, and a luminance for each of a horizontal axis and a vertical axis of image data of the rectangular medium is detected. A projection generation step of generating a projection of pixel values by projection; an end point detection step of determining both ends of a projection pattern of the projection waveform for each of the projection on the horizontal axis and the projection on the vertical axis; Removing an area outside the target area by excluding the other area as the outside of the target area, and for the area to be processed, two parallel lines at positions passing through the area Is at least parallel to the horizontal axis or the vertical axis, the edge position of the rectangular medium on each parallel line is determined, and the edge spacing between the two edge positions is determined. A media edge point deviation calculating step, characterized in that it comprises, an angle calculation step of calculating a tilt angle from the edge interval and the distance of the two parallel lines.
By this, in the image processing method of the present invention, it is possible to detect the rotation angle on the image of the rectangular medium using the digital image. For this reason, it is possible to reduce the operation load without using the Hough transform.
 本発明によれば、デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出することが可能になる。このため、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。 According to the present invention, digital images can be used to detect the rotation angle on an image of a rectangular medium. For this reason, it is possible to reduce the operation load without using the Hough transform.
実施形態に係る、画像処理装置の主要部の構成例を示す図である。FIG. 2 is a diagram showing an example of a configuration of main parts of an image processing apparatus according to an embodiment. 実施形態に係る、矩形状した媒体の一例であるカードの外観を模式的に示す図である。It is a figure which shows typically the external appearance of the card | curd which is an example of the rectangular-shaped medium based on embodiment. 実施形態に係る、画像処理装置におけるデータ処理部の構成例を示すブロック図である。It is a block diagram showing an example of composition of a data processing part in an image processing device concerning an embodiment. 実施形態に係る、データ処理部による媒体回転角度検出処理を示すフローチャートである。It is a flowchart which shows the medium rotation angle detection process by a data processing part based on embodiment. 実施形態に係る、カードを撮像した画像および輝度投影の一例を示す図である。It is a figure which shows an example which imaged the card | curd and brightness | luminance projection based on embodiment. 実施形態に係る、抽出した長方形領域と第1水平線、第2水平線の関係を示す図である。It is a figure which shows the relationship between the extracted rectangular area | region, a 1st horizontal line, and a 2nd horizontal line based on embodiment. 実施形態に係る、平行線(第1水平線、第2水平線)上における媒体エッジ位置近傍の輝度の変化を示す図である。FIG. 7 is a diagram showing a change in luminance in the vicinity of the medium edge position on parallel lines (first horizontal line, second horizontal line) according to the embodiment. 実施形態に係る、抽出した長方形領域と第1垂直線、第2垂直線の関係を示す図である。It is a figure which shows the relationship between the extracted rectangular area | region, a 1st perpendicular line, and a 2nd perpendicular line based on embodiment.
 以下、発明を実施するための形態(以下、「実施形態」という)を、図面を参照しつつ説明する。 Hereinafter, modes for carrying out the invention (hereinafter, referred to as “embodiments”) will be described with reference to the drawings.
 図1は、実施形態に係る画像処理装置10の主要部の構成例を示す図である。図2は、矩形状した媒体の一例であるカード100の外観を模式的に示す図である。 FIG. 1 is a view showing an example of the arrangement of main parts of an image processing apparatus 10 according to the embodiment. FIG. 2 is a view schematically showing the appearance of a card 100 which is an example of a rectangular medium.
 画像処理装置10は、デジタル画像を用いて、矩形状した媒体100の画像上の回転角度を検出する装置である。 The image processing apparatus 10 is an apparatus that detects a rotation angle on an image of the rectangular medium 100 using a digital image.
 矩形状した媒体100は、JISに準拠している一般的なカード(以下、「カード」という。)である。このカード100は、矩形状をしており、たとえば、幅86mm、高さ54mm、厚み0.76mmというサイズのプラスチックカードである。なお、矩形状をした媒体は、上述したカード100に限定されるものではなく、たとえば、幅125mm、高さ88mmというサイズのパスポートブックでもよい。また、プラスチックカードやパスポートブックに限定されるものではなく、IDカードや運転免許証などでもよい。 The rectangular medium 100 is a general card (hereinafter referred to as a "card") conforming to JIS. The card 100 has a rectangular shape, and is, for example, a plastic card having a width of 86 mm, a height of 54 mm, and a thickness of 0.76 mm. The rectangular medium is not limited to the card 100 described above, and may be, for example, a passport book having a width of 125 mm and a height of 88 mm. Moreover, it is not limited to a plastic card or a passport book, and an ID card or a driver's license may be used.
 なお、本実施形態では、図2において、カード100の長手方向をX軸方向とし、短手方向をY軸方向としている。また、本実施形態では、説明を簡単にするために、図2に示すように、情報110(たとえばOCR文字)の記録領域120に形成されている文字列等の情報110の方向、すなわち、文字が並んでいる方向をX軸方向としている。そして、X軸方向に直交する方向をY軸方向としている。具体的には、文字が並んでいる方向と直交する方向はY軸方向である。 In the present embodiment, in FIG. 2, the longitudinal direction of the card 100 is the X axis direction, and the short direction is the Y axis direction. Further, in the present embodiment, in order to simplify the description, as shown in FIG. 2, the direction of the information 110 such as a character string formed in the recording area 120 of the information 110 (for example, an OCR character) The direction in which the are aligned is the X-axis direction. The direction orthogonal to the X-axis direction is taken as the Y-axis direction. Specifically, the direction orthogonal to the direction in which the characters are arranged is the Y-axis direction.
 図1に示すように、画像処理装置10は、カード100が載置されるテーブル20、画像データ入力部としての画像読取部30、アナログデジタルコンバータ(A/Dコンバータ)40、画像メモリ50、およびデータ処理部60を有している。 As shown in FIG. 1, the image processing apparatus 10 includes a table 20 on which a card 100 is placed, an image reading unit 30 as an image data input unit, an analog-to-digital converter (A / D converter) 40, an image memory 50, and A data processing unit 60 is provided.
 画像読取部30は、光を検出して電荷を発生させる光電変換素子を用いた固体撮像装置(イメージセンサ)としてのCCD(Charge Coupled Device)イメージセンサ、イメージセンサの画素領域に入射光を導く(被写体像を結像する)光学系(レンズ等)を有し、テーブル20上に載置され、照明光源31で照明されるカード100の全体を含む所定の領域を撮像する。なお、固体撮像装置(イメージセンサ)として、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサが用いられてもよい。 The image reading unit 30 guides incident light to a pixel area of a CCD (Charge Coupled Device) image sensor as a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates electric charge ( An optical system (lens or the like) for forming an image of a subject is mounted on the table 20, and a predetermined area including the entire card 100 illuminated by the illumination light source 31 is imaged. Note that a complementary metal oxide semiconductor (CMOS) image sensor may be used as a solid-state imaging device (image sensor).
 A/Dコンバータ40は、画像読取部30によって撮像されたカード100を含む画像をデジタル画像データに変換し、画像メモリ50に格納する。なお、A/Dコンバータ40は、画像読取部30にその機能を含ませることも可能である。 The A / D converter 40 converts an image including the card 100 captured by the image reading unit 30 into digital image data, and stores the digital image data in the image memory 50. The A / D converter 40 can also include the function of the image reading unit 30.
 画像メモリ50は、画像読取部30で撮像されたOCR文字列等の情報110を含むカード100のデジタル化された画像データを記憶(格納)する。画像メモリ50に格納される原画像は、複数の画素がマトリクス状に配列されて形成され、具体的には、図示していないが、X軸方向にM行、Y軸方向にN列の画素が配置されている。各画素はそれぞれ画素値(輝度値)を有する。本実施形態では、各画素値は、たとえば8ビットで表現すると0~255の間のいずれかの値をとり、画素値は黒に近いほど小さく、白に近いほど大きな値をとる。なお、この画像メモリ50は、RAM,SDRAM,DDRSDRAM,RDRAMなど、画像データを記憶しうるものであれば如何なるものであってもよい。 The image memory 50 stores (stores) digitized image data of the card 100 including the information 110 such as an OCR character string captured by the image reading unit 30. The original image stored in the image memory 50 is formed by arranging a plurality of pixels in a matrix, and specifically, although not shown, pixels of M rows in the X axis direction and N columns in the Y axis direction Is arranged. Each pixel has a pixel value (luminance value). In the present embodiment, each pixel value takes, for example, any value between 0 and 255 when expressed by 8 bits, and the pixel value is smaller as black is closer and larger as white is closer. The image memory 50 may be anything as long as it can store image data, such as RAM, SDRAM, DDRSDRAM, RDRAM.
 データ処理部60は、デジタル画像を用いて、カード100の画像上の回転角度を検出し、検出した回転角度を参照してカード100上に記録されている文字あるいはバーコード等の情報110を認識する機能を有する。データ処理部60は、画像処理装置10の全体的な制御を司るCPU等の一部として構成される。 The data processing unit 60 detects the rotation angle on the image of the card 100 using the digital image, and refers to the detected rotation angle to recognize the information 110 such as characters or barcodes recorded on the card 100. Have a function to The data processing unit 60 is configured as part of a CPU or the like that controls the overall control of the image processing apparatus 10.
[データ処理部60の各部の構成および機能]
 次に、データ処理部60の各部の基本的な構成および機能について説明する。
 データ処理部60は、画像メモリ50から多値化された画像データ(多階調の濃淡画像、たとえば、256階調)を読み出す。
[Configuration and Function of Each Unit of Data Processing Unit 60]
Next, basic configurations and functions of the respective units of the data processing unit 60 will be described.
The data processing unit 60 reads out multi-valued image data (multi-gradation gray-scale image, for example, 256 gradations) from the image memory 50.
 図3は、画像処理装置10におけるデータ処理部60の構成例を示すブロック図である。図4は、データ処理60による媒体回転角度検出処理を示すフローチャートである。図5はカード100を撮像した画像および輝度投影(以下、単に「射影」という。)の一例を示す図である。図5(a)がカード100を撮像した画像データIMGの例であり、図5(b)が水平軸(X軸)への射影の例であり、図5(c)が垂直軸(Y軸)への射影の例である。なお、図5(b)において、横軸はX軸の位置を示す、縦軸は画素値Pを示す。また、図5(c)において、横軸は画素値Pを示し、縦軸はY軸の位置を示す。 FIG. 3 is a block diagram showing a configuration example of the data processing unit 60 in the image processing apparatus 10. FIG. 4 is a flowchart showing medium rotation angle detection processing by the data processing 60. FIG. 5 is a view showing an example of an image obtained by imaging the card 100 and luminance projection (hereinafter, simply referred to as “projection”). FIG. 5 (a) is an example of image data IMG obtained by imaging the card 100, FIG. 5 (b) is an example of projection onto a horizontal axis (X axis), and FIG. 5 (c) is a vertical axis (Y axis). It is an example of the projection to). In FIG. 5B, the horizontal axis indicates the position of the X axis, and the vertical axis indicates the pixel value P. Further, in FIG. 5C, the horizontal axis indicates the pixel value P, and the vertical axis indicates the position of the Y axis.
 データ処理部60は、射影生成部630と、端点検出部640と、処理対象区域外除去部650と、媒体エッジ位置偏差算出部660と、角度算出部670とを有し、媒体回転角度検出処理を行う。 The data processing unit 60 includes a projection generation unit 630, an end point detection unit 640, a processing target area outside removal unit 650, a medium edge position deviation calculation unit 660, and an angle calculation unit 670, and performs medium rotation angle detection processing. I do.
 射影生成部630は、射影形成処理S10を実行する。具体的には、射影形成部630は、画像メモリ50から画像データIMGを取得し、たとえば、図5(a)に示す画像IMGの水平軸(X軸)および垂直軸(Y軸)のそれぞれに対して、輝度投影による画素値の第1射影prjX(図5(b))および第2射影prjY(図5(c))を生成する。 The projection generation unit 630 executes a projection formation process S10. Specifically, projection formation unit 630 obtains image data IMG from image memory 50, and, for example, for each of the horizontal axis (X axis) and the vertical axis (Y axis) of image IMG shown in FIG. 5A. On the other hand, a first projection prjX (FIG. 5 (b)) and a second projection prjY (FIG. 5 (c)) of pixel values by luminance projection are generated.
 ここで、第1射影prjXとは、X軸に垂直方向にラインごとの画素値(輝度値)の平均をとったものである。第2射影prjYとは、Y軸に垂直方向にラインごとの画素値(輝度値)の平均をとったものである。 Here, the first projection prjX is obtained by taking an average of pixel values (brightness values) of each line in a direction perpendicular to the X axis. The second projection prjY is obtained by averaging the pixel values (luminance values) of each line in the direction perpendicular to the Y axis.
 端点検出部640は、端点検出処理S12を実行する。具体的には、端点検出部640は、第1射影prjXおよび第2射影prjYをもとに、カード100が含まれている領域を抽出するために、その領域の4つの端点を検出する。より具体的には、端点検出部640は、まず、第1射影prjXにおいて、左端部および右端部の出力値の最小となる点を求め、これらをそれぞれ左端点XL、右端点XRとする。同様に、端点検出部64は、第2射影prjYにおいて、上端部および下端部の出力値の最小となる点を求め、これらをそれぞれ上端点YU、下端点YLとする。 The end point detection unit 640 executes an end point detection process S12. Specifically, the end point detection unit 640 detects four end points of the area based on the first projection prjX and the second projection prjY in order to extract the area in which the card 100 is included. More specifically, the end point detection unit 640 first determines the points at which the output values of the left end and the right end are minimum in the first projection prjX, and sets these points as the left end point XL and the right end point XR. Similarly, in the second projection prjY, the end point detection unit 64 obtains the points at which the output values of the upper end and the lower end are minimum, and sets these points as the upper end YU and the lower end YL, respectively.
 処理対象区域外除去部650は、処理対象区域外除去処理S14を実行する。具体的には、処理対象区域外除去部650は、水平軸Xに関する両端点(左端点XL、右端点XR)の位置および、垂直軸Yに関する両端点(上端点YU、下端点YL)の位置を、4頂点とする長方形領域150を処理対象区域として切り出し、それ以外の領域を処理対象区域外として除去する。 The non-target area removal unit 650 executes the non-target area removal process S14. Specifically, the outside-of-target-area removing unit 650 determines the positions of both end points (left end point XL and right end point XR) with respect to the horizontal axis X and the positions of both end points (upper end point YU, lower end point YL) with respect to the vertical axis Y The rectangular area 150 having four vertices is cut out as a processing target area, and the other areas are removed as out of the processing target area.
 すなわち、処理対象区域外除去部650は、点A(XL、YU)、点B(XL、YL)、点C(XR、YL)、点D(XR、YU)により形成される長方形ABCD(長方形領域150)を切り出す。これにより、処理に不要な部分が切り離される。 That is, the removal target area outside removal unit 650 is a rectangle ABCD (rectangle) formed by point A (XL, YU), point B (XL, YL), point C (XR, YL), and point D (XR, YU). Cut out the area 150). Thereby, parts unnecessary for processing are separated.
 媒体エッジ位置偏差算出部660は、媒体エッジ位置偏差算出処理S16を実行する。具体的には、媒体エッジ位置偏差算出部660は、処理対象区域である長方形領域150について、図6に示すように、その長方形領域150を通過する位置に2本の平行線(第1水平線L1、第2水平線L2)を引く。ここでは、図示のように下側に第1水平線L1、上側に第2水平線L2である。本実施形態では、第1水平線L1は、Y1行目のX軸方向に形成された画素(XL~XR)から構成されている。同様に、第2水平線L2は、Y2行目のX軸方向に形成された画素(XL~XR)から構成されている。 The medium edge position deviation calculation unit 660 executes a medium edge position deviation calculation process S16. Specifically, as shown in FIG. 6, the medium edge position deviation calculation unit 660 calculates two parallel lines (first horizontal line L1) at positions passing through the rectangular area 150, as shown in FIG. , Second horizontal line L2). Here, as illustrated, the lower side is the first horizontal line L1, and the upper side is the second horizontal line L2. In the present embodiment, the first horizontal line L1 is composed of pixels (XL to XR) formed in the X-axis direction of the Y1-th row. Similarly, the second horizontal line L2 is composed of pixels (XL to XR) formed in the X-axis direction of the Y2-th row.
 媒体エッジ位置偏差算出部660は、各平行線(第1水平線L1、第2水平線L2)上における媒体エッジ位置(第1エッジ位置X1、第2エッジ位置X2)を求め、次の式1により、2点の水平方向エッジ位置間距離Wを求める。 The medium edge position deviation calculation unit 660 obtains the medium edge positions (first edge position X1 and second edge position X2) on the respective parallel lines (first horizontal line L1 and second horizontal line L2). A distance W between two horizontal edge positions is obtained.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 なお、2本の平行線(第1水平線L1、第2水平線L2)を引く場所は、カード100の同一辺上に媒体エッジ位置(第1エッジ位置X1、第2エッジ位置X2)が求まる位置とする。たとえば、長方形領域150の上下中間位置から同じ所定幅とする。所定幅は、上下の長さの1/4程度とすることができる。 The place where two parallel lines (first horizontal line L1 and second horizontal line L2) are drawn is the position where the medium edge position (first edge position X1, second edge position X2) can be found on the same side of the card 100. Do. For example, the same predetermined width is taken from the upper and lower middle positions of the rectangular area 150. The predetermined width can be about 1/4 of the upper and lower length.
 図7は平行線(第1水平線L1、第2水平線L2)上における媒体エッジ位置近傍の輝度の変化を示しており、図7(a)が長方形領域150の左側の領域Q1に対応している。図7(b)は長方形領域150の右側の領域Q2に対応している。 FIG. 7 shows the change in luminance near the medium edge position on parallel lines (first horizontal line L1, second horizontal line L2), and FIG. 7 (a) corresponds to the area Q1 on the left side of the rectangular area 150. . FIG. 7B corresponds to the area Q2 on the right side of the rectangular area 150.
 ここでは、長方形領域150の左側の領域Q1における媒体エッジ位置(第1エッジ位置X1、第2エッジ位置X2)を決定する例を挙げる。 Here, an example in which the medium edge position (first edge position X1, second edge position X2) in the area Q1 on the left side of the rectangular area 150 is determined will be described.
 平行線(第1水平線L1、第2水平線L2)の位置を第1水平線位置Y1、第2水平線位置Y2とすると、次の式2により垂直距離Hを求める。 Assuming that the positions of parallel lines (the first horizontal line L1 and the second horizontal line L2) are the first horizontal line position Y1 and the second horizontal line position Y2, the vertical distance H is determined by the following equation 2.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 角度算出部670は、角度算出処理を実行するS18。具体的には、角度算出部670は、水平方向エッジ位置間距離Wと2本の平行線(第1水平線L1、第2水平線L2)の垂直距離Hとから、次の式3により傾斜角θを算出する。 The angle calculation unit 670 executes an angle calculation process S18. Specifically, the angle calculation unit 670 calculates the inclination angle θ according to the following equation 3 from the distance W between the horizontal direction edge positions and the vertical distance H between the two parallel lines (the first horizontal line L1 and the second horizontal line L2). Calculate
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 なお、算出した傾斜角度θを、最終的な傾斜角度として採用してもよいが、精度向上の観点から、同様の操作を長方形領域150の右側の領域Q2について行って、左端側での結果と合わせて角度を算出してもよい。たとえば平均値を最終の傾斜角度とすることができる。 Although the calculated inclination angle θ may be adopted as a final inclination angle, the same operation is performed on the right area Q2 of the rectangular area 150 from the viewpoint of accuracy improvement, and the result on the left end side The angle may be calculated together. For example, the mean value can be the final tilt angle.
 さらに、一層の精度向上の観点から、図8に示すように、同じ長方形領域150の画像において、2本の垂直線(第1垂直線V1、第2垂直線V2)を引き、上側の領域R1の第1垂直線V1と媒体上側エッジとの交点を第1エッジ位置YY1、第2垂直線V2と媒体上側エッジとの交点を第2エッジ位置YY2として、次の式4により、2点の垂直方向エッジ位置間距離HHを求める。本形態では、第1垂直線V1は、X1行目のY軸方向に形成された画素(YL~YR)から構成されている。同様に、第2水平線V2は、X2行目のY軸方向に形成された画素(YL~YR)から構成されている。 Furthermore, as shown in FIG. 8, in the image of the same rectangular area 150, two vertical lines (first vertical line V1, second vertical line V2) are drawn from the viewpoint of further improvement in accuracy, and the upper area R1 is drawn. Assuming that the intersection of the first vertical line V1 and the upper side edge of the medium is the first edge position YY1, and the intersection of the second vertical line V2 and the upper side edge of the medium is the second edge position YY2, A direction edge position distance HH is obtained. In the present embodiment, the first vertical line V1 is composed of pixels (YL to YR) formed in the Y-axis direction of the X1th row. Similarly, the second horizontal line V2 is composed of pixels (YL to YR) formed in the Y-axis direction of the X2-th row.
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 第1垂直線V1および第2垂直線V2の位置をそれぞれ第1垂直線位置XX1、第2垂直線位置XX2とすると、次の式5により水平距離WWを求める。 Assuming that the positions of the first vertical line V1 and the second vertical line V2 are respectively the first vertical line position XX1 and the second vertical line position XX2, the horizontal distance WW is determined by the following equation 5.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 次の式6により傾斜角θθを算出する。 The inclination angle θθ is calculated by the following equation 6.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 そして、同様の操作を媒体の下端の領域R2側についても行って、上端の領域R1側での結果と合わせて角度を算出してもよい。たとえば平均値を最終の傾斜角度とする。 Then, the same operation may be performed on the area R2 side of the lower end of the medium, and the angle may be calculated in combination with the result on the area R1 side of the upper end. For example, let the average value be the final inclination angle.
 さらに左右上下合計4か所での傾斜角度を算出してその全体の平均値を求め、これを傾斜角度としてもよい。また、左右上下のいずれか3箇所が用いられてもよい。 Furthermore, it is possible to calculate inclination angles at a total of four positions in the left and right and up and down to obtain an average value of the whole, and use this as the inclination angle. Moreover, any three places of right and left upper and lower sides may be used.
 上述したとおり、角度算出部670で決定された傾斜角度θ(または傾斜角度θθ)は、図示していないが、データ処理部60内の文字認識処理部に出力される。たとえば、文字認識処理部は、カード100上の記録領域120の画像上の傾きを、傾斜角度θ(および/または傾斜角度θθ)に応じて修正(補正)し、修正後の画像における記録領域120に形成されている文字列等の情報110を文字認識処理を行う。 As described above, the inclination angle θ (or the inclination angle θθ) determined by the angle calculation unit 670 is output to the character recognition processing unit in the data processing unit 60 although not shown. For example, the character recognition processing unit corrects (corrects) the inclination on the image of the recording area 120 on the card 100 according to the inclination angle θ (and / or the inclination angle θθ), and the recording area 120 in the corrected image. A character recognition process is performed on the information 110 such as a character string formed in the.
 以上、本実施形態の特徴をまとめると次のとおりである。
 (1)画像処理装置10は、デジタル画像を用いて、カード100の画像上の回転角度を検出する。そして、画像処理装置10は、カード100の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影形成部630と、水平軸(X軸)への射影および垂直軸(Y軸)への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出部640と、左右上下の4端点からなる長方形部分(長方形領域150)を処理対象として、それ以外の領域を対象区域外として除外する処理対象区域外除去部650と、処理対象の区域について、その区域を通過する位置に2本の平行線(第1水平線L1と第2水平線L2の組、または第1垂直線V1と第2垂直線V2の組)を少なくとも水平軸(X軸)または垂直軸(Y軸)に平行に引き、各平行線上におけるカード100のエッジ位置(エッジ点)を求め、二つのエッジ位置のエッジ間隔を求める媒体エッジ点偏差算出部660と、エッジ間隔と2本の平行線の離間距離とから傾斜角を算出する角度算出部670と、を備える。これによって、算出された傾斜角をカード100の回転角度とするので、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。
The features of the present embodiment are summarized as follows.
(1) The image processing apparatus 10 detects the rotation angle on the image of the card 100 using a digital image. Then, the image processing apparatus 10 generates a projection of the pixel value by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, projection on the horizontal axis (X axis), For each projection onto the vertical axis (Y axis), an end point detection unit 640 that determines the end points of the projection pattern of the projection waveform and a rectangular portion (rectangular area 150) consisting of four end points on the left, right, upper, and lower are processed. A process target area outside removal unit 650 for excluding the other areas as the target area outside, and for the area to be processed, a pair of parallel lines (a set of the first horizontal line L1 and the second horizontal line L2) at a position passing the area Or draw the first vertical line V1 and the second vertical line V2 at least parallel to the horizontal axis (X axis) or the vertical axis (Y axis), and position the edge of the card 100 on each parallel line A medium edge point deviation calculating unit 660 for determining the edge distance between the two edge positions, and an angle calculating unit 670 for calculating an inclination angle from the edge distance and the separation distance between the two parallel lines. Prepare. As a result, since the calculated inclination angle is used as the rotation angle of the card 100, it is possible to reduce the calculation load without using the Hough transform.
 (2)媒体エッジ位置偏差算出部660は、エッジ間隔を、2本の平行線に対してカード100の対向する辺でそれぞれ1箇所ずつで求め、角度算出部670は、2箇所におけるエッジ間隔と、2本の平行線の離間距離とから、二つの傾斜角を算出し、それらの平均値を回転角度とすることができる。これによって、2本の平行線で、カード100の対応する対向する2辺(長辺同士または短辺同士)において、それぞれエッジ位置の組を得られるため、対向する2箇所で傾斜角を求めることができる。これらの傾斜角の平均値をカード100の回転角度とすることで、検出精度をより向上させることができる。 (2) The medium edge position deviation calculation unit 660 obtains edge intervals at each of two opposing sides of the card 100 on opposite sides of one parallel line, and the angle calculation unit 670 calculates edge intervals at two locations. From the distance between two parallel lines, two inclination angles can be calculated, and the average value of them can be used as the rotation angle. As a result, since a pair of edge positions can be obtained on the two opposing sides (long sides or short sides) of the card 100 corresponding to two parallel lines, the inclination angles should be determined at the two opposing sides. Can. By setting the average value of these tilt angles as the rotation angle of the card 100, detection accuracy can be further improved.
 (3)媒体エッジ位置偏差算出部660は、2本の平行線を、水平軸(X軸)および垂直軸(Y軸)に平行に、少なくともそれぞれ一組引き、それぞれの2本の平行線においてエッジ間隔と離間距離とを求め、角度算出部670は、求めたエッジ間隔と離間距離との組のそれぞれについて傾斜角を算出し、それらの平均値を前記回転角度としてもよい。これによって、平行線を、水平軸および垂直軸のそれぞれに平行に1組以上引くので、カード100の長辺および短辺でそれぞれで1箇所以上の傾斜角が得られる。これらの傾斜角の平均値をカード100の回転角度とすることで、検出精度をより向上させることができる。また、カード100の4辺において傾斜角を求め平均値を回転角度とすれば、さらに検出精度を向上させることができる。 (3) The medium edge position deviation calculation unit 660 draws at least one pair of two parallel lines parallel to the horizontal axis (X axis) and the vertical axis (Y axis) at each of two parallel lines. The edge spacing and the separation distance may be determined, and the angle calculation unit 670 may calculate the inclination angle for each of the set of the determined edge spacing and the separation distance, and the average value thereof may be used as the rotation angle. By this, one or more sets of parallel lines are drawn parallel to each of the horizontal axis and the vertical axis, so that one or more inclination angles can be obtained on each of the long side and the short side of the card 100. By setting the average value of these tilt angles as the rotation angle of the card 100, detection accuracy can be further improved. In addition, if the tilt angles are calculated on the four sides of the card 100 and the average value is set as the rotation angle, the detection accuracy can be further improved.
 (4)デジタル画像を用いて、カード100の画像上の回転角度を検出する画像処理方法であって、カード100の画像データの水平軸(X軸)および垂直軸(Y軸)のそれぞれに対して輝度投影による画素値の射影を生成する射影生成工程S10と、水平軸への射影および垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出工程S12と、左右上下の4端点からなる長方形部分(長方形領域150)を処理対象として、それ以外の領域を対象区域外として除外する処理対象区域外除去工程S14と、処理対象の区域(長方形領域150)について、その区域を通過する位置に2本の平行線(第1水平線L1および第2水平線L2の組、または第1垂直線V1および第2垂直線V2の組)を少なくとも水平軸(X軸)または垂直軸(Y軸)に平行に引き、各平行線上におけるカード100のエッジ位置を求め、二つのエッジ位置のエッジ間隔を求める媒体エッジ点偏差算出工程S16と、エッジ間隔と2本の平行線の離間距離とから傾斜角を算出する角度算出工程S18と、を備える。これによって、算出された傾斜角をカード100の回転角度とするので、ハフ変換を用いることなく、演算負荷を軽減することが可能になる。 (4) An image processing method for detecting a rotation angle on an image of a card 100 using a digital image, which is for each of a horizontal axis (X axis) and a vertical axis (Y axis) of image data of the card 100 A projection generation step S10 of generating a projection of pixel values by luminance projection; and an end point detection step S12 of determining both ends of a projection pattern of the projection waveform for each of projection on the horizontal axis and projection on the vertical axis; With regard to a process target area outside removal process S14 for excluding the other areas as outside the target area with the rectangular part (rectangular area 150) consisting of four end points on the left and right and upper and lower sides and the area to be processed (rectangular area 150) The two parallel lines (the first horizontal line L1 and the second horizontal line L2, or the first vertical line V1 and the second vertical line V2) are small at positions passing through the area. Media edge point deviation calculation step S16 for obtaining edge positions of the card 100 on each parallel line by drawing parallel to the horizontal axis (X axis) or vertical axis (Y axis), and for calculating the edge interval between the two edge positions; And an angle calculation step S18 for calculating an inclination angle from the distance and the separation distance between the two parallel lines. As a result, since the calculated inclination angle is used as the rotation angle of the card 100, it is possible to reduce the calculation load without using the Hough transform.
 本発明を、実施の形態をもとに説明したが、この実施の形態は例示であり、それらの各構成要素の組み合わせ等にいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 Although the present invention has been described based on the embodiment, this embodiment is an exemplification, and it is possible that various modifications can be made to the combination of the respective constituent elements thereof, and such a modification is also applicable to the present invention. It is understood by those skilled in the art to be within the scope.
10 像処理装置
20 テーブル
30 画像読み取り装置
31 光源
40 アナログデジタルコンバータ(A/Dコンバータ)
50 画像メモリ
60 データ処理
100 カード(媒体)
110 情報
120 情報記録領域
630 射影形成部
640 端点検出部
650 処理対象区域外除去部
660 媒体エッジ位置偏差算出部(媒体エッジ点偏差算出部)
670 角度算出部
DESCRIPTION OF SYMBOLS 10 Image processing apparatus 20 Table 30 Image reading apparatus 31 Light source 40 Analog-digital converter (A / D converter)
50 image memory 60 data processing 100 card (medium)
DESCRIPTION OF SYMBOLS 110 Information 120 Information recording area 630 Project formation part 640 End point detection part 650 Processing area outside removal part 660 Medium edge position deviation calculation part (medium edge point deviation calculation part)
670 Angle Calculator

Claims (4)

  1.  デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出する画像処理装置であって、
     前記矩形状した媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成部と、
     前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出部と、
     左右上下の4端点からなる長方形部分を処理対象として、それ以外の領域を対象区域外として除外する処理対象区域外除去部と、
     前記処理対象の区域について、その区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状した媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出部と、
     前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出部と、
     を備えることを特徴とする画像処理装置。
    An image processing apparatus for detecting a rotation angle on an image of a rectangular medium using a digital image, comprising:
    A projection generation unit for generating projections of pixel values by luminance projection with respect to horizontal and vertical axes of image data of the rectangular medium,
    An end point detection unit which determines, for each of the projection onto the horizontal axis and the projection onto the vertical axis, both end points of the projection pattern of the projection waveform;
    A process target area outside removal section that excludes other areas as target areas outside the target area, with a rectangular portion consisting of four end points on the left, right, upper, and lower
    For the area to be processed, two parallel lines are drawn at least parallel to the horizontal axis or the vertical axis at positions passing the area, and the edge position of the rectangular medium on each parallel line is determined; A medium edge point deviation calculation unit for calculating an edge interval of the edge position;
    An angle calculation unit that calculates an inclination angle from the edge interval and the separation distance between the two parallel lines;
    An image processing apparatus comprising:
  2.  前記媒体エッジ点偏差算出部は、前記エッジ間隔を、前記2本の平行線に対して前記矩形状した媒体の対向する辺でそれぞれ1箇所ずつ求め、
     前記角度算出部は、2箇所における前記エッジ間隔と、前記2本の平行線の前記離間距離とから、二つの前記傾斜角を算出し、それらの平均値を前記回転角度とすることを特徴とする請求項1に記載の画像処理装置。
    The medium edge point deviation calculation unit obtains the edge intervals, one by one on opposing sides of the rectangular medium with respect to the two parallel lines,
    The angle calculation unit is characterized in that two inclination angles are calculated from the edge intervals at two locations and the separation distance between the two parallel lines, and an average value thereof is used as the rotation angle. The image processing apparatus according to claim 1.
  3.  前記媒体エッジ点偏差算出部は、前記2本の平行線を、前記水平軸および前記垂直軸に平行に、少なくともそれぞれ一組引き、それぞれの前記2本の平行線において前記エッジ間隔と前記離間距離とを求め、
     前記角度算出部は、求めた前記エッジ間隔と前記離間距離との組のそれぞれについて前記傾斜角を算出し、それらの平均値を前記回転角度とすることを特徴とする請求項1または2に記載の画像処理装置。
    The medium edge point deviation calculation unit draws at least one pair of the two parallel lines parallel to the horizontal axis and the vertical axis, and the edge distance and the separation distance at each of the two parallel lines. Seeking and
    The said angle calculation part calculates the said inclination angle about each of the group of the calculated said edge space | interval and the said separation distance, The average value is made into the said rotation angle, It is characterized by the above-mentioned. Image processing device.
  4.  デジタル画像を用いて、矩形状した媒体の画像上の回転角度を検出する画像処理方法であって、
     前記矩形状した媒体の画像データの水平軸および垂直軸のそれぞれに対して輝度投影による画素値の射影を生成する射影生成工程と、
     前記水平軸への射影および前記垂直軸への射影のそれぞれについて、その射影波形の射影パターンの両端点を決定する端点検出工程と、
     左右上下の4端点からなる長方形部分を処理対象として、それ以外の領域を対象区域外として除外する処理対象区域外除去工程と、
     前記処理対象の区域について、その区域を通過する位置に2本の平行線を少なくとも前記水平軸または前記垂直軸に平行に引き、各平行線上における前記矩形状した媒体のエッジ位置を求め、二つの前記エッジ位置のエッジ間隔を求める媒体エッジ点偏差算出工程と、
     前記エッジ間隔と前記2本の平行線の離間距離とから傾斜角を算出する角度算出工程と、
     を備えることを特徴とする画像処理方法。
    An image processing method for detecting a rotation angle on an image of a rectangular medium using a digital image, comprising:
    A projection generation step of generating projections of pixel values by luminance projection with respect to horizontal and vertical axes of image data of the rectangular medium,
    An end point detection step of determining end points of a projection pattern of the projection waveform for each of the projection onto the horizontal axis and the projection onto the vertical axis;
    A process target area outside removal process which excludes the area other than the target area as a processing target and a rectangular portion consisting of four end points on the left, right, upper and lower sides,
    For the area to be processed, two parallel lines are drawn at least parallel to the horizontal axis or the vertical axis at positions passing the area, and the edge position of the rectangular medium on each parallel line is determined; A medium edge point deviation calculating step of determining an edge interval of the edge position;
    An angle calculation step of calculating an inclination angle from the edge interval and the separation distance between the two parallel lines;
    An image processing method comprising:
PCT/JP2018/041980 2017-11-30 2018-11-13 Image processing device and image processing method WO2019107141A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017230325 2017-11-30
JP2017-230325 2017-11-30

Publications (1)

Publication Number Publication Date
WO2019107141A1 true WO2019107141A1 (en) 2019-06-06

Family

ID=66665604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041980 WO2019107141A1 (en) 2017-11-30 2018-11-13 Image processing device and image processing method

Country Status (1)

Country Link
WO (1) WO2019107141A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117495858A (en) * 2023-12-29 2024-02-02 合肥金星智控科技股份有限公司 Belt offset detection method, system, equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07229726A (en) * 1994-02-16 1995-08-29 Kubota Corp Method for detecting angle of object
JP2006065407A (en) * 2004-08-24 2006-03-09 Canon Inc Image processing device, its method and program
JP2006163821A (en) * 2004-12-07 2006-06-22 Casio Comput Co Ltd Photographing device, image processor, image processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07229726A (en) * 1994-02-16 1995-08-29 Kubota Corp Method for detecting angle of object
JP2006065407A (en) * 2004-08-24 2006-03-09 Canon Inc Image processing device, its method and program
JP2006163821A (en) * 2004-12-07 2006-06-22 Casio Comput Co Ltd Photographing device, image processor, image processing method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117495858A (en) * 2023-12-29 2024-02-02 合肥金星智控科技股份有限公司 Belt offset detection method, system, equipment and medium

Similar Documents

Publication Publication Date Title
US8559748B2 (en) Edge detection
WO2019187967A1 (en) Image processing device and image processing method
JP5041458B2 (en) Device for detecting three-dimensional objects
CN107316047B (en) Image processing apparatus, image processing method, and storage medium
JP4911340B2 (en) Two-dimensional code detection system and two-dimensional code detection program
JP2016513320A (en) Method and apparatus for image enhancement and edge verification using at least one additional image
JP5573618B2 (en) Image processing program and image processing apparatus
US20050196070A1 (en) Image combine apparatus and image combining method
WO2018147059A1 (en) Image processing device, image processing method, and program
WO2018061997A1 (en) Medium recognition device and medium recognition method
WO2019107141A1 (en) Image processing device and image processing method
CN108335266B (en) Method for correcting document image distortion
Zhang et al. Estimation of 3D shape of warped document surface for image restoration
JP6006675B2 (en) Marker detection apparatus, marker detection method, and program
Zhang et al. Restoringwarped document images using shape-from-shading and surface interpolation
JP5264956B2 (en) Two-dimensional code reading apparatus and method
JP2010091525A (en) Pattern matching method of electronic component
CN110097065B (en) Freeman chain code-based line detection method and terminal
WO2019188194A1 (en) Method for determining center of pattern on lens marker, device for same, program for causing computer to execute said determination method, and recording medium for program
JP6171165B2 (en) Driver's license reading device and driver's license reading method
TWI493475B (en) Method for determining convex polygon object in image
JP2009025992A (en) Two-dimensional code
JP2020027000A (en) Correction method for lens marker image, correction device, program, and recording medium
JP2015192389A (en) Marker embedding device, marker detection device, and program
JP2018084925A (en) Parking frame detection device, parking frame detection method, program, and non-temporary recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18882387

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18882387

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP