WO2019187967A1 - Dispositif et procédé de traitement d'image - Google Patents

Dispositif et procédé de traitement d'image Download PDF

Info

Publication number
WO2019187967A1
WO2019187967A1 PCT/JP2019/007979 JP2019007979W WO2019187967A1 WO 2019187967 A1 WO2019187967 A1 WO 2019187967A1 JP 2019007979 W JP2019007979 W JP 2019007979W WO 2019187967 A1 WO2019187967 A1 WO 2019187967A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
image
projection
medium
unit
Prior art date
Application number
PCT/JP2019/007979
Other languages
English (en)
Japanese (ja)
Inventor
中村 宏
Original Assignee
日本電産サンキョー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産サンキョー株式会社 filed Critical 日本電産サンキョー株式会社
Publication of WO2019187967A1 publication Critical patent/WO2019187967A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to an image processing apparatus and an image processing method for processing information on a rectangular medium.
  • Hough transformation is well known as a technique for detecting and recognizing specific geometric figures consisting of straight lines or straight lines on a rectangular medium to be processed using digital images. Widely used in various industrial fields.
  • preprocessing such as noise removal and edge enhancement is performed on an image including a target graphic
  • Hough transform is performed on the extracted image pattern to convert it into cumulative points
  • maximum A procedure is performed in which inverse Hough transform is performed on cumulative points having cumulative frequencies to obtain straight lines in the image space (see, for example, Patent Documents 1 and 2).
  • position coordinates are detected by using Hough transform for positioning of a workpiece in a semiconductor device manufacturing process, and used for position correction. After removal, inverse Hough transform is performed to obtain a reference straight line to obtain workpiece angle correction.
  • the Hough transform is applied to estimate a line segment constituting a part of the periphery of the object. Count the number of points to be converted to polar coordinates, and select polar coordinates with high frequency. For each selected polar coordinate, a straight line equation is calculated by inverse Hough transform and a rotation angle is detected.
  • the present invention has been made in view of the above-described situation, and provides a technique capable of reducing the calculation load with respect to an image processing technique for detecting a rotation angle on an image of a rectangular medium. With the goal.
  • the present invention is an image processing apparatus for detecting a rotation angle on an image of a rectangular medium using a digital image, and each of the horizontal axis and the vertical axis of image data of the rectangular medium.
  • a projection generation unit that generates a projection of pixel values by luminance projection, an end point detection unit that determines both end points of the projection pattern of the projection waveform for each of the projection to the horizontal axis and the projection to the vertical axis;
  • a processing target cutout unit that cuts out first image data using a rectangular portion including four left and right and upper and lower four end points as processing targets, first image data obtained by cutting out the medium for the processing target area, and 180 degrees of the first image data.
  • An image adjustment unit that generates third image data obtained by superimposing the second image data rotated by the degree of rotation, and the third image data in the area to be processed.
  • Two parallel lines are drawn parallel to at least the horizontal axis or the vertical axis at a position passing through the area to obtain an edge position of the rectangular medium on each parallel line, and an edge interval between the two edge positions is determined.
  • the image processing apparatus of the present invention can detect the rotation angle on the image of the rectangular medium using the digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
  • the image adjustment unit generates the third image data by taking an average of the pixel values of the first image data and the second image data obtained by rotating the first image data by 180 degrees. It is preferable to do.
  • the image adjusting unit is less susceptible to noise and the like by taking the average of the pixel values of the first image data and the second image data obtained by rotating the first image data by 180 degrees.
  • a difference in luminance between the background and the medium edge (edge) can be generated.
  • the present invention relates to an image processing method for detecting a rotation angle on an image of a rectangular medium using a digital image, and each of the horizontal axis and the vertical axis of the image data of the rectangular medium.
  • a projection generation step for generating a projection of pixel values by luminance projection, and an endpoint detection step for determining both end points of the projection pattern of the projection waveform for each of the projection to the horizontal axis and the projection to the vertical axis;
  • a processing target cutout step for cutting out first image data using a rectangular portion composed of four end points on the left, right, top, and bottom as a processing target, first image data obtained by cutting out the medium with respect to the area to be processed, and 180 for the first image data.
  • An image adjustment step for generating third image data obtained by superimposing the second image data rotated at a predetermined angle, and the third image data in the area to be processed. And drawing two parallel lines parallel to at least the horizontal axis or the vertical axis at a position passing through the area to be processed to obtain an edge position of the rectangular medium on each parallel line, A medium edge point deviation calculating step for obtaining an edge interval between edge positions; and an angle calculating step for calculating an inclination angle from the edge interval and a separation distance between the two parallel lines.
  • the present invention it is possible to detect a rotation angle on an image of a rectangular medium using a digital image. For this reason, it is possible to reduce the calculation load without using the Hough transform.
  • FIG. 5A is a diagram illustrating an example of an image captured from a card and luminance projection according to the embodiment, and FIG. 5A is an example of first image data cut out from image data captured from the card; FIG. ) Is an example of projection onto the horizontal axis (X axis), and FIG.
  • FIG. 5C is an example of projection onto the vertical axis (Y axis).
  • FIG. 6A is a diagram for explaining image adjustment processing of the image adjustment unit according to the embodiment, FIG. 6A shows the first image data cut out by the processing target cutout unit, and FIG. The second image data obtained by rotating the first image data by 180 degrees is shown, and FIG. 6C shows the third image data obtained by superimposing the first image data and the second image data. It is a figure which shows the relationship between the rectangular area of the extracted 3rd image data based on embodiment, a 1st horizontal line, and a 2nd horizontal line.
  • FIG. 8A is a diagram showing a change in luminance in the vicinity of a medium edge position on parallel lines (first horizontal line, second horizontal line) according to the embodiment, and FIG.
  • 8A corresponds to the left region of the rectangular region; 8 (b) corresponds to the area on the right side of the rectangular area. It is a figure which shows the relationship between the rectangular area of the extracted 3rd image data based on embodiment, a 1st vertical line, and a 2nd vertical line.
  • FIG. 1 is a diagram illustrating a configuration example of a main part of an image processing apparatus 10 according to the embodiment.
  • FIG. 2 is a diagram schematically showing an appearance of a card 100 which is an example of a rectangular medium.
  • the image processing apparatus 10 is an apparatus that detects a rotation angle on an image of a rectangular medium 100 using a digital image.
  • the medium 100 having a rectangular shape is a general card (hereinafter referred to as “card”) compliant with JIS.
  • the card 100 has a rectangular shape, and is, for example, a plastic card having a width of 86 mm, a height of 54 mm, and a thickness of 0.76 mm.
  • the rectangular medium is not limited to the card 100 described above, and may be, for example, a passport book with a width of 125 mm and a height of 88 mm.
  • it is not limited to a plastic card or a passport book, but may be an ID card or a driver's license.
  • the longitudinal direction of the card 100 is the X-axis direction
  • the short direction is the Y-axis direction
  • a black magnetic stripe 110 is formed on the card 100 in the longitudinal direction of the card 100
  • a barcode 120 is formed on the card 100 in the longitudinal direction.
  • the origin of the image of the card 100 is the origin O on the upper left side as shown in FIGS. 2 and 5, and the longitudinal direction of the card 100 (from the origin O to the right).
  • the direction in which it faces is the X-axis direction.
  • a direction perpendicular to the X-axis direction (a direction downward from the origin O) is taken as a Y-axis direction.
  • the image processing apparatus 10 includes a table 20 on which a card 100 is placed, an image reading unit 30 as an image data input unit, an analog / digital converter (A / D converter) 40, an image memory 50, and A data processing unit 60 is included.
  • a / D converter analog / digital converter
  • the image reading unit 30 guides incident light to a pixel area of a CCD (Charge Coupled Device) image sensor, which is a solid-state imaging device (image sensor) using a photoelectric conversion element that detects light and generates charges (image sensor). It has an optical system (lens and the like) that forms a subject image, and is placed on the table 20 and images a predetermined area including the entire card 100 illuminated by the illumination light source 31.
  • a CMOS (Complementary Metal Metal Oxide Semiconductor) image sensor may be used as the solid-state imaging device (image sensor).
  • the A / D converter 40 converts an image including the card 100 imaged by the image reading unit 30 into digital image data and stores the digital image data in the image memory 50.
  • the A / D converter 40 can also include the function in the image reading unit 30.
  • the image memory 50 stores (stores) digitized image data of the card 100 including the magnetic stripe 110 and the barcode 120 on which information to be read imaged by the image reading unit 30 is recorded.
  • the original image stored in the image memory 50 is formed by arranging a plurality of pixels in a matrix. Specifically, although not shown in the drawing, M rows in the X-axis direction and N columns in the Y-axis direction Is arranged. Each pixel has a pixel value (luminance value). In the present embodiment, each pixel value takes any value between 0 and 255 when expressed in 8 bits, for example. The pixel value is smaller as it is closer to black, and is larger as it is closer to white.
  • the image memory 50 may be any memory such as RAM, SDRAM, DDRSDRAM, and RDRAM as long as it can store image data.
  • the data processing unit 60 uses the digital image to detect the rotation angle on the image of the card 100, and refers to the detected rotation angle to record the magnetic stripe 110, barcode 120, characters, etc. recorded on the card 100. It has a function of recognizing information.
  • the data processing unit 60 is configured as a part of a CPU or the like that controls the entire image processing apparatus 10.
  • the data processing unit 60 reads multi-valued image data (multi-gradation grayscale image, for example, 256 gradations) from the image memory 50.
  • multi-valued image data multi-gradation grayscale image, for example, 256 gradations
  • FIG. 3 is a block diagram illustrating a configuration example of the data processing unit 60 in the image processing apparatus 10.
  • FIG. 4 is a flowchart showing medium rotation angle detection processing by the data processing unit 60.
  • FIG. 5 is a diagram illustrating an example of first image data and luminance projection (hereinafter simply referred to as “projection”) cut out from image data obtained by capturing the card 100.
  • FIG. 5A shows an example of the first image data IMG1 cut out from the image data obtained by capturing the card 100
  • FIG. 5B shows an example of projection onto the horizontal axis (X axis)
  • FIG. ) Is an example of projection onto the vertical axis (Y-axis).
  • FIG. 5A shows an example of the first image data IMG1 cut out from the image data obtained by capturing the card 100
  • FIG. 5B shows an example of projection onto the horizontal axis (X axis)
  • FIG. ) Is an example of projection onto the vertical axis (Y-axis).
  • the horizontal axis indicates the position of the X axis
  • the vertical axis indicates the pixel value P
  • the horizontal axis indicates the pixel value P
  • the vertical axis indicates the position of the Y axis.
  • the data processing unit 60 includes a projection generation unit 610, an end point detection unit 620, a processing target cutout unit 630, an image adjustment unit 640, a medium edge position deviation calculation unit 650, and an angle calculation unit 660. A rotation angle detection process is performed.
  • the projection generation unit 610 executes the projection formation process ST11. Specifically, the projection forming unit 610 acquires the image data IMG from the image memory 50 and, for example, each of the horizontal axis (X axis) and the vertical axis (Y axis) of the image IMG shown in FIG. On the other hand, a first projection prjX (FIG. 5B) and a second projection prjY (FIG. 5C) of pixel values by luminance projection are generated.
  • the first projection prjX is an average of pixel values (luminance values) for each line in the direction perpendicular to the X axis.
  • the second projection prjY is an average of pixel values (luminance values) for each line in the direction perpendicular to the Y axis.
  • the endpoint detection unit 620 executes endpoint detection processing ST12. Specifically, the end point detection unit 620 detects four end points of the area in order to extract the area including the card 100 based on the first projection prjX and the second projection prjY. More specifically, the end point detection unit 620 first obtains the points at which the output values of the left end portion and the right end portion are minimum in the first projection prjX, and sets these as the left end point XL and the right end point XR, respectively. Similarly, the end point detection unit 620 obtains points at which the output values of the upper end and the lower end are minimum in the second projection prjY, and sets these as the upper end point YU and the lower end point YL, respectively.
  • the process target cutout unit 630 executes the process target cutout process ST13. Specifically, the processing target cutout unit 60 determines the position of both end points (left end point XL, right end point XR) with respect to the horizontal axis X and the position of both end points (upper end point YU, lower end point YL) with respect to the vertical axis Y. A rectangular area 150 having four vertices is cut out as the first image data IMG to be processed, and the other areas are removed as outside the processing target area.
  • the processing target cutout unit 630 includes a rectangular area (ABCD) that includes a point A (XL, YU), a point B (XL, YL), a point C (XR, YL), and a point D (XR, YU). ) 150 first image data IMG1 are cut out. Thereby, a part unnecessary for processing is separated.
  • the image adjustment unit 640 executes an image adjustment process ST14.
  • FIG. 6 is a diagram for explaining the image adjustment processing of the image adjustment unit 640.
  • 6A shows the first image data IMG1 cut out by the processing target cutout unit 630
  • FIG. 6B shows the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees
  • FIG. (C) shows the third image data IMG3 obtained by superimposing the first image data IMG1 and the second image data IMG2.
  • the image adjusting unit 640 rotates the first image data IMG1 shown in FIG. 6A and the first image data IMG1 that are cut out by the processing target cutout unit 630 by 180 degrees.
  • the second image data IMG2 is overlapped to generate the third image data as shown in FIG.
  • the image adjustment unit 640 takes the average of the respective pixel values of the original first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees, to obtain the third image data IMG3. Is generated.
  • the image adjustment unit 640 takes the average of the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees in the magnetic recording area.
  • the average of the pixel values of a certain magnetic stripe 110 portion and a portion without the magnetic stripe 110 is averaged, and the luminance of the averaged portion is made higher than the luminance of the magnetic stripe 110 to obtain the luminance between the background and the card edge. It is possible to make a difference.
  • the portion of the magnetic stripe 110 is averaged with the portion without the magnetic stripe, and the luminance of this portion is higher than the luminance of the magnetic stripe 110.
  • the brightness of the background portion is unchanged, there is a difference in brightness between the background and the card edge, preventing erroneous detection of the card edge position in the media edge position deviation calculation process in the next process. it can. That is, when the image adjustment unit 640 is not provided, in the process of specifying the card edge position, the “card-background luminance difference” at the position to be determined as the card edge becomes almost zero. May occur, and accurate angle detection may be hindered. However, by providing the image adjustment unit 640 as in the present embodiment, a “brightness difference between the card and the background” is secured, Misdetection of the card edge position is less likely to occur.
  • the medium edge position deviation calculation unit 650 executes medium edge position deviation calculation processing ST15.
  • FIG. 7 is a diagram illustrating a relationship between the rectangular region 150 of the extracted third image data IMG3, the first horizontal line L1, and the second horizontal line L2.
  • the medium edge position deviation calculation unit 650 as shown in FIG. 7, for the third image data IMG3 of the rectangular area 150 that is the processing target area, is parallel to two positions passing through the rectangular area 150.
  • a line (first horizontal line L1, second horizontal line L2) is drawn.
  • the first horizontal line L1 is on the lower side
  • the second horizontal line L2 is on the upper side.
  • the first horizontal line L1 is composed of pixels (XL to XR) formed in the X-axis direction of the Y1th row.
  • the second horizontal line L2 is composed of pixels (XL to XR) formed in the X-axis direction of the Y2th row.
  • the medium edge position deviation calculation unit 650 obtains the medium edge position (first edge position X1, second edge position X2) on each parallel line (first horizontal line L1, second horizontal line L2), and uses the following equation (1): A distance W between two horizontal edge positions is obtained.
  • first horizontal line L1 and second horizontal line L2 The place where two parallel lines (first horizontal line L1 and second horizontal line L2) are drawn is a position where the medium edge position (first edge position X1 and second edge position X2) is obtained on the same side of the card 100.
  • the same predetermined width is set from the upper and lower intermediate positions of the rectangular region 150.
  • the predetermined width can be about 1 ⁇ 4 of the vertical length.
  • FIG. 8 shows a change in luminance in the vicinity of the medium edge position on the parallel lines (first horizontal line L1, second horizontal line L2), and FIG. 8A corresponds to a region Q1 on the left side of the rectangular region 150. .
  • FIG. 8B corresponds to a region Q2 on the right side of the rectangular region 150.
  • the vertical distance H is obtained by the following equation 2.
  • the angle calculation unit 660 executes an angle calculation process ST16. Specifically, the angle calculation unit 660 calculates the inclination angle ⁇ from the horizontal edge position distance W and the vertical distance H of two parallel lines (first horizontal line L1 and second horizontal line L2) according to the following equation 3. Is calculated.
  • the calculated tilt angle ⁇ may be adopted as the final tilt angle. However, from the viewpoint of improving accuracy, the same operation is performed on the right side region Q2 of the rectangular region 150, and the result on the left end side is calculated. In addition, the angle may be calculated. For example, the average value can be the final tilt angle.
  • FIG. 9 is a diagram showing the relationship between the rectangular area 150 of the extracted third image data IMG3, the first vertical line V1, and the second vertical line V2. Furthermore, from the viewpoint of further improving the accuracy, as shown in FIG. 9, in the image of the same rectangular area 150, two vertical lines (first vertical line V1 and second vertical line V2) are drawn, and the upper area R1 is drawn. The intersection of the first vertical line V1 and the upper side edge of the medium is defined as a first edge position YY1, and the intersection of the second vertical line V2 and the upper side edge of the medium is defined as a second edge position YY2. A distance HH between the direction edge positions is obtained.
  • the first vertical line V1 is composed of pixels (YL to YR) formed in the Y-axis direction of the X1th row.
  • the second horizontal line V2 is composed of pixels (YL to YR) formed in the Y-axis direction of the X2th row.
  • the horizontal distance WW is obtained by the following Expression 5.
  • the inclination angle ⁇ is calculated by the following formula 6.
  • the same operation may be performed on the lower region R2 side of the medium, and the angle may be calculated together with the result on the upper region R1 side.
  • the average value is set as the final inclination angle.
  • the inclination angle ⁇ (or inclination angle ⁇ ) determined by the angle calculation unit 660 is output to the information recognition processing unit in the data processing unit 60, although not shown.
  • the information recognition processing unit corrects (corrects) the inclination on the image of the magnetic stripe 110 that is the recording area on the card 100 according to the inclination angle ⁇ (and / or the inclination angle ⁇ ), and the corrected image.
  • the recognition process of the information recorded on the magnetic stripe 110 is performed.
  • the image processing apparatus 10 detects the rotation angle on the image of the card 100 using the digital image.
  • the image processing apparatus 10 includes a projection forming unit 610 that generates a projection of pixel values by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, and a projection on the horizontal axis (X axis).
  • a projection forming unit 610 that generates a projection of pixel values by luminance projection on each of the horizontal axis and the vertical axis of the image data of the card 100, and a projection on the horizontal axis (X axis).
  • an end point detection unit 620 that determines both end points of the projection pattern of the projection waveform, and a rectangular portion (in this embodiment, a rectangular region 150) consisting of four left and right upper and lower end points are processed.
  • a processing target cutout unit 630 that cuts out the first image data IMG1 as a target, first image data IMG1 cut out by the processing target cutout unit 630, and second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees.
  • the image adjustment unit 640 that generates the third image data IMG3 by superimposing the third image data IMG3 in the area to be processed Two parallel lines (a set of the first horizontal line L1 and the second horizontal line L2 or a set of the first vertical line V1 and the second vertical line V2) at least at a horizontal axis (X axis) or a vertical axis at a position passing through the area
  • a medium edge point deviation calculating unit 650 that obtains an edge position (edge point) of the card 100 on each parallel line and obtains an edge interval between the two edge positions, and is drawn parallel to the (Y axis), and the edge interval and two parallels.
  • An angle calculation unit 660 that calculates an inclination angle from the line separation distance.
  • the calculated inclination angle is set as the rotation angle of the card 100, the calculation load can be reduced without using the Hough transform. In addition, erroneous detection of the card edge position can be prevented.
  • the image adjustment unit 640 generates the third image data IMG3 by averaging the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG2 by 180 degrees.
  • the image adjustment unit 640 takes the average of the pixel values of the first image data IMG1 and the second image data IMG2 obtained by rotating the first image data IMG1 by 180 degrees in the magnetic recording area.
  • the average of the pixel values of a certain magnetic stripe 110 portion and a portion without the magnetic stripe 110 is taken, and the luminance of the averaged portion is made higher than the luminance of the magnetic recording area, so that the background and the edge of the medium (edge) It is possible to produce a difference in brightness between the two. More specifically, by rotating the first image data IMG1 by 180 degrees, the portion of the magnetic stripe 110 is averaged with the portion without the magnetic stripe, and the luminance of this portion is higher than the luminance of the magnetic stripe 110.
  • the brightness of the background portion is unchanged, there is a difference in brightness between the background and the card edge, preventing erroneous detection of the card edge position in the media edge position deviation calculation process in the next process. it can.
  • the medium edge position deviation calculation unit 650 obtains the edge interval in the third image data IMG3 at one place on each side opposite to the card 100 with respect to two parallel lines, and the angle calculation unit 660 Two inclination angles can be calculated from the edge interval at two locations and the separation distance between two parallel lines, and the average value thereof can be used as the rotation angle.
  • a pair of edge positions can be obtained with two parallel lines on the two opposite sides (long sides or short sides) corresponding to each other on the card 100, so that the inclination angle is obtained at the two opposite locations. Can do.
  • the detection accuracy can be further improved.
  • the medium edge position deviation calculation unit 650 draws at least one set of two parallel lines in the third image data IMG3 in parallel with the horizontal axis (X axis) and the vertical axis (Y axis). The edge interval and the separation distance are obtained for the two parallel lines, and the angle calculation unit 660 calculates an inclination angle for each of the obtained pair of the edge interval and the separation distance, and the average value thereof is used as the rotation angle. Good.
  • one or more inclination angles are obtained on the long side and the short side of the card 100, respectively.
  • the detection accuracy can be further improved.
  • the inclination angles are obtained on the four sides of the card 100 and the average value is set as the rotation angle, the detection accuracy can be further improved.
  • An image processing method for detecting a rotation angle on an image of the card 100 using a digital image, and each of the horizontal axis (X axis) and the vertical axis (Y axis) of the image data of the card 100 A projection generation step ST11 for generating a projection of pixel values by luminance projection; an end point detection step ST12 for determining the end points of the projection pattern of the projection waveform for each of the projection onto the horizontal axis and the projection onto the vertical axis;
  • a set of the first vertical line V1 and the second vertical line V2) is drawn in parallel to at least the horizontal axis (X axis) or the vertical axis (Y axis), and the edge position of the card 100 on each parallel line is obtained.
  • the medium edge point deviation calculating step ST15 for obtaining the edge interval of the edge position, and the angle calculating step ST16 for calculating the inclination angle from the edge interval and the distance between the two parallel lines are provided.
  • the calculated inclination angle is used as the rotation angle of the card 100, so that the calculation load can be reduced without using the Hough transform.
  • erroneous detection of the card edge position can be prevented.
  • SYMBOLS 10 ... Image processing apparatus, 20 ... Table, 30 ... Image reading part, 31 ... Illumination light source, 40 ... Analog-digital converter (A / D converter), 50 ... Image memory, 60 ... Data processing unit, 100 ... Card (medium), 110 ... Magnetic stripe, 120 ... Bar code, 610 ... Projection forming unit, 620 ... End point detection unit, 630 ... Processing target cutout unit, 640... Image adjustment unit, 650... Medium edge position deviation calculation unit (medium edge point deviation calculation unit), 660.
  • a / D converter Analog-digital converter
  • 50 ... Image memory
  • 60 Data processing unit, 100 ... Card (medium), 110 ... Magnetic stripe, 120 ... Bar code, 610 ... Projection forming unit, 620 ... End point detection unit, 630 ... Processing target cutout unit, 640... Image adjustment unit, 650... Medium edge position deviation calculation unit (medium edge point deviation calculation unit), 660.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne une technologie qui est associée à une technologie de traitement d'image pour détecter l'angle de rotation d'un support sur une image, et qui est capable de réduire la charge de calcul et d'empêcher la détection erronée d'une position de bord de carte. Une unité de traitement de données dudit dispositif de traitement d'image comprend : une unité de génération de projection 610 qui génère une projection de valeurs de pixel de données d'image, lesdites valeurs de pixel étant issues d'une projection de luminance par rapport à un axe X et un axe Y ; une unité de détection de point d'extrémité qui détermine les deux points d'extrémité d'un motif de projection d'une forme d'onde de projection sur l'axe X et l'axe Y ; une unité d'extraction de cible de traitement qui extrait des premières données d'image à l'aide d'une section rectangulaire comprenant quatre côtés gauche, des points d'extrémité droit, supérieur et inférieur en tant que cibles de traitement ; une unité de réglage d'image 640 qui génère des troisièmes données d'image par chevauchement des premières données d'image extraites et des secondes données d'image, qui sont les premières données d'image pivotées à 180 degrés ; et une unité de calcul d'écart de point de bord de support qui trouve, par rapport aux troisièmes données d'image, des positions de bord de carte sur deux lignes parallèles, qui sont parallèles à l'axe X, à des positions qui passent à travers la zone cible de traitement, et trouve des intervalles de bord.
PCT/JP2019/007979 2018-03-30 2019-03-01 Dispositif et procédé de traitement d'image WO2019187967A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-067419 2018-03-30
JP2018067419A JP2019179342A (ja) 2018-03-30 2018-03-30 画像処理装置および画像処理方法

Publications (1)

Publication Number Publication Date
WO2019187967A1 true WO2019187967A1 (fr) 2019-10-03

Family

ID=68061354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/007979 WO2019187967A1 (fr) 2018-03-30 2019-03-01 Dispositif et procédé de traitement d'image

Country Status (2)

Country Link
JP (1) JP2019179342A (fr)
WO (1) WO2019187967A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115229804A (zh) * 2022-09-21 2022-10-25 荣耀终端有限公司 组件贴合方法和装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022169874A (ja) 2021-04-28 2022-11-10 株式会社Pfu 画像処理装置、画像処理方法、及びプログラム
CN113828948B (zh) * 2021-11-23 2022-03-08 济南邦德激光股份有限公司 一种激光切割机的板材寻边方法、标定系统及寻边系统
CN113829673B (zh) * 2021-11-26 2022-03-18 武汉宏博纸品包装有限公司 一种基于霍夫变换的蜂窝纸芯拉伸控制方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0735509A (ja) * 1993-07-16 1995-02-07 Toshiba Corp 切手検出装置
JP2006031166A (ja) * 2004-07-13 2006-02-02 Glory Ltd 画像照合装置、画像照合方法および画像照合プログラム。
JP2006135211A (ja) * 2004-11-09 2006-05-25 Nikon Corp 表面検査装置および表面検査方法および露光システム
JP2010245788A (ja) * 2009-04-03 2010-10-28 Sharp Corp 画像出力装置、携帯端末装置、撮像画像処理システム、画像出力方法、プログラムおよび記録媒体

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0735509A (ja) * 1993-07-16 1995-02-07 Toshiba Corp 切手検出装置
JP2006031166A (ja) * 2004-07-13 2006-02-02 Glory Ltd 画像照合装置、画像照合方法および画像照合プログラム。
JP2006135211A (ja) * 2004-11-09 2006-05-25 Nikon Corp 表面検査装置および表面検査方法および露光システム
JP2010245788A (ja) * 2009-04-03 2010-10-28 Sharp Corp 画像出力装置、携帯端末装置、撮像画像処理システム、画像出力方法、プログラムおよび記録媒体

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115229804A (zh) * 2022-09-21 2022-10-25 荣耀终端有限公司 组件贴合方法和装置

Also Published As

Publication number Publication date
JP2019179342A (ja) 2019-10-17

Similar Documents

Publication Publication Date Title
WO2019187967A1 (fr) Dispositif et procédé de traitement d'image
JP4911340B2 (ja) 二次元コード検出システムおよび二次元コード検出プログラム
US8649593B2 (en) Image processing apparatus, image processing method, and program
CN107316047B (zh) 图像处理装置、图像处理方法以及存储介质
JP2012243307A (ja) 入力画像における歪を検出する方法、入力画像における歪を検出する装置およびコンピューター読み取り可能な媒体
US20050196070A1 (en) Image combine apparatus and image combining method
CN111164959B (zh) 图像处理装置、图像处理方法以及记录介质
WO2012172817A1 (fr) Appareil de stabilisation d'image, procédé de stabilisation d'image et document
JP2016513320A (ja) 少なくとも1つの追加イメージを用いたイメージ改善及びエッジ検証のための方法及び装置
JP2011058812A (ja) 視差算出方法、および視差算出装置
JP2000244729A (ja) 画像処理装置
JP2009020613A (ja) 画像処理プログラム、画像処理方法及び画像処理装置
WO2018147059A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, et programme
JP2008147976A (ja) 画像傾き補正装置及び画像傾き補正方法
EP1408681A1 (fr) Procédé pour déterminer l'angle d'inclinaison et l'emplacement d'un document dans une image scannée
WO2012029658A1 (fr) Dispositif d'imagerie, dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
US7079265B2 (en) Distortion correction device for correcting imaged object to produce plane image without distortion
JP2003304561A (ja) ステレオ画像処理装置
WO2018061997A1 (fr) Dispositif et procédé de reconnaissance de support
CN108335266B (zh) 一种文档图像畸变的矫正方法
US20190108617A1 (en) Image processing apparatus, system, image processing method, calibration method, and computer-readable recording medium
WO2019107141A1 (fr) Dispositif et procédé de traitement d'image
JP6006675B2 (ja) マーカ検出装置、マーカ検出方法、及びプログラム
JP6068080B2 (ja) 画像結合装置、画像結合方法及びプログラム
JP2018148513A (ja) 画像処理装置、画像処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19777502

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19777502

Country of ref document: EP

Kind code of ref document: A1