WO2014199824A1 - 画像処理方法、画像処理装置および画像処理プログラム - Google Patents
画像処理方法、画像処理装置および画像処理プログラム Download PDFInfo
- Publication number
- WO2014199824A1 WO2014199824A1 PCT/JP2014/064066 JP2014064066W WO2014199824A1 WO 2014199824 A1 WO2014199824 A1 WO 2014199824A1 JP 2014064066 W JP2014064066 W JP 2014064066W WO 2014199824 A1 WO2014199824 A1 WO 2014199824A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- rotation amount
- template
- matching
- rotation
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 17
- 238000012545 processing Methods 0.000 title claims description 104
- 238000006073 displacement reaction Methods 0.000 claims abstract description 11
- 238000001514 detection method Methods 0.000 claims description 55
- 230000008859 change Effects 0.000 claims description 12
- 230000000737 periodic effect Effects 0.000 claims description 10
- 238000003702 image correction Methods 0.000 claims description 5
- 238000000034 method Methods 0.000 description 117
- 230000008569 process Effects 0.000 description 53
- 238000006243 chemical reaction Methods 0.000 description 28
- 238000012937 correction Methods 0.000 description 25
- 238000004364 calculation method Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000015654 memory Effects 0.000 description 3
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 206010047571 Visual impairment Diseases 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/14—Transformations for image registration, e.g. adjusting or mapping for alignment of images
- G06T3/147—Transformations for image registration, e.g. adjusting or mapping for alignment of images using affine transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/32—Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/11—Technique with transformation invariance effect
Definitions
- the present invention relates to an image processing method, an image processing apparatus, and an image processing program related to template matching using a template image.
- a technique for detecting a position corresponding to a template image from search target images using a template image registered in advance has been put into practical use.
- a phase-only correlation method Phase-Only Correlation: hereinafter also referred to as “POC method”
- the POC method is a method of searching for corresponding points between images using phase difference information of spatial frequencies included in the images. By using the phase difference information, it is possible to improve the robustness against illumination fluctuations.
- the general POC method assumes that there is no change in the rotation direction between the template image and the search target image. Therefore, when an object registered in advance as a template image is rotated in the search target image, the position cannot be accurately detected.
- a rotation-invariant phase-only correlation method (hereinafter referred to as “RIPOC method”) is an extension of the POC method. It can be dealt with by adopting ".”
- the amount of rotation (rotational displacement) between images is estimated using the amplitude information included in the template image and the search target image, and either the template image or the search target image is estimated based on the estimated rotation amount. Correct one. Subsequently, using the corrected template image or search target image, a corresponding position is detected. Through such a two-step process, the corresponding position can be accurately detected even when the target object can rotate.
- Patent Document 1 discloses a pattern matching apparatus using the RIPOC method. This pattern matching device estimates the amount of rotation from an image obtained by converting the amplitude information into polar coordinates, and estimates the position after correcting the template image.
- Patent Document 2 discloses a position detection method by image processing for obtaining a rotation angle from a reference position set in an image for a target graphic in a given input image.
- the search target image is matched with a template rotated at a plurality of rotation angles, and the rotation angle and position of the object are determined from the template having the highest similarity and the position found in the template. Is estimated.
- the RIPOC method disclosed in the above-mentioned Patent Document 1 and the like has high robustness against fluctuations in the rotation direction and the like, but if the estimation of the rotation amount (rotational displacement) between the previous images fails, Since the image cannot be corrected appropriately, position detection at a later stage does not work well. For example, in an image having periodicity in the rotation direction, it is not guaranteed that the estimation of the rotation amount (rotational displacement) can be performed correctly, so that the subsequent position detection may not be performed correctly. Further, in the position detection method disclosed in Patent Document 2 described above, since it is necessary to perform matching for all of a plurality of rotation angles, a lot of calculation time is required.
- An object of the present invention is to provide an image processing method, an image processing apparatus, and an image processing program capable of speeding up processing while improving robustness in template matching using a template image.
- an image processing method for performing template matching with a search target image using a template image.
- a step of acquiring information on periodicity in the rotation direction for at least one of the template image and the search target image, and first matching in the rotation direction between the template image and the search target image are performed.
- determining a plurality of rotation amount candidates for correcting a relative shift in the rotation direction between the template image and the search target image based on the result of the first matching and the periodicity information Using each of the quantity candidates to generate a plurality of sets of images in which the relative shift in the rotation direction between the template image and the search target image is corrected; and a second position for each of the sets of images. And outputting a result with relatively high accuracy among the results of the second matching.
- the step of determining a plurality of rotation amount candidates includes a step of limiting a search range of the rotation amount based on periodicity information.
- the step of limiting the range in which the rotation amount is searched includes the relationship between the rotation amount and the similarity in the range in which the rotation amount is searched, and the rotation amount and similarity outside the range in which the rotation amount is searched.
- the step of generating information on the similarity of the range to be searched is corrected.
- the step of determining a plurality of rotation amount candidates includes adding and / or subtracting a rotation amount indicating periodicity information to the rotation amount determined by the first matching. Determining the step.
- the step of determining a plurality of rotation amount candidates refers to a change characteristic of the similarity between the template image and the search target image with respect to the rotation amount and satisfies the condition of the periodicity information among the rotation amount candidates. Including excluding no candidates.
- the step of acquiring periodicity information includes a step of receiving periodicity information set by a user when generating a template image.
- the step of acquiring periodicity information includes a step of performing a third matching with respect to the rotation direction between the template images, and a step of determining the periodicity from the cycle of similarity appearing in the result of the third matching. including.
- the step of acquiring periodicity information includes a step of receiving arbitrary periodicity information from the user.
- an image processing apparatus that performs template matching with a search target image using a template image.
- the image processing apparatus includes a periodic information acquisition unit that acquires information on periodicity in the rotation direction for at least one of the template image and the search target image, and a first rotation direction between the template image and the search target image.
- Rotation for performing matching and determining a plurality of rotation amount candidates for correcting relative shift in the rotation direction between the template image and the search target image based on the result of the first matching and the periodicity information An image estimator, an image corrector that generates a plurality of image sets in which a relative shift in the rotation direction between the template image and the search target image is corrected using each of the rotation amount candidates, and an image set And a position detection unit that outputs a result with relatively high accuracy among the results of the second matching.
- an image processing program for performing template matching with a search target image using a template image.
- the image processing program acquires, in a computer, information on periodicity in the rotation direction for at least one of the template image and the search target image, and a first matching in the rotation direction between the template image and the search target image. And determining a plurality of rotation amount candidates for correcting a relative shift in the rotation direction between the template image and the search target image based on the first matching result and the periodicity information.
- each of the rotation amount candidates Using each of the rotation amount candidates, generating a plurality of image sets in which the relative displacement in the rotation direction between the template image and the search target image is corrected, and the position of each of the image sets Performing a second matching and outputting a result with a relatively high accuracy among the results of the second matching. That.
- FIG. 3 is a schematic diagram showing a functional configuration of the image processing apparatus according to the first embodiment.
- FIG. 6 is a schematic diagram for describing an overview of rotation correction processing in a rotation correction unit of the image processing apparatus according to the first embodiment.
- FIG. 3 is a more detailed functional block diagram of a rotation amount estimation unit of the image processing device according to the first embodiment.
- FIG. It is a schematic diagram which shows the algorithm of POC method. It is a figure which shows an example of the POC value calculated according to the general POC method shown in FIG.
- FIG. 6 is a diagram showing an example of a result of rotation correction processing by a rotation correction unit of the image processing apparatus according to the first embodiment.
- FIG. 6 is a flowchart showing an overall procedure of position detection processing according to the first embodiment. It is a figure for demonstrating the rotation amount estimation process according to Embodiment 2.
- FIG. It is a figure which shows an example of the change of POC value with respect to the rotation amount which arises when a noise component is contained in an image. It is a figure for demonstrating the rotation amount estimation process according to Embodiment 3.
- FIG. FIG. 10 is a schematic diagram showing a functional configuration of an image processing device according to a fifth embodiment.
- FIG. 20 is a more detailed functional block diagram of a rotation amount estimation unit and a period information calculation unit of an image processing device according to Processing Example 3 of Embodiment 5. It is a figure for demonstrating the rotation amount estimation process according to Embodiment 7.
- FIG. 20 is a more detailed functional block diagram of a rotation amount estimation unit and
- an image processing method, an image processing apparatus, and an image processing program for performing template matching with a search target image using a template image are provided.
- periodicity information in the rotation direction is acquired for at least one of the template image and the search target image.
- the first matching in the rotation direction is executed between the template image and the search target image. That is, a relative shift (rotation amount) in the rotation direction between images is estimated.
- a plurality of rotation amount candidates for correcting the relative shift in the rotation direction between the template image and the search target image are determined based on the first matching result and the periodicity information. Then, using each of the rotation amount candidates, a plurality of image sets in which the relative shift in the rotation direction between the template image and the search target image is corrected is generated, and the position of each of the image sets is determined. Match two. That is, a plurality of candidates for rotation amount correction are prepared, and a position search is executed for each candidate. Then, a result with relatively high accuracy is output as a final result among the results of the second matching.
- FIG. 1 is a schematic diagram showing an application example of template matching according to the present embodiment.
- system 1 according to the present embodiment is applied to a production line including belt conveyor 3 as an example.
- the workpiece 2 is continuously conveyed on the belt conveyor 3, and the workpiece 2 is photographed by the camera 10, whereby an image including the appearance of the workpiece 2 (hereinafter also referred to as “object”) (hereinafter “object”). Also referred to as “search target image 16”).
- the search target image 16 is transmitted to the image processing apparatus 100.
- the image processing apparatus 100 detects the position of the workpiece 2 included in the search target image 16 using the template image 18 registered in advance.
- the image processing apparatus 100 outputs position information including information such as a rotation amount and a magnification obtained by the template matching.
- FIG. 2 is a block diagram showing a configuration when template matching according to the present embodiment is realized by a personal computer.
- an image processing apparatus 100 realized by a personal computer is mainly mounted on a computer having a general-purpose architecture.
- the image processing apparatus 100 includes, as main components, a central processing unit (CPU) 102, a random access memory (RAM) 104, a read only memory (ROM) 106, a network interface (I / F) 108, a hard disk 110, and the like.
- Each component is communicably connected to each other via a bus 130.
- the CPU 102 controls the entire image processing apparatus 100 by executing various programs such as an operating system (OS) and a template matching execution program 112 stored in the ROM 106, the hard disk 110, and the like.
- the RAM 104 functions as a working memory for the CPU 102 to execute various programs.
- the ROM 106 stores an initial program (boot program) that is executed when the image processing apparatus 100 is started.
- the network interface 108 exchanges data with other devices (such as server devices) via various communication media. More specifically, the network interface 108 is connected via a wired line such as Ethernet (registered trademark) (LAN (Local Area Network), WAN (Wide Area Network), etc.) and / or a wireless line such as a wireless LAN. Perform data communication.
- a wired line such as Ethernet (registered trademark) (LAN (Local Area Network), WAN (Wide Area Network), etc.) and / or a wireless line such as a wireless LAN. Perform data communication.
- the hard disk 110 stores an image processing program (mainly the template matching execution program 112) for realizing various processes according to the present embodiment, a template image 18, and the like.
- the hard disk 110 may further store a program such as an operating system.
- the display unit 120 displays a GUI (Graphical User Interface) screen provided by the operating system, an image generated by executing the template matching execution program 112, and the like.
- the input unit 122 typically includes a keyboard, a mouse, a touch panel, and the like, and outputs the content of an instruction received from the user to the CPU 102 or the like.
- the memory card interface 124 reads / writes data from / to various memory cards (nonvolatile recording media) 126 such as an SD (Secure Digital) card and a CF (Compact Flash (registered trademark)) card.
- the camera interface 128 captures an image for generating the search target image 16 acquired by photographing the subject from the camera 10.
- the camera 10 functions as an image acquisition unit for acquiring an image.
- the main body of the image processing apparatus 100 may not have a function of photographing a subject.
- a necessary image is captured via a memory card 126 that stores various images acquired by some device. That is, the memory card 126 is attached to the memory card interface 124, and various images read from the memory card 126 are stored (copied) in the hard disk 110 or the like.
- the template matching execution program 112 stored in the hard disk 110 is stored and distributed in a recording medium such as a CD-ROM (Compact Disk-Read Only Memory), or distributed from a server device or the like via a network.
- the template matching execution program 112 implements processing by calling necessary modules among program modules provided as part of an operating system executed by the image processing apparatus 100 (personal computer) at a predetermined timing and in a necessary order. To do.
- the template matching execution program 112 itself does not include a module provided by the operating system, but instead implements image processing in cooperation with the operating system.
- the template matching execution program 112 may be provided by being incorporated in a part of some program instead of a single program.
- the program itself does not include a module that is used in common with other programs, and instead, image processing is realized in cooperation with the other programs.
- image processing is realized in cooperation with the other programs.
- the template matching execution program 112 that does not include some of the modules does not depart from the spirit of the image processing apparatus 100 according to the present embodiment.
- templates matching execution program 112 may be realized by dedicated hardware.
- ⁇ b3: Realization Example with Other Configuration
- the personal computer described above may be mounted on a digital camera, a mobile phone, or a smartphone. Further, it may be in the form of a so-called cloud service in which at least one server device realizes processing according to the present embodiment.
- the user transmits at least two processing target images to the server device (cloud side) using his / her terminal (such as a personal computer or a smartphone), and the server device is transmitted to the transmitted processing target image.
- the server apparatus side it is not necessary for the server apparatus side to perform all functions (processing), and the user terminal and the server apparatus may cooperate to realize the image processing according to the present embodiment.
- FIG. 3 is a schematic diagram showing a functional configuration of image processing apparatus 100 according to the first embodiment.
- image processing apparatus 100 estimates a change (rotation amount) in the rotation direction between template image 18 and search target image 16, and template image 18 and search target image 16. Correct at least one of the following.
- the image processing apparatus 100 corrects the change in the rotation direction and then detects a corresponding position between the images by matching processing between the images.
- processing in the case where rotation correction is performed by rotating the search target image 16 will be exemplified.
- the template image 18 may be rotationally corrected.
- the image processing apparatus 100 includes a template holding unit 150, an image acquisition unit 152, a rotation correction unit 154, and a position detection unit 160 as its functional configuration.
- the template holding unit 150 holds a template image 18 prepared in advance.
- the template holding unit 150 registers the template image 18 in advance.
- the template image 18 is arbitrarily created and / or set by a user or the like according to the purpose of template matching.
- the template image 18 prepared in advance by a user or the like is acquired by the image processing apparatus 100 via an arbitrary recording medium or communication medium, that is, read by file input or the like and held in the template holding unit 150.
- the template image 18 may be created using all or part of the image taken by the camera 10.
- the template image 18 held by the template holding unit 150 is used in rotation amount estimation processing and position detection processing described later.
- the image acquisition unit 152 acquires the search target image 16 that is a target of template matching by an arbitrary method. Typically, the image acquisition unit 152 acquires the search target image 16 generated when the camera 10 captures a subject.
- the acquisition method of the search target image 16 is not limited to the input method from the camera 10, and the search target image 16 captured in advance is acquired via an arbitrary recording medium or communication medium, that is, input as a file. Also good.
- the search target image 16 to be acquired may be a single image, or may be a plurality of images captured continuously.
- the camera 10 acquires images at a predetermined cycle (moving images are captured), and the continuously acquired images are temporarily stored in the frame buffer.
- a certain trigger release by the user, automatic shooting after a predetermined time elapses, trigger by another recognition process
- a corresponding image among the images stored in the frame buffer is acquired and processed. There are cases like this.
- the rotation correction unit 154 estimates a relative rotation amount between the template image 18 and the search target image 16, and uses at least one of the template image 18 and the search target image 16 with the estimated relative rotation amount. Correct the rotation. As described above, image processing apparatus 100 according to the first embodiment performs rotation correction on search target image 16. More specifically, the rotation correction unit 154 includes a rotation amount estimation unit 156 and an image correction unit 158.
- the rotation amount estimation unit 156 estimates a relative shift in the rotation amount between the template image 18 and the search target image 16 and also estimates a plurality of displacement amount candidates in the rotation direction between the images. That is, the rotation amount estimation unit 156 performs matching (first matching) with respect to the rotation direction between the template image 18 and the search target image 16, and the result of matching with respect to the rotation direction (first matching). Based on the periodicity information, a plurality of rotation amount candidates for correcting the relative shift in the rotation direction between the template image 18 and the search target image 16 are determined.
- Various methods can be employed as the logic for estimating the relative rotation amount. In this embodiment, the rotation amount is estimated using the POC method.
- the image correction unit 158 corrects the template image 18 or the search target image 16 based on the plurality of rotation amounts estimated by the rotation amount estimation unit 156, and outputs a plurality of corrected images (in the following description, search target images). 16). That is, the image correction unit 158 generates a plurality of image sets in which the relative shift in the rotation direction between the template image 18 and the search target image 16 is corrected using each of the rotation amount candidates.
- Rotation correction is realized by relatively rotating one image using affine transformation.
- the affine transformation includes an interpolation process.
- a known interpolation method such as a Bi-linear method or a CubicConvolution method can be employed.
- the position detection unit 160 executes processing for detecting a corresponding position using each corrected image generated by the rotation correction. In other words, the position detection unit 160 performs position detection processing (second matching) on the position of each of the image sets generated by the image correction unit 158, and has relatively high accuracy among the results of the position detection processing. Output high results.
- position detection logic various methods can be employed, but in the present embodiment, position detection using the POC method is employed.
- a plurality of rotation amounts are estimated as candidates using period information, and position detection is performed for each rotation amount candidate.
- the position detection unit 160 outputs the most probable result as final position information. Details of the position detection processing in the position detection unit 160 will be described later.
- FIG. 4 is a schematic diagram for explaining an outline of the rotation correction process in rotation correction unit 154 of image processing apparatus 100 according to the first embodiment.
- a plurality of rotation amount candidates used for the rotation correction process are estimated.
- the position detection robustness is enhanced by performing the subsequent position detection process using each of the estimated plurality of rotation amounts.
- a relative rotational shift is detected between the template image 18 and the search target image 16.
- the similarity between the images is calculated while sequentially changing the relative rotation amount between the template image 18 and the search target image 16. That is, matching processing regarding the rotation direction is executed between the template image 18 and the search target image 16.
- the POC method is adopted.
- a POC value is calculated as the similarity.
- a POC value as shown by reference numeral 30 is calculated according to the rotation amount of the search target image 16. That is, the grid-like objects as shown in the template image 18 and the search target image 16 have periodicity, and the peak of the POC value (similarity) appears at a cycle of 90 °.
- the angle corresponding to the peak position is determined as one of the rotation amount candidates to be corrected, and the determined rotation amount candidate is set as a period.
- the period information is information on the periodicity in the rotation direction for at least one of the template image 18 and the search target image 16. Details of the method for acquiring the period information will be described later.
- the cycle information is acquired in advance, and in addition to the rotation amount ⁇ showing the maximum peak, the rotation amount ⁇ ⁇ (the rotation amount according to the cycle information) is determined as a rotation correction candidate.
- the corrected images 16A and 16B are generated by correcting the search target image 16 according to each estimation result.
- the post-position detection process is executed using these corrected images 16A and 16B.
- the process of generating a plurality of sets of images is performed by adding and / or subtracting a rotation amount indicating periodicity information to a rotation amount determined by matching in the rotation direction (first matching). , Including a process of determining rotation amount candidates.
- FIG. 5 is a more detailed functional block diagram of rotation amount estimation unit 156 of image processing apparatus 100 according to the first embodiment.
- rotation amount estimation unit 156 has, as its functional configuration, frequency conversion units 1561 and 1562, logarithmization units 1563 and 1564, polar coordinate conversion units 1565 and 1566, POC processing unit 1567, and candidates A generating unit 1568.
- the frequency conversion units 1561 and 1562 calculate frequency components (amplitude component and phase component) included in the template image 18 and the search target image 16, respectively.
- an amplitude component is used instead of a phase component.
- the logarithmization unit 1563 and the polar coordinate conversion unit 1565 logarithmize the amplitude component of the template image 18 and convert it into a polar coordinate image.
- the logarithmization unit 1564 and the polar coordinate conversion unit 1566 logarithmize the amplitude component of the search target image 16 and convert it into a polar coordinate image.
- the rotation amount is expressed as a coordinate point on two-dimensional coordinates.
- the horizontal coordinate of the polar coordinate image corresponds to the rotation amount.
- the POC processing unit 1567 calculates a similarity and a parallel movement amount (corresponding to a rotation amount) between polar coordinate images output from the polar coordinate conversion units 1565 and 1566, respectively.
- the similarity calculated by the POC processing unit 1567 is also referred to as a correlation value or a POC value.
- FIG. 6 is a schematic diagram showing an algorithm of the POC method.
- POC processing unit 1567 includes reference window setting unit 1571, reference window setting unit 1572, frequency conversion units 1573 and 1574, phase information extraction units 1575 and 1576, and phase difference calculation unit 1577.
- Frequency inverse transform unit 1578 Frequency inverse transform unit 1578.
- the standard window setting unit 1571 and the reference window setting unit 1572 set windows for the template image (polar coordinate conversion image) and the search target image (polar coordinate conversion image), respectively.
- the frequency conversion unit 1573 performs frequency conversion (typically, Fourier transform) on the reference window set on the template image, thereby converting the image information contained therein into information in the frequency space. .
- the frequency conversion unit 1573 performs frequency conversion (typically, Fourier transform) on the reference window set on the search target image, thereby converting the image information contained therein into the frequency space. Convert to information.
- the information on the converted frequency space includes amplitude information and phase information for each frequency. That is, frequency conversion units 1573 and 1574 perform frequency decomposition on the partial images included in the respective windows.
- the calculation result of a general frequency conversion is output in a complex number format including a real part and an imaginary part.
- frequency conversion is performed according to equation (1).
- the frequency information is output in a complex number format including a real part Re (u, v) and an imaginary part Im (u, v). Moreover, it can convert into amplitude information A (u, v) and phase information (theta) (u, v) using the value of a real part and an imaginary part.
- the result of frequency conversion may be stored in a format that combines amplitude and phase, or may be stored in a format that combines a real part and an imaginary part.
- the phase information extraction units 1575 and 1576 extract the phase information about the reference window and the reference window, respectively, using the frequency conversion results (typically complex numbers) output from the frequency conversion units 1573 and 1574, respectively. .
- the phase difference calculation unit 1577 calculates the phase information difference for the windows extracted by the phase information extraction units 1575 and 1576, respectively. That is, the phase difference calculation unit 1577 generates phase difference information.
- the frequency inverse transform unit 1578 performs frequency inverse transform on the phase difference information calculated by the phase difference calculation unit 1577, thereby obtaining a POC value indicating the similarity between partial images included in each set window. calculate.
- the process of calculating the POC value is repeatedly executed every time the reference window setting unit 1572 updates the position of the reference window set on the template image.
- FIG. 7 is a diagram showing an example of the POC value calculated according to the general POC method shown in FIG.
- FIG. 7 shows an example of a POC value calculation result obtained from the template image 18 and the search target image 16 shown in FIG.
- two peaks appear.
- the template image 18 and the search target image 16 shown in FIG. 4 are grid-like objects. That is, the amplitude component included in each of the template image 18 and the search target image 16 has a cross shape as shown in FIG. 7, and accordingly, when considered in terms of the phase component, the rotation amounts ⁇ and ⁇ ⁇ 90 °.
- a plurality of peaks also appear in the POC value (correlation value) calculated by the POC method.
- the rotation amount corresponding to the maximum peak is estimated as the rotation fluctuation amount. Therefore, when a plurality of peaks occur, it is impossible to accurately determine which peak shows a truly correct rotation amount.
- the rotation amount is estimated at the stage of rotation amount estimation. Estimate multiple quantity candidates.
- FIG. 8 is a diagram showing a result example of the rotation correction process by the rotation correction unit 154 of the image processing apparatus 100 according to the first embodiment.
- the search target image 16 is corrected with a plurality of rotation amount candidates based on the amount of change in the rotation direction between the template image 18 and the search target image 16 and the period information.
- images 16A and 16B are generated.
- the position search process is executed using each of the corrected images 16A and 16B.
- Period information can be acquired using various methods.
- the frequency component (amplitude component and phase component) included in the template image may be calculated, and the period information may be determined from the periodicity of the frequency component. More specifically, when creating the template image 18, the user can know what periodicity is present from the image characteristics. Specifically, by performing frequency conversion on the template image 18 and extracting amplitude information, an image as in the image after logarithmic processing shown in FIG. 5 can be obtained. By viewing such an image, the user can confirm what periodicity is present in the rotation direction.
- the processing for acquiring periodicity information includes processing for receiving periodicity information set by the user when the template image 18 is generated.
- the search target image is corrected with each of a plurality of rotation amount candidates, and position detection is performed using each of the corrected images. From the position detection result, the most probable result is output as final position information.
- the position detection unit 160 of the image processing apparatus 100 illustrated in FIG. 3 determines whether any position detection result is likely by comparing the magnitudes of the maximum peaks associated with these position detection results.
- corrected images 16A and 16B are obtained by correcting the search target image 16 at ⁇ 10 ° and 80 °, respectively.
- the corrected image 16B obtained by correcting the search target image 16 by 80 ° is different in appearance from the template image 18, so that the value of the maximum peak is small, and there is a possibility that the accuracy of the position detection result is not high. Is expensive.
- the corrected image 16A obtained by correcting the search target image 16 by ⁇ 10 ° looks almost the same as the template image 18, the value of the maximum peak becomes high and the position is accurately detected. It becomes possible.
- FIG. 9 is a flowchart showing an overall procedure of position detection processing according to the first embodiment. Each step shown in FIG. 9 is typically realized by the CPU 102 executing the template matching execution program 112 (both in FIG. 2).
- CPU 102 obtains a template image (step S100).
- the user may cut out an image obtained by photographing the reference workpiece with the camera 10, or may obtain an image generated from the design data of the workpiece 2.
- the CPU 102 acquires period information (step S102). This period information may be acquired from the outside in association with the template image, or may be set by the user based on the image information included in the template image.
- steps S100 and S102 information necessary for template matching according to the present embodiment is collected.
- the CPU 102 acquires a search target image (step S104).
- the CPU 102 captures the work 2 by the camera 10 and acquires a search target image.
- an image obtained by photographing a work with another camera or the like may be captured as a search target image.
- the work 10 may be photographed by the camera 10 on condition that the sensor arranged on the belt conveyor 3 detects the arrival of the work 2.
- the CPU 102 estimates a change (rotation amount) in the rotation direction between the template image and the search target image (step S106). More specifically, as shown in FIG. 5, the CPU 102 performs frequency conversion, logarithmic processing, and polar coordinate conversion on each image, and then estimates the rotation amount using the POC method. In this rotation amount estimation, the CPU 102 determines a rotation amount showing the highest similarity (correlation value). Then, the CPU 102 estimates a rotation amount candidate using the period information from the rotation amount determined in step S106 (step S108). More specifically, the CPU 102 adds / subtracts the rotation amount indicating the period information to the rotation amount estimated in step S106 to determine a rotation amount candidate.
- the CPU 102 rotates the search target image 16 using a plurality of candidate rotation amounts to generate a plurality of corrected images (step S110). Then, the CPU 102 performs position detection with each of a plurality of corrected images generated using the template image 18 (step S112). More specifically, the CPU 102 typically searches for a corresponding position between images using the POC method. Finally, the CPU 102 outputs a result indicating the highest similarity (correlation value) among the position detection results executed in step S112 as final position information (step S114). That is, the CPU 102 outputs the most probable result as the final position information among the results of the position search performed using the plurality of candidate rotation amounts. Then, the process ends.
- ⁇ c6 Advantage >> According to the present embodiment, even when the rotation amount related to the rotation correction cannot be accurately obtained, a plurality of rotation amount candidates are estimated by using the periodicity in the rotation direction of the image, and each candidate is determined. To perform position detection. Then, by determining the optimum rotation amount and the corresponding displacement amount from the position detection result, position detection with higher robustness and accuracy can be realized.
- the POC value (similarity) is calculated using the amplitude component included in the template image 18 and the search target image 16, and the rotation amount indicating the maximum peak is determined.
- one or a plurality of rotation amount candidates are estimated from the period information with reference to the rotation amount corresponding to the determined maximum peak.
- the POC value (similarity) search process in the rotation amount estimation process basically, it is necessary to calculate POC values (similarities) for all rotation amounts.
- the period information is acquired, the period in which the peak appears can be predicted in advance, so that the search range of the POC value (similarity) can be limited from the period information.
- FIG. 10 is a diagram for explaining the rotation amount estimation processing according to the second embodiment.
- FIG. 10 schematically shows one-dimensional changes in the POC value with respect to the rotation amount. In the example shown in FIG. 10, peaks appear at three places in total, that is, rotation amounts of 0 °, ⁇ 90 °, and 90 °.
- the period information is 90 °
- only the range of ⁇ 45 ° around 0 ° may be searched. That is, a peak at a position of 0 ° can be specified by searching a range of ⁇ 45 ° centered on 0 °. Then, using the specified 0 ° position as a reference, it can be estimated that there are peaks at the remaining ⁇ 90 ° and 90 ° positions using the period information.
- search target image 16 when the search target image 16 is rotated by 40 ° with respect to the template image 18, peaks appear at two locations of 40 ° and ⁇ 50 °, and the search target image is 60 °. In the case of rotation, peaks appear at two locations of 60 ° and ⁇ 30 °. In any case, since at least one peak always exists within the range (search range) of ⁇ 45 °, it is possible to search for a peak by limiting the search range.
- the search range may be limited to ⁇ ( ⁇ / 2).
- the search range may be set to ⁇ ( ⁇ / 2 + ⁇ ) using the likelihood ⁇ .
- the POC value is calculated for the amplitude information, the maximum peak position is searched from the POC value, and the rotation amount corresponding to the searched maximum peak position is used as the reference from the period information.
- a plurality of rotation amount candidates are estimated.
- the range (search range) for searching for the peak position from the period information is limited. That is, in the second embodiment, the matching (first matching) in the rotation direction between the template image 18 and the search target image 16 is a range in which the rotation amount is searched based on the periodicity information. Includes limited processing. The calculation load can be reduced by adopting such processing that limits the search range from the period information.
- FIG. 11 is a diagram illustrating an example of a change in the POC value with respect to the rotation amount generated when a noise component is included in the image.
- FIG. 12 is a diagram for explaining a rotation amount estimation process according to the third embodiment.
- any noise component (shot noise, shading, etc.) occurs when acquiring the search target image, as shown in FIG. 11, a peak (false peak) is present at a place where it should not be (position b in FIG. 11). May appear.
- the maximum peak should occur at position a.
- a rotation amount candidate is determined from the period information with reference to the rotation amount corresponding to the position b. For this reason, in position detection at a later stage, position detection is performed using an image that has been corrected with a rotation amount that is erroneously estimated and a rotation amount that is calculated from the rotation amount using period information. Therefore, position detection cannot be performed accurately.
- such a false detection can occur even when the search range is limited to ⁇ 45 °. Therefore, in the third embodiment, as shown in FIG. 12, by integrating the limited search range and the POC value information outside the search range, the influence on accuracy due to the occurrence of false peaks is eliminated. To do. More specifically, the amplitude (false waveform) of the false peak appearing at position b in FIG. 11 can be reduced by folding and adding the characteristics (waveforms) of the POC values outside the search range and further averaging. .
- the search range is limited based on the period information, and the information of the PCO value outside the search range is integrated into the search range, thereby reducing the influence of noise components and the like and improving the robustness. it can. That is, in the third embodiment, in the process of limiting the range in which the search for the rotation amount is performed, the relationship between the rotation amount and the similarity in the range in which the search for the rotation amount is performed, And processing for generating information on the similarity of the range to be searched.
- a noise component (shot noise, shading, or the like) occurs when acquiring the search target image, a peak appears at a place (position b in FIG. 11) that should not exist as shown in FIG. (False peaks) may appear.
- the presence or absence of periodicity may be determined for each peak.
- the period (90 in this example) obtained from the period information based on the rotation amount corresponding to the position b.
- the POC value at the position where the angle is added / subtracted (the rotation amount of the peak position ⁇ 90 °) is evaluated. If the POC value at this (peak position rotation amount ⁇ 90 °) is smaller than a predetermined threshold value, periodicity is not seen, so the peak appearing at position b is a false peak. This can be judged and excluded.
- a peak having the next largest amplitude at a position other than the position b is searched.
- the presence or absence of periodicity is also determined for the newly searched peak. If it is determined by this determination that there is periodicity, the position of the peak and the position corresponding to the period information are extracted as rotation amount candidates.
- position detection processing is executed for each extracted rotation amount candidate, and the most probable result (rotation amount and corresponding position) is output as final position information.
- whether or not the extracted peak is a false peak is determined by determining whether or not there is a periodic peak on the basis of the rotation amount indicating the extracted maximum peak. Can be judged. That is, in the fourth embodiment, with reference to the change characteristic of the similarity between the template image 18 and the search target image 16 with respect to the rotation amount, a candidate that does not satisfy the condition of periodicity information among rotation amount candidates is selected. Execute the process to be excluded. Thus, by determining the presence or absence of periodicity with respect to the extracted peak, it is possible to eliminate false peaks, thereby improving robustness.
- FIG. 13 is a schematic diagram showing a functional configuration of the image processing apparatus 100A according to the fifth embodiment.
- image processing apparatus 100 ⁇ / b> A according to the fifth embodiment is further added with period information calculation unit 162 compared to image processing apparatus 100 shown in FIG. 3. Since the other components have been described with reference to FIG. 3, detailed description will not be repeated.
- the period information calculation unit 162 shown in FIG. 13 calculates the period information included in the template image using any one of the methods 1 to 3 described later.
- Treatment example 1 As Processing Example 1, the period information calculation unit 162 performs rotation estimation between template images set by the user. The rotation estimation between the template images is performed offline, that is, before the position search process for the search target image is performed.
- a plurality of peaks appear in accordance with the periodicity, and the periodicity included in the template image can be extracted from these peaks. More specifically, out of a plurality of appearing peaks, those whose peak value (amplitude) exceeds a predetermined threshold are extracted, and how many pixel intervals the extracted peaks occur.
- the period information can be calculated by determining.
- the rotation amount can be estimated in consideration of the cycle information as described in the above embodiment.
- a sharp peak is output only at the center of the image as the POC value, making it difficult to calculate period information. For this reason, it is preferable to cause some error between the template images to be subjected to rotation estimation.
- a method for generating such an error for example, a method of adding some noise such as random noise to any one of the template images may be employed, or one of the template images may be subjected to a smoothing filter or the like. A method of degrading an image may be adopted.
- processing example 2 calculates a correlation value between the template image and an image generated by rotating the template image, and thereby calculates the periodicity of the template image. .
- the period information calculation unit 162 generates a plurality of images obtained by rotating a template image set by the user by a predetermined angle (for example, by 1 °). This image generation process is executed off-line. Then, the period information calculation unit 162 uses the template image set by the user to perform position detection matching with a plurality of images generated by rotating the image by a predetermined angle, thereby obtaining a correlation value for each image. calculate.
- the correlation value calculated for each image varies according to the periodicity. That is, a higher matching degree is shown according to the periodicity. Therefore, the period information is calculated by calculating how many rotation amounts of the peak occur from the correlation value data exceeding a predetermined threshold value among the calculated correlation values. Can be determined.
- FIG. 14 is a more detailed functional block diagram of rotation amount estimation unit 156 and period information calculation unit 162 of the image processing apparatus according to processing example 3 of the fifth embodiment.
- period information calculation section 162 has an axis representing the rotation amount with respect to an image (polar coordinate conversion image) obtained by polar coordinate conversion of the amplitude information of the template image (horizontal axis in the polar coordinate conversion image shown in FIG. 14).
- frequency transformation typically, Fourier transformation
- the process of acquiring periodicity information includes a process of performing matching (third matching) in the rotation direction between the template images 18, and a period from the period of similarity appearing in the result of the matching. Processing to determine gender.
- the process of acquiring periodicity information includes a process of performing matching (third matching) in the rotation direction between the template images 18, and a period from the period of similarity appearing in the result of the matching. Processing to determine gender.
- the user may arbitrarily set in addition to or instead of the automatically determined periodicity information. That is, the process of acquiring periodic information may include a process of receiving arbitrary periodic information from the user.
- a peak position having an amplitude equal to or greater than a predetermined threshold value may be calculated, and a periodic position with respect to each peak position may be calculated.
- Rotation amount estimation uses unstable amplitude information such as illumination fluctuations, which may increase the probability of error in rotation amount estimation.
- robust detection is possible by performing the subsequent position detection for a plurality of rotation amount candidates.
- a plurality of peak positions having amplitudes equal to or greater than a predetermined threshold value are calculated, and it is determined whether a periodic relationship exists for the calculated plurality of peak positions. .
- a candidate for the amount of rotation is determined after excluding the peak position, and position detection is performed in the subsequent stage. This process means that the maximum peak position that matches the period information is calculated.
- FIG. 15 is a diagram for explaining a rotation amount estimation process according to the seventh embodiment.
- the peak (1) shown in FIG. 15 shows the maximum peak value, and the peaks (2) to (4) show peak values exceeding a predetermined threshold value (horizontal broken line). .
- the peaks (1) to (4) shown in FIG. 15 are extracted as candidates for rotation amount estimation.
- the probability of the peak is determined using the period information in order from the maximum peak.
- the determination of the probability does not necessarily need to be performed in order from the maximum peak, and the order is arbitrary. For example, it may be performed in order from the peak located on the left side shown in FIG. 15, or may be performed in order from the peak located on the right side.
- period information A here, 90 ° as an example.
- the length corresponding to 90 ° is indicated by an arrow
- the peak of (1) is excluded from the candidates.
- the peak of (2) using the period information B (here, also 90 °), it is determined whether or not there is a peak exceeding the threshold at the corresponding position. to decide.
- the peak (3) exists at a position that is exactly 90 ° away from the peak (2), the peak (2) is left as a candidate because it is likely.
- the probability of each peak is determined for all peaks exceeding a predetermined threshold (except for a peak paired with another peak such as the peak in (3)).
- the probabilities may be judged in order from the highest peak value, and the remaining peaks may be excluded when at least one peak matching the period information is found. .
- the processing example using the POC method as the position detection method has been described.
- any method capable of performing position detection between images may be employed.
- a technique capable of detecting a position such as a SAD (Sum of Absolute Difference) method can be employed.
- the SAD method is a method of evaluating the similarity by calculating the sum of absolute values of differences for each pixel between the template image and the search target image.
- the rotation amount of the template image is optimized in the rotation amount correction process, it is possible to use a technique that is somewhat inferior if only final candidates are narrowed down.
- the SAD method has a smaller calculation load than the POC method, the processing speed can be increased.
- the present embodiment includes the following aspects.
- the image processing apparatus uses means for acquiring a template image and a search target image, means for acquiring periodicity regarding the rotation direction of amplitude information of the acquired image, and amplitude information and periodicity of the template image and search target image.
- the image processing apparatus is configured to limit a search range for estimating the relative rotation amount based on the periodicity information, and to estimate a relative rotation amount from the limited search range; Means for estimating a plurality of relative rotation amounts based on the estimated relative rotation amount and periodicity information.
- the image processing apparatus includes means for integrating the correlation value in the limited search range and the correlation value outside the search range.
- the image processing apparatus includes means for excluding, from the estimation result, a rotation amount that does not match the periodicity information with respect to the estimated plurality of relative rotation amounts.
- the means for acquiring periodicity is set by the user when creating the template.
- the means for acquiring the periodicity performs a process for estimating the rotation amount between the templates set by the user, and calculates the periodicity from the obtained correlation value.
- the RIPOC method in an algorithm that first estimates the amount of deviation in the rotational direction and then executes a position search, it is important to estimate the amount of deviation in the first rotational direction.
- the amplitude information of the image has periodicity in the rotation direction, it is impossible to accurately determine which rotation amount is the correct shift amount, and there is a possibility of estimating an incorrect rotation amount.
- the amount of rotation unrelated to the periodicity is estimated due to the influence of noise on the image. Therefore, in the position detection method considering the rotation direction of the image, it is possible to improve the robustness and accuracy in the initially calculated rotation amount estimation using the periodicity of the rotation direction of the image.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
Description
本実施の形態によれば、テンプレート画像を用いて探索対象画像との間でテンプレートマッチングを行なう画像処理方法、画像処理装置、および画像処理プログラムが提供される。この処理では、テンプレート画像および探索対象画像の少なくとも一方についての、回転方向の周期性の情報が取得される。また、テンプレート画像と探索対象画像との間で回転方向についての第1のマッチングが実行される。すなわち、画像間の回転方向の相対的なずれ(回転量)が推定される。
まず、本実施の形態に従うテンプレート画像を用いたテンプレートマッチングの機能を有する画像処理装置の実装例について説明する。
図1は、本実施の形態に従うテンプレートマッチングの適用例を示す模式図である。図1を参照して、本実施の形態に従うシステム1は、一例として、ベルトコンベア3を含む生産ラインに適用される。システム1において、ワーク2はベルトコンベア3上を連続的に搬送されるとともに、ワーク2をカメラ10により撮影することで、ワーク2の外観(以下「オブジェクト」とも称す。)を含む画像(以下「探索対象画像16」とも称す。)が取得される。
図2は、本実施の形態に従うテンプレートマッチングをパーソナルコンピューターにより実現した場合の構成を示すブロック図である。図2を参照して、パーソナルコンピューターにより実現される画像処理装置100は、主として、汎用的なアーキテクチャーを有するコンピューター上に実装される。画像処理装置100は、主たるコンポーネントとして、CPU(Central Processing Unit)102と、RAM(Random Access Memory)104と、ROM(Read Only Memory)106と、ネットワークインターフェイス(I/F)108と、ハードディスク110と、表示部120と、入力部122と、メモリーカードインターフェイス(I/F)124と、カメラインターフェイス(I/F)128とを含む。各コンポーネントは、バス130を介して、互いに通信可能に接続されている。
上述したパーソナルコンピューターにより実現する例に加えて、例えば、デジタルカメラ、携帯電話、あるいはスマートフォン上に実装してもよい。さらに、少なくとも1つのサーバー装置が本実施の形態に従う処理を実現する、いわゆるクラウドサービスのような形態であってもよい。この場合、ユーザーは、自身の端末(パーソナルコンピューターやスマートフォンなど)を用いて、少なくとも2つの処理対象画像をサーバー装置(クラウド側)へ送信し、当該送信された処理対象画像に対して、サーバー装置側が本実施の形態に従う画像処理を行なうような構成が想定される。但し、サーバー装置側がすべての機能(処理)を行なう必要はなく、ユーザー側の端末とサーバー装置とが協働して、本実施の形態に従う画像処理を実現するようにしてもよい。
《c1:全体構成》
図3は、実施の形態1に従う画像処理装置100の機能構成を示す模式図である。図3を参照して、実施の形態1に従う画像処理装置100は、テンプレート画像18と探索対象画像16との間で回転方向の変化(回転量)を推定し、テンプレート画像18および探索対象画像16の少なくとも一方を補正する。画像処理装置100は、この回転方向の変化の補正を行なった上で、画像間のマッチング処理によって画像間の対応する位置を検出する。説明の便宜上、以下の説明においては、探索対象画像16を回転させることで回転補正を行なう場合の処理を例示する。もちろん、テンプレート画像18に対して回転補正するようにしてもよい。
位置検出部160は、回転補正によって生成されるそれぞれの補正後画像を用いて、対応する位置を検出するための処理を実行する。すなわち、位置検出部160は、画像補正部158によって生成された画像の組の各々で位置についての位置検出処理(第2のマッチング)を行なうとともに、位置検出処理の結果のうち相対的に確度の高い結果を出力する。
次に、上述した回転補正処理の詳細について説明する。図4は、実施の形態1に従う画像処理装置100の回転補正部154における回転補正処理の概要を説明するための模式図である。
周期情報は、各種の方法を用いて取得することができる。典型的には、テンプレート画像に含まれる周波数成分(振幅成分および位相成分)を算出し、この周波数成分の周期性から周期情報を決定してもよい。より具体的には、ユーザーは、テンプレート画像18を作成する際、その画像特徴からどのような周期性があるかを知ることができる。具体的には、テンプレート画像18に対して周波数変換を行なって振幅情報を抽出することで、図5に示す対数化処理後の画像にあるような画像を得ることができる。このような画像を見ることで、ユーザーは、回転方向にどのような周期性があるかを確認することができる。
次に、上述した位置検出処理の詳細について説明する。
次に、本実施の形態に従う位置検出処理の全体手順について説明する。図9は、実施の形態1に従う位置検出処理の全体手順を示すフローチャートである。図9に示す各ステップは、典型的には、CPU102がテンプレートマッチング実行プログラム112(いずれも図2)を実行するなどして実現される。
本実施の形態によれば、回転補正に係る回転量を正確に求めることができない場合でも、画像の回転方向の周期性を利用することで、複数の回転量の候補を推定し、各候補を用いて位置検出を実行する。そして、位置検出結果から最適な回転量および対応する位置ずれ量を決定することで、よりロバスト性および正確性の高い位置検出を実現できる。
上述の実施の形態1においては、回転量推定の処理において、テンプレート画像18および探索対象画像16に含まれる振幅成分を用いてPOC値(類似度)を算出し、最大ピークを示す回転量を決定するとともに、当該決定した最大ピークに対応する回転量を基準として、周期情報から1または複数の回転量の候補を推定する。この回転量推定の処理におけるPOC値(類似度)の探索処理では、基本的には、すべての回転量についてPOC値(類似度)を算出する必要がある。一方で、周期情報が取得されている場合には、ピークが現われる周期が予め予測できるので、周期情報からPOC値(類似度)の探索範囲を限定できる。
上述の実施の形態2では探索範囲を限定することで、演算負荷が低減する構成について説明した。但し、探索範囲を限定することでロバスト性が低下する場合もある。そのため、実施の形態3では、回転量推定の処理において探索範囲を限定するとともに、探索範囲外の情報を利用することでロバスト性を向上させる処理について説明する。この回転量推定の処理以外の部分については、上述の実施の形態1と同様であるので、詳細な説明は繰り返さない。
上述の実施の形態3では、探索範囲と探索範囲外とのPOC値の情報を統合することで、ロバスト性を向上する処理について説明した。これに対して、実施の形態4においては、統合処理ではなく、別の処理で、偽のピークを除外する処理について説明する。この回転量推定の処理以外の部分については、上述の実施の形態1と同様であるので、詳細な説明は繰り返さない。
上述の実施の形態1においては、ユーザーが周期情報を取得して設定するような例について説明した。図4に示すような格子状の単純なテンプレート画像のパターンであれば、ユーザーが知り得ることができる。しかしながら、少し複雑なパターンになると、どのような周期性を有しているか否かを知ることが困難になる。そこで、実施の形態5においては、ユーザーがテンプレート画像を設定すれば、そのテンプレート画像がどのような周期性を有しているか否かを自動的に判断する構成について例示する。
処理例1として、周期情報算出部162は、ユーザーが設定したテンプレート画像同士の間で回転推定を行なう。このテンプレート画像同士の回転推定は、オフライン、すなわち探索対象画像に対する位置探索処理が実行される前に実行される。
処理例2として、周期情報算出部162は、テンプレート画像とテンプレート画像を回転させて生成される画像との間で相関値を算出し、これによって、テンプレート画像が有している周期性を算出する。
図14は、実施の形態5の処理例3に従う画像処理装置の回転量推定部156および周期情報算出部162のより詳細な機能ブロック図である。図14を参照して、周期情報算出部162は、テンプレート画像の振幅情報を極座標変換した画像(極座標変換画像)に対して、回転量を表す軸(図14に示す極座標変換画像では横軸)方向に周波数変換(典型的には、フーリエ変換)を行なうことで、周期性に相当する周波数のピークが高くなる。この性質を利用して周期情報を取得する。
実施の形態5においては、周期性の情報を取得する処理は、テンプレート画像18同士で回転方向についてのマッチング(第3のマッチング)を行なう処理と、当該マッチングの結果に現われる類似度の周期から周期性を決定する処理とを含む。このような実施の形態5によれば、ユーザーはテンプレート画像を設定すれば、その設定したテンプレート画像についての周期性の情報を自身で設定する必要がない。つまり、設定したテンプレート画像の有している周期性が自動的に解析されるので、ユーザビリティを高めたシステムを実現できる。
上述したように、自動的に決定された周期性の情報に加えて、あるいは、それに代えて、ユーザーが任意に設定するようにしてもよい。つまり、周期性の情報を取得する処理としては、ユーザーから任意の周期性の情報を受け付ける処理を含んでいてもよい。
上述の実施の形態1では、回転量推定時に最大ピークの回転量のみを求め、周期情報に基づいて複数の回転量を推定していたが、別の処理方法を採用してもよい。実施の形態6では、回転量推定の処理のバリエーションについて説明する。この回転量推定の処理以外の部分については、上述の実施の形態1と同様であるので、詳細な説明は繰り返さない。
上述の実施の形態1では、回転量推定時に最大ピークの回転量のみを求め、周期情報に基づいて複数の回転量を推定していたが、別の処理方法を採用してもよい。実施の形態7では、回転量推定の処理のさらなるバリエーションについて説明する。この回転量推定の処理以外の部分については、上述の実施の形態1と同様であるので、詳細な説明は繰り返さない。
上述の実施の形態1~7においては、位置検出の手法としてPOC法を利用する処理例について説明したが、画像間で位置検出を行なうことができる任意の手法を採用してもよい。例えば、SAD(Sum of Absolute Difference)法のような位置検出できる手法を採用できる。SAD法は、テンプレート画像と探索対象画像との間で画素毎に差の絶対値和を算出して、類似度を評価する方法である。
本実施の形態は、以下のような局面を含む。
好ましくは、周期性を取得する手段は、ユーザーが設定したテンプレート同士で回転量推定のための処理を行ない、得られた相関値から周期性を算出する。
本実施の形態によれば、画像の回転を考慮した位置検出手法において、最初に算出される回転量推定時に、画像の回転方向の周期性を利用することで、周期性に合致しないピークの影響を低減したり、探索範囲を限定することで、ロバスト性を向上したり、演算負荷を低減したりすることができる。
Claims (10)
- テンプレート画像を用いて探索対象画像との間でテンプレートマッチングを行なう画像処理方法であって、
前記テンプレート画像および前記探索対象画像の少なくとも一方についての、回転方向の周期性の情報を取得するステップと、
前記テンプレート画像と前記探索対象画像との間で回転方向についての第1のマッチングを行なうとともに、前記第1のマッチングの結果と前記周期性の情報とに基づいて、前記テンプレート画像と前記探索対象画像との間の回転方向の相対的なずれを補正する回転量の候補を複数決定するステップと、
前記回転量の候補の各々を用いて、前記テンプレート画像と前記探索対象画像との間の回転方向の相対的なずれを補正した画像の組を複数生成するステップと、
前記画像の組の各々で位置についての第2のマッチングを行なうとともに、前記第2のマッチングの結果のうち相対的に確度の高い結果を出力するステップとを含む、画像処理方法。 - 前記回転量の候補を複数決定するステップは、前記周期性の情報に基づいて、回転量の探索を行なう範囲を限定するステップを含む、請求項1に記載の画像処理方法。
- 前記回転量の探索を行なう範囲を限定するステップは、前記回転量の探索を行なう範囲における回転量と類似度との関係を、前記回転量の探索を行なう範囲外における回転量と類似度との関係で補正することで、探索を行なう範囲の類似度の情報を生成するステップを含む、請求項2に記載の画像処理方法。
- 前記回転量の候補を複数決定するステップは、前記第1のマッチングにより決定された回転量に対して、前記周期性の情報を示す回転量を加算および/または減算することで、前記回転量の候補を決定するステップを含む、請求項1~3のいずれか1項に記載の画像処理方法。
- 前記回転量の候補を複数決定するステップは、前記テンプレート画像と前記探索対象画像との類似度の回転量に対する変化特性を参照して、前記回転量の候補のうち、前記周期性の情報の条件を満たさない候補を除外するステップを含む、請求項1~4のいずれか1項に記載の画像処理方法。
- 前記周期性の情報を取得するステップは、前記テンプレート画像の生成時に、ユーザーが設定した前記周期性の情報を受け付けるステップを含む、請求項1~5のいずれか1項に記載の画像処理方法。
- 前記周期性の情報を取得するステップは、
前記テンプレート画像同士で回転方向についての第3のマッチングを行なうステップと、
前記第3のマッチングの結果に現われる類似度の周期から周期性を決定するステップとを含む、請求項1~5のいずれか1項に記載の画像処理方法。 - 前記周期性の情報を取得するステップは、ユーザーから任意の周期性の情報を受け付けるステップを含む、請求項1~5のいずれか1項に記載の画像処理方法。
- テンプレート画像を用いて探索対象画像との間でテンプレートマッチングを行なう画像処理装置であって、
前記テンプレート画像および前記探索対象画像の少なくとも一方についての、回転方向の周期性の情報を取得する周期情報取得部と、
前記テンプレート画像と前記探索対象画像との間で回転方向についての第1のマッチングを行なうとともに、前記第1のマッチングの結果と前記周期性の情報とに基づいて、前記テンプレート画像と前記探索対象画像との間の回転方向の相対的なずれを補正する回転量の候補を複数決定する回転量推定部と、
前記回転量の候補の各々を用いて、前記テンプレート画像と前記探索対象画像との間の回転方向の相対的なずれを補正した画像の組を複数生成する画像補正部と、
前記画像の組の各々で位置についての第2のマッチングを行なうとともに、前記第2のマッチングの結果のうち相対的に確度の高い結果を出力する位置検出部とを含む、画像処理装置。 - テンプレート画像を用いて探索対象画像との間でテンプレートマッチングを行なう画像処理プログラムであって、前記画像処理プログラムはコンピューターに
前記テンプレート画像および前記探索対象画像の少なくとも一方についての、回転方向の周期性の情報を取得するステップと、
前記テンプレート画像と前記探索対象画像との間で回転方向についての第1のマッチングを行なうとともに、前記第1のマッチングの結果と前記周期性の情報とに基づいて、前記テンプレート画像と前記探索対象画像との間の回転方向の相対的なずれを補正する回転量の候補を複数決定するステップと、
前記回転量の候補の各々を用いて、前記テンプレート画像と前記探索対象画像との間の回転方向の相対的なずれを補正した画像の組を複数生成するステップと、
前記画像の組の各々で位置についての第2のマッチングを行なうとともに、前記第2のマッチングの結果のうち相対的に確度の高い結果を出力するステップとを実行させる、画像処理プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/897,570 US20160140419A1 (en) | 2013-06-13 | 2014-05-28 | Image Processing Method, Image Processing Apparatus, And Image Processing Program |
JP2015522705A JPWO2014199824A1 (ja) | 2013-06-13 | 2014-05-28 | 画像処理方法、画像処理装置および画像処理プログラム |
EP14810909.3A EP3009985A4 (en) | 2013-06-13 | 2014-05-28 | Image processing method, image processing device, and image processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013124439 | 2013-06-13 | ||
JP2013-124439 | 2013-06-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014199824A1 true WO2014199824A1 (ja) | 2014-12-18 |
Family
ID=52022123
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/064066 WO2014199824A1 (ja) | 2013-06-13 | 2014-05-28 | 画像処理方法、画像処理装置および画像処理プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20160140419A1 (ja) |
EP (1) | EP3009985A4 (ja) |
JP (1) | JPWO2014199824A1 (ja) |
WO (1) | WO2014199824A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160321810A1 (en) * | 2015-04-28 | 2016-11-03 | Pixart Imaging (Penang) Sdn. Bhd. | Optical navigation sensor, electronic device with optical navigation function and operation method thereof |
JP6713185B2 (ja) * | 2015-10-15 | 2020-06-24 | 株式会社日立ハイテク | テンプレートマッチングを用いた検査装置および検査方法 |
CN106355607B (zh) * | 2016-08-12 | 2019-01-22 | 辽宁工程技术大学 | 一种宽基线彩色图像模板匹配方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10124667A (ja) | 1996-08-26 | 1998-05-15 | Yamatake Honeywell Co Ltd | パターン照合装置 |
JPH10206134A (ja) | 1997-01-17 | 1998-08-07 | Matsushita Electric Works Ltd | 画像処理による位置検出方法 |
JP2012069084A (ja) * | 2010-08-27 | 2012-04-05 | Hitachi High-Technologies Corp | 重み付きテンプレートマッチング実行装置およびプログラム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100308456B1 (ko) * | 1999-07-09 | 2001-11-02 | 오길록 | 주파수 공간상에서의 질감표현방법 및 질감기반 검색방법 |
JP2004240931A (ja) * | 2003-02-05 | 2004-08-26 | Sony Corp | 画像照合装置、画像照合方法、およびプログラム |
US7212673B2 (en) * | 2003-06-05 | 2007-05-01 | National Instruments Corporation | Rotational symmetry detection for configurations of discrete curves |
WO2005096218A1 (en) * | 2004-03-31 | 2005-10-13 | Canon Kabushiki Kaisha | Imaging system performance measurement |
US7340089B2 (en) * | 2004-08-17 | 2008-03-04 | National Instruments Corporation | Geometric pattern matching using dynamic feature combinations |
JP4909859B2 (ja) * | 2007-09-28 | 2012-04-04 | 株式会社日立ハイテクノロジーズ | 検査装置及び検査方法 |
JP5409237B2 (ja) * | 2009-09-28 | 2014-02-05 | キヤノン株式会社 | パターン検出装置、その処理方法及びプログラム |
US9613285B2 (en) * | 2012-03-22 | 2017-04-04 | The Charles Stark Draper Laboratory, Inc. | Compressive sensing with local geometric features |
-
2014
- 2014-05-28 EP EP14810909.3A patent/EP3009985A4/en not_active Withdrawn
- 2014-05-28 JP JP2015522705A patent/JPWO2014199824A1/ja active Pending
- 2014-05-28 WO PCT/JP2014/064066 patent/WO2014199824A1/ja active Application Filing
- 2014-05-28 US US14/897,570 patent/US20160140419A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10124667A (ja) | 1996-08-26 | 1998-05-15 | Yamatake Honeywell Co Ltd | パターン照合装置 |
JPH10206134A (ja) | 1997-01-17 | 1998-08-07 | Matsushita Electric Works Ltd | 画像処理による位置検出方法 |
JP2012069084A (ja) * | 2010-08-27 | 2012-04-05 | Hitachi High-Technologies Corp | 重み付きテンプレートマッチング実行装置およびプログラム |
Non-Patent Citations (3)
Title |
---|
GERHARD X. RITTER; JOSEPH N. WILSON: "Handbook of Computer Vision Algorithms in Image Algebra", 1 May 1996 |
KOJI KOBAYASHI ET AL.: "Filtering on Phase Only Correlation Domain and its Applications", ITE TECHNICAL REPORT, vol. 21, no. 42, 22 July 1997 (1997-07-22), pages 31 - 36, XP008181648 * |
See also references of EP3009985A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014199824A1 (ja) | 2017-02-23 |
US20160140419A1 (en) | 2016-05-19 |
EP3009985A4 (en) | 2017-02-15 |
EP3009985A1 (en) | 2016-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7297018B2 (ja) | ビジョンシステムでラインを検出するためのシステム及び方法 | |
KR101283262B1 (ko) | 영상 처리 방법 및 장치 | |
JP4372051B2 (ja) | 手形状認識装置及びその方法 | |
EP2911116A1 (en) | Image-processing device, image-processing method, and image-processing program | |
US8582810B2 (en) | Detecting potential changed objects in images | |
US10664946B2 (en) | Signal processors and methods for estimating transformations between signals with phase deviation | |
US9971954B2 (en) | Apparatus and method for producing image processing filter | |
EP3336588A2 (en) | Method and apparatus for matching images | |
US10249058B2 (en) | Three-dimensional information restoration device, three-dimensional information restoration system, and three-dimensional information restoration method | |
JP2023120281A (ja) | ビジョンシステムでラインを検出するためのシステム及び方法 | |
WO2014199824A1 (ja) | 画像処理方法、画像処理装置および画像処理プログラム | |
EP2994884A2 (en) | Method and apparatus for image matching | |
TW201719572A (zh) | 三維模型分析及搜尋方法 | |
JP6163868B2 (ja) | 画像処理方法、画像処理装置および画像処理プログラム | |
US20170140549A1 (en) | Method of perceiving 3d structure from a pair of images | |
JP2017091202A (ja) | 物体認識方法及び物体認識装置 | |
JP5643147B2 (ja) | 動きベクトル検出装置、動きベクトル検出方法及び動きベクトル検出プログラム | |
WO2016142965A1 (ja) | 映像処理装置、映像処理方法及び映像処理プログラムを記憶する記録媒体 | |
JP2014029677A (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP3811474B2 (ja) | 顔部品位置検出方法及び顔部品位置検出装置 | |
CN111814869B (zh) | 一种同步定位与建图的方法、装置、电子设备及存储介质 | |
WO2015005425A1 (ja) | 顔照合装置、顔照合方法及び顔照合プログラム | |
Amankwah | Image registration by automatic subimage selection and maximization of combined mutual information and spatial information | |
EP2214138A1 (en) | Detecting potential changed objects in images | |
JP6582618B2 (ja) | テンプレートマッチングプログラム、テンプレートマッチング方法およびテンプレートマッチング装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14810909 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015522705 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14897570 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2014810909 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014810909 Country of ref document: EP |