WO2024013901A1 - Match rate calculation device, match rate calculation method, and match rate calculation program - Google Patents

Match rate calculation device, match rate calculation method, and match rate calculation program Download PDF

Info

Publication number
WO2024013901A1
WO2024013901A1 PCT/JP2022/027611 JP2022027611W WO2024013901A1 WO 2024013901 A1 WO2024013901 A1 WO 2024013901A1 JP 2022027611 W JP2022027611 W JP 2022027611W WO 2024013901 A1 WO2024013901 A1 WO 2024013901A1
Authority
WO
WIPO (PCT)
Prior art keywords
match rate
image
rate calculation
template image
pixels
Prior art date
Application number
PCT/JP2022/027611
Other languages
French (fr)
Japanese (ja)
Inventor
史拓 横瀬
公雄 土川
泰輔 若杉
諒 内田
晴夫 大石
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2022/027611 priority Critical patent/WO2024013901A1/en
Publication of WO2024013901A1 publication Critical patent/WO2024013901A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a match rate calculation device, a match rate calculation method, and a match rate calculation program.
  • image matching is used to search for areas that match a template image on the captured image (search image) on the PC screen. (See Non-Patent Document 1).
  • image matching in addition to the process of finding parts of the searched image that completely match the template image, the process of finding similar parts of the searched image whose match rate with the template image is greater than a predetermined threshold is used. be exposed.
  • the match rate in image matching changes if the size of the area corresponding to the margin changes depending on how the template image is cut out.
  • controls on the PC screen may have visual effects such as changing color when the mouse cursor is placed over the control (mouse over), and when non-search images change due to visual effects.
  • the match rate will change.
  • the present invention has been made in view of the above, and an object of the present invention is to enable humans to easily set a match rate threshold by adjusting the match rate calculation in image matching to human intuitive sense. shall be.
  • a match rate calculation device includes a background detection unit that detects a background area of a template image for image matching, and a background detection unit that excludes the detected background area.
  • the image processing apparatus is characterized in that it includes a calculation unit that calculates a match rate between the template image and the searched image.
  • FIG. 1 is a diagram for explaining an overview of a match rate calculation device.
  • FIG. 2 is a diagram for explaining the outline of the match rate calculation device.
  • FIG. 3 is a diagram for explaining the outline of the match rate calculation device.
  • FIG. 4 is a diagram for explaining the outline of the match rate calculation device.
  • FIG. 5 is a schematic diagram illustrating a schematic configuration of the match rate calculation device of this embodiment.
  • FIG. 6 is a diagram for explaining the match rate calculation process.
  • FIG. 7 is a diagram for explaining the processing of the background detection section.
  • FIG. 8 is a diagram for explaining the processing of the edge detection section.
  • FIG. 9 is a diagram for explaining the processing of the calculation unit.
  • FIG. 10 is a diagram for explaining the processing of the calculation unit.
  • FIG. 10 is a diagram for explaining the processing of the calculation unit.
  • FIG. 11 is a diagram for explaining the processing of the calculation unit.
  • FIG. 12 is a diagram for explaining the processing of the calculation unit.
  • FIG. 13 is a flowchart showing the match rate calculation processing procedure.
  • FIG. 14 is a diagram for explaining the effect of the match rate calculation process.
  • FIG. 15 is a diagram illustrating an example of a computer that executes a match rate calculation program.
  • FIGS. 1 to 4 are diagrams for explaining the outline of the match rate calculation device.
  • Image matching is a process of comparing a searched image and a template image to find a position in the searched image that has a high degree of similarity to the template image.
  • image matching involves shifting (scanning) the position within the searched image and comparing parts of the searched image with the same size as the template image to calculate the match rate for each position. do. If this match rate is greater than or equal to a predetermined threshold, it is determined that the position of the searched image is similar to the template image.
  • the image to be compared is a grayscale image in which each pixel takes a value of 0 to 255.
  • the comparison is not limited to this, and may be a comparison between color images.
  • the match rate is normalized to 0% to 100%, that is, 0.0 to 1.0, and is set to 100% when there is a complete match.
  • the range of the match rate is not limited to this.
  • the match rate is the normalized sum of the absolute values of each pixel difference.
  • the value is not limited to this, and may be a value calculated by, for example, the square root of the sum of squares of each pixel difference, a cosine similarity of each pixel difference, or the like.
  • the match rate varies greatly depending on how the template image is cropped from the captured image, including the margins.
  • the margin is the area around a control (operated element such as a button, check box, text box, etc.) on the PC screen to be operated, and is usually the background on the PC screen.
  • a control operted element such as a button, check box, text box, etc.
  • the template image shown in FIG. 3(a) is cut out to be larger than the template image shown in FIG. 3(b), so that the margin portion is larger.
  • the proportion of the different parts (the presence or absence of "R") to the whole becomes relatively small, so the match rate is as high as 98.9%.
  • the match rate will change even if there is no major change in appearance to the human eye.
  • the search target image "registration” illustrated in FIG. 4 usually has a match rate of 100% with the template image "registration” as illustrated in FIG. 4(a).
  • the match rate with "Registered” in the template image decreases, and "Reset” with different characters appears. The match rate is lower.
  • the match rate calculation device of this embodiment detects the background region of the template image and excludes it from contributing to the match rate. This makes it possible to exclude the blank space of the template image, which has little meaning when specifying elements on the PC screen, thereby reducing the influence on the match rate.
  • edge regions which are regions with large changes compared to the surrounding areas, are detected, and the weight of their contribution to the match rate is increased.
  • the match rate calculation device increases the contribution of more characteristic regions of the graphic design such as edges of the control image, and changes caused by visual effects such as mouse over are small, but in relatively large regions of the control. It becomes possible to reduce the influence of changes that occur on the match rate.
  • the match rate calculation device brings the change in match rate closer to the human sense, and calculates the match rate without being affected by how the template image is cropped or the mouse over, thereby improving the match rate in image matching. It becomes possible to easily set a threshold value for rate determination.
  • FIG. 5 is a schematic diagram illustrating a schematic configuration of the match rate calculation device of this embodiment.
  • the match rate calculation device 10 of this embodiment is realized by a general-purpose computer such as a personal computer, and includes an input section 11, an output section 12, a communication control section 13, a storage section 14, and a control section 15. Be prepared.
  • the input unit 11 is realized using an input device such as a keyboard or a mouse, and inputs various instruction information such as starting processing to the control unit 15 in response to an input operation by an operator.
  • the output unit 12 is realized by a display device such as a liquid crystal display, a printing device such as a printer, and the like. For example, the output unit 12 displays the results of match rate calculation processing, which will be described later.
  • the communication control unit 13 is realized by a NIC (Network Interface Card) or the like, and controls communication between an external device and the control unit 15 via a telecommunication line such as a LAN (Local Area Network) or the Internet.
  • a NIC Network Interface Card
  • the communication control unit 13 controls communication between the control unit 15 and a management device that manages various types of information used in match rate calculation processing.
  • the storage unit 14 is realized by a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or an optical disk.
  • a processing program for operating the match rate calculation device 10 data used during execution of the processing program, etc. are stored in advance, or are temporarily stored each time processing is performed.
  • the storage unit 14 may be configured to communicate with the control unit 15 via the communication control unit 13.
  • the control unit 15 is realized using a CPU (Central Processing Unit) or the like, and executes a processing program stored in a memory. Thereby, the control unit 15 functions as an acquisition unit 15a, a background detection unit 15b, an edge detection unit 15c, and a calculation unit 15d, and executes the match rate calculation process, as illustrated in FIG. Note that each or a part of these functional units may be implemented in different hardware. For example, the acquisition unit 15a may be implemented in hardware different from other functional units. Further, the control unit 15 may include other functional units.
  • a CPU Central Processing Unit
  • the acquisition unit 15a acquires a searched image and a template image to be subjected to image matching processing. For example, the acquisition unit 15a acquires the searched image and template image generated by screen capture via the input unit 11 or from a user terminal, management device, or the like via the communication control unit 13.
  • the acquisition unit 15a may acquire the searched image and template image to be processed in advance and store them in the storage unit 14, or may transfer them immediately to a subsequent functional unit without storing them in the storage unit 14. You can.
  • the background detection unit 15b detects the background area of the template image for image matching. Specifically, the background detection unit 15b detects pixels having the same color as the four outer sides of the template image as a background area.
  • FIG. 6 is a diagram for explaining the match rate calculation process.
  • FIG. 6(a) is an image to be searched for the processing described below
  • FIG. 6(b) is a template image.
  • all images are grayscale images.
  • the searched image and template image are the same size, 20 pixels wide by 15 pixels high, there is no need to calculate the match rate multiple times while scanning the searched image, and only calculate the match rate once. calculate.
  • FIG. 7 is a diagram for explaining the processing of the background detection section.
  • the background detection unit 15b uses that color as the background color. Then, the background detection unit 15b detects pixels of the same color inside from the pixels on the four sides as a background area.
  • the edge detection unit 15c detects edge regions of the template image. Specifically, the edge detection unit 15c detects a pixel having the maximum color difference between the target pixel of the template image and surrounding pixels as an edge region.
  • FIG. 8 is a diagram for explaining the processing of the edge detection section.
  • the edge detection unit 15c calculates the maximum value among the absolute values of the differences between the target pixel of interest and its surrounding pixels.
  • FIG. 8A illustrates a case where the center pixel is the target pixel, the lower right pixel has a color difference of 150, and the other pixels have the same color with a difference of 0.
  • the maximum value of the difference between each pixel and surrounding pixels is calculated for each pixel as a target pixel, and pixels for which the calculated value is greater than or equal to a predetermined threshold are defined as edge regions.
  • pixels with a difference value of 150 that are shaded with diagonal lines are detected as edge regions.
  • FIGS. 9 to 12 are diagrams for explaining the processing of the calculation unit.
  • the calculation unit 15d excludes the detected background area and calculates the matching rate between the template image and the searched image.
  • the calculation unit 15d calculates the match rate using the difference value for each pixel between the template image and the search target image.
  • the calculation unit 15d compares the template image illustrated in FIG. 9(a) with a comparison target portion of the same size as the template image of the searched image, and calculates the pixel size as illustrated in FIG. 9(b). Calculate the difference value for each.
  • the calculation unit 15d identifies the background region of the template image from among the difference values for each pixel, as illustrated in FIG. 10(a). Further, as illustrated in FIG. 10B, the calculation unit 15d identifies pixels in the background region of the template image whose difference value is equal to or less than a predetermined threshold value (for example, 0) as background pixels. In FIG. 10(b), diagonally shaded pixels from the upper right to the lower left are specified as background pixels. The calculation unit 15d excludes the specified background pixel from the comparison target pixels for calculating the match rate.
  • a predetermined threshold value for example, 0
  • the calculation unit 15d calculates the match rate between the template image and the searched image by adding a predetermined weight to the edge region. For example, as illustrated in FIG. 11A, the calculation unit 15d identifies an edge region of the template image among the pixel-by-pixel difference values, and identifies pixels in the edge region as weight-increasing pixels.
  • the calculation unit 15d excludes pixels identified as background pixels as illustrated in FIG. 10(b) from the weight-increasing pixels.
  • FIG. 11(b) diagonally shaded pixels from the upper left to the lower right are specified as weight increasing pixels. Note that instead of prioritizing background pixels, priority may be given to weight increasing pixels, and pixels identified as weight increasing pixels may be excluded from the background pixels.
  • the calculation unit 15d calculates the match rate between the comparison target portion of the searched image and the template image, as illustrated in FIG. 12. For example, the calculation unit 15d calculates the match rate using the ratio of the weighted sum of the difference values to the maximum possible value. That is, the calculation unit 15d calculates 1-(weighted sum of difference values) ⁇ (maximum possible value of weighted sum of difference values) as the match rate.
  • the calculation unit 15d first calculates the weighted sum of the difference values between both images.
  • a predetermined weight of 2.0 is added to the sum of the difference values for the 133 weight increase pixels illustrated in FIG. 11(b).
  • the sum of the difference values of 26 pixels excluding the 144 background pixels illustrated in FIG. 10(b) from the total 300 pixels is calculated.
  • the calculation unit 15d calculates the maximum value that the weighted sum of the difference values between both images can take.
  • the maximum value 255 of the difference value of each pixel is calculated as 255 ⁇ 133 ⁇ 2.0 by adding the weight 2.0 of the 133 weight increase pixels.
  • the maximum possible value of 26 pixels excluding background pixels is calculated as 255 ⁇ 26.
  • the calculation unit 15d calculates ⁇ 1-(weighted sum of difference values) ⁇ (maximum possible value of weighted sum of difference values) ⁇ as a match rate.
  • FIG. 13 is a flowchart showing the match rate calculation processing procedure.
  • the flowchart in FIG. 13 is started, for example, at the timing when the user instructs to start the apparatus.
  • the acquisition unit 15a acquires a search target image and a template image for image matching processing. Further, the background detection unit 15b detects a background area of a template image for image matching (step S1). For example, the background detection unit 15b detects pixels having the same color as the four outer sides of the template image as a background area.
  • the edge detection unit 15c detects an edge region of the template image (step S2). For example, the edge detection unit 15c detects, as an edge region, a pixel that has the maximum color difference between the target pixel and surrounding pixels in the template image.
  • the calculation unit 15d calculates the match rate between the template image and the searched image, excluding the detected background area. Further, the calculation unit 15d adds a predetermined weight to the edge region and calculates the matching rate between the template image and the searched image (step S3). For example, the calculation unit 15d uses the difference value for each pixel between the template image and the searched image, and calculates the match rate as 1 - (weighted sum of difference values) ⁇ (maximum possible value of the weighted sum of difference values). calculate. This completes a series of match rate calculation processes.
  • the background detection unit 15b detects the background area of the template image for image matching.
  • the calculation unit 15d calculates the matching rate between the template image and the searched image, excluding the detected background area.
  • the calculation unit 15d calculates the match rate using the difference value for each pixel between the template image and the searched image. Furthermore, the background detection unit 15b detects pixels having the same color as the four outer sides of the template image as a background area.
  • the match rate calculation device 10 is able to calculate the match rate by excluding the blank space of the template image, which has little meaning when specifying elements on the PC screen, and reducing the influence on the match rate. Become.
  • the edge detection unit 15c detects edge areas of the template image. In that case, the calculation unit 15d adds a predetermined weight to the edge region and calculates the matching rate between the template image and the search target image.
  • the edge detection unit 15c detects, as an edge region, a pixel that has the maximum color difference between the target pixel of the template image and surrounding pixels.
  • the match rate calculation device 10 calculates the match rate by emphasizing common designs such as the edges of control buttons and reducing the influence on the match rate of changes in color within the edge area due to mouse over. It becomes possible to do so.
  • FIG. 14 is a diagram for explaining the effect of the match rate calculation process.
  • the match rate with the searched image changes depending on how the template image is cut out from the captured image and the color changes due to mouse over.
  • the match rate for the searched image "Reset” with respect to the template image "Register” varies between 95% and 98% depending on the size of the blank space of the template image. .
  • the match rate is calculated to be the same 95% between the "reset” searched image and the "registered” image whose color changes due to mouse over.
  • the match rate calculation device 10 of this embodiment as illustrated in FIG. 14(b), the influence of the margin part of the template image is eliminated, and the influence of color changes due to mouse over etc. is suppressed. Then, it is possible to calculate the match rate.
  • the match rate for the template image "Register” is 70% for "Register”, whose color changes when the mouse is over the image, than for "Reset", the searched image. It is close to the feeling of 99%, which is high.
  • the match rate calculation device 10 is able to calculate the match rate in image matching by bringing the match rate closer to the human intuitive feeling and suppressing the effects of cropping methods, mouse overs, etc. Therefore, it becomes possible for humans to more easily set the matching rate threshold for determining that the search target image and the template image are similar.
  • the match rate calculation device 10 can be implemented by installing a match rate calculation program that executes the above-described match rate calculation process on a desired computer as packaged software or online software. For example, by causing the information processing device to execute the above match rate calculation program, the information processing device can be made to function as the match rate calculation device 10.
  • the information processing device referred to here includes a desktop or notebook personal computer.
  • information processing devices include mobile communication terminals such as smartphones, mobile phones, and PHSs (Personal Handyphone Systems), as well as slate terminals such as PDAs (Personal Digital Assistants).
  • the functions of the match rate calculation device 10 may be implemented in a cloud server.
  • FIG. 15 is a diagram showing an example of a computer that executes a match rate calculation program.
  • Computer 1000 includes, for example, memory 1010, CPU 1020, hard disk drive interface 1030, disk drive interface 1040, serial port interface 1050, video adapter 1060, and network interface 1070. These parts are connected by a bus 1080.
  • the memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM 1012.
  • the ROM 1011 stores, for example, a boot program such as BIOS (Basic Input Output System).
  • Hard disk drive interface 1030 is connected to hard disk drive 1031.
  • Disk drive interface 1040 is connected to disk drive 1041.
  • a removable storage medium such as a magnetic disk or an optical disk is inserted into the disk drive 1041, for example.
  • a mouse 1051 and a keyboard 1052 are connected to the serial port interface 1050.
  • a display 1061 is connected to the video adapter 1060.
  • the hard disk drive 1031 stores, for example, an OS 1091, an application program 1092, a program module 1093, and program data 1094. Each piece of information described in the above embodiments is stored in, for example, the hard disk drive 1031 or the memory 1010.
  • the match rate calculation program is stored in the hard disk drive 1031, for example, as a program module 1093 in which commands to be executed by the computer 1000 are written. Specifically, a program module 1093 in which each process executed by the match rate calculation device 10 described in the above embodiment is described is stored in the hard disk drive 1031.
  • data used for information processing by the match rate calculation program is stored as program data 1094 in, for example, the hard disk drive 1031.
  • the CPU 1020 reads out the program module 1093 and program data 1094 stored in the hard disk drive 1031 to the RAM 1012 as necessary, and executes each of the above-described procedures.
  • program module 1093 and program data 1094 related to the match rate calculation program are not limited to being stored in the hard disk drive 1031; for example, they may be stored in a removable storage medium and executed by the CPU 1020 via the disk drive 1041 or the like. May be read.
  • the program module 1093 and program data 1094 related to the match rate calculation program are stored in another computer connected via a network such as a LAN or WAN (Wide Area Network), and read by the CPU 1020 via the network interface 1070. May be served.

Abstract

In the present invention, a background detection unit (15b) detects a background region in a template image for image matching. A calculation unit (15d) excludes the detected background region and calculates the match rate between the template image and a retrieved image.

Description

マッチ率算出装置、マッチ率算出方法およびマッチ率算出プログラムMatch rate calculation device, match rate calculation method, and match rate calculation program
 本発明は、マッチ率算出装置、マッチ率算出方法およびマッチ率算出プログラムに関する。 The present invention relates to a match rate calculation device, a match rate calculation method, and a match rate calculation program.
 RPA(Robotic Process Automation)などのPC上の操作自動化やPC上の操作分析において、PC画面上のキャプチャ画像(被検索画像)に対して、テンプレート画像と一致する領域を探索する画像マッチングが使われている(非特許文献1参照)。画像マッチングでは、被検索画像の中からテンプレート画像と完全一致する部分を見つける処理の他、被検索画像の中からテンプレート画像とのマッチ率が所定の閾値以上である類似の部分を見つける処理が使われる。 In PC operation automation such as RPA (Robotic Process Automation) and PC operation analysis, image matching is used to search for areas that match a template image on the captured image (search image) on the PC screen. (See Non-Patent Document 1). In image matching, in addition to the process of finding parts of the searched image that completely match the template image, the process of finding similar parts of the searched image whose match rate with the template image is greater than a predetermined threshold is used. be exposed.
 しかしながら、従来技術では、画像マッチングにおけるマッチ率の閾値の設定が困難な場合がある。例えば、画像マッチングにおけるマッチ率は、テンプレート画像の切り抜き方により余白に当たる領域の大きさが変化すると、マッチ率が変化してしまう。また、PC画面上のコントロール(操作対象要素)は、マウスカーソルがコントロール上に乗った状態(マウスオーバー)では色が変わるなどの視覚効果を持つ場合があり、視覚効果によって非検索画像が変化するとマッチ率が変化してしまう。しかし、テンプレート画像の余白領域の大きさや視覚効果による非検索画像の変化は人間にとって注目しづらいため、それによるマッチ率の変動は人間の直感的な感覚とのずれを生じさせ、類似すると判定するための閾値の設定を困難なものと人間に感じさせる。 However, with the conventional technology, it may be difficult to set a match rate threshold in image matching. For example, the match rate in image matching changes if the size of the area corresponding to the margin changes depending on how the template image is cut out. In addition, controls on the PC screen (operated elements) may have visual effects such as changing color when the mouse cursor is placed over the control (mouse over), and when non-search images change due to visual effects. The match rate will change. However, it is difficult for humans to notice changes in non-search images due to the size of the blank area of the template image or visual effects, so the resulting fluctuations in the match rate cause a discrepancy with human intuitive sense, leading to the determination that images are similar. It makes people feel that setting the threshold for the purpose is difficult.
 本発明は、上記に鑑みてなされたものであって、画像マッチングにおけるマッチ率の算出を人間の直感的な感覚に合わせることで、マッチ率の閾値を人間が容易に設定可能とすることを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to enable humans to easily set a match rate threshold by adjusting the match rate calculation in image matching to human intuitive sense. shall be.
 上述した課題を解決し、目的を達成するために、本発明に係るマッチ率算出装置は、画像マッチングのテンプレ―ト画像の背景領域を検出する背景検出部と、検出された前記背景領域を除外して、前記テンプレート画像と被検索画像とのマッチ率を算出する算出部と、を有することを特徴とする。 In order to solve the above-mentioned problems and achieve the purpose, a match rate calculation device according to the present invention includes a background detection unit that detects a background area of a template image for image matching, and a background detection unit that excludes the detected background area. The image processing apparatus is characterized in that it includes a calculation unit that calculates a match rate between the template image and the searched image.
 本発明によれば、画像マッチングにおけるマッチ率の閾値を人間が容易に設定することが可能となる。 According to the present invention, it becomes possible for humans to easily set the match rate threshold in image matching.
図1は、マッチ率算出装置の概要を説明するための図である。FIG. 1 is a diagram for explaining an overview of a match rate calculation device. 図2は、マッチ率算出装置の概要を説明するための図である。FIG. 2 is a diagram for explaining the outline of the match rate calculation device. 図3は、マッチ率算出装置の概要を説明するための図である。FIG. 3 is a diagram for explaining the outline of the match rate calculation device. 図4は、マッチ率算出装置の概要を説明するための図である。FIG. 4 is a diagram for explaining the outline of the match rate calculation device. 図5は、本実施形態のマッチ率算出装置の概略構成を例示する模式図である。FIG. 5 is a schematic diagram illustrating a schematic configuration of the match rate calculation device of this embodiment. 図6は、マッチ率算出処理を説明するための図である。FIG. 6 is a diagram for explaining the match rate calculation process. 図7は、背景検出部の処理を説明するための図である。FIG. 7 is a diagram for explaining the processing of the background detection section. 図8は、エッジ検出部の処理を説明するための図である。FIG. 8 is a diagram for explaining the processing of the edge detection section. 図9は、算出部の処理を説明するための図である。FIG. 9 is a diagram for explaining the processing of the calculation unit. 図10は、算出部の処理を説明するための図である。FIG. 10 is a diagram for explaining the processing of the calculation unit. 図11は、算出部の処理を説明するための図である。FIG. 11 is a diagram for explaining the processing of the calculation unit. 図12は、算出部の処理を説明するための図である。FIG. 12 is a diagram for explaining the processing of the calculation unit. 図13は、マッチ率算出処理手順を示すフローチャートである。FIG. 13 is a flowchart showing the match rate calculation processing procedure. 図14は、マッチ率算出処理の効果を説明するための図である。FIG. 14 is a diagram for explaining the effect of the match rate calculation process. 図15は、マッチ率算出プログラムを実行するコンピュータの一例を示す図である。FIG. 15 is a diagram illustrating an example of a computer that executes a match rate calculation program.
 以下、図面を参照して、本発明の一実施形態を詳細に説明する。なお、この実施形態により本発明が限定されるものではない。また、図面の記載において、同一部分には同一の符号を付して示している。 Hereinafter, one embodiment of the present invention will be described in detail with reference to the drawings. Note that the present invention is not limited to this embodiment. In addition, in the description of the drawings, the same parts are denoted by the same reference numerals.
[マッチ率算出装置の概要]
 図1~図4は、マッチ率算出装置の概要を説明するための図である。画像マッチングは、被検索画像とテンプレート画像とを比較して、被検索画像の中からテンプレート画像との類似度の高い位置を見つける処理である。図1に概念を例示するように、画像マッチングでは、被検索画像内の位置をずらし(走査)ながら、被検索画像のテンプレート画像と同サイズの部分を比較して、位置ごとにマッチ率を算出する。このマッチ率が所定の閾値以上である場合に被検索画像の当該位置がテンプレート画像に類似すると判定される。
[Overview of match rate calculation device]
FIGS. 1 to 4 are diagrams for explaining the outline of the match rate calculation device. Image matching is a process of comparing a searched image and a template image to find a position in the searched image that has a high degree of similarity to the template image. As illustrated in the concept in Figure 1, image matching involves shifting (scanning) the position within the searched image and comparing parts of the searched image with the same size as the template image to calculate the match rate for each position. do. If this match rate is greater than or equal to a predetermined threshold, it is determined that the position of the searched image is similar to the template image.
 なお、本実施形態では、比較対象の画像は、各ピクセルが0~255の値をとるグレースケール画像とする。ただし、これに限定されず、カラー画像間の比較であってもよい。また、マッチ率は、0%~100%つまり0.0~1.0に正規化し、完全に一致した場合に100%とする。ただし、マッチ率の範囲はこれに限定されない。また、マッチ率は、図2に例示するように、各ピクセル差分絶対値の総和を正規化したものする。ただし、これに限定されず、例えば、各ピクセル差分の二乗和平方根、各ピクセル差分のコサイン類似度等で算出された値でもよい。 Note that in this embodiment, the image to be compared is a grayscale image in which each pixel takes a value of 0 to 255. However, the comparison is not limited to this, and may be a comparison between color images. Further, the match rate is normalized to 0% to 100%, that is, 0.0 to 1.0, and is set to 100% when there is a complete match. However, the range of the match rate is not limited to this. Further, as illustrated in FIG. 2, the match rate is the normalized sum of the absolute values of each pixel difference. However, the value is not limited to this, and may be a value calculated by, for example, the square root of the sum of squares of each pixel difference, a cosine similarity of each pixel difference, or the like.
 ここで、マッチ率は、テンプレート画像の余白を含めたキャプチャ画像からの切り抜き方により大きく変化する。余白とは、操作対象のPC画面上のコントロール(ボタン、チェックボックス、テキストボックス、などの操作対象要素)の周辺領域で、通常はPC画面上の背景である。例えば、図3に示す例では、図3(a)に示すテンプレート画像は、図3(b)に示すテンプレート画像より大きく切り抜いて余白部分が大きくなっている。この場合に、異なる部分(「レ」の有無)の全体に対する割合が相対的に小さくなるため、マッチ率が98.9%と大きくなっている。 Here, the match rate varies greatly depending on how the template image is cropped from the captured image, including the margins. The margin is the area around a control (operated element such as a button, check box, text box, etc.) on the PC screen to be operated, and is usually the background on the PC screen. For example, in the example shown in FIG. 3, the template image shown in FIG. 3(a) is cut out to be larger than the template image shown in FIG. 3(b), so that the margin portion is larger. In this case, the proportion of the different parts (the presence or absence of "R") to the whole becomes relatively small, so the match rate is as high as 98.9%.
 また、PC画面上のボタンなどのコントロールの色が、要素にマウスカーソルが乗った状態であるマウスオーバー等により変わった場合には、人間の目による見た目に大きな変化がなくてもマッチ率が変化する場合がある。例えば、図4に例示する被検索画像の「登録」は、通常は、図4(a)に例示するように、テンプレート画像の「登録」とのマッチ率が100%である。一方、図4(b)に例示するように、マウスオーバーにより被検索画像の「登録」の色が変化すると、テンプレート画像の「登録」とのマッチ率が低下して、文字が異なる「リセット」よりもマッチ率が低下している。 In addition, if the color of a control such as a button on a PC screen changes due to a mouse over, etc., when the mouse cursor is placed on an element, the match rate will change even if there is no major change in appearance to the human eye. There are cases where For example, the search target image "registration" illustrated in FIG. 4 usually has a match rate of 100% with the template image "registration" as illustrated in FIG. 4(a). On the other hand, as illustrated in FIG. 4(b), when the color of "Registered" in the searched image changes due to mouse over, the match rate with "Registered" in the template image decreases, and "Reset" with different characters appears. The match rate is lower.
 そこで、本実施形態のマッチ率算出装置は、テンプレート画像の背景領域を検出してマッチ率への寄与から除外する。これにより、PC画面上の要素の特定に際して意味が小さいテンプレート画像の余白部分を除外して、マッチ率への影響を小さくすることが可能となる。 Therefore, the match rate calculation device of this embodiment detects the background region of the template image and excludes it from contributing to the match rate. This makes it possible to exclude the blank space of the template image, which has little meaning when specifying elements on the PC screen, thereby reducing the influence on the match rate.
 また、周辺と比較して変化の大きい領域であるエッジ領域を検出して、マッチ率への寄与の重みを大きくする。これにより、マッチ率算出装置は、コントロール画像のエッジ等の図形デザインのより特徴的な領域の寄与を大きくして、マウスオーバー等の視覚効果で起こる変化は小さいがコントロールの相対的に広い領域で起こる変化のマッチ率への影響を小さくすることが可能となる。 Additionally, edge regions, which are regions with large changes compared to the surrounding areas, are detected, and the weight of their contribution to the match rate is increased. As a result, the match rate calculation device increases the contribution of more characteristic regions of the graphic design such as edges of the control image, and changes caused by visual effects such as mouse over are small, but in relatively large regions of the control. It becomes possible to reduce the influence of changes that occur on the match rate.
 このように、マッチ率算出装置によれば、マッチ率の変化を人間の感覚に近づけて、テンプレート画像の切り抜き方やマウスオーバー等に影響されずにマッチ率を算出することにより、画像マッチングにおけるマッチ率の判定の閾値を容易に設定することが可能となる。 In this way, the match rate calculation device brings the change in match rate closer to the human sense, and calculates the match rate without being affected by how the template image is cropped or the mouse over, thereby improving the match rate in image matching. It becomes possible to easily set a threshold value for rate determination.
[マッチ率算出装置の構成]
 図5は、本実施形態のマッチ率算出装置の概略構成を例示する模式図である。図5に例示するように、本実施形態のマッチ率算出装置10は、パソコン等の汎用コンピュータで実現され、入力部11、出力部12、通信制御部13、記憶部14、および制御部15を備える。
[Configuration of match rate calculation device]
FIG. 5 is a schematic diagram illustrating a schematic configuration of the match rate calculation device of this embodiment. As illustrated in FIG. 5, the match rate calculation device 10 of this embodiment is realized by a general-purpose computer such as a personal computer, and includes an input section 11, an output section 12, a communication control section 13, a storage section 14, and a control section 15. Be prepared.
 入力部11は、キーボードやマウス等の入力デバイスを用いて実現され、操作者による入力操作に対応して、制御部15に対して処理開始などの各種指示情報を入力する。出力部12は、液晶ディスプレイなどの表示装置、プリンター等の印刷装置等によって実現される。例えば、出力部12には、後述するマッチ率算出処理の結果が表示される。 The input unit 11 is realized using an input device such as a keyboard or a mouse, and inputs various instruction information such as starting processing to the control unit 15 in response to an input operation by an operator. The output unit 12 is realized by a display device such as a liquid crystal display, a printing device such as a printer, and the like. For example, the output unit 12 displays the results of match rate calculation processing, which will be described later.
 通信制御部13は、NIC(Network Interface Card)等で実現され、LAN(Local Area Network)やインターネットなどの電気通信回線を介した外部の装置と制御部15との通信を制御する。例えば、通信制御部13は、マッチ率算出処理に用いられる各種の情報を管理する管理装置等と制御部15との通信を制御する。 The communication control unit 13 is realized by a NIC (Network Interface Card) or the like, and controls communication between an external device and the control unit 15 via a telecommunication line such as a LAN (Local Area Network) or the Internet. For example, the communication control unit 13 controls communication between the control unit 15 and a management device that manages various types of information used in match rate calculation processing.
 記憶部14は、RAM(Random Access Memory)、フラッシュメモリ(Flash Memory)等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。記憶部14には、マッチ率算出装置10を動作させる処理プログラムや、処理プログラムの実行中に使用されるデータなどが予め記憶され、あるいは処理の都度一時的に記憶される。なお、記憶部14は、通信制御部13を介して制御部15と通信する構成でもよい。 The storage unit 14 is realized by a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or an optical disk. In the storage unit 14, a processing program for operating the match rate calculation device 10, data used during execution of the processing program, etc. are stored in advance, or are temporarily stored each time processing is performed. Note that the storage unit 14 may be configured to communicate with the control unit 15 via the communication control unit 13.
 制御部15は、CPU(Central Processing Unit)等を用いて実現され、メモリに記憶された処理プログラムを実行する。これにより、制御部15は、図5に例示するように、取得部15a、背景検出部15b、エッジ検出部15cおよび算出部15dとして機能して、マッチ率算出処理を実行する。なお、これらの機能部は、それぞれ、あるいは一部が異なるハードウェアに実装されてもよい。例えば、取得部15aは、他の機能部とは異なるハードウェアに実装されてもよい。また、制御部15は、その他の機能部を備えてもよい。 The control unit 15 is realized using a CPU (Central Processing Unit) or the like, and executes a processing program stored in a memory. Thereby, the control unit 15 functions as an acquisition unit 15a, a background detection unit 15b, an edge detection unit 15c, and a calculation unit 15d, and executes the match rate calculation process, as illustrated in FIG. Note that each or a part of these functional units may be implemented in different hardware. For example, the acquisition unit 15a may be implemented in hardware different from other functional units. Further, the control unit 15 may include other functional units.
 取得部15aは、画像マッチング処理の対象の被検索画像とテンプレート画像とを取得する。例えば、取得部15aは、入力部11を介して、あるいは、ユーザ端末や管理装置等から通信制御部13を介して、画面キャプチャで生成された被検索画像とテンプレート画像とを取得する。 The acquisition unit 15a acquires a searched image and a template image to be subjected to image matching processing. For example, the acquisition unit 15a acquires the searched image and template image generated by screen capture via the input unit 11 or from a user terminal, management device, or the like via the communication control unit 13.
 なお、取得部15aは、処理対象の被検索画像とテンプレート画像とを、予め取得して記憶部14に記憶させてもよいし、記憶部14に記憶させずに直ちに後段の機能部に転送してもよい。 Note that the acquisition unit 15a may acquire the searched image and template image to be processed in advance and store them in the storage unit 14, or may transfer them immediately to a subsequent functional unit without storing them in the storage unit 14. You can.
 背景検出部15bは、画像マッチングのテンプレ―ト画像の背景領域を検出する。具体的には、背景検出部15bは、テンプレート画像の外側の4辺と同一色のピクセルを、背景領域として検出する。 The background detection unit 15b detects the background area of the template image for image matching. Specifically, the background detection unit 15b detects pixels having the same color as the four outer sides of the template image as a background area.
 ここで、図6は、マッチ率算出処理を説明するための図である。図6(a)は、以下に説明する処理の被検索画像であり、図6(b)は、テンプレート画像である。本実施形態において、いずれもグレースケール画像としている。また、被検索画像とテンプレート画像とは、横20ピクセル、縦15ピクセルと同一サイズであるため、被検索画像上を走査しながらマッチ率を複数回算出する必要はなく、1回だけマッチ率を算出する。 Here, FIG. 6 is a diagram for explaining the match rate calculation process. FIG. 6(a) is an image to be searched for the processing described below, and FIG. 6(b) is a template image. In this embodiment, all images are grayscale images. In addition, since the searched image and template image are the same size, 20 pixels wide by 15 pixels high, there is no need to calculate the match rate multiple times while scanning the searched image, and only calculate the match rate once. calculate.
 また、図7は、背景検出部の処理を説明するための図である。例えば、背景検出部15bは、図7(a)に示すように、テンプレ―ト画像の外側の4辺のピクセルの色情報がすべて同じであれば、その色を背景色とする。そして、背景検出部15bは、4辺のピクセルから内側に同一色であるピクセルを、背景領域として検出する。 Further, FIG. 7 is a diagram for explaining the processing of the background detection section. For example, as shown in FIG. 7A, if the color information of pixels on the outer four sides of the template image are all the same, the background detection unit 15b uses that color as the background color. Then, the background detection unit 15b detects pixels of the same color inside from the pixels on the four sides as a background area.
 エッジ検出部15cは、テンプレート画像のエッジ領域を検出する。具体的には、エッジ検出部15cは、テンプレート画像の対象ピクセルの周辺ピクセルとの色の差が最大となるピクセルを、エッジ領域として検出する。 The edge detection unit 15c detects edge regions of the template image. Specifically, the edge detection unit 15c detects a pixel having the maximum color difference between the target pixel of the template image and surrounding pixels as an edge region.
 ここで、図8は、エッジ検出部の処理を説明するための図である。例えば、エッジ検出部15cは、図8(a)に示すように、注目の対象ピクセルとその周辺のピクセルとの差の絶対値のうち最大となる値を算出する。図8(a)には、中心のピクセルを対象ピクセルとして、右下のピクセルの色の差が150であり、それ以外は差が0の同一色である場合が例示されている。そして、図8(b)に示すように、各ピクセルを対象ピクセルとして周辺ピクセルとの差の最大値を算出し、算出した値が所定の閾値以上であるピクセルをエッジ領域とする。図8(b)に示す例では、斜線の網掛の差分値150のピクセルがエッジ領域として検出されている。 Here, FIG. 8 is a diagram for explaining the processing of the edge detection section. For example, as shown in FIG. 8A, the edge detection unit 15c calculates the maximum value among the absolute values of the differences between the target pixel of interest and its surrounding pixels. FIG. 8A illustrates a case where the center pixel is the target pixel, the lower right pixel has a color difference of 150, and the other pixels have the same color with a difference of 0. Then, as shown in FIG. 8B, the maximum value of the difference between each pixel and surrounding pixels is calculated for each pixel as a target pixel, and pixels for which the calculated value is greater than or equal to a predetermined threshold are defined as edge regions. In the example shown in FIG. 8B, pixels with a difference value of 150 that are shaded with diagonal lines are detected as edge regions.
 次に、図9~図12は、算出部の処理を説明するための図である。まず、算出部15dは、検出された背景領域を除外して、テンプレート画像と被検索画像とのマッチ率を算出する。ここで、算出部15dは、テンプレート画像と被検索画像とのピクセルごとの差分値を用いてマッチ率を算出する。 Next, FIGS. 9 to 12 are diagrams for explaining the processing of the calculation unit. First, the calculation unit 15d excludes the detected background area and calculates the matching rate between the template image and the searched image. Here, the calculation unit 15d calculates the match rate using the difference value for each pixel between the template image and the search target image.
 例えば、算出部15dは、図9(a)に例示するテンプレート画像と、被検索画像のテンプレート画像と同サイズの比較対象部分とを比較して、図9(b)に例示するように、ピクセルごとに差分値を算出する。 For example, the calculation unit 15d compares the template image illustrated in FIG. 9(a) with a comparison target portion of the same size as the template image of the searched image, and calculates the pixel size as illustrated in FIG. 9(b). Calculate the difference value for each.
 そして、算出部15dは、図10(a)に例示するように、ピクセルごとの差分値のうち、テンプレート画像の背景領域を特定する。また、算出部15dは、図10(b)に例示するように、テンプレート画像の背景領域であって、差分値が所定の閾値(例えば、0)以下であるピクセルを背景ピクセルとして特定する。図10(b)では、右上から左下の斜線の網掛けのピクセルが背景ピクセルとして特定されている。算出部15dは、特定した背景ピクセルをマッチ率算出のための比較対象ピクセルから除外する。 Then, the calculation unit 15d identifies the background region of the template image from among the difference values for each pixel, as illustrated in FIG. 10(a). Further, as illustrated in FIG. 10B, the calculation unit 15d identifies pixels in the background region of the template image whose difference value is equal to or less than a predetermined threshold value (for example, 0) as background pixels. In FIG. 10(b), diagonally shaded pixels from the upper right to the lower left are specified as background pixels. The calculation unit 15d excludes the specified background pixel from the comparison target pixels for calculating the match rate.
 また、算出部15dは、エッジ領域に所定の重みを付加して、テンプレート画像と被検索画像とのマッチ率を算出する。例えば、算出部15dは、図11(a)に例示するように、ピクセルごとの差分値のうち、テンプレート画像のエッジ領域を特定し、エッジ領域のピクセルを重み増加ピクセルとして特定する。 Furthermore, the calculation unit 15d calculates the match rate between the template image and the searched image by adding a predetermined weight to the edge region. For example, as illustrated in FIG. 11A, the calculation unit 15d identifies an edge region of the template image among the pixel-by-pixel difference values, and identifies pixels in the edge region as weight-increasing pixels.
 その際に、算出部15dは、図11(b)に例示するように、図10(b)に例示したように背景ピクセルとして特定されたピクセルは、重み増加ピクセルから除外する。図11(b)では、左上から右下の斜線の網掛けのピクセルが重み増加ピクセルとして特定されている。なお、背景ピクセルを優先するかわりに、重み増加ピクセルを優先して、重み増加ピクセルとして特定されたピクセルを背景ピクセルから除外してもよい。 At this time, as illustrated in FIG. 11(b), the calculation unit 15d excludes pixels identified as background pixels as illustrated in FIG. 10(b) from the weight-increasing pixels. In FIG. 11(b), diagonally shaded pixels from the upper left to the lower right are specified as weight increasing pixels. Note that instead of prioritizing background pixels, priority may be given to weight increasing pixels, and pixels identified as weight increasing pixels may be excluded from the background pixels.
 そして、算出部15dは、図12に例示するように、被検索画像の比較対象部分とテンプレート画像とのマッチ率を算出する。例えば、算出部15dは、差分値の加重総和の取り得る最大値に対する割合を用いて、マッチ率を算出する。すなわち、算出部15dは、1-(差分値の加重総和)÷(差分値の加重総和の取り得る最大値)をマッチ率として算出する。 Then, the calculation unit 15d calculates the match rate between the comparison target portion of the searched image and the template image, as illustrated in FIG. 12. For example, the calculation unit 15d calculates the match rate using the ratio of the weighted sum of the difference values to the maximum possible value. That is, the calculation unit 15d calculates 1-(weighted sum of difference values)÷(maximum possible value of weighted sum of difference values) as the match rate.
 具体的には、図12に例示するように、算出部15dは、まず、両画像の差分値の加重総和を算出する。図12に示す例では、図11(b)に例示した133の重み増加ピクセルについて、差分値の総和に所定の重み2.0が付加されている。また、全300ピクセルから図10(b)に例示した144の背景ピクセルを除外した26ピクセルの差分値の総和が算出されている。 Specifically, as illustrated in FIG. 12, the calculation unit 15d first calculates the weighted sum of the difference values between both images. In the example shown in FIG. 12, a predetermined weight of 2.0 is added to the sum of the difference values for the 133 weight increase pixels illustrated in FIG. 11(b). Furthermore, the sum of the difference values of 26 pixels excluding the 144 background pixels illustrated in FIG. 10(b) from the total 300 pixels is calculated.
 また、算出部15dは、両画像の差分値の加重総和が取り得る最大値を算出する。図12に示す例では、各ピクセルの差分値の最大値255として、133の重み増加ピクセルの重み2.0を付加した最大値が255×133×2.0として算出されている。また、背景ピクセルを除外した26ピクセルの取り得る最大値が255×26として算出されている。 Furthermore, the calculation unit 15d calculates the maximum value that the weighted sum of the difference values between both images can take. In the example shown in FIG. 12, the maximum value 255 of the difference value of each pixel is calculated as 255×133×2.0 by adding the weight 2.0 of the 133 weight increase pixels. Furthermore, the maximum possible value of 26 pixels excluding background pixels is calculated as 255×26.
 そして、算出部15dは、{1-(差分値の加重総和)÷(差分値の加重総和の取り得る最大値)}をマッチ率として算出する。図12に示す例では、マッチ率は約0.611=61.1%と算出される。 Then, the calculation unit 15d calculates {1-(weighted sum of difference values)÷(maximum possible value of weighted sum of difference values)} as a match rate. In the example shown in FIG. 12, the match rate is calculated to be approximately 0.611=61.1%.
[マッチ率算出処理]
 次に、図13を参照して、本実施形態に係るマッチ率算出装置10によるマッチ率算出処理について説明する。図13は、マッチ率算出処理手順を示すフローチャートである。図13のフローチャートは、例えば、ユーザが本装置の開始を指示したタイミングで開始される。
[Match rate calculation process]
Next, with reference to FIG. 13, match rate calculation processing by the match rate calculation device 10 according to the present embodiment will be described. FIG. 13 is a flowchart showing the match rate calculation processing procedure. The flowchart in FIG. 13 is started, for example, at the timing when the user instructs to start the apparatus.
 まず、取得部15aが、画像マッチング処理の被検索画像とテンプレート画像とを取得する。また、背景検出部15bが、画像マッチングのテンプレ―ト画像の背景領域を検出する(ステップS1)。例えば、背景検出部15bは、テンプレート画像の外側の4辺と同一色のピクセルを、背景領域として検出する。 First, the acquisition unit 15a acquires a search target image and a template image for image matching processing. Further, the background detection unit 15b detects a background area of a template image for image matching (step S1). For example, the background detection unit 15b detects pixels having the same color as the four outer sides of the template image as a background area.
 また、エッジ検出部15cが、テンプレート画像のエッジ領域を検出する(ステップS2)。例えば、エッジ検出部15cは、テンプレート画像の対象ピクセルの周辺ピクセルとの色の差が最大となるピクセルを、エッジ領域として検出する。 Furthermore, the edge detection unit 15c detects an edge region of the template image (step S2). For example, the edge detection unit 15c detects, as an edge region, a pixel that has the maximum color difference between the target pixel and surrounding pixels in the template image.
 そして、算出部15dが、検出された背景領域を除外して、テンプレート画像と被検索画像とのマッチ率を算出する。また、算出部15dは、エッジ領域に所定の重みを付加して、テンプレート画像と被検索画像とのマッチ率を算出する(ステップS3)。例えば、算出部15dは、テンプレート画像と被検索画像とのピクセルごとの差分値を用いて、1-(差分値の加重総和)÷(差分値の加重総和の取り得る最大値)をマッチ率として算出する。これにより、一連のマッチ率算出処理が終了する。 Then, the calculation unit 15d calculates the match rate between the template image and the searched image, excluding the detected background area. Further, the calculation unit 15d adds a predetermined weight to the edge region and calculates the matching rate between the template image and the searched image (step S3). For example, the calculation unit 15d uses the difference value for each pixel between the template image and the searched image, and calculates the match rate as 1 - (weighted sum of difference values) ÷ (maximum possible value of the weighted sum of difference values). calculate. This completes a series of match rate calculation processes.
[効果]
 以上、説明したように、本実施形態のマッチ率算出装置10において、背景検出部15bが、画像マッチングのテンプレ―ト画像の背景領域を検出する。算出部15dが、検出された背景領域を除外して、テンプレート画像と被検索画像とのマッチ率を算出する。
[effect]
As described above, in the match rate calculation device 10 of this embodiment, the background detection unit 15b detects the background area of the template image for image matching. The calculation unit 15d calculates the matching rate between the template image and the searched image, excluding the detected background area.
 具体的には、算出部15dは、テンプレート画像と被検索画像とのピクセルごとの差分値を用いてマッチ率を算出する。また、背景検出部15bは、テンプレート画像の外側の4辺と同一色のピクセルを、背景領域として検出する。 Specifically, the calculation unit 15d calculates the match rate using the difference value for each pixel between the template image and the searched image. Furthermore, the background detection unit 15b detects pixels having the same color as the four outer sides of the template image as a background area.
 これにより、マッチ率算出装置10は、PC画面上の要素の特定に際して意味が小さいテンプレート画像の余白部分を除外して、マッチ率への影響を小さくして、マッチ率を算出することが可能となる。 As a result, the match rate calculation device 10 is able to calculate the match rate by excluding the blank space of the template image, which has little meaning when specifying elements on the PC screen, and reducing the influence on the match rate. Become.
 また、エッジ検出部15cが、テンプレート画像のエッジ領域を検出する。その場合に、算出部15dは、エッジ領域に所定の重みを付加して、テンプレート画像と被検索画像とのマッチ率を算出する。 Additionally, the edge detection unit 15c detects edge areas of the template image. In that case, the calculation unit 15d adds a predetermined weight to the edge region and calculates the matching rate between the template image and the search target image.
 具体的には、エッジ検出部15cは、テンプレート画像の対象ピクセルの周辺ピクセルとの色の差が最大となるピクセルを、エッジ領域として検出する。 Specifically, the edge detection unit 15c detects, as an edge region, a pixel that has the maximum color difference between the target pixel of the template image and surrounding pixels.
 これにより、マッチ率算出装置10は、コントロールボタンのエッジ等の共通的なデザインを強調してマウスオーバーによるエッジ領域内の色の変化等のマッチ率への影響を小さくして、マッチ率を算出することが可能となる。 As a result, the match rate calculation device 10 calculates the match rate by emphasizing common designs such as the edges of control buttons and reducing the influence on the match rate of changes in color within the edge area due to mouse over. It becomes possible to do so.
 ここで、図14は、マッチ率算出処理の効果を説明するための図である。従来、図14(a)に例示するように、キャプチャ画像からのテンプレート画像の切り抜き方や、マウスオーバー等による色の変化により、被検索画像とのマッチ率が変化する。図14(a)に示す例では、テンプレート画像の「登録」に対し、被検索画像の「リセット」のマッチ率が、テンプレート画像の余白部分の大きさにより95%、98%と変化している。また、余白部分の大きさが同じでも、被検索画像の「リセット」とマウスオーバーにより色が変化した「登録」とでマッチ率が同じ95%と算出されている。 Here, FIG. 14 is a diagram for explaining the effect of the match rate calculation process. Conventionally, as illustrated in FIG. 14A, the match rate with the searched image changes depending on how the template image is cut out from the captured image and the color changes due to mouse over. In the example shown in FIG. 14(a), the match rate for the searched image "Reset" with respect to the template image "Register" varies between 95% and 98% depending on the size of the blank space of the template image. . Furthermore, even if the size of the blank space is the same, the match rate is calculated to be the same 95% between the "reset" searched image and the "registered" image whose color changes due to mouse over.
 これに対し、本実施形態のマッチ率算出装置10によれば、図14(b)に例示するように、テンプレート画像の余白部分の影響を排除し、マウスオーバー等による色の変化の影響を抑制して、マッチ率を算出することが可能となる。図14(b)に示す例では、テンプレート画像の「登録」とのマッチ率が、被検索画像の「リセット」の70%より、マウスオーバーにより色が変化している「登録」の方が人間の感覚に近く99%と高くなっている。 On the other hand, according to the match rate calculation device 10 of this embodiment, as illustrated in FIG. 14(b), the influence of the margin part of the template image is eliminated, and the influence of color changes due to mouse over etc. is suppressed. Then, it is possible to calculate the match rate. In the example shown in Fig. 14(b), the match rate for the template image "Register" is 70% for "Register", whose color changes when the mouse is over the image, than for "Reset", the searched image. It is close to the feeling of 99%, which is high.
 このように、マッチ率算出装置10は、画像マッチングにおけるマッチ率を人間の直感的な感覚に近づけて、切り抜き方やマウスオーバー等の影響を抑制してマッチ率を算出することが可能となる。したがって、被検索画像とテンプレート画像とが類似すると判定するためのマッチ率の閾値を人間がより容易に設定することが可能となる。 In this way, the match rate calculation device 10 is able to calculate the match rate in image matching by bringing the match rate closer to the human intuitive feeling and suppressing the effects of cropping methods, mouse overs, etc. Therefore, it becomes possible for humans to more easily set the matching rate threshold for determining that the search target image and the template image are similar.
[プログラム]
 上記実施形態に係るマッチ率算出装置10が実行する処理をコンピュータが実行可能な言語で記述したプログラムを作成することもできる。一実施形態として、マッチ率算出装置10は、パッケージソフトウェアやオンラインソフトウェアとして上記のマッチ率算出処理を実行するマッチ率算出プログラムを所望のコンピュータにインストールさせることによって実装できる。例えば、上記のマッチ率算出プログラムを情報処理装置に実行させることにより、情報処理装置をマッチ率算出装置10として機能させることができる。ここで言う情報処理装置には、デスクトップ型またはノート型のパーソナルコンピュータが含まれる。また、その他にも、情報処理装置にはスマートフォン、携帯電話機やPHS(Personal Handyphone System)などの移動体通信端末、さらには、PDA(Personal Digital Assistant)などのスレート端末などがその範疇に含まれる。また、マッチ率算出装置10の機能を、クラウドサーバに実装してもよい。
[program]
It is also possible to create a program in which the processing executed by the match rate calculation device 10 according to the embodiment described above is written in a computer-executable language. As one embodiment, the match rate calculation device 10 can be implemented by installing a match rate calculation program that executes the above-described match rate calculation process on a desired computer as packaged software or online software. For example, by causing the information processing device to execute the above match rate calculation program, the information processing device can be made to function as the match rate calculation device 10. The information processing device referred to here includes a desktop or notebook personal computer. In addition, information processing devices include mobile communication terminals such as smartphones, mobile phones, and PHSs (Personal Handyphone Systems), as well as slate terminals such as PDAs (Personal Digital Assistants). Further, the functions of the match rate calculation device 10 may be implemented in a cloud server.
 図15は、マッチ率算出プログラムを実行するコンピュータの一例を示す図である。コンピュータ1000は、例えば、メモリ1010と、CPU1020と、ハードディスクドライブインタフェース1030と、ディスクドライブインタフェース1040と、シリアルポートインタフェース1050と、ビデオアダプタ1060と、ネットワークインタフェース1070とを有する。これらの各部は、バス1080によって接続される。 FIG. 15 is a diagram showing an example of a computer that executes a match rate calculation program. Computer 1000 includes, for example, memory 1010, CPU 1020, hard disk drive interface 1030, disk drive interface 1040, serial port interface 1050, video adapter 1060, and network interface 1070. These parts are connected by a bus 1080.
 メモリ1010は、ROM(Read Only Memory)1011およびRAM1012を含む。ROM1011は、例えば、BIOS(Basic Input Output System)等のブートプログラムを記憶する。ハードディスクドライブインタフェース1030は、ハードディスクドライブ1031に接続される。ディスクドライブインタフェース1040は、ディスクドライブ1041に接続される。ディスクドライブ1041には、例えば、磁気ディスクや光ディスク等の着脱可能な記憶媒体が挿入される。シリアルポートインタフェース1050には、例えば、マウス1051およびキーボード1052が接続される。ビデオアダプタ1060には、例えば、ディスプレイ1061が接続される。 The memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM 1012. The ROM 1011 stores, for example, a boot program such as BIOS (Basic Input Output System). Hard disk drive interface 1030 is connected to hard disk drive 1031. Disk drive interface 1040 is connected to disk drive 1041. A removable storage medium such as a magnetic disk or an optical disk is inserted into the disk drive 1041, for example. For example, a mouse 1051 and a keyboard 1052 are connected to the serial port interface 1050. For example, a display 1061 is connected to the video adapter 1060.
 ここで、ハードディスクドライブ1031は、例えば、OS1091、アプリケーションプログラム1092、プログラムモジュール1093およびプログラムデータ1094を記憶する。上記実施形態で説明した各情報は、例えばハードディスクドライブ1031やメモリ1010に記憶される。 Here, the hard disk drive 1031 stores, for example, an OS 1091, an application program 1092, a program module 1093, and program data 1094. Each piece of information described in the above embodiments is stored in, for example, the hard disk drive 1031 or the memory 1010.
 また、マッチ率算出プログラムは、例えば、コンピュータ1000によって実行される指令が記述されたプログラムモジュール1093として、ハードディスクドライブ1031に記憶される。具体的には、上記実施形態で説明したマッチ率算出装置10が実行する各処理が記述されたプログラムモジュール1093が、ハードディスクドライブ1031に記憶される。 Further, the match rate calculation program is stored in the hard disk drive 1031, for example, as a program module 1093 in which commands to be executed by the computer 1000 are written. Specifically, a program module 1093 in which each process executed by the match rate calculation device 10 described in the above embodiment is described is stored in the hard disk drive 1031.
 また、マッチ率算出プログラムによる情報処理に用いられるデータは、プログラムデータ1094として、例えば、ハードディスクドライブ1031に記憶される。そして、CPU1020が、ハードディスクドライブ1031に記憶されたプログラムモジュール1093やプログラムデータ1094を必要に応じてRAM1012に読み出して、上述した各手順を実行する。 Further, data used for information processing by the match rate calculation program is stored as program data 1094 in, for example, the hard disk drive 1031. Then, the CPU 1020 reads out the program module 1093 and program data 1094 stored in the hard disk drive 1031 to the RAM 1012 as necessary, and executes each of the above-described procedures.
 なお、マッチ率算出プログラムに係るプログラムモジュール1093やプログラムデータ1094は、ハードディスクドライブ1031に記憶される場合に限られず、例えば、着脱可能な記憶媒体に記憶されて、ディスクドライブ1041等を介してCPU1020によって読み出されてもよい。あるいは、マッチ率算出プログラムに係るプログラムモジュール1093やプログラムデータ1094は、LANやWAN(Wide Area Network)等のネットワークを介して接続された他のコンピュータに記憶され、ネットワークインタフェース1070を介してCPU1020によって読み出されてもよい。 Note that the program module 1093 and program data 1094 related to the match rate calculation program are not limited to being stored in the hard disk drive 1031; for example, they may be stored in a removable storage medium and executed by the CPU 1020 via the disk drive 1041 or the like. May be read. Alternatively, the program module 1093 and program data 1094 related to the match rate calculation program are stored in another computer connected via a network such as a LAN or WAN (Wide Area Network), and read by the CPU 1020 via the network interface 1070. May be served.
 以上、本発明者によってなされた発明を適用した実施形態について説明したが、本実施形態による本発明の開示の一部をなす記述および図面により本発明は限定されることはない。すなわち、本実施形態に基づいて当業者等によりなされる他の実施形態、実施例および運用技術等は全て本発明の範疇に含まれる。 Although embodiments to which the invention made by the present inventor is applied have been described above, the present invention is not limited by the description and drawings that form part of the disclosure of the present invention by this embodiment. That is, all other embodiments, examples, operational techniques, etc. made by those skilled in the art based on this embodiment are included in the scope of the present invention.
 10 マッチ率算出装置
 11 入力部
 12 出力部
 13 通信制御部
 14 記憶部
 15 制御部
 15a 取得部
 15b 背景検出部
 15c エッジ検出部
 15d 算出部
10 Match rate calculation device 11 Input section 12 Output section 13 Communication control section 14 Storage section 15 Control section 15a Acquisition section 15b Background detection section 15c Edge detection section 15d Calculation section

Claims (7)

  1.  画像マッチングのテンプレート画像の背景領域を検出する背景検出部と、
     検出された前記背景領域を除外して、前記テンプレート画像と被検索画像とのマッチ率を算出する算出部と、
     を有することを特徴とするマッチ率算出装置。
    a background detection unit that detects a background area of a template image for image matching;
    a calculation unit that calculates a match rate between the template image and the searched image by excluding the detected background area;
    A match rate calculation device comprising:
  2.  前記テンプレート画像のエッジ領域を検出するエッジ検出部をさらに有し、
     前記算出部は、前記エッジ領域に所定の重みを付加して、前記テンプレート画像と被検索画像とのマッチ率を算出する、
     ことを特徴とする請求項1に記載のマッチ率算出装置。
    further comprising an edge detection unit that detects an edge region of the template image,
    The calculation unit adds a predetermined weight to the edge region and calculates a match rate between the template image and the searched image.
    2. The match rate calculation device according to claim 1.
  3.  前記算出部は、前記テンプレート画像と前記被検索画像とのピクセルごとの差分値を用いて前記マッチ率を算出することを特徴とする請求項1に記載のマッチ率算出装置。 The match rate calculation device according to claim 1, wherein the calculation unit calculates the match rate using a pixel-by-pixel difference value between the template image and the searched image.
  4.  前記背景検出部は、前記テンプレート画像の外側の4辺と同一色のピクセルを、背景領域として検出することを特徴とする請求項1に記載のマッチ率算出装置。 The match rate calculation device according to claim 1, wherein the background detection unit detects pixels having the same color as the four outer sides of the template image as a background area.
  5.  前記エッジ検出部は、前記テンプレート画像の対象ピクセルの周辺ピクセルとの色の差が最大となるピクセルを、前記エッジ領域として検出することを特徴とする請求項2に記載のマッチ率算出装置。 3. The match rate calculation device according to claim 2, wherein the edge detection unit detects a pixel having a maximum color difference between the target pixel of the template image and surrounding pixels as the edge region.
  6.  マッチ率算出装置が実行するマッチ率算出方法であって、
     画像マッチングのテンプレート画像の背景領域を検出する背景検出工程と、
     検出された前記背景領域を除外して、前記テンプレート画像と被検索画像とのマッチ率を算出する算出工程と、
     を含むことを特徴とするマッチ率算出方法。
    A match rate calculation method executed by a match rate calculation device, comprising:
    a background detection step of detecting a background area of a template image for image matching;
    a calculation step of calculating a match rate between the template image and the searched image by excluding the detected background area;
    A match rate calculation method characterized by including.
  7.  画像マッチングのテンプレート画像の背景領域を検出する背景検出ステップと、
     検出された前記背景領域を除外して、前記テンプレート画像と被検索画像とのマッチ率を算出する算出ステップと、
     をコンピュータに実行させることを特徴とするマッチ率算出プログラム。
    a background detection step of detecting a background area of a template image for image matching;
    a calculation step of calculating a match rate between the template image and the searched image by excluding the detected background area;
    A match rate calculation program that causes a computer to execute.
PCT/JP2022/027611 2022-07-13 2022-07-13 Match rate calculation device, match rate calculation method, and match rate calculation program WO2024013901A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027611 WO2024013901A1 (en) 2022-07-13 2022-07-13 Match rate calculation device, match rate calculation method, and match rate calculation program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027611 WO2024013901A1 (en) 2022-07-13 2022-07-13 Match rate calculation device, match rate calculation method, and match rate calculation program

Publications (1)

Publication Number Publication Date
WO2024013901A1 true WO2024013901A1 (en) 2024-01-18

Family

ID=89536212

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/027611 WO2024013901A1 (en) 2022-07-13 2022-07-13 Match rate calculation device, match rate calculation method, and match rate calculation program

Country Status (1)

Country Link
WO (1) WO2024013901A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011002965A (en) * 2009-06-17 2011-01-06 Canon Inc Image retrieval method and device
WO2014132414A1 (en) * 2013-02-28 2014-09-04 グローリー株式会社 Character recognition method and character recognition system
JP2015032248A (en) * 2013-08-06 2015-02-16 富士ゼロックス株式会社 Image search device, system and program for searching for data
JP2017129903A (en) * 2016-01-18 2017-07-27 日本電信電話株式会社 Searched book display device, method, and program
JP2018147262A (en) * 2017-03-06 2018-09-20 国立大学法人豊橋技術科学大学 Image feature quantity and three-dimensional shape retrieval system using the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011002965A (en) * 2009-06-17 2011-01-06 Canon Inc Image retrieval method and device
WO2014132414A1 (en) * 2013-02-28 2014-09-04 グローリー株式会社 Character recognition method and character recognition system
JP2015032248A (en) * 2013-08-06 2015-02-16 富士ゼロックス株式会社 Image search device, system and program for searching for data
JP2017129903A (en) * 2016-01-18 2017-07-27 日本電信電話株式会社 Searched book display device, method, and program
JP2018147262A (en) * 2017-03-06 2018-09-20 国立大学法人豊橋技術科学大学 Image feature quantity and three-dimensional shape retrieval system using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Edge Detection - MathWorks India - MATLAB & Simulink", 21 April 2021 (2021-04-21), XP093129014, Retrieved from the Internet <URL:https://web.archive.org/web/20210421012519/https://jp.mathworks.com/discovery/edge-detection.html> *

Similar Documents

Publication Publication Date Title
US11373275B2 (en) Method for generating high-resolution picture, computer device, and storage medium
US20240078646A1 (en) Image processing method, image processing apparatus, and non-transitory storage medium
US11170210B2 (en) Gesture identification, control, and neural network training methods and apparatuses, and electronic devices
US9697416B2 (en) Object detection using cascaded convolutional neural networks
US10846870B2 (en) Joint training technique for depth map generation
US9721387B2 (en) Systems and methods for implementing augmented reality
KR102435365B1 (en) Certificate recognition method and apparatus, electronic device, computer readable storage medium
US20230237841A1 (en) Occlusion Detection
US8995772B2 (en) Real-time face detection using pixel pairs
CN112991180B (en) Image stitching method, device, equipment and storage medium
CN108647351B (en) Text image processing method and device, storage medium and terminal
JP7389824B2 (en) Object identification method and device, electronic equipment and storage medium
US10049268B2 (en) Selective, user-mediated content recognition using mobile devices
JP6079449B2 (en) Apparatus, method and electronic equipment for extracting edge of object in image
CN113657518B (en) Training method, target image detection method, device, electronic device, and medium
WO2020147258A1 (en) Remote desktop operation method and apparatus, readable storage medium, and terminal device
US20210342972A1 (en) Automatic Content-Aware Collage
WO2024013901A1 (en) Match rate calculation device, match rate calculation method, and match rate calculation program
CN112150347A (en) Image modification patterns learned from a limited set of modified images
CN108304840B (en) Image data processing method and device
WO2021229809A1 (en) User operation recording device and user operation recording method
CN114663418A (en) Image processing method and device, storage medium and electronic equipment
CN114494686A (en) Text image correction method, text image correction device, electronic equipment and storage medium
US11176720B2 (en) Computer program, image processing method, and image processing apparatus
JP6609181B2 (en) Character attribute estimation apparatus and character attribute estimation program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22951115

Country of ref document: EP

Kind code of ref document: A1