CN110807483B - FPGA-based template matching implementation device and method - Google Patents

FPGA-based template matching implementation device and method Download PDF

Info

Publication number
CN110807483B
CN110807483B CN201911047750.3A CN201911047750A CN110807483B CN 110807483 B CN110807483 B CN 110807483B CN 201911047750 A CN201911047750 A CN 201911047750A CN 110807483 B CN110807483 B CN 110807483B
Authority
CN
China
Prior art keywords
matching
image
template
fpga
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911047750.3A
Other languages
Chinese (zh)
Other versions
CN110807483A (en
Inventor
张方元
吕猛
张华东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yi Si Si Hangzhou Technology Co ltd
Original Assignee
Isvision Hangzhou Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isvision Hangzhou Technology Co Ltd filed Critical Isvision Hangzhou Technology Co Ltd
Priority to CN201911047750.3A priority Critical patent/CN110807483B/en
Publication of CN110807483A publication Critical patent/CN110807483A/en
Application granted granted Critical
Publication of CN110807483B publication Critical patent/CN110807483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Biology (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a template matching realization device and method based on FPGA, wherein the device comprises a memory and the FPGA; the memory is used for storing an original image and selecting features to obtain an interested region image; a ROM for storing template data and a plurality of interesting region matching modules are arranged in the FPGA; the interested region images are stored in interested region matching modules respectively, each module is provided with a plurality of dual-port BRAMs, and each BRAM is connected with two matching calculation modules respectively; all the matching calculation modules are connected with the comparison module; the single frame of interested region image is divided into a plurality of sections which are respectively stored in BRAM; the matching calculation module is used for calculating the matching degree of the module and the corresponding section of the interested region image, outputting the matching metric value of the single section of the image, and the comparison module obtains the maximum value or the minimum value of the matching metric values in the multiple sections of the image as the matching metric value of the single interested region through calculation. The technical scheme adopts a parallel mode to carry out template matching and quickly obtains a result.

Description

FPGA-based template matching implementation device and method
Technical Field
The invention relates to the field of image processing, in particular to a template matching implementation device and method based on an FPGA (field programmable gate array).
Background
Template matching is the most basic and most common matching method in image processing. Template matching is a technique for finding the most similar region to a template image in an image, and the technique can be used for positioning and identifying objects. Because the template matching calculation amount is huge, the template matching calculation method is mostly realized in a PC or an industrial personal computer, and the defects of high cost, large volume, high power consumption and the like exist, so that the template matching application scene is limited. Meanwhile, when the existing embedded platform is used for template matching, the calculation time is long, and the real-time requirement is difficult to meet.
Disclosure of Invention
In order to solve the technical problems, the invention provides a template matching implementation device and method based on an FPGA (field programmable gate array), which are used for template matching in a parallel computing mode, can quickly obtain results and meet the requirement of real-time monitoring. Therefore, the technical scheme of the invention is as follows:
a template matching implementation device based on FPGA comprises a memory and the FPGA;
the memory is used for storing an original image, the original image comprises n features to be matched, and each feature is respectively selected by frames to obtain n images of interest;
a ROM and N interesting region matching modules are arranged in the FPGA, and N is more than or equal to N; the n interesting region images are respectively stored in an interesting region matching module;
storing template data in the ROM;
a plurality of double-port BRAMs, a matching calculation module and a comparison module are arranged in each interest region matching module, and each double-port BRAM is connected with two matching calculation modules respectively; all the matching calculation modules are connected with the comparison module; the single image of the interest is divided into a plurality of sections, the partial data of the adjacent sections are the same, and each section of image is stored in a dual-port BRAM; the matching calculation module is connected with the ROM, can acquire template data therein, and is used for matching the data received from the BRAM with the template data to acquire a matching metric value, then comparing the matching metric value with a larger or smaller value in a previous group of comparison results, screening the larger or smaller value, preparing to compare with a next result, and obtaining a maximum or minimum matching metric value of the image and the template which are stored in the same dual-port BRAM and a corresponding matching point coordinate; outputting the single image as a matching measurement result of a single image in a single image of interest;
the comparison module compares the matching metric values of the segmented images in the single interested region image to obtain the maximum value or the minimum value, namely the optimal matching metric value of the single interested region and the template image, and simultaneously obtains the corresponding matching point coordinates of the single interested region and the template image.
After n interesting region images in the same image are respectively matched, n results can be obtained.
Further, the memory has two separate storage spaces, and the original images are stored in the two storage spaces in an alternating manner in sequence.
Further, the method for dividing a single image of interest into a plurality of segments comprises the following steps:
a first stage: line 1 to line 1
Figure GDA0003572063440000021
A row;
and a second stage: first, the
Figure GDA0003572063440000022
Go to the first
Figure GDA0003572063440000023
A row;
a third stage: first, the
Figure GDA0003572063440000024
Go to
Figure GDA0003572063440000025
A row;
……
an M section:
Figure GDA0003572063440000031
row B;
wherein: b is the total line number of the interested region image; b' is the total line number of the template image; m is the total number of segments into which the image of interest is divided;
Figure GDA0003572063440000032
is rounded up.
Further, the minimum value of the segment number M of the interesting image division is determined according to the following formula:
Figure GDA0003572063440000033
wherein: the sliding window range refers to the area of the template image which can freely move on the image of interest in the image matching process; the number of the effective points of the template image is the image in the template imageTotal number of prime points; the preset template matching time refers to the allowed time for completing matching of n interesting region images on the same frame of original image;
Figure GDA0003572063440000034
is rounded up.
A template matching implementation method based on FPGA comprises the following steps:
1) performing frame selection on n features needing to be matched on an original image to obtain n interesting region images; storing the original image in a memory;
2) the n interesting region images are respectively stored in an interesting region matching module in the FPGA, and the interesting region matching module comprises a plurality of dual-port BRAMs, a matching calculation module and a comparison module; the storage mode of the interested region image is as follows: sequentially dividing each image of interest into a plurality of sections, wherein the data of the two adjacent sections are the same, and each section of data is independently stored in a dual-port BRAM;
3) the single dual-port BRAM is connected with two matching calculation modules, data are respectively transmitted to the matching calculation modules for matching calculation, the matching calculation modules call template data from the ROM to be matched with the received image data of the region of interest, a matching metric value is obtained, the matching metric value is compared with a larger or smaller value in a previous group of comparison results, a larger or smaller value is screened, and the matching metric value is prepared to be compared with a next obtained result until the maximum or minimum matching metric value and the corresponding matching point coordinate of the image and the template which are stored in the same dual-port BRAM are obtained; outputting the single image as a matching measurement result of a single image in a single image of interest;
4) the comparison module compares the matching metric values of the segmented images in the single interested region image to obtain the maximum value or the minimum value, namely the optimal matching metric value of the single interested region and the template image, and simultaneously obtains the corresponding matching point coordinates of the single interested region and the template image.
Further, the memory has two separate storage spaces, and the original images are stored in the two storage spaces in an alternating manner in sequence.
Further, the method for sequentially dividing each region of interest image into a plurality of segments in the step 2) comprises the following steps:
a first stage: line 1 to line 1
Figure GDA0003572063440000041
A row;
and a second stage: first, the
Figure GDA0003572063440000042
Go to the first
Figure GDA0003572063440000043
A row;
a third stage: first, the
Figure GDA0003572063440000044
Go to the first
Figure GDA0003572063440000045
A row;
……
an M section:
Figure GDA0003572063440000046
moving to row B;
wherein: b is the total line number of the interested region image; b' is the total line number of the template image; m is the total number of segments into which the image of interest is divided;
Figure GDA0003572063440000047
is rounded up.
Further, the minimum value of the segment number M of the interesting image division is determined according to the following formula:
Figure GDA0003572063440000048
wherein: the sliding window range refers to the area of the template image which can freely move on the image of interest in the image matching process; the number of the effective points of the template image is the total number of the pixel points in the template image; preset template matching time fingerThe time allowed for matching the n interested region images on the same frame of original image;
Figure GDA0003572063440000049
is rounded up.
At present, the following six conventional template matching metric value calculation methods can be used for calculation by using the method or the device provided by the invention. Naturally, the method provided by the present invention is not limited to these six calculation methods, and may be applied to other calculation methods.
1. Squared error matching
Figure GDA0003572063440000051
Wherein, R (x, y) is a matching value, T represents a template drawing, I represents an area currently covered by the template drawing, and x 'and y' are values of x and y coordinates in the template drawing T respectively. (the same letter meaning)
2. Standard squared error matching
Figure GDA0003572063440000052
3. Correlation matching
Figure GDA0003572063440000053
4. Standard correlation matching
Figure GDA0003572063440000054
5. Correlation matching
Figure GDA0003572063440000055
Wherein:
T'(x',y')=T(x',y')-1/(w·h)·∑ x”,y” T'(x”,y”)
I'(x+x',y+y')=I(x+x',y+y')-1/(w·h)·∑ x”,y” I(x+x”,y+y”)
6. standard correlation matching
Figure GDA0003572063440000061
The device and the method for realizing template matching based on the FPGA adopt a parallel computing mode to carry out template matching, can quickly obtain results, reduce the power consumption of equipment, and can adjust parameters according to the requirement of computing time to meet the requirement of real-time monitoring.
Drawings
FIG. 1 is a diagram of a relationship between an image of interest and an original image to be processed;
fig. 2 is a schematic structural diagram of an apparatus for implementing template matching based on an FPGA provided in the present application;
FIG. 3 is a schematic diagram of a region of interest matching module;
fig. 4 is a flow chart of calculating the matching metric result of a single image in a single image of interest.
In the figure: ROI refers to the region of interest image.
Detailed Description
The technical solution of the present invention is described in detail below with reference to the accompanying drawings and the detailed description.
A template matching implementation device based on FPGA comprises a memory and the FPGA;
the memory is used for storing an original image, the original image comprises n features to be matched, and each feature is respectively selected by frames to obtain n images of interest (as shown in figure 1);
a ROM and N interesting region matching modules are arranged in the FPGA, and N is more than or equal to N; the n interested area images are respectively stored in an interested area matching module;
storing template data in the ROM;
a plurality of double-port BRAMs, a matching calculation module and a comparison module are arranged in the region of interest matching module, and each double-port BRAM is connected with two matching calculation modules respectively; all the matching calculation modules are connected with the comparison module; the single interested region image is divided into a plurality of sections, the partial data of adjacent sections are the same, and each section of image is stored in a dual-port BRAM; specifically, the method comprises the following steps: the single frame interesting region image is divided into M sections by adopting the following method:
a first stage: line 1 to line 1
Figure GDA0003572063440000071
A row;
and a second stage: first, the
Figure GDA0003572063440000072
Go to
Figure GDA0003572063440000073
A row;
a third stage: first, the
Figure GDA0003572063440000074
Go to
Figure GDA0003572063440000075
A row;
……
an M section:
Figure GDA0003572063440000076
moving to row B;
wherein: b is the total line number of the interested region image; b' is the total line number of the template image; m is the total number of segments into which the image of interest is divided;
Figure GDA0003572063440000077
is to round up upwards;
determining the minimum value of the segment number M of the interesting image division according to the following formula:
Figure GDA0003572063440000078
wherein: the sliding window range refers to the area of the template image which can freely move on the image of interest in the image matching process; the number of the effective points of the template image is the total number of the pixel points in the template image; the preset template matching time refers to the allowed time for completing matching of n interesting region images on the same frame of original image;
Figure GDA0003572063440000079
is rounded up.
The matching calculation module is connected with the ROM, can obtain template data therein, and is used for matching the data received from the BRAM with the template data to obtain a matching metric value, then comparing the matching metric value with a larger or smaller value in a previous group of comparison results, screening the larger or smaller value, preparing to compare with a next obtained result until obtaining a maximum or minimum matching metric value of the image and the template which are stored in the same dual-port BRAM and a corresponding matching point coordinate; outputting the single image as a matching measurement result of a single image in a single image of interest;
the comparison module compares the matching metric values of the multiple segmented images in the single interested area image to obtain the maximum value or the minimum value, namely the optimal matching metric value of the single interested area and the template image, and simultaneously obtains the corresponding matching point coordinates of the single interested area and the template image.
After n interesting region images in the same image are respectively matched, n results can be obtained.
In order to accelerate the calling speed of the picture, two separate storage spaces are stored in the memory, and the original images are stored in the two storage spaces in an alternating mode in sequence. The memory can be divided into two blocks or one block.
A template matching implementation method based on FPGA comprises the following steps:
1) performing frame selection on n features needing to be matched on an original image to obtain n interesting region images; storing the original image in a memory;
2) the n interesting area images are respectively stored in an interesting area matching module in the FPGA, and the interesting area matching module comprises a plurality of dual-port BRAMs, a matching calculation module and a comparison module; the storage mode of the interested region image is as follows: sequentially dividing each interested region image into M sections, wherein the data of the two adjacent sections are the same, and each section of data is independently stored in a dual-port BRAM; determining the minimum value of the segment number of the interesting image division according to the following formula:
Figure GDA0003572063440000081
wherein: the sliding window range refers to the area of the template image which can freely move on the image of interest in the image matching process; the number of the effective points of the template image is the total number of the pixel points in the template image; the preset template matching time refers to the allowed time for completing matching of n interesting region images on the same frame of original image;
Figure GDA0003572063440000091
is rounded up.
Specifically, the method comprises the following steps: the method for sequentially dividing each interested region image into a plurality of sections comprises the following steps:
a first stage: line 1 to line 1
Figure GDA0003572063440000092
A row;
and a second stage: first, the
Figure GDA0003572063440000093
Go to
Figure GDA0003572063440000094
A row;
a third stage: first, the
Figure GDA0003572063440000095
Go to the first
Figure GDA0003572063440000096
A row;
……
an M section:
Figure GDA0003572063440000097
moving to row B;
wherein: b is the total line number of the interested region image; b' is the total line number of the template image; m is the total number of segments into which the image of interest is divided;
Figure GDA0003572063440000098
is to round up upwards; in addition to this division method, other division methods may be naturally selected as long as the storage condition is satisfied;
3) the single dual-port BRAM is connected with two matching calculation modules, data are respectively transmitted to the matching calculation modules for matching calculation, the matching calculation modules call template data from the ROM to be matched with the received image data of the region of interest, a matching metric value is obtained, the matching metric value is compared with a larger or smaller value in a previous group of comparison results, a larger or smaller value is screened, and the matching metric value is prepared to be compared with a next obtained result until the maximum or minimum matching metric value and the corresponding matching point coordinate of the image and the template which are stored in the same dual-port BRAM are obtained; outputting the single image as a matching measurement result of a single image in a single image of interest;
4) the comparison module compares the matching metric values of the segmented images in the single interested region image to obtain the maximum value or the minimum value, namely the optimal matching metric value of the single interested region and the template image, and simultaneously obtains the corresponding matching point coordinates of the single interested region and the template image.
In order to call the picture data in parallel, the internal memory has two separate storage spaces, which can be realized by two internal memories, or by dividing one internal memory into two areas, and the original image is stored in the two storage spaces in an alternating mode in sequence.
At present, the following six conventional template matching metric value calculation methods can be used for calculation by using the method or the device provided by the invention. Naturally, the method provided by the present invention is not limited to these six calculation methods, and may be applied to other calculation methods.
1. Squared error matching
Figure GDA0003572063440000101
Wherein, R (x, y) is a matching value, T represents a template drawing, I represents an area currently covered by the template drawing, and x 'and y' are values of x and y coordinates in the template drawing T respectively. (the letters are the same as each other in the following meaning)
2. Standard squared error matching
Figure GDA0003572063440000102
3. Correlation matching
Figure GDA0003572063440000103
4. Standard correlation matching
Figure GDA0003572063440000104
5. Correlation matching
Figure GDA0003572063440000105
Wherein:
T'(x',y')=T(x',y')-1/(w·h)·∑ x”,y” T'(x”,y”)
I'(x+x',y+y')=I(x+x',y+y')-1/(w·h)·∑ x”,y” I(x+x”,y+y”)
6. standard correlation matching
Figure GDA0003572063440000111
The device and the method for realizing template matching based on the FPGA adopt a parallel computing mode to carry out template matching, can quickly obtain results, reduce the power consumption of equipment and meet the requirement of real-time monitoring.
The following compares the effects of the method of the present invention and the existing method with the same treatment object:
a template matching implementation method based on FPGA comprises the following steps:
1) 5 features needing to be matched are arranged on the original image; the original image is stored in a memory, 5 templates are matched with 5 features, the size of each template is 100 multiplied by 100, and the number of effective points is 10000.
2) The 5 interesting region images are respectively stored in an interesting region matching module in the FPGA, the size of the interesting region image is 299 multiplied by 299, and the range of the sliding window is 200 multiplied by 200. The storage mode of the interested region image is as follows: sequentially dividing each interested region image into four sections, wherein the data of the two adjacent sections are the same, and each section of data is independently stored in a dual-port BRAM;
specifically, the method comprises the following steps: the method for sequentially dividing each interested area image into four sections comprises the following steps:
a first stage: lines 1 through 149;
and a second stage: lines 51 through 199;
a third stage: lines 101 through 249;
a fourth stage: lines 151 through 299;
the same as the steps 3) and 4) above, calculating by adopting a 4 th standard correlation matching method; the running clock frequency of the FPGA is 200000000Hz, and the time required for matching 5 characteristics is as follows:
Figure GDA0003572063440000121
comparative example: under the condition that the template size and the sliding window range are the same, the consumed time is 10s when the dual-core Cortex A9 runs by using the same template matching algorithm.
Therefore, the template matching speed can be greatly improved by using the FPGA-based template matching method provided by the application.
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable others skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications thereof. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (8)

1. A template matching implementation device based on FPGA comprises a memory and the FPGA; the method is characterized in that:
the memory is used for storing an original image, the original image comprises n features to be matched, and each feature is respectively selected by frames to obtain n images of interest;
a ROM and N interesting region matching modules are arranged in the FPGA, and N is more than or equal to N; the n interesting region images are respectively stored in an interesting region matching module;
storing template data in the ROM;
a plurality of double-port BRAMs, a matching calculation module and a comparison module are arranged in each interest region matching module, and each double-port BRAM is connected with two matching calculation modules respectively; all the matching calculation modules are connected with the comparison module; the single image of the interest is divided into a plurality of sections, the partial data of the adjacent sections are the same, and each section of image is stored in a dual-port BRAM; the matching calculation module is connected with the ROM, can acquire template data therein, and is used for matching the data received from the BRAM with the template data to acquire a matching metric value, then comparing the matching metric value with a larger or smaller value in a previous group of comparison results, screening the larger or smaller value, preparing to compare with a next result, and obtaining a maximum or minimum matching metric value of the image and the template which are stored in the same dual-port BRAM and a corresponding matching point coordinate; outputting the single image as a matching measurement result of a single image in a single image of interest;
the comparison module compares the matching metric values of the segmented images in the single interested region image to obtain the maximum value or the minimum value, namely the optimal matching metric value of the single interested region and the template image, and simultaneously obtains the corresponding matching point coordinates of the single interested region and the template image.
2. The FPGA-based template matching implementation apparatus of claim 1, wherein: the memory has two separate storage spaces, and the original images are stored in the two storage spaces in an alternating manner in sequence.
3. The FPGA-based template matching implementation apparatus of claim 1, wherein: the method for dividing a single image of interest into a plurality of segments comprises the following steps:
a first stage: line 1 to line 1
Figure FDA0003572063430000021
A row;
and a second stage: first, the
Figure FDA0003572063430000022
Go to
Figure FDA0003572063430000023
A row;
a third stage: first, the
Figure FDA0003572063430000024
Go to
Figure FDA0003572063430000025
A row;
……
an M section:
Figure FDA0003572063430000026
moving to row B;
wherein: b is the total line number of the interested region image; b' is the total line number of the template image; m is the total number of segments into which the image of interest is divided;
Figure FDA0003572063430000027
is rounded up.
4. The FPGA-based template matching implementation apparatus of claim 1, wherein: determining the minimum value of the segment number M of the interesting image division according to the following formula:
Figure FDA0003572063430000028
wherein: the sliding window range refers to the area of the template image which can freely move on the image of interest in the image matching process; the number of the effective points of the template image is the total number of the pixel points in the template image; the preset template matching time refers to the allowed time for completing matching of n interesting region images on the same frame of original image;
Figure FDA0003572063430000029
is rounded up.
5. A template matching implementation method based on FPGA is characterized by comprising the following steps:
1) performing frame selection on n features needing to be matched on an original image to obtain n images of interest; storing the original image in a memory;
2) the n interesting region images are respectively stored in an interesting region matching module in the FPGA, and the interesting region matching module comprises a plurality of dual-port BRAMs, a matching calculation module and a comparison module; the storage mode of the interested region image is as follows: sequentially dividing each image of interest into a plurality of sections, wherein the data of the two adjacent sections are the same, and each section of data is independently stored in a dual-port BRAM;
3) the single dual-port BRAM is connected with two matching calculation modules, data are respectively transmitted to the matching calculation modules for matching calculation, the matching calculation modules call template data from the ROM to be matched with the received image data of the region of interest, a matching metric value is obtained, the matching metric value is compared with a larger or smaller value in a previous group of comparison results, a larger or smaller value is screened, and the matching metric value is prepared to be compared with a next obtained result until the maximum or minimum matching metric value and the corresponding matching point coordinate of the image and the template which are stored in the same dual-port BRAM are obtained; outputting the single image as a matching measurement result of a single image in a single image of interest;
4) the comparison module compares the matching metric values of the segmented images in the single interested region image to obtain the maximum value or the minimum value, namely the optimal matching metric value of the single interested region and the template image, and simultaneously obtains the corresponding matching point coordinates of the single interested region and the template image.
6. The FPGA-based template matching implementation method of claim 5, characterized in that: the memory has two separate storage spaces, and the original images are stored in the two storage spaces in an alternating manner in sequence.
7. The FPGA-based template matching implementation method of claim 5, characterized in that: step 2) the method for sequentially dividing each interested region image into a plurality of sections comprises the following steps:
a first stage: line 1 to line 1
Figure FDA0003572063430000031
A row;
and a second stage: first, the
Figure FDA0003572063430000032
Go to
Figure FDA0003572063430000033
A row;
a third stage: first, the
Figure FDA0003572063430000034
Go to
Figure FDA0003572063430000035
A row;
……
an M section:
Figure FDA0003572063430000036
moving to row B;
wherein: b is the total line number of the interested region image; b' is the total line number of the template image; m is the total number of segments into which the image of interest is divided;
Figure FDA0003572063430000041
is rounded up.
8. The FPGA-based template matching implementation method of claim 5, characterized in that: determining the minimum value of the segment number M of the interesting image division according to the following formula:
Figure FDA0003572063430000042
wherein: the sliding window range refers to the area of the template image which can freely move on the image of interest in the image matching process; the number of the effective points of the template image is the total number of the pixel points in the template image; the preset template matching time refers to the allowed time for completing matching of n interesting region images on the same frame of original image;
Figure FDA0003572063430000043
is rounded up.
CN201911047750.3A 2019-10-30 2019-10-30 FPGA-based template matching implementation device and method Active CN110807483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911047750.3A CN110807483B (en) 2019-10-30 2019-10-30 FPGA-based template matching implementation device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911047750.3A CN110807483B (en) 2019-10-30 2019-10-30 FPGA-based template matching implementation device and method

Publications (2)

Publication Number Publication Date
CN110807483A CN110807483A (en) 2020-02-18
CN110807483B true CN110807483B (en) 2022-08-16

Family

ID=69489647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911047750.3A Active CN110807483B (en) 2019-10-30 2019-10-30 FPGA-based template matching implementation device and method

Country Status (1)

Country Link
CN (1) CN110807483B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967310B (en) * 2021-02-04 2023-07-14 成都国翼电子技术有限公司 Template matching acceleration method based on FPGA

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576961A (en) * 2009-06-16 2009-11-11 天津大学 High-speed image matching method and device thereof
CN103400153A (en) * 2013-07-15 2013-11-20 中国航天科工集团第三研究院第八三五八研究所 Serial filtering matching method and system for real-time image identification
CN106061376A (en) * 2014-03-03 2016-10-26 瓦里安医疗系统公司 Systems and methods for patient position monitoring
JP2018201102A (en) * 2017-05-26 2018-12-20 キヤノン株式会社 Imaging apparatus
CN110047100A (en) * 2019-04-01 2019-07-23 四川深瑞视科技有限公司 Depth information detection method, apparatus and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9053372B2 (en) * 2012-06-28 2015-06-09 Honda Motor Co., Ltd. Road marking detection and recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576961A (en) * 2009-06-16 2009-11-11 天津大学 High-speed image matching method and device thereof
CN103400153A (en) * 2013-07-15 2013-11-20 中国航天科工集团第三研究院第八三五八研究所 Serial filtering matching method and system for real-time image identification
CN106061376A (en) * 2014-03-03 2016-10-26 瓦里安医疗系统公司 Systems and methods for patient position monitoring
JP2018201102A (en) * 2017-05-26 2018-12-20 キヤノン株式会社 Imaging apparatus
CN110047100A (en) * 2019-04-01 2019-07-23 四川深瑞视科技有限公司 Depth information detection method, apparatus and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
An FPGA-based accelerator for multiple real-time template matching;Erika S. Albuquerque等;《2016 29th Symposium on Integrated Circuits and Systems Design (SBCCI)》;20161031;全文 *
Template Matching Using DSP Slices on the FPGA;Kaoru Hashimoto等;《2013 First International Symposium on Computing and Networking》;20140130;全文 *

Also Published As

Publication number Publication date
CN110807483A (en) 2020-02-18

Similar Documents

Publication Publication Date Title
CN111192292B (en) Target tracking method and related equipment based on attention mechanism and twin network
CN108805023B (en) Image detection method, device, computer equipment and storage medium
CN104978715B (en) A kind of non-local mean image de-noising method based on filter window and parameter adaptive
US9471964B2 (en) Non-local mean-based video denoising method and apparatus
CN109583483B (en) Target detection method and system based on convolutional neural network
CN111242127B (en) Vehicle detection method with granularity level multi-scale characteristic based on asymmetric convolution
CN108921057B (en) Convolutional neural network-based prawn form measuring method, medium, terminal equipment and device
CN104008401B (en) A kind of method and device of pictograph identification
TWI775935B (en) Method of estimating depth in image
WO2021088101A1 (en) Insulator segmentation method based on improved conditional generative adversarial network
CN102779157B (en) Method and device for searching images
CN112712546A (en) Target tracking method based on twin neural network
CN111383232A (en) Matting method, matting device, terminal equipment and computer-readable storage medium
US11348349B2 (en) Training data increment method, electronic apparatus and computer-readable medium
CN112581462A (en) Method and device for detecting appearance defects of industrial products and storage medium
US11275966B2 (en) Calculation method using pixel-channel shuffle convolutional neural network and operating system using the same
CN110175506B (en) Pedestrian re-identification method and device based on parallel dimensionality reduction convolutional neural network
CN113421242B (en) Welding spot appearance quality detection method and device based on deep learning and terminal
CN113469092B (en) Character recognition model generation method, device, computer equipment and storage medium
Kim et al. Deep blind image quality assessment by employing FR-IQA
CN110866490A (en) Face detection method and device based on multitask learning
CN111476835A (en) Unsupervised depth prediction method, system and device for consistency of multi-view images
CN114445651A (en) Training set construction method and device of semantic segmentation model and electronic equipment
CN114782355B (en) Gastric cancer digital pathological section detection method based on improved VGG16 network
WO2017113692A1 (en) Method and device for image matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee after: Yi Si Si (Hangzhou) Technology Co.,Ltd.

Address before: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee before: ISVISION (HANGZHOU) TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder