CN110264488B - Binary image edge extraction device - Google Patents

Binary image edge extraction device Download PDF

Info

Publication number
CN110264488B
CN110264488B CN201910535278.1A CN201910535278A CN110264488B CN 110264488 B CN110264488 B CN 110264488B CN 201910535278 A CN201910535278 A CN 201910535278A CN 110264488 B CN110264488 B CN 110264488B
Authority
CN
China
Prior art keywords
unit
data
row
image
column
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910535278.1A
Other languages
Chinese (zh)
Other versions
CN110264488A (en
Inventor
宋宇鲲
杜诗强
阳欣
王泽中
张多利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN201910535278.1A priority Critical patent/CN110264488B/en
Publication of CN110264488A publication Critical patent/CN110264488A/en
Application granted granted Critical
Publication of CN110264488B publication Critical patent/CN110264488B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Abstract

The application discloses binary image edge extraction element, the device includes: the device comprises an acquisition unit, a storage unit, a processing unit and an image output unit; the acquiring unit is used for acquiring a binary image of the image to be identified and acquiring pixel point data of pixel points point by point according to row and column addresses of the pixel points in the binary image; the storage unit is used for storing pixel point data according to the row address, the column address and the three-dimensional neighborhood matrix form and outputting the pixel point data to the processing unit; the processing unit is used for performing edge feature extraction operation on the received pixel point data in the form of the three-dimensional neighborhood matrix according to the row and column addresses, recording the result of the edge feature extraction operation as an edge feature value, and inputting the edge feature value to the image output unit; the image output unit is used for generating and outputting an edge characteristic image of the image to be recognized. Through the technical scheme in the application, the complexity of edge extraction operation is reduced, and the data storage space is saved.

Description

Binary image edge extraction device
Technical Field
The present application relates to the field of image processing devices, and in particular, to a binary image edge extraction device.
Background
With the rapid development of information technology, image processing technology is rapidly developed, and is widely applied to aspects of daily life at present. In digital image processing, an edge feature is an important feature of an image and is an important component of image processing, pattern recognition and computer vision, and the extraction operation of the edge feature of the image is widely applied to the fields of industrial detection, image segmentation, motion detection, face recognition, target tracking and the like, and the result of the edge extraction of the image directly influences the effect of further image processing and pattern recognition.
The edges of an image are typically where there are drastic changes in the color of the image, often caused by the shape structure of the object, external ambient lighting, and light reflections from the surface of the object. The image edge can directly reflect the outline and the topological structure of an object, and the image edge extraction technology is one of important bases of an image processing technology, a pattern recognition technology and a computational vision technology. With the development of the fields of artificial intelligence, big data cloud computing and the like, people have higher and higher requirements on the speed of machine recognition and the speed of image processing, which means that the speed of image edge extraction also has a non-negligible concern.
In the prior art, most of color images are subjected to graying processing and then are directly subjected to edge extraction operation, and although data after graying is 8-bit data, the numerical value of the data is still relatively large, so that higher computational complexity is still kept during edge extraction.
Disclosure of Invention
The purpose of this application lies in: the binary image edge extraction device is realized to reduce the complexity of edge feature extraction operation and reduce the storage space, thereby further quickly extracting the edge features of the image.
The technical scheme of the application is as follows: there is provided a binary image edge extraction device including: the device comprises an acquisition unit, a storage unit, a processing unit and an image output unit; the acquiring unit is used for acquiring a binary image of the image to be identified and acquiring pixel point data of pixel points point by point according to row and column addresses of the pixel points in the binary image; the storage unit is used for storing pixel point data according to the row address, the column address and the three-dimensional neighborhood matrix form and outputting the pixel point data to the processing unit; the processing unit is used for performing edge feature extraction operation on the received pixel point data in the form of the three-dimensional neighborhood matrix according to the row and column addresses, recording the result of the edge feature extraction operation as an edge feature value, and inputting the edge feature value to the image output unit; the image output unit is used for generating and outputting an edge characteristic image of the image to be recognized.
In any one of the above technical solutions, further, the obtaining unit includes: a row-column counter and a judgment unit; the row and column counter is arranged at the input end of the acquisition unit and used for counting pixel points in the acquired binary image and generating row and column addresses, wherein the row and column addresses comprise row addresses and column addresses; the judging unit is used for transmitting the pixel data of the pixel to the image output unit when judging that the row address and the column address of the pixel are equal to a first preset threshold value or the row address and the column address of the pixel are equal to a first second preset threshold value.
In any of the above technical solutions, further, the storage unit includes a first storage module, a second storage module and a first-level register, where the first storage module and the second storage module are first-in first-out memories; the data input end of the storage unit is respectively connected with the input end of the first storage module and the input end of the first-stage register; the output end of the first-level register is connected with the third output end of the storage unit; the output end of the first storage module is respectively connected with the second output end of the storage unit and the input end of the second storage module; the output end of the second storage module is connected to the first output end of the storage unit.
In any one of the above technical solutions, further, the processing unit includes: a data extraction unit and an arithmetic unit; the data extraction unit is used for extracting and sending pixel data in a three-dimensional neighborhood matrix form to the operation unit when the row address, the column address and the row address are judged to be both greater than or equal to a second third preset threshold value; the operation unit is used for carrying out edge feature extraction operation according to the pixel point data to generate an edge feature value.
In any one of the above technical solutions, further, the operation unit further includes: the first adder, the second adder, the third adder, the first shifting unit, the second shifting unit and the comparing unit; the data output ends of the first adder and the second adder are connected to the data input end of a third adder through a first shifting unit and a second shifting unit respectively, and the data output end of the third adder is connected to the comparison unit; and the comparison unit is used for extracting edge characteristics according to the size between the operation result of the third adder and the edge preset threshold value and outputting an edge characteristic value.
In any one of the above technical solutions, further, extracting and sending pixel point data in the form of a three-dimensional neighborhood matrix to the operation unit specifically includes:
generating a data selecting frame, wherein the data selecting frame is in a 3 x 3 matrix form;
transmitting the pixel data output in the form of the three-dimensional neighborhood matrix to a data selecting frame;
sequentially extracting pixel point data of a first row and a first column, a first row and a second column, and a second row and a first column of a data selecting frame, and transmitting the pixel point data to a first adder;
and pixel point data of a second row, a third column, a third row and a third column of the data selecting frame are sequentially extracted and transmitted to the second adder.
The beneficial effect of this application is:
before the edge feature extraction operation, the binary image is obtained from the image to be identified, that is, the data bit number of the pixel point is only 1bit, so that the calculation amount in the edge feature extraction operation process is reduced, and the operation process is greatly simplified.
In the process of storing and processing the image, a storage unit provided with a first-level register is combined, a mode of caching two lines of pixel data is adopted, a three-dimensional neighborhood matrix is formed, and the purpose that less resources are used to a great extent to achieve higher storage efficiency is achieved. By adopting a 3X 3 three-dimensional neighborhood matrix form, the edge feature extraction operation is carried out on the pixel point data, the operation complexity is reduced, and the edge feature extraction of the image can be completed efficiently and quickly.
In the method, the pixel points are not symmetrically extended or periodically extended in the processing mode of the boundary pixel points of the image, but result pixel values (edge characteristic values) at the same positions are directly represented by the pixel values (pixel point data) of the initial image at the same positions, so that the operation on the pixel points is omitted, and the storage and calculation operation of the whole frame of image is greatly simplified.
Drawings
The advantages of the above and/or additional aspects of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic block diagram of a binary image edge extraction apparatus according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a state machine according to one embodiment of the present application;
FIG. 3 is a schematic diagram of data storage in a memory cell according to one embodiment of the present application;
FIG. 4 is a schematic diagram of an arithmetic unit according to an embodiment of the present application;
FIG. 5 is a schematic illustration of an image to be recognized according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an edge feature image according to one embodiment of the present application.
Detailed Description
In order that the above objects, features and advantages of the present application can be more clearly understood, the present application will be described in further detail with reference to the accompanying drawings and detailed description. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, however, the present application may be practiced in other ways than those described herein, and therefore the scope of the present application is not limited by the specific embodiments disclosed below.
In the present embodiment, the size of the image to be recognized is set to 640 × 480, that is, when the image is stored in the storage unit dot by dot, the pixel dot column address row _ count is 1,2, …,640, and the pixel dot row address col _ count is 1,2, …, 480.
As shown in fig. 1, the present embodiment provides a binary image edge extraction device, including: an acquisition unit 10, a storage unit 20, a processing unit 30, and an image output unit 40;
the obtaining unit 10 is configured to obtain a binary image of an image to be identified, and obtain pixel point data of a pixel point by point according to a row address and a column address of the pixel point in the binary image;
specifically, according to the prior art, the pixel value of the pixel point in the image to be recognized may be converted into 8 bits, and the converted image to be recognized is recorded as a grayscale image, in order to reduce the data amount and the calculation complexity, the obtaining unit 10 generates a pixel point culling frame, and the size of the culling frame may be set according to the actual calculation requirement, such as 21 × 21.
And moving the selecting frame line by adopting a traversal algorithm, calculating the pixel mean value of the pixels in the selecting frame, recording the pixel mean value as a pixel threshold, comparing the pixel value of the pixel corresponding to the center point of the selecting frame with the pixel threshold, setting the pixel value of the pixel to be 1 when the pixel value is judged to be greater than the pixel threshold, and otherwise, setting the pixel value of the pixel to be 0, so that a gray image with the pixel value of 8 bits can be converted into a binary image with the pixel value of 1bit, and the binary image of the image to be identified can be obtained.
Further, the acquisition unit 10 includes: a row/column counter 11 and a judgment unit 12;
the row-column counter 11 is arranged at the input end of the obtaining unit 10, and the row-column counter 11 is used for counting pixel points in the obtained binary image and generating row-column addresses, wherein the row-column addresses comprise row addresses and row addresses;
the determining unit 12 is configured to transmit the pixel data of the pixel to the image output unit 40 when it is determined that the column address of the pixel is equal to the first preset threshold or the row address is equal to the second preset threshold.
The storage unit 20 is configured to store pixel point data according to the row and column addresses and in a three-dimensional neighborhood matrix form, and output the pixel point data to the processing unit 30;
specifically, as shown in fig. 2, the running process of the whole module, including reading, storing and arithmetic processing of pixel point data, is controlled by a finite state machine. The finite state machine comprises the following states in total:
a low level reset signal rst _ en, which is determined by an external input of the edge extracting apparatus 100;
reading in a data valid signal data _ en, which is determined by the acquisition unit 10 of the edge extraction device 100;
entering a calculation part initial signal cal _ en, which is determined by the row-column counter 11 according to the counting, that is, when the column address 2< row _ count <641 and the row address 2< col _ count <481, the data of the storage unit 20 enters the processing unit 30, that is, the data of the pixel point in the 3 rd column of the 3 rd row and the data of the last pixel point all enter the processing unit 30, so as to calculate the edge feature values of the pixel points in the 2 nd column of the 2 nd row and the 479 th column of the 639 th row;
when the row-column counter 11 determines a pixel processing start signal s _ en at the image boundary according to the count, that is, when the column address row _ count is 1, or the column address row _ count is 640, or the row address col _ count is 1, or the row address col _ count is 480, the obtaining unit 10 transmits the pixel point data of the pixel point to the image output unit 40, that is, the pixel point data of the 1 st row, the 640 th row, the 1 st column, and the 480 st column is transmitted to the output unit 40 as the edge feature value. It should be noted that, when the column address row _ count is 1 and the row address col _ count is 1, no output operation is performed, and only data is input to the memory unit 20, where the first preset threshold is set to 1 and 640, and the second preset threshold is set to 1 and 480;
the data enable signal s _ last _ en of the last row and the last column is processed, and the edge extraction of the image to be recognized is completed when the row-column counter 11 determines the count, i.e. the column address row _ count is 640 and the row address col _ count is 480.
Further, the storage unit 20 includes a first storage module 21, a first storage module 22 and a first-level register 23, wherein the first storage module 21 and the first storage module 22 are fifos; the data input end of the storage unit 20 is respectively connected to the input end of the first storage module 21 and the input end of the first-stage register 23; the output end of the primary register 23 is connected to the third output end of the storage unit 20; the output end of the first storage module 21 is respectively connected to the second output end of the storage unit 20 and the input end of the first storage module 22; an output of the first memory module 22 is connected to a first output of the memory unit 20.
Specifically, as shown in fig. 3, the data register data _ in is set as the data input terminal of the storage unit 30, and two identical first-out type memories fifo, which have a depth of 640 bits and a width of 1bit, are employed as the first storage block fifo1 and the second storage block fifo0, and each memory fifo can store pixel point data of one line of a binary image since the depth of each memory fifo is equal to the image width of the binary image of the image to be recognized.
In this embodiment, 6 adjacent columns of data in the 32 th row to the 34 th row are selected to describe the edge extraction process, as shown in table 1.
TABLE 1
Line 32 1 1 1 0 0 0
Line 33 0 1 1 0 0 0
Line 34 1 1 1 0 0 0
The pixel point data of the pixel points are acquired point by point through the acquisition unit 10, that is, one pixel point data is read in each period, and the sequentially obtained values are …,1,1, 1,0,0, …,0, 1,1,0,0,0, …,1,1, 1,0, 0. When the storage unit 20 stores data, the pixel point data of the 32 nd line is sequentially read by the data register data _ in, and each read pixel point data is sequentially sent to the first storage module fifo1 in the order of reading. When the pixel data in the 33 st row and the 1 st column are read, all 640 pixel data in the 32 nd row are stored in the first storage module fifo1, and then the pixel data in the 32 nd row are sequentially stored in the second storage module fifo0 by the first storage module fifo 1. When the pixel data in the first column of the 34 th row is read, the pixel data in the 33 th row is stored in the first storage module fifo1, and the pixel data in the 32 th row is stored in the second storage module fifo 0. Therefore, by setting the primary register data _ in _ r at the data output end of the data register data _ in, the pixel point data is delayed by one period, so as to realize that the second storage module fifo0, the first storage module fifo1 and the primary register data _ in _ r output the same column data in the adjacent three rows of data, that is, a 3 × 3 pixel matrix is formed according to the form of a three-dimensional neighborhood matrix, such as:
Figure BDA0002101019310000071
it should be noted that, after receiving the binary image of the image to be recognized, the obtaining unit 10 of the binary image edge extracting apparatus 100 starts to obtain pixel data point by point, and generates the read-in data valid signal data _ en, and when obtaining one pixel point data, the row-column counter 11 starts to count, first, the row address col _ count is 1, the column address row _ count is gradually increased from 1 to 640, then, the row address col _ count is 2, and the column address row _ count is gradually increased from 1 until the last pixel point data is read, at this time, the column address row _ count is 640, and the row address col _ count is 480, that is, the row-column address of each pixel point data may be written in the form of coordinates: (col _ count, row _ count).
The first preset threshold is set to be 1 and 640, and the second preset threshold is set to be 1 and 480, that is, when the determining unit 12 determines that the pixel point is at least one of the following four conditions, the pixel point is marked as a boundary point, and the corresponding pixel point data is used as an edge feature value and compared with the edge preset threshold. The obtaining unit 10 generates a pixel processing start signal s _ en at the image boundary, and directly transmits the pixel to the image output unit 40 through the storage unit 20. These four cases are: the column address row _ count is 1, the column address row _ count is 640, the row address col _ count is 1, and the row address col _ count is 480.
Note that when the column address row _ count is 640 and the row address col _ count is 480, the obtaining unit 10 generates a data enable signal s _ last _ en that processes the last row and the last column.
When the pixel point is not a boundary point, the acquisition unit 10 generates an initial signal cal _ en entering the calculation part, the pixel point data enters the processing unit 30 for processing, and the processing unit 30 transmits the processed data to the image output unit 40.
The processing unit 30 is configured to perform edge feature extraction operation on the received pixel point data in the form of the three-dimensional neighborhood matrix according to the row address, record a result of the edge feature extraction operation as an edge feature value, and input the edge feature value to the image output unit 40; the image output unit 40 is used for generating and outputting an edge feature image of the image to be recognized.
In the example, a method based on a Sobel operator is adopted to calculate the pixel data of the whole binary image, wherein the processing of the pixel is divided into two situations, namely, the pixel at the image boundary, namely, the boundary point; and secondly, removing pixel points at the image boundary. The first condition comprises pixel points of a first row, a first column, a last row and a last column, and the operation on the pixel points is not to carry out convolution operation of a Sobel operator, but directly takes pixel point data of the pixel points as a result value to participate in edge feature extraction; the second case is that the neighborhood matrix of the pixel point, i.e. the three-dimensional neighborhood matrix, is convolved with the Sobel operator matrix in two directions, and at this time, the output result is the central value of the three-dimensional neighborhood matrix, i.e. the row-column address of the output result is (col _ count-1, row _ count-1). Therefore, edge characteristic values corresponding to all pixel points in the binary image can be obtained by combining the boundary points and the output result.
Since the input data in this embodiment is pixel point data in a binary image, and the value of the pixel point data is 1 or 0, in order to save the operation resources, convolution operators in two directions may be merged, and the calculation formula is as follows:
Figure BDA0002101019310000091
and then extracting data in the three-dimensional neighborhood matrix according to the combined matrix, and improving the operation rate and reducing the operation complexity by adopting a mode of combining an addition operator and a shifter.
Further, the processing unit 30 includes: a data extraction unit 31 and an arithmetic unit; the data extraction unit 31 is configured to extract and send pixel data in the form of a three-dimensional neighborhood matrix to the arithmetic unit when it is determined that the column address is greater than or equal to a third preset threshold and the row address is greater than the third preset threshold;
specifically, the third preset threshold is set to determine that enough pixel data for generating the three-dimensional neighborhood matrix is stored in the storage unit 20, that is, only when the obtaining unit 10 obtains the pixel data of the third row and the third column, the storage unit 20 can generate the first three-dimensional neighborhood matrix by using the pixel data of the first three columns in the first to third rows, and meanwhile, to expand the data processing range of the binary image edge extraction device 100, the value of the third preset threshold is set to be 3.
Further, the arithmetic unit further includes: a first adder 32, a second adder 33, a third adder 34, a first shifting unit 35, a second shifting unit 36, and a comparing unit 37; the first adder 32 and the second adder 33 are arranged in parallel, the data output ends of the first adder 32 and the second adder 33 are connected to the data input end of the third adder 34 through a first shifting unit 35 and a second shifting unit 36 respectively, and the data output end of the third adder 34 is connected to a comparing unit 37; the comparing unit 37 is configured to perform edge feature extraction according to a size between an operation result of the third adder 34 and an edge preset threshold, and output an edge feature value, where the edge preset threshold is 0.
In an implementation manner of this embodiment, extracting and sending pixel point data in the form of a three-dimensional neighborhood matrix to an arithmetic unit specifically includes:
generating a data selecting frame, wherein the data selecting frame is in a 3 x 3 matrix form;
transmitting the pixel data output in the form of the three-dimensional neighborhood matrix to a data selecting frame;
sequentially extracting pixel point data of a first row and a first column, a first row and a second column, and a second row and a first column of the data selecting frame, and transmitting the pixel point data to a first adder 32;
the pixel point data of the second row, the third column, the third row, the second column and the third column of the data selection frame are sequentially extracted and sent to the second adder 33.
The operation unit is used for carrying out edge feature extraction operation according to the pixel point data to generate an edge feature value.
Specifically, as shown in fig. 4, taking the pixel data in the 32 th row to the 34 th row as an example, when the obtaining unit 10 obtains the pixel point data in the 34 th row and the 4 th column, the generated data access frame is as follows:
Figure BDA0002101019310000101
data _00 corresponds to row 32, column 1 pixel point data, data _01 corresponds to row 32, column 2 pixel point data, data _02 corresponds to row 32, column 3 pixel point data, data _10 corresponds to row 33, column 1 pixel point data, data _11 corresponds to row 33, column 2 pixel point data, data _12 corresponds to row 33, column 3 pixel point data, data _20 corresponds to row 34, column 1 pixel point data, data _21 corresponds to row 34, column 2 pixel point data, and data _22 corresponds to row 34, column 3 pixel point data.
And according to a calculation formula after the Sobel convolution operators are combined, data extraction is carried out according to the position corresponding to each pixel point data in the data selecting frame, the data in the data _00, the data _01 and the data _10 are transmitted to a first adder 32, the data in the data _12, the data _21 and the data _22 are transmitted to a second adder 33, addition operation is respectively carried out, then, the addition operation result is subjected to shift operation through a first shift unit 35 and a second shift unit 36, and a third adder 34 carries out subtraction operation.
Note that since the input image is a binary image, a threshold is preset with a value of 0 as a value for determining whether or not an edge having a boundary feature is present. When the operation results in the two directions are both 0, the point does not have edge features in the two directions, so that the value is assigned to be 1, namely the point is a background point; when one of the operation results in the two directions is not 0, the point is represented to have a boundary feature, i.e., the edge feature value is set to 0.
Taking the result of Sobel convolution operation on the three-dimensional neighborhood matrix as the result of central data, taking the pixel data in the 32 th to 34 th rows as an example, the obtained calculation result corresponds to each pixel in the 33 th row.
The results of the operations performed on the data in table 1 are shown in table 2. x represents unknown, because the operation result of the corresponding pixel point can not be obtained according to the data in the table 1.
TABLE 2
Line 33 x 2 4 4 0 x
The edge feature means that the value of the point is different from the values of the points around the point, and the three-dimensional matrix of the first output is:
Figure BDA0002101019310000111
obviously, the central point is 1, and the upper and lower values of the corresponding column are also 1, i.e. the central point does not have a vertical boundary feature; similarly, in the horizontal direction, the left side is 0, the right side is 1, and the center point is different from the left point, which indicates that the point has the horizontal boundary characteristic. So in general, the point has a boundary feature. After convolution operation, the corresponding operation result is 2 and is not equal to the edge preset threshold value 0, so that the edge characteristic value of the point is assigned to be 0;
likewise, the second matrix is:
Figure BDA0002101019310000112
obviously, the center point is 1, which has no edge feature in the vertical direction, but has an edge feature in the horizontal direction, so that the point is a point with an edge feature, and the convolution operation results in that the point operation result is 4, which is not equal to the edge preset threshold value 0, so that the edge feature value of the point is assigned to 0, and so on, and the final edge feature value is obtained as shown in table 3.
TABLE 3
Line 33 0 0 0 1
That is to say, as long as the result obtained by the convolution operation is not 0, it indicates that the point has the boundary feature, and it needs to be assigned as 0, indicating that the point has the boundary feature; the convolution result is 0, indicating no boundary features, and is assigned a value of 1, indicating that the point is a background point.
As can be seen from the above formula analysis, the current implementation manner of merging processing enables the convolution operation of the input data and the operator to save much overhead on resources, and the obtained convolution results are the same, that is, the processing effects are consistent. Compared with the traditional convolution operation of operators in two directions respectively, the implementation mode can save the cost of 5 adders, and the resource saving rate reaches 55.5%.
In order to visually verify the effect of the binary image edge extraction device 100 in this embodiment, a binary image is randomly selected, as shown in fig. 5, and the binary image edge extraction device 100 in this embodiment is used to perform processing to generate an edge feature image, as shown in fig. 6.
By comparing the original binary image with the extracted edge feature image, the edge features of the fine parts in the image can be extracted, and the dominant frequency of the device can reach 140MHz by integrating the hardware modules of the whole extraction device, so that the extraction device is superior to the existing processing equipment in operation speed and processing effect and meets the real-time processing requirement.
The technical solution of the present application is described in detail above with reference to the accompanying drawings, and the present application provides a binary image edge extraction device, including: the device comprises an acquisition unit, a storage unit, a processing unit and an image output unit; the acquiring unit is used for acquiring a binary image of the image to be identified and acquiring pixel point data of pixel points point by point according to row and column addresses of the pixel points in the binary image; the storage unit is used for storing pixel point data according to the row address, the column address and the three-dimensional neighborhood matrix form and outputting the pixel point data to the processing unit; the processing unit is used for performing edge feature extraction operation on the received pixel point data in the form of the three-dimensional neighborhood matrix according to the row and column addresses, recording the result of the edge feature extraction operation as an edge feature value, and inputting the edge feature value to the image output unit; the image output unit is used for generating and outputting an edge characteristic image of the image to be recognized. Through the technical scheme in the application, the complexity of edge extraction operation is reduced, and the data storage space is saved.
The steps in the present application may be sequentially adjusted, combined, and subtracted according to actual requirements.
The units in the device can be merged, divided and deleted according to actual requirements.
Although the present application has been disclosed in detail with reference to the accompanying drawings, it is to be understood that such description is merely illustrative and not restrictive of the application of the present application. The scope of the present application is defined by the appended claims and may include various modifications, adaptations, and equivalents of the invention without departing from the scope and spirit of the application.

Claims (5)

1. A binary image edge extraction device, characterized by comprising: the device comprises an acquisition unit, a storage unit, a processing unit and an image output unit;
the acquiring unit is used for acquiring a binary image of an image to be identified and acquiring pixel point data of pixel points point by point according to row and column addresses of the pixel points in the binary image, wherein the acquiring unit comprises: a row-column counter and a judgment unit;
the row and column counter is arranged at the input end of the acquisition unit and is used for counting the pixel points in the acquired binary image and generating row and column addresses, wherein the row and column addresses comprise row addresses and row addresses;
the judging unit is used for transmitting the pixel point data of the pixel point to the image output unit when judging that the column address of the pixel point is equal to a first preset threshold value or the row address is equal to a second preset threshold value;
the storage unit is used for storing the pixel point data according to the row and column addresses and in a three-dimensional neighborhood matrix form and outputting the pixel point data to the processing unit;
the processing unit is used for performing edge feature extraction operation on the received pixel point data in the form of the three-dimensional neighborhood matrix according to the row and column addresses, recording the result of the edge feature extraction operation as an edge feature value, and inputting the edge feature value to the image output unit;
the image output unit is used for generating and outputting the edge characteristic image of the image to be identified.
2. The binary image edge extraction device according to claim 1, wherein the storage unit includes a first storage module, a second storage module, and a first-level register, wherein the first storage module and the second storage module are first-in-first-out memories;
the data input end of the storage unit is respectively connected with the input end of the first storage module and the input end of the first-stage register;
the output end of the primary register is connected to the third output end of the storage unit;
the output end of the first storage module is respectively connected with the second output end of the storage unit and the input end of the second storage module;
the output end of the second storage module is connected to the first output end of the storage unit.
3. The binary image edge extraction device according to claim 1, wherein the processing unit includes: a data extraction unit and an arithmetic unit;
the data extraction unit is used for extracting and sending the pixel data in a three-dimensional neighborhood matrix form to the operation unit when the column address and the row address are judged to be both larger than or equal to a third preset threshold value;
the operation unit is used for performing edge feature extraction operation according to the pixel point data to generate the edge feature value.
4. The binary image edge extraction device according to claim 3, wherein the operation unit further includes: the first adder, the second adder, the third adder, the first shifting unit, the second shifting unit and the comparing unit;
the first adder and the second adder are arranged in parallel, the data output ends of the first adder and the second adder are connected to the data input end of the third adder through the first shifting unit and the second shifting unit respectively, and the data output end of the third adder is connected to the comparing unit;
and the comparison unit is used for extracting edge characteristics according to the size between the operation result of the third adder and an edge preset threshold value and outputting the edge characteristic value.
5. The binary image edge extraction device according to claim 4, wherein extracting and sending the pixel data in the form of a three-dimensional neighborhood matrix to the operation unit specifically comprises:
generating a data selecting frame, wherein the data selecting frame is in a 3 x 3 matrix form;
the pixel point data output in the form of the three-dimensional neighborhood matrix is transmitted to the data selecting frame;
sequentially extracting the pixel point data of a first row and a first column, a first row and a second column and a second row and a first column of the data selecting frame and transmitting the pixel point data to the first adder;
and sequentially extracting the pixel point data of a second row, a third row, a second column and a third column of the data selecting frame and transmitting the pixel point data to the second adder.
CN201910535278.1A 2019-06-20 2019-06-20 Binary image edge extraction device Active CN110264488B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910535278.1A CN110264488B (en) 2019-06-20 2019-06-20 Binary image edge extraction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910535278.1A CN110264488B (en) 2019-06-20 2019-06-20 Binary image edge extraction device

Publications (2)

Publication Number Publication Date
CN110264488A CN110264488A (en) 2019-09-20
CN110264488B true CN110264488B (en) 2021-03-16

Family

ID=67919679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910535278.1A Active CN110264488B (en) 2019-06-20 2019-06-20 Binary image edge extraction device

Country Status (1)

Country Link
CN (1) CN110264488B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113068045A (en) * 2021-03-17 2021-07-02 厦门雅基软件有限公司 Data storage method and device, electronic equipment and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003105087A1 (en) * 2002-06-10 2003-12-18 Lockeed Martin Corporation Edge detection using hough transformation
CN106485255A (en) * 2016-09-29 2017-03-08 深圳元启智能技术有限公司 A kind of DM code positioning and the method and system of identification
WO2017193876A1 (en) * 2016-05-12 2017-11-16 深圳市太赫兹科技创新研究院 Method and device for detecting dangerous object hidden on human body from microwave image
CN108537786A (en) * 2018-03-30 2018-09-14 百度在线网络技术(北京)有限公司 For handling image method and device
CN108716890A (en) * 2018-08-17 2018-10-30 苏州富鑫林光电科技有限公司 A kind of high-precision size detecting method based on machine vision
CN109427066A (en) * 2017-08-31 2019-03-05 中国科学院微电子研究所 Edge detection method at any angle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2548303B (en) * 2015-10-14 2018-02-21 Shanghai United Imaging Healthcare Co Ltd System and method for image correction

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003105087A1 (en) * 2002-06-10 2003-12-18 Lockeed Martin Corporation Edge detection using hough transformation
WO2017193876A1 (en) * 2016-05-12 2017-11-16 深圳市太赫兹科技创新研究院 Method and device for detecting dangerous object hidden on human body from microwave image
CN106485255A (en) * 2016-09-29 2017-03-08 深圳元启智能技术有限公司 A kind of DM code positioning and the method and system of identification
CN109427066A (en) * 2017-08-31 2019-03-05 中国科学院微电子研究所 Edge detection method at any angle
CN108537786A (en) * 2018-03-30 2018-09-14 百度在线网络技术(北京)有限公司 For handling image method and device
CN108716890A (en) * 2018-08-17 2018-10-30 苏州富鑫林光电科技有限公司 A kind of high-precision size detecting method based on machine vision

Also Published As

Publication number Publication date
CN110264488A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
CN110189285B (en) Multi-frame image fusion method and device
CN107833238B (en) Maximum connected domain marking method, target tracking method and augmented reality/virtual reality device
CN112837303A (en) Defect detection method, device, equipment and medium for mold monitoring
CN109272509B (en) Target detection method, device and equipment for continuous images and storage medium
CN111681273B (en) Image segmentation method and device, electronic equipment and readable storage medium
CN109977952B (en) Candidate target detection method based on local maximum
CN110390681B (en) Depth image object contour rapid extraction method and device based on depth camera
CN114049499A (en) Target object detection method, apparatus and storage medium for continuous contour
CN108960247B (en) Image significance detection method and device and electronic equipment
CN113362238A (en) Test image processing method and device, electronic equipment and storage medium
CN114359665B (en) Training method and device of full-task face recognition model and face recognition method
CN110264488B (en) Binary image edge extraction device
CN111507340A (en) Target point cloud data extraction method based on three-dimensional point cloud data
CN111340835A (en) FPGA-based video image edge detection system
WO2015031350A1 (en) Systems and methods for memory utilization for object detection
CN114170596A (en) Posture recognition method and device, electronic equipment, engineering machinery and storage medium
CN110930423B (en) Object edge feature recognition and extraction method
CN110728692A (en) Image edge detection method based on Scharr operator improvement
CN113643290B (en) Straw counting method and device based on image processing and storage medium
CN112146834B (en) Method and device for measuring structural vibration displacement
RU2383925C2 (en) Method of detecting contours of image objects and device for realising said method
CN111242140B (en) Method for rapidly extracting contour line under non-uniform illumination
CN112418109A (en) Image processing method and device
Okarma et al. A fast image analysis technique for the line tracking robots
CN116403200A (en) License plate real-time identification system based on hardware acceleration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant