CN103164702B - Mark center point extracting method, device and image processing system - Google Patents

Mark center point extracting method, device and image processing system Download PDF

Info

Publication number
CN103164702B
CN103164702B CN201110414925.7A CN201110414925A CN103164702B CN 103164702 B CN103164702 B CN 103164702B CN 201110414925 A CN201110414925 A CN 201110414925A CN 103164702 B CN103164702 B CN 103164702B
Authority
CN
China
Prior art keywords
pixel
gray
gray value
labelling
search
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110414925.7A
Other languages
Chinese (zh)
Other versions
CN103164702A (en
Inventor
邓伟
李卫伟
柯有勇
王敏乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huiyan Zhixing Technology Co ltd
Original Assignee
BEIJING HUIYAN ZHIXING TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING HUIYAN ZHIXING TECHNOLOGY Co Ltd filed Critical BEIJING HUIYAN ZHIXING TECHNOLOGY Co Ltd
Priority to CN201110414925.7A priority Critical patent/CN103164702B/en
Publication of CN103164702A publication Critical patent/CN103164702A/en
Application granted granted Critical
Publication of CN103164702B publication Critical patent/CN103164702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of mark center point extracting method, including: from pixel pickup area, choose region of search, the average gray value of each pixel in calculating region of search, and be modified obtaining gray threshold to calculated average gray value;Gray threshold is compared with the gray value of pixel in region of search, and determines that in region of search, gray value is less than all pixels of gray threshold;Fixed all pixels are traveled through, the gray value of the gray value and neighborhood territory pixel that currently travel through pixel is compared respectively;If currently the gray value of traversal pixel is less than the gray value of neighborhood territory pixel, the then labelling to the first kind that current traversal pixel is carried out, to send the coordinate of the labelling with the first kind to data processing equipment in follow-up flow process, and then determine mark center point.The invention still further relates to a kind of mark center point extraction element and image processing system.The present invention is capable of gathering the accurate extraction of the dot center of each information point in image.

Description

Mark center point extracting method, device and image processing system
The present invention relates to encoding of graphs field, particularly relate to a kind of mark center point extraction side Method, device and image processing system.
Background technology
Along with the development of graphics encoding techniques, it is already present in various print media (example in recent years Such as books and periodicals, certificate, advertisement, product identification etc.) upper printing is with certain regular arrangement, for remembering The optics geometric figure of record particular data symbolic information provides the convenient way of additional information, The reader of print media or user can be by some general or special purpose recognition instruments by light Learn geometrical pattern and read in computer or host computer, with obtain the information corresponding to this geometrical pattern or The link etc. of corresponding information.
These geometrical patterns are typically with varicolored color lump (such as one-dimension code or Quick Response Code Deng) or the dot pattern that is made up of sparse dot matrix to record particular data symbolic information.For using For the encoding of graphs scheme of the dot pattern being made up of sparse dot matrix, each information in dot pattern The extraction of point is that dot pattern calculates and the basis of decoding.If realizing decoding accurately, adopt The accurate extraction of the dot center of each information point in collection image is then generally necessary.If Dot center extracts inaccurate or there is loss situation, then can further result in the position of information point Misalignment and point are lost, it is also possible to obtain the problems such as false information point.
In more existing encoding of graphs read schemes, in order to extract from the image gathered Dot center, first by camera acquisition to gray level image carry out binaryzation, be only converted into There is the black white image of intensity 0 and 255, then after converting the image into black white image, then Center is asked for as in the point being finally drawn into further according to the two-value agglomerate in black white image The heart.But this way exists certain defect, wherein the threshold value of binaryzation is more difficult determines, and And the black white image after binaryzation lost the half-tone information of some reference value, thus right Dot center in second-rate image extracts, then may cause the information extracting falseness Point, the problem losing useful information point, and also the position of the dot center determined may the most also Inaccurate.
Summary of the invention
The purpose of the present invention is to propose at a kind of mark center point extracting method, device and image Reason system, it is possible to realize gathering the accurate extraction of the dot center of each information point in image, As far as possible overcome dot center's positional misalignment of the information point extracted, point to lose or False Intersection Points etc. is asked Topic.
For achieving the above object, the invention provides a kind of mark center point extracting method, bag Include:
From the pixel pickup area of image, choose the region of search of pre-set dimension, calculate described The average gray value of each pixel in region of search, and to calculated described average gray Value is modified obtaining gray threshold;
The gray value of described gray threshold with each pixel in described region of search is compared Relatively, and determine all pixels less than described gray threshold of gray value in described region of search;
Fixed all pixels are traveled through, and at the place to each current traversal pixel During reason, the gray value of the gray value and neighborhood territory pixel that currently travel through pixel is carried out respectively Relatively, described neighborhood territory pixel is each pixel of all pixels of adjacent area;
If currently the gray value of traversal pixel is less than the gray value of described neighborhood territory pixel, the most right The labelling of the first kind that current traversal pixel is carried out, in order to will be with the in follow-up flow process The coordinate of the labelling of one type sends data processing equipment to, and then according to the first kind obtained The coordinate of the labelling of type determines mark center point.
Further, if currently the gray value of traversal pixel is arbitrary more than described neighborhood territory pixel The gray value of pixel, then carry out the labelling of Second Type or to currently to current traversal pixel Traversal pixel is not marked.
Further, if currently the gray value of traversal pixel is mellow lime equal to described neighborhood territory pixel The gray value of at least one pixel that degree is minimum, then carry out the first kind to current traversal pixel Labelling, and cancel or retain the neighborhood territory pixel equal with the gray value currently traveling through pixel The labelling of the first kind.
Further, for gray value in described pixel pickup area equal to described gray threshold All pixels, also assist in the processing procedure of traversal.
Further, the labelling of the described first kind is the black image of 0 corresponding to gray scale, The labelling of Second Type is the white image of 255 corresponding to gray scale.
Further, the gray threshold of described region of search weighs according to the replacing of region of search New calculating.
Further, the size of described region of search is extracted according to image resolution ratio, central point Error rate, reserved cache size and/or region of search change number of times and set, described adjacent region The error rate that the size in territory and scope are extracted according to image resolution ratio and/or central point determines.
Further, described region of search is the rectangular area of 5 × 12, described adjacent area All pixels be 8 currently adjacent around traversal pixel pixels.
Further, described it is modified obtaining ash to calculated described average gray value Degree threshold value operation particularly as follows:
Calculated described average gray value is repaiied with the gray-level correction value preset and/or gray scale Order coefficient linearly to calculate, and using the result that obtains as gray threshold.
For achieving the above object, present invention also offers a kind of mark center point extraction element, Including:
Region of search selects unit, for choosing default chi from the pixel pickup area of image Very little region of search;
Average gray computing unit, in calculating described region of search, each pixel is average Gray value;
Gray threshold computing unit, for repairing calculated described average gray value Just obtaining gray threshold;
Gray threshold comparing unit, for by described gray threshold and described region of search The gray value of each pixel compares;
Traversal pixel value determining unit, is used for determining that in described region of search, gray value is less than described All pixels of gray threshold;
Pixel traversal processing unit, for fixed all pixels are traveled through, and In processing procedure to each current traversal pixel, will currently travel through the gray value of pixel with adjacent The gray value of territory pixel compares respectively, and described neighborhood territory pixel is all pictures of adjacent area Each pixel of element, if currently the gray value of traversal pixel is less than described neighborhood territory pixel Gray value, the then labelling to the first kind that current traversal pixel is carried out;
Pixel coordinate delivery unit, for sending to the coordinate of the labelling with the first kind Data processing equipment, and then determine in labelling according to the coordinate of the labelling of the first kind obtained Heart point.
Further, described pixel traversal processing unit is additionally operable at the current ash traveling through pixel When angle value is more than the gray value of described neighborhood territory pixel any pixel, current traversal pixel is carried out The labelling of Second Type or current traversal pixel is not marked.
Further, described pixel traversal processing unit is additionally operable at the current ash traveling through pixel When angle value is equal to the gray value of at least one pixel that gray scale is minimum in described neighborhood territory pixel, right Current traversal pixel carries out the labelling of the first kind, and cancels or retain and currently travel through pixel The labelling of the first kind of the equal neighborhood territory pixel of gray value.
Further, described pixel traversal processing unit is also used for described pixel pickup area Middle gray value carries out traversal processing equal to all pixels of described gray threshold.
Further, the labelling of the described first kind is the black image of 0 corresponding to gray scale, The labelling of Second Type is the white image of 255 corresponding to gray scale.
Further, the gray threshold of described region of search weighs according to the replacing of region of search New calculating.
Further, the size of described region of search is extracted according to image resolution ratio, central point Error rate, reserved cache size and/or region of search change number of times and set, described adjacent region The error rate that the size in territory and scope are extracted according to image resolution ratio and/or central point determines.
Further, described region of search is the rectangular area of 5 × 12, described adjacent area All pixels be 8 currently adjacent around traversal pixel pixels.
Further, described gray threshold computing unit specifically includes:
Gray-level correction assembly, for by calculated described average gray value and the ash preset Degree correction value and/or gray scale revisory coefficient linearly calculate, and using the result that obtains as ash Degree threshold value.
For achieving the above object, present invention also offers one to include in arbitrary aforesaid labelling The image processing system of heart point extraction element, also includes:
Data processing equipment, for according to the obtained from described mark center point extraction element The coordinate of the labelling of one type determines mark center point.
Further, described mark center point extraction element is by field programmable gate array (FieldProgrammable Gate Array is called for short FPGA) device realizes.
Based on technique scheme, present invention employs from gray level image, directly extract labelling The mode of the coordinate of dot center, this mode is directly come really according to the gray level image of region of search Determine gray threshold, and when extracting labelling point according to traveling through the actual grey value of pixel as choosing The basis selected, be possible not only to eliminate as far as possible illumination uneven, material is reflective, camera lens contrast The impacts on gray level image such as degree, and it also avoid prior art first carries out binaryzation and The loss of the valuable half-tone information caused, therefore compared to labelling point of the prior art Center extraction mode, the labelling dot center extracting mode of the present invention is relatively reliable accurately;Additionally In another embodiment, dot center is extracted process and can be realized by FPGA device, will search Satisfactory pixel in rope region is selected, then it is next that the pixel selected carries out coupling respectively The steps such as the coordinate determining labelling dot center realize in a pipeline fashion, will not increase extra The process time, therefore there is higher treatment effeciency.
Accompanying drawing explanation
Accompanying drawing described herein is used for providing a further understanding of the present invention, constitutes this Shen A part please, the schematic description and description of the present invention is used for explaining the present invention, and Do not constitute inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 is that digital printing apparatus forms ink on the surface of print media (such as paper etc.) The process schematic of point.
Fig. 2 is the right of the coding pattern that arrives with equipment actual acquisition of the coding pattern of original figure Compare schematic diagram.
Fig. 3 is that pixel gray level is by the change schematic diagram of black to white gradual change.
Fig. 4 is the labelling point distribution of a kind of sparse dot pattern that optical read device collects Schematic diagram with shape.
Fig. 5 is that the labelling point of a kind of sparse dot pattern that optical read device collects is adjacent The schematic diagram of the scope of pixel.
Fig. 6 is the schematic flow sheet of an embodiment of mark center point extracting method of the present invention.
Fig. 7 is the schematic diagram of region of search and the adjacent area related in Fig. 6 embodiment.
Fig. 8 be mark center point extracting method of the present invention another embodiment in currently travel through picture The schematic diagram of the Gray-scale Matching of element.
Fig. 9 is the structural representation of an embodiment of mark center point extraction element of the present invention.
Figure 10 is the structural representation of an embodiment of image processing system of the present invention.
Detailed description of the invention
Below by drawings and Examples, technical scheme is done the most in detail Describe.
In current existing encoding scheme, the labelling point being used for being formed dot matrix is substantially by counting The ink dot formed during the printing of word printing equipment.When printing, ink dot contacts with paper surface After can collide, ink dot is assembled, spherically convex shape because of its surface tension, thus from Form shape a little in appearance.See Fig. 1, print the ink dot and paper surface formed The size of contact area depends on the volume of ink dot, impact strength and ink dot and paper surface Affinity.And the degree difference absorbed by paper due to shape and the ink dot of ink dot, By optical device actual acquisition to dot matrix image in, the gray scale at ink dot center is less than The gray scale at ink dot edge, in other words, ink dot center is generally more black than edge.
The infrarede emitting diode using 4 850nm with local illuminates even body of light, then by even As a example by the light of light volume scattering illuminates the mode of whole acquisition plane, now set due to optical reading There is aberration in the camera lens of standby (such as level grey camera etc.), therefore has greater brightness contrast Pattern edge, it may appear that the transition of pixel grey scale, as in figure 2 it is shown, the figure on the left side is The figure of a pair labelling point in original document, wherein this is black to labelling point, and background colour is White, the image that the figure on the right is collected by optical read device, wherein A point is the back of the body Scene element, gray value is 123, and B point is pattern edge pixel, and gray value is 73, and C point is Figure central authorities, gray value is 37.The only assistance of the gray value of the premises understands side of the present invention The reference of case, the not gray value shown by reality.
For the imageing sensor that optical read device is used, sense with cmos image As a example by device, its image collected is made up of pixel, as it is shown on figure 3, be pixel GTG is by the change schematic diagram of black to white gradual change.Each pixel has 0~255 grade of gray scale, ash Angle value the least expression target image is the darkest, otherwise image intensity value is the biggest, represents target image The brightest.And the percent data shown in Fig. 3 is gray value accounts for the number of levels that gray scale is total The percentage ratio of 255.From figure 3, it can be seen that along with the increasing of gray value percentage Add, target image brightness from dark to bright.
Following is a brief introduction of a kind of sparse dot pattern collected by optical read device, ginseng Seeing Fig. 4, the labelling point of low-light level in the drawings is disposed in the background of high brightness, these marks Note point distribution is sparse, will not be overlapping, and coverage rate is relatively low.Being shaped as of labelling point is not limited to The solid object of circle, polygon etc..Labelling point shows as 1~36 in the image sensor The neighbor that individual relative background gray scale is relatively low, these multiple pixels represent the position of single marking point Put and size.During heart point extracts in the markers, it is only necessary to extract the position letter of labelling point Breath (such as coordinate information etc.), the positional information of labelling point chooses the geometric center of labelling point Good, namely need accurately to calculate the coordinate of central point.According further to labelling point Shape facility, circular or connect rotund centrosymmetric image, and also geometric center point is attached Near grey scale pixel value is less than the gray value of the pixel away from geometric center.
As it is shown in figure 5, the image on the left side is that a kind of sparse dot pattern is by after actual acquisition Code pattern, the resolution of this code pattern is 240 × 240, and the image on the right is by a labelling Point to and pixel about totally 16 × 16 region, the figure observed after amplifying 12 times Picture.It is apparent that the pixel composition of two labelling points and ash are spent from right image Cross situation.Wherein, A point is labelling dot center pixel, and gray value is 42, and A point is this phase Pixel the darkest in neighbouring region;B point is background pixel, and gray value is 146;C point is mark Note point is by paracentral pixel, and gray value is 47.D point is labelling point edge pixel, gray scale Value is 84.The gray value of the premises is only the reference assisting to understand the present invention program, not Gray value shown by reality.
Explanation based on aforementioned capturing and coding figure, the invention provides relatively prior art more The extraction scheme of superior mark center point, it is possible to obtain mark center more quickly and accurately Point.As shown in Figure 6, for the flow process of an embodiment of mark center point extracting method of the present invention Schematic diagram.In the present embodiment, the extraction flow process of mark center point includes:
Step 100, from the pixel pickup area of image, choose the field of search of pre-set dimension Territory, calculates the average gray value of each pixel in described region of search, and to calculated Described average gray value is modified obtaining gray threshold;
Step 200, by the ash of each pixel in described gray threshold and described region of search Angle value compares, and determines that in described region of search, gray value is less than described gray threshold All pixels;
Step 300, fixed all pixels are traveled through, and to each current time Go through in the processing procedure of pixel, will currently travel through the gray value of pixel and the gray scale of neighborhood territory pixel Value compares respectively, and described neighborhood territory pixel is each picture of all pixels of adjacent area Element;
If the gray value of step 400 currently traversal pixel is less than the ash of described neighborhood territory pixel Angle value, the then labelling to the first kind that current traversal pixel is carried out, in order in follow-up flow process The middle coordinate by the labelling with the first kind sends data processing equipment to, and then according to obtaining To the coordinate of labelling of the first kind determine mark center point.
In the present embodiment, selected from the pixel pickup area of image region of search ginseng See Fig. 7, for the sparse dot matrix encoding scheme related in Fig. 4 or Fig. 5, print resolution For 600dpi, a diameter of 25.4mm/600=0.0423mm of a print pixel point;Each Labelling point is typically made up of 2 × 2 print pixel points when printing, the most each labelling point A diameter of 0.0423 × 2=0.0846mm.The image sensor resolutions used is 240 × 240, the coded graphics of collection a size of 8mm × 8mm, so in the image gathered, often Code corresponding to individual pixel schemes a size of 8mm/240=0.0333mm.Labelling spot diameter/pixel chi Very little=0.0846/0.0333=3 pixel, it is assumed that labelling point is square for approximation on image Shape, the pixel number that labelling point corresponds to gather on image is 3 × 3=9 pixel Point.
According to printer or the performance of printer and the difference printing material, real marking The size of point can change, and for ensureing the interference free performance of code figure, may require that labelling point is adopted The pixel of collection is more than 3 pixels, below 36 pixels.According to requirements above, We can choose the rectangular area of 5 × 12 as region of search, and the size of region of search is permissible The error rate extracted by image resolution ratio, central point, reserved cache size and region of search are more The one or more factors changed in the factors such as number of times are set.Select suitable region of search It is easy to determine the gray threshold being more suitable for, thus realizes mark center point more accurately and extract.
After determining region of search, the gray value of each pixel in region of search is carried out Statistics, calculates the average gray value of each pixel in this region, enters one in this, as basis Step is accomplished and is obtained gray threshold.The corrected Calculation of gray threshold can by calculated averagely Gray value linearly calculates with the gray-level correction value preset and/or gray scale revisory coefficient, and will The result obtained is as gray threshold.Such as: by average gray value and the gray-level correction preset The result that value is added is as gray threshold, or is revised with the gray scale preset by average gray value The result of multiplication is as gray threshold, or is repaiied with the gray scale preset by average gray value Order multiplication the result that is added with default gray scale revision value as gray threshold etc..
After determining gray threshold, by the gray scale of each pixel in comparison search region Value determines, with the relation of gray threshold, the pixel that mark center point is alternative, and alternate pixel is permissible By judging whether gray value determines less than gray threshold, for gray value equal to gray scale threshold For the pixel of value, can directly exclude the scope of alternate pixel, it is also possible to alternately Pixel participates in follow-up traversal processing process.
After determining alternate pixel by comparing gray threshold, further these are determined All pixels travel through, following process is all carried out for each pixel traversed Journey, will currently travel through the gray value of the gray value of pixel and neighborhood territory pixel respectively than Relatively, described neighborhood territory pixel is each pixel of all pixels of adjacent area;If it is current The gray value of traversal pixel is less than the gray value of described neighborhood territory pixel, then to currently traveling through pixel The labelling of the first kind carried out.
Here neighborhood territory pixel refers to each of all pixels in adjacent area (also referred to as neighborhood) The error rate that individual pixel, its size and scope can be extracted according to image resolution ratio and/or central point Determine etc. factor.All pixels of the adjacent area shown in Fig. 7 are for currently traveling through pixel 8 the most adjacent pixels.
In gray scale comparison procedure to current traversal pixel, it may be determined that in going out contiguous range be No existence travels through, than current, the pixel that pixel is darker, without finding darker pixel, then The alternative of mark center point can be defined as by currently traveling through pixel, in order to be identified, can Current traversal pixel to be carried out the labelling of the first kind, such as, it is the black of 0 by gray scale Image is marked, it would however also be possible to employ the modes such as other character or numeral are marked.? After defining labelling, being formed after labelling or can travel through or ergodic part Complete waiting time and send the coordinate of the labelling with the first kind to data processing equipment, enter And determined mark center by data processing equipment according to the coordinate of the labelling of the first kind obtained Point.
In gray scale comparison procedure to current traversal pixel, if currently traveling through the gray scale of pixel Value more than the gray value of neighborhood territory pixel any pixel, then can carry out the to current traversal pixel The labelling of two types.Such as it is marked by the white image that gray scale is 255, it is also possible to The modes such as the character or the numeral that use other are marked.In view of current traversal pixel and the back of the body Scene element is the most all not involved in the determination of mark center point, it is also possible to current traversal pixel not It is marked.Accordingly when to data processing equipment transmission coordinate, both can be by the first kind The coordinate of the labelling of type sends data processing equipment to, it is also possible to by the labelling of the first kind The coordinate of the labelling of coordinate and Second Type sends data processing equipment to.For more grey Other non-alternate pixel of degree threshold value, it is also possible to the most all carry out the mark for Second Type Note, and pass to data processing equipment together with the labelling of the first kind.
In the gray scale comparison procedure of current traversal pixel, if currently traveling through the gray scale of pixel The gray value of at least one pixel that value is minimum equal to gray scale in neighborhood territory pixel, then can be to working as Front traversal pixel carries out the labelling of the first kind, and retains and the current gray value traveling through pixel The labelling of the first kind of equal neighborhood territory pixel.So in the contiguous range of 3 × 3, when Front traversal pixel and the neighborhood territory pixel equal with its gray value are all marked as the first kind, and After coordinate is submitted to data processing equipment, by data processing equipment from certain limit many Individual pixel selection some or the center of gravity selecting multiple pixel etc. are as mark center point.
In another embodiment, also have another processing mode, will in neighborhood territory pixel with The current labelling traveling through the original first kind of the equal pixel of grey scale pixel value is cancelled, such as figure Shown in 8.In left figure, pixel PB be currently travel through pixel P neighborhood territory pixel, and with work as The gray value of front traversal pixel P is equal, and current traversal pixel P after processing in right figure is marked It is designated as black, and the original density bullet of PB is cancelled.So at the model of 3 × 3 specified Only exist the labelling of a first kind in enclosing, and the coordinate of the labelling of this first kind is passed After giving data processing equipment, data processing equipment can be directly according to the labelling of the first kind Coordinate be used as mark center point.
In above-mentioned each labelling point extracting method embodiment, along with the replacing of region of search, phase The gray threshold of the region of search after should changing recalculates.So can preferably adapt to because of Under the influence of reflective, the equal factor of camera lens contrast of uneven, the material of ambient lighting The accurate extraction of mark center point.
One of ordinary skill in the art will appreciate that: realize the whole of said method embodiment or Part steps can be completed by the hardware that programmed instruction is relevant, and aforesaid program can be deposited Being stored in the read/write memory medium of a calculating equipment, this program upon execution, performs to include The step of said method embodiment;And aforesaid storage medium includes: ROM, RAM, magnetic disc Or the various medium that can store program code such as CD.
As it is shown in figure 9, be the structure of an embodiment of mark center point extraction element of the present invention Schematic diagram.In the present embodiment, mark center point extraction element includes: region of search selects Unit 1, average gray computing unit 2, gray threshold computing unit 3, gray threshold compare Unit 4, traversal pixel value determining unit 5, pixel traversal processing unit 6 and pixel coordinate transmission Unit 7.
Region of search selects unit 1 to be responsible for choosing default chi from the pixel pickup area of image Very little region of search.Average gray computing unit 2 is responsible for calculating in described region of search each The average gray value of pixel.Gray threshold computing unit 3 is responsible for calculated described flat All gray values are modified obtaining gray threshold.Gray threshold comparing unit 4 is responsible for described Gray threshold compares with the gray value of each pixel in described region of search.Traversal picture Element determines that unit 5 is responsible for determining that in described region of search, gray value is less than described gray threshold All pixels.
Pixel traversal processing unit 6 is responsible for traveling through fixed all pixels, and In processing procedure to each current traversal pixel, will currently travel through the gray value of pixel with adjacent The gray value of territory pixel compares respectively, and described neighborhood territory pixel is all pictures of adjacent area Each pixel of element, if currently the gray value of traversal pixel is less than described neighborhood territory pixel Gray value, the then labelling to the first kind that current traversal pixel is carried out.Pixel coordinate transmission Unit 7 is responsible for sending the coordinate of the labelling with the first kind to data processing equipment, enters And determine mark center point according to the coordinate of the labelling of the first kind obtained.Here first The preferred gray scale of labelling of type is the black image of 0.
In the present embodiment, region of search selects the size of the region of search selected by unit 1 Error rate, reserved cache size and/or the search can extracted according to image resolution ratio, central point Region is changed number of times and is set, and preferably region of search is the rectangular area of 5 × 12.And when search When area selecting unit 1 has changed region of search, the gray threshold of region of search can also root Recalculate according to the replacing of region of search.Gray threshold computing unit 3 can specifically wrap Include: gray-level correction assembly, for by calculated described average gray value and the ash preset Degree correction value and/or gray scale revisory coefficient linearly calculate, and using the result that obtains as ash Degree threshold value.
Pixel traversal processing unit 6 when relatively neighborhood territory pixel, selected adjacent area Size and scope all can determine according to the error rate that image resolution ratio and/or central point extract.Excellent All pixels in phase selection neighbouring region are currently to travel through 8 pixels adjacent around pixel.
In another device embodiment, pixel traversal processing unit 6 can be also used for working as When the gray value of front traversal pixel is more than the gray value of described neighborhood territory pixel any pixel, to working as Front traversal pixel carries out the labelling of Second Type or is not marked current traversal pixel. Here the preferred gray scale of Second Type labelling is the white image of 255.
In another device embodiment, pixel traversal processing unit 6 can be also used for working as At least one pixel that the gray value of front traversal pixel is minimum equal to gray scale in described neighborhood territory pixel Gray value time, current traversal pixel is carried out the labelling of the first kind, and cancels or retain The labelling of the first kind of the neighborhood territory pixel equal with the gray value of current traversal pixel.
In another device embodiment, pixel traversal processing unit 6 can be used for institute State gray value in pixel pickup area to carry out at traversal equal to all pixels of described gray threshold Reason.
As shown in Figure 10, for the structural representation of an embodiment of image processing system of the present invention Figure.In the present embodiment, during image processing system includes any one labelling noted earlier The embodiment of heart point extraction element 10 and data processing equipment 20.The most to mark center Point extraction element has carried out careful introduction, the most just repeats no more.
Data processing equipment 20 is responsible for according to the obtained from described mark center point extraction element The coordinate of the labelling of one type determines mark center point.
After heart point extraction element 10 defines labelling in the markers, can being formed after labelling, Or travel through or ergodic part has completed waiting time by the labelling with the first kind Coordinate sends data processing equipment 20 to, and data processing equipment 20 can be according to obtained The coordinate of the labelling of one type determines mark center point.If in the contiguous range of 3 × 3, Current traversal pixel and the neighborhood territory pixel equal with its gray value are all marked as the first kind, And after coordinate is submitted to data processing equipment, by data processing equipment 20 from certain limit In multiple pixel selection some or the center of gravity selecting multiple pixel etc. as mark center Point.If in the contiguous range of 3 × 3, only one of which pixel is marked as the first kind, Then data processing equipment 20 directly can be used as labelling according to the coordinate of the labelling of the first kind Central point.
After mark center point extraction element 10 defines labelling, to data processing equipment 20 During transmission coordinate, both can send the coordinate of the labelling of the first kind to data processing equipment 20, it is also possible to the coordinate army of the first kind and the labelling of Second Type is sent to data and processes Device 20.For other the non-alternate pixel at relatively gray threshold, it is also possible to after the comparison All carry out the labelling for Second Type, and pass at data together with the labelling of the first kind Reason device 20.
Mark center point extraction element 10 can be realized by FPGA device, will the field of search Satisfactory pixel in territory is selected, then the pixel selected carries out coupling respectively determines The steps such as the coordinate of labelling dot center realize in a pipeline fashion, will not increase extra process Time, therefore there is higher treatment effeciency.
In this specification, each embodiment all uses the mode gone forward one by one to describe, each embodiment weight Point explanation is all the difference with other embodiments, identical or phase between each embodiment As part cross-reference.For device embodiment, owing to it is implemented with method Example basic simlarity, so describe is fairly simple, relevant part sees the portion of embodiment of the method Defend oneself bright.
Finally should be noted that: above example is only in order to illustrate technical scheme It is not intended to limit;Although the present invention being described in detail with reference to preferred embodiment, Those of ordinary skill in the field are it is understood that still can be embodied as the present invention Mode is modified or portion of techniques feature is carried out equivalent;Without deviating from the present invention The spirit of technical scheme, it all should be contained the technical scheme scope being claimed in the present invention and works as In.

Claims (20)

1. a mark center point extracting method, including:
From the pixel pickup area of image, choose the region of search of pre-set dimension, search described in calculating The average gray value of each pixel in rope region, and calculated described average gray value is entered Row correction obtains gray threshold;
The gray value of described gray threshold with each pixel in described region of search is compared Relatively, and determine all pixels less than described gray threshold of gray value in described region of search;
Fixed all pixels are traveled through, and in the process to each current traversal pixel During, the gray value of the gray value and neighborhood territory pixel that currently travel through pixel is compared respectively, Described neighborhood territory pixel is each pixel of all pixels of adjacent area;
If currently the gray value of traversal pixel is less than the gray value of described neighborhood territory pixel, then to working as The labelling of the first kind that front traversal pixel is carried out, in order to will be with the first kind in follow-up flow process The coordinate of the labelling of type sends data processing equipment to, and then according to the mark of the first kind obtained The coordinate of note determines mark center point.
Mark center point extracting method the most according to claim 1, if wherein current The gray value of traversal pixel is more than the gray value of described neighborhood territory pixel any pixel, then to current time Go through pixel carry out the labelling of Second Type or current traversal pixel is not marked.
Mark center point extracting method the most according to claim 2, if wherein current The gray value of traversal pixel is equal to the ash of at least one pixel of gray scale minimum in described neighborhood territory pixel Angle value, then carry out the labelling of the first kind, and cancel or retain with current current traversal pixel The labelling of the first kind of the equal neighborhood territory pixel of gray value of traversal pixel.
4. according to the arbitrary described mark center point extracting method of claims 1 to 3, the most right In described pixel pickup area, gray value is equal to all pixels of described gray threshold, also assists in The processing procedure of traversal.
Mark center point extracting method the most according to claim 2, wherein said first The labelling of type is the black image of 0 corresponding to gray scale, and the labelling of Second Type corresponds to gray scale It it is the white image of 255.
Mark center point extracting method the most according to claim 1, wherein said search The gray threshold in region recalculates according to the replacing of region of search.
Mark center point extracting method the most according to claim 6, wherein said search The error rate that the size in region is extracted according to image resolution ratio, central point, reserved cache size and/ Or region of search replacing number of times sets, the size of described adjacent area and scope are according to image resolution The error rate that rate and/or central point extract determines.
Mark center point extracting method the most according to claim 7, wherein said search Region is the rectangular area of 5 × 12, and all pixels of described adjacent area are for currently traveling through pixel 8 the most adjacent pixels.
Mark center point extracting method the most according to claim 1, wherein said to meter The described average gray value obtained be modified obtaining gray threshold operation particularly as follows:
Calculated described average gray value is repaiied with the gray-level correction value preset and/or gray scale Order coefficient linearly to calculate, and using the result that obtains as gray threshold.
10. a mark center point extraction element, including:
Region of search selects unit, for choosing pre-set dimension from the pixel pickup area of image Region of search;
Average gray computing unit, the average ash of each pixel in calculating described region of search Angle value;
Gray threshold computing unit, for being modified calculated described average gray value Obtain gray threshold;
Gray threshold comparing unit, for by each with described region of search of described gray threshold The gray value of individual pixel compares;
Traversal pixel value determining unit, is used for determining that in described region of search, gray value is less than described ash All pixels of degree threshold value;
Pixel traversal processing unit, for traveling through fixed all pixels, and right In the processing procedure of each current traversal pixel, will currently travel through gray value and the neighborhood picture of pixel The gray value of element compares respectively, and described neighborhood territory pixel is the every of all pixels of adjacent area One pixel, if currently the gray value of traversal pixel is less than the gray value of described neighborhood territory pixel, The then labelling to the first kind that current traversal pixel is carried out;
Pixel coordinate delivery unit, for sending the coordinate of the labelling with the first kind to number According to processing means, and then determine mark center point according to the coordinate of the labelling of the first kind obtained.
11. mark center point extraction elements according to claim 10, wherein said picture Element traversal processing unit is additionally operable to appoint more than described neighborhood territory pixel at the gray value of current traversal pixel During the gray value of one pixel, current traversal pixel is carried out the labelling of Second Type or to currently Traversal pixel is not marked.
12. mark center point extraction elements according to claim 11, wherein said picture Element traversal processing unit is additionally operable to be equal in described neighborhood territory pixel at the gray value of current traversal pixel During the gray value of at least one pixel that gray scale is minimum, current traversal pixel is carried out the first kind Labelling, and cancel or retain the of the neighborhood territory pixel equal with the gray value currently traveling through pixel The labelling of one type.
13. mark center point extraction elements according to claim 10, wherein said picture Element traversal processing unit is also used for gray value in described pixel pickup area equal to described gray scale threshold All pixels of value carry out traversal processing.
14. mark center point extraction elements according to claim 11, wherein said The labelling of one type is the black image of 0 corresponding to gray scale, and the labelling of Second Type is corresponding to ash Degree is the white image of 255.
15. mark center point extraction elements according to claim 10, wherein said search The gray threshold in rope region recalculates according to the replacing of region of search.
16. mark center point extraction elements according to claim 15, wherein said search The error rate that the size in rope region is extracted according to image resolution ratio, central point, reserved cache size And/or region of search replacing number of times sets, size and the scope of described adjacent area are divided according to image The error rate that resolution and/or central point extract determines.
17. mark center point extraction elements according to claim 16, wherein said search Rope region is the rectangular area of 5 × 12, and all pixels of described adjacent area are for currently traveling through picture 8 pixels that element is the most adjacent.
18. mark center point extraction elements according to claim 10, wherein said ash Degree threshold computation unit specifically includes:
Gray-level correction assembly, for by calculated described average gray value and the gray scale preset Correction value and/or gray scale revisory coefficient linearly calculate, and using the result that obtains as gray scale threshold Value.
19. 1 kinds include the arbitrary described mark center point extraction element of claim 10~18 Image processing system, also includes:
Data processing equipment, for according to first obtained from described mark center point extraction element The coordinate of the labelling of type determines mark center point.
20. image processing systems according to claim 19, wherein said mark center Point extraction element is realized by FPGA.
CN201110414925.7A 2011-12-13 2011-12-13 Mark center point extracting method, device and image processing system Active CN103164702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110414925.7A CN103164702B (en) 2011-12-13 2011-12-13 Mark center point extracting method, device and image processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110414925.7A CN103164702B (en) 2011-12-13 2011-12-13 Mark center point extracting method, device and image processing system

Publications (2)

Publication Number Publication Date
CN103164702A CN103164702A (en) 2013-06-19
CN103164702B true CN103164702B (en) 2016-09-28

Family

ID=48587774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110414925.7A Active CN103164702B (en) 2011-12-13 2011-12-13 Mark center point extracting method, device and image processing system

Country Status (1)

Country Link
CN (1) CN103164702B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593862A (en) * 2013-11-21 2014-02-19 广东威创视讯科技股份有限公司 Image display method and control unit
CN106338335A (en) * 2016-11-07 2017-01-18 西华大学 Image based online sleeper vibration monitoring method
CN108956639B (en) * 2018-06-13 2021-10-01 广东美的智能机器人有限公司 Pipe fitting detection method and pipe fitting detection device
CN113167628A (en) * 2018-12-03 2021-07-23 比奥-拉德实验室公司 Liquid level determination
CN109657672A (en) * 2018-12-20 2019-04-19 上海曼恒数字技术股份有限公司 Space-location method, device, equipment and storage medium
CN109558927A (en) * 2018-12-26 2019-04-02 上海钦轩网络科技有限公司 A kind of art two-dimensional code generation method and device
CN109840546B (en) * 2019-01-04 2023-02-03 南方医科大学南方医院 Method, system and storage medium for identifying mark and matching information of X-ray image
CN109903216B (en) * 2019-01-23 2022-12-23 武汉精立电子技术有限公司 System and method for realizing positioning image dot matrix extraction based on FPGA platform
CN110543798B (en) * 2019-08-12 2023-06-20 创新先进技术有限公司 Two-dimensional code identification method and device
CN112581536B (en) * 2019-09-30 2022-06-17 华中科技大学 OLED mobile phone screen pixel positioning method based on region growing
CN111476795A (en) * 2020-02-27 2020-07-31 浙江工业大学 Binary icon notation method based on breadth-first search
CN113538479B (en) * 2020-04-20 2023-07-14 深圳市汉森软件有限公司 Image edge processing method, device, equipment and storage medium
CN115204383A (en) * 2021-04-13 2022-10-18 北京三快在线科技有限公司 Training method and device for central point prediction model
CN116246265B (en) * 2023-05-12 2023-08-08 威海凯思信息科技有限公司 Property management method and device based on image processing
CN117333803B (en) * 2023-10-09 2024-03-12 广东科驭科技有限公司 Illumination operation and maintenance method and system based on image recognition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963669A (en) * 1997-01-02 1999-10-05 Ncr Corporation Method of extracting relevant character information from gray scale image data for character recognition
JP4302854B2 (en) * 2000-04-06 2009-07-29 富士フイルム株式会社 Image processing method and apparatus, and recording medium
CN101645742A (en) * 2009-09-04 2010-02-10 中国科学院上海技术物理研究所 Tracking system of satellite-ground quantum communication link direction
CN102043940A (en) * 2009-10-14 2011-05-04 北大方正集团有限公司 Method and device for reading two-dimensional code symbol data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963669A (en) * 1997-01-02 1999-10-05 Ncr Corporation Method of extracting relevant character information from gray scale image data for character recognition
JP4302854B2 (en) * 2000-04-06 2009-07-29 富士フイルム株式会社 Image processing method and apparatus, and recording medium
CN101645742A (en) * 2009-09-04 2010-02-10 中国科学院上海技术物理研究所 Tracking system of satellite-ground quantum communication link direction
CN102043940A (en) * 2009-10-14 2011-05-04 北大方正集团有限公司 Method and device for reading two-dimensional code symbol data

Also Published As

Publication number Publication date
CN103164702A (en) 2013-06-19

Similar Documents

Publication Publication Date Title
CN103164702B (en) Mark center point extracting method, device and image processing system
CN111325203B (en) American license plate recognition method and system based on image correction
CN106599792B (en) Method for detecting hand driving violation behavior
US6830197B2 (en) Compact matrix code and one-touch device and method for code reading
CN1954339B (en) Methods and systems for converting images from low dynamic range to high dynamic range
CN102598648B (en) Printed medium, information processing method, information processor
CN102682301B (en) Adaptation for clear path detection with additional classifiers
CN104899870A (en) Depth estimation method based on light-field data distribution
CN101929867A (en) Clear path detection using road model
CN114155527A (en) Scene text recognition method and device
CN105654015A (en) Systems and methods for decoding two-dimensional matrix symbols
CN106096610A (en) A kind of file and picture binary coding method based on support vector machine
CN102739911B (en) Image processing apparatus and image processing method
CN105574542A (en) Multi-vision feature vehicle detection method based on multi-sensor fusion
CN103942762A (en) Two-dimension code preprocessing method and device
CN110751646A (en) Method and device for identifying damage by using multiple image frames in vehicle video
JP5288691B2 (en) Two-dimensional code reading program
CN107395996A (en) The system and method for determining and adjusting camera parameter using more gain images
CN111340810A (en) Intelligent evaluation method for Chinese character writing quality
CN109816041A (en) Commodity detect camera, commodity detection method and device
CN110084327A (en) Bill Handwritten Digit Recognition method and system based on the adaptive depth network in visual angle
CN109947110A (en) Lane self-checking algorithm assemblage on-orbit control method and system towards automatic Pilot
CN102262733B (en) Laser point detection method and apparatus thereof
CN101609508A (en) Sign structure and recognition methods to object identification and orientation information calculating
EP1417636B1 (en) Device, method and computer program for position determination

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
ASS Succession or assignment of patent right

Owner name: BEIJING HUIYAN ZHIXING TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: LI WEIWEI

Effective date: 20130923

C41 Transfer of patent application or patent right or utility model
C53 Correction of patent of invention or patent application
CB03 Change of inventor or designer information

Inventor after: Deng Wei

Inventor after: Li Weiwei

Inventor after: Ke Youyong

Inventor after: Wang Minle

Inventor before: Li Weiwei

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: LI WEIWEI TO: DENG WEI LI WEIWEI KE YOUYONG WANG MINLE

TA01 Transfer of patent application right

Effective date of registration: 20130923

Address after: 100093 Haidian District apricot stone road, No. 10102, room 1, building 99

Applicant after: BEIJING HUIYAN ZHIXING TECHNOLOGY Co.,Ltd.

Address before: 100093 room 2522, block B, apricot Road, No. 99, Beijing, Haidian District

Applicant before: Li Weiwei

C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
PP01 Preservation of patent right
PP01 Preservation of patent right

Effective date of registration: 20180709

Granted publication date: 20160928

PD01 Discharge of preservation of patent
PD01 Discharge of preservation of patent

Date of cancellation: 20210709

Granted publication date: 20160928

PP01 Preservation of patent right
PP01 Preservation of patent right

Effective date of registration: 20220930

Granted publication date: 20160928