CN104184918A - Image processing apparatus and method, and upper-build image scanning apparatus - Google Patents

Image processing apparatus and method, and upper-build image scanning apparatus Download PDF

Info

Publication number
CN104184918A
CN104184918A CN201310201320.9A CN201310201320A CN104184918A CN 104184918 A CN104184918 A CN 104184918A CN 201310201320 A CN201310201320 A CN 201310201320A CN 104184918 A CN104184918 A CN 104184918A
Authority
CN
China
Prior art keywords
image
reference line
edge
color
clutter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310201320.9A
Other languages
Chinese (zh)
Other versions
CN104184918B (en
Inventor
何源
孙俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN201310201320.9A priority Critical patent/CN104184918B/en
Publication of CN104184918A publication Critical patent/CN104184918A/en
Application granted granted Critical
Publication of CN104184918B publication Critical patent/CN104184918B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention relates to an image processing apparatus, an image processing method, and an upper-build image scanning apparatus containing the image processing apparatus. The image processing apparatus comprises: a reference line determination unit, which is used for determining a reference line for assisting in extracting an edge of a foreground object in a to-be-processed image; and an interference edge identification unit, which is used for identifying an interference edge existing in the image according to the reference line determined by the reference line determination unit. The interference edge is one that is caused by a background object in the image and causes the interference on the edge extraction of the foreground object in the image.

Description

Image processing apparatus and method and overhead image-scanning device
Technical field
The disclosure relates to image processing techniques, more specifically, relates to a kind of image processing apparatus, a kind of image processing method and a kind of overhead image-scanning device that comprises this image processing apparatus.
Background technology
At present, from vertical direction, obtain the technology of the image that is placed in the object carrier known.The image acquiring device using in this technology is commonly called overhead image-scanning device.Overhead image-scanning device is a kind of contactless image forming device, and than the plate of traditional contact or cylinder type image-scanning device, it can scan solid object.For example, overhead image-scanning device can scan thick books, and do not need that books are divided into single page, does not scan.
Here, for convenience of description, the image that overhead image-scanning device for example, is obtained by sweep object (books) is called scan image, the page-images of the books that present in scan image is called to foreground object, and the part except foreground object in scan image is called to background object.
Due to the restriction of the three-dimension curved surface imaging pattern of books, the page-images of obtaining by scanning is with obvious distortion, thereby is necessary obtained page-images to carry out surf deform corrective operations.The algorithm of correcting according to page edge is known in the art, and these algorithms, to the not restriction of the content of obtained page-images, therefore have applicability widely.
Common overhead image-scanning device requires books to be directly placed on user's desktop conventionally.Yet the factor such as it is reflective that the color of desktop, texture and ambient illumination environment cause may be disturbed the automatic extraction at page edge.Therefore, in order to improve the automatic extraction accuracy at page edge, the scheme of a simple possible is that the pure color backing plate that color is identical with the base of overhead image-scanning device is set on the table, then books are positioned on backing plate, thereby can control to guarantee to the background object of the page-images as foreground object in scan image the precision of page edge extracting.
Yet because base and the backing plate of overhead image-scanning device is separated parts, so the edge of the base presenting in scan image and backing plate may cause interference to the edge extracting of the page-images as foreground object.
Summary of the invention
To provide about brief overview of the present disclosure hereinafter, to the basic comprehension about some aspect of the present disclosure is provided.Should be appreciated that this general introduction is not about exhaustive general introduction of the present disclosure.It is not that intention is determined key of the present disclosure or pith, neither be intended to limit the scope of the present disclosure.Its object is only that the form of simplifying provides some concept, usings this as the preorder in greater detail of discussing after a while.
Above-mentioned present situation in view of prior art, object of the present disclosure is, the scene scanning for the overhead image-scanning device that adopts above-mentioned backing plate, provide a kind of image processing apparatus and method, the interference that its edge that can effectively identify the background object in scan image causes the edge extracting of foreground object.
According to an aspect of the present disclosure, a kind of image processing apparatus is provided, it comprises: reference line determining unit, for determining reference line at pending image, this reference line is for assisting to extract the edge of image foreground object; And Clutter edge recognition unit, for the Clutter edge that comes recognition image to exist according to the definite reference line of reference line determining unit, wherein Clutter edge is the background object edge that cause, that the edge extracting of the foreground object in image is caused to interference in image.
According to another aspect of the present disclosure, provide a kind of overhead image-scanning device that comprises above-mentioned image processing apparatus.
According to another aspect of the present disclosure, a kind of image processing method is provided, it comprises: in pending image, determine reference line, this reference line is for assisting to extract the edge of image foreground object; And according to determined reference line, carry out the Clutter edge that exists in recognition image, wherein Clutter edge is the background object edge that cause, that the edge extracting of the foreground object in image is caused to interference in image.
In addition, embodiment of the present disclosure also provides for realizing the computer program of above-mentioned image processing method.
In addition, embodiment of the present disclosure also provides at least computer program of computer-readable medium form, records for realizing the computer program code of above-mentioned image processing method on it.
Accompanying drawing explanation
With reference to the explanation to disclosure embodiment below in conjunction with accompanying drawing, can understand more easily above and other objects, features and advantages of the present disclosure.Parts in accompanying drawing are not proportional draftings, and just for principle of the present disclosure is shown.In the accompanying drawings, same or similar technical characterictic or parts will adopt same or similar Reference numeral to represent.
Fig. 1 a and 1b show respectively the base of overhead image-scanning device and the ideal of backing plate is placed the schematic diagram of situation and the placement situation that staggers;
Fig. 2 a and 2b show respectively the scan image obtaining respectively in two kinds of situations that schematically show at Fig. 1 a and 1b;
Fig. 3 a shows by the top to the scan image shown in Fig. 2 a and carries out the gradient map that gradient map conversion operations obtains;
Fig. 3 b shows by the gradient map in Fig. 3 a and carries out the edge extracting result that edge extraction operation obtains;
Fig. 4 a shows by the top to the scan image shown in Fig. 2 b and carries out the gradient map that gradient map conversion operations obtains;
Fig. 4 b shows by the gradient map in Fig. 4 a and carries out the edge extracting result that edge extraction operation obtains;
Fig. 5 shows according to the schematic block diagram of the image processing apparatus of an embodiment of the present disclosure;
Fig. 6 schematically shows the Clutter edge occurring in the placement situation that staggers of base and backing plate;
Fig. 7 shows stagger base in placement situation and the partial enlarged view in the gap between backing plate at base and backing plate;
Fig. 8 a and 8b schematically show respectively and carry out the first Clutter edge Transformatin gradient map before and afterwards;
Fig. 9 a shows the scan image shown in Fig. 7 is carried out to gradient map conversion and the gradient map with direction that obtains;
Fig. 9 b shows the diagram that the gradient shown in Fig. 9 a and maximum line projection were obtained to time in scan image;
Figure 10 schematically shows and carries out the second Clutter edge Transformatin gradient map afterwards;
Figure 11 shows the schematic diagram of the scan image after removing the first Clutter edge and the second Clutter edge;
Figure 12 shows according to the flow chart of the processing of image processing method of the present disclosure;
Figure 13 shows the flow chart of details of an embodiment of image processing method illustrated in fig. 12; And
Figure 14 shows and can be used to realize according to the structure diagram of the general-purpose computing system of the apparatus and method of embodiment of the present disclosure.
Embodiment
In connection with accompanying drawing, one exemplary embodiment of the present disclosure is described hereinafter.All features of actual execution mode are not described for clarity and conciseness, in specification.Yet, should understand, in the process of any this actual execution mode of exploitation, can make a lot of decisions specific to execution mode, to realize developer's objectives, and these decisions may change to some extent along with the difference of execution mode.
At this, also it should be noted is that, for fear of the details because of unnecessary fuzzy the disclosure, only show in the accompanying drawings with according to the closely-related apparatus structure of scheme of the present disclosure, and omitted other details little with disclosure relation.
Fig. 1 a and 1b illustrate respectively the base of overhead image-scanning device and the ideal of backing plate is placed the schematic diagram of situation and the placement situation that staggers.Here base and the backing plate of supposing overhead image-scanning device all have uniform color, be pure color, and the color of the two are identical.The ideal of base and backing plate is placed situation as shown in Fig. 1 a, and this situation is the most favourable for the edge that obtains the foreground object in image.Yet because base and backing plate are artificial placements, and backing plate and base be two discrete equipment, therefore when placing backing plate and base, the two may stagger, and causes between them, occurring gap, and desktop is revealed in obtained scan image.The situation that has gap between base and backing plate has been shown in Fig. 1 b.Now, the upper lower limb in this gap all likely can interfere with the extraction to the page edge in scan image.
Fig. 2 a and 2b illustrate respectively the scan image of the books that obtain respectively in two kinds of situations that schematically show at Fig. 1 a and 1b.As shown in Figure 2 a, in ideal placement situation, in scan image, can't see the gap between base and backing plate.Yet, in the scan image shown in Fig. 2 b, between base and backing plate, there is gap.It may be noted that, although the books of usining are in this article described as an example of object to be scanned, those skilled in the art will appreciate that sweep object is not limited to books, so long as need to carry out to its scan image the object of edge extracting, all applicable technical scheme of the present disclosure.
In Fig. 3 a, illustrated by the top to the scan image shown in Fig. 2 a and carried out the gradient map that gradient map conversion operations obtains.By the gradient map to obtained, carry out predetermined process, can obtain the position at edge.Subsequently, marginal position is projected in original scan image, can determine the edge occurring in scan image.In Fig. 3 b, illustrated by the gradient map in Fig. 3 a and carried out the edge extracting result that edge extraction operation obtains.
Similarly, in Fig. 4 a, illustrated by the top to the scan image shown in Fig. 2 b and carried out the gradient map that gradient map conversion operations obtains.By the gradient map to obtained, carry out predetermined process, can obtain the position at edge.Subsequently, marginal position is projected in original scan image, can determine the edge occurring in scan image.In Fig. 4 b, illustrated by the gradient map in Fig. 4 a and carried out the edge extracting result that edge extraction operation obtains.
For the edge extraction operation that the scan image shown in Fig. 2 a and 2b is carried out, the most frequently used method is as described above, first converts scan image to gradient map, subsequently by gradient map being processed to obtain the edge in image.Those skilled in the art it should be noted that in addition to the above methods, can adopt many image processing methods to carry out above-mentioned edge extraction operation, and these algorithms are known in the art.
From Fig. 4 a and 4b, can find out, the gap occurring between the base of overhead image-scanning device and backing plate can be to scan image in the extraction at page edge cause interference.As shown in Fig. 4 a, existence due to the gap between base and backing plate, the edge extracting comprises the top edge of the page of the lower limb of base, the top edge of backing plate and books, and wherein the lower limb of base and the top edge of backing plate are the edge extracting of page-images to be caused to the Clutter edge of interference.In the situation that there is gap between base and backing plate, the top edge of backing plate, i.e. the lower limb in gap and the lower limb of base, the top edge in gap is likely identified as the edge of page-images, thereby the edge extracting of page-images is caused to interference.In the example as shown in Fig. 4 b, the lower limb of base, the top edge in gap is identified as the edge of page-images mistakenly.In the disclosure, may cause the edge of interference to be called " Clutter edge " to the edge extracting of foreground image.
For the problems referred to above, a kind of image processing apparatus has been proposed herein, it can be identified in the base that may exist in scan image and the edge in the gap between backing plate.As a kind of preferred implementation, in follow-up processing, can eliminate according to actual needs the impact at identified this edge, retain the true edge of page-images simultaneously, thereby guarantee page-images, the edge extracting of foreground object is interference-free.In example of the present disclosure, the base existing in scan image and the image of backing plate are the examples of background image.
Fig. 5 shows according to the schematic block diagram of the image processing apparatus 100 of an embodiment of the present disclosure.As shown in Figure 5, according to the image processing apparatus 100 of this embodiment, comprise reference line determining unit 102, it is for determining reference line at scan image.This reference line is for assisting to extract the edge of the foreground object of scan image.Image processing apparatus 100 further comprises Clutter edge recognition unit 104, and it is for identifying according to the definite reference line of reference line determining unit 102 Clutter edge that scan image exists.Clutter edge is edge that the background object in scan image causes, that the edge extracting of foreground object in scan image is caused to interference.
Below by concrete example, describe according to structure chart and the concrete operations of the image processing apparatus 100 of this embodiment.
As shown in Figure 6, when there is gap between the base of overhead image-scanning device and backing plate, the Clutter edge being caused by this gap can be divided into following two classes:
1) the first Clutter edge, it is set to the edge between gap and backing plate, i.e. the lower limb in gap, and
2) the second Clutter edge, it is set to the edge between gap and base, i.e. the top edge in gap.
The classification that those skilled in the art will appreciate that Clutter edge is not limited to this, can to Clutter edge, carry out other classification based on actual conditions, as long as it may cause interference to the edge extracting of foreground object, just can be classified as Clutter edge.
The concrete operations of identifying the first Clutter edge and the second Clutter edge according to image processing apparatus of the present disclosure are below described respectively.
First describe for identifying the concrete operations of the first Clutter edge.The image that overhead image-scanning device obtains by scanning can be set to I (x, y), and wherein I (x, y) represents the color of the pixel that in scan image, coordinate is (x, y).By scan image I (x, y) being carried out to the gradient map that gradient map conversion operations obtains, can be set to G (x, y), wherein G (x, y) represents the gradient of the pixel that in gradient map, coordinate is (x, y).
Here, the color of supposing backing plate and base is all known, and it is set to the first color of background object, this first color represents with I0, I0=(r0, g0, b0), wherein r0, g0 and b0 represent that respectively R(is red), G(is green) and B(blueness) color value.
In addition, in staggering in placement situation of backing plate and base, except base and backing plate, background object in scan image also comprises the gap between base and backing plate, and it is set to the second color of background object, and this second color represents with I1, I1=(r1, g1, b1), wherein r1, g1 and b1 represent that respectively R(is red), G(is green) and B(blueness) color value.Here, the second color of background object is unknown.
First, reference line determining unit 102 is determined the first reference line, and wherein the first reference line is according to the first prior information about the background object in scan image and definite.The first prior information is the information of lower limb about the base approximate location in scan image.It will be understood by those skilled in the art that this first prior information can be by the experiment of limited number of time or acquisition based on experience value, detail does not repeat them here.As shown in Figure 7, suppose that position base is for (x0, y0) to (x1, y0), therefore the first reference line can be set to the straight line that ordinate shown in Fig. 7 is y0, and x0, x1 and y0 are known here.
Here need explanation, the first selected reference line is not the lower limb of the base accurate location in scan image, and its object is rational initial value of processing selecting of search the first noise spot for describing below.The information relevant to the background object that may cause Clutter edge it will be appreciated by those skilled in the art that the setting of the first reference line and chooses to be not limited to above-mentioned example, so long as may be used to determine the first reference line.For example, in the present embodiment, the first reference line also can be selected as base in scan image and the arbitrary horizontal line in the gap between backing plate.
Owing to may not being parallel to each other between backing plate and base, therefore need to identify seriatim each pixel on the first Clutter edge.The noise spot that forms the first Clutter edge is called as the first noise spot.
Then, Clutter edge recognition unit 104 carries out scanning by pixel according to predetermined order to comprising the presumptive area of the first reference line in scan image along the first reference line.
Here, the presumptive area that comprises the first reference line is the second prior information of the background object based on about in scan image and definite region.Here, the second prior information is about wherein there is the information in the region of the first Clutter edge.It will be understood by those skilled in the art that this second prior information can be by the experiment of limited number of time or acquisition based on experience value.
Particularly, in the actual use of overhead image-scanning device, the gap between base and backing plate is not often that user has a mind to cause, but cause unintentionally when user's pedestal for placed and backing plate.Therefore can think, user, realize that while there is larger gap between base and backing plate, user can initiatively adjust.Therefore, can suppose that the base that may occur in scan image and the Breadth Maximum in the gap between backing plate are Gmax.When gap width is greater than Gmax, user can initiatively adjust.
That is to say, this presumptive area can be set to the whole region being surrounded by pixel (x0, y0), (x1, y0), (x0, y0+Gmax) and (x1, y0+Gmax).Those skilled in the art will appreciate that the object of setting this presumptive area is to simplify processing, reduces computing cost.In the situation that not setting this presumptive area, Clutter edge recognition unit 104 also can carry out the scanning by pixel to view picture scan image.
In this embodiment, Clutter edge recognition unit 104 according to order from left to right, from top to bottom, carries out point by point scanning to comprising the presumptive area of the first reference line in scan image along the first reference line.Those skilled in the art will appreciate that search order given here is only an example.In fact, this search order can be arbitrarily, as long as each pixel in can traversal search scope.
The pixel being scanned for each, Clutter edge recognition unit 104 in the vertical directions be take this pixel and are set first distance of swimming that comprises a plurality of pixels downwards as starting point.If the aberration between the color of each pixel starting from certain pixel in each first distance of swimming and the first color of background object is less than the first color difference threshold, this certain pixel is defined as to the first noise spot.The set of determined the first noise spot is the first Clutter edge.
Particularly, referring to Fig. 7, for each pixel on the first reference line { (x*, y0), x0<x*<x1 }, search for to find the first noise spot on first Clutter edge that may exist downwards.Hunting zone is the mentioned above presumptive area that comprises the first reference line, the region that vegetarian refreshments (x0, y0), (x1, y0), (x0, y0+Gmax) and (x1, y0+Gmax) are surrounded.
At searching period, for each the pixel (x* in hunting zone, y*), if take the color of all pixels in the pixel distance of swimming downwards (i.e. first distance of swimming) that this pixel is starting point and the aberration between the first color of background object (being the color of base and backing plate), be all less than the first color difference threshold T1, think this pixel (x*, y*) be the first noise spot on the first Clutter edge, this can be represented by following formula (1):
(diff(I(x*,y*+i),I0)<T1,i=0,1,2,…,t1 (1)
Wherein diff () is two differences between RGB color value, and T1 is the first color difference threshold about color difference, and t1 is the first run length.Here, the first color difference threshold T1 and the first run length t1 are predefined steady state values.It will be understood by those skilled in the art that the first color difference threshold T1 and the first run length t1 can be by the experiment of limited number of time or acquisitions based on experience value.Especially, the resolution of image-scanning device, the desired resolution of practical application, the required factors such as time of page edge extracting also can affect the setting of the first color difference threshold T1 and the first run length t1.
Described above judge pixel (x*, y*) whether as the first noise spot on the first Clutter edge according to being clearly in the physical sense.Particularly, if found (x*, the y*) that meets above-mentioned decision condition, think that pixel (x*, y*) is positioned on the first Clutter edge, and the pixel of this pixel below all belongs to backing plate.
The set of determined all the first noise spots has formed the first Clutter edge.
Preferably, subsequently, Clutter edge recognition unit 104 can be removed the image gradient of the first noise spot.For example, Clutter edge recognition unit 104 can all be adjusted into white by the image gradient of some pixels of found all the first noise spot tops.This can be represented by following formula (2):
G(x*,y*-i)=255,i=0,1,2,…,w1 (2)
Wherein G () represents the color of pixel to be adjusted into white, and w1 is predefined integer constant, and it represents the width of the first Clutter edge in gradient map.The value that it will be understood by those skilled in the art that w1 can be by the experiment of limited number of time or acquisition based on experience value, and detail does not repeat them here.
Need explanation, after identifying the first Clutter edge, both can, as above preferred embodiment, in scan image, eliminate the impact of the first Clutter edge; Can the first Clutter edge not processed yet, and be only to preserve the use that this information is usingd as follow-up further processing, for example, can be after identifying respective edges, whether this respective edges that relatively judges by the first Clutter edge with preserved is Clutter edge.
Fig. 8 a and 8b schematically show respectively and carry out the first Clutter edge Transformatin gradient map before and afterwards.After removing the first Clutter edge, the gradient map of the original scan image shown in Fig. 8 a becomes the gradient map shown in Fig. 8 b.In Fig. 8 b, to the edge extracting of page-images, cause the base of interference and first Clutter edge in the gap between backing plate to be removed.
It should be noted that, in the actual use of overhead image-scanning device, may have following situation.Base, backing plate and may be placed as and make the top edge of books cover the top edge of backing plate as the books of sweep object, thus in scan image, present gap between base and books but not gap between base and backing plate.In this case, there is not the first Clutter edge.In other words, above-mentioned search operation can not find the first noise spot on the first Clutter edge that meets decision condition.Therefore, only need to consider base in identification scan image and second Clutter edge in the gap between backing plate.
Next, use description to identify the concrete operations of the second Clutter edge.
First reference line determining unit 102 determines the second reference line.Particularly, reference line determining unit 102 converts scan image to gradient map on vertical direction.This gradient map is the gradient map with direction, that is to say, in this gradient map, for arbitrary pixel, only, when the pixel color of the color of the pixel of this pixel top the first color (color of base and backing plate) that is background object and its below is not the first color of background object, the gradient of this pixel just can be enhanced.
Subsequently, reference line determining unit 102 is determined gradient and maximum line in horizontal direction, and in definite scan image, whether the alternate position spike between the line corresponding with this line and the first reference line mentioned above is less than alternate position spike threshold value.If the alternate position spike between them is less than alternate position spike threshold value, line corresponding with this line in scan image is defined as to the second reference line.It will be appreciated by those skilled in the art that alternate position spike threshold value can be by the experiment of limited number of time or acquisition based on experience value, detail does not repeat them here.
In this embodiment, the relatively accurate position of the lower limb of base that the position of the second reference line is considered to overhead image-scanning device in scan image.Therefore should be appreciated that owing to not understanding in advance whether have gap between base and backing plate, when there is not gap between base and backing plate, determined gradient and the maximum line correspondence position in scan image is the position of the lower limb of base not necessarily.Therefore, itself and the position of the first reference line as described above can be compared.The approximate location of the lower limb that can be regarded as base due to the position of the first reference line in scan image, therefore only when gradient and the maximum correspondence position of line and the alternate position spike between the position of the first reference line are less than alternate position spike threshold value, just think that the correspondence position of gradient in above-mentioned horizontal direction and maximum line is the relative position accurately of the lower limb of base.
Fig. 9 a shows the scan image shown in Fig. 7 is carried out to gradient map conversion and the gradient map with direction that obtains.During being described clearly, in Fig. 9 a, only show the topography relevant with identifying the second Clutter edge.Here, the gradient map of the vertical direction calculating according to original scan image I (x, y) is VG (x, y), and wherein shown line is gradient and maximum line in horizontal direction.Here, VG (x, y) represents the vertical gradient of the pixel that in the gradient map on vertical direction, coordinate is (x, y).Fig. 9 b shows the diagram that gradient in the horizontal direction shown in Fig. 9 a and maximum line projection were obtained to time in scan image.In Fig. 9 b, suppose for (x0, yb), to arrive (x1, yb) by take the second reference line that upper type determines position in scan image.
After reference line determining unit 102 is determined the second reference line, Clutter edge recognition unit 104 is determined the second color I1 of background object based on the second reference line, determines the color in the gap between base and backing plate.It is generally acknowledged, the color in the gap between base and backing plate is the color of the desktop of the placement overhead image-scanning device that reveals.Without loss of generality in the situation that, can think that the color of desktop is uniformly, and different from the color of base and backing plate.Easily understand, in this embodiment, desktop is also an example of background object.
Particularly, Clutter edge recognition unit 104 find on the second reference lines, the vertical gradient in gradient map is greater than the pixel of Grads threshold.In staggering in placement situation of base and backing plate, can think that these found pixels are positioned on the lower limb of base.It will be appreciated by those skilled in the art that Grads threshold can be by the experiment of limited number of time or acquisition based on experience value.In the vertical direction be take respectively these pixels and is set as starting point a plurality of second distances of swimming that all comprise a plurality of pixels downwards, that is to say, the starting point of second distance of swimming is the peak of the second distance of swimming in the vertical direction.In these second distances of swimming, the highest color of the frequency of occurrences is confirmed as the second color I1 of background object, the i.e. color in the gap between base and backing plate.
Particularly, in all pixels (x, yb) on the second reference line, find vertical gradient value corresponding in vertical gradient figure VG (x, y) to be greater than the pixel of Grads threshold.The set of the pixel obtaining is like this by S edgerepresent, it can be represented by following formula (3):
S edge=(x, yb), x=x0 ..., x1 and VG (x, yb) >TG} (3)
Wherein TG is above-mentioned Grads threshold.
For obtained pixel S set edgein each pixel (x, yb), take this pixel (x, yb) is starting point, in the vertical direction is set the pixel distance of swimming (i.e. second distance of swimming) RL downwards x=(x, yb+i), and i=1,2 ..., t2}, wherein t2 is the second run length.It will be understood by those skilled in the art that the second run length t2 can be by the experiment of limited number of time or acquisition based on experience value, detail does not repeat them here.Especially, the resolution of image-scanning device, the desired resolution of practical application, the required factors such as time of page edge extracting also can affect the setting of the second run length t2.For all pixels in these pixel distances of swimming, calculate their RGB color histogram, then by the color that wherein frequency of occurrences is the highest the second color I1 of object, the i.e. color in the gap between base and backing plate as a setting.
Subsequently, the second color I1 of the background object of Clutter edge recognition unit 104 based on estimated determines Clutter edge.
Particularly, for each second distance of swimming, if the aberration in second distance of swimming between the second color of the color of all pixels and determined background object is less than the second color difference threshold T2, the highest pixel (x, yb) corresponding with this second distance of swimming is defined as to the second noise spot on the second Clutter edge.The set of determined the second noise spot has formed the second Clutter edge.It will be understood by those skilled in the art that the second color difference threshold T2 can be by the experiment of limited number of time or acquisition based on experience value.Especially, the resolution of image-scanning device, the desired resolution of practical application, the required factors such as time of page edge extracting also can affect the setting of the second color difference threshold T2.
That is to say, for S edgeeach pixel (x, yb) in set, judges whether it is the pixel on the second Clutter edge.When take the second distance of swimming RL that this pixel (x, yb) is starting point xin the aberration of the color value of all pixels and the color I1 in the gap between determined base and backing plate while being all less than the second color difference threshold T2, think that this pixel (x, yb) is the second noise spot on the second Clutter edge.
Described above judge pixel (x, yb) whether as the second noise spot on the second Clutter edge according to being clearly in the physical sense.Particularly, if found (x, the yb) that meets above-mentioned decision condition, think that pixel (x, yb) is positioned on the second Clutter edge, and the pixel of this pixel below all belongs to gap.
As preferred embodiment, subsequently, Clutter edge recognition unit 104 can be removed the image gradient of the second noise spot.For example, Clutter edge recognition unit 104 can all be adjusted into white by the color of some pixels of all the second noise spot tops of finding.This can be represented by following formula (4):
G(x,yb-i)=255,i=0,1,2,…,w2 (4)
Wherein G () represents the color of pixel to be adjusted into white, and w2 is predefined integer constant, and it represents the width of the second Clutter edge in gradient map.The value that it will be understood by those skilled in the art that w2 can be by the experiment of limited number of time or acquisition based on experience value, and detail does not repeat them here.Those skilled in the art should also be understood that the value of the width w1 of the first Clutter edge and the width w2 of the second Clutter edge can be the same or different.
Need explanation, similarly, after identifying the second Clutter edge, both can, as above preferred embodiment, in scan image, eliminate the impact of the second Clutter edge; Can the second Clutter edge not processed yet, and be only to preserve the use that this information is usingd as follow-up further processing, for example, can be after identifying respective edges, whether this respective edges that relatively judges by the second Clutter edge with preserved is Clutter edge.
Figure 10 schematically shows and carries out the second Clutter edge Transformatin gradient map afterwards.After removing the second Clutter edge, the gradient map of the scan image of removing the first Clutter edge shown in Fig. 8 b becomes the gradient map shown in Figure 10.As seen from Figure 10, to the edge extracting of page-images, cause the base of interference and second Clutter edge in the gap between backing plate and the first Clutter edge to be all removed.
Now, can extract easily correct page edge, as shown in figure 11.
Here it should be noted that, in practical operation, owing to not understanding in advance whether have gap between base and backing plate, therefore reference line determining unit 102 may not be determined the second reference line, or Clutter edge recognition unit 104 may not find meet above-mentioned for judging whether pixel is the pixel of the condition of the noise spot on the second Clutter edge.In these cases, can think and not have gap between base and backing plate, base and backing plate are placed situation in ideal.
In addition, although be described according to the order of first identifying the first Clutter edge and identifying subsequently the second Clutter edge hereinbefore, but those skilled in the art are to be understood that, also can first identify the second Clutter edge and identify subsequently the first Clutter edge, or the operation of identification the first Clutter edge and identification the second Clutter edge can be carried out simultaneously.
The disclosure has also proposed a kind of overhead image-scanning device that comprises image processing apparatus as described above.By according to overhead image-scanning device of the present disclosure, can effectively identify the interference that the edge of the background object in scan image causes the edge extracting of foreground object, for accurately carrying out the edge extracting of foreground object image for subsequent treatment.
Figure 12 shows according to the flow chart of the processing of image processing method of the present disclosure.According to image processing method of the present disclosure, start from step S1202, wherein in pending image, determine reference line, this reference line is for assisting to extract the edge of image foreground object.Subsequently, in step S1204, carry out the Clutter edge that exists in recognition image according to determined reference line, wherein Clutter edge is the background object edge that cause, that the edge extracting of the foreground object in image is caused to interference in image.
Figure 13 shows the flow chart of details of an embodiment of image processing method illustrated in fig. 12.The processing of this image processing method starts from step S1302, and wherein reference line determining unit 102 is determined the first reference line.In step S1304, Clutter edge recognition unit 104 determines whether to exist to meet and judges whether pixel is the pixel of the condition of the noise spot on the first Clutter edge.If there is such pixel ("Yes" in step S1304), in step S1306, Clutter edge recognition unit 104 is removed these pixels, with reprocessing, goes to step S1308.Otherwise ("No" in step S1304), step S1308 is directly gone in this processing.
In step S1308, reference line determining unit 102 determines whether to exist the second reference line.When not there is not the second reference line ("No" in step S1308), this processing finishes.When there is the second reference line ("Yes" in step S1308), Clutter edge recognition unit 104 determines whether to exist to meet and judges whether pixel is the pixel of the condition of the noise spot on the second Clutter edge.If there is such pixel ("Yes" in step S1310), in step S1312, Clutter edge recognition unit 104 is removed these pixels, with reprocessing, finishes.Otherwise ("No" in step S1310), this processing directly finishes.
The processing example of step S1302 and S1308 is as carried out by the above-mentioned reference line determining unit 102 of describing with reference to Fig. 5, and detail can, referring to description above, for for purpose of brevity, not repeat them here.In addition, the processing example of step S1304, S1306, S1310 and S1312 is as carried out by the above-mentioned Clutter edge recognition unit 104 of describing with reference to Fig. 5, and detail can, referring to description above, for for purpose of brevity, not repeat them here.
By block diagram, flow chart and/or embodiment, have been described in detail above, illustrated according to the embodiment of the device of embodiment of the present disclosure and/or method.When these block diagrams, flow chart and/or embodiment comprise one or more functions and/or operation, it will be obvious to those skilled in the art that each function in these block diagrams, flow chart and/or embodiment and/or operation can by various hardware, software, firmware or in fact they combination in any and individually and/or jointly enforcement.Several parts of the theme of describing in this specification in one embodiment, can be passed through application-specific IC (ASIC), field programmable gate array (FPGA), digital signal processor (DSP) or other integrated forms and realize.Yet, those skilled in the art will recognize that, some aspects of the execution mode of describing in this specification can be whole or in part in integrated circuit with the form of one or more computer programs of moving on one or more computers (for example, form with one or more computer programs of moving in one or more computer systems), with the form of one or more programs of moving on one or more processors (for example, form with one or more programs of moving on one or more microprocessors), form with firmware, or implement equivalently with the form of their combination in any in fact, and, according to disclosed content in this specification, the code that is designed for circuit of the present disclosure and/or writes for software of the present disclosure and/or firmware is completely within those skilled in the art's limit of power.
For example, image processing apparatus 100 and all modules thereof shown in above-mentioned Fig. 5, reference line determining unit 102 and Clutter edge recognition unit 104 can be configured by the mode of software, firmware, hardware or its combination in any.In the situation that realizing by software or firmware, can to the computer (example general-purpose computations 1400 as shown in figure 14) with specialized hardware structure, the program that forms this software be installed from storage medium or network, this computer, when various program is installed, can be carried out above-mentioned various functions.
Figure 14 shows and can be used to realize according to the structure diagram of the general-purpose computing system of the method and apparatus of embodiment of the present disclosure.Computer system 1400 is an example just, not the limitation of hint to the scope of application of method and apparatus of the present disclosure or function.Computer system 1400 should be interpreted as to the arbitrary assembly shown in exemplary operation system 1400 or its combination are had to dependence or demand yet.
In Figure 14, CPU (CPU) 1401 carries out various processing according to the program of storage in read-only memory (ROM) 1402 or from the program that storage area 1408 is loaded into random access memory (RAM) 1403.In RAM1403, also store as required data required when CPU1401 carries out various processing etc.CPU1401, ROM1402 and RAM1403 are connected to each other via bus 1404.Input/output interface 1405 is also connected to bus 1404.
Following parts are also connected to input/output interface 1405: importation 1406(comprises keyboard, mouse etc.), output 1407(comprises display, such as cathode ray tube (CRT), liquid crystal display (LCD) etc., and loud speaker etc.), storage area 1408(comprises hard disk etc.), communications portion 1409(comprises such as LAN card, modulator-demodulator etc. of network interface unit).Communications portion 1409 is via for example internet executive communication processing of network.As required, driver 1410 also can be connected to input/output interface 1405.Detachable media 1411 for example disk, CD, magneto optical disk, semiconductor memory etc. can be installed on driver 1410 as required, and the computer program of therefrom reading can be installed in storage area 1408 as required.
In the situation that realizing above-mentioned series of processes by software, can from network for example internet or from storage medium for example detachable media 1411 program that forms softwares is installed.
It will be understood by those of skill in the art that this storage medium is not limited to wherein having program stored therein shown in Figure 14, distributes separately to user, to provide the detachable media 1411 of program with equipment.The example of detachable media 1411 comprises disk (comprising floppy disk), CD (comprising compact disc read-only memory (CD-ROM) and digital universal disc (DVD)), magneto optical disk (comprising mini-disk (MD) (registered trade mark)) and semiconductor memory.Or storage medium can be hard disk comprising in ROM1402, storage area 1408 etc., computer program stored wherein, and be distributed to user together with the equipment that comprises them.
Therefore, the disclosure has also proposed a kind of program product that stores the instruction code that machine readable gets.When described instruction code is read and carried out by machine, can carry out above-mentioned according to the image processing method of embodiment of the present disclosure.Correspondingly, for carrying the above-named various storage mediums of this program product, be also included within the scope of the present disclosure.
In the above in the description of specific embodiment of the present disclosure, the feature of describing and/or illustrating for a kind of execution mode can be used in same or similar mode in one or more other execution mode, combined with the feature in other execution mode, or substitute the feature in other execution mode.
Should emphasize, term " comprises/comprises " existence that refers to feature, key element, step or assembly while using herein, but does not get rid of the existence of one or more further feature, key element, step or assembly or add.The term " first " that relates to ordinal number, " second " etc. do not represent enforcement order or the importance degree of feature, key element, step or assembly that these terms limit, and for the purpose of being only used to be described clearly and for identifying between these features, key element, step or assembly.
In addition, during the method for each embodiment of the present disclosure is not limited to specifications, describe or accompanying drawing shown in time sequencing carry out, also can be according to other time sequencing, carry out concurrently or independently.The execution sequence of the method for therefore, describing in this specification is not construed as limiting technical scope of the present disclosure.
To sum up, according in embodiment of the present disclosure, the disclosure provides following scheme, but is not limited to this:
1. 1 kinds of image processing apparatus of scheme, comprising:
Reference line determining unit, for determining reference line at pending image, described reference line is for assisting to extract the edge of described image foreground object; And
Clutter edge recognition unit, for identifying according to the definite reference line of described reference line determining unit the Clutter edge that described image exists, wherein said Clutter edge is edge that the background object in described image causes, that the edge extracting of the foreground object in described image is caused to interference.
Scheme 2. is according to the image processing apparatus described in scheme 1, wherein
Described Clutter edge recognition unit is used for according to described reference line, and the color based on background object is identified described Clutter edge, and wherein background object at least has the first color.
Scheme 3. is according to the image processing apparatus described in scheme 1 or 2, wherein
Described reference line determining unit is based on determining about the first prior information of described background object the first reference line that described reference line comprises; And
Described Clutter edge recognition unit carries out scanning by pixel according to predetermined order to the presumptive area that comprises described the first reference line in described image along described the first reference line, wherein, the pixel being scanned for each, in the vertical direction be take this pixel and is set first distance of swimming that comprises a plurality of pixels downwards as starting point, if the aberration between the first color of the color of each pixel starting from certain pixel in first distance of swimming described in each and described background object is less than the first color difference threshold, described certain pixel in first distance of swimming described in each is defined as to the first noise spot, determined the first noise spot forms the first Clutter edge that described Clutter edge comprises.
Scheme 4. is according to the image processing apparatus described in scheme 3, and the presumptive area that wherein comprises described the first reference line is the second prior information based on about described background object and definite region.
Scheme 5. is according to the image processing apparatus described in scheme 3 or 4, wherein
Described Clutter edge recognition unit according to order from left to right, from top to bottom, carries out point by point scanning to the presumptive area that comprises described the first reference line in described image along described the first reference line.
Scheme 6. is according to the image processing apparatus described in any one in scheme 1 to 5, wherein
Described reference line determining unit converts described pending image to gradient image on vertical direction, determine gradient and maximum line in horizontal direction, and determine whether line corresponding with this line in described pending image and the alternate position spike between the first reference line are less than alternate position spike threshold value, if described alternate position spike is less than alternate position spike threshold value, line corresponding with this line in described pending image is defined as to the second reference line that described reference line comprises; And
Described Clutter edge recognition unit carrys out the second color of estimated background object based on described the second reference line, and the second color of the background object based on estimated is determined the second Clutter edge that described Clutter edge comprises.
Scheme 7. is according to the image processing apparatus described in scheme 6, and wherein said Clutter edge recognition unit is configured to carry out following operation:
Find on described the second reference line, the gradient in described gradient image is greater than the pixel of Grads threshold, in the vertical direction be take respectively these pixels and is set a plurality of second distances of swimming that all comprise a plurality of pixels downwards as starting point, in described a plurality of second distance of swimming, the highest color of the frequency of occurrences is confirmed as the second color of background object, and
For each second distance of swimming, if the aberration in described second distance of swimming between the color of all pixels and the second color of determined background object is less than the second color difference threshold, the pixel of the starting point as described second distance of swimming is defined as to the second noise spot, determined the second noise spot forms described the second Clutter edge.
Scheme 8. is according to the image processing apparatus described in any one in scheme 1 to 7, and wherein said Clutter edge recognition unit is adjusted into white by the image gradient of described Clutter edge.
9. 1 kinds of overhead image-scanning devices of scheme, comprise the image processing apparatus as described in any one in scheme 1 to 8.
10. 1 kinds of image processing methods of scheme, comprising:
In pending image, determine reference line, described reference line is for assisting to extract the edge of described image foreground object; And
According to determined reference line, identify the Clutter edge existing in described image, wherein said Clutter edge is edge that the background object in described image causes, that the edge extracting of the foreground object in described image is caused to interference.
Scheme 11. is according to the image processing method described in scheme 10, wherein
The step of identifying the Clutter edge existing in described image according to determined reference line comprises: according to described reference line, the color based on background object is identified described Clutter edge, and wherein background object at least has the first color.
Scheme 12. is according to the image processing method described in scheme 10 or 11, wherein
The step of determining reference line in pending image comprises: based on determining about the first prior information of described background object the first reference line that described reference line comprises;
The step of identifying the Clutter edge existing in described image according to determined reference line comprises: along described the first reference line, according to predetermined order, the presumptive area that comprises described the first reference line in described image is carried out to the scanning by pixel, wherein, the pixel being scanned for each, in the vertical direction be take this pixel and is set first distance of swimming that comprises a plurality of pixels downwards as starting point, if the aberration between the first color of the color of each pixel starting from certain pixel in first distance of swimming described in each and described background object is less than the first color difference threshold, described certain pixel in first distance of swimming described in each is defined as to the first noise spot, determined the first noise spot forms the first Clutter edge that described Clutter edge comprises.
Scheme 13. is according to the image processing method described in scheme 12, wherein
The presumptive area that comprises described the first reference line is the second prior information based on about described background object and definite region.
Scheme 14. is according to the image processing method described in scheme 12 or 13, wherein
Along described the first reference line, according to predetermined order, the presumptive area that comprises described the first reference line in described image is carried out comprising by the step of the scanning of pixel: along described the first reference line, according to order from left to right, from top to bottom, the presumptive area that comprises described the first reference line in described image is carried out to point by point scanning.
Scheme 15. is according to the image processing method described in any one in scheme 10 to 14, wherein
The step of determining reference line in pending image comprises: convert described pending image to gradient image on vertical direction, determine gradient and maximum line in horizontal direction, and determine whether line corresponding with this line in described pending image and the alternate position spike between the first reference line are less than alternate position spike threshold value, if described alternate position spike is less than alternate position spike threshold value, line corresponding with this line in described pending image is defined as to the second reference line that described reference line comprises; And
The step of identifying the Clutter edge existing in described image according to determined reference line comprises: based on described the second reference line, carry out the second color of estimated background object, and the second color of the background object based on estimated is determined the second Clutter edge that described Clutter edge comprises.
Scheme 16. is according to the image processing method described in scheme 15, wherein
The second color of the background object based on estimated determines that the step of the second Clutter edge that described Clutter edge comprises comprises:
Find on described the second reference line, the gradient in described gradient image is greater than the pixel of Grads threshold, in the vertical direction be take respectively these pixels and is set a plurality of second distances of swimming that all comprise a plurality of pixels downwards as starting point, in described a plurality of second distance of swimming, the highest color of the frequency of occurrences is confirmed as the second color of background object, and
For each second distance of swimming, if the aberration in described second distance of swimming between the color of all pixels and the second color of determined background object is less than the second color difference threshold, the pixel of the starting point as described second distance of swimming is defined as to the second noise spot, determined the second noise spot forms described the second Clutter edge.
Scheme 17. is according to the image processing method described in any one in scheme 10 to 16, wherein
The step of identifying the Clutter edge existing in described image according to determined reference line comprises: the image gradient of described Clutter edge is adjusted into white.
Although the disclosure is disclosed by the description to specific embodiment of the present disclosure above, but, should be appreciated that, those skilled in the art can design various modifications of the present disclosure, improvement or equivalent in the spirit and scope of claims.These modifications, improvement or equivalent also should be believed to comprise in protection range of the present disclosure.

Claims (10)

1. an image processing apparatus, comprising:
Reference line determining unit, for determining reference line at pending image, described reference line is for assisting to extract the edge of described image foreground object; And
Clutter edge recognition unit, for identifying according to the definite reference line of described reference line determining unit the Clutter edge that described image exists, wherein said Clutter edge is edge that the background object in described image causes, that the edge extracting of the foreground object in described image is caused to interference.
2. image processing apparatus according to claim 1, wherein
Described Clutter edge recognition unit is used for according to described reference line, and the color based on background object is identified described Clutter edge, and wherein background object at least has the first color.
3. image processing apparatus according to claim 1 and 2, wherein
Described reference line determining unit is based on determining about the first prior information of described background object the first reference line that described reference line comprises; And
Described Clutter edge recognition unit carries out scanning by pixel according to predetermined order to the presumptive area that comprises described the first reference line in described image along described the first reference line, wherein, the pixel being scanned for each, in the vertical direction be take this pixel and is set first distance of swimming that comprises a plurality of pixels downwards as starting point, if the aberration between the first color of the color of each pixel starting from certain pixel in first distance of swimming described in each and described background object is less than the first color difference threshold, described certain pixel in first distance of swimming described in each is defined as to the first noise spot, determined the first noise spot forms the first Clutter edge that described Clutter edge comprises.
4. image processing apparatus according to claim 3, the presumptive area that wherein comprises described the first reference line is the second prior information based on about described background object and definite region.
5. image processing apparatus according to claim 3, wherein
Described Clutter edge recognition unit according to order from left to right, from top to bottom, carries out point by point scanning to the presumptive area that comprises described the first reference line in described image along described the first reference line.
6. image processing apparatus according to claim 3, wherein
Described reference line determining unit converts described pending image to gradient image on vertical direction, determine gradient and maximum line in horizontal direction, and determine whether line corresponding with this line in described pending image and the alternate position spike between the first reference line are less than alternate position spike threshold value, if described alternate position spike is less than alternate position spike threshold value, line corresponding with this line in described pending image is defined as to the second reference line that described reference line comprises; And
Described Clutter edge recognition unit carrys out the second color of estimated background object based on described the second reference line, and the second color of the background object based on estimated is determined the second Clutter edge that described Clutter edge comprises.
7. image processing apparatus according to claim 6, wherein said Clutter edge recognition unit is configured to carry out following operation:
Find on described the second reference line, the gradient in described gradient image is greater than the pixel of Grads threshold, in the vertical direction be take respectively these pixels and is set a plurality of second distances of swimming that all comprise a plurality of pixels downwards as starting point, in described a plurality of second distance of swimming, the highest color of the frequency of occurrences is confirmed as the second color of background object, and
For each second distance of swimming, if the aberration in described second distance of swimming between the color of all pixels and the second color of determined background object is less than the second color difference threshold, the pixel of the starting point as described second distance of swimming is defined as to the second noise spot, determined the second noise spot forms described the second Clutter edge.
8. image processing apparatus according to claim 1 and 2, wherein said Clutter edge recognition unit is adjusted into white by the image gradient of described Clutter edge.
9. an overhead image-scanning device, comprises the image processing apparatus as described in any one in claim 1 to 8.
10. an image processing method, comprising:
In pending image, determine reference line, described reference line is for assisting to extract the edge of described image foreground object; And
According to determined reference line, identify the Clutter edge existing in described image, wherein said Clutter edge is edge that the background object in described image causes, that the edge extracting of the foreground object in described image is caused to interference.
CN201310201320.9A 2013-05-27 2013-05-27 Image processing apparatus and method and overhead image-scanning device Expired - Fee Related CN104184918B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310201320.9A CN104184918B (en) 2013-05-27 2013-05-27 Image processing apparatus and method and overhead image-scanning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310201320.9A CN104184918B (en) 2013-05-27 2013-05-27 Image processing apparatus and method and overhead image-scanning device

Publications (2)

Publication Number Publication Date
CN104184918A true CN104184918A (en) 2014-12-03
CN104184918B CN104184918B (en) 2017-07-07

Family

ID=51965635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310201320.9A Expired - Fee Related CN104184918B (en) 2013-05-27 2013-05-27 Image processing apparatus and method and overhead image-scanning device

Country Status (1)

Country Link
CN (1) CN104184918B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933430A (en) * 2015-06-03 2015-09-23 北京好运到信息科技有限公司 Interactive image processing method and interactive image processing system for mobile terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0505729A2 (en) * 1991-03-29 1992-09-30 EASTMAN KODAK COMPANY (a New Jersey corporation) Image binarization system
JP2004096435A (en) * 2002-08-30 2004-03-25 Minolta Co Ltd Image analyzing device, image analysis method, and image analysis program
US20080024845A1 (en) * 2006-07-28 2008-01-31 Canon Kabushiki Kaisha Image reading apparatus
CN201585029U (en) * 2009-12-25 2010-09-15 虹光精密工业(苏州)有限公司 Scanner with reference patterns assisting boundary detection and original clamp for scanning
CN102833459A (en) * 2011-06-15 2012-12-19 富士通株式会社 Image processing method, image processing device and scanner

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0505729A2 (en) * 1991-03-29 1992-09-30 EASTMAN KODAK COMPANY (a New Jersey corporation) Image binarization system
EP0505729A3 (en) * 1991-03-29 1994-01-05 Eastman Kodak Co
JP2004096435A (en) * 2002-08-30 2004-03-25 Minolta Co Ltd Image analyzing device, image analysis method, and image analysis program
US20080024845A1 (en) * 2006-07-28 2008-01-31 Canon Kabushiki Kaisha Image reading apparatus
CN201585029U (en) * 2009-12-25 2010-09-15 虹光精密工业(苏州)有限公司 Scanner with reference patterns assisting boundary detection and original clamp for scanning
CN102833459A (en) * 2011-06-15 2012-12-19 富士通株式会社 Image processing method, image processing device and scanner

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933430A (en) * 2015-06-03 2015-09-23 北京好运到信息科技有限公司 Interactive image processing method and interactive image processing system for mobile terminal
CN104933430B (en) * 2015-06-03 2018-10-09 北京好运到信息科技有限公司 A kind of Interactive Image Processing method and system for mobile terminal

Also Published As

Publication number Publication date
CN104184918B (en) 2017-07-07

Similar Documents

Publication Publication Date Title
US9710109B2 (en) Image processing device and image processing method
WO2018028306A1 (en) Method and device for recognizing license plate number
JP5538435B2 (en) Image feature extraction method and system
CN103942797B (en) Scene image text detection method and system based on histogram and super-pixels
CN108830133B (en) Contract image picture identification method, electronic device and readable storage medium
US8897600B1 (en) Method and system for determining vanishing point candidates for projective correction
JP2016516245A (en) Classification of objects in images using mobile devices
US20120257833A1 (en) Image processing apparatus, image processing method, and computer readable medium
US10169673B2 (en) Region-of-interest detection apparatus, region-of-interest detection method, and recording medium
JP2014056572A (en) Template matching with histogram of gradient orientations
US8923610B2 (en) Image processing apparatus, image processing method, and computer readable medium
CN104766037A (en) Two-dimension code recognition method and device
CN105096347A (en) Image processing device and method
WO2019019595A1 (en) Image matching method, electronic device method, apparatus, electronic device and medium
CN109509200A (en) Checkerboard angle point detection process, device and computer readable storage medium based on contours extract
US10229498B2 (en) Image processing device, image processing method, and computer-readable recording medium
CN104008542A (en) Fast angle point matching method for specific plane figure
EP2536123B1 (en) Image processing method and image processing apparatus
CN104616019A (en) Identification method for electronic equipment signboard image
CN105354570A (en) Method and system for precisely locating left and right boundaries of license plate
US20180225536A1 (en) Image processing device, image processing method, and computer program product
KR101282663B1 (en) Apparatus for detecting image region of vehicle number plate and method thereof
CN104184918A (en) Image processing apparatus and method, and upper-build image scanning apparatus
CN108492345B (en) Data block dividing method based on scale transformation
CN108470351B (en) Method, device and storage medium for measuring body shift by image patch tracking

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170707

Termination date: 20180527