KR20170045846A - Device and method for obtaining edge line by detecting outline - Google Patents
Device and method for obtaining edge line by detecting outline Download PDFInfo
- Publication number
- KR20170045846A KR20170045846A KR1020150145846A KR20150145846A KR20170045846A KR 20170045846 A KR20170045846 A KR 20170045846A KR 1020150145846 A KR1020150145846 A KR 1020150145846A KR 20150145846 A KR20150145846 A KR 20150145846A KR 20170045846 A KR20170045846 A KR 20170045846A
- Authority
- KR
- South Korea
- Prior art keywords
- line
- point
- edge
- defining
- detecting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The present invention relates to an apparatus and method for detecting an edge line to obtain a dividing line, and a method for detecting an edge line according to the present invention to obtain a dividing line includes: detecting an edge line of the opened book image; Defining a lowest point of the bending region among the upper edge lines constituting the edge line as a first point; Defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line; And acquiring a straight line connecting one of the second point and the third point with the first point as a dividing line of the opened book image, It is possible to accurately detect the dividing line and provide a clearer image to the user.
Description
The present invention relates to an apparatus and method for detecting an edge line to obtain a dividing line, and more particularly, to an apparatus and method for detecting an edge line of an opened book image to obtain a dividing line of an opened book image.
Background Art [0002] With recent advances in technology for electronic devices, various electronic devices are being developed. In particular, the use of electronic devices such as smart phones and tablet PCs is rapidly increasing, and at the same time, it is operated on smart phones and tablet PCs, thereby providing convenience to users.
In this regard, when acquiring clearer images or texts from the contents stored in the electronic device, the satisfaction of the user will become even greater. For example, there is a need for a technique for accurately finding a dividing line of two pages so that the body text of a page is not damaged in an expanded book image photographed at an electronic device.
However, in the past, since a book opened by a user is mostly taken, it is common that the user can not photograph the book accurately in the center of the book. Accordingly, there is a need for a method that can find a dividing line of an accurate book and provide it to a user even if the book image is photographed unevenly in the left, right, or up and down directions.
[Related Technical Literature]
1. A method for automatically detecting a border of a medium image, a medium image processing system using the same, and a processing method (Patent Application No. 10-2013-0124216)
SUMMARY OF THE INVENTION It is an object of the present invention to provide an apparatus and a method for detecting a divided line of two pages in a book opened so as not to damage contents of a text in an opened book image and detecting an edge line for providing a clearer image to a user, And to provide an apparatus and a method for acquiring such information.
Another object of the present invention is to provide an apparatus and method for acquiring a dividing line by detecting an edge line capable of acquiring a dividing line more accurately even when a book image unfolded according to a shooting state or a surrounding environment can not be obtained accurately .
Another problem to be solved by the present invention is to reduce the size of the original image by a set size, thereby reducing the data to be processed, thereby improving the processing speed and detecting the edge line capable of improving the accuracy, And to provide a method and an apparatus for performing the method.
The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.
According to an aspect of the present invention, there is provided a method of detecting an edge line and acquiring a dividing line according to an embodiment of the present invention includes detecting an edge line of an opened book image; Defining a lowest point of the bending region among the upper edge lines constituting the edge line as a first point; Defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line; And acquiring a straight line connecting one of the second point and the third point and the first point as a dividing line of the opened book image.
According to another aspect of the present invention, there is provided an image processing method comprising the steps of: calling an original image including an expanded book image; And reducing the original image by a predetermined size.
According to another aspect of the present invention, the step of detecting an edge line of the expanded book image includes: converting an expanded book image into a gradient image; And detecting an edge line of the expanded book image by using a watermark algorithm for the gradient image.
According to another aspect of the present invention, the step of defining the lowest point of the bending region among the upper side edge lines constituting the edge line as the first point includes the steps of: And defining a lowest point as a first point.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: vertically dividing the entire image including the edge line equally vertically by a predetermined area; And extracting an area including the left edge line of the divided area as a left area and extracting an area including the right edge line as a right area.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: selecting two pieces of edge data among the plurality of pieces of edge data located in the left area; Selecting two edge data; Obtaining a left sampling line in which two edge data located in the left region are connected by a straight line and obtaining a right sampling line in which two edge data located in the right region are connected by a straight line; Detecting the number of edge data located within a set range based on the left sampling line and detecting the number of edge data located within a set range based on the right sampling line; And the left sampling line is detected as the left line and the right sampling line in which the number of the edge data located in the set range of the right sampling line is detected the most is referred to as the right sampling line, And detecting the line as a line.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the step of defining an intersection of extension lines of the left line and the right line as a second point.
According to still another aspect of the present invention, the step of defining the second point and the third point includes calculating an accuracy value of the left line and calculating an accuracy value of the right line, wherein the accuracy value of the second point Is a low accuracy value among the accuracy value of the left line and the accuracy value of the right line.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: horizontally dividing the entire image including the edge line equally by a predetermined area; And extracting an area including a lower edge line of the divided area as a lower area.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: selecting two pieces of edge data among a plurality of pieces of edge data located in the lower area; Obtaining a lower sampling line connecting two edge data by a straight line; Detecting a number of edge data located within a set range based on a lower sampling line; And detecting a lower sampling line in which the number of edge data located within a set range of the lower sampling lines is the highest detected as a lower line.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the step of calculating an accuracy value of the lower line, which is an accuracy value of the third point.
According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: connecting an intersection of a straight line and a lower line vertically connected to a lower line among straight lines connected from a first point to a lower line to a third point The method comprising the steps of:
According to still another aspect of the present invention, the step of acquiring a straight line connecting the first point and either the second point or the third point with the dividing line of the expanded book image includes obtaining the accuracy of the second point and the third point Defining a point having a high value as a fourth point; And obtaining a straight line connecting the first point and the fourth point as a dividing line of the opened book image.
According to another aspect of the present invention, there is provided an apparatus for detecting an edge line to obtain an edge line, the edge line detecting apparatus comprising: a detector for detecting an edge line of an opened book image; And defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line, And a control unit for acquiring a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image.
According to another aspect of the present invention, there is provided a computer-readable recording medium for detecting an edge line of an opened book image and detecting a minimum point of a curved area of an upper edge line constituting an edge line, And defines a second point and a third point based on the left line, the right line and the lower line detected from the edge line, and defines a straight line connecting one of the second point and the third point and the first point And a control unit for acquiring a division line of the book image that has been expanded.
The details of other embodiments are included in the detailed description and drawings.
The present invention has the effect of providing a clearer image to the user by accurately detecting the dividing line of the book so that the content of the main text is not damaged in the book image spread on two pages.
The present invention can acquire a more accurate dividing line even when the book image can not be accurately obtained according to the photographing state or the surrounding environment, thereby improving the user's convenience.
Since the size of the original image is reduced by a set size, the amount of data to be processed is reduced, so that it is possible to improve the processing speed in acquiring the dividing line of the expanded book image and to improve the accuracy of acquiring the dividing line .
The effects according to the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.
1 is a block diagram showing a schematic configuration of an apparatus for detecting an edge line to obtain a dividing line according to an embodiment of the present invention.
2 is a flowchart illustrating a method of detecting an edge line to obtain a dividing line according to an embodiment of the present invention.
3A illustrates an exemplary embodiment for calling and collapsing an entire image including an unfolded book image according to an embodiment of the present invention.
Figures 3B and 3C illustrate an exemplary embodiment for detecting an edge line of an unfolded book image according to an embodiment of the present invention.
FIG. 3D illustrates an exemplary embodiment for defining a first point in accordance with an embodiment of the present invention.
Figure 3E illustrates an exemplary embodiment for extracting the left region according to an embodiment of the present invention.
FIG. 3F shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to an embodiment of the present invention.
3G shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention.
3H illustrates an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention.
FIG. 3I shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention.
3J illustrates an exemplary embodiment for detecting the left line and calculating the accuracy value of the left line in accordance with an embodiment of the present invention.
3K illustrates an exemplary embodiment for extracting the right region according to an embodiment of the present invention.
Figure 31 illustrates an exemplary embodiment for detecting the right line and calculating the right value of the right line according to an embodiment of the present invention.
Figure 3m illustrates an exemplary embodiment for defining a second point in accordance with an embodiment of the present invention and calculating an accuracy value of the second point.
Figure 3n illustrates an exemplary embodiment for extracting a lower region according to an embodiment of the present invention.
3O illustrates an exemplary embodiment for detecting the bottom line and calculating the accuracy value of the bottom line in accordance with an embodiment of the present invention.
FIG. 3P illustrates an exemplary embodiment for defining a third point and a fourth point according to an embodiment of the present invention, and calculating an accuracy value for the third point.
FIG. 3Q illustrates an exemplary embodiment for obtaining a dividing line of an unfolded book image according to an embodiment of the present invention.
FIG. 3R shows an exemplary embodiment for obtaining a dividing line of an opened book image according to another embodiment of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims.
Each block of the accompanying block diagrams and combinations of the steps of the flowcharts may be performed by algorithms or computer program instructions comprised of firmware, software, or hardware. These algorithms or computer program instructions may be embedded in a processor of a general purpose computer, special purpose computer, or other programmable digital signal processing device, so that the instructions that are executed by a processor of a computer or other programmable data processing apparatus Generate means for performing the functions described in each block or flowchart of the block diagram. These algorithms or computer program instructions may also be stored in a computer usable or computer readable memory capable of directing a computer or other programmable data processing apparatus to implement a function in a particular manner, It is also possible for instructions stored in a possible memory to produce a manufacturing item containing instruction means for performing the function described in each block or flowchart of each block diagram. Computer program instructions may also be stored on a computer or other programmable data processing equipment so that a series of operating steps may be performed on a computer or other programmable data processing equipment to create a computer- It is also possible that the instructions that perform the processing equipment provide the steps for executing the functions described in each block of the block diagram and at each step of the flowchart.
Also, each block or each step may represent a module, segment, or portion of code that includes one or more executable instructions for executing the specified logical function (s). It should also be noted that in some alternative embodiments, the functions mentioned in the blocks or steps may occur out of order. For example, two blocks or steps shown in succession may in fact be performed substantially concurrently, or the blocks or steps may sometimes be performed in reverse order according to the corresponding function.
Although the first, second, etc. are used to describe various components, it goes without saying that these components are not limited by these terms. These terms are used only to distinguish one component from another. Therefore, it goes without saying that the first component mentioned below may be the second component within the technical scope of the present invention.
Like reference numerals refer to like elements throughout the specification.
It is to be understood that each of the features of the various embodiments of the present invention may be combined or combined with each other partially or entirely and technically various interlocking and driving is possible as will be appreciated by those skilled in the art, It may be possible to cooperate with each other in association.
Various embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
1 is a block diagram showing a schematic configuration of an apparatus for detecting an edge line to obtain a dividing line according to an embodiment of the present invention. Referring to FIG. 1, an
The
The
The
The display unit 130 provides an output interface to the user. For example, the display unit 130 may be configured to display various screens while providing an output interface to a user, such as a liquid crystal display or an organic light emitting display.
The
The
In the following, reference is made to Fig. 2 for describing more specific functions of an
2 is a flowchart illustrating a method of detecting an edge line to obtain a dividing line according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
The
Thereafter, the
Thereafter, the
The
Then, the
If a plurality of bending regions exist on the upper edge line, the
Then, the
Then, the
Then, the
Then, the
Thereafter, the
Thereafter, the
Thereafter, the
Then, the
Then, the
Thereafter, the
Then, the
Thereafter, the
Then, the
In an
3A illustrates an exemplary embodiment for calling and collapsing an entire image including an unfolded book image according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3A, the
In the
Figures 3B and 3C illustrate an exemplary embodiment for detecting an edge line of an unfolded book image according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3B, the
Referring again to FIG. 3C, the
In an
In an
FIG. 3D illustrates an exemplary embodiment for defining a first point in accordance with an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3D, the
Then, the
If there are a plurality of bending regions on the
As described above, in the
Figure 3E illustrates an exemplary embodiment for extracting the left region according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3E, the
Then, the
As described above, in the
FIG. 3F shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
3F, the
The
In an
3G shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
3G, the
Then, the
Then, the
3H illustrates an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
3H, the
The
FIG. 3I shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
3I, after selecting any two pieces of
The
3J illustrates an exemplary embodiment for detecting the left line and calculating the accuracy value of the left line in accordance with an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3J, the
Thereafter, the
In an
3K illustrates an exemplary embodiment for extracting the right region according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3K, the
Then, the
Figure 31 illustrates an exemplary embodiment for detecting the right line and calculating the right value of the right line according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 31, the
Thereafter, the
Figure 3m illustrates an exemplary embodiment for defining a second point in accordance with an embodiment of the present invention and calculating an accuracy value of the second point. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3M, the
Thereafter, the
In the
Figure 3n illustrates an exemplary embodiment for extracting a lower region according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3n, the
Then, the
3O illustrates an exemplary embodiment for detecting the bottom line and calculating the accuracy value of the bottom line in accordance with an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3O, the
Thereafter, the
FIG. 3P illustrates an exemplary embodiment for defining a third point and a fourth point according to an embodiment of the present invention, and calculating an accuracy value for the third point. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3P, the
The
Thereafter, the
Since the
FIG. 3Q illustrates an exemplary embodiment for obtaining a dividing line of an unfolded book image according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3Q, the
In the
FIG. 3R shows an exemplary embodiment for obtaining a dividing line of an opened book image according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.
Referring to FIG. 3R, the
In this specification, each block or each step may represent a part of a module, segment or code that includes one or more executable instructions for executing the specified logical function (s). It should also be noted that in some alternative embodiments, the functions mentioned in the blocks or steps may occur out of order. For example, two blocks or steps shown in succession may in fact be performed substantially concurrently, or the blocks or steps may sometimes be performed in reverse order according to the corresponding function.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software module may reside in a RAM memory, a flash memory, a ROM memory, an EPROM memory, an EEPROM memory, a register, a hard disk, a removable disk, a CD-ROM or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor, which is capable of reading information from, and writing information to, the storage medium. Alternatively, the storage medium may be integral with the processor. The processor and the storage medium may reside within an application specific integrated circuit (ASIC). The ASIC may reside within the user terminal. Alternatively, the processor and the storage medium may reside as discrete components in a user terminal.
Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, it is to be understood that the present invention is not limited to those embodiments and various changes and modifications may be made without departing from the scope of the present invention. . Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. Therefore, it should be understood that the above-described embodiments are illustrative in all aspects and not restrictive. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.
100 device
110 control unit
120 detector
130 display unit
140 communication section
150 storage unit
301 Total Images
302 Expanded Book Image
303 upper edge line
304 Left edge line
305 right edge line
306 lower edge line
307 Edge Line
308 First point
309 Bottom of the bend area at the corner of the image of the spread book
310 left area
311 Edge data
312 Edge data
312-1 The left sampling line and the interior angle of each straight line of the two straight lines
312-2 Cabinet of two straight lines
313 Left Sampling Line
314 Edge data
315 Straight line passing edge data
316 Straight line through edge data
317 Set area
318 Left line
319 right area
320 right line
321 Second point
322 lower region
323 Lower line
324 Third point
325 partition line
Claims (15)
Defining a lowest point of the bending region among the upper side edge lines constituting the edge line as a first point;
Defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line; And
And acquiring a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image. How to.
Calling an original image including the opened book image; And
Further comprising reducing the original image by a predetermined amount. ≪ RTI ID = 0.0 > 8. < / RTI >
Wherein the step of detecting an edge line of the opened book image comprises:
Converting the expanded book image into a gradient image; And
And detecting an edge line of the expanded book image using the watershed algorithm for the gradient image. ≪ Desc / Clms Page number 19 >
Wherein the step of defining, as a first point, the lowest point of the bending region among the upper edge lines constituting the edge line,
And defining the lowest point of the bend region located at the most center in the upper edge line as the first point when the plurality of bend regions are present.
Wherein defining the second point and the third point comprises:
Dividing the entire image including the edge line vertically equally by a predetermined area; And
Extracting a region including a left edge line of the segmented region as a left region and extracting an area including a right edge line as a right region, How to.
Wherein defining the second point and the third point comprises:
Selecting two pieces of edge data among a plurality of pieces of edge data located in the left region and selecting two pieces of edge data among a plurality of pieces of edge data located in the right region;
A left sampling line connected by a straight line connecting the two edge data located in the left region and a right sampling line connecting the two edge data located in the right region by a straight line, line;
Detecting the number of edge data located within a set range based on the left sampling line and detecting the number of edge data located within a set range based on the right sampling line; And
A left sampling line in which the number of the edge data located within the set range of the left sampling lines is detected the most is detected as the left line and the number of the edge data located within the set range of the right sampling lines is detected the most And detecting the right side sampling line as the right line. ≪ Desc / Clms Page number 20 >
Wherein defining the second point and the third point comprises:
And defining an intersection of an extension of the left line and the right line as the second point.
Wherein defining the second point and the third point comprises:
Calculating an accuracy value of the left line and calculating an accuracy value of the right line,
Wherein the accuracy value of the second point is a low accuracy value of the accuracy value of the left line and the accuracy value of the right line.
Wherein defining the second point and the third point comprises:
Dividing the entire image including the edge line equally horizontally by a predetermined area; And
And extracting an area including a lower edge line of the divided area as a lower area.
Wherein defining the second point and the third point comprises:
Selecting two edge data among a plurality of edge data located in the lower region;
Obtaining a bottom sampling line connecting the two edge data by a straight line;
Detecting the number of edge data located within a set range based on the lower sampling line; And
And detecting the lower sampling line in which the number of the edge data located within the set range of the lower sampling lines is the greatest detected as the lower line to obtain the dividing line .
Wherein defining the second point and the third point comprises:
And calculating an accuracy value of the lower line, which is an accuracy value of the third point.
Wherein defining the second point and the third point comprises:
And defining an intersection of a line connected perpendicularly to the lower line and the lower line among the straight lines connected from the first point to the lower line as the third point. How to acquire a line.
Acquiring a straight line connecting any one of the second point and the third point with the first point as a dividing line of the opened book image,
Defining a point having a higher accuracy value among the second point and the third point as a fourth point; And
And obtaining a straight line connecting the first point and the fourth point as a dividing line of the opened book image.
Defining a lowest point of the bending region of the upper side edge line constituting the edge line as a first point and defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line And a control unit for obtaining a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image, Device to acquire.
Defining a lowest point of the bending region of the upper side edge line constituting the edge line as a first point and defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line And a control unit for obtaining a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image, And stores the acquired instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150145846A KR101761641B1 (en) | 2015-10-20 | 2015-10-20 | Device and method for obtaining edge line by detecting outline |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150145846A KR101761641B1 (en) | 2015-10-20 | 2015-10-20 | Device and method for obtaining edge line by detecting outline |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170045846A true KR20170045846A (en) | 2017-04-28 |
KR101761641B1 KR101761641B1 (en) | 2017-08-08 |
Family
ID=58701907
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150145846A KR101761641B1 (en) | 2015-10-20 | 2015-10-20 | Device and method for obtaining edge line by detecting outline |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101761641B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113588667A (en) * | 2019-05-22 | 2021-11-02 | 合肥联宝信息技术有限公司 | Method and device for detecting object appearance |
CN115908429A (en) * | 2023-03-08 | 2023-04-04 | 山东歆悦药业有限公司 | Foot bath powder grinding precision detection method and system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8270044B2 (en) * | 2006-10-26 | 2012-09-18 | Samsung Electronics Co., Ltd. | Scanning apparatus having image correction function |
CN102196112B (en) | 2010-03-01 | 2014-09-24 | 佳能株式会社 | Page border detection method and device |
JP2013192100A (en) * | 2012-03-14 | 2013-09-26 | Panasonic Corp | Image processor and original reading system equipped with the same |
JP2013242826A (en) * | 2012-05-23 | 2013-12-05 | Panasonic Corp | Image processing device and document reading system including the same |
-
2015
- 2015-10-20 KR KR1020150145846A patent/KR101761641B1/en active IP Right Grant
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113588667A (en) * | 2019-05-22 | 2021-11-02 | 合肥联宝信息技术有限公司 | Method and device for detecting object appearance |
CN115908429A (en) * | 2023-03-08 | 2023-04-04 | 山东歆悦药业有限公司 | Foot bath powder grinding precision detection method and system |
Also Published As
Publication number | Publication date |
---|---|
KR101761641B1 (en) | 2017-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11743571B2 (en) | Electronic device and operating method thereof | |
CA2941143C (en) | System and method for multi-focus imaging | |
JP5826081B2 (en) | Image processing apparatus, character recognition method, and computer program | |
US9076221B2 (en) | Removing an object from an image | |
US9262690B2 (en) | Method and device for detecting glare pixels of image | |
US9196055B2 (en) | Method and apparatus for providing a mechanism for gesture recognition | |
US10027878B2 (en) | Detection of object in digital image | |
JP2008283649A (en) | Image processing method, image region detecting method, image processing program, image region detection program, image processing apparatus, and image region detecting apparatus | |
KR20150059989A (en) | Apparatus and Method for recognition a documentation with text and image | |
US9898800B2 (en) | Image processing apparatus and image processing method | |
KR101761641B1 (en) | Device and method for obtaining edge line by detecting outline | |
AU2011265380B2 (en) | Determining transparent fills based on a reference background colour | |
WO2022088946A1 (en) | Method and apparatus for selecting characters from curved text, and terminal device | |
JP2010021656A (en) | Image processor and image processing method | |
KR20130134546A (en) | Method for create thumbnail images of videos and an electronic device thereof | |
JP6669390B2 (en) | Information processing apparatus, information processing method, and program | |
US10147169B2 (en) | Image processing device and program | |
JP2005316958A (en) | Red eye detection device, method, and program | |
EP2800349B1 (en) | Method and electronic device for generating thumbnail image | |
US9886767B2 (en) | Method, apparatus and computer program product for segmentation of objects in images | |
JP2016167258A (en) | Method, device and computer program product of reducing chromatic aberration in deconvolution images | |
US9355456B2 (en) | Method, apparatus and computer program product for compensating eye color defects | |
JP2010197968A (en) | Focus evaluation apparatus, camera and program | |
JP2016048851A (en) | Marker embedding device, marker detection device, method and program | |
JP2015028735A (en) | Image processing device and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right |