KR20170045846A - Device and method for obtaining edge line by detecting outline - Google Patents

Device and method for obtaining edge line by detecting outline Download PDF

Info

Publication number
KR20170045846A
KR20170045846A KR1020150145846A KR20150145846A KR20170045846A KR 20170045846 A KR20170045846 A KR 20170045846A KR 1020150145846 A KR1020150145846 A KR 1020150145846A KR 20150145846 A KR20150145846 A KR 20150145846A KR 20170045846 A KR20170045846 A KR 20170045846A
Authority
KR
South Korea
Prior art keywords
line
point
edge
defining
detecting
Prior art date
Application number
KR1020150145846A
Other languages
Korean (ko)
Other versions
KR101761641B1 (en
Inventor
장희라
김재원
김태성
Original Assignee
주식회사 셀바스에이아이
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 셀바스에이아이 filed Critical 주식회사 셀바스에이아이
Priority to KR1020150145846A priority Critical patent/KR101761641B1/en
Publication of KR20170045846A publication Critical patent/KR20170045846A/en
Application granted granted Critical
Publication of KR101761641B1 publication Critical patent/KR101761641B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention relates to an apparatus and method for detecting an edge line to obtain a dividing line, and a method for detecting an edge line according to the present invention to obtain a dividing line includes: detecting an edge line of the opened book image; Defining a lowest point of the bending region among the upper edge lines constituting the edge line as a first point; Defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line; And acquiring a straight line connecting one of the second point and the third point with the first point as a dividing line of the opened book image, It is possible to accurately detect the dividing line and provide a clearer image to the user.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to an apparatus and a method for detecting an edge line,

The present invention relates to an apparatus and method for detecting an edge line to obtain a dividing line, and more particularly, to an apparatus and method for detecting an edge line of an opened book image to obtain a dividing line of an opened book image.

Background Art [0002] With recent advances in technology for electronic devices, various electronic devices are being developed. In particular, the use of electronic devices such as smart phones and tablet PCs is rapidly increasing, and at the same time, it is operated on smart phones and tablet PCs, thereby providing convenience to users.

In this regard, when acquiring clearer images or texts from the contents stored in the electronic device, the satisfaction of the user will become even greater. For example, there is a need for a technique for accurately finding a dividing line of two pages so that the body text of a page is not damaged in an expanded book image photographed at an electronic device.

However, in the past, since a book opened by a user is mostly taken, it is common that the user can not photograph the book accurately in the center of the book. Accordingly, there is a need for a method that can find a dividing line of an accurate book and provide it to a user even if the book image is photographed unevenly in the left, right, or up and down directions.

[Related Technical Literature]

1. A method for automatically detecting a border of a medium image, a medium image processing system using the same, and a processing method (Patent Application No. 10-2013-0124216)

SUMMARY OF THE INVENTION It is an object of the present invention to provide an apparatus and a method for detecting a divided line of two pages in a book opened so as not to damage contents of a text in an opened book image and detecting an edge line for providing a clearer image to a user, And to provide an apparatus and a method for acquiring such information.

Another object of the present invention is to provide an apparatus and method for acquiring a dividing line by detecting an edge line capable of acquiring a dividing line more accurately even when a book image unfolded according to a shooting state or a surrounding environment can not be obtained accurately .

Another problem to be solved by the present invention is to reduce the size of the original image by a set size, thereby reducing the data to be processed, thereby improving the processing speed and detecting the edge line capable of improving the accuracy, And to provide a method and an apparatus for performing the method.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a method of detecting an edge line and acquiring a dividing line according to an embodiment of the present invention includes detecting an edge line of an opened book image; Defining a lowest point of the bending region among the upper edge lines constituting the edge line as a first point; Defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line; And acquiring a straight line connecting one of the second point and the third point and the first point as a dividing line of the opened book image.

According to another aspect of the present invention, there is provided an image processing method comprising the steps of: calling an original image including an expanded book image; And reducing the original image by a predetermined size.

According to another aspect of the present invention, the step of detecting an edge line of the expanded book image includes: converting an expanded book image into a gradient image; And detecting an edge line of the expanded book image by using a watermark algorithm for the gradient image.

According to another aspect of the present invention, the step of defining the lowest point of the bending region among the upper side edge lines constituting the edge line as the first point includes the steps of: And defining a lowest point as a first point.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: vertically dividing the entire image including the edge line equally vertically by a predetermined area; And extracting an area including the left edge line of the divided area as a left area and extracting an area including the right edge line as a right area.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: selecting two pieces of edge data among the plurality of pieces of edge data located in the left area; Selecting two edge data; Obtaining a left sampling line in which two edge data located in the left region are connected by a straight line and obtaining a right sampling line in which two edge data located in the right region are connected by a straight line; Detecting the number of edge data located within a set range based on the left sampling line and detecting the number of edge data located within a set range based on the right sampling line; And the left sampling line is detected as the left line and the right sampling line in which the number of the edge data located in the set range of the right sampling line is detected the most is referred to as the right sampling line, And detecting the line as a line.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the step of defining an intersection of extension lines of the left line and the right line as a second point.

According to still another aspect of the present invention, the step of defining the second point and the third point includes calculating an accuracy value of the left line and calculating an accuracy value of the right line, wherein the accuracy value of the second point Is a low accuracy value among the accuracy value of the left line and the accuracy value of the right line.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: horizontally dividing the entire image including the edge line equally by a predetermined area; And extracting an area including a lower edge line of the divided area as a lower area.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: selecting two pieces of edge data among a plurality of pieces of edge data located in the lower area; Obtaining a lower sampling line connecting two edge data by a straight line; Detecting a number of edge data located within a set range based on a lower sampling line; And detecting a lower sampling line in which the number of edge data located within a set range of the lower sampling lines is the highest detected as a lower line.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the step of calculating an accuracy value of the lower line, which is an accuracy value of the third point.

According to still another aspect of the present invention, the step of defining the second point and the third point includes the steps of: connecting an intersection of a straight line and a lower line vertically connected to a lower line among straight lines connected from a first point to a lower line to a third point The method comprising the steps of:

According to still another aspect of the present invention, the step of acquiring a straight line connecting the first point and either the second point or the third point with the dividing line of the expanded book image includes obtaining the accuracy of the second point and the third point Defining a point having a high value as a fourth point; And obtaining a straight line connecting the first point and the fourth point as a dividing line of the opened book image.

According to another aspect of the present invention, there is provided an apparatus for detecting an edge line to obtain an edge line, the edge line detecting apparatus comprising: a detector for detecting an edge line of an opened book image; And defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line, And a control unit for acquiring a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image.

According to another aspect of the present invention, there is provided a computer-readable recording medium for detecting an edge line of an opened book image and detecting a minimum point of a curved area of an upper edge line constituting an edge line, And defines a second point and a third point based on the left line, the right line and the lower line detected from the edge line, and defines a straight line connecting one of the second point and the third point and the first point And a control unit for acquiring a division line of the book image that has been expanded.

The details of other embodiments are included in the detailed description and drawings.

The present invention has the effect of providing a clearer image to the user by accurately detecting the dividing line of the book so that the content of the main text is not damaged in the book image spread on two pages.

The present invention can acquire a more accurate dividing line even when the book image can not be accurately obtained according to the photographing state or the surrounding environment, thereby improving the user's convenience.

Since the size of the original image is reduced by a set size, the amount of data to be processed is reduced, so that it is possible to improve the processing speed in acquiring the dividing line of the expanded book image and to improve the accuracy of acquiring the dividing line .

The effects according to the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.

1 is a block diagram showing a schematic configuration of an apparatus for detecting an edge line to obtain a dividing line according to an embodiment of the present invention.
2 is a flowchart illustrating a method of detecting an edge line to obtain a dividing line according to an embodiment of the present invention.
3A illustrates an exemplary embodiment for calling and collapsing an entire image including an unfolded book image according to an embodiment of the present invention.
Figures 3B and 3C illustrate an exemplary embodiment for detecting an edge line of an unfolded book image according to an embodiment of the present invention.
FIG. 3D illustrates an exemplary embodiment for defining a first point in accordance with an embodiment of the present invention.
Figure 3E illustrates an exemplary embodiment for extracting the left region according to an embodiment of the present invention.
FIG. 3F shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to an embodiment of the present invention.
3G shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention.
3H illustrates an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention.
FIG. 3I shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention.
3J illustrates an exemplary embodiment for detecting the left line and calculating the accuracy value of the left line in accordance with an embodiment of the present invention.
3K illustrates an exemplary embodiment for extracting the right region according to an embodiment of the present invention.
Figure 31 illustrates an exemplary embodiment for detecting the right line and calculating the right value of the right line according to an embodiment of the present invention.
Figure 3m illustrates an exemplary embodiment for defining a second point in accordance with an embodiment of the present invention and calculating an accuracy value of the second point.
Figure 3n illustrates an exemplary embodiment for extracting a lower region according to an embodiment of the present invention.
3O illustrates an exemplary embodiment for detecting the bottom line and calculating the accuracy value of the bottom line in accordance with an embodiment of the present invention.
FIG. 3P illustrates an exemplary embodiment for defining a third point and a fourth point according to an embodiment of the present invention, and calculating an accuracy value for the third point.
FIG. 3Q illustrates an exemplary embodiment for obtaining a dividing line of an unfolded book image according to an embodiment of the present invention.
FIG. 3R shows an exemplary embodiment for obtaining a dividing line of an opened book image according to another embodiment of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims.

Each block of the accompanying block diagrams and combinations of the steps of the flowcharts may be performed by algorithms or computer program instructions comprised of firmware, software, or hardware. These algorithms or computer program instructions may be embedded in a processor of a general purpose computer, special purpose computer, or other programmable digital signal processing device, so that the instructions that are executed by a processor of a computer or other programmable data processing apparatus Generate means for performing the functions described in each block or flowchart of the block diagram. These algorithms or computer program instructions may also be stored in a computer usable or computer readable memory capable of directing a computer or other programmable data processing apparatus to implement a function in a particular manner, It is also possible for instructions stored in a possible memory to produce a manufacturing item containing instruction means for performing the function described in each block or flowchart of each block diagram. Computer program instructions may also be stored on a computer or other programmable data processing equipment so that a series of operating steps may be performed on a computer or other programmable data processing equipment to create a computer- It is also possible that the instructions that perform the processing equipment provide the steps for executing the functions described in each block of the block diagram and at each step of the flowchart.

Also, each block or each step may represent a module, segment, or portion of code that includes one or more executable instructions for executing the specified logical function (s). It should also be noted that in some alternative embodiments, the functions mentioned in the blocks or steps may occur out of order. For example, two blocks or steps shown in succession may in fact be performed substantially concurrently, or the blocks or steps may sometimes be performed in reverse order according to the corresponding function.

Although the first, second, etc. are used to describe various components, it goes without saying that these components are not limited by these terms. These terms are used only to distinguish one component from another. Therefore, it goes without saying that the first component mentioned below may be the second component within the technical scope of the present invention.

Like reference numerals refer to like elements throughout the specification.

It is to be understood that each of the features of the various embodiments of the present invention may be combined or combined with each other partially or entirely and technically various interlocking and driving is possible as will be appreciated by those skilled in the art, It may be possible to cooperate with each other in association.

Various embodiments of the present invention will now be described in detail with reference to the accompanying drawings.

1 is a block diagram showing a schematic configuration of an apparatus for detecting an edge line to obtain a dividing line according to an embodiment of the present invention. Referring to FIG. 1, an apparatus 100 for detecting an edge line to obtain a dividing line includes a control unit 110, a detection unit 120, a display unit 130 communication unit 140, and a storage unit 150.

The apparatus 100 for detecting an edge line to obtain a dividing line may be any of a variety of electronic devices. For example, the device 100 for detecting an edge line to obtain a dividing line may be any of a variety of electronic devices such as a smart phone, a tablet PC, a notebook, a desktop, and the like.

The control unit 110 executes various software programs to perform various functions of the apparatus 100 for detecting an edge line to obtain a dividing line, and also performs processing and control for voice communication and data communication.

The detection unit 120 detects various data under the control of the control unit 110. For example, the detection unit 120 can detect an edge line of an opened book image. Specifically, the detection unit 120 can detect the upper edge line, the left edge line, the right edge line, and the lower edge line of the opened book image.

The display unit 130 provides an output interface to the user. For example, the display unit 130 may be configured to display various screens while providing an output interface to a user, such as a liquid crystal display or an organic light emitting display.

The communication unit 140 enables communication with a computer, a server, or a portable terminal.

The storage unit 150 stores various kinds of information under the control of the controller 110.

In the following, reference is made to Fig. 2 for describing more specific functions of an apparatus 100 for detecting an edge line according to an embodiment of the present invention to obtain a dividing line.

2 is a flowchart illustrating a method of detecting an edge line to obtain a dividing line according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

The apparatus 100 detects an edge line of the opened book image (S210). First, the control unit 100 calls up the expanded book image. For example, the control unit 110 may call the entire image including the expanded book image stored or stored in the storage unit 150, Here, an opened book image refers to an image in which both left and right pages of a book are opened.

Thereafter, the apparatus 100 reduces the entire image including the expanded book image by a set amount. For example, the control unit 110 may reduce the original image of the A × B size including the opened book image to the image of the size of a × b (A> a, B> b).

Thereafter, the apparatus 100 converts the reduced expanded book image into a gradient image. Here, the gradient image indicates that one color gradually changes to another color on the image, and is an image expressing the color change between colors step by step.

The apparatus 100 then uses the watershed algorithm to detect the edge line of the collapsed expanded book image using the gradient image. Here, the water shade algorithm is an algorithm that considers a set of pixels in an image as a terrain and analyzes the elevation. Here, the edge line is a group of edge data constituting an outline line of the opened book image, and includes an upper edge line, a left edge line, a right edge line, and a lower edge line. Although the apparatus 100 according to an embodiment of the present invention has described, for example, the detection of edge lines of an opened book image using a watermark algorithm, the apparatus 100 is not limited thereto, It is also possible to detect the edge line of the image.

Then, the apparatus 100 defines the lowest point of the bend region among the upper edge lines constituting the edge line as a first point (S220). Specifically, the control unit 110 defines, as a first point, the lowest point of the upper edge line, the left edge line, the right edge line, and the bend region included in the upper edge line of the lower edge line constituting the edge line of the opened book image . Here, the first point is the lowest point of the bending region among the upper side edge lines constituting the edge line of the opened book image, and is the center depression point of the upper side edge line constituting the edge line of the opened book image.

If a plurality of bending regions exist on the upper edge line, the controller 110 defines the lowest point of the bending region located at the most center in the upper edge line as a first point. For example, if there is a first bend region near the left or right edge of the upper edge line and a second bend region near the center, the controller 110 sets the lowest point of the second bend region, One point can be defined.

Then, the apparatus 100 defines a second point and a third point based on the left line, the right line and the lower line detected from the edge line (S230). First, the apparatus 100 extracts the left region and the right region from the entire image including the edge line, and detects the left line and the right line. Specifically, the control unit 110 vertically divides the entire image including the edge lines by the predetermined area. For example, the control unit 110 can vertically divide the entire image including the edge line into four equal parts.

Then, the control unit 110 extracts the area including the left edge line as the left area and the area including the right edge line as the right area among the divided areas. Here, the left region is a region including the left edge line among the regions vertically divided by the same area as the entire image including the edge line. The right side region is an area including the right side edge line out of the vertically divided regions of the whole image including the edge line equally by the set region.

Then, the apparatus 100 detects the left line and the right line in the extracted left and right regions. First, the control unit 110 selects two pieces of edge data among a plurality of pieces of edge data located in the left area, and selects two pieces of edge data from the plurality of pieces of edge data located in the right area. Thereafter, the controller 110 obtains a left sampling line connecting the two edge data located in the left region by a straight line, and obtains a left sampling line (right sampling line) connecting the two edge data located in the right region, (right sampling line). Here, the left sampling line is a line connecting arbitrary two pieces of edge data among a plurality of pieces of edge data located in the left area, and the right sampling line is a line connecting any two pieces of edge data located in the right area It is a line connecting the edge data with a straight line.

Then, the control unit 110 detects the number of edge data located within the set range based on the left sampling line, and detects the number of edge data located within the set range based on the right sampling line. Then, the controller 110 detects the left sampling line in which the number of edge data within the set range of the left sampling line is the largest detected, as the left line, and detects the number of edge data in the set range of the right sampling line The right sampling line is detected as the right line. Here, the left line is a left sampling line in which the number of edge data located in the set range of the left sampling lines is the largest detected, and the right line is a line in which the number of edge data located in the set range of the right sampling lines, It is a sampling line.

Thereafter, the apparatus 100 calculates the accuracy values of the detected left and right lines. Here, the accuracy value of the left line is the ratio of the edge data located within the set range of the left line among the plurality of edge data located in the left area. For example, when 100 edge data are located in the left area and 70 edge data is located within the set range of the left line, the accuracy value of the left line can be calculated to be 70. [ The accuracy value of the right line is the ratio of the edge data located within the set range of the right line among the plurality of edge data located in the right area. For example, when 100 edge data are located in the right area, and 80 edge data are located within the set range of the right line, the accuracy value of the right line can be calculated to be 80. [

Thereafter, the apparatus 100 defines the intersection of the extension lines of the left line and the right line as the second point. Here, the second point is an intersection of extension lines of the left line and the right line. The second point mentioned above is that the apparatus 100, in which the apparatus 100 photographing the opened book and the opened book are not parallel to each other, or the apparatus 100 in which the opened book and the opened book are photographed are parallel to each other, It can be defined if it does not exactly match the center.

Thereafter, the apparatus 100 calculates the accuracy value of the second point. Here, the accuracy value of the second point is a low accuracy value among the accuracy value of the left line and the accuracy value of the right line. For example, if the accuracy value of the left line is 60 and the accuracy value of the right line is 70, then the accuracy value of the second point is the accuracy value of the left line and the accuracy value of the right line, The accuracy value of the line can be calculated to be 60.

Then, the apparatus 100 extracts the lower region from the entire image including the edge line, and detects the lower line. Here, the lower region is a region including the lower edge line of the entire image including the edge line, which is equally divided horizontally by the set region. Then, the apparatus 100 detects the lower line within the extracted lower region. First, the controller 110 selects two edge data among a plurality of edge data located in the lower area, and obtains a bottom sampling line connecting the two edge data by a straight line. Here, the lower sampling line is a line in which any two pieces of edge data among a plurality of pieces of edge data located in the lower region are connected by a straight line.

Then, the controller 110 detects the number of edge data located within the set range based on the lower sampling line, and then outputs the lower sampling line, in which the number of edge data located in the set range of the lower sampling lines, . Here, the lower side line is the lower side sampling line in which the number of edge data located within the set range of the lower side sampling lines is detected the most.

Thereafter, the apparatus 100 calculates the accuracy value of the detected lower line. Here, the accuracy value of the lower line is the ratio of the edge data located within the set range of the lower line of the plurality of edge data located in the lower area. For example, when 100 edge data are located in the lower area and 70 edge data are located within the set range of the lower line, the accuracy value of the lower line can be calculated to be 70. [

Then, the apparatus 100 defines the intersection of a straight line and a lower line vertically connected to the lower line among the straight lines connected from the first point to the lower line as a third point. Here, the third point is an intersection of a straight line and a lower line vertically connected to a lower line of the straight lines connected from the first point to the lower line.

Thereafter, the apparatus 100 calculates the accuracy value of the third point. Here, the accuracy value of the third point refers to the ratio of the edge data located within the set range of the lower line of the plurality of edge data located in the lower area. For example, when 100 edge data are located in the lower area and 80 edge data are located within the set range of the lower line, the accuracy value of the lower line can be calculated to be 80. [

Then, the apparatus 100 acquires a straight line connecting any one of the second point and the third point with the first point as a dividing line of the opened book image (S240). First, the control unit 110 defines a point having a relatively high accuracy value among the second point and the third point as the fourth point. Here, the fourth point is a point at which the accuracy value of the second point and the third point is relatively high. For example, when the accuracy value of the second point is 70 and the accuracy value of the third point is 80, the fourth point is a third point having a relatively high accuracy value. Then, the control unit 110 obtains a straight line connecting the first point, the second point, and the third point having relatively high accuracy values, as a dividing line of the opened book image. For example, when the accuracy value of the second point is higher than that of the second point, the controller 110 defines the second point as the fourth point, and then defines a straight line connecting the first point and the fourth point Is obtained as a dividing line of the opened book image. In another example, when the accuracy value of the third point among the second point and the third point is higher, the control unit 110 defines the third point as the fourth point, and then connects the first point and the fourth point A straight line is obtained as a dividing line of the opened book image.

In an apparatus 100 for detecting an edge line according to an embodiment of the present invention and obtaining a dividing line, an optimal point for acquiring a dividing line of the opened book image in the process of defining the first point and the fourth point It is possible to acquire a dividing line with higher accuracy. Specifically, the first point is defined as the lowest point of the bend area located at the most center in the upper edge line of the opened book, and the fourth point is defined as the point having the higher accuracy value among the second point and the third point A dividing line with higher accuracy can be obtained.

3A illustrates an exemplary embodiment for calling and collapsing an entire image including an unfolded book image according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3A, the apparatus 100 invokes an entire image 301 that includes a stored expanded book image 302. Specifically, when the control unit 110 calls up the entire image 301 including the expanded book image 302 stored in the storage unit 150, the display unit 130 displays the entire image 301 including the expanded book image 302 (301). Thereafter, the apparatus 100 reduces the entire image including the expanded book image by a set amount. For example, the control unit 110 may reduce the original image of the A × B size including the opened book image to the image of the size of a × b (A> a, B> b).

In the apparatus 100 and method for detecting an edge line according to an exemplary embodiment of the present invention, the size of the original image is reduced by a set size as described above, so that data to be processed is reduced, The processing speed in acquiring the dividing line can be improved, and the accuracy of acquiring the dividing line can be improved.

Figures 3B and 3C illustrate an exemplary embodiment for detecting an edge line of an unfolded book image according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3B, the control unit 110 calls the entire image including the expanded book image stored in the storage unit 150 to reduce the entire image by a predetermined size, and then, by using the expanded book image as a gradient image Conversion. Thereafter, the apparatus 100 detects the edge line of the expanded book image using the gradient image and the watermark algorithm. Here, when the apparatus 100 detects the edge line of the book image expanded by using the watermark algorithm by using the watermark algorithm, the edge line consisting of the plurality of edge data is indicated by white and the remaining image is indicated by black, (FIG. 3C to FIG. 3R), the edge line is black, and the remaining image is white.

Referring again to FIG. 3C, the control unit 110 detects the edge line 307 of the expanded book image using the gradient image using the watermark algorithm. Specifically, the control unit 110 controls the upper edge line 303, the left edge line 304, the right edge line 305, and the lower edge line 303, which are the edge lines 307 of the opened book image, using the Wattsade algorithm, The edge line 306 is detected.

In an apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a segment line, an edge line of an expanded book image is detected using a watermark algorithm, but the present invention is not limited thereto. It is possible to detect the edge line of the expanded book image converted into the head image.

In an apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, an edge line is detected using a watermark algorithm using an expanded book image converted into a gradient image as described above Therefore, there is an advantage that the edge line of the book image can be detected more accurately.

FIG. 3D illustrates an exemplary embodiment for defining a first point in accordance with an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3D, the control unit 110 calls the entire image including the expanded book image stored in the storage unit 150, and then reduces the entire image by a predetermined size. Then, the control unit 110 converts the opened book image into a gradient image so as to highlight only the outer portion of the opened book image, and then detects the edge line of the expanded book image by using the watermark algorithm on the converted gradient image . Specifically, the control unit 110 detects the upper edge line 303, the left edge line, the right edge line, and the lower edge line which are the edge lines of the opened book image.

Then, the apparatus 100 defines the first point 308 as the lowest point of the curved region of the upper edge line 303 constituting the edge line. Specifically, the control unit 110 sets the lowest point of the bending region included in the upper edge line 303, the left edge line, the right edge line, and the upper edge line 303 of the lower edge line, which constitute the edge line of the opened book image, to Is defined as a first point (308). That is, the controller 110 defines the center point of the upper edge line 303, which constitutes the outline of the opened book image, as the first point 308.

If there are a plurality of bending regions on the upper edge line 303, the controller 110 defines the first point 308 as the lowest point of the bend region located at the most center in the upper edge line 303. For example, if there is a first bend region near the left edge of the upper edge line 303 and a second bend region near the center, the controller 110 determines the lowest point of the bend region 309), the lowest point of the second most bendable region can be defined as the first point 308.

As described above, in the apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, even when a plurality of bending regions exist on the upper edge line of the reduced expanded book image, Since the lowest point of the bend region located at the center is defined as the first point, there is an advantage that the position of the first point which is the reference point of the divided line can be accurately detected.

Figure 3E illustrates an exemplary embodiment for extracting the left region according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3E, the apparatus 100 extracts the left region 310 from the entire image including the edge lines. First, the control unit 110 vertically divides the entire image including the edge line by the predetermined area. For example, the control unit 110 can vertically divide the entire image including the edge line into four equal parts.

Then, the controller 110 extracts a region including the left edge line 304 among the divided regions into the left region 310. That is, even if the entire image including the edge line is longitudinally equally or not evenly divided into several equal parts, the area including the left edge line 304 is the left area 310. [ If there is no area including the entire left edge line 304 as a result of dividing the entire image vertically equally or non-equal times, the number of times the control unit 110 divides the left edge line 304 is changed to the left edge line 304, It is possible to repeat so that the entire region (the left region 310) exists.

As described above, in the apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, only the left region including the left edge line is extracted from the entire image including the edge line There is an advantage that the amount of calculation can be reduced and the accuracy can be further increased when detecting the left line.

FIG. 3F shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

3F, the controller 110 selects any two pieces of edge data 311 and 312 among a plurality of pieces of edge data located in the left area 310 and then outputs the two pieces of edge data 311 and 312 as a straight line To obtain the left sampling line 313 connected to the left sampling line 313. Then, the control unit 110 selects the reference edge data 314 located at the center of the two edge data 311 and 312. The control unit 110 then detects the two straight lines 315 and 316 passing through the reference edge data 314 and forming the set angle 312-1 with the left sampling line 313. For example, the control unit 110 passes the reference edge data 314 and detects two straight lines 315 and 316 forming an internal angle of 10 degrees with the left sampling line 313 .

The controller 110 then detects the number of edge data in the two straight lines 315 and 316 and selects any two different edge data and performs the above described procedure A process of detecting two straight lines constituting the set internal angle with the left sampling line and detecting the number of edge data located within the set range).

In an apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, the left sampling line and the edge data within the set range to detect the left line from the left edge line It is possible to detect a more accurate left line by introducing a process of detecting the number of lines.

3G shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

3G, the controller 110 selects any two pieces of edge data 311 and 312 among a plurality of pieces of edge data located in the left area 310 and then outputs the two pieces of edge data 311 and 312 as a straight line To obtain the left sampling line 313 connected to the left sampling line 313.

Then, the controller 110 detects the number of edge data located in the set range 317 including the two edge data 311 and 312. For example, when 200 pieces of edge data are located within the set range 317, the controller 110 can detect the number of 200 pieces of edge data.

Then, the control unit 110 repeats the above-described process (a process of acquiring a new left sampling line and detecting the number of edge data located within the set range) by selecting any two different edge data.

3H illustrates an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

3H, the controller 110 selects any two pieces of edge data 311 and 312 among a plurality of pieces of edge data located in the left area 310 and then outputs the two pieces of edge data 311 and 312 as a straight line To obtain the left sampling line 313 connected to the left sampling line 313. Thereafter, the control unit 110 selects the edge data 311 located on the upper side of the two edge data 311 and 312. The controller 110 then detects two straight lines 315 and 316 passing through the edge data 311 and forming the set angle 312-1 with the left sampling line 313. For example, the control unit 110 can detect two straight lines 315 and 316 passing through the edge data 311 and forming an interior angle 312-2 of 15 degrees with the left sampling line 313.

The controller 110 then detects the number of edge data in the two straight lines 315 and 316 and selects any two different edge data and performs the above described procedure A process of detecting two straight lines constituting the set internal angle with the left sampling line and detecting the number of edge data located within the set range).

FIG. 3I shows an exemplary embodiment for detecting the number of edge data located within a set range based on the left sampling line according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

3I, after selecting any two pieces of edge data 311 and 312 among a plurality of pieces of edge data located in the left region 310, the controller 110 divides the two pieces of edge data 311 and 312 into straight lines To obtain the left sampling line 313 connected to the left sampling line 313. Thereafter, the controller 110 selects the edge data 312 located at the lower side of the two edge data 311 and 312. [ The control unit 110 then detects the two straight lines 315 and 316 passing through the edge data 312 and forming the set angle 312-1 with the left sampling line 313. For example, the control unit 110 can detect two straight lines 315 and 316 passing through the edge data 311 and forming an interior angle 312-1 of 15 degrees with the left sampling line 313.

The controller 110 then detects the number of edge data in the two straight lines 315 and 316 and selects any two different edge data and performs the above described procedure A process of detecting two straight lines constituting the set internal angle with the left sampling line and detecting the number of edge data located within the set range).

3J illustrates an exemplary embodiment for detecting the left line and calculating the accuracy value of the left line in accordance with an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3J, the controller 110 detects the number of edge data within the set range based on the left sampling line, and then detects the number of edge data located within the set range of the left sampling lines, Is detected as the left line 318. For example, when 60, 70, and 80 edge data are detected within the set ranges of the first left sampling line, the second left sampling line, and the Nth left sampling line, the controller 110 determines that the Nth left sampling The line can be detected as the left line 318.

Thereafter, the apparatus 100 calculates the accuracy value of the detected left line 318. For example, in a case where 100 pieces of edge data are located in the left area 310 and 70 pieces of edge data are located in the area set with reference to the left line 318 in the process of detecting the left line 318, (110) may calculate the accuracy value of the left line (318) to be 70.

In an apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, as described above, other noise (e.g., a shade portion of a book and a finger portion Etc.) are detected, it is possible to detect a left line more accurately, so that a more accurate left line can be detected.

3K illustrates an exemplary embodiment for extracting the right region according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3K, the apparatus 100 extracts the right region 319 from the entire image including the edge lines. First, the control unit 110 vertically divides the entire image including the edge line by the predetermined area. For example, the control unit 110 can vertically divide the entire image including the edge line into four equal parts.

Then, the control unit 110 extracts a region including the right-side edge line 305 among the divided regions into the right-side region 319. That is, even if the entire image including the edge line is longitudinally equally or not evenly divided into several equal parts, the area including the right edge line 305 is the right area 319. [ If there is no region including the entire right edge line 305 as a result of dividing the entire image vertically in the same or non-equal number of times, the number of times the control unit 110 divides the right edge line 305 is changed to the right edge line 305, (Right region, 319) is present.

Figure 31 illustrates an exemplary embodiment for detecting the right line and calculating the right value of the right line according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 31, the controller 110 detects the number of edge data within the set range based on the right sampling line, and then detects the number of edge data within the set range of the right sampling line, To the right line (320). For example, when 60, 70, and 80 edge data are respectively detected within the set range of the first right sampling line, the second right sampling line and the Nth right sampling line, the control unit 110 determines that the Nth right sampling The line can be detected by the right line 320. Here, the various embodiments for detecting the right line are the same as those described above with reference to FIGS. 3F to 3I, and will be omitted for convenience.

Thereafter, the apparatus 100 calculates the accuracy value of the detected right line 320. For example, in a case where 100 pieces of edge data are located in the right area 319 and 70 pieces of edge data are located in the area set on the basis of the right line 320 in the process of detecting the right line 320, The controller 110 may calculate the accuracy value of the right line 320 as 70. [

Figure 3m illustrates an exemplary embodiment for defining a second point in accordance with an embodiment of the present invention and calculating an accuracy value of the second point. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3M, the apparatus 100 defines a point of intersection of the extension line of the left line 318 and the right line 320 as a second point 321. The second point 321 may be such that the device 100 that has taken the opened book and the opened book are not parallel to each other or that the device 100 that has taken the opened book and the opened book are parallel to each other, Can be defined if they do not exactly match the center of the expanded book.

Thereafter, the apparatus 100 calculates the accuracy value of the second point 321. Here, the accuracy value of the second point 321 refers to the lower accuracy value of the accuracy value of the left line 318 and the accuracy value of the right line 320. For example, if the accuracy value of the left line 318 is 60 due to the noise of the user's finger or the like and the accuracy value of the right line 320 is 70, the accuracy value of the second point 321 is the left line 318 and the accuracy value of the right line 320 can be calculated to be 60, which is the accuracy value of the left line 318, which is relatively lower.

In the apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, as described above, the accuracy value of the left line and the accuracy value of the right line, Since the accuracy value is calculated as the accuracy value of the second point, there is an advantage that the accuracy value of the second point can be more accurately reflected.

Figure 3n illustrates an exemplary embodiment for extracting a lower region according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3n, the apparatus 100 extracts the lower region 322 from the entire image including the edge lines. First, the control unit 110 equally horizontally divides the entire image including the edge line by a predetermined area. For example, the control unit 110 can divide the entire image including the edge line into three halves.

Then, the control unit 110 extracts a region including the lower edge line 306 among the divided regions into the lower region 322. That is, even if the entire image including the edge line is equally or not equally divided horizontally, the area including the lower edge line 306 is the lower area 322. [ If there is no area including the entire lower edge line 306 as a result of dividing the entire image horizontally in the same or non-equal number of times, the number of times the controller 110 divides the image is modified so that the lower edge line 306 (Lower region 322) in which the entire region is included.

3O illustrates an exemplary embodiment for detecting the bottom line and calculating the accuracy value of the bottom line in accordance with an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3O, the controller 110 detects the number of edge data within the set range based on the lower sampling line, and then detects the number of edge data within the set range of the lower sampling lines, As the lower line 323. For example, when 60, 70, and 80 edge data are detected within the set ranges of the first lower sampling line, the second lower sampling line and the Nth lower sampling line, the controller 110 controls the Nth lower sampling Line can be detected as the lower line 323. Here, the various embodiments for detecting the lower line are the same as those described above with reference to FIGS. 3F to 3I, and will be omitted for simplification.

Thereafter, the apparatus 100 calculates the accuracy value of the detected lower line 323. For example, in a case where 100 pieces of edge data are located in the lower region 322 and 70 pieces of edge data are located in the region set with reference to the lower line 323 in the process of detecting the lower line 323, The controller 110 can calculate the accuracy value of the lower line 323 by 70. [

FIG. 3P illustrates an exemplary embodiment for defining a third point and a fourth point according to an embodiment of the present invention, and calculating an accuracy value for the third point. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3P, the apparatus 100 determines the intersection of a straight line connected vertically to the lower line 323 and a lower line 323 among the straight lines connected from the first point 308 to the lower line 323 to the third point (324).

The device 100 then calculates the accuracy value of the third point 324. For example, if 100 edge data are located in the lower region and 80 edge data is located within the set range of the lower line 323, the accuracy value of the lower line 323 can be calculated as 80 have. That is, the accuracy value of the third point 324 is an accuracy value in the process of detecting the lower line 323.

Thereafter, the apparatus 100 defines either the second point or the third point 324 as the fourth point. Specifically, the control unit 110 defines a point having a relatively high accuracy value among the second point and the third point 324 as a fourth point. For example, when the accuracy value of the second point is 70 and the accuracy value of the third point 324 is 80, the fourth point 324 becomes the third point 324 having a relatively high accuracy value .

Since the apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line use the first to third points defined in detail as described above, There is an advantage that an accurate dividing line can be obtained.

FIG. 3Q illustrates an exemplary embodiment for obtaining a dividing line of an unfolded book image according to an embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3Q, the control unit 110 displays a straight line 325 connecting the first point 308, the second point 321 and the third point 324 with relatively high accuracy values, As shown in FIG. For example, when the accuracy value of the second point 321 of the second point 321 and the third point 324 is higher, the controller 110 sets the second point 321 to the fourth point 321, The straight line 325 connecting the first point 308 and the fourth point 321 is obtained as a dividing line of the opened book image.

In the apparatus 100 and method for detecting an edge line according to an embodiment of the present invention to obtain a dividing line, the first point Since the fourth to fourth points are determined, a more accurate dividing line can be obtained.

FIG. 3R shows an exemplary embodiment for obtaining a dividing line of an opened book image according to another embodiment of the present invention. For convenience of explanation, FIG. 1 will be described together.

Referring to FIG. 3R, the control unit 110 displays a straight line 325 connecting the first point 308, the second point and the third point 324 having relatively high accuracy values, . For example, when the accuracy value of the third point 324 among the second point and the third point 324 is higher, the control unit 110 defines the third point 324 as the fourth point 324 , A straight line 325 connecting the first point 308 and the fourth point 324 is obtained as the dividing line of the opened book image.

In this specification, each block or each step may represent a part of a module, segment or code that includes one or more executable instructions for executing the specified logical function (s). It should also be noted that in some alternative embodiments, the functions mentioned in the blocks or steps may occur out of order. For example, two blocks or steps shown in succession may in fact be performed substantially concurrently, or the blocks or steps may sometimes be performed in reverse order according to the corresponding function.

The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software module may reside in a RAM memory, a flash memory, a ROM memory, an EPROM memory, an EEPROM memory, a register, a hard disk, a removable disk, a CD-ROM or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor, which is capable of reading information from, and writing information to, the storage medium. Alternatively, the storage medium may be integral with the processor. The processor and the storage medium may reside within an application specific integrated circuit (ASIC). The ASIC may reside within the user terminal. Alternatively, the processor and the storage medium may reside as discrete components in a user terminal.

Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, it is to be understood that the present invention is not limited to those embodiments and various changes and modifications may be made without departing from the scope of the present invention. . Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. Therefore, it should be understood that the above-described embodiments are illustrative in all aspects and not restrictive. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.

100 device
110 control unit
120 detector
130 display unit
140 communication section
150 storage unit
301 Total Images
302 Expanded Book Image
303 upper edge line
304 Left edge line
305 right edge line
306 lower edge line
307 Edge Line
308 First point
309 Bottom of the bend area at the corner of the image of the spread book
310 left area
311 Edge data
312 Edge data
312-1 The left sampling line and the interior angle of each straight line of the two straight lines
312-2 Cabinet of two straight lines
313 Left Sampling Line
314 Edge data
315 Straight line passing edge data
316 Straight line through edge data
317 Set area
318 Left line
319 right area
320 right line
321 Second point
322 lower region
323 Lower line
324 Third point
325 partition line

Claims (15)

Detecting an edge line of the opened book image;
Defining a lowest point of the bending region among the upper side edge lines constituting the edge line as a first point;
Defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line; And
And acquiring a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image. How to.
The method according to claim 1,
Calling an original image including the opened book image; And
Further comprising reducing the original image by a predetermined amount. ≪ RTI ID = 0.0 > 8. < / RTI >
The method according to claim 1,
Wherein the step of detecting an edge line of the opened book image comprises:
Converting the expanded book image into a gradient image; And
And detecting an edge line of the expanded book image using the watershed algorithm for the gradient image. ≪ Desc / Clms Page number 19 >
The method according to claim 1,
Wherein the step of defining, as a first point, the lowest point of the bending region among the upper edge lines constituting the edge line,
And defining the lowest point of the bend region located at the most center in the upper edge line as the first point when the plurality of bend regions are present.
The method according to claim 1,
Wherein defining the second point and the third point comprises:
Dividing the entire image including the edge line vertically equally by a predetermined area; And
Extracting a region including a left edge line of the segmented region as a left region and extracting an area including a right edge line as a right region, How to.
6. The method of claim 5,
Wherein defining the second point and the third point comprises:
Selecting two pieces of edge data among a plurality of pieces of edge data located in the left region and selecting two pieces of edge data among a plurality of pieces of edge data located in the right region;
A left sampling line connected by a straight line connecting the two edge data located in the left region and a right sampling line connecting the two edge data located in the right region by a straight line, line;
Detecting the number of edge data located within a set range based on the left sampling line and detecting the number of edge data located within a set range based on the right sampling line; And
A left sampling line in which the number of the edge data located within the set range of the left sampling lines is detected the most is detected as the left line and the number of the edge data located within the set range of the right sampling lines is detected the most And detecting the right side sampling line as the right line. ≪ Desc / Clms Page number 20 >
The method according to claim 6,
Wherein defining the second point and the third point comprises:
And defining an intersection of an extension of the left line and the right line as the second point.
8. The method of claim 7,
Wherein defining the second point and the third point comprises:
Calculating an accuracy value of the left line and calculating an accuracy value of the right line,
Wherein the accuracy value of the second point is a low accuracy value of the accuracy value of the left line and the accuracy value of the right line.
The method according to claim 1,
Wherein defining the second point and the third point comprises:
Dividing the entire image including the edge line equally horizontally by a predetermined area; And
And extracting an area including a lower edge line of the divided area as a lower area.
10. The method of claim 9,
Wherein defining the second point and the third point comprises:
Selecting two edge data among a plurality of edge data located in the lower region;
Obtaining a bottom sampling line connecting the two edge data by a straight line;
Detecting the number of edge data located within a set range based on the lower sampling line; And
And detecting the lower sampling line in which the number of the edge data located within the set range of the lower sampling lines is the greatest detected as the lower line to obtain the dividing line .
11. The method of claim 10,
Wherein defining the second point and the third point comprises:
And calculating an accuracy value of the lower line, which is an accuracy value of the third point.
11. The method of claim 10,
Wherein defining the second point and the third point comprises:
And defining an intersection of a line connected perpendicularly to the lower line and the lower line among the straight lines connected from the first point to the lower line as the third point. How to acquire a line.
The method according to claim 1,
Acquiring a straight line connecting any one of the second point and the third point with the first point as a dividing line of the opened book image,
Defining a point having a higher accuracy value among the second point and the third point as a fourth point; And
And obtaining a straight line connecting the first point and the fourth point as a dividing line of the opened book image.
A detector for detecting an edge line of the opened book image; And
Defining a lowest point of the bending region of the upper side edge line constituting the edge line as a first point and defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line And a control unit for obtaining a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image, Device to acquire.
The edge line of the opened book image is detected,
Defining a lowest point of the bending region of the upper side edge line constituting the edge line as a first point and defining a second point and a third point based on the left line, the right line and the lower line detected from the edge line And a control unit for obtaining a straight line connecting any one of the second point and the third point and the first point as a dividing line of the opened book image, And stores the acquired instructions.
KR1020150145846A 2015-10-20 2015-10-20 Device and method for obtaining edge line by detecting outline KR101761641B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150145846A KR101761641B1 (en) 2015-10-20 2015-10-20 Device and method for obtaining edge line by detecting outline

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150145846A KR101761641B1 (en) 2015-10-20 2015-10-20 Device and method for obtaining edge line by detecting outline

Publications (2)

Publication Number Publication Date
KR20170045846A true KR20170045846A (en) 2017-04-28
KR101761641B1 KR101761641B1 (en) 2017-08-08

Family

ID=58701907

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150145846A KR101761641B1 (en) 2015-10-20 2015-10-20 Device and method for obtaining edge line by detecting outline

Country Status (1)

Country Link
KR (1) KR101761641B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113588667A (en) * 2019-05-22 2021-11-02 合肥联宝信息技术有限公司 Method and device for detecting object appearance
CN115908429A (en) * 2023-03-08 2023-04-04 山东歆悦药业有限公司 Foot bath powder grinding precision detection method and system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8270044B2 (en) * 2006-10-26 2012-09-18 Samsung Electronics Co., Ltd. Scanning apparatus having image correction function
CN102196112B (en) 2010-03-01 2014-09-24 佳能株式会社 Page border detection method and device
JP2013192100A (en) * 2012-03-14 2013-09-26 Panasonic Corp Image processor and original reading system equipped with the same
JP2013242826A (en) * 2012-05-23 2013-12-05 Panasonic Corp Image processing device and document reading system including the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113588667A (en) * 2019-05-22 2021-11-02 合肥联宝信息技术有限公司 Method and device for detecting object appearance
CN115908429A (en) * 2023-03-08 2023-04-04 山东歆悦药业有限公司 Foot bath powder grinding precision detection method and system

Also Published As

Publication number Publication date
KR101761641B1 (en) 2017-08-08

Similar Documents

Publication Publication Date Title
US11743571B2 (en) Electronic device and operating method thereof
CA2941143C (en) System and method for multi-focus imaging
JP5826081B2 (en) Image processing apparatus, character recognition method, and computer program
US9076221B2 (en) Removing an object from an image
US9262690B2 (en) Method and device for detecting glare pixels of image
US9196055B2 (en) Method and apparatus for providing a mechanism for gesture recognition
US10027878B2 (en) Detection of object in digital image
JP2008283649A (en) Image processing method, image region detecting method, image processing program, image region detection program, image processing apparatus, and image region detecting apparatus
KR20150059989A (en) Apparatus and Method for recognition a documentation with text and image
US9898800B2 (en) Image processing apparatus and image processing method
KR101761641B1 (en) Device and method for obtaining edge line by detecting outline
AU2011265380B2 (en) Determining transparent fills based on a reference background colour
WO2022088946A1 (en) Method and apparatus for selecting characters from curved text, and terminal device
JP2010021656A (en) Image processor and image processing method
KR20130134546A (en) Method for create thumbnail images of videos and an electronic device thereof
JP6669390B2 (en) Information processing apparatus, information processing method, and program
US10147169B2 (en) Image processing device and program
JP2005316958A (en) Red eye detection device, method, and program
EP2800349B1 (en) Method and electronic device for generating thumbnail image
US9886767B2 (en) Method, apparatus and computer program product for segmentation of objects in images
JP2016167258A (en) Method, device and computer program product of reducing chromatic aberration in deconvolution images
US9355456B2 (en) Method, apparatus and computer program product for compensating eye color defects
JP2010197968A (en) Focus evaluation apparatus, camera and program
JP2016048851A (en) Marker embedding device, marker detection device, method and program
JP2015028735A (en) Image processing device and program

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right