KR101725041B1 - Method and apparatus for detecting forged image - Google Patents

Method and apparatus for detecting forged image Download PDF

Info

Publication number
KR101725041B1
KR101725041B1 KR1020150104105A KR20150104105A KR101725041B1 KR 101725041 B1 KR101725041 B1 KR 101725041B1 KR 1020150104105 A KR1020150104105 A KR 1020150104105A KR 20150104105 A KR20150104105 A KR 20150104105A KR 101725041 B1 KR101725041 B1 KR 101725041B1
Authority
KR
South Korea
Prior art keywords
edge
image
blur
target image
blur index
Prior art date
Application number
KR1020150104105A
Other languages
Korean (ko)
Other versions
KR20170011454A (en
Inventor
엄일규
정보규
Original Assignee
부산대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 부산대학교 산학협력단 filed Critical 부산대학교 산학협력단
Priority to KR1020150104105A priority Critical patent/KR101725041B1/en
Publication of KR20170011454A publication Critical patent/KR20170011454A/en
Application granted granted Critical
Publication of KR101725041B1 publication Critical patent/KR101725041B1/en

Links

Images

Classifications

    • G06K9/4604
    • G06K9/4609
    • G06K9/6204
    • G06K9/6218

Abstract

A forged image detection method is provided. A forged image detecting method is characterized in that an edge detector detects an edge point of a check target image and a blur estimator detects a blur of an edge point of the check target image detected by the edge detecting unit The statistical parameter detector models the blur index function using the blur index measured by the blur index measuring section, estimates a statistical parameter based on the modeled blur index function, And the determination unit clusters the statistical parameters estimated by the statistical parameter detection unit and determines a fake area of the verification target image using the group.

Description

FIELD OF THE INVENTION [0001]

The present invention relates to a method and apparatus for detecting a counterfeit image.

Today, image forgery has become easier due to the development of digital cameras and various image processing programs.

Falsified images can be used for crime, etc., and it is important to judge whether the images are forged or not.

At the time of image synthesis, the edge portion of the synthesized region can be blurred to hide the heterogeneous portion of the synthesized region.

The natural blur may be uniformly distributed throughout the image. However, artificial blur can be concentrated in one area.

Judging the falsification area of the image using the blurred part may be an effective method for judging whether the image is falsified or not.

Korean Patent Publication No. 10-2013-0077357

The present invention provides a method for determining whether an image is falsified by using an edge blur index.

According to an aspect of the present invention, there is provided a computer-readable recording medium storing a program for determining whether or not an image is falsified by using a blur index of an edge of an image to be identified.

The technical objects of the present invention are not limited to the technical matters mentioned above, and other technical subjects not mentioned can be clearly understood by those skilled in the art from the following description.

According to another aspect of the present invention, there is provided a method for detecting a fake image, the method comprising: detecting an edge point of an image to be checked; generating a blur estimator; The blur index of the edge of the image to be checked detected by the edge detecting unit is measured and the statistical parameter detector models the blur exponent function using the blur index measured by the blur index measuring unit, The method includes clustering statistical parameters estimated by the statistical parameter detector and determining a forgery area of the image to be checked using the forged region determinator based on the statistical parameters.

In some embodiments, the falsification area determination unit clusters the statistical parameters estimated by the statistical parameter detection unit and uses the falsification area determination unit to determine the falsification area of the verification target image, the determination of the locally dense part of the statistical parameter And determining the area as a forgery area of the target image.

In some embodiments, the edge point of the image to be identified includes a plurality of edge points, and wherein the blur index measuring section measures the blur index includes measuring the blur index for the plurality of edge points can do.

In some embodiments, the edge detector detecting an edge point of the image to be identified includes measuring an edge sample at an edge point of the image to be identified, wherein the blur index is a cost function using the edge sample To a minimum.

According to an aspect of the present invention, there is provided a computer-readable recording medium for detecting an edge point of an image to be checked, measuring a blur index of an edge point of the detected image to be checked, Estimating a statistical parameter based on the modeled blur exponential function, clustering the estimated statistical parameter, and using the calculated blur exponent function to calculate a forgery area of the check target image And the like.

In some embodiments, determining the forgery region of the image to be identified may include determining a locally dense portion of the statistical parameter as a forgery region of the image to be identified.

In some embodiments, the edge point of the image to be identified includes a plurality of edge points, and measuring the blur index of the edge point of the detected image to be checked may include calculating the blur index for the plurality of edge points ≪ / RTI >

In some embodiments, detecting an edge point of the image to be identified includes measuring an edge sample at an edge point of the image to be identified, wherein the blur index is determined by minimizing a cost function using the edge sample Lt; / RTI >

The details of other embodiments are included in the detailed description and drawings.

1 is a block diagram of a fake video detection system according to embodiments of the present invention.
2 is a flowchart of a method for detecting a false image according to embodiments of the present invention.
3 is a flowchart of a blur index measuring method according to embodiments of the present invention.
4 is a diagram for explaining an edge sample detection method according to embodiments of the present invention.
5 is a graph showing the blur exponent function.
6 is a view showing an edge model according to the blur index.
7 is a graph showing a probability density function of the blur index.
8 is a graph showing distribution of statistical parameters before and after blurring.
FIGS. 9A and 9B are diagrams for explaining a result of detection of a false image according to the embodiments of the present invention.
10 is a diagram illustrating a computing system including a program for detecting false images in accordance with embodiments of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. The dimensions and relative sizes of the components shown in the figures may be exaggerated for clarity of description. Like reference numerals refer to like elements throughout the specification and "and / or" include each and every combination of one or more of the mentioned items.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

Referring to FIG. 1, a description will be given of a fake video detection system according to the present invention.

1 is a block diagram of a fake video detection system according to embodiments of the present invention.

The falsification image detection system 10 includes an edge detection unit 11, a blur index measurement unit 13, a statistical parameter detection unit 15, and a fake area determination unit 17.

The falsification image detecting system 10 according to the present embodiment may be implemented in a program form using, for example, command codes, commands, and the like, but the present invention is not limited thereto. In some embodiments, the counterfeit image detection system 10 may be implemented in hardware using, for example, an FPGA or the like.

The term " part " used in this embodiment means a hardware component such as software or an FPGA or an ASIC, and 'part' performs certain roles. However, 'minus' is not limited to software or hardware. The " part " may be configured to be in an addressable storage medium and configured to play back one or more processors. Thus, by way of example, and by no means, the terms " component " or " component " means any combination of components, such as software components, object- oriented software components, class components and task components, Subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functions provided in the components and parts may be combined into a smaller number of components and parts or further separated into additional components and parts.

The edge detector 11 can detect an edge point of a check target image.

Specifically, the edge detecting unit 11 can use an edge detecting method of the check target image to detect an edge point of the check target image.

At this time, the edge of the check target image may be composed of an edge point. Further, the number of edge points of the verification target image may be plural. That is, the edge point of the check target image may include a plurality of edge points.

The detection of the edge point of the image to be checked by the edge detecting unit 11 may include the measurement of the edge sample Sr at the edge point of the image to be confirmed. The method of measuring the edge specimen (Sr) will be described later.

When the edge detecting unit 11 detects the edge point of the verification target image and measures the edge specimen Sr, the blur exponent measuring unit 13 calculates the blur index of the verification target image The blur index of the edge point can be measured.

Specifically, the blur index measuring unit 13 can define a cost function that uses the edge specimen (Sr) at the edge point of the verification target image detected by the edge detection unit (11).

In addition, the blur index measuring unit 13 can measure the blur index at the edge point of the image to be checked using the cost function.

Here, the blur index measured by the blur index measuring unit 13 may be a value that minimizes the cost function using the edge specimen Sr.

For example, the blur index measuring unit 13 may perform an optimization method on the cost function to measure the blur index at the edge point of the image to be identified.

At this time, when there are a plurality of edge points of the verification target image, the blur index measuring unit 13 can measure the blur index with respect to a plurality of edge points of the verification target image.

The statistical parameter detector 15 can model the blur exponent function using the blur index measured by the blur index measuring unit 13. [ Furthermore, the statistical parameter detector 15 can estimate a statistical parameter based on the modeled blur exponential function.

The blur exponential function may, for example, consist of an impulse function and an arbitrary function. In addition, the statistical parameter detector 15 may use, for example, a maximum likelihood method when estimating a statistical parameter based on the modeled blur exponential function. Details of this will be described later.

The forged region determiner 17 clusters the statistical parameters estimated by the statistical parameter detector 15 and determines the forgery region of the image to be checked.

Specifically, the falsification area determination unit 17 can determine, as a result of clustering statistical parameters, a portion where statistical parameters are locally dense as a falsification area of the verification target image.

Hereinafter, a method of detecting a false image of a false image detecting system according to embodiments of the present invention will be described with reference to FIGS. 1 to 8. FIG.

2 is a flowchart of a method of detecting a false image according to embodiments of the present invention.

Referring to FIG. 2, an edge of a check target image is detected (S100). For example, the edge detecting unit 11 of the counterfeit image detecting system 10 can detect the edge of the check target image for edge point detection of the check target image.

Here, the confirmation target image may be a video image for checking whether there is a falsified portion.

The edge of the confirmation target image may be, for example, a part where the brightness rapidly changes in the confirmation target image. In addition, the edge of the confirmation target image may include, for example, a high frequency component.

Therefore, the edge detecting unit 11 can detect the edge of the image to be checked through the high-pass filter or the edge detecting methods.

In some embodiments, the edge of the image to be identified may be detected using the Canny edge detection method.

When the edge detecting unit 11 detects the edge of the check target image, the edge point of the check target image can be detected.

The edge point of the check target image may be an object for measuring the blur index to be described below.

Hereinafter, a method of obtaining the edge specimen (Sr) will be described with reference to Figs. 3 and 4. Fig.

FIG. 3 is a flowchart of a blur index measuring method according to an embodiment of the present invention, and FIG. 4 is a view for explaining an edge sample detecting method according to embodiments of the present invention.

Referring to FIG. 3, after detecting an edge of a check target image (S100), a window centered on one edge point is set (S101). For example, the edge detection unit 11 of the falsification image detection system 10 can set a window centered on the edge point of the measured confirmation target image. At this time, for example, when there are a plurality of edge points of the confirmation target image, a window can be set around one of the edge points.

The window may be, for example, a square shape.

A window may contain one edge point and may be set around the included edge point. However, the embodiments of the present invention are not limited thereto.

For example, one or more edge points may be included in a window. In this case, however, only the edge point in the center of the window can be subjected to blur index measurement, which will be described later.

Next, the edge specimen (Sr) can be measured by measuring the advancing direction of the edge of the confirmation target image at one edge point in the center of the window (S102). For example, the edge detection unit 11 of the falsification image detection system 10 can measure the edge specimen Sr.

Referring to FIG. 4, (a) shows a window set around the edge point of the image to be checked, (b) shows the edge direction at the edge point and the edge direction of the image to be checked, and c) is an edge specimen (Sr) graph measured at the edge point of the image to be confirmed.

In step S102, in order to measure the edge specimen (Sr) of the image to be confirmed, it may be necessary to set the direction to display the edge in one-dimensional cross-section.

Referring to FIG. 4B, the edge specimen Sr may be a specimen obtained by extracting values of a normal perpendicular to an inclination direction of edges of an image to be identified.

Specifically, in order to obtain the edge specimen (Sr), the inclination degree ratio, that is, the normal direction of the edge can be measured by obtaining the inclination ratio in the vertical direction and the horizontal direction of the edge of the confirmation object image.

In this case, the direction may be divided into vertical, horizontal, diagonal, or diagonal directions.

The edge specimen (Sr) may have an extended width, for example, one space on the basis of the edge point of the image to be confirmed at the center of the window.

For example, the maximum width (W) may be a total of 13 points, 6 points to the left and right.

In Fig. 4, W may be the maximum width, l the position of the edge point, h the luminosity of the background, and k the variable representing the contrast of the edge.

Referring to (c) of FIG. 4, the edge sample (Sr) is measured under the above-described conditions. Where the x-axis is the edge sample (Sr) and the y-axis is the maximum width (W).

Referring again to FIG. 3, an optimization method is performed on a cost function using the edge specimen Sr (S103). For example, the blur index measuring unit 13 of the counterfeit image detecting system 10 may define a cost function using the edge sample Sr and perform an optimization method on the cost function.

The cost function may be the difference between the edge specimen Sr of the confirmation target image and the edge model Sa blurred.

For example, the cost function (E) can be defined as: " (1) "

Figure 112015071503770-pat00001

Here, the blurred edge model Sa may be a value that is predicted when the edge of the sample image is blurred.

The edge of the sample image may have the form of a step function.

The blurred edge model Sa can be obtained by gaussian blurring, for example, a step function representing an edge.

For example, the blurred edge model Sa can be defined as Equation (2).

Figure 112015071503770-pat00002

Here, erf is an error function and can be defined as Equation (3).

Figure 112015071503770-pat00003

In Equations (1) and (2),? Can be defined as a blur index. The blur index can be used as a basis for judging a fake area of a check target image.

Here, the blur index can be obtained by the least squares method of the edge sample (Sr) of the image to be confirmed and the edge model (Sa) blurred.

That is, the blur index may be a value that minimizes the cost function using the edge sample Sr.

In order to obtain a blur index that minimizes the cost function, for example, an optimization method may be used.

As an optimization method, for example, a 'Nedler-Mead optimization method' may be used.

The initial parameters to be input in the optimization method may be h 0 , k 0 , l 0 , σ 0 . Where h 0 is the minimum value of the sample, k 0 is the difference between the maximum and minimum values of the sample, l 0 is the center position of the edge, and σ 0 can be any value.

Using the initial parameters, an optimization method can be performed on the cost function to obtain the appropriate parameters (h, k, l, σ).

Referring to FIG. 2 again, the blur index is measured at one edge point of the image to be checked (S110). For example, the blur index measuring unit 13 of the counterfeit image detecting system 10 can measure the blur index at one edge point of the image to be checked.

At this time, when there are a plurality of edge points of the check target image, the blur index is measured at all edge points (S120). For example, the blur index measuring unit 13 of the counterfeit image detecting system 10 can measure the blur index for all the edge points of the check target image.

If the blur index is not measured at all edge points of the check target image, steps S101, S102, S103, and S110 may be repeated for other edge points (S130).

If the blur index is measured at all edge points of the target image, the blur index function is modeled using the blur index (S140). For example, the statistical parameter detector 15 of the falsification image detection system 10 can model the blur exponent function using the blur index.

In order to model the blur exponential function, the statistical parameter detector 15 may first measure the histogram of the blur exponent of the edge point of the image to be checked.

Given a random variable that is a blur index, the blur exponent function can be estimated from the measured histogram.

For example, the estimated blur exponential function p () can be modeled as an impulse function and an arbitrary function, as in equation (4).

Figure 112015071503770-pat00004

In Equation (4), the impulse function? (?) Having the coefficient k may be composed of values extracted at the edge point of the non-blurred check target image.

Also, in equation (4), any function p b (?) May be a probability density function of the blur index.

Referring to FIG. 5, a graph of the modeled Blur exponent function can be seen.

5 is a graph showing the blur exponent function.

In the graph, the x-axis () is the blur index and the y-axis (p ()) represents the modeled blur exponential function.

At this time, the probability density function p b () of the blur index is

Figure 112015071503770-pat00005
Lt; / RTI >

here

Figure 112015071503770-pat00006
Is a value required for statistical parameter estimation to be described later, and can be determined through experiments.

6,

Figure 112015071503770-pat00007
Will be described.

6 is a view showing an edge model according to blur indexes.

Referring to the graphs of FIG. 6, it can be seen that the edge models change according to the blur index. FIG. 6 exemplarily shows a graph implemented using equation (2) when the parameters h = 0, k = 1, and l = 6.

For example, if σ is a blur exponent, it can diverge when σ = 0, and keep the shape of the step function when 0 <σ <0.22. When σ has a value of 0.22 or more, the graph shape may have a gentle shape as the value increases. If σ exceeds 10, it may not have a step function form.

therefore,

Figure 112015071503770-pat00008
0.22,
Figure 112015071503770-pat00009
Can be set to 10.

In this case, in the modeled blur exponential function, the probability density function p b (σ) of the blur index can have a range of 0.22 <σ ≤ 10.

The blur index has a non-negative value and can be represented by a continuous probability distribution.

Referring again to FIG. 2, a statistical parameter is estimated based on the modeled Blur exponent function (S150). For example, the statistical parameter detector 15 of the falsification image detection system 10 can estimate a statistical parameter based on the modeled blur exponent function.

In some embodiments, statistical parameter estimates for the blur index may utilize a &quot; maximum likelihood &quot; method. At this time, the statistical parameter estimation can use the probability density function p b (?) Of the blur index.

In the example described above, the probability density function p b (σ) of the blur index can have a range of 0.22 <σ ≤ 10.

7 is a graph showing a probability density function of the blur index within a range of 0.22 <

Wherein the x-axis blur factor (σ) is, y axis represents the probability density function of the blur index p b (σ).

First, the statistical parameter detector 15 can estimate the distribution of the blur exponent using the maximum likelihood method within the range of the probability density function p b () of the blur index for the statistical parameter estimation.

For example, when the maximum likelihood method is used, the probability density function p b () of the blur index modeled in step S 140 can be defined as Equation (5).

Figure 112015071503770-pat00010

Here, c, k, and alpha may be statistical parameters.

Specifically, c and k may be shape parameters and may have values greater than zero. Also, alpha may be a scale parameter and may have a value greater than zero.

The likelihood function having the random variable x = (x1, x2, ..., xN ) for obtaining the maximum likelihood may be as shown in Equation (6).

Figure 112015071503770-pat00011

Where θ can be a vector of statistical parameters,

Figure 112015071503770-pat00012
Can be expressed by Equation (7).

Figure 112015071503770-pat00013

here

Figure 112015071503770-pat00014
(C, k,?) Of the probability density function p b (?) Of the blur index, that is, the statistical parameter (c, k,?).

A description of the statistical parameters (c, k, alpha) obtained in the above-described manner will be described with reference to Fig.

Fig. 8 is a graph showing distribution of statistical parameters (c, k, alpha) before and after blurring.

Referring to FIG. 8, the x-axis of the graph is the blur index obtained in the manner described above, and the y-axis is the probability density function p b () of the blur index within a range of 0.22 <

'No blur' on the top right of the graph is the unblurred natural image, and 'sigma' can be the standard deviation value of the Gaussian blur filter.

The higher the standard deviation value of the Gaussian blur filter, the stronger the blurring intensity with respect to the image.

In the case of 'No blur', which is a natural image with no manipulation, the maximum value of the graph can be close to 0 and it can have sharp tap value.

The comparison of the statistical parameters before and after blurring can be as shown in Table 1.

Average alpha c k No blur 0.7017 5.1992 0.2743 Sigma 1 1.0174 4.4292 0.3884 Sigma 2 1.8252 4.2956 0.5734 Sigma 3 1.8290 4.8836 0.5185

The scale parameter alpha may have a value of 1 or less in the case of a natural image to which no operation is applied. On the other hand, the scale parameter? Of the blurred image may all have a value of 1 or more.

The shape parameter c may be substantially higher than the blurred image in the case of a natural image to which no manipulation is applied.

The shape parameter k may have a value substantially smaller than that of a blurred image in the case of a natural image to which no manipulation is applied. However, the shape parameter k may be slightly different in the change tendency from image to image.

Referring again to FIG. 2, the statistical parameters estimated in step S150 are clustered (S160). For example, the counterfeit area determination unit 17 of the counterfeit image detection system 10 determines whether or not the statistical parameter (

Figure 112015071503770-pat00015
) Can be clustered.

In some embodiments, the clustering of statistical parameters may utilize a &quot; k-means algorithm &quot;.

Next, using the result of clustering the statistical parameters, a region where the statistical parameters are locally dense is determined as a fake area (S170). For example, the falsification area determination unit 17 of the falsification image detection system 10 can determine a portion where statistical parameters are locally dense as the falsification area.

Hereinafter, effects according to the embodiments of the present invention will be described with reference to Figs. 9A and 9B.

FIGS. 9A and 9B are diagrams for explaining a result of detection of a false image according to the embodiments of the present invention.

Referring to FIG. 9A, (a) is an image to be identified, and (b) is a grouping of statistical parameters of the image to be checked according to embodiments of the present invention.

The confirmation target image (a) is obtained by inserting a monkey image into the original image and blurring the edge of the inserted monkey image.

(b), the statistical parameters of the image (a) to be confirmed are locally concentrated in the yellow and blue regions.

In this case, the fake area determination unit 17 of the fake image detection system 10 determines that the area where the statistical parameters of the yellow and blue areas, that is, the image to be confirmed a, are locally dense, .

According to the forgery detection method according to the technical idea of the present invention, even if the synthesized monkey image itself has a natural blur (yellow area of (b)), the falsification area can be accurately determined.

Referring to FIG. 9B, (a) is an image to be checked, (b) is a result of judging a fake area of the image to be confirmed by a conventional method, and (c) The statistical parameters of the target image are clustered.

The confirmation target image (a) is obtained by synthesizing a face image of a specific entertainer on the body of the original image, and blurring the edge of the synthesized face image of the specific entertainer.

(c), the statistical parameters of the image to be confirmed (a) are locally concentrated in the green region.

In this case, the fake area determining unit 17 of the fake image detecting system 10 can determine that the portion where the statistical parameters of the green area, that is, the image to be checked a, are locally densely packed is a fake area .

In the case of the confirmation target image a, it is impossible to judge which part is a fake area in the case of using a conventional method as shown in (b) with a very precisely falsified image. However, as shown in (c), according to the falsification area detection method according to the technical idea of the present invention, the falsification area can be accurately determined.

That is, according to the falsification area detection method according to the technical idea of the present invention, the blur index is measured for all the edge points of the check target image, so that the blur index of the background area of the check target image that is not blurred is not detected, A more accurate forgery area can be judged.

The falsification area detecting method according to the technical idea of the present invention may be implemented in a general-purpose digital computer that can be created as a program that can be executed by a computer and operates the program using a computer-readable recording medium.

Hereinafter, with reference to FIG. 10, a computing system including a program for detecting false images according to embodiments of the present invention will be described.

10 is a diagram illustrating a computing system including a program for detecting false images in accordance with embodiments of the present invention.

The computing system 500 may include a processor 510, a network interface 570, and a storage 560.

The computing system 500 may also include a system bus 550 coupled with the processor 510 to provide a data pathway.

The network interface 570 may be connected to a terminal device that is another computing device through a network. For example, the terminal device, which is another computing device connected to the network interface 570, may be a display device, a user terminal, or the like.

The network interface 570 may be Ethernet, FireWire, USB, or the like.

The storage 560 may be implemented, for example, as a non-volatile memory device such as a flash memory, a hard disk, or the like.

The storage 560 may store data of the counterfeit image detection program 561. [ The data of the counterfeit image detecting program 561 may include a binary executable file and other resource files.

Processor 510 may execute counterfeit image detection program 561. [ However, the processor 510 may not be a processor capable of executing only the counterfeit image detecting program 561. [

For example, the processor 510 may execute a program other than the counterfeit image detection program 561. [

The counterfeit image detection program 561 detects an edge point of the image to be identified, measures the blur index of the edge point of the detected image to be checked, models the blur exponential function using the measured blur index, A program for estimating statistical parameters based on the blur exponential function, clustering the estimated statistical parameters, and determining a forgery area of the image to be confirmed using the obtained statistics can be stored.

Further, the counterfeit image detecting program 561 may include a series of operations for determining a locally dense portion of the statistical parameter of the check target image as a counterfeit region of the check target image.

Also, the counterfeit image detection program 561 may include a series of operations in which the edge point of the image to be confirmed includes a plurality of edge points and the blur index is measured for a plurality of edge points.

Further, the counterfeit image detection program 561 may include a series of operations for measuring an edge sample at an edge point of the image to be checked.

Further, the counterfeit image detecting program 561 may include a series of operations in which the value that makes the cost function using the edge sample minimum is a blur index.

The user can implement the falsification area detecting method according to the technical idea of the present invention in a form that downloads, stores or executes the falsification image detecting program 561 stored in the storage 560 on the terminal device via the network have.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It is to be understood that the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

Claims (8)

An edge detector detects an edge point of the check target image,
A blur estimator measures a blur index of an edge point of the confirmation target image detected by the edge detection unit,
A statistical parameter detector modeling a blur exponential function using the blur index measured by the blur index measuring unit, estimating a statistical parameter based on the modeled blur exponent function,
Wherein the forged region determinator clusters the statistical parameters estimated by the statistical parameter detector and determines a forgery region of the verification target image using the clustered statistical parameters,
The edge detecting unit detects an edge point of the check target image,
Measuring an edge sample at an edge point of the image to be identified,
The blur index,
And a cost function using the edge sample is minimized.
The method according to claim 1,
The falsification area determination unit clusters the statistical parameters estimated by the statistical parameter detection unit and determines the falsification area of the verification target image using the group,
And determining a locally dense portion of the statistical parameter as a fake area of the verification object image.
The method according to claim 1,
Wherein the edge point of the check target image includes a plurality of edge points,
The blur index measuring unit measures the blur index,
And measuring the blur index for the plurality of edge points.
delete Detects an edge point of the check target image,
A blur index of an edge point of the detected image to be checked is measured,
Modeling the blur exponential function using the measured blur exponent,
Estimating a statistical parameter based on the modeled Blur exponential function,
Clustering the estimated statistical parameters, determining a fake area of the verification target image using the clustered statistical parameters,
The detection of the edge point of the image to be confirmed,
Measuring an edge sample at an edge point of the image to be identified,
The blur index,
Wherein the edge function is a value that minimizes a cost function using the edge sample.
6. The method of claim 5,
The determination of the forgery area of the image to be confirmed,
And determining a locally dense portion of the statistical parameter as a fake area of the verification object image.
6. The method of claim 5,
Wherein the edge point of the check target image includes a plurality of edge points,
The blur index of the edge point of the detected image to be checked is measured,
And measuring the blur index for the plurality of edge points.
delete
KR1020150104105A 2015-07-23 2015-07-23 Method and apparatus for detecting forged image KR101725041B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150104105A KR101725041B1 (en) 2015-07-23 2015-07-23 Method and apparatus for detecting forged image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150104105A KR101725041B1 (en) 2015-07-23 2015-07-23 Method and apparatus for detecting forged image

Publications (2)

Publication Number Publication Date
KR20170011454A KR20170011454A (en) 2017-02-02
KR101725041B1 true KR101725041B1 (en) 2017-04-10

Family

ID=58154072

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150104105A KR101725041B1 (en) 2015-07-23 2015-07-23 Method and apparatus for detecting forged image

Country Status (1)

Country Link
KR (1) KR101725041B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210084150A (en) 2019-12-27 2021-07-07 한국항공대학교산학협력단 System and method for the detection of multiple compression of image and video
KR20220084236A (en) 2020-12-13 2022-06-21 한국항공대학교산학협력단 Advanced system and method for detecting video forgery

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874845B (en) * 2018-09-03 2022-06-21 中国科学院深圳先进技术研究院 Method and device for detecting image smoothing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101436406B1 (en) 2011-12-29 2014-09-02 주식회사 안랩 Client, server, system and method for updating data based on peer to peer

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
G Cao 외 2인, "Edge-based blur metric for tamper detection", Journal of Information Hiding and Multimedia Signal Processing, January 2010, vol 1, 20~27 pages*
Matthew C. Stamm 외 1인, "Forensic detection of image manipulation using statistical intrinsic fingerprints", IEEE Transactions on Information Forensics and Security, 2010, Vol 5, Issue 3, pp. 492~506*

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210084150A (en) 2019-12-27 2021-07-07 한국항공대학교산학협력단 System and method for the detection of multiple compression of image and video
KR20220084236A (en) 2020-12-13 2022-06-21 한국항공대학교산학협력단 Advanced system and method for detecting video forgery

Also Published As

Publication number Publication date
KR20170011454A (en) 2017-02-02

Similar Documents

Publication Publication Date Title
CN109076198B (en) Video-based object tracking occlusion detection system, method and equipment
KR101899866B1 (en) Apparatus and method for detecting error of lesion contour, apparatus and method for correcting error of lesion contour and, apparatus for insecting error of lesion contour
US9269155B2 (en) Region growing method for depth map/color image
JP6482195B2 (en) Image recognition apparatus, image recognition method, and program
US8483480B2 (en) Method and system for factoring an illumination image
TWI687689B (en) Measurement device and measurement method for rotation of round body and non-transitory information readable medium
US9396411B2 (en) Method and system for generating intrinsic images using a single reflectance technique
CN107292269B (en) Face image false distinguishing method based on perspective distortion characteristic, storage and processing equipment
KR101761586B1 (en) Method for detecting borderline between iris and sclera
US20140050411A1 (en) Apparatus and method for generating image feature data
KR101725041B1 (en) Method and apparatus for detecting forged image
TWI500904B (en) Stereo camera and automatic range finding method for measuring a distance between stereo camera and reference plane
JP2015197376A (en) Device, method, and program for abrasion detection
KR101662407B1 (en) Method for vignetting correction of image and apparatus therefor
US9754155B2 (en) Method and system for generating intrinsic images using a single reflectance technique
EP2791865B1 (en) System and method for estimating target size
CN106886796B (en) Icon position identification method and device and terminal equipment
JP4685711B2 (en) Image processing method, apparatus and program
CN110889817B (en) Image fusion quality evaluation method and device
JP6855938B2 (en) Distance measuring device, distance measuring method and distance measuring program
CN114581433A (en) Method and system for obtaining metal ball cavity inner surface appearance detection image
CN111951254A (en) Source camera identification method and system based on edge-guided weighted average
JP4560434B2 (en) Change region extraction method and program of the method
CN115019069A (en) Template matching method, template matching device and storage medium
JP2009151445A (en) Subarea detection device, object identification apparatus, and program

Legal Events

Date Code Title Description
A201 Request for examination
E701 Decision to grant or registration of patent right
GRNT Written decision to grant