KR20160139801A - Method and Apparatus for Detecting Object in an Image - Google Patents

Method and Apparatus for Detecting Object in an Image Download PDF

Info

Publication number
KR20160139801A
KR20160139801A KR1020150075346A KR20150075346A KR20160139801A KR 20160139801 A KR20160139801 A KR 20160139801A KR 1020150075346 A KR1020150075346 A KR 1020150075346A KR 20150075346 A KR20150075346 A KR 20150075346A KR 20160139801 A KR20160139801 A KR 20160139801A
Authority
KR
South Korea
Prior art keywords
less
sharpness
interest
image
current frame
Prior art date
Application number
KR1020150075346A
Other languages
Korean (ko)
Other versions
KR101951900B1 (en
Inventor
김봉모
박태서
조동찬
Original Assignee
에스케이텔레콤 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스케이텔레콤 주식회사 filed Critical 에스케이텔레콤 주식회사
Priority to KR1020150075346A priority Critical patent/KR101951900B1/en
Publication of KR20160139801A publication Critical patent/KR20160139801A/en
Application granted granted Critical
Publication of KR101951900B1 publication Critical patent/KR101951900B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20182Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering

Abstract

Disclosed are a device and method for detecting an object in a frame. According to an embodiment of the present invention, the object detection device includes: an object detection unit which uses a current frame and one or more previous frames to detect a moving object; and a filtering unit which determines whether the sharpness of the object detected by the object detection unit within the current frame is equal to or less than a preset level and whether the brightness of the object is equal to or less than a preset level as well as classifying the object as a non-interest object when the sharpness and brightness of the object are both equal to or less than the preset levels.

Description

[0001] The present invention relates to an apparatus and a method for detecting an object in a frame,

This embodiment relates to a method and apparatus for detecting objects in a frame.

The contents described in this section merely provide background information on the present embodiment and do not constitute the prior art.

Conventionally, there is a Gaussian Mixture Model (GMM) using background model learning as a method of detecting an object in a frame. In this method, a specific number of Gaussian models are generated based on input values for each pixel of an image. When the input value for each pixel is out of a preset range, the object is recognized as a non-object to be.

However, such a method has a problem in that even when an unattended object moves, an unattractive object moving can be detected as an object of interest. For example, it becomes a problem when an unattended object such as a worm approaches a camera installed in the outdoors. Since the object of interest is moving, the value of the pixel is likely to exceed the set range, and unexpected objects of interest may be detected as objects of interest. Accordingly, there is a need for an object detection method and apparatus that reduces the probability of false detection of an object of interest as an object of interest.

The present embodiment provides an apparatus and method for detecting an object that detects an object motion within a frame and detects an object of interest by filtering the object of interest among objects detected as a motion, There is a purpose.

According to an aspect of the present invention, there is provided an image processing apparatus including an object detection unit detecting an object in which a motion exists using a current frame and one or more previous frames, and a display unit displaying a sharpness of a detected object in the current frame, And if the sharpness of the object is less than a predetermined level and the brightness of the object is equal to or less than a preset value, classifies the object as a non-interest object And a filtering unit.

According to another aspect of the present invention, there is provided a method of detecting an object, the method comprising: detecting an object by determining whether there is an object having motion using a current frame and one or more previous frames; Determining whether a sharpness of an object is less than a predetermined level and whether the brightness of the object is less than a predetermined value; and determining whether the sharpness of the object is less than a predetermined level, And classifying the object as a non-interest object when the number of objects is equal to or less than a predetermined value.

As described above, according to one aspect of the present invention, in an object detection method, in particular, a method of detecting an object of interest using background model learning, an object of interest is filtered out of objects detected as a motion, There is an effect that can be.

1 is a block diagram illustrating an object detection system in accordance with an embodiment of the present invention.
2 is a block diagram illustrating an object detection apparatus according to an exemplary embodiment of the present invention.
3 is a flowchart illustrating an object detection method according to an exemplary embodiment of the present invention.
FIGS. 4A and 4B are diagrams illustrating a conventional object detection system and a result of using the object detection system according to an embodiment of the present invention, respectively.

Hereinafter, some embodiments of the present invention will be described in detail with reference to exemplary drawings. It should be noted that, in adding reference numerals to the constituent elements of the drawings, the same constituent elements are denoted by the same reference symbols as possible even if they are shown in different drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

In describing the components of the present invention, terms such as first, second, A, B, (a), and (b) may be used. These terms are intended to distinguish the constituent elements from other constituent elements, and the terms do not limit the nature, order or order of the constituent elements. Throughout the specification, when an element is referred to as being "comprising" or "comprising", it means that it can include other elements as well, without excluding other elements unless specifically stated otherwise . In addition, '... Quot ;, " module ", and " module " refer to a unit for processing at least one function or operation, which may be implemented by hardware or software or a combination of hardware and software.

1 is a block diagram illustrating an object detection system in accordance with an embodiment of the present invention.

Referring to FIG. 1, an object detection system 100 according to an embodiment of the present invention includes an object 110, an image device 120, 122, 124, an object detection device 130, and a display device 140 do.

The imaging devices 120, 122, and 124 are devices capable of capturing an image of an object 110 to monitor the object 110. The video devices 120, 122, and 124 according to the present exemplary embodiment are preferably implemented as a CCTV camera, a surveillance camera, or the like capable of capturing an image for monitoring the object 110 for monitoring the theft or theft of the object But the present invention is not limited thereto, and any imaging device can be included as long as it is an apparatus capable of capturing an image of an object. The video apparatuses 120, 122, and 124 transmit the captured image of the object to the object detection apparatus 130 in order to monitor theft and neglect. The images photographed by the imaging devices 120, 122, and 124 may include a plurality of frames. Although a plurality of video devices 120, 122, and 124 are shown in FIG. 1, it is not limited thereto. Only one video device can capture an image of an object.

The object detecting apparatus 130 receives an image of an object photographed by the imaging apparatus from the imaging apparatus, detects a motion object, and filters the detected object whether it is an object of interest or an object of no interest. And processes the filtered image into a displayable signal so that the display device 140 can output the filtered image. A more detailed description will be made with reference to Fig.

The display device 140 serves to display an image filtered from the object detection device and processed into a displayable signal.

2 is a block diagram illustrating an object detection apparatus according to an exemplary embodiment of the present invention.

Referring to FIG. 2, the object detection apparatus 130 includes an object extraction unit 210, a filtering unit 220, and an image processing unit 230.

The object extracting unit 210 extracts an object in a frame of the object image received from the image device. In extracting objects, objects included in a frame can be extracted using background model learning. When the background model learning is used, a specific number of Gaussian models are generated for each pixel of the image, and it is determined whether the pixel value exceeds the set range. At this time, if the value of the pixel exceeds the predetermined range, it is judged as an object. If the value of the pixel does not exceed the predetermined range, it is judged as a non-object.

The filtering unit 220 determines whether the object extracted by the object extracting unit in the frame is an object of interest or an object of no interest, and performs filtering. At this time, the degree of focus and the average brightness level are determined as a criterion for determining whether the filtering unit 220 is an object of interest or not.

First, the filtering unit determines the sharpness of an area where an object is located in a frame. The sharpness degree is a term indicating the degree of sharpness of an image, and judges the degree of sharpness and determines the degree of focus of the object in the frame. The imaging device is correctly focused when the object is located at a suitable distance for focusing. However, particularly when the imaging device is located outdoors, when an unattended object, such as a worm or a liquid such as rain water or snow, is focused on the imaging device, The non-interest object is rarely located at a suitable distance for focusing the imaging device. For this reason, in the case of an object of no interest, the focus of the image captured by the imaging device becomes incorrect. Accordingly, whether the extracted object is an interest object or an uninteresting object determines whether or not the object is focused in the image.

At this time, the frequency component of the region where the object is located in the image can be analyzed by a method of determining the sharpness of the filtering unit, that is, the degree to which the object is focused in the frame. If low-frequency components are distributed in the image, it means that the object is in a defocused state (Blur) without focusing. Therefore, in determining the degree to which an object is focused, it can be known by analyzing a frequency component. In this case, a Laplacian value can be calculated using the following equation (1), and the frequency component can be analyzed therefrom.

Figure pat00001

In Equation (1), I (x, y) denotes a pixel value at a position (x, y) in a frame and G (x, y) denotes a Laplacian value at a position of a frame (x, y). The frequency component of the region in which the object is located is analyzed using the thus obtained dispersion value of the Laplacian values. As a result of the analysis, it can be seen that the object is focused when a high frequency component is distributed in the frequency component, and the defocused state is not focused when the high frequency component is low.

Secondly, when the object is in an unfocused defocus state, the filtering unit determines the average brightness level of an area where the object is located in the image. Unobtrusive objects attached to or close to the imaging device are subject to backlighting, resulting in low overall brightness of the object. In particular, this phenomenon occurs more clearly when the video apparatus is installed outdoors. Accordingly, if the average brightness level of the object is lower than that of the other portions, the filtering unit determines the object as an uninteresting object. The average brightness level of the object can be determined as shown in the following equation.

Figure pat00002

M represents the average brightness level, and represents the average brightness level of a predetermined partial area among the areas where the object is located in the image. M is the distance from the horizontal coordinate j to J in the area where the object is located in the image horizontally and the average brightness in the rectangular area generated by vertically extending the distance from the vertical coordinate k to K in the area where the object is located in the image It corresponds to the value indicating the level. The filtering unit determines whether the average brightness level of the partial area is equal to or greater than a preset value. If M is greater than a predetermined value, the filtering unit filters the object as an object of interest. At this time, predetermined values may be defined empirically or through analysis of the distribution of measurements in the database.

The image processor 230 processes the image as a displayable signal so that the image can be output by the display device. That is, the image processing unit performs a preprocessing operation so that the display unit can output an image filtered by the filtering unit.

Each component included in the object detection apparatus shown in FIG. 2 is connected to a communication path connecting a software module or a hardware module in the apparatus, and operates organically with each other. These components communicate using one or more communication buses or signal lines.

3 is a flowchart illustrating an object detection method according to an exemplary embodiment of the present invention.

It is determined whether there is an object in which motion exists using the current frame and one or more previous frames (S310). The object is extracted from the frame of the object image received from the image device. In this case, in extracting the object, the object included in the image can be extracted using the background model learning.

It is determined whether the sharpness of the detected object in the current frame is less than a preset level (S320). The sharpness degree is a term indicating the degree of sharpness of an image, and judges the degree of sharpness and determines the degree of focus of the object in the frame. When the non-interest object is focused on the image device, the non-interest object described above often attaches to or approaches the image device, so that the non-interest object is rarely located at a suitable distance for focusing the image device. Accordingly, in the case of an object of no interest, the focus of the image captured by the imaging device becomes incorrect. Accordingly, whether the extracted object is an interest object or an uninteresting object determines whether or not the object is focused in the image. In this case, in determining the degree of focus, it is possible to analyze the frequency component of the region in which the object is located in the image, and calculate the Laplacian value to analyze the frequency component therefrom.

If the sharpness of the object in the current frame is less than a predetermined level, it is determined whether the brightness of the object in the current frame is equal to or less than a preset value (S330). Unobtrusive objects attached to or close to the imaging device are subject to backlighting, resulting in low overall brightness of the object. In particular, this phenomenon occurs more clearly when the video apparatus is installed outdoors. Therefore, it is determined whether the average brightness level of the area where the object is located is less than a predetermined value.

If the sharpness of the object in the current frame is less than a predetermined level, and the brightness of the object in the current frame is less than or equal to a preset value, the object is classified as an object of interest (S340). If the object is in a defocused state in which the object is not focused in the current frame and the average brightness level is less than the predetermined value, the object is not focused within the frame and can be determined as being in the backlight state. have.

If the sharpness of the object in the current frame is higher than a predetermined level, or if the brightness of the object in the current frame exceeds a predetermined value, the object is classified as an object of interest (S350).

In FIG. 3, it is described that the processes S310 to S350 are sequentially executed. However, this is merely illustrative of the technical idea of the embodiment of the present invention. In other words, those skilled in the art will recognize that the order of the steps described in FIG. 3 may be altered and executed, or one of steps S310 through S350 may be performed without deviating from the essential characteristics of one embodiment of the present invention. It should be noted that FIG. 3 is not limited to the time-series order, since it can be variously modified and modified by being executed in parallel.

Meanwhile, the processes shown in FIG. 3 can be implemented as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. That is, a computer-readable recording medium includes a magnetic storage medium (e.g., ROM, floppy disk, hard disk, etc.), an optical reading medium (e.g., CD ROM, And the like). The computer-readable recording medium may also be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner.

FIGS. 4A and 4B are diagrams illustrating a conventional object detection system and a result of using the object detection system according to an embodiment of the present invention, respectively.

FIG. 4A is a diagram illustrating a result of detecting whether an object of no interest is detected when a conventional object detection system is used. In the conventional object detection system, non-interested objects also have motion, and when the background model learning method is used, the possibility of being classified as an object is high. Accordingly, in the object detection system of FIG. 4A, it can be seen that the object of interest is detected as an object, which is illustrated as a worm in FIG. 4A. Since there are systems that generate an alarm or sound effect when detected as an object according to an object detection system, an alarm or an effect sound due to an object of interest often occurs. Particularly, when the video apparatus is installed outdoors, it occurs more frequently and distracts an observer's attention.

In contrast, according to the object detection system of FIG. 4B, since the degree of focusing and the average brightness level are determined, the object can be detected as a non-interest object by determining the degree of focusing and the average brightness level. The worms shown in FIG. 4B are thus filtered into the objects of interest and are not detected as objects of interest on the object detection system.

The foregoing description is merely illustrative of the technical idea of the present embodiment, and various modifications and changes may be made to those skilled in the art without departing from the essential characteristics of the embodiments. Therefore, the present embodiments are to be construed as illustrative rather than restrictive, and the scope of the technical idea of the present embodiment is not limited by these embodiments. The scope of protection of the present embodiment should be construed according to the following claims, and all technical ideas within the scope of equivalents thereof should be construed as being included in the scope of the present invention.

100: object detection system 110: object
120, 122, 124: video device 130: object detection device
140: Display device 210: Object extraction unit
220: filtering unit 230: image processing unit

Claims (6)

An object detector for detecting an object in which motion is present using a current frame and one or more previous frames; And
Determining whether the sharpness of the object detected by the object detecting unit in the current frame is less than a predetermined level and whether the brightness of the object is less than a preset value, A filtering unit for classifying the object as a non-interest object when the brightness of the object is equal to or less than a predetermined value,
And an object detection unit for detecting the object.
The method according to claim 1,
Wherein the filtering unit comprises:
And determines sharpness of the object by grasping how high frequency components are distributed among the frequency components in the current frame.
The method according to claim 1,
Wherein the filtering unit comprises:
Determining whether or not the brightness of the object is equal to or less than a predetermined value based on an average brightness value of a predetermined partial area of an object whose sharpness is less than a predetermined level, Device.
Detecting an object by determining whether there is an object in which motion is present using the current frame and one or more previous frames;
Determining whether a sharpness of a detected object is less than a predetermined level from a process of detecting the object in the current frame and whether the brightness of the object is less than a predetermined value; And
Classifying the object as a non-interest object when the sharpness of the object is less than a predetermined level and the brightness of the object is less than a preset value
The object detection method comprising the steps of:
5. The method of claim 4,
Wherein the step of determining whether the sharpness of the detected object is less than a preset level from the process of detecting the object,
Wherein the sharpness of the object is determined by determining how high frequency components are distributed among the frequency components in the current frame.
5. The method of claim 4,
Classifying the object as a non-interest object,
Determining whether or not the brightness of the object is equal to or less than a predetermined value based on an average brightness value of a predetermined partial area of an object whose sharpness is less than a predetermined level, Way.
KR1020150075346A 2015-05-28 2015-05-28 Method and Apparatus for Detecting Object in an Image KR101951900B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150075346A KR101951900B1 (en) 2015-05-28 2015-05-28 Method and Apparatus for Detecting Object in an Image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150075346A KR101951900B1 (en) 2015-05-28 2015-05-28 Method and Apparatus for Detecting Object in an Image

Publications (2)

Publication Number Publication Date
KR20160139801A true KR20160139801A (en) 2016-12-07
KR101951900B1 KR101951900B1 (en) 2019-02-25

Family

ID=57573279

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150075346A KR101951900B1 (en) 2015-05-28 2015-05-28 Method and Apparatus for Detecting Object in an Image

Country Status (1)

Country Link
KR (1) KR101951900B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019147024A1 (en) * 2018-01-23 2019-08-01 광주과학기술원 Object detection method using two cameras having different focal distances, and apparatus therefor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100605226B1 (en) * 2004-06-10 2006-07-31 주식회사 팬택앤큐리텔 Apparatus and method for detecting foreign material in digital camera module
KR20120133645A (en) * 2011-05-31 2012-12-11 삼성테크윈 주식회사 Security camera and Method of controlling thereof
KR20150025714A (en) * 2013-08-30 2015-03-11 현대모비스 주식회사 Image recognition apparatus and method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100605226B1 (en) * 2004-06-10 2006-07-31 주식회사 팬택앤큐리텔 Apparatus and method for detecting foreign material in digital camera module
KR20120133645A (en) * 2011-05-31 2012-12-11 삼성테크윈 주식회사 Security camera and Method of controlling thereof
KR20150025714A (en) * 2013-08-30 2015-03-11 현대모비스 주식회사 Image recognition apparatus and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019147024A1 (en) * 2018-01-23 2019-08-01 광주과학기술원 Object detection method using two cameras having different focal distances, and apparatus therefor

Also Published As

Publication number Publication date
KR101951900B1 (en) 2019-02-25

Similar Documents

Publication Publication Date Title
CN104519318B (en) Frequency image monitoring system and surveillance camera
EP2549738B1 (en) Method and camera for determining an image adjustment parameter
US10032283B2 (en) Modification of at least one parameter used by a video processing algorithm for monitoring of a scene
EP2665039B1 (en) Detection of near-field camera obstruction
KR102217253B1 (en) Apparatus and method for analyzing behavior pattern
JP2009005198A (en) Image monitoring system
US8294765B2 (en) Video image monitoring system
US9030559B2 (en) Constrained parametric curve detection using clustering on Hough curves over a sequence of images
JP4999794B2 (en) Still region detection method and apparatus, program and recording medium
CN109255360B (en) Target classification method, device and system
KR20090044957A (en) Theft and left baggage survellance system and meothod thereof
KR101951900B1 (en) Method and Apparatus for Detecting Object in an Image
JP5897950B2 (en) Image monitoring device
KR101581162B1 (en) Automatic detection method, apparatus and system of flame, smoke and object movement based on real time images
JP5222908B2 (en) Collapse detection system and collapse detection method
TWI476735B (en) Abnormal classification detection method for a video camera and a monitering host with video image abnormal detection
WO2005109893A2 (en) System and method for detecting anomalies in a video image sequence
WO2012141574A1 (en) Intrusion detection system for determining object position
JP4998955B2 (en) Collapse detection system and method
JP2008040724A (en) Image processing device and image processing method
Hagui et al. Comparative study and enhancement of Camera Tampering Detection algorithms
CN110853127A (en) Image processing method, device and equipment
JP3490196B2 (en) Image processing apparatus and method
CN116958897A (en) Pedestrian monitoring and early warning device and method based on image processing
JP4394053B2 (en) Image recognition device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant