KR20190056458A - Seperation Method for Overlapping Objects in Object Tracking - Google Patents

Seperation Method for Overlapping Objects in Object Tracking Download PDF

Info

Publication number
KR20190056458A
KR20190056458A KR1020170149203A KR20170149203A KR20190056458A KR 20190056458 A KR20190056458 A KR 20190056458A KR 1020170149203 A KR1020170149203 A KR 1020170149203A KR 20170149203 A KR20170149203 A KR 20170149203A KR 20190056458 A KR20190056458 A KR 20190056458A
Authority
KR
South Korea
Prior art keywords
objects
detected
overlapped
image
tracking
Prior art date
Application number
KR1020170149203A
Other languages
Korean (ko)
Inventor
송혁
최인규
고민수
Original Assignee
전자부품연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전자부품연구원 filed Critical 전자부품연구원
Priority to KR1020170149203A priority Critical patent/KR20190056458A/en
Publication of KR20190056458A publication Critical patent/KR20190056458A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

A method of separating overlapping objects in object tracking is provided. An object separation method according to an embodiment of the present invention detects objects in an image, separates overlapping objects when the detected objects are overlapped, and tracks objects that are separated from the detected objects. Accordingly, it is possible to precisely identify and separate objects judged to be overlapped in an image, thereby enabling continuous object tracking, and eventually accurate information transmission and situation grasp becomes possible.

Description

A method for separating overlapping objects from object tracking,

The present invention relates to an image processing technique, and more particularly, to a method for detecting and tracking objects such as a vehicle and a pedestrian for traffic safety and a video system using the method.

When detecting objects such as vehicles and pedestrians using the pattern recognition technique, most of the objects are detected with high accuracy. However, when a part of the object overlaps with another object, do.

Thus, one of the important problems in detecting an object by using image analysis technology is that it can not accurately detect overlapped objects. If the objects are not detected, it is impossible to trace the object, which is a subsequent procedure performed based on the detection, and ultimately, the dangerous situation can not be grasped or inaccurate information is transmitted.

Therefore, as a method for accurate object detection, a method for accurately separating overlapping objects is sought.

It is an object of the present invention to provide a method of separating objects judged to overlap in an image and an image system using the same.

According to an aspect of the present invention, there is provided an object separation method comprising: detecting objects in an image; If it is determined that the detected objects are overlapped, separating the overlapping objects; And tracking the objects separated from the detected objects.

The separating step may separate the first object having a small intermediate depth in the overlapping area where the objects overlap.

Also, the separating step may predict a second object having a depth larger than the first object from the overlapping area in which the first object is separated.

And, the separating step can predict the second object in the overlapping area by referring to the second object appearing in the previous frames.

Further, the object may include at least one of a pedestrian and a vehicle.

Then, the tracking step can track the detected objects based on the size information and the position information of the detected objects.

Also, in the separating step, if the size information of the objects detected in the previous frame and the position information are not similar to the size information and the position information of the objects detected in the current frame, it can be determined that the detected objects are overlapped.

According to another aspect of the present invention, an image system includes an input unit for inputting an image; And a processor for detecting objects in the input image through the input unit, separating the overlapped objects when the detected objects are judged to be overlapped, and tracking the objects separated from the detected objects.

As described above, according to the embodiments of the present invention, it is possible to precisely identify and separate objects judged to be overlapped in an image, thereby enabling continuous object tracking, and ultimately accurate information transmission and situation grasp becomes possible.

In addition, according to the embodiments of the present invention, it is possible to accurately analyze the individual behaviors of the objects overlapping each other in the image by accurately discriminating and separating the objects judged to overlap in the image.

Brief Description of Drawings Fig. 1 is a flowchart provided in the description of an object separation method according to an embodiment of the present invention;
FIGS. 2 and 3 illustrate images in which object overlapping occurs,
4 is a view showing an enlarged image of overlapping objects,
5 is a block diagram of an object tracking system according to another embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the drawings.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a flowchart illustrating an object separation method according to an embodiment of the present invention. FIG. The object separating method according to an embodiment of the present invention is capable of continuously tracking overlapping objects by separating overlapping objects in an image and tracking them.

That is, the object separating method according to the embodiment of the present invention is different from the existing pattern recognition method that processes overlapping objects as one object, and separates the objects into two or more objects and processes them.

To do this, as shown in FIG. 1, a CCTV (Closed-Circuit TeleVision) installed on a pedestrian crossing or an image photographed by a camera is inputted (S110).

Next, objects are detected in the input image in step S110 (S120). Here, the objects include vehicles, pedestrians, and the like. Object detection can be performed using a pattern recognition technique or a deep learning technique.

In the case of the deep learning technique, it is necessary to pre-learn the deep learning model into the vehicle database and the pedestrian database.

Then, it is determined whether the detected objects are overlapped or not, that is, whether overlapping objects exist in the image (S130). Whether objects overlap is determined based on size information and position information of the detected objects.

Specifically, the size and position of the objects detected in the current frame (frame of time T + 1) are compared with the size and position of the objects detected in the previous frame (frame of time T) It is judged that they are overlapped.

In addition, it is determined that overlapping of objects has occurred even when the number of adjacent objects detected in the previous frame decreases sharply in the current frame.

FIG. 2 and FIG. 3 illustrate the images in which the object-sharing occurs. It can be confirmed that the overlapping of some pedestrians occurs in the frame of time T + 1 in FIG. 2 and FIG.

If it is determined that the detected objects are overlapped (S130-Y), a process of separating the overlapping objects is performed (S140 to S150).

In order to separate an object, first, an object having a small depth (an object before and / or at the top of the image) is separated in the overlapped region where the objects are overlapped (S140). As shown in FIG. 4, an object expressed in red, which is an object having a small depth, can be separated from other objects because the entire region is displayed on the image.

Next, an image of another object having a large depth is predicted from an overlapped region in which an object having a small depth is separated (S150). In step S150, image prediction of the object is performed with reference to the images of the object shown in the previous frames, and the deep learning technique can be applied here.

Thereafter, the objects detected in step S120 or separated by steps S140 and S150 are tracked (S160). The object tracking in step S160 may be performed based on size information and position information of the objects.

Specifically, as shown in FIG. 2 and FIG. 3, object tracking is performed by searching and matching objects having similar size and position with objects detected in the frame of time T at time T + 1.

On the other hand, if the detected objects are not overlapped (S130-N), steps S140 and S150 for separating overlapping objects are not performed.

5 is a block diagram of an object tracking system according to another embodiment of the present invention. The object tracking system according to an embodiment of the present invention is a computing system including an image input unit 210, a processor 220, an output unit 230, and a storage unit 240, as shown in FIG.

The video input unit 210 receives a CCTV installed on a pedestrian crossing or an image photographed from a camera.

The processor 220 detects objects (vehicles, pedestrians, etc.) from the image input through the image input unit 210 using the pattern recognition technique or the deep learning technique, and tracks the detected objects.

In this process, the processor 220 judges whether the detected objects are overlapped, and if it is judged that they are overlapped, the object tracking is continued after separating the overlapping objects.

The output unit 230 is a display for displaying object tracking results by the processor 220, and communication means for delivering object tracking results to an external device or a network.

The storage unit 240 provides a storage space necessary for the processor 220 to perform the object separation / tracking algorithm.

A method and system for separating and tracking overlapping objects in an image so as to enable continuous tracing for overlapping objects has been described in detail with respect to preferred embodiments.

In the above embodiment, it is assumed that the vehicle, the pedestrian, and the like are tracked. It is needless to say that the technical idea of the present invention can also be applied to a case of tracking other kinds of moving objects.

In the object separating method according to the embodiment of the present invention, the overlapped objects are separated into two or more objects by utilizing time and space information, unlike the existing technique of processing the overlapped objects as one object.

It goes without saying that the technical idea of the present invention can also be applied to a computer-readable recording medium having a computer program for performing the functions of the apparatus and method according to the present embodiment. In addition, the technical idea according to various embodiments of the present invention may be embodied in computer-readable code form recorded on a computer-readable recording medium. The computer-readable recording medium is any data storage device that can be read by a computer and can store data. For example, the computer-readable recording medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical disk, a hard disk drive, or the like. In addition, the computer readable code or program stored in the computer readable recording medium may be transmitted through a network connected between the computers.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

210:
220: Processor
230: Output section
240:

Claims (8)

Detecting objects in an image;
If it is determined that the detected objects are overlapped, separating the overlapping objects; And
And tracking the objects separated from the detected objects.
The method according to claim 1,
In the separation step,
Wherein a first object having a small intermediate depth is separated in an overlapping area where the objects are overlapped.
The method of claim 2,
In the separation step,
And estimating a second object having a larger depth than the first object from the overlapping area in which the first object is separated.
The method of claim 3,
In the separation step,
And estimating a second object in the overlapping area by referring to a second object appearing in previous frames.
The method according to claim 1,
The object,
A pedestrian, and a vehicle.
The method according to claim 1,
In the tracking step,
And the detected objects are tracked based on size information and position information of the detected objects.
The method of claim 6,
In the separation step,
Wherein if the size information and the position information of the objects detected in the previous frame are not similar to the size information and the position information of the objects detected in the current frame, then the object detection unit determines that the detected objects overlap.
An input unit for inputting an image; And
And a processor for detecting objects in the input image through the input unit, separating the overlapped objects when the detected objects are judged to be overlapped, and tracking the objects separated from the detected objects.
KR1020170149203A 2017-11-10 2017-11-10 Seperation Method for Overlapping Objects in Object Tracking KR20190056458A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020170149203A KR20190056458A (en) 2017-11-10 2017-11-10 Seperation Method for Overlapping Objects in Object Tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020170149203A KR20190056458A (en) 2017-11-10 2017-11-10 Seperation Method for Overlapping Objects in Object Tracking

Publications (1)

Publication Number Publication Date
KR20190056458A true KR20190056458A (en) 2019-05-27

Family

ID=66679381

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020170149203A KR20190056458A (en) 2017-11-10 2017-11-10 Seperation Method for Overlapping Objects in Object Tracking

Country Status (1)

Country Link
KR (1) KR20190056458A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021167365A3 (en) * 2020-02-21 2021-10-14 삼성전자 주식회사 Electronic device and method for tracking movement of object
KR102696863B1 (en) * 2024-05-24 2024-08-20 건아정보기술 주식회사 Bi-directional object tracking system using object tracking prediction algorithm in occlusion area

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021167365A3 (en) * 2020-02-21 2021-10-14 삼성전자 주식회사 Electronic device and method for tracking movement of object
KR102696863B1 (en) * 2024-05-24 2024-08-20 건아정보기술 주식회사 Bi-directional object tracking system using object tracking prediction algorithm in occlusion area

Similar Documents

Publication Publication Date Title
US8902053B2 (en) Method and system for lane departure warning
US20150248587A1 (en) Image processing system, image processing method, and program
CN109727275B (en) Object detection method, device, system and computer readable storage medium
US8724851B2 (en) Aerial survey video processing
CN106295598A (en) A kind of across photographic head method for tracking target and device
JP6595375B2 (en) Traffic condition analysis device, traffic condition analysis method, and traffic condition analysis program
JP5931662B2 (en) Road condition monitoring apparatus and road condition monitoring method
CN111008600A (en) Lane line detection method
US20160343144A1 (en) Method of detecting vehicle, database structure for detecting vehicle, and method of establishing database for detecting vehicle
WO2010109831A1 (en) Drive recorder
JP2016058085A (en) Method and device for detecting shielding of object
CN112861673A (en) False alarm removal early warning method and system for multi-target detection of surveillance video
Fradi et al. Spatio-temporal crowd density model in a human detection and tracking framework
KR20110035662A (en) Intelligent image search method and system using surveillance camera
KR20150026178A (en) Apparatus for Providing Video Synopsis Computer-Readable Recording Medium with Program therefore
Shine et al. Fractional data distillation model for anomaly detection in traffic videos
KR20190056458A (en) Seperation Method for Overlapping Objects in Object Tracking
KR101492366B1 (en) Car accident detection method and apparatus
EP3761228A1 (en) Computer-implemented method
JP6405606B2 (en) Image processing apparatus, image processing method, and image processing program
US9183448B2 (en) Approaching-object detector, approaching object detecting method, and recording medium storing its program
CN109800685A (en) The determination method and device of object in a kind of video
US20150178577A1 (en) Image processing apparatus, and image processing method
CN103714552B (en) Motion shadow removing method and device and intelligent video analysis system
EP3671537A1 (en) Smoke detection method and apparatus