KR101731461B1 - Apparatus and method for behavior detection of object - Google Patents
Apparatus and method for behavior detection of object Download PDFInfo
- Publication number
- KR101731461B1 KR101731461B1 KR1020150175219A KR20150175219A KR101731461B1 KR 101731461 B1 KR101731461 B1 KR 101731461B1 KR 1020150175219 A KR1020150175219 A KR 1020150175219A KR 20150175219 A KR20150175219 A KR 20150175219A KR 101731461 B1 KR101731461 B1 KR 101731461B1
- Authority
- KR
- South Korea
- Prior art keywords
- behavior
- image data
- target object
- behavior model
- model
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
The present invention relates to a behavior detection device for an object and a behavior detection method using the same.
Recently, the importance of surveillance system using CCTV etc. is increasing. However, most CCTV surveillance systems are not out of labor-intensive environments that focus on simple monitoring. Recently, it is required to develop an intelligent surveillance system capable of recognizing a more intelligent and autonomous situation.
A typical method of intelligent surveillance system is an abnormal behavior detection method based on behavior analysis. The conventional abnormal behavior detection method learns based on the collected normal behavior and abnormal behavior or judges abnormal behavior based on a predetermined threshold value. For example, the anomaly detection method can determine a threshold value for abnormal behavior such as a fight or a fall, and detect anomalous activity in real time based on a predetermined threshold value.
However, since the conventional abnormal behavior detection method based on such a threshold value utilizes features such as color change and brightness change extracted at low-level from data collected in real time, There is a disadvantage that it judges action by only fragmentary information.
Therefore, the abnormal behavior detection method can use context information which is a characteristic of high-level. For example, there is a behavior analysis method based on trajectory information in an abnormal behavior detection method using context information. Trajectory information based behavior analysis methods are mainly used to recognize abnormal behavior in traffic situations. These methods mainly use topic models such as latent Dirichlet allocation (LDA). However, the method of analyzing the trajectory information is difficult to detect abnormal behavior in real time, and it does not properly reflect characteristics of human behavior.
In this regard, Korean Patent Laid-Open Publication No. 10-2013-0056170 (entitled " Method and apparatus for detecting real-time abnormal behavior using a motion sequence ") analyzes a motion sequence constituting a human behavior, Time abnormal behavior detection method and apparatus for detecting an abnormal behavior.
SUMMARY OF THE INVENTION The present invention has been made to solve the above problems of the prior art, and it is an object of the present invention to provide a behavior detection apparatus for an object that detects an abnormal behavior of an object based on a behavior model generated by analyzing normal behavior of the object, to provide.
It should be understood, however, that the technical scope of the present invention is not limited to the above-described technical problems, and other technical problems may exist.
According to a first aspect of the present invention, there is provided an apparatus for detecting a behavior of an object, the apparatus comprising: an image receiving unit; a database storing a behavior model; a memory storing a behavior detection program for the object; Lt; / RTI > At this time, according to the execution of the program, the processor recognizes the target object from the image data inputted through the image receiving unit, and detects abnormal behavior of the target object from the image data based on the recognized target object and the generated behavior model. Then, the behavior model is generated based on the normal behavior of the object extracted from the plurality of image data collected to generate the behavior model, and is stored in the database.
According to a second aspect of the present invention, there is provided a behavior detection method for an object of a behavior detection apparatus, comprising: recognizing a target object from image data; And detecting an abnormal behavior of the target object from the image data based on the recognized target object and the generated behavior model through the recognizing step. At this time, the behavior model is generated based on the normal behavior of the object extracted from the plurality of image data collected to generate the behavior model, and is stored in the database.
The present invention generates a behavior model based on the normal behavior of an object, so that it is possible to detect various abnormal behaviors without separately defining the abnormal behavior. Since the present invention uses the pre-generated behavioral model, it is possible to effectively detect abnormal behavior in real time.
The present invention can classify normal and abnormal behaviors based on global action models and local action models, so that the user can easily understand the basis for judging normal and abnormal behaviors, and visualization is easy. The present invention also receives user feedback on normal and abnormal behavior and can re-learn the behavioral model according to the received feedback. Therefore, as the behavior detection period increases, the present invention can reduce misrecognition and improve recognition accuracy.
1 is a block diagram of a behavior detection apparatus for an object according to an exemplary embodiment of the present invention.
2 is a diagram illustrating an example of a preprocessing process of the behavior detection apparatus according to an embodiment of the present invention.
3 is a diagram illustrating an example of a feature extraction process of the behavior detection apparatus according to an embodiment of the present invention.
4 is a view illustrating an example of a global behavior model generation process of the behavior detection apparatus according to an embodiment of the present invention.
5 is a diagram illustrating an example of a process of generating a local action model of the behavior detection apparatus according to an embodiment of the present invention.
6 is a diagram illustrating an example of a behavior recognition process of the behavior detection apparatus according to an embodiment of the present invention.
7 is a diagram illustrating an example of visualization of a behavior recognition result according to an embodiment of the present invention.
8 is a block diagram of a behavior detection program according to an embodiment of the present invention.
FIG. 9 is a flowchart of a behavior detection method according to an embodiment of the present invention.
10 is a flowchart of a behavior model generation method according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.
Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when a part is referred to as "including " an element, it does not exclude other elements unless specifically stated otherwise.
Next, a
FIG. 1 is a block diagram of an
The
The
The
The
When the image data is inputted through the
The
Specifically, the
For example, the
The
2 is a diagram illustrating an example of a preprocessing process of the
The
For example, the
The
For example, the
At this time, the detected background and foreground may include pixels misclassified by minute changes of light, shadows, or the like. Such background and misclassification results included in the foreground may act as noise in the generation of the behavioral model and may reduce the accuracy of normal or abnormal behavior discrimination. Therefore, the
For example, the
In other words, the
The
For example, the
In this way, the
FIG. 3 is an exemplary diagram illustrating a feature extraction process of the
The
For example, the
The
The
The
For example, when N frames are included in the image data, the
Meanwhile, when the
FIG. 4 is a diagram illustrating a global behavior model generation process of the
In order to generate the global behavior model, the
The
Trajectories of objects passing through a similar movement path can be included in the same community in a plurality of extracted clusters by performing clustering. Therefore, the cluster can be used to represent the movement path existing in the image data. At this time, since the feature vector includes the object tracing information indicating the direction of the object along with the motion trajectory information, even if the trajectory has a similar trajectory, the feature vector can be included in other clusters when the direction is different.
The
For example, when performing scene segmentation based on the k- d tree, the
When performing scene division based on the binary space division technique, the
After completing the partitioning process, the
Through this process, the
5 is a diagram illustrating an example of a process of generating a local action model of the
After generating the
For this purpose, the
The
For example, the
The
As such, the
6 is a diagram illustrating an example of a behavior recognition process of the
When the
Then, the
At this time, the
The
Based on the local behavior model, the
For example, the
Alternatively, the
The
In this way, the
The
7 is a diagram illustrating an example of visualization of a behavior recognition result according to an embodiment of the present invention.
The
Through this visualization, the administrator of the
In FIG. 7B, the image data includes a total of six divided spaces P711, P712, P713, P714, P715, and P716. In this case, the divided spaces where the abnormal behavior occurs are 'partition space 3' (P713) and 'partition space 4' (P714). The
The
For example, if the manager of the
As described above, the
8 is a block diagram of a behavior detection program according to an embodiment of the present invention.
The behavior detection program includes a
In order to generate a behavior model, the
In particular, the
The
The behavior
The
When the
The
Similar to the
In this way, upon receiving the image data to detect the abnormal behavior, the
When a feature vector is extracted by the
The
The
Next, a behavior detection method for an object of the
FIG. 9 is a flowchart of a behavior detection method according to an embodiment of the present invention.
The
Then, the
10 is a flowchart of a behavior model generation method according to an embodiment of the present invention.
At this time, the
Then, the
The
At this time, in order to generate the global behavior model, the
And to generate a local behavior model, the
On the other hand, in order to detect an abnormal behavior of the target object from the image data, the
The
Referring again to FIG. 9, the
After performing the evaluation of the abnormal behavior, the
The
The
One embodiment of the present invention may also be embodied in the form of a recording medium including instructions executable by a computer, such as program modules, being executed by a computer. Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media. The computer-readable medium is also a computer storage medium. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
While the methods and systems of the present invention have been described in connection with specific embodiments, some or all of those elements or operations may be implemented using a computer system having a general purpose hardware architecture.
It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.
The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.
100: Behavior detection device
110:
120: Memory
130: Processor
140: Database
Claims (15)
Image receiving unit,
A database for storing behavioral models,
A memory in which the behavior detection program for the object is stored,
And a processor for executing the program,
The processor recognizes a target object from the image data input through the image receiving unit according to the execution of the program and detects an abnormal path of the target object from the image data based on the global behavior model for the target object Detecting a change in behavior of the target object from the image data based on a local action model for the target object, detecting an abnormal behavior of the target object based on an abnormal path of the target object and a change in behavior of the target object However,
Wherein the global behavior model and the local behavior model are generated based on normal behavior of an object extracted from a plurality of previously collected image data and then stored in the database,
Wherein the global behavior model is generated by calculating a transition probability of the recognized object with respect to a plurality of spaces included in each image data after recognizing the object from the plurality of image data collected in the past,
Wherein the local behavior model is generated based on an action of the recognized object in each space included in the plurality of spaces.
Wherein the processor recognizes an object with respect to any one of the plurality of captured image data, performs clustering on the one of the plurality of captured image data, and divides the image data into a plurality of spaces, And generating the global behavior model based on the calculated transition probability, and based on the behavior of each of the recognized objects with respect to each of the divided spaces, To generate a local behavior model corresponding to the object.
Wherein the processor performs a pre-processing on any one of the image data, recognizes the object based on the pre-processing performed,
Extracts a trajectory for the recognized object, and performs the clustering based on the extracted trajectory.
Wherein the clustering is performed based on a graph cut algorithm.
Wherein the processor generates the global behavior model based on a transition probability and a spatial partitioning technique of the recognized object in the divided plurality of spaces.
Wherein the processor generates the local behavior model based on the behavior of the recognized object for each partitioned space and a single class classification technique.
Wherein the processor calculates a degree of similarity corresponding to the target object based on the global behavior model and the local behavior model,
And detects the behavior of the target object as an abnormal behavior when the calculated similarity is equal to or less than a predetermined value.
Wherein the processor performs an evaluation of abnormal behavior detection for the target object,
And regenerates the global behavior model and the local behavior model based on the result of the evaluation.
Wherein the processor visualizes an abnormal behavior detection result for the target object.
Recognizing a target object from the image data; And
Detecting an abnormal path of the target object from the image data based on a global behavior model for the target object;
Detecting a change in behavior of the target object from the image data based on a local action model of the recognized target object; And
Detecting abnormal behavior of the target object from the image data based on an abnormal path of the target object and a change in behavior of the target object,
Wherein the global behavior model and the local behavior model are generated based on normal behavior of the object extracted from the plurality of image data collected in advance,
Wherein the global behavior model is generated by calculating a transition probability of the recognized object with respect to a plurality of spaces included in each image data after recognizing the object from the plurality of image data collected in the past,
Wherein the local behavior model is generated based on an action of the recognized object in each space included in the plurality of spaces.
Recognizing an object of any one of the plurality of image data collected before the recognition of the target object;
Performing clustering on any one of the image data and dividing the image data into a plurality of spaces; And
Further comprising generating the global behavior model and the local behavior model for an action of an object in the plurality of divided spaces.
Wherein the generating the global behavior model and the local behavior model comprises:
Calculating a transition probability of the object recognized through the recognizing step in the divided plurality of spaces;
Generating the global behavior model based on the calculated transition probability; and
And generating a local behavior model corresponding to each space based on the behavior of the object for each of the divided spaces.
Wherein the step of detecting abnormal behavior of the target object comprises:
Calculating a similarity degree corresponding to the target object based on the global behavior model and the local behavior model; And
Detecting a behavior of the target object as an abnormal behavior when the calculated similarity is equal to or less than a predetermined value.
Performing an evaluation of abnormal behavior detection on the target object after the detection of abnormal behavior of the target object; And
Further comprising regenerating the global behavior model and the local behavior model based on a result of the performed evaluation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150175219A KR101731461B1 (en) | 2015-12-09 | 2015-12-09 | Apparatus and method for behavior detection of object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150175219A KR101731461B1 (en) | 2015-12-09 | 2015-12-09 | Apparatus and method for behavior detection of object |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101731461B1 true KR101731461B1 (en) | 2017-05-11 |
Family
ID=58740821
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150175219A KR101731461B1 (en) | 2015-12-09 | 2015-12-09 | Apparatus and method for behavior detection of object |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101731461B1 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101900237B1 (en) * | 2017-10-20 | 2018-09-19 | 주식회사 삼진엘앤디 | On-site judgment method using situation judgment data on acquired images |
WO2018226882A1 (en) * | 2017-06-07 | 2018-12-13 | Amazon Technologies, Inc. | Behavior-aware security systems and associated methods |
KR101956008B1 (en) * | 2018-06-07 | 2019-03-08 | 쿠도커뮤니케이션 주식회사 | Apparatus for Providing Trajectory and Data Building Method Thereof |
KR101979375B1 (en) * | 2018-02-23 | 2019-08-28 | 주식회사 삼알글로벌 | Method of predicting object behavior of surveillance video |
CN110473232A (en) * | 2017-07-14 | 2019-11-19 | 腾讯科技(深圳)有限公司 | Image-recognizing method, device, storage medium and electronic equipment |
KR102043366B1 (en) * | 2018-11-21 | 2019-12-05 | (주)터보소프트 | Method for measuring trajectory similarity between geo-referenced videos using largest common view |
CN111539339A (en) * | 2020-04-26 | 2020-08-14 | 北京市商汤科技开发有限公司 | Data processing method and device, electronic equipment and storage medium |
KR102148607B1 (en) * | 2019-07-26 | 2020-08-26 | 연세대학교 산학협력단 | Audio-video matching area detection apparatus and method |
KR20200119386A (en) * | 2019-03-26 | 2020-10-20 | 연세대학교 산학협력단 | Apparatus and method for recognizing activity and detecting activity area in video |
KR20200119391A (en) * | 2019-03-27 | 2020-10-20 | 연세대학교 산학협력단 | Apparatus and method for recognizing activity and detecting activity duration in video |
CN112200081A (en) * | 2020-10-10 | 2021-01-08 | 平安国际智慧城市科技股份有限公司 | Abnormal behavior identification method and device, electronic equipment and storage medium |
KR20210084330A (en) * | 2019-12-27 | 2021-07-07 | 권세기 | A monitering system for wearing muzzles of dog using deep learning and monitering method |
KR102341715B1 (en) * | 2021-02-23 | 2021-12-21 | 주식회사 딥팜 | Apparatus and method for livestock monitoring |
KR102347811B1 (en) * | 2021-05-31 | 2022-01-06 | 한국교통대학교산학협력단 | Apparatus and method for detecting object of abnormal behavior |
WO2023096092A1 (en) * | 2021-11-25 | 2023-06-01 | 한국전자기술연구원 | Method and system for detecting abnormal behavior on basis of composite image |
CN117932233A (en) * | 2024-03-21 | 2024-04-26 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | User behavior model fine-tuning method, system and medium based on similar abnormal behaviors |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101179276B1 (en) * | 2011-06-13 | 2012-09-03 | 고려대학교 산학협력단 | Device and method for detecting abnormal crowd behavior |
-
2015
- 2015-12-09 KR KR1020150175219A patent/KR101731461B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101179276B1 (en) * | 2011-06-13 | 2012-09-03 | 고려대학교 산학협력단 | Device and method for detecting abnormal crowd behavior |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018226882A1 (en) * | 2017-06-07 | 2018-12-13 | Amazon Technologies, Inc. | Behavior-aware security systems and associated methods |
US10936655B2 (en) | 2017-06-07 | 2021-03-02 | Amazon Technologies, Inc. | Security video searching systems and associated methods |
CN110473232A (en) * | 2017-07-14 | 2019-11-19 | 腾讯科技(深圳)有限公司 | Image-recognizing method, device, storage medium and electronic equipment |
CN110473232B (en) * | 2017-07-14 | 2024-02-09 | 腾讯科技(深圳)有限公司 | Image recognition method and device, storage medium and electronic equipment |
KR101900237B1 (en) * | 2017-10-20 | 2018-09-19 | 주식회사 삼진엘앤디 | On-site judgment method using situation judgment data on acquired images |
KR101979375B1 (en) * | 2018-02-23 | 2019-08-28 | 주식회사 삼알글로벌 | Method of predicting object behavior of surveillance video |
KR101956008B1 (en) * | 2018-06-07 | 2019-03-08 | 쿠도커뮤니케이션 주식회사 | Apparatus for Providing Trajectory and Data Building Method Thereof |
KR102043366B1 (en) * | 2018-11-21 | 2019-12-05 | (주)터보소프트 | Method for measuring trajectory similarity between geo-referenced videos using largest common view |
KR102174656B1 (en) | 2019-03-26 | 2020-11-05 | 연세대학교 산학협력단 | Apparatus and method for recognizing activity and detecting activity area in video |
KR20200119386A (en) * | 2019-03-26 | 2020-10-20 | 연세대학교 산학협력단 | Apparatus and method for recognizing activity and detecting activity area in video |
KR20200119391A (en) * | 2019-03-27 | 2020-10-20 | 연세대학교 산학협력단 | Apparatus and method for recognizing activity and detecting activity duration in video |
KR102174658B1 (en) | 2019-03-27 | 2020-11-05 | 연세대학교 산학협력단 | Apparatus and method for recognizing activity and detecting activity duration in video |
KR102148607B1 (en) * | 2019-07-26 | 2020-08-26 | 연세대학교 산학협력단 | Audio-video matching area detection apparatus and method |
KR20210084330A (en) * | 2019-12-27 | 2021-07-07 | 권세기 | A monitering system for wearing muzzles of dog using deep learning and monitering method |
KR102581941B1 (en) | 2019-12-27 | 2023-09-22 | 권세기 | A monitering system for wearing muzzles of dog using deep learning and monitering method |
CN111539339A (en) * | 2020-04-26 | 2020-08-14 | 北京市商汤科技开发有限公司 | Data processing method and device, electronic equipment and storage medium |
CN112200081A (en) * | 2020-10-10 | 2021-01-08 | 平安国际智慧城市科技股份有限公司 | Abnormal behavior identification method and device, electronic equipment and storage medium |
KR102341715B1 (en) * | 2021-02-23 | 2021-12-21 | 주식회사 딥팜 | Apparatus and method for livestock monitoring |
KR102347811B1 (en) * | 2021-05-31 | 2022-01-06 | 한국교통대학교산학협력단 | Apparatus and method for detecting object of abnormal behavior |
WO2023096092A1 (en) * | 2021-11-25 | 2023-06-01 | 한국전자기술연구원 | Method and system for detecting abnormal behavior on basis of composite image |
CN117932233A (en) * | 2024-03-21 | 2024-04-26 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | User behavior model fine-tuning method, system and medium based on similar abnormal behaviors |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101731461B1 (en) | Apparatus and method for behavior detection of object | |
US10248860B2 (en) | System and method for object re-identification | |
US9959630B2 (en) | Background model for complex and dynamic scenes | |
US9652863B2 (en) | Multi-mode video event indexing | |
Afiq et al. | A review on classifying abnormal behavior in crowd scene | |
US8548198B2 (en) | Identifying anomalous object types during classification | |
US8374393B2 (en) | Foreground object tracking | |
RU2475853C2 (en) | Behaviour recognition system | |
US8218819B2 (en) | Foreground object detection in a video surveillance system | |
US8416296B2 (en) | Mapper component for multiple art networks in a video analysis system | |
CN109829382B (en) | Abnormal target early warning tracking system and method based on intelligent behavior characteristic analysis | |
KR101720781B1 (en) | Apparatus and method for prediction of abnormal behavior of object | |
KR101472674B1 (en) | Method and apparatus for video surveillance based on detecting abnormal behavior using extraction of trajectories from crowd in images | |
Azimjonov et al. | Vision-based vehicle tracking on highway traffic using bounding-box features to extract statistical information | |
Yang et al. | Cluster-based crowd movement behavior detection | |
Mudjirahardjo | A Study On Human Motion Detection-Toward Abnormal Motion Identification | |
Jiang | Anomalous event detection from surveillance video | |
Varadharajan | Object Detection in a video based on Frame Differencing using Deep learning | |
Li | Road User Detection and Analysis in Traffic Surveillance Videos | |
Yang | Multiple humans tracking by learning appearance and motion patterns |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |