KR101600314B1 - Smart CCTV control system - Google Patents
Smart CCTV control system Download PDFInfo
- Publication number
- KR101600314B1 KR101600314B1 KR1020150118650A KR20150118650A KR101600314B1 KR 101600314 B1 KR101600314 B1 KR 101600314B1 KR 1020150118650 A KR1020150118650 A KR 1020150118650A KR 20150118650 A KR20150118650 A KR 20150118650A KR 101600314 B1 KR101600314 B1 KR 101600314B1
- Authority
- KR
- South Korea
- Prior art keywords
- image
- camera
- sensor
- sensors
- sensing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19617—Surveillance camera constructional details
- G08B13/1963—Arrangements allowing camera rotation to change view, e.g. pivoting camera, pan-tilt and zoom [PTZ]
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
Abstract
The present invention relates to a system for controlling a single or multiple placed ccivities, comprising at least one image capturing device as an embodiment, and a control center communicating with the image capturing device to display images obtained from the image capturing device Wherein the image acquisition device includes a camera for acquiring an image, a rotation drive module for rotating the camera in a forward direction or a reverse direction, The image processing apparatus according to any one of claims 1 to 3, further comprising: recognition sensors for measuring an azimuth angle with respect to an object moving within a sensing angle range and a limited sensing distance; and object information from the camera and object information calculated from the recognition sensor, And a control unit for receiving a control signal for controlling the operation of the rotation drive module The SmartCity control system including the new module is presented.
Description
The present invention relates to a system for controlling single or multiple deployed ccivities.
CCTV technology is a technology that monitors a number of images acquired from a large number of CCTVs installed in an observation area to monitor events occurring in the observation area, and to leave images as evidence and use them as evidence.
In order to ensure effective monitoring, it is necessary to eliminate blind spots in the observation area as much as possible. Since a camera that acquires an image usually has a limited angle of view, it is difficult to simultaneously capture all directions at any one point. In order to solve this problem, a plurality of cameras are arranged in a radial direction to constitute a video device for simultaneously acquiring images for all directions, or a camera can be rotated so as to cover a wide angle range as much as possible.
The installation cost in the CCTV system is one of the factors determining the scope of the observation area. The cost of installation of the control system becomes a very sensitive determinant in the decision to introduce the control system in a wide observation area because it is possible to design a wide observation area with a low cost in construction of the control system.
Some of the control systems adopting rotating CCTV have been proposed to extract moving objects such as people and follow the moving objects. In order to achieve this, a technique of recognizing a moving object is required. The technique of using a sensor for sensing the movement of the object and the technique of extracting the moving object from the image by sequentially comparing the acquired image with each other are categorized. The addition of a computing device for extracting objects from these additional sensors or images is required for each installation location of the camera, which causes a considerable increase in the construction cost of the control system.
The present invention proposes an object tracking type civil control system that can be constructed at a relatively low cost. Also, it is necessary to effectively manage multiple cameras to increase the tracking efficiency of moving objects, and to provide various information to the monitoring personnel in a timely manner.
Other objects and advantages of the present invention will become apparent to those skilled in the art from the following detailed description.
According to an embodiment of the present invention, there is provided a control system including at least one image acquiring device and a control center communicating with the image acquiring device and displaying an image obtained from the image acquiring device, The acquisition device includes a camera for acquiring an image, a rotation drive module for rotating the camera in a forward direction or a reverse direction, and a radar device disposed radially on a horizontal plane with respect to an installation point of the camera module, A control unit for transmitting to the control center the image information from the camera and the object information calculated from the perception sensor and for controlling the operation of the rotation drive module from the control center, A smartcitib tube containing a communication module for receiving a signal I present my system.
Here, the recognition sensors may be arranged such that the sensing angle ranges partially overlap with neighboring recognition sensors. Specifically, the perception sensor may be a Doppler sensor or a PIR sensor.
Meanwhile, the image acquiring device or the control center may include an operation control module for operating and controlling the rotation drive module to rotate the camera toward the object sensed by the perception sensor.
Further, when the camera is controlled to face the object, the operation control module rotates the camera toward the other object if another object is recognized through the recognition sensors, After the image is captured, the rotation driving module can be controlled to rotate toward the existing object.
In addition, a plurality of the image capturing apparatuses are provided so as to be spaced apart from each other, and a moving object recognized by a cognition sensor provided in any image capturing apparatus is recognized. In the image capturing apparatus adjacent to the image capturing apparatus, The camera of the neighboring image acquiring device can be controlled to face the object based on the object information of the image acquiring device which recognizes the moving object.
Further, a control signal for rotating the camera of the neighboring image acquiring device may be transmitted from the control center.
According to another aspect of the present invention, there is provided an image capturing apparatus, comprising: a plurality of image capturing apparatuses arranged so as to be spaced apart from each other; a sensing unit recognizing an object is located in a blind spot between a plurality of neighboring image capturing apparatuses; A plurality of recognition sensors for sensing azimuths with respect to moving objects within a limited sensing angle range and a limited sensing distance; and a controller for transmitting object information calculated from the sensing sensor to the control center or an adjacent image acquiring device And a smart communication control system including a communication module for communicating with the user.
At this time, when recognizing an object moving by any one of the image capturing apparatuses or the sensing units provided in the sensing unit, and if no object is recognized in the image capturing apparatus or the image capturing apparatus adjacent to the sensing unit, The rotation driving module of the neighboring image acquiring device may be controlled to direct the camera to the object based on object information of the image acquiring device or object information of the sensing unit that recognizes the moving object.
According to the embodiment of the present invention, a control system can be constructed at a low cost, and images can be left behind by following an object effectively. In addition, since the location and route of the object by the cognitive sensor as well as the acquired image are grasped, double detection of the camera and the cognitive sensor can be performed, and the blind spot of the monitoring is reduced accordingly. In addition, it can be expected to reduce the crime rate and the effect of preventing the crime concerned in rare areas. Evidence of various accidents is possible.
The effects of the present invention will be clearly understood and understood by those skilled in the art, either through the specific details described below, or during the course of practicing the present invention.
1 is a block diagram showing a configuration of a smart ccive control system according to an embodiment of the present invention;
FIG. 2 is a perspective view conceptually showing an image acquisition device employed in the embodiment shown in FIG. 1; FIG.
3 is a plan view conceptually showing the image acquisition device shown in Fig.
Fig. 4 schematically shows the sensing range of the cognitive sensor, Fig. 4 (a) is a plan view related to the embodiment shown in Fig. 3, and Fig. 4 (b) is a plan view showing an example in which four cognitive sensors are arranged in a radial direction.
FIG. 5 is a plan view schematically showing a use state of a smartCytique control system according to an embodiment of the present invention; FIG.
6 is a plan view schematically illustrating another use state of the present invention.
7 is a plan view schematically showing another use state of the present invention.
8 is a plan view schematically showing another use state of the present invention;
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the configuration, functions, and functions of a smart control system according to the present invention will be described with reference to the accompanying drawings. It should be noted, however, that the drawing numbers for the same or similar components throughout the drawings and embodiments shall be used collectively.
In addition, p1, p2, p3, etc. of the triangle indicate the position along the time of the movement trajectory of the object, The block arrow indicates the direction the camera is facing.
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, therefore, are not to be construed as limiting the technical spirit of the invention. It is to be understood that the invention is not to be limited by any of the details of the description to those skilled in the art from the standpoint of a person skilled in the art that any or all of the drawings shown in the drawings are not necessarily the shape,
FIG. 2 is a perspective view showing an example of an image capturing apparatus. FIG. 2 is a block diagram showing a configuration of a control system according to an embodiment of the present invention.
The
The
The
The
2, the
The rotation of the
On the other hand, a plurality of
The
As a specific example of the
The Doppler sensor can measure the speed of an object based on the difference in wavelength when the emitted sound wave hits the moving object and returns. Therefore, adopting a Doppler sensor can detect moving objects.
It is difficult to specify the position of a moving object (relative to the Doppler sensor) in a Doppler sensor. However, when a plurality of Doppler sensors are installed in a radial direction, an absolute coordinate value of an installation position of each Doppler sensor, The orientation of the moving object can be calculated by combining the values. This will be described in detail later.
On the other hand, the PIR sensor recognizes the infrared ray emitted from the heat source and recognizes the approach of the object when approaching a heat source such as a person. Although the PIR sensor can not accurately specify the position of an object alone, a plurality of PIR sensors are combined at different angles to calculate the position of the object through the coordinate values of the arranged PIR sensors and from which sensor the object is detected can do.
Such a Doppler sensor or a PIR sensor has an advantage of a low price, so that the installation cost of the control system is not greatly increased even if a plurality of the Doppler sensor or the PIR sensor is used as a set.
On the other hand, FIG. 2 and FIG. 3 show the arrangement relationship of the perception sensors.
The
The limit of the
If the sensing angle of the cognitive sensor is 90 ° to 100 ° and the azimuth angle for detecting the object is all around, at least four cognitive sensors should be arranged radially.
On the other hand, the limit of the
In addition, when the sensing distance of the adopted cognitive sensor is short, the installation interval of the image capturing device becomes long if the supporting structure or the building which can install the camera is not crowded. In this case, a blind spot Wide distribution. In another embodiment of the present invention, a blind spot is reduced as much as possible by installing a sensing unit in a blind spot. This will be described later.
Referring again to Figs. 2 to 4, a plurality of radially arranged cognitive sensors may be arranged so that the sensing angular ranges of neighboring cognitive sensors overlap. In FIGS. 3 and 4A, s1, s2, and s3 are cognition sensors arranged in a radial direction, and there is no blind spot with respect to all directions (360 degrees) by arranging the sensing angle ranges 131 to overlap.
In addition, the orientation of the object can be grasped more precisely by arranging the recognition sensors such that neighboring recognition sensors overlap each other at a
For example, in FIG. 3, if an object is simultaneously recognized by the perceptual sensor s1 and the perceptual sensor s2, it can be inferred that the object is located within the overlapping range of the perception angle range of the perception sensor s1 and the perception range of the perception sensor s2 have.
Further, in arranging the cognitive sensors in the radial direction, the
The orientation in which the object is positioned can be measured more precisely by configuring the sensor angle range of the vicinity of 2/3 of the entire sensing angle range to overlap with the sensing angle range of the neighboring sensor.
For example, in FIG. 4 (a), three recognition sensors s1, s2, s3 having a horizontal direction sensing angle of about 130 degrees are arranged at regular intervals. In the drawing, p1, p2, p3 and p4 represent specific points in the movement trajectory of the object J, and the object J is located at p4 past p1, p2 and p3.
In this case, if the used cognitive sensor can not detect the orientation of the object independently, only the cognitive sensor s3 recognizes that an object exists in front of the cognitive sensor s3, distinguishes the positions of p1 and p2, Can not distinguish the specific orientation of
In other words, from the information that the object is detected in the cognitive sensor s3, it is inferred that the object will be located at the upper left of the figure based on the direction in which the cognitive sensor s3 is installed. At this time, if the angle of
In contrast, FIG. 4 (b) shows four cognitive sensors (s1, s2, s3, s4) having the same sensing angle range as that of FIG. The left 1/3 of the
For the object J moving along the same trajectory as FIG. 3, p1 is detected by the perceptual sensor s3 and the perceptual sensor s4, and p2 is detected by the perceptual sensor s4, so that the orientation of p1 and p2 can be distinguished. In this way, Fig. 4 (b) shows that the omnidirectional area is divided into eight parts from the four cognitive sensors, so that the set of cognitive sensors can further distinguish the orientation of the object J.
Thus, the orientation of the object can be precisely calculated by constructing the detection angles of the neighboring sensors to be partially overlapped. Accordingly, the object can be traced even with a camera having a narrow angle of view, and a clear image of the object can be obtained because the object is placed at the center of the acquired image.
An operation of calculating the orientation of the object by the measured values of the plurality of cognitive sensors can be performed in the individual
Referring again to FIG. 1, the
Specifically, the
In the case where the
Further, the
At this time, the network form of the
The
When the
On the other hand, when the operation control module is provided in the control center, the image acquisition device functions as a simple terminal, and is configured to operate the rotation drive module only by the control signal of the control center.
The image acquiring apparatus further includes power supply means (not shown). Although not shown in FIG. 2, the set of the recognition sensor, the communication module, and the like may be provided inside the enclosure so as not to be directly exposed to the outside.
The
FIG. 5 relates to an object tracking operation of the camera by the operation control module when the object enters the recognition range of the image acquisition device and moves to a certain path.
In the illustrated image acquiring device, three recognition sensors s1, s2, and s3 are radially provided, and all directions are observed. The object J is moving along p1, p2, and p3, and the location of the object is p3.
When the object J is located at p1 and enters the recognizable range for the first time, the recognition sensor s3 detects the object. At this time, as described above, the remaining recognition sensor s1 and recognition sensor s2 can not detect the object at the position p1, and the operation control module detects that the camera (11, the direction of the block arrow indicates the direction of the camera) A control signal is generated. This control signal is transmitted to the rotation drive module so that the
Thereafter, as the object moves to p2, the existence of the object is detected in the recognition sensor s1 and the recognition sensor s3, and the operation control module recognizes that the object is positioned between the recognition sensor s1 and the recognition sensor s3. A control signal is generated so as to face a portion (2 o'clock in the drawing) where the sensing angle range of the recognition sensor s1 overlaps with the sensing angle range of the recognition sensor s3.
When the object moves to p3, the object J is not detected in the recognition sensor s3 and the object is detected only in the recognition sensor s1. The operation control module generates a control signal so that the camera looks in the same direction as the recognition sensor s1, thereby allowing the object (J) located at p3 to enter the camera image. Thus, it is possible to control the direction of the camera by only detecting the recognition sensor.
If a cognitive sensor capable of distinguishing the orientation of an object with a single cognitive sensor is employed, the control signal can be generated so that the camera always faces the object. In this case, since the object is placed at the center of the photographed image, a more accurate image of the object can be obtained. Also, even if a camera with a narrow angle of view is used, it is easy to acquire a continuous image of a moving object.
6 relates to image acquisition of a new object sensed by a cognitive sensor in a situation where a plurality of objects are sequentially accessed.
Referring to the drawing, an object 1 (J1) enters an observable region of an image acquisition apparatus and moves to p1 and p2. As described above, the camera acquires an image while tracking an object 1 (J1) along p1 and p2 The other object 2 (J2) can newly enter into the observable area of the image acquiring apparatus.
That is, it is a case of recognizing another object moving through the recognition sensor while controlling the camera toward the object. In this case, the camera may take an image of another object that has been newly entered for a predetermined time toward another object, and then control the rotation driving module to face the existing object again.
If the object 2 (J2) is detected in the recognition sensor s2 (
Then, when the object 1 (J1) is out of the observable area of the image capturing device, if the object 2 (J2) still exists in the observable area, the arithmetic control module generates the control signal so that the camera faces the
With this control method, it is possible to acquire images of a large number of objects entering the observation area. It can be controlled to visually express the presence of the object to the monitoring agent by securing the image of the newly appeared object during shooting of the existing object and to acquire the image of the
The presence of a new object and its approximate orientation can be determined by analyzing the measured values of the cognitive sensors for the new object in the observable area of the image acquisition device. In addition, we acquire images of new objects and visually confirm new objects, and leave them in the records to improve the reliability of various analysis data such as object movement path.
Figure 7 relates to a control system with a plurality of image acquisition devices.
In the installation target area where the control system is to be installed, two or more image acquisition devices may be installed. The plurality of image capturing devices are installed at regular intervals from each other. In order to secure an image of a wide field of view, the camera needs to be installed higher than the ground, and an image capturing device is installed mainly using a street lamp or a telephone pole. Generally, the installation interval of the image acquisition device depends on the arrangement of the building, the structure, and the like in the installation target area. In some cases, the image acquisition device may be arranged in a concentrated manner so that the neighboring image acquisition device and the observable area may partially overlap, and sometimes the image acquisition device is arranged sparsely, A blind spot B may be generated.
In a control system having a plurality of image acquisition devices, the object information acquired by an image acquisition device can be based on the direction control of a camera of another neighboring image acquisition device.
Specifically, when an object is recognized by a cognition sensor included in an image acquisition device, and a neighboring image acquisition device installed in the vicinity of the image acquisition device recognizing the object does not recognize any object, The camera of the device is controlled to face the object based on the object information of the image acquiring device that perceives the object.
7, the object J is moved from the upper right to the middle of the drawing, so that the object J passes through p1 to p2 and into the observable area of the central first
According to the above, the recognition sensor of the first
The image and object information generated by the first
Since the object J in which the camera is likely to enter the self-observation area is pointed ahead in the neighboring
When a camera of a neighboring image acquiring device is to be controlled based on the object information of an image acquiring device, the subject of the control signal is the control center.
If the neighboring image acquiring device recognizes an object in its observable area, the control center does not transmit a control signal to the image acquiring device, thereby allowing the object in the observable area to be preferentially photographed .
Or the control sensor sends a control signal to the neighboring image acquiring device to always face the object based on the object information when the object information is generated in any one of the image acquiring devices, It can be controlled to determine the priority of the control signal for photographing the object existing in the observable area and to continuously photograph the object existing in the observable area.
Figure 8, on the other hand, relates to a smart ccive control system in accordance with another embodiment of the present invention.
The configuration and operation of the image acquiring device provided in another embodiment of the present invention may be the same as the configuration and operation of the image acquiring device of the above-described embodiment within a range that does not conflict with the following description.
Referring to FIG. 1, a plurality of
The
The
The
The
When a
In the control system provided with the sensing units and the image acquisition devices, any one object may be located within the observation region of any one sensing unit or within the observation region of any one of the image acquisition devices, It is located in the zone.
When an object enters an observation area of any one sensing unit or any one of the image acquiring devices and the sensing unit of the sensing unit or the image sensing device senses the object, By controlling the camera of the apparatus to face the object, it is possible to quickly acquire the image of the object entering the neighboring image acquiring apparatus.
The smartcities control system according to the embodiment of the present invention can be constructed by additionally installing the cognitive sensors, the communication module, the operation control module, the sensing unit, and the like in the existing rotary cctv camera control system. The cost of additional installation components, including cognitive sensors, is so low that it can be installed in a large area at low cost by incorporating into existing control systems.
In particular, even when the installation interval of the cameras in the existing CCTV system is wide and effective monitoring of a wide area is not easy, a blind spot can be greatly reduced by additionally installing a sensing unit.
100: System
10, 10a, 10b: Image acquisition device
11: camera 111: angle of view 12: rotation drive module
13: Cognitive sensor 131: Sensing angle range 132: Sensing distance
14: Communication module 15: Operation control module
20: Control center
21: control center server 22: control center DB 23: monitor
T: Support B: Blind spot 30: Sensing unit J, J1, J2: Object
Claims (9)
The image acquiring device includes:
A camera for acquiring an image;
A rotation driving module for rotating the camera in a forward direction or a reverse direction;
Recognition sensors disposed radially on a horizontal plane with respect to an installation point of the camera and measuring an azimuth angle with respect to an object moving within a limited sensing angle range and a limited sensing distance; And
A communication module for transmitting image information from the camera and object information calculated from the recognition sensor to the control center and receiving a control signal for controlling operation of the rotation drive module from the control center;
/ RTI >
The recognition sensors
Doppler sensors or PIR sensors, at least three radially arranged on a horizontal plane,
The left 1/3 portion and the right 1/3 portion of the sensing angle range of any one of the cognitive sensors are arranged to overlap with the sensing angle range of the cognitive sensor adjacent to the left and right sides of the cognitive sensor,
The image acquiring device or the control center,
And an operation control module operable to control the rotation drive module to rotate the camera toward the object sensed by the recognition sensor,
The operation control module
Wherein the direction of the object is calculated according to whether one of the two sensors perceives the object by itself and the two adjacent sensors perceive the object simultaneously and transmits the control signal based on the calculated orientation.
SmartCity control system.
The operation control module
When another object is recognized through the recognition sensors while the camera is controlled to face the object,
The camera rotates toward the other object, captures an image of the other object for a predetermined time, and then controls the rotation driving module to rotate toward the existing object
SmartCity control system.
Wherein the image acquiring device includes a plurality of image acquiring devices spaced apart from each other,
When an object moving by a cognition sensor provided in any one of the image capturing apparatuses is recognized and if no object is recognized in the neighboring image capturing apparatuses adjacent to the image capturing apparatus,
The camera of the neighboring image acquiring device is controlled to face the object based on the object information of the image acquiring device which recognizes the moving object
SmartCity control system.
A control signal for rotating the camera of the neighboring image acquiring device is transmitted from the control center
SmartCity control system.
Wherein a plurality of the image acquiring devices are arranged so as to be spaced apart from each other,
A sensing unit recognizing an object is positioned in a blind spot between a plurality of neighboring image acquiring devices,
The sensing unit includes:
A plurality of cognitive sensors arranged radially on a horizontal plane and measuring an azimuth angle with respect to a moving object within a limited sensing angle range and a limited sensing distance,
And a communication module for transmitting the object information calculated by the recognition sensor to the control center or an adjacent image acquiring device
SmartCity control system.
When the recognition unit recognizes an object moving by any one of the image acquisition apparatuses or the sensing units provided in the sensing unit and does not recognize any object in the image acquisition apparatus or the image acquisition apparatus adjacent to the sensing unit,
The rotation driving module of the neighboring image acquiring device includes:
The camera is controlled to face the object based on the object information of the image acquisition device or the object information of the sensing unit,
SmartCity control system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150118650A KR101600314B1 (en) | 2015-08-24 | 2015-08-24 | Smart CCTV control system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150118650A KR101600314B1 (en) | 2015-08-24 | 2015-08-24 | Smart CCTV control system |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101600314B1 true KR101600314B1 (en) | 2016-03-07 |
Family
ID=55540344
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150118650A KR101600314B1 (en) | 2015-08-24 | 2015-08-24 | Smart CCTV control system |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101600314B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101786525B1 (en) * | 2017-06-14 | 2017-10-18 | 주식회사 신아시스템 | System for saving power and controling operation using human body detection sensor with 360 degree |
EP4296985A1 (en) * | 2022-06-15 | 2023-12-27 | Arlo Technologies, Inc. | Electronic monitoring system with activity zone alignment tool |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007323563A (en) * | 2006-06-05 | 2007-12-13 | Akira Kawada | Object-detecting and automatic tracking device |
KR100980586B1 (en) * | 2010-05-07 | 2010-09-06 | 주식회사 에스엘티 | Method for intelligent image security using single or multi camera and system thereof |
JP2011258031A (en) * | 2010-06-09 | 2011-12-22 | Dx Antenna Co Ltd | Tracker for monitoring camera |
KR101328246B1 (en) | 2013-05-15 | 2013-11-14 | 강성진 | Apparatus for tracking of moving target and the method thereof |
KR101340897B1 (en) | 2013-04-22 | 2013-12-13 | 주식회사 사라다 | The anticrime system in a school zone |
KR20140058192A (en) | 2012-11-06 | 2014-05-14 | 에스케이텔레콤 주식회사 | Control image relocation method and apparatus according to the direction of movement of the object of interest |
KR101425505B1 (en) * | 2013-10-25 | 2014-08-13 | 홍승권 | The monitering method of Intelligent surveilance system by using object recognition technology |
-
2015
- 2015-08-24 KR KR1020150118650A patent/KR101600314B1/en active IP Right Grant
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007323563A (en) * | 2006-06-05 | 2007-12-13 | Akira Kawada | Object-detecting and automatic tracking device |
KR100980586B1 (en) * | 2010-05-07 | 2010-09-06 | 주식회사 에스엘티 | Method for intelligent image security using single or multi camera and system thereof |
JP2011258031A (en) * | 2010-06-09 | 2011-12-22 | Dx Antenna Co Ltd | Tracker for monitoring camera |
KR20140058192A (en) | 2012-11-06 | 2014-05-14 | 에스케이텔레콤 주식회사 | Control image relocation method and apparatus according to the direction of movement of the object of interest |
KR101340897B1 (en) | 2013-04-22 | 2013-12-13 | 주식회사 사라다 | The anticrime system in a school zone |
KR101328246B1 (en) | 2013-05-15 | 2013-11-14 | 강성진 | Apparatus for tracking of moving target and the method thereof |
KR101425505B1 (en) * | 2013-10-25 | 2014-08-13 | 홍승권 | The monitering method of Intelligent surveilance system by using object recognition technology |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101786525B1 (en) * | 2017-06-14 | 2017-10-18 | 주식회사 신아시스템 | System for saving power and controling operation using human body detection sensor with 360 degree |
EP4296985A1 (en) * | 2022-06-15 | 2023-12-27 | Arlo Technologies, Inc. | Electronic monitoring system with activity zone alignment tool |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2710801B1 (en) | Surveillance system | |
US7535353B2 (en) | Surveillance system and surveillance method | |
US5980123A (en) | System and method for detecting an intruder | |
EP2779130B1 (en) | GPS directed intrusion system with real-time data acquisition | |
US20130208123A1 (en) | Method and System for Collecting Evidence in a Security System | |
US8717439B2 (en) | Surveillance system and method | |
CN106657921A (en) | Portable radar perimeter security and protection system | |
EP3452848B1 (en) | Monitoring method using a camera system with an area movement detection | |
KR101275297B1 (en) | Camera Apparatus of tracking moving object | |
CN103179342A (en) | Monitoring camera and method for monitoring | |
KR20150060626A (en) | Active Type Unmanned Security System | |
CN103592901A (en) | Holder control airport monitoring system | |
KR101600314B1 (en) | Smart CCTV control system | |
KR101648292B1 (en) | Unmanned monitoring system apparatus | |
RU2542873C1 (en) | System for technical surveillance of protected area | |
KR20170112714A (en) | System for automatic monitoring of distributed area using flying photographing unit | |
EP3891711B1 (en) | Method of optical alignment and verification of field of view integrity for a flame detector and system | |
WO2020135519A1 (en) | Mobile detection device | |
US20210248384A1 (en) | Building evacuation method and building evacuation system | |
JP7176868B2 (en) | monitoring device | |
CN113068000A (en) | Method, device, equipment and system for monitoring video target and storage medium | |
KR101870151B1 (en) | Enforcement system of illegal parking | |
KR20230089493A (en) | Multi-camera fire detector | |
RU2563557C2 (en) | Multispectral system and method for electro-optical surveillance of protected area | |
KR100452092B1 (en) | Unidentified People Tracking Device Using Dual Cameras And Method Thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20181210 Year of fee payment: 4 |
|
FPAY | Annual fee payment |
Payment date: 20191212 Year of fee payment: 5 |