KR101600314B1 - Smart CCTV control system - Google Patents

Smart CCTV control system Download PDF

Info

Publication number
KR101600314B1
KR101600314B1 KR1020150118650A KR20150118650A KR101600314B1 KR 101600314 B1 KR101600314 B1 KR 101600314B1 KR 1020150118650 A KR1020150118650 A KR 1020150118650A KR 20150118650 A KR20150118650 A KR 20150118650A KR 101600314 B1 KR101600314 B1 KR 101600314B1
Authority
KR
South Korea
Prior art keywords
image
camera
sensor
sensors
sensing
Prior art date
Application number
KR1020150118650A
Other languages
Korean (ko)
Inventor
김용진
조호룡
박창호
Original Assignee
(주)유리네트웍스
인천광역시 서구(인천광역시서구청장)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)유리네트웍스, 인천광역시 서구(인천광역시서구청장) filed Critical (주)유리네트웍스
Priority to KR1020150118650A priority Critical patent/KR101600314B1/en
Application granted granted Critical
Publication of KR101600314B1 publication Critical patent/KR101600314B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/1963Arrangements allowing camera rotation to change view, e.g. pivoting camera, pan-tilt and zoom [PTZ]
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera

Abstract

The present invention relates to a system for controlling a single or multiple placed ccivities, comprising at least one image capturing device as an embodiment, and a control center communicating with the image capturing device to display images obtained from the image capturing device Wherein the image acquisition device includes a camera for acquiring an image, a rotation drive module for rotating the camera in a forward direction or a reverse direction, The image processing apparatus according to any one of claims 1 to 3, further comprising: recognition sensors for measuring an azimuth angle with respect to an object moving within a sensing angle range and a limited sensing distance; and object information from the camera and object information calculated from the recognition sensor, And a control unit for receiving a control signal for controlling the operation of the rotation drive module The SmartCity control system including the new module is presented.

Description

Smart CCTV control system

The present invention relates to a system for controlling single or multiple deployed ccivities.

CCTV technology is a technology that monitors a number of images acquired from a large number of CCTVs installed in an observation area to monitor events occurring in the observation area, and to leave images as evidence and use them as evidence.

In order to ensure effective monitoring, it is necessary to eliminate blind spots in the observation area as much as possible. Since a camera that acquires an image usually has a limited angle of view, it is difficult to simultaneously capture all directions at any one point. In order to solve this problem, a plurality of cameras are arranged in a radial direction to constitute a video device for simultaneously acquiring images for all directions, or a camera can be rotated so as to cover a wide angle range as much as possible.

The installation cost in the CCTV system is one of the factors determining the scope of the observation area. The cost of installation of the control system becomes a very sensitive determinant in the decision to introduce the control system in a wide observation area because it is possible to design a wide observation area with a low cost in construction of the control system.

Some of the control systems adopting rotating CCTV have been proposed to extract moving objects such as people and follow the moving objects. In order to achieve this, a technique of recognizing a moving object is required. The technique of using a sensor for sensing the movement of the object and the technique of extracting the moving object from the image by sequentially comparing the acquired image with each other are categorized. The addition of a computing device for extracting objects from these additional sensors or images is required for each installation location of the camera, which causes a considerable increase in the construction cost of the control system.

Korean Registered Patent No. 10-1328246 (November 31, 2013) Korean Patent Publication No. 10-2014-0058192 (Apr. Korean Registered Patent No. 10-1340897 (December 13, 2013)

The present invention proposes an object tracking type civil control system that can be constructed at a relatively low cost. Also, it is necessary to effectively manage multiple cameras to increase the tracking efficiency of moving objects, and to provide various information to the monitoring personnel in a timely manner.

Other objects and advantages of the present invention will become apparent to those skilled in the art from the following detailed description.

According to an embodiment of the present invention, there is provided a control system including at least one image acquiring device and a control center communicating with the image acquiring device and displaying an image obtained from the image acquiring device, The acquisition device includes a camera for acquiring an image, a rotation drive module for rotating the camera in a forward direction or a reverse direction, and a radar device disposed radially on a horizontal plane with respect to an installation point of the camera module, A control unit for transmitting to the control center the image information from the camera and the object information calculated from the perception sensor and for controlling the operation of the rotation drive module from the control center, A smartcitib tube containing a communication module for receiving a signal I present my system.

Here, the recognition sensors may be arranged such that the sensing angle ranges partially overlap with neighboring recognition sensors. Specifically, the perception sensor may be a Doppler sensor or a PIR sensor.

Meanwhile, the image acquiring device or the control center may include an operation control module for operating and controlling the rotation drive module to rotate the camera toward the object sensed by the perception sensor.

Further, when the camera is controlled to face the object, the operation control module rotates the camera toward the other object if another object is recognized through the recognition sensors, After the image is captured, the rotation driving module can be controlled to rotate toward the existing object.

In addition, a plurality of the image capturing apparatuses are provided so as to be spaced apart from each other, and a moving object recognized by a cognition sensor provided in any image capturing apparatus is recognized. In the image capturing apparatus adjacent to the image capturing apparatus, The camera of the neighboring image acquiring device can be controlled to face the object based on the object information of the image acquiring device which recognizes the moving object.

Further, a control signal for rotating the camera of the neighboring image acquiring device may be transmitted from the control center.

According to another aspect of the present invention, there is provided an image capturing apparatus, comprising: a plurality of image capturing apparatuses arranged so as to be spaced apart from each other; a sensing unit recognizing an object is located in a blind spot between a plurality of neighboring image capturing apparatuses; A plurality of recognition sensors for sensing azimuths with respect to moving objects within a limited sensing angle range and a limited sensing distance; and a controller for transmitting object information calculated from the sensing sensor to the control center or an adjacent image acquiring device And a smart communication control system including a communication module for communicating with the user.

At this time, when recognizing an object moving by any one of the image capturing apparatuses or the sensing units provided in the sensing unit, and if no object is recognized in the image capturing apparatus or the image capturing apparatus adjacent to the sensing unit, The rotation driving module of the neighboring image acquiring device may be controlled to direct the camera to the object based on object information of the image acquiring device or object information of the sensing unit that recognizes the moving object.

According to the embodiment of the present invention, a control system can be constructed at a low cost, and images can be left behind by following an object effectively. In addition, since the location and route of the object by the cognitive sensor as well as the acquired image are grasped, double detection of the camera and the cognitive sensor can be performed, and the blind spot of the monitoring is reduced accordingly. In addition, it can be expected to reduce the crime rate and the effect of preventing the crime concerned in rare areas. Evidence of various accidents is possible.

The effects of the present invention will be clearly understood and understood by those skilled in the art, either through the specific details described below, or during the course of practicing the present invention.

1 is a block diagram showing a configuration of a smart ccive control system according to an embodiment of the present invention;
FIG. 2 is a perspective view conceptually showing an image acquisition device employed in the embodiment shown in FIG. 1; FIG.
3 is a plan view conceptually showing the image acquisition device shown in Fig.
Fig. 4 schematically shows the sensing range of the cognitive sensor, Fig. 4 (a) is a plan view related to the embodiment shown in Fig. 3, and Fig. 4 (b) is a plan view showing an example in which four cognitive sensors are arranged in a radial direction.
FIG. 5 is a plan view schematically showing a use state of a smartCytique control system according to an embodiment of the present invention; FIG.
6 is a plan view schematically illustrating another use state of the present invention.
7 is a plan view schematically showing another use state of the present invention.
8 is a plan view schematically showing another use state of the present invention;

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the configuration, functions, and functions of a smart control system according to the present invention will be described with reference to the accompanying drawings. It should be noted, however, that the drawing numbers for the same or similar components throughout the drawings and embodiments shall be used collectively.

In addition, p1, p2, p3, etc. of the triangle indicate the position along the time of the movement trajectory of the object, The block arrow indicates the direction the camera is facing.

BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, therefore, are not to be construed as limiting the technical spirit of the invention. It is to be understood that the invention is not to be limited by any of the details of the description to those skilled in the art from the standpoint of a person skilled in the art that any or all of the drawings shown in the drawings are not necessarily the shape,

FIG. 2 is a perspective view showing an example of an image capturing apparatus. FIG. 2 is a block diagram showing a configuration of a control system according to an embodiment of the present invention.

The control system 100 according to an embodiment of the present invention includes at least one image capturing apparatus 10 and a control center 20 communicating with the image capturing apparatus 10 to provide images to monitoring personnel and the like.

The image acquisition device 10 includes a camera 11, a rotation drive module 12, a plurality of recognition sensors 13, and a communication module 14.

The camera 11 is for acquiring an image, and generates a moving image. The camera 11 has an angle of view 111 within a certain range although there is a difference depending on the lenses provided. When one camera 11 is operated in the image acquiring device, since the image that can be acquired is limited according to the limitation of the angle of view 111, the camera 11 is rotated and controlled to acquire an image in a desired direction.

The rotation drive module 12 is a member that rotates the camera 11 in a horizontal plane, and includes a motor capable of rotating in a normal and reverse direction, as well as gear parts, a camera housing, and a control circuit. The detailed configuration and operation relationship of the rotation drive module are well known in the art, and a description thereof will be omitted here.

2, the camera 11 and the rotation driving module 12 are installed on the upper part of the streetlight post T. In another embodiment not shown, the camera and the rotation driving module may be installed in various places, Can be installed in place.

The rotation of the camera 11 by the rotation drive module 12 can be made over 360 degrees, or can be made in any limited angular range. The range of the camera rotation angle may vary depending on the installation location of the camera and the design intention of the control system. In the illustrated embodiment, it is assumed that the camera is rotatable in all directions (360 degrees).

On the other hand, a plurality of recognition sensors 13 constitute one set, and are radially arranged on a horizontal plane with respect to the camera 11 installation point. In this example, the camera 11 is installed at the upper end of the street lamp post T and the recognition sensors 13 are radially disposed at the middle height of the post T in Fig. The installation height of the cognitive sensors 13 is determined in consideration of the object to be sensed, and can be approximately the height of the waist of an adult when the object is a person.

The cognitive sensor 13 is for sensing an object, and mainly aims to recognize a moving person. And further aims to calculate positional information of the perceived object.

As a specific example of the recognition sensor 13, there is a Doppler sensor or a PIR sensor (Pyroelectric Infrared Ray Sensor).

The Doppler sensor can measure the speed of an object based on the difference in wavelength when the emitted sound wave hits the moving object and returns. Therefore, adopting a Doppler sensor can detect moving objects.

It is difficult to specify the position of a moving object (relative to the Doppler sensor) in a Doppler sensor. However, when a plurality of Doppler sensors are installed in a radial direction, an absolute coordinate value of an installation position of each Doppler sensor, The orientation of the moving object can be calculated by combining the values. This will be described in detail later.

On the other hand, the PIR sensor recognizes the infrared ray emitted from the heat source and recognizes the approach of the object when approaching a heat source such as a person. Although the PIR sensor can not accurately specify the position of an object alone, a plurality of PIR sensors are combined at different angles to calculate the position of the object through the coordinate values of the arranged PIR sensors and from which sensor the object is detected can do.

Such a Doppler sensor or a PIR sensor has an advantage of a low price, so that the installation cost of the control system is not greatly increased even if a plurality of the Doppler sensor or the PIR sensor is used as a set.

On the other hand, FIG. 2 and FIG. 3 show the arrangement relationship of the perception sensors.

The recognition sensor 13 has a limited detection range in which an object can be detected. Any one sensor has a limit in the sensing angle range 131 in which the object can be recognized, and there is a limit in the sensing distance 132 at which the object can be recognized. The limit of the detection range depends on the type, specification, operating condition, and the like of the sensor selected by the cognitive sensor.

The limit of the sensing angle range 131 of the adopted cognitive sensor 13 is related to the problem of how many cognitive sensors are to be arranged according to the azimuth angle at which the object is to be detected. 2 and 3, when the sensing angle range 131 of the cognitive sensor 13 is approximately 130 degrees and the azimuth angle for detecting the object is 360 degrees, the three cognitive sensors s1, s2, So that it is possible to recognize the object in all directions by radial arrangement.

If the sensing angle of the cognitive sensor is 90 ° to 100 ° and the azimuth angle for detecting the object is all around, at least four cognitive sensors should be arranged radially.

On the other hand, the limit of the sensing distance 132 of the cognitive sensor 13 is related to the problem of how closely to arrange the image capturing apparatus when introducing the control system to a target area of a predetermined size. The longer the detection distance, the larger the area can be covered by a small number of image acquisition devices.

In addition, when the sensing distance of the adopted cognitive sensor is short, the installation interval of the image capturing device becomes long if the supporting structure or the building which can install the camera is not crowded. In this case, a blind spot Wide distribution. In another embodiment of the present invention, a blind spot is reduced as much as possible by installing a sensing unit in a blind spot. This will be described later.

Referring again to Figs. 2 to 4, a plurality of radially arranged cognitive sensors may be arranged so that the sensing angular ranges of neighboring cognitive sensors overlap. In FIGS. 3 and 4A, s1, s2, and s3 are cognition sensors arranged in a radial direction, and there is no blind spot with respect to all directions (360 degrees) by arranging the sensing angle ranges 131 to overlap.

In addition, the orientation of the object can be grasped more precisely by arranging the recognition sensors such that neighboring recognition sensors overlap each other at a certain angle range 131.

For example, in FIG. 3, if an object is simultaneously recognized by the perceptual sensor s1 and the perceptual sensor s2, it can be inferred that the object is located within the overlapping range of the perception angle range of the perception sensor s1 and the perception range of the perception sensor s2 have.

Further, in arranging the cognitive sensors in the radial direction, the left 1/3 part of the sensing angle range of any one of the cognitive sensors is overlapped with the sensing angle range of the other cognitive sensors located on the left side of the corresponding cognitive sensor, Portion can arrange the cognitive sensors to overlap the sensing angular range of another cognitive sensor on the right side.

The orientation in which the object is positioned can be measured more precisely by configuring the sensor angle range of the vicinity of 2/3 of the entire sensing angle range to overlap with the sensing angle range of the neighboring sensor.

For example, in FIG. 4 (a), three recognition sensors s1, s2, s3 having a horizontal direction sensing angle of about 130 degrees are arranged at regular intervals. In the drawing, p1, p2, p3 and p4 represent specific points in the movement trajectory of the object J, and the object J is located at p4 past p1, p2 and p3.

In this case, if the used cognitive sensor can not detect the orientation of the object independently, only the cognitive sensor s3 recognizes that an object exists in front of the cognitive sensor s3, distinguishes the positions of p1 and p2, Can not distinguish the specific orientation of

In other words, from the information that the object is detected in the cognitive sensor s3, it is inferred that the object will be located at the upper left of the figure based on the direction in which the cognitive sensor s3 is installed. At this time, if the angle of view 111 of the camera 11 is sufficiently wide, an object can be captured in the image of the camera 11 by controlling the camera to be directed to the upper left side with respect to the drawing.

In contrast, FIG. 4 (b) shows four cognitive sensors (s1, s2, s3, s4) having the same sensing angle range as that of FIG. The left 1/3 of the sensing angle range 131 of the cognitive sensor s4 overlaps the sensing angle range of the perception sensor s1 and the right 1/3 overlaps the sensing angle range of the perception sensor s3.

For the object J moving along the same trajectory as FIG. 3, p1 is detected by the perceptual sensor s3 and the perceptual sensor s4, and p2 is detected by the perceptual sensor s4, so that the orientation of p1 and p2 can be distinguished. In this way, Fig. 4 (b) shows that the omnidirectional area is divided into eight parts from the four cognitive sensors, so that the set of cognitive sensors can further distinguish the orientation of the object J.

Thus, the orientation of the object can be precisely calculated by constructing the detection angles of the neighboring sensors to be partially overlapped. Accordingly, the object can be traced even with a camera having a narrow angle of view, and a clear image of the object can be obtained because the object is placed at the center of the acquired image.

An operation of calculating the orientation of the object by the measured values of the plurality of cognitive sensors can be performed in the individual image capturing apparatus 10 through the operation control module 15 installed together with the set of the cognitive sensors 13. [ Alternatively, the cognitive sensors may be configured to transmit the measured values to the control center via the network, and to install the calculation control module in the control center server, and calculate the distance and position of the object based on the measured values.

Referring again to FIG. 1, the communication module 14 transmits the measurement value of the recognition sensor to the control center or receives the control signal transmitted from the control center.

Specifically, the communication module 14 transmits the image or photograph, that is, the image information obtained from the camera 11, to the control center 20.

In the case where the operation control module 15 is provided in the image acquisition device 10, the operation control module 15 transmits data about the orientation of the object calculated by the operation control module 15 to the control center 20 as object information . Alternatively, if the operation control module is provided in the control center instead of the image acquisition device, the communication module transmits data including the measurement value of the recognition sensor as object information to the control center.

Further, the communication module 14 receives a control signal for controlling the operation of the rotation drive module 12 from the control center 20. The received control signal is transmitted to the rotation drive module 12 to cause the camera 11 to be rotated toward a specific orientation (or object). Wherein the control signal is an electrical signal that rotates the camera in a forward or reverse direction to face the object detected by the adjacent sensor or an electrical signal that rotates the camera in a specific direction intended by the control center 20.

At this time, the network form of the communication module 14 may be a forming network centering on the control center 20, and further includes communication lines with other image capturing devices or sensor units to be described later, Can be formed.

The operation control module 15 provided in the image acquisition device 10 has a function of calculating the orientation of the object through the measured values of the cognition sensors 13 provided, So that the control signal can be generated.

When the operation control module 15 is provided in the image acquisition device 10, the generated control signal is transmitted to the rotation drive module 12 without involvement of the control center 20, Can be obtained. In this case, the control center monitors the image transmitted from the image acquiring device, and if necessary, the monitoring personnel transmits a control signal to control the rotation driving module in preference to the operation control module.

On the other hand, when the operation control module is provided in the control center, the image acquisition device functions as a simple terminal, and is configured to operate the rotation drive module only by the control signal of the control center.

The image acquiring apparatus further includes power supply means (not shown). Although not shown in FIG. 2, the set of the recognition sensor, the communication module, and the like may be provided inside the enclosure so as not to be directly exposed to the outside.

The control center 20 is provided with a plurality of monitors 23 for presenting the image information transmitted from the image acquiring device 10 to the monitoring personnel. A control center server DB 21 for storing and recording various information, and the like.

FIG. 5 relates to an object tracking operation of the camera by the operation control module when the object enters the recognition range of the image acquisition device and moves to a certain path.

In the illustrated image acquiring device, three recognition sensors s1, s2, and s3 are radially provided, and all directions are observed. The object J is moving along p1, p2, and p3, and the location of the object is p3.

When the object J is located at p1 and enters the recognizable range for the first time, the recognition sensor s3 detects the object. At this time, as described above, the remaining recognition sensor s1 and recognition sensor s2 can not detect the object at the position p1, and the operation control module detects that the camera (11, the direction of the block arrow indicates the direction of the camera) A control signal is generated. This control signal is transmitted to the rotation drive module so that the camera 11 is oriented in the same direction as s3 and there is an object located at p1 within a range allowed by the angle of view 111 of the camera 11. Therefore, An object exists.

Thereafter, as the object moves to p2, the existence of the object is detected in the recognition sensor s1 and the recognition sensor s3, and the operation control module recognizes that the object is positioned between the recognition sensor s1 and the recognition sensor s3. A control signal is generated so as to face a portion (2 o'clock in the drawing) where the sensing angle range of the recognition sensor s1 overlaps with the sensing angle range of the recognition sensor s3.

When the object moves to p3, the object J is not detected in the recognition sensor s3 and the object is detected only in the recognition sensor s1. The operation control module generates a control signal so that the camera looks in the same direction as the recognition sensor s1, thereby allowing the object (J) located at p3 to enter the camera image. Thus, it is possible to control the direction of the camera by only detecting the recognition sensor.

If a cognitive sensor capable of distinguishing the orientation of an object with a single cognitive sensor is employed, the control signal can be generated so that the camera always faces the object. In this case, since the object is placed at the center of the photographed image, a more accurate image of the object can be obtained. Also, even if a camera with a narrow angle of view is used, it is easy to acquire a continuous image of a moving object.

6 relates to image acquisition of a new object sensed by a cognitive sensor in a situation where a plurality of objects are sequentially accessed.

Referring to the drawing, an object 1 (J1) enters an observable region of an image acquisition apparatus and moves to p1 and p2. As described above, the camera acquires an image while tracking an object 1 (J1) along p1 and p2 The other object 2 (J2) can newly enter into the observable area of the image acquiring apparatus.

That is, it is a case of recognizing another object moving through the recognition sensor while controlling the camera toward the object. In this case, the camera may take an image of another object that has been newly entered for a predetermined time toward another object, and then control the rotation driving module to face the existing object again.

If the object 2 (J2) is detected in the recognition sensor s2 (object 2 is at the position p4) when the object 1 (J1) is located at p2 again in FIG. 6, (P5 position) and generates a control signal so as to face the object 1 (J1). Here, the time for acquiring the image of the object 2 (J2) may be set to several seconds, depending on the setting.

Then, when the object 1 (J1) is out of the observable area of the image capturing device, if the object 2 (J2) still exists in the observable area, the arithmetic control module generates the control signal so that the camera faces the object 2.

With this control method, it is possible to acquire images of a large number of objects entering the observation area. It can be controlled to visually express the presence of the object to the monitoring agent by securing the image of the newly appeared object during shooting of the existing object and to acquire the image of the object 2 instead of the image for the object 1 by the direct control of the monitoring personnel.

The presence of a new object and its approximate orientation can be determined by analyzing the measured values of the cognitive sensors for the new object in the observable area of the image acquisition device. In addition, we acquire images of new objects and visually confirm new objects, and leave them in the records to improve the reliability of various analysis data such as object movement path.

Figure 7 relates to a control system with a plurality of image acquisition devices.

In the installation target area where the control system is to be installed, two or more image acquisition devices may be installed. The plurality of image capturing devices are installed at regular intervals from each other. In order to secure an image of a wide field of view, the camera needs to be installed higher than the ground, and an image capturing device is installed mainly using a street lamp or a telephone pole. Generally, the installation interval of the image acquisition device depends on the arrangement of the building, the structure, and the like in the installation target area. In some cases, the image acquisition device may be arranged in a concentrated manner so that the neighboring image acquisition device and the observable area may partially overlap, and sometimes the image acquisition device is arranged sparsely, A blind spot B may be generated.

In a control system having a plurality of image acquisition devices, the object information acquired by an image acquisition device can be based on the direction control of a camera of another neighboring image acquisition device.

Specifically, when an object is recognized by a cognition sensor included in an image acquisition device, and a neighboring image acquisition device installed in the vicinity of the image acquisition device recognizing the object does not recognize any object, The camera of the device is controlled to face the object based on the object information of the image acquiring device that perceives the object.

7, the object J is moved from the upper right to the middle of the drawing, so that the object J passes through p1 to p2 and into the observable area of the central first image acquiring device 10a .

According to the above, the recognition sensor of the first image acquisition device 10a detects the object and controls the direction of the camera 11 so as to contain the object in the image.

The image and object information generated by the first image acquiring device 10a are transmitted to the control center and the control center transmits the object J to the neighboring image acquiring device 10b around the first image acquiring device 10a, As shown in Fig. Accordingly, in the neighboring-image acquiring devices, the camera 11 is pointing at the object even though the recognition sensor of the image acquiring device 10b detects the object, and then the object , It acquires an image by following the object that has entered the self-observation area.

Since the object J in which the camera is likely to enter the self-observation area is pointed ahead in the neighboring image acquiring device 10b, the image of the object that enters the self-observable area is acquired quickly. In other words, it is possible to reduce the time required for the camera to rotate toward the object after the object is detected by the cognitive sensor while the camera is pointing in the other direction. From the viewpoint of the control center, it is possible to acquire a continuous image of a continuous object.

When a camera of a neighboring image acquiring device is to be controlled based on the object information of an image acquiring device, the subject of the control signal is the control center.

If the neighboring image acquiring device recognizes an object in its observable area, the control center does not transmit a control signal to the image acquiring device, thereby allowing the object in the observable area to be preferentially photographed .

Or the control sensor sends a control signal to the neighboring image acquiring device to always face the object based on the object information when the object information is generated in any one of the image acquiring devices, It can be controlled to determine the priority of the control signal for photographing the object existing in the observable area and to continuously photograph the object existing in the observable area.

Figure 8, on the other hand, relates to a smart ccive control system in accordance with another embodiment of the present invention.

The configuration and operation of the image acquiring device provided in another embodiment of the present invention may be the same as the configuration and operation of the image acquiring device of the above-described embodiment within a range that does not conflict with the following description.

Referring to FIG. 1, a plurality of image capturing apparatuses 10 are spaced apart from each other. The image capturing apparatuses are spaced apart from each other to form a dead space, which is an empty space between the observable regions. In this blind spot, a sensing unit 30 recognizing a moving object is located.

The sensing unit 30 includes a plurality of radial sensors 30 arranged on a horizontal plane, and includes cognitive sensors 13 for measuring an azimuth or a distance to an object within a limited angular range and a limited measurement distance, To the control center or the adjacent image acquiring device (10). Here, the configuration and function of the cognitive sensors and the communication module are the same as those of the cognitive sensors and the communication module of the above-described embodiment, and thus duplicated description will be omitted. The sensing unit also has power supply means for operating the recognition modules and the communication module.

The sensing unit 30 has an independent status with respect to the surrounding image acquiring device, thereby having an observable area by itself, generates object information, and transmits it to the control center or the peripheral image acquiring device.

The sensing unit 30 is installed in a blind spot between sparsely installed image acquisition devices, thereby contributing to reducing the blind spot as much as possible in the target area occupied by the control system.

The sensing unit 30 can not acquire a direct image of the captured object as it does not have a camera. However, when the object is detected by the sensing sensor of the sensing unit, it is possible to know the existence of the object at approximately azimuth angle. The control center displays the existence of the object on the map based on the coordinate values of the map and the sensing unit and the received object information, and can identify the object and trace the movement route by linking with a plurality of image acquisition devices or sensing units .

When a sensing unit 30 senses an object and the sensing units 30 and neighboring image sensing devices are not aware of any objects in their observable area, the neighboring image sensing devices 10 May be controlled to face the object J based on the object information obtained by the sensing unit 30. [

In the control system provided with the sensing units and the image acquisition devices, any one object may be located within the observation region of any one sensing unit or within the observation region of any one of the image acquisition devices, It is located in the zone.

When an object enters an observation area of any one sensing unit or any one of the image acquiring devices and the sensing unit of the sensing unit or the image sensing device senses the object, By controlling the camera of the apparatus to face the object, it is possible to quickly acquire the image of the object entering the neighboring image acquiring apparatus.

The smartcities control system according to the embodiment of the present invention can be constructed by additionally installing the cognitive sensors, the communication module, the operation control module, the sensing unit, and the like in the existing rotary cctv camera control system. The cost of additional installation components, including cognitive sensors, is so low that it can be installed in a large area at low cost by incorporating into existing control systems.

In particular, even when the installation interval of the cameras in the existing CCTV system is wide and effective monitoring of a wide area is not easy, a blind spot can be greatly reduced by additionally installing a sensing unit.

100: System
10, 10a, 10b: Image acquisition device
11: camera 111: angle of view 12: rotation drive module
13: Cognitive sensor 131: Sensing angle range 132: Sensing distance
14: Communication module 15: Operation control module
20: Control center
21: control center server 22: control center DB 23: monitor
T: Support B: Blind spot 30: Sensing unit J, J1, J2: Object

Claims (9)

A control system comprising at least one image acquisition device and a control center communicating with the image acquisition device to display an image obtained from the image acquisition device,
The image acquiring device includes:
A camera for acquiring an image;
A rotation driving module for rotating the camera in a forward direction or a reverse direction;
Recognition sensors disposed radially on a horizontal plane with respect to an installation point of the camera and measuring an azimuth angle with respect to an object moving within a limited sensing angle range and a limited sensing distance; And
A communication module for transmitting image information from the camera and object information calculated from the recognition sensor to the control center and receiving a control signal for controlling operation of the rotation drive module from the control center;
/ RTI >

The recognition sensors
Doppler sensors or PIR sensors, at least three radially arranged on a horizontal plane,
The left 1/3 portion and the right 1/3 portion of the sensing angle range of any one of the cognitive sensors are arranged to overlap with the sensing angle range of the cognitive sensor adjacent to the left and right sides of the cognitive sensor,

The image acquiring device or the control center,
And an operation control module operable to control the rotation drive module to rotate the camera toward the object sensed by the recognition sensor,

The operation control module
Wherein the direction of the object is calculated according to whether one of the two sensors perceives the object by itself and the two adjacent sensors perceive the object simultaneously and transmits the control signal based on the calculated orientation.
SmartCity control system.
delete delete delete The method of claim 1,
The operation control module
When another object is recognized through the recognition sensors while the camera is controlled to face the object,
The camera rotates toward the other object, captures an image of the other object for a predetermined time, and then controls the rotation driving module to rotate toward the existing object
SmartCity control system.
The method of claim 1,
Wherein the image acquiring device includes a plurality of image acquiring devices spaced apart from each other,
When an object moving by a cognition sensor provided in any one of the image capturing apparatuses is recognized and if no object is recognized in the neighboring image capturing apparatuses adjacent to the image capturing apparatus,
The camera of the neighboring image acquiring device is controlled to face the object based on the object information of the image acquiring device which recognizes the moving object
SmartCity control system.
The method of claim 6,
A control signal for rotating the camera of the neighboring image acquiring device is transmitted from the control center
SmartCity control system.
In claim 1,
Wherein a plurality of the image acquiring devices are arranged so as to be spaced apart from each other,
A sensing unit recognizing an object is positioned in a blind spot between a plurality of neighboring image acquiring devices,
The sensing unit includes:
A plurality of cognitive sensors arranged radially on a horizontal plane and measuring an azimuth angle with respect to a moving object within a limited sensing angle range and a limited sensing distance,
And a communication module for transmitting the object information calculated by the recognition sensor to the control center or an adjacent image acquiring device
SmartCity control system.
9. The method of claim 8,
When the recognition unit recognizes an object moving by any one of the image acquisition apparatuses or the sensing units provided in the sensing unit and does not recognize any object in the image acquisition apparatus or the image acquisition apparatus adjacent to the sensing unit,
The rotation driving module of the neighboring image acquiring device includes:
The camera is controlled to face the object based on the object information of the image acquisition device or the object information of the sensing unit,
SmartCity control system.
KR1020150118650A 2015-08-24 2015-08-24 Smart CCTV control system KR101600314B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150118650A KR101600314B1 (en) 2015-08-24 2015-08-24 Smart CCTV control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150118650A KR101600314B1 (en) 2015-08-24 2015-08-24 Smart CCTV control system

Publications (1)

Publication Number Publication Date
KR101600314B1 true KR101600314B1 (en) 2016-03-07

Family

ID=55540344

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150118650A KR101600314B1 (en) 2015-08-24 2015-08-24 Smart CCTV control system

Country Status (1)

Country Link
KR (1) KR101600314B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101786525B1 (en) * 2017-06-14 2017-10-18 주식회사 신아시스템 System for saving power and controling operation using human body detection sensor with 360 degree
EP4296985A1 (en) * 2022-06-15 2023-12-27 Arlo Technologies, Inc. Electronic monitoring system with activity zone alignment tool

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007323563A (en) * 2006-06-05 2007-12-13 Akira Kawada Object-detecting and automatic tracking device
KR100980586B1 (en) * 2010-05-07 2010-09-06 주식회사 에스엘티 Method for intelligent image security using single or multi camera and system thereof
JP2011258031A (en) * 2010-06-09 2011-12-22 Dx Antenna Co Ltd Tracker for monitoring camera
KR101328246B1 (en) 2013-05-15 2013-11-14 강성진 Apparatus for tracking of moving target and the method thereof
KR101340897B1 (en) 2013-04-22 2013-12-13 주식회사 사라다 The anticrime system in a school zone
KR20140058192A (en) 2012-11-06 2014-05-14 에스케이텔레콤 주식회사 Control image relocation method and apparatus according to the direction of movement of the object of interest
KR101425505B1 (en) * 2013-10-25 2014-08-13 홍승권 The monitering method of Intelligent surveilance system by using object recognition technology

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007323563A (en) * 2006-06-05 2007-12-13 Akira Kawada Object-detecting and automatic tracking device
KR100980586B1 (en) * 2010-05-07 2010-09-06 주식회사 에스엘티 Method for intelligent image security using single or multi camera and system thereof
JP2011258031A (en) * 2010-06-09 2011-12-22 Dx Antenna Co Ltd Tracker for monitoring camera
KR20140058192A (en) 2012-11-06 2014-05-14 에스케이텔레콤 주식회사 Control image relocation method and apparatus according to the direction of movement of the object of interest
KR101340897B1 (en) 2013-04-22 2013-12-13 주식회사 사라다 The anticrime system in a school zone
KR101328246B1 (en) 2013-05-15 2013-11-14 강성진 Apparatus for tracking of moving target and the method thereof
KR101425505B1 (en) * 2013-10-25 2014-08-13 홍승권 The monitering method of Intelligent surveilance system by using object recognition technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101786525B1 (en) * 2017-06-14 2017-10-18 주식회사 신아시스템 System for saving power and controling operation using human body detection sensor with 360 degree
EP4296985A1 (en) * 2022-06-15 2023-12-27 Arlo Technologies, Inc. Electronic monitoring system with activity zone alignment tool

Similar Documents

Publication Publication Date Title
EP2710801B1 (en) Surveillance system
US7535353B2 (en) Surveillance system and surveillance method
US5980123A (en) System and method for detecting an intruder
EP2779130B1 (en) GPS directed intrusion system with real-time data acquisition
US20130208123A1 (en) Method and System for Collecting Evidence in a Security System
US8717439B2 (en) Surveillance system and method
CN106657921A (en) Portable radar perimeter security and protection system
EP3452848B1 (en) Monitoring method using a camera system with an area movement detection
KR101275297B1 (en) Camera Apparatus of tracking moving object
CN103179342A (en) Monitoring camera and method for monitoring
KR20150060626A (en) Active Type Unmanned Security System
CN103592901A (en) Holder control airport monitoring system
KR101600314B1 (en) Smart CCTV control system
KR101648292B1 (en) Unmanned monitoring system apparatus
RU2542873C1 (en) System for technical surveillance of protected area
KR20170112714A (en) System for automatic monitoring of distributed area using flying photographing unit
EP3891711B1 (en) Method of optical alignment and verification of field of view integrity for a flame detector and system
WO2020135519A1 (en) Mobile detection device
US20210248384A1 (en) Building evacuation method and building evacuation system
JP7176868B2 (en) monitoring device
CN113068000A (en) Method, device, equipment and system for monitoring video target and storage medium
KR101870151B1 (en) Enforcement system of illegal parking
KR20230089493A (en) Multi-camera fire detector
RU2563557C2 (en) Multispectral system and method for electro-optical surveillance of protected area
KR100452092B1 (en) Unidentified People Tracking Device Using Dual Cameras And Method Thereof

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20181210

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20191212

Year of fee payment: 5