KR20110136907A - Wide area surveillance system and monitoring data processing method in the same - Google Patents

Wide area surveillance system and monitoring data processing method in the same Download PDF

Info

Publication number
KR20110136907A
KR20110136907A KR1020100056806A KR20100056806A KR20110136907A KR 20110136907 A KR20110136907 A KR 20110136907A KR 1020100056806 A KR1020100056806 A KR 1020100056806A KR 20100056806 A KR20100056806 A KR 20100056806A KR 20110136907 A KR20110136907 A KR 20110136907A
Authority
KR
South Korea
Prior art keywords
image
camera
surveillance
wide
control device
Prior art date
Application number
KR1020100056806A
Other languages
Korean (ko)
Inventor
이지환
Original Assignee
주식회사 영국전자
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 영국전자 filed Critical 주식회사 영국전자
Priority to KR1020100056806A priority Critical patent/KR20110136907A/en
Publication of KR20110136907A publication Critical patent/KR20110136907A/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19626Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses
    • G08B13/19628Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses of wide angled cameras and camera groups, e.g. omni-directional cameras, fish eye, single units having multiple cameras achieving a wide angle view
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Abstract

The ability to monitor a wide range of monitored areas can reduce the computing burden of the remote surveillance unit and reduce the bandwidth of the line for signal transmission from multiple cameras to the remote surveillance unit, thereby reducing the cost of laying and maintaining the line. Surveillance system that can reproduce the path and the concentrated surveillance image of the moving object in the surveillance area as needed. Surveillance system of the present invention comprises a remote control device; A wide area image of a surveillance area, each of which is connected to the remote control device, each of which is at least a part of the surveillance area, detects a moving object in the surveillance area, transmits movement object detection information to the remote control device, and A plurality of master cameras for storing a wide area image and a focused surveillance image; And a plurality of slave cameras, each of which is connected to any one of the plurality of master cameras and acquires the centralized surveillance image for a portion of the surveillance area.

Figure P1020100056806

Description

Wide Area Surveillance System and Monitoring Data Processing Method in the Same}

The present invention relates to a surveillance system, and more particularly, to a surveillance system of a closed circuit television (CCTV) system. In addition, the present invention relates to a data processing method in such a system.

In general, a CCTV surveillance system includes a camera for photographing an area to be monitored, and a remote surveillance unit connected to the camera. Typically, the camera is controlled by the remote monitoring unit, and the image taken by the camera is transmitted to the remote monitoring unit, displayed on the monitor of the remote monitoring unit, and stored in the storage device.

Although various kinds of surveillance cameras are used, one of the most widely used cameras at the time of the present application is a fixed camera having a fixed focal length. However, the fixed camera has a problem that the viewing angle is narrow and the photographing range is limited to an extremely narrow area according to the monitoring direction first set by the operator.

Pan-Tilt-Zoom (PTZ) cameras capable of horizontal rotation (i.e. panning) and vertical rotation (i.e. tilting) and zoom in / zoom out are also widely used. Since PTZ cameras can be remotely controlled to rotate horizontally, vertically, and zoom in / zoom out, it is possible to change the surveillance area according to the instructions of the remote monitoring unit or to concentrate monitoring while tracking only a specific target. However, even a PTZ camera has a limited blind spot of the lens, so there is a blind spot where shooting is impossible, although it is spatially different from the direction of the camera. In particular, when a user zooms in on a lens and drives a panning and tilting mechanism to track a specific object, the blind spots become wider because it is impossible to monitor other areas except the periphery of the traced object.

As a method for extending the surveillance range, a method of using a panoramic camera (also referred to as a wide-angle camera or an omnidirectional camera) employing a wide-angle lens such as a fisheye lens has also been proposed. Korean Patent No. 663483 (name of the invention: an unmanned monitoring method and apparatus using an omnidirectional camera) and Korean Patent Publication No. 2009-15311 (name of the invention: a video surveillance system) are examples. However, in the fisheye lens camera, the image to be photographed is circular, and the distortion is severe. Furthermore, tracking and monitoring of moving objects is more difficult due to the curvilinear image properties. Therefore, fisheye cameras are useful for looking at the overall context, but not for intensive monitoring of moving objects.

Accordingly, in recent years, a surveillance system combining a wide-range surveillance camera and a centralized surveillance camera has been spreading. For example, Korean Patent Laid-Open Publication No. 2005-0103597 (name of the invention: a surveillance system using a real-time panoramic video image and a method of controlling the system) selects a specific portion of an image from panoramic images obtained by a plurality of component cameras, and selects the selected portion. A system for controlling a PTZ camera to shoot is described. According to such a system, it is possible to monitor omnidirectionally by a panoramic camera, and when a motion is captured, a moving object can be tracked and monitored by a PTZ camera.

In an existing surveillance system combining a panoramic camera and a PTZ camera, the control of the PTZ camera is entirely by operating the input device of the operator in the remote monitoring unit or by executing a computer program of the remote monitoring unit. If the control of the PTZ camera is made manually by the operator, there is a disadvantage that a constant input of sufficient manpower is required. On the other hand, when the control of the PTZ camera is performed by motion detection software and / or hardware, the data processing burden on the remote monitoring unit becomes large. In particular, a system with a large number of cameras becomes more computing burden on the remote monitoring unit.

Moreover, since the video signal obtained by each camera is transmitted to the remote monitoring unit through separate lines, there is a problem that the line laying cost and the maintenance cost increase corresponding to the number of cameras. Even under conditions where the bandwidth of the line is sufficiently wide, separate multiplexer equipment is installed at the place where the line is collected to transmit video signals from several cameras through fewer lines than the number of cameras, and the demultiplexer is provided at the remote monitoring unit. Must be done, which increases system complexity.

On the other hand, a variety of moving objects can appear in the images obtained by the cameras, there are many applications that do not need the immediate tracking and intensive surveillance image for most objects. Such applications include internal monitoring systems for business or commercial buildings, in which moving objects are often resident employees or regular customers. In such a system, an image of a moving object may be necessary only after the cause is identified or the object is identified after, for example, an accident or a problem occurs. Therefore, in such a system, it is not necessary to display the centralized monitoring image of all moving objects on the monitor of the remote monitoring unit, and it is sufficient to reproduce the path and the image of the object of interest when necessary.

The present invention is to solve such a problem, it is possible to monitor a wide range of surveillance target area, to reduce the computing burden of the remote monitoring unit, and to reduce the bandwidth of the line for signal transmission from multiple cameras to the remote monitoring unit Accordingly, the technical problem is to provide a surveillance system that can reduce the cost of line installation and maintenance, and to reproduce the path and the centralized surveillance image of a moving object that has appeared in the monitored area as needed.

In addition, the present invention can reduce the computing burden of the remote monitoring unit in a wide area monitoring system having a plurality of cameras and a remote monitoring unit, the bandwidth of the line for the signal transmission from the plurality of cameras to the remote monitoring unit and thus Another technical challenge is to provide monitoring data processing methods to reduce the cost of track laying and maintenance, and to reproduce the path and the centralized surveillance video of moving objects in the surveillance area as needed. .

The monitoring system of the present invention for achieving the above technical problem is

Remote control device;

A wide area image of a surveillance area, each of which is connected to the remote control device, each of which is at least a part of the surveillance area, detects a moving object in the surveillance area, transmits movement object detection information to the remote control device, and A plurality of master cameras for storing a wide area image and a focused surveillance image; And

A plurality of slave cameras, each of which is connected to any one of the plurality of master cameras and obtains the centralized surveillance image for a portion of the surveillance area;

Respectively.

Each of the plurality of slave cameras is capable of pan / tilt / zoom driving, and the master camera controls the slave camera to photograph the moving object based on the moving object detection information without the help of the remote control device. desirable.

At least some of the plurality of master cameras may include a wide-angle lens to obtain a wide-angle image as the wide-angle image.

Each of the plurality of master cameras preferably provides the centralized monitoring image to the remote control device only when there is a request from the remote control device.

Each of the plurality of master cameras may continuously provide the centralized monitoring image to the remote control device together with the object log data.

On the other hand, the monitoring data processing method of the present invention for achieving the above another technical problem is to obtain a wide area image for the remote control device and the surveillance area, each of which is connected to the remote control device and at least a part of the area to be monitored and focus on each At least one slave camera for acquiring a surveillance image is implemented in a surveillance system having a master camera connected thereto.

First, the master camera acquires the wide area image and receives the focused surveillance image from the slave camera. The master camera detects a moving object from the wide area image or the centralized monitoring image, generates object log data including position information and identification information of the moving object, and transmits the object log data to the remote control device. The concentrated monitoring image is stored in a storage device. Finally, in response to the request of the remote control device, the master camera transmits the stored image to the remote control device.

According to the present invention, each of the multifunctional master cameras directly detects a motion in a wide area image, and transmits detection information or appearance information of a moving object to a remote monitoring / control device. The remote monitoring / control device may control other multifunctional master cameras according to the detection information or the appearance information of the moving object, thereby allowing the multifunctional master cameras to operate in conjunction. Each master camera may control the centralized monitoring camera to track and monitor the moving object, and may store the centralized monitoring image in a storage device provided therein together with the wide area image. The remote monitoring / control device can reconstruct the path of the moving object in the monitored area based on the detection information or the appearance information of the moving object, and reproduce the concentrated monitoring image from the image data stored in the multifunctional master camera. have.

Motion detection and control of centralized surveillance cameras and video storage are accomplished by a multi-functional master camera, which enables the monitoring of a wide range of surveillance areas while reducing the computing burden of remote monitoring and control devices, and remote monitoring from each camera. It has the effect of reducing the bandwidth of the line for signal transmission to the control device and thus the cost of laying the wire and maintaining it. In addition, even in the event of an abnormality of the remote monitoring / control device can be recorded without interruption monitoring, there is an advantage that can operate the remote monitoring / control device with a small number of personnel.

On the other hand, the master camera may be provided in plural instead of one, there is an advantage that the system size can be easily extended to fit the area of the monitored area.

Hereinafter, exemplary embodiments of the present invention will be described with reference to the accompanying drawings. For convenience, the same reference numerals will be used for the same or corresponding members in the drawings. In the drawings,
1A is a schematic block diagram of an embodiment of a video surveillance system according to the present invention;
1B is a diagram showing an arrangement of cameras in a video surveillance system according to the present invention;
2 is a detailed block diagram of one embodiment of the master camera shown in FIG. 1;
3 is a detailed block diagram of one embodiment of the slave camera shown in FIG. 1;
4 is a diagram illustrating an example of a method of converting a wide-angle image into a rectangular panoramic image;
5 shows configuration examples of an output image;
6 is a flowchart showing an operation process of the master camera shown in FIG. 1;
7 is a diagram illustrating a state where a wide-angle image is divided into a plurality of sectors for motion detection and slave motor control;
8 is a schematic diagram illustrating an embodiment of a motion object detection process;
9 is a schematic diagram illustrating another embodiment of a motion object detection process;
10 is a schematic diagram illustrating yet another embodiment of a motion object detection process;
FIG. 11 is a diagram for explaining a method of determining a tilting angle value when a wide-angle lens has orthogonal refraction characteristics; FIG.
12 is a view for explaining a method of determining a tilt angle value when a wide-angle lens has an equidistant projection refractive characteristic;
13 is a diagram illustrating an example of an output image in which a motion object pointer is added to a panoramic image portion;
14 shows an example of a tracking table for storing movement history information for each moving object;
15 is a view for explaining a process of controlling a slave camera when a new moving object is detected in the wide-angle image;
FIG. 16 is a diagram for describing a process of controlling a slave camera when an additional moving object is detected in a wide angle image; FIG.
FIG. 17 illustrates a process of controlling a slave camera when a plurality of moving objects are located in the same image sector; FIG.
18 is a view for explaining a process of controlling a slave camera when the moving objects in the same image sector are separated again;
19 illustrates an example of an output image in which a different type of pointer is added to the motion objects when a plurality of motion objects exist in the panoramic image;
20 is a diagram for describing a master camera control process according to movement of a moving object in the example of the camera arrangement of FIG. 1B;
21 is a block diagram of a modified embodiment of the master camera shown in FIG. 2;
FIG. 22 is a block diagram of another modified embodiment of the master camera shown in FIG. 2;
23 to 30 show installation examples of the master camera and the slave camera;
31 is a block diagram of another embodiment of a video surveillance system according to the present invention;
32 is a detailed block diagram of an embodiment of the camera device shown in FIG. 31;
33 is a view for explaining a panorama image composition method by the panorama image configuration unit shown in FIG. 32;
34 is a perspective view of one embodiment of the camera device of FIG. 32;
FIG. 35 is a detailed block diagram of another embodiment of the camera device shown in FIG. 31;
FIG. 36 is a view for explaining a panorama image construction method by the panorama image construction unit shown in FIG. 35; FIG.
FIG. 37 is a side view of an embodiment of the camera device of FIG. 35; FIG.
38 is a bottom perspective view of another embodiment of the camera device of FIG. 35; And
39 is a bottom perspective view of yet another embodiment of the camera device of FIG.

1A shows an embodiment of a video surveillance system according to the present invention. The video surveillance system according to the present embodiment includes a multifunctional master camera (hereinafter referred to as a "master camera"), a slave camera 20, and a remote monitoring / control device 40. In a preferred embodiment, the master camera 10 is a fisheye lens camera or a panoramic camera for capturing the surveillance area in all directions, but the present invention is not limited thereto. It may be. In addition, in the preferred embodiment, the slave camera 20 is a pan-tilt-zoom (PTZ) camera capable of horizontal rotation and vertical rotation and zoom in / zoom out, but the present invention is limited thereto. Other types of cameras such as fixed cameras may be used as the slave camera 20.

In FIG. 1A, for simplicity, one master camera 10 is connected to the remote monitoring / control device 40 and one slave camera 20 is connected to the master camera 10. The video surveillance system may include a greater number of cameras 10, 20. Figure 1b illustratively shows the actual arrangement of the cameras in the video surveillance system according to the present invention.

In FIG. 1B, five master cameras 10a to 10e are installed in the surveillance region. Each of the master cameras 10a to 10e captures an image of a cell or a surveillance area in which the surveillance area is divided in location, and all of them may be connected to the remote monitoring / control device 40. In addition, one or more slave cameras 20 may be connected to each of the master cameras 10a to 10e. In FIG. 1B, two slave cameras 20a-20b, 20c-20d, 20e-20f, 20g-20h, 20i-20j are connected to the master cameras 10a-10e, respectively.

As will be described later, some of the master cameras 10a to 10e may be integrated with any one of the slave cameras connected thereto. Therefore, in the present specification, including the claims, it should be noted that the 'master camera' and the 'slave camera' are distinguished from a functional point of view, and do not necessarily mean that they must be manufactured as separate devices. .

Referring back to FIG. 1A, the master camera 10 photographs its own surveillance area, detects a moving object from a captured image (hereinafter, referred to as a 'wide image'), and collects object log data indicating motion detection information. It transmits to the remote monitoring / control device 40. In addition, the master camera 10 may control the slave camera 20 to concentrate on the spot where the moving object is detected. The master camera 10 stores the combined image in the image storage unit by combining the centralized monitoring image and the wide area image acquired by the slave camera 20. In response to the request of the remote monitoring / control device 40, the master camera 10 may provide the stored image to the remote monitoring / control device 40.

In the most preferred embodiment, the master camera 10 provides the object log data to the remote monitoring / control device 40 in a normal operation but does not provide an output image. However, in a modified embodiment, the master camera 10 may provide the combined image to the remote monitoring / control device 40 as an output image. However, for convenience of description, the following description focuses on an embodiment in which the master camera 10 provides an output image to the remote monitoring / control device 40 together with object log data in a normal operation process.

The slave camera 20 obtains the centralized monitoring image under the control of the master camera 10 and outputs it to the master camera 10. On the other hand, in the preferred embodiment, the centralized video signal supplied from the slave camera 20 to the master camera 10 is a composite video signal conforming to the NTSC standard. However, the centralized monitoring video signal is not limited thereto, and may be a signal conforming to PAL, SECAM, or other standard, or may be a digital signal.

As mentioned above, the slave camera 20 may be a PTZ camera. However, in a variant embodiment, the slave camera 20 may be a combination of multiple stationary cameras. Even in such a case, it is preferable that each fixed camera is capable of zooming in and out. In the following description, a description will be given focusing on an embodiment in which the slave camera 20 is a PTZ camera. The slave camera 20 receives a control signal from the master camera 10 by serial communication, and horizontal rotation, vertical rotation and zoom in / zoom out are controlled.

The remote monitoring / control device 40 is installed at a remote location separated from the master camera 10 and includes a data processing unit 42, a monitor 44, and an input unit 46. The data processor 42 receives object log data representing motion detection information from the master camera 10 and stores the object log data in a storage unit (not shown). In one embodiment, the data processor 42 may receive the output image from the master camera 10 to be displayed on the monitor 44. In addition, the data processor 42 may control the master camera 10 according to a user input received through the input unit 46, or control the slave camera 20 via the master camera 10. In particular, the data processor 42 may request the master camera 10 an image related to a specific moving object according to a user input. The data processor 42 may be implemented by a general PC, and may further include a matrix, a screen splitter, an image distribution amplifier, and the like for processing the displayed image. The input unit 46 may be a keyboard, a mouse, a joystick input device, or a combination thereof.

FIG. 2 shows an embodiment of the master camera 10 shown in FIG. 1A in detail. For convenience, FIG. 2 shows two slave cameras 20a and 20b on the assumption that they are connected.

In the illustrated embodiment, the master camera 10 includes a wide angle imaging unit 110 and a control / signal processing circuit 120. In addition, the master camera 10 includes video signal input terminals 150a and 150b for receiving a video signal from the slave camera 20 and a first serial port 152 for supplying a control signal to the slave camera 20. And a second serial port 160 for receiving a camera control signal from the remote monitoring / control device 40 and a video signal output terminal 162 for supplying a video signal to the remote monitoring / control device 40. Equipped.

In the wide-angle imaging unit 110, the wide-angle lens 112 is an optical structure composed of a 180 degree fisheye lens, a 360 degree reflector, or a combination of lenses or mirrors, and is incident omnidirectionally in the area to be monitored. Light is condensed to form an image on the image sensor 114. The image sensor 112 includes a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD) and an analog-to-digital converter. The image sensor 112 converts the light collected by the wide-angle lens 112 into an electrical signal and digitizes the digital wide-angle. Output video signal. The wide-angle image acquired by the wide-angle imaging unit 110 is circular or annular.

In the control / signal processing circuit 120, the motion detector 122 receives the wide-angle image signal from the wide-angle image pickup unit 110 and compares the wide-angle image in units of frames to determine whether a motion object exists in the image. Here, the frames to be compared may be consecutive frames or frames separated in time by a plurality of frame periods. A detailed motion detection algorithm will be described later.

The panoramic image configuring unit 124 converts the wide-angle image from the wide-angle imaging unit 110 into a rectangular panoramic image. 4 shows an example of a method of converting a wide-angle image into a rectangular panoramic image. In the illustrated example, the panoramic image configuration unit 124 divides the circular or annular wide-angle image 200 into two regions of the upper side and the lower side, and expands each region and fills the empty pixels by interpolation to fill the rectangular image. Convert to In this case, in order to reduce the computational burden due to interpolation, the central portion having a small amount of information in the circular wide-angle image may not be used for conversion.

In the following description, the image 212 in which the lower region of the wide-angle image 200 is converted into a rectangle is referred to as a 'front panorama image', and the image 214 in which the upper region of the wide-angle image 200 is converted into a rectangle is described. It will be referred to as a 'rear panoramic image'. In addition, the image 210 connecting the front panoramic image 212 and the rear panoramic image 214 to the left and right will be referred to as a “panorama image”. In the drawing, points P1 to P4 are displayed to indicate corresponding points of the wide-angle image 200, the front panoramic image 212, and the rear panoramic image 214. The width of the front panoramic image 212 and the rear panoramic image 214 may be equal to the width of the centralized monitoring image from the slave camera 20. However, in the modified example, the width of the panoramic image 210 may be equal to the width of the centralized monitoring image.

Referring to FIG. 2 again, the first slave camera 20a is connected to the video signal input terminal 150a through a coaxial cable, and the second slave camera 20b is connected to the video signal input terminal 150b through a coaxial cable. Connected. Serial communication units of the slave cameras 20a and 20b are commonly connected to the first serial port 152 of the master camera 10a. When transmitting the control signal, the serial communication unit 132 of the master camera 10a selectively transmits the control signal to each slave camera 20a, 20b by transmitting by time division multiplexing or by specifying the ID or address of the slave camera. Can be sent by

The first signal converter 126 converts the centralized monitoring video signal received through the video signal input terminal 150 into a digital video signal. That is, the first signal converter 126 converts the centralized monitoring video signal in the form of a composite video signal into a digital YCbCr or RGB component video signal in the same manner as the wide-angle video signal.

The image storage unit 128 includes a storage medium such as a hard disk or a solid state drive (SSD), and stores a digital centralized video signal and a wide-angle video signal. In a preferred embodiment, the image storage unit 128 may store a wide-angle image and a concentrated monitoring image for a certain period of time, for example, two weeks, one month, or two months, and from the initial position when the storage medium is stored to the end of the storage medium. The video is saved by overwriting the previously saved video. The image storage unit 128 may store the compressed video signal instead of the original video signal. In such an embodiment, the storage unit 128 further includes a compression / decompression unit for compressing the original video signal or restoring the compressed video signal. The compression / decompression unit is configured to implement the controller 130. It may be implemented by a computer program running on a microprocessor.

The controller 130 controls the overall operation of the master camera 10, detects a moving object within a wide image, that is, a wide angle image, and remotely transmits object log data indicating motion detection information through the serial 132. It transmits to the monitoring / control device 40. In addition, the controller 130 controls the pan-tilt-zoom operation of the slave camera 20 so as to intensively photograph the point where the moving object is detected. The basic control operation of the controller 130 is performed by a computer program, but may be performed in response to a camera control signal received from the remote monitoring / control device 40.

The controller 130 may control the image combination unit 134 to change the configuration of the output image according to the camera control signal. In addition, the controller 130 allows the video signal stored in the image storage unit 128 to be read and provided to the remote monitoring / control device 40 as indicated by the camera control signal. At this time, the control unit 130, in response to the request of the remote monitoring / control device 40, the wide-angle image and the centralized monitoring image stored in the image storage unit 128 is combined by the image combination unit 134 to remotely monitor / The controller 40 may be transmitted to the control device 40, or only the wide angle image or the centralized monitoring image may be transmitted to the remote monitoring / control device 40 through the second signal converter 136.

The serial communication unit 132 allows the control unit 130 to communicate with the slave camera 20 through the first serial port 152, and remote monitoring / control device 40 through the second serial port 160. Allows you to communicate with That is, the controller 130 may transmit control signals including the PTZ control signal to the slave camera 20 through the serial communication unit 132 and receive status information from the slave camera 20. In addition, the controller 130 transmits object log data or status information of the master camera 10 or the slave camera 20 to the remote monitoring / control device 40 through the serial communication unit 132, and receives a camera control signal. Can be entered. The connection between the serial communication unit 132 and the slave camera 20 or the remote monitoring / control device 40 can be made according to the RS232C, RS422 or RS485 standard.

The image combination unit 134 selects one or more of the wide-angle image 200, the front panoramic image 212, the rear panoramic image 214, the panoramic image 210, and the first and second concentrated monitoring images to output the image. Configure

5 shows an example of the configuration of the output image generated by the image combination unit 134. In the basic output screen, the panoramic image 210 is disposed at the bottom of the screen, and the centralized monitoring image 220 obtained by the slave camera 20 is disposed in the upper and center regions of the screen (upper left portion of FIG. 5). In the state in which the output image is transmitted to the remote monitoring / control device 40 and displayed on the monitor 44, the operator may change the configuration of the output image by operating the input unit 46. For example, when the operator selects the part of the centralized monitoring image 220 from the output image, the remote monitoring / control device 40 transmits a control signal for requesting a configuration change of the output image, and controls it through the serial communication unit 132. The control unit 130 receiving the signal causes the image combination unit 134 to include only the concentrated monitoring image 220 in the output image (the upper right portion of FIG. 5). Similarly, when the operator selects the panorama image 210 part from the output image, the image combination unit 134 combines only the front panorama image 212 and the rear panorama image 214 to form the output image (the lower right of FIG. 5). part). In this state, when the operator selects an arbitrary point in the output image, the image combination unit 134 arranges only the wide-angle image 200 in the output image (lower left portion of FIG. 5). Finally, the operator can return to the basic output screen by pressing a particular key (eg, ESC key). Meanwhile, the output image may be automatically switched according to a specific sequence according to the detection event of the moving object.

Referring back to FIG. 2, the second signal converter 136 generates a composite video signal for the output video configured by the video combination unit 134, and remotely monitors / controls the device through the video signal output terminal 162. To 40. Accordingly, the output image configured by the image combination unit 134 may be displayed on the monitor 44 of the remote monitoring / control device 40.

In the master camera 10 shown in FIG. 2, all components except the wide-angle imaging unit 110 may be configured in hardware, but some of them may be configured in software. In addition, although all components including the wide-angle imaging unit 110 may be stored in one housing, the present invention is not limited thereto, and the components may be divided and installed in two or more housings. Even in such a case, it is preferable that the housings are provided at a close distance or at the same geographical site so that components installed in the plurality of housings can transmit and receive signals without requiring a separate telecommunication standard or communication protocol.

3 shows an embodiment of a slave camera 20 in detail.

The slave camera 20 includes a focusing lens 170 and an image sensor 172, a signal converter 176, a panning motor 180, a panning motor driver 182, a tilting motor 184, and a tilting motor driver 186. ), A zoom motor 188, a zoom motor driver 190, a controller 192, and a serial communication unit 194. In addition, the slave camera 20 has an image signal output terminal 196 for transmitting the acquired centralized monitoring image to the master camera 10, and a serial for receiving control signals from the master camera 10 and transmitting camera status information. And a port 198.

The focusing lens 170 collects light incident from the front side. The image sensor 172 includes a CMOS or CCD and an analog-to-digital converter. The image sensor 172 converts the light collected by the focusing lens 170 into an electrical signal and digitizes the digital signal to output a digital intensive monitoring image signal. The signal converter 176 generates a composite video signal for the focused monitoring image from the digital focused monitoring video signal, and outputs it through the video signal output terminal 196.

The panning motor driver 182 drives the panning motor 180 under the control of the controller 192 to rotate the camera structure including the focusing lens 170 and the image sensor 172 in the horizontal direction. The tilting motor driver 186 drives the tilting motor driver 186 under the control of the controller 192 to rotate the camera structure including the focusing lens 170 and the image sensor 172 in the vertical direction. The zoom motor driver 190 drives the zoom motor 188 under the control of the controller 192, thereby making it possible to vary the focal length of the focusing lens 170 and to implement a zoom in / zoom out function.

The controller 192 drives the panning motor driver 182, the tilting motor driver 186, and the zoom motor driver 190 according to a control signal received from the master camera 10 through the serial communication unit 194. do. When the control signal from the master camera 10 is generated based on the detection of the moving object, the slave camera 20 may obtain a tracking surveillance image for the moving object. On the other hand, when the control signal is relayed by the master camera 10, the control unit 192 drives the motors 182, 186, 190 to the camera control signal from the remote monitoring / control device 40 In this case, the operator will photograph the area of interest. Each motor 182, 186, 190 is preferably implemented by a stepping motor. On the other hand, the controller 192 preferably resets the position of each motor 182, 186, 190 periodically or aperiodically to its initial position so that the direction of each motor 182, 186, 190 is registered correctly.

The panning / tilting control signal provided by the master camera 10 to the slave camera 20 may be a value indicating a specific panning and tilting angle. However, in the modified embodiment, the PTZ control signal may be provided in the form of a preset code. In such an embodiment, a non-volatile memory (not shown) of the slave camera 20 stores a preset code lookup table (LUT) indicating a correspondence relationship between each preset code and a panning and tilting value. It is preferable to drive the panning motor driver 182 and the tilting motor driver 186 with reference to this LUT. The configuration and utilization of the preset code LUT is described again below.

FIG. 6 shows an operation process of the master camera 10 shown in FIG. 1.

First, the master camera 10 obtains a wide-angle image through the imaging unit 110 (step 300). The motion detector 122 detects a motion object while comparing the wide-angle image in units of frames, and provides a detection signal to the controller 130.

The controller 130 continuously monitors whether a motion object detection signal is received from the motion detector 122 or a wait for detection command is received from the remote monitoring / control device 40 (step 302).

When a motion object detection signal is received from the motion detector 122 or a detection wait command is received from the remote monitoring / control device 40, the controller 130 causes the motion detector 122 to determine the position and direction of movement of the motion object. Decision (step 304). When the position and the direction of movement of the moving object are determined, the controller 130 configures the motion object log data including the position and the direction of movement of the moving object and identification information (ID) assigned to the moving object. In step 306, the remote monitoring / control device 40 transmits to the remote monitoring / control device 40.

Subsequently, the controller 130 controls the panning and tilting of the slave camera 20 according to the detected movement object present region so that the slave camera 20 can track the moving object (operation 308).

In operation 310, the master camera 10 receives the centralized monitoring image signal from the slave camera 20 through the image signal input terminal 150, and the first signal converter 126 receives the digital centralized monitoring image from the received signal. Restore the signal. Steps 300 to 310 are repeatedly performed, thereby continuously tracking the moving object. In this case, the motion detector 122 may recalculate the exact position of the moving object using the digital centralized image signal, and precisely control the slave camera 20 based on the recalculation result (step 312).

In parallel with operation 312, the image combiner 134 combines the panoramic image 210 and the concentrated monitoring image signal to form an output image, and the second signal converter 136 generates a composite image signal with respect to the output image. May be transmitted to the remote monitoring / control device 40. In this case, as described above, when the operator applies the control command by operating the input unit 46 in the remote monitoring / control device 40, the image selected by the image combining unit 134 may vary.

In step 314, the controller 130 determines whether the moving object has left the surveillance area. If the moving object does not leave the surveillance area, the process returns to operation 300 again. On the other hand, if the moving object has left the surveillance area, the motion object log data for notifying the departure of the moving object is configured and transmitted to the remote monitoring / control device 40 through the serial communication unit 132 (step 316). , The process ends.

7 to 19, the motion detection process (step 304) and the slave camera 20 control process (step 308) of FIG. 6 will be described in more detail.

In a preferred embodiment, the motion detector 122 virtually divides the wide-angle image 200 into a plurality of sectors or blocks, and determines the presence or absence of a moving object by counting the amount of change in pixel value for each sector. 7 illustrates a state in which the wide-angle image 200 is divided into a plurality of sectors. In the illustrated embodiment, the wide-angle image 200 is divided into rectangular sectors 202 of the same size. However, in the modified embodiment, the size and / or shape of each sector may be differentiated according to the position in the image. For example, sectors may be divided such that sectors far from the center point are smaller than sectors close to the center point of the wide-angle image 200. In addition, each sector may be divided to have an arc shape.

8 is a schematic diagram illustrating an embodiment of a moving object detection process.

According to the present embodiment, a nonvolatile memory (not shown) of the master camera 10 has a control value indicating a correspondence between a sector code previously given to each sector and a panning and tilting value of the slave camera 20. The lookup table (LUT) is stored. The control value LUT 320 stores a sector code and panning and tilting values for each image sector. On the other hand, in case the master camera 10 is linked to a plurality of slave cameras 20, the slave camera ID that is responsible for the centralized monitoring of the monitoring target area corresponding to the video sector may be additionally stored. Meanwhile, a motion object table 322 for storing motion coefficient values calculated by the motion detector 122 for each image sector is stored and maintained in a volatile memory such as SRAM or DRAM of the master camera 10 during an operation process. .

When the wide-angle image 200 is obtained by the imaging unit 110, the motion detector 122 compares the wide-angle image 200 in units of frames. Here, the frames to be compared may be consecutive frames or frames separated in time by a plurality of frame periods. In detail, the motion detector 122 compares the pixel value or the luminance value of the current frame and the previous frame with respect to each image sector in units of pixels, and counts the number of pixels in which the pixel value difference is larger than the first reference value in the image sector. do. When the count value is larger than the second reference value, it is determined that the motion object exists in the corresponding video sector, and the motion detection signal is output to the controller 130. Here, the motion object may be determined to exist only when the count value is larger than the second reference value more than a predetermined number of times so as to minimize the error. In the illustrated example, 67 count values larger than 40, the second reference value, are counted in the second image sector (sector code = " 01 ").

When the controller 130 receives the motion detection signal, the controller 130 first checks an image sector in which the motion object exists by referring to the motion object table 322. The controller 130 reads a predetermined panning and tilting value for the sector in which the moving object exists from the control value LUT 320 and transmits the predetermined panning and tilting values to the slave camera 20 through the serial communication unit 132. The monitor 20 intensively monitors an area to be monitored corresponding to the corresponding video sector. If a motion is detected in two or more image sectors, the motion detector 122 assigns a priority according to the motion coefficient value, and the controller 130 causes the slave camera 20 to assign a priority to the image sector having the highest priority. Keep an eye on it. Meanwhile, the centralized monitoring of a specific moving object may be performed regardless of the priority according to the control signal from the remote monitoring / control device 40.

In the embodiment of FIG. 8, the panning and tilting values stored in the control value LUT 320 may be experimentally determined and stored for each slave camera 20. For example, while sequentially driving the slave camera 20, a panning and tilting value of a point corresponding to approximately a center point of each sector of the wide-angle image 200 may be determined. At this time, the installer may determine the panning and tilting values of the slave camera 20 for each point while the identification plate having a number or other mark is provided in the monitoring target area. In addition, the panning and tilting values are experimentally determined only for a few points within the wide-angle image 200 or the surveillance region, and the panning and tilting values for the remaining positions are interpolated according to the Law of Proportional Part. You may. Meanwhile, in addition to the panning and tilting values, the zoom magnification value may also be stored in the control value LUT 320 to provide a zoom magnification value when the slave camera 20 is controlled. On the other hand, the panning, tilting, and zoom ratio values stored in the control value LUT 320 and provided to the slave camera 20 may be represented as absolute values in a spherical coordinate system centering on the slave camera 20. For example, it may be a relative value based on a specific position. In this case, specific panning, tilting, and zoom magnification values may be determined in units of motor resolution instead of angle or focal length.

As described above, according to the embodiment of FIG. 8, according to the motion detection, the controller 130 may provide the panning and tilting values as if the GOTO command is used to determine the centralized monitoring direction of the slave camera 20. In addition, the controller 130 may provide these control values to the slave camera 20 according to a control signal from the remote monitoring / control device 40.

9 is a schematic diagram illustrating another embodiment of a motion object detection process.

According to the present exemplary embodiment, specific parameter values for driving the slave camera 20 are not stored in the camera LUT 320a stored in the nonvolatile memory (not shown) of the master camera 10. Only preset codes are stored. The specific panning and tilting values for each preset code are stored in the control value LUT 320b maintained in the slave camera 20. The preset code value may be determined to be equal to the sector code value.

When the wide-angle image 200 is obtained by the imaging unit 110, the motion detector 122 compares the wide-angle image 200 in units of frames with respect to each image sector to calculate a motion coefficient. If the count value is larger than the second reference value, the motion detector 122 determines that a motion object exists in the corresponding image sector, and outputs a motion detection signal to the controller 130.

Upon receiving the motion detection signal, the controller 130 first checks an image sector in which the motion object exists by referring to the motion object table 322. The controller 130 reads a preset code value from the camera LUT 320a in a sector in which the moving object exists and transmits the preset code value to the slave camera 20 through the serial communication unit 132. The slave camera 20 reads the patting and tilting values corresponding to the preset code values from the control value LUT 320b stored in the internal memory, and drives the panning motor 180 and the tilting motor 184 with the read values. The monitoring target area corresponding to the video sector is concentrated.

As described above, according to the exemplary embodiment of FIG. 9, when the motion is detected, the controller 130 transmits only the preset code value abbreviating the PTZ control parameter to the slave camera 20, and the slave camera 20 interprets the preset code value. Since the panning motor 180 and the tilting motor 184 are driven, the control process is simplified from the standpoint of the master camera 10, and data transmission and reception between the master camera 10 and the slave camera 20 are facilitated. Since other features of the embodiment of FIG. 9 are similar to those of FIG. 8, detailed description thereof will be omitted.

As described above, according to the exemplary embodiment shown in FIG. 8 or 9, since the motion is directly detected from the wide-angle image 200, which is the original image before the panorama image 210 is generated, the motion detection speed is high. In addition, since the wide-angle image 200 is divided into sectors to determine the presence of a moving object using only pixel values, complicated calculations such as complex coordinate value conversion required to expand the wide-angle image 200 are unnecessary, and in terms of motion detection speed. There is a further advantage. In addition, since the slave camera 20 is uniformly controlled on an image sector basis, the time required for initial tracking is also shortened.

On the other hand, in the preferred embodiment, by using the centralized monitoring image 220 obtained by the slave camera 20 to verify the motion object detection by, for example, a block matching algorithm, it is possible to prevent errors and further improve the tracking performance. . In addition, the motion detector 122 may detect the contour of each motion object in the process of detecting the motion object using the centralized monitoring image 220, and provide the controller 130 with size information of the motion object. In this case, the controller 130 may determine the zoom factor value for the centralized monitoring image 220 based on the size information of the moving object, and control the slave camera 20.

However, the present invention is not necessarily limited to performing motion detection or controlling the slave camera 20 on an image sector basis. That is, the motion detector 122 may detect the motion object by using the entire wide-angle image 200, determine the position of the motion object in pixels, and then control the slave camera 20 according to the coordinates of the motion object. have. 10 shows such a modified embodiment.

In the embodiment of FIG. 10, when the wide-angle image 200 is obtained by the imaging unit 110, the motion detector 122 compares the wide-angle image 200 in units of frames and detects a moving object. In one embodiment, the motion detection may be performed by calculating a motion coefficient value for each pixel as described above, and grouping pixels whose coefficient value is larger than a reference value among neighboring pixels. The motion detector 122 determines the coordinates of the center point of the object in the form of two-dimensional polar coordinates (r, θ) by calculating an average value or a median value for the coordinates of the pixels constituting each moving object. Here, r represents a distance from the image center point of the object center point, and θ represents an azimuth angle from a constant reference line. In addition, the motion detector 122 determines the size of the up, down, left and right directions of each moving object. The motion detector 122 assigns an ID to each motion object, and stores the center point coordinates (r, θ) and data about the size in the motion object table 322a.

The controller 130 determines the panning and tilting values of the slave camera 20 from the center point coordinates r and θ for each moving object. The controller 130 determines a zoom factor value for each moving object by using an image size and a distance coordinate r value. The controller 130 transmits the determined panning and tilting values and the zoom magnification values to the slave camera 20 through the serial communication unit 132 so that the slave camera 20 intensively monitors the monitoring target area corresponding to the corresponding video sector. do. If a motion is detected in two or more image sectors, the motion detector 122 assigns a priority according to the motion coefficient value, and the controller 130 causes the slave camera 20 to assign a priority to the image sector having the highest priority. Keep an eye on it. Meanwhile, the centralized monitoring of a specific moving object may be performed regardless of the priority according to the control signal from the remote monitoring / control device 40.

In the embodiment of FIG. 10, when the reference point of the slave camera 20 is almost similar, the method of determining the panning and tilting values from the center point coordinates r and θ of the object is as follows.

First, assuming that the measurement reference of the panning angle is the same as the measurement reference of the azimuth coordinate (θ) in the wide-angle image 200, the panning angle is determined to be equal to the azimuth coordinate (θ) value of the object center point. Even when the measurement reference plane of the panning angle is different from the azimuth coordinate measurement reference plane in the wide-angle image 200, the panning angle value can be easily determined by linear equation from the azimuth coordinate (θ) value of the center point coordinate (r, θ). have.

On the other hand, the tilt angle value can be obtained from the distance coordinate (r) value of the object center point. In general, the refractive characteristics of a fisheye lens or the reflection characteristics of a 360 ° reflector have profile characteristics in which incident light can be represented by a specific mathematical model. For example, when the fisheye lens 112 has orthogonal refraction characteristics as shown in FIG. 11, the distance from the image center of the imaging point of the incident light is represented by the following equation (1).

Figure pat00001

Here, x is the angle of incidence, f is the focal length, and represents the radius of the image in the wide-angle image, and y represents the distance from the image center point of the imaging point.

Therefore, the incident angle x with respect to the object center point of the distance r from the image center in the image having a radius R can be obtained by the following equation (2).

Figure pat00002

When the measurement reference plane of the tilting angle is a horizontal plane, the tilting angle value for driving the slave camera 20 has a magnitude (90 ° -x), and the sign has a negative value, and is expressed as in Equation 3 below.

Figure pat00003

On the other hand, when the fisheye lens 112 has an equidistant projection refractive characteristic as shown in Fig. 12, the distance from the image center of the imaging point of the incident light is represented by the following equation (4).

Figure pat00004

Therefore, the incident angle x of the object center point of the distance r from the center of the image in the image of radius R can be obtained by the following equation (5).

Figure pat00005

When the measurement reference plane of the tilting angle is a horizontal plane, the tilting angle value for driving the slave camera 20 has a magnitude (90 ° -x), and a sign has a negative value, and is expressed as in Equation 6 below.

Figure pat00006

Even when the wide-angle lens to the fisheye lens 112 have other refractive characteristics, the tilt angle value may be determined similarly to the above.

The method of determining the panning and tilting values as described above may be applied when the installation reference position of the slave camera 20 is almost similar to the center point of the wide-angle lens 112, and the slave camera 20 and the master camera 10 are adjacent to each other. It can be useful when installed. On the other hand, when the slave camera 20 is installed away from the master camera 10, as in the embodiment of FIG. 8, panning and tilting values for several points within the wide-angle image 200 or the area to be monitored. Can be determined experimentally and stored in memory, and the panning and tilting values for the remaining positions can be interpolated according to the Law of Proportional Part with reference to the stored values. Such embodiments will be omitted by those skilled in the art to which the present invention pertains, based on the present specification.

8 to 10, when detecting a moving object in the wide-angle image 200 and controlling the PTZ operation of the slave camera 20 by using the same, the panoramic image in the output image illustrated in FIG. 5. It is preferable that a pointer indicating the position of the moving object that the slave camera 20 is tracking is additionally displayed on the 210. 13 shows an example of such an output image. The controller 130 determines the position of the moving object by calculating an approximate midpoint for the sectors adjacent to each other among the image sectors in which the motion is detected. The control unit 130 provides this position information to the panoramic image forming unit 124 or the image combining unit 134 so that the panoramic image forming unit 124 or the image combining unit 134 is in the panoramic image 210, for example, red. In addition, the pointer 211 having a rectangular shape with added color may be added. The size of the pointer 211 is preferably constant regardless of the size of the moving object. The pointer 211 helps the operator recognize the current situation more accurately while comparing the panoramic image 210 and the concentrated monitoring image 220.

14 to 19, a change in the tracking and monitoring operation according to generation and destruction of a moving object will be described.

As mentioned above, each time a new moving object is detected, the master camera 10 stores the information about the moving object in the moving object table 322 or 322a, and determines a PTZ control value to the slave camera 20. Send. Here, each moving object is given a tracking priority along with a unique object ID. The tracking priority may change depending on the occurrence and destruction of other moving objects and the movement of the moving objects. In one embodiment, the closer the moving object is to the center of the wide-angle image 200, the higher the priority is given to the object. In addition, the higher the moving speed of the moving object in the wide-angle image 200, the higher priority may be given. Here, the distance between the object and the center of the image 200 and the moving speed of the object generally show a high correlation due to the nonlinearity of the wide-angle image 200. However, when the two do not coincide, one of two criteria High priority weights can be used to determine priorities. Meanwhile, in the modified embodiment, high priority may be given to new detection objects.

The priority for the moving object may be changed by the operator of the remote monitoring / control device 40. In particular, the highest priority may be given to the tracking target object selected by the operator. That is, when an operator selects a moving object through the input unit 46, a control signal including the selected object information is transmitted to the master camera 10, and the controller 130 gives the object the highest priority. Allow slave camera 20 to track.

The control unit 130 of the master camera 10 stores the movement history information for each currently active object while generating and maintaining the tracking table 330 in the volatile memory, in addition to the motion object table 322 or 322a. do. 14 shows an example of the tracking table 330. In the example shown, the tracking table 330 sequentially stores the image sector numbers to which each moving object has moved to date. In the modified embodiment, the history of the variation of the preset code value or the coordinate value of the object center point for driving the slave camera 20 may be stored instead of the image sector number.

Referring to FIG. 15, when a new motion object ('A' on the drawing) is detected, the motion detector 122 stores the object information and the image sector information in the motion object table 322 or 322a and the tracking table 330. At this time, when motion is detected in several adjacent sectors, it is assumed that the motion object exists in the region having the largest motion count. The controller 130 transmits a PTZ control signal to the slave camera 20 according to a parameter mapped to an image sector in which a moving object exists, and sends an alarm signal to the remote monitoring / control device 40 through the serial communication unit 132. Send.

When the moving object moves as time elapses, and the motion count value for the image sectors changes, the motion detector 122 determines that the object has moved to the sector having the largest motion count value, and moves the information about the new image sector. The data is stored in the object table 322 or 322a and the tracking table 330. The controller 130 transmits a PTZ control signal to the slave camera 20 according to a parameter corresponding to the updated image sector information. On the other hand, if the motion object is out of the boundary of the wide-angle image 200 or disappears from the image completely, the motion information is deleted from the motion object table 322 or 322a and the tracking table 330.

Referring to FIG. 16, when another motion object ('B' on the drawing) is detected in an image sector that is not adjacent to an existing motion object ('A' on the drawing), the motion detector 122 may generate a new motion object 'B'. Object information and image sector information about ') are stored in the motion object table 322 or 322a and the tracking table 330. In this case, the controller 130 may transmit an alarm signal indicating the appearance of a new moving object to the remote monitoring / control device 40 through the serial communication unit 132. Meanwhile, the controller 130 determines the priorities of the two moving objects 'A' and 'B' according to the criteria described above. The controller 130 transmits a PTZ control signal to the slave camera 20 according to a parameter mapped to an image sector in which a high priority object exists, so that the slave camera 20 intensively monitors a moving object having a high priority. To acquire an image.

Referring to FIG. 17, a plurality of moving objects 'A' and 'B' may be located in the same image sector. In this case, the motion detector 122 stores the image sector information common to the two motion objects 'A' and 'B' in the motion object table 322 or 322a and the tracking table 330. The controller 130 transmits the PTZ control signal to the slave camera 20 according to the parameter mapped to the common image sector, so that the slave camera 20 includes both objects 'A' and 'B'. It will be to acquire the intensive monitoring image.

As shown in FIG. 18, when both or one of the two moving objects 'A' and 'B' moves and is located in different image sectors, the controller 130 may move the two moving objects 'A' and 'B'. Re-prioritize according to the criteria described above for '). The controller 130 transmits a PTZ control signal to the slave camera 20 according to a parameter mapped to an image sector in which a high priority object exists, so that the slave camera 20 intensively monitors a moving object having a high priority. To acquire an image.

15 to 18, when there are a plurality of moving objects in the wide-angle image 200, the plurality of moving objects are also displayed in the panoramic image 210 included in the output image. In addition, some moving objects are displayed on the centralized monitoring image 220. In this case, the panoramic image configuring unit 124 or the image combining unit 134 may display the highest priority moving object for which the intensive surveillance image is obtained by using a pointer 211 having a different shape from other moving objects. . 19 shows an example of such an output image. For example, a red pointer 211a may be displayed in the panorama image 210 for the highest priority moving object, and a green pointer 211b may be displayed for the other moving object. In addition to the different colors as described above, the pointer 211a may be flickered for the highest priority moving object or may be differentiated by changing the shape of the contour of the pointer 211a. On the other hand, when there are a plurality of moving objects, it is preferable to arrange the object IDs (MDE_003, MDE_008, MDE_xxx) on one side of the screen so that the operator can identify the objects more accurately.

Referring to FIG. 20, a process of controlling a master camera according to movement of a moving object in the example of the camera arrangement of FIG. 1B will be described. In the drawing, it is assumed that the moving object appears in the lower left side and disappears in the upper right side, like a thick dotted arrow.

First, immediately after detecting the moving object, the master camera 10b transmits object log data including the position and the moving direction of the moving object to the remote monitoring / control device 40, and the remote monitoring / control device 40 receives The new motion object is stored in the storage device according to the object log data. At this time, the master camera 10b causes the slave camera 20a or 20b to centrally monitor the moving object.

As the moving object moves, the remote monitoring / control device 40 waits for the motion detection / tracking including the position and ID of the moving object with respect to the master cameras 10a, 10c, and 10e around the master camera 10b. Send a command, whereby the master cameras 10a, 10c, 10e enter the standby state.

When the moving object is out of its surveillance area, the first master camera 10b transmits object log data indicating that the moving object has expired to the remote monitoring / control device 40, and ends tracking for the object. . Meanwhile, the master camera 10c managing the area where the moving object enters transmits object log data indicating that the moving object which is waiting for monitoring has entered to the remote monitoring / control device 40. Then, the master camera 10c causes the slave camera 20e or 20f to centrally monitor the moving object.

As the moving object moves, the remote monitoring / control device 40 again detects and tracks the motion object including the position and ID of the moving object with respect to the master cameras 10a, 10d, and 10e around the master camera 10c. The standby command is transmitted, whereby the master cameras 10a, 10d, 10e enter the standby state.

When the moving object is out of its surveillance area, the master camera 10c transmits object log data indicating that the moving object has expired, to the remote monitoring / control device 40, and terminates tracking of the object. Meanwhile, the master camera 10d managing the area where the moving object enters transmits object log data indicating that the moving object which is waiting for monitoring has entered to the remote monitoring / control device 40. Then, the master camera 10d causes the slave camera 20g or 20h to centrally monitor the moving object. When the moving object moves and finally disappears from the surveillance area, the master camera 10d transmits object log data indicating that the moving object has expired, to the remote monitoring / control device 40, and terminates tracking of the object.

FIG. 21 shows another modified embodiment of the master camera shown in FIG. 2. The master camera 10 shown in FIG. 2 receives a camera control signal from the remote monitoring / control device 40 by serial communication via the serial communication unit 132 and the master camera 10 and / or slave cameras 20a and 20b. While transmitting the status information of the) to the remote monitoring / control device 40, and transmits the output image configured by the video combination unit 134 to the remote monitoring / control device 40 through the coaxial cable in the form of a composite video signal, , The master camera 10a according to the present embodiment communicates with the remote monitoring / control device 40 by a network protocol.

The master camera 10a has a network adapter 138, for example an Ethernet interface card, which is connected to the remote monitoring / control device 40 via a LAN cable. The network adapter 138 receives a camera control signal from the remote monitoring / control device 40 according to the TCP / IP protocol and monitors status information of the master camera 10a and / or slave cameras 20a and 20b. 40 can be transmitted. In addition, the network adapter 138 transmits the output image configured by the image combination unit 134 to the remote monitoring / control device 40 in the form of a digital signal.

According to the present embodiment, even when the master camera 10a is installed remotely from the remote monitoring / control device 40, the remote monitoring / control device 40 may operate the master camera 10a through a network such as the Internet. It can be easily controlled and can easily monitor the area to be monitored by receiving video signals from the master camera 10a and the slave cameras 20a and 20b. In addition, a plurality of master cameras 10a can be connected to the remote monitoring / control device 40 and the wiring required for this can be minimized.

Meanwhile, the slave cameras 20a and 20b may provide the centralized monitoring image to the master camera 10a in the form of a digital signal. In this case, the first signal converter 126 may be omitted in FIG. 21, and the master camera 10a may operate in a full digital manner.

FIG. 22 shows another modified embodiment of the master camera shown in FIG. 2. In the present embodiment, the image combination unit 134 combines the images in the analog signal state to generate an output image.

The wide-angle imaging unit 110 outputs an analog wide-angle video signal together with a digital wide-angle video signal. If the image sensor 112 can output only a digital wide-angle image signal, an A / D converter may be additionally provided in the wide-angle imaging unit 110 or the control / signal processing circuit 120.

The first signal converter 126 converts the centralized monitoring video signal received through the video signal input terminal 150 into a digital video signal. The image storage unit 128 stores the digital centralized monitoring video signal and the wide-angle video signal. The second signal converting unit 140 converts the digital panoramic image signal from the panoramic image forming unit 124 into an analog signal.

The image combination unit 134 selects one or more of the wide-angle image 200, the panoramic image 210, the front panoramic image 212, the rear panoramic image 214, the panoramic image 210, and the concentrated monitoring image. The output image is composed by combining the images in the analog signal state. The image combination unit 134 transmits the output image to the remote monitoring / control device 40 through the image signal output terminal 162 in the form of a composite image signal.

As described above, the camera device shown in FIG. 2 and its modified embodiments may be modified in various ways depending on the output signal format of the centralized monitoring cameras 20a and 20b or the video signal input request of the remote monitoring / control device 40. Since it can process video signals and can be linked to the remote monitoring / control device 40 through serial communication or network communication, it can be very easily adopted in various monitoring systems without replacing existing equipment. Since the output image can be provided to the remote monitoring / control device 40 in a format fully formatted by the master camera 10, the computing burden for high-speed signal processing in the remote monitoring / control device 40 is reduced. In addition, the limitation by the bandwidth of the transmission channel is greatly reduced, and the amount of wiring required to link the plurality of centralized monitoring cameras 20a and 20b to the remote monitoring / control device 40 can be minimized.

23 shows an example of installation of the master camera 10 and the slave camera 20.

In this example, the "U" shaped support member 402 is attached to the upper outer peripheral surface of the support 400. The “U” shaped support member 402 includes a first horizontal bar 404, a vertical bar 406 that is bent upward at an outer end of the first horizontal bar 404, and an upper end of the vertical bar 406. And a second horizontal bar 408 that is bent inwardly and continuously. The first horizontal bar 404 may be welded to the upper outer circumferential surface of the support 400, or may be attached to the outer circumferential surface of the support 400 using a U-band bracket.

The slave camera 20, which is a PTZ camera, is installed on the top of the support 400. The master camera 10, which is an omnidirectional camera, is installed on the bottom surface of the inner end of the second horizontal bar 408 of the “U” shaped support member 402. At this time, it is preferable that the master camera 10 is installed directly above the slave camera 20 so that the optical axis of the master camera 10 and the panning central axis of the slave camera 20 coincide.

According to this embodiment, since the optical axis of the master camera 10 and the panning central axis of the slave camera 20 coincide, the azimuth angle of the center point of the moving object in the wide-angle image 200 during the motion detection and tracking monitoring process. The slave camera 20 may be controlled by directly determining the coordinate value θ as the panning angle value of the PTZ camera.

Since the installation height of the master camera 10 can be adjusted by adjusting the length of the vertical bar 406 of the “U” shaped support member 402, the range of the motion detection and tracking area can be adjusted according to the user's needs. There is an advantage.

Although the surveillance blind spot is generated in the master camera 10 due to the "U" shaped support member 402 and the slave camera 20, the installation example of FIG. It can be useful in applications such as a camera with a localized, rotational angle limited panning / tilting drive.

24 shows another installation example of the master camera 10 and the slave camera 20.

In this example, an "L" shaped support member 412 is attached to the upper outer peripheral surface of the support 410. The "L" shaped support member 412 has a horizontal bar and a vertical bar bent downward at the outer end of the horizontal bar to be continuous. The horizontal bar 414 may be welded to the upper outer circumferential surface of the support 410 or may be mounted on the outer circumferential surface of the support 410 using a U-band bracket.

The master camera 10 is installed at the bottom of the vertical bar of the "L" shaped support member 412, the slave camera 20 is installed at the top of the support (400).

According to the present exemplary embodiment, it is easy to apply a 360 ° reflecting mirror capable of acquiring an image with less distortion than when using a fisheye lens to the imaging unit 110 of the master camera 10. In addition, by adjusting the installation height of the master camera 10 has the advantage that the range of the monitoring target area can be adjusted according to the needs of the user.

Surveillance blind spots are generated in the slave camera 20 due to the "L" shaped support member 412 and the master camera 10, and the blind spots caused by the "L" shaped support member 412 also for the master camera 10. Although this occurs, some blind spots are allowed monitoring stations, rearview shielded monitoring stations, even if the position of the master camera 10 is generally low, it can be usefully used in monitoring stations without problems in operation.

25 shows another installation example of the master camera 10 and the slave camera 20.

The horizontal bar 422 is attached to the upper outer peripheral surface of the support 420. The master camera 10 is installed using a bracket (not shown) below the outer end of the horizontal bar 422, and the slave camera 20 is located outside the horizontal bar 422 so as to be directly above the master camera 10. It is installed using a bracket (not shown) on the top. The height at which the horizontal bar 422 is installed in the support 420 may be arbitrarily selected by the user in consideration of the range of the monitoring target area.

Since the optical axis of the master camera 10 and the panning central axis of the slave camera 20 coincide with each other, the azimuth coordinate (θ) value of the center point of the moving object in the wide-angle image 200 is immediately determined in the motion detection and tracking monitoring process. The slave camera 20 may be controlled by determining the panning angle value of the PTZ camera. This installation example has the advantage that the blind area of each camera can be minimized.

26 shows another installation example of the master camera 10 and the slave camera 20.

The upper side of the support 430 is provided with a camera installation room 432 made of a wall surface made of a material such as transparent tempered glass or reinforced resin. The master camera 10 is installed to face downward on the ceiling of the camera installation room 432. The slave camera 20 is installed at the upper end of the support 430 so as to be directly above the master camera 10. The height of the post 430 and the location of the camera installation room 432 may be selected in consideration of the range of the surveillance area and the range of the blind area.

Also in this example, since the optical axis of the master camera 10 and the panning central axis of the slave camera 20 coincide, the azimuth coordinate of the center point of the moving object in the wide-angle image 200 during the motion detection and tracking monitoring process ( The slave camera 20 may be controlled by directly determining the value of θ) as a panning angle value of the PTZ camera.

According to the present installation example, there is a possibility of deterioration of image quality due to light transmittance in the surveillance blind spot in the master camera 10 due to the holding portion under the camera installation room 432, the outer wall of the camera installation room 432, and the slave camera 20. There may be additional considerations, such as weight limits. However, the present installation example can be usefully used for video conferencing relays at conference tables, monitoring stations where the installation position may be low, remote motion detection and tracking required areas, and the like.

27 shows another installation example of the master camera 10 and the slave camera 20.

A first support member 442 having an " L " shape is mounted on the upper outer peripheral surface of the support 440, and a second support member (on the outer peripheral surface of the support 440 under the mounting portion of the first support member 442). 444 is mounted. The first and second support members 442 and 444 may be welded to the outer circumferential surface of the support 440 or may be mounted using a U band bracket.

The slave camera 20 is installed at the bottom of the vertical bar of the first support member 442, and the master camera 10 is installed at the bottom of the vertical bar of the second support member 444. The height of the master camera 10 may be selected in consideration of the range of the area to be monitored and the range of the blind area.

Also in this example, since the optical axis of the master camera 10 and the panning central axis of the slave camera 20 coincide, the azimuth coordinate of the center point of the moving object in the wide-angle image 200 during the motion detection and tracking monitoring process ( The slave camera 20 may be controlled by directly determining the value of θ) as a panning angle value of the PTZ camera. In addition, the image pickup unit 110 of the master camera 10 has an advantage that it is easy to apply a 360 ° reflection mirror that can obtain an image with less distortion than when using a fisheye lens.

Such an installation example is suitable for a monitoring station where some blind spots are allowed, a monitoring station with a shielded rear surface, and a monitoring station that does not interfere with operation even if the position of the master camera 10 is generally low.

28 shows another installation example of the master camera 10 and the slave camera 20.

The first support member 452 having an "L" shape is mounted on the upper outer peripheral surface of the support 450, and the second support member () is provided on the outer peripheral surface of the support 450 under the mounting portion of the first support member 452. 454 is mounted. The first and second support members 452 and 454 may be welded to the outer circumferential surface of the support 450 or may be mounted using a U band bracket.

The master camera 10 is installed at the bottom of the vertical bar of the first support member 452, and the slave camera 20 is installed at the bottom of the vertical bar of the second support member 454. The height of the master camera 10 may be selected in consideration of the range of the area to be monitored and the range of the blind area.

Also in this example, since the optical axis of the master camera 10 and the panning central axis of the slave camera 20 coincide, the azimuth coordinate of the center point of the moving object in the wide-angle image 200 during the motion detection and tracking monitoring process ( The slave camera 20 may be controlled by directly determining the value of θ) as a panning angle value of the PTZ camera.

Such an installation example can be usefully used in monitoring areas where some blind spots are allowed, monitoring stations with shielded rear surfaces, and cameras to which panning / tilting drives with limited rotation angles are used.

29 shows another installation example of the master camera 10 and the slave camera 20.

The horizontal bar 462 is attached to the upper outer peripheral surface of the support 460. The master camera 10 is installed at the bottom of one point of the horizontal bar 462. One or more slave cameras 20a and 20b are installed on the bottom of the horizontal bar 462 in the inward direction or the outward direction at the point where the master camera 10 is installed. Only one slave camera 20a or 20b may be installed or both may be installed according to the monitoring target region.

As such, when the master camera 10 and the slave cameras 20a and 20b are installed in parallel, monitoring blind spots may occur due to interference between the cameras. However, when the horizontal bar 462 is sufficiently high, such interference may occur. Can be ignored. Therefore, this installation example can be utilized especially useful in the height monitoring area | region.

30 shows another installation example of the master camera 10 and the slave camera 20. This installation example is suitable for indoor monitoring topic, and the master camera 10 and the slave camera 20 are installed side by side on the ceiling 470 in the interior of the building. Surveillance blind spots may occur due to the interference between the master camera 10 and the slave camera 20, but such interference is ignored in a room where surveillance is mainly performed for an area lower than the cameras 10 and 20. Can be.

31 shows another embodiment of a video surveillance system according to the present invention. The video surveillance system according to the present embodiment includes a multi-function camera device 500 and a remote monitoring / control device 40.

The camera device 500 integrates a omnidirectional camera and a PTZ camera, and includes a wide-angle imaging unit 510, a PTZ imaging unit 520, and a control / drive unit 530. The wide-angle imaging unit 510 photographs the surveillance area in all directions, and the control / drive unit 530 detects the moving object in the omnidirectional image, and when the moving object is detected, the PTZ imaging unit 520 to concentrate on the region where the moving object is detected. ). In addition, the control / drive unit 530 generates an output image by combining the centralized surveillance image obtained by the PTZ imaging unit 520 and the omnidirectional image, and remote monitoring / controlling device 40 as an analog image signal or a digital image signal. To provide.

The remote monitoring / control device 40 is installed at a remote location apart from the camera device 500 and includes a data processing unit 42, a monitor 44, and an input unit 46. The data processor 42 receives the output image from the camera device 500 and displays it on the monitor 44. In addition, the data processor 42 controls the camera device 500 according to a user input input through the input unit 46. The data processor 42 may be implemented by a general PC, and may further include a matrix, a screen splitter, an image distribution amplifier, and the like for processing the displayed image. The input unit 46 may be a keyboard, a mouse, a joystick input device, or a combination thereof.

In FIG. 31, for the sake of simplicity, only one camera device 500 is connected to the remote monitoring / control device 40, but a plurality of camera devices 500 are connected to the remote monitoring / control device 40. May be connected.

32 is a detailed block diagram of one embodiment of a camera apparatus 500.

The wide-angle imaging unit 510 includes the fisheye lens 512 and the first image sensor 514 electrically / optically. The fisheye lens 512 has an omnidirectionally more than 150 degrees of viewing angle, and collects light incident from the space within the viewing angle to form an image on the first image sensor 514. The first image sensor 514 includes a CMOD or CCD element and an A / D converter. The first image sensor 514 converts the light collected by the fisheye lens 512 into an electrical image signal and digitizes the digital image to output a digital wide-angle image signal. The wide-angle image acquired through the fisheye lens 14 is circular.

The PTZ imaging unit 520 includes a focusing lens 522 and a second image sensor 524. The focusing lens 522 condenses the light incident from the front, and the second image sensor 524 converts and digitizes the light condensed by the focusing lens 522 into an electric image signal, thereby converting the digital intensive monitoring image signal. Output

In the control / drive unit, the motion detector 122 receives the wide-angle image signal from the wide-angle image pickup unit 510 and compares the wide-angle image in units of frames to determine whether a motion object exists in the image. Here, the frames to be compared may be consecutive frames or frames separated in time by a plurality of frame periods.

The panoramic image configuring unit 124 converts the wide-angle image from the wide-angle imaging unit 510 into a rectangular panoramic image. 33 illustrates a panorama image composition method by the panorama image configuration unit 124. As mentioned above, the wide-angle image 600 captured by the wide-angle imaging unit 510 having the fisheye lens 512 is circular. In an exemplary embodiment, the panoramic image configuring unit 124 selects only a part of the rectangle from the circular wide-angle image 600, and outputs a video signal for the selected video area 610 as a panoramic video signal. The area taken as the panoramic image 610 may be fixed or may vary according to motion detection. In addition, in the remote monitoring / control device 40, the operator may select an area by specifying two points P1 and P4 or P2 and P3 at diagonal positions through the input unit 46. In this case, the area selector 102 receives an area setting signal indicating the coordinates of two points P1 and P4 or P2 and P3 through the control unit 104 and, in response, receives a panoramic image (from the wide-angle image 600). 610 is extracted.

Referring to FIG. 32 again, the image storage unit 128 includes a storage medium such as a hard disk or a semiconductor storage device (SSD), and stores a digital centralized monitoring image signal and a wide-angle image signal. The image storage unit 128 may store the compressed video signal instead of the original video signal. In such an embodiment, the image storage unit 128 further includes a compression / decompression unit for compressing the original image signal or restoring the compressed image signal. The compression / decompression unit implements the controller 130. It can be implemented by a computer program running on a microprocessor for.

The controller 130 controls the overall operation of the camera device 500. In particular, the controller 130 controls the pan-tilt-zoom operation for the PTZ image capturing unit 520 according to the motion object detected by the motion detector 122. In addition, the controller 130 may control the image combination unit 134 to change the configuration of the output image according to the camera control signal. In addition, the controller 130 allows the video signal stored in the image storage unit 128 to be read and provided to the remote monitoring / control device 40 as indicated by the camera control signal. The basic control operation of the controller 130 is performed by a computer program, but may be performed in response to a camera control signal received from the remote monitoring / control device 40.

The serial communication unit 132 allows the control unit 130 to communicate with the remote monitoring / control device 40 through the serial port 160. That is, the controller 130 may receive the camera control signal from the remote monitoring / control device 40 through the serial communication unit 132 and transmit the status information of the camera device 500 to the remote monitoring / control device 40. . The connection between the serial communication unit 132 and the remote monitoring / control device 40 may be made according to RS232C, RS422 or RS485 standards.

The image combination unit 134 configures an output image by selecting one or more of the wide-angle image 600, the panoramic image 610, and the concentrated monitoring image. In this embodiment, since the output image is similar to that shown in FIG. 5, a detailed description thereof will be omitted. The signal converter 132 generates a composite video signal for the output video configured by the video combination unit 134 and transmits the composite video signal to the remote monitoring / control device 40 through the video signal output terminal 162. Accordingly, the output image configured by the image combination unit 134 may be displayed on the monitor 44 of the remote monitoring / control device 40.

Meanwhile, the panning motor driver 182 drives the panning motor 180 under the control of the controller 132 to mechanically move the PTZ imaging unit 520 including the focusing lens 522 and the image sensor 524 in the horizontal direction. Rotate The tilting motor driver 186 drives the tilting motor driver 186 under the control of the controller 132 to rotate the PTZ imaging unit 520 in the vertical direction. The zoom motor driver 190 drives the zoom motor 188 under the control of the controller 132, thereby making it possible to vary the focal length of the focusing lens 522 and to implement a zoom in / zoom out function. When the motor drivers 182, 186, and 190 are driven based on the detection result of the moving object, the PTZ imaging unit 520 may acquire a tracking and monitoring image of the moving object.

34 is a perspective view of an embodiment of the camera apparatus 500 of FIG. 32.

The camera device 500 according to the preferred embodiment includes a housing 540 made of a metal or a synthetic resin material and having a shape such as a bell, and storing and protecting the PTZ imaging unit 520 on the bottom of the housing 540. For dome 544. Under the outer circumferential surface of the housing 540, the fisheye lens 512 of the wide-angle imaging unit 510 is exposed to the outside through the support protrusion 542. The upper surface of the housing 540 may be provided with a bracket 550 for mounting the camera device 500 on the wall.

The support protrusion 542 is provided below the outer circumferential surface of the housing 540 such that the optical axis of the fisheye lens 512 is directed outwardly downward of the camera apparatus 500. The support protrusion 542 determines the photographing direction of the wide-angle imaging unit 510 and structurally supports the wide-angle imaging unit 510 so that the wide-angle imaging unit 510 includes a point directly below the camera device 540. Allows you to capture images.

Bracket 550 is made of a metal material, the vertical portion that is continuous in the vertical direction and the lower end thereof can be coupled to the upper surface of the housing 540, and vertically bent in the horizontal direction by bending backward from the upper end of the vertical portion And an attachment plate provided at the rear end of the vertical portion. A plurality of holes are formed in the attachment plate, so that the bolts 552 can be attached to the support post or the wall surface.

Other features of the camera apparatus 500 illustrated in FIGS. 32 and 34 are similar to those of the master camera 20 illustrated in FIG. 2, and thus a detailed description thereof will be omitted.

FIG. 35 shows another embodiment of the camera apparatus 500 shown in FIG. 31.

According to the present embodiment, the camera device 500 includes two optical imaging units 512a and 512b. As will be described later, the optical imaging units 512a and 512b are arranged to be horizontally symmetrical and face in opposite directions to each other, thereby obtaining an omnidirectional image. The configuration of each optical imaging unit 512a. 512b is similar to the optical imaging unit 512 shown in FIG. Meanwhile, the PTZ imaging unit 520 of FIG. 35 has the same configuration and function as that shown in FIG. 32.

The motion detector 122 receives wide-angle image signals from the optical imaging units 512a and 512b and determines whether a motion object exists in the image separately for each wide-angle image.

The panoramic image configuring unit 124 converts the wide-angle images from the optical imaging units 512a and 512b into rectangular panoramic images. Referring to FIG. 36, the wide-angle images 600a and 600b respectively photographed by the optical imaging units 512a and 512b are circular. The panoramic image configuration unit 124 selects only the rectangular portions 612a and 612b from the circular wide-angle images 600a and 600b, respectively, and connects them.

In relation to the present embodiment, in the following description, the image 612a extracted in the rectangular form from the wide-angle image 600a is referred to as a 'forward panoramic image', and the image 612b extracted in the rectangular form in the wide-angle image 600b is described. Will be referred to as a 'rear panoramic image'. In addition, an image 610 in which the front panoramic image 612a and the rear panoramic image 612b are connected to the left and right will be referred to as a “panorama image”. In the drawing, points P1 to P8 are displayed to indicate corresponding points of the wide-angle images 600a and 600b, the front panoramic image 612a, and the rear panoramic image 612b. The width of the front panoramic image 612a and the rear panoramic image 612b may be equal to the width of the centralized monitoring image from the PTZ imaging unit 520. However, in the modified example, the width of the panoramic image 610 may be equal to the width of the centralized monitoring image.

Referring back to FIG. 35, the storage unit 128 stores the digital centralized monitoring video signal and the wide-angle video signals. The image storage unit 128 may store the compressed video signal instead of the original video signal. In such an embodiment, the storage unit 128 further includes a compression / decompression unit for compressing the original video signal or restoring the compressed video signal. The compression / decompression unit is configured to implement the controller 130. It may be implemented by a computer program running on a microprocessor.

The controller 130 controls the overall operation of the camera device 500. In particular, the controller 130 controls the pan-tilt-zoom operation for the PTZ image capturing unit 520 according to the motion object detected by the motion detector 122. In addition, the controller 130 may control the image combination unit 134 to change the configuration of the output image according to the camera control signal. In addition, the controller 130 allows the video signal stored in the image storage unit 128 to be read and provided to the remote monitoring / control device 40 as indicated by the camera control signal. The basic control operation of the controller 130 is performed by a computer program, but may be performed in response to a camera control signal received from the remote monitoring / control device 40.

The image combination unit 134 configures an output image by selecting one or more of the wide-angle images 600a and 600b, the front panoramic image 612a, the rear panoramic image 612b, the panoramic image 610, and the concentrated monitoring image. . The signal converter 132 generates a composite video signal for the output video configured by the video combination unit 134 and transmits the composite video signal to the remote monitoring / control device 40 through the video signal output terminal 162. Accordingly, the output image configured by the image combination unit 134 may be displayed on the monitor 44 of the remote monitoring / control device 40.

FIG. 37 is a side view of an embodiment of the camera device 500 of FIG. 35.

The camera device 500 according to the present embodiment includes a housing 540 made of metal or synthetic resin and having a shape substantially the same as that of a bell, and the PTZ imaging unit 520 is housed and protected on the bottom of the housing 540. For dome 544. Under the outer circumferential surface of the housing 540, two support protrusions 542a and 542b are provided to be horizontally symmetrical. The fisheye lens 512a of the first wide-angle imaging unit 510a is exposed to the support protrusion 542a, and the fisheye lens 512b of the second wide-angle imaging unit 510b is provided to the support protrusion 542b. Installed outside. The upper surface of the housing 540 may be provided with a bracket 554 for mounting the camera device 500 on the wall.

The support protrusions 542a and 542b are provided below the outer circumferential surface of the housing 540 such that the optical axes of the fisheye lenses 512a and 512b face the outside of the camera apparatus 500. The protrusions 542a and 542b determine the photographing direction of the wide-angle imaging units 510a and 510b, and structurally support the wide-angle imaging units 510a and 510b so that the wide-angle imaging units 510a and 510b are connected to the camera device 540. Allows you to capture surrounding images, including points directly underneath.

38 shows another embodiment of the camera device 500 of FIG. 35.

The camera device 500 according to the present embodiment includes an upper frame 550, a horizontal rotating frame 560, and a PTZ camera unit 570.

The upper frame 550 has a column shape having a substantially circular cross section or a polygonal cross section, and two support protrusions 552a and 552b are horizontally symmetrically provided below the outer circumferential surface thereof. A plurality of supporting / fastening protrusions 554a to 554c having through holes are formed at the upper side of the upper frame 550 to allow the upper frame 550 to be stably supported on the upper mounting surface and to the bolt. It can be fixed to the installation surface.

The fisheye lens 512a of the first wide-angle imaging unit 510a is exposed to the support protrusion 552a, and the fisheye lens 512b of the second wide-angle imaging unit 510b is provided to the support protrusion 552b. Installed outside. The surface where the fisheye lenses 512a and 512b are installed in the support protrusions 552a and 552b is the camera device 500 within a range in which the wide-angle imaging units 510a and 510b are not covered by the PTZ camera unit 570. It is inclined so that it can shoot up to below.

The horizontal rotating frame 560 is installed on the bottom of the upper frame 550 so as to pan, that is, rotate horizontally with respect to the upper frame 550. A panning motor is installed in the upper frame 550 or the upper frame 550 so that the upper frame 550 rotates horizontally at the bottom of the upper frame 550, and a panning shaft (not shown) is driven in the panning motor. The upper frame 550 and the upper frame 550 are connected via a panning shaft.

The PTZ camera unit 570 is installed to be tilted, that is, vertically rotated below the horizontal rotating frame 560. In the present embodiment, a tilting motor is installed in the horizontal rotating frame 560, and a tilting shaft (not shown) connected to the tilting motor across the horizontal rotating frame 560 in the horizontal direction is connected. Brackets 562 are rotatably connected to both ends of the tilting shaft, and the PTZ camera unit 570 is fixedly installed under the brackets 562. The specific configuration and connection of the panning motor and the panning shaft and the specific configuration and connection of the tilting motor and the tilting shaft are well known to those skilled in the art to which the present invention pertains. , A detailed description thereof will be omitted.

A transparent window 572 is provided at the front of the PTZ camera unit 570 to protect the lens while transmitting light. On the other hand, LED lights 562a and 562b are provided on both sides of the horizontal rotating frame 560 so that the illumination can be irradiated to the front at night.

39 shows yet another embodiment of the camera device 500 of FIG. 35.

The camera device 500 according to the present embodiment includes an upper frame 550, a horizontal rotating frame 560, a PTZ camera unit 580, and an LED light 590.

The upper frame 550 has a column shape having a substantially circular cross section or a polygonal cross section, and two support protrusions 552a and 552b are horizontally symmetrically provided below the outer circumferential surface thereof. A plurality of supporting / fastening protrusions 554a to 554c having through holes are formed at the upper side of the upper frame 550 to allow the upper frame 550 to be stably supported on the upper mounting surface and to the bolt. It can be fixed to the installation surface.

The fisheye lens 512a of the first wide-angle imaging unit 510a is exposed to the support protrusion 552a, and the fisheye lens 512b of the second wide-angle imaging unit 510b is provided to the support protrusion 552b. Installed outside. The surface where the fisheye lenses 512a and 512b are installed in the support protrusions 552a and 552b is the camera device 500 within a range in which the wide-angle imaging units 510a and 510b are not covered by the PTZ camera unit 570. It is inclined so that it can shoot up to below.

The horizontal rotating frame 560 is installed on the bottom of the upper frame 550 so as to pan, that is, rotate horizontally with respect to the upper frame 550. A panning motor is installed in the upper frame 550 or the upper frame 550 so that the upper frame 550 rotates horizontally at the bottom of the upper frame 550, and a panning shaft (not shown) is driven in the panning motor. The upper frame 550 and the upper frame 550 are connected via a panning shaft.

The PTZ camera unit 580 is installed to be able to tilt in the lateral direction of the horizontal rotating frame 560. In the present embodiment, a tilting motor is installed in the horizontal rotating frame 560, and a tilting shaft (not shown) connected to the tilting motor across the horizontal rotating frame 120 in the horizontal direction is connected. The PTZ camera unit 580 is installed at one end of the tilting shaft, and the LED light 590 is installed at the other end. Accordingly, when the tilting motor and the tilting shaft rotate, the PTZ camera unit 580 and the LED light 590 rotate correspondingly vertically. In addition, since the PTZ camera unit 580 and the LED light 590 are left and right balanced to some extent, it is possible to prevent the device from being damaged due to load imbalance. Meanwhile, a transparent window 582 is provided at the front of the PTZ camera unit 580 to protect the lens while transmitting light.

Other features of the camera apparatus 500 illustrated in FIGS. 35 to 39 are similar to those of the apparatus illustrated in FIG. 32, and thus a detailed description thereof will be omitted.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the invention.

For example, the image combination unit 134 combines one or more of the wide-angle image 200, the front panoramic image 212, the rear panoramic image 214, the panoramic image 210, and the concentrated monitoring image signal 220. Although described as configuring an output image, the camera apparatus simply multiplexes by a multiplexer and transmits the output image to the remote monitoring / control apparatus 40 instead of combining these video signals, and the remote monitoring / control apparatus 40 outputs the output image. It can also be configured.

When a plurality of intensive surveillance cameras may be connected to one master camera 10, all or part of the intensive surveillance camera may be a fixed camera instead of a PTZ camera. In such a case, the master camera 10 may select only an image from a camera photographing an area in which a moving object exists among the plurality of intensive surveillance cameras, and use it as the intensive surveillance image. Also in this embodiment, it is preferable that each centralized monitoring camera has a zoom function.

In the above description, although the master camera 10 detects the moving object in the wide-angle image and controls the slave camera 20 according to the position of the detected moving object, in the modified embodiment, the moving object detection is performed by the remote monitoring / control device 40. It may also be made by). In such an embodiment, the master camera 10 may receive the position information or the PTZ control information of the moving object from the remote monitoring / control device 40 and control the slave camera 20 accordingly. Delete if not available)

The image storage unit 126 may store the panoramic image instead of the original wide angle image.

On the other hand, it has been described above that the master camera 10 or the camera device 500 transmits the intensive surveillance image or the output image including the intensive surveillance image to the remote monitoring / control device 40. However, in the modified embodiment, while the master camera 10 or the camera device 500 performs a control function for the intensive surveillance camera, the intensive surveillance image is directly transmitted to the remote monitoring / control device 40. You can also do that.

Although various embodiments have been illustrated above, the features described in each exemplary embodiment may be applied to other embodiments unless they are inherently impossible. In addition, the features described in each example embodiment may be combined in one embodiment. For example, the above-described embodiments in which the master camera 10 or the camera device 500 transmits an output image to the remote monitoring / control device 40 through serial communication or a TCP / IP-based network are described. In another embodiment of the present invention, the master camera 10 or the camera device 500 may have both a function of communicating with the master camera 10 through serial communication and a TCP / IP based network. Meanwhile, the master camera 10 or the camera apparatus 500 may transmit the output image signal to the remote monitoring / control device 40 in the form of digital image data instead of an analog image signal. On the other hand, the master camera 10 may receive the centralized monitoring image in the form of a digital signal instead of an analog signal from the slave camera 20.

Therefore, the embodiments described above are to be understood in all respects as illustrative and not restrictive. The scope of the present invention is shown by the following claims rather than the detailed description, and all changes or modifications derived from the meaning and scope of the claims and their equivalents should be construed as being included in the scope of the present invention. do.

Claims (9)

Remote control device;
A wide area image of a surveillance area, each of which is connected to the remote control device, each of which is at least a part of the surveillance area, detects a moving object in the surveillance area, transmits movement object detection information to the remote control device, and A plurality of master cameras for storing a wide area image and a focused surveillance image; And
A plurality of slave cameras, each of which is connected to any one of the plurality of master cameras and obtains the centralized surveillance image for a portion of the surveillance area;
Surveillance system having a.
The method according to claim 1,
Each of the plurality of slave cameras is capable of pan / tilt / zoom driving,
And the master camera controls the slave camera to photograph the moving object based on the moving object detection information without the help of the remote control device.
The method according to claim 1 or 2,
At least some of said plurality of master cameras have a wide-angle lens to obtain a wide-angle image as said wide-range image.
The surveillance system according to claim 1, wherein each of the plurality of master cameras provides the centralized surveillance image to the remote control device only when there is a request from the remote control device. The surveillance system according to claim 1, wherein each of the plurality of master cameras continuously provides the centralized monitoring image to the remote control device together with the object log data. And a master camera, each of which is connected to the remote control device and obtains a wide area image of the surveillance area that is at least a part of the area to be monitored, and at least one slave camera is connected to each to obtain the centralized surveillance image. In the monitoring system,
(a) the master camera acquiring the wide area image and receiving the focused surveillance image from the slave camera;
(b) the master camera detects a moving object from the wide area image or the centralized monitoring image, generates object log data including position information and identification information of the moving object, and transmits the object log data to the remote control apparatus; Storing an image and the centralized monitoring image in a storage device; And
(c) in response to a request from the remote control device, sending the stored image to the remote control device by the master camera;
Surveillance data processing method comprising a.
The method of claim 6,
The slave camera can be pan / tilt / zoom drive,
Step (a)
Controlling the slave camera to photograph the moving object based on the moving object detection information without the help of the remote control apparatus by the master camera;
Surveillance data processing method comprising a.
The method according to claim 6 or 7,
And the master camera is provided with a wide-angle lens to obtain a wide-angle image as the wide-angle image.
The method of claim 6, wherein the master camera continuously provides the wide area image and the centralized surveillance image to the remote control device together with the object log data.
KR1020100056806A 2010-06-16 2010-06-16 Wide area surveillance system and monitoring data processing method in the same KR20110136907A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100056806A KR20110136907A (en) 2010-06-16 2010-06-16 Wide area surveillance system and monitoring data processing method in the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100056806A KR20110136907A (en) 2010-06-16 2010-06-16 Wide area surveillance system and monitoring data processing method in the same

Publications (1)

Publication Number Publication Date
KR20110136907A true KR20110136907A (en) 2011-12-22

Family

ID=45503344

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100056806A KR20110136907A (en) 2010-06-16 2010-06-16 Wide area surveillance system and monitoring data processing method in the same

Country Status (1)

Country Link
KR (1) KR20110136907A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140039927A (en) * 2012-09-25 2014-04-02 에스케이텔레콤 주식회사 Method and apparatus for detecting event from multiple image
KR101440537B1 (en) * 2014-04-01 2014-09-15 주식회사 베스트디지탈 IP camera apparatus with network connecting function, and operating method thereof
KR101502448B1 (en) * 2014-09-25 2015-03-13 주식회사 영국전자 Video Surveillance System and Method Having Field of Views of 360 Degrees Horizontally and Vertically
KR20150120096A (en) * 2014-04-17 2015-10-27 현대인프라코어 주식회사 Optically connected closed circuit television camera apparatus using passive optical elements and cctv system
US20210096245A1 (en) * 2018-03-02 2021-04-01 Furuno Electric Co., Ltd. Underwater detection apparatus and underwater detection method
KR102344606B1 (en) * 2021-07-09 2021-12-31 주식회사 대산시스템 CCTV system for tracking and monitoring of object, and tracking and monitoring method therefor

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140039927A (en) * 2012-09-25 2014-04-02 에스케이텔레콤 주식회사 Method and apparatus for detecting event from multiple image
KR101440537B1 (en) * 2014-04-01 2014-09-15 주식회사 베스트디지탈 IP camera apparatus with network connecting function, and operating method thereof
KR20150120096A (en) * 2014-04-17 2015-10-27 현대인프라코어 주식회사 Optically connected closed circuit television camera apparatus using passive optical elements and cctv system
KR101502448B1 (en) * 2014-09-25 2015-03-13 주식회사 영국전자 Video Surveillance System and Method Having Field of Views of 360 Degrees Horizontally and Vertically
US20210096245A1 (en) * 2018-03-02 2021-04-01 Furuno Electric Co., Ltd. Underwater detection apparatus and underwater detection method
KR102344606B1 (en) * 2021-07-09 2021-12-31 주식회사 대산시스템 CCTV system for tracking and monitoring of object, and tracking and monitoring method therefor

Similar Documents

Publication Publication Date Title
JP2012520650A (en) Intelligent surveillance camera device and video surveillance system employing the same
KR20130071510A (en) Surveillance camera apparatus, wide area surveillance system, and cooperative tracking method in the same
JP3951191B2 (en) Image forming and processing apparatus and method using camera without moving parts
US20050036036A1 (en) Camera control apparatus and method
KR101002066B1 (en) Camera apparatus for monitoring and tracking objects and remote monitoring system employing the same
US7667730B2 (en) Composite surveillance camera system
KR102087450B1 (en) A System and Method for Processing a Very Wide Angle Image
KR101502448B1 (en) Video Surveillance System and Method Having Field of Views of 360 Degrees Horizontally and Vertically
KR20110136907A (en) Wide area surveillance system and monitoring data processing method in the same
KR101685418B1 (en) Monitoring system for generating 3-dimensional picture
CN102868875A (en) Multidirectional early-warning positioning and automatic tracking and monitoring device for monitoring area
KR20050051575A (en) Photographing apparatus and method, supervising system, program and recording medium
JPH11510342A (en) Image division, image formation and processing apparatus and method using camera without moving parts
US20100141733A1 (en) Surveillance system
JP4736381B2 (en) Imaging apparatus and method, monitoring system, program, and recording medium
KR20100129125A (en) Intelligent panorama camera, circuit and method for controlling thereof, and video monitoring system
CN202818503U (en) Multidirectional monitoring area early warning positioning automatic tracking and monitoring device
CN114554093B (en) Image acquisition system and target tracking method
KR101452342B1 (en) Surveillance Camera Unit And Method of Operating The Same
JP2011109630A (en) Universal head for camera apparatus
CN112218048B (en) Intelligent monitoring system
JP2005175970A (en) Imaging system
JP2006261871A (en) Image processor in hands-free camera
JP2004153605A (en) Image pickup device and system for transmitting pick-up image
JP2002101408A (en) Supervisory camera system

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination