WO2011031128A1 - Control mechanism for automated surveillance system - Google Patents
Control mechanism for automated surveillance system Download PDFInfo
- Publication number
- WO2011031128A1 WO2011031128A1 PCT/MY2010/000157 MY2010000157W WO2011031128A1 WO 2011031128 A1 WO2011031128 A1 WO 2011031128A1 MY 2010000157 W MY2010000157 W MY 2010000157W WO 2011031128 A1 WO2011031128 A1 WO 2011031128A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- zoom
- image
- control points
- tilt
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
A system for an automated surveillance system comprises the use of at least two cameras in which a first camera acts as a view finder and a second camera acts as a zoom wherein a mechanism is provided for controlling the second camera based on the control signals resultant from the processed image produced by the first camera through image processing. Preferably, the first camera is a static camera with wide angle lens and the second camera is a movable zoomed camera.
Description
Control Mechanism for Automated Surveillance System Field of the Invention The present invention relates generally to a system for an automated surveillance application, and more particularly to a mechanism for interacting . cameras used in the said system.
Background of the Invention
Surveillance typically includes the use of cameras for monitoring, tracking, and focusing moving or stationary objects in an area. In some cases, a surveillance system may loss useful scene information and details due to the behaviour of the cameras which are likely to focus on a particular stationary or moving object even if another moving object appear on the scene.
The use of two different cameras could overcome the above-mentioned problem wherein a first camera acts as a viewfinder to cover the wide area for detecting and tracking moving objects and a second camera which is a movable zoomed camera for zoorning the same. In this way, the close-up view and the overall scene could be monitored closely and simultaneously. These cameras, however, may be required to be operated manually as there is no interaction between the first camera with the second camera as they usually are two independent equipments.
In light of the above, it is aim of the present invention to provide a mechanism for interacting cameras used for an automated surveillance system:
Summary of the Invention
According to the present invention, an automated surveillance system comprises the use of at least two cameras in which a first camera acts as a viewfinder and a second camera acts as a zoom wherein a mechanism is provided for controlling the second camera based on the control signals resultant from the processed image produced by the first camera through image processing. Preferably, the first camera is a static camera with wide angle lens and the second camera is a movable camera.
In a first embodiment, a static camera equipped with a fisheye lens and a movable pan-tilt-zoom (PTZ) camera is used. The first camera is referred to as viewfinder and the second camera is referred to as a movable zoomed camera and the images produced by these two cameras are called circular image and zoomed image, respectively.
The circular image from the viewfinder static fisheye camera covers a wide area and it is used as a source image where the corresponding location between the two cameras are defined through this camera. The location in the circular image is called primary coordinate and the corresponding coordinate in the zoomed image is called the secondary coordinate. The secondary coordinate in this case maybe described in terms of the control signals that to be send to the movable camera i.e. i) pan signal, ii) tilt signal and iii) the zoom factor. Pan control parameter is derived as the distance between control points to the vertical reference line in circular image. Tilt control parameter is derived as the distance between control points to the horizontal reference line in circular image. Zoom control parameter is derived as the distance between the control points to the bottom most of the circular image.
Brief Description of the Drawings
The present invention will now be described by way of example only with reference to the accompanying drawings, in which:
Fig. 1 shows an arrangement of cameras used in an automated surveillance system according to a preferred embodiment of the present invention.
Fig. 2 shows a flowchart showing a process for establishing relationship between a static camera and a movable camera.
Fig. 3 shows a flowchart showing a process for generating pan, tilt, and zoom values in operation mode.
Fig. 4 shows a distribution of control points over an image. Fig. 5 shows a tabulated data obtained from control points Fig. 6 shows graphs plotted with data shown in Fig. 5 Fig. 7 shows graphs plotted with data shown in Fig. 5
Detailed Description of the Preferred Embodiment
The automated surveillance system comprising a static camera with wide angle lens, which acts as a view finder, and a movable zoomed camera.
The first camera will acquire image and locate points of particular objects in a surveillance area or physical space whereby the points will be generated into control signals. The control signals will be sent to the second camera which is movable to allow it to move (pan and tilt) to the corrected position with corrected zoom factor. This interaction is useful for tracking an object whereby the input of the identified object in the view finder can be zoomed automatically so that the object can be viewed in greater details. In operation, the cameras are arranged as such they are facing the same direction as shown in Fig. 1. Fig. 1 shows an example of arrangement whereby the static camera (101) is disposed at the top and the PTZ camera (103) is disposed at the bottom.
The process of establishing the interaction between the first camera and the second camera is illustrated in a flowchart shown in Fig. 2. The process involves the followings.
1. Define a number of control points in the physical space within the first camera field of view and these control points are used as a reference to get x and y coordinates in the resultant circular image. The control points should be distributed _across the region of interest of the field of view of the first camera and cover as much the field of view as possible. For a standard image, typical number of points is in the range between 40 to 60 points. The distribution of the control points is illustrated in Fig. 4.
Set the position of the movable zoomed camera to home position and when the movable zoomed camera is at its home position the pan parameter value would be reset to "0" and tilt parameter value would be reset to "-90". This will then be used as a home position (datum for calculation) for both viewfinder and the movable zoomed camera. Define vertical and a horizontal reference lines in the circular image and set the intersection point between vertical and horizontal lines to the home position of the movable zoomed camera (the object point that appears on the PTZ camera where it is set at home). From this point onwards the home position of the circular image and the zoomed image is the same.
Move the movable zoomed camera to each of the predefined control points in the physical space defined in step 1. Record the x coordinate, y coordinate in the circular image as well as the pan and tilt parameter value from the movable zoomed camera. It should be noted that the there is a polarity / sign in both x and y parameter in the circular image as they are taken with respect to the intersection point of the cross. The tabulated data is shown in Fig. 5 Determine the distance, Delta-x from the control points to the vertical reference line. Plot a graph of Pan value vs. the signed Delta-x in the table and extract the relationship equation. The graph is shown in Fig. 6. Determine the distance, Delta-y from the control points to the horizontal reference line. Plot a graph of Tilt value vs. the signed Delta-y in the table and extract the relationship equation. The graph is shown in Fig. 7. Define a number of zoom factor control points in the physical space where these control points must be located on the floor in the image. These control
points are then measured with respect to the bottom most points in the fisheye image. Image the zoomed control point with the PTZ camera and find out the zoom factors and record the zoom factor. Measure the distance, Delta-z between the coordinates of these zoom factor control points to the bottom most point. Plot a graph of zoom factor vs. distance, Delta-z from the control points to the bottom most point. Interpolate the points to get the closest line estimation and extract the relationship equation. In operation mode, feed in the delta-x, delta-y value as well as the delta-z value to obtain the Pan, Tilt and Zoom factor for a particular object in the scene.
Claims
1. A system for an automated surveillance comprising
a first camera acts as a viewfinder
a second camera acts as a zoom
a mechanism for interacting the first camera with the second camera wherein the mechanism controls an operation of the second camera based on information acquired by the first camera so that the second camera will automatically zoom a corrected position.
2. A system according to Claim 1 wherein the information is in the form of coordinated points derived from a view finder image of the first camera.
3. A system according to Claims 1 and 2 wherein the information will be generated into signals having pan, tilt and zoom values.
4. A system according to Claim 1 wherein the first camera is static and has a wide angle lens.
5. A system according to Claim 1 wherein the second camera is movable and has zoom capability.
6. A system according to Claim 4 wherein the first camera is a standard camera coupled with a fisheye lens.
7. A system according to Claim 5 wherein the second camera is a pan-tilt-zoom (PTZ) camera
8. A system according to Claim 1 wherein the first and the second cameras are faced in the same direction in the forward looking position.
9. A system according to Claim 3 wherein the pan value is derived from polar angle in Cartesian and normalized to the first camera optical axis.
10. A system according to Claim 3 wherein the tilt value is derived from the radial information of the viewfinder image as an interpolated formula.
11. A system according to Claim 3 wherein the. zoom control value is derived from tilt angle.
12. A system according to Claim 1 wherein the mechanism comprises the steps of
a. defining a number of control points of an image in the first camera view as reference coordinates;
b. setting an initial position for the first and second camera wherein the second camera pans at 0 and tilts at -90.
c. defining reference vertical and horizontal line in the first camera view to the home position of second camera
d. locating the intersection point of the lines to the initial position;
e. moving the second camera to the predefined control points and obtain the actual control values, pan, tilt and zoom of the second camera f. recording the control values, pan, tilt and the corresponding control points x and y coordinates
g- determining the distance, delta-x of control points from the vertical reference line in the image of first camera
h. plotting a graph of Pan versus delta-x and determining the relationship equation of the graph
i. determining the distance, delta-y of control points from the horizontal reference line in the image of first camera,
j. plotting a graph of Tilt versus delta-y and determining the relationship equation of the graph
k. defining a number of zoom factor control points which are located on the floor of first camera's image
1. moving second camera to image the zoomed control points in item k m. obtaining the actual zoom values of the second camera at each control points
n. recording the actual zoom values of second camera
o. determining the distance, delta-z of control points from the bottom point of first camera's image
p. plotting a graph of Zoom versus delta-z and determining the relationship equation of the graph
q. feeding in calibrated pan, tilt and zoom values equations to generate control values in operation mode
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MYPI20093735 | 2009-09-08 | ||
MYPI20093735A MY158543A (en) | 2009-09-08 | 2009-09-08 | Control mechanism for automated surveillance system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011031128A1 true WO2011031128A1 (en) | 2011-03-17 |
Family
ID=43838266
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/MY2010/000157 WO2011031128A1 (en) | 2009-09-08 | 2010-09-02 | Control mechanism for automated surveillance system |
Country Status (2)
Country | Link |
---|---|
MY (1) | MY158543A (en) |
WO (1) | WO2011031128A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104506776A (en) * | 2015-01-09 | 2015-04-08 | 成都新舟锐视科技有限公司 | Automatic focusing system for real-time balling machine tracking |
WO2016000572A1 (en) * | 2014-06-30 | 2016-01-07 | 华为技术有限公司 | Image processing method and video camera |
WO2019114617A1 (en) * | 2017-12-12 | 2019-06-20 | 华为技术有限公司 | Method, device, and system for fast capturing of still frame |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030071891A1 (en) * | 2001-08-09 | 2003-04-17 | Geng Z. Jason | Method and apparatus for an omni-directional video surveillance system |
US20060284971A1 (en) * | 2005-06-15 | 2006-12-21 | Wren Christopher R | Composite surveillance camera system |
-
2009
- 2009-09-08 MY MYPI20093735A patent/MY158543A/en unknown
-
2010
- 2010-09-02 WO PCT/MY2010/000157 patent/WO2011031128A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030071891A1 (en) * | 2001-08-09 | 2003-04-17 | Geng Z. Jason | Method and apparatus for an omni-directional video surveillance system |
US20060284971A1 (en) * | 2005-06-15 | 2006-12-21 | Wren Christopher R | Composite surveillance camera system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016000572A1 (en) * | 2014-06-30 | 2016-01-07 | 华为技术有限公司 | Image processing method and video camera |
US10425608B2 (en) | 2014-06-30 | 2019-09-24 | Huawei Technologies Co., Ltd. | Image processing method and camera |
CN104506776A (en) * | 2015-01-09 | 2015-04-08 | 成都新舟锐视科技有限公司 | Automatic focusing system for real-time balling machine tracking |
WO2019114617A1 (en) * | 2017-12-12 | 2019-06-20 | 华为技术有限公司 | Method, device, and system for fast capturing of still frame |
Also Published As
Publication number | Publication date |
---|---|
MY158543A (en) | 2016-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8488001B2 (en) | Semi-automatic relative calibration method for master slave camera control | |
US8390686B2 (en) | Surveillance camera apparatus and surveillance camera system | |
US8279283B2 (en) | Methods and systems for operating a video surveillance system | |
US8085300B2 (en) | Surveillance camera system, remote-controlled monitoring device, control method, and their control program | |
US20050036036A1 (en) | Camera control apparatus and method | |
US11190698B2 (en) | Imaging apparatus, control method for imaging apparatus, information processing apparatus, and storage medium | |
US10334150B2 (en) | Camera system and method of tracking object using the same | |
JP2006523043A (en) | Method and system for monitoring | |
CN102957862A (en) | Image capturing apparatus and control method thereof | |
JP5183152B2 (en) | Image processing device | |
KR101452342B1 (en) | Surveillance Camera Unit And Method of Operating The Same | |
KR20150130901A (en) | Camera apparatus and method of object tracking using the same | |
US11750922B2 (en) | Camera switchover control techniques for multiple-camera systems | |
WO2011031128A1 (en) | Control mechanism for automated surveillance system | |
JP2017098613A (en) | Imaging apparatus, its control method, and control program | |
CN109104562B (en) | Information processing apparatus, information processing method, and recording medium | |
JP2019054378A (en) | Imaging apparatus and control method thereof, and program | |
KR20060112721A (en) | System and method for camera actuator with direction sensor | |
KR20200081057A (en) | Method and Apparatus for Center Calibration of Camera System | |
TW201722145A (en) | 3D video surveillance system capable of automatic camera dispatching function, and surveillance method for using the same | |
WO2014119991A1 (en) | Directing steerable camera with user bias tracking | |
KR101341632B1 (en) | Optical axis error compensation system of the zoom camera, the method of the same | |
KR20120125037A (en) | Method for controlling surveillance system | |
JP6898826B2 (en) | Monitoring device | |
JP2004336569A (en) | Mobile object monitor system and mobile object monitor method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10815671 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10815671 Country of ref document: EP Kind code of ref document: A1 |