WO2003030550A1 - Optimal multi-camera setup for computer-based visual surveillance - Google Patents
Optimal multi-camera setup for computer-based visual surveillance Download PDFInfo
- Publication number
- WO2003030550A1 WO2003030550A1 PCT/IB2002/003717 IB0203717W WO03030550A1 WO 2003030550 A1 WO2003030550 A1 WO 2003030550A1 IB 0203717 W IB0203717 W IB 0203717W WO 03030550 A1 WO03030550 A1 WO 03030550A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- deployment
- measure
- effectiveness
- camera
- computer
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 19
- 238000000034 method Methods 0.000 claims description 22
- 238000004590 computer program Methods 0.000 claims description 8
- 230000000694 effects Effects 0.000 claims 1
- 238000005259 measurement Methods 0.000 claims 1
- 230000008569 process Effects 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000002922 simulated annealing Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19645—Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/1968—Interfaces for setting up or customising the system
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0476—Cameras to detect unsafe condition, e.g. video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- This invention relates to the field of security systems, and in particular to the placement of multiple cameras to facilitate computer- vision applications.
- Cameras are often used in security systems and other visual monitoring applications.
- Computer programs and applications are continually being developed to process the image information obtained from a camera, or from multiple cameras.
- Face and figure recognition systems provide the capability of tracking identified persons or items as they move about a field of view, or among multiple fields of view.
- each camera affects the performance and effectiveness of the image processing system.
- the determination of proper placement of each camera is a manual process, wherein a security professional assesses the area and places the cameras in locations that provide effective and efficient coverage.
- Effective coverage is commonly defined as a camera placement that n inimizes "blind spots" within each camera's field of view.
- Efficient coverage is commonly defined as coverage using as few cameras as possible, to reduce cost and complexity.
- the objective of the placement is to maximize the visual coverage of the secured area using a minimum number of cameras. Achieving such an objective, however, is often neither effective nor efficient for computer- vision applications.
- These objects and others are achieved by defining a measure of effectiveness of a camera's deployment that includes the camera's effectiveness in providing image information to computer-vision applications.
- the effectiveness of the deployment includes measures based on the ability of one or more computer- vision applications to perform their intended functions using the image information provided by the deployed cameras.
- the deployment of the cameras includes consideration of the perspective information that is provided by the deployment.
- FIG. 1 illustrates an example flow diagram of a multi-camera deployment system in accordance with this invention.
- Fig. 2 illustrates a second example flow diagram of a multi-camera deployment system in accordance with this invention.
- This invention is premised on the observation that a camera deployment that provides effective visual coverage does not necessarily provide sufficient image information for effective computer- vision processing. Camera locations that provide a wide coverage area may not provide perspective information; camera locations that provide perspective discrimination may not provide discernible context information; and so on.
- a regular-shaped room with no obstructions will be allocated a single camera, located at an upper corner of the room, and aimed coincident with the diagonal of the room, and slightly downward. Assuming that the field of view of the camera is wide enough to encompass the entire room, or adjustable to sweep the entire room, a single camera will be sufficient for visual coverage of the room.
- a room or hallway rarely contains more than one camera, an additional camera being used only when an obstruction interferes with the camera's field of view.
- Computer- vision systems often require more than one camera's view of a scene to identify the context of the view and to provide an interpretation of the scene based on the 3 -dimensional location of objects within the scene. As such, the placement of cameras to provide visual coverage is often insufficient.
- algorithms are available for estimating 3-D dimensions from a single 2-D image, or from multiple 2-D images from a single camera with pan-tilt-zoom capability, such approaches are substantially less effective or less efficient than algorithms that use images of the same scene from different viewpoints.
- Some 2-D images from a single camera do provide for excellent 3-D dimension determination, such as a top-down view from a ceiling-mounted camera, because the image identifies where in the room a target object is located, and the type of object identifies its approximate height.
- 3-D dimension determination such as a top-down view from a ceiling-mounted camera
- Fig. 1 illustrates an example flow diagram of a multi-camera deployment system that includes consideration of a deployment's computer- vision effectiveness in accordance with this invention.
- a proposed initial camera deployment is defined, for example, by identifying camera locations on a displayed floor plan of the area that is being secured.
- the visual coverage provided by the deployment is assessed, using techniques common in the art.
- the "computer- vision effectiveness" of the deployment is determined, as discussed further below.
- Each computer- vision application performs its function based on select parameters that are extracted from the image.
- the particular parameters, and the function's sensitivity to each, are identifiable.
- a gesture-recognition function may be very sensitive to horizontal and vertical movements (waving arms, etc.), and somewhat insensitive to depth movements. Defining x, y, and z, as horizontal, vertical, and depth dimensions, respectively, the gesture-recognition function can be said to be sensitive to delta-x and delta- y detection. Therefore, in this example, determining the computer- vision effectiveness of the deployment for gesture-recognition will be based on how well the deployment provides delta- x and delta-y parameters from the image.
- Such a determination is made based on each camera's location and orientation relative to the secured area, using, for example, a geometric model and conventional differential mathematics. Heuristics and other simplifications may also be used. Obviously, for example, a downward pointing camera will provide minimal, if any, delta-y information, and its measure of effectiveness for gesture-recognition will be poor. In lieu of a formal geometric model, a rating system may be used, wherein each camera is assigned a score based on its viewing angle relative to the horizontal.
- an image-recognition function may be sensitive to the resolution of the image in the x and y directions, and the measure of image-recognition effectiveness will be based on the achievable resolution throughout the area being covered.
- a camera on a wall of a room may provide good x and y resolution for objects near the wall, but poor x and y resolution for objects near a far-opposite wall.
- placing an additional camera on the far-opposite wall will increase the available resolution throughout the room, but will be redundant relative to providing visual coverage of the room.
- a motion-estimation function that predicts a path of an intruder in a secured area may be sensitive to horizontal and depth movements (delta-x and delta-z), but relatively insensitive to vertical movements (delta-y), in areas such as rooms that do not provide a vertical egress, and sensitive to vertical movements in areas such as stairways that provide vertical egress.
- the measure of the computer- vision effectiveness will include a measure of the delta-x and delta-z sensitivity provided by the cameras in rooms and a measure of the delta-y sensitivity provided by the cameras in the hallways.
- sensitivities of a computer- vision system need not be limited to the example x, y, and z parameters discussed above.
- a face-recognition system may be expected to recognize a person regardless of the direction that the person is facing. As such, in addition to x and y resolution, the system will be sensitive to the orientation of each camera's field of view, and the effectiveness of the deployment will be dependent upon having intersecting fields of view from a plurality of directions.
- the assessment of the deployment's effectiveness is typically a composite measure based on each camera's effectiveness, as well as the effectiveness of combinations of cameras. For example, if the computer- vision application is sensitive to delta-x, delta-y, and delta-z, the relationship of two cameras to each other and to the secured area may provide sufficient perspective information to determine delta-x, delta-y, and delta-z, even though neither of the two cameras provides all three parameters. In such a situation, the deployment system of this invention is configured to "ignore" the poor scores that may be determined for an individual camera when a higher score is determined for a combination of this camera with another camera.
- the deployment system is configured to assume that the deployment must provide a proper x, y, and z coordinates for objects in the secured area, and measures the computer- vision effectiveness in terms of the perspective information provided by the deployment.
- this perspective measure is generally determined based on the location and orientation of two or more cameras with intersecting fields of view in the secured area.
- the acceptability of the deployment is assessed, based on the measure of computer- vision effectiveness, from 130, and optionally, the visual coverage provided by this deployment, from 120. If the deployment is unacceptable, it is modified, at 150, and the process 130-140 (optionally 120-130-140) is repeated until an acceptable deployment is found.
- the modification at 150 may include a relocation of existing camera placements, or the addition of new cameras to the deployment, or both.
- the modification at 150 may be automated, or manual, or a combination of both.
- the deployment system highlights the area or areas having insufficient computer- vision effectiveness, and suggests a location for an additional camera. Because the initial deployment 110 will typically be designed to assure sufficient visual coverage, it is assumed that providing an additional camera is a preferred alternative to changing the initial camera locations, although the user is provided the option of changing these initial locations. Also, this deployment system is particularly well suited for enhancing existing multi-camera systems, and the addition of a camera is generally an easier task than moving a previously installed camera.
- Fig. 2 illustrates a second example flow diagram of a multi-camera deployment system in accordance with this invention.
- the camera locations are determined at 210 in order to provide sufficient visual coverage.
- This deployment at 210 may correspond to an existing deployment that had been installed to provide visual coverage, or it may correspond to a proposed deployment, such as provided by the techniques disclosed in the above referenced PCT Application PCT/USOO/40011, or other automated deployment processes common in the art.
- the computer- vision effectiveness of the deployment is determined at 220, as discussed above with regard to block 130 of Fig. 1.
- the acceptability of the deployment is determined.
- the acceptability of the deployment at 230 is based solely on the determined computer-vision effectiveness from 220.
- a new camera is added to the deployment, and at 250, the location for each new camera is determined.
- the particular deficiency of the existing deployment is determined, relative to the aforementioned sensitivities of the particular computer- vision application. For example, if a delta-z sensitivity is not provided by the current deployment, a ceiling-mounted camera location is a likely solution.
- the user is provided the option of identifying areas within which new cameras may be added and/or identifying areas within which new cameras may not be added. For example, in an external area, the location of existing poles or other structures upon which a camera can be mounted will be identified.
- the process 250 is configured to re-determine the location of each of the added cameras, each time that a new camera is added. That is, as is known in the art, an optimal placement of one camera may not correspond to that camera's optimal placement if another camera is also available for placement. Similarly, if a third camera is added, the optimal locations of the first two cameras may change.
- the secured area is partitioned into sub-areas, wherein the deployment of cameras in one sub-area is virtually independent of the deployment in another sub-area.
- the deployment of cameras in each room is processed as an independent deployment process.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02765217A EP1433326A1 (en) | 2001-09-27 | 2002-09-11 | Optimal multi-camera setup for computer-based visual surveillance |
JP2003533612A JP2005505209A (en) | 2001-09-27 | 2002-09-11 | Optimal multi-camera setup for computer-based visual surveillance |
KR10-2004-7004440A KR20040037145A (en) | 2001-09-27 | 2002-09-11 | Optimal multi-camera setup for computer-based visual surveillance |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US32539901P | 2001-09-27 | 2001-09-27 | |
US60/325,399 | 2001-09-27 | ||
US10/165,089 US20030058342A1 (en) | 2001-09-27 | 2002-06-07 | Optimal multi-camera setup for computer-based visual surveillance |
US10/165,089 | 2002-06-07 | ||
US10/189,272 | 2002-07-03 | ||
US10/189,272 US20030058111A1 (en) | 2001-09-27 | 2002-07-03 | Computer vision based elderly care monitoring system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003030550A1 true WO2003030550A1 (en) | 2003-04-10 |
Family
ID=27389101
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2002/003717 WO2003030550A1 (en) | 2001-09-27 | 2002-09-11 | Optimal multi-camera setup for computer-based visual surveillance |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1433326A1 (en) |
JP (1) | JP2005505209A (en) |
KR (1) | KR20040037145A (en) |
CN (1) | CN1561640A (en) |
WO (1) | WO2003030550A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008142504A1 (en) * | 2007-05-19 | 2008-11-27 | Videotec S.P.A. | Method and system for monitoring an environment |
CN101572804B (en) * | 2009-03-30 | 2012-03-21 | 浙江大学 | Multi-camera intelligent control method and device |
US8817102B2 (en) | 2010-06-28 | 2014-08-26 | Hitachi, Ltd. | Camera layout determination support device |
US20140278281A1 (en) * | 2013-03-15 | 2014-09-18 | Adt Us Holdings, Inc. | Security system using visual floor plan |
JP2015517247A (en) * | 2012-04-02 | 2015-06-18 | マックマスター ユニバーシティー | Optimal camera selection in an array of cameras for monitoring and surveillance applications |
US9898921B2 (en) | 2013-03-15 | 2018-02-20 | Adt Us Holdings, Inc. | Security system installation |
CN112291526A (en) * | 2020-10-30 | 2021-01-29 | 重庆紫光华山智安科技有限公司 | Monitoring point determining method and device, electronic equipment and storage medium |
WO2021035012A1 (en) * | 2019-08-22 | 2021-02-25 | Cubic Corporation | Self-initializing machine vision sensors |
WO2022060442A1 (en) * | 2020-09-18 | 2022-03-24 | Microsoft Technology Licensing, Llc | Camera placement guidance |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010125489A1 (en) * | 2009-04-29 | 2010-11-04 | Koninklijke Philips Electronics N.V. | Method of selecting an optimal viewing angle position for a camera |
CN101853399B (en) * | 2010-05-11 | 2013-01-09 | 北京航空航天大学 | Method for realizing blind road and pedestrian crossing real-time detection by utilizing computer vision technology |
JP6218089B2 (en) * | 2013-06-18 | 2017-10-25 | パナソニックIpマネジメント株式会社 | Imaging position determination device and imaging position determination method |
US9955124B2 (en) * | 2013-06-21 | 2018-04-24 | Hitachi, Ltd. | Sensor placement determination device and sensor placement determination method |
EP2835792B1 (en) * | 2013-08-07 | 2016-10-05 | Axis AB | Method and system for selecting position and orientation for a monitoring camera |
CN106716447B (en) * | 2015-08-10 | 2018-05-15 | 皇家飞利浦有限公司 | Take detection |
CN108234900B (en) * | 2018-02-13 | 2020-11-20 | 深圳市瑞立视多媒体科技有限公司 | Camera configuration method and device |
CN108495057B (en) * | 2018-02-13 | 2020-12-08 | 深圳市瑞立视多媒体科技有限公司 | Camera configuration method and device |
CN108471496B (en) * | 2018-02-13 | 2020-11-03 | 深圳市瑞立视多媒体科技有限公司 | Camera configuration method and device |
CN108449551B (en) * | 2018-02-13 | 2020-11-03 | 深圳市瑞立视多媒体科技有限公司 | Camera configuration method and device |
US20230288527A1 (en) * | 2020-10-29 | 2023-09-14 | Nec Corporation | Allocation determination apparatus, allocation determination method, and computer-readable medium |
CN114724323B (en) * | 2022-06-09 | 2022-09-02 | 北京科技大学 | Point distribution method of portable intelligent electronic fence for fire scene protection |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0529317A1 (en) * | 1991-08-22 | 1993-03-03 | Sensormatic Electronics Corporation | Surveillance system with master camera control of slave cameras |
US5331413A (en) * | 1992-09-28 | 1994-07-19 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Adjustable control station with movable monitors and cameras for viewing systems in robotics and teleoperations |
EP0714081A1 (en) * | 1994-11-22 | 1996-05-29 | Sensormatic Electronics Corporation | Video surveillance system |
US6215519B1 (en) * | 1998-03-04 | 2001-04-10 | The Trustees Of Columbia University In The City Of New York | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring |
-
2002
- 2002-09-11 JP JP2003533612A patent/JP2005505209A/en not_active Withdrawn
- 2002-09-11 WO PCT/IB2002/003717 patent/WO2003030550A1/en not_active Application Discontinuation
- 2002-09-11 CN CNA028190580A patent/CN1561640A/en active Pending
- 2002-09-11 EP EP02765217A patent/EP1433326A1/en not_active Withdrawn
- 2002-09-11 KR KR10-2004-7004440A patent/KR20040037145A/en not_active Application Discontinuation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0529317A1 (en) * | 1991-08-22 | 1993-03-03 | Sensormatic Electronics Corporation | Surveillance system with master camera control of slave cameras |
US5331413A (en) * | 1992-09-28 | 1994-07-19 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Adjustable control station with movable monitors and cameras for viewing systems in robotics and teleoperations |
EP0714081A1 (en) * | 1994-11-22 | 1996-05-29 | Sensormatic Electronics Corporation | Video surveillance system |
US6215519B1 (en) * | 1998-03-04 | 2001-04-10 | The Trustees Of Columbia University In The City Of New York | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008142504A1 (en) * | 2007-05-19 | 2008-11-27 | Videotec S.P.A. | Method and system for monitoring an environment |
EP2533535A1 (en) * | 2007-05-19 | 2012-12-12 | Videotec S.p.a. | Method and system for monitoring an environment |
US8350911B2 (en) | 2007-05-19 | 2013-01-08 | Videotec S.P.A. | Method and system for monitoring an environment |
RU2494567C2 (en) * | 2007-05-19 | 2013-09-27 | Видеотек С.П.А. | Environment monitoring method and system |
CN101572804B (en) * | 2009-03-30 | 2012-03-21 | 浙江大学 | Multi-camera intelligent control method and device |
US8817102B2 (en) | 2010-06-28 | 2014-08-26 | Hitachi, Ltd. | Camera layout determination support device |
US9591272B2 (en) | 2012-04-02 | 2017-03-07 | Mcmaster University | Optimal camera selection in array of monitoring cameras |
JP2015517247A (en) * | 2012-04-02 | 2015-06-18 | マックマスター ユニバーシティー | Optimal camera selection in an array of cameras for monitoring and surveillance applications |
US9942468B2 (en) | 2012-04-02 | 2018-04-10 | Mcmaster University | Optimal camera selection in array of monitoring cameras |
US20140278281A1 (en) * | 2013-03-15 | 2014-09-18 | Adt Us Holdings, Inc. | Security system using visual floor plan |
US9898921B2 (en) | 2013-03-15 | 2018-02-20 | Adt Us Holdings, Inc. | Security system installation |
US10073929B2 (en) * | 2013-03-15 | 2018-09-11 | Adt Us Holdings, Inc. | Security system using visual floor plan |
WO2021035012A1 (en) * | 2019-08-22 | 2021-02-25 | Cubic Corporation | Self-initializing machine vision sensors |
US11380013B2 (en) | 2019-08-22 | 2022-07-05 | Cubic Corporation | Self-initializing machine vision sensors |
WO2022060442A1 (en) * | 2020-09-18 | 2022-03-24 | Microsoft Technology Licensing, Llc | Camera placement guidance |
US11496674B2 (en) | 2020-09-18 | 2022-11-08 | Microsoft Technology Licensing, Llc | Camera placement guidance |
CN112291526A (en) * | 2020-10-30 | 2021-01-29 | 重庆紫光华山智安科技有限公司 | Monitoring point determining method and device, electronic equipment and storage medium |
CN112291526B (en) * | 2020-10-30 | 2022-11-22 | 重庆紫光华山智安科技有限公司 | Monitoring point determining method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN1561640A (en) | 2005-01-05 |
EP1433326A1 (en) | 2004-06-30 |
KR20040037145A (en) | 2004-05-04 |
JP2005505209A (en) | 2005-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20030058342A1 (en) | Optimal multi-camera setup for computer-based visual surveillance | |
EP1433326A1 (en) | Optimal multi-camera setup for computer-based visual surveillance | |
US7397929B2 (en) | Method and apparatus for monitoring a passageway using 3D images | |
KR100660762B1 (en) | Figure tracking in a multiple camera system | |
RU2251739C2 (en) | Objects recognition and tracking system | |
US20020196330A1 (en) | Security camera system for tracking moving objects in both forward and reverse directions | |
US20050134685A1 (en) | Master-slave automated video-based surveillance system | |
JP5956248B2 (en) | Image monitoring device | |
WO2005026907A9 (en) | Method and apparatus for computerized image background analysis | |
WO1999045511A1 (en) | A combined wide angle and narrow angle imaging system and method for surveillance and monitoring | |
WO2011054971A2 (en) | Method and system for detecting the movement of objects | |
Snidaro et al. | Automatic camera selection and fusion for outdoor surveillance under changing weather conditions | |
GB2368482A (en) | Pose-dependent viewing system | |
US11227376B2 (en) | Camera layout suitability evaluation apparatus, control method thereof, optimum camera layout calculation apparatus, and computer readable medium | |
US7355626B2 (en) | Location of events in a three dimensional space under surveillance | |
Conci et al. | Camera placement using particle swarm optimization in visual surveillance applications | |
CN113841180A (en) | Method for capturing movement of an object and movement capturing system | |
KR102441436B1 (en) | System and method for security | |
Jung et al. | Tracking multiple moving targets using a camera and laser rangefinder | |
GB2352899A (en) | Tracking moving objects | |
JP6548683B2 (en) | Object image estimation device and object image determination device | |
JP6548682B2 (en) | Object image judgment device | |
JP4448249B2 (en) | Image recognition device | |
KR102672032B1 (en) | System and method for determining the position of the camera image center point by the vanishing point position | |
WO2024135342A1 (en) | Control system, control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FR GB GR IE IT LU MC NL PT SE SK TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002765217 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003533612 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20028190580 Country of ref document: CN Ref document number: 1020047004440 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2002765217 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2002765217 Country of ref document: EP |