WO2000008856A1 - Figure tracking in a multiple camera system - Google Patents
Figure tracking in a multiple camera system Download PDFInfo
- Publication number
- WO2000008856A1 WO2000008856A1 PCT/EP1999/005505 EP9905505W WO0008856A1 WO 2000008856 A1 WO2000008856 A1 WO 2000008856A1 EP 9905505 W EP9905505 W EP 9905505W WO 0008856 A1 WO0008856 A1 WO 0008856A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- location
- target
- view
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
- G05D3/12—Control of position or direction using feedback
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
Definitions
- This invention relates to a system for controlling multiple video cameras.
- This invention allows for an automated camera handoff for selecting and directing cameras within a multi-camera system, as might be used in a security system or a multi-camera broadcasting system.
- the automation is provided by tracking a figure within the image from an individual camera, coupled with an area representation of the fields of view of each of the other cameras.
- Security systems for airports, casinos, and the like typically employ a multitude of cameras that provide images of selected areas to a control station. The images from each of these cameras, or a subset of these cameras, are displayed on one or more monitors at the control station.
- the operator of the control station is provided an ability to select any one of the cameras for a display of its image on a primary monitor, and, if the camera is adjustable, to control of the camera's field of view.
- Such control systems are also utilized for selecting from among multiple cameras at an event being broadcast, for example, multiple cameras at a sports arena, or studio.
- the selection and control of the cameras is typically accomplished by controlling a bank of switches, or by selecting from amongst a list of cameras on a computer terminal. To view a particular area, the operator selects the camera associated with that area. If the camera is adjustable, the operator subsequently adjusts the selected camera's field of view by adjusting its rotation about a horizontal axis (pan) or vertical axis (tilt), or its magnification (zoom).
- the entire span of an adjustable camera's span of view is termed herein as the camera's potential field of view, whereas the view resulting from the particular pan, tilt, and zoom settings is termed the camera's actual field of view.
- Image processing algorithms are available which allow for the identification of a particular pattern, or figure, within an image, and the identification of any subsequent movement of that figure. Coupled with a security control system, such image processing algorithms allow for the automated adjustment of a camera so as to keep the figure in the center of the camera's actual field of view. When the figure travels beyond the potential field of view of the camera, the operator selects another camera whose potential field of view contains the figure at its new location, adjusts the camera, identifies the figure in the camera's actual field of view, and thereafter continues the automated tracking until the figure exits that camera's potential field of view.
- the operator In the conventional camera selection scenario, the operator must be familiar with the layout of the secured area, as well as the correspondence between the displayed image and this layout. That is, for example, if a figure is seen exiting through one of several doorways, the operator must be able to quickly determine to which other area that particular doorway leads, and must further determine which camera includes that other area.
- the invention provides a camera handoff system, a security system, and a method as defined in the independent claims.
- the independent claims define advantageous embodiments.
- the preferred system will allow for the near continuous display of a figure as the figure moves about throughout the multiple cameras' potential fields of view.
- the approximate physical location of a figure is determined from the displayed image, the identification of the figure within this image by the figure tracking system, and a knowledge of the camera's location and actual field of view which is producing the displayed image. If the figure exits a selected camera's field of view, another camera containing the figure within its field of view is selected.
- the bounds of each camera's potential field of view are contained in the system. The system determines which cameras' potential fields of view contain the figure by determining whether the figure's determined physical location lies within the bounds of each camera's field of view.
- the system determines which other camera's potential field of view contains the figure, then adjusts that other camera's actual field of view to contain the figure.
- the system automatically selects an other camera and communicates the appropriate information to the figure tracking process to continue the tracking of the figure using this other camera.
- the system also contains predictive location determination algorithms. By assessing the movement of the figure, the selection and adjustment of the next camera can be effected based upon the predicted subsequent location of the figure. Such predictive techniques are effective for tracking a figure in a secured area in which the cameras' fields of view are not necessarily overlapping, and also for selecting from among multiple cameras containing the figure in their potential field of view.
- the operator need not determine the potential egress points from each camera's field of view, nor need the operator know which camera or cameras cover a given area, nor which areas are adj acent each other.
- the selection of a target is also automated.
- Security systems often automatically select a camera associated with an alarm, for the presentation of a view of the alarmed area to the operator.
- the system can automatically select and adjust the camera associated with the alarm to contain that target point, and identify the target as those portions of the image which exhibit movement. Thereafter, the system will track the target, as discussed above.
- FIG. 1 illustrates an example multi-camera security system in accordance with this invention.
- FIG. 2 illustrates an example graphic representation of a secured area with a multi-camera security system, in accordance with this invention.
- FIGs. 3a, 3b and 3c illustrate example field of view polygons associated with cameras in a multi-camera security system, in accordance with this invention.
- FIG. 4 illustrates an example three dimensional representation of a secured area and a camera's field of view polyhedron, in accordance with this invention.
- FIGs. 5a, 5b and 5c illustrate an example of the association between a figure in an image from a camera and the physical representation of the secured area, in accordance with this invention. Description of the Preferred Embodiments
- FIG. 1 illustrates a multi-camera security system.
- the system comprises video cameras 101, 102, 103 and 104-106 (shown in Fig. 2). Cameras 101 and 102 are shown as adjustable, pan/tilt/zoom, cameras.
- the cameras 101, 102, 103 provide an input to a camera handoff system 120; the connections between the cameras 101, 102, 103 and the camera handoff system 120 may be direct or remote, for example, via a telephone connection.
- the camera handoff system 120 includes a controller 130, a location determinator 140, and a field of view determinator 150.
- the controller 130 effects the control of the cameras 101, 102, 103 based on inputs from the sensors 111, 112, the operator station 170, and the location determinator 140 and field of view determinator 150.
- An operator controls the security system via an operator's station 170, and controller 130.
- the operator typically selects from options presented on a screen 180 to select one of the cameras 101, 102, 103, and controls the selected camera to change its line of sight, via pan and tilt adjustments, or magnification factor, via zoom adjustments.
- the image from the selected camera's field of view is presented to the operator for viewing via the switch 135.
- the optional alarm sensors 111, 112 provide for automatic camera selection when an alarm condition is sensed.
- Each alarm sensor has one or more cameras associated with it; when the alarm is activated, an associated camera is selected and adjusted to a predefined line of sight and the view is displayed on the screen 180 for the operator's further assessment and subsequent security actions.
- the field of view determinator 150 determines the field of view of each camera based upon its location and orientation.
- Non-adjustable camera 103 has a fixed field of view, whereas the adjustable cameras 101, 102 each have varying fields of view, depending upon the current pan, tilt, and zoom settings of the camera.
- the camera handoff system 120 includes a database 160 that describes the secured area and the location of each camera.
- the database 160 may include a graphic representation of the secured area, for example, a floor plan as shown in FIG. 2.
- the floor plan is created and entered in the control system when the security system is installed, using for example Computer Aided Design (CAD) techniques well known to one skilled in the art.
- CAD Computer Aided Design
- the location determinator 140 determines the location of an object within a selected camera's field of view. Based upon the object's location within the image from the selected camera, and the camera's physical location and orientation within the secured area, the location determinator 140 determines the object's physical location within the secured area.
- the controller 130 determines which cameras' field of view include the object's physical location and selects the appropriate camera when the object traverses from one camera's field of view to another camera's field of view. The switching from one camera to another is termed a camera handoff.
- the camera handoff is further automated via the use of figure tracking system 144 within the location determinator 140.
- line segments PI through P5 represent the path of a person (not shown) traversing the secured areas.
- the operator of the security system upon detecting the figure of the person in the image of camera 105, identifies the figure to the figure tracking system 144, typically by outlining the figure on a copy of the image from camera 105 on the video screen 180.
- automated means can be employed to identify moving objects in an image that conform to a particular target profile, such as size, shape, speed, etc.
- Camera 105 is initially adjusted to capture the figure, and the figure tracking techniques continually monitor and report the location of the figure in the image produced from camera 105.
- the figure tracking system 144 associates the characteristics of the selected area, such as color combinations and patterns, to the identified figure, or target. Thereafter, the figure tracking system 144 determines the subsequent location of this same characteristic pattern, corresponding to the movement of the identified target as it moves about the camera's field of view.
- Manual figure tracking by the operator may be used in addition to, or in lieu of, the automated figure tracking system 144. In a busy scene, the operator may be better able to distinguish the target. In a manual figure tracking mode, the operator uses a mouse or other suitable input device to point to the target as it traverses the image on the display 180.
- the controller 130 adjusts camera 105 to maintain the target figure in the center of the image from camera 105. That is, camera 105's line of sight and actual field of view will be adjusted to continue to contain the figure as the person moves along path PI within camera 105's potential field of view. Soon after the person progresses along path P2, the person will no longer be within camera 105's potential field of view.
- the controller 130 selects camera 106 when the person enters camera 106's potential field of view.
- the figure tracking techniques will subsequently be applied to continue to track the figure in the image from camera 106.
- the system in accordance with this invention will select camera 103, then camera 102, then camera 104, and then camera 102 again, as the person proceeds along the P3-P4-P5 path.
- the camera handoff system 120 includes a representation of each camera's location and potential field of view, relative to each other.
- the camera locations are provided relative to the site plan of the secured area that is contained in the secured area database 160.
- Associated with each camera is a polygon or polyhedron, outlining each camera's potential field of view.
- FIG. 3a illustrates the polygon associated with camera 102.
- FIG. 3b illustrates the polygon associated with camera 103.
- Camera 102 is a camera having an adjustable field of view, and thus can view any area within a full 360 degree arc, provided that it is not blocked by an obstruction.
- Camera 103 is a camera with a fixed field of view, as represented by the limited view angle 203.
- Camera 102's potential field of view is the polygon bounded by vertices 221 through 229.
- Camera 103's field of view is the polygon bounded by vertices 230-239.
- the field of view polygon can include details such as the ability to see through passages in obstructions, such as shown by the vertices 238 and 239 in FIG. 3b.
- Also associated with each camera is the location of the camera, shown for example as 220, 230, 240 in FIGs. 3a, 3b, 3c.
- the polygon representing the field of view of camera 104 is shown in FIG. 3c, comprising vertices 240 through 256. As shown in FIG.
- the field of view polygon can omit details, as shown by the use of vertices 244-245, omitting the actual field of view vertices 264-265.
- the level of detail of the polygons is relatively arbitrary; typically, one would provide the detail necessary to cover the maximum surveillance area within the secured area. If one area is coverable by multiple cameras, the need is minimal for identifying the fact that a particular camera can also view that area by viewing through a doorway. Conversely, if the only view of an area is through such a doorway, the encoding of the polygon to include this otherwise uncovered area may be worthwhile.
- polygon bounds can be defined to merely include the area of interest, as shown for example in FIG. 3c, where the bounds 249-250 and 253-254 are drawn just beyond the perimeter of the area being secured.
- the site map may also be represented as a three dimensional model, as shown in FIG. 4.
- the cameras' fields of view are represented by polyhedron, to include the three-dimensional nature of a camera's field of view.
- the polyhedron associated with camera 104 is shown in FIG. 4, and is represented by the vertices 441 through 462.
- the detail of the polyhedron model is dependent upon the level of precision desired.
- vertices 449 through 454 model the view through the portal 480 as a wedge shaped area
- vertices 455 through 462 model the view through the portal 481 as a block shaped area.
- Three dimensional modeling will provide for greater flexibility and accuracy in the determination of actual location of the target, but at increased computational costs. For ease of understanding, two dimensional modeling will be discussed hereafter. The techniques employed are equally applicable to three dimensional site maps, as would be evident to one skilled in the art.
- the coordinate system utilized for encoding the camera locations and orientations can be any convenient form. Actual dimensions, relative to a reference such as the floor plan, may be used; or, scaled dimensions, such as screen coordinates may be used. Techniques for converting from one coordinate system to another are well known to one skilled in the art, and different coordinate systems may be utilized as required. Combinations of three dimensional modeling and two dimensional modeling may also be employed, wherein for example, the cameras at each floor of a multistoried building are represented by a two dimensional plan, and each of these two dimensional plans have a third, elevation, dimension associated with it. In this manner, the computationally complex process of associating an image to a physical locale can operate in the two dimensional representation, and the third dimension need only be processed when the target enters an elevator or stairway.
- FIGs. 5a-5c demonstrates the association of a figure in an image to a target in the physical coordinate system, in accordance with this invention.
- An image 510 from camera 502 (shown in Fig. 5c), containing a figure 511, is shown in FIG. 5a.
- figure tracking processes are available that determine a figure's location within an image and allows a camera control system to adjust camera 502's line of sight so as to center the figure in the image, as shown in FIG. 5b.
- the controller 130 in accordance with this invention will maintain the camera 502's actual line of sight, in terms of the physical site plan, for subsequent processing.
- the line of sight from the camera to the figure is determined by the angular distance the figure is offset from the center of the image. By adjusting the camera to center the figure, a greater degree of accuracy can be achieved in resolving the actual line of sight to the figure.
- the direction of the target from the camera, in relation to the physical site plan can thus be determined.
- the line of sight is used herein as the straight line between the camera and the target in the physical coordinate site plan, independent of whether the camera is adjusted to effect this line of sight.
- FIG. 5c illustrates the physical representation of a secured area, as well as the location of camera 502, the line of sight 580 to the target, and the camera's actual field of view, as bounded by rays 581 and 582 about an angle of view 585.
- triangulation if the target is along the line of sight of another camera, the intersection of the lines of sight will determine the target's actual location along these lines of sight. This triangulation method, however, requires that the target lie within the field of view of two or more cameras.
- the target's distance (range) from the camera can be determined by the setting of the focus adjustment to bring the target into focus.
- the amount of focus control applied to bring the target into focus will provide sufficient information to estimate the distance of the target from the location of the camera, provided that the correlation between focus control and focal distance is known. Any number of known techniques can be employed for modeling the correlation between focus control and focal distance.
- the camera itself may contain the ability to report the focal distance, directly, to the camera handoff system.
- the focal distance information may be provided based upon independent means, such as radar or sonar ranging means associated with each camera.
- the correlation between focus control and focal distance is modeled as a polynomial, associating the angular rotation x of the focus control to the focal distance R as follows:
- R ao + aix + a 2 x 2 + ... + a n x n .
- the coefficients ao through a n are determined empirically. At least n+1 measurements are taken, adjusting the focus x of the camera to focus upon an item place at each of n+1 distances from the camera. Conventional least squares curve fitting techniques are applied to this set of measurements to determine the coefficients ao through a n .
- these measurements and curve fitting techniques can be applied to each camera, to determine the particular polynomial coefficients for each camera; or, a single set of polynomial coefficients can be applied to all cameras having the same auto-focus mechanism.
- the common single set of coefficients are provided as the default parameters for each camera, with a capability of subsequently modifying these coefficients via camera specific measurements, as required.
- the camera is not adjustable, or fixed focus
- alternative techniques can also be employed to estimate the range of the target from the camera. For example, if the target to be tracked can be expected to be of a given average physical size, the size of the figure of the target in the image can be used to estimate the distance, using the conventional square law correlation between image size and distance. Similarly, if the camera's line of sight is set at an angle to the surface of the secured area, the vertical location of the figure in the displayed image will be correlated to the distance from the camera. These and other techniques are well known in the art for estimating an object's distance, or range, from a camera.
- the target location P in the site plan coordinate system, corresponding to the figure location in the displayed image from the camera, can be determined.
- the cameras within whose fields of view the location P lies can be determined. This is because the cameras' fields of view are modeled in this same coordinate system.
- the cameras whose fields of view are in proximity to the location P can also be determined.
- each of the cameras including the target point can be automatically adjusted to center the target point in their respective fields of view, independent of whether the camera is selected as the camera utilized for figure tracking.
- all cameras which contain the target in their potential field of view, and which are not allocated to a higher priority task, are automatically redirected to contain the target in their actual field of view.
- the techniques presented herein are also applicable to a manual figure tracking scenario as well. That is, for example, the operator points to a figure in the image from a camera, and the system determines the line of sight and range as discussed above. Thereafter, knowing the target location, the system displays the same target location from the other cameras, automatically.
- a manual technique would be useful, for example, for managing multiple cameras in a sports event, such that the operator points to a particular player, and the other cameras having this player in their field of view are identified for alternative selection and/or redirection to also include this player.
- a variety of techniques may be employed to determine whether to select a different camera from the one currently selected for figure tracking, as well as techniques to select among multiple cameras. Selection can be maintained with the camera containing the figure until the figure tracking system 144 reports that the figure is no longer within the view of that camera; at that time, one of the cameras which had been determined to have contained the target in its prior location P can be selected. The camera will be positioned to this location P and the figure tracking system 144 will be directed to locate the figure in the image from this camera.
- the assumption in this scenario is that the cameras are arranged to have overlapping fields of view, and the edges of these fields of view are not coincident, such that the target cannot exit the field of view of two cameras simultaneously.
- the camera handoff system includes an predictor 142 that estimates a next location Q, based upon the motion (sequence of prior locations) of the figure.
- a linear model can be used, wherein the next location is equal to the prior location plus the vector distance the target traveled from its next-prior location.
- a non-linear model can be used, wherein the next location is dependent upon multiple prior locations, so as to model both velocity and acceleration.
- the figure tracking system 144 locations exhibit jitter, or sporadic deviations, because the movement of a figure such as a person, comprising arbitrarily moving appendages and relatively unsharp edges, is difficult to determine absolutely.
- Data smoothing techniques can be applied so as to minimize the jitter in the predictive location Q, whether determined using a linear or non-linear model.
- the cameras containing the point Q within their potential fields of view can be determined. If the predicted location Q lies outside the limits of the current camera's potential field of view, an alternative camera, containing location Q in its field of view, is selected and adjusted so as to provide the target in its actual field of view. The system need not wait until the predicted location is no longer within the current camera's field of view; if the predicted location Q is approaching the bounds of the selected camera's field of view, but well within the bounds of another camera's field of view, the other camera can be selected. Similarly, the distance from each camera can be utilized in this selection process.
- a weighting factor can be associated with each of the parameters associated with the viewing of a security scene, such as the distance from the camera, the distance from the edge of the camera's field of view, the likelihood that the target will be locatable in the camera's field of view (influenced, for example, by the complexity of image from one camera versus that from another), whether this camera is currently selected, etc.
- the preference for selecting the camera can be determined, and the most preferred camera can be selected and adjusted.
- target location P will be used to identify the location to which a camera is intended to be directed, regardless of whether this location P is the prior location or estimated next location.
- each camera can contain a list of those cameras having a potential field of view in proximity to its potential field of view. This list would exclude cameras whose fields of view are in proximity to its field of view, but which fields of views are physically separated and inaccessible from its field of view; for example a camera in an adjacent room, to which there is no access from this camera's room.
- the list could be segregated by sub-areas within the camera's field of view, so as to include only those cameras having fields of view with direct access from each of the sub areas within the camera's field of view.
- cameras 102 and 103 have adjacent fields of view to camera 104's field of view.
- the list associated with camera 104 would contain both cameras 102 and 103, but may be structured such that, if the target point lies to the left of the middle of camera 104's potential field of view, camera 102 is identified as the next camera to be utilized; if the target lies to the right of camera 104's potential field of view, camera 103 is identified as the next camera to be utilized.
- the egresses from the area can be determined from the site plan, and the most likely egress identified based upon the pattern of activity of the target, or upon a model of likelihood factors associated with each egress point. For example, the route from the lobby of a bank to the bank's vault would have a high likelihood of use when the target first enters a bank. The bank exit would have a low likelihood of intended use at this time. But, these likelihoods would reverse once the target returns from the vault.
- the position of the figure within an image, and an identification of the camera providing this image is provided by the figure tracking system and controller.
- the position of the image relative to the orientation of the camera determines the line of sight from the camera to the target, in the physical coordinate space.
- the orientation of the camera, in the physical coordinate space is determined when the camera is initially installed. If the camera is not adjustable, the direction the camera is aimed, in the physical domain, is the orientation of the camera. If the camera is adjustable, the initial orientation is determined by aiming the camera at a point having a known location in the physical representation of the secured site, such as a corner; subsequent rotations of the camera will then be relative to this known direction.
- the location of the target along the determined line of sight either ranging or interpolation methods may be employed. If the range method is used, the distance between the camera and the target is determined using the methods discussed above. The target location P, in the physical domain, is determined as the point along the line of sight at a distance from the camera's location.
- the line of sight is stored and another image is assessed by the target tracking system to provide a position relative to another camera's image.
- the line of sight relative to this second camera is determined.
- the target location P in the physical domain, is determined at as the point at the intersection of the line of sight from the first camera's location and the line of sight from the second camera's location.
- this target location is processed so as to produce a predicted next position, or filtered to remove jitter, or a combination of both.
- the processed target location P is returned for presentation to the camera selection process.
- the camera handoff system determines which cameras contain this target point. Because each camera's potential field of view is represented as vertices in the physical domain coordinate system, the process merely comprises a determination of whether point P lies within the polygon or polyhedron associated with each camera. If the number of cameras is large, this search process can be optimized, as discussed above. The search process employed would replace this exhaustive search loop. The system thereafter selects one of the cameras containing the target point P in its potential field of view.
- the system comprises alarms, each alarm having an associated camera and predefined target point, in the physical domain coordinate system. Upon receipt of an alarm the system marks the associated camera for selection.
- the target point P is set to the associated predefined target point and the process continues, as discussed above.
- the system could signal the figure tracking system that a new target is to be identified, by noting movements in proximity to this target point in subsequent images; or, the operator, having been altered to the alarm and presented the image at the target point, could outline the figure directly. Thereafter, the system will track the figure, selecting alternate cameras to maintain the tracking, as discussed above.
- the user's ability to override the camera handoff system's selection is mentioned. Provided that the system provides a preference to select the currently selected camera, to minimize changing images the user selected camera will thereafter be the currently selected camera for subsequent camera selection until the figure is no longer in the selected camera's potential field of view.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Closed-Circuit Television Systems (AREA)
- Detergent Compositions (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP99939424A EP1042917A1 (en) | 1998-08-07 | 1999-07-27 | Figure tracking in a multiple camera system |
JP2000564381A JP2002522980A (en) | 1998-08-07 | 1999-07-27 | Image tracking in multiple camera systems |
KR1020007003758A KR100660762B1 (en) | 1998-08-07 | 1999-07-27 | Figure tracking in a multiple camera system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/131,243 | 1998-08-07 | ||
US09/131,243 US6359647B1 (en) | 1998-08-07 | 1998-08-07 | Automated camera handoff system for figure tracking in a multiple camera system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000008856A1 true WO2000008856A1 (en) | 2000-02-17 |
Family
ID=22448579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP1999/005505 WO2000008856A1 (en) | 1998-08-07 | 1999-07-27 | Figure tracking in a multiple camera system |
Country Status (6)
Country | Link |
---|---|
US (1) | US6359647B1 (en) |
EP (1) | EP1042917A1 (en) |
JP (1) | JP2002522980A (en) |
KR (1) | KR100660762B1 (en) |
CN (1) | CN1192094C (en) |
WO (1) | WO2000008856A1 (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002011431A1 (en) * | 2000-07-27 | 2002-02-07 | Revolution Company, Llc | Video system and method of operating a video system |
EP1189187A2 (en) * | 2000-08-31 | 2002-03-20 | Industrie Technik IPS GmbH | Method and system for monitoring a designated area |
GB2371936A (en) * | 2001-02-03 | 2002-08-07 | Hewlett Packard Co | Surveillance system for tracking a moving object |
EP1289282A1 (en) * | 2001-08-29 | 2003-03-05 | Dartfish SA | Video sequence automatic production method and system |
WO2003026281A1 (en) * | 2001-09-17 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Intelligent quad display through cooperative distributed vision |
WO2003041411A1 (en) * | 2001-11-08 | 2003-05-15 | Revolution Company, Llc | Video system and methods for operating a video system |
FR2834415A1 (en) * | 2001-12-27 | 2003-07-04 | Sport Universal | REAL-TIME MOBILE TRACKING SYSTEM ON A SPORTS FIELD |
WO2007101788A1 (en) * | 2006-03-03 | 2007-09-13 | Siemens Aktiengesellschaft | Apparatus and method for visually monitoring a room area |
WO2007104367A1 (en) * | 2006-03-16 | 2007-09-20 | Siemens Aktiengesellschaft | Video monitoring system |
EP2120452A1 (en) * | 2007-02-14 | 2009-11-18 | Panasonic Corporation | Monitoring camera and monitoring camera control method |
US7697720B2 (en) | 2004-09-18 | 2010-04-13 | Hewlett-Packard Development Company, L.P. | Visual sensing for large-scale tracking |
US7804519B2 (en) | 2004-09-18 | 2010-09-28 | Hewlett-Packard Development Company, L.P. | Method of refining a plurality of tracks |
US7929022B2 (en) | 2004-09-18 | 2011-04-19 | Hewlett-Packard Development Company, L.P. | Method of producing a transit graph |
EP1489847A3 (en) * | 2003-06-18 | 2011-05-04 | Panasonic Corporation | Video surveillance system, surveillance video composition apparatus, and video surveillance server |
WO2011071817A1 (en) * | 2009-12-10 | 2011-06-16 | Harris Corporation | Video processing system providing correlation between objects in different georeferenced video feeds and related methods |
WO2011071814A1 (en) * | 2009-12-10 | 2011-06-16 | Harris Corporation | Video processing system providing overlay of selected geospatially-tagged metadata relating to a geolocation outside viewable area and related methods |
WO2011037737A3 (en) * | 2009-09-02 | 2013-04-25 | Peopleary Inc | Methods for producing low-cost, high-quality video excerpts using an automated sequence of camera switches |
JP2013196199A (en) * | 2012-03-16 | 2013-09-30 | Fujitsu Ltd | User detection device, method and program |
US8675073B2 (en) | 2001-11-08 | 2014-03-18 | Kenneth Joseph Aagaard | Video system and methods for operating a video system |
CN103702030A (en) * | 2013-12-25 | 2014-04-02 | 浙江宇视科技有限公司 | Scene monitoring method and moving target tracking method based on GIS (Geographic Information System) map |
GB2516173A (en) * | 2013-07-11 | 2015-01-14 | Panasonic Corp | Tracking assistance device, tracking assistance system and tracking assistance method |
FR3039919A1 (en) * | 2015-08-04 | 2017-02-10 | Neosensys | TRACKING A TARGET IN A CAMERAS NETWORK |
DE112006002674B4 (en) | 2005-10-07 | 2019-05-09 | Cognex Corp. | Methods and apparatus for practical 3D vision system |
WO2022007998A1 (en) * | 2020-07-10 | 2022-01-13 | Raytheon Anschütz Gmbh | System and method for locating an object in a specified area |
EP4057624A4 (en) * | 2020-01-10 | 2023-01-04 | Kobelco Construction Machinery Co., Ltd. | Inspection system for construction machine |
EP4131934A4 (en) * | 2020-03-30 | 2023-08-23 | Hangzhou Hikvision Digital Technology Co., Ltd. | Method and apparatus for determining picture switching, electronic device, and storage medium |
US12028653B2 (en) | 2020-01-10 | 2024-07-02 | Kobelco Construction Machinery Co., Ltd. | Inspection system for construction machine |
Families Citing this family (201)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020036617A1 (en) | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
US6750848B1 (en) | 1998-11-09 | 2004-06-15 | Timothy R. Pryor | More useful man machine interfaces and applications |
JP2000069346A (en) * | 1998-06-12 | 2000-03-03 | Canon Inc | Camera control device, its method, camera, tracking camera system, and computer-readable recording medium |
US7483049B2 (en) * | 1998-11-20 | 2009-01-27 | Aman James A | Optimizations for live event, real-time, 3D object tracking |
US9555310B2 (en) * | 1998-11-20 | 2017-01-31 | Maxx Holdings, Inc. | Sports scorekeeping system with integrated scoreboard and automatic entertainment system |
WO2005099423A2 (en) * | 2004-04-16 | 2005-10-27 | Aman James A | Automatic event videoing, tracking and content generation system |
FI112549B (en) * | 1999-03-01 | 2003-12-15 | Honeywell Oy | A method for synchronizing image information from process monitoring cameras |
JP4209535B2 (en) * | 1999-04-16 | 2009-01-14 | パナソニック株式会社 | Camera control device |
US7015950B1 (en) | 1999-05-11 | 2006-03-21 | Pryor Timothy R | Picture taking method and apparatus |
JP2001069496A (en) * | 1999-08-31 | 2001-03-16 | Matsushita Electric Ind Co Ltd | Supervisory camera apparatus and control method for supervisory camera |
US7995096B1 (en) * | 1999-09-23 | 2011-08-09 | The Boeing Company | Visual security operations system |
JP3809309B2 (en) * | 1999-09-27 | 2006-08-16 | キヤノン株式会社 | Camera control system, camera control method, and storage medium |
DE60040051D1 (en) * | 1999-12-03 | 2008-10-09 | Fujinon Corp | Automatic follower |
GB9929870D0 (en) * | 1999-12-18 | 2000-02-09 | Roke Manor Research | Improvements in or relating to security camera systems |
US6789039B1 (en) * | 2000-04-05 | 2004-09-07 | Microsoft Corporation | Relative range camera calibration |
US6931254B1 (en) * | 2000-08-21 | 2005-08-16 | Nortel Networks Limited | Personalized presentation system and method |
US7319479B1 (en) * | 2000-09-22 | 2008-01-15 | Brickstream Corporation | System and method for multi-camera linking and analysis |
GB2368482B (en) * | 2000-10-26 | 2004-08-25 | Hewlett Packard Co | Optimal image capture |
US7200246B2 (en) * | 2000-11-17 | 2007-04-03 | Honeywell International Inc. | Object detection |
KR100392727B1 (en) * | 2001-01-09 | 2003-07-28 | 주식회사 한국씨씨에스 | A computer-based remote surveillance CCTV system, a computer video matrix switcher and a control program adapted to the CCTV system |
US7423666B2 (en) * | 2001-05-25 | 2008-09-09 | Minolta Co., Ltd. | Image pickup system employing a three-dimensional reference object |
WO2002103634A1 (en) * | 2001-06-15 | 2002-12-27 | Datacard Corporation | Apparatus and method for machine vision |
US8430749B2 (en) * | 2001-08-10 | 2013-04-30 | Igt | Dynamic casino tracking and optimization |
US7342489B1 (en) | 2001-09-06 | 2008-03-11 | Siemens Schweiz Ag | Surveillance system control unit |
US20030058342A1 (en) * | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Optimal multi-camera setup for computer-based visual surveillance |
JP4100934B2 (en) * | 2002-02-28 | 2008-06-11 | シャープ株式会社 | Composite camera system, zoom camera control method, and zoom camera control program |
DE50201004D1 (en) * | 2002-03-01 | 2004-10-21 | Brainlab Ag | Operating lamp with camera system for 3D referencing |
JP2003284053A (en) * | 2002-03-27 | 2003-10-03 | Minolta Co Ltd | Monitoring camera system and monitoring camera control device |
US20030202102A1 (en) * | 2002-03-28 | 2003-10-30 | Minolta Co., Ltd. | Monitoring system |
EP1543509A4 (en) * | 2002-09-02 | 2008-09-24 | Samsung Electronics Co Ltd | Optical information storage medium and method of and apparatus for recording and/or reproducing information on and/or from the optical information storage medium |
US7385626B2 (en) * | 2002-10-21 | 2008-06-10 | Sarnoff Corporation | Method and system for performing surveillance |
EP1557038A4 (en) * | 2002-10-30 | 2009-05-13 | Nds Ltd | Interactive broadcast system |
US7221775B2 (en) * | 2002-11-12 | 2007-05-22 | Intellivid Corporation | Method and apparatus for computerized image background analysis |
ATE454789T1 (en) * | 2002-11-12 | 2010-01-15 | Intellivid Corp | METHOD AND SYSTEM FOR TRACKING AND MONITORING BEHAVIOR OF MULTIPLE OBJECTS MOVING THROUGH MULTIPLE FIELDS OF VIEW |
US7394916B2 (en) * | 2003-02-10 | 2008-07-01 | Activeye, Inc. | Linking tracked objects that undergo temporary occlusion |
US20050134685A1 (en) * | 2003-12-22 | 2005-06-23 | Objectvideo, Inc. | Master-slave automated video-based surveillance system |
US20100002070A1 (en) | 2004-04-30 | 2010-01-07 | Grandeye Ltd. | Method and System of Simultaneously Displaying Multiple Views for Video Surveillance |
US20050007453A1 (en) * | 2003-05-02 | 2005-01-13 | Yavuz Ahiska | Method and system of simultaneously displaying multiple views for video surveillance |
US7508956B2 (en) | 2003-06-04 | 2009-03-24 | Aps Technology Group, Inc. | Systems and methods for monitoring and tracking movement and location of shipping containers and vehicles using a vision based system |
US7956889B2 (en) * | 2003-06-04 | 2011-06-07 | Model Software Corporation | Video surveillance system |
US7242423B2 (en) * | 2003-06-16 | 2007-07-10 | Active Eye, Inc. | Linking zones for object tracking and camera handoff |
US7525570B2 (en) * | 2003-07-17 | 2009-04-28 | Igt | Security camera interface |
JP3975400B2 (en) * | 2003-08-20 | 2007-09-12 | ソニー株式会社 | Monitoring system, information processing apparatus and method, recording medium, and program |
US7286157B2 (en) * | 2003-09-11 | 2007-10-23 | Intellivid Corporation | Computerized method and apparatus for determining field-of-view relationships among multiple image sensors |
US6873924B1 (en) * | 2003-09-30 | 2005-03-29 | General Electric Company | Method and system for calibrating relative fields of view of multiple cameras |
JP4306397B2 (en) * | 2003-10-08 | 2009-07-29 | 株式会社日立製作所 | Recognition processing system |
US7346187B2 (en) * | 2003-10-10 | 2008-03-18 | Intellivid Corporation | Method of counting objects in a monitored environment and apparatus for the same |
US7280673B2 (en) * | 2003-10-10 | 2007-10-09 | Intellivid Corporation | System and method for searching for changes in surveillance video |
JP4134878B2 (en) * | 2003-10-22 | 2008-08-20 | 株式会社デンソー | Conductor composition, mounting substrate using the conductor composition, and mounting structure |
US20050104958A1 (en) * | 2003-11-13 | 2005-05-19 | Geoffrey Egnal | Active camera video-based surveillance systems and methods |
US7171024B2 (en) * | 2003-12-01 | 2007-01-30 | Brickstream Corporation | Systems and methods for determining if objects are in a queue |
DE10358017A1 (en) * | 2003-12-11 | 2005-07-21 | Siemens Ag | 3D camera control |
US7683937B1 (en) * | 2003-12-31 | 2010-03-23 | Aol Inc. | Presentation of a multimedia experience |
US20080055101A1 (en) * | 2004-03-19 | 2008-03-06 | Intexact Technologies Limited | Location Tracking System And A Method Of Operating Same |
US20050225634A1 (en) * | 2004-04-05 | 2005-10-13 | Sam Brunetti | Closed circuit TV security system |
US8427538B2 (en) * | 2004-04-30 | 2013-04-23 | Oncam Grandeye | Multiple view and multiple object processing in wide-angle video camera |
US7399127B2 (en) * | 2004-05-26 | 2008-07-15 | Incorporated Administrative Agency National Agricultural And Bio-Oriented Research Organization | Autonomous operation control system |
GB2414615A (en) * | 2004-05-28 | 2005-11-30 | Sony Uk Ltd | Object detection, scanning and labelling |
JP4478510B2 (en) * | 2004-06-03 | 2010-06-09 | キヤノン株式会社 | Camera system, camera, and camera control method |
FI117662B (en) * | 2004-06-29 | 2006-12-29 | Videra Oy | AV system as well as controls |
JP4140567B2 (en) * | 2004-07-14 | 2008-08-27 | 松下電器産業株式会社 | Object tracking device and object tracking method |
US7929017B2 (en) * | 2004-07-28 | 2011-04-19 | Sri International | Method and apparatus for stereo, multi-camera tracking and RF and video track fusion |
US8289390B2 (en) * | 2004-07-28 | 2012-10-16 | Sri International | Method and apparatus for total situational awareness and monitoring |
US7606425B2 (en) * | 2004-09-09 | 2009-10-20 | Honeywell International Inc. | Unsupervised learning of events in a video sequence |
US20060137414A1 (en) * | 2004-10-12 | 2006-06-29 | Triteq Lock And Security Llc | Vending-machine lock with motor-controlled slide-bar and hook mechanism |
US20060077253A1 (en) * | 2004-10-13 | 2006-04-13 | Honeywell International, Inc. | System and method for enhanced situation awareness |
EP1657927A1 (en) * | 2004-11-12 | 2006-05-17 | Saab Ab | Image-based movement tracking |
US7250853B2 (en) * | 2004-12-10 | 2007-07-31 | Honeywell International Inc. | Surveillance system |
US7924311B2 (en) * | 2004-12-21 | 2011-04-12 | Panasonic Corporation | Camera terminal and monitoring system |
WO2006068463A1 (en) * | 2004-12-24 | 2006-06-29 | Ultrawaves Design Holding B.V. | Intelligent distributed image processing |
JP4641424B2 (en) * | 2005-02-02 | 2011-03-02 | キヤノン株式会社 | Imaging device |
FR2883382A1 (en) | 2005-03-21 | 2006-09-22 | Giat Ind Sa | Object e.g. enemy vehicle, or event e.g. fire accident, locating and perceiving method, involves determining zone in which event or object is found and is detected by sensor, and recording images in zone by video cameras by recording unit |
EP1872345B1 (en) * | 2005-03-25 | 2011-03-02 | Sensormatic Electronics, LLC | Intelligent camera selection and object tracking |
US7760908B2 (en) * | 2005-03-31 | 2010-07-20 | Honeywell International Inc. | Event packaged video sequence |
EP1867167A4 (en) * | 2005-04-03 | 2009-05-06 | Nice Systems Ltd | Apparatus and methods for the semi-automatic tracking and examining of an object or an event in a monitored site |
US7852372B2 (en) * | 2005-04-04 | 2010-12-14 | Gary Sohmers | Interactive television system and method |
US20080291278A1 (en) * | 2005-04-05 | 2008-11-27 | Objectvideo, Inc. | Wide-area site-based video surveillance system |
US7583815B2 (en) * | 2005-04-05 | 2009-09-01 | Objectvideo Inc. | Wide-area site-based video surveillance system |
US7720257B2 (en) * | 2005-06-16 | 2010-05-18 | Honeywell International Inc. | Object tracking system |
US9036028B2 (en) | 2005-09-02 | 2015-05-19 | Sensormatic Electronics, LLC | Object tracking and alerts |
US20070064208A1 (en) * | 2005-09-07 | 2007-03-22 | Ablaze Development Corporation | Aerial support structure and method for image capture |
US9363487B2 (en) * | 2005-09-08 | 2016-06-07 | Avigilon Fortress Corporation | Scanning camera-based video surveillance system |
US20070058717A1 (en) * | 2005-09-09 | 2007-03-15 | Objectvideo, Inc. | Enhanced processing for scanning video |
US20070071404A1 (en) * | 2005-09-29 | 2007-03-29 | Honeywell International Inc. | Controlled video event presentation |
WO2007059301A2 (en) * | 2005-11-16 | 2007-05-24 | Integrated Equine Technologies Llc | Automated video system for context-appropriate object tracking |
US20070146543A1 (en) * | 2005-12-23 | 2007-06-28 | Demian Gordon | Volume configurations in motion capture |
JP4570159B2 (en) * | 2006-01-06 | 2010-10-27 | Kddi株式会社 | Multi-view video encoding method, apparatus, and program |
US7881537B2 (en) | 2006-01-31 | 2011-02-01 | Honeywell International Inc. | Automated activity detection using supervised learning |
JP5044237B2 (en) * | 2006-03-27 | 2012-10-10 | 富士フイルム株式会社 | Image recording apparatus, image recording method, and image recording program |
US7825792B2 (en) * | 2006-06-02 | 2010-11-02 | Sensormatic Electronics Llc | Systems and methods for distributed monitoring of remote sites |
US7671728B2 (en) | 2006-06-02 | 2010-03-02 | Sensormatic Electronics, LLC | Systems and methods for distributed monitoring of remote sites |
EP1862969A1 (en) * | 2006-06-02 | 2007-12-05 | Eidgenössische Technische Hochschule Zürich | Method and system for generating a representation of a dynamically changing 3D scene |
JP5041757B2 (en) * | 2006-08-02 | 2012-10-03 | パナソニック株式会社 | Camera control device and camera control system |
EP1909229B1 (en) * | 2006-10-03 | 2014-02-19 | Nikon Corporation | Tracking device and image-capturing apparatus |
CN101652999B (en) * | 2007-02-02 | 2016-12-28 | 霍尼韦尔国际公司 | System and method for managing live video data |
JP5121258B2 (en) * | 2007-03-06 | 2013-01-16 | 株式会社東芝 | Suspicious behavior detection system and method |
US7922085B2 (en) * | 2007-04-13 | 2011-04-12 | Aps Technology Group, Inc. | System, method, apparatus, and computer program product for monitoring the transfer of cargo to and from a transporter |
US8527655B2 (en) * | 2007-05-22 | 2013-09-03 | Vidsys, Inc. | Optimal routing of audio, video, and control data through heterogeneous networks |
WO2008154003A2 (en) * | 2007-06-09 | 2008-12-18 | Sensormatic Electronics Corporation | System and method for integrating video analytics and data analytics/mining |
WO2009006605A2 (en) | 2007-07-03 | 2009-01-08 | Pivotal Vision, Llc | Motion-validating remote monitoring system |
US8619140B2 (en) * | 2007-07-30 | 2013-12-31 | International Business Machines Corporation | Automatic adjustment of area monitoring based on camera motion |
US20090055205A1 (en) * | 2007-08-23 | 2009-02-26 | Igt | Multimedia player tracking infrastructure |
EP2185255A4 (en) * | 2007-09-21 | 2013-08-14 | Playdata Llc | Object location and movement detection system and method |
KR101187909B1 (en) * | 2007-10-04 | 2012-10-05 | 삼성테크윈 주식회사 | Surveillance camera system |
US20090103909A1 (en) * | 2007-10-17 | 2009-04-23 | Live Event Media, Inc. | Aerial camera support structure |
US7391886B1 (en) * | 2008-01-09 | 2008-06-24 | International Business Machines Corporation | Digital camera with image tracking system |
EP2093636A1 (en) * | 2008-02-21 | 2009-08-26 | Siemens Aktiengesellschaft | Method for controlling an alarm management system |
US20090265105A1 (en) * | 2008-04-21 | 2009-10-22 | Igt | Real-time navigation devices, systems and methods |
FR2930668B1 (en) * | 2008-04-25 | 2010-06-18 | Citilog | SYSTEM FOR AIDING THE OPERATION OF A QUALITY OF ROAD ROAD NETWORK |
JP5180733B2 (en) * | 2008-08-19 | 2013-04-10 | セコム株式会社 | Moving object tracking device |
US9053594B2 (en) * | 2008-10-01 | 2015-06-09 | International Business Machines Corporation | Monitoring objects in motion along a static route using sensory detection devices |
US20100097472A1 (en) * | 2008-10-21 | 2010-04-22 | Honeywell International Inc. | Method of efficient camera control and hand over in surveillance management |
TWI405457B (en) * | 2008-12-18 | 2013-08-11 | Ind Tech Res Inst | Multi-target tracking system, method and smart node using active camera handoff |
TWI388205B (en) * | 2008-12-19 | 2013-03-01 | Ind Tech Res Inst | Method and apparatus for tracking objects |
WO2010099575A1 (en) | 2009-03-04 | 2010-09-10 | Honeywell International Inc. | Systems and methods for managing video data |
CN101572804B (en) * | 2009-03-30 | 2012-03-21 | 浙江大学 | Multi-camera intelligent control method and device |
US8836601B2 (en) | 2013-02-04 | 2014-09-16 | Ubiquiti Networks, Inc. | Dual receiver/transmitter radio devices with choke |
US9496620B2 (en) | 2013-02-04 | 2016-11-15 | Ubiquiti Networks, Inc. | Radio system for long-range high-speed wireless communication |
DE102009025077A1 (en) * | 2009-06-10 | 2010-12-16 | Karl Storz Gmbh & Co. Kg | System for orientation support and representation of an instrument in the interior of an examination object, in particular in the human body |
US20110002548A1 (en) * | 2009-07-02 | 2011-01-06 | Honeywell International Inc. | Systems and methods of video navigation |
JP5402431B2 (en) * | 2009-09-11 | 2014-01-29 | 沖電気工業株式会社 | Camera control device |
US8251597B2 (en) * | 2009-10-16 | 2012-08-28 | Wavecam Media, Inc. | Aerial support structure for capturing an image of a target |
JP2011139262A (en) * | 2009-12-28 | 2011-07-14 | Sony Corp | Image processing device, image processing method, and program |
US20110228098A1 (en) * | 2010-02-10 | 2011-09-22 | Brian Lamb | Automatic motion tracking, event detection and video image capture and tagging |
FR2956789B1 (en) * | 2010-02-19 | 2012-11-16 | Canon Kk | METHOD AND DEVICE FOR PROCESSING A VIDEO SEQUENCE |
EP2549753B1 (en) * | 2010-03-15 | 2019-04-10 | Omron Corporation | Surveillance camera terminal |
KR20110115686A (en) * | 2010-04-16 | 2011-10-24 | 삼성전자주식회사 | Shutter galsses and display apparatus including the same |
CN102223473A (en) * | 2010-04-16 | 2011-10-19 | 鸿富锦精密工业(深圳)有限公司 | Camera device and method for dynamic tracking of specific object by using camera device |
KR20110121866A (en) * | 2010-05-03 | 2011-11-09 | 삼성전자주식회사 | Portable apparatus and method for processing measurement data thereof |
US20120092492A1 (en) * | 2010-10-19 | 2012-04-19 | International Business Machines Corporation | Monitoring traffic flow within a customer service area to improve customer experience |
CN102065279B (en) * | 2010-10-28 | 2015-11-25 | 北京中星微电子有限公司 | A kind of method and system of continuous tracking monitored object |
TWI514324B (en) * | 2010-11-30 | 2015-12-21 | Ind Tech Res Inst | Tracking system and method for image object region and computer program product thereof |
US8958478B2 (en) * | 2010-12-03 | 2015-02-17 | Technische Universitaet Berlin | Method and device for processing pixels contained in a video sequence |
JP5727207B2 (en) * | 2010-12-10 | 2015-06-03 | セコム株式会社 | Image monitoring device |
US9615064B2 (en) | 2010-12-30 | 2017-04-04 | Pelco, Inc. | Tracking moving objects using a camera network |
US9171075B2 (en) | 2010-12-30 | 2015-10-27 | Pelco, Inc. | Searching recorded video |
US11697372B1 (en) * | 2011-01-04 | 2023-07-11 | Spirited Eagle Enterprises, LLC | System and method for enhancing situational awareness in a transportation vehicle |
CN102176246A (en) * | 2011-01-30 | 2011-09-07 | 西安理工大学 | Camera relay relationship determining method of multi-camera target relay tracking system |
US8451344B1 (en) * | 2011-03-24 | 2013-05-28 | Amazon Technologies, Inc. | Electronic devices with side viewing capability |
CN102811340B (en) * | 2011-06-02 | 2017-11-21 | 中兴通讯股份有限公司 | A kind of intelligent video monitoring system and method |
US8868684B2 (en) * | 2011-06-17 | 2014-10-21 | At&T Intellectual Property I, L.P. | Telepresence simulation with multiple interconnected devices |
CN102495639A (en) * | 2011-12-02 | 2012-06-13 | 天津工业大学 | Target tracking experiment device |
US9227568B1 (en) * | 2012-01-04 | 2016-01-05 | Spirited Eagle Enterprises LLC | System and method for managing driver sensory communication devices in a transportation vehicle |
WO2013149340A1 (en) * | 2012-04-02 | 2013-10-10 | Mcmaster University | Optimal camera selection iν array of monitoring cameras |
JP6065195B2 (en) * | 2012-05-08 | 2017-01-25 | パナソニックIpマネジメント株式会社 | Display image forming apparatus and display image forming method |
JP6055823B2 (en) * | 2012-05-30 | 2016-12-27 | 株式会社日立製作所 | Surveillance camera control device and video surveillance system |
US10536361B2 (en) | 2012-06-27 | 2020-01-14 | Ubiquiti Inc. | Method and apparatus for monitoring and processing sensor data from an electrical outlet |
US9256957B1 (en) | 2012-09-13 | 2016-02-09 | Bae Systems Information And Electronic Systems Integration Inc. | Method for moving-object detection tracking identification cueing of videos |
EP2913996B1 (en) * | 2012-10-23 | 2021-03-03 | Sony Corporation | Information-processing device, information-processing method, program, and information-processing system |
US10175751B2 (en) * | 2012-12-19 | 2019-01-08 | Change Healthcare Holdings, Llc | Method and apparatus for dynamic sensor configuration |
US9543635B2 (en) | 2013-02-04 | 2017-01-10 | Ubiquiti Networks, Inc. | Operation of radio devices for long-range high-speed wireless communication |
US9373885B2 (en) | 2013-02-08 | 2016-06-21 | Ubiquiti Networks, Inc. | Radio system for high-speed wireless communication |
JP5506990B1 (en) * | 2013-07-11 | 2014-05-28 | パナソニック株式会社 | Tracking support device, tracking support system, and tracking support method |
CN103414872B (en) * | 2013-07-16 | 2016-05-25 | 南京师范大学 | A kind of target location drives the method for Pan/Tilt/Zoom camera |
WO2015022339A1 (en) * | 2013-08-13 | 2015-02-19 | Navigate Surgical Technologies, Inc. | System and method for focusing imaging devices |
US10500479B1 (en) * | 2013-08-26 | 2019-12-10 | Venuenext, Inc. | Game state-sensitive selection of media sources for media coverage of a sporting event |
BR112016007701B1 (en) | 2013-10-11 | 2023-01-31 | Ubiquiti Inc | METHOD FOR CONTROLLING THE RECEPTION OF A WIRELESS BROADBAND RADIO |
CN103607569B (en) * | 2013-11-22 | 2017-05-17 | 广东威创视讯科技股份有限公司 | Method and system for tracking monitored target in process of video monitoring |
JP2015128254A (en) * | 2013-12-27 | 2015-07-09 | ソニー株式会社 | Information processor, information processing system, information processing method and program |
US9172605B2 (en) | 2014-03-07 | 2015-10-27 | Ubiquiti Networks, Inc. | Cloud device identification and authentication |
US20150256355A1 (en) | 2014-03-07 | 2015-09-10 | Robert J. Pera | Wall-mounted interactive sensing and audio-visual node devices for networked living and work spaces |
US9368870B2 (en) | 2014-03-17 | 2016-06-14 | Ubiquiti Networks, Inc. | Methods of operating an access point using a plurality of directional beams |
CN104981941B (en) | 2014-04-01 | 2018-02-02 | 优倍快网络公司 | Antenna module |
US9813600B2 (en) | 2014-04-08 | 2017-11-07 | Microsoft Technology Licensing, Llc | Multi-camera view selection based on calculated metric |
WO2015178540A1 (en) * | 2014-05-20 | 2015-11-26 | 삼성에스디에스 주식회사 | Apparatus and method for tracking target using handover between cameras |
KR102152725B1 (en) * | 2014-05-29 | 2020-09-07 | 한화테크윈 주식회사 | Control apparatus for camera |
US20160037209A1 (en) * | 2014-07-31 | 2016-02-04 | Kabushiki Kaisha Toshiba | Video audio output device and system |
JP2016046642A (en) * | 2014-08-21 | 2016-04-04 | キヤノン株式会社 | Information processing system, information processing method, and program |
US10110856B2 (en) | 2014-12-05 | 2018-10-23 | Avigilon Fortress Corporation | Systems and methods for video analysis rules based on map data |
US10306193B2 (en) | 2015-04-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Trigger zones for objects in projected surface model |
US10088549B2 (en) * | 2015-06-25 | 2018-10-02 | Appropolis Inc. | System and a method for tracking mobile objects using cameras and tag devices |
JP6828010B2 (en) | 2015-07-31 | 2021-02-10 | ダルマイアー エレクトロニック ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフトDallmeier electronic GmbH & Co. KG | Systems of interest and the systems and appropriate methods that monitor and act on the objects of interest and the processes they perform. |
WO2017134706A1 (en) * | 2016-02-03 | 2017-08-10 | パナソニックIpマネジメント株式会社 | Video display method and video display device |
JP6778912B2 (en) * | 2016-02-03 | 2020-11-04 | パナソニックIpマネジメント株式会社 | Video display method and video display device |
JP6284086B2 (en) * | 2016-02-05 | 2018-02-28 | パナソニックIpマネジメント株式会社 | Tracking support device, tracking support system, and tracking support method |
US10764539B2 (en) | 2016-03-22 | 2020-09-01 | Sensormatic Electronics, LLC | System and method for using mobile device of zone and correlated motion detection |
US9965680B2 (en) | 2016-03-22 | 2018-05-08 | Sensormatic Electronics, LLC | Method and system for conveying data from monitored scene via surveillance cameras |
US11601583B2 (en) | 2016-03-22 | 2023-03-07 | Johnson Controls Tyco IP Holdings LLP | System and method for controlling surveillance cameras |
US10475315B2 (en) | 2016-03-22 | 2019-11-12 | Sensormatic Electronics, LLC | System and method for configuring surveillance cameras using mobile computing devices |
US11216847B2 (en) | 2016-03-22 | 2022-01-04 | Sensormatic Electronics, LLC | System and method for retail customer tracking in surveillance camera network |
US10733231B2 (en) | 2016-03-22 | 2020-08-04 | Sensormatic Electronics, LLC | Method and system for modeling image of interest to users |
US10318836B2 (en) * | 2016-03-22 | 2019-06-11 | Sensormatic Electronics, LLC | System and method for designating surveillance camera regions of interest |
US10665071B2 (en) | 2016-03-22 | 2020-05-26 | Sensormatic Electronics, LLC | System and method for deadzone detection in surveillance camera network |
US10192414B2 (en) * | 2016-03-22 | 2019-01-29 | Sensormatic Electronics, LLC | System and method for overlap detection in surveillance camera network |
US10347102B2 (en) | 2016-03-22 | 2019-07-09 | Sensormatic Electronics, LLC | Method and system for surveillance camera arbitration of uplink consumption |
US10325339B2 (en) * | 2016-04-26 | 2019-06-18 | Qualcomm Incorporated | Method and device for capturing image of traffic sign |
JP6701018B2 (en) * | 2016-07-19 | 2020-05-27 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
RU2719356C9 (en) * | 2016-10-03 | 2020-07-08 | Дзе Проктер Энд Гэмбл Компани | DETERGENT COMPOSITION WITH LOW pH |
US20180181823A1 (en) * | 2016-12-28 | 2018-06-28 | Nxp Usa, Inc. | Distributed image cognition processing system |
CN106941580B (en) * | 2017-03-22 | 2019-12-03 | 北京昊翔信达科技有限公司 | The method and system that teacher student automatically tracks is realized based on single detective camera lens |
JP7008431B2 (en) * | 2017-06-01 | 2022-01-25 | キヤノン株式会社 | Imaging equipment, control methods, programs and imaging systems |
US10764486B2 (en) | 2018-01-11 | 2020-09-01 | Qualcomm Incorporated | Multi-camera autofocus synchronization |
TWI779029B (en) * | 2018-05-04 | 2022-10-01 | 大猩猩科技股份有限公司 | A distributed object tracking system |
US10594987B1 (en) * | 2018-05-30 | 2020-03-17 | Amazon Technologies, Inc. | Identifying and locating objects by associating video data of the objects with signals identifying wireless devices belonging to the objects |
CN112400315A (en) * | 2018-07-13 | 2021-02-23 | Abb瑞士股份有限公司 | Monitoring method for shooting device |
TWI692969B (en) * | 2019-01-15 | 2020-05-01 | 沅聖科技股份有限公司 | Camera automatic focusing method and device thereof |
CN109905679B (en) * | 2019-04-09 | 2021-02-26 | 梅州讯联科技发展有限公司 | Monitoring method, device and system |
CN110491060B (en) * | 2019-08-19 | 2021-09-17 | 深圳市优必选科技股份有限公司 | Robot, safety monitoring method and device thereof, and storage medium |
CN111857188A (en) * | 2020-07-21 | 2020-10-30 | 南京航空航天大学 | Aerial remote target follow-shooting system and method |
JP6800509B1 (en) * | 2020-09-30 | 2020-12-16 | アースアイズ株式会社 | Shooting system and shooting method |
EP4350654A1 (en) * | 2022-10-03 | 2024-04-10 | Axis AB | Camera information handover |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0529317A1 (en) * | 1991-08-22 | 1993-03-03 | Sensormatic Electronics Corporation | Surveillance system with master camera control of slave cameras |
EP0714081A1 (en) * | 1994-11-22 | 1996-05-29 | Sensormatic Electronics Corporation | Video surveillance system |
DE19639728A1 (en) * | 1996-09-26 | 1998-04-09 | Siemens Ag | Video monitoring unit arrangement |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4511886A (en) * | 1983-06-01 | 1985-04-16 | Micron International, Ltd. | Electronic security and surveillance system |
JPH0811071A (en) * | 1994-06-29 | 1996-01-16 | Yaskawa Electric Corp | Controller for manipulator |
US5729471A (en) * | 1995-03-31 | 1998-03-17 | The Regents Of The University Of California | Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene |
US5699444A (en) * | 1995-03-31 | 1997-12-16 | Synthonics Incorporated | Methods and apparatus for using image data to determine camera location and orientation |
WO1997004428A1 (en) | 1995-07-20 | 1997-02-06 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Interactive surveillance system |
US6002995A (en) * | 1995-12-19 | 1999-12-14 | Canon Kabushiki Kaisha | Apparatus and method for displaying control information of cameras connected to a network |
-
1998
- 1998-08-07 US US09/131,243 patent/US6359647B1/en not_active Expired - Lifetime
- 1998-09-25 CN CNB988143291A patent/CN1192094C/en not_active Expired - Lifetime
-
1999
- 1999-07-27 EP EP99939424A patent/EP1042917A1/en not_active Withdrawn
- 1999-07-27 WO PCT/EP1999/005505 patent/WO2000008856A1/en not_active Application Discontinuation
- 1999-07-27 JP JP2000564381A patent/JP2002522980A/en not_active Withdrawn
- 1999-07-27 KR KR1020007003758A patent/KR100660762B1/en not_active IP Right Cessation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0529317A1 (en) * | 1991-08-22 | 1993-03-03 | Sensormatic Electronics Corporation | Surveillance system with master camera control of slave cameras |
EP0714081A1 (en) * | 1994-11-22 | 1996-05-29 | Sensormatic Electronics Corporation | Video surveillance system |
DE19639728A1 (en) * | 1996-09-26 | 1998-04-09 | Siemens Ag | Video monitoring unit arrangement |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7193645B1 (en) | 2000-07-27 | 2007-03-20 | Pvi Virtual Media Services, Llc | Video system and method of operating a video system |
WO2002011431A1 (en) * | 2000-07-27 | 2002-02-07 | Revolution Company, Llc | Video system and method of operating a video system |
EP1189187A2 (en) * | 2000-08-31 | 2002-03-20 | Industrie Technik IPS GmbH | Method and system for monitoring a designated area |
EP1189187A3 (en) * | 2000-08-31 | 2009-05-27 | Industrie Technik IPS GmbH | Method and system for monitoring a designated area |
GB2371936A (en) * | 2001-02-03 | 2002-08-07 | Hewlett Packard Co | Surveillance system for tracking a moving object |
EP1289282A1 (en) * | 2001-08-29 | 2003-03-05 | Dartfish SA | Video sequence automatic production method and system |
WO2003026281A1 (en) * | 2001-09-17 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Intelligent quad display through cooperative distributed vision |
WO2003041411A1 (en) * | 2001-11-08 | 2003-05-15 | Revolution Company, Llc | Video system and methods for operating a video system |
US8675073B2 (en) | 2001-11-08 | 2014-03-18 | Kenneth Joseph Aagaard | Video system and methods for operating a video system |
WO2003056809A1 (en) * | 2001-12-27 | 2003-07-10 | S.A. Sport Universal | System for real-time monitoring of mobile objects on a playing field |
FR2834415A1 (en) * | 2001-12-27 | 2003-07-04 | Sport Universal | REAL-TIME MOBILE TRACKING SYSTEM ON A SPORTS FIELD |
EP1489847A3 (en) * | 2003-06-18 | 2011-05-04 | Panasonic Corporation | Video surveillance system, surveillance video composition apparatus, and video surveillance server |
US7697720B2 (en) | 2004-09-18 | 2010-04-13 | Hewlett-Packard Development Company, L.P. | Visual sensing for large-scale tracking |
US7804519B2 (en) | 2004-09-18 | 2010-09-28 | Hewlett-Packard Development Company, L.P. | Method of refining a plurality of tracks |
US7929022B2 (en) | 2004-09-18 | 2011-04-19 | Hewlett-Packard Development Company, L.P. | Method of producing a transit graph |
DE112006002674B4 (en) | 2005-10-07 | 2019-05-09 | Cognex Corp. | Methods and apparatus for practical 3D vision system |
WO2007101788A1 (en) * | 2006-03-03 | 2007-09-13 | Siemens Aktiengesellschaft | Apparatus and method for visually monitoring a room area |
WO2007104367A1 (en) * | 2006-03-16 | 2007-09-20 | Siemens Aktiengesellschaft | Video monitoring system |
EP2120452A1 (en) * | 2007-02-14 | 2009-11-18 | Panasonic Corporation | Monitoring camera and monitoring camera control method |
US10475312B2 (en) | 2007-02-14 | 2019-11-12 | Panasonic intellectual property Management co., Ltd | Monitoring camera and monitoring camera control method |
US10861304B2 (en) | 2007-02-14 | 2020-12-08 | Panasonic I-Pro Sensing Solutions Co., Ltd. | Monitoring camera and monitoring camera control method |
EP2120452A4 (en) * | 2007-02-14 | 2011-05-18 | Panasonic Corp | Monitoring camera and monitoring camera control method |
US9286775B2 (en) | 2007-02-14 | 2016-03-15 | Panasonic Intellectual Property Management Co., Ltd. | Monitoring camera and monitoring camera control method |
US9870685B2 (en) | 2007-02-14 | 2018-01-16 | Panasonic Intellectual Property Management Co., Ltd. | Monitoring camera and monitoring camera control method |
US9437089B2 (en) | 2007-02-14 | 2016-09-06 | Panasonic Intellectual Property Management Co., Ltd. | Monitoring camera and monitoring camera control method |
WO2011037737A3 (en) * | 2009-09-02 | 2013-04-25 | Peopleary Inc | Methods for producing low-cost, high-quality video excerpts using an automated sequence of camera switches |
US8970694B2 (en) | 2009-12-10 | 2015-03-03 | Harris Corporation | Video processing system providing overlay of selected geospatially-tagged metadata relating to a geolocation outside viewable area and related methods |
WO2011071817A1 (en) * | 2009-12-10 | 2011-06-16 | Harris Corporation | Video processing system providing correlation between objects in different georeferenced video feeds and related methods |
WO2011071814A1 (en) * | 2009-12-10 | 2011-06-16 | Harris Corporation | Video processing system providing overlay of selected geospatially-tagged metadata relating to a geolocation outside viewable area and related methods |
US8717436B2 (en) | 2009-12-10 | 2014-05-06 | Harris Corporation | Video processing system providing correlation between objects in different georeferenced video feeds and related methods |
JP2013196199A (en) * | 2012-03-16 | 2013-09-30 | Fujitsu Ltd | User detection device, method and program |
GB2516173B (en) * | 2013-07-11 | 2016-05-11 | Panasonic Corp | Tracking assistance device, tracking assistance system and tracking assistance method |
GB2516173A (en) * | 2013-07-11 | 2015-01-14 | Panasonic Corp | Tracking assistance device, tracking assistance system and tracking assistance method |
CN103702030A (en) * | 2013-12-25 | 2014-04-02 | 浙江宇视科技有限公司 | Scene monitoring method and moving target tracking method based on GIS (Geographic Information System) map |
FR3039919A1 (en) * | 2015-08-04 | 2017-02-10 | Neosensys | TRACKING A TARGET IN A CAMERAS NETWORK |
EP4057624A4 (en) * | 2020-01-10 | 2023-01-04 | Kobelco Construction Machinery Co., Ltd. | Inspection system for construction machine |
US12028653B2 (en) | 2020-01-10 | 2024-07-02 | Kobelco Construction Machinery Co., Ltd. | Inspection system for construction machine |
EP4131934A4 (en) * | 2020-03-30 | 2023-08-23 | Hangzhou Hikvision Digital Technology Co., Ltd. | Method and apparatus for determining picture switching, electronic device, and storage medium |
WO2022007998A1 (en) * | 2020-07-10 | 2022-01-13 | Raytheon Anschütz Gmbh | System and method for locating an object in a specified area |
DE102020118304A1 (en) | 2020-07-10 | 2022-01-13 | Raytheon Anschütz Gmbh | System and method for locating an object in a predetermined area |
Also Published As
Publication number | Publication date |
---|---|
CN1322237A (en) | 2001-11-14 |
US6359647B1 (en) | 2002-03-19 |
EP1042917A1 (en) | 2000-10-11 |
CN1192094C (en) | 2005-03-09 |
JP2002522980A (en) | 2002-07-23 |
KR100660762B1 (en) | 2006-12-26 |
KR20010024452A (en) | 2001-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6359647B1 (en) | Automated camera handoff system for figure tracking in a multiple camera system | |
US6215519B1 (en) | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring | |
US10237478B2 (en) | System and method for correlating camera views | |
US10728459B2 (en) | System and method for tracking moving objects in a scene | |
CA2149730C (en) | Rail-based closed circuit t.v. surveillance system with automatic target acquisition | |
US7750936B2 (en) | Immersive surveillance system interface | |
US7385626B2 (en) | Method and system for performing surveillance | |
CA2794057C (en) | Effortless navigation across cameras and cooperative control of cameras | |
KR101204080B1 (en) | Surveillance camera system and method for controlling thereof | |
US20060028550A1 (en) | Surveillance system and method | |
EP1619897B1 (en) | Camera link system, camera device and camera link control method | |
EP2607952B1 (en) | Monitoring camera and method for monitoring | |
US20070183770A1 (en) | Camera terminal and imaging zone adjusting apparatus | |
WO2006132029A1 (en) | Monitoring system, monitoring method, and camera terminal | |
US20020052708A1 (en) | Optimal image capture | |
WO2006017402A2 (en) | Surveillance system and method | |
Erdem et al. | Automated placement of cameras in a floorplan to satisfy task-specific constraints | |
Sclaroff et al. | Automated Placement of Cameras in a Floorplan to Satisfy Task-Specific Constraints |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP KR |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1999939424 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020007003758 Country of ref document: KR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1999939424 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020007003758 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1020007003758 Country of ref document: KR |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1999939424 Country of ref document: EP |