WO2011114799A1 - 監視カメラ端末 - Google Patents
監視カメラ端末 Download PDFInfo
- Publication number
- WO2011114799A1 WO2011114799A1 PCT/JP2011/052468 JP2011052468W WO2011114799A1 WO 2011114799 A1 WO2011114799 A1 WO 2011114799A1 JP 2011052468 W JP2011052468 W JP 2011052468W WO 2011114799 A1 WO2011114799 A1 WO 2011114799A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- camera terminal
- tracking target
- tracking
- field
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 219
- 238000012544 monitoring process Methods 0.000 claims description 137
- 230000000007 visual effect Effects 0.000 claims description 27
- 238000012546 transfer Methods 0.000 claims description 25
- 238000000034 method Methods 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 20
- 238000001514 detection method Methods 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 22
- 238000006243 chemical reaction Methods 0.000 description 19
- 230000007246 mechanism Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 230000009466 transformation Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19645—Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- the present invention relates to a surveillance camera terminal used in a wide area monitoring system that tracks a suspicious person over a wide area at a place where an unspecified number of people gather such as a station, a shopping center, and a downtown area.
- a person such as a suspicious person who took a peculiar action using a plurality of surveillance camera terminals (hereinafter simply referred to as a tracking target person). .)
- a wide area monitoring system that tracks the detected person to be tracked.
- each surveillance camera uses a surveillance camera terminal that changes the imaging field of view according to the movement of the person being tracked.
- a monitoring target area allocated to a terminal is increased (see Patent Document 1).
- the monitoring camera terminal changes the imaging field of view so that the person to be tracked is imaged at the center of the frame image. That is, the surveillance camera terminal changes the imaging field of view with reference to the tracking target person.
- the monitoring camera terminal (hereinafter referred to as a delivery side of the monitoring camera terminal) is tracking the tracking target person. This is called a surveillance camera terminal.)
- This delivery request is transmitted to another surveillance camera terminal (hereinafter referred to as the takeover-side surveillance camera terminal) that takes over the tracking of the tracking subject.
- the handover-side monitoring camera terminal changes the imaging field of view to the imaging field of view when the tracking target person defined with the delivery-side monitoring camera terminal is delivered.
- the monitoring camera terminal on the delivery side also changes to the imaging field of view when the tracking target person defined with the monitoring camera terminal on the takeover side is delivered.
- both the delivery side surveillance camera terminal and the takeover side surveillance camera terminal image the delivery area at substantially the same timing.
- the monitoring camera terminal on the takeover side identifies the tracking target person tracked by the monitoring camera terminal on the delivery side from the persons imaged in the frame image obtained by imaging the delivery area.
- the monitoring camera terminal on the takeover side takes over the tracking of the identified tracking target person.
- the tracking target person is identified based on the position of the person including the tracking target person in the frame image captured by each surveillance camera terminal.
- both the monitoring camera terminal on the delivery side and the surveillance camera terminal on the takeover side match the imaging field of view with the imaging field of view when the tracking target person is delivered. It was. For this reason, when these surveillance camera terminals synchronously image the transfer area, the tracking target person may have already moved out of the transfer area. In this case, the person to be tracked is lost and tracking of the person to be tracked fails.
- An object of the present invention is to provide a monitoring camera terminal that can suppress losing sight of a tracking target when the tracking target being tracked is delivered to another monitoring camera terminal.
- the surveillance camera terminal of the present invention is configured as follows to achieve the above object.
- the imaging visual field control means changes the imaging visual field of the imaging means within the monitoring target area.
- the tracking target extraction unit processes the frame image captured by the imaging unit within the imaging field of view, and extracts the tracking target being captured.
- the tracking unit detects the position of the tracking target in the monitoring target area based on the position on the frame image of the tracking target extracted by the tracking target extraction unit and the imaging field of view of the imaging unit, and tracks the tracking target.
- the tracking unit tracks the tracking target based on a change in the position of the tracking target in the monitoring target area with time, which is obtained by processing the frame images in time series.
- the imaging visual field control means changes the imaging visual field of the imaging means according to the position in the monitoring target area of the tracking target tracked by the tracking means.
- the imaging field of view of the imaging unit is changed so that the tracking target is imaged in the approximate center of the frame image. Further, by using the movement vector of the tracking target obtained from the tracking result of the tracking target in the tracking unit, it is possible to change the imaging field of view while predicting the position of the tracking target.
- the imaging field storage means stores the imaging field when the tracking object is transferred for each of the other monitoring camera terminals that transfer the tracking object.
- the detecting means detects whether or not the tracking target tracked by the tracking means has moved to the vicinity of the delivery area with another monitoring camera terminal.
- the detection means may be configured to detect, for example, whether the tracking means has moved around the transfer area with another monitoring camera terminal, based on the imaging field of view of the imaging means, or the position of the tracking target being tracked by the tracking means Thus, it may be configured to detect whether or not it has moved to the vicinity of the delivery area with another surveillance camera terminal.
- a predetermined mark is painted around the delivery area, and whether or not the tracking unit has moved to the vicinity of the delivery area with another monitoring camera terminal is detected based on whether or not the predetermined mark has been detected from the frame image. It is good also as a structure.
- the imaging field control unit stores the imaging field of the imaging unit in the imaging field storage unit when the detection unit detects that the tracking target tracked by the tracking unit has moved to the vicinity of the transfer area with another monitoring camera terminal. It matches with the imaging field of view when handing over the tracking target to the other surveillance camera terminal. At this time, the imaging visual field control means switches the imaging visual field of the imaging means without causing the tracking target person to be tracked out of the frame, and the imaging visual field storage means stores the imaging visual field of the imaging means without losing sight of the tracking target person. It is preferable to match with the delivery area with the other surveillance camera terminal stored in the.
- the detection means detects that the tracking target being tracked by the tracking means has moved to the vicinity of the delivery area with another monitoring camera terminal
- the tracking target is delivered to another monitoring camera terminal that delivers the tracking target.
- the delivery request may be made.
- the monitoring camera terminal on the other side that delivers the tracking target may be configured to change the imaging field of view to the imaging field of view that delivers.
- the imaging field of view of both the monitoring camera terminal on the delivery side and the monitoring camera terminal on the takeover side is the imaging field of view when delivering the tracking target. . That is, there is no time required for these surveillance camera terminals to change the imaging field of view from the entry of the tracking target person into the delivery area to the imaging field of view when the tracking target person is delivered. Accordingly, it is possible to suppress losing sight of the tracking target when the tracking target being tracked is transferred to another monitoring camera terminal.
- the present invention it is possible to suppress losing sight of a tracking target when the tracking target being tracked is delivered to another surveillance camera terminal.
- FIG. 1 is a schematic diagram showing a configuration of a wide area monitoring system using a monitoring camera terminal according to an embodiment of the present invention.
- This wide area monitoring system is a network system having a plurality of monitoring camera terminals 1 (1A to 1H).
- This wide area monitoring system is, for example, an ad hoc network system.
- Data communication can be performed between the monitoring camera terminals 1 directly or via another monitoring camera terminal 1.
- the data communication between the monitoring camera terminals 1 may be wireless or wired.
- monitoring camera terminals 1 constituting the wide area monitoring system is not limited to the eight illustrated here, and may be any number as long as there are a plurality of monitoring cameras.
- the line which connects between the monitoring camera terminals 1 shown in FIG. 1 is a link.
- the surveillance camera terminals 1A to 1H will be referred to as surveillance camera terminals 1 when they are not distinguished from each other.
- FIG. 2 is a diagram showing the configuration of the main part of the surveillance camera terminal.
- the surveillance camera terminal 1 includes a control unit 11, an imaging unit 12, a drive mechanism unit 13, an imaging visual field control unit 14, an image processing unit 15, a storage unit 16, a timer 17, and a communication unit 18. I have.
- the control unit 11 controls the operation of each part of the main body.
- the imaging unit 12 outputs a frame image obtained by imaging the imaging field (imaging area) about 30 frames per second.
- the imaging unit 12 is attached to a pan head (not shown) that rotates individually in the pan direction and the tilt direction.
- the pan direction and the tilt direction are orthogonal to each other.
- the imaging unit 12 includes an optical system driving unit (not shown) that drives the imaging optical system, and can change the imaging magnification Zoom.
- the drive mechanism unit 13 has a drive source such as a motor that rotates the pan head to which the imaging unit 12 is attached in the pan direction and the tilt direction.
- a sensor that detects a rotation angle ⁇ applied to the pan direction of the pan head and a rotation angle ⁇ applied to the tilt direction is provided.
- the imaging visual field control unit 14 instructs the imaging unit 12 on the imaging magnification.
- the imaging unit 12 changes the imaging magnification Zoom according to this instruction.
- the imaging visual field control unit 14 instructs the drive mechanism unit 13 about the rotation angle ⁇ in the pan direction and the rotation angle ⁇ in the tilt direction of the pan head on which the imaging unit 12 is attached.
- the drive mechanism unit 13 changes the rotation angle ⁇ applied in the pan direction and the rotation angle ⁇ applied in the tilt direction of the camera head to which the imaging unit 12 is attached.
- the imaging field of the imaging unit 12 changes in accordance with changes in the rotation angle ⁇ in the pan direction of the camera platform, the rotation angle ⁇ in the tilt direction of the camera platform, and the imaging magnification Zoom.
- the monitoring camera terminal 1 has a monitoring target area. This monitoring target area is a range in which imaging can be performed by changing the imaging field of view of the imaging unit 12.
- the image processing unit 15 processes the frame image captured by the imaging unit 12, extracts the person being imaged, and attaches an ID to the extracted person.
- This ID is a unique value that can identify a person.
- the image processing unit 15 is supplied with outputs of sensors that detect a rotation angle ⁇ applied to the pan head of the pan head to which the imaging unit 12 is attached and a rotation angle ⁇ applied to the tilt direction. That is, the image processing unit 15 obtains the rotation angle ⁇ applied in the pan direction and the rotation angle ⁇ applied in the tilt direction of the pan head to which the imaging unit 12 is attached, from the output of the sensor.
- the image processing unit 15 obtains an imaging magnification Zoom of the imaging unit 12 based on a signal related to the imaging magnification input from the imaging unit 12.
- the image processing unit 15 uses the rotation angle ⁇ in the pan direction of the pan head, the rotation angle ⁇ in the tilt direction, and the imaging magnification Zoom, and the imaging field of view of the imaging unit 12, that is, the position of the imaging area in the monitoring target area, Can be obtained. Therefore, the image processing unit 15 can convert the position of the person on the frame image captured by the imaging unit 12 into the position in the monitoring target area.
- the image processing unit 15 instructs the imaging field controller 14 to determine the imaging field of view of the imaging unit 12 according to the movement of the person being tracked.
- the image processing unit 15 is configured to extract and track a person being imaged using, for example, a spatio-temporal MRF (Markov® Random® Field) model.
- a spatio-temporal MRF Markov® Random® Field
- the spatiotemporal MRF model is an extension of the MRF model as a spatiotemporal model, focusing on the correlation in the time axis direction of the spatiotemporal image.
- This spatio-temporal MRF model divides an area into blocks of several pixels ⁇ several pixels (for example, 8 pixels ⁇ 8 pixels) for a frame image to be processed, and each block between temporally continuous frame images. This is a model that defines the correlation in the time axis direction with reference to the motion vector.
- the storage unit 16 stores an operation program for operating the main body, setting data used during operation, processing data generated during operation, and the like.
- the timer 17 measures the current time.
- the communication unit 18 controls data communication with other surveillance camera terminals 1.
- This wide-area monitoring system is a system for tracking a person such as a suspicious person who has taken a specific action (hereinafter referred to as a tracking target person).
- Each monitoring camera terminal 1 is assigned a monitoring target area as described above. Further, as shown in FIG. 4, two adjacent monitoring camera terminals 1 ⁇ / b> A and 1 ⁇ / b> B partially overlap the monitoring target area.
- FIG. 4 illustrates the monitoring target areas for the two adjacent monitoring camera terminals 1A and 1B. However, in the combination of the other two adjacent monitoring camera terminals 1, a part of the monitoring target area overlaps. Yes.
- This overlapping area is a delivery area for delivering the tracking target person from one surveillance camera terminal 1A (or 1B) to the other surveillance camera terminal 1B (or 1A).
- the monitoring camera terminal 1B This is an area for identifying a tracking target person and taking over the tracking target person from the monitoring camera terminal 1A.
- the surveillance camera terminal 1 captures an imaging field of view of the imaging unit 12 (a rotation angle ⁇ and a tilt direction in the pan direction) when the tracking target person is transferred to and from the surveillance camera terminal 1 for each adjacent surveillance camera terminal 1. Is stored in the storage unit 16.
- the imaging field of view of the imaging unit 12 when delivering the tracking target person includes a delivery area with the counterpart monitoring camera terminal 1 that delivers the tracking target person.
- each monitoring camera terminal 1 has, for each adjacent monitoring camera terminal 1, a two-dimensional coordinate system of a frame image obtained by imaging the transfer area by the imaging unit 12 of its own terminal, and an imaging unit of the adjacent counterpart monitoring camera terminal 1 12 stores coordinate conversion information indicating a relative positional relationship with the two-dimensional coordinate system of the frame image obtained by imaging the transfer area in the storage unit 16.
- the coordinate conversion information is obtained by tracking the two-dimensional coordinate system of the frame image captured in the imaging field when the imaging unit 12 of the own terminal delivers the tracking target and the imaging unit 12 of the adjacent counterpart monitoring camera terminal 1 to be tracked.
- a first coordinate conversion parameter and a second coordinate conversion parameter shown below are stored in the storage unit 16 as the coordinate conversion information.
- the first coordinate transformation parameter is obtained by the imaging unit 12 of the adjacent counterpart camera terminal 1 using the two-dimensional coordinate system of the frame image captured in the imaging field when the imaging unit 12 of the own terminal delivers the tracking target person.
- This is a parameter for projective transformation to a two-dimensional coordinate system of a frame image captured in an imaging field of view when a tracking target person is delivered.
- the second coordinate conversion parameter is obtained by using the two-dimensional coordinate system of the frame image captured in the imaging field when the imaging unit 12 of the adjacent counterpart monitoring camera terminal 1 delivers the tracking target person as the imaging unit of the own terminal.
- 12 is a parameter for projective transformation into a two-dimensional coordinate system of a frame image captured in the imaging field of view when the tracking target person is delivered.
- the coordinate conversion information may be only one of the first coordinate conversion parameter and the second coordinate conversion parameter.
- the first coordinate conversion parameter and the second coordinate conversion parameter are values calculated using a frame image actually captured when the surveillance camera terminal 1 is installed.
- the coordinate position on the frame image is acquired for each person imaged in the delivery area of the frame image taken by the terminal. Further, the coordinate position on the frame image is acquired from the counterpart terminal for each person imaged in the frame image transfer area captured by the counterpart terminal. Also, a combination pattern is created in which the person located in the delivery area imaged by the terminal is associated with the person located in the delivery area imaged by the partner terminal on a one-to-one basis. The number of combination patterns created here is, for example, two if there are two persons located in the delivery area, and 6 if there are three persons located in the delivery area. Street.
- the surveillance camera terminal 1 converts the coordinate position of the person into the coordinate system of the counterpart terminal using the first coordinate conversion parameter for each person located in the transfer area imaged by the own terminal.
- the surveillance camera terminal 1 calculates the first distance energy that is the sum of the distances between the corresponding persons in the coordinate system of the counterpart terminal for each person combination pattern.
- the surveillance camera terminal 1 converts the coordinate position of the person into the coordinate system of the own terminal using the second coordinate conversion parameter for each person located in the transfer area imaged by the counterpart terminal.
- the surveillance camera terminal 1 calculates the second distance energy that is the sum of the distances between corresponding persons in the coordinate system of the terminal for each person combination pattern.
- the surveillance camera terminal 1 is a person located in the delivery area with a combination pattern having the smallest sum of the first distance energy and the second distance energy among the combinations of persons located in the delivery area. Is determined to be an appropriate association. Therefore, if the tracking target person is located in the delivery area, the tracking target person can be identified.
- FIG. 5 is a flowchart showing the operation of the surveillance camera terminal 1.
- the monitoring camera terminal 1 starts a search process related to searching for a tracking target person in the monitoring target area (S1). At this time, the monitoring camera terminal 1 does not track the person to be tracked (does not execute the tracking process described later).
- the monitoring camera terminal 1 waits for a tracking target person to be searched for in the monitoring target area (S2) or to receive a tracking target person transfer request from the adjacent monitoring camera terminal 1 (S3).
- the search process starting at S1 changes the imaging field of view of the imaging unit 12 every certain time (several seconds). Specifically, the imaging visual field control unit 14 rotates the rotation angle ⁇ in the pan direction and the rotation in the tilt direction of the pan head to which the imaging unit 12 is attached based on a predetermined condition at regular intervals. The moving angle ⁇ is instructed to the drive mechanism unit 13. For example, this condition is set so that the imaging field of view of the imaging unit 12 scans the monitoring target area.
- the imaging magnification Zoom is a predetermined magnification.
- the image processing unit 15 processes the frame image captured by the imaging unit 12, extracts the person being imaged, and assigns an ID to the extracted person. At this time, the image processing unit 15 does not assign an ID to a person who has already been assigned an ID. Further, the image processing unit 15 creates an object map in which the ID and the position in the monitoring target area are associated for each extracted person. This object map is given the time stamped by the timer 17 at the time of creation as a time stamp. The image processing unit 15 analyzes the behavior of each person to whom an ID is assigned, using an object map that is temporally continuous. If there is a person whose action is unique, the person is detected as a tracking target person. In addition, when a face image of a person requiring attention is registered and a person requiring attention is detected by collation with the face image of the person captured by the imaging unit 12, the person requiring attention is detected as a tracking target person. You may comprise.
- the surveillance camera terminal 1 starts the tracking process when the tracking target person is searched (S4).
- FIG. 6 is a flowchart showing the tracking process.
- the image processing unit 15 captures the frame image captured by the imaging unit 12 (S11), processes the frame image, and captures the imaging field of view (rotation in the pan direction of the pan head) where the tracking target person is the center of the imaging unit 12.
- the angle ⁇ and the rotation angle ⁇ in the tilt direction are calculated (S12).
- the imaging magnification Zoom of the imaging unit 12 is a predetermined magnification.
- the monitoring camera terminal 1 matches the imaging field of the imaging unit 12 with the imaging field calculated in S12 (S13). That is, the rotation angle ⁇ in the pan direction of the pan head and the rotation angle ⁇ in the tilt direction are matched with the angle calculated in S12. Specifically, the image processing unit 15 notifies the imaging visual field control unit 14 of the rotation angle ⁇ applied in the pan direction of the pan head and the rotation angle ⁇ applied in the tilt direction calculated here. Based on this notification, the imaging visual field control unit 14 changes the rotation angle ⁇ applied in the pan direction of the pan head by the drive mechanism unit 13 and the rotation angle ⁇ applied in the tilt direction.
- the monitoring camera terminal 1 determines whether or not the tracking target person has moved to the vicinity of the delivery area with another adjacent monitoring camera terminal 1 (S14). If the surveillance camera terminal 1 determines that the tracking target person has not moved to the vicinity of the delivery area with the other surveillance camera terminal 1 adjacent thereto, the surveillance camera terminal 1 returns to S11 and repeats the above-described processing.
- the surveillance camera terminal 1 tracks the tracking target person X while capturing the tracking target person X at the approximate center of the imaging field of view of the imaging unit 12 by repeating the processes shown in S11 to S14. be able to.
- the image processing unit 15 also creates an object map that associates the ID with the position in the monitoring target area for the tracking target person. Therefore, by using this object map, a movement vector relating to the moving direction and moving speed of the tracking target person can be obtained.
- the tracking target person is transferred to the adjacent other monitoring camera terminal 1.
- a request is made (S15).
- the tracking target person is adjacent based on the imaging field of view of the imaging unit 12 when the tracking target person is transferred to and from another adjacent monitoring camera terminal 1, and the current imaging field of view. It is determined whether or not it has moved to the vicinity of the delivery area with the other surveillance camera terminal 1 that performs.
- the imaging field of view of the imaging unit 12 when the tracking target person is transferred to and from another adjacent monitoring camera terminal 1 is set to the rotation angle ( ⁇ 1) applied to the pan direction of the pan head and the tilt.
- the rotation angle ( ⁇ 1) applied to the direction, the imaging magnification (z1), and the imaging field of view of the current imaging unit 12 is the rotation angle ( ⁇ t) applied to the pan direction of the pan head and the rotation angle applied to the tilt direction.
- ( ⁇ t) when the imaging magnification (zt), (a): ⁇ 1 ⁇ V ⁇ ⁇ t ⁇ ⁇ 1 + V (V is a vertical angle of view when the imaging magnification is z1) (b): ⁇ 1 ⁇ H ⁇ ⁇ t ⁇ ⁇ 1 + H / 2 (H is a horizontal angle of view when the imaging magnification is z1)
- the surveillance camera terminal 1 changes the imaging magnification Zoom to the above-described z1 (imaging magnification at the time of delivery) when making a delivery request for the tracking subject in S15 (S16).
- the image processing unit 15 captures and processes the frame image captured by the imaging unit 12 (S17), and determines whether the tracking target person is out of frame when the imaging field of view of the imaging unit 12 is transferred to the imaging field of view. Determine (S18).
- the imaging field of view (the rotation angle ( ⁇ ) applied to the pan direction of the pan head and the rotation angle ( ⁇ ) applied to the tilt direction ( ⁇ )) is calculated (S19).
- the imaging field the imaging field centered on the person to be tracked may be calculated. However, preferably, the field of view to be imaged near the imaging field at the time of delivery is calculated by shifting the person to be tracked from the center. .
- the surveillance camera terminal 1 matches the imaging field of the imaging unit 12 with the imaging field calculated in S19 (S20), and returns to S17.
- the surveillance camera terminal 1 determines the imaging field of view of the imaging unit 12 as the imaging field of view stored in the storage unit 16 (S21).
- the surveillance camera terminal 1 adjusts the imaging field of view of the imaging unit 12 to the imaging field of view at the time of delivery (S22).
- the monitoring camera terminal 1 is based on the tracking target person (the tracking target person is in the imaging field of view).
- the imaging visual field of the imaging unit 12 is changed so as to be in the center.
- the imaging field of view of the imaging unit 12 is changed with reference to the imaging field of view at the time of delivery. Therefore, when the tracking target person enters the delivery area, the imaging field of the imaging unit 12 is the imaging field at the time of delivery.
- the adjacent surveillance camera terminal 1 that has received the tracking target person delivery request made in S15 described above starts the handover process (S5).
- This adjacent monitoring camera terminal 1 takes over the tracking of the tracking target person.
- another monitoring camera terminal 1 that takes over the tracking of the tracking target person is referred to as a takeover-side monitoring camera terminal 1.
- the surveillance camera terminal 1 on the side that has transmitted the tracking target person's delivery request in S15 is referred to as delivery side surveillance camera terminal 1.
- FIG. 8 is a flowchart showing a delivery process performed by the takeover-side monitoring camera terminal.
- the takeover-side monitoring camera terminal 1 matches the imaging field of the imaging unit 12 with the imaging field when the tracking target person is transferred to and from the adjacent monitoring camera terminal 1 that has transmitted the transfer request received this time (S31). ). That is, when the takeover-side monitoring camera terminal 1 receives the delivery request, at that time, the imaging field of view of the imaging unit 12 is changed to the imaging field of view when the tracking target person is delivered to the delivery-side monitoring camera terminal 1. Match.
- the takeover-side monitoring camera terminal 1 waits for a synchronization signal from the delivery-side monitoring camera terminal 1 (S32).
- the transfer-side monitoring camera terminal 1 waits for the tracking target person to enter the transfer area when matched with the imaging field of view when the tracking target person is transferred with the takeover-side monitoring camera terminal 1 in S22 (S23). .
- the delivery monitoring camera terminal 1 transmits a synchronization signal to the takeover monitoring camera terminal 1 (S24). This synchronization signal may notify the time or simply notify the reference timing.
- the monitoring camera terminal 1 that delivers the tracking target person and the monitoring camera terminal 1 that takes over the tracking target person captures the frame image captured by the imaging unit 12 at a timing based on the synchronization signal (S25, S33).
- the takeover-side monitoring camera terminal 1 identifies the tracking target person being imaged with the monitoring camera terminal 1 on the side of transferring the tracking target person based on the captured image obtained by imaging the transfer area (S26, S34). ).
- the tracking target person may be identified by a technique using the first distance energy and the second distance energy described above.
- the takeover-side monitoring camera terminal 1 ends this takeover process and starts the tracking process in S4 described above. Further, the delivery-side monitoring camera terminal 1 ends the tracking process in S4 and returns to S1.
- the delivery monitoring camera terminal 1 and the takeover monitoring camera terminal 1 have the imaging field of view of the imaging unit 12 set at the time of delivery set in advance. ing. That is, it does not take time for the surveillance camera terminal 1 to change the imaging field of view to the imaging field of view when the tracking subject is delivered after the tracking subject enters the delivery area. Therefore, it is possible to suppress losing sight of the tracking target when the tracking target person being tracked is delivered to another monitoring camera terminal 1.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Alarm Systems (AREA)
Abstract
Description
X=(a1x+b1y+c1)/(a0x+b0y+1)
Y=(a2x+b2y+c2)/(a0x+b0y+1)
に代入し、8元連立方程式を得る。この8元連立方程式の解である、a0,b0,a1,b1,c1,a2,b2,c2の8個の計数が、この隣接する相手側監視カメラ端末1との第1の座標変換パラメータである。監視カメラ端末1は、この第1の座標変換パラメータを記憶部16に記憶する。
x=(A1X+B1Y+C1)/(A0X+B0Y+1)
y=(A2X+B2Y+C2)/(A0X+B0Y+1)
に代入し、8元連立方程式を得る。この8元連立方程式の解である、A0,B0,A1,B1,C1,A2,B2,C2の8個の計数が、この隣接する相手側監視カメラ端末1との第2の座標変換パラメータである。監視カメラ端末1は、この第2の座標変換パラメータを記憶部16に記憶する。
(a):φ1-V≦φt≦φ1+V (Vは、撮像倍率がz1であるときの垂直画角)
(b):θ1-H≦θt≦θ1+H/2 (Hは、撮像倍率がz1であるときの水平画角)
の両条件を満足するときに、追跡対象者が隣接する他の監視カメラ端末1との受け渡しエリア周辺に移動したと判定する。
11…制御部
12…撮像部
13…駆動機構部
14…撮像視野制御部
15…画像処理部
16…記憶部
17…タイマ
18…通信部
Claims (5)
- 撮像視野内を撮像したフレーム画像を出力する撮像手段と、
前記撮像手段の撮像視野を監視対象エリア内で変化させる撮像視野制御手段と、
前記撮像手段が撮像したフレーム画像を処理し、撮像されている追跡対象を抽出する追跡対象抽出手段と、
前記追跡対象抽出手段が抽出した追跡対象のフレーム画像上の位置、および前記撮像手段の撮像視野に基づき、前記監視対象エリア内における追跡対象の位置を検出し、当該追跡対象を追跡する追跡手段と、を備え、
前記撮像視野制御手段は、前記追跡手段が追跡している追跡対象の前記監視対象エリア内の位置に応じて前記撮像手段の撮像視野を変化させる、監視カメラ端末において、
追跡対象の受け渡しを行う別の監視カメラ端末毎に、追跡対象の受け渡すときの撮像視野を記憶する撮像視野記憶手段と、
前記追跡手段が追跡している追跡対象が、別の監視カメラ端末との間で、追跡対象を受け渡す受け渡しエリア周辺に移動したかどうかを検知する検知手段と、
前記撮像視野制御手段は、前記検知手段により、前記追跡手段が追跡している追跡対象が別の監視カメラ端末との受け渡しエリア周辺に移動したことを検知すると、前記撮像手段の撮像視野を前記撮像視野記憶手段に記憶している当該別の監視カメラ端末との間で追跡対象を受け渡すときの撮像視野に合わせる、監視カメラ端末。 - 前記撮像視野制御手段は、前記検知手段により、前記追跡手段が追跡している追跡対象が別の監視カメラ端末との受け渡しエリア周辺に移動したことを検知すると、当該追跡対象を前記撮像手段の撮像視野内に収めながら、前記撮像手段の撮像視野を変化させ、追跡対象を受け渡すときの撮像視野に合わせる、請求項1に記載の監視カメラ端末。
- 追跡対象の受け渡しを行う別の監視カメラ端末との間で、前記撮像手段における撮像タイミングを同期させる同期手段を備えた請求項1、または2に記載の監視カメラ端末。
- 前記検知手段により、前記追跡手段が追跡している追跡対象が別の監視カメラ端末との受け渡しエリア周辺に移動したことを検知したときに、当該追跡対象の受け渡しを行う別の監視カメラ端末に対して、受け渡し要求を行う受け渡し要求通知手段を備えた請求項1~3のいずれかに記載の監視カメラ端末。
- 前記撮像視野制御手段は、別の監視カメラ端末から前記受け渡し要求を受け付けたときに、前記撮像手段の撮像視野を前記撮像視野記憶手段に記憶している当該別の監視カメラ端末との間で追跡対象を受け渡すときの撮像視野に合わせる、請求項4に記載の監視カメラ端末。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012505558A JP5325337B2 (ja) | 2010-03-15 | 2011-02-07 | 監視カメラ端末 |
US13/634,735 US9398231B2 (en) | 2010-03-15 | 2011-02-07 | Surveillance camera terminal |
EP11755989.8A EP2549753B1 (en) | 2010-03-15 | 2011-02-07 | Surveillance camera terminal |
CN201180009511.4A CN102754436B (zh) | 2010-03-15 | 2011-02-07 | 监视摄像机终端 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-056960 | 2010-03-15 | ||
JP2010056960 | 2010-03-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011114799A1 true WO2011114799A1 (ja) | 2011-09-22 |
Family
ID=44648906
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/052468 WO2011114799A1 (ja) | 2010-03-15 | 2011-02-07 | 監視カメラ端末 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9398231B2 (ja) |
EP (1) | EP2549753B1 (ja) |
JP (1) | JP5325337B2 (ja) |
CN (1) | CN102754436B (ja) |
WO (1) | WO2011114799A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012124763A (ja) * | 2010-12-09 | 2012-06-28 | Sony Corp | 映像表示装置、映像表示システム、映像表示方法およびプログラム |
US20140218532A1 (en) * | 2012-08-06 | 2014-08-07 | Cloudparc, Inc. | Defining a Handoff Zone for Tracking a Vehicle Between Cameras |
JP2015060327A (ja) * | 2013-09-17 | 2015-03-30 | 株式会社リコー | 投影装置、投影方法及び情報処理システム |
JP2015064800A (ja) * | 2013-09-25 | 2015-04-09 | 株式会社 シリコンプラス | 自立型監視カメラ |
US9171382B2 (en) | 2012-08-06 | 2015-10-27 | Cloudparc, Inc. | Tracking speeding violations and controlling use of parking spaces using cameras |
US9489839B2 (en) | 2012-08-06 | 2016-11-08 | Cloudparc, Inc. | Tracking a vehicle using an unmanned aerial vehicle |
JP2020202000A (ja) * | 2016-12-22 | 2020-12-17 | 日本電気株式会社 | 映像収集システム、映像収集方法及びプログラム |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2913996B1 (en) * | 2012-10-23 | 2021-03-03 | Sony Corporation | Information-processing device, information-processing method, program, and information-processing system |
US9953304B2 (en) * | 2012-12-30 | 2018-04-24 | Buzd, Llc | Situational and global context aware calendar, communications, and relationship management |
RU2534962C2 (ru) * | 2013-01-09 | 2014-12-10 | Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Омский государственный технический университет" | Способ обработки изображения |
EP2757772A3 (en) * | 2013-01-17 | 2017-08-16 | Canon Kabushiki Kaisha | Image pickup apparatus, remote control apparatus, and methods of controlling image pickup apparatus and remote control apparatus |
US10043359B2 (en) * | 2013-03-08 | 2018-08-07 | Denso Wave Incorporated | Apparatus and method of monitoring moving objects |
FR3006842B1 (fr) * | 2013-06-06 | 2015-07-17 | Thales Sa | Systeme de videosurveillance |
US10491810B2 (en) | 2016-02-29 | 2019-11-26 | Nokia Technologies Oy | Adaptive control of image capture parameters in virtual reality cameras |
EP3423974A4 (en) | 2016-03-02 | 2020-05-06 | Tinoq Inc. | SYSTEMS AND METHODS FOR EFFICIENT FACE RECOGNITION |
US10728694B2 (en) | 2016-03-08 | 2020-07-28 | Tinoq Inc. | Systems and methods for a compound sensor system |
EP3436926A4 (en) * | 2016-03-30 | 2019-11-13 | Tinoq Inc. | SYSTEMS AND METHODS FOR USER DETECTION AND RECOGNITION |
AU2017290128B2 (en) * | 2016-06-29 | 2022-08-11 | Vision Quest Industries Incorporated Dba Vq Orthocare | Measurement and ordering system for orthotic devices |
CN107666590B (zh) * | 2016-07-29 | 2020-01-17 | 华为终端有限公司 | 一种目标监控方法、摄像头、控制器和目标监控系统 |
CN106791698A (zh) * | 2017-01-18 | 2017-05-31 | 张岩岩 | 视频监测装置以及系统 |
CN111034189B (zh) * | 2017-08-30 | 2021-03-26 | 三菱电机株式会社 | 拍摄对象追踪装置以及拍摄对象追踪方法 |
JP2019067813A (ja) * | 2017-09-28 | 2019-04-25 | 株式会社デンソー | 半導体モジュール |
JP6977492B2 (ja) | 2017-11-13 | 2021-12-08 | トヨタ自動車株式会社 | 救援システムおよび救援方法、ならびにそれに使用されるサーバおよびプログラム |
JP7052305B2 (ja) | 2017-11-13 | 2022-04-12 | トヨタ自動車株式会社 | 救援システムおよび救援方法、ならびにそれに使用されるサーバおよびプログラム |
JP7000805B2 (ja) | 2017-11-13 | 2022-01-19 | トヨタ自動車株式会社 | 動物救援システムおよび動物救援方法、ならびにそれに使用されるサーバおよびプログラム |
JP6870584B2 (ja) | 2017-11-13 | 2021-05-12 | トヨタ自動車株式会社 | 救援システムおよび救援方法、ならびにそれに使用されるサーバおよびプログラム |
FR3076377B1 (fr) * | 2017-12-29 | 2021-09-24 | Bull Sas | Prediction de deplacement et de topologie pour un reseau de cameras. |
WO2020041352A1 (en) | 2018-08-21 | 2020-02-27 | Tinoq Inc. | Systems and methods for member facial recognition based on context information |
CN110245546A (zh) * | 2018-12-06 | 2019-09-17 | 浙江大华技术股份有限公司 | 一种目标跟踪系统、方法及存储介质 |
TWI688924B (zh) * | 2019-04-15 | 2020-03-21 | 勝品電通股份有限公司 | 追蹤辨識監控系統 |
CN111063145A (zh) * | 2019-12-13 | 2020-04-24 | 北京都是科技有限公司 | 电子围栏智能处理器 |
US11593951B2 (en) * | 2020-02-25 | 2023-02-28 | Qualcomm Incorporated | Multi-device object tracking and localization |
CN111818303A (zh) * | 2020-07-06 | 2020-10-23 | 深圳博为教育科技有限公司 | 一种智能导播方法、装置、系统及控制主机 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002290962A (ja) * | 2001-03-27 | 2002-10-04 | Mitsubishi Electric Corp | 侵入者自動追尾方法および装置並びに画像処理装置 |
JP2005033827A (ja) * | 2004-09-10 | 2005-02-03 | Hitachi Kokusai Electric Inc | 物体監視装置及び監視システム |
JP2005244279A (ja) * | 2004-02-24 | 2005-09-08 | Matsushita Electric Ind Co Ltd | 監視システム、監視装置および監視方法 |
JP2006310901A (ja) | 2005-04-26 | 2006-11-09 | Victor Co Of Japan Ltd | 監視システム及び監視方法 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5696503A (en) * | 1993-07-23 | 1997-12-09 | Condition Monitoring Systems, Inc. | Wide area traffic surveillance using a multisensor tracking system |
US6359647B1 (en) * | 1998-08-07 | 2002-03-19 | Philips Electronics North America Corporation | Automated camera handoff system for figure tracking in a multiple camera system |
JP4250172B2 (ja) * | 2000-02-28 | 2009-04-08 | 株式会社日立国際電気 | 物体検出方法及び物体検出装置 |
JP3643513B2 (ja) * | 2000-03-01 | 2005-04-27 | 株式会社日立国際電気 | 侵入物体監視方法および侵入物体監視装置 |
JP2002092751A (ja) * | 2000-09-18 | 2002-03-29 | Matsushita Electric Ind Co Ltd | 監視システム |
CN1554193A (zh) * | 2001-07-25 | 2004-12-08 | �����J��ʷ����ɭ | 摄像机控制装置及方法 |
AU2003280516A1 (en) * | 2002-07-01 | 2004-01-19 | The Regents Of The University Of California | Digital processing of video images |
JP4195991B2 (ja) * | 2003-06-18 | 2008-12-17 | パナソニック株式会社 | 監視映像モニタリングシステム、監視映像生成方法、および監視映像モニタリングサーバ |
JP4587166B2 (ja) * | 2004-09-14 | 2010-11-24 | キヤノン株式会社 | 移動体追跡システム、撮影装置及び撮影方法 |
US8760519B2 (en) | 2007-02-16 | 2014-06-24 | Panasonic Corporation | Threat-detection in a distributed multi-camera surveillance system |
JP4937016B2 (ja) * | 2007-07-09 | 2012-05-23 | 三菱電機株式会社 | 監視装置及び監視方法及びプログラム |
-
2011
- 2011-02-07 WO PCT/JP2011/052468 patent/WO2011114799A1/ja active Application Filing
- 2011-02-07 CN CN201180009511.4A patent/CN102754436B/zh active Active
- 2011-02-07 US US13/634,735 patent/US9398231B2/en active Active
- 2011-02-07 EP EP11755989.8A patent/EP2549753B1/en active Active
- 2011-02-07 JP JP2012505558A patent/JP5325337B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002290962A (ja) * | 2001-03-27 | 2002-10-04 | Mitsubishi Electric Corp | 侵入者自動追尾方法および装置並びに画像処理装置 |
JP2005244279A (ja) * | 2004-02-24 | 2005-09-08 | Matsushita Electric Ind Co Ltd | 監視システム、監視装置および監視方法 |
JP2005033827A (ja) * | 2004-09-10 | 2005-02-03 | Hitachi Kokusai Electric Inc | 物体監視装置及び監視システム |
JP2006310901A (ja) | 2005-04-26 | 2006-11-09 | Victor Co Of Japan Ltd | 監視システム及び監視方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2549753A4 |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012124763A (ja) * | 2010-12-09 | 2012-06-28 | Sony Corp | 映像表示装置、映像表示システム、映像表示方法およびプログラム |
US9064414B2 (en) | 2012-08-06 | 2015-06-23 | Cloudparc, Inc. | Indicator for automated parking systems |
US9607214B2 (en) | 2012-08-06 | 2017-03-28 | Cloudparc, Inc. | Tracking at least one object |
US9171382B2 (en) | 2012-08-06 | 2015-10-27 | Cloudparc, Inc. | Tracking speeding violations and controlling use of parking spaces using cameras |
US8982215B2 (en) | 2012-08-06 | 2015-03-17 | Cloudparc, Inc. | Controlling use of parking spaces using cameras and smart sensors |
US8982214B2 (en) | 2012-08-06 | 2015-03-17 | Cloudparc, Inc. | Controlling use of parking spaces using cameras and smart sensors |
US9165467B2 (en) * | 2012-08-06 | 2015-10-20 | Cloudparc, Inc. | Defining a handoff zone for tracking a vehicle between cameras |
US10521665B2 (en) | 2012-08-06 | 2019-12-31 | Cloudparc, Inc. | Tracking a vehicle using an unmanned aerial vehicle |
US9036027B2 (en) | 2012-08-06 | 2015-05-19 | Cloudparc, Inc. | Tracking the use of at least one destination location |
US9208619B1 (en) | 2012-08-06 | 2015-12-08 | Cloudparc, Inc. | Tracking the use of at least one destination location |
US20140218532A1 (en) * | 2012-08-06 | 2014-08-07 | Cloudparc, Inc. | Defining a Handoff Zone for Tracking a Vehicle Between Cameras |
US9858480B2 (en) | 2012-08-06 | 2018-01-02 | Cloudparc, Inc. | Tracking a vehicle using an unmanned aerial vehicle |
US8982213B2 (en) | 2012-08-06 | 2015-03-17 | Cloudparc, Inc. | Controlling use of parking spaces using cameras and smart sensors |
US9064415B2 (en) | 2012-08-06 | 2015-06-23 | Cloudparc, Inc. | Tracking traffic violations within an intersection and controlling use of parking spaces using cameras |
US9330303B2 (en) | 2012-08-06 | 2016-05-03 | Cloudparc, Inc. | Controlling use of parking spaces using a smart sensor network |
US9390319B2 (en) | 2012-08-06 | 2016-07-12 | Cloudparc, Inc. | Defining destination locations and restricted locations within an image stream |
US9489839B2 (en) | 2012-08-06 | 2016-11-08 | Cloudparc, Inc. | Tracking a vehicle using an unmanned aerial vehicle |
US8937660B2 (en) | 2012-08-06 | 2015-01-20 | Cloudparc, Inc. | Profiling and tracking vehicles using cameras |
US9652666B2 (en) | 2012-08-06 | 2017-05-16 | Cloudparc, Inc. | Human review of an image stream for a parking camera system |
JP2015060327A (ja) * | 2013-09-17 | 2015-03-30 | 株式会社リコー | 投影装置、投影方法及び情報処理システム |
JP2015064800A (ja) * | 2013-09-25 | 2015-04-09 | 株式会社 シリコンプラス | 自立型監視カメラ |
JP7040580B2 (ja) | 2016-12-22 | 2022-03-23 | 日本電気株式会社 | 映像収集システム、映像収集方法及びプログラム |
JP2020202000A (ja) * | 2016-12-22 | 2020-12-17 | 日本電気株式会社 | 映像収集システム、映像収集方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2549753A1 (en) | 2013-01-23 |
CN102754436B (zh) | 2015-07-22 |
EP2549753A4 (en) | 2014-06-04 |
JP5325337B2 (ja) | 2013-10-23 |
US20130002869A1 (en) | 2013-01-03 |
CN102754436A (zh) | 2012-10-24 |
EP2549753B1 (en) | 2019-04-10 |
US9398231B2 (en) | 2016-07-19 |
JPWO2011114799A1 (ja) | 2013-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5325337B2 (ja) | 監視カメラ端末 | |
JPWO2011114770A1 (ja) | 監視カメラ端末 | |
WO2019138836A1 (ja) | 情報処理装置、情報処理システム、情報処理方法およびプログラム | |
JP2011193187A (ja) | 監視カメラ端末 | |
JP3905116B2 (ja) | 検出領域調整装置 | |
JP3700707B2 (ja) | 計測システム | |
Qureshi et al. | Planning ahead for PTZ camera assignment and handoff | |
JP2006086591A (ja) | 移動体追跡システム、撮影装置及び撮影方法 | |
JP4418805B2 (ja) | 検出領域調整装置 | |
JP2005117542A (ja) | 監視システム | |
JP4475164B2 (ja) | 監視システム及び監視方法 | |
JP5686435B2 (ja) | 監視システム、監視カメラ端末、および動作モード制御プログラム | |
KR101821159B1 (ko) | 다수의 카메라를 이용한 이동체의 이동 경로 추적 시스템 | |
US9930119B2 (en) | Heterogeneous cellular object tracking and surveillance network | |
JP2011101165A (ja) | 連動撮影システム | |
TWI471825B (zh) | 天台安全監控系統及方法 | |
JP4300060B2 (ja) | 監視システム及び監視端末 | |
JP5397281B2 (ja) | 監視カメラ端末 | |
JP2002199382A (ja) | 動画像処理カメラ及びこれを用いた画像処理システム | |
JP2007208659A (ja) | 連携カメラシステム | |
KR20200010691A (ko) | 이종 카메라를 활용한 인물인식 및 교통통제시스템 | |
KR101822429B1 (ko) | 객체 정보 획득 장치 및 그 방법 | |
JP2005244279A (ja) | 監視システム、監視装置および監視方法 | |
KR101332820B1 (ko) | 객체 추적 시스템 및 방법, 객체관리장치 및 그 장치의 구동방법, 촬영장치 및 그 장치의 구동방법 | |
JPWO2003088672A1 (ja) | 監視システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180009511.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11755989 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012505558 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13634735 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011755989 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |