WO2011114770A1 - 監視カメラ端末 - Google Patents
監視カメラ端末 Download PDFInfo
- Publication number
- WO2011114770A1 WO2011114770A1 PCT/JP2011/051048 JP2011051048W WO2011114770A1 WO 2011114770 A1 WO2011114770 A1 WO 2011114770A1 JP 2011051048 W JP2011051048 W JP 2011051048W WO 2011114770 A1 WO2011114770 A1 WO 2011114770A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- terminal
- wireless communication
- unit
- camera terminal
- imaging
- Prior art date
Links
- 238000004891 communication Methods 0.000 claims abstract description 76
- 238000012544 monitoring process Methods 0.000 claims description 162
- 238000003384 imaging method Methods 0.000 claims description 135
- 238000000034 method Methods 0.000 claims description 25
- 230000008569 process Effects 0.000 claims description 24
- 230000008859 change Effects 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 4
- 238000012546 transfer Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 abstract description 28
- 238000006243 chemical reaction Methods 0.000 description 19
- 230000007246 mechanism Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 239000013598 vector Substances 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19645—Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- the present invention relates to a monitoring camera terminal that processes a frame image obtained by imaging a monitoring target area assigned to the own terminal and tracks a person located in the monitoring target area.
- the present invention relates to a surveillance camera terminal that performs tracking of a person being tracked with another surveillance camera terminal that partially overlaps the surveillance target area, and tracks this person over a wide area.
- a person such as a suspicious person who has taken an unusual action using a plurality of monitoring camera terminals (hereinafter, simply referred to as a suspicious person. ) And tracking of the detected suspicious person.
- the monitoring camera terminal to be tracked is switched according to the movement of the suspicious person.
- the suspicious person is handed over to the monitoring camera terminal taking over the tracking.
- a part of the monitoring target area is overlapped between the monitoring camera terminals that can pass the suspicious person.
- the surveillance camera which takes over tracking the suspicious person in the person being imaged using the frame image when the suspicious person is located in this overlapping area (hereinafter referred to as delivery area). It is made to identify to the terminal.
- the surveillance camera terminal taking over tracking will track another person who is not the suspicious person who has been tracking so far. Therefore, tracking of a suspicious person fails.
- each surveillance camera terminal uses a surveillance camera terminal that changes the imaging field of view according to the movement of a suspicious person. It is also performed to increase the monitoring target area to be assigned (see Patent Document 1).
- wireless communication between both monitoring camera terminals may be performed not directly but through a wireless communication channel via another monitoring camera terminal.
- a surveillance camera terminal tracking a suspicious person forms a wireless communication path with the surveillance camera terminal taking over tracking of the suspicious person, based on the reception intensity as a condition, wireless communication via another surveillance camera terminal Paths may be formed.
- the delay time of communication between both terminals for passing a suspicious person becomes longer.
- identification of a suspicious person in the delivery area is delayed, so that tracking of the suspicious person can not be performed in real time, and it is not possible to properly instruct a security guard or the like to cope with the suspicious person. That is, the suspicious person can not be caught and will be missed.
- both surveillance camera terminals which deliver a suspicious person will image a suspicious person located in a delivery area by delay of identification of a suspicious person Can not obtain a frame image and fails to track a suspicious person.
- the object of the present invention is to identify an object in a delivery area with a surveillance camera terminal taking over tracking of the object while the object is located in the delivery area, so that the tracking of the object is performed in real time
- a surveillance camera terminal that can be
- the surveillance camera terminal of the present invention is configured as follows.
- the object extracting means processes a frame image obtained by imaging the monitoring target area assigned to the own terminal by the imaging means, and extracts the imaged object.
- the tracking means tracks the object extracted by the object extraction means in the monitoring target area. For example, the frame images captured by the imaging means are processed in time series, and the position of the object is tracked as time passes.
- the delivery means performs wireless communication with the other party terminal in which the monitoring target area assigned in part overlaps with the monitoring target area of the own terminal in part, and the tracking target area tracks the object, the monitoring target area Deliver in the delivery area where is duplicated. That is, when the tracked object moves from the monitoring target area of the own terminal to the monitoring target area of the opposite terminal, the remote terminal identifies the tracked object in the delivery area and takes over the tracking of the object.
- the wireless communication path forming means forms a wireless communication path in which the wireless communication means directly communicates with the other party terminal which delivers the object when the delivery means delivers the object.
- the delay of the wireless communication between the terminals at the time of delivery of the object is suppressed. Therefore, while the tracked object is located in the delivery area, the tracked terminal can be identified. As a result, object tracking can be performed in real time.
- the wireless communication means is configured to have a first wireless communication unit having an omnidirectional antenna and a second wireless communication unit having a directional antenna, and the wireless communication path forming means is an object having a passing means as an object.
- the first wireless communication unit transmits a delivery request to the other party's terminal to the other party's terminal performing the delivery, and the directivity of the directional antenna is suitable for direct communication with the other party's terminal. You may make it the structure changed into direction.
- the opposite terminal receiving the delivery request is suitable for direct communication with the terminal transmitting the delivery request (the terminal tracking the object at this point) the directionality of the directivity of the directional antenna. It may be configured to change to the other direction.
- the wireless communication between the terminals at the time of delivery of the object may be performed by the second wireless communication unit.
- the reception strength of the wireless communication between the terminals at the time of the delivery of the object can be secured, and the delay of the communication due to the reduction of the throughput can also be suppressed.
- prediction means may be provided for predicting a partner terminal to which the object being tracked is to be delivered based on the movement path of the object being tracked in the area to be monitored by the tracking means.
- the wireless communication channel forming means may be configured to form a wireless communication channel that directly communicates with the opposite terminal predicted by the prediction means. As a result, it is possible to form a wireless communication path for direct communication between the terminals that deliver the object before the tracked object enters the delivery area.
- the imaging unit may have a fixed imaging field of view, or may be configured to change the imaging field of view according to the position of the object being tracked by the tracking unit by the imaging field control unit.
- identification of an object in the delivery area with the surveillance camera terminal taking over tracking of the object can be performed while the object is located in the delivery area, and tracking of the object is performed. It can be done in real time.
- FIG. 1 is a schematic view showing the configuration of a wide area surveillance system using a surveillance camera terminal according to an embodiment of the present invention.
- This wide area surveillance system is a network system having a plurality of surveillance camera terminals 1 (1A to 1H).
- the wide area monitoring system is, for example, an ad hoc network system.
- Data communication can be performed directly between the monitoring camera terminals 1 or via another monitoring camera terminal 1.
- the data communication between the monitoring camera terminals 1 is performed by wireless communication.
- the number of monitoring camera terminals 1 constituting the wide area monitoring system is not limited to eight illustrated here, and may be any number as long as it is plural. Further, a line connecting between the monitoring camera terminals 1 shown in FIG. 1 is a link. Further, in the following description, the monitoring camera terminals 1A to 1H are referred to as the monitoring camera terminal 1 when the monitoring camera terminals 1A to 1H are described without distinction.
- FIG. 2 is a diagram showing the configuration of the main part of the monitoring camera terminal.
- the monitoring camera terminal 1 includes a control unit 11, an imaging unit 12, a drive mechanism unit 13, an imaging field control unit 14, an image processing unit 15, a storage unit 16, a timer 17, and a first wireless communication unit.
- a second wireless communication unit 19 and a directivity control unit 20 are provided.
- the control unit 11 controls the operation of each unit of the main body.
- the imaging unit 12 outputs about 30 frames per second of a frame image obtained by imaging the inside of the imaging field (imaging area).
- the imaging unit 12 is attached to a camera platform (not shown) that individually rotates in the pan direction and the tilt direction.
- the pan direction and the tilt direction are orthogonal to each other.
- the imaging unit 12 also includes an optical system drive unit (not shown) that drives the imaging optical system, and can change the imaging magnification Zoom.
- the drive mechanism unit 13 has a drive source such as a motor that rotates the camera platform to which the imaging unit 12 is attached in the pan direction and in the tilt direction.
- the camera also has a sensor that detects a rotation angle ⁇ applied in the pan direction of the camera platform and a rotation angle ⁇ applied in the tilt direction.
- the imaging view control unit 14 instructs the imaging unit 12 to perform imaging magnification Zoom.
- the imaging unit 12 changes the imaging magnification Zoom in accordance with this instruction.
- the imaging view field control unit 14 instructs the drive mechanism unit 13 about the rotation angle ⁇ applied in the pan direction of the camera platform to which the imaging unit 12 is attached and the rotation angle ⁇ applied in the tilt direction.
- the drive mechanism unit 13 changes the rotation angle ⁇ applied in the pan direction of the camera platform to which the imaging unit 12 is attached and the rotation angle ⁇ applied in the tilt direction.
- the imaging field of the imaging unit 12 changes with the changes of the rotation angle ⁇ applied in the pan direction of the camera platform, the rotation angle ⁇ applied in the tilt direction of the camera platform, and the imaging magnification Zoom.
- a monitoring target area is defined in the monitoring camera terminal 1.
- the monitoring target area is a range that can be imaged by changing the imaging field of the imaging unit 12.
- the image processing unit 15 processes the frame image captured by the imaging unit 12, extracts the person being imaged, and attaches an ID to the person extracted here. This ID is a unique value that can identify a person.
- the image processing unit 15 also receives an output of a sensor that detects a rotation angle ⁇ in the pan direction of the camera platform to which the imaging unit 12 is attached and a rotation angle ⁇ in the tilt direction. That is, the image processing unit 15 obtains the rotation angle ⁇ applied in the pan direction of the camera platform to which the imaging unit 12 is attached and the rotation angle ⁇ applied in the tilt direction from the output of the sensor. Further, the image processing unit 15 obtains the imaging magnification Zoom of the imaging unit 12 based on the signal of the imaging magnification Zoom input from the imaging unit 12.
- the image processing unit 15 detects the imaging field of the imaging unit 12, that is, the position of the imaging area in the monitoring target area, according to the rotation angle ⁇ applied to the pan direction of the camera platform, the rotation angle ⁇ applied to the tilt direction, and the imaging magnification Zoom. You can get Therefore, the image processing unit 15 can convert the position of the person on the frame image captured by the imaging unit 12 into the position in the monitoring target area.
- the image processing unit 15 instructs the imaging view control unit 14 to perform the imaging view of the imaging unit 12 according to the movement of the person being tracked.
- the image processing unit 15 is configured to perform extraction and tracking of a person being imaged using, for example, a space-time MRF (Markov Random Field) model.
- the space-time MRF model is an extension of the MRF model as a space-time model, focusing on the correlation in the time axis direction of the space-time image.
- the space-time MRF model performs area division on a frame image to be processed into blocks of several pixels ⁇ several pixels (for example, 8 pixels ⁇ 8 pixels), and blocks each block between temporally consecutive frame images. It is a model that defines correlation in the time axis direction with reference to motion vectors of.
- the storage unit 16 stores an operation program for operating the main body, setting data used at the time of operation, processing data generated at the time of operation, and the like.
- the timer 17 clocks the current time.
- the first wireless communication unit 18 is connected to the omnidirectional antenna 18 a and controls wireless communication with another monitoring camera terminal 1.
- the directional antenna 20 a is connected to the second wireless communication unit 19 via the directivity control unit 20.
- the directivity control unit 20 changes the direction of directivity of the directional antenna 20a.
- the second wireless communication unit 19 also controls wireless communication with another monitoring camera terminal 1.
- This wide area surveillance system is a system for tracking a person such as a suspicious person who has taken a unique action (hereinafter, simply referred to as a suspicious person).
- the monitoring target area is assigned to each monitoring camera terminal 1. Further, as shown in FIG. 4, in the two adjacent monitoring camera terminals 1A and 1B, a part of the monitoring target area overlaps.
- FIG. 4 exemplifies the monitoring target area for two adjacent monitoring camera terminals 1A and 1B, a part of the monitoring target area overlaps even in the combination of other two adjacent monitoring camera terminals 1 There is.
- the overlapping area is a delivery area for delivering the suspicious person from one monitoring camera terminal 1A (or 1B) to the other monitoring camera terminal 1B (or 1A).
- the surveillance camera terminal 1B when the suspicious person being tracked by the surveillance camera terminal 1A passes through the delivery area with the surveillance camera terminal 1B and enters the monitoring target area of the surveillance camera terminal 1B, the surveillance camera terminal 1B operates in this delivery area. It is an area for identifying a suspicious person and taking over tracking of the suspicious person from the monitoring camera terminal 1A.
- the imaging field of view of the imaging unit 12 (a pivoting angle .theta.
- the rotation angle ⁇ and the imaging magnification Zoom) are stored in the storage unit 16.
- the imaging field of view of the imaging unit 12 at the time of passing a suspicious person includes a delivery area with the monitoring camera terminal 1 on the other side of the delivery of the suspicious person.
- each monitoring camera terminal 1 has a two-dimensional coordinate system of a frame image captured by the image capturing unit 12 of the own terminal in the delivery area for each adjacent monitoring camera terminal 1 and the imaging unit 12 of the adjacent partner monitoring camera terminal 1
- the coordinate conversion information indicating the relative positional relationship between the two-dimensional coordinate system of the frame image obtained by imaging the delivery area is stored in the storage unit 16.
- the two-dimensional coordinate system of the frame image captured by the imaging field when the imaging unit 12 of the own terminal passes the suspicious person, and the imaging unit 12 of the adjacent partner monitoring camera terminal 1 adjacent to the suspicious person This is information for projective transformation of a two-dimensional coordinate system of a frame image captured in a field of view taken at the time of delivery to a common coordinate system.
- a first coordinate conversion parameter and a second coordinate conversion parameter shown below are stored in the storage unit 16.
- the imaging unit 12 of the adjacent partner monitoring camera terminal 1 is suspicious that the two-dimensional coordinate system of the frame image captured in the imaging view when the imaging unit 12 of the own terminal passes the suspicious person It is a parameter for projective transformation to a two-dimensional coordinate system of a frame image captured in a field of view when passing a person.
- the imaging unit 12 of the own terminal is suspicious of the two-dimensional coordinate system of the frame image captured in the imaging view when the imaging unit 12 of the adjacent partner monitoring camera terminal 1 passes the suspicious person It is a parameter for projective transformation to a two-dimensional coordinate system of a frame image captured in a field of view when passing a person.
- the coordinate conversion information may be only one of the first coordinate conversion parameter and the second coordinate conversion parameter.
- the first coordinate conversion parameter and the second coordinate conversion parameter are values calculated using a frame image actually captured when the monitoring camera terminal 1 is installed.
- the imaging unit 12 of the own terminal processes the frame image captured in the imaging view when passing the suspicious person, and detects the coordinate positions (x, y) of the four points marked on the frame image.
- the eight constants of a0, b0, a1, b1, c1, a2, b2, c2, which are the solutions of this eight-element simultaneous equation, are the first coordinate transformation parameters with this adjacent remote monitoring camera terminal 1 is there.
- the monitoring camera terminal 1 stores the first coordinate conversion parameter in the storage unit 16.
- the identification of the person located in the delivery area acquires the coordinate position on the frame image for each person captured in the delivery area of the frame image captured by the own terminal. Further, the coordinate position on the frame image is acquired for each person captured in the delivery area of the frame image captured by the other party's terminal from the other party's terminal.
- a combination pattern is created in which a person located in the delivery area captured by the own terminal and a person located in the delivery area captured by the other-party terminal are associated on a one-to-one basis.
- the number of patterns created here is, for example, two if there are two persons located in the delivery area, and six if there are three persons located in the delivery area. It is street.
- the monitoring camera terminal 1 converts the coordinate position of the person into the coordinate system of the opposite terminal using the first coordinate conversion parameter for each person located in the delivery area captured by the own terminal.
- the monitoring camera terminal 1 calculates, for each combination pattern of persons, a first distance energy which is a sum of distances between corresponding persons in the coordinate system of the opposite terminal.
- the monitoring camera terminal 1 also converts the coordinate position of the person into the coordinate system of the own terminal using the second coordinate conversion parameter for each person located in the delivery area captured by the other party terminal.
- the monitoring camera terminal 1 calculates, for each combination pattern of persons, a second distance energy which is a sum of distances between corresponding persons in the coordinate system of the own terminal.
- And surveillance camera terminal 1 is a person who is located in the delivery area in the combination pattern in which the sum of the first distance energy and the second distance energy is the smallest among the combinations of persons located in the delivery area. It is determined that the Therefore, if the suspicious person is located in the delivery area, the monitoring camera terminal 1 on the side taking over tracking of the suspicious person is the suspicious person who is the suspicious person among the persons located in the delivery area. The suspicious person can be identified by obtaining it from the monitoring camera terminal 1 which is tracking.
- the identification of the suspicious person located in the delivery area, the monitoring camera terminal 1 tracking the suspicious person takes over the template information used for detecting the suspicious person, the monitoring camera terminal 1 taking over the tracking of the suspicious person , And the surveillance camera terminal 1 taking over tracking of the suspicious person may identify the suspicious person using the notified template information.
- feature information used to detect a suspicious person for example, image data of the suspicious person cut out from the image, information identifying the suspicious person by shape, color, size, etc., or eyes, The shape of the face parts such as the nose and the mouth, the position, etc. may also be notified to the monitoring camera terminal 1 taking over tracking of the suspicious person.
- FIG. 5 is a flowchart showing the operation of the monitoring camera terminal 1.
- the monitoring camera terminal 1 starts search processing for searching for a suspicious person in the monitoring target area (S1). At this time, the surveillance camera terminal 1 has not tracked the suspicious person (the tracking process described later has not been performed).
- the monitoring camera terminal 1 waits for a suspicious person to be detected in the search process started in S1 (S2) or a request for passing of the suspicious person from the adjacent monitoring camera terminal 1 is made (S3).
- the search process started in S1 changes the imaging visual field of the imaging unit 12 every fixed time (several seconds). Specifically, based on predetermined conditions, the imaging view field control unit 14 rotates in the pan direction of the camera platform to which the imaging unit 12 is attached and the rotation angle in the tilt direction at predetermined time intervals. The movement angle ⁇ is instructed to the drive mechanism unit 13. This condition is set, for example, so that the imaging view field of the imaging unit 12 scans the entire monitoring target area.
- the imaging magnification is a predetermined imaging magnification.
- the image processing unit 15 processes the frame image captured by the imaging unit 12 and extracts a person being captured, and assigns an ID to the extracted person. At this time, the image processing unit 15 does not assign an ID to a person to whom an ID has already been assigned. Further, the image processing unit 15 creates an object map in which the ID is associated with the position in the monitoring target area for each of the extracted persons. A time stamped by the timer 17 at the time of creation is added as a time stamp to this object map. The image processing unit 15 analyzes the behavior of each person who has been given an ID, using a temporally continuous object map. Then, if there is a person whose person's action is unique (a person who is a predetermined action pattern), the person is detected as a suspicious person. In addition, the face image of the cautioned person is registered, and when the cautionary person is detected by comparison with the face image of the person captured by the imaging unit 12, the cautionary person is detected as a suspicious person. It may be configured.
- the first wireless communication unit 18 waits for receiving a delivery request from another monitoring camera terminal 1 to the own terminal. Since the first wireless communication unit 18 is connected to the omnidirectional antenna 18a, the first wireless communication unit 18 may receive a passing request for another terminal. When receiving a passing request for this other terminal, it functions as a relay node. Specifically, the transfer destination of the passing request for the other terminal received this time is determined from among the terminals (including the terminal to which the passing request is sent) other than the terminal that has sent this passing request, and here Send to the determined terminal. For example, a passing request that monitoring camera terminal 1A shown in FIG.
- monitoring camera terminal 1 transmits to monitoring camera terminal 1B is transmitted along a path of monitoring camera terminal 1A ⁇ 1F ⁇ 1C ⁇ 1B or monitoring camera terminal 1A ⁇ 1G ⁇ 1B It may be sent on the route of In this case, the monitoring camera terminals 1F, 1C, and 1G function as relay nodes.
- the surveillance camera terminal 1 When the surveillance camera terminal 1 detects a suspicious person, it starts tracking processing (S4). Also, when receiving the passing request for the own terminal, the handover process is executed (S5).
- the monitoring camera terminal 1 that executes the tracking process according to S4 is referred to as the delivery-side monitoring camera terminal 1
- the monitoring camera terminal 1 that executes the handover process according to S5 is referred to as the handover-side monitoring camera terminal 1.
- FIG. 6 is a flowchart showing this tracking process.
- the image processing unit 15 of the delivery-side monitoring camera terminal 1 takes in the frame image captured by the imaging unit 12 (S11), processes this frame image, and the imaging field of view where the suspicious person becomes the center of the imaging unit 12
- the rotation angle ⁇ applied in the pan direction and the rotation angle ⁇ applied in the tilt direction are calculated (S12).
- the imaging magnification Zoom of the imaging unit 12 is a predetermined imaging magnification.
- an ID is assigned to the suspicious person when it is detected.
- the delivery-side monitoring camera terminal 1 adjusts the imaging view field of the imaging unit 12 to the imaging view field calculated in S12 (S13). That is, the rotation angle ⁇ applied in the pan direction of the camera platform and the rotation angle ⁇ applied in the tilt direction are adjusted to the angles calculated in S12. Specifically, the image processing unit 15 notifies the imaging view control unit 14 of the rotation angle ⁇ applied in the pan direction of the camera platform calculated here and the rotation angle ⁇ applied in the tilt direction. Based on the notification, the imaging view field control unit 14 changes the rotation angle ⁇ applied to the pan direction of the camera platform by the drive mechanism unit 13 and the rotation angle ⁇ applied to the tilt direction.
- the delivery-side monitoring camera terminal 1 predicts another adjacent monitoring camera terminal 1 (takeover-side monitoring camera terminal 1) that takes over tracking of the suspicious person (S14).
- the delivery-side monitoring camera terminal 1 determines whether the tracking suspicious person is located around the delivery area with the adjacent takeover-side monitoring camera terminal 1 predicted in S14 (S15). If the passing surveillance camera terminal 1 determines that the tracking suspicious person is not located around the delivery area with the handover surveillance camera terminal 1 predicted in S14, the processing returns to S11 and the above-described process is repeated.
- the delivery-side monitoring camera terminal 1 tracks the suspicious person X while catching the suspicious person X substantially at the center of the imaging field of view of the imaging unit 12. be able to.
- the image processing unit 15 also creates an object map in which the ID is associated with the position in the monitoring target area for the suspicious person also in this tracking process. Therefore, by using this object map, it is possible to obtain the movement direction of the suspicious person and the movement vector according to the movement speed.
- the delivery-side surveillance camera terminal 1 predicts the handover-side surveillance camera terminal 1 taking over tracking of the suspicious person who is tracking in S14, using this movement vector.
- a threshold of the distance from the delivery area with the takeover side monitoring camera terminal 1 is set in advance, and the position of the suspicious person and this delivery area (for example, the center or boundary of the delivery area) It may be determined based on whether the distance is less than or equal to this threshold. Also, in consideration of the movement speed of the suspicious person, the time required for the suspicious person being tracked to reach the delivery area with the takeover side monitoring camera terminal 1 is less than a predetermined time (for example, 2 to 3 seconds) It may be determined whether the suspicious person is located in the vicinity of the delivery area depending on whether it has become.
- the handover monitoring camera terminal 1 When it is determined that the suspicious person being tracked at S15 is positioned around the delivery area with the handover monitoring camera terminal 1 predicted at S14, the handover monitoring camera terminal 1 sends a notification to the handover monitoring camera terminal 1. And transmit the transfer request (S16).
- the delivery request for S16 is transmitted from the first wireless communication unit 18.
- the handover request is not necessarily received directly by the handover-side monitoring camera terminal 1. That is, it may be received by the takeover side surveillance camera terminal 1 in a wireless communication channel via another surveillance camera terminal 1.
- the handover-side monitoring camera terminal 1 that has received the delivery request for the own terminal executes the handover process in S5.
- the directivity control unit 20 switches the direction of the directivity of the directional antenna 20a to the handover-side monitoring camera terminal 1 that has received the current delivery request. It is changed to the direction suitable for direct communication (S17).
- the directivity control unit 20 stores, for each adjacent monitoring camera terminal 1, the directivity direction of the directivity antenna 20a suitable for direct communication.
- the delivery-side monitoring camera terminal 1 changes the imaging magnification Zoom to the imaging magnification at the time of delivery (S18).
- the image processing unit 15 takes in and processes the frame image captured by the imaging unit 12 (S19), and determines whether the suspicious person performs a frame out when the imaging view of the imaging unit 12 is set as the imaging view at the time of delivery. To do (S20).
- the imaging view field in which the suspicious person fits (the rotation angle (.theta. In the pan direction of the camera platform) and the rotation angle in the tilt direction ⁇ )) is calculated (S21).
- the imaging field of view may calculate an imaging field of view centered on the suspicious person, but it is preferable to shift the suspicious person from the center and to calculate an imaging field of view close to the imaging field of view at the time of delivery.
- the delivery-side monitoring camera terminal 1 matches the imaging view field of the imaging unit 12 with the imaging view field calculated in S21 (S22), and returns to S17.
- the delivery-side monitoring camera terminal 1 determines the imaging view of the imaging unit 12 as the imaging view at the time of delivery stored in the storage unit 16 (S23). The delivery-side monitoring camera terminal 1 adjusts the imaging view of the imaging unit 12 to the imaging view at the time of delivery (S24).
- the handover-side monitoring camera terminal 1 that has received the suspicious person's delivery request sent by the delivery-side monitoring camera terminal 1 in S16 described above has started the handover process shown in FIG.
- the handover-side monitoring camera terminal 1 changes the direction of the directivity of the directional antenna 20a to the direction suitable for direct communication with the passing-side monitoring camera terminal 1 by the directivity control unit 20 (S31).
- the handover-side monitoring camera terminal 1 matches the imaging field of view of the imaging unit 12 with the imaging field of view at the time of passing a suspicious person with the delivery-side monitoring camera terminal 1 (S32). Wait for the synchronization signal from the (S33).
- the second wireless communication unit It is in a state where linear communication can be performed at 19. Further, at this time, the handover-side monitoring camera terminal 1 matches the imaging view of the imaging unit 12 with the imaging view at the time of delivering the suspicious person with the delivery-side monitoring camera terminal 1.
- the passing-side monitoring camera terminal 1 and the passing-side monitoring camera terminal 1 directly communicate with each other in the second wireless communication unit 19 until the handover of tracking of the suspicious person is completed.
- the delivery-side monitoring camera terminal 1 When the delivery-side monitoring camera terminal 1 matches the imaging view at the time of passing the suspicious person with the takeover-side monitoring camera terminal 1 in S24, the suspicious person enters the delivery area. Wait for (S25). When the suspicious person enters the delivery area, the delivery-side monitoring camera terminal 1 transmits a synchronization signal to the handover-side monitoring camera terminal 1 (S26).
- the synchronization signal may notify time, or may simply notify reference timing. Further, this synchronization signal is transmitted to the handover monitoring camera terminal 1 in the second wireless communication unit 19.
- the delivery-side surveillance camera terminal 1 and the handover-side surveillance camera terminal 1 capture the frame image captured by the imaging unit 12 at the timing based on the synchronization signal (S27, S34).
- the handover-side monitoring camera terminal 1 identifies the suspicious person being imaged based on the captured image obtained by imaging the delivery area with the delivery-side monitoring camera terminal 1 (S28, S35).
- the identification of the suspicious person may be performed by a method using the first distance energy and the second distance energy described above.
- the delivery-side monitoring camera terminal 1 and the handover-side monitoring camera terminal 1 directly communicate with each other via the second wireless communication unit 19 and the processing result of the frame image captured by the delivery-side monitoring camera terminal 1 in S27 (The takeover side monitoring camera terminal 1 is notified of the position (position on the frame image) of each person including the suspicious person being tracked, which is located within the delivery area.
- the communication delay between the delivery side surveillance camera terminal 1 and the takeover side surveillance camera terminal 1 is sufficiently suppressed, and the identification of the suspicious person between the delivery side surveillance camera terminal 1 and the takeover side surveillance camera terminal 1 is It can be performed while the suspicious person is located in the delivery area. This enables tracking of suspicious persons in real time. In addition, it is possible to properly instruct the security guard etc. to respond to a suspicious person and be caught without missing the suspicious person.
- the handover-side monitoring camera terminal 1 starts the tracking process in S4 described above. That is, it tracks the suspicious person taken over this time.
- the delivery-side monitoring camera terminal 1 returns to S1 when the tracking process in S4 is completed.
- each monitoring camera terminal 1 was demonstrated as a structure which can change an imaging visual field
- an imaging visual field may be fixed.
- the range imaged in the fixed imaging field of view is the monitoring target area.
- the second wireless communication unit 19 communicates using the directional antenna 20a.
- the second wireless communication unit 19, the directivity control unit 20, and the like may be used in an environment where sufficient reception strength can be obtained even in direct communication in the first wireless communication unit 18 using the nondirectional antenna 18a.
- the directional antenna 20a may be unnecessary.
- the tracking target has been described as a suspicious person, the tracking target is not limited to a person, and may be another type of mobile object such as a vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
Abstract
Description
X=(a1x+b1y+c1)/(a0x+b0y+1)
Y=(a2x+b2y+c2)/(a0x+b0y+1)
に代入し、8元連立方程式を得る。この8元連立方程式の解である、a0,b0,a1,b1,c1,a2,b2,c2の8個の定数が、この隣接する相手側監視カメラ端末1との第1の座標変換パラメータである。監視カメラ端末1は、この第1の座標変換パラメータを記憶部16に記憶する。
x=(A1X+B1Y+C1)/(A0X+B0Y+1)
y=(A2X+B2Y+C2)/(A0X+B0Y+1)
に代入し、8元連立方程式を得る。この8元連立方程式の解である、A0,B0,A1,B1,C1,A2,B2,C2の8個の定数が、この隣接する相手側監視カメラ端末1との第2の座標変換パラメータである。監視カメラ端末1は、この第2の座標変換パラメータを記憶部16に記憶する。
11…制御部
12…撮像部
13…駆動機構部
14…撮像視野制御部
15…画像処理部
16…記憶部
17…タイマ
18…第1の無線通信部
18a…無指向性アンテナ
19…第2の無線通信部
20…指向性制御部
20a…指向性アンテナ
Claims (5)
- 自端末の周辺に位置する別の端末との無線通信を、直接、または他の端末を介して行う無線通信手段と、
自端末に割り当てられている監視対象エリアを撮像する撮像手段と、
前記撮像手段が撮像したフレーム画像を処理し、撮像されているオブジェクトを抽出するオブジェクト抽出手段と、
前記オブジェクト抽出手段が抽出したオブジェクトを、前記監視対象エリア内において追跡する追跡手段と、
割り当てられている前記監視対象エリアが、その一部において自端末の前記監視対象エリアと重複している相手側端末との無線通信を前記無線通信手段で行い、前記追跡手段が追跡しているオブジェクトを、前記監視対象エリアが重複している受け渡しエリアで受け渡す受け渡し手段と、
前記受け渡し手段がオブジェクトの受け渡しを行うとき、このオブジェクトを受け渡す相手側端末との間で、前記無線通信手段が直接通信する無線通信路を形成する無線通信路形成手段と、を備えた監視カメラ端末。 - 前記無線通信手段は、無指向性アンテナを有する第1の無線通信部、および指向性アンテナを有する第2の無線通信部を有し、
前記無線通信路形成手段は、前記受け渡し手段がオブジェクトの受け渡しを行う相手側端末に対して、前記第1の無線通信部で当該相手側端末に受け渡し要求を送信するとともに、前記指向性アンテナの指向性の向きを当該相手側端末との直接通信に適した向きに変更し、
前記受け渡し手段は、前記第2の無線通信部で相手側端末と無線通信を行う、請求項1に記載の監視カメラ端末。 - 前記受け渡し手段は、自端末、および相手側端末が前記受け渡しエリアを撮像したフレーム画像を用いて、前記追跡手段が追跡しているオブジェクトを相手側端末に同定させる、請求項1、または2に記載の監視カメラ端末。
- 前記追跡手段が前記監視対象エリア内で追跡しているオブジェクトの移動経路に基づいて、前記受け渡し手段がオブジェクトを受け渡す相手側端末を予測する予測手段、を備え、
前記無線通信路形成手段は、前記予測手段が予測した相手側端末と直接通信する無線通信路を形成する、請求項1~3のいずれかに記載の監視カメラ端末。 - 前記追跡手段が前記監視対象エリア内で追跡しているオブジェクトの位置に応じて前記撮像手段の撮像視野を変化させる撮像視野制御手段を備えた請求項1~4のいずれかに記載の監視カメラ端末。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/634,711 US20130002868A1 (en) | 2010-03-15 | 2011-01-21 | Surveillance camera terminal |
EP11755960.9A EP2549752A4 (en) | 2010-03-15 | 2011-01-21 | TERMINAL FOR SURVEILLANCE CAMERA |
JP2012505545A JPWO2011114770A1 (ja) | 2010-03-15 | 2011-01-21 | 監視カメラ端末 |
CN201180009393.7A CN102754435A (zh) | 2010-03-15 | 2011-01-21 | 监视摄像机终端 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-057122 | 2010-03-15 | ||
JP2010057122 | 2010-03-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011114770A1 true WO2011114770A1 (ja) | 2011-09-22 |
Family
ID=44648879
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/051048 WO2011114770A1 (ja) | 2010-03-15 | 2011-01-21 | 監視カメラ端末 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130002868A1 (ja) |
EP (1) | EP2549752A4 (ja) |
JP (1) | JPWO2011114770A1 (ja) |
CN (1) | CN102754435A (ja) |
WO (1) | WO2011114770A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102724773A (zh) * | 2012-05-25 | 2012-10-10 | 西安电子科技大学 | 基于m2m通信的无线智能监控装置及方法 |
JP2013118561A (ja) * | 2011-12-05 | 2013-06-13 | Nikon Corp | 電子カメラおよび電子カメラの制御方法 |
WO2016132769A1 (ja) * | 2015-02-19 | 2016-08-25 | シャープ株式会社 | 撮影装置、撮影装置の制御方法、および制御プログラム |
CN106331653A (zh) * | 2016-09-29 | 2017-01-11 | 浙江宇视科技有限公司 | 一种定位全景相机子画面显示区域的方法及装置 |
JP2017046023A (ja) * | 2015-08-24 | 2017-03-02 | 三菱電機株式会社 | 移動体追跡装置及び移動体追跡方法及び移動体追跡プログラム |
JP2017195617A (ja) * | 2017-05-31 | 2017-10-26 | 株式会社ニコン | カメラ |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5743221B2 (ja) * | 2012-06-29 | 2015-07-01 | カシオ計算機株式会社 | 無線同期システム、無線装置、センサ装置、無線同期方法、及びプログラム |
JP6022886B2 (ja) * | 2012-10-16 | 2016-11-09 | 株式会社日立国際電気 | 無線監視カメラシステム及び無線監視カメラ装置 |
CN103900554A (zh) * | 2012-12-26 | 2014-07-02 | 联想(北京)有限公司 | 信息处理方法和引导设备 |
CN103702030A (zh) * | 2013-12-25 | 2014-04-02 | 浙江宇视科技有限公司 | 一种基于gis地图的场景监控方法和移动目标追踪方法 |
CN105100577B (zh) * | 2014-04-21 | 2019-03-26 | 浙江大华技术股份有限公司 | 一种图像处理方法及装置 |
WO2015178540A1 (ko) * | 2014-05-20 | 2015-11-26 | 삼성에스디에스 주식회사 | 카메라간 핸드오버를 이용한 목표물 추적 장치 및 방법 |
JP6696422B2 (ja) | 2014-06-27 | 2020-05-20 | 日本電気株式会社 | 異常検知装置及び異常検知方法 |
US10664705B2 (en) * | 2014-09-26 | 2020-05-26 | Nec Corporation | Object tracking apparatus, object tracking system, object tracking method, display control device, object detection device, and computer-readable medium |
KR102174839B1 (ko) * | 2014-12-26 | 2020-11-05 | 삼성전자주식회사 | 보안 시스템 및 그 운영 방법 및 장치 |
JP6631619B2 (ja) * | 2015-03-27 | 2020-01-15 | 日本電気株式会社 | 映像監視システム及び映像監視方法 |
EP3089263B1 (de) * | 2015-04-29 | 2022-04-13 | Rohde & Schwarz GmbH & Co. KG | Tragbare richtantenne, messanordnung und messverfahren |
JP6631712B2 (ja) * | 2015-08-28 | 2020-01-15 | 日本電気株式会社 | 解析装置、解析方法、及びプログラム |
KR20170050028A (ko) * | 2015-10-29 | 2017-05-11 | 삼성에스디에스 주식회사 | 객체의 위치 검색 시스템 및 방법 |
CN106791586A (zh) * | 2015-11-19 | 2017-05-31 | 杭州海康威视数字技术股份有限公司 | 一种对移动目标进行监控的方法及监控设备、装置、系统 |
WO2018067058A1 (en) * | 2016-10-06 | 2018-04-12 | Modcam Ab | Method for sharing information in system of imaging sensors |
KR102575271B1 (ko) * | 2016-10-17 | 2023-09-06 | 한화비전 주식회사 | Pos 기기와 연동된 감시 카메라 및 이를 이용한 감시 방법 |
JP6992536B2 (ja) * | 2018-01-19 | 2022-01-13 | 富士通株式会社 | 観測システムおよび観測方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001136565A (ja) * | 1999-11-09 | 2001-05-18 | Mitsubishi Electric Corp | 移動体通信装置 |
JP2002290962A (ja) * | 2001-03-27 | 2002-10-04 | Mitsubishi Electric Corp | 侵入者自動追尾方法および装置並びに画像処理装置 |
JP2005244279A (ja) * | 2004-02-24 | 2005-09-08 | Matsushita Electric Ind Co Ltd | 監視システム、監視装置および監視方法 |
JP2006310901A (ja) | 2005-04-26 | 2006-11-09 | Victor Co Of Japan Ltd | 監視システム及び監視方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6690374B2 (en) * | 1999-05-12 | 2004-02-10 | Imove, Inc. | Security camera system for tracking moving objects in both forward and reverse directions |
GB2393350B (en) * | 2001-07-25 | 2006-03-08 | Neil J Stevenson | A camera control apparatus and method |
JP4478510B2 (ja) * | 2004-06-03 | 2010-06-09 | キヤノン株式会社 | カメラシステム、カメラ、及びカメラの制御方法 |
JP4587166B2 (ja) * | 2004-09-14 | 2010-11-24 | キヤノン株式会社 | 移動体追跡システム、撮影装置及び撮影方法 |
US7546624B2 (en) * | 2006-01-18 | 2009-06-09 | Vishloff T Lee | Systems and methods for wireless digital video monitoring |
US8432449B2 (en) * | 2007-08-13 | 2013-04-30 | Fuji Xerox Co., Ltd. | Hidden markov model for camera handoff |
-
2011
- 2011-01-21 WO PCT/JP2011/051048 patent/WO2011114770A1/ja active Application Filing
- 2011-01-21 EP EP11755960.9A patent/EP2549752A4/en not_active Withdrawn
- 2011-01-21 JP JP2012505545A patent/JPWO2011114770A1/ja active Pending
- 2011-01-21 US US13/634,711 patent/US20130002868A1/en not_active Abandoned
- 2011-01-21 CN CN201180009393.7A patent/CN102754435A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001136565A (ja) * | 1999-11-09 | 2001-05-18 | Mitsubishi Electric Corp | 移動体通信装置 |
JP2002290962A (ja) * | 2001-03-27 | 2002-10-04 | Mitsubishi Electric Corp | 侵入者自動追尾方法および装置並びに画像処理装置 |
JP2005244279A (ja) * | 2004-02-24 | 2005-09-08 | Matsushita Electric Ind Co Ltd | 監視システム、監視装置および監視方法 |
JP2006310901A (ja) | 2005-04-26 | 2006-11-09 | Victor Co Of Japan Ltd | 監視システム及び監視方法 |
Non-Patent Citations (2)
Title |
---|
See also references of EP2549752A4 |
YASUHIDE HYODO ET AL.: "Robust Multiple-human Tracking against Occlusion through Camera Network", IPSJ SIG NOTES, 2008-CVIM-164, COMPUTER VISION AND IMAGE MEDIA, vol. 2008, no. 82, 29 August 2008 (2008-08-29), pages 171 - 176, XP008167429 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013118561A (ja) * | 2011-12-05 | 2013-06-13 | Nikon Corp | 電子カメラおよび電子カメラの制御方法 |
CN102724773A (zh) * | 2012-05-25 | 2012-10-10 | 西安电子科技大学 | 基于m2m通信的无线智能监控装置及方法 |
WO2016132769A1 (ja) * | 2015-02-19 | 2016-08-25 | シャープ株式会社 | 撮影装置、撮影装置の制御方法、および制御プログラム |
JP2017046023A (ja) * | 2015-08-24 | 2017-03-02 | 三菱電機株式会社 | 移動体追跡装置及び移動体追跡方法及び移動体追跡プログラム |
CN106331653A (zh) * | 2016-09-29 | 2017-01-11 | 浙江宇视科技有限公司 | 一种定位全景相机子画面显示区域的方法及装置 |
JP2017195617A (ja) * | 2017-05-31 | 2017-10-26 | 株式会社ニコン | カメラ |
Also Published As
Publication number | Publication date |
---|---|
EP2549752A4 (en) | 2014-12-03 |
EP2549752A1 (en) | 2013-01-23 |
JPWO2011114770A1 (ja) | 2013-06-27 |
US20130002868A1 (en) | 2013-01-03 |
CN102754435A (zh) | 2012-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011114770A1 (ja) | 監視カメラ端末 | |
JP5325337B2 (ja) | 監視カメラ端末 | |
US8115814B2 (en) | Mobile tracking system, camera and photographing method | |
US20200068632A1 (en) | Wireless communication system, wireless relay device and wireless communication method | |
EP3772227B1 (en) | Cellular telecommunications network | |
WO2005081530A1 (ja) | 監視システム、監視装置および監視方法 | |
US9930119B2 (en) | Heterogeneous cellular object tracking and surveillance network | |
JP5470516B2 (ja) | 移動体操作用無線通信装置および通信制御プログラムならびに移動体 | |
JP7218321B2 (ja) | 無人航空機のビューを生成するためのシステムおよび方法 | |
US20240185707A1 (en) | Automated Vehicle Control Distributed Network Apparatuses and Methods | |
US11812320B2 (en) | Initiation of transfer of user equipment to base station according to visual data | |
KR101494884B1 (ko) | 근거리 무선통신 액세스 포인트를 구비한 감시 카메라 시스템 및 그 구동 방법 | |
KR100877227B1 (ko) | 다중이동물체 자동추적용 지그비 다중카메라시스템 및 그 운영방법 | |
JP4174448B2 (ja) | 検知対象検知システム | |
JP4206052B2 (ja) | 検知対象検知システム | |
CN103900554A (zh) | 信息处理方法和引导设备 | |
KR102093727B1 (ko) | 카메라를 이용한 송신측 안테나 및 이를 이용한 무선전력 전송시스템 및 방법 | |
KR102382503B1 (ko) | Vlc 조명을 이용한 관제 로봇의 제어 시스템 및 방법 | |
WO2023112501A1 (ja) | 移動体端末、無線通信システム、および通信制御プログラム | |
JP2011135327A (ja) | 無線通信システム,無線通信端末 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180009393.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11755960 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012505545 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13634711 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011755960 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |