WO2017038160A1 - 監視情報生成装置、撮影方向推定装置、監視情報生成方法、撮影方向推定方法、及びプログラム - Google Patents
監視情報生成装置、撮影方向推定装置、監視情報生成方法、撮影方向推定方法、及びプログラム Download PDFInfo
- Publication number
- WO2017038160A1 WO2017038160A1 PCT/JP2016/063720 JP2016063720W WO2017038160A1 WO 2017038160 A1 WO2017038160 A1 WO 2017038160A1 JP 2016063720 W JP2016063720 W JP 2016063720W WO 2017038160 A1 WO2017038160 A1 WO 2017038160A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- monitoring image
- moving
- monitoring
- shooting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/53—Recognition of crowd images, e.g. recognition of crowd congestion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to video surveillance.
- Patent Document 1 discloses a technique for calculating a moving direction of a crowd by analyzing an input monitoring video and controlling a monitoring device according to the calculated moving direction.
- a surveillance camera fixed to a building in this way often photographs a crowd from a distance.
- the size of the person shown in the photographed image is small, it is difficult to grasp the state of the crowd (for example, the number of people included in the crowd and their distribution).
- the present invention has been made in view of the above problems.
- the objective of this invention is providing the technique for grasping
- the monitoring information generating apparatus of the present invention includes: 1) a first acquisition unit that acquires a first monitoring image captured by a fixed camera that is a camera whose position is fixed; and 2) Second acquisition means for acquiring a second monitoring image captured by a moving camera, which is a camera whose position is not fixed, and 3) using the first monitoring image and the second monitoring image, Generating means for generating monitoring information.
- the imaging direction estimation apparatus is: 1) a first movement direction estimation that estimates a first movement direction that is a movement direction of an object in a first monitoring image captured by a fixed camera that is a camera with a fixed position. Means, 2) second movement direction estimation means for estimating a second movement direction that is a movement direction of the object in the second monitoring image taken by a moving camera that is a camera whose position is not fixed, and 3) Shooting direction estimation means for estimating the shooting direction of the moving camera based on the first moving direction, the second moving direction, the position and orientation of the fixed camera, and the position of the moving camera.
- the first monitoring information generation method of the present invention is executed by a computer.
- the method includes: 1) a first acquisition step of acquiring a first monitoring image taken by a fixed camera which is a camera whose position is fixed; and 2) an image taken by a moving camera which is a camera whose position is not fixed.
- the shooting direction estimation method of the present invention is executed by a computer.
- the method includes 1) a first movement direction estimation step for estimating a first movement direction which is a movement direction of an object in a first monitoring image taken by a fixed camera which is a camera whose position is fixed; and 2) A second moving direction estimating step for estimating a second moving direction that is a moving direction of the object in the second monitoring image taken by the moving camera that is a camera whose position is not fixed; and 3) the first movement.
- a technique for grasping the state of the crowd from an image of the crowd is provided.
- FIG. 1 is a block diagram illustrating a monitoring information generation device according to a first embodiment. It is a figure which illustrates notionally the operation of the monitoring information generation device of Embodiment 1.
- 3 is a flowchart illustrating a flow of processing executed by the monitoring information generation apparatus according to the first embodiment. It is a figure which illustrates the hardware constitutions of the computer which implement
- FIG. 10 is a flowchart illustrating an example of a flow of processing executed by a monitoring information generation apparatus according to the second embodiment. It is a figure which illustrates the optical flow computed about the 1st surveillance picture. It is a figure which illustrates the change of the position of an object. It is a figure for demonstrating operation
- FIG. 1 is a block diagram illustrating a monitoring information generation apparatus 2000 according to the first embodiment.
- each block represents a functional unit configuration, not a hardware unit configuration.
- the monitoring information generation device 2000 uses two types of monitoring images, a monitoring image generated by a fixed camera and a monitoring image generated by a moving camera.
- a fixed camera is a camera whose position is fixed.
- the fixed camera is a surveillance camera that is fixedly installed in various places such as a wall, a pillar, or a ceiling.
- the place where the fixed camera is installed may be indoors or outdoors.
- the wall etc. in which the fixed camera is provided should just be fixed for a certain period of time, and are not limited to real estate.
- a wall or the like on which a fixed camera is installed may be a partition or a pillar that is temporarily installed in an event venue or the like.
- the moving camera is a camera whose position moves.
- the moving camera is worn by a person or attached to a car, a motorcycle, a flying object, or the like.
- the mobile camera worn by a person is, for example, a camera held by hand (camera of a mobile terminal such as a video camera or a smartphone) or a camera fixed to the head or chest (wearable camera or the like).
- a camera attached to a car, a motorcycle, or a flying object may be a camera attached for use as a so-called drive recorder, or may be a camera attached separately for surveillance photography.
- Both the moving camera and the fixed camera take a video of the location to be monitored.
- the location to be monitored is arbitrary.
- the monitored location is a route between the event venue and the nearest station.
- the location to be monitored may be indoors or outdoors.
- the imaging range of the moving camera and the imaging range of the fixed camera may or may not overlap.
- FIG. 2 is a diagram conceptually illustrating the operation of the monitoring information generating apparatus 2000.
- the fixed camera 10 captures the crowd and generates the first monitoring image 12.
- a crowd here is one or more objects.
- the object may be a person or something other than a person (for example, a car, a motorcycle, an animal, etc.).
- the mobile camera 20 captures the crowd and generates a second monitoring image 22.
- the crowd photographed by the fixed camera 10 and the crowd photographed by the mobile camera 20 may be the same crowd or different crowds.
- the monitoring information generation device 2000 generates monitoring information 30 using the first monitoring image 12 and the second monitoring image 22.
- the monitoring information 30 is information related to object monitoring. Specific contents of the monitoring information 30 and a generation method thereof will be described later.
- the monitoring information generation device 2000 includes a first monitoring image acquisition unit 2020, a second monitoring image acquisition unit 2040, and a generation unit 2060.
- the first monitoring image acquisition unit 2020 acquires the first monitoring image 12.
- the second monitoring image acquisition unit 2040 acquires the second monitoring image 22.
- the generation unit 2060 generates the monitoring information 30 using the first monitoring image 12 and the first monitoring information 14.
- the monitoring information of the crowd is generated using the monitoring image generated by the mobile camera 20 in addition to the monitoring image generated by the fixed camera 10. Therefore, the crowd state can be grasped more accurately as compared with the case where the crowd state must be grasped only by using the fixed camera 10.
- FIG. 3 is a flowchart illustrating the flow of processing executed by the monitoring information generation apparatus 2000 according to the first embodiment.
- the first monitoring image acquisition unit 2020 acquires the first monitoring image 12 (S102).
- the second monitoring image acquisition unit 2040 acquires the second monitoring image 22 (S104).
- the generation unit 2060 generates monitoring information 30 using the first monitoring image 12 and the second monitoring image 22 (S106).
- FIG. 4 is a diagram illustrating a hardware configuration of a computer 1000 that implements the monitoring information generation apparatus 2000 according to the first embodiment.
- the computer 1000 may be implemented using a dedicated device designed exclusively for realizing the monitoring information generating device 2000, or may be implemented using a general-purpose device such as a PC (Personal Computer) or a portable terminal. Also good.
- PC Personal Computer
- the computer 1000 includes a bus 1020, a processor 1040, a memory 1060, a storage 1080, an input / output interface 1100, and a network interface 1120.
- the bus 1020 is a data transmission path through which the processor 1040, the memory 1060, the storage 1080, the input / output interface 1100, and the network interface 1120 transmit / receive data to / from each other.
- the processor 1040 is an arithmetic processing device such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
- the memory 1060 is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory).
- the storage 1080 is a storage device such as a hard disk, an SSD (Solid State Drive), or a memory card. Further, the storage 1080 may be a memory such as a RAM or a “ROM”.
- the input / output interface 1100 is an interface for connecting the computer 1000 and an input / output device.
- a keyboard and a mouse are connected to the input / output interface 1100.
- the network interface 1120 is an interface for connecting the computer 1000 to an external device so as to be communicable.
- the network interface 1120 may be a network interface for connecting to a wired line or a network interface for connecting to a wireless line.
- the computer 1000 that implements the monitoring information generation device 2000 is connected to the fixed camera 10 and the moving camera 20 via a network.
- the method for connecting the computer 1000 to the fixed camera 10 and the mobile camera 20 is not limited to connection via a network.
- the computer 1000 may not be connected to the fixed camera 10 or the mobile camera 20 so as to be communicable.
- the storage 1080 stores program modules that implement the functions of the monitoring information generation device 2000.
- the processor 1040 implements each function corresponding to the program module by executing each program module.
- the processor 1040 may execute the modules after reading them onto the memory 1060 or without reading them onto the memory 1060.
- each program module may be stored in the memory 1060.
- the computer 1000 may not include the storage 1080.
- the first monitoring image acquisition unit 2020 acquires the first monitoring image 12 (S102).
- the first monitoring image acquisition unit 2020 receives the first monitoring image 12 transmitted from the fixed camera 10.
- the first monitoring image acquisition unit 2020 may access the fixed camera 10 and acquire the first monitoring image 12 stored in the fixed camera 10.
- the fixed camera 10 may store the first monitoring image 12 in a storage device provided outside the fixed camera 10. In this case, the first monitoring image acquisition unit 2020 may acquire the first monitoring image 12 by accessing this storage device.
- the first monitoring image acquisition unit 2020 may acquire the first monitoring image 12 in real time, or may acquire the first monitoring image 12 after a while after the first monitoring image 12 is generated. In the latter case, for example, the monitoring information generation device 2000 acquires the first monitoring image 12 and the second monitoring image 22 captured in the past (for example, the previous day), and generates monitoring information for the past monitoring image. Analyze crowd behavior.
- the second monitoring image acquisition unit 2040 acquires the second monitoring image 22 (S104).
- the method in which the second monitoring image acquisition unit 2040 acquires the second monitoring image 22 is the same as the method in which the first monitoring image acquisition unit 2020 acquires the first monitoring image 12.
- the generation unit 2060 generates object monitoring information 30 using the first monitoring image 12 and the second monitoring image 22 (S106).
- the object is not limited to a person and may be any object. What the generation unit 2060 handles as an object may be set in advance in the generation unit 2060, stored in a storage device accessible from the generation unit 2060, or the generation unit 2060 operates. May be set manually.
- the monitoring information 30 generated by the generation unit 2060 is various. Hereinafter, specific examples of the monitoring information 30 and respective generation methods will be described.
- FIG. 5 is a diagram illustrating a display in which the second monitoring image 22 is superimposed on the first monitoring image 12.
- a screen in which the second monitoring image 22-1 to the second monitoring image 22-3 are superimposed on the first monitoring image 12 is displayed.
- the display screen 40 is viewed, for example, by a monitor in a security room. By watching this display screen 40, the monitoring person grasps the detailed state of each place photographed by the mobile camera 20 while grasping the overall state of the surveillance place photographed by the fixed camera 10. Can do. Therefore, the monitor or the like can grasp the crowd state flexibly and accurately.
- the position on the first monitoring image 12 where the second monitoring image 22 is presented may be the position on the first monitoring image 12 corresponding to the position of the second monitoring image 22 in the real world or the vicinity thereof. preferable.
- the generation unit 2060 determines the position to superimpose the second monitoring image 22 using the position information of the moving camera 20, the position information of the fixed camera 10, and camera parameters representing the attitude of the fixed camera 10.
- the generation unit 2060 uses the position information and posture of the fixed camera 10 to determine a position corresponding to the position information of the mobile camera 20 from the location shown in the first monitoring image 12.
- the posture of the fixed camera 10 includes the horizontal direction and the vertical direction of the shooting direction of the fixed camera 10.
- the position information of each camera is arbitrary information that can specify the position of the camera.
- the position information of the camera is information indicating the “Global Positioning System (GPS) coordinates of the camera.
- GPS Global Positioning System
- the generation unit 2060 acquires the position information of the mobile camera 20.
- the position information of the mobile camera 20 is included in the metadata of the second monitoring image 22, for example.
- the generation unit 2060 acquires the position information of the mobile camera 20 from the metadata of the second monitoring image 22.
- the generation unit 2060 may receive position information separately transmitted by the mobile camera 20. This transmission may be performed spontaneously by the mobile camera 20, or may be performed in response to a request from the generation unit 2060.
- the method in which the generation unit 2060 acquires the position information of the fixed camera 10 is the same as the method in which the generation unit 2060 acquires the position information of the mobile camera 20, for example. Since the position of the fixed camera 10 is fixed, the position information of the fixed camera 10 may be stored in advance in a storage unit accessible from the generation unit 2060. For example, the position information of the fixed camera 10 may be manually input to the generation unit 2060.
- the position of the second monitoring image 22 on the first monitoring image 12 is not limited to the position based on the position of the second monitoring image 22 in the real world.
- the second monitoring image 22 may be displayed at a predetermined position on the first monitoring image 12.
- FIG. 6 illustrates a state in which the second monitoring image is displayed side by side near the left end of the display screen 40.
- the predetermined position may be set in advance in the generation unit 2060, or may be stored in a storage device accessible from the generation unit 2060.
- the display position of the second monitoring image 22 may be changeable by a user operation.
- the monitoring information generation device 2000 accepts an operation such as dragging the second monitoring image 22 with a mouse, and changes the display position of the second monitoring image 22 according to the operation.
- the generation unit 2060 may display the first monitoring image 12 and the second monitoring image 22 side by side without superimposing the second monitoring image 22 on the first monitoring image 12.
- FIG. 7 illustrates a state where the first monitoring image 12 and the second monitoring image 22 are displayed side by side on the display screen 40.
- the generation unit 2060 displays a mark or the like representing each mobile camera 20 described later at a position on the first monitoring image 12 corresponding to the position of each mobile camera 20 in the real world. Then, the generation unit 2060 displays information (for example, a mark number) indicating which mark each second monitoring image 22 corresponds to next to the second monitoring image 22.
- the generation unit 2060 generates a display in which a mark indicating the moving camera 20 is superimposed on the first monitoring image 12. Furthermore, when the mark is selected by a user (such as a monitor), the generation unit 2060 displays the second monitoring image 22 generated by the mobile camera 20 corresponding to the mark. As a result, monitoring information 30 that is a display in which the second monitoring image 22 is superimposed on the first monitoring image 12 is generated as in the case of the specific example 1 described above.
- the first monitoring image 12 is compared with the case where all the second monitoring images 22 are displayed unconditionally. This makes it easy to grasp the overall state of the crowd in the image, and also allows the observer to easily grasp the detailed state of the crowd at the location that the observer or the like wants to watch.
- FIG. 8 is a diagram illustrating a state in which a mark indicating the moving camera 20 is displayed on the first monitoring image 12.
- FIG. 8 there are three moving cameras 20-1 to 3 within the shooting range of the fixed camera 10, and their positions are represented by marks 50-1 to 50-3.
- the position of each mark 50 in FIG. 8 is a position corresponding to the position of the corresponding mobile camera 20 in the real world.
- FIG. 9 is a diagram illustrating a display when the mouse cursor 60 is clicked in FIG.
- the user's selection operation for the mark 50 is not limited to the mouse operation.
- the display position of the mark of the mobile camera 20 displayed on the first monitoring image 12 is not limited to the position corresponding to the position of the mobile camera 20 in the real world. This is the same as the display position of the second monitoring image 22 on the first monitoring image 12 described above.
- FIG. 10 is a diagram illustrating a state in which the shooting direction of the mobile camera 20 is displayed.
- the direction indicated by the shooting direction 52 is the shooting direction of the moving camera 20 corresponding to the mark 50.
- the generation unit 2060 grasps the shooting direction of each mobile camera 20.
- the generation unit 2060 uses the direction indicated by the output of the electronic compass as the shooting direction of the mobile camera 20.
- the generation unit 2060 may use the shooting direction of the mobile camera 20 estimated by a method described in an embodiment described later.
- the generation unit 2060 generates a display in which information regarding the crowd is superimposed on the first monitoring image 12 as the monitoring information 30.
- the information regarding the crowd is information representing the distribution of objects included in the crowd, for example.
- information representing the distribution of objects included in the crowd is referred to as distribution information.
- FIG. 11 and 12 are diagrams illustrating an example of superimposing a display based on distribution information on the first monitoring image 12.
- a heat map 61 representing a human distribution is superimposed on the first monitoring image 12.
- this heat map is a heat map in which a place where the density of people is high is red and a place where the density of people is low is blue.
- the first monitoring image 12 is divided into a plurality of partial areas. Specifically, the first monitoring image 12 is divided into 24 partial areas by being divided into six equal parts vertically and four equal parts horizontally.
- a dotted line representing each partial area is displayed for easy understanding of the drawing. However, this dotted line may not actually be displayed.
- the generation unit 2060 generates, as the monitoring information 30, a display in which a frame line 62 that highlights an area where the degree of crowding is high is superimposed on the first monitoring image 12. Specifically, the generation unit 2060 generates distribution information indicating the number of people shown in each partial area, and sets the partial areas whose number is equal to or greater than a predetermined value as areas where the degree of crowding is high.
- the monitor or the like can use the first monitoring image 12 or the second monitoring image 12. It becomes possible to easily grasp the state of the crowd that is difficult to grasp only by looking at the image 22.
- the generation unit 2060 generates distribution information using the first monitoring image 12 and the second monitoring image 22.
- the generation unit 2060 generates distribution information for the first monitoring image 12 by performing processing such as image recognition processing on the first monitoring image 12.
- the generation unit 2060 divides the first monitoring image 12 into a plurality of partial areas, and calculates the number of objects shown in each partial area. As a result, distribution information indicating the number of objects for each partial region of the first monitoring image 12 is generated.
- the generation unit 2060 determines an area shown in the second monitoring image 22 among the partial areas of the first monitoring image 12. The generation unit 2060 corrects the number of objects indicated for the calculated area in the distribution information by using the number of objects shown in the second monitoring image 22. Then, the generation unit 2060 performs display in which the corrected distribution information is superimposed on the first monitoring image 12. Note that the number of objects shown in the second monitoring image 22 is calculated by performing image recognition processing or the like on the second monitoring image 22 in the same manner as the number of objects shown in the first monitoring image 12. be able to.
- FIG. 13 is a diagram illustrating an overlap between a range shown in the first monitoring image 12 and a range shown in the second monitoring image 22.
- the partial area 64-1 is the above-described partial area obtained by dividing the first monitoring image 12 into a plurality of parts.
- a range 65 shown in the second monitoring image 22 is contained in one partial area 64-1.
- the generation unit 2060 calculates the number of objects reflected in the entire second monitoring image 22, and corrects the number of objects in the partial area 64-1 indicated by the distribution information using the calculated number of objects. .
- the generation unit 2060 1) replaces the number of objects in the partial area 64-1 indicated by the distribution information with the number of objects calculated for the second monitoring image 22, and 2) replaces the number of objects in the partial area 64-1 indicated by the distribution information. Processing such as replacing the number of objects with a statistical value of the number and the number of objects calculated for the second monitoring image 22 is performed.
- the statistical value in 2) is a weighted average in which the number of objects calculated for the second monitoring image 22 is greater than the number of objects indicated by the distribution information.
- the generation unit 2060 divides the second monitoring image 22 into a region A that overlaps the partial region 64-1 and a region B that overlaps the partial region 64-2, and determines the number of objects reflected in each region. calculate. Then, the generation unit 2060 corrects the number of objects in the partial area 64-1 indicated by the distribution information using the number of objects calculated for the area A. Similarly, the generation unit 2060 corrects the number of objects in the partial area 64-2 indicated by the distribution information using the number of objects calculated for the area B. Each correction method is the same as the method described with reference to FIG.
- the object distribution can be calculated more accurately. . This is because the number of objects calculated for the second monitoring image 22 can be calculated more accurately than the number of objects calculated for the first monitoring image 12.
- the generation unit 2060 can calculate the number of objects shown in the second monitoring image 22 more accurately than the number of objects shown in the first monitoring image 12.
- the fixed camera 10 when the fixed camera 10 is shooting at an angle that looks down obliquely from a distance, people close to each other overlap each other, and some people are One monitor image 12 may not be captured.
- the moving camera 20 by attaching the moving camera 20 to a small flying object and photographing the crowd from directly above, it is possible to perform photographing so that persons in close proximity do not overlap. Therefore, the number of objects calculated using the second monitoring image 22 generated by the moving camera 20 that performs shooting in this manner is more accurate than the number of objects calculated using the first monitoring image 12.
- the monitoring information generation apparatus 2000 may repeatedly generate the distribution information described above and update the distribution information to be displayed. Thereby, distribution information such as a heat map can be displayed like an animation.
- the generation of distribution information may be performed using all of the first monitoring image 12 and the second monitoring image 22, or may be performed using a part of the first monitoring image 12 and the second monitoring image 22. Also good. In the latter case, for example, distribution information is generated at intervals such as once per second or once every 10 seconds. Further, the monitoring information generation apparatus 2000 may receive a user operation instructing generation of distribution information, and update the distribution information only when the user operation is accepted.
- the generation unit 2060 uses the display in which the distribution information is superimposed on the first monitoring image 12 as the monitoring information 30.
- the generation unit 2060 may use the distribution information itself as the monitoring information 30.
- the generation unit 2060 stores the distribution information in a storage device or the like, or displays the distribution information on the display screen in a table format or a graph format. This distribution information can be used for behavior analysis of the crowd.
- the monitoring information generation apparatus 2000 may use distribution information as another specific example shown below.
- the generation unit 2060 may display the above-described distribution information superimposed on the map of the monitoring target location.
- the monitoring information generation device 2000 includes a map information acquisition unit 2080.
- the map information acquisition unit 2080 acquires map information that is information relating to a map near the location to be monitored.
- FIG. 14 is a block diagram illustrating a monitoring information generation device 2000 having a map information acquisition unit 2080.
- the map information indicates positions such as sidewalks, roads, and buildings.
- the map information is stored in advance in a storage device that can be accessed from the monitoring information generation device 2000.
- FIG. 15 is a diagram illustrating a map 200 displayed on the display screen 40.
- the monitoring target in FIG. 15 is an indoor floor.
- a plurality of fixed cameras 10 are installed.
- the position of the moving camera 20 is indicated by a mark 50.
- the position of the fixed camera 10 and the position of the moving camera 20 on the map 200 can be calculated using the position information of the fixed camera 10, the position information of the moving camera 20, and the position information of the place represented by the map 200. .
- the position of the fixed camera 10 may be shown on the map 200 in advance.
- the generation unit 2060 uses the first monitoring image 12 generated by each fixed camera 10 to generate object distribution information in the shooting range of each fixed camera 10. Further, the generation unit 2060 corrects each distribution information using the number of objects calculated for the second monitoring image 22. Note that the method of correcting the distribution information using the number of objects calculated for the second monitoring image 22 is the same as the method described with reference to FIG.
- the generation unit 2060 superimposes and displays a display based on each corrected distribution information on the map 200.
- FIG. 16 is a diagram illustrating a map 200 on which a heat map generated based on the corrected distribution information is superimposed.
- the background is simplified, so that the distribution information can be easily grasped visually by a monitor. Further, when the distribution information is displayed on the map, the state of the crowd at the place monitored by the plurality of fixed cameras 10 can be displayed on one screen. Therefore, it becomes possible to easily grasp the situation of the crowd distributed by a monitor or the like over a wide range.
- the generation unit 2060 receives an operation for selecting the mark 50 from the user in the same manner as the processing described with reference to FIGS. 8 and 9, and the second generated by the mobile camera 20 corresponding to the selected mark 50.
- the monitoring image 22 may be displayed on the map 200.
- the generation unit 2060 replaces the second monitoring image 22 or together with the second monitoring image 22 with the distribution information generated by using the second monitoring image 22 (information indicating the distribution of objects in the shooting range of the mobile camera 20). ) May be displayed on the map 200.
- the generation unit 2060 may generate the above-described distribution information using only the second monitoring image 22 without using the first monitoring image 12. In this case, for example, the generation unit 2060 calculates the number of objects shown in the second monitoring image 22 generated by each of the plurality of mobile cameras 20, and sets “the shooting range of the second monitoring image 22, the number of objects”. Distribution information is generated as a list of combinations. And the production
- the generation unit 2060 may generate distribution information for each second monitoring image 22 by the same method as the method for generating distribution information for the first monitoring image 12. In this case, the generation unit 2060 generates a heat map or the like using a plurality of distribution information, and displays the heat map or the like superimposed on the first monitoring image 12 or the map 200.
- FIG. 17 is a block diagram illustrating a monitoring information generation apparatus 2000 according to the second embodiment.
- each block represents a functional unit configuration, not a hardware unit configuration.
- the monitoring information generation device 2000 has a function of estimating the shooting direction of the mobile camera 20.
- the shooting direction of the fixed camera 10 can be determined by acquiring camera parameters (such as a horizontal rotation angle and a vertical rotation angle) representing the current camera posture.
- camera parameters such as a horizontal rotation angle and a vertical rotation angle
- a camera parameter acquired from the fixed camera 10 represents a horizontal rotation angle of +45 degrees.
- the rotation angle in the horizontal direction is expressed with a counterclockwise rotation as a plus direction. The same applies to the following description.
- the shooting direction of the fixed camera 10 is northwest, which is a direction rotated 45 degrees clockwise from the north.
- the mobile camera 20 worn by a person or the like unlike the fixed camera 10, it is difficult to determine the initial orientation as a reference, so the shooting direction of the mobile camera 20 is determined. It is difficult.
- FIG. 18 is a diagram conceptually illustrating the operation of the monitoring information generation apparatus 2000 according to the second embodiment.
- the monitoring information generation device 2000 acquires the first monitoring image 12 generated by the fixed camera 10 and estimates the movement direction of the crowd (hereinafter, the first movement direction) in the first monitoring image 12.
- the monitoring information generation device 2000 acquires the second monitoring image 22 generated by the mobile camera 20, and estimates the movement direction of the crowd in the second monitoring image 22 (hereinafter referred to as the second movement direction).
- the monitoring information generation apparatus 2000 estimates the shooting direction of the mobile camera 20 based on the movement direction of the crowd in the first monitoring image 12 and the movement direction of the crowd in the second monitoring image 22.
- the monitoring information generation apparatus 2000 further includes a first movement direction estimation unit 2100, a second movement direction estimation unit 2120, and a shooting direction estimation unit 2140.
- the first movement direction estimation unit 2100 estimates the movement direction of the crowd in the first monitoring image 12.
- the second movement direction estimation unit 2120 estimates the movement direction of the crowd in the second monitoring image 22.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 based on the first moving direction, the second moving direction, the position and posture of the fixed camera 10, and the position of the moving camera 20.
- the monitoring information generating apparatus 2000 of the present embodiment since the shooting direction of the moving camera 20 is estimated using the first monitoring image 12 and the second monitoring image 22, an electronic compass attached to the moving camera 20, etc. Even if the device cannot calculate the shooting direction of the mobile camera 20 with high accuracy, the device can grasp the shooting direction of the mobile camera 20 with high accuracy.
- the shooting direction of the mobile camera 20 estimated by the monitoring information generation device 2000 is the process of visualizing the shooting direction of the mobile camera 20 described in the first embodiment, and the range shown in the second monitoring image 22 is the first monitoring image. 12 or a process of mapping on a map.
- the usage method of the estimated shooting direction of the mobile camera 20 is arbitrary, and is not limited to the usage method described in the first embodiment.
- monitoring information generation apparatus 2000 of this embodiment will be described in more detail.
- FIG. 19 is a flowchart illustrating the flow of processing executed by the monitoring information generation apparatus 2000 according to the second embodiment.
- the first movement direction estimation unit 2100 estimates the first movement direction using the first monitoring image 12 (S202).
- the second movement direction estimation unit 2120 estimates the second movement direction using the second monitoring image 22 (S204).
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 based on the first moving direction, the second moving direction, the position and posture of the fixed camera 10, and the position of the moving camera 20 (S206).
- the first movement direction estimation unit 2100 estimates a first movement direction that is the movement direction of the crowd in the first monitoring image 12 (S202).
- a method for estimating the first movement direction will be described.
- the first movement direction estimation unit 2100 calculates optical flows of pixels and feature points included in each of the plurality of first monitoring images 12 arranged in time series.
- FIG. 20 is a diagram illustrating an optical flow calculated for the first monitoring image 12. Each arrow shown in FIG. 20 represents the optical flow calculated for the first monitoring image 12.
- the first movement direction estimation unit 2100 estimates the first movement direction based on the calculated optical flow. For example, the first movement direction estimation unit 2100 selects one from the optical flows, and sets the selected optical flow as the first movement direction. For example, the first movement direction estimation unit 2100 randomly selects one optical flow.
- the first movement direction estimation unit 2100 statistically processes a plurality of calculated optical flows to calculate one vector, and sets this vector as the first movement direction.
- This statistical process is, for example, a process for calculating an average of vectors.
- the first movement direction estimation unit 2100 detects an object that is commonly shown in the plurality of first monitoring images 12 arranged in time series, and estimates the first movement direction based on a change in the position of the object.
- FIG. 21 is a diagram illustrating a change in the position of the object.
- the object represented by the dotted line is shown in the t-th first monitoring image 12
- the object represented by the solid line is shown in the t + 1-th first monitoring image 12.
- An arrow represents a change in the position of each object.
- the change in the position of the object is, for example, a vector connecting the centroids of a plurality of areas representing the same object.
- the first movement direction estimation unit 2100 selects one from a plurality of objects, and sets a vector representing a change in the position of the selected object as the first movement direction. For example, the first movement direction estimation unit 2100 randomly selects one object. For example, the first movement direction estimation unit 2100 selects the largest object.
- the first movement direction estimation unit 2100 performs statistical processing on a plurality of vectors representing changes in the positions of a plurality of objects, calculates one vector, and sets this vector as the first movement direction.
- This statistical process is, for example, a process for calculating an average of vectors.
- the first movement direction estimation unit 2100 may detect the movement direction of the crowd based on the direction of the object shown in the first monitoring image 12. For example, when the object is a person or an animal, the first movement direction estimation unit 2100 determines the direction of the face or body, and sets the direction in which the face or body is facing as the first movement direction. When the object is an object such as a car, a motorcycle, or a flying object, the first movement direction estimation unit 2100 shows the shape of the object and the positions of various parts (such as bumpers and handles) shown in the first monitoring image 12. Thus, the traveling direction of the object is determined, and the determined traveling direction is defined as the first movement direction.
- a method in which the second movement direction estimation unit 2120 estimates the movement direction (second movement direction) of the crowd shown in the second monitoring image 22 is a method in which the first movement direction estimation unit 2100 estimates the first movement direction. It is the same.
- the first monitor image 12 and the second monitor image 22 may be blurred.
- the crowd may appear blurred in each monitoring image.
- a crowd is displayed on each monitoring image. May be blurred.
- the first movement direction estimation unit 2100 performs the estimation of the first movement direction after performing processing (so-called camera shake correction or the like) for reducing the shake on each first monitoring image 12.
- the second moving direction estimation unit 2120 also performs the estimation of the second moving direction after performing the process of reducing the shake on each second monitoring image 22.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 based on the first moving direction, the second moving direction, the position and posture of the fixed camera 10, and the position of the moving camera 20 (SS210).
- FIG. 22 is a diagram for explaining the operation of the shooting direction estimation unit 2140.
- specific processing in which the shooting direction estimation unit 2140 estimates the shooting direction of the mobile camera 20 will be described with reference to FIG.
- the imaging direction estimation unit 2140 maps the first movement direction on a plane (for example, on a map) obtained by planarly viewing the location to be monitored in the vertical direction.
- a plane 70 is a plane obtained by planarly viewing a location to be monitored in the vertical direction.
- the movement direction 80 represents a direction in which the movement direction of the crowd in the first monitoring image 12 is mapped to the plane 70.
- the plane 70 is a plane whose north is the upward direction.
- the shooting direction estimation unit 2140 uses the movement direction 80 to calculate the movement direction (hereinafter, the third movement direction) of the object on the plane 70 shown in the moving camera 20.
- the third movement direction is the same as the movement direction 80.
- FIG. 22 described above illustrates a case where the shooting range of the fixed camera 10 and the shooting range of the moving camera 20 overlap. The processing when the shooting range of the fixed camera 10 and the shooting range of the moving camera 20 do not overlap will be described later.
- the shooting direction estimation unit 2140 determines a shooting direction of the mobile camera 20 as an estimation result from a plurality of candidates for the shooting direction of the mobile camera 20 (hereinafter, candidate shooting directions). First, the shooting direction estimation unit 2140 shows, for each candidate shooting direction, the movement direction of the crowd shown in the moving camera 20 (hereinafter referred to as the candidate moving direction) that is seen when the candidate shooting direction is viewed from the position of the moving camera 20. calculate.
- candidate shooting directions 92 north, northeast, east, southeast, south, southwest, west, and northwest.
- candidate moving directions 94-1 to 93-1 are candidate moving directions when the candidate photographing directions 92-1, 92-2, and 92-3 are viewed from the position of the moving camera 20, respectively. Note that in this example, even if the candidate shooting directions other than the above three are seen from the position of the mobile camera 20, the crowd is hardly visible or not at all, so the candidate shooting directions other than the above three are omitted.
- the imaging direction estimation unit 2140 performs matching between each candidate movement direction and the second movement direction.
- the second movement direction is the second movement direction 95.
- the shooting direction estimation unit 2140 estimates that the candidate shooting direction 92 corresponding to the candidate movement direction 94 having the highest degree of coincidence with the second movement direction 95 is the shooting direction of the mobile camera 20.
- the candidate movement direction 94 having the highest degree of coincidence with the second movement direction 95 is the candidate movement direction 94-2. Therefore, the shooting direction estimation unit 2140 estimates that the candidate shooting direction 92-2 corresponding to the candidate moving direction 94-2 is the shooting direction of the moving camera 20.
- the candidate shooting direction is not limited to the above-mentioned eight directions.
- the candidate shooting directions may be four directions of north, east, south, and west.
- the candidate shooting directions are not limited to directions in which general names such as east, west, south, and north are defined.
- the shooting direction estimation unit 2140 determines a candidate shooting direction based on the direction of a sidewalk near the mobile camera 20.
- FIG. 23 is a diagram for explaining a method of determining a candidate shooting direction based on the direction of the sidewalk.
- the moving camera 20 is located near the sidewalk 110. Therefore, it is considered that a monitor who monitors the crowd moving on the sidewalk 110 performs monitoring with the direction 120 that is the normal direction of the sidewalk 110 as the front direction.
- the shooting direction estimation unit 2140 determines each candidate shooting direction with reference to the direction 120 that is the normal direction of the direction of the sidewalk 110. For example, the shooting direction estimation unit 2140 selects four directions, that is, a direction 120, a direction 121 obtained by rotating the direction 120 by +90 degrees, a direction 122 obtained by rotating the direction 120 by +180 degrees, and a direction 123 obtained by rotating the direction 120 by +270 degrees. And
- the shooting direction estimation unit 2140 includes the map information acquisition unit 2080 described above and uses the map information acquired by the map information acquisition unit 2080. .
- the shooting direction estimation unit 2140 uses the movement direction 80 to calculate the movement direction (third movement direction) of the object shown on the moving camera 20 on the plane 70.
- the shooting range of the fixed camera 10 and the shooting range of the moving camera 20 overlap.
- a case where the shooting range of the fixed camera 10 and the shooting range of the moving camera 20 do not overlap will be described.
- FIG. 24 is a diagram illustrating a case where the shooting ranges of the fixed camera 10 and the moving camera 20 do not overlap.
- the direction in which the first movement direction is mapped on the plane 70 is the movement direction 80-1.
- the moving direction of the crowd shown in the moving camera 20 on the plane 70 is not limited to the moving direction 80-2 which is the same direction as the moving direction 80-1, but is the moving direction 80-3. It is also possible.
- the shooting direction estimation unit 2140 obtains map information of the location to be monitored, and calculates the movement direction of the crowd on the plane 70 reflected in the moving camera 20 based on the map information. Specifically, the imaging direction estimation unit 2140 moves the moving direction on the plane 70 of the crowd shown in the first monitoring image 12 along a route (for example, a sidewalk) that the crowd is supposed to move on the map. The moving direction when moved to the vicinity of 20 is the moving direction on the plane 70 of the crowd shown in the moving camera 20.
- a route for example, a sidewalk
- FIG. 25 is a diagram for explaining a method of estimating the movement direction of the crowd on the plane 70 using the map information.
- the object is a person.
- a sidewalk 110 is a sidewalk where people walk.
- the shooting direction estimation unit 2140 sets the moving direction 80-3 as the third moving direction. More specifically, the shooting direction estimation unit 2140 calculates a vector obtained by moving the moving direction 80-1 along the line to the point where the distance between the line passing through the center of the sidewalk and the position of the moving camera 20 is the shortest. , The third movement direction.
- FIG. 26 is a diagram illustrating a case where there are a plurality of movement paths of the crowd.
- the crowd can move in either the moving direction 80-2 or the moving direction 80-3.
- the shooting direction estimation unit 2140 acquires movement route information related to the movement route of the crowd.
- the monitoring information generation apparatus 2000 includes a movement route information acquisition unit 2160.
- FIG. 27 is a block diagram illustrating a monitoring information generation apparatus 2000 having a movement route information acquisition unit 2160.
- the movement route information indicates in which direction the crowd moves. For example, when an event is held at an event venue, it is considered that the crowd moves from the nearest station to the event venue.
- crowd guidance is generally performed so that the crowd is directed to an event venue along a predetermined route. Therefore, the movement route information indicates the movement route of the crowd determined in advance as described above.
- FIG. 28 is a diagram illustrating a process of estimating the movement direction of the crowd using the movement route information.
- the travel route information acquired in FIG. 28 is information indicating a map and a travel route on the map. Also in the map indicated by the travel route information, the sidewalk 110 is branched as in FIG.
- the path 111 is shown in the movement path information acquired by the imaging direction estimation unit 2140. Therefore, the shooting direction estimation unit 2140 sets the movement direction 80-2 obtained by moving the movement direction 80-1 along the path 111 as the third movement direction.
- the movement route information acquisition unit 2160 acquires a combination of “the movement route of the crowd and the time zone in which the movement occurs” as the movement route information.
- the travel route information is stored in advance in a storage device accessible from the travel route information acquisition unit 2160.
- the imaging direction estimation unit 2140 estimates the movement direction of the crowd on the plane 70 shown in the second monitoring image 22 from the movement path of the crowd indicated in the movement path information. For example, in FIG. 28, the imaging direction estimation unit 2140 does not use the movement direction 80-1, and the movement direction on the plane 70 of the crowd shown in the second monitoring image 22 is based on the path 111. 3 may be estimated.
- the shooting direction estimation unit 2140 may narrow down the candidate shooting directions before calculating the candidate movement direction corresponding to the candidate shooting direction and performing matching with the second movement direction. As a result, the number of times of matching can be reduced, so that there is an effect that the calculation amount of processing of the photographing direction estimation unit 2140 can be reduced.
- several methods for narrowing the candidate shooting direction will be exemplified.
- the shooting direction estimation unit 2140 narrows down the candidate shooting directions using an electronic compass. Specifically, the shooting direction estimation unit 2140 excludes a direction having a large angle difference from the direction indicated by the electronic compass from the candidate shooting directions.
- FIG. 29 is a diagram for explaining a method of narrowing down the candidate shooting direction using an electronic compass.
- the candidate shooting directions are eight directions such as north and northwest.
- the electronic compass indicates northwest.
- the actual shooting direction of the mobile camera 20 is likely to be north or west.
- the southeast which is the opposite of the direction indicated by the electronic compass, is unlikely to be the shooting direction of the mobile camera 20.
- the shooting direction estimation unit 2140 excludes the southeast or the like where the difference in angle with the direction indicated by the electronic compass is large from the candidate shooting directions.
- a predetermined number representing the number of directions excluded from the candidate shooting directions is determined in advance. For example, if the predetermined number is 3, the shooting direction estimation unit 2140 excludes southeast, south, and east from the candidate shooting directions. If the predetermined number is 5, the shooting direction estimation unit 2140 excludes southeast, south, east, southwest, and northeast from the candidate shooting directions. In FIG. 29, southeast, south, and east are excluded from the candidate shooting directions.
- the shooting direction estimation unit 2140 narrows down the candidate shooting directions based on the background shown in the second monitoring image 22.
- the shooting direction estimation unit 2140 excludes candidate shooting directions that are predicted that the background shown in the second monitoring image 22 does not fall within the angle of view of the mobile camera 20.
- FIG. 30 is a diagram for explaining a method of narrowing down the candidate shooting direction based on the background shown in the second monitoring image 22.
- FIG. 30 shows a map around the monitoring target.
- the building 160 is reflected in the second monitoring image 22.
- the shooting direction of the mobile camera 20 is southeast, the building 160 is not shown in the second monitoring image 22.
- the shooting direction of the mobile camera 20 is northwest, it is considered that the building 160 is reflected in the second monitoring image 22.
- the shooting direction estimation unit 2140 extracts a background representing a characteristic building or a signboard around the monitoring target from the background shown in the second monitoring image 22 in this way. Then, the shooting direction estimation unit 2140 excludes candidate shooting directions where the extracted background is less likely to appear from the relationship between the position of the extracted background on the map and the position of the mobile camera 20. Specifically, the shooting direction estimation unit 2140 excludes candidate shooting directions having a large angle difference from the direction starting from the position of the mobile camera 20 and having the extracted background position on the map as the end point. For example, the shooting direction estimation unit 2140 excludes a predetermined number of candidate shooting directions, similar to the above-described exclusion of the candidate shooting directions using the electronic compass. In FIG. 30, southeast, south, and east are excluded from the candidate shooting directions.
- the shooting direction estimation unit 2140 may narrow down the candidate shooting directions based on the position of the extracted background in the second monitoring image 22.
- FIG. 31 is a diagram for explaining a method of narrowing down the candidate shooting direction based on the position of a specific background on the second monitoring image 22.
- FIG. 31A shows the second monitoring image 22
- FIG. 31B shows the positional relationship between the mobile camera 20 and the building 160 on the map.
- the building 160 is shown to the right of the center of the second monitoring image 22.
- the shooting direction of the moving camera 20 is a direction rotated in the + direction from the direction (direction 170) connecting the position of the moving camera 20 and the position of the building 160 on the map.
- the shooting direction estimation unit 2140 excludes candidate shooting directions in which the angle formed by the direction connecting the position of the mobile camera 20 and the position of the building 160 on the map is in the range of -0 ° to -180 °.
- the candidate shooting direction 174 is excluded from the candidate shooting directions, and the candidate shooting direction 172 is not excluded from the candidate shooting directions.
- the shooting direction estimation unit 2140 may estimate the shooting direction of the moving camera 20 a plurality of times during a predetermined period, and may comprehensively estimate the shooting direction of the moving camera 20 based on the plurality of estimation results. For example, it is assumed that the shooting direction estimation unit 2140 calculates a comprehensive estimation result of the shooting direction of the mobile camera 20 every second. Further, it is assumed that the fixed camera 10 and the moving camera 20 are cameras that shoot at a frequency of 30 frames per second (30 fps). In this case, the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 by estimating the shooting direction of the moving camera 20 each time the first monitoring image 12 and the second monitoring image 22 are generated. 30 times per second. Then, a comprehensive estimation result is calculated based on the 30 estimation results.
- the shooting direction estimation unit 2140 statistically processes a plurality of estimation results calculated during a predetermined period, and estimates the shooting direction of the mobile camera 20. For example, the shooting direction estimation unit 2140 sets the mode value of the plurality of estimation results as the shooting direction of the mobile camera 20. As an example, it is assumed that the breakdown of the result of estimation performed 30 times per second is “north: 20 times, northwest: 8 times, northeast: 2 times”. In this case, the shooting direction estimation unit 2140 estimates that the most frequently calculated north is the shooting direction of the mobile camera 20 in that one second.
- the shooting direction estimation unit 2140 may calculate an average value of a plurality of estimation results and use the average value as the shooting direction of the mobile camera 20. Specifically, the shooting direction estimation unit 2140 represents each estimated direction of the mobile camera 20 calculated a plurality of times as a numerical value with east as +0 degrees, obtains an average value of these numerical values, and moves this numerical value. The shooting direction of the camera 20 is assumed.
- the shooting direction estimation unit 2140 calculates a comprehensive estimation result using a result of estimation performed a plurality of times in a predetermined period, the above-mentioned comprehensive calculation is performed using only a result having a high reliability among the plurality of estimation results.
- a simple estimation result may be calculated.
- the shooting direction estimation unit 2140 calculates a comprehensive estimation result of the shooting direction of the mobile camera 20 once per second.
- the fixed camera 10 and the moving camera 20 are cameras that perform shooting at 30 fps.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 by estimating the shooting direction of the moving camera 20 each time the first monitoring image 12 and the second monitoring image 22 are generated. 30 times per second. Next, the shooting direction estimation unit 2140 divides the estimation results of 30 times per second into groups of 10 times that are continuous in time series. Then, the shooting direction estimation unit 2140 calculates a comprehensive estimation result of the shooting direction of the mobile camera 20 for one second using a group having high reliability among the three groups.
- the reliability of the group described above is determined based on, for example, the size of the variance of estimation results within the group.
- the shooting direction estimation unit 2140 calculates the variance of the estimation results within the group by expressing each estimation result as a numerical value with east as + 0 °. Then, the shooting direction estimation unit 2140 calculates a comprehensive estimation result using only estimation results included in a group whose calculated variance is equal to or less than a predetermined value.
- FIG. 32 is a diagram illustrating an example in which the crowd flow changes near an intersection.
- FIG. 32 shows a state where the intersection is viewed in plan in the vertical direction. It is assumed that at a certain point in time, the signal 131 of the pedestrian crossing 130 is blue and the signal 141 of the pedestrian crossing 140 is red. In this case, the crowd flows in the direction 132, for example, to cross the pedestrian crossing 130. After that, it is assumed that the signal 131 turns red and the signal 141 turns blue. In this case, the crowd crosses the pedestrian crossing 140 and flows, for example, in the direction 142.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 before and after the change, and comprehensively estimates the shooting direction of the moving camera 20 from the result. Good. Specifically, the shooting direction estimation unit 2140 uses the first monitoring image 12 and the second monitoring image 22 shot before and after the change in the crowd flow, and the shooting of the mobile camera 20 before and after the crowd flow. Estimate the direction. Then, the shooting direction estimation unit 2140 calculates a final estimation result by statistically processing the estimated shooting direction.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 when the crowd is flowing in the direction 132
- the candidate moving direction having a high degree of coincidence with the second moving direction is north.
- the candidate moving directions having a high degree of coincidence with the second moving direction are the north and the northeast.
- the shooting direction estimation unit 2140 has a high degree of coincidence with the second moving direction (the mode value) in both the case where the crowd flows in the direction 132 and the case where the crowd flows in the direction 142.
- the shooting direction of the moving camera 20 is estimated.
- the shooting direction estimation unit 2140 repeatedly estimates the shooting direction of the moving camera 20.
- the shooting direction estimation unit 2140 repeatedly estimates the shooting direction of the moving camera 20 at a frequency such as once per second or once every 10 seconds.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 by the method described above, the shooting direction of the moving camera 20 is estimated based on the change in the shooting direction of the moving camera 20 thereafter. can do. Specifically, the shooting direction estimation unit 2140 calculates a change in the shooting direction of the mobile camera 20, and the changed moving camera 20 based on the calculated change and the shooting direction of the mobile camera 20 estimated before the change. Can be estimated. For example, it is assumed that the shooting direction estimation unit 2140 estimates that the shooting direction of the mobile camera 20 is north at time t. Next, it is assumed that, after t1 seconds, the shooting direction estimation unit 2140 calculates that the shooting direction of the mobile camera 20 has changed by + 45 °. In this case, the shooting direction estimation unit 2140 estimates that the shooting direction of the mobile camera 20 at time t + t1 is northwest.
- the shooting direction estimation unit 2140 may perform a part of the shooting direction estimation process of the mobile camera 20 that is repeatedly performed as a process of estimating the shooting direction of the mobile camera 20 based on a change in the shooting direction of the mobile camera 20.
- the process of estimating the shooting direction of the moving camera 20 using the first moving direction and the second moving direction is referred to as a first estimating process
- the change of the shooting direction of the moving camera 20 is calculated to calculate the change of the moving camera 20.
- a process for estimating the shooting direction is referred to as a second estimation process.
- FIG. 33 is a diagram illustrating a breakdown of the estimation process of the shooting direction of the mobile camera 20 performed by the shooting direction estimation unit 2140 in time series.
- the shooting direction estimation unit 2140 estimates the shooting direction of the moving camera 20 at a frequency of once per second.
- the imaging direction estimation unit 2140 repeats the process “After the first estimation process is performed once, the second estimation process is performed nine times”. Therefore, the frequency at which the first estimation process is performed is once every 10 seconds.
- the shooting direction estimation unit 2140 may estimate the shooting direction of the mobile camera 20 a plurality of times, and may estimate the shooting direction of the mobile camera 20 comprehensively based on the estimation results of the plurality of times. is there.
- the process for estimating the shooting direction of the comprehensive mobile camera 20 is represented as one first estimation process.
- the change in the shooting direction of the moving camera 20 can be calculated using an acceleration sensor attached to the moving camera 20.
- an acceleration sensor attached to the moving camera 20 When an acceleration sensor attached to the moving camera 20 is used, a relative change in the posture of the moving camera 20 can be calculated from a change in the output of the acceleration sensor. Therefore, for example, the shooting direction estimation unit 2140 moves from the acceleration sensor output at the time t + t1 to the movement at the time t + t1 from the difference between the output of the acceleration sensor when the shooting direction of the mobile camera 20 at the time t is estimated A change in the shooting direction of the camera 20 is calculated.
- the change in the shooting direction of the mobile camera 20 can be calculated by tracking the change in the feature points shown in the second monitoring image 22.
- FIG. 34 is a diagram illustrating a change in the position of the feature point of the second monitoring image 22.
- the feature point shown at the position 150-1 at a certain point t moves to the position 150-2 at the point t + t1.
- the moved distance in the horizontal direction is x.
- the shooting direction estimation unit 2140 can calculate the change in the shooting direction of the moving camera 20 from the direction in which the shooting direction of the moving camera 20 has changed and the magnitude of the change.
- the amount of calculation of the second estimation process (a process of estimating the shooting direction of the mobile camera 20 using a change in the shooting direction of the mobile camera 20) is the first estimation process (calculation of the first movement direction and the second movement direction, etc.) Is smaller than the calculation amount of the process of estimating the shooting direction of the moving camera 20. Therefore, when the shooting direction of the repetitively moving camera 20 is calculated, there is an effect that the processing load of the monitoring information generating apparatus 2000 can be reduced by using the first estimation process and the second estimation process together.
- the first movement direction estimation unit 2100, the second movement direction estimation unit 2120, and the imaging direction estimation unit 2140 perform the above-described processing after correcting the vertical inclinations of the first monitoring image 12 and the second monitoring image 22. It is preferable. For example, correction of the vertical inclination of each image can be performed based on the inclination of a line of a building or the like shown in the image. For example, when a building is shown in the first monitoring image 12, the first movement direction estimation unit 2100 and the like extract a line in the height direction of the building shown in the first monitoring image 12, and the line is the first monitoring image. By correcting the image 12 so as to be perpendicular to the horizontal direction, the inclination of the first monitoring image 12 in the vertical direction is corrected.
- the correction of the vertical tilt of the first monitoring image 12 may be performed using a camera parameter representing the vertical tilt of the fixed camera 10.
- the correction of the vertical tilt of the second monitoring image 22 may be performed using the vertical tilt of the moving camera 20 that can be calculated from an acceleration sensor attached to the moving camera 20.
- the monitoring information generating apparatus 2000 of the second embodiment is realized using the computer 1000 as in the first embodiment (see FIG. 4).
- each program module stored in the storage 1080 described above further includes a program that realizes each function described in the present embodiment.
- FIG. 35 is a block diagram illustrating a shooting direction estimation apparatus 3000. In FIG. 35, each block shows a functional unit configuration, not a hardware unit configuration.
- the shooting direction estimation apparatus 3000 includes a first monitoring image acquisition unit 2020, a second monitoring image acquisition unit 2040, a first movement direction estimation unit 2100, a second movement direction estimation unit 2120, and a shooting direction estimation unit 2140.
- the function of each functional component is as described above.
- the hardware configuration of the imaging direction estimation apparatus 3000 is represented by, for example, FIG.
- photography direction estimation apparatus 3000 has has stored the program module for implement
- the generation unit 2060 of the monitoring information generation device 2000 acquires the shooting direction of the mobile camera 20 from the shooting direction estimation device 3000. Use.
- First acquisition means for acquiring a first monitoring image photographed by a fixed camera which is a camera whose position is fixed
- Second acquisition means for acquiring a second monitoring image photographed by a moving camera which is a camera whose position is not fixed
- a monitoring information generating apparatus comprising: generating means for generating monitoring information of an object using the first monitoring image and the second monitoring image. 2.
- the generation unit generates a display in which the second monitoring image is superimposed on the first monitoring image as the monitoring information.
- the monitoring information generator described in 1. 3.
- the generating unit superimposes the second monitoring image on a position on the first monitoring image corresponding to a position in the real world of the moving camera; 2.
- the generating means includes Displaying a mark representing the moving camera on the first monitoring image; When an operation for selecting the mark is performed, a display in which the second monitoring image generated by the moving camera corresponding to the mark is superimposed on the first monitoring image is generated as the monitoring information. 2. Or 3.
- the generating means includes Generating distribution information representing the distribution of the object shown in the first monitoring image; Calculating the number of objects in the second monitoring image; The distribution information is corrected using the number of objects shown in the second monitoring image, and the corrected distribution information is generated as the monitoring information.
- the generation unit generates a display in which the corrected distribution information is superimposed on the first monitoring image.
- the generation unit Having map information acquisition means for acquiring map information of a monitored location, 4.
- the generation unit generates a display in which the corrected distribution information is superimposed on a map represented by the map information.
- the monitoring information generator described in 1. 8).
- the generating means includes Using the shooting direction of the moving camera, the shooting range of the moving camera in the first monitoring image is calculated, 4. Correct the number or distribution of the objects in the shooting range of the moving camera indicated by the distribution information by using the number of objects in the second monitoring image. To 7.
- the monitoring information generation device according to any one of the above.
- the generation unit superimposes a display indicating a shooting direction of the moving camera on the first monitoring image.
- the monitoring information generation device according to any one of the above. 10.
- First movement direction estimation means for estimating a first movement direction that is a movement direction of an object in the first monitoring image; Second movement direction estimation means for estimating a second movement direction that is a movement direction of the object in the second monitoring image;
- Shooting direction estimation means for estimating the shooting direction of the moving camera based on the first moving direction, the second moving direction, the position and orientation of the fixed camera, and the position of the moving camera;
- the generating means uses the estimated shooting direction of the moving camera as the shooting direction of the moving camera; 8. Or 9.
- the photographing direction estimating means includes Using the position and orientation of the fixed camera and the first moving direction, the moving direction of the object shown in the second monitoring image on the plane obtained by planarly viewing the place to be monitored in the vertical direction.
- 3 moving direction is calculated, Calculating a plurality of candidate movement directions, which are movement directions of the object when the object moving in the third movement direction from each of the plurality of candidate shooting directions at the position of the moving camera; It is estimated that the candidate moving direction having the highest degree of coincidence with the second moving direction is the shooting direction of the moving camera.
- the monitoring information generator described in 1. 12 The shooting direction estimation means calculates the moving direction of the object shown in the first monitoring image on the plane using the position and orientation of the fixed camera and the first moving direction, and calculates the calculated 10. the moving direction is the third moving direction; The monitoring information generator described in 1. 13.
- the photographing direction estimating means includes Using the position and orientation of the fixed camera and the first movement direction, the movement direction of the object shown in the first monitoring image on the plane is calculated, 10. The position where the calculated moving direction is moved to the vicinity of the moving camera along the moving path indicated by the moving path information is defined as the third moving direction.
- First movement direction estimation means for estimating a first movement direction that is a movement direction of an object in a first monitoring image captured by a fixed camera that is a camera whose position is fixed; Second movement direction estimation means for estimating a second movement direction that is a movement direction of an object in a second monitoring image captured by a moving camera that is a camera whose position is not fixed;
- Shooting direction estimation means for estimating the shooting direction of the moving camera based on the first moving direction, the second moving direction, the position and orientation of the fixed camera, and the position of the moving camera Direction estimation device.
- the photographing direction estimating means includes Using the position and orientation of the fixed camera and the first moving direction, the moving direction of the object shown in the second monitoring image on the plane obtained by planarly viewing the place to be monitored in the vertical direction. 3 moving direction is calculated, Calculating a plurality of candidate movement directions, which are movement directions of the object when the object moving in the third movement direction from each of the plurality of candidate shooting directions at the position of the moving camera; 13. Estimating that the candidate moving direction having the highest degree of coincidence with the second moving direction is the shooting direction of the moving camera; The shooting direction estimation apparatus according to claim 1. 16.
- the shooting direction estimation means calculates the moving direction of the object shown in the first monitoring image on the plane using the position and orientation of the fixed camera and the first moving direction, and calculates the calculated 15.
- the shooting direction estimation apparatus has a movement route information acquisition means for acquiring movement route information indicating the movement route of the object,
- the photographing direction estimating means includes Using the position and orientation of the fixed camera and the first movement direction, the movement direction of the object shown in the first monitoring image on the plane is calculated, 15. The position where the calculated movement direction is moved to the vicinity of the moving camera along the movement path indicated by the movement path information is defined as the third movement direction.
- a monitoring information generation method executed by a computer A first acquisition step of acquiring a first monitoring image captured by a fixed camera that is a camera whose position is fixed; A second acquisition step of acquiring a second monitoring image taken by a moving camera which is a camera whose position is not fixed; A generation step of generating monitoring information of an object using the first monitoring image and the second monitoring image. 19.
- the generation step generates a display in which the second monitoring image is superimposed on the first monitoring image as the monitoring information.
- the generating step superimposes the second monitoring image on a position on the first monitoring image corresponding to a position in the real world of the moving camera;
- the generating step includes Displaying a mark representing the moving camera on the first monitoring image; When an operation for selecting the mark is performed, a display in which the second monitoring image generated by the moving camera corresponding to the mark is superimposed on the first monitoring image is generated as the monitoring information. , 19. Or 20.
- the generating step includes Generating distribution information representing the distribution of the object shown in the first monitoring image; Calculating the number of objects in the second monitoring image; 17. correcting the distribution information using the number of objects in the second monitoring image, and generating the corrected distribution information as the monitoring information;
- the generation step generates a display in which the corrected distribution information is superimposed on the first monitoring image.
- the generating step Having a map information acquisition step of acquiring map information of a location to be monitored;
- the generating step generates a display in which the corrected distribution information is superimposed on a map represented by the map information;
- the generating step includes Using the shooting direction of the moving camera, the shooting range of the moving camera in the first monitoring image is calculated, 22. Correcting the number or distribution of the objects in the shooting range of the moving camera indicated by the distribution information using the number of objects reflected in the second monitoring image; To 24.
- the monitoring information generation method according to any one of the above. 26.
- the generating step superimposes a display indicating a shooting direction of the moving camera on the first monitoring image; 18.
- the generating step uses the estimated shooting direction of the moving camera as the shooting direction of the moving camera; 25. Or 26.
- the shooting direction estimation step includes: Using the position and orientation of the fixed camera and the first moving direction, the moving direction of the object shown in the second monitoring image on the plane obtained by planarly viewing the place to be monitored in the vertical direction. 3 moving direction is calculated, Calculating a plurality of candidate movement directions, which are movement directions of the object when the object moving in the third movement direction from each of the plurality of candidate shooting directions at the position of the moving camera; Estimating that the candidate moving direction having the highest degree of coincidence with the second moving direction is the shooting direction of the moving camera; 27.
- the shooting direction estimation step calculates a moving direction of an object shown in the first monitoring image on the plane using the position and posture of the fixed camera and the first moving direction, and calculates The moving direction is the third moving direction, 28.
- the shooting direction estimation step includes: Using the position and orientation of the fixed camera and the first movement direction, the movement direction of the object shown in the first monitoring image on the plane is calculated, 28.
- a position where the calculated moving direction is moved to the vicinity of the moving camera along the moving path indicated by the moving path information is defined as the third moving direction.
- a shooting direction estimation method executed by a computer A first movement direction estimation step for estimating a first movement direction which is a movement direction of an object in a first monitoring image captured by a fixed camera which is a camera whose position is fixed; A second movement direction estimation step for estimating a second movement direction that is a movement direction of the object in the second monitoring image captured by the movement camera that is a camera whose position is not fixed; A shooting direction estimating step of estimating a shooting direction of the moving camera based on the first moving direction, the second moving direction, the position and orientation of the fixed camera, and the position of the moving camera; Direction estimation method. 32.
- the shooting direction estimation step includes: Using the position and orientation of the fixed camera and the first moving direction, the moving direction of the object shown in the second monitoring image on the plane obtained by planarly viewing the place to be monitored in the vertical direction. 3 moving direction is calculated, Calculating a plurality of candidate movement directions, which are movement directions of the object when the object moving in the third movement direction from each of the plurality of candidate shooting directions at the position of the moving camera; It is estimated that the candidate moving direction having the highest degree of coincidence with the second moving direction is the shooting direction of the moving camera; 31.
- the shooting direction estimation step calculates a moving direction of an object shown in the first monitoring image on the plane using the position and posture of the fixed camera and the first moving direction, and calculates The moving direction is the third moving direction, 32.
- the shooting direction estimation step includes: Using the position and orientation of the fixed camera and the first movement direction, the movement direction of the object shown in the first monitoring image on the plane is calculated, A position where the calculated movement direction is moved to the vicinity of the moving camera along the movement path indicated by the movement path information is defined as the third movement direction.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Alarm Systems (AREA)
Abstract
Description
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像を取得する第2取得手段と、3)前記第1の監視画像及び前記第2の監視画像を用いて、オブジェクトの監視情報を生成する生成手段と、を有する。
図1は、実施形態1に係る監視情報生成装置2000を例示するブロック図である。図1において、各ブロックは、ハードウエア単位の構成ではなく、機能単位の構成を表している。
本実施形態によれば、固定カメラ10によって生成された監視画像に加え、移動カメラ20によって生成された監視画像を用いて、群衆の監視情報が生成される。よって、固定カメラ10のみを利用して群衆の状態を把握しなければならない場合と比較し、群衆の状態をより正確に把握できるようになる。
図3は、実施形態1の監視情報生成装置2000によって実行される処理の流れを例示するフローチャートである。第1監視画像取得部2020は第1監視画像12を取得する(S102)。第2監視画像取得部2040は第2監視画像22を取得する(S104)。生成部2060は、第1監視画像12及び第2監視画像22を用いて監視情報30を生成する(S106)。
図4は、実施形態1の監視情報生成装置2000を実現する計算機1000のハードウエア構成を例示する図である。この計算機1000は、監視情報生成装置2000を実現するために専用に設計された専用装置を用いて実装されてもよいし、PC(Personal Computer)や携帯端末などの汎用装置を用いて実装されてもよい。
第1監視画像取得部2020は第1監視画像12を取得する(S102)。ここで、第1監視画像取得部2020が第1監視画像12を取得する方法は様々である。例えば第1監視画像取得部2020は、固定カメラ10から送信される第1監視画像12を受信する。また例えば、第1監視画像取得部2020は、固定カメラ10にアクセスし、固定カメラ10に記憶されている第1監視画像12を取得してもよい。なお、固定カメラ10は、固定カメラ10の外部に設けられている記憶装置に第1監視画像12を記憶してもよい。この場合、第1監視画像取得部2020は、この記憶装置にアクセスして第1監視画像12を取得してもよい。
第2監視画像取得部2040は第2監視画像22を取得する(S104)。ここで、第2監視画像取得部2040が第2監視画像22を取得する方法は、第1監視画像取得部2020が第1監視画像12を取得する方法と同様である。
生成部2060は、第1監視画像12及び第2監視画像22を用いてオブジェクトの監視情報30を生成する(S106)。ここで前述したように、オブジェクトは人だけに限定されず、任意のものでよい。生成部2060が何をオブジェクトとして扱うかについては、生成部2060に予め設定されていてもよいし、生成部2060からアクセス可能な記憶装置等に記憶されていてもよいし、生成部2060が動作する際に手動で設定されてもよい。
生成部2060は、第1監視画像12に第2監視画像22を重畳した表示を、監視情報30として生成する。図5は、第1監視画像12に第2監視画像22を重畳した表示を例示する図である。図5の表示画面40には、第1監視画像12に、第2監視画像22-1から第2監視画像22-3を重畳した画面が表示されている。
生成部2060は、第1監視画像12上に移動カメラ20を示すマークを重畳した表示を生成する。さらに生成部2060は、このマークがユーザ(監視員など)によって選択された場合に、そのマークに対応する移動カメラ20によって生成された第2監視画像22を表示する。その結果、前述した具体例1の場合と同様に、第1監視画像12上に第2監視画像22が重畳された表示である監視情報30が生成される。
生成部2060は、第1監視画像12上に群衆に関する情報を重畳した表示を、監視情報30として生成する。群衆に関する情報は、例えば群衆に含まれるオブジェクトの分布を表す情報である。以下、群衆に含まれるオブジェクトの分布を表す情報を、分布情報と表記する。
上述の各表示を実現するための分布情報の生成方法について説明する。生成部2060は、第1監視画像12及び第2監視画像22を用いて分布情報を生成する。まず生成部2060は、第1監視画像12に対して画像認識処理などの処理を行うことにより、第1監視画像12について分布情報を生成する。具体的には、生成部2060は、第1監視画像12を複数の部分領域に分割し、各部分領域に写っているオブジェクトの数を算出する。その結果、第1監視画像12の部分領域ごとのオブジェクトの数を表す分布情報が生成される。
固定カメラ10や移動カメラ20は動画を撮影するカメラであるため、第1監視画像12や第2監視画像22は繰り返し生成される。そこで監視情報生成装置2000は、上述した分布情報を繰り返し生成し、表示する分布情報を更新してもよい。これにより、ヒートマップ等の分布情報がアニメーションのように表示することなどが可能となる。なお、分布情報の生成は全ての第1監視画像12及び第2監視画像22を用いて行われてもよいし、一部の第1監視画像12及び第2監視画像22を用いて行われてもよい。後者の場合、例えば1秒間に1回や10秒間に1回などの間隔で分布情報が生成される。また、監視情報生成装置2000は、分布情報の生成を指示するユーザ操作を受け付け、そのユーザ操作を受けつけた場合のみ分布情報の更新を行ってもよい。
上述の例において、生成部2060は、分布情報を第1監視画像12に重畳した表示を監視情報30としている。しかし生成部2060は、分布情報そのものを監視情報30としてもよい。この場合、例えば生成部2060は、分布情報を記憶装置等に記憶したり、分布情報を表形式やグラフ形式などで表示画面に表示したりする。この分布情報は、群衆の行動解析などに利用できる。また、監視情報生成装置2000は以下で示す別の具体例のように分布情報を利用してもよい。
生成部2060は、前述した分布情報を監視対象の場所の地図に重畳して表示してもよい。この場合、監視情報生成装置2000は、地図情報取得部2080を有する。地図情報取得部2080は、監視対象の場所付近の地図に関する情報である地図情報を取得する。図14は、地図情報取得部2080を有する監視情報生成装置2000を例示するブロック図である。例えば地図情報は、歩道、道路、及び建物などの位置などを示す。例えば地図情報は、監視情報生成装置2000からアクセス可能な記憶装置などに予め記憶されているものとする。
生成部2060は、前述した分布情報を、第1監視画像12を用いずに第2監視画像22のみを用いて生成してもよい。この場合、例えば生成部2060は、複数の移動カメラ20それぞれによって生成された第2監視画像22に写っているオブジェクトの数を算出し、「第2監視画像22の撮影範囲、オブジェクトの数」の組み合わせのリストとして分布情報を生成する。そして、生成部2060は、この分布情報を用いてヒートマップ等を生成し、第1監視画像12や地図200に重畳して表示する。
図17は、実施形態2に係る監視情報生成装置2000を例示するブロック図である。図17において、各ブロックは、ハードウエア単位の構成ではなく、機能単位の構成を表している。
本実施形態の監視情報生成装置2000によれば、第1監視画像12及び第2監視画像22を用いて移動カメラ20の撮影方向が推定されるため、移動カメラ20に取り付けられた電子コンパス等のデバイスでは移動カメラ20の撮影方向を精度良く算出できない場合であっても、移動カメラ20の撮影方向を精度良く把握できるようになる。
図19は、実施形態2の監視情報生成装置2000によって実行される処理の流れを例示するフローチャートである。第1移動方向推定部2100は第1監視画像12を用いて第1移動方向を推定する(S202)。第2移動方向推定部2120は第2監視画像22を用いて第2移動方向を推定する(S204)。撮影方向推定部2140は、第1移動方向、第2移動方向、固定カメラ10の位置及び姿勢、並びに移動カメラ20の位置に基づいて、移動カメラ20の撮影方向を推定する(S206)。
第1移動方向推定部2100は、第1監視画像12における群衆の移動方向である第1移動方向を推定する(S202)。ここで、第1移動方向推定部2100が第1移動方向を推定する方法は様々である。以下、第1移動方向の推定方法について説明する。
第1移動方向推定部2100は、時系列で並べられた複数の第1監視画像12それぞれに含まれる画素や特徴点のオプティカルフローを算出する。図20は、第1監視画像12について算出したオプティカルフローを例示する図である。図20に示す各矢印が、第1監視画像12について算出したオプティカルフローを表す。
第1移動方向推定部2100は、時系列で並べられた複数の第1監視画像12に共通で写っているオブジェクトを検出し、そのオブジェクトの位置の変化に基づいて第1移動方向を推定する。図21は、オブジェクトの位置の変化を例示する図である。図21において、点線で表されたオブジェクトは t 番目の第1監視画像12に写っており、実線で表されたオブジェクトは t+1 番目の第1監視画像12に写っている。矢印は、各オブジェクトの位置の変化を表す。オブジェクトの位置の変化は、例えば同一のオブジェクトを表す複数の領域の重心を結んだベクトルである。
また例えば、第1移動方向推定部2100は、第1監視画像12に写っているオブジェクトの向きに基づいて、群衆の移動方向を検出してもよい。例えばオブジェクトが人や動物である場合、第1移動方向推定部2100は、これらの顔や身体の向きを割り出し、顔や身体の正面が向いている方向を第1移動方向とする。またオブジェクトが車、バイク、又は飛行物体などの物である場合、第1移動方向推定部2100は、第1監視画像12に写っているオブジェクトの形状や各種部品(バンパーやハンドルなど)の位置などからオブジェクトの進行方向を割り出し、割り出した進行方向を第1移動方向とする。
第2移動方向推定部2120が第2監視画像22に写っている群衆の移動方向(第2移動方向)を推定する方法は、第1移動方向推定部2100が第1移動方向を推定する方法と同様である。
第1監視画像12や第2監視画像22には、群衆がぶれて写ることがある。例えば固定カメラ10や移動カメラ20が移動している群衆を撮影すると、各監視画像に群衆がぶれて写ることがある。また例えば、固定カメラ10の姿勢を変化させながら固定カメラ10による撮影を行った場合や、移動カメラ20の姿勢や位置を変化させながら移動カメラ20による撮影を行った場合、各監視画像に群衆がぶれて写ることがある。
撮影方向推定部2140は、第1移動方向、第2移動方向、固定カメラ10の位置及び姿勢、並びに移動カメラ20の位置に基づいて、移動カメラ20の撮影方向を推定する(SS210)。図22は、撮影方向推定部2140の動作を説明するための図である。以下、図22を用いながら、撮影方向推定部2140が移動カメラ20の撮影方向を推定する具体的な処理について説明する。
前述したように、撮影方向推定部2140は、移動方向80を用いて、移動カメラ20に写るオブジェクトの平面70上における移動方向(第3移動方向)を算出する。前述の例では、固定カメラ10の撮影範囲と移動カメラ20の撮影範囲が重なっている場合を前提としていた。以下では、固定カメラ10の撮影範囲と移動カメラ20の撮影範囲が重なっていない場合について説明する。
また撮影方向推定部2140は、候補撮影方向に対応する候補移動方向を算出して第2移動方向とのマッチングを行う前に、候補撮影方向の絞り込みを行ってもよい。これにより上記マッチングの回数を減らすことができるため、撮影方向推定部2140の処理の計算量を削減することができるという効果がある。以下、候補撮影方向を絞り込む方法をいくつか例示する。
撮影方向推定部2140は、電子コンパスを用いて候補撮影方向の絞り込みを行う。具体的には、撮影方向推定部2140は、候補撮影方向から、電子コンパスが示す方向との角度の差異が大きい方向を除外する。
撮影方向推定部2140は第2監視画像22に写っている背景に基づいて候補撮影方向の絞り込みを行う。例えば撮影方向推定部2140は、第2監視画像22に写っている背景が移動カメラ20の画角に入らないと予測される候補撮影方向を除外する。
撮影方向推定部2140は、所定期間に移動カメラ20の撮影方向を複数回推定し、それら複数の推定結果に基づいて、移動カメラ20の撮影方向を総合的に推定してもよい。例えば撮影方向推定部2140は、移動カメラ20の撮影方向の総合的な推定結果を毎秒算出するとする。また、固定カメラ10と移動カメラ20が1秒間に30フレームの頻度(30fps)で撮影を行うカメラであるとする。この場合、撮影方向推定部2140は、第1監視画像12と第2監視画像22とが生成される度に移動カメラ20の撮影方向を推定することで、移動カメラ20の撮影方向の推定を1秒間に30回行う。そして、この30回の推定結果に基づいて、総合的な推定結果を算出する。
場所によっては、群衆の流れが定期的又は不定期に変化することがある。例えば交差点付近では、信号の切り替わりによって、群衆の流れが変化する。図32は、交差点付近で群衆の流れが変化する例を示す図である。図32では、交差点を鉛直方向に平面視した様子が示されている。ある時点において、横断歩道130の信号131が青になっており、横断歩道140の信号141が赤になっているとする。この場合、群衆は横断歩道130を渡るため、例えば方向132へ流れる。その後、信号131が赤になり、信号141が青になったとする。この場合、群衆は横断歩道140を渡るため、例えば方向142へ流れる。
移動カメラ20の撮影方向は変化しうる。そのため撮影方向推定部2140は、繰り返し移動カメラ20の撮影方向を推定することが好ましい。例えば撮影方向推定部2140は、1秒間に1回や10秒間に1回などの頻度で繰り返し移動カメラ20の撮影方向の推定を行う。
第1移動方向推定部2100、第2移動方向推定部2120、及び撮影方向推定部2140は、第1監視画像12や第2監視画像22の垂直方向の傾きを補正してから上述した処理を行うことが好ましい。例えば各画像の垂直方向の傾きの補正は、画像に写っている建物等の線の傾きに基づいて行うことができる。例えば第1監視画像12にビルが写っている場合、第1移動方向推定部2100等は、第1監視画像12に写っているビルの高さ方向の線を抽出し、その線が第1監視画像12の水平方向と直角となるように補正することで、第1監視画像12の垂直方向の傾きを補正する。
実施形態2の監視情報生成装置2000は、実施形態1と同様に計算機1000を用いて実現される(図4参照)。本実施形態において、前述したストレージ1080に記憶される各プログラムモジュールには、本実施形態で説明した各機能を実現するプログラムがさらに含まれる。
移動カメラ20の撮影方向を推定する装置は、監視情報生成装置2000とは別途設けられてもよい。この装置を撮影方向推定装置3000と表記する。図35は、撮影方向推定装置3000を例示するブロック図である。図35において、各ブロックはハードウエア単位の構成ではなく、機能単位の構成を示す。
1. 位置が固定されたカメラである固定カメラによって撮影された第1の監視画像を取得する第1取得手段と、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像を取得する第2取得手段と、
前記第1の監視画像及び前記第2の監視画像を用いて、オブジェクトの監視情報を生成する生成手段と、を有する監視情報生成装置。
2. 前記生成手段は、前記監視情報として、前記第1の監視画像に対して前記第2の監視画像を重畳した表示を生成する、1.に記載の監視情報生成装置。
3. 前記生成手段は、前記移動カメラの実世界上の位置に対応する前記第1の監視画像上の位置に前記第2の監視画像を重畳する、2.に記載の監視情報生成装置。
4. 前記生成手段は、
前記第1の監視画像上に前記移動カメラを表すマークを表示し、
前記マークを選択する操作が行われた場合に、前記監視情報として、そのマークに対応する移動カメラによって生成された前記第2の監視画像を前記第1の監視画像上に重畳した表示を生成する、2.又は3.に記載の監視情報生成装置。
5. 前記生成手段は、
前記第1の監視画像に写っているオブジェクトの分布を表す分布情報を生成し、
前記第2の監視画像に写っているオブジェクトの数を算出し、
前記第2の監視画像に写っているオブジェクトの数を用いて前記分布情報を補正し、前記補正した分布情報を前記監視情報として生成する、1.に記載の監視情報生成装置。
6. 前記生成手段は、前記第1の監視画像に前記補正された分布情報を重畳した表示を生成する、5.に記載の監視情報生成装置。
7. 監視対象の場所の地図情報を取得する地図情報取得手段を有し、
前記生成手段は、前記地図情報によって表される地図上に前記補正された分布情報を重畳した表示を生成する、5.に記載の監視情報生成装置。
8. 前記生成手段は、
前記移動カメラの撮影方向を用いて、前記第1の監視画像内における前記移動カメラの撮影範囲を算出し、
前記分布情報によって示される前記移動カメラの撮影範囲内における前記オブジェクトの数又は分布を、前記第2の監視画像に写っているオブジェクトの数を用いて補正する、5.乃至7.いずれか一つに記載の監視情報生成装置。
9. 前記生成手段は、前記第1の監視画像に対して前記移動カメラの撮影方向を表す表示を重畳する、1.乃至8.いずれか一つに記載の監視情報生成装置。
10. 前記第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定手段と、
前記第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定手段と、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定手段と、を有し、
前記生成手段は、前記推定した移動カメラの撮影方向を、前記移動カメラの撮影方向として用いる、8.又は9.に記載の監視情報生成装置。
11. 前記撮影方向推定手段は、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、監視対象の場所を鉛直方向に平面視した平面上における、前記第2の監視画像に写っているオブジェクトの移動方向である第3の移動方向を算出し、
前記移動カメラの位置で複数の候補撮影方向それぞれから前記第3の移動方向へ移動するオブジェクトを見た際におけるそのオブジェクトの移動方向である、複数の候補移動方向を算出し、
前記第2の移動方向との一致度合いが最も高い前記候補移動方向が前記移動カメラの撮影方向であると推定する、10.に記載の監視情報生成装置。
12. 前記撮影方向推定手段は、前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、前記算出した移動方向を前記第3の移動方向とする、11.に記載の監視情報生成装置。
13. オブジェクトの移動経路を示す移動経路情報を取得する移動経路情報取得手段を有し、
前記撮影方向推定手段は、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、
前記算出した移動方向を前記移動経路情報に示される移動経路に沿って前記移動カメラの付近へ移動させた位置を前記第3の移動方向とする、11.に記載の監視情報生成装置。
14. 位置が固定されたカメラである固定カメラによって撮影された第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定手段と、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定手段と、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定手段と、を有する撮影方向推定装置。
15. 前記撮影方向推定手段は、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、監視対象の場所を鉛直方向に平面視した平面上における、前記第2の監視画像に写っているオブジェクトの移動方向である第3の移動方向を算出し、
前記移動カメラの位置で複数の候補撮影方向それぞれから前記第3の移動方向へ移動するオブジェクトを見た際におけるそのオブジェクトの移動方向である、複数の候補移動方向を算出し、
前記第2の移動方向との一致度合いが最も高い前記候補移動方向が前記移動カメラの撮影方向であると推定する、14.に記載の撮影方向推定装置。
16. 前記撮影方向推定手段は、前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、前記算出した移動方向を前記第3の移動方向とする、15.に記載の撮影方向推定装置。
17. オブジェクトの移動経路を示す移動経路情報を取得する移動経路情報取得手段を有し、
前記撮影方向推定手段は、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、
前記算出した移動方向を前記移動経路情報に示される移動経路に沿って前記移動カメラの付近へ移動させた位置を前記第3の移動方向とする、15.に記載の撮影方向推定装置。
18. コンピュータによって実行される監視情報生成方法であって、
位置が固定されたカメラである固定カメラによって撮影された第1の監視画像を取得する第1取得ステップと、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像を取得する第2取得ステップと、
前記第1の監視画像及び前記第2の監視画像を用いて、オブジェクトの監視情報を生成する生成ステップと、を有する監視情報生成方法。
19. 前記生成ステップは、前記監視情報として、前記第1の監視画像に対して前記第2の監視画像を重畳した表示を生成する、18.に記載の監視情報生成方法。
20. 前記生成ステップは、前記移動カメラの実世界上の位置に対応する前記第1の監視画像上の位置に前記第2の監視画像を重畳する、19.に記載の監視情報生成方法。
21. 前記生成ステップは、
前記第1の監視画像上に前記移動カメラを表すマークを表示し、
前記マークを選択する操作が行われた場合に、前記監視情報として、そのマークに対応する移動カメラによって生成された前記第2の監視画像を前記第1の監視画像上に重畳した表示を生成する、19.又は20.に記載の監視情報生成方法。
22. 前記生成ステップは、
前記第1の監視画像に写っているオブジェクトの分布を表す分布情報を生成し、
前記第2の監視画像に写っているオブジェクトの数を算出し、
前記第2の監視画像に写っているオブジェクトの数を用いて前記分布情報を補正し、前記補正した分布情報を前記監視情報として生成する、18.に記載の監視情報生成方法。
23. 前記生成ステップは、前記第1の監視画像に前記補正された分布情報を重畳した表示を生成する、22.に記載の監視情報生成方法。
24. 監視対象の場所の地図情報を取得する地図情報取得ステップを有し、
前記生成ステップは、前記地図情報によって表される地図上に前記補正された分布情報を重畳した表示を生成する、22.に記載の監視情報生成方法。
25. 前記生成ステップは、
前記移動カメラの撮影方向を用いて、前記第1の監視画像内における前記移動カメラの撮影範囲を算出し、
前記分布情報によって示される前記移動カメラの撮影範囲内における前記オブジェクトの数又は分布を、前記第2の監視画像に写っているオブジェクトの数を用いて補正する、22.乃至24.いずれか一つに記載の監視情報生成方法。
26. 前記生成ステップは、前記第1の監視画像に対して前記移動カメラの撮影方向を表す表示を重畳する、18.乃至25.いずれか一つに記載の監視情報生成方法。
27. 前記第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定ステップと、
前記第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定ステップと、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定ステップと、を有し、
前記生成ステップは、前記推定した移動カメラの撮影方向を、前記移動カメラの撮影方向として用いる、25.又は26.に記載の監視情報生成方法。
28. 前記撮影方向推定ステップは、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、監視対象の場所を鉛直方向に平面視した平面上における、前記第2の監視画像に写っているオブジェクトの移動方向である第3の移動方向を算出し、
前記移動カメラの位置で複数の候補撮影方向それぞれから前記第3の移動方向へ移動するオブジェクトを見た際におけるそのオブジェクトの移動方向である、複数の候補移動方向を算出し、
前記第2の移動方向との一致度合いが最も高い前記候補移動方向が前記移動カメラの撮影方向であると推定する、27.に記載の監視情報生成方法。
29. 前記撮影方向推定ステップは、前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、前記算出した移動方向を前記第3の移動方向とする、28.に記載の監視情報生成方法。
30. オブジェクトの移動経路を示す移動経路情報を取得する移動経路情報取得ステップを有し、
前記撮影方向推定ステップは、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、
前記算出した移動方向を前記移動経路情報に示される移動経路に沿って前記移動カメラの付近へ移動させた位置を前記第3の移動方向とする、28.に記載の監視情報生成方法。
31. コンピュータによって実行される撮影方向推定方法であって、
位置が固定されたカメラである固定カメラによって撮影された第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定ステップと、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定ステップと、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定ステップと、を有する撮影方向推定方法。
32. 前記撮影方向推定ステップは、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、監視対象の場所を鉛直方向に平面視した平面上における、前記第2の監視画像に写っているオブジェクトの移動方向である第3の移動方向を算出し、
前記移動カメラの位置で複数の候補撮影方向それぞれから前記第3の移動方向へ移動するオブジェクトを見た際におけるそのオブジェクトの移動方向である、複数の候補移動方向を算出し、
前記第2の移動方向との一致度合いが最も高い前記候補移動方向が前記移動カメラの撮影方向であると推定する、31.に記載の撮影方向推定方法。
33. 前記撮影方向推定ステップは、前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、前記算出した移動方向を前記第3の移動方向とする、32.に記載の撮影方向推定方法。
34. オブジェクトの移動経路を示す移動経路情報を取得する移動経路情報取得ステップを有し、
前記撮影方向推定ステップは、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、
前記算出した移動方向を前記移動経路情報に示される移動経路に沿って前記移動カメラの付近へ移動させた位置を前記第3の移動方向とする、32.に記載の撮影方向推定方法。
35. コンピュータに18.乃至34.いずれか一つに記載の各ステップを実行させるプログラム。
Claims (17)
- 位置が固定されたカメラである固定カメラによって撮影された第1の監視画像を取得する第1取得手段と、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像を取得する第2取得手段と、
前記第1の監視画像及び前記第2の監視画像を用いて、オブジェクトの監視情報を生成する生成手段と、を有する監視情報生成装置。 - 前記生成手段は、前記監視情報として、前記第1の監視画像に対して前記第2の監視画像を重畳した表示を生成する、請求項1に記載の監視情報生成装置。
- 前記生成手段は、前記移動カメラの実世界上の位置に対応する前記第1の監視画像上の位置に前記第2の監視画像を重畳する、請求項2に記載の監視情報生成装置。
- 前記生成手段は、
前記第1の監視画像上に前記移動カメラを表すマークを表示し、
前記マークを選択する操作が行われた場合に、前記監視情報として、そのマークに対応する移動カメラによって生成された前記第2の監視画像を前記第1の監視画像上に重畳した表示を生成する、請求項2又は3に記載の監視情報生成装置。 - 前記生成手段は、
前記第1の監視画像に写っているオブジェクトの分布を表す分布情報を生成し、
前記第2の監視画像に写っているオブジェクトの数を算出し、
前記第2の監視画像に写っているオブジェクトの数を用いて前記分布情報を補正し、前記補正した分布情報を前記監視情報として生成する、請求項1に記載の監視情報生成装置。 - 前記生成手段は、前記第1の監視画像に前記補正された分布情報を重畳した表示を生成する、請求項5に記載の監視情報生成装置。
- 監視対象の場所の地図情報を取得する地図情報取得手段を有し、
前記生成手段は、前記地図情報によって表される地図上に前記補正された分布情報を重畳した表示を生成する、請求項5に記載の監視情報生成装置。 - 前記生成手段は、
前記移動カメラの撮影方向を用いて、前記第1の監視画像内における前記移動カメラの撮影範囲を算出し、
前記分布情報によって示される前記移動カメラの撮影範囲内における前記オブジェクトの数又は分布を、前記第2の監視画像に写っているオブジェクトの数を用いて補正する、請求項5乃至7いずれか一項に記載の監視情報生成装置。 - 前記生成手段は、前記第1の監視画像に対して前記移動カメラの撮影方向を表す表示を重畳する、請求項1乃至8いずれか一項に記載の監視情報生成装置。
- 前記第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定手段と、
前記第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定手段と、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定手段と、を有し、
前記生成手段は、前記推定した移動カメラの撮影方向を、前記移動カメラの撮影方向として用いる、請求項8又は9に記載の監視情報生成装置。 - 前記撮影方向推定手段は、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、監視対象の場所を鉛直方向に平面視した平面上における、前記第2の監視画像に写っているオブジェクトの移動方向である第3の移動方向を算出し、
前記移動カメラの位置で複数の候補撮影方向それぞれから前記第3の移動方向へ移動するオブジェクトを見た際におけるそのオブジェクトの移動方向である、複数の候補移動方向を算出し、
前記第2の移動方向との一致度合いが最も高い前記候補移動方向が前記移動カメラの撮影方向であると推定する、請求項10に記載の監視情報生成装置。 - 前記撮影方向推定手段は、前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、前記算出した移動方向を前記第3の移動方向とする、請求項11に記載の監視情報生成装置。
- オブジェクトの移動経路を示す移動経路情報を取得する移動経路情報取得手段を有し、
前記撮影方向推定手段は、
前記固定カメラの位置及び姿勢並びに前記第1の移動方向を用いて、前記平面上における前記第1の監視画像に写っているオブジェクトの移動方向を算出し、
前記算出した移動方向を前記移動経路情報に示される移動経路に沿って前記移動カメラの付近へ移動させた位置を前記第3の移動方向とする、請求項11に記載の監視情報生成装置。 - 位置が固定されたカメラである固定カメラによって撮影された第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定手段と、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定手段と、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定手段と、を有する撮影方向推定装置。 - コンピュータによって実行される監視情報生成方法であって、
位置が固定されたカメラである固定カメラによって撮影された第1の監視画像を取得する第1取得ステップと、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像を取得する第2取得ステップと、
前記第1の監視画像及び前記第2の監視画像を用いて、オブジェクトの監視情報を生成する生成ステップと、を有する監視情報生成方法。 - コンピュータによって実行される撮影方向推定方法であって、
位置が固定されたカメラである固定カメラによって撮影された第1の監視画像におけるオブジェクトの移動方向である第1の移動方向を推定する第1移動方向推定ステップと、
位置が固定されていないカメラである移動カメラによって撮影された第2の監視画像におけるオブジェクトの移動方向である第2の移動方向を推定する第2移動方向推定ステップと、
前記第1の移動方向、前記第2の移動方向、前記固定カメラの位置及び姿勢、並びに前記移動カメラの位置に基づいて、前記移動カメラの撮影方向を推定する撮影方向推定ステップと、を有する撮影方向推定方法。 - コンピュータに請求項15又は16に記載の各ステップを実行させるプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017537577A JP6741009B2 (ja) | 2015-09-01 | 2016-05-09 | 監視情報生成装置、撮影方向推定装置、監視情報生成方法、撮影方向推定方法、及びプログラム |
US15/754,613 US10748010B2 (en) | 2015-09-01 | 2016-05-09 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
US16/406,705 US10579881B2 (en) | 2015-09-01 | 2019-05-08 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
US16/406,730 US10977499B2 (en) | 2015-09-01 | 2019-05-08 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
US17/214,018 US11710322B2 (en) | 2015-09-01 | 2021-03-26 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
US18/204,754 US20230326213A1 (en) | 2015-09-01 | 2023-06-01 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-172082 | 2015-09-01 | ||
JP2015172082 | 2015-09-01 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/754,613 A-371-Of-International US10748010B2 (en) | 2015-09-01 | 2016-05-09 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
US16/406,705 Continuation US10579881B2 (en) | 2015-09-01 | 2019-05-08 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
US16/406,730 Continuation US10977499B2 (en) | 2015-09-01 | 2019-05-08 | Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017038160A1 true WO2017038160A1 (ja) | 2017-03-09 |
Family
ID=58187139
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/063720 WO2017038160A1 (ja) | 2015-09-01 | 2016-05-09 | 監視情報生成装置、撮影方向推定装置、監視情報生成方法、撮影方向推定方法、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (5) | US10748010B2 (ja) |
JP (3) | JP6741009B2 (ja) |
WO (1) | WO2017038160A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019180032A (ja) * | 2018-03-30 | 2019-10-17 | セコム株式会社 | 監視システム及び監視装置 |
JP2021043866A (ja) * | 2019-09-13 | 2021-03-18 | キヤノン株式会社 | 画像解析装置、画像解析方法、及びプログラム |
WO2021144874A1 (ja) * | 2020-01-15 | 2021-07-22 | 日本電信電話株式会社 | 撮影範囲推定装置、撮影範囲推定方法およびプログラム |
JP7359922B1 (ja) | 2022-09-26 | 2023-10-11 | 株式会社デンソーテン | 情報処理装置、情報処理方法およびプログラム |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6812976B2 (ja) * | 2015-09-02 | 2021-01-13 | 日本電気株式会社 | 監視システム、監視ネットワーク構築方法、およびプログラム |
KR102500838B1 (ko) * | 2016-01-11 | 2023-02-16 | 한화테크윈 주식회사 | 감시 영역 기반 경로 제공 방법 및 이를 위한 장치 |
WO2017122258A1 (ja) * | 2016-01-12 | 2017-07-20 | 株式会社日立国際電気 | 混雑状況監視システム |
JP6849430B2 (ja) * | 2016-12-27 | 2021-03-24 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
JP6918523B2 (ja) * | 2017-03-06 | 2021-08-11 | キヤノン株式会社 | 情報処理システム、情報処理装置、情報処理方法ならびに情報処理方法をコンピュータに実行させるプログラム |
WO2019130827A1 (ja) * | 2017-12-25 | 2019-07-04 | キヤノン株式会社 | 画像処理装置およびその制御方法 |
JP2019176306A (ja) * | 2018-03-28 | 2019-10-10 | キヤノン株式会社 | 監視システム、監視システムの制御方法及びプログラム |
JP7200493B2 (ja) * | 2018-03-29 | 2023-01-10 | 京セラドキュメントソリューションズ株式会社 | 監視システム |
KR102526113B1 (ko) * | 2018-05-31 | 2023-04-26 | 삼성에스디에스 주식회사 | 혼잡도 시각화 장치 및 방법 |
JP7143661B2 (ja) * | 2018-07-24 | 2022-09-29 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び制御方法 |
US10958854B2 (en) * | 2018-11-02 | 2021-03-23 | BriefCam Ltd. | Computer-implemented method for generating an output video from multiple video sources |
US10796725B2 (en) * | 2018-11-06 | 2020-10-06 | Motorola Solutions, Inc. | Device, system and method for determining incident objects in secondary video |
US11557194B2 (en) * | 2020-02-24 | 2023-01-17 | Intrado Corporation | Integrated emergency response and data network |
JP7482011B2 (ja) * | 2020-12-04 | 2024-05-13 | 株式会社東芝 | 情報処理システム |
US20220210376A1 (en) * | 2020-12-30 | 2022-06-30 | Honeywell International Inc. | Methods and systems for providing security monitoring of a procession as the procession moves along a procession route |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002084531A (ja) * | 2000-09-08 | 2002-03-22 | Nippon Telegr & Teleph Corp <Ntt> | 遠隔操縦移動型監視ロボットシステム |
JP2005268972A (ja) * | 2004-03-17 | 2005-09-29 | Matsushita Electric Ind Co Ltd | 映像表示システム、及び映像表示方法 |
JP2006331260A (ja) * | 2005-05-30 | 2006-12-07 | Sabo Frontier Foundation | 位置方位付き写真提供システム及びそのプログラム |
JP2010245628A (ja) * | 2009-04-01 | 2010-10-28 | Mitsubishi Electric Corp | カメラ校正装置 |
JP2011076316A (ja) * | 2009-09-30 | 2011-04-14 | Fujifilm Corp | 群衆監視装置および方法ならびにプログラム |
JP2011176452A (ja) * | 2010-02-23 | 2011-09-08 | Mitsubishi Electric Corp | 映像監視システム及び監視映像表示装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003329462A (ja) | 2002-05-08 | 2003-11-19 | Hitachi Ltd | 映像配信装置および映像情報配信システム |
JP3870124B2 (ja) * | 2002-06-14 | 2007-01-17 | キヤノン株式会社 | 画像処理装置及びその方法、並びにコンピュータプログラム及びコンピュータ可読記憶媒体 |
JP3946593B2 (ja) * | 2002-07-23 | 2007-07-18 | 株式会社エヌ・ティ・ティ・データ | 共同撮影システム |
JP4239621B2 (ja) | 2003-03-11 | 2009-03-18 | 株式会社明電舎 | 混雑度調査装置 |
JP4469148B2 (ja) | 2003-08-06 | 2010-05-26 | パナソニック株式会社 | 監視システム、固定カメラ、移動カメラおよび撮影方法 |
JP2005175852A (ja) | 2003-12-10 | 2005-06-30 | Canon Inc | 撮影装置及び撮影装置の制御方法 |
JP2007068010A (ja) * | 2005-09-01 | 2007-03-15 | Matsushita Electric Ind Co Ltd | 映像監視システム |
JP5052003B2 (ja) | 2005-12-20 | 2012-10-17 | パナソニック株式会社 | 情報配信システム |
JP2007201556A (ja) | 2006-01-23 | 2007-08-09 | Fujifilm Corp | 混雑情報提供システム及び方法 |
JP4901233B2 (ja) * | 2006-02-14 | 2012-03-21 | 株式会社日立製作所 | 監視システム、監視方法、及び、監視プログラム |
JP5056359B2 (ja) * | 2007-11-02 | 2012-10-24 | ソニー株式会社 | 情報表示装置、情報表示方法および撮像装置 |
JP2009188740A (ja) | 2008-02-06 | 2009-08-20 | Nec Corp | 監視システムおよび監視方法 |
JP5431695B2 (ja) * | 2008-08-19 | 2014-03-05 | 三菱電機ビルテクノサービス株式会社 | 建物設備保守用の表示装置 |
JP4547040B1 (ja) * | 2009-10-27 | 2010-09-22 | パナソニック株式会社 | 表示画像切替装置及び表示画像切替方法 |
JP5213883B2 (ja) | 2010-01-19 | 2013-06-19 | 三菱電機株式会社 | 合成表示装置 |
JP2013030924A (ja) * | 2011-07-27 | 2013-02-07 | Jvc Kenwood Corp | カメラ制御装置、カメラ制御方法及びカメラ制御プログラム |
WO2014174737A1 (ja) | 2013-04-26 | 2014-10-30 | 日本電気株式会社 | 監視装置、監視方法および監視用プログラム |
US20160182849A1 (en) * | 2013-08-09 | 2016-06-23 | Panasonic Intellectual Property Management Co., Ltd. | Wireless camera system, central device, image display method, and image display program |
GB2582512B (en) | 2017-12-15 | 2022-03-30 | Motorola Solutions Inc | Device, system and method for crowd control |
-
2016
- 2016-05-09 JP JP2017537577A patent/JP6741009B2/ja active Active
- 2016-05-09 WO PCT/JP2016/063720 patent/WO2017038160A1/ja active Application Filing
- 2016-05-09 US US15/754,613 patent/US10748010B2/en active Active
-
2019
- 2019-05-08 US US16/406,705 patent/US10579881B2/en active Active
- 2019-05-08 US US16/406,730 patent/US10977499B2/en active Active
-
2020
- 2020-07-20 JP JP2020123492A patent/JP7371924B2/ja active Active
-
2021
- 2021-03-26 US US17/214,018 patent/US11710322B2/en active Active
-
2022
- 2022-11-14 JP JP2022181636A patent/JP7480823B2/ja active Active
-
2023
- 2023-06-01 US US18/204,754 patent/US20230326213A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002084531A (ja) * | 2000-09-08 | 2002-03-22 | Nippon Telegr & Teleph Corp <Ntt> | 遠隔操縦移動型監視ロボットシステム |
JP2005268972A (ja) * | 2004-03-17 | 2005-09-29 | Matsushita Electric Ind Co Ltd | 映像表示システム、及び映像表示方法 |
JP2006331260A (ja) * | 2005-05-30 | 2006-12-07 | Sabo Frontier Foundation | 位置方位付き写真提供システム及びそのプログラム |
JP2010245628A (ja) * | 2009-04-01 | 2010-10-28 | Mitsubishi Electric Corp | カメラ校正装置 |
JP2011076316A (ja) * | 2009-09-30 | 2011-04-14 | Fujifilm Corp | 群衆監視装置および方法ならびにプログラム |
JP2011176452A (ja) * | 2010-02-23 | 2011-09-08 | Mitsubishi Electric Corp | 映像監視システム及び監視映像表示装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019180032A (ja) * | 2018-03-30 | 2019-10-17 | セコム株式会社 | 監視システム及び監視装置 |
JP7123604B2 (ja) | 2018-03-30 | 2022-08-23 | セコム株式会社 | 監視システム及び監視装置 |
JP2021043866A (ja) * | 2019-09-13 | 2021-03-18 | キヤノン株式会社 | 画像解析装置、画像解析方法、及びプログラム |
JP7443002B2 (ja) | 2019-09-13 | 2024-03-05 | キヤノン株式会社 | 画像解析装置、画像解析方法、及びプログラム |
WO2021144874A1 (ja) * | 2020-01-15 | 2021-07-22 | 日本電信電話株式会社 | 撮影範囲推定装置、撮影範囲推定方法およびプログラム |
JPWO2021144874A1 (ja) * | 2020-01-15 | 2021-07-22 | ||
JP7243867B2 (ja) | 2020-01-15 | 2023-03-22 | 日本電信電話株式会社 | 撮影範囲推定装置、撮影範囲推定方法およびプログラム |
JP7359922B1 (ja) | 2022-09-26 | 2023-10-11 | 株式会社デンソーテン | 情報処理装置、情報処理方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20180247135A1 (en) | 2018-08-30 |
JP2023016837A (ja) | 2023-02-02 |
JPWO2017038160A1 (ja) | 2018-06-14 |
JP7480823B2 (ja) | 2024-05-10 |
US20230326213A1 (en) | 2023-10-12 |
US20190266415A1 (en) | 2019-08-29 |
JP7371924B2 (ja) | 2023-10-31 |
US11710322B2 (en) | 2023-07-25 |
US10748010B2 (en) | 2020-08-18 |
US20210216789A1 (en) | 2021-07-15 |
US20190318172A1 (en) | 2019-10-17 |
JP2020184795A (ja) | 2020-11-12 |
JP6741009B2 (ja) | 2020-08-19 |
US10977499B2 (en) | 2021-04-13 |
US10579881B2 (en) | 2020-03-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7480823B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP6687204B2 (ja) | 投影イメージ生成方法およびその装置、イメージピクセルと深度値との間のマッピング方法 | |
US10061486B2 (en) | Area monitoring system implementing a virtual environment | |
CN111081199B (zh) | 选择用于显示的时间分布的全景图像 | |
US20170094227A1 (en) | Three-dimensional spatial-awareness vision system | |
EP3044726B1 (en) | Landmark identification from point cloud generated from geographic imagery data | |
JP5956248B2 (ja) | 画像監視装置 | |
JP2013171523A (ja) | Ar画像処理装置及び方法 | |
KR20130110156A (ko) | 이미 존재하는 정지 이미지 내의 비디오의 시각화 | |
JP6126501B2 (ja) | カメラ設置シミュレータ及びそのコンピュータプログラム | |
US11630857B2 (en) | User interaction event data capturing system for use with aerial spherical imagery | |
JP2016164518A (ja) | 屋内位置情報測位システム及び屋内位置情報測位方法 | |
KR101523643B1 (ko) | 감시 시스템에서 카메라 제어를 위한 시스템 및 방법 | |
CN113610702A (zh) | 一种建图方法、装置、电子设备及存储介质 | |
KR101700651B1 (ko) | 위치정보 기반의 공유 경로데이터를 이용한 객체 트래킹 장치 | |
KR101036107B1 (ko) | 고유식별 정보를 이용한 증강 현실 구현시스템 | |
JP5649842B2 (ja) | 情報提供装置、情報提供方法、及びプログラム | |
JP7274298B2 (ja) | 撮像装置、撮像方法及び撮像プログラム | |
JP2014182628A (ja) | 空間重要度判定装置、空間重要度判定方法、及びプログラム | |
KR20220050722A (ko) | 영상관제 시스템에서 카메라 영상내 관제 지점의 지도 매핑 방법 | |
JP2024070133A (ja) | 目標認識方法、プログラム及び装置 | |
US9769373B2 (en) | Situation comprehending apparatus, situation comprehending method, and program for situation comprehension | |
JP5168313B2 (ja) | 画像表示装置 | |
CN112985372A (zh) | 路径规划系统及其方法 | |
Watanabe et al. | A Walkthrough System to Display Video Corresponding to the Viewer's Face Orientation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16841192 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017537577 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15754613 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16841192 Country of ref document: EP Kind code of ref document: A1 |