WO2014203389A1 - センサ配置決定装置およびセンサ配置決定方法 - Google Patents
センサ配置決定装置およびセンサ配置決定方法 Download PDFInfo
- Publication number
- WO2014203389A1 WO2014203389A1 PCT/JP2013/067047 JP2013067047W WO2014203389A1 WO 2014203389 A1 WO2014203389 A1 WO 2014203389A1 JP 2013067047 W JP2013067047 W JP 2013067047W WO 2014203389 A1 WO2014203389 A1 WO 2014203389A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- human flow
- sensor
- candidate
- trajectory
- estimation
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/53—Recognition of crowd images, e.g. recognition of crowd congestion
Definitions
- the present invention relates to a sensor arrangement determining apparatus and a sensor arrangement determining method for determining an arrangement position of a sensor such as a camera for measuring a human flow.
- the need for human flow measurement is increasing due to sensors such as cameras. For example, in order to grasp the congestion in a station, human flow measurement is performed by a camera. In addition, there are cases where it is necessary to measure human flow in a department store or the like in order to mitigate congestion. At this time, the number of cameras to be installed is required to be a minimum necessary due to restrictions on equipment costs.
- Patent Document 1 describes a method for automatically creating a camera arrangement that satisfies a customer request based on a customer request including a monitoring layout and supporting the determination of the camera arrangement.
- Non-Patent Document 1 describes a method for reducing an estimation error of a human flow generated by simulation based on partial human flow measurement data.
- Patent Document 1 focuses on camera functions such as motion detection and face detection as a customer request including a monitoring layout, and integrates cameras that require similar functions. It was not.
- Non-Patent Document 1 is intended for human flow measurement, it is intended to reduce the estimation error of the human flow and is not intended to determine a camera position that can accurately measure the human flow.
- the present invention is an invention for solving the above-described problems, and is a sensor placement determination capable of evaluating a sensor placement position from the viewpoint of human flow measurement accuracy and determining a sensor placement position suitable for human flow measurement. It is an object to provide a device and a sensor arrangement determination method.
- the sensor placement determination device of the present invention actually uses a target area (for example, the target area 300) where a plurality of sensors (for example, the camera 314) for observing human flow are to be placed.
- Storage means for example, the external storage device 15
- human flow data for example, the record 62
- an actual trajectory for example, the path 307
- human flow simulation observation means for example, the human flow simulation observation unit 100
- Trajectory candidates for example, trajectory candidates 702 and 7) about how a person passes through the target area with the observation information generated by the human flow simulation observation means as a constraint.
- Estimate and generate the human flow estimation means for example, the human flow estimation unit 102
- the number of observations for each candidate of the sensor arrangement position calculated based on the trajectory candidate generated by the human flow estimation means and the human flow data
- An estimation error calculation unit (estimation error calculation unit 104) that calculates an estimation error when the human flow estimation unit estimates a trajectory candidate by comparing the number of observation positions for each candidate of the sensor arrangement position calculated based on A sensor position determination unit (for example, additional camera determination) that selects a candidate position as a sensor position and selects a position where the estimation error calculated by the estimation error calculation unit is small for a plurality of sensor layout position candidates. Part 106).
- the sensor arrangement position can be evaluated from the viewpoint of the measurement accuracy of the human flow, and the arrangement position of the sensor suitable for the human flow measurement can be determined.
- Embodiments according to the sensor arrangement determination device of the present invention will be described in detail with reference to the drawings.
- a sensor capable of measuring human flow such as a line sensor or a laser scanner, may be used.
- the sensor detects the passage of people and the direction of passage.
- FIG. 1 is a diagram showing an overall configuration of a sensor arrangement determining apparatus according to the present invention.
- the sensor arrangement determining apparatus 10 includes a human flow simulation observation unit 100 (human flow simulation observation means, see FIG. 5), a human flow estimation unit 102 (human flow estimation means, see FIGS. 6 and 7), an estimation error calculation unit 104 (estimation error calculation means). 8), and an additional camera determination unit 106 (sensor position determination means, see FIG. 10).
- the human flow simulation observation unit 100 observes human flow true value data 108 observed at a virtual camera position (for example, an observation position by the camera 314 shown in FIG. 5) determined by the additional camera determination unit 106 described later.
- Passage information that is, flow information (for example, flow information 502 shown in FIG. 5) is generated in a simulated manner.
- the flow information is local information obtained when the human flow true value data 108, which is global information, is observed with a discretely arranged camera, and here, information on the number of people passing through the camera's field of view. It is. Details will be described later with reference to FIG.
- the human flow true value data 108 means a data set including a plurality of human flow data (single movement trajectory) as a reference when evaluating an estimation error of the human flow.
- Human flow data is represented by a set of trajectory points that represent the route that a person actually travels in a building or on the street.
- the true value means a correct value actually measured.
- the human flow data is a set of trajectory points tracking a person, and may be measured using, for example, a laser scanner.
- a laser scanner is a device that acquires three-dimensional coordinate data of surrounding objects by using a time until an irradiated laser beam is reflected by the object and returns.
- the human flow true value data 108 may be measured by detecting and tracking a person from a number of video camera images.
- the human flow true value data 108 may be created manually.
- human flow data may be created by human flow simulation. A method for acquiring human flow data will be described later with reference to FIGS.
- the human flow estimation unit 102 generates human flow data that well satisfies the flow information generation conditions generated by the human flow simulation observation unit 100 by agent simulation described later.
- the estimation error calculation unit 104 compares the human flow data estimated by the human flow estimation unit 102 with the human flow true value data 108 to calculate an estimation error.
- the additional camera determination unit 106 arranges a new camera to be added to the camera group whose installation determined by the camera installation information is determined so that the estimation error of the human flow calculated by the estimation error calculation unit 104 is minimized.
- the position (installation position) is determined, and the result is stored in the camera installation information 110.
- FIG. 11 is a diagram illustrating a hardware configuration of the sensor arrangement determination device.
- the sensor arrangement determination device 10 includes a display device 11, an input device 12, a central processing unit (CPU) 13, a communication control device 14, an external storage device 15 (storage means), a memory 16, and a bus 17 for connecting them. Is done.
- the display device 11 is a display or the like, and displays an execution status, an execution result, and the like of processing performed by the sensor arrangement determination device 10.
- the input device 12 is a device for inputting an instruction to a computer such as a keyboard and a mouse, and inputs an instruction for starting a program.
- the central processing unit (CPU) 13 executes various programs stored in the memory.
- the communication control device 14 exchanges various data and commands with other devices via a LAN (Local Area Network) 18.
- the external storage device 15 stores various data for the sensor placement determination device 10 to execute processing.
- the memory 16 holds various programs executed by the sensor arrangement determination device 10 and temporary data.
- the memory 16 includes a human flow simulation observation unit 100, a human flow estimation unit 102, an estimation error calculation unit 104, an additional camera determination unit 106, and a human flow data measurement unit 120 for measuring human flow data included in the human flow true value data 108.
- a processing program is stored.
- the external storage device 15 stores human flow true value data 108 and camera installation information 110.
- FIG. 2 is a diagram showing a processing flow of the entire apparatus of the sensor arrangement determination device. With reference to FIG. 1 and FIG. 2, the flow of processing of the sensor arrangement determination device 10 will be described. The processing flow is shown in a PAD (Problem-Analysis-Diagram) format.
- PAD Problem-Analysis-Diagram
- step S200 the human flow simulation observation unit 100 or the estimation error calculation unit 104 reads the human flow true value data 108 normally stored as a file on the memory 16 (see FIG. 11).
- step S202 the additional camera determination unit 106 repeats the processing from step S204 to step S208 with respect to the assumed installation position of the virtual additional camera.
- step S204 the human flow simulation observation unit 100 generates simulated flow information of a person that occurs when the human flow true value data 108 is observed at an assumed additional camera installation position.
- step S206 the human flow estimation unit 102 estimates the human flow by generating human flow data that well satisfies the flow information generation conditions generated in step S204 by an agent simulation described later.
- step S208 the estimation error calculation unit 104 compares the human flow data estimated in step S206 with the human flow true value data 108 to calculate the human flow estimation error.
- step S210 the additional camera determination unit 106 selects a condition that minimizes the estimation error of the human flow calculated in step S208 from the various additional camera installation positions determined in step S202, and sets the additional camera installation position as the additional camera installation position. decide.
- FIG. 3 is a diagram showing an example of use of the sensor arrangement determination device. With reference to FIG. 3, the usage form of the sensor arrangement
- the target area 300 is an area to be processed by the sensor arrangement determination device 10, and is a sales floor such as a station premises, a department store, a supermarket, or the like.
- the trajectory 307 is a trajectory in which one person enters from the entrance 302 and exits from the entrance 306.
- a trajectory 308 is a trajectory in which one person enters from the entrance 302 and exits from the entrance 306.
- a trajectory 309 is a trajectory in which one person enters from the entrance 302 and exits from the entrance 304.
- the sensor arrangement determination device 10 determines the position of the camera where the estimation error is lowest when estimating human flow information generated in the entire target region 300 based on information measured by a small number of cameras. It becomes possible to do. For example, when there are cameras 310, 312, and 314 as candidate camera positions to be added, it is possible to evaluate and determine which camera is the best.
- the human flow true value data 108 is preferably preliminarily investigated before the additional camera is examined in order to grasp the human flow in the target area 300 to be processed.
- FIG. 12 is a diagram showing an example of a method for acquiring human flow data using a video camera.
- a plurality of video cameras 40 are installed in the target area 300 so that a place where the video camera 40 cannot shoot, that is, a blind spot is not generated as much as possible.
- the video camera 40 is not always installed, but is for acquiring human flow data, and is removed when the human flow data is acquired.
- the human flow data measurement unit 120 tracks a moving person in the video by image processing in cooperation with a plurality of video cameras 40 connected to the network 18 (see FIG. 11). For example, when the person 41 moves along the locus 42, the plurality of video cameras 40 that capture the person 41 are linked to track the person 41 on the video.
- the human flow data can be obtained from the image information. Become.
- the person to be identified when using image processing, it is difficult to specify a person photographed by the video camera 40.
- the person to be identified when the person to be identified is limited, it can be identified by using a method of collating customers based on video features as described in Japanese Patent Application Laid-Open No. 2000-200377, for example.
- the features on the video of the person to be specified are stored in advance in a database, and the person with the best matching is specified by comparing with the features on the video of the person photographed when measuring the human flow data.
- FIG. 13 is a diagram showing an example of human flow data obtained by a video camera.
- An example of human flow data will be described with reference to FIG. Since the human flow data measurement unit 120 (see FIG. 11) continuously detects the position of the moving object at certain time intervals, when a person moves, a plurality of continuous points on the locus are measured. For example, when the person 41 moves along the trajectory 42, a point sequence that is an element of human flow data, that is, a point 50, a point 51, a point 52, a point 53, and a point 54 are measured.
- the human flow data is approximately expressed by the plurality of point sequences.
- a generally used method of expressing a free curve such as a method of increasing the measurement interval of the human flow data measuring unit 120 or a spline interpolation, may be employed.
- FIG. 14 is a diagram showing an example of the data structure of human flow true value data.
- the human flow true value data 108 is a table for storing a plurality of human flow data measured by the human flow data measuring unit 120. One measured human flow data is stored in each row or record of this table.
- the record 62 shows an example in which human flow data for the locus 42 is stored. This record includes the human flow data ID, which is a unique number for identifying the measured human flow data, and the position information of the point sequence that is a set of points on the human flow data, together with the time information at which the point sequence is measured. .
- FIG. 4 is a diagram showing the installation position of the additional camera.
- the iterative process for the installation position of the additional camera shown in step S202 of FIG. 2 will be described.
- the processing from step S204 to step S208 is repeated for the assumed installation position of the additional camera.
- an assumed position of the additional camera an arbitrary place where a human flow occurs in the target area 300 can be assumed.
- the candidate area For example, when a path portion having no branch is considered as one area, the areas A1 to A10 are targeted. The measurement flow is the same regardless of where the camera is placed within each area. This is because it can be considered that no generation or disappearance of human flow occurs in this region. Accordingly, if the candidate position of the camera is considered at an arbitrary position in each area, it is not necessary to consider the candidate position for other places in the area. Therefore, for every such region, one place such as the center of gravity is considered as a camera candidate position.
- the candidate for the installation position of the additional camera may be specified directly on the screen by using the mouse of the input device 12 (see FIG. 11), for example, by the user of the sensor arrangement determination device 10.
- FIG. 5 is a diagram illustrating an example of flow information generation of the human flow simulation observation unit.
- the flow information generation shown in step S204 of FIG. 2 will be described with reference to FIG.
- the human flow simulation observation unit 100 Based on the human flow true value data 108, the human flow simulation observation unit 100 generates, in a simulated manner, passage information of a person, that is, flow information, that occurs when observation is performed for the installation position of the additional camera in step S202.
- the locus 309 is a locus that enters from the entrance 302 and exits from the entrance 304.
- the human flow simulation observation unit 100 generates the flow information 502 as a result of measurement by a virtually installed camera 314.
- the flow information 502 holds the number of people who pass through the field of view of the camera 314 for each predetermined time interval.
- the human flow true value data 108 includes point sequence position information, which is a set of points on the human flow data, for each human flow data ID, together with time information obtained by measuring the point sequence. Yes. Therefore, the human flow simulation observation unit 100 can count the number of people passing every predetermined time passing through the visual field captured by the camera 314. Specifically, in the flow information 502, the number of people passing from the entrance / exit 302 side to the entrance / exit 304 side is 20 on the locus 309 from 8:00:00 to 8:10. Further, from 8:10 to 8:20, the number of people passing from the entrance / exit 302 side to the entrance / exit 304 side is 15 on the locus 309. In this way, the human flow simulation observation unit 100 can generate flow information that changes from moment to moment.
- FIG. 6 is a diagram illustrating a process flow of the human flow estimation process of the human flow estimation unit.
- the human flow estimation unit 102 generates human flow data that well satisfies the generation condition of the flow information 502 generated in step S204 by multi-agent simulation.
- Multi-agent simulation is to analyze the social behavior that occurs when multiple agents (behaviors, in this case, people) who have given rules in advance execute each rule simultaneously and interact with each other. This is a simulation method.
- a technique described in Non-Patent Document 1 may be used for estimating a human flow using multi-agent simulation. The outline of this method will be described below.
- the human flow estimation unit 102 repeats the processing from step S602 to step S606 for the predetermined time of step S600.
- the predetermined time is the collection time of the human flow true value data 108. For example, if the collection time of the human flow true value data 108 is 30 minutes and the data collection interval is 1/10 second, the process of step S600 is repeated for 30 minutes at intervals of 1/10 second.
- step S602 the human flow estimation unit 102 generates an agent when there is a person who enters the target area 300 (see FIG. 3) in the human flow true value data 108 at the current simulation time.
- the human flow estimation unit 102 generates a plurality of locus candidates for the agent using a pedestrian model.
- the pedestrian model is a rule regarding agent movement.
- a potential model that selects a route based on a travel cost to a destination may be used.
- the potential model is a method that considers a potential surface of movement cost from an arbitrary point to a destination, and moves a person in a direction in which the gradient of the potential surface is large.
- the movement cost the time required for movement can be considered.
- the human flow estimation unit 102 uses this walking model, the human flow estimation unit 102 generates trajectory candidates for a plurality of entrances / exits of the target area 300 from the points that have entered the target area 300.
- the human flow estimation unit 102 selects trajectory candidates according to the observation values by evaluating and weighting the data according to the data assimilation method and the observation values.
- a data assimilation method a method using a particle filter described in Non-Patent Document 1 may be used.
- the particle filter is a kind of Bayes filter. Based on Bayes' theorem, a Bayesian filter estimates a state by sequentially repeating a time update for predicting the state at the next time and an observation update for updating the predicted state using observation information obtained by a sensor. It is a technique. This estimated state is expressed by a probability density function.
- the particle filter is a technique for obtaining an approximate solution of a probability density function by Monte Carlo approximation.
- the human flow estimation unit 102 uses the particle filter to determine the same location (for example, in FIG. 5) based on the flow information generated by the additional camera obtained in step S204 (see FIG. 2) and the trajectory candidate (particle) generated in step S604.
- the weight of the trajectory candidate is adjusted so that the flow information generated at the observation position by the camera 314 shown) is as close as possible.
- the initial value W 0 (i) of the weight of the i-th trajectory candidate (particle) is set to 1 / n. If the flow information observed at time t was set to m t, i th particle weight W t (i) is by updating the formula (1), to obtain a weight W t + 1 at time t + 1 (i). By repeating this update, the weight of the trajectory candidate that satisfies the constraint condition of the flow information is obtained.
- the result of weighting the target trajectory candidate so as to match the observed value is the estimated trajectory data.
- a specific example will be described with reference to FIG.
- FIG. 7 is a diagram illustrating an example of a locus candidate generation process. With reference to FIG. 7, a specific example of the locus candidate generation processing in step S604 of FIG. 6 will be described.
- the human flow estimation unit 102 generates two trajectory candidates, a trajectory candidate 702 and a trajectory candidate 704, for the person 700 who enters from the entrance 302.
- the trajectory candidate 702 is a trajectory having the entrance / exit 302 as a start point and the entrance / exit 306 as an end point.
- the trajectory candidate 704 is a trajectory having the entrance / exit 302 as a start point and the entrance / exit 304 as an end point. As the end point, all doorways may be selected, or may be selected based on a predetermined probability.
- the human flow estimation unit 102 determines a start point and an end point, and then generates a trajectory candidate from the start point to the end point by simulation using the above-described potential model. As shown in the table 706, 0.5, which is the reciprocal of the trajectory candidate number 2, is set in the generated trajectory candidate as shown in the table 706.
- the weight of the trajectory candidate 702 is 0.9 and the weight of the trajectory candidate 704 is 0.1. That is, it is estimated that the person 700 flows as 0.9 person for the locus candidate 704 and 0.1 person for the locus candidate 702.
- the trajectories of the trajectory candidates 702 and 704 shown in FIG. 7 are obtained by the human flow estimation unit 102 when estimating the movement path from one person entering the target area 300 to exiting by simulation as described above. It is a movement route to generate as.
- the trajectories 307, 308, and 309 shown in FIG. 5 are movement paths along which the person has actually moved. Note that the trajectories 307, 308, and 309 are actual trajectories with respect to the estimated trajectory candidates 702 and 704.
- FIG. 8 is a diagram illustrating an evaluation processing flow of the human flow estimation error of the estimation error calculation unit.
- the estimation error calculation unit 104 calculates an estimation error by comparing the trajectory data estimated in step S206 with the trajectory of the human flow true value data 108.
- step S800 the estimation error calculation unit 104 repeats the process of step S802 for all human flow data IDs (see FIG. 14) included in the human flow true value data 108.
- step S802 the passing number of people for each grid is calculated with respect to the trajectory represented in the point cloud for each human flow data ID to be processed. With reference to FIG. 9, a specific example of the number of passing people for each grid will be described.
- FIG. 9 is a diagram showing an example of the process of counting the number of passing people for each grid.
- the specific example of the calculation process of the passing number for every grid shown to step S802 of FIG. 8 and step S806 is demonstrated.
- the target area 300 is divided into grids with a predetermined number of divisions so that each grid can hold information on the number of people passing through.
- the number of people passing through the grid 902 is increased by 1, for example.
- step S ⁇ b> 802 the estimation error calculation unit 104 calculates the cumulative number of trajectories that pass through each grid when the target region 300 is divided into several grids.
- step S804 the estimation error calculation unit 104 repeats the process of step S806 for all the trajectory data estimated in step S206.
- step S806 the estimation error calculation unit 104 calculates the number of passing people for each grid for the trajectory data to be processed, as in step S802.
- step S808 an error is calculated by comparing, for each grid, the number of people who have passed the human flow true value data 108 and the number of people who have passed the trajectory data estimated in step S206.
- the root mean square error is used as the error.
- This root mean square error RMSE is expressed by equation (2).
- n is the number of grids
- r i is the number of people passing the human flow true value data 108 for the i-th grid
- e i is the number of people passing the estimated trajectory data for the i-th grid.
- FIG. 10 is a diagram illustrating an example of a process for determining the installation position of the additional camera in the target area.
- the numerical value shown at the lower right of the candidate area (area A1 to area A10) of the installation position of the additional camera is the root mean square error RMSE calculated in step S208.
- the error E1 when the camera is added to the area A1 corresponding to the camera 310 is 980.
- the error E2 when a camera is added to the area A2 is 500.
- the additional camera determination unit 106 compares the errors calculated for all the regions to determine a region with the smallest error, and the display device 11 (see FIG. 11) shows the region shown in FIG. The determined sensor placement position is displayed on the target area 300.
- the error E10 of the area A10 is the smallest 400. Therefore, the camera 314 corresponding to this area A10 can be determined as the installation position of the additional camera. Note that, for example, when one camera is newly added on the premise that two cameras are already installed in the target area 300, the same determination can be made.
- the sensor arrangement determining apparatus 10 stores, in a target area 300 where a sensor for measuring a human flow (for example, a camera 314) is to be arranged, human flow data that is an actual trajectory of a person is stored (for example, , An external storage device 15) and a human flow simulation observation unit (for example, a human flow simulation observation unit) that generates passage information of a person in the region observed by the sensor based on the human flow data in the candidate positions of the sensors in the target region 300. 100), a human flow estimation means (for example, a human flow estimation unit 102) that generates a trajectory candidate of how a person passes through the target area using the passage information as a constraint condition, and a region that the sensor observes.
- a human flow estimation means for example, a human flow estimation unit 102 that generates a trajectory candidate of how a person passes through the target area using the passage information as a constraint condition, and a region that the sensor observes.
- An estimation error calculating means (for example, an estimation error calculating unit 104) that calculates an estimation error of a human flow between the number of passing trajectory candidates and the number of passing actual trajectories; The position estimation error of the flow is reduced, the sensor position determination means for determining a position of the sensor (e.g., additional camera determining unit 106) comprises a, a.
- the sensor arrangement determination device 10 of the present embodiment can quantitatively evaluate whether the camera arrangement position (installation position) is good or bad from the viewpoint of human flow measurement. This makes it possible to determine an optimal arrangement position when measuring a human flow with a small number of cameras.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
図5は、人流模擬観測部のフロー情報生成の一例を示す図である。図5を参照して、図2のステップS204に示したフロー情報生成について説明する。人流模擬観測部100は、人流真値データ108に基づいて、ステップS202での追加カメラの設置位置に対して、観測した場合に発生する人物の通過情報、すなわちフロー情報を模擬的に生成する。
図6は、人流推定部の人流推定処理の処理フローを示す図である。図6を参照して、図2のステップS206に示した人流の推定処理について説明する。人流推定部102は、ステップS204で生成したフロー情報502の発生条件を良く満たす人流データを、マルチエージェントシミュレーションにより生成する。
図8は、推定誤差算出部の人流推定誤差の評価処理フローを示す図である。図8を参照して、図2のステップS208に示した人流推定誤差の評価処理について説明する。推定誤差算出部104は、ステップS206で推定した軌跡データと、人流真値データ108の人流データの軌跡とを比較して推定誤差を算出する。
図10は、対象領域に追加カメラの設置位置の決定処理の一例を示す図である。図10を参照して、対象領域300に、図2のステップS210に示した追加カメラの設置位置の決定処理の具体例を、対象領域300に初めてカメラを追加する場合について説明する。追加カメラの設置位置の候補領域(領域A1~領域A10)の右下に示した数値は、ステップS208で算出した二乗平均平方根誤差RMSEである。
11 表示装置
12 入力装置
13 中央演算処理装置(CPU)
14 通信制御装置
15 外部記憶装置(記憶手段)
16 メモリ
40 ビデオカメラ
41 人物
62 レコード
100 人流模擬観測部(人流模擬観測手段)
102 人流推定部(人流推定手段)
104 推定誤差算出部(推定誤差算出手段)
106 追加カメラ決定部(センサ位置決定手段)
108 人流真値データ
110 カメラ設置情報
120 人流データ計測部
302,304,306 出入り口
310,312,314 カメラ
307,308,309 軌跡(実軌跡)
502 フロー情報(通過情報)
702,704 軌跡候補
706 テーブル
902 グリッド
RMSE 二乗平均平方根誤差
Claims (8)
- 人流を観測する複数のセンサを配置する予定の対象領域について、当該対象領域を実際に人物が通過する際の実軌跡である人流データが記憶される記憶手段と、
前記対象領域における前記複数のセンサの配置位置の候補において、前記人流データに基づいて、前記複数のセンサが配置された場合に、各センサが観測する人物の観測情報を模擬的に生成する人流模擬観測手段と、
前記人流模擬観測手段が生成する前記観測情報を制約条件として、人物が前記対象領域内をどのように通過するかという軌跡候補を推定して生成する人流推定手段と、
前記人流推定手段が生成した前記軌跡候補に基づいて算出されるセンサの配置位置の候補ごとの観測人数と、前記人流データに基づいて算出されるセンサの配置位置の候補ごとの観測人数とを対比して、前記人流推定手段が前記軌跡候補を推定する際の推定誤差を算出する推定誤差算出手段と、
前記複数のセンサの配置位置の候補に対し、前記推定誤差算出手段が算出する推定誤差が小さくなる位置を前記センサの配置位置として前記候補を選択し、配置位置を決定するセンサ位置決定手段と、を備える
ことを特徴とするセンサ配置決定装置。 - 前記人流推定手段は、マルチエージェントシミュレーションによって軌跡候補を生成する
ことを特徴とする請求項1に記載のセンサ配置決定装置。 - 前記人流推定手段は、前記各センサが観測する人物の観測情報である観測値によって前記軌跡候補をデータ同化手法で重み付けすることにより、前記観測値に即した軌跡候補を生成する
ことを特徴とする請求項2に記載のセンサ配置決定装置。 - 前記推定誤差算出手段は、前記対象領域のセンサの配置位置の候補の領域毎に観測人数を算出し、その領域毎の人数の二乗平均平方根誤差を推定誤差として用いる
ことを特徴とする請求項3に記載のセンサ配置決定装置。 - 前記センサは、カメラ、ラインセンサ、レーザスキャナの少なくともひとつである
ことを特徴とする請求項1に記載のセンサ配置決定装置。 - 前記センサ配置決定手段は、分岐が存在しない通路部毎に前記センサの配置位置の候補を選択する
ことを特徴とする請求項1から請求項5のいずれか1項に記載のセンサ配置決定装置。 - 前記センサ配置決定手段は、表示装置に、前記対象領域および前記決定された前記センサの配置位置を表示する
ことを特徴とする請求項1から請求項5のいずれか1項に記載のセンサ配置決定装置。 - 人流を計測する複数のセンサを配置する予定の対象領域について、当該対象領域を実際に人物が通過する際の実軌跡である人流データが記憶される記憶手段と、人流模擬観測手段と、人流推定手段と、推定誤差算出手段と、センサ位置決定手段とを備えるセンサ配置決定装置を用いて、前記対象領域にセンサを追加する際のセンサ配置決定方法であって、
前記人流模擬観測手段は、前記対象領域における前記複数のセンサの配置位置の候補において、前記人流データに基づいて、前記複数のセンサが配置された場合に、各センサが観測する人物の観測情報を模擬的に生成し、
前記人流推定手段は、前記人流模擬観測手段が生成する前記観測情報を制約条件として、人物が前記対象領域内をどのように通過するかという軌跡候補を推定して生成し、
前記推定誤差算出手段は、前記人流推定手段が生成した前記軌跡候補に基づいて算出されるセンサの配置位置の候補ごとの観測人数と、前記人流データに基づいて算出されるセンサの配置位置の候補ごとの観測人数とを対比して、前記人流推定手段が前記軌跡候補を推定する際の推定誤差を算出し、
前記センサ位置決定手段は、前記複数のセンサの配置位置の候補に対し、前記推定誤差算出手段が算出する推定誤差が小さくなる位置を前記センサの配置位置として前記候補を選択し、配置位置を決定する
ことを特徴とするセンサ配置決定方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015522451A JP6236448B2 (ja) | 2013-06-21 | 2013-06-21 | センサ配置決定装置およびセンサ配置決定方法 |
PCT/JP2013/067047 WO2014203389A1 (ja) | 2013-06-21 | 2013-06-21 | センサ配置決定装置およびセンサ配置決定方法 |
US14/899,834 US9955124B2 (en) | 2013-06-21 | 2013-06-21 | Sensor placement determination device and sensor placement determination method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/067047 WO2014203389A1 (ja) | 2013-06-21 | 2013-06-21 | センサ配置決定装置およびセンサ配置決定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014203389A1 true WO2014203389A1 (ja) | 2014-12-24 |
Family
ID=52104151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/067047 WO2014203389A1 (ja) | 2013-06-21 | 2013-06-21 | センサ配置決定装置およびセンサ配置決定方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9955124B2 (ja) |
JP (1) | JP6236448B2 (ja) |
WO (1) | WO2014203389A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017049954A (ja) * | 2015-09-04 | 2017-03-09 | 国立大学法人 東京大学 | 推定装置、推定方法及びプログラム |
WO2019087729A1 (ja) * | 2017-10-30 | 2019-05-09 | 株式会社日立製作所 | ビル内人流推定システムおよび推定方法 |
WO2019186890A1 (ja) * | 2018-03-29 | 2019-10-03 | 日本電気株式会社 | 映像監視装置、その制御方法、及び、コンピュータ可読媒体 |
WO2019186889A1 (ja) * | 2018-03-29 | 2019-10-03 | 日本電気株式会社 | カメラ配置適性度評価装置、その制御方法、最適カメラ配置算出装置、及び、コンピュータ可読媒体 |
JP2022166067A (ja) * | 2015-01-14 | 2022-11-01 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
JP7417476B2 (ja) | 2020-06-12 | 2024-01-18 | 公益財団法人鉄道総合技術研究所 | 配置計画作成支援システム及び配置計画作成支援方法 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6176335B2 (ja) * | 2013-12-02 | 2017-08-09 | 富士通株式会社 | 通信網制御方法、通信網制御プログラム、およびシステム |
DE102014010937A1 (de) * | 2014-07-28 | 2016-01-28 | S.M.S, Smart Microwave Sensors Gmbh | Verfahren zum Bestimmen einer Position und/oder Ausrichtung eines Sensors |
JP5915960B1 (ja) * | 2015-04-17 | 2016-05-11 | パナソニックIpマネジメント株式会社 | 動線分析システム及び動線分析方法 |
US20160343099A1 (en) * | 2015-05-22 | 2016-11-24 | International Business Machines Corporation | Automated traffic sensor placement planning |
JP6558579B2 (ja) | 2015-12-24 | 2019-08-14 | パナソニックIpマネジメント株式会社 | 動線分析システム及び動線分析方法 |
EP3437957B1 (en) * | 2016-03-29 | 2021-12-01 | Mitsubishi Electric Corporation | Train traffic control system and train traffic control method |
US10497130B2 (en) | 2016-05-10 | 2019-12-03 | Panasonic Intellectual Property Management Co., Ltd. | Moving information analyzing system and moving information analyzing method |
US10565786B1 (en) | 2016-06-30 | 2020-02-18 | Google Llc | Sensor placement interface |
US10216983B2 (en) * | 2016-12-06 | 2019-02-26 | General Electric Company | Techniques for assessing group level cognitive states |
US10445565B2 (en) * | 2016-12-06 | 2019-10-15 | General Electric Company | Crowd analytics via one shot learning |
JP6905888B2 (ja) * | 2017-07-25 | 2021-07-21 | 日本電信電話株式会社 | 流量予測装置、方法、及びプログラム |
EP3841440B1 (en) * | 2018-10-25 | 2023-05-24 | Siemens Aktiengesellschaft | A method for computer-implemented determination of sensor positions in a simulated process of an automation system |
CN111324945B (zh) * | 2020-01-20 | 2023-09-26 | 阿波罗智能技术(北京)有限公司 | 传感器方案确定方法、装置、设备及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007264706A (ja) * | 2006-03-27 | 2007-10-11 | Yokogawa Electric Corp | 画像処理装置、監視カメラ及び画像監視システム |
JP2011195290A (ja) * | 2010-03-19 | 2011-10-06 | Toshiba Elevator Co Ltd | エスカレータ監視装置 |
JP2012010210A (ja) * | 2010-06-28 | 2012-01-12 | Hitachi Ltd | カメラ配置決定支援装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1561640A (zh) * | 2001-09-27 | 2005-01-05 | 皇家飞利浦电子股份有限公司 | 用于基本计算机的视觉监视的最佳多摄像机设置 |
JP3915805B2 (ja) * | 2004-08-31 | 2007-05-16 | 住友電気工業株式会社 | 駐車場におけるカメラ設置条件の自動決定方法及び装置 |
US8013731B2 (en) * | 2007-07-03 | 2011-09-06 | 3M Innovative Properties Company | Apparatus and method for processing data collected via wireless network sensors |
JP5190970B2 (ja) * | 2009-10-13 | 2013-04-24 | 国立大学法人 鹿児島大学 | 監視カメラ配置位置評価装置 |
US8706458B2 (en) * | 2011-10-05 | 2014-04-22 | International Business Machines Corporation | Traffic sensor management |
-
2013
- 2013-06-21 WO PCT/JP2013/067047 patent/WO2014203389A1/ja active Application Filing
- 2013-06-21 JP JP2015522451A patent/JP6236448B2/ja not_active Expired - Fee Related
- 2013-06-21 US US14/899,834 patent/US9955124B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007264706A (ja) * | 2006-03-27 | 2007-10-11 | Yokogawa Electric Corp | 画像処理装置、監視カメラ及び画像監視システム |
JP2011195290A (ja) * | 2010-03-19 | 2011-10-06 | Toshiba Elevator Co Ltd | エスカレータ監視装置 |
JP2012010210A (ja) * | 2010-06-28 | 2012-01-12 | Hitachi Ltd | カメラ配置決定支援装置 |
Non-Patent Citations (1)
Title |
---|
TOSHIKAZU NAKAMURA: "Fundamental Study on Data Assimilation to Estimate People Flow in a Railway Station", DEPARTMENT OF SOCIO-CULTURAL ENVIRONMENTAL STUDIES, GRADUATE SCHOOL OF FRONTIER SCIENCES, 24 March 2011 (2011-03-24), THE UNIVERSITY OF TOKYO SHUSHI RONBUN, pages 5 - 42 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7428213B2 (ja) | 2015-01-14 | 2024-02-06 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
JP2022166067A (ja) * | 2015-01-14 | 2022-11-01 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
JP2017049954A (ja) * | 2015-09-04 | 2017-03-09 | 国立大学法人 東京大学 | 推定装置、推定方法及びプログラム |
JP7029930B2 (ja) | 2017-10-30 | 2022-03-04 | 株式会社日立製作所 | ビル内人流推定システムおよび推定方法 |
WO2019087729A1 (ja) * | 2017-10-30 | 2019-05-09 | 株式会社日立製作所 | ビル内人流推定システムおよび推定方法 |
JP2019081626A (ja) * | 2017-10-30 | 2019-05-30 | 株式会社日立製作所 | ビル内人流推定システムおよび推定方法 |
CN111225867A (zh) * | 2017-10-30 | 2020-06-02 | 株式会社日立制作所 | 大楼内人流推算系统和推算方法 |
US11623842B2 (en) | 2017-10-30 | 2023-04-11 | Hitachi, Ltd. | Building human flow estimation system and estimation method |
CN111225867B (zh) * | 2017-10-30 | 2021-06-01 | 株式会社日立制作所 | 大楼内人流推算系统和推算方法 |
EP3705439A4 (en) * | 2017-10-30 | 2021-08-18 | Hitachi, Ltd. | SYSTEM AND METHOD FOR PREDICTING HUMAN FLOWS INSIDE A BUILDING |
WO2019186889A1 (ja) * | 2018-03-29 | 2019-10-03 | 日本電気株式会社 | カメラ配置適性度評価装置、その制御方法、最適カメラ配置算出装置、及び、コンピュータ可読媒体 |
US11227376B2 (en) | 2018-03-29 | 2022-01-18 | Nec Corporation | Camera layout suitability evaluation apparatus, control method thereof, optimum camera layout calculation apparatus, and computer readable medium |
US11381782B2 (en) | 2018-03-29 | 2022-07-05 | Nec Corporation | Video monitoring apparatus, control method thereof, and computer readable medium |
JPWO2019186889A1 (ja) * | 2018-03-29 | 2021-03-11 | 日本電気株式会社 | カメラ配置適性度評価装置、その制御方法、最適カメラ配置算出装置、及び、プログラム |
JPWO2019186890A1 (ja) * | 2018-03-29 | 2021-02-12 | 日本電気株式会社 | 映像監視装置、その制御方法、及び、プログラム |
WO2019186890A1 (ja) * | 2018-03-29 | 2019-10-03 | 日本電気株式会社 | 映像監視装置、その制御方法、及び、コンピュータ可読媒体 |
US11924585B2 (en) | 2018-03-29 | 2024-03-05 | Nec Corporation | Video monitoring apparatus, control method thereof, and computer readable medium |
JP7417476B2 (ja) | 2020-06-12 | 2024-01-18 | 公益財団法人鉄道総合技術研究所 | 配置計画作成支援システム及び配置計画作成支援方法 |
Also Published As
Publication number | Publication date |
---|---|
US20160142679A1 (en) | 2016-05-19 |
US9955124B2 (en) | 2018-04-24 |
JPWO2014203389A1 (ja) | 2017-02-23 |
JP6236448B2 (ja) | 2017-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6236448B2 (ja) | センサ配置決定装置およびセンサ配置決定方法 | |
Asadzadeh et al. | Sensor-based safety management | |
Rao et al. | Origin-destination pattern estimation based on trajectory reconstruction using automatic license plate recognition data | |
US10810442B2 (en) | People flow estimation device, people flow estimation method, and recording medium | |
JP2024041997A (ja) | 情報処理装置、情報処理方法およびプログラム | |
US20120020518A1 (en) | Person tracking device and person tracking program | |
JP6120404B2 (ja) | 移動体行動分析・予測装置 | |
Hassannayebi et al. | A hybrid simulation model of passenger emergency evacuation under disruption scenarios: A case study of a large transfer railway station | |
Dridi | Tracking individual targets in high density crowd scenes analysis of a video recording in hajj 2009 | |
Papaioannou et al. | Tracking people in highly dynamic industrial environments | |
WO2012111138A1 (ja) | 歩行者移動情報検出装置 | |
JP2019012494A (ja) | 人流推定装置及び人流推定システム並びに人流測定方法 | |
JP2020095292A (ja) | 混雑予測システムおよび歩行者シミュレーション装置 | |
Ronchi et al. | A probabilistic approach for the analysis of evacuation movement data | |
Voloshin et al. | Optimization-based calibration for micro-level agent-based simulation of pedestrian behavior in public spaces | |
Krbálek et al. | Pedestrian headways—Reflection of territorial social forces | |
JP2010139325A (ja) | 位置推定装置及び位置推定方法 | |
JP6813527B2 (ja) | 推定装置、推定方法及びプログラム | |
Stisen et al. | Task phase recognition for highly mobile workers in large building complexes | |
JP2008134939A (ja) | 動体追跡装置,動体追跡方法,その方法を記述した動体追跡プログラム及びそのプログラムを格納した記録媒体 | |
WO2019187288A1 (ja) | 情報処理装置、データ生成方法、及びプログラムが格納された非一時的なコンピュータ可読媒体 | |
Saunier et al. | Comparing data from mobile and static traffic sensors for travel time assessment | |
Abdelwahab et al. | Measuring “nigiwai” from pedestrian movement | |
Mishalani et al. | Evaluating real-time origin-destination flow estimation using remote sensing-based surveillance data | |
Ettehadieh | Systematic parameter optimization and application of automated tracking in pedestrian-dominant situations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13887464 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015522451 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14899834 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13887464 Country of ref document: EP Kind code of ref document: A1 |