JP2006031645A - Real-time estimation method for dynamic crowd density and crowd accident prevention system - Google Patents

Real-time estimation method for dynamic crowd density and crowd accident prevention system Download PDF

Info

Publication number
JP2006031645A
JP2006031645A JP2004231672A JP2004231672A JP2006031645A JP 2006031645 A JP2006031645 A JP 2006031645A JP 2004231672 A JP2004231672 A JP 2004231672A JP 2004231672 A JP2004231672 A JP 2004231672A JP 2006031645 A JP2006031645 A JP 2006031645A
Authority
JP
Japan
Prior art keywords
crowd
image
density
real
crowd density
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2004231672A
Other languages
Japanese (ja)
Inventor
Nariyuki Mitachi
成幸 三田地
Mitsuo Takano
光雄 高野
Tetsuya Kaneda
哲也 金田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to JP2004231672A priority Critical patent/JP2006031645A/en
Publication of JP2006031645A publication Critical patent/JP2006031645A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide real-time estimation technology for dynamic crowd density based on image processing for caution and prevention of generation of a crowd accident in a store, various kinds of event sites or the like. <P>SOLUTION: In this real-time estimation method for dynamic crowd density and crowd accident prevention system, so as to grasp a congestion situation in real time to contribute to crowd control, the number of persons is not accurately counted by closely watching the individual person, but a property as a crowd is characterized into a moving lump or a lump having a different background and different brightness, rigidity is not pursued, and simplification is performed as thoroughly as possible to a degree not impairing practicality. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

本発明は、商店や各種イベント会場などにおける事故の発生を警戒・防止する画像処理技術に関するものである。  The present invention relates to an image processing technique that warns and prevents the occurrence of an accident in a store or various event venues.

大勢の人が混雑する祭りやイベント会場、新店舗オープン会場等では限られた場所に不特定多数の人や車の集中と混雑が発生する。雑踏警備の業務は、訪問人の適切な交通誘導、入退場整理を行い、雑踏に伴う事故や混乱を防ぐことである。近年、発生した事故で兵庫県の明石祭り(花火大会)事故の調査報告の例(非特許文献1)では、事故現場の歩道橋がボトルネックになることは予想可能であったのにもかかわらず、事前の対応策を用意していなかったことに問題があり、歩道橋への流入を制限することで事故を防止できたと主張している。当時の各種データから歩道橋の滞留人数を算出し、流入制限のタイミングについても述べているが、これに適切に対応できるシステムは存在しなかったし、当然のことながらそのようなシステムが設置もされていなかった。  At festivals, event venues, and new store opening venues where many people are crowded, an unspecified number of people and cars are concentrated and crowded. The duties of the hustle guard are to prevent traffic accidents and confusion caused by hustle and buses by appropriately guiding visitors to traffic and arranging entrances and exits. In recent years, an accident report (Non-Patent Document 1) of the Akashi Festival (fireworks display) accident in Hyogo Prefecture caused an accident that occurred, although it was possible to predict that the pedestrian bridge at the accident site would be a bottleneck. He argued that there was a problem in not having prepared a countermeasure in advance and that the accident could be prevented by restricting the inflow to the pedestrian bridge. The number of people staying on the pedestrian bridge is calculated from various data at the time, and the timing of inflow restriction is also described, but there was no system that could respond appropriately, and such a system was naturally installed. It wasn't.

明石市民夏まつり事故調査委員会:第32回明石市民夏まつりにおける花火大会事故調査報告書(2002) http://www.city.akashi.hyogo.jp/soumu/h safety/natsumatsuri houkoku.htmlAkashi Civic Summer Festival Accident Investigation Committee: Fireworks display accident investigation report at the 32nd Akashi Civic Summer Festival (2002) http: // www. city. akashi. hyog. jp / soumu / h safety / natsumatsuri hokkoku. html

人数が時間を追って増える状況を知ることは、規制を行うタイミングを判断するために重要である。このタイミングは混雑がひどくなってからでは、群集の反発もあり難しい。人数変化を一定時間にわたって追跡することで、客観的かつ効果的なタイミングを得ることができると思われる。第32回明石市民夏まつりにおける花火大会事故調査報告書(2002)が、混雑に伴う事故発生直後の臨機応変な対応は困難であると考えなければいけないと提言していることからも、事故の芽を事前に摘み取る対策をタイミングよく実施することが重要である。  Knowing the situation where the number of people increases over time is important for judging the timing of regulation. This timing is difficult due to crowd rebound after the crowds get worse. By tracking changes in the number of people over a certain period of time, it seems that objective and effective timing can be obtained. Because the fireworks display accident investigation report (2002) in the 32nd Akashi citizen summer festival suggests that it is difficult to respond flexibly immediately after the occurrence of the accident due to congestion, It is important to implement measures to pick buds in advance in a timely manner.

本発明では、リアルタイムに混雑状況を把握し雑踏警備に寄与する技術を実現した。本発明技術は既存技術と異なり、個々の人間に注視して正確に人数を数えるのではなく、群集としての性質の特徴化にオリジナリティーがある。また、厳密性を追求せず、実用性を損なわない程度にできるだけ単純化し、必要な条件を確定する技術である。  In the present invention, a technology for grasping the congestion situation in real time and contributing to crowd guarding has been realized. Unlike the existing technology, the technology of the present invention does not count the number of people accurately by paying attention to each person, but has the originality in characterizing the characteristics as a crowd. In addition, it is a technique that does not pursue strictness and simplifies as much as possible without impairing practicality, and determines necessary conditions.

本発明に関わる既報/既存の技術としては次の三つの技術があった。  There are the following three technologies as reported / existing technologies related to the present invention.

第1は、奈良先端科学技術大学院大学修士論文(2002年)「前田宏二 指導教官 千原國宏」に記載の、「街中の歩行者分布の実時間計測と地図の作成」(非特許文献2)である。この技術内容は以下の通りである。歩行者分布情報は、マーケティングや都市整備計画の検討材料として利用できるなど、人それぞれの利用目的を持った有益な街中情報でもある。街中の歩行者分布情報を抽出して、ユーザーに対して提供可能なシステムを提案している。歩行者分布計測には不特定多数のセンシングが可能な画像処理を用い、カメラには広範囲撮影可能な魚眼レンズを装着したものを利用する。歩行者分布計測は複数の場所で行なう必要があるが、ネットワークを利用することにより抽出した情報を1ヵ所に収集することで地図上への統合を可能にしている。この既存技術が静的な歩行者分布情報を基に複数のカメラ画像情報を用いるのに対し、本発明が1台のカメラの動的歩行者分布情報(時間変化情報)を基にしている点で根本的に異なる。  The first is “Real-time measurement of pedestrian distribution in town and creation of maps” (Non-Patent Document 2) described in “Koji Maeda, Kunihiro Chihara”, Master's thesis of Nara Institute of Science and Technology (2002). is there. The technical contents are as follows. Pedestrian distribution information is also useful city information that has its own purpose of use, such as being able to be used as a material for studying marketing and urban development plans. We have proposed a system that can extract pedestrian distribution information in the city and provide it to users. The pedestrian distribution measurement uses image processing capable of unspecified many sensing, and uses a camera equipped with a fish-eye lens capable of photographing a wide range. Pedestrian distribution measurements need to be performed at multiple locations, but the information extracted by using the network is collected at one location, enabling integration on the map. Whereas this existing technology uses a plurality of camera image information based on static pedestrian distribution information, the present invention is based on dynamic pedestrian distribution information (time change information) of one camera. Is fundamentally different.

第2は、株式会社シー・イー・デー・システムによる、「人数計測システム(IBSカウンター)」という発明(非特許文献3)である。この技術内容は以下の通りである。移動物体の認識に映像の時空間差分情報を主体とせず、オプティカルフロー推定のみかけ速度場情報から3次元空間の物体移動情報を抽出し、正確に移動物体の同定を行う技術を開発している。この技術を応用してビルの出入り口付近や大規模ホール、公共集客施設、コンビニ、エレベータ入退出の通過人数集計装置及び侵入者検知装置を販売している。この既存技術が個々の人間に注視して正確に人数を数える技術(3次元空間の物体移動情報を抽出し、正確に移動物体の同定を行う技術)であるのに対し、本発明は群集としての性質の特徴化を主眼とし、厳密性を追求せず、実用性を損なわない程度にできるだけ単純化し、必要な条件を確定する技術である点で根本的に異なる。  The second is an invention (Non-Patent Document 3) called “Person Counting System (IBS Counter)” by CEE Day System Co., Ltd. The technical contents are as follows. We are developing a technology that accurately identifies moving objects by recognizing optical flow estimation and extracting 3D space object movement information from the velocity field information without using the spatio-temporal difference information of the image as the subject for recognition of moving objects. . By applying this technology, we sell buildings near entrances and entrances, large halls, public customer facilities, convenience stores, elevator entry / exit counting devices and intruder detection devices. While this existing technique is a technique for accurately counting the number of people by paying attention to each person (a technique for extracting object movement information in a three-dimensional space and accurately identifying a moving object), the present invention is used as a crowd. It is fundamentally different in that it is a technology for determining the necessary conditions, focusing on the characterization of the nature of the technology, not pursuing strictness, simplifying it as much as possible without impairing its practicality.

第3は、独立行政法人防災科学技術研究所による、「密集空間を対象とした総合避難誘導シミュレーションシステム研究」である。この技術内容は以下の通りである。施設内リアルタイム管理システムとして次の4つの要素技術(現存人数計測システム、群集シミュレーションシステム、データベースシステム、誘導情報表示システム)について特許出願中(非特許文献4)ということである。この既存技術が、人数カウンタを用い一人一人を正確に計測した施設内の現存人数情報(静的人数情報)を基に、群集の行動を予測するのに対し、本発明はモニターカメラの動画情報を全体画面上の人的密度を近似的に求め、動的歩行者分布情報(時間変化情報)を与える点で根本的に異なる。  The third is “Comprehensive Evacuation Guidance Simulation System Study for Dense Space” by the National Institute for Disaster Prevention Science and Technology. The technical contents are as follows. As an in-facility real-time management system, the following four elemental technologies (existing people counting system, crowd simulation system, database system, guidance information display system) are pending patent application (Non-Patent Document 4). Whereas this existing technology predicts the behavior of the crowd based on the existing number of people information (static number information) in the facility that accurately measures each person using the number of people counter, the present invention is the video information of the monitor camera Is fundamentally different in that the human density on the entire screen is approximately calculated and dynamic pedestrian distribution information (time change information) is given.

そこで、本発明の目的は、個々の人間に注視して人数を計測するのではなく、群集としての性質の特徴化を主眼とし、厳密性を追求せず、実用性を損なわない程度にできるだけ単純化し、必要な条件を確定し、リアルタイムに混雑状況を把握し雑踏警備に寄与する技術を提供しようというものである。
http://chihara.aist−nara.ac.jp/public/thesis/master/2002/maeda0051096.pdf http://www.ced.co.jp/riv3.htm http://www.pref.miyagi.jp/hk−tihouken/sodan/jirei7−5.html、http://www.kedm.bosai.go.jp/japanese/kenkykaihatsu/005_nisshukukan.html
Therefore, the purpose of the present invention is not to measure the number of people by paying attention to individual human beings, but to focus on characterizing the properties as a crowd, pursuing strictness, and as simple as possible without impairing practicality. It is intended to provide a technology that contributes to crowd guarding by determining the necessary conditions, grasping the congestion situation in real time.
http: // chihara. aist-nara. ac. jp / public / thesis / master / 2002 / maeda0051096. pdf http: // www. ced. co. jp / riv3. htm http: // www. pref. Miyagi. jp / hk-tihouken / sodan / jirei7-5. html, http: // www. kedm. bosai. go. jp / japanes / kenkykaihatsu / 005_nisshukukan. html

本発明は、以上の点を解決するために、現場の混雑状況を撮影した画像中に、群集の面積が占める割合を群集密度の特徴量とする。続いて、群集と背景を分ける2値化操作を画像に対して行い、2値化の方法は判別分析法(非特許文献5)を用いる。この方法は、濃度tで2つのグループに分けたとき、グループ間の分散が最大となる濃度tmaxを閾値とするというものである。この操作は背景を表す画素群と群集を表す画素群を分ける濃度の推定を目的としている。画像全体に用いると適切に動作しないため、領域を限定して閾値を推定し、これを画像全体の閾値として用いる。画像は撮影された角度θに応じて、図1のように見かけ上の歪みを生じる。Pをカメラの位置としたとき、地面に垂直なOAはAA′なり、水平なOBはBB′となる。このことから、群集の面積はcosθ倍、画像全体の面積はsinθ倍となって写ることになる。したがって、画像全体の画素数をm、群集に相

Figure 2006031645
領域の面積から求める。実際の人数は画像に写っている人の数を目視により数え、画像処理から得られる人数とのカリブレーション(更正曲線)のためのリファランス値として用いた。人が重なっている場合は頭部の数を数え、対象領域の面積は、画像内の物体から縮尺の基準を定めて推定した。群集の移動方向検出はテンプレートマッチングを利用した。固定カメラにより撮影した隣接時刻のサンプリング画像のうち、新しい画像内の局所部分(位置s,sサイズm×n)をテンプレートに定め、古い画像内で上下左右に移動し不一致度を計算した。古い画像内で不一致度の低い位置から、新しい画像内の局所領域の位置へ物体が移動したと推定できる。古い画像内の位置を(i,j)、移動量をaとすれば、不一致度行列Rをa×aの行列として次式で計算した。
Figure 2006031645
を提供する。
南敏,中村納:画像工学,コロナ社(1989) In the present invention, in order to solve the above-described points, the proportion of the area of the crowd in the image obtained by photographing the congestion situation at the site is used as the feature amount of the crowd density. Subsequently, a binarization operation for separating the crowd and the background is performed on the image, and a discriminant analysis method (Non-patent Document 5) is used as the binarization method. In this method, when the density t is divided into two groups, the density t max that maximizes the variance between the groups is used as a threshold value. The purpose of this operation is to estimate the density that separates the pixel group representing the background and the pixel group representing the crowd. Since it does not operate properly when used for the entire image, the threshold is estimated by limiting the region, and this is used as the threshold for the entire image. The image is apparently distorted as shown in FIG. 1 according to the angle θ taken. When P is the camera position, OA perpendicular to the ground is AA 'and horizontal OB is BB'. Therefore, the area of the crowd is shown as cos θ times and the area of the entire image as sin θ times. Therefore, the number of pixels in the entire image is m,
Figure 2006031645
Obtained from the area of the region. The actual number of people in the image was visually counted and used as a reference value for calibration (correction curve) with the number of people obtained from image processing. When people overlap, the number of heads was counted, and the area of the target area was estimated by setting a scale reference from the object in the image. Crowd movement direction detection uses template matching. A local part (position s x , sy size m × n) in a new image is defined as a template among sampling images taken at a fixed time taken by a fixed camera, and the degree of inconsistency is calculated by moving up and down, left and right in the old image. . It can be estimated that the object has moved from the position with a low degree of mismatch in the old image to the position of the local region in the new image. Assuming that the position in the old image is (i, j) and the movement amount is a, the mismatch degree matrix R is calculated as the a × a matrix by the following equation.
Figure 2006031645
I will provide a.
Satoshi Minami, Nana Nakamura: Image Engineering, Corona (1989)

本発明は次のような効果を奏する。場所と状況を変えて群集密度の推定行った実施例の結果より、各場所毎で実際の群集密度と群集密度の特徴量には強い相関があり、群集の増減を掌握するためには本発明の手法は十分であり、雑踏警備における人員の効果的配置と即時対応に寄与できる。適用範囲を確定すれば実用的な要素技術と考えられ、システムとして構築することが可能となる。  The present invention has the following effects. From the results of the embodiment in which the crowd density was estimated by changing the place and the situation, there is a strong correlation between the actual crowd density and the feature quantity of the crowd density for each place, and in order to grasp the increase and decrease of the crowd, the present invention This method is sufficient, and can contribute to effective placement and immediate response of personnel in crowd guards. If the scope of application is determined, it can be considered as a practical element technology and can be constructed as a system.

以下、本発明を実施例に基づいて詳細に説明する。
<実施例1>
Hereinafter, the present invention will be described in detail based on examples.
<Example 1>

2003年11月4日、天気晴れ、八王子駅北口前スクランブル交差点の横断部分を午後7時からおよそ10分間撮影した(八王子駅前交差点(夜))。交差点を見下ろす駅前歩道橋上に、家庭用ビデオカメラを三脚で固定して撮影した。群集を見下ろす角度はおよそ10°であった。次に、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を1秒毎に取り出した。この画像内の特定領域の面積と領域内の人数から実際の群集密度を推定した。特定領域の面積は画像中の物体を縮尺の基準に定めて推定した。その際に、撮影角度によって生じる奥行き方向の歪みを補正した結果、面積は149.7mであった。領域内の人数は目視により数えた。人が重なっている場合は頭部の数を数えた。その結果、最小人数13人、最大人数37人であった。群集密度は0.09人/mから0.25人/mまで変化したと推定される。次に、画像中の人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、特定領域内に占める人物面積と、特定領域全域との面積比を、群集密度の特徴量とした。特定領域面積と人物面積は撮影角度による歪みを生じているため、この2つから求めた特徴量を補正する。その結果、図2に示すように、得られた群集密度の特徴量(比率)と、実際の群集密度(人数/m)は相関係数0.96となる強い相関があった。次に、同一の撮影領域について、異なる時刻の静止画像をビデオ映像から取り出し、新しい画像内に局所部分を定める。古い画像内の同一部分を初期対応点と定める。両画像の対応点について、濃度差の2乗を、対応点全てについて足したものを不一致度と定める。古い画像内の対応点を上下左右に10ドットずつ移動させて不一致度行列を得た。この結果、図3に示すように、不一致度行列の各要素の値は、物体の移動元を示す要素で最小となり、群集の移動方向を推定できた。
<実施例2>
On November 4, 2003, the weather was fine, and the crossing part of the scramble crossing in front of Hachioji Station North Exit was photographed for about 10 minutes from 7:00 pm (Hachioji Station Crossing (Night)). A home video camera was fixed on a tripod on the pedestrian bridge in front of the station overlooking the intersection. The angle overlooking the crowd was approximately 10 °. Next, a still image was taken out from the video image every second using a personal computer equipped with a video capture board. The actual crowd density was estimated from the area of the specific area in this image and the number of people in the area. The area of the specific region was estimated by setting an object in the image as a reference for the scale. At this time, as a result of correcting distortion in the depth direction caused by the shooting angle, the area was 149.7 m 2 . The number of people in the area was counted visually. When people were overlapping, the number of heads was counted. As a result, the minimum number was 13 and the maximum number was 37. It is estimated that the crowd density changed from 0.09 person / m 2 to 0.25 person / m 2 . Next, paying attention to the difference in brightness between the human and the background in the image, binarization processing is performed to separate the two, and the area ratio between the person area occupied in the specific area and the entire specific area is expressed as the crowd density The feature amount. Since the specific area and the person area are distorted by the shooting angle, the feature amount obtained from the two is corrected. As a result, as shown in FIG. 2, there was a strong correlation between the obtained feature value (ratio) of the crowd density and the actual crowd density (number of people / m 2 ) having a correlation coefficient of 0.96. Next, for the same shooting area, still images at different times are extracted from the video image, and a local portion is defined in the new image. The same part in the old image is defined as the initial corresponding point. For the corresponding points of both images, the sum of the square of the density difference and all the corresponding points is defined as the degree of inconsistency. The corresponding points in the old image were moved 10 dots up, down, left, and right to obtain a mismatch degree matrix. As a result, as shown in FIG. 3, the value of each element of the disagreement degree matrix is the smallest among the elements indicating the movement source of the object, and the movement direction of the crowd can be estimated.
<Example 2>

2003年11月5日、天気晴れ、八王子駅北口前スクランブル交差点の横断部分を午前9時からおよそ10分間撮影した(八王子駅前交差点(朝))。交差点を見下ろす駅前歩道橋上に、家庭用ビデオカメラを三脚で固定して撮影した。群集を見下ろす角度はおよそ10°であった。次に、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を1秒毎に取り出した。この画像内の特定領域の面積と領域内の人数から実際の群集密度を推定した。特定領域の面積は画像中の物体を縮尺の基準に定めて推定した。その際に、撮影角度によって生じる奥行き方向の歪みを補正した結果、面積は149.7mであった。領域内の人数は目視により数えた。人が重なっている場合は頭部の数を数えた。その結果、最小人数5人、最大人数25人であった。群集密度は0.03人/mから0.17人/mまで変化したと推定される。次に、画像中の人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、特定領域内に占める人物面積と、特定領域全域との面積比を、群集密度の特徴量とした。特定領域面積と人物面積は撮影角度による歪みを生じているため、この2つから求めた特徴量を補正する。その結果、図4に示すように、得られた群集密度の特徴量と、実際の群集密度とは相関係数0.92となる強い相関があった。次に、画像中の移動体のすべてが人間であることに着目し、特定領域内の移動体面積と特定領域全域との面積比を、群集密度の特徴量とした。移動体の検出は、同一の撮影領域に発生する濃度変化を2値化処理することで行う。即ち、異なる時刻の画像を用い、対応点の濃度変化が一定以上増加した部分を移動体として検知する。その結果、図5に示すように、得られた群集密度の特徴量と、実際の群集密度とは相関係数0.84となる強い相関があった。
<実施例3>
On November 5, 2003, the weather was clear and the crossing part of the scramble crossing in front of the Hachioji station north exit was photographed for about 10 minutes from 9:00 am (Hachioji station crossing (morning)). A home video camera was fixed on a tripod on the pedestrian bridge in front of the station overlooking the intersection. The angle overlooking the crowd was approximately 10 °. Next, a still image was taken out from the video image every second using a personal computer equipped with a video capture board. The actual crowd density was estimated from the area of the specific area in this image and the number of people in the area. The area of the specific region was estimated by setting an object in the image as a reference for the scale. At this time, as a result of correcting distortion in the depth direction caused by the shooting angle, the area was 149.7 m 2 . The number of people in the area was counted visually. When people were overlapping, the number of heads was counted. As a result, the minimum number was 5, and the maximum number was 25. The crowd density is estimated to have changed from 0.03 person / m 2 to 0.17 person / m 2 . Next, paying attention to the difference in brightness between the human and the background in the image, binarization processing is performed to separate the two, and the area ratio between the person area occupied in the specific area and the entire specific area is expressed as the crowd density The feature amount. Since the specific area and the person area are distorted by the shooting angle, the feature amount obtained from the two is corrected. As a result, as shown in FIG. 4, the obtained feature value of the crowd density and the actual crowd density had a strong correlation with a correlation coefficient of 0.92. Next, paying attention to the fact that all the moving objects in the image are humans, the area ratio between the moving object area in the specific area and the entire specific area is used as the feature amount of the crowd density. The moving object is detected by binarizing density changes occurring in the same imaging region. That is, using an image at a different time, a portion where the density change of the corresponding point has increased by a certain level is detected as a moving object. As a result, as shown in FIG. 5, the obtained feature value of the crowd density and the actual crowd density had a strong correlation with a correlation coefficient of 0.84.
<Example 3>

2003年11月5日、天気晴れ、八王子駅構内のエスカレータと階段が併設された通路を午前9時30分からおよそ10分間撮影した(八王子駅構内エスカレータ昇降)。撮影は通路を見下ろす場所に家庭用ビデオカメラを固定して行った。群集を見下ろす角度はおよそ30°であった。次に、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を1秒毎に取り出した。この画像内の特定領域の面積と領域内の人数から実際の群集密度を推定した。特定領域の面積は画像中の物体を縮尺の基準に定めて推定した。その際に、撮影角度によって生じる奥行き方向の歪みを補正した結果、面積は16mであった。領域内の人数は目視により数えた。人が重なっている場合は頭部の数を数えた。その結果、最小人数2人、最大人数9人であった。群集密度は0.13人/mから0.56人/mまで変化したと推定される。次に、画像中の人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、特定領域内に占める人物面積と、特定領域全域との面積比を、群集密度の特徴量とした。特定領域面積と人物面積は撮影角度による歪みを生じているため、この2つから求めた特徴量を補正する。その結果、図6に示すように、得られた群集密度の特徴量と、実際の群集密度は強い相関(相関係数0.8)があった。次に、同一の撮影領域について、異なる時刻の静止画像をビデオ映像から取り出し、新しい画像内に局所部分を定める。古い画像内の同一部分を初期対応点と定める。両画像の対応点について、濃度差の2乗を、対応点全てについて足したものを不一致度と定める。古い画像内の対応点を上下左右に10ドットずつ移動させて不一致度行列を得た。この結果、図7に示すように、不一致度行列の各要素の値は、物体の移動元を示す要素で最小となり、群集の移動方向を推定できた。
<実施例4>
On November 5, 2003, the weather was clear, and a passage with escalators and stairs inside the Hachioji station was photographed for approximately 10 minutes from 9:30 am (up and down the escalator inside the Hachioji station). The photo was taken with a home video camera fixed in a place overlooking the aisle. The angle overlooking the crowd was approximately 30 °. Next, a still image was taken out from the video image every second using a personal computer equipped with a video capture board. The actual crowd density was estimated from the area of the specific area in this image and the number of people in the area. The area of the specific region was estimated by setting an object in the image as a reference for the scale. At that time, as a result of correcting the distortion in the depth direction caused by the photographing angle, the area was 16 m 2 . The number of people in the area was counted visually. When people were overlapping, the number of heads was counted. As a result, the minimum number was 2, and the maximum number was 9. It is estimated that the crowd density changed from 0.13 person / m 2 to 0.56 person / m 2 . Next, paying attention to the difference in brightness between the human and the background in the image, binarization processing is performed to separate the two, and the area ratio between the person area occupied in the specific area and the entire specific area is expressed as the crowd density The feature amount. Since the specific area and the person area are distorted by the shooting angle, the feature amount obtained from the two is corrected. As a result, as shown in FIG. 6, there was a strong correlation (correlation coefficient 0.8) between the obtained feature value of the crowd density and the actual crowd density. Next, for the same shooting area, still images at different times are extracted from the video image, and a local portion is defined in the new image. The same part in the old image is defined as the initial corresponding point. For the corresponding points of both images, the sum of the square of the density difference and all the corresponding points is defined as the degree of inconsistency. The corresponding points in the old image were moved 10 dots up, down, left, and right to obtain a mismatch degree matrix. As a result, as shown in FIG. 7, the value of each element of the disagreement degree matrix is the smallest in the element indicating the movement source of the object, and the movement direction of the crowd can be estimated.
<Example 4>

2003年11月5日、八王子みなみ野駅構内において、電車から降車後、改札へ向かう群集を改札とホームを結ぶ階段の上部にカメラを固定して、午前10時からおよそ10分間撮影した(八王子みなみ野駅ホーム)。群集を見下ろす角度はおよそ50°であった。次に、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を1秒毎に取り出した。この画像内の特定領域の面積と領域内の人数から実際の群集密度を推定した。特定領域の面積は画像中の物体を縮尺の基準に定めて推定した。その際に、撮影角度によって生じる奥行き方向の歪みを補正した結果、面積は8.9mであった。領域内の人数は目視により数えた。人が重なっている場合は頭部の数を数えた。その結果、最小人数1人、最大人数13人であった。群集密度は0.11人/mから1.46人/mまで変化したと推定される。次に、画像中の人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、特定領域内に占める人物面積と、特定領域全域との面積比を、群集密度の特徴量とした。特定領域面積と人物面積は撮影角度による歪みを生じているため、この2つから求めた特徴量を補正する。その結果、図8に示すように、得られた群集密度の特徴量と、実際の群集密度は強い相関(相関係数0.94)があった。次に、同一の撮影領域について、異なる時刻の静止画像をビデオ映像から取り出し、新しい画像内に局所部分を定める。古い画像内の同一部分を初期対応点と定める。両画像の対応点について、濃度差の2乗を、対応点全てについて足したものを不一致度と定める。古い画像内の対応点を上下左右に10ドットずつ移動させて不一致度行列を得た。この結果、図9に示すように、不一致度行列の各要素の値は、物体の移動元を示す要素で最小となり、群集の移動方向を推定できた。
<実施例5>
On November 5, 2003, at Hachioji Minamino Station, after getting off the train, the crowds heading for the ticket gates were fixed at the top of the stairs connecting the ticket gates and the platform, and were photographed for about 10 minutes from 10:00 am (Hachioji Minamino) Station platform). The angle overlooking the crowd was approximately 50 °. Next, a still image was taken out from the video image every second using a personal computer equipped with a video capture board. The actual crowd density was estimated from the area of the specific area in this image and the number of people in the area. The area of the specific region was estimated by setting an object in the image as a reference for the scale. At this time, as a result of correcting distortion in the depth direction caused by the shooting angle, the area was 8.9 m 2 . The number of people in the area was counted visually. When people were overlapping, the number of heads was counted. As a result, the minimum number was 1, and the maximum number was 13. The crowd density is estimated to have changed from 0.11 person / m 2 to 1.46 person / m 2 . Next, paying attention to the difference in brightness between the human and the background in the image, binarization processing is performed to separate the two, and the area ratio between the person area occupied in the specific area and the entire specific area is expressed as the crowd density The feature amount. Since the specific area and the person area are distorted by the shooting angle, the feature amount obtained from the two is corrected. As a result, as shown in FIG. 8, there was a strong correlation (correlation coefficient: 0.94) between the obtained feature value of the crowd density and the actual crowd density. Next, for the same shooting area, still images at different times are extracted from the video image, and a local portion is defined in the new image. The same part in the old image is defined as the initial corresponding point. For the corresponding points of both images, the sum of the square of the density difference and all the corresponding points is defined as the degree of inconsistency. The corresponding points in the old image were moved 10 dots up, down, left, and right to obtain a mismatch degree matrix. As a result, as shown in FIG. 9, the value of each element of the disagreement degree matrix is the smallest among the elements indicating the movement source of the object, and the movement direction of the crowd can be estimated.
<Example 5>

2003年12月22日、天気晴れ、渋谷駅構内の京王線とJR線の連絡通路を午前10時から10分間撮影した(渋谷駅(京王線とJR線の連絡通路))。連絡通路を見下ろす位置(併設の飲食店街入口付近)で、家庭用ビデオカメラを三脚で固定して撮影した。群集を見下ろす角度はおよそ35°である。次に、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を1秒毎に取り出した。この画像内の特定領域の面積と領域内の人数から実際の群集密度を推定する。特定領域の面積は画像中の物体を縮尺の基準に定めて推定した。その際に、撮影角度によって生じる奥行き方向の歪みを補正した。その結果面積は60mであった。領域内の人数は目視により数えた。人が重なっている場合は頭部の数を数えた。その結果、最小人数2人、最大人数13人であった。群集密度は0.03人/mから0.22/mまで変化したと推定される。次に、画像中の人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、特定領域内に占める人物面積と、特定領域全域との面積比を、群集密度の特徴量とした。特定領域面積と人物面積は撮影角度による歪みを生じているため、この2つから求めた特徴量を補正する。その結果、図10に示すように、得られた群集密度の特徴量と、実際の群集密度は強い相関(相関係数0.96)があった。次に、画像中の移動体のすべてが人間であることに着目し、特定領域内の移動体面積と特定領域全域との面積比を、群集密度の特徴量とした。移動体の検出は、同一の撮影領域に発生する濃度変化を2値化処理することで行う。即ち、異なる時刻の画像を用い、対応点の濃度変化が一定以上増加した部分を移動体として検知する。その結果、図11に示すように得られた群集密度の特徴量と、実際の群集密度とは相関係数0.95となる強い相関があった。
<実施例6>
On December 22, 2003, the weather was clear and the communication passage between the Keio Line and JR Line in Shibuya Station was photographed for 10 minutes from 10:00 am (Shibuya Station (connection passage between Keio Line and JR Line)). The video camera for home use was fixed with a tripod at a position overlooking the communication passage (near the entrance to the restaurant street). The angle overlooking the crowd is approximately 35 °. Next, a still image was taken out from the video image every second using a personal computer equipped with a video capture board. The actual crowd density is estimated from the area of the specific region in the image and the number of people in the region. The area of the specific region was estimated by setting an object in the image as a reference for the scale. At that time, the distortion in the depth direction caused by the shooting angle was corrected. As a result, the area was 60 m 2 . The number of people in the area was counted visually. When people were overlapping, the number of heads was counted. As a result, the minimum number was 2, and the maximum number was 13. The crowd density is estimated to have changed from 0.03 person / m 2 to 0.22 / m 2 . Next, paying attention to the difference in brightness between the human and the background in the image, binarization processing is performed to separate the two, and the area ratio between the person area occupied in the specific area and the entire specific area is expressed as the crowd density The feature amount. Since the specific area and the person area are distorted by the shooting angle, the feature amount obtained from the two is corrected. As a result, as shown in FIG. 10, there was a strong correlation (correlation coefficient of 0.96) between the obtained feature value of the crowd density and the actual crowd density. Next, paying attention to the fact that all the moving objects in the image are humans, the area ratio between the moving object area in the specific area and the entire specific area is used as the feature amount of the crowd density. The moving object is detected by binarizing density changes occurring in the same imaging region. That is, using an image at a different time, a portion where the density change of the corresponding point has increased by a certain level is detected as a moving object. As a result, as shown in FIG. 11, there was a strong correlation between the obtained feature value of the crowd density and the actual crowd density having a correlation coefficient of 0.95.
<Example 6>

2003年12月22日、天気晴れ。渋谷駅前交差点の横断部分を午前11時から10分間撮影した(渋谷駅交差点)。交差点を見下ろす喫茶店内に家庭用ビデオカメラを三脚で固定して撮影した。群集を見下ろす角度はおよそ20°であった。次に、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を1秒毎に取り出した。この画像内の特定領域の面積と領域内の人数から実際の群集密度を推定した。特定領域の面積は画像中の物体を縮尺の基準に定めて推定した。その際に、撮影角度によって生じる奥行き方向の歪みを補正した結果、面積は180mであった。領域内の人数は目視により数えた。人が重なっている場合は頭部の数を数えた。その結果、最小人数7人、最大人数75人であった。群集密度は0.04人/mから0.42人/mまで変化したと推定される。次に、画像中の人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、特定領域内に占める人物面積と、特定領域全域との面積比を、群集密度の特徴量とした。特定領域面積と人物面積は撮影角度による歪みを生じているため、この2つから求めた特徴量を補正する。その結果、図12に示すように、得られた群集密度の特徴量と、実際の群集密度は強い相関(相関係数0.93)があった。次に、画像中の移動体のすべてが人間であることに着目し、特定領域内の移動体面積と特定領域全域との面積比を、群集密度の特徴量とした。移動体の検出は、同一の撮影領域に発生する濃度変化を2値化処理することで行う。即ち、異なる時刻の画像を用い、対応点の濃度変化が一定以上増加した部分を移動体として検知する。その結果、図13に示すように、得られた群集密度の特徴量と、実際の群集密度は強い相関(相関係数0.96)があった。
<応用例1>
The weather was fine on December 22, 2003. The crossing part of the Shibuya station intersection was photographed for 10 minutes from 11:00 am (Shibuya station intersection). A video camera for home use was fixed on a tripod in a coffee shop overlooking the intersection. The angle overlooking the crowd was approximately 20 °. Next, a still image was taken out from the video image every second using a personal computer equipped with a video capture board. The actual crowd density was estimated from the area of the specific area in this image and the number of people in the area. The area of the specific region was estimated by setting an object in the image as a reference for the scale. At this time, as a result of correcting the distortion in the depth direction caused by the photographing angle, the area was 180 m 2 . The number of people in the area was counted visually. When people were overlapping, the number of heads was counted. As a result, the minimum number was 7, and the maximum number was 75. The crowd density is estimated to have changed from 0.04 person / m 2 to 0.42 person / m 2 . Next, paying attention to the difference in brightness between the human and the background in the image, binarization processing is performed to separate the two, and the area ratio between the person area occupied in the specific area and the entire specific area is expressed as the crowd density The feature amount. Since the specific area and the person area are distorted by the shooting angle, the feature amount obtained from the two is corrected. As a result, as shown in FIG. 12, there was a strong correlation (correlation coefficient: 0.93) between the feature amount of the obtained crowd density and the actual crowd density. Next, paying attention to the fact that all the moving objects in the image are humans, the area ratio between the moving object area in the specific area and the entire specific area is used as the feature amount of the crowd density. The moving object is detected by binarizing density changes occurring in the same imaging region. That is, using an image at a different time, a portion where the density change of the corresponding point has increased by a certain level is detected as a moving object. As a result, as shown in FIG. 13, there was a strong correlation (correlation coefficient of 0.96) between the obtained feature value of the crowd density and the actual crowd density.
<Application example 1>

本発明の動的群集密度のリアルタイム推定方法は、イベント会場における群集事故を未然に防止するために、入場券販売機と動的群集密度推定システムを通信機能で結び、イベント会場に向かう人ごみを過度に密集しないよう調節するのに適用できる。その検証の目的で、学生食堂をイベント会場に見立て、食券販売機を入場券販売機とし次のシステム構成を検討した。検証実験のタイミングとしては、学内で学会等の催しものがあるときを選んだ。一般に、学外からの来訪者にも食堂を利用してもらい、食券は自動発券機で販売するが、普段でも昼時は混雑し行列ができている。そのため、学外の来訪者のために特設コーナーを設けて対応している。普段の混雑時でも、厨房の人達は大変忙しく、次々と来る人に食膳を提供するので手一杯である。食堂内の混雑を監視する担当者は置かれていない。学外者に気分良く食事をしてもらうためには、学内者の入場を制限する必要がある。食堂内に設置した監視カメラから混雑の変化を監視し、混雑に応じて食券発売機の稼動台数を制御した。図14に本システムの基本構成を示す。本システムは、食券売機3台、カメラ1台、データ処理及び制御を行うパーソナルコンピュータ、これらを接続する通信ケーブルで構成する。パーソナルコンピュータには、ビデオカメラと接続するためのビデオキャプチャボード、券売機と接続するための制御ボードを装着している。図15、図16に本システムの設置状況を示す。食堂は3階にあり、階段を上った場所に券売機が設置してある。利用者は、食券を購入して、券売機右側の入口から食堂内に入室し、食事を済ませると利用者は出口から退出する。人の流れを図内の矢印で示す。室内高所に設置したカメラで、俯瞰風景を撮影した。撮影範囲を決定するために、データ処理装置からカメラを制御した。室内全域を撮影対象とするように、カメラの画角調整と撮影角度調整を行った。この時の角度θは撮影範囲の中央部分を示している。撮影範囲の上部、中部、下部では角度が異なっているため、撮影範囲の上部と下部の対象を映像の中央部分に一時的に移動させて角度を計測した。それぞれθu、θdとする。撮影角度による歪みを補正するために、これらの角度を用いる。撮影面積の推定は、撮影範囲の上部、中部、下部における縮尺の基凖を、各部分に写った人物の身長が射影する床面上の距離とした。このとき、撮影範囲の選定で測定した上部、中部、下部の角度を用いて補正した。データ処理装置からビデオカメラを制御し室内の様子の撮影を開始した。映像の調整(色、映像の明暗)はカメラのオート機能に任せ、データ処理装置に装着したビデオキャプチャボードを通じて、ビデオ映像をデータ処理装置内に蓄積した。データ処理装置では、映像から1秒毎に取り出した静止画に対して明るさを2値化することで特徴量を得た。特徴量に従って各発券機の状態を”発売中”あるいは”発売中止”のいずれかに、各食券販売機を独立に制御した。特徴量と券売機の稼動台数を下記の通りとした。特徴量<0.6の場合は3台、0.6<特徴量<0.8の場合は2台、0.8<特徴量<0.9の場合は1台、0.9<特徴量の場合は0台の券売機を稼動させる。The dynamic crowd density real-time estimation method of the present invention connects an admission ticket vending machine and a dynamic crowd density estimation system with a communication function in order to prevent crowd accidents at an event venue, and excessively crowds going to the event venue. Applicable to adjust so as not to get crowded. For the purpose of this verification, we considered the student cafeteria as the event venue and examined the following system configuration with the ticket vending machine as the admission ticket vending machine. The timing of the verification experiment was selected when there were events such as academic conferences on campus. In general, visitors from off-campus use the cafeteria and sell their tickets with automatic ticket machines, but they are usually crowded and queued at noon. Therefore, a special corner is set up for visitors from outside the university. Even in normal crowds, the people in the kitchen are very busy and are full of food as they provide food to the people who come one after another. There is no person in charge of monitoring the crowd in the cafeteria. It is necessary to restrict the entrance of insiders in order to have outsiders feel comfortable eating. The change of congestion was monitored from a surveillance camera installed in the cafeteria, and the number of ticket vending machines operated was controlled according to the congestion. FIG. 14 shows the basic configuration of this system. This system comprises three food vending machines, one camera, a personal computer for data processing and control, and a communication cable for connecting them. A personal computer is equipped with a video capture board for connection with a video camera and a control board for connection with a ticket vending machine. 15 and 16 show the installation status of this system. The cafeteria is on the 3rd floor, and a ticket vending machine is installed up the stairs. A user purchases a meal ticket, enters the dining room from the entrance on the right side of the ticket vending machine, and when the user finishes the meal, the user exits from the exit. The flow of people is indicated by arrows in the figure. A bird's-eye view was shot with a camera installed in an indoor high place. The camera was controlled from the data processing device to determine the shooting range. The angle of view of the camera and the adjustment of the shooting angle were adjusted so that the entire indoor area was taken. Angle theta c at this time shows the central portion of the imaging range. Since the angles at the top, middle, and bottom of the shooting range are different, the angle was measured by temporarily moving the upper and lower targets of the shooting range to the center of the video. Let them be θu and θd, respectively. These angles are used to correct distortion due to the shooting angle. The estimation of the shooting area was based on the scale on the upper, middle, and lower parts of the shooting range as the distance on the floor where the height of the person reflected in each part was projected. At this time, correction was performed using the upper, middle, and lower angles measured in the selection of the shooting range. The video camera was controlled from the data processing device, and the indoor scene was started. The adjustment of the image (color, brightness of the image) was left to the auto function of the camera, and the video image was stored in the data processing device through a video capture board attached to the data processing device. In the data processing apparatus, the feature amount is obtained by binarizing the brightness of the still image taken out from the video every second. Each ticket vending machine was independently controlled to either “on sale” or “cancellation of sale” according to the feature amount. The feature quantity and the number of operating ticket machines are as follows. 3 units if feature quantity <0.6, 2 units if feature quantity <0.8, 1 unit if 0.8 <feature quantity <0.9, 0.9 <feature quantity In the case of 0, 0 ticket machines are operated.

この、結果実験対象とした東京工科大学厚生棟3階の食堂の12時から13時の昼食時にかけての混雑は学会開催中というイベント時にも関わらず解消し、食堂内での配膳場所での学会関係の学外者と学生とによる混乱は避けられ、本システムがイベント会場での密集した群集による事故を未然に防止するのに極めて有効であることが明らかとなった。
<応用例2>
As a result, the crowd from the 12:00 to 13:00 lunch at the cafeteria on the 3rd floor of the Tokyo Tech welfare building, which was the subject of the experiment, disappeared despite the event that the conference was held, and the conference was held in the cafeteria. It was found that the confusion between related outsiders and students can be avoided, and this system is extremely effective in preventing accidents caused by crowded crowds at the event venue.
<Application example 2>

本発明の動的群集密度のリアルタイム推定方法は、イベント会場に向かう群集事故を未然に防止するために、動的群集密度推定システムと中央監視センターとを通信機能で結び、イベント会場に向かう群衆を過度に密集しないよう中央監視センターで調節するのに適用できる。その検証の目的で、東京工科大学をイベント会場に見立て、最寄の駅である八王子みなみ野駅を降りて東京工科大学行バス発着場までのルート(図17)を花火大会等のイベント会場までのルートの途中の陸橋や階段など群集事故の発生しやすい場所と見立て、図18に示すような位置にモニターカメラを配置して来場者の動きを監視し、誘導する群集事故防止システムとして図19のようなシステム構成を検討した。八王子みなみ野駅前の本学専用のバス停では、午前中の混雑時は、およそ500メートルに渡って列ができる。現在、整理員を数人配置して混雑に対応している。日中は数人の整理員によって、バス停前の混雑は統制されている。バスの運行は、利用者の多い時間帯は2分〜5分間隔で行われている。それ以外の時間帯では、大変な行列でも一台しか来ないときもある。バスの運転手の報告により、バスを増減することも可能であるが、空いている状態から混雑した状態に対応するのは難しい。混雑の状況を本発明のモニタカメラの画像を用いたリアルタイム推定方法で計測し、事後対応でなく事前対応でバスの増発を行い、円滑な輸送を朝の8:30から11:00までの時間帯に実施した。アーケードの天井から駅方向を撮影し、階段部分を歩行する人を含めないようにする。階段を歩行する人はカメラに接近しすぎて補正が難しい。アーケードの天井に設置したカメラで、俯瞰風景を撮影した。撮影範囲を決定するために、データ処理装置からカメラを制御した階段直前までを撮影対象とするように、カメラの画角調整と撮影角度調整を行った。この時の角度は撮影範囲の中央部分を示している。撮影範囲の上部、中部、下部では角度が異なっているため、撮影範囲の上部と下部の対象を映像の中部分に一時的に移動させて角度を計測した。それぞれθu、θc、θdとする。撮影角度による歪みを補正するために、それぞれの角度を用いる。撮影面積の推定は、撮影範囲の上部、中部、下部における縮尺の基準を、各部分に写った人物の身長が射影する床面上の距離とした。このとき、撮影範囲の選定で測定した上部、中部、下部の角度を用いて補正した。データ処理装置からビデオカメラを制御し行列の様子の撮影を開始した。映像の調整(色、映像の明暗)はカメラのオート機能に任せ、データ処理装置に装着したビデオキャプチャボードを通じて、ビデオ映像をデータ処理装置内に蓄積した。データ処理装置では、映像から1秒毎に取り出した静止画に対して明るさを2値化することで特徴量を得た。アーケード天井から駅方向を撮影した映像から取り出した静止画の混雑に関する特徴量を無線ネットワークで約数キロメーター離れた東京工科大学内のバス溜りに停車中のバス内にいる運行指示者に、10秒間の平均特徴量が0.1以下の場合は全LED無点灯、0.1<特徴量<0.5の場合は青のLED点灯、0.5<特徴量<0.8の場合は黄色のLED点灯、特徴量が0.8以上の場合は赤のLEDが点灯する信号を無線送信し、バス運行指示者は常時1台のバス(可搬のバス運行監視端末あるいは監視ボックスが設置されたバス)でこれを監視し、無線で待機バスに発車命令を出した。バスの運行指示者は特徴量が0.1以下の場合は0台(待機)、0.1<特徴量<0.5(青のLED点灯)の場合は1台、0.5<特徴量<0.8(黄色のLED点灯)の場合は2台、0.8<特徴量(赤のLED点灯)の場合は3台のバスを東京工科大学内のバス溜りより連続発車させ、シャトル運行させるように指示した。この場合、各バスは指示を受けると八王子みなみ野駅へ向かい、乗客を大学へ搬送すると東京工科大学内のバス溜りに待機し、次の指示を受ける。  The real-time estimation method of the dynamic crowd density of the present invention connects the dynamic crowd density estimation system and the central monitoring center with a communication function in order to prevent a crowd accident toward the event venue. Applicable to adjust at the central monitoring center so as not to be overcrowded. For the purpose of verification, Tokyo Institute of Technology is regarded as the event venue, the route to Hachioji Minamino Station, the nearest station, and the bus to the Tokyo Institute of Technology bus stop (Figure 17) to the event venue such as fireworks display As a crowd accident prevention system shown in FIG. 19, a monitor camera is placed at the position shown in FIG. 18 to monitor and guide the movement of visitors, such as a crossover or a stairway on the route. A system configuration like this was studied. At the university's exclusive bus stop in front of Hachioji Minamino Station, when crowded in the morning, there is a line that spans approximately 500 meters. Currently, several organizers are assigned to deal with the congestion. During the day, crowds before bus stops are controlled by several organizers. Buses are operated at intervals of 2 to 5 minutes during times when there are many users. In other time zones, there may be cases where only one car is coming even if it is a serious procession. Although it is possible to increase or decrease the number of buses according to the report of the bus driver, it is difficult to deal with a busy state from an empty state. The congestion situation is measured by the real-time estimation method using the image of the monitor camera of the present invention, the number of buses is increased in advance rather than in the after-response, and smooth transportation is performed from 8:30 to 11:00 in the morning. Conducted on the belt. Take a picture of the direction of the station from the ceiling of the arcade to avoid including people walking on the stairs. People walking on stairs are too close to the camera to make corrections. I took a bird's-eye view with a camera installed on the ceiling of the arcade. In order to determine the shooting range, the angle of view of the camera and the adjustment of the shooting angle were adjusted so that the area from the data processing device to just before the stairs where the camera was controlled was taken as a shooting target. The angle at this time indicates the central portion of the shooting range. Since the angles at the top, middle, and bottom of the shooting range are different, the angle was measured by temporarily moving the objects at the top and bottom of the shooting range to the middle part of the video. Let them be θu, θc, and θd, respectively. Each angle is used to correct distortion due to the shooting angle. For the estimation of the shooting area, the scale reference at the upper, middle and lower parts of the shooting range was the distance on the floor where the height of the person reflected in each part was projected. At this time, correction was performed using the upper, middle, and lower angles measured in the selection of the shooting range. The video camera was controlled from the data processing device, and shooting of the state of the matrix was started. The adjustment of the image (color, brightness of the image) was left to the auto function of the camera, and the video image was stored in the data processing device through a video capture board attached to the data processing device. In the data processing apparatus, the feature amount is obtained by binarizing the brightness of the still image taken out from the video every second. 10 to the operation instructor in the bus parked at the bus stop in the Tokyo University of Technology, which is about a few kilometers away from the image of the station direction taken from the arcade ceiling. When the average feature value per second is 0.1 or less, all LEDs are not lit, blue LED is lit when 0.1 <feature value <0.5, and yellow when 0.5 <feature value <0.8 When the LED is on and the feature quantity is 0.8 or more, the red LED lights up wirelessly, and the bus operation instructor is always equipped with one bus (portable bus operation monitoring terminal or monitoring box) This was monitored by a bus, and a departure command was issued wirelessly to the standby bus. If the feature amount is 0.1 or less, the bus operation instructor is 0 (standby), if 0.1 <feature amount <0.5 (blue LED is on), 0.5 <feature amount If <0.8 (yellow LED is lit), 2 buses, if 0.8 <feature amount (red LED is lit), 3 buses will depart from the Tokyo University of Technology bus pool and shuttle service Instructed to do. In this case, each bus will go to Hachioji Minamino Station when it receives an instruction, and when a passenger is transported to the university, it will wait in a bus pool in Tokyo University of Technology and receive the next instruction.

この、結果八王子みなみ野の駅前の東京工科大学行きバス停に向かって並ぶ数百mの長蛇の列は8:30から11:00の時間で全く発生せずに、最大で50m程度の2列の整然とした乗り合いバス待ち風景が常時見られる状況となった。なお、8:30から11:00以外の時間では予め決められた季節ごとの定時運行を行っている。  As a result, the long line of several hundreds of meters lined up toward the bus stop for Tokyo University of Technology in front of the Hachioji Minamino station does not occur at all from 8:30 to 11:00. The situation of waiting for a shared bus was always visible. In addition, at times other than 8:30 to 11:00, scheduled service is performed for each predetermined season.

撮影角度による画像の歪みを補正するための図である。It is a figure for correct | amending the distortion of the image by imaging | photography angle. 八王子駅前交差点(夜)風景撮影の画像を用い、明るさの2値化画像処理から得られた群集密度の特徴量と、実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature amount of the crowd density obtained from the binarized image processing of brightness, and the actual crowd density in time series using an image taken at the Hachioji station square (night) landscape. 八王子駅前交差点(夜)風景撮影の画像を用いて、群集の移動方向推定のために不一致度を可視化した図である。左図は、ある瞬間の映像である。右図は左隣の映像より1/30秒前の画像と対応点の位置を上下左右にずらしたときの不一致度の最小を黒、最大を白とする階調で表している。座標は上下左右のずらした量に対応する。It is the figure which visualized the inconsistency degree for the movement direction estimation of a crowd using the image of Hachioji station square intersection (night) landscape photography. The figure on the left is an image at a certain moment. The right figure shows the gray level where the minimum mismatch is black and the maximum is white when the position of the corresponding point is shifted up and down and left and right from the image 1/30 second before the image on the left. Coordinates correspond to the amount shifted up, down, left and right. 八王子駅前交差点(朝)風景撮影の画像を用い、明るさの2値化画像処理から得られた群集密度の特徴量と、実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature amount of the crowd density obtained from the binarized image processing of the brightness, and the actual crowd density in time series using an image taken at the Hachioji station square intersection (morning) landscape. 八王子駅前交差点(朝)風景撮影の画像を用い、移動体の検出に基づいて、得られた群集密度の特徴量と実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature quantity of the obtained crowd density and the actual crowd density in time series based on the detection of a moving body using the image of Hachioji station square intersection (morning) scenery photography. 八王子駅構内エスカレータ昇降風景撮影の画像を用い、明るさの2値化画像処理から得られた群集密度の特徴量と、実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature quantity of the crowd density obtained from the binarized image processing of brightness, and the actual crowd density in time series using the image of the escalator up-and-down landscape photography in Hachioji Station. 八王子駅構内エスカレータ昇降風景撮影の画像を用いて、群集の移動方向推定のために不一致度を可視化した図である。左図は、ある瞬間の映像である。右図は左隣の映像より1/30秒前の画像と対応点の位置を上下左右にずらしたときの不一致度の最小を黒、最大を白とする階調で表している。座標は上下左右のずらした量に対応する。It is the figure which visualized the inconsistency degree for the movement direction estimation of a crowd using the picture of the escalator up-and-down landscape photography in Hachioji Station premises. The figure on the left is an image at a certain moment. The right figure shows the gray level where the minimum mismatch is black and the maximum is white when the position of the corresponding point is shifted up and down and left and right from the image 1/30 second before the image on the left. Coordinates correspond to the amount shifted up, down, left and right. 八王子みなみ野駅ホーム風景撮影の画像を用い、明るさの2値化画像処理から得られた群集密度の特徴量と、実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature amount of the crowd density obtained from the binarized image processing of brightness, and the actual crowd density in time series using the image of the Hachioji Minamino Station home landscape photographed. 八王子みなみ野駅ホーム風景撮影の画像を用いて、群集の移動方向推定のために不一致度を可視化した図である。左図は、ある瞬間の映像である。右図は左隣の映像より1/30秒前の画像と対応点の位置を上下左右にずらしたときの不一致度の最小を黒、最大を白とする階調で表している。座標は上下左右のずらした量に対応する。It is the figure which visualized inconsistency degree for the movement direction estimation of a crowd using the image of Hachioji Minamino Station home landscape photography. The figure on the left is an image at a certain moment. The right figure shows the gray level where the minimum mismatch is black and the maximum is white when the position of the corresponding point is shifted up and down and left and right from the image 1/30 second before the image on the left. Coordinates correspond to the amount shifted up, down, left and right. 渋谷駅(京王線とJR線の連絡通路)風景撮影の画像を用い、明るさの2値化画像処理から得られた群集密度の特徴量と、実際の群集密度を時系列に対比した図である。Shibuya Station (communication passage between Keio Line and JR Line) Using the image of landscape shooting, the feature value of crowd density obtained from binarized image processing of brightness and the actual crowd density compared with time series is there. 渋谷駅構内(京王線とJR線の連絡通路)風景撮影の画像を用い、移動体の検出に基づいて、得られた群集密度の特徴量と実際の群集密度を時系列に対比した図である。Based on the detection of moving objects using Shibuya station yard (communication path between Keio Line and JR line), it is a figure comparing the obtained crowd density features and the actual crowd density in time series. . 渋谷駅交差点風景撮影の画像を用い、明るさの2値化画像処理から得られた群集密度の特徴量と、実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature quantity of the crowd density obtained from the binarized image processing of brightness, and the actual crowd density in time series using the image of Shibuya Station intersection landscape photography. 渋谷駅前交差点風景撮影の画像を用い、移動体の検出に基づいて、得られた群集密度の特徴量と実際の群集密度を時系列に対比した図である。It is the figure which contrasted the feature-value of the obtained crowd density and the actual crowd density in time series based on the detection of a moving body using the image of the Shibuya station square scenery photograph. 食堂の混雑を制御するシステムの装置構成図である。It is an apparatus block diagram of the system which controls the congestion of a dining room. 撮影角度による歪みの補正を示す図である。It is a figure which shows the correction | amendment of the distortion by imaging | photography angle. 食堂への各装置の設置状況を示す図である。It is a figure which shows the installation condition of each apparatus in a dining room. 東京工科大学行バス発着場へのルート図である。It is a route map to the Tokyo Institute of Technology bus stop. カメラの配置断面図である。It is arrangement | positioning sectional drawing of a camera. バス運行制御システム構成図である。It is a bus operation control system block diagram.

Claims (9)

動的群集密度をリアルタイムに推定する方法において、時間的相関のある静止画像を複数用いて、この画像に人間と背景を区別するデータ処理を加え、群集の密度を推定することを特徴とする動的群集密度のリアルタイム推定方法。In a method for estimating dynamic crowd density in real time, a dynamic image is characterized in that a plurality of still images with temporal correlation are used, data processing for distinguishing humans and backgrounds is added to the images, and crowd density is estimated. Real-time method for estimating population density. 前記請求項1の静止画像において、ビデオ映像から静止画像を取り出すことを特徴とする請求項1に記載の動的群集密度のリアルタイム推定方法。2. The dynamic crowd density real-time estimation method according to claim 1, wherein a still image is extracted from a video image in the still image of claim 1. 前記請求項1の静止画像において、ビデオキャプチャボードを装着したパーソナルコンピュータを用いてビデオ映像から静止画像を取り出ことを特徴とする請求項1に記載の動的群集密度のリアルタイム推定方法。2. The method according to claim 1, wherein a still image is extracted from a video image using a personal computer equipped with a video capture board. 前記請求項1に記載のデータ処理において、ビデオ映像から取り出した静止画像に2値化処理を行い、対象領域に占める移動物体と全領域の面積比を群集密度の特徴量とすることを特徴とする請求項1に記載の動的群集密度のリアルタイム推定方法。The data processing according to claim 1, wherein binarization processing is performed on a still image extracted from a video image, and an area ratio between a moving object and the entire region in the target region is used as a feature amount of the crowd density. The dynamic crowd density real-time estimation method according to claim 1. 前記請求項4の移動物体の検出においては、同一の撮影領域に対して異なる時刻の画像を用い、対応点の濃度変化について2値化処理を行うことを特徴とする請求項4に記載の動的群集密度のリアルタイム推定方法。5. The moving object according to claim 4, wherein in the detection of the moving object according to claim 4, binarization processing is performed for density changes at corresponding points using images at different times for the same imaging region. Real-time method for estimating population density. 前記請求項4の群集密度の特徴量において、画像中での人間と背景の明るさの違いに着目し、両者を分離する2値化処理を行い、対象領域に占める人間の面積と全領域の面積との比を群集密度の特徴量とすることを特徴とする請求項4に記載の動的群集密度のリアルタイム推定方法。In the feature amount of the crowd density according to claim 4, paying attention to the difference in brightness between the human and the background in the image, the binarization process for separating the two is performed, and the human area occupying the target area and the total area The dynamic crowd density real-time estimation method according to claim 4, wherein a ratio with the area is a feature amount of the crowd density. 前記請求項5の動的群集密度をリアルタイムに推定する方法において、群集の移動方向推定のために同一の撮影対象領域について、異なる時刻の静止画像をビデオ映像から取り出し、新しい画像の局所部分が、古い画像内で類似度が高くなる位置を調べ、古い画像中で類似度の高い位置から、新しい画像中の局所領域の位置へ物体が移動したと推定することを特徴とする請求項5に記載の動的群集密度のリアルタイム推定方法。In the method for estimating the dynamic crowd density in real time according to claim 5, a still image at a different time is extracted from a video image for the same shooting target region for estimating a movement direction of the crowd, and a local portion of a new image is The position where the degree of similarity becomes high in the old image is examined, and it is estimated that the object has moved from the position where the degree of similarity is high in the old image to the position of the local region in the new image. Real-time estimation method of dynamic community density 請求項7の群集の移動方向推定において、2つの画像の類似度を表す指標は、画像の対応する点について濃度差の2乗和の逆数であらわすこととしたことを特徴とする請求項7に記載の動的群集密度のリアルタイム推定方法。8. The movement direction estimation of the crowd according to claim 7, wherein the index representing the similarity between two images is expressed by the reciprocal of the sum of squares of the density difference for the corresponding points of the images. The method for real-time estimation of the described dynamic crowd density. 請求項1の動的群集密度のリアルタイム推定方法を用いて、イベント会場、食堂、停留所等への人の密集を制御することを特徴とする、群集事故防止システム。A crowd accident prevention system that controls crowding of people at an event venue, restaurant, stop, etc. using the dynamic crowd density real-time estimation method according to claim 1.
JP2004231672A 2004-07-12 2004-07-12 Real-time estimation method for dynamic crowd density and crowd accident prevention system Pending JP2006031645A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004231672A JP2006031645A (en) 2004-07-12 2004-07-12 Real-time estimation method for dynamic crowd density and crowd accident prevention system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004231672A JP2006031645A (en) 2004-07-12 2004-07-12 Real-time estimation method for dynamic crowd density and crowd accident prevention system

Publications (1)

Publication Number Publication Date
JP2006031645A true JP2006031645A (en) 2006-02-02

Family

ID=35897872

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004231672A Pending JP2006031645A (en) 2004-07-12 2004-07-12 Real-time estimation method for dynamic crowd density and crowd accident prevention system

Country Status (1)

Country Link
JP (1) JP2006031645A (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010140371A (en) * 2008-12-15 2010-06-24 Nippon Telegr & Teleph Corp <Ntt> System, method and program for monitoring video
CN101464944B (en) * 2007-12-19 2011-03-16 中国科学院自动化研究所 Crowd density analysis method based on statistical characteristics
CN102436739A (en) * 2011-09-27 2012-05-02 重庆大学 Method for distinguishing traffic jam of toll plaza of highway based on video detection technology
KR101173786B1 (en) 2010-11-05 2012-08-16 성균관대학교산학협력단 System and method for automated measurement of crowd density using neural network
JP5597781B1 (en) * 2014-03-26 2014-10-01 パナソニック株式会社 Residence status analysis apparatus, residence status analysis system, and residence status analysis method
JP5613815B1 (en) * 2013-10-29 2014-10-29 パナソニック株式会社 Residence status analysis apparatus, residence status analysis system, and residence status analysis method
CN104933867A (en) * 2015-06-17 2015-09-23 苏州大学 Road information real-time acquisition device and method based on traffic monitoring video
JP2015528614A (en) * 2012-09-12 2015-09-28 アビギロン フォートレス コーポレイションAvigilon Fortress Corporation Method, apparatus and system for detecting an object in a video
US9165372B2 (en) 2006-10-16 2015-10-20 Bae Systems Plc Improvements relating to event detection
JP2016042277A (en) * 2014-08-18 2016-03-31 日本電信電話株式会社 Guidance system and guidance method for guidance system
CN108665691A (en) * 2018-08-22 2018-10-16 张菁菁 A kind of system and method for the early warning and water conservancy diversion of the anti-swarm and jostlement of intelligence
CN111080341A (en) * 2019-11-26 2020-04-28 微梦创科网络科技(中国)有限公司 Method and device for creating dynamic card of specific character
CN111144276A (en) * 2019-12-24 2020-05-12 北京深测科技有限公司 Monitoring and early warning method for pasture
CN112001274A (en) * 2020-08-06 2020-11-27 腾讯科技(深圳)有限公司 Crowd density determination method, device, storage medium and processor
CN112632601A (en) * 2020-12-16 2021-04-09 苏州玖合智能科技有限公司 Crowd counting method for subway carriage scene
CN115223102A (en) * 2022-09-08 2022-10-21 枫树谷(成都)科技有限责任公司 Real-time crowd density fusion sensing method and model based on camera cluster
JP2022166067A (en) * 2015-01-14 2022-11-01 日本電気株式会社 Information processing system, information processing method and program
CN117058627A (en) * 2023-10-13 2023-11-14 阳光学院 Public place crowd safety distance monitoring method, medium and system

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9165372B2 (en) 2006-10-16 2015-10-20 Bae Systems Plc Improvements relating to event detection
CN101464944B (en) * 2007-12-19 2011-03-16 中国科学院自动化研究所 Crowd density analysis method based on statistical characteristics
JP2010140371A (en) * 2008-12-15 2010-06-24 Nippon Telegr & Teleph Corp <Ntt> System, method and program for monitoring video
KR101173786B1 (en) 2010-11-05 2012-08-16 성균관대학교산학협력단 System and method for automated measurement of crowd density using neural network
CN102436739A (en) * 2011-09-27 2012-05-02 重庆大学 Method for distinguishing traffic jam of toll plaza of highway based on video detection technology
CN107256377A (en) * 2012-09-12 2017-10-17 威智伦富智堡公司 Method, apparatus and system for detecting the object in video
JP2015528614A (en) * 2012-09-12 2015-09-28 アビギロン フォートレス コーポレイションAvigilon Fortress Corporation Method, apparatus and system for detecting an object in a video
JP5613815B1 (en) * 2013-10-29 2014-10-29 パナソニック株式会社 Residence status analysis apparatus, residence status analysis system, and residence status analysis method
JP2015087841A (en) * 2013-10-29 2015-05-07 パナソニック株式会社 Congestion status analyzer, congestion status analyzing system, and congestion status analyzing method
JP5597781B1 (en) * 2014-03-26 2014-10-01 パナソニック株式会社 Residence status analysis apparatus, residence status analysis system, and residence status analysis method
JP2015186202A (en) * 2014-03-26 2015-10-22 パナソニック株式会社 Residence condition analysis device, residence condition analysis system and residence condition analysis method
JP2016042277A (en) * 2014-08-18 2016-03-31 日本電信電話株式会社 Guidance system and guidance method for guidance system
JP2022166067A (en) * 2015-01-14 2022-11-01 日本電気株式会社 Information processing system, information processing method and program
JP7428213B2 (en) 2015-01-14 2024-02-06 日本電気株式会社 Information processing system, information processing method and program
CN104933867B (en) * 2015-06-17 2017-05-24 苏州大学 Road information real-time acquisition method based on traffic monitoring video
CN104933867A (en) * 2015-06-17 2015-09-23 苏州大学 Road information real-time acquisition device and method based on traffic monitoring video
CN108665691A (en) * 2018-08-22 2018-10-16 张菁菁 A kind of system and method for the early warning and water conservancy diversion of the anti-swarm and jostlement of intelligence
CN111080341B (en) * 2019-11-26 2023-04-07 微梦创科网络科技(中国)有限公司 Method and device for creating dynamic card of specific character
CN111080341A (en) * 2019-11-26 2020-04-28 微梦创科网络科技(中国)有限公司 Method and device for creating dynamic card of specific character
CN111144276A (en) * 2019-12-24 2020-05-12 北京深测科技有限公司 Monitoring and early warning method for pasture
CN111144276B (en) * 2019-12-24 2023-04-18 北京深测科技有限公司 Monitoring and early warning method for pasture
CN112001274A (en) * 2020-08-06 2020-11-27 腾讯科技(深圳)有限公司 Crowd density determination method, device, storage medium and processor
CN112001274B (en) * 2020-08-06 2023-11-17 腾讯科技(深圳)有限公司 Crowd density determining method, device, storage medium and processor
CN112632601A (en) * 2020-12-16 2021-04-09 苏州玖合智能科技有限公司 Crowd counting method for subway carriage scene
CN112632601B (en) * 2020-12-16 2024-03-12 苏州玖合智能科技有限公司 Crowd counting method for subway carriage scene
CN115223102A (en) * 2022-09-08 2022-10-21 枫树谷(成都)科技有限责任公司 Real-time crowd density fusion sensing method and model based on camera cluster
CN115223102B (en) * 2022-09-08 2022-12-16 枫树谷(成都)科技有限责任公司 Real-time crowd density fusion sensing method and model based on camera cluster
CN117058627A (en) * 2023-10-13 2023-11-14 阳光学院 Public place crowd safety distance monitoring method, medium and system
CN117058627B (en) * 2023-10-13 2023-12-26 阳光学院 Public place crowd safety distance monitoring method, medium and system

Similar Documents

Publication Publication Date Title
JP2006031645A (en) Real-time estimation method for dynamic crowd density and crowd accident prevention system
US8334906B2 (en) Video imagery-based sensor
JP6631619B2 (en) Video monitoring system and video monitoring method
JP3243234B2 (en) Congestion degree measuring method, measuring device, and system using the same
JP3490466B2 (en) Image monitoring device and elevator control device using the image monitoring device
CN106144862A (en) Passenger based on depth transducer for passenger traffic gate control senses
CN108206935A (en) A kind of personnel amount statistical monitoring analysis system
JP2007317052A (en) System for measuring waiting time for lines
AU2016405879A1 (en) Control system and procedure for vehicles and parking spaces for outdoor parking lots
JPH056500A (en) Moving body and equipment control system
US20210209909A1 (en) Information processing device and determination method
JP3480847B2 (en) Elevator control device using image monitoring device
US20220139199A1 (en) Accurate digital security system, method, and program
JP2008217289A (en) People counting system
CN107409198B (en) Camera image data-based situation confirmation system, control device, and camera image data-based situation confirmation method
CN109872561A (en) A kind of road brake system and method with information guiding and prompt facility
TWM551285U (en) Surveillance system with face recognition
KR102550673B1 (en) Method for visitor access statistics analysis and apparatus for the same
KR102059669B1 (en) Disaster Management System With KIOSK
KR101977111B1 (en) a walking line solution system
KR20210074424A (en) Parking sharing system for maximum use of parking spaces
KR20200025384A (en) Parking Information Providing Device Using Real-Time Image Processing and System thereof
KR20230020184A (en) Video analysis device using fixed camera and moving camera
JP2022104636A (en) Entry/exit management system, unit panel, and unit house
Khoudour et al. Project cromatica