TW201804445A - Moving object detection device - Google Patents
Moving object detection device Download PDFInfo
- Publication number
- TW201804445A TW201804445A TW105132825A TW105132825A TW201804445A TW 201804445 A TW201804445 A TW 201804445A TW 105132825 A TW105132825 A TW 105132825A TW 105132825 A TW105132825 A TW 105132825A TW 201804445 A TW201804445 A TW 201804445A
- Authority
- TW
- Taiwan
- Prior art keywords
- moving
- moving object
- speed
- unit
- data
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06M—COUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
- G06M7/00—Counting of objects carried by a conveyor
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
本發明係有關於一種推定移動物體的個數、位置以及包含移動方向之移動速度等的移動物體檢測裝置。 The present invention relates to a moving object detection device that estimates the number and position of moving objects, and the moving speed including the moving direction.
自以往,已知使用各種感測器,對感測器觀測範圍內之行人等的移動物體推定個數、位置以及移動方向的技術。 Conventionally, a technique is known in which various sensors are used to estimate the number, position, and moving direction of moving objects such as pedestrians within the sensor's observation range.
在推定移動物體的個數、位置或移動方向,例如有一種方法,該方法係藉複數台相機觀測通過通路之物體,並根據在各相機所拍攝之影像上的移動物體之影像的偏差,推定個數、位置等。 In estimating the number, position, or direction of moving objects, for example, there is a method of observing an object passing through a path by using a plurality of cameras, and estimating based on the deviation of the image of the moving object on the images captured by each camera. Number, location, etc.
可是,在依此方式藉複數台相機推定移動物體的個數、位置等的方法,在夜間使用可見光相機的情況等,移動物體所造成之光的反射強度低的情況,從影像無法正確地得到移動物體的形狀,而推定精度顯著地變差。又,例如,在行人改變臉的方向等移動物體的形狀在時間上變化的情況,亦推定精度變差。 However, in the method of estimating the number and position of moving objects by using multiple cameras in this way, and using a visible light camera at night, etc., the light reflection intensity caused by a moving object is low, and it cannot be accurately obtained from the image. The shape of a moving object significantly deteriorates the estimation accuracy. In addition, for example, when the shape of a moving object changes with time, such as when a pedestrian changes the direction of his face, the estimation accuracy is also deteriorated.
作為推定移動物體的個數、位置或移動方向之其他的方法,例如有藉複數台雷射感測器觀測物體,並在同一座標上合併各雷射感測器的觀測資訊,藉此,推定移動物體之個 數、位置等的方法。 As another method for estimating the number, position, or direction of moving objects, for example, there are a plurality of laser sensors to observe the object, and the observation information of each laser sensor is combined on the same coordinate to estimate Moving object Number, position, etc.
在依此方式使用複數台雷射感測器來檢測出移動物體的方法,為了推定移動物體之移動方向,將藉雷射感測器觀測遠比每個時刻圖框之移動物體的移動範圍寬的範圍作為前提。可是,將雷射光投光之雷射感測器係每單位時間之可觀測範圍比相機窄,所以雷射感測器之台數愈少,觀測範圍愈窄,結果,移動方向之推定精度顯著變差。 In this method, a plurality of laser sensors are used to detect a moving object. In order to estimate the moving direction of the moving object, the laser sensor will be used to observe a wider range of movement than the moving object of the frame at each moment. As a prerequisite. However, the laser sensor that casts the laser light has a narrower observable range per unit time than the camera, so the fewer the number of laser sensors, the narrower the observation range. As a result, the accuracy of the estimation of the moving direction is significant. Worse.
關於如上述所示之問題,在專利文獻1,揭示一種技術,該技術係使用將狹縫光照射於通路上的雷射、與觀測狹縫光之反射光的攝像裝置,從人物通過狹縫光之照射位置時的亮度變化,推定人物的人數、位置以及前進方向。 Regarding the problems as described above, Patent Document 1 discloses a technology that uses a laser device that irradiates slit light on a path and an imaging device that observes the reflected light of the slit light to pass a person through the slit. The change in brightness at the light irradiation position estimates the number of people, the position, and the direction of travel.
在專利文獻1所揭示之技術,在使用一台攝像裝置、與被配置於該攝像裝置之附近的一台雷射的情況,可計數在通路之人物的通過人數。又,在使用一台攝像裝置、與被配置於該攝像裝置之附近的複數處之雷射的情況,可對人物之前進的各方向計數通過人數。 In the technique disclosed in Patent Document 1, when a single imaging device and a laser disposed in the vicinity of the imaging device are used, the number of people passing through the path can be counted. In addition, in the case of using one imaging device and a plurality of lasers arranged in the vicinity of the imaging device, it is possible to count the number of passing people in each direction in which the person advances.
【先行專利文獻】 [Leading Patent Literature]
【專利文獻】 [Patent Literature]
[專利文獻1]特開2005-25593號公報 [Patent Document 1] JP 2005-25593
因為需要觀測行人等之移動物體之流動的範圍可能遍及廣大區域,所以移動方向檢測裝置係低費用較佳。 Because it is necessary to observe the flow range of a moving object such as a pedestrian, it may be spread over a wide area, so it is better to have a low-cost moving direction detection device.
若依據如在專利文獻1所揭示之技術,為了推定 移動物體之各移動方向的個數等,具有需要將雷射配置於至少2處的課題。 If it is based on the technology as disclosed in Patent Document 1, The number of moving directions of a moving object, etc., has a problem that lasers need to be arranged at at least two places.
本發明係為了解決如上述所示之課題而開發的,其目的在於提供一種可藉設置於一處之一台相機與一台雷射感測器,高精度地推定移動物體的個數、位置以及移動速度的移動物體檢測裝置。 The present invention was developed in order to solve the problems as described above, and an object thereof is to provide a camera and a laser sensor which can be installed at a place to estimate the number and position of moving objects with high accuracy And a moving object detection device at a moving speed.
本發明之移動物體檢測裝置包括:物體位置資料製作部,係根據藉測距感測器所測量之測距資訊,算出移動物體之位置座標;像素移動方向資料製作部,係根據藉影像取得感測器所拍攝之影像資訊,算出影像上之各個區域的推定移動方向;以及位置移動方向相關部,係根據位置座標及推定移動方向,推定移動物體的個數及移動方向。 The moving object detection device of the present invention includes: an object position data production unit that calculates the position coordinates of a moving object based on the ranging information measured by the borrowing range sensor; a pixel moving direction data production unit that acquires the senses based on the borrowed image The information of the image captured by the measuring device is used to calculate the estimated moving direction of each area on the image; and the position moving direction related section estimates the number of moving objects and the moving direction based on the position coordinates and the estimated moving direction.
若依據本發明,可藉設置於一處之一台相機與一台雷射感測器,高精度地推定移動物體的個數、位置以及包含移動方向之移動速度。 According to the present invention, a camera and a laser sensor provided at one place can be used to estimate the number and position of moving objects and the moving speed including the moving direction with high accuracy.
11‧‧‧測距資料取得部 11‧‧‧ Ranging data acquisition department
12‧‧‧物體位置資料製作部 12‧‧‧ Object position data production department
13‧‧‧影像取得部 13‧‧‧Image acquisition department
14‧‧‧像素移動方向資料製作部 14‧‧‧ Pixel production direction data production department
15‧‧‧位置移動方向相關部 15‧‧‧Position moving direction related department
16‧‧‧顯示部 16‧‧‧Display
17‧‧‧記錄部 17‧‧‧Recording Department
18‧‧‧移動方向假設值設定部 18‧‧‧ Movement direction hypothesis setting unit
19‧‧‧移動方向決定部 19‧‧‧ Movement direction decision unit
20‧‧‧物體高度資料製作部 20‧‧‧ Object Height Data Production Department
21‧‧‧物體屬性判定部 21‧‧‧Object attribute determination unit
53、54‧‧‧觀測範圍 53, 54‧‧‧ Observation range
61‧‧‧物體重心位置 61‧‧‧ Center of gravity
62、65、67a、67b、67c、69a、69b、69c‧‧‧像素速度 62, 65, 67a, 67b, 67c, 69a, 69b, 69c ‧‧‧ pixel speed
63‧‧‧物體尺寸 63‧‧‧Object size
66a、66b、66c、68a、68b、68c‧‧‧物體位置推定值 66a, 66b, 66c, 68a, 68b, 68c‧‧‧Estimated object position
100、100a、100b‧‧‧移動物體檢測裝置 100, 100a, 100b ‧‧‧ mobile object detection device
101‧‧‧測距感測器處理部 101‧‧‧ ranging sensor processing unit
102‧‧‧影像取得感測器處理部 102‧‧‧Image acquisition sensor processing unit
103‧‧‧融合處理部 103‧‧‧Fusion Processing Department
200‧‧‧測距感測器 200‧‧‧ ranging sensor
300‧‧‧影像取得感測器 300‧‧‧Image acquisition sensor
501‧‧‧處理電路 501‧‧‧processing circuit
502‧‧‧HDD 502‧‧‧HDD
503‧‧‧顯示器 503‧‧‧ Display
504‧‧‧輸入介面裝置 504‧‧‧input interface device
505‧‧‧輸出介面裝置 505‧‧‧ output interface device
506‧‧‧記憶體 506‧‧‧Memory
507‧‧‧CPU 507‧‧‧CPU
第1圖係具備本發明之第1實施形態的移動物體檢測裝置之移動物體檢測系統的構成圖。 FIG. 1 is a configuration diagram of a moving object detection system including a moving object detection device according to a first embodiment of the present invention.
第2圖係用以說明在本第1實施形態之測距感測器與影像取得感測器的設置狀況、及測距感測器與影像取得感測器的觀測條件之一例的圖。 FIG. 2 is a diagram for explaining an example of the installation conditions of the ranging sensor and the image acquisition sensor and the observation conditions of the ranging sensor and the image acquisition sensor in the first embodiment.
第3圖係用以說明在本第1實施形態之測距感測器與影像取得感測器的設置狀況、及測距感測器與影像取得感測器的觀測條件之一例的圖,係從z軸之正方向觀察第2圖的俯視圖。 FIG. 3 is a diagram for explaining an example of the installation conditions of the ranging sensor and the image acquisition sensor and the observation conditions of the ranging sensor and the image acquisition sensor in the first embodiment. FIG. The top view of FIG. 2 is seen from the positive direction of the z-axis.
第4圖係本發明之第1實施形態之移動物體檢測裝置的構成圖。 Fig. 4 is a configuration diagram of a moving object detection device according to the first embodiment of the present invention.
第5A圖、第5B圖係表示本發明之第1實施形態的移動物體檢測裝置之硬體構成之一例的圖。 5A and 5B are diagrams showing an example of a hardware configuration of a moving object detection device according to the first embodiment of the present invention.
第6圖係說明本發明之第1實施形態的移動物體檢測裝置之動作的流程圖。 Fig. 6 is a flowchart illustrating the operation of the moving object detection device according to the first embodiment of the present invention.
第7圖係說明在第6圖之步驟ST603之移動物體的速度推定動作的示意圖。 FIG. 7 is a schematic diagram illustrating a speed estimation operation of the moving object in step ST603 of FIG. 6.
第8圖係本發明之第2實施形態之移動物體檢測裝置的構成圖。 Fig. 8 is a configuration diagram of a moving object detection device according to a second embodiment of the present invention.
第9圖係說明本發明之第2實施形態的移動物體檢測裝置之動作的流程圖。 Fig. 9 is a flowchart illustrating the operation of the moving object detection device according to the second embodiment of the present invention.
第10圖係以i=1的情況為例,表示在第9圖的步驟ST904之藉移動方向假設值設定部的速度假設值之設定之動作的示意圖。 FIG. 10 is a schematic diagram showing the operation of setting the speed assumption value by the moving direction assumption value setting unit in step ST904 in FIG. 9 taking i = 1 as an example.
第11圖係詳細地說明第9圖之步驟ST904之動作的流程圖。 Fig. 11 is a flowchart illustrating the operation of step ST904 in Fig. 9 in detail.
第12圖係以j=1及j=2的情況為例,表示在第9圖的步驟ST909之藉移動方向決定部的速度假設值v0、v1、v2之概率的更新之動作的示意圖。 FIG. 12 shows the case of j = 1 and j = 2 as an example, and shows the operation of updating the probability of the speed assumption values v 0 , v 1 , and v 2 of the borrowing direction determining unit in step ST909 of FIG. 9. schematic diagram.
第13圖係說明第9圖之步驟ST909之細部動作的流程圖。 FIG. 13 is a flowchart illustrating detailed operations of step ST909 in FIG. 9.
第14圖係本發明之第3實施形態之移動物體檢測裝置的構成圖。 Fig. 14 is a configuration diagram of a moving object detection device according to a third embodiment of the present invention.
第15圖係說明本發明之第3實施形態的移動物體檢測裝置之動作的流程圖。 Fig. 15 is a flowchart illustrating the operation of the moving object detection device according to the third embodiment of the present invention.
以下,一面參照圖面,一面詳細地說明本發明之實施形態。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
第1實施形態 First Embodiment
在以下的說明,作為一例,將在一方向所延伸之通路移動的行人作為檢測對象,應用本發明之第1實施形態的移動物體檢測裝置100。即,此處,作為移動物體,採用檢測出行人者,以下說明之。 In the following description, as an example, a moving object detection device 100 according to the first embodiment of the present invention is applied to a pedestrian moving in a path extending in one direction as a detection target. That is, here, a pedestrian is used as a moving object, which will be described below.
第1圖係具備本發明之第1實施形態的移動物體檢測裝置100之移動物體檢測系統的構成圖。 FIG. 1 is a configuration diagram of a moving object detection system including a moving object detection device 100 according to the first embodiment of the present invention.
如第1圖所示,在移動物體檢測系統,移動物體檢測裝置100係經由網路與測距感測器200及影像取得感測器300連接。 As shown in FIG. 1, in the moving object detection system, the moving object detection device 100 is connected to the distance measurement sensor 200 and the image acquisition sensor 300 via a network.
測距感測器200係例如是雷射測距儀、雷達、超音波感測器、主動立體感測器等,在影像取得感測器300所拍攝之影像的一個時刻圖框內掃描,取得測距資料。 The range-finding sensor 200 is, for example, a laser rangefinder, a radar, an ultrasonic sensor, an active stereo sensor, etc., and scans within a time frame of an image captured by the image acquisition sensor 300 to obtain Ranging data.
影像取得感測器300係可見光相機、紅外線相機等,被設置成俯視檢測出行人的通路,並拍攝通路。 The image acquisition sensor 300 is a visible light camera, an infrared camera, or the like, and is provided to detect a pedestrian path in a plan view and capture the path.
移動物體檢測裝置100係根據從測距感測器200所取得之在時刻圖框之測距感測器觀測範圍的測距資訊、與從影像取得感測器300所取得之在時刻圖框之影像取得感測器觀測範圍內 的影像,推定在通路移動之行人的位置、及包含移動方向之移動速度等。 The moving object detection device 100 is based on the ranging information of the observation range of the ranging sensor at the time frame obtained from the ranging sensor 200 and the time at the time frame obtained from the image acquisition sensor 300. Image acquisition sensor within the observation range Image, the position of the pedestrian moving along the path, and the moving speed including the moving direction are estimated.
此外,一台測距感測器200與一台影像取得感測器300成為一組,對一個移動物體檢測對象區域,設置該一組之測距感測器200及影像取得感測器300。 In addition, one ranging sensor 200 and one image acquisition sensor 300 form a group, and for a moving object detection target area, the group of ranging sensors 200 and image acquisition sensors 300 are set.
又,由該一台測距感測器200與一台影像取得感測器300所構成的組為複數組,並經由網路與一台移動物體檢測裝置100連接,移動物體檢測裝置100係可對由一台測距感測器200與一台影像取得感測器300所構成的各組實施推定在通路移動之行人的位置、移動速度等的處理。關於具體之處理內容將後述。 In addition, a group consisting of the ranging sensor 200 and an image acquisition sensor 300 is a complex array, and is connected to a moving object detection device 100 via a network. The moving object detection device 100 may be Each group consisting of a distance measuring sensor 200 and an image acquisition sensor 300 is subjected to a process of estimating the position, moving speed, and the like of a pedestrian moving along the path. The specific processing content will be described later.
第2圖、第3圖係用以說明在本第1實施形態之測距感測器200與影像取得感測器300的設置狀況、及測距感測器200與影像取得感測器300的觀測條件之一例的圖。此外,第3圖係從z軸之正方向觀察第2圖的俯視圖。 FIG. 2 and FIG. 3 are diagrams for explaining the installation conditions of the ranging sensor 200 and the image acquisition sensor 300 in the first embodiment, and how the distance measurement sensor 200 and the image acquisition sensor 300 are installed. An example of observation conditions. FIG. 3 is a plan view of FIG. 2 viewed from the positive direction of the z-axis.
在第2圖、第3圖,x軸方向係通路之延伸方向,z軸方向係與地面垂直之方向,y軸方向係與通路之延伸方向垂直且與地面平行之方向,x軸、y軸、z軸之正方向係採用右手系統的正交座標系統。 In Figures 2 and 3, the x-axis direction is the direction of the path extension, the z-axis direction is the direction perpendicular to the ground, the y-axis direction is the direction perpendicular to the path's extending direction and parallel to the ground, and the x-axis and y-axis The positive directions of the z-axis are orthogonal coordinate systems using a right-handed system.
如第2圖、第3圖所示,影像取得感測器300及測距感測器200係被設置成俯視通路。 As shown in FIG. 2 and FIG. 3, the image acquisition sensor 300 and the distance measurement sensor 200 are provided in a plan view path.
影像取得感測器300係以所預設之視角取得通路的影像,將從影像取得感測器300之設置位置向通路上之xy平面所投影的範圍作為影像取得感測器300的觀測範圍53。 The image acquisition sensor 300 acquires the image of the channel at a preset angle of view, and the range projected from the setting position of the image acquisition sensor 300 to the xy plane on the channel is the observation range 53 of the image acquisition sensor 300 .
測距感測器200係在一個時刻圖框內在y軸方向掃描至少一次,取得測距資料,並將從測距感測器200之設置位置向與y軸平行之xy平面上的一維線所投影的範圍作為測距感測器200的觀測範圍54。 The ranging sensor 200 is scanned at least once in the y-axis direction within a time frame to obtain ranging data, and the position of the ranging sensor 200 is set to a one-dimensional line on the xy plane parallel to the y-axis. The projected range serves as the observation range 54 of the ranging sensor 200.
影像取得感測器300的觀測範圍53與測距感測器200的觀測範圍54係一部分重複,測距感測器200的觀測範圍54之x軸方向的寬度係比影像取得感測器300的觀測範圍53之x軸方向的寬度充分地窄。 The observation range 53 of the image acquisition sensor 300 overlaps with the observation range 54 of the ranging sensor 200, and the width in the x-axis direction of the observation range 54 of the ranging sensor 200 is larger than that of the image acquisition sensor 300. The width in the x-axis direction of the observation range 53 is sufficiently narrow.
此外,亦可是測距感測器200之觀測範圍的一維線係採用可因應於觀測環境來選擇。例如,亦可測距感測器200係觀測複數條一維線,從各一維線中選擇 In addition, the one-dimensional line of the observation range of the ranging sensor 200 may be selected according to the observation environment. For example, the distance measuring sensor 200 can also observe a plurality of one-dimensional lines and select from each one-dimensional line.
被障礙物等遮蔽之可能性低的一維線,作為觀測範圍。 One-dimensional lines that are less likely to be obscured by obstacles, etc., are used as the observation range.
第4圖係本發明之第1實施形態之移動物體檢測裝置100的構成圖。 Fig. 4 is a configuration diagram of a moving object detection device 100 according to the first embodiment of the present invention.
如第4圖所示,移動物體檢測裝置100包括測距資料取得部11、物體位置資料製作部12、影像取得部13、像素移動方向資料製作部14、位置移動方向相關部15、顯示部16以及記錄部17。 As shown in FIG. 4, the moving object detection device 100 includes a ranging data acquisition unit 11, an object position data creation unit 12, an image acquisition unit 13, a pixel movement direction data creation unit 14, a position movement direction correlation unit 15, and a display unit 16 And the recording section 17.
測距資料取得部11與物體位置資料製作部12構成測距感測器處理部101,影像取得部13與像素移動方向資料製作部14構成影像取得感測器處理部102,位置移動方向相關部15構成融合處理部103。 The ranging data acquisition unit 11 and the object position data production unit 12 constitute a ranging sensor processing unit 101, the image acquisition unit 13 and the pixel movement direction data production unit 14 constitute an image acquisition sensor processing unit 102, and a position movement direction correlation unit 15 constitutes a fusion processing unit 103.
測距資料取得部11係將命令送至測距感測器200,並從測距感測器200取得在時刻圖框之測距感測器觀測 範圍的測距資訊。測距資料取得部11係向物體位置資料製作部12輸出所取得之測距資訊。 The ranging data acquisition unit 11 sends a command to the ranging sensor 200, and obtains the ranging sensor observation at the time frame from the ranging sensor 200. Range ranging information. The ranging data acquisition unit 11 outputs the acquired ranging information to the object position data creation unit 12.
物體位置資料製作部12係根據從測距資料取得部11所取得之在複數個時刻圖框的測距資訊,特定移動物體之重心位置通過測距感測器200之觀察範圍54的時刻圖框。又,物體位置資料製作部12係從由測距資料取得部11所取得之複數個時刻圖框,算出移動物體之x軸及y軸的重心位置,並製作表示該重心位置的物體位置資料。 The object position data creation unit 12 is a time frame based on the distance measurement information obtained from the distance measurement data acquisition unit 11 at a plurality of time frames, and the position of the center of gravity of a specific moving object passes through the observation range 54 of the distance sensor 200 . The object position data creation unit 12 calculates the center of gravity positions of the x-axis and y-axis of a moving object from a plurality of time frames obtained by the distance measurement data acquisition unit 11 and generates object position data indicating the position of the center of gravity.
物體位置資料製作部12係向位置移動方向相關部15輸出移動物體之重心所通過的時刻圖框,即測距感測器200之觀測範圍54的通過時刻與物體位置資料。 The object position data creation unit 12 outputs the time frame through which the center of gravity of the moving object passes to the position moving direction correlation unit 15, that is, the passing time and the object position data of the observation range 54 of the distance measuring sensor 200.
此外,此處,作為表示移動物體之位置座標之物理量的一例,採用算出移動物體整體之重心位置的裝置,但是亦可替代重心位置,採用關於移動物體之一部分的位置座標。例如,亦可採用對行人之雙肩的位置、或對汽車之車輪的位置。 Here, as an example of a physical quantity indicating the position coordinates of a moving object, a device that calculates the position of the center of gravity of the entire moving object is used, but instead of the position of the center of gravity, a position coordinate about a part of the moving object may be used. For example, the position on the shoulders of a pedestrian or the position on the wheels of a car may be used.
影像取得部13係將命令送至影像取得感測器300,並從影像取得感測器300取得在時刻圖框之影像取得感測器300之觀測範圍53內的影像。影像取得部13係向像素移動方向資料製作部14輸出所取得之影像。 The image acquisition unit 13 sends a command to the image acquisition sensor 300, and acquires an image within the observation range 53 of the image acquisition sensor 300 in the time frame from the image acquisition sensor 300. The image acquisition unit 13 outputs the acquired image to the pixel moving direction data creation unit 14.
像素移動方向資料製作部14係根據從影像取得部13所取得之在複數個時刻圖框的影像,算出各像素之x軸及y軸方向的推定速度,並對各時刻圖框製作表示該推定速度的像素速度資料。像素移動方向資料製作部14係向位置移動方向相關部15輸出像素速度資料。 The pixel moving direction data creation unit 14 calculates the estimated speeds of the x-axis and y-axis directions of each pixel based on the images obtained at the frames at a plurality of times obtained from the image acquisition unit 13, and creates and displays the estimates for each time frame Pixel speed data for speed. The pixel moving direction data creation unit 14 outputs pixel speed data to the position moving direction correlation unit 15.
此外,此處,作為對影像上之各區域算出推定速度之裝置的一例,採用是對各像素算出推定速度之裝置的像素移動方向資料製作部14,但是算出影像上之推定速度之區域的選法係未必是每一個像素。例如,亦可藉由對縱10個像素、橫10個像素之各區域算出推定速度,減輕處理負擔。 Here, as an example of a device that calculates an estimated speed for each area on an image, the pixel moving direction data creation unit 14 that is a device that calculates an estimated speed for each pixel is used. The legal system is not necessarily every pixel. For example, it is also possible to reduce the processing load by calculating the estimated speed for each region of 10 pixels in length and 10 pixels in width.
又,此處,像素移動方向資料製作部14係作為表示物體之移動方向之物理量的一例,採用算出x軸及y軸方向的推定速度,但是在僅推定各移動物體之移動方向的目的,未必要算出速度。例如,亦可像素移動方向資料製作部14係採用算出僅表示移動方向之大小固定的向量、或表示移動方向的方位角。 Here, as an example of the physical quantity indicating the moving direction of the object, the pixel moving direction data creation unit 14 uses the estimated velocity in the x-axis and y-axis directions. However, the purpose of estimating only the moving direction of each moving object is not It is necessary to calculate the speed. For example, the pixel moving direction data creation unit 14 may calculate a fixed vector representing only the moving direction or an azimuth indicating the moving direction.
位置移動方向相關部15係接受從物體位置資料製作部12所輸出之物體位置資料、與從像素移動方向資料製作部14所輸出之像素速度資料,並推定根據物體位置資料所特定之移動物體的移動速度。 The position moving direction correlation unit 15 receives the object position data output from the object position data creation unit 12 and the pixel velocity data output from the pixel movement direction data creation unit 14 and estimates the position of the moving object specified by the object position data. Moving speed.
位置移動方向相關部15係對移動物體,使顯示部16顯示重心位置通過測距感測器200之觀測範圍54的時刻、x軸及y軸上的重心位置、以及表示x軸及y軸方向之推定速度的資料。又,位置移動方向相關部15係對各移動物體,使記錄部17記憶重心位置通過測距感測器200之觀測範圍54的時刻、x軸及y軸上的重心位置、以及表示x軸及y軸方向之推定速度的資料。 The position moving direction related unit 15 causes the display unit 16 to display the time when the position of the center of gravity passes through the observation range 54 of the distance measuring sensor 200, the position of the center of gravity on the x-axis and y-axis, and indicates the directions of the x-axis and y-axis. Information on the estimated speed. The position-moving-direction correlation unit 15 causes the recording unit 17 to memorize the time when the position of the center of gravity passes through the observation range 54 of the distance measuring sensor 200, the position of the center of gravity on the x-axis and y-axis, and indicates the x-axis and Information on the estimated speed in the y-axis direction.
此處,將各移動物體之重心位置通過測距感測器200之觀測範圍54的時刻、x軸及y軸上的重心位置、以及表示x軸及 y軸方向之推定速度的資料總稱為融合資料。 Here, the time when the position of the center of gravity of each moving object passes the observation range 54 of the ranging sensor 200, the position of the center of gravity on the x-axis and y-axis, and the x-axis and The data of the estimated velocity in the y-axis direction are collectively called fusion data.
顯示部16係例如是顯示器等之顯示裝置,並顯示從位置移動方向相關部15所輸出之融合資料。 The display unit 16 is a display device such as a display, and displays the fusion data output from the position moving direction correlation unit 15.
記錄部17記憶從位置移動方向相關部15所輸出之融合資料。 The recording unit 17 stores the fusion data output from the position moving direction correlation unit 15.
此外,此處,如第4圖所示,顯示部16與記錄部17係採用移動物體檢測裝置100所具備者,但是不限定為此,亦可作成在移動物體檢測裝置100之外部具備顯示部16與記錄部17,並經由網路連接移動物體檢測裝置100、顯示部16以及記錄部17。 Here, as shown in FIG. 4, the display unit 16 and the recording unit 17 are those included in the moving object detection device 100, but the present invention is not limited to this, and the display portion may be provided outside the moving object detection device 100. 16 and the recording unit 17, and the moving object detection device 100, the display unit 16, and the recording unit 17 are connected via a network.
第5A圖、第5B圖係表示本發明之第1實施形態的移動物體檢測裝置100之硬體構成之一例的圖。 5A and 5B are diagrams showing an example of a hardware configuration of the moving object detection device 100 according to the first embodiment of the present invention.
在本發明之第1實施形態,測距感測器處理部101、影像取得感測器處理部102以及融合處理部103之各功能係藉處理電路501所實現。即,移動物體檢測裝置100係具備處理電路501,該處理電路501係根據所取得之測距資訊與影像資訊,進行顯示或記憶關於移動物體之位置、速度等之資訊的控制。 In the first embodiment of the present invention, each function of the ranging sensor processing unit 101, the image acquisition sensor processing unit 102, and the fusion processing unit 103 is implemented by the processing circuit 501. That is, the moving object detection device 100 is provided with a processing circuit 501 that performs control of displaying or memorizing information on the position, speed, etc. of the moving object based on the obtained ranging information and image information.
處理電路501係如第5A圖所示是專用之硬體,或是如第5B圖所示是執行記憶體506所儲存之程式的CPU(Central Processing Unit)507都可。 The processing circuit 501 may be dedicated hardware as shown in FIG. 5A, or a CPU (Central Processing Unit) 507 that executes a program stored in the memory 506 as shown in FIG. 5B.
在處理電路501是專用之硬體的情況,處理電路501係例如單一電路、複合電路、程式化之處理器、並列程式化之處理器、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、或這些元件之組合符 合。 When the processing circuit 501 is dedicated hardware, the processing circuit 501 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate). Array), or a combination of these elements Together.
在處理電路501是CPU507的情況,測距感測器處理部101、影像取得感測器處理部102以及融合處理部103之各功能係藉軟體、軔體、或軟體與軔體之組合所實現。即,測距感測器處理部101、影像取得感測器處理部102以及融合處理部103係藉執行HDD(Hard Disk Driver)502、記憶體506等所記憶之程式的CPU507、系統LSI(Large-Scale Integration)等的處理電路所實現。又,HDD502、記憶體506等所記憶之程式係亦可說是使電腦執行測距感測器處理部101、影像取得感測器處理部102以及融合處理部103之程序或方法者。此處,記憶體506係例如RAM(Random Access Memory)、ROM(Read Only Memory)、快閃記憶體、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)等之不揮發性或揮發性的半導體記憶體、或磁碟片、軟碟片、光碟片、小型光碟片、迷你光碟片、DVD(Digital Versatile Disc)等符合。 When the processing circuit 501 is a CPU 507, the functions of the ranging sensor processing unit 101, the image acquisition sensor processing unit 102, and the fusion processing unit 103 are implemented by software, a carcass, or a combination of software and carcass . In other words, the ranging sensor processing unit 101, the image acquisition sensor processing unit 102, and the fusion processing unit 103 are CPU507 and system LSI (Large) that execute programs stored in the HDD (Hard Disk Driver) 502 and the memory 506. -Scale Integration). The programs stored in the HDD 502, the memory 506, and the like can be said to be programs or methods that cause a computer to execute the ranging sensor processing unit 101, the image acquisition sensor processing unit 102, and the fusion processing unit 103. Here, the memory 506 is nonvolatile such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), and EEPROM (Electrically Erasable Programmable Read Only Memory). Or volatile semiconductor memory, or magnetic disks, floppy disks, optical disks, small optical disks, mini optical disks, DVD (Digital Versatile Disc), etc.
此外,關於測距感測器處理部101、影像取得感測器處理部102以及融合處理部103之各功能,亦可作成以專用之硬體實現一部分,以軟體或軔體實現一部分。例如,對測距感測器處理部101,係以作專用之硬體的處理電路501可實現其功能,對影像取得感測器處理部102與融合處理部103,係藉由處理電路讀出並執行記憶體506所儲存之程式,可實現其功能。 In addition, each of the functions of the ranging sensor processing unit 101, the image acquisition sensor processing unit 102, and the fusion processing unit 103 may be implemented as part of dedicated hardware and part of software or hardware. For example, the processing unit 501 of the ranging sensor processing unit is a dedicated hardware processing circuit 501 to realize its function, and the processing unit 102 and the fusion processing unit 103 of the image acquisition sensor are read out by the processing circuit. And execute the program stored in the memory 506 to realize its function.
顯示部16係例如是顯示器503。 The display unit 16 is, for example, a display 503.
記錄部17係例如使用HDD502。此外,這只不過是一例,亦可記錄部17係藉DVD、記憶體506等所構成。 The recording unit 17 uses, for example, HDD502. In addition, this is just an example, and the recording unit 17 may be constituted by a DVD, a memory 506, or the like.
又,移動物體檢測裝置100具有與測距感測器200、影像取得感測器300等之外部機器進行通訊的輸入介面裝置504、輸出介面裝置505。例如,測距感測器處理部101係利用USB(Universal Serial Bus)、以太(Aether)(登錄商標,以下省略記載)等之輸入介面裝置504,取得測距感測器200所取得之測距資訊。又,例如,影像取得感測器處理部102係利用DVI(Digital Visual Interface,登錄商標)、HDMI(High-Definition Multimedia Interface,登錄商標)、USB、以太等之輸入介面裝置504,取得影像取得感測器300所拍攝之影像。又,例如,融合處理部103係利用USB、以太等之輸出介面裝置505與顯示器503連接,並輸出移動物體之個數、位置、速度等的資訊。 In addition, the moving object detection device 100 includes an input interface device 504 and an output interface device 505 that communicate with external devices such as the ranging sensor 200 and the image acquisition sensor 300. For example, the distance measuring sensor processing unit 101 uses an input interface device 504 such as USB (Universal Serial Bus), Aether (registered trademark, hereinafter omitted) to obtain the distance measured by the distance measuring sensor 200. Information. In addition, for example, the image acquisition sensor processing unit 102 uses an input interface device 504 such as DVI (Digital Visual Interface (registered trademark)), HDMI (High-Definition Multimedia Interface (registered trademark), USB, Ethernet, etc.) to acquire an image acquisition feeling. The image captured by the detector 300. In addition, for example, the fusion processing unit 103 is connected to the display 503 using an output interface device 505 such as USB and Ethernet, and outputs information such as the number, position, and speed of moving objects.
說明本第1實施形態之移動物體檢測裝置100的動作。 The operation of the moving object detection device 100 according to the first embodiment will be described.
第6圖係說明本發明之第1實施形態的移動物體檢測裝置100之動作的流程圖。 FIG. 6 is a flowchart illustrating the operation of the moving object detection device 100 according to the first embodiment of the present invention.
使用第6圖,以在從開始觀測第k(k係自然數)個時刻圖框,對重心通過測距感測器200之觀測範圍54的移動物體,推定移動速度的情況為例,說明移動物體檢測裝置100的動作。 Using FIG. 6, a case where the moving speed is estimated at the k-th (k-series natural number) time frame from the beginning of the observation, and the moving speed is estimated for the moving object having the center of gravity passing through the observation range 54 of the ranging sensor 200 is used as an example to explain the movement The operation of the object detection device 100.
在以下,將從開始觀測第k個時刻圖框記述為時刻k。又,測距感測器200及影像取得感測器300之設置位置及觀測範圍 係當作所預設而已知,用以測量測距感測器200及影像取得感測器300之觀測時刻的時鐘係當作在測距感測器200及影像取得感測器300之間同步。 Hereinafter, the k-th time frame from the start of observation is described as time k. In addition, the installation positions and observation ranges of the ranging sensor 200 and the image acquisition sensor 300 It is known as a preset, and a clock for measuring the observation time of the ranging sensor 200 and the image acquisition sensor 300 is used as synchronization between the ranging sensor 200 and the image acquisition sensor 300. .
測距感測器處理部101係從測距感測器200所取得之在複數個時刻圖框的測距資訊,抽出在時刻k抽出重心位置已通過測距感測器200之觀測範圍54的移動物體,製作在時刻k的物體位置資料(步驟ST601)。具體而言,物體位置資料製作部12係根據測距資料取得部11所輸出之在複數個時刻圖框的測距資訊,抽出已通過測距感測器200之觀測範圍54內的移動物體,並特定移動物體之重心位置通過測距感測器200之觀測範圍內的時刻圖框,且製作物體位置資料。在已抽出複數個移動物體的情況,對各移動物體進行時刻圖框的特定、及物體位置資料的製作。此外,物體位置資料製作部12係作成使測距資訊儲存部(省略圖示)儲存測距資料取得部11所輸出之時刻圖框的測距資訊,並根據該測距資訊儲存部所儲存之複數個測距資訊製作物體位置資料即可。物體位置資料製作部12係向位置移動方向相關部15輸出所製作之物體位置資料。 The ranging sensor processing unit 101 is based on the ranging information obtained from the ranging sensor 200 at a plurality of time frames, and extracts the position of the center of gravity that has passed the observation range 54 of the ranging sensor 200 at time k. The object is moved to create object position data at time k (step ST601). Specifically, the object position data production unit 12 extracts the moving objects that have passed through the observation range 54 of the distance measurement sensor 200 based on the distance measurement information at a plurality of time frames output by the distance measurement data acquisition unit 11. The position of the center of gravity of a specific moving object is passed through the time frame within the observation range of the ranging sensor 200, and object position data is generated. When a plurality of moving objects have been extracted, the time frame is specified for each moving object, and the object position data is produced. In addition, the object position data creation unit 12 creates the distance measurement information storage unit (not shown) to store the distance measurement information of the time frame output by the distance measurement data acquisition unit 11 and stores the distance information based on the distance information storage unit. The plurality of ranging information can be used to create the object position data. The object position data creation unit 12 outputs the created object position data to the position moving direction correlation unit 15.
作為從各時刻圖框的測距資訊抽出移動物體之重心位置的方法,例如,使用在特開2011-108223號公報所揭示之方法。在該文獻所揭示之方法係抽出在特定的高度之移動物體的個數及位置的方法,例如,在抽出行人之位置的情況,可求得在行人之頭部的高度所存在之物體的個數及位置。 As a method of extracting the position of the center of gravity of the moving object from the ranging information of the frame at each time, for example, the method disclosed in Japanese Patent Application Laid-Open No. 2011-108223 is used. The method disclosed in this document is a method of extracting the number and position of moving objects at a specific height. For example, when the position of a pedestrian is extracted, the number of objects existing at the height of the head of the pedestrian can be obtained. Number and location.
此外,不限定為該方法,亦可物體位置資料製作部12係 取得所設先製作之在時刻k的物體位置資料。具體而言,亦可物體位置資料製作部12係在步驟ST601不是每次製作在時刻k的物體位置資料,而是預先記憶已製作之物體位置資料,再取得該記憶之物體位置資料。 In addition, it is not limited to this method, and the object position data creation section 12 series may also be used. Obtain the position data of the object that was created first at time k. Specifically, the object position data creation unit 12 may not create the object position data at time k each time in step ST601, but may memorize the created object position data in advance, and then obtain the memorized object position data.
影像取得感測器處理部102係從影像取得感測器300所取得之各時刻圖框的影像,製作在時刻k的像素速度資料(步驟ST602)。具體而言,像素移動方向資料製作部14係根據影像取得部13所輸出之在複數個時刻圖框的影像,製作像素速度資料。此外,像素移動方向資料製作部14係作成使影像資訊儲存部(省略圖示)儲存影像取得部13所輸出之時刻圖框的影像,並根據該影像資訊儲存部所儲存之影像製作像素速度資料即可。像素移動方向資料製作部14係向位置移動方向相關部15輸出所製作之像素速度資料。 The image acquisition sensor processing unit 102 generates the pixel speed data at time k from the images of the frame at each time acquired by the image acquisition sensor 300 (step ST602). Specifically, the pixel moving direction data creation unit 14 generates pixel speed data based on the images at the frames at a plurality of times output by the image acquisition unit 13. In addition, the pixel moving direction data creation unit 14 creates an image information storage unit (not shown) to store the time frame image output by the image acquisition unit 13 and creates pixel speed data based on the image stored in the image information storage unit. Just fine. The pixel moving direction data creation unit 14 outputs the created pixel speed data to the position moving direction correlation unit 15.
作為從各時刻圖框之影像製作像素速度資料的方法,例如,使用在B.Lucas,T.Kanade,“An Iterative Image Registration Technique with an Application to Stereo Vision,”Proceedings DARPA Image Understanding Workshop,pp.121~130,April 1981所揭示之方法。該文獻所揭示之方法係以各像素之亮度強度是在空間上連續、與物體之運動是線性為前提,藉最小平方法推定時刻圖框間之亮度強度的殘差成為最小的速度向量。根據此方法,表示靜止物之像素的速度向量成為接近零的值,而可得到表示移動物體之像素的速度向量成為接近移動速度之值的像素速度資料。 As a method of producing pixel velocity data from the image of the frame at each time, for example, B. Lucas, T. Kanade, "An Iterative Image Registration Technique with an Application to Stereo Vision," Proceedings DARPA Image Understanding Workshop, pp. 121 ~ 130, method disclosed in April 1981. The method disclosed in this document is based on the premise that the brightness intensity of each pixel is spatially continuous and linear with the movement of the object. The residual vector of the brightness intensity between the time frames is estimated to be the minimum velocity vector by the least square method. According to this method, the velocity vector of a pixel representing a stationary object becomes a value close to zero, and pixel velocity data indicating that the velocity vector of a pixel of a moving object becomes a value close to a moving speed can be obtained.
此外,不限定為該方法,亦可像素移動方向資料製作部14 係取得所預先製作之在時刻k的像素速度資料。具體而言,例如,亦可作成像素移動方向資料製作部14係在步驟ST602不是每次製作時刻k的像素速度資料,而是預先記憶已製作之像素速度資料,再取得該記憶之像素速度資料。 In addition, the method is not limited to this method, and the pixel moving direction data creation unit 14 may be used. The obtained pixel velocity data at time k is obtained in advance. Specifically, for example, the pixel moving direction data creation unit 14 may not create the pixel speed data at time k at step ST602, but may store the created pixel speed data in advance and obtain the stored pixel speed data .
此外,亦可作成像素移動方向資料製作部14係因應於觀測環境僅對影像上之部分區域製作像素速度資料。例如,若從資料取得前或過去之時刻圖框的資料已知行人無法通行之區域等移動物體不會存在的區域,亦可省略該移動物體不會存在的區域之像素速度資料的製作。 In addition, the pixel moving direction data creation unit 14 may also create pixel velocity data for only a part of the area on the image according to the observation environment. For example, if the area where the moving object does not exist, such as the area where pedestrians cannot pass, is known from the data of the frame before or after the data is obtained, the creation of pixel velocity data in the area where the moving object does not exist may be omitted.
位置移動方向相關部15係使用在步驟ST601物體位置資料製作部12所輸出之在時刻k的物體位置資料、與在步驟ST602像素移動方向資料製作部14所輸出之在時刻k的像素速度資料,對根據在時刻k的物體位置資料所特定之移動物體,推定移動速度,並製作關於該移動物體之在時刻k的融合資料(步驟ST603)。 The position moving direction correlation unit 15 uses the object position data at time k output by the object position data generating unit 12 in step ST601 and the pixel velocity data at time k output by the pixel moving direction data generating unit 14 in step ST602. For a moving object specified by the object position data at time k, a moving speed is estimated, and fusion data about the moving object at time k is created (step ST603).
此處,第7圖係說明在第6圖之步驟ST603之移動物體的速度推定動作的示意圖。此外,在第7圖,當作抽出2個在時刻k重心已通過測距感測器200之觀測範圍54的移動物體。 Here, FIG. 7 is a schematic diagram illustrating a speed estimation operation of the moving object in step ST603 of FIG. 6. In addition, in FIG. 7, two moving objects that have passed through the observation range 54 of the range-finding sensor 200 at time k are extracted.
位置移動方向相關部15係對在時刻k之各移動物體的重心位置61,抽出以該重心位置61為中心物體尺寸63以內的像素速度62。而且,位置移動方向相關部15係從物體尺寸63內的像素速度,算出各移動物體的推定速度v1及v2。 The position-moving-direction correlation unit 15 extracts a pixel velocity 62 within the object size 63 centered on the center of gravity position 61 of each moving object at time k. The position moving direction correlation unit 15 calculates the estimated speeds v 1 and v 2 of each moving object from the pixel speed within the object size 63.
此處,在實空間之x座標及y座標係採用與從影 像取得感測器300所得之影像之圖框上的x座標、y座標一致者。 Here, the x-coordinates and y-coordinates in real space are If the x-coordinate and y-coordinate on the frame of the image obtained by the sensor 300 are the same.
在以下,以在物體位置資料將在時刻k的重心位置當作(xd,yd)之某移動物體為例,說明推定速度v1=(v1,x,v1,y)的算出方法。此處,xd係當作移動物體之重心位置的x軸位置,yd係當作移動物體之重心位置的y軸位置。又,v1,x、v1,y係當作推定速度之x軸方向成分、y軸方向成分。此外,因為測距感測器200之觀測範圍54係在x軸方向窄,所以亦可xd係當作測距感測器200之觀測範圍之中心的x軸位置。 In the following, the calculation of the estimated velocity v 1 = (v 1, x , v 1, y ) will be described by taking the position of the center of gravity at time k as a moving object of (x d , y d ) as an example. method. Here, x d is the position of the x-axis as the position of the center of gravity of the moving object, and y d is the position of the y-axis as the position of the center of gravity of the moving object. In addition, v 1, x and v 1, y are regarded as the x-axis direction component and the y-axis direction component of the estimated speed. In addition, since the observation range 54 of the ranging sensor 200 is narrow in the x-axis direction, x d may also be used as the x-axis position of the center of the observation range of the ranging sensor 200.
又,在以下,以Vx(xi,yi)及Vy(xi,yi)表示在時刻k的像素速度資料。此處,xi及yi係分別表示第i個像素的x軸中心位置及y軸中心位置,Vx(xi,yi)表示像素速度之x軸方向成分,Vy(xi,yi)表示像素速度之y軸方向成分。此外,像素的總數係當作NP。 In the following, the pixel velocity data at time k is represented by V x (x i , y i ) and V y (x i , y i ). Here, x i and y i represent the x-axis center position and the y-axis center position of the i-th pixel, V x (x i , y i ) represents the x-axis direction component of the pixel velocity, and V y (x i , y i ) represents the y-axis direction component of the pixel velocity. In addition, the total number of pixels is taken as N P.
重心位置(xd,yd)之移動物體的推定速度v1係自以下之數學式(1)及數學式(2)算出。 The estimated speed v 1 of the moving object at the center of gravity position (x d , y d ) is calculated from the following mathematical formula (1) and mathematical formula (2).
此處,W(xi,yi;xd,yd)係表示物體尺寸63的函數,係對任意之(xd,yd)滿足以下之數學式(3)的函數。 Here, W (x i , y i ; x d , y d ) is a function representing the object size 63, and is a function that satisfies the following mathematical formula (3) for any (x d , y d ).
例如,在以(xd,yd)為中心定義圓形之物體尺寸63的情況, 定義成如以下之數學式(4)所示。 For example, when a circular object size 63 is defined with (x d , y d ) as the center, it is defined as shown in the following mathematical formula (4).
此處,將表示物體尺寸63之半徑的參數當作R,將滿足以下之數學式(5)之像素(xi’,yi’)的個數當作n(R)。 Here, a parameter representing a radius of the object size 63 is taken as R, and the number of pixels (x i ', y i ') satisfying the following mathematical expression (5) is taken as n (R).
(x i'-x d )2+(y i'-y d )2<R 2 (5) ( x i ' -x d ) 2 + ( y i' - y d ) 2 < R 2 (5)
此外,亦可因應於所設想之移動物體的種類、或在影像取得感測器300之像素速度資料的誤差以及測距感測器200之物體位置資料的誤差,改變物體尺寸63的大小,或將形狀作成橢圓形或長方形等。如本實施形態所示,在將行人作為檢測對象的情況,物體尺寸63係當作表示從上方所俯視之行人之平均的大小者即可。 In addition, the size of the object size 63 can also be changed according to the type of the moving object envisaged, or the error of the pixel speed data of the sensor 300 acquired in the image and the error of the position data of the distance measuring sensor 200, Make the shape oval or rectangular. As shown in this embodiment, when a pedestrian is used as a detection target, the object size 63 may be regarded as an average size of pedestrians viewed from above.
又,除了如上述所示對物體尺寸63內部的像素速度資料一樣地進行加法平均以外,亦可為了在物體尺寸63內進行加權平均,而使函數W(xi,yi;xd,yd)的值具有傾斜。 In addition to performing the average addition on the pixel velocity data inside the object size 63 as described above, the function W (x i , y i ; x d , y The value of d ) has a slope.
又,亦可以在物體位置資料亦誤含靜止物體或實際存在之物體以外之不要的資料為前提,根據推定速度,丟棄不要的資料。例如,亦可若推定速度是未滿臨限值,就當作靜止物體,並從移動物體的個數除去。即,亦可將在物體位置資料製作部12所算出之移動物體的個數作為暫時之移動物體的個數,在位置移動方向相關部15進行最終之移動物體的個數的推定。 In addition, it is also possible to discard unnecessary data based on the estimated speed on the premise that the position data of the object also contains unnecessary data other than stationary objects or actual objects. For example, if the estimated speed is less than the threshold value, it may be regarded as a stationary object and removed from the number of moving objects. That is, the number of moving objects calculated by the object position data creation unit 12 may be regarded as the number of temporary moving objects, and the final number of moving objects may be estimated by the position moving direction correlation unit 15.
回到第6圖的流程圖。 Return to the flowchart in FIG. 6.
位置移動方向相關部15係判定是否對根據在時刻k的物體位置資料所特定之全部的移動物體已推定推定速度(步驟 ST604)。 The position moving direction correlation unit 15 determines whether or not the estimated speed has been estimated for all moving objects specified by the object position data at time k (step ST604).
在步驟ST604,在判定對根據在時刻k的物體位置資料所特定之全部的移動物體未推定推定速度的情況(步驟ST604之”NO”的情況),回到步驟ST603,並重複以後的處理。 In step ST604, when it is determined that the estimated speed has not been estimated for all moving objects specified by the object position data at time k (in the case of "NO" in step ST604), the process returns to step ST603 and repeats subsequent processing.
在步驟ST604,在判定對根據在時刻k的物體位置資料所特定之全部的移動物體已推定推定速度的情況(步驟ST604之”YES”的情況),位置移動方向相關部15係關於根據在時刻k的物體位置資料所特定之全部的移動物體,對各移動物體,將通過時刻、重心位置以及在步驟ST603所推定之推定速度合在一起,作為融合資料,向顯示部16及記錄部17輸出(步驟ST605)。 In step ST604, when it is determined that the estimated speed has been estimated for all the moving objects specified by the object position data at time k (in the case of "YES" in step ST604), the position moving direction related unit 15 is based on the For all moving objects specified by the object position data of k, the passing time, the position of the center of gravity, and the estimated speed estimated in step ST603 are combined for each moving object and output to the display unit 16 and the recording unit 17 as fusion data. (Step ST605).
顯示部16係顯示根據從位置移動方向相關部15所輸出之融合資料的資訊。顯示部16所顯示之資訊係例如是移動物體的個數、各移動物體的重心位置以及推定速度。又,因為在推定速度亦包含移動方向之資訊,所以顯示部16係例如亦可每單位時間顯示朝向通路的x正方向、或x負方向通過測距感測器200之觀察範圍54之移動物體的個數。 The display unit 16 displays information based on the fusion data output from the position moving direction correlation unit 15. The information displayed on the display unit 16 is, for example, the number of moving objects, the position of the center of gravity of each moving object, and the estimated speed. In addition, because the estimated speed also includes information on the direction of movement, the display unit 16 can also display, for example, a moving object that passes through the observation range 54 of the distance measuring sensor 200 toward the positive x direction or negative x direction of the path per unit time. Number of.
令顯示部16顯示的資訊係作成根據所預設之條件決定即可。不限定為此,亦可作成使用者從輸入裝置(省略圖示)預先設定令顯示部16顯示的資訊,或者亦可使用者可在顯示中切換令顯示的資訊。 The information displayed on the display unit 16 may be determined based on the preset conditions. The information is not limited to this, and the information displayed on the display unit 16 may be set in advance by the user from an input device (not shown), or the user may switch the information displayed on the display during the display.
記錄部17係記錄從位置移動方向相關部15所輸出之融合資料。記錄部17所記錄之資訊係例如是移動物體的個數、各移動物體的重心位置以及推定速度。又,在記錄部17, 亦可採用在每單位時間記錄朝向通路的x正方向、或x負方向通過測距感測器200之觀測範圍54之移動物體的個數。 The recording unit 17 records the fusion data output from the position moving direction correlation unit 15. The information recorded by the recording unit 17 is, for example, the number of moving objects, the position of the center of gravity of each moving object, and the estimated speed. Also, in the recording section 17, It is also possible to record the number of moving objects that pass through the observation range 54 of the ranging sensor 200 toward the positive x direction or negative x direction of the path per unit time.
令記錄部17記錄的資訊係作成根據所預設之條件決定即可。不限定為此,亦可作成使用者從輸入裝置(省略圖示)預先設定令記錄部17記錄的資訊,或者亦可使用者可變更令記錄的資訊。 The information recorded by the recording unit 17 may be determined based on preset conditions. Not limited to this, the information recorded by the order recording unit 17 may be set in advance by the user from an input device (not shown), or the information recorded by the order may be changed by the user.
如以上所示,若依據本第1實施形態,可藉配置於一個檢測位置的一台影像取得感測器300與一台測距感測器200,推定移動物體的個數、位置以及包含移動方向的移動速度。 As shown above, according to the first embodiment, an image acquisition sensor 300 and a distance measuring sensor 200 disposed at a detection position can be used to estimate the number, position, and movement of moving objects. The speed of movement in the direction.
一般,與僅需要受光元件之影像取得感測器相比,因為投光元件與受光元件都具備的測距感測器之構成之元件數比較多,且修正老化之頻次比較高,所以在製造時、運用時之費用變貴。在本第1實施形態之移動物體檢測裝置100,因為對一個檢測位置僅需要一台雷射感測器等的測距感測器200,所以與使用複數台測距感測器的方法相比,能以少的費用實現相同的功能。 Generally, compared with image acquisition sensors that only require light-receiving elements, because the number of components of the ranging sensor that are provided by both the light-emitting element and the light-receiving element is larger, and the frequency of correction and aging is higher, it is being manufactured. Expenses are expensive when using it. In the moving object detection device 100 according to the first embodiment, since only one distance sensor 200 such as a laser sensor is required for one detection position, it is compared with a method using a plurality of distance sensors. , Can achieve the same function at a low cost.
又,若依據本第1實施形態,因為藉設置於一處之一台測距感測器200的投光元件可推定包含移動物體之移動方向的移動速度,所以與將狹縫光投光於複數處的方式相比,設置條件的限制少,且設置工程的費用可少。 In addition, according to the first embodiment, the light emitting element provided at one of the distance measuring sensors 200 can be used to estimate the moving speed including the moving direction of the moving object. Compared with the plural methods, there are fewer restrictions on setting conditions, and the cost of setting up the project can be reduced.
又,若依據本第1實施形態,可藉影像取得感測器300與僅在一維方向掃描的測距感測器200推定移動物體的個數、位置以及移動方向。一般,在一維方向掃描的測距感測 器200係與在二維方向掃描的測距感測器相比,製造時、設置工程時以及運用時的費用可少了掃描所需之機器及處理的份量。因此,與使用在二維方向掃描之雷射感測器等的測距感測器的方法相比,能以少的費用實現相同的功能。 In addition, according to the first embodiment, the number, position, and direction of moving objects can be estimated by the image acquisition sensor 300 and the distance measuring sensor 200 that scans in only one dimension. Ranging sensing in one direction Compared with a distance measuring sensor that scans in a two-dimensional direction, the device 200 can reduce the cost of scanning and the amount of machinery and processing required during manufacturing, installation, and operation. Therefore, it is possible to realize the same function at a lower cost than a method using a ranging sensor such as a laser sensor that scans in a two-dimensional direction.
又,測距感測器200係因為在測距感測器200所照射之雷射等的照射方向之移動物體的形狀亦可觀測,所以移動物體之位置推定誤差比影像取得感測器300小,且對接近之複數個體之各移動物體的區域是容易。因此,若依據第1實施形態,與僅使用可見光相機等之影像取得感測器300的方法相比,移動物體的個數及位置的推定精度高。 The distance measuring sensor 200 is also capable of observing the shape of a moving object in an irradiation direction such as a laser irradiated by the distance measuring sensor 200, so the position estimation error of the moving object is smaller than that of the image acquisition sensor 300. , And it is easy for the area of each moving object of a plurality of individuals approaching. Therefore, according to the first embodiment, compared with the method of acquiring the sensor 300 using only an image of a visible light camera or the like, the number and position of moving objects can be estimated with higher accuracy.
又,若依據本第1實施形態,作成對影像取得感測器300的影像,根據像素之亮度及強度算出各像素的推定速度,並推定移動物體的移動方向。因此,在影像取得感測器300之解析度低而觀測移動物體時的形狀不清晰的情況,例如在對移動物體的大小,影像之像素粗的情況或移動物體之輪廓的一部分欠缺的情況,與僅使用可見光相機等之影像取得感測器300的方法相比,移動方向推定精度的劣化小。 In addition, according to the first embodiment, an image of the image acquisition sensor 300 is created, an estimated speed of each pixel is calculated based on the brightness and intensity of the pixel, and a moving direction of the moving object is estimated. Therefore, when the resolution of the image acquisition sensor 300 is low and the shape of the moving object is not clear, such as when the size of the moving object, the pixels of the image are thick, or a part of the outline of the moving object is missing, Compared with the method of acquiring the sensor 300 using only an image of a visible light camera or the like, the deterioration of the estimation accuracy of the movement direction is small.
又,若依據本第1實施形態,因為作成移動物體的個數係根據測距感測器200的觀測結果來推定,所以在影像中拍到移動物體之影子的情況、或在背景之銀幕播放動態影像的情況等,可在不會將在背景之無實體之亮度強度的變化誤判為移動物體下,推定移動物體的個數。 In addition, according to the first embodiment, the number of moving objects is estimated based on the observation results of the ranging sensor 200. Therefore, when a shadow of a moving object is captured in an image, or played on a screen on the background In the case of moving images, etc., the number of moving objects can be estimated without misjudging changes in the brightness intensity of the backgroundless entity as moving objects.
第2實施形態 Second embodiment
在第1實施形態,以在藉影像取得感測器300所 取得之影像上移動物體的亮度及強度是連續為前提,作成在相同之時刻圖框內將根據來自測距感測器200的測距資訊之移動物體的重心位置、與根據來自影像取得感測器300之影像的像素速度賦與對應。 In the first embodiment, the sensor 300 is obtained from the borrowed image. The premise that the brightness and intensity of the moving object on the acquired image is continuous. At the same time, the position of the center of gravity of the moving object based on the ranging information from the ranging sensor 200 at the same time frame and the sensing based on the obtained image The pixel speed of the image of the processor 300 corresponds.
可是,實際上,例如因光源照度之偏倚、或背景色的變化、移動物體的轉動與變形、夜間攝影之誇大雜訊等,移動物體之亮度強度連續地變化。在這種情況,移動物體之像素速度暫時成為與真正之移動物體的速度大為相異的值,在某時刻圖框之像素速度上會發生移動物體消失的情形。 However, in fact, the brightness intensity of a moving object changes continuously due to, for example, a bias in the illumination of the light source, a change in background color, rotation and deformation of a moving object, exaggerated noise of night photography, and the like. In this case, the pixel speed of the moving object temporarily becomes a value that is greatly different from the speed of the real moving object. At some point, the pixel speed of the frame may disappear.
因此,在本第2實施形態,說明從複數個時刻圖框的像素速度,設定複數種各移動物體之推定速度的候選,從各候選中判定最合理之推定速度的實施形態。 Therefore, in this second embodiment, an embodiment will be described in which a plurality of candidates of estimated speeds of each moving object are set from the pixel speeds of a plurality of time frames, and the most reasonable estimated speed is determined from each candidate.
本第2實施形態之移動物體檢測系統的構成係因為與在第1實施形態使用第1圖所說明者一樣,所以省略重複的說明。 The configuration of the moving object detection system according to the second embodiment is the same as that described with reference to FIG. 1 in the first embodiment, and redundant description is omitted.
又,關於測距感測器200與影像取得感測器300之設置狀況、及測距感測器200與影像取得感測器300的觀測條件,亦因為例如如在第1實施形態使用第2圖、第3圖之說明所示,所以省略重複的說明。 In addition, the installation conditions of the ranging sensor 200 and the image acquisition sensor 300 and the observation conditions of the ranging sensor 200 and the image acquisition sensor 300 are also used because, for example, the second embodiment is used in the first embodiment. As shown in the description of FIG. 3 and FIG. 3, overlapping description is omitted.
第8圖係本發明之第2實施形態之移動物體檢測裝置100a的構成圖。 Fig. 8 is a configuration diagram of a moving object detection device 100a according to a second embodiment of the present invention.
在第8圖,關於與使用第4圖所說明之第1實施形態的移動物體檢測裝置100一樣的構成,附加相同的符號,並省略說明。 In FIG. 8, the same configurations as those of the moving object detection device 100 according to the first embodiment described with reference to FIG. 4 are assigned the same reference numerals, and descriptions thereof are omitted.
本發明之第2實施形態之移動物體檢測裝置100a係與第1實施形態的移動物體檢測裝置100相比,僅在替代位置移動方向相關部15,而由移動方向假設值設定部18與移動方向決定部19構成融合處理部103上相異。 Compared with the moving object detection device 100 according to the first embodiment, the moving object detection device 100a according to the second embodiment of the present invention only has a moving direction correlation section 15 at a substitute position, and the moving direction assumption value setting section 18 and the moving direction The determination unit 19 differs from the fusion processing unit 103.
在本第2實施形態,物體位置資料製作部12係向移動方向假設值設定部18輸出物體位置資料。又,像素移動方向資料製作部14係向移動方向假設值設定部18及移動方向決定部19輸出像素速度資料。 In the second embodiment, the object position data creation unit 12 outputs the object position data to the movement direction assumption value setting unit 18. The pixel moving direction data creating unit 14 outputs the pixel speed data to the moving direction assumption value setting unit 18 and the moving direction determining unit 19.
移動方向假設值設定部18係取得從物體位置資料製作部12所輸出之物體位置資料、與從像素移動方向資料製作部14所輸出之時刻圖框的像素速度資料,並設定複數種以測距感測器200所觀測之移動物體之移動速度的候選值。此處,將移動方向假設值設定部18所設定之移動物體之移動速度的候選值稱為速度假設值。 The movement direction hypothesis setting unit 18 obtains the object position data output from the object position data creation unit 12 and the pixel speed data of the time frame output from the pixel movement direction data creation unit 14 and sets a plurality of types for distance measurement. Candidate value of the moving speed of the moving object observed by the sensor 200. Here, the candidate value of the moving speed of the moving object set by the moving direction assumption value setting unit 18 is referred to as a speed assumption value.
移動方向假設值設定部18係向移動方向決定部19輸出速度假設值。 The movement direction assumption value setting unit 18 outputs a speed assumption value to the movement direction determination unit 19.
移動方向決定部19係取得從像素移動方向資料製作部14所輸出之時刻圖框的像素速度資料、與從移動方向假設值設定部18所輸出之移動物體的速度假設值,並從速度假設值之中決定移動物體的移動速度。移動方向決定部19係向顯示部16及記錄部17輸出移動物體的重心位置通過測距感測器200之觀測範圍54的時刻、x軸及y軸上的重心位置、以及表示x軸及y軸方向之推定速度的融合資料。 The moving direction determining unit 19 obtains the pixel speed data of the frame at the time output from the pixel moving direction data creating unit 14 and the speed assumed value of the moving object output from the moving direction assumed value setting unit 18, and obtains the speed assumed value from the speed assumed value. Among them determines the moving speed of the moving object. The movement direction determining unit 19 outputs to the display unit 16 and the recording unit 17 the time when the position of the center of gravity of the moving object passes through the observation range 54 of the distance measuring sensor 200, the position of the center of gravity on the x-axis and y-axis, and indicates the x-axis and y Fusion data of estimated velocity in the axial direction.
移動物體檢測裝置100a的硬體構成係因為與在第 1實施形態使用第5A圖、第5B圖所說明者一樣,所以省略重複的說明。 The hardware configuration of the moving object detection device 100a is similar to that of the first embodiment. The first embodiment is the same as that described with reference to Figs. 5A and 5B, so redundant descriptions are omitted.
說明本第2實施形態之移動物體檢測裝置100a的動作。 The operation of the moving object detection device 100a according to the second embodiment will be described.
第9圖係說明本發明之第2實施形態的移動物體檢測裝置100a之動作的流程圖。 Fig. 9 is a flowchart illustrating the operation of the moving object detection device 100a according to the second embodiment of the present invention.
此外,在以下的說明,k係當作N+1以上的自然數。關於參數N將後述。 In the following description, k is taken as a natural number equal to or greater than N + 1. The parameter N will be described later.
測距感測器處理部101係從測距感測器200所取得之在複數個時刻圖框的測距資訊,抽出在時刻k重心位置通過測距感測器200之觀測範圍54的移動物體,並製作在時刻k的物體位置資料(步驟ST901)。具體的動作係與在第1實施形態所說明之第6圖的步驟ST601一樣。測距感測器處理部101之物體位置資料製作部12係向移動方向假設值設定部18輸出所製作之物體位置資料。 The ranging sensor processing unit 101 is based on the ranging information obtained from the ranging sensor 200 at a plurality of time frames, and extracts a moving object passing through the observation range 54 of the ranging sensor 200 at the position of the center of gravity at time k. And create object position data at time k (step ST901). The specific operation is the same as step ST601 of FIG. 6 described in the first embodiment. The object position data creation unit 12 of the ranging sensor processing unit 101 outputs the created object position data to the movement direction assumption value setting unit 18.
移動物體檢測裝置100a之控制部(省略圖示)係對變數i進行起始化(步驟ST902)。此處,變數i的起始值係當作0。 The control unit (not shown) of the moving object detection device 100a initializes the variable i (step ST902). Here, the starting value of the variable i is taken as 0.
影像取得感測器處理部102係從影像取得感測器300所取得之在複數個時刻圖框的影像,製作在時刻k-i的像素速度資料(步驟ST903)。具體之像素速度資料的製作方法係與在第1實施形態所說明之第6圖的步驟ST602一樣。影像取得感測器處理部102之像素移動方向資料製作部14係向移動方向假設值設定部18及移動方向決定部19輸出在時刻k-i的像素 速度資料。 The image acquisition sensor processing unit 102 generates the pixel velocity data at the time k-i from the images obtained at the frames at a plurality of times from the image acquisition sensor 300 (step ST903). A specific method of generating the pixel speed data is the same as step ST602 of FIG. 6 described in the first embodiment. The pixel movement direction data creation unit 14 of the image acquisition sensor processing unit 102 outputs the pixels at the time k-i to the movement direction assumption value setting unit 18 and the movement direction determination unit 19 Speed data.
移動方向假設值設定部18係使用在步驟ST901物體位置資料製作部12所輸出之在時刻k的物體位置資料、與在步驟ST903像素移動方向資料製作部14所輸出之在時刻k-i的像素速度資料,對在時刻k的物體位置資料所含的移動物體設定速度假設值(步驟ST904)。 The movement direction assumed value setting unit 18 uses the object position data at time k output by the object position data creation unit 12 in step ST901 and the pixel velocity data at time ki output by the pixel movement direction data creation unit 14 in step ST903 , A speed assumed value is set for the moving object included in the object position data at time k (step ST904).
此處,第10圖係以i=1的情況為例,表示在第9圖的步驟ST904之藉移動方向假設值設定部18的速度假設值之設定之動作的示意圖。 Here, FIG. 10 is a schematic diagram showing the operation of setting the speed assumption value by the movement direction assumption value setting unit 18 in step ST904 of FIG. 9 taking i = 1 as an example.
移動方向假設值設定部18係對在時刻k之移動物體的重心位置61,抽出在時刻k-1之移動物體的重心位置的推定值,即在時刻k-1之物體位置推定值64、與以在時刻k-1之物體位置推定值64為中心之物體尺寸63以內之在時刻k-1的像素速度65。而且,移動方向假設值設定部18係從所抽出之在時刻k-1的像素速度65,算出移動物體的速度假設值v1。 The movement direction hypothesis setting unit 18 extracts the estimated value of the position of the center of gravity of the moving object at time k-1 from the position of the center of gravity 61 of the moving object at time k, that is, the estimated value of the position of the object at time k-1 is 64, and The pixel speed 65 at time k-1 is within the object size 63 centered on the object position estimated value 64 at time k-1. Further, the movement direction assumption value setting unit 18 calculates a speed assumption value v 1 of the moving object from the extracted pixel speed 65 at time k- 1 .
第11圖係詳細地說明第9圖之步驟ST904之動作的流程圖。 Fig. 11 is a flowchart illustrating the operation of step ST904 in Fig. 9 in detail.
此處,將作為對象之某移動物體的重心位置當作(xd,yd),將速度假設值當作vu=(vu,x,vu,y)。xd係當作移動物體之重心位置的x軸位置,yd係當作移動物體之重心位置的y軸位置,vu,x、vu,y係分別當作推定速度之x軸方向成分、y軸方向成分。此外,因為測距感測器200之觀測範圍54係在x軸方向窄,所以亦可xd係當作測距感測器200之觀測範圍之中心的x軸位置。 Here, the position of the center of gravity of a moving object as an object is taken as (x d , y d ), and the speed assumption value is taken as v u = (v u, x , v u, y ). x d is the x-axis position of the center of gravity of the moving object, y d is the y-axis position of the center of gravity of the moving object, v u, x , v u, y are the x-axis direction components of the estimated velocity, respectively , Y-axis direction component. In addition, since the observation range 54 of the ranging sensor 200 is narrow in the x-axis direction, x d may also be used as the x-axis position of the center of the observation range of the ranging sensor 200.
又,以Vk-i,x(xq,yq)及Vk-i,y(xq,yq)表示在時刻k -i的像素速度資料。此處,xq及yq係分別表示第q個像素的x軸中心位置及y軸中心位置,Vk-i,x(xq,yq)表示像素速度之x軸方向成分,Vk-i,y(xq,yq)表示像素速度之y軸方向成分。此外,像素的總數係當作NP。 In addition, V ki, x (x q , y q ) and V ki, y (x q , y q ) represent the pixel velocity data at time k -i. Here, x q and y q represent the x-axis center position and the y-axis center position of the q-th pixel, respectively, and V ki, x (x q , y q ) represents the x-axis direction component of the pixel velocity, and V ki, y (x q , y q ) represents the y-axis direction component of the pixel velocity. In addition, the total number of pixels is taken as N P.
首先,移動方向假設值設定部18係在從像素移動方向資料製作部14所取得之影像,即藉影像取得感測器300所取得之影像中,選擇一個未選擇的像素(步驟ST1101)。在以下,將在步驟ST1101移動方向假設值設定部18所選擇之像素稱為「選擇像素」,並將該選擇像素的x軸中心位置、y軸中心位置當作(xs,ys)。 First, the movement direction assumption value setting unit 18 selects an unselected pixel from the image obtained from the pixel movement direction data creation unit 14, that is, the image obtained by the image acquisition sensor 300 (step ST1101). Hereinafter, the pixel selected by the movement direction assumption value setting unit 18 in step ST1101 is referred to as a "selected pixel", and the x-axis center position and the y-axis center position of the selected pixel are referred to as (x s , y s ).
移動方向假設值設定部18係從選擇像素的附近之在時刻k-i的像素速度資料,算出選擇像素的平均速度vm=(vm,x,vm,y)(步驟ST1102)。此處,vm,x、vm,y係當作平均速度之x軸方向成分、y軸方向成分。 The movement direction hypothesis setting unit 18 calculates the average velocity v m = (v m, x , v m, y ) of the selected pixels from the pixel velocity data at the time ki in the vicinity of the selected pixels (step ST1102). Here, v m, x and v m, y are considered as the x-axis direction component and the y-axis direction component of the average speed.
平均速度vm=(vm,x,vm,y)係自以下之數學式(6)及數學式(7)算出。 The average speed v m = (v m, x , v m, y ) is calculated from the following mathematical formula (6) and mathematical formula (7).
此處,W(xq,yq;xs,ys)係表示物體尺寸63的函數,係對任意之(xs,ys)滿足以下之數學式(8)的函數。 Here, W (x q , y q ; x s , y s ) is a function representing the object size 63, and is a function that satisfies the following mathematical formula (8) for any (x s , y s ).
例如,在以(xs,ys)為中心定義圓形之物體尺寸63的情況, 定義成如以下之數學式(9)所示。 For example, when a circular object size 63 is defined with (x s , y s ) as the center, it is defined as shown in the following mathematical formula (9).
此處,將表示物體尺寸63之半徑的參數當作R,將滿足以下之數學式(10)之像素(xq’,yq’)的個數當作n(R)。 Here, the parameter representing the radius of the object size 63 is taken as R, and the number of pixels (x q ', y q ') satisfying the following mathematical expression (10) is taken as n (R).
(x q'-x s )2+(y q'-y s )2<R 2 (10) ( x q ' -x s ) 2 + ( y q' - y s ) 2 < R 2 (10)
又,亦可因應於所設想之移動物體的種類、或在影像取得感測器300之像素速度資料的誤差以及測距感測器200之物體位置資料的誤差,改變物體尺寸63的大小,或將形狀作成橢圓形或長方形等。如本實施形態所示,在將行人作為檢測對象的情況,物體尺寸63係當作表示從上方所俯視之行人之平均的大小者即可。 In addition, the size of the object size 63 may be changed according to the type of the moving object envisaged, or the error of the pixel speed data of the sensor 300 obtained in the image, and the error of the position data of the distance measuring sensor 200, or Make the shape oval or rectangular. As shown in this embodiment, when a pedestrian is used as a detection target, the object size 63 may be regarded as an average size of pedestrians viewed from above.
又,除了如上述所示對物體尺寸63內部的像素速度資料一樣地進行加法平均以外,亦可為了在物體尺寸63內進行加權平均,而使函數W(xq,yq;xs,ys)的值具有傾斜。 In addition to performing the average addition on the pixel velocity data in the object size 63 as described above, the function W (x q , y q ; x s , y The value of s ) has a slope.
移動方向假設值設定部18係根據在步驟ST1102所算出之平均速度,算出以平均速度對時刻k外插選擇像素的位置(步驟ST1103)。此處,將以平均速度對時刻k外插選擇像素的位置稱為外插位置。 Based on the average speed calculated in step ST1102, the movement direction assumption value setting unit 18 calculates the position of the selected pixel by extrapolating the time k with the average speed (step ST1103). Here, the position where the selected pixel is extrapolated at an average speed from time k is referred to as an extrapolated position.
移動方向假設值設定部18係假設移動物體從時刻k-i至時刻k之間等速直線前進,從以下之數學式(11)及數學式(12)算出外插位置(xe,ye)。 The movement direction hypothesis value setting unit 18 assumes that the moving object moves straight at a constant speed from time ki to time k, and calculates the extrapolated position (x e , y e ) from the following mathematical formula (11) and mathematical formula (12).
x e =x s +v m,x T k-i,k (11) x e = x s + v m , x T k - i , k (11)
y e =y s +v m,y T k-i,k (12) y e = y s + v m , y T k - i , k (12)
此處,Tk-i,k表示從時刻k-i至時刻k之圖框間經過時間。尤其,在i=0的情況係當作Tk,k=0。 Here, T ki, k represents the elapsed time between the frames from time ki to time k. In particular, in the case of i = 0, it is regarded as T k, k = 0.
又,在數學式(11)及數學式(12),作為移動物體之運動模型,假設等速直線前進運動,但是亦可因應於移動物體之種類,例如,假設等加速度直線前進運動等別的運動模型,算出外插位置。 Also, in Mathematical Formula (11) and Mathematical Formula (12), as a motion model of a moving object, a constant-speed straight-forward motion is assumed, but it can also be adapted to the type of moving object. Motion model to calculate extrapolated position.
移動方向假設值設定部18算出在步驟ST1103所算出之外插位置、與移動物體之重心位置的殘差(步驟ST1104)。 The movement direction hypothesis value setting unit 18 calculates a residual from the extrapolated position calculated in step ST1103 and the position of the center of gravity of the moving object (step ST1104).
移動方向假設值設定部18係根據以下之數學式(13)算出從在時刻k-i的像素速度資料所算出之外插位置(xe,ye)與在時刻k的物體位置資料之移動物體之重心位置(xd,yd)的殘差△r。 The moving direction hypothesis value setting unit 18 calculates the moving object based on the following mathematical formula (13) from the extrapolated position (x e , y e ) calculated from the pixel velocity data at time ki and the object position data at time k. The residual Δr of the position of the center of gravity (x d , y d ).
此處,矩陣右上的t係表示矩陣的轉置,矩陣右上的「-1」係表示反矩陣。 Here, t in the upper right of the matrix represents the transpose of the matrix, and "-1" in the upper right of the matrix represents the inverse matrix.
又,σx 2、σy 2、σxy 2係作為表示關於外插位置與移動物體之重心位置的殘差之變動之大小的參數,σx 2係表示x軸成分的分散,σy 2係表示y軸成分的分散,σxy 2係表示x軸與y軸成分的共分散,例如,在像素速度資料之y軸方向成分的誤差比x軸方向成分的誤差大的情況,藉由將σy 2設定成比σx 2更大的值,可算出考慮到各成分之變動的差異之殘差。 And, σ x 2, σ y 2 , σ xy 2 based on the gravity center position as a parameter extrapolated position of the moving object changes in size of the residuals, σ x 2 are diagrams dispersive x-axis component, σ y 2 Indicates the dispersion of the y-axis component, and σ xy 2 indicates the co-dispersion of the x-axis and y-axis components. σ y 2 is set to a value larger than σ x 2 , and a residual can be calculated in consideration of the difference in the variation of each component.
或,在關於外插位置與移動物體之重心位置的殘差之變動 的數值化困難的情況等,亦可假設各成分之變動係獨立且一樣,並根據更簡單之以下的數學式(14)求得殘差△r。 Or, the change in the residuals about the extrapolated position and the position of the center of gravity of the moving object In cases where it is difficult to digitize, it can be assumed that the variation of each component is independent and the same, and the residual Δr can be obtained from the simpler mathematical formula (14) below.
△r 2=(x d -x e )2+(y d -y e )2 (14) △ r 2 = ( x d - x e ) 2 + ( y d - y e ) 2 (14)
移動方向假設值設定部18係將在步驟ST1104所算出之殘差△r與在步驟ST1101~步驟ST1107的迴路處理中已算出之其他的殘差相比,判定是否是最小(步驟ST1105)。 The movement direction hypothesis value setting unit 18 determines whether the residual Δr calculated in step ST1104 is the smallest compared to other residuals calculated in the loop processing of steps ST1101 to ST1107 (step ST1105).
在步驟ST1105,與其他的殘差相比,在判定不是最小的情況,即,在正前之步驟ST1104所算出之殘差暫定不是最小值的情況(在步驟ST1105之”NO”的情況),跳越步驟ST1106的處理,移至步驟ST1107。 In step ST1105, compared with other residuals, when the determination is not the smallest, that is, when the residual calculated in step ST1104 immediately before is not tentatively the minimum value (in the case of "NO" in step ST1105), The process of step ST1106 is skipped, and the process proceeds to step ST1107.
在步驟ST1105,與其他的殘差相比,在判定是最小的情況,即,在正前之步驟ST1104所算出之殘差暫定是最小值的情況(在步驟ST1105之”YES”的情況),移動方向假設值設定部18係將在步1102所算出之平均速度vm設定成在時刻k的重心位置是(xd,yd)之移動物體的速度假設值vu(步驟ST1106)。此外,在步驟ST1101~步驟ST1107的迴路處理中已將別的值設定成移動物體之速度假設值的情況,移動方向假設值設定部18作將在步驟ST1102所算出之平均速度vm作為新的平均速度替換。 In step ST1105, compared with other residuals, when the determination is the smallest, that is, when the residual calculated in step ST1104 immediately before is tentatively the minimum value (in the case of "YES" in step ST1105), The moving direction hypothesis value setting unit 18 sets the average speed v m calculated in step 1102 to the speed hypothesis value v u of the moving object whose center of gravity position at time k is (x d , y d ) (step ST1106). In addition, if another value has been set as the speed assumed value of the moving object in the loop processing of steps ST1101 to ST1107, the movement direction assumed value setting unit 18 sets the average speed v m calculated in step ST1102 as a new value. Average speed replacement.
移動方向假設值設定部18判定是否已選擇在時刻k-i的像素速度資料中之全部的像素(步驟ST1107)。 The movement direction assumption value setting unit 18 determines whether or not all pixels in the pixel speed data at time k-i have been selected (step ST1107).
在步驟ST1107,判定未選擇全部之像素的情況(在步驟ST1107之”NO”的情況),回到步驟ST1101,並重複以後 的處理。 In step ST1107, it is determined that all the pixels have not been selected (in the case of "NO" in step ST1107), the process returns to step ST1101, and repeats thereafter. Processing.
在步驟ST1107,判定已選擇全部之像素的情況(在步驟ST1107之”YES”的情況),結束第11圖之處理。 In step ST1107, it is determined that all the pixels have been selected (in the case of "YES" in step ST1107), and the processing of Fig. 11 is ended.
如以上所示,藉由執行第11圖之處理,根據在時刻k-i的像素速度資料,設定移動物體之速度假設值。 As shown above, by executing the processing of FIG. 11, the speed assumed value of the moving object is set based on the pixel speed data at time k-i.
此外,在以上之第11圖的處理,對在時刻k-i的像素速度資料中之全部的像素,設定平均速度、外插位置、與移動物體之重心位置的殘差,並設定速度假設值,但是亦可例如藉由從移動物體之設想最大速度限制所選擇之像素,減輕處理負擔。 In addition, in the processing of FIG. 11 above, for all pixels in the pixel speed data at time ki, the average speed, the extrapolated position, and the residual from the position of the center of gravity of the moving object are set, and the speed assumed value is set, but It is also possible to reduce the processing load, for example, by limiting the selected pixels from the maximum speed assumed for the moving object.
又,在以上之第11圖的處理,根據在時刻k-i的像素速度資料,僅設定一個速度假設值,但是亦可因應於殘差之大小,從在時刻k-i的像素速度資料,設定複數個速度假設值。 In the processing in FIG. 11 above, only one speed hypothesis value is set based on the pixel speed data at time ki, but a plurality of speeds may be set from the pixel speed data at time ki according to the size of the residual. Assumed value.
又,在以上之第11圖的處理,說明像素移動方向資料製作部14算出x軸及y軸方向之速度的情況,但是在像素移動方向資料製作部14僅算出移動方向的情況,亦可例如根據表示所預設之行人之典型的移動速度的參數,將所推定之移動方向變換成x軸及y軸方向的速度,並一樣地進行處理。 In the processing in FIG. 11 described above, the case where the pixel movement direction data creation unit 14 calculates the speeds in the x-axis and y-axis directions is described. However, the case where the pixel movement direction data creation unit 14 calculates only the movement direction. The estimated moving direction is converted into speeds in the x-axis and y-axis directions based on a parameter indicating a typical moving speed of the pedestrian, and the same processing is performed.
又,此處,像素移動方向資料製作部14係作為表示物體之移動方向之物理量的一例,採用算出x軸及y軸方向的速度,但是在僅推定物體之移動方向的目的,未必要算出速度。例如,亦可採用僅算出表示移動方向之大小固定的向量、或表 示移動方向的方位角。在此情況,移動方向假設值設定部18所設定之物體的速度假設值係亦稱為移動方向假設值。 Here, the pixel moving direction data creation unit 14 is an example of the physical quantity indicating the moving direction of the object, and calculates the speed in the x-axis and y-axis directions. However, it is not necessary to calculate the speed for the purpose of estimating only the moving direction of the object. . For example, it is also possible to use only a fixed vector or table that calculates the direction of movement. Shows the azimuth of the moving direction. In this case, the speed assumed value of the object set by the movement direction assumed value setting unit 18 is also referred to as a movement direction assumed value.
回到第9圖的流程圖。 Return to the flowchart of FIG. 9.
移動物體檢測裝置100a的控制部(省略圖示)判定變數i是否是參數N以上(步驟ST905)。 The control unit (not shown) of the moving object detection device 100a determines whether the variable i is greater than or equal to the parameter N (step ST905).
參數N表示在設定速度假設值時所參照之過去的時刻圖框的個數。參數N係例如設定在從時刻k-N至時刻k之間將目標之運動作為等速直線運動的假設成為妥當之時刻圖框數的最大值。 The parameter N represents the number of past time frames that are referred to when setting the speed assumption value. The parameter N is set, for example, from the time k-N to the time k with the assumption that the movement of the target is a constant-speed linear motion, and the maximum number of frames at the appropriate time is assumed.
在步驟ST905,在判定變數i為未滿參數N的情況(在步驟ST905之”NO”的情況),移動方向假設值設定部18係對變數i加1(步驟ST906),並回到步驟ST903。 In step ST905, when it is determined that the variable i is an underfill parameter N (in the case of "NO" in step ST905), the movement direction assumption value setting unit 18 adds 1 to the variable i (step ST906), and returns to step ST903. .
在步驟ST905,在判定變數i為參數N以上的情況(在步驟ST905之”YES”的情況),移動物體檢測裝置100a的控制部係對變數j進行起始化(步驟ST907)。變數j的起始值是1。即,控制部係將1代入變數j。控制部係向影像取得感測器處理部102輸出對變數j已進行起始化之主旨的資訊。 In step ST905, when it is determined that the variable i is greater than the parameter N (in the case of "YES" in step ST905), the control unit of the moving object detection device 100a initializes the variable j (step ST907). The starting value of the variable j is 1. That is, the control unit substitutes 1 into the variable j. The control unit outputs information to the image acquisition sensor processing unit 102 on the purpose of initializing the variable j.
影像取得感測器處理部102係從影像取得感測器300所取得之在複數個時刻圖框的影像,製作在時刻k+j的像素速度資料(步驟ST908)。具體之像素速度資料的製作方法係與在第1實施形態所說明之第6圖的步驟ST602一樣。影像取得感測器處理部102之像素移動方向資料製作部14係向移動方向假設值設定部18及移動方向決定部19輸出在時刻k+j的像素速度資料。 The image acquisition sensor processing unit 102 generates the pixel velocity data at the time k + j from the images obtained at the frames at a plurality of times from the image acquisition sensor 300 (step ST908). A specific method of generating the pixel speed data is the same as step ST602 of FIG. 6 described in the first embodiment. The pixel movement direction data creation unit 14 of the image acquisition sensor processing unit 102 outputs pixel speed data at the time k + j to the movement direction assumption value setting unit 18 and the movement direction determination unit 19.
移動方向決定部19係使用在步驟ST908所取得之在時刻k+j的像素速度資料,對在步驟ST904移動方向假設值設定部18所設定之移動物體的複數個速度假設值,更新表示各速度假設值之概率的值(步驟ST909)。此處,將表示各速度假設值之概率的值稱為速度假設值之概率。 The movement direction determination unit 19 updates the plurality of speed assumption values of the moving object set by the movement direction assumption value setting unit 18 in step ST904 using the pixel speed data at time k + j obtained in step ST908, and displays the respective speeds. The value of the probability of the hypothetical value (step ST909). Here, the value which represents the probability of each speed hypothesis value is called the probability of speed hypothesis value.
此外,各速度假設值之概率係在步驟ST908~步驟ST911之關於變數j的迴路處理中逐次更新。例如,係於某速度假設值vi,根據在時刻k+j的像素速度資料所更新的概率Li,k+j係根據在前一個變數j的迴路處理所算出之概率Li,k+j-1所算出。 In addition, the probability of each speed hypothesis value is successively updated in the loop processing on the variable j in steps ST908 to ST911. For example, the assumed value based certain velocity v i, at time k + j in accordance with the speed of the pixel data of the updated probability L i, k + j j based on a probability-based variable of the previous loop processing of the calculated L i, k + calculated by j-1 .
第12圖係以j=1及j=2的情況為例,表示在第9圖的步驟ST909之藉移動方向決定部的速度假設值v0、v1、v2之概率的更新之動作的示意圖。 FIG. 12 shows the case of j = 1 and j = 2 as an example, and shows the operation of updating the probability of the speed assumption values v 0 , v 1 , and v 2 of the borrowing direction determining unit in step ST909 of FIG. 9. schematic diagram.
移動方向決定部19係對在時刻k之移動物體的重心位置61,根據各速度假設值,算出在時刻k+1的物體位置推定值66a~66c、在時刻k+2的物體位置推定值68a~68c。又,移動方向決定部19係從在時刻k+1及時刻k+2的像素速度資料抽出以各物體位置推定值為中心的物體尺寸63內之在時刻k+1及時刻k+2的像素速度67a~67c、69a~69c。 The moving direction determining unit 19 calculates the position of the center of gravity 61 of the moving object at time k, and calculates the estimated positions 66a to 66c of the object at time k + 1 and the estimated positions 68a of the object at time k + 2 based on the speed assumptions. ~ 68c. In addition, the movement direction determination unit 19 extracts pixels at time k + 1 and time k + 2 within the object size 63 centered on the estimated value of each object position from the pixel velocity data at time k + 1 and time k + 2. Speeds 67a ~ 67c, 69a ~ 69c.
而且,移動方向決定部19係在各時刻圖框,從速度假設值與物體尺寸63內之像素速度的差異,算出速度假設值的概率。該概率係表示從在時刻k以前之像素速度所設定的速度假設值、與在時刻k+1以後的像素速度之一致的程度,經由在時刻k+1以後的各時刻圖框,速度假設值與物體尺寸63內之 像素速度的差異愈小成為愈高的值。 Further, the moving direction determination unit 19 calculates the probability of the speed assumed value from the difference between the speed assumed value and the pixel speed within the object size 63 at each time frame. This probability indicates the degree of agreement between the speed assumed value set by the pixel speed before time k and the pixel speed after time k + 1, and the speed assumed value through the time frame after time k + 1. Within object size 63 The smaller the difference in pixel speed becomes, the higher the value becomes.
在第12圖之例子,因為速度假設值v1與物體位置推定值66a周邊的像素速度67a、物體位置推定值68a周邊的像素速度69a的差異比剩下之速度假設值v0、v2小,所以移動物體之速度假設值v1之概率成為比速度假設值v0、v2之概率大的值。 In the example in FIG. 12, the difference between the speed assumption value v 1 and the pixel speed 67a around the object position estimation value 66a and the pixel speed 69a around the object position estimation value 68a is smaller than the remaining speed assumption values v 0 and v 2 Therefore, the probability of the velocity assumed value v 1 of the moving object is larger than the probability of the velocity assumed values v 0 and v 2 .
以下,詳細說明第9圖之步驟ST909的動作。 Hereinafter, the operation of step ST909 in FIG. 9 will be described in detail.
第13圖係說明第9圖之步驟ST909之細部動作的流程圖。 FIG. 13 is a flowchart illustrating detailed operations of step ST909 in FIG. 9.
此處,將作為對象之某移動物體的重心位置當作(xd,yd),將速度假設值當作vu=(vu,x,vu,y)。xd係當作移動物體之重心位置的x軸位置,yd係當作移動物體之重心位置的y軸位置,vu,x、vu,y係當作推定速度之x軸方向成分、y軸方向成分。此外,因為測距感測器200之觀測範圍54係在x軸方向窄,所以亦可xd係當作測距感測器200之觀測範圍之中心。 Here, the position of the center of gravity of a moving object as an object is taken as (x d , y d ), and the speed assumption value is taken as v u = (v u, x , v u, y ). x d is the x-axis position of the center of gravity of the moving object, y d is the y-axis position of the center of gravity of the moving object, v u, x , v u, y are the x-axis direction components of the estimated velocity, The y-axis component. In addition, since the observation range 54 of the ranging sensor 200 is narrow in the x-axis direction, x d can also be used as the center of the observation range of the ranging sensor 200.
又,以vk+j,x(xq,yq)、vk+j,y(xq,yq)表示在時刻k+j的像素速度資料。此處,xq及yq係表示第q個像素之x軸中心位置、y軸中心位置,vk+j,x(xq,yq)係表示像素速度之x軸方向成分,vk+j,y(xq,yq)係表示像素速度之y軸方向成分。此外,像素之總數係當作Np。 In addition, v k + j, x (x q , y q ) and v k + j, y (x q , y q ) represent pixel velocity data at time k + j. Here, x q and y q are the x-axis center position and y-axis center position of the q-th pixel, and v k + j, x (x q , y q ) are the x-axis direction components of the pixel velocity, and v k + j, y (x q , y q ) represents the y-axis direction component of the pixel velocity. In addition, the total number of pixels is taken as N p .
移動方向決定部19係自在時刻k之移動物體的重心位置(xd,yd)及速度假設值vu算出關於速度假設值vu之在時刻k+j的物體位置推定值(xe,k+j,ye,k+j)(步驟ST1301)。 Moving direction determination unit 19 based gravity position of the moving object free time k is (x d, y d) and the velocity of hypothetical values v u is calculated on the velocity of hypothetical values v u sum at time k + j of the object position estimation value (x e, k + j , y e, k + j ) (step ST1301).
移動方向決定部19係假設移動物體從時刻k至時刻k+j 之間等速直線前進,從以下之數學式(15)及數學式(16)算出物體位置推定值(xe,k+j,ye,k+j)(步驟ST1301)。 The moving direction determining unit 19 assumes that the moving object is moving straight at a constant speed from time k to time k + j, and calculates the estimated position of the object (x e, k + j ) from the following mathematical expressions (15) and (16). , y e, k + j ) (step ST1301).
x e,k+j =x d +v u,x T k,k+j (15) x e , k + j = x d + v u , x T k , k + j (15)
y e,k+j =y d +v u,y T k,k+j (16) y e , k + j = y d + v u , y T k , k + j (16)
此處,Tk,k+j表示從時刻k至時刻k+j之圖框間經過時刻。 Here, T k, k + j represents the elapsed time between the frames from time k to time k + j.
又,在數學式(15)及數學式(16),作為移動物體之運動模型,假設等速直線前進運動,但是亦可移動方向決定部19係因應於移動物體之種類,例如,假設等加速度直線前進運動等別的運動模型,算出外插位置。 Also, in Mathematical Expressions (15) and (16), as a motion model of a moving object, it is assumed that a linear motion is performed at a constant speed. However, the movement direction determining unit 19 may be adapted to the type of the moving object. For example, a constant acceleration is assumed. Calculate the extrapolated position by other motion models such as linear motion.
移動方向決定部19係從在步驟ST1301所算出之物體位置推定值與在時刻k+j的像素速度資料,算出在時刻k+j之移動物體附近的平均速度vm,k+j=(vm,k+j,x,vm,k+j,y)(步驟ST1302)。vm,k+j,x,vm,k+j,y係當作平均速度之x軸方向成分、y軸方向成分。 The moving direction determining unit 19 calculates the average velocity near the moving object at time k + j from the estimated position of the object calculated at step ST1301 and the pixel velocity data at time k + j, and m + k = (v m, k + j, x , v m, k + j, y ) (step ST1302). v m, k + j, x , v m, k + j, y are regarded as the x-axis component and the y-axis component of the average speed.
移動方向決定部19係例如與第11圖之步驟ST1102一樣地,從以下之數學式(17)及數學式(18)算出平均速度vm,k+j=(vm,k+j,x,vm,k+j,y)。 The moving direction determination unit 19 calculates the average speed v m, k + j = (v m, k + j, x ) from the following mathematical formula (17) and mathematical formula (18), for example, as in step ST1102 in FIG. , v m, k + j, y ).
此處,W(xq,yq;xe,k+j,ye,k+j)係表示物體尺寸63的函數,係對任意之(xe,k+j,ye,k+j)滿足以下之數學式(19)的函數。 Here, W (x q , y q ; x e, k + j , y e, k + j ) is a function representing the size of the object 63, which is arbitrary for (x e, k + j , y e, k + j ) A function that satisfies the following formula (19).
例如,在以(xe,k+j,ye,k+j)為中心定義圓形之物體尺寸63的情況,定義成如以下之數學式(20)所示。 For example, when a circular object size 63 is defined with (x e, k + j , y e, k + j ) as the center, it is defined as shown in the following mathematical formula (20).
此處,R係當作表示物體尺寸63之半徑的參數,n(R)係當作滿足以下之數學式(21)之像素(xq’,yq’)的個數。 Here, R is taken as a parameter representing the radius of the object size 63, and n (R) is taken as the number of pixels (x q ', y q ') satisfying the following mathematical formula (21).
(x e,k+j -x q')2+(y e,k+j -y q')2<R 2 (21) ( x e , k + j - x q ' ) 2 + ( y e , k + j - y q ' ) 2 < R 2 (21)
又,亦可因應於所設想之移動物體的種類、或在影像取得感測器300之像素速度資料的誤差以及測距感測器200之物體位置資料的誤差,改變物體尺寸63的大小,或將形狀作成橢圓形或長方形等。如本實施形態所示,在將行人作為檢測對象的情況,物體尺寸63係當作表示從上方所俯視之行人之平均的大小者即可。 In addition, the size of the object size 63 may be changed according to the type of the moving object envisaged, or the error of the pixel speed data of the sensor 300 obtained in the image, and the error of the position data of the distance measuring sensor 200, or Make the shape oval or rectangular. As shown in this embodiment, when a pedestrian is used as a detection target, the object size 63 may be regarded as an average size of pedestrians viewed from above.
又,除了如上述所示對物體尺寸63內部的像素速度資料一樣地進行加法平均以外,亦可為了在物體尺寸63內進行加權平均,而使函數W(xq,yq;xe,k+j,ye,k+j)的值具有傾斜。 In addition to performing the average addition on the pixel velocity data inside the object size 63 as described above, the function W (x q , y q ; x e, k + j , y e, k + j ) have skew.
移動方向決定部19係從速度假設值vu=(vu,x,vu,y)與在步驟ST1302所算出之平均速度vm,k+j=(vm,k+j,x,vm,k+j,y),算出速度假設值的概率Lu,k+j(步驟ST1303)。 The moving direction determination unit 19 is based on the speed assumption v u = (v u, x , v u, y ) and the average speed v m, k + j = (v m, k + j, x , v m, k + j, y ), and calculate the probability L u, k + j of the speed assumed value (step ST1303).
速度假設值vu之在時刻k+j的概率Lu,k+j係自以下之數學式(22)算出。 The probability L u, k + j of the speed assumption v u at time k + j is calculated from the following mathematical formula (22).
此處,速度假設值vu、vm,k+j係當作以以下之數學式(23)、 (24)所表示的縱向量。 Here, the speed assumption values v u and v m, k + j are taken as the longitudinal quantities represented by the following mathematical expressions (23) and (24).
Sk+j係表示vu-vm,k+j之殘差共分散矩陣之2列2行的參數。又,關於在時刻k+j-1的概率Lu,k+j-1,尤其在j=1的情況,當作如下。 Sk + j is a parameter representing two columns and two rows of the co-dispersion matrix of the residuals of v u -v m, k + j . In addition, regarding the probability Lu, k + j-1 at the time k + j- 1 , especially in the case of j = 1, it is considered as follows.
L u,k =1 (25) L u , k = 1 (25)
此外,以上之概率Lu,k+j係平均速度vm,k+j是對移動物體之真正的速度加上白色高斯雜訊之模型的值,亦可使用平均速度之誤差分布採用有色非高斯雜訊的模型。 In addition, the above probability Lu, k + j is the average speed vm , k + j is the value of the model of the real speed of the moving object plus the white Gaussian noise. The error distribution of the average speed can also be used for colored non- Gaussian noise model.
移動方向決定部19判定是否對全部之速度假設值已更新在時刻k+j之假設值概率(步驟ST1304)。 The movement direction determination unit 19 determines whether or not the assumed value probability at time k + j has been updated for all the speed assumed values (step ST1304).
在步驟ST1304,判定對全部之速度假設值未更新在時刻k+j之假設值概率的情況(在步驟ST1304之”NO”的情況),回到步驟ST1301,並重複以後的處理。 In step ST1304, it is determined that the hypothesis value probability at time k + j has not been updated for all the speed assumed values (in the case of "NO" in step ST1304), and the process returns to step ST1301 and repeats subsequent processing.
在步驟ST1304,判定對全部之速度假設值已更新在時刻k+j之假設值概率的情況(在步驟ST1304之”YES”的情況),結束第13圖的處理。 In step ST1304, it is determined that the hypothesis value probability at time k + j has been updated for all the speed assumed values (in the case of "YES" in step ST1304), and the processing of Fig. 13 is ended.
藉由執行以上之第13圖的處理,移動方向決定部19係根據在時刻k+j的像素速度資料,對移動物體之各個速度假設值更新表示概率的值。 By executing the processing of FIG. 13 described above, the movement direction determination unit 19 updates the values representing the probabilities for each of the assumed speed values of the moving object based on the pixel speed data at time k + j.
回到第9圖之流程圖。 Return to the flowchart of FIG. 9.
控制部判定變數j是否是參數M以上(步驟ST910)。參數M表示在速度假設值之概率算出所使用之時刻圖框的個數。在參數M,例如預設定在從時刻k至時刻k+M之間將目標之運動作為等速直線運動的假設成為妥當之時刻圖框數的最大值。 The control unit determines whether the variable j is greater than or equal to the parameter M (step ST910). The parameter M represents the number of time frames used in the calculation of the probability of the speed hypothesis value. In the parameter M, for example, it is preset that the assumption that the movement of the target is a constant-speed straight-line movement from time k to time k + M becomes the maximum value of the number of frames at the appropriate time.
在步驟ST910,在判定變數j是未滿參數M的情況(在步驟ST910之”NO”的情況),移動方向決定部19係對變數j加1(步驟ST911),並回到步驟ST908。 In step ST910, when it is determined that the variable j is an underfill parameter M (in the case of NO in step ST910), the movement direction determination unit 19 adds 1 to the variable j (step ST911), and returns to step ST908.
在步驟ST910,在判定變數j為參數M以上的情況(在步驟ST910之”YES”的情況),移動方向決定部19係從在步驟ST909所算出並更新之各速度假設值的概率,算出各速度假設值的事後概率(步驟ST912)。 In step ST910, when it is determined that the variable j is greater than or equal to the parameter M (in the case of "YES" in step ST910), the movement direction determination unit 19 calculates each of the speed hypothesis values calculated and updated in step ST909. Post hoc probability of speed hypothesis (step ST912).
具體而言,移動方向決定部19係對根據在時刻k+1~時刻k+M的像素速度資料所更新之各個速度假設值的概率Lu,k+M(u=1,2,...,NH,NH係速度假設值的個數),自以下之數學式(26)算出事後概率P(Hu | Vk:k+M)。 Specifically, the moving direction determination unit 19 is a probability L u, k + M (u = 1,2, ..) of each speed assumption value updated based on the pixel speed data at time k + 1 to time k + M. ., N H , N H are the number of speed hypotheses), and calculate the ex post probability P (H u | V k: k + M ) from the following mathematical formula (26).
此處,P(Hu)係表示速度假設值vu的事前概率,係對各速度假設值vu所設定之參數。 Here, P (H u) represents a priori probability-based value is assumed velocity v u, the system parameters for each velocity set of hypothetical values of v u.
作為P(Hu)之設定方法,例如根據設定速度假設值vu時之關於數學式(26)的殘差△r、或v1之值的大小、在取得像素速度資料之影像的光源或背景色等所設定。或,亦可全部之速度假設值係在設定時間點是一樣地當作概率,而將P(Hu)當作常數。 As the setting method of P (H u ), for example, according to the magnitude of the residual Δr of mathematical formula (26) or the value of v 1 when the speed assumption value v u is set, the light source or Set the background color, etc. Or, all speed hypotheses can be regarded as the same at the set time point, and P (H u ) can be regarded as a constant.
移動方向決定部19係從在步驟ST912所算出之速 度假設值的事後概率P(H1 | Vk:k+M)之中決定事後概率最大的速度假設值vu,作為移動物體的推定速度(步驟ST913)。 The moving direction determination unit 19 determines the speed assumed value v u with the greatest hindered probability from the hindered probability P (H 1 | V k: k + M ) of the speed assumed values calculated in step ST912, as the estimated speed of the moving object. (Step ST913).
移動方向決定部19係在根據在時刻k的物體位置資料所特定的一個或複數個移動物體之中,決定對全部的移動物體是否已決定推定速度(步驟ST914)。 The moving direction determination unit 19 determines whether or not the estimated speed has been determined for all moving objects among one or a plurality of moving objects specified by the object position data at time k (step ST914).
在步驟ST914,對全部的移動物體未決定推定速度的情況(在步驟ST914之”NO”的情況),回到步驟ST902。 In step ST914, when the estimated speed has not been determined for all the moving objects (in the case of "NO" in step ST914), the process returns to step ST902.
在步驟ST914,對全部的移動物體已決定推定速度的情況(在步驟ST914之”YES”的情況),移動方向決定部19係向顯示部16及記錄部17輸出在步驟ST913所決定之移動物體的移動速度、與表示重心位置的融合資料(步驟ST915)。 In step ST914, when the estimated speed has been determined for all the moving objects (in the case of "YES" in step ST914), the moving direction determination unit 19 outputs the moving objects determined in step ST913 to the display unit 16 and the recording unit 17. Fusion data of the moving speed and the position of the center of gravity (step ST915).
此外,在上述,物體位置資料製作部12已推定移動物體的個數,但是在以物體位置資料亦誤含靜止物體或實際存在之物體以外之不要的資料為前提,亦可在移動方向決定部19根據推定速度,丟棄不要的資料。例如,若推定速度係未滿臨限值,亦可當作靜止物體,並從移動物體的個數除去。即,亦可作成將在物體位置資料製作部12所算出之移動物體的個數當作暫時之移動物體的個數,在移動方向決定部19進行最終之移動物體之個數的推定。 In addition, as described above, the object position data creation unit 12 has estimated the number of moving objects. However, on the premise that the object position data also contains unnecessary data other than stationary objects or objects that actually exist, the movement direction determination unit may also be used. 19 Discard unnecessary data based on the estimated speed. For example, if the estimated speed is below the threshold, it can also be regarded as a stationary object and removed from the number of moving objects. In other words, the number of moving objects calculated by the object position data creating unit 12 may be regarded as the number of temporary moving objects, and the final number of moving objects may be estimated by the movement direction determining unit 19.
此外,在上述,融合處理部103係從在時刻k以前的像素速度資料設定移動物體的速度假設值,並根據在時刻k+1以後的像素速度資料來判定移動物體的速度,但是在各處理所使用之時刻圖框的選法係不在此限。例如,亦可融合處理部103係根據在時刻k+1以後的像素速度資料來設定速度假 設值,並根據根據在時刻k以前的像素速度資料從速度假設值之中決定移動物體的移動速度。 In addition, in the above, the fusion processing unit 103 sets the speed assumed value of the moving object from the pixel speed data before time k, and determines the speed of the moving object based on the pixel speed data after time k + 1. The frame selection method used at the moment is not limited. For example, the fusion processing unit 103 may set the speed false based on the pixel speed data after time k + 1. Set a value and determine the moving speed of the moving object from among the speed assumptions based on the pixel speed data before time k.
如以上所示,若依據本第2實施形態,因為作成從複數個時刻圖框的像素速度推定移動物體的移動速度,所以在從影像取得感測器300所算出之像素速度資料上,在移動物體周邊的像素速度會暫時成為與真正之物體移動速度大為相異之值的情況,亦可藉設置於一個檢測位置之一台測距感測器與一台影像取得感測器推定移動物體的個數、位置以及包含移動方向的移動速度。 As described above, according to the second embodiment, since the moving speed of a moving object is estimated from the pixel speed of the frame at a plurality of times, the pixel speed data calculated from the image acquisition sensor 300 is used for moving. The pixel speed around the object will temporarily become a value that is very different from the real object's moving speed. It can also be estimated by a distance measuring sensor and an image acquisition sensor installed at a detection position. Number, position, and moving speed including the moving direction.
該效果係例如在夜間推定移動物體的個數、位置以及包含移動方向的移動速度的情況是有效。相機等之影像取得感測器300係具有愈是夜間或暗處等之光源照度低的環境,因感測器內部的雜訊而各像素之亮度愈易變動的特性。在這種情況,因為移動物體的亮度因雜訊而變動,所以像素速度資料係易具有偏離移動物體之真正之物體移動速度的值。可是,若依據本第1實施形態的構成,因為根據複數個時刻圖框的像素速度資料來設定物體之移動速度的候選值,並比較各候選值,藉此,推定概率移動速度,所以可減少推定錯誤之移動速度的機率。 This effect is effective when, for example, the number and position of moving objects and the moving speed including the moving direction are estimated at night. The image acquisition sensor 300 of a camera or the like has a characteristic that the illumination intensity of a light source is low at night or in a dark place, and the brightness of each pixel is more likely to change due to noise inside the sensor. In this case, since the brightness of a moving object changes due to noise, the pixel speed data is likely to have a value that deviates from the true object moving speed of the moving object. However, according to the configuration of the first embodiment, the candidate speed of the object is set based on the pixel speed data of the frame at a plurality of time points, and the candidate values are compared to thereby estimate the probability of the speed of movement, so it can be reduced. Probability of estimating incorrect moving speed.
又,愈是在低照度環境亦雜訊所造成之亮度強度的變動小之影像取得感測器300,因為一般製程愈複雜,所以影像取得感測器300的費用變成昂貴。因此,若依據本第2實施形態,與使用在低照度環境難受到雜訊之影響之影像取得感測器300的方法相比,在能以少的費用實現相同之精度上係劃時代的。 In addition, the image acquisition sensor 300 with small changes in luminance intensity caused by noise in low-light environments is more expensive because the general manufacturing process is more complicated. Therefore, according to the second embodiment, compared with the method of using the image acquisition sensor 300 which is hardly affected by noise in a low-light environment, it is epoch-making in terms of achieving the same accuracy with less cost.
又,該效果係例如在作為移動物體之種類將行人作為對象的情況亦有效。行人係因手足之動作或身體之方向的變化,與車輛等之無機物相比,影像上之物體表面的亮度易不規則地變化。因此,例如在行人改變臉之方向的時間點,像素速度資料易成為與物體之移動速度大為相異的值。可是,若依據本第2實施形態的構成,暫時像素速度資料成為不穩定之時刻的像素速度係因為與複數個時刻圖框之像素速度資料的一致低而被丟棄,所以即使在行人之亮度分布不規則地變化的情況亦可減少推定錯誤之移動速度的機率。 This effect is also effective when, for example, a pedestrian is targeted as a type of moving object. Due to the movement of the hands and feet or the change in the direction of the body, the pedestrian is more likely to irregularly change the brightness of the surface of the object compared to inorganic objects such as vehicles. Therefore, for example, at a point in time when a pedestrian changes the direction of the face, the pixel speed data tends to be a value that is significantly different from the moving speed of the object. However, according to the configuration of the second embodiment, the pixel speed at the moment when the temporary pixel speed data becomes unstable is discarded because the pixel speed data of the time frame is low, and therefore the pixel speed is discarded. Irregular changes can also reduce the chance of miscalculating the speed of movement.
又,行人係因為有配戴於身上的服飾品等,所以與車輛等之無機物相比,亮度分布成為多樣化。因此,難發生背景與行人之混淆之環境的選擇在原理上係困難,結果,在背景之亮度成為與行人之亮度相同的時刻圖框,像素速度資料會成為與行人之移動速度大為相異的值。可是,若依據本第2實施形態,在複數個速度假設值之中,從背景與背景之混淆之時刻圖框的像素速度資料所設定之速度假設值係因為當作與其他的時刻圖框之一致性低的假設值而被丟棄,所以在行人與背景暫時易混淆的情況亦可減少推定錯誤之移動速度的機率。 In addition, because pedestrians have clothing and the like worn on them, the brightness distribution is more diverse than that of inorganic substances such as vehicles. Therefore, the selection of an environment where it is difficult for the background to be confused with pedestrians is difficult in principle. As a result, when the background brightness becomes the same frame as the pedestrian brightness, the pixel velocity data will be very different from the pedestrian's moving speed. Value. However, according to the second embodiment, among the plurality of speed assumption values, the speed assumption value set from the pixel velocity data of the frame at the time when the background is confused with the background is because it is regarded as being different from the other time frame. Hypothetical values with low consistency are discarded, so the probability of moving speed of estimation errors can also be reduced in situations where pedestrians and backgrounds are temporarily confusing.
又,該效果係例如在晴天時之室外環境或背景含有光源的情況等在影像上之亮度分布具有大的偏倚的情況亦有效。在室外之向陽處與背蔭處等在影像上之亮度分布具有大的偏倚的情況,因為是算出像素速度時的前提之亮度分布的連續性不成立,所以移動物體周邊的像素速度資料會取得與移動物體之速度大為相異的值。可是,若依據本第2實施形態,從 移動物體通過亮度分布大為偏倚之區域之時刻圖框的像素速度資料所推定之速度假設值係因為當作與其他的時刻圖框之像素速度的一致性低的假設值而被丟棄,所以在亮度分布大為偏倚的情況亦可減少推定錯誤之移動速度的機率。 This effect is also effective when the brightness distribution on the image is greatly biased, for example, when the outdoor environment on a sunny day or when the background contains a light source. In the case where the brightness distribution on the image is greatly biased in outdoor locations such as the sunny place and the shaded place, because the continuity of the brightness distribution is the premise when calculating the pixel speed, the pixel speed data around the moving object will be obtained and The speed of moving objects is very different. However, according to the second embodiment, from When the moving object passes through the area where the brightness distribution is largely biased, the estimated speed value of the frame's pixel speed data is discarded because it is assumed to have low consistency with the pixel speed of the frame at other times. A situation where the brightness distribution is largely biased can also reduce the probability of estimating the moving speed by mistake.
第3實施形態 Third Embodiment
在第1、第2實施形態,作成從根據測距感測器200的測距資訊之移動物體的重心位置、與根據影像取得感測器300之影像的像素速度,推定移動物體的個數、位置以及包含移動方向的移動速度。 In the first and second embodiments, the number of moving objects is estimated from the position of the center of gravity of the moving object based on the ranging information of the distance measuring sensor 200 and the pixel speed of the image obtained by the sensor 300 based on the image. Position and moving speed including the moving direction.
在本第3實施形態,說明根據測距感測器200之測距資訊,進而將移動物體之自地素面的高度、以及關於移動物體之寬度的資料製作成移動物體之屬性資料的實施形態。 In the third embodiment, an embodiment will be described in which the height information from the ground plane of the moving object and the data about the width of the moving object are made into the attribute data of the moving object based on the distance measurement information of the distance measuring sensor 200.
本第3實施形態之移動物體檢測系統的構成係因為與在第1實施形態使用第1圖所說明者一樣,所以省略重複的說明。 The configuration of the moving object detection system according to the third embodiment is the same as that described with reference to FIG. 1 in the first embodiment, and therefore redundant description is omitted.
又,關於測距感測器200與影像取得感測器300之設置狀況、及測距感測器200與影像取得感測器300的觀測條件,亦因為例如如在第1實施形態使用第2圖、第3圖之說明所示,所以省略重複的說明。 In addition, the installation conditions of the ranging sensor 200 and the image acquisition sensor 300 and the observation conditions of the ranging sensor 200 and the image acquisition sensor 300 are also used because, for example, the second embodiment is used in the first embodiment. As shown in the description of FIG. 3 and FIG. 3, overlapping description is omitted.
第14圖係本發明之第3實施形態之移動物體檢測裝置100b的構成圖。 Fig. 14 is a configuration diagram of a moving object detection device 100b according to a third embodiment of the present invention.
在第14圖,關於與使用第4圖所說明之第1實施形態的移動物體檢測裝置100一樣的構成,附加相同的符號,並省略說明。 In FIG. 14, the same configurations as those of the moving object detection device 100 according to the first embodiment described with reference to FIG. 4 are denoted by the same reference numerals, and descriptions thereof will be omitted.
本發明之第3實施形態之移動物體檢測裝置100b係與第1實施形態的移動物體檢測裝置100相比,僅在測距感測器處理部101更包括物體高度資料製作部20與物體寬度資料製作部21、及融合處理部103更具備物體屬性判定部22上相異。 Compared with the moving object detection device 100 according to the first embodiment, the moving object detection device 100b according to the third embodiment of the present invention includes an object height data creation unit 20 and an object width data only in the ranging sensor processing unit 101. The creation section 21 and the fusion processing section 103 further include an object attribute determination section 22 that is different from each other.
物體高度資料製作部20係從測距資料取得部11取得在複數個時刻圖框的測距資訊,並製作已推定移動物體之自地表面之高度的資料。此處,將已推定移動物體之自地表面之高度的資料稱為物體高度資料。 The object height data creation unit 20 obtains the distance measurement information of the frame at a plurality of times from the distance measurement data acquisition unit 11 and generates data of the estimated height of the moving object from the ground surface. Here, the data of the estimated height of the moving object from the ground surface is referred to as object height data.
物體高度資料製作部20係向物體屬性判定部22輸出所製作之物體高度資料。 The object height data creation unit 20 outputs the created object height data to the object attribute determination unit 22.
此外,在物體高度資料之移動物體係與在物體位置資料製作部12所製作之物體位置資料的移動物體被賦與對應,對應關係係可從物體高度資料參照。 In addition, the moving object system of the object height data and the moving object of the object position data produced by the object position data creating section 12 are assigned correspondences, and the correspondence relationship can be referred from the object height data.
物體寬度資料製作部21係從測距資料取得部11取得在複數個時刻圖框的測距資訊,並製作已推定移動物體之x軸方向及y軸方向之寬度的資料。此外,將推定移動物體之x軸方向及y軸方向之寬度的資料稱為物體寬度資料。 The object width data creation unit 21 obtains the distance measurement information of the frame at a plurality of times from the distance measurement data acquisition unit 11 and generates data of the estimated widths of the x-axis direction and the y-axis direction of the moving object. In addition, data for estimating the width in the x-axis direction and the y-axis direction of a moving object is referred to as object width data.
物體寬度資料製作部21係向物體屬性判定部22輸出所製作之物體寬度資料。 The object width data creation unit 21 outputs the created object width data to the object attribute determination unit 22.
此外,在物體寬度資料之移動物體係與在物體位置資料製作部12所製作之物體位置資料的移動物體被賦與對應,對應關係係可從物體寬度資料參照。 In addition, the moving object system of the object width data and the moving object of the object position data produced by the object position data creation section 12 are assigned correspondence, and the correspondence relationship can be referred from the object width data.
物體屬性判定部22係從位置移動方向相關部15取得體之位置及移動速度的資料,從物體高度資料製作部20 取得物體高度資料,從物體寬度資料製作部21取得物體寬度資料,並根據該取得之資料,判定移動物體的屬性。 The object attribute determination unit 22 obtains the position and movement speed data of the body from the position movement direction correlation unit 15 and the object height data creation unit 20 Obtain the object height data, obtain the object width data from the object width data creation section 21, and determine the attributes of the moving object based on the acquired data.
物體屬性判定部22係向顯示部16及記錄部17輸出表示移動物體之位置、移動速度以及屬性的資料。此處,將物體屬性判定部22所輸出之表示移動物體之位置、移動速度以及屬性的資料稱為具屬性融合資料。 The object attribute determination unit 22 outputs data indicating the position, movement speed, and attributes of a moving object to the display unit 16 and the recording unit 17. Here, the data indicating the position, moving speed, and attributes of the moving object output by the object attribute determination unit 22 is referred to as attribute fusion data.
移動物體檢測裝置100b的硬體構成係因為與在第1實施形態使用第5A圖、第5B圖所說明者一樣,所以省略重複的說明。 Since the hardware configuration of the moving object detection device 100b is the same as that described with reference to Figs. 5A and 5B in the first embodiment, redundant description is omitted.
說明本第3實施形態之移動物體檢測裝置100b的動作。 The operation of the moving object detection device 100b according to the third embodiment will be described.
第15圖係說明本發明之第3實施形態的移動物體檢測裝置100b之動作的流程圖。 Fig. 15 is a flowchart illustrating the operation of the moving object detection device 100b according to the third embodiment of the present invention.
此外,在以下的說明,對在時刻k重心位置已通過測距感測器200之觀測範圍54的移動物體,推定移動速度及屬性。 In the following description, the moving speed and attributes of the moving object having passed through the observation range 54 of the distance-measuring sensor 200 at time k are estimated.
測距感測器處理部101的測距資料取得部11及物體位置資料製作部12係從測距感測器200所取得之在複數個時刻圖框的測距資訊,抽出在時刻k重心位置已通過測距感測器200之觀測範圍54的移動物體,並製作在時刻k的物體位置資料(步驟ST1501)。具體的動作係與在第1實施形態所說明之第6圖的步驟ST601一樣。物體位置資料製作部12係向位置移動方向相關部15輸出所製作之物體位置資料。 The ranging data acquisition unit 11 and the object position data production unit 12 of the ranging sensor processing unit 101 are ranging information obtained from the ranging sensor 200 at a plurality of time frames, and the position of the center of gravity at time k is extracted. The moving object that has passed the observation range 54 of the ranging sensor 200 and generates object position data at time k (step ST1501). The specific operation is the same as step ST601 of FIG. 6 described in the first embodiment. The object position data creation unit 12 outputs the created object position data to the position moving direction correlation unit 15.
測距感測器處理部101的物體高度資料製作部20係從測距資料取得部11取得測距感測器200所取得之在複數 個時刻圖框的測距資訊,對在時刻k重心位置已通過測距感測器200之觀測範圍54的移動物體,算出該移動物體之自地表面的高度,並製作在時刻k的物體高度資料(步驟ST1502)。 The object height data creation unit 20 of the distance measurement sensor processing unit 101 obtains the plurality of objects obtained by the distance measurement sensor 200 from the distance measurement data acquisition unit 11 Distance measurement information for each time frame, calculate the height of the moving object from the ground surface of the moving object that has passed the observation range 54 of the ranging sensor 200 at time k, and create the object height at time k Document (step ST1502).
物體高度資料製作部20係作為從在各時刻圖框的測距資訊算出移動物體之高度的方法,例如,使用上述之在特開2011-108223號公報所揭示的方法。在該方法,在高度方向分割觀測範圍54內之各移動物體的測距資訊,並在各高度算出檢測出物體之範圍的方法,例如,在算出行人之高度的情況,可算出檢測出行人之高度的最大值,作為行人之高度。 The object height data creation unit 20 is a method for calculating the height of a moving object from the ranging information of the frame at each time, and for example, the method disclosed in Japanese Patent Application Laid-Open No. 2011-108223 is used. In this method, the distance measurement information of each moving object within the observation range 54 is divided in the height direction, and the range of the detected object is calculated at each height. For example, when the height of a pedestrian is calculated, the number of detected pedestrians can be calculated. The maximum height is taken as the height of the pedestrian.
物體高度資料製作部20係向物體屬性判定部22輸出所製作之物體高度資料。 The object height data creation unit 20 outputs the created object height data to the object attribute determination unit 22.
測距感測器處理部101的物體寬度資料製作部21係從測距資料取得部11取得測距感測器200所取得之在複數個時刻圖框的測距資訊,並對在時刻k重心位置已通過測距感測器200之觀測範圍54的移動物體,算出該移動物體之x方向及y方向的寬度,並製作在時刻k的物體寬度資料(步驟ST1503)。 The object width data creation unit 21 of the ranging sensor processing unit 101 obtains the ranging information obtained by the ranging sensor 200 from the ranging data acquisition unit 11 at a plurality of time frames, and measures the center of gravity at time k. A moving object whose position has passed the observation range 54 of the distance measuring sensor 200, calculates the width in the x direction and the y direction of the moving object, and generates object width data at time k (step ST1503).
物體寬度資料製作部21係作為從在各時刻圖框的測距資訊算出移動物體之寬度的方法,例如,使用上述之在特開2011-108223號公報所揭示的方法。該方法係在距離方向分割觀測範圍54內之各移動物體的測距資訊,並可從在名距離方向檢測出移動物體之距離方向的最大值與最小值算出行人之身體的寬度。又,關於對距離方向垂直之方向的寬度,可從連續地檢測出移動物體的時間算出。 The object width data creation unit 21 is a method of calculating the width of a moving object from the ranging information of the frame at each time, and for example, the method disclosed in Japanese Patent Application Laid-Open No. 2011-108223 is used. This method divides the distance measurement information of each moving object within the observation range 54 in the distance direction, and can calculate the width of the pedestrian's body from the maximum and minimum values of the distance direction of the moving object detected in the name distance direction. The width in the direction perpendicular to the distance direction can be calculated from the time during which a moving object is continuously detected.
物體寬度資料製作部21係向物體屬性判定部22輸出所製作之物體寬度資料。 The object width data creation unit 21 outputs the created object width data to the object attribute determination unit 22.
影像取得感測器處理部102係從影像取得感測器300所取得之在複數個時刻圖框的影像,製作在時刻k的像素速度資料(步驟ST1504)。具體之像素速度資料的製作方法係與在第1實施形態所說明之第6圖的步驟ST602一樣。影像取得感測器處理部102之像素移動方向資料製作部14係向位置移動方向相關部15輸出在時刻k的像素速度資料。 The image acquisition sensor processing unit 102 generates the pixel velocity data at time k from the images at the frames of time obtained by the image acquisition sensor 300 (step ST1504). A specific method of generating the pixel speed data is the same as step ST602 of FIG. 6 described in the first embodiment. The pixel movement direction data creation unit 14 of the image acquisition sensor processing unit 102 outputs the pixel speed data at the time k to the position movement direction correlation unit 15.
位置移動方向相關部15係使用在步驟ST1501物體位置資料製作部12所輸出之在時刻k的物體高度資料、與在步驟ST1504像素移動方向資料製作部14所輸出之在時刻k的像素速度資料,對根據在時刻k的物體位置資料所特定的移動物體,推定推定速度,並製作關於該移動物體之在時刻k的融合資料(步驟ST1505)。具體的動作係與在第1實施形態所說明之第6圖的步驟ST603一樣。 The position moving direction correlation unit 15 uses the object height data at time k output by the object position data generating unit 12 in step ST1501 and the pixel velocity data at time k output by the pixel moving direction data generating unit 14 in step ST1504. The moving speed specified by the object position data at time k is estimated, and a fusion data about the moving object at time k is created (step ST1505). The specific operation is the same as step ST603 in FIG. 6 described in the first embodiment.
位置移動方向相關部15係向物體屬性判定部22輸出所製作之融合資料。 The position moving direction correlation unit 15 outputs the created fusion data to the object attribute determination unit 22.
物體屬性判定部22係根據在步驟ST1502物體高度資料製作部20所製作之在時刻k的物體高度資料、與在步驟ST1503物體寬度資料製作部21所製作之在時刻k的物體寬度資料、以及在步驟ST1505位置移動方向相關部15所製作之在時刻k的融合資料所含之移動物體的重心位置及推定速度,判定移動物體的屬性,即形狀屬性(步驟ST1506)。 The object attribute determination unit 22 is based on the object height data at time k produced by the object height data creation unit 20 in step ST1502 and the object width data at time k produced by the object width data creation unit 21 in step ST1503 and In step ST1505, the position of the center of gravity and the estimated speed of the moving object included in the fusion data at time k produced by the position-moving direction correlation unit 15 determine the attribute of the moving object, that is, the shape attribute (step ST1506).
例如,在作為移動物體將行人作為對象的情況, 在某移動物體的高度是成人之平均身高以下,且移動物體的寬度是成人之典型之身體的寬度以下,移動速度比一般之步行速度快的情況,物體屬性判定部22係判定該移動物體的形狀屬性是「小孩」。此外,成人之平均身高、成人之典型之身體的寬度以及一般之步行速度的資訊等形狀屬性之判定所需的資訊係被預設,並被記憶於物體屬性判定部22可參照的位置。 For example, in the case of a pedestrian as a moving object, In the case where the height of a moving object is below the average height of an adult, and the width of the moving object is below the width of a typical body of an adult, and the moving speed is faster than the general walking speed, the object attribute determination unit 22 determines whether the moving object is The shape attribute is "kid". In addition, information required for determining shape attributes such as the average height of an adult, the width of a typical body of an adult, and information on general walking speed is preset and stored in a position that the object attribute determination unit 22 can refer to.
又,例如,在作為移動物體將行人作為對象的情況,在某移動物體的高度是成人之平均身高以下,且移動物體之地面附近的寬度是適合輪椅之車輪的大小,且,移動速度比一般之步行速度慢的情況,物體屬性判定部22係判定該移動物體的形狀屬性是「輪椅利用者」。 For example, when a pedestrian is used as a moving object, the height of a moving object is equal to or less than the average height of an adult, and the width near the ground of the moving object is the size of a wheel suitable for a wheelchair. When the walking speed is slow, the object attribute determination unit 22 determines that the shape attribute of the moving object is a "wheelchair user".
此外,在上述,列舉將行人作為對象的例子,但是不限定為此,亦可用於將車輛作為對象之車型的判定、將飛機作為對象之機型的判定、將船舶作為對象之船型的判定、將動物作為對象之種類的判定等。 In addition, in the above, the examples of pedestrians are listed, but it is not limited to this, and it can also be used to determine the type of vehicle, the type of aircraft, the type of boat, Determination of the type of animal as a target.
物體屬性判定部22係判定是否對根據在時刻k的物體位置資料所特定之一個或複數個移動物體的全部已推定推定速度及形狀屬性(步驟ST1507)。 The object attribute determination unit 22 determines whether or not the estimated velocity and shape attributes are all estimated for one or a plurality of moving objects specified by the object position data at time k (step ST1507).
在步驟ST1507,在對根據在時刻k的物體位置資料所特定之一個或複數個移動物體的全部未推定推定速度及形狀屬性的情況(步驟ST1507之”NO”的情況),回到步驟ST1505。 In step ST1507, when all the estimated speed and shape attributes of one or a plurality of moving objects specified by the object position data at time k are not estimated (in the case of "NO" in step ST1507), the process returns to step ST1505.
在步驟ST1507,在對根據在時刻k的物體位置資料所特定之一個或複數個移動物體的全部已推定推定速度及形狀屬性的情況(步驟ST1507之”YES”的情況),物體屬性判定 部22係從在步驟ST1505~步驟ST1506所推定之在時刻k之各移動物體的位置、推定速度、形狀屬性以及在時刻k以外之時刻圖框的融合資料,判定根據各移動物體之相對位置、速度之移動物體的屬性(步驟ST1508)。將在此步驟ST1508所判定之屬性稱為群組屬性。 In step ST1507, when all the estimated speed and shape attributes of all one or more moving objects specified by the object position data at time k are determined (in the case of "YES" in step ST1507), the object attribute is determined. The unit 22 determines the position of each moving object at time k estimated from steps ST1505 to ST1506, the estimated speed, shape attributes, and the fusion data of the frame at time other than time k, and determines whether the relative position of each moving object, Attributes of moving objects at speed (step ST1508). The attribute determined in this step ST1508 is called a group attribute.
例如,在作為移動物體將行人作為對象的情況,對在時刻k重心位置已通過測距感測器之觀測範圍54的移動物體,在自時刻k開始固定數以內之過去或未來的時刻圖框,觀測到固定個數以上之位置相同且移動方向相同之移動物體的情況,物體屬性判定部22係判定符合之時刻k的移動物體係群組屬性是「團體行人之一位」。 For example, in the case of a pedestrian as a moving object, for a moving object that has passed the observation range 54 of the ranging sensor at time k, the past or future time frame within a fixed number from time k In the case where more than a fixed number of moving objects with the same position and the same moving direction are observed, the object attribute determining unit 22 determines that the group attribute of the moving object system at the time k is "one of the group pedestrians".
又,例如,在作為移動物體將行人作為對象的情況,對形狀屬性是「小孩」的移動物體,觀測到該移動物體之附近的位置、相同的移動速度、且形狀屬性是「小孩」以外的移動物體的情況,將符合之形狀屬性「小孩」之移動物體的群組屬性當作「有監護人跟隨的小孩」,並將符合之形狀屬性「小孩以外」之移動物體的群組屬性當作「跟隨小孩的監護人」。 In addition, for example, when a pedestrian is used as a moving object, for a moving object whose shape attribute is "kid", a position near the moving object, the same moving speed, and the shape attribute is other than "kid" are observed. In the case of a moving object, the group attribute of the moving object with the matching shape attribute "kids" is regarded as a "child with a guardian", and the group attribute of the matching object with the shape attribute "other than children" is regarded as " Follow the child's guardian. "
此外,在上述,列舉將行人作為對象的例子,但是不限定為此,亦可用於將車輛作為對象之塞車的判定、將飛機作為對象之編隊的判定、將船舶作為對象之船隊的判定、將動物作為成對象之成群的判定等。 In addition, in the above, the examples of pedestrians are listed, but it is not limited to this. It can also be used to determine traffic jams for vehicles, formations for aircrafts, fleets for ships, and Judgment of groups of animals as objects, etc.
物體屬性判定部22係將在步驟ST1505所推定之移動物體的位置及移動速度、在步驟ST1506所判定之移動物體的形狀屬性、以及在步驟ST1508所判定之移動物體的群組 屬性當作具有屬性之融合資料,向顯示部16及記錄部17輸出。 The object attribute determination unit 22 is a group of the position and speed of the moving object estimated in step ST1505, the shape attributes of the moving object determined in step ST1506, and the group of moving objects determined in step ST1508. The attributes are output to the display unit 16 and the recording unit 17 as fusion data having attributes.
此外,亦可替代在第14圖之位置移動方向相關部15,藉在第2實施形態所說明之移動方向假設值設定部18與移動方向決定部19推定移動物體的個數、位置以及移動速度。在此情況,物體屬性判定部22係根據移動方向決定部19的輸出,判定移動物體之形狀屬性及群組屬性。 In addition, instead of the position moving direction correlation unit 15 in the position shown in FIG. 14, the number, position, and speed of moving objects can be estimated by the moving direction assumption value setting unit 18 and the moving direction determining unit 19 described in the second embodiment. . In this case, the object attribute determination unit 22 determines the shape attribute and the group attribute of the moving object based on the output of the movement direction determination unit 19.
依此方式,若依據本第3實施形態,因為構成為使用根據測距感測器200之測距資訊與影像取得感測器300之影像資訊之移動物體的位置及移動速度、從測距感測器200之測距資訊所算出之移動物體的高度及寬度,判定移動物體的屬性,所以在影像取得感測器300之解析度低的情況、或因低照度而移動物體之輪廓在影像取得感測器上不清晰的情況,亦可判定對移動物體之分析有用的屬性。 In this way, if according to the third embodiment, the position and moving speed of the moving object are configured to use the distance measurement information of the distance measurement sensor 200 and the image information of the sensor 300 to obtain the distance from the distance measurement sensor. The height and width of the moving object calculated by the distance measurement information of the detector 200 determine the attributes of the moving object. Therefore, when the resolution of the image acquisition sensor 300 is low, or the contour of the moving object is acquired in the image due to low illumination Unclear conditions on the sensor can also determine attributes that are useful for analysis of moving objects.
這種構成係例如在推定行人中之小孩的比例、或團體行動之行人的比例的情況是有效。行人之年齡結構或成為混亂的原因之團體行人的有無係在警備系統或事件管理、市場分析等是有用的資訊,但是為了僅藉可見光相機等之影像取得感測器判定各行人之屬性,需要從影像上之輪廓抽出行人之身體的部位,在低解析度之影像取得感測器,判定之精度變差。例如,為了從影像抽出小孩之行人,需要從影像抽出行人之頭部位置等,而從低解析度之影像係困難。 Such a configuration is effective, for example, in a case where the proportion of children among pedestrians or the proportion of pedestrians acting in groups is estimated. The age structure of pedestrians or the existence of group pedestrians that are the cause of confusion are useful information in security systems, event management, market analysis, etc., but in order to determine the attributes of each pedestrian by using sensors such as visible light cameras to obtain sensors, The pedestrian's body part is extracted from the contour on the image, and the sensor is acquired in the low-resolution image, and the accuracy of the judgment is deteriorated. For example, in order to extract a pedestrian of a child from an image, it is necessary to extract a head position of the pedestrian from the image, and it is difficult to extract the image from a low-resolution image.
另一方面,若依據本第3實施形態,因為使用藉測距感測器200所取得之移動物體的高度及寬度、以及將測距感測器200與影像取得感測器300之觀測結果融合所得之各移 動物體的位置及移動速度,判定移動物體的屬性,所以在影像取得感測器300之解析度低而移動物體之輪廓不清晰的情況,亦可減少判定錯誤之屬性的機率。 On the other hand, according to the third embodiment, the height and width of the moving object obtained by using the distance measuring sensor 200 and the observation results of the distance measuring sensor 200 and the image obtaining sensor 300 are merged. Gains The position and moving speed of the animal body determine the attributes of the moving object. Therefore, in the case where the resolution of the image acquisition sensor 300 is low and the outline of the moving object is not clear, the probability of determining the wrong attribute can also be reduced.
又,因為測距感測器200係亦可觀測對移動物體之進深方向的形狀,所以與影像取得感測器300相比,在根據移動物體之位置及速度的屬性判定,可減少進行錯誤之判定的機率。 In addition, since the range-finding sensor 200 can also observe the shape in the depth direction of a moving object, compared with the image acquisition sensor 300, it is possible to reduce errors in the determination based on the attributes of the position and speed of the moving object. Probability of judgment.
此外,在第1實施形態,移動物體檢測裝置100係採用如第4圖所示的構成,但是移動物體檢測裝置100係藉由包括物體位置資料製作部12、像素移動方向資料製作部14以及位置移動方向相關部15,可得到如上述所示之效果。 In addition, in the first embodiment, the moving object detection device 100 is configured as shown in FIG. 4. However, the moving object detection device 100 includes an object position data creation unit 12, a pixel movement direction data creation unit 14, and a position. The moving direction correlation unit 15 can obtain the effects as described above.
而,上述之說明係說明了作為移動物體檢測出行人的情況,但是對於作為移動物體檢測出汽車、自行車、輪椅、火車、飛機、船舶、人以外之動物、機器人等的情況,亦可利用本發明,這係顯然的。 The above description describes the case where a pedestrian is detected as a moving object. However, when a car, bicycle, wheelchair, train, airplane, ship, animal other than a human, a robot, etc. is detected as a moving object, the present invention can also be used. Invention, this is obvious.
又,本發明係在其發明的範圍內,可進行各實施形態之自由的組合、或各實施形態之任意之構成元件的變形、或者在各實施形態省略任意之構成元件。 The present invention is within the scope of the invention, and can be freely combined with each embodiment, modified with any constituent element of each embodiment, or omitted from any constituent element in each embodiment.
【工業上的可應用性】 [Industrial applicability]
本發明之移動物體檢測裝置係因為構成為可藉設置於一處之一台相機與一台雷射感測器,高精度地推定移動物體的個數、位置以及移動方向,所以可應用於推定行人等之移動物體的個數、位置以及包含移動方向之移動速度的移動物體檢測裝置等。 The moving object detection device of the present invention is configured to be able to estimate the number, position, and direction of moving objects with high accuracy by using a camera and a laser sensor installed at one place, so it can be applied to estimation The number and position of moving objects such as pedestrians, and moving object detection devices including the moving speed in the moving direction.
11‧‧‧測距資料取得部 11‧‧‧ Ranging data acquisition department
12‧‧‧物體位置資料製作部 12‧‧‧ Object position data production department
13‧‧‧影像取得部 13‧‧‧Image acquisition department
14‧‧‧像素移動方向資料製作部 14‧‧‧ Pixel production direction data production department
15‧‧‧位置移動方向相關部 15‧‧‧Position moving direction related department
16‧‧‧顯示部 16‧‧‧Display
17‧‧‧記錄部 17‧‧‧Recording Department
100‧‧‧移動物體檢測裝置 100‧‧‧moving object detection device
101‧‧‧測距感測器處理部 101‧‧‧ ranging sensor processing unit
102‧‧‧影像取得感測器處理部 102‧‧‧Image acquisition sensor processing unit
103‧‧‧融合處理部 103‧‧‧Fusion Processing Department
200‧‧‧測距感測器 200‧‧‧ ranging sensor
300‧‧‧影像取得感測器 300‧‧‧Image acquisition sensor
Claims (5)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
??PCT/JP2016/070943 | 2016-07-15 | ||
PCT/JP2016/070943 WO2018011964A1 (en) | 2016-07-15 | 2016-07-15 | Moving object detection device |
Publications (1)
Publication Number | Publication Date |
---|---|
TW201804445A true TW201804445A (en) | 2018-02-01 |
Family
ID=60951985
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW105132825A TW201804445A (en) | 2016-07-15 | 2016-10-12 | Moving object detection device |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP6351917B2 (en) |
TW (1) | TW201804445A (en) |
WO (1) | WO2018011964A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113163110A (en) * | 2021-03-05 | 2021-07-23 | 北京宙心科技有限公司 | People stream density analysis system and analysis method |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11911918B2 (en) * | 2018-06-11 | 2024-02-27 | Panasonic Intellectual Property Management Co., Ltd. | Distance-measuring system and distance-measuring method |
CN109188419B (en) * | 2018-09-07 | 2021-06-15 | 百度在线网络技术(北京)有限公司 | Method and device for detecting speed of obstacle, computer equipment and storage medium |
WO2022074800A1 (en) * | 2020-10-08 | 2022-04-14 | 三菱電機株式会社 | Air-conditioning control device and air-conditioning control method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1062546A (en) * | 1996-08-26 | 1998-03-06 | Matsushita Electric Works Ltd | Composite infrared human body detector |
JP3233584B2 (en) * | 1996-09-04 | 2001-11-26 | 松下電器産業株式会社 | Passenger detection device |
JP2005025590A (en) * | 2003-07-04 | 2005-01-27 | Minolta Co Ltd | Counting system |
US9177195B2 (en) * | 2011-09-23 | 2015-11-03 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device |
JP5812894B2 (en) * | 2012-02-24 | 2015-11-17 | 東芝エレベータ株式会社 | Elevator occupancy measuring device, and elevator system in which a plurality of elevators each have occupancy counting devices |
JP5834254B2 (en) * | 2014-04-11 | 2015-12-16 | パナソニックIpマネジメント株式会社 | People counting device, people counting system, and people counting method |
-
2016
- 2016-07-15 JP JP2018515317A patent/JP6351917B2/en active Active
- 2016-07-15 WO PCT/JP2016/070943 patent/WO2018011964A1/en active Application Filing
- 2016-10-12 TW TW105132825A patent/TW201804445A/en unknown
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113163110A (en) * | 2021-03-05 | 2021-07-23 | 北京宙心科技有限公司 | People stream density analysis system and analysis method |
Also Published As
Publication number | Publication date |
---|---|
WO2018011964A1 (en) | 2018-01-18 |
JPWO2018011964A1 (en) | 2018-08-16 |
JP6351917B2 (en) | 2018-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113196296B (en) | Detecting objects in a population using geometric context | |
KR101870902B1 (en) | Image processing apparatus and image processing method | |
EP2671384B1 (en) | Mobile camera localization using depth maps | |
JP5023186B2 (en) | Object motion detection system based on combination of 3D warping technique and proper object motion (POM) detection | |
TW201804445A (en) | Moving object detection device | |
KR20110013200A (en) | Identifying method of human attitude and apparatus of the same | |
KR20150121179A (en) | Real time stereo matching | |
US20180039860A1 (en) | Image processing apparatus and image processing method | |
AU2021255130B2 (en) | Artificial intelligence and computer vision powered driving-performance assessment | |
CN112541403B (en) | Indoor personnel falling detection method by utilizing infrared camera | |
Li et al. | Laser scanning based three dimensional measurement of vegetation canopy structure | |
Huch et al. | Quantifying the lidar sim-to-real domain shift: A detailed investigation using object detectors and analyzing point clouds at target-level | |
US20230091536A1 (en) | Camera Placement Guidance | |
CN105631431B (en) | The aircraft region of interest that a kind of visible ray objective contour model is instructed surveys spectral method | |
CN111126363A (en) | Object recognition and distance measurement method and device for automatic driving vehicle | |
JP7226553B2 (en) | Information processing device, data generation method, and program | |
Pfeiffer et al. | Ground truth evaluation of the Stixel representation using laser scanners | |
Schilling et al. | Mind the gap-a benchmark for dense depth prediction beyond lidar | |
JP2020076714A (en) | Position attitude estimation device | |
GB2605621A (en) | Monocular depth estimation | |
Zováthi et al. | ST-DepthNet: A spatio-temporal deep network for depth completion using a single non-repetitive circular scanning Lidar | |
JPWO2020175085A1 (en) | Image processing device and image processing method | |
Toth et al. | Calibrating the MS kinect sensor | |
CN109784315A (en) | Tracking detection method, device, system and the computer storage medium of 3D barrier | |
Creß et al. | Targetless extrinsic calibration between event-based and rgb camera for intelligent transportation systems |