JP2017009572A5 - - Google Patents

Download PDF

Info

Publication number
JP2017009572A5
JP2017009572A5 JP2015198675A JP2015198675A JP2017009572A5 JP 2017009572 A5 JP2017009572 A5 JP 2017009572A5 JP 2015198675 A JP2015198675 A JP 2015198675A JP 2015198675 A JP2015198675 A JP 2015198675A JP 2017009572 A5 JP2017009572 A5 JP 2017009572A5
Authority
JP
Japan
Prior art keywords
boundary
boundary position
detection range
objects
estimated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2015198675A
Other languages
Japanese (ja)
Other versions
JP2017009572A (en
JP6557923B2 (en
Filing date
Publication date
Application filed filed Critical
Priority to US14/951,481 priority Critical patent/US10101448B2/en
Priority to EP15196636.3A priority patent/EP3032273A1/en
Publication of JP2017009572A publication Critical patent/JP2017009572A/en
Publication of JP2017009572A5 publication Critical patent/JP2017009572A5/ja
Application granted granted Critical
Publication of JP6557923B2 publication Critical patent/JP6557923B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Description

本開示の一態様に係る車載レーダ装置は、検知範囲に対してレーダ信号をフレーム毎に送信し、前記レーダ信号が1つ以上の物体に反射された1つ以上の反射信号を受信する送受信部と、前記検知範囲内の各方位において、前記1つ以上の反射信号に基づいて検知される1つ以上の反射点のうち、車載レーダ装置からの距離が最も近い反射点の位置を、前記検知範囲内において前記1つ以上の物体が存在しない領域の境界候補位置として、前記フレーム毎に検出する検出部と、前記車載レーダ装置の移動量に関する移動量データを計算する計算部と、前記移動量データに基づいて、過去のフレームにおいて検出された前記境界候補位置を現フレームでの境界位置に変換することによって、推定された推定境界位置を生成する推定部と、前記現フレームの前記境界候補位置と前記推定境界位置とを用いてスムージング処理し、前記検知範囲内において前記1つ以上の物体が存在しない領域との境界位置の有無を判断し、前記境界位置を有すると判断した場合、前記境界位置を運転支援装置に出力するスムージング部と、を具備する構成を採る。 A vehicle-mounted radar device according to one aspect of the present disclosure transmits / receives a radar signal to a detection range for each frame, and receives one or more reflected signals reflected by one or more objects. And, in each direction within the detection range, among the one or more reflection points detected based on the one or more reflection signals, the position of the reflection point closest to the on-vehicle radar device is detected. As a boundary candidate position of a region where the one or more objects do not exist within a range, a detection unit that detects each frame, a calculation unit that calculates movement amount data related to the movement amount of the in-vehicle radar device, and the movement amount An estimation unit that generates an estimated estimated boundary position by converting the boundary candidate position detected in a past frame into a boundary position in a current frame based on the data; and And smoothing processing using said boundary candidate position of frame and the said estimated boundary position, it is determined whether there is a boundary position between the one or more objects does not exist a region within the detection range and having a boundary position When it is determined, a configuration including a smoothing unit that outputs the boundary position to the driving support device is adopted.

本開示の一態様に係る領域検出方法は、検知範囲に対してレーダ信号をフレーム毎に送信し、前記レーダ信号が1つ以上の物体に反射された1つ以上の反射信号を受信し、前記検知範囲内の各方位において、前記1つ以上の反射信号に基づいて検知される1つ以上の反射点のうち、車載レーダ装置からの距離が最も近い反射点の位置を、前記検知範囲内において前記1つ以上の物体が存在しない領域の境界候補位置として、フレーム毎に検出し、前記車載レーダ装置の移動量に関する移動量データを計算し、前記移動量データに基づいて、過去のフレームにおいて検出された前記境界候補位置を現フレームでの境界位置に変換することによって、推定された推定境界位置を生成し、前記現フレームの前記境界候補位置と前記推定境界位置とを用いてスムージング処理し、前記検知範囲内において物体が存在しない領域との境界位置の有無を判断し、前記境界位置を有すると判断した場合、前記境界位置を運転支援装置に出力する。 In the region detection method according to an aspect of the present disclosure, a radar signal is transmitted to a detection range for each frame, and the radar signal receives one or more reflected signals reflected by one or more objects, Among each of the one or more reflection points detected based on the one or more reflection signals in each azimuth within the detection range, the position of the reflection point closest to the in-vehicle radar device is determined within the detection range. Detected for each frame as a boundary candidate position of the area where the one or more objects do not exist, calculates movement amount data related to the movement amount of the in-vehicle radar device, and detects in a past frame based on the movement amount data The estimated boundary position is generated by converting the determined boundary candidate position into a boundary position in the current frame, and the boundary candidate position and the estimated boundary position in the current frame are There was smoothed, to determine whether the boundary position between the region where no object exists within the detection range, if it is determined to have the boundary position, and outputs the boundary position in the driving support device.

Claims (10)

検知範囲に対してレーダ信号をフレーム毎に送信し、前記レーダ信号が1つ以上の物体に反射された1つ以上の反射信号を受信する送受信部と、
前記検知範囲内の各方位において、前記1つ以上の反射信号に基づいて検知される1つ以上の反射点のうち、車載レーダ装置からの距離が最も近い反射点の位置を、前記検知範囲内において前記1つ以上の物体が存在しない領域の境界候補位置として、前記フレーム毎に検出する検出部と、
前記車載レーダ装置の移動量に関する移動量データを計算する計算部と、
前記移動量データに基づいて、過去のフレームにおいて検出された前記境界候補位置を現フレームでの境界位置に変換することによって、推定された推定境界位置を生成する推定部と、
前記現フレームの前記境界候補位置と前記推定境界位置とを用いてスムージング処理し、前記検知範囲内において前記1つ以上の物体が存在しない領域との境界位置の有無を判断し、前記境界位置を有すると判断した場合、前記境界位置を運転支援装置に出力するスムージング部と、
を具備する車載レーダ装置。
A transmission / reception unit that transmits a radar signal to the detection range for each frame and receives one or more reflected signals of the radar signal reflected by one or more objects;
In each azimuth within the detection range, among the one or more reflection points detected based on the one or more reflection signals, the position of the reflection point closest to the in-vehicle radar device is determined within the detection range. A detection unit that detects for each frame as a boundary candidate position of a region where the one or more objects do not exist;
A calculation unit for calculating movement amount data related to the movement amount of the in-vehicle radar device;
An estimation unit that generates an estimated estimated boundary position by converting the boundary candidate position detected in a past frame into a boundary position in a current frame based on the movement amount data;
Smoothing processing is performed using the boundary candidate position and the estimated boundary position of the current frame, the presence / absence of a boundary position with the region where the one or more objects do not exist in the detection range, and the boundary position A smoothing unit that outputs the boundary position to the driving support device ,
An on-vehicle radar device comprising:
前記スムージング部は、
前記検知範囲内の各方位の前記境界候補位置又は前記推定境界位置に対し、予め設定された尤度重みを用いて尤度値を算出し、前記尤度値が所定の閾値以上である場合、前記境界位置が存在すると判断し、
前記検知範囲内の各方位に前記境界位置が存在する場合に、予め設定された出力値重みを用いて前記現フレームの前記境界候補位置又は前記推定境界位置の距離に重み付けし、加重平均を用いて算出した出力距離を前記境界位置として出力する、
請求項1記載の車載レーダ装置。
The smoothing unit is
When a likelihood value is calculated using a preset likelihood weight for the boundary candidate position or the estimated boundary position in each direction within the detection range, and the likelihood value is equal to or greater than a predetermined threshold value, It determines that the boundary position exists,
When the boundary position in each direction in the detection range is present, the weighted distance of said boundary candidate position or the estimated boundary position of the current frame by using the output value weight set in advance, using the weighted average Output the calculated output distance as the boundary position,
The on-vehicle radar device according to claim 1.
前記1つ以上の物体は、静止物体及び移動物体を含む、
請求項1記載の車載レーダ装置。
The one or more objects include stationary objects and moving objects;
The on-vehicle radar device according to claim 1.
前記1つ以上の物体は、静止物体である、
請求項1記載の車載レーダ装置。
The one or more objects are stationary objects;
The on-vehicle radar device according to claim 1.
前記検知範囲を撮影するカメラ部と、
前記撮影された映像のうち、物体認識に用いる画像領域を、前記境界位置に基づいて設定する領域設定部と、
前記設定された画像領域に対して、物体認識を行う物体認識部と、
を更に含む、
請求項1記載の車載レーダ装置。
A camera unit for photographing the detection range;
An area setting unit that sets an image area used for object recognition in the captured video, based on the boundary position;
An object recognition unit that performs object recognition on the set image region;
Further including
The on-vehicle radar device according to claim 1.
検知範囲に対してレーダ信号をフレーム毎に送信し、前記レーダ信号が1つ以上の物体に反射された1つ以上の反射信号を受信し、
前記検知範囲内の各方位において、前記1つ以上の反射信号に基づいて検知される1つ以上の反射点のうち、車載レーダ装置からの距離が最も近い反射点の位置を、前記検知範囲内において前記1つ以上の物体が存在しない領域の境界候補位置として、フレーム毎に検出し、
前記車載レーダ装置の移動量に関する移動量データを計算し、
前記移動量データに基づいて、過去のフレームにおいて検出された前記境界候補位置を現フレームでの境界位置に変換することによって、推定された推定境界位置を生成し、
前記現フレームの前記境界候補位置と前記推定境界位置とを用いてスムージング処理し、前記検知範囲内において物体が存在しない領域との境界位置の有無を判断し、前記境界位置を有すると判断した場合、前記境界位置を運転支援装置に出力する、
領域検出方法。
Transmitting a radar signal to the detection range for each frame, and receiving one or more reflected signals in which the radar signal is reflected by one or more objects;
In each azimuth within the detection range, among the one or more reflection points detected based on the one or more reflection signals, the position of the reflection point closest to the in-vehicle radar device is determined within the detection range. , Detecting for each frame as a boundary candidate position of the region where the one or more objects do not exist,
Calculate movement amount data relating to the movement amount of the in-vehicle radar device,
Based on the movement amount data, the estimated boundary position is generated by converting the boundary candidate position detected in the past frame into the boundary position in the current frame,
When smoothing processing is performed using the boundary candidate position and the estimated boundary position of the current frame, it is determined whether or not there is a boundary position with an area where no object exists in the detection range, and the boundary position is determined , Outputting the boundary position to the driving support device ,
Region detection method.
前記スムージング処理は、
前記検知範囲内の各方位の前記境界候補位置又は前記推定境界位置に対し、予め設定された尤度重みを用いて尤度値を算出し、前記尤度値が所定の閾値以上である場合、前記境界位置が存在すると判断し、
前記検知範囲内の各方位に前記境界位置が存在する場合に、予め設定された出力値重みを用いて前記現フレームの前記境界候補位置又は前記推定境界位置の距離に重み付けし、加重平均を用いて算出した出力距離を前記境界位置として出力する、
請求項6記載の領域検出方法。
The smoothing process includes
When a likelihood value is calculated using a preset likelihood weight for the boundary candidate position or the estimated boundary position in each direction within the detection range, and the likelihood value is equal to or greater than a predetermined threshold value, Determining that the boundary position exists;
When the boundary position exists in each direction within the detection range, the weight of the boundary candidate position or the estimated boundary position of the current frame is weighted using a preset output value weight, and a weighted average is used. Output the calculated output distance as the boundary position,
The region detection method according to claim 6.
前記1つ以上の物体は、静止物体及び移動物体を含む、
請求項6記載の領域検出方法。
The one or more objects include stationary objects and moving objects;
The region detection method according to claim 6.
前記1つ以上の物体は、静止物体である、
請求項6記載の領域検出方法。
The one or more objects are stationary objects;
The region detection method according to claim 6.
前記検知範囲を、カメラ部を用いて撮影し、
前記撮影された映像のうち、物体認識に用いる画像領域を、前記境界位置に基づいて設定し、
前記設定された画像領域に対して、物体認識を行う、
を更に含む、
請求項6記載の領域検出方法。
The detection range is photographed using a camera unit,
An image region used for object recognition among the captured images is set based on the boundary position,
Object recognition is performed on the set image area.
Further including
The region detection method according to claim 6.
JP2015198675A 2014-12-12 2015-10-06 On-vehicle radar device and area detection method Active JP6557923B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/951,481 US10101448B2 (en) 2014-12-12 2015-11-24 On-board radar apparatus and region detection method
EP15196636.3A EP3032273A1 (en) 2014-12-12 2015-11-27 On-board radar apparatus and region detection method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014251876 2014-12-12
JP2014251876 2014-12-12
JP2015132621 2015-07-01
JP2015132621 2015-07-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP2019120096A Division JP2019194614A (en) 2014-12-12 2019-06-27 On-vehicle radar device, area detection device and area detection method

Publications (3)

Publication Number Publication Date
JP2017009572A JP2017009572A (en) 2017-01-12
JP2017009572A5 true JP2017009572A5 (en) 2018-07-12
JP6557923B2 JP6557923B2 (en) 2019-08-14

Family

ID=57763251

Family Applications (2)

Application Number Title Priority Date Filing Date
JP2015198675A Active JP6557923B2 (en) 2014-12-12 2015-10-06 On-vehicle radar device and area detection method
JP2019120096A Pending JP2019194614A (en) 2014-12-12 2019-06-27 On-vehicle radar device, area detection device and area detection method

Family Applications After (1)

Application Number Title Priority Date Filing Date
JP2019120096A Pending JP2019194614A (en) 2014-12-12 2019-06-27 On-vehicle radar device, area detection device and area detection method

Country Status (1)

Country Link
JP (2) JP6557923B2 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6829833B2 (en) * 2017-02-01 2021-02-17 株式会社東海理化電機製作所 Radio wave propagation distance estimation device
JP6892600B2 (en) * 2017-07-12 2021-06-23 ミツミ電機株式会社 Object detection method and object detection device
KR102401176B1 (en) * 2017-09-14 2022-05-24 삼성전자주식회사 Radar image processing method, apparatus and system
JP7476495B2 (en) * 2019-08-20 2024-05-01 オムロン株式会社 Collision avoidance device, collision avoidance method, and collision avoidance program for autonomous vehicle
JP7417466B2 (en) * 2020-05-07 2024-01-18 株式会社トヨタマップマスター Information processing device, information processing method, and information processing program
JP7445890B2 (en) 2020-06-10 2024-03-08 パナソニックIpマネジメント株式会社 Processing device, processing method, program, and radar device
CN111999709A (en) * 2020-08-31 2020-11-27 安徽江淮汽车集团股份有限公司 Detection device for detection range of automobile radar

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3147541B2 (en) * 1992-11-06 2001-03-19 株式会社豊田中央研究所 Obstacle recognition device for vehicles
JP3371854B2 (en) * 1998-09-07 2003-01-27 株式会社デンソー Ambient situation detection device and recording medium
JP2008298544A (en) * 2007-05-30 2008-12-11 Toyota Motor Corp Object detection device and control device for vehicle
JP4453775B2 (en) * 2008-06-27 2010-04-21 トヨタ自動車株式会社 Object detection device
KR100985035B1 (en) * 2008-10-14 2010-10-05 한국과학기술연구원 Motion Tracking Method Using Scanning Device
JP5531474B2 (en) * 2008-12-12 2014-06-25 株式会社豊田中央研究所 Map generation device, runway estimation device, movable region estimation device, and program
JP5700940B2 (en) * 2010-03-16 2015-04-15 ダイハツ工業株式会社 Object recognition device
JP5678793B2 (en) * 2011-05-11 2015-03-04 トヨタ自動車株式会社 Perimeter monitoring device, perimeter monitoring method, and driving support device
JP2013040886A (en) * 2011-08-19 2013-02-28 Mitsubishi Electric Corp Method and program for measuring three-dimensional point group

Similar Documents

Publication Publication Date Title
JP2017009572A5 (en)
CN108028023B (en) Information processing apparatus, information processing method, and computer-readable storage medium
US9623869B2 (en) Vehicle driving support control apparatus
KR101787996B1 (en) Apparatus of estimating traffic lane in vehicle, control method of thereof
WO2009099022A1 (en) Periphery monitoring device and periphery monitoring method
US10755124B2 (en) Passenger counting device, system, method and program, and vehicle movement amount calculation device, method and program
JP2018072105A5 (en)
JP2017142760A5 (en)
US9734415B2 (en) Object detection system
MY179729A (en) Pedestrian determination method and determination device
RU151809U1 (en) VIDEO SYSTEM FOR SECURITY OF VEHICLES
JP2016153775A5 (en)
JP2016045903A5 (en)
JP5809751B2 (en) Object recognition device
JP2016223812A5 (en)
JP6396714B2 (en) Object recognition device
JP2017194432A5 (en)
US10539418B2 (en) Target detection apparatus and method
JP2019020158A5 (en)
JP2015133078A5 (en)
WO2013035612A1 (en) Obstacle sensing device, obstacle sensing method, and obstacle sensing program
JP2009019914A (en) Object detecting device
JP2019079188A (en) Object detection device
JP2018101942A5 (en)
JP2018105688A (en) Obstacle detector