JP6819161B2 - Coordinate calculation method and coordinate calculation device - Google Patents

Coordinate calculation method and coordinate calculation device Download PDF

Info

Publication number
JP6819161B2
JP6819161B2 JP2016176440A JP2016176440A JP6819161B2 JP 6819161 B2 JP6819161 B2 JP 6819161B2 JP 2016176440 A JP2016176440 A JP 2016176440A JP 2016176440 A JP2016176440 A JP 2016176440A JP 6819161 B2 JP6819161 B2 JP 6819161B2
Authority
JP
Japan
Prior art keywords
sensor
region
value
detection result
azimuth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2016176440A
Other languages
Japanese (ja)
Other versions
JP2018040755A (en
Inventor
沖 孝彦
孝彦 沖
公大 矢野
公大 矢野
達弥 志野
達弥 志野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to JP2016176440A priority Critical patent/JP6819161B2/en
Publication of JP2018040755A publication Critical patent/JP2018040755A/en
Application granted granted Critical
Publication of JP6819161B2 publication Critical patent/JP6819161B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Description

本発明は、座標計算方法及び座標計算装置に関する。 The present invention relates to a coordinate calculation method and a coordinate calculation device.

従来において、車両に設けたセンサから周囲の観測領域に光を照射し、反射光をセンサで受信することで、観測領域の物体を検出する技術が開示されている(特許文献1参照)。 Conventionally, a technique for detecting an object in an observation region by irradiating a surrounding observation region with light from a sensor provided in a vehicle and receiving the reflected light by the sensor has been disclosed (see Patent Document 1).

特開2015−152428号公報Japanese Unexamined Patent Publication No. 2015-152428

例えば、2つのセンサを用い、互いの観測領域の一部を重複させることで、観測領域が及ばない位置、いわゆる死角を少なくできる。
しかし、重複領域での検出結果には、センサの光学歪みに起因する誤差が含まれ、物体の座標を正しく計算できない可能性がある。
For example, by using two sensors and overlapping a part of each observation area, it is possible to reduce a position that the observation area does not reach, that is, a so-called blind spot.
However, the detection result in the overlapping region includes an error due to the optical distortion of the sensor, and the coordinates of the object may not be calculated correctly.

本発明は、上記課題に鑑みてなされたものであり、物体の座標を正しく計算可能な座標計算方法及び座標計算装置を提供することを目的とする。 The present invention has been made in view of the above problems, and an object of the present invention is to provide a coordinate calculation method and a coordinate calculation device capable of correctly calculating the coordinates of an object.

本発明の一態様に係わる座標計算方法においては、第1及び第2センサは、互いの観測領域の一部が重複するように車両に搭載される。観測領域の重複領域は、第1センサで得られる第1方位角の絶対値が第1値より大きい第1領域、第2センサで得られる第2方位角の絶対値が第2値より大きい第2領域を含み、第1領域と第2領域の重複領域と、第1領域から第1領域と第2領域の重複領域を除いた領域と、第2領域から第1領域と第2領域の重複領域を除いた領域と、それぞれ存在する。第1方位角の絶対値が第1値以下である場合は、第1センサの検出結果を用いて座標を計算し、第2方位角の絶対値が第2値以下である場合は、第2センサの検出結果を用いて座標を計算する。第1方位角の絶対値が第1値より大きく且つ第2方位角の絶対値が第2値より大きい場合は、第1センサの検出結果と前記第2センサの検出結果とを用いて座標を計算するのであって、第1センサ及び第2センサの検出結果の一方の直線距離と、第1センサ及び第2センサの検出結果の一方の方位角と、第1センサ及び第2センサの検出結果に含まれる各鉛直方向距離の平均値とを用いて座標を計算する。 In the coordinate calculation method according to one aspect of the present invention, the first and second sensors are mounted on the vehicle so that a part of each observation region overlaps with each other. The overlapping region of the observation region is the first region in which the absolute value of the first azimuth obtained by the first sensor is larger than the first value, and the second region in which the absolute value of the second azimuth obtained by the second sensor is larger than the second value. Two regions are included, the overlapping region of the first region and the second region, the region excluding the overlapping region of the first region and the second region from the first region, and the overlap of the first region and the second region from the second region. a region excluding the region, but each present. When the absolute value of the first azimuth is less than or equal to the first value, the coordinates are calculated using the detection result of the first sensor, and when the absolute value of the second azimuth is less than or equal to the second value, the second value is calculated. Coordinates are calculated using the detection result of the sensor. When the absolute value of the first azimuth is larger than the first value and the absolute value of the second azimuth is larger than the second value, the coordinates are determined using the detection result of the first sensor and the detection result of the second sensor. It is calculated , the linear distance of one of the detection results of the first sensor and the second sensor, the azimuth of one of the detection results of the first sensor and the second sensor, and the detection result of the first sensor and the second sensor. Coordinates are calculated using the average value of each vertical distance included in.

本発明によれば、物体の座標を正しく計算することが可能となる。 According to the present invention, it is possible to correctly calculate the coordinates of an object.

図1は、実施例1の車両周囲に構成される観測領域を示す図である。FIG. 1 is a diagram showing an observation region configured around the vehicle of the first embodiment. 図2は、車両1に搭載される座標計算装置の構成を示すブロック図である。FIG. 2 is a block diagram showing a configuration of a coordinate calculation device mounted on the vehicle 1. 図3は、各センサの構成を示すブロック図である。FIG. 3 is a block diagram showing the configuration of each sensor. 図4は、方位角sに応じて生じる鉛直方向距離d2の歪みを示す図である。FIG. 4 is a diagram showing the distortion of the vertical distance d2 that occurs according to the azimuth angle s. 図5は、実施例1における座標計算方法の流れを示すフローチャートである。FIG. 5 is a flowchart showing the flow of the coordinate calculation method in the first embodiment. 図6は、実施例2における観測領域を示す図である。FIG. 6 is a diagram showing an observation region in the second embodiment. 図7は、実施例2の変形例における観測領域を示す図である。FIG. 7 is a diagram showing an observation region in a modified example of the second embodiment. 図8は、実施例2の別の変形例における観測領域を示す図である。FIG. 8 is a diagram showing an observation region in another modification of Example 2.

以下、実施の形態を図面に基づいて説明する。同一部材には同一符号を付して再度の説明を省略する。 Hereinafter, embodiments will be described with reference to the drawings. The same members are designated by the same reference numerals, and the description thereof will be omitted again.

(実施例1)
図1に示す車両1は、周囲の物体を検出可能となっている。具体的には、車両1の前部に配置されたセンサ111により、車両1の前方の観測領域21に存在する物体を検出する。また、車両1の左側部に配置されたセンサ112により、車両1の左側方の観測領域22に存在する物体を検出する。また、車両1の後部に配置されたセンサ113により、車両1の後方の観測領域23に存在する物体を検出する。また、車両1の右側部に配置されたセンサ114により、車両1の右側方の観測領域24に存在する物体を検出する。
(Example 1)
The vehicle 1 shown in FIG. 1 is capable of detecting surrounding objects. Specifically, the sensor 111 arranged at the front of the vehicle 1 detects an object existing in the observation region 21 in front of the vehicle 1. Further, the sensor 112 arranged on the left side of the vehicle 1 detects an object existing in the observation area 22 on the left side of the vehicle 1. Further, the sensor 113 arranged at the rear of the vehicle 1 detects an object existing in the observation area 23 behind the vehicle 1. Further, the sensor 114 arranged on the right side of the vehicle 1 detects an object existing in the observation area 24 on the right side of the vehicle 1.

各観測領域は、例えば、センサの正面方向を中心とした150度の範囲に設定される。これにより、隣り合う観測領域は部分的に重複し、車両1の全周囲の物体を検出可能となっている。つまり、隣り合う観測領域に対応する各センサは、互いの観測領域の一部が重複するように車両1に搭載される。重複領域の詳細については後述する。 Each observation area is set to, for example, a range of 150 degrees centered on the front direction of the sensor. As a result, the adjacent observation areas partially overlap, and it is possible to detect objects all around the vehicle 1. That is, the sensors corresponding to the adjacent observation regions are mounted on the vehicle 1 so that a part of each observation region overlaps. The details of the overlapping area will be described later.

図2に示すように、座標計算装置は、車両1の前部のセンサ111と、車両1の左側部のセンサ112と、車両1の後部のセンサ113と、車両1の右側部のセンサ114と、各センサに駆動信号を送信し、各センサから検出結果を受信し、物体の座標を計算する座標計算部12とを備える。 As shown in FIG. 2, the coordinate calculation device includes a sensor 111 at the front of the vehicle 1, a sensor 112 at the left side of the vehicle 1, a sensor 113 at the rear of the vehicle 1, and a sensor 114 at the right side of the vehicle 1. , A coordinate calculation unit 12 that transmits a drive signal to each sensor, receives a detection result from each sensor, and calculates the coordinates of an object.

図3に示すように、各センサ111〜114は、送光部1101、イメージセンサ1102(受光部)、駆動制御部1103、測定部1104及び演算部1105を備える。 As shown in FIG. 3, each sensor 111 to 114 includes a light transmitting unit 1101, an image sensor 1102 (light receiving unit), a drive control unit 1103, a measuring unit 1104, and a calculation unit 1105.

送光部1101は、車両1の周囲、詳しくは対応する観測領域の方向に対して光を照射する。イメージセンサ1102は、光を受けた物体からの反射光を受光する受光素子を2次元のマトリクス状に配列して構成され、反射光を受光する。駆動制御部1103は、座標計算部12から駆動信号を受信し、送光部1101とイメージセンサ1102を駆動及び制御する。 The light transmitting unit 1101 irradiates light around the vehicle 1, specifically in the direction of the corresponding observation region. The image sensor 1102 is configured by arranging light receiving elements that receive reflected light from an object that has received light in a two-dimensional matrix, and receives the reflected light. The drive control unit 1103 receives a drive signal from the coordinate calculation unit 12 and drives and controls the light transmission unit 1101 and the image sensor 1102.

測定部1104は、イメージセンサ1102から、イメージセンサ1102の各受光素子で受光した反射光の強さを示す輝度信号を受信し、輝度信号を基に、イメージセンサ1102の各受光素子で受光した反射信号の強さを示す輝度からなる輝度画像を生成する。 The measuring unit 1104 receives from the image sensor 1102 a luminance signal indicating the intensity of the reflected light received by each light receiving element of the image sensor 1102, and based on the luminance signal, the reflection received by each light receiving element of the image sensor 1102. Generates a luminance image consisting of luminance indicating the strength of the signal.

また、測定部1104は、輝度信号に基づき、受光素子毎に、発光タイミングに対する受光タイミングの遅延時間を測定する。そして、遅延時間を基に、受光素子毎に反射点までの距離を計測し、各距離からなる距離画像を生成する。 Further, the measuring unit 1104 measures the delay time of the light receiving timing with respect to the light emitting timing for each light receiving element based on the luminance signal. Then, based on the delay time, the distance to the reflection point is measured for each light receiving element, and a distance image consisting of each distance is generated.

演算部1105は、距離画像に基づいて、センサから物体までの直線距離d1を演算し、輝度画像に基づいて、イメージセンサの受光面の法線に対する物体の方位角sおよび仰角gを演算する。 The calculation unit 1105 calculates the linear distance d1 from the sensor to the object based on the distance image, and calculates the azimuth angle s and elevation angle g of the object with respect to the normal of the light receiving surface of the image sensor based on the luminance image.

方位角sは、ここでは、イメージセンサの受光面の法線の方向(光を照射する方向)を基準(0度)とし、平面視においてセンサを中心とした時計回り方向をプラス、反時計回り方向をマイナスとして演算することとする。観測領域は、方位角sが+75度の方向と−75度の方向に挟まれた範囲(150度の範囲)に設定される。つまり、観測領域は、方位角sが0度の方向を中心として水平方向で対称となっている。 Here, the azimuth angle s is based on the direction of the normal of the light receiving surface of the image sensor (direction of irradiating light) as a reference (0 degree), and the clockwise direction centered on the sensor is plus and counterclockwise in a plan view. It is calculated with the direction as minus. The observation area is set in a range (a range of 150 degrees) in which the azimuth angle s is sandwiched between the direction of +75 degrees and the direction of −75 degrees. That is, the observation region is symmetrical in the horizontal direction with the direction in which the azimuth angle s is 0 degrees as the center.

また、ここでは、演算部1105は、直線距離d1と仰角gに基づいて、センサと物体の間の鉛直方向の距離d2(以下、鉛直方向距離d2という)を演算することとする。 Further, here, the calculation unit 1105 calculates the vertical distance d2 (hereinafter referred to as the vertical distance d2) between the sensor and the object based on the linear distance d1 and the elevation angle g.

すなわち、演算部1105は、直線距離d1、方位角s、仰角g及び鉛直方向距離d2からなる検出結果を得て、このうちの少なくとも直線距離d1、方位角s及び鉛直方向距離d2(これらも総称して検出結果という)を座標計算部12に送信する。 That is, the calculation unit 1105 obtains a detection result consisting of a linear distance d1, an azimuth angle s, an elevation angle g, and a vertical distance d2, and at least the linear distance d1, the azimuth angle s, and the vertical distance d2 (also collectively referred to). Then, the detection result) is transmitted to the coordinate calculation unit 12.

図4を参照し、方位角sに応じて生じる鉛直方向距離d2の歪みについて説明する。
図4において、横軸は方位角sを示し、縦軸は鉛直方向距離d2を示す。横軸における0[度]は、イメージセンサの受光面の法線の方向を示す。縦軸における0[m]の位置は、センサの鉛直方向の位置である。
With reference to FIG. 4, the distortion of the vertical distance d2 that occurs according to the azimuth angle s will be described.
In FIG. 4, the horizontal axis represents the azimuth angle s, and the vertical axis represents the vertical distance d2. 0 [degree] on the horizontal axis indicates the direction of the normal of the light receiving surface of the image sensor. The position of 0 [m] on the vertical axis is the position in the vertical direction of the sensor.

各特性線pは、水平方向に配置された直線状の物体の検出結果を示す。物体は水平方向に配置されるので、鉛直方向距離d2は本来は一定であるが、方位角sの絶対値が大きいほど鉛直方向距離d2の絶対値が大きくなっている。つまり、観測領域の水平方向の端部に近いほど、光学的な歪みが大きく、測定された鉛直方向距離d2と本来の鉛直方向距離の差(誤差)が大きい。 Each characteristic line p indicates the detection result of a linear object arranged in the horizontal direction. Since the objects are arranged in the horizontal direction, the vertical distance d2 is originally constant, but the larger the absolute value of the azimuth angle s, the larger the absolute value of the vertical distance d2. That is, the closer to the horizontal end of the observation region, the larger the optical distortion, and the larger the difference (error) between the measured vertical distance d2 and the original vertical distance.

また、このような特性をもつ観測領域の端部は、隣の観測領域と重複する部分でもある。例えば、同一の物体を各センサ111、112で検出した場合、センサ111で得た鉛直方向距離d2と、センサ112で得た鉛直方向距離d2とが相違することがある。この現象の要因は、図4に示す特性に鑑みれば、センサ111で得た方位角sと、センサ112で得た方位角sとが相違することと考えられる。 In addition, the end of the observation area having such characteristics also overlaps with the adjacent observation area. For example, when the same object is detected by the sensors 111 and 112, the vertical distance d2 obtained by the sensor 111 and the vertical distance d2 obtained by the sensor 112 may be different. The cause of this phenomenon is considered to be that the azimuth angle s obtained by the sensor 111 and the azimuth angle s obtained by the sensor 112 are different from each other in view of the characteristics shown in FIG.

本実施例では、このような場合、同一の物体を異なる物体と誤検出することなく、後述の処理により、同一の物体として検出する。 In this embodiment, in such a case, the same object is detected as the same object by the process described later without erroneously detecting the same object as a different object.

図1において、例えば、観測領域21、22の重複領域Aは、センサ111で得られる方位角s(第1方位角)の絶対値が45度(第1値)以上となる領域(第1領域)、つまり、図1における領域A1、A3からなる領域を含む。また、重複領域Aは、センサ112で得られる方位角s(第2方位角)の絶対値が45度(第2値)以上となる領域(第2領域)、つまり、図1における領域A2、A3からなる領域を含む。ここでは第1値と第2値は同じだが、相違してもよい。 In FIG. 1, for example, the overlapping region A of the observation regions 21 and 22 is a region (first region) in which the absolute value of the azimuth s (first azimuth) obtained by the sensor 111 is 45 degrees (first value) or more. ), That is, the region including the regions A1 and A3 in FIG. 1 is included. Further, the overlapping region A is a region (second region) in which the absolute value of the azimuth s (second azimuth) obtained by the sensor 112 is 45 degrees (second value) or more, that is, the region A2 in FIG. Includes a region consisting of A3. Here, the first value and the second value are the same, but may be different.

換言すれば、重複領域Aは、第1領域と第2領域とが重複した領域A3を含む。また、領域A1は、第1領域から領域A3を除いた領域であり、領域A2は、第2領域から領域A3を除いた領域である。 In other words, the overlapping region A includes a region A3 in which the first region and the second region overlap. Further, the region A1 is a region obtained by removing the region A3 from the first region, and the region A2 is a region obtained by removing the region A3 from the second region.

また、観測領域22、23の重複領域、観測領域23、24の重複領域、観測領域24、21の重複領域についても、観測領域21、22の重複領域Aと同様の構成となっている。 Further, the overlapping regions of the observation regions 22 and 23, the overlapping regions of the observation regions 23 and 24, and the overlapping regions of the observation regions 24 and 21 have the same configuration as the overlapping regions A of the observation regions 21 and 22.

次に、図5を参照し、座標計算部12が行う座標計算方法について説明する。
ここでは、座標計算部12は、センサ111が得た直線距離d1(以下、直線距離d11という)、方位角s(以下、方位角s1という)及び鉛直方向距離d2(以下、鉛直方向距離d21という)をセンサ111から受信したこととする。直線距離d11、方位角s1及び鉛直方向距離d21を総称して検出結果r1という。
Next, with reference to FIG. 5, the coordinate calculation method performed by the coordinate calculation unit 12 will be described.
Here, the coordinate calculation unit 12 refers to the linear distance d1 (hereinafter referred to as the linear distance d11), the azimuth s (hereinafter referred to as the azimuth s1) and the vertical distance d2 (hereinafter referred to as the vertical distance d21) obtained by the sensor 111. ) Is received from the sensor 111. The linear distance d11, the azimuth angle s1 and the vertical distance d21 are collectively referred to as the detection result r1.

また、座標計算部12は、センサ112が得た直線距離d1(以下、直線距離d12という)、方位角s(以下、方位角s2という)及び鉛直方向距離d2(以下、鉛直方向距離d22という)をセンサ112から受信したこととする。直線距離d12、方位角s2及び鉛直方向距離d22を総称して検出結果r2という。 Further, the coordinate calculation unit 12 has a linear distance d1 (hereinafter referred to as a linear distance d12), an azimuth angle s (hereinafter referred to as an azimuth angle s2) and a vertical distance d2 (hereinafter referred to as a vertical distance d22) obtained by the sensor 112. Is received from the sensor 112. The linear distance d12, the azimuth angle s2, and the vertical distance d22 are collectively referred to as the detection result r2.

まず、座標計算部12は、直線距離d11、d12の差が予め定めたしきい値th1以下か否か、つまり、直線距離d11、d12が実質的に同一か否かを判定する(S1)。直線距離の差がしきい値th1より大きい場合(S1:NO)、検出結果r1に基づいて物体の座標を計算し、検出結果r1に基づいて別の物体のものとしての座標を計算し(S3)、処理を終える。 First, the coordinate calculation unit 12 determines whether or not the difference between the linear distances d11 and d12 is equal to or less than the predetermined threshold value th1, that is, whether or not the linear distances d11 and d12 are substantially the same (S1). When the difference in linear distance is larger than the threshold value th1 (S1: NO), the coordinates of the object are calculated based on the detection result r1, and the coordinates of another object are calculated based on the detection result r1 (S3). ), Finish the process.

座標は、例えば、車両の中心を原点、車両の右方をX軸のプラス方向、車両の前方をY軸のプラス方向、車両の上方をZ軸のプラス方向とした座標系の座標とすることが可能である。 For example, the coordinates shall be the coordinates of the coordinate system with the center of the vehicle as the origin, the right side of the vehicle as the plus direction of the X axis, the front of the vehicle as the plus direction of the Y axis, and the upper part of the vehicle as the plus direction of the Z axis. Is possible.

さて、直線距離の差がしきい値th1以下の場合(S1:YES)、方位角s1の絶対値が45度(第1値)以下であるか否かを判定する(S5)。方位角s1の絶対値が45度(第1値)以下の場合(S5:YES)、検出結果r2に基づいて物体の座標計算を行わず、検出結果r1に基づいて物体の座標を計算し(S7)、処理を終える。つまり、図1の領域A2の物体の座標を、検出結果r1に基づいて計算する。 When the difference between the linear distances is the threshold value th1 or less (S1: YES), it is determined whether or not the absolute value of the azimuth angle s1 is 45 degrees (first value) or less (S5). When the absolute value of the azimuth angle s1 is 45 degrees (first value) or less (S5: YES), the coordinates of the object are not calculated based on the detection result r2, and the coordinates of the object are calculated based on the detection result r1 (S5: YES). S7), the process is completed. That is, the coordinates of the object in the area A2 of FIG. 1 are calculated based on the detection result r1.

例えば、領域A2の物体の座標を検出結果r2に基づいて計算したとする。しかし、検出結果r2の鉛直方向距離には多くの誤差が含まれるので、正確な座標は得られない。そこで、このように、多くの誤差が含まれる検出結果r2の鉛直方向距離を使用せず、誤差の少ない検出結果r1の鉛直方向距離を使用することで、より正確な座標を得ることができる。 For example, suppose that the coordinates of the object in the area A2 are calculated based on the detection result r2. However, since the vertical distance of the detection result r2 contains many errors, accurate coordinates cannot be obtained. Therefore, more accurate coordinates can be obtained by using the vertical distance of the detection result r1 with a small error instead of using the vertical distance of the detection result r2 containing many errors.

一方、方位角s1の絶対値が45度(第1値)より大きい場合(S5:NO)、方位角s2の絶対値が45度(第2値)以下であるか否かを判定する(S9)。方位角s2の絶対値が45度以下の場合(S9:YES)、検出結果r1に基づいて物体の座標計算を行わず、検出結果r2に基づいて物体の座標を計算し(S11)、処理を終える。つまり、図1の領域A1の物体の座標を、検出結果r2に基づいて計算する。 On the other hand, when the absolute value of the azimuth angle s1 is larger than 45 degrees (first value) (S5: NO), it is determined whether or not the absolute value of the azimuth angle s2 is 45 degrees (second value) or less (S9). ). When the absolute value of the azimuth angle s2 is 45 degrees or less (S9: YES), the coordinates of the object are not calculated based on the detection result r1 and the coordinates of the object are calculated based on the detection result r2 (S11). Finish. That is, the coordinates of the object in the region A1 of FIG. 1 are calculated based on the detection result r2.

例えば、領域A1の物体の座標を検出結果r1に基づいて計算したとする。しかし、検出結果r1の鉛直方向距離には多くの誤差が含まれるので、正確な座標は得られない。そこで、このように、多くの誤差が含まれる検出結果r1の鉛直方向距離を使用せず、誤差の少ない検出結果r2の鉛直方向距離を使用することで、より正確な座標を得ることができる。 For example, suppose that the coordinates of the object in the area A1 are calculated based on the detection result r1. However, since the vertical distance of the detection result r1 contains many errors, accurate coordinates cannot be obtained. Therefore, more accurate coordinates can be obtained by using the vertical distance of the detection result r2 having a small error instead of using the vertical distance of the detection result r1 containing many errors.

一方、方位角s2の絶対値が45度(第2値)より大きい場合(S9:NO)、鉛直方向距離d21、d22の差が予め定めたしきい値th2以下か否か、つまり、鉛直方向距離d21、d22が実質的に同一か否かを判定する(S13)。 On the other hand, when the absolute value of the azimuth angle s2 is larger than 45 degrees (second value) (S9: NO), whether or not the difference between the vertical distances d21 and d22 is equal to or less than the predetermined threshold value th2, that is, the vertical direction. It is determined whether or not the distances d21 and d22 are substantially the same (S13).

鉛直方向距離の差がしきい値th2より大きい場合(S13:NO)、検出結果r1に基づいて物体の座標を計算し、検出結果r1に基づいて別の物体のものとしての座標を計算し(S3)、処理を終える。なお、鉛直方向距離の差がしきい値th2より大きい場合(S13:NO)、観測対象外とし、座標を計算しなくてもよい。 When the difference in the vertical distance is larger than the threshold value th2 (S13: NO), the coordinates of the object are calculated based on the detection result r1, and the coordinates of another object are calculated based on the detection result r1 (S13: NO). S3), the process is completed. If the difference in the vertical distance is larger than the threshold value th2 (S13: NO), it is excluded from the observation target and the coordinates need not be calculated.

一方、鉛直方向距離の差がしきい値th2以下の場合(S13:YES)、検出結果r1、r2を用いて物体の座標を計算し(S15)、処理を終える。 On the other hand, when the difference in the vertical distance is equal to or less than the threshold value th2 (S13: YES), the coordinates of the object are calculated using the detection results r1 and r2 (S15), and the process is completed.

ステップS15では、例えば、鉛直方向距離d21、d22の平均値davを計算し、2つの検出結果の一方に含まれる直線距離と、2つの検出結果の一方に含まれる方位角と、平均値davとに基づいて座標を計算する(S15)。 In step S15, for example, the average value dav of the vertical distances d21 and d22 is calculated, the linear distance included in one of the two detection results, the azimuth angle included in one of the two detection results, and the average value dav. Coordinates are calculated based on (S15).

仮に一方の検出結果を選択して使用した場合、その検出結果に含まれる鉛直方向距離の方が、選択しなかった検出結果に含まれる鉛直方向距離よりも誤差を多く含む可能性があり、座標の誤差も大きくなってしまう。そこで、本実施例では、両方の検出結果を使用することで、誤差の少ない、より正確な座標を得ることができる。 If one of the detection results is selected and used, the vertical distance included in the detection result may contain more error than the vertical distance included in the unselected detection result, and the coordinates. The error of is also large. Therefore, in this embodiment, by using both detection results, it is possible to obtain more accurate coordinates with less error.

なお、上記例では、センサ111、112の検出結果に基づく座標計算方法について述べたが、センサ112、113の検出結果に基づく座標計算方法、センサ113、114の検出結果に基づく座標計算方法並びにセンサ114、111の検出結果に基づく座標計算方法もこれと同様である。 In the above example, the coordinate calculation method based on the detection results of the sensors 111 and 112 has been described, but the coordinate calculation method based on the detection results of the sensors 112 and 113, the coordinate calculation method based on the detection results of the sensors 113 and 114, and the sensor. The coordinate calculation method based on the detection results of 114 and 111 is the same as this.

以上のように、実施例1の座標計算方法においては、車両1の周囲に光を照射し、反射光をイメージセンサ1102で受光することにより、直線距離d1、方位角s及び仰角gを含む検出結果を第1センサ(上記例のセンサ111)と第2センサ(上記例のセンサ112)とで得る。そして、各検出結果を用いて座標を計算する。 As described above, in the coordinate calculation method of the first embodiment, the surroundings of the vehicle 1 are irradiated with light, and the reflected light is received by the image sensor 1102 to detect the linear distance d1, the azimuth angle s, and the elevation angle g. The result is obtained by the first sensor (sensor 111 of the above example) and the second sensor (sensor 112 of the above example). Then, the coordinates are calculated using each detection result.

第1センサ(111)及び第2センサ(112)は、互いの観測領域21、22の一部が重複するように車両に搭載される(図1)。観測領域の重複領域Aは、第1センサ(111)で得られる第1方位角の絶対値が第1値(上記例の45度)以上となる第1領域(A1+A3)、第2センサで得られる第2方位角の絶対値が第2値(上記例の45度)以上となる第2領域(A2+A3)を含む。また、第1領域と第2領域の重複領域(A3)が存在する。 The first sensor (111) and the second sensor (112) are mounted on the vehicle so that parts of the observation areas 21 and 22 overlap each other (FIG. 1). The overlapping region A of the observation region is obtained by the first region (A1 + A3) and the second sensor in which the absolute value of the first azimuth obtained by the first sensor (111) is equal to or larger than the first value (45 degrees in the above example). It includes a second region (A2 + A3) in which the absolute value of the second azimuth is equal to or greater than the second value (45 degrees in the above example). In addition, there is an overlapping region (A3) between the first region and the second region.

そして、第1方位角の絶対値が第1値以下である場合は(S5:YES)、第1センサの検出結果を用いて座標を計算する(S7)。第2方位角の絶対値が第2値以下である場合は(S9:YES)、第2センサの検出結果を用いて座標を計算する(S11)。第1方位角の絶対値が第1値より大きく且つ第2方位角の絶対値が第2値より大きい場合は(S9:NO)、第1センサの検出結果と第2センサの検出結果とを用いて座標を計算する(S15)。 Then, when the absolute value of the first azimuth angle is equal to or less than the first value (S5: YES), the coordinates are calculated using the detection result of the first sensor (S7). When the absolute value of the second azimuth is equal to or less than the second value (S9: YES), the coordinates are calculated using the detection result of the second sensor (S11). When the absolute value of the first azimuth is larger than the first value and the absolute value of the second azimuth is larger than the second value (S9: NO), the detection result of the first sensor and the detection result of the second sensor are displayed. The coordinates are calculated using (S15).

よって、第1領域から重複領域(A3)を除いた領域(A1)の物体の座標は、誤差の少ない第2センサの検出結果に基づいて計算され、第2領域から重複領域(A3)を除いた領域(A2)の物体の座標は、誤差の少ない第1センサの検出結果に基づいて計算される。その結果、より正確な座標を得ることができる。 Therefore, the coordinates of the object in the region (A1) excluding the overlapping region (A3) from the first region are calculated based on the detection result of the second sensor with less error, and the overlapping region (A3) is excluded from the second region. The coordinates of the object in the region (A2) are calculated based on the detection result of the first sensor with less error. As a result, more accurate coordinates can be obtained.

また、重複領域(A3)の物体の座標を、一方の検出結果のみに基づいて計算するのでなく、第1、第2センサの検出結果に基づいて計算するので、より正確な座標を得ることができる。 Further, since the coordinates of the object in the overlapping region (A3) are calculated based on the detection results of the first and second sensors, not based on only one of the detection results, more accurate coordinates can be obtained. it can.

また、第1方位角の絶対値が第1値より大きく且つ第2方位角の絶対値が第2値より大きく(S9:NO)、第1センサの検出結果の鉛直方向距離と第2センサの検出結果の鉛直方向距離との差が所定のしきい値th2以下の場合(S13:YES)、各検出結果を用いて座標を計算する(S15)。これにより、第1、第2センサで検出した物体が同一物体と判断できる場合(S13:YES)に限り、その物体の座標を得ることができる。 Further, the absolute value of the first azimuth is larger than the first value and the absolute value of the second azimuth is larger than the second value (S9: NO), and the vertical distance of the detection result of the first sensor and the vertical distance of the second sensor. When the difference between the detection result and the vertical distance is equal to or less than the predetermined threshold value th2 (S13: YES), the coordinates are calculated using each detection result (S15). As a result, the coordinates of the object can be obtained only when the objects detected by the first and second sensors can be determined to be the same object (S13: YES).

また、各鉛直方向距離の平均値davを計算し、2つの検出結果の一方の直線距離と、2つの検出結果の一方の方位角と、平均値davとを用いて座標を計算する(S15)。よって、第1、第2センサで検出した物体が同一物体と判断できる場合(S13:YES)に限り、その物体の座標を得ることができる。 Further, the average value dav of each vertical distance is calculated, and the coordinates are calculated using one linear distance of the two detection results, one azimuth angle of the two detection results, and the average value dav (S15). .. Therefore, the coordinates of the object can be obtained only when the objects detected by the first and second sensors can be determined to be the same object (S13: YES).

(実施例2)
次に、実施例2について説明する。実施例2は、実施例1に対して、観測領域の重複領域の構成が異なり、その他については同様なので、差異を中心に説明を行う。
(Example 2)
Next, Example 2 will be described. Since the configuration of the overlapping region of the observation region is different from that of the first embodiment and the other parts are the same in the second embodiment, the description will be focused on the difference.

図6においては、観測領域21の外縁と観測領域22の外縁の交点Pが示されている。領域A1、A3からなる第1領域は、センサ111で得られる方位角s(第1方位角)の絶対値が第1値以上となる領域であることに変わりはないが、第1値(第1方位角)は、センサ111から交点Pへの方位を示すものとなっている。これにより、第1値は、実施例1の第1値である45度より大きくなっている。 In FIG. 6, the intersection P of the outer edge of the observation area 21 and the outer edge of the observation area 22 is shown. The first region composed of the regions A1 and A3 is still a region in which the absolute value of the azimuth angle s (first azimuth angle) obtained by the sensor 111 is equal to or higher than the first value, but the first value (first value). 1 azimuth angle) indicates the azimuth from the sensor 111 to the intersection P. As a result, the first value is larger than the first value of Example 1, 45 degrees.

また、領域A2、A3からなる第2領域は、センサ112で得られる方位角s(第2方位角)の絶対値が第2値以上となる領域であることに変わりはないが、第2値(第2方位角)は、センサ112から交点Pへの方位を示すものとなっている。これにより、第2値は、実施例1の第2値である45度より大きくなっている。 Further, the second region composed of the regions A2 and A3 is still a region in which the absolute value of the azimuth angle s (second azimuth angle) obtained by the sensor 112 is equal to or higher than the second value, but the second value. (Second azimuth angle) indicates the azimuth from the sensor 112 to the intersection P. As a result, the second value is larger than the second value of Example 1, 45 degrees.

これにより、実施例1に比べ、領域A3の面積を小さくできる。その結果、物体の座標を計算するにしても、平均値を求める回数を少なくでき、計算の負荷を低減できる。 As a result, the area of the region A3 can be made smaller than that of the first embodiment. As a result, even when the coordinates of the object are calculated, the number of times the average value is calculated can be reduced, and the calculation load can be reduced.

(実施例2の変形例)
なお、センサの配置は、実施例1、2の配置に限らず、例えば、図7に示すように、センサ112、114を前方にずらしてもよい。この場合でも、第1、第2の方位角をそれぞれ交点Pへの方位を示すものとすることで、領域A3の面積を小さくでき、計算の負荷を低減できる。
(Modified Example of Example 2)
The arrangement of the sensors is not limited to the arrangements of Examples 1 and 2, and for example, as shown in FIG. 7, the sensors 112 and 114 may be shifted forward. Even in this case, the area of the region A3 can be reduced and the calculation load can be reduced by setting the first and second azimuth angles to indicate the directions to the intersection P, respectively.

(実施例2の別の変形例)
また、センサは、図8に示すように、車両の前部の左右端、後部の左右端に配置してもよい。この場合でも、第1、第2の方位角を交点Pへの方位を示すものとすることで、領域A3の面積を小さくでき、計算の負荷を低減できる。
(Another variant of Example 2)
Further, as shown in FIG. 8, the sensors may be arranged at the left and right ends of the front portion and the left and right ends of the rear portion of the vehicle. Even in this case, by setting the first and second azimuth angles to indicate the azimuth to the intersection P, the area of the region A3 can be reduced and the calculation load can be reduced.

また、図示しないが、センサの数は、4に限らず、2又は5以上でもよい。また、上記技術は、複数のセンサの一部、例えば3つのセンサのうちの2つに適用してもよい。 Although not shown, the number of sensors is not limited to 4, and may be 2 or 5 or more. Further, the above technique may be applied to a part of a plurality of sensors, for example, two of three sensors.

また、上記実施例では、観測領域が水平方向で重複する場合について述べたが、その他の方向、例えば車両の進行方向を軸とした軸回り方向で観測領域が重複するようにしてもよい。この場合、センサは、車両の屋根の左右端部などに配置すればよい。 Further, in the above embodiment, the case where the observation regions overlap in the horizontal direction has been described, but the observation regions may overlap in other directions, for example, in the axial direction around the traveling direction of the vehicle. In this case, the sensors may be arranged at the left and right ends of the roof of the vehicle.

以上、本発明の実施形態を記載したが、この開示の一部をなす論述及び図面はこの発明を限定するものであると理解すべきではない。この開示から当業者には様々な代替実施の形態、実施例及び運用技術が明らかとなろう。 Although embodiments of the present invention have been described above, the statements and drawings that form part of this disclosure should not be understood to limit the invention. Various alternative embodiments, examples and operational techniques will be apparent to those skilled in the art from this disclosure.

1 車両
111〜114 センサ
12 座標計算部
1101 送光部
1102 イメージセンサ
1103 駆動制御部
1104 測定部
1105 演算部
21、22、23、24 観察領域
d1 直線距離
d2 鉛直方向距離
s 方位角
g 仰角
th1 直線距離の差に対するしきい値
th2 鉛直方向距離の差に対するしきい値
r1 センサ111による検出結果
r2 センサ112による検出結果
d11 検出結果r1に含まれる直線距離
d12 検出結果r2に含まれる直線距離
s1 検出結果r1に含まれる方位角
s2 検出結果r2に含まれる方位角
d21 検出結果r1に含まれる鉛直方向距離
d22 検出結果r2に含まれる鉛直方向距離
1 Vehicle 111-114 Sensor 12 Coordinate calculation unit 1101 Light transmission unit 1102 Image sensor 1103 Drive control unit 1104 Measurement unit 1105 Calculation unit 21, 22, 23, 24 Observation area d1 Straight line distance d2 Vertical distance s Azimuth g Elevation angle th1 Straight line Threshold for difference in distance th2 Threshold for difference in vertical distance r1 Detection result by sensor 111 r2 Detection result by sensor 112 d11 Linear distance d12 included in detection result r1 Linear distance s1 detection result included in detection result r2 Azimuth angle s2 included in r1 Azimuth angle d21 included in detection result r2 Vertical distance included in detection result r1 d22 Vertical distance included in detection result r2

Claims (4)

車両の周囲に光を照射し、前記光を受けた物体からの反射光をイメージセンサで受光することにより、前記物体までの直線距離、前記イメージセンサの受光面の法線に対する方位角及び仰角、並びに前記直線距離と前記仰角とに基づいて演算される、前記物体との間の鉛直方向距離を含む検出結果をそれぞれ得る第1センサと第2センサを用いて、前記物体の座標を計算する物体座標計算方法において、
前記第1及び第2センサは、互いの観測領域の一部が重複するように前記車両に搭載され、前記観測領域の重複領域は、前記第1センサで得られる第1方位角の絶対値が第1値より大きい第1領域、前記第2センサで得られる第2方位角の絶対値が第2値より大きい第2領域を含み、前記第1領域と前記第2領域の重複領域と、前記第1領域から前記第1領域と前記第2領域の重複領域を除いた領域と、前記第2領域から前記第1領域と前記第2領域の重複領域を除いた領域と、がそれぞれ存在し、
前記第1方位角の絶対値が前記第1値以下である場合は、前記第1センサの検出結果を用いて座標を計算し、
前記第2方位角の絶対値が前記第2値以下である場合は、前記第2センサの検出結果を用いて座標を計算し、
前記第1方位角の絶対値が前記第1値より大きく且つ前記第2方位角の絶対値が前記第2値より大きい場合は、前記第1センサの検出結果と前記第2センサの検出結果とを用いて座標を計算するのであって、前記第1センサの検出結果及び前記第2センサの検出結果に含まれる各鉛直方向距離の平均値を計算し、前記第1センサの検出結果及び前記第2センサの検出結果の一方の前記直線距離と、前記第1センサの検出結果及び前記第2センサの検出結果の一方の前記方位角と、前記平均値とを用いて座標を計算する
ことを特徴とする座標計算方法。
By irradiating the surroundings of the vehicle with light and receiving the reflected light from the object that received the light with the image sensor, the linear distance to the object, the azimuth and elevation angle with respect to the normal of the light receiving surface of the image sensor, An object that calculates the coordinates of the object using the first sensor and the second sensor that obtain the detection results including the vertical distance between the object and the object, which are calculated based on the linear distance and the elevation angle, respectively. In the coordinate calculation method
The first and second sensors are mounted on the vehicle so that a part of each observation region overlaps, and the overlapping region of the observation region has an absolute value of the first azimuth obtained by the first sensor. A first region larger than the first value, a second region in which the absolute value of the second azimuth obtained by the second sensor is larger than the second value is included, and the overlapping region of the first region and the second region and the said A region excluding the overlapping region of the first region and the second region from the first region and a region excluding the overlapping region of the first region and the second region from the second region exist, respectively.
When the absolute value of the first azimuth is equal to or less than the first value, the coordinates are calculated using the detection result of the first sensor.
When the absolute value of the second azimuth is equal to or less than the second value, the coordinates are calculated using the detection result of the second sensor.
When the absolute value of the first azimuth angle is larger than the first value and the absolute value of the second azimuth angle is larger than the second value, the detection result of the first sensor and the detection result of the second sensor The coordinates are calculated using the above, and the average value of each vertical distance included in the detection result of the first sensor and the detection result of the second sensor is calculated, and the detection result of the first sensor and the first sensor are calculated. It is characterized in that coordinates are calculated using the linear distance of one of the detection results of the two sensors, the azimuth of one of the detection results of the first sensor and the detection result of the second sensor, and the average value. Coordinate calculation method.
前記第1方位角の絶対値が前記第1値より大きく且つ前記第2方位角の絶対値が前記第2値より大きく、前記第1センサの検出結果に含まれる前記鉛直方向距離と前記第2センサの検出結果に含まれる前記鉛直方向距離との差が所定のしきい値以下の場合、前記各検出結果を用いて座標を計算する
ことを特徴とする請求項1記載の座標計算方法。
Absolute value is larger than said second value of large and said second azimuth angle absolute value than the first value of the first azimuth angle, the vertical distance between the second included in the detection result of the first sensor If the difference between the vertical distance contained in the detection result of the sensor is below a predetermined threshold value, the coordinate calculation method according to claim 1, wherein the calculating the coordinates by using the detection results.
前記第1値は、前記第1センサから、前記第1センサの前記観測領域の外縁と前記第2センサの前記観測領域の外縁の交点への方位を示すものであり、
前記第2値は、前記第2センサから前記交点への方位を示すものである
ことを特徴とする請求項1又は2に記載の座標計算方法。
The first value from the first sensor, which indicates the direction of the outer edge of the intersection of the observation area of the outer edge and the second sensor of the observation area of the first sensor,
The coordinate calculation method according to claim 1 or 2 , wherein the second value indicates an orientation from the second sensor to the intersection.
車両の周囲に光を照射し、前記光を受けた物体からの反射光をイメージセンサで受光することにより、前記物体までの直線距離、前記イメージセンサの受光面の法線に対する方位角及び仰角、並びに前記直線距離と前記仰角とに基づいて演算される、前記物体との間の鉛直方向距離を含む検出結果をそれぞれ得る第1センサと第2センサを用いて、前記物体の座標を計算する物体座標計算装置において、
前記第1及び第2センサは、互いの観測領域の一部が重複するように前記車両に搭載され、前記観測領域の重複領域は、前記第1センサで得られる第1方位角の絶対値が第1値より大きい第1領域、前記第2センサで得られる第2方位角の絶対値が第2値より大きい第2領域を含み、前記第1領域と前記第2領域の重複領域と、前記第1領域から前記第1領域と前記第2領域の重複領域を除いた領域と、前記第2領域から前記第1領域と前記第2領域の重複領域を除いた領域と、がそれぞれ存在し、
前記第1方位角の絶対値が前記第1値以下である場合は、前記第1センサの検出結果を用いて座標を計算し、前記第2方位角の絶対値が前記第2値以下である場合は、前記第2センサの検出結果を用いて座標を計算し、前記第1方位角の絶対値が前記第1値より大きく且つ前記第2方位角の絶対値が前記第2値より大きい場合は、前記第1センサの検出結果と前記第2センサの検出結果とを用いて座標を計算する座標計算部
を備え
前記座標計算部は、
前記第1センサの検出結果及び前記第2センサの検出結果に含まれる各鉛直方向距離の平均値を計算し、前記第1センサの検出結果及び前記第2センサの検出結果の一方の前記直線距離と、前記第1センサの検出結果及び前記第2センサの検出結果の一方の前記方位角と、前記平均値とを用いて座標を計算することを特徴とする座標計算装置。
By irradiating the surroundings of the vehicle with light and receiving the reflected light from the object that received the light with the image sensor, the linear distance to the object, the azimuth and elevation angle with respect to the normal of the light receiving surface of the image sensor, An object that calculates the coordinates of the object using the first sensor and the second sensor that obtain the detection results including the vertical distance between the object and the object, which are calculated based on the linear distance and the elevation angle, respectively. In the coordinate calculation device
The first and second sensors are mounted on the vehicle so that a part of each observation region overlaps, and the overlapping region of the observation region has an absolute value of the first azimuth obtained by the first sensor. A first region larger than the first value, a second region in which the absolute value of the second azimuth obtained by the second sensor is larger than the second value is included, and the overlapping region of the first region and the second region and the said A region excluding the overlapping region of the first region and the second region from the first region and a region excluding the overlapping region of the first region and the second region from the second region exist, respectively.
When the absolute value of the first azimuth is equal to or less than the first value, the coordinates are calculated using the detection result of the first sensor, and the absolute value of the second azimuth is equal to or less than the second value. In this case, the coordinates are calculated using the detection result of the second sensor, and the absolute value of the first azimuth is larger than the first value and the absolute value of the second azimuth is larger than the second value. Provided a coordinate calculation unit that calculates coordinates using the detection result of the first sensor and the detection result of the second sensor .
The coordinate calculation unit
The average value of each vertical distance included in the detection result of the first sensor and the detection result of the second sensor is calculated, and the linear distance of one of the detection result of the first sensor and the detection result of the second sensor. When the coordinate calculation apparatus characterized that you calculate coordinates using the one the azimuth angle of the detection result of the detection result of the first sensor and the second sensor, and said average value.
JP2016176440A 2016-09-09 2016-09-09 Coordinate calculation method and coordinate calculation device Active JP6819161B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016176440A JP6819161B2 (en) 2016-09-09 2016-09-09 Coordinate calculation method and coordinate calculation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2016176440A JP6819161B2 (en) 2016-09-09 2016-09-09 Coordinate calculation method and coordinate calculation device

Publications (2)

Publication Number Publication Date
JP2018040755A JP2018040755A (en) 2018-03-15
JP6819161B2 true JP6819161B2 (en) 2021-01-27

Family

ID=61625929

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016176440A Active JP6819161B2 (en) 2016-09-09 2016-09-09 Coordinate calculation method and coordinate calculation device

Country Status (1)

Country Link
JP (1) JP6819161B2 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802096A (en) * 1987-05-14 1989-01-31 Bell & Howell Company Controlled direction non-contact detection system for automatic guided vehicles
JP3196861B2 (en) * 1992-12-21 2001-08-06 同和鉱業株式会社 Object position detection method and object position detection device
JP4284652B2 (en) * 2004-03-08 2009-06-24 オムロン株式会社 Radar equipment
JP2011185664A (en) * 2010-03-05 2011-09-22 Panasonic Electric Works Co Ltd Object detector
JP2012247226A (en) * 2011-05-25 2012-12-13 Optex Co Ltd Distance image camera and distance image combination method
JP6029306B2 (en) * 2012-03-29 2016-11-24 住友建機株式会社 Perimeter monitoring equipment for work machines
JP6019959B2 (en) * 2012-09-06 2016-11-02 富士通株式会社 Object detection device, object detection program, and vehicle

Also Published As

Publication number Publication date
JP2018040755A (en) 2018-03-15

Similar Documents

Publication Publication Date Title
JP6906569B2 (en) Non-ground point filtering methods, devices, storage media, and programs in point clouds
JP5316572B2 (en) Object recognition device
JP6413621B2 (en) On-vehicle object discrimination device
JP2015155878A (en) Obstacle detection device for vehicle
WO2019012770A1 (en) Imaging device and monitoring device
US11532166B2 (en) Obstacle positioning method, device and terminal
US20220289026A1 (en) Object Detection Sensor Alignment
JP2016080647A (en) Object detector
KR101359649B1 (en) obstacle detection sensor
KR102253274B1 (en) Fusion signal processing apparatus
JP6819161B2 (en) Coordinate calculation method and coordinate calculation device
JP6943678B2 (en) External recognition device
JP2018036225A (en) State estimation device
JP6724670B2 (en) Vehicle detection method and vehicle detection device
JP6241083B2 (en) Imaging apparatus and parallax detection method
JP6404985B1 (en) Imaging device for detecting abnormality of range image
JP6686776B2 (en) Step detection method and step detection apparatus
JP7363545B2 (en) Calibration judgment result presentation device, calibration judgment result presentation method and program
JP6961125B2 (en) Vehicle posture measuring device and vehicle posture measuring method
KR102421831B1 (en) Vehicle and controlling method for the same
US9587950B2 (en) Carrier
CN110308460B (en) Parameter determination method and system of sensor
JP7385422B2 (en) Distance sensors, inspection methods, and reflectors
JP7542735B2 (en) Method and apparatus for determining false positives of a lidar sensor
JP7330850B2 (en) DISTANCE SENSOR, INSPECTION DEVICE, INSPECTION SYSTEM, AND INSPECTION METHOD

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20190328

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20200312

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20200324

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20200518

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20200923

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20201113

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20201201

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20201214

R151 Written notification of patent or utility model registration

Ref document number: 6819161

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151