JP6841553B2 - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
JP6841553B2
JP6841553B2 JP2018559042A JP2018559042A JP6841553B2 JP 6841553 B2 JP6841553 B2 JP 6841553B2 JP 2018559042 A JP2018559042 A JP 2018559042A JP 2018559042 A JP2018559042 A JP 2018559042A JP 6841553 B2 JP6841553 B2 JP 6841553B2
Authority
JP
Japan
Prior art keywords
dimensional position
unit
straight
imaging device
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2018559042A
Other languages
Japanese (ja)
Other versions
JPWO2018123640A1 (en
Inventor
大 堤
大 堤
直也 多田
直也 多田
永崎 健
健 永崎
寛人 三苫
寛人 三苫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Publication of JPWO2018123640A1 publication Critical patent/JPWO2018123640A1/en
Application granted granted Critical
Publication of JP6841553B2 publication Critical patent/JP6841553B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Description

本発明は、車両等の移動体に搭載され、舵角補正を行うための撮像装置に関する。 The present invention relates to an imaging device mounted on a moving body such as a vehicle to correct a steering angle.

近年、自動運転を初めとする車両制御技術に注目が集まっている。正確な車両制御を行う為にはリアルタイムに更新される多数の車載センサ出力値が必要であり、中でも自車挙動を計算する際に利用される操舵角センサ値は精度が求められる。 In recent years, attention has been focused on vehicle control technology such as automatic driving. In order to perform accurate vehicle control, a large number of in-vehicle sensor output values that are updated in real time are required, and among them, the steering angle sensor value used when calculating the behavior of the own vehicle is required to be accurate.

しかし、操舵角センサ値のゼロ点には経年劣化によりオフセットが加算される。このため、精度良く、遅れの無い操舵角値を得る為には、操舵角補正値計算を行う事が肝要である。 However, an offset is added to the zero point of the steering angle sensor value due to aging deterioration. Therefore, in order to obtain a steering angle value with high accuracy and no delay, it is important to calculate the steering angle correction value.

また、コスト低減の為、操舵角センサ値を受け取る側(例えば、車載ステレオカメラ等の車両制御を行うための撮像装置)で操舵角補正値計算を行う事が望ましい。 Further, in order to reduce costs, it is desirable that the steering angle correction value is calculated on the side that receives the steering angle sensor value (for example, an imaging device for controlling a vehicle such as an in-vehicle stereo camera).

このため、従来から、操舵角補正値計算を行う様々な技術や、撮像装置が提案されてきた。 For this reason, various techniques for calculating the steering angle correction value and an imaging device have been proposed conventionally.

例えば、特許文献1には、車載カメラで撮像した画像上の白線に着目し、操舵角補正値計算を行う撮像装置が記載されている。 For example, Patent Document 1 describes an imaging device that calculates a steering angle correction value by focusing on a white line on an image captured by an in-vehicle camera.

特開2006−199242号公報Japanese Unexamined Patent Publication No. 2006-199242

しかしながら、特許文献1等に記載された従来技術では、白線を初めとした画像上の基準線が撮像されていないと目標走行ラインが設定出来ず、舵角補正値計算を行う事が出来ない。 However, in the prior art described in Patent Document 1 and the like, the target traveling line cannot be set and the steering angle correction value cannot be calculated unless the reference line on the image such as the white line is imaged.

白線等の基準線がない道路であっても、舵角補正値計算を行うことが可能となれば、車両制御の精度を向上することができる。 Even on a road without a reference line such as a white line, if it is possible to calculate the steering angle correction value, the accuracy of vehicle control can be improved.

このため、その実現化が望まれている。 Therefore, its realization is desired.

本発明の目的は、白線を初めとする基準直線が無い道路等を車両等の移動体が走行中であっても、自車両が直進しているか否を判定可能な撮像装置を実現することである。 An object of the present invention is to realize an imaging device capable of determining whether or not a vehicle is traveling straight even when a moving body such as a vehicle is traveling on a road or the like having no reference straight line such as a white line. is there.

上記目的を達成するため、本発明は次のように構成される。 In order to achieve the above object, the present invention is configured as follows.

本発明の撮像装置は、
複数の画像を取得する複数の撮像素子を有する撮像部と、
取得された上記複数の画像から、静止物である撮像対象物の視差画像を生成する視差画像生成部と、
上記視差画像生成部により生成された上記視差画像に基づいて、上記撮像対象物の、時系列の複数の三次元位置情報を取得する三次元位置情報抽出部と、
上記三次元位置情報抽出部により取得された上記複数の三次元位置情報に基づいて、移動体の直進判定をする直進判定部と、を備え
上記三次元位置情報抽出部は、
上記視差画像生成部により生成された上記視差画像に基づいて、上記撮像対象物を認識する認識部と、
上記認識部により、認識された上記撮像対象物の三次元位置情報を取得し、過去に取得された三次元位置情報と現時点で抽出された三次元位置情報とから上記撮像対象物の三次元位置軌跡を生成する三次元位置軌跡生成部とを有し、
上記直進判定部は、上記三次元位置軌跡生成部が生成した三次元位置軌跡が所定の曲率以上であるか否かに基づいて、直進判定をする
The imaging device of the present invention
An image pickup unit having a plurality of image pickup elements for acquiring a plurality of images, and an image pickup unit.
A parallax image generation unit that generates a parallax image of a stationary object to be imaged from the above-mentioned plurality of acquired images, and a parallax image generation unit.
Based on the parallax image generated by the parallax image generation unit, a three-dimensional position information extraction unit that acquires a plurality of three-dimensional position information of the imaging target in time series, and a three-dimensional position information extraction unit.
It is provided with a straight-ahead determination unit that determines straight-ahead of a moving body based on the plurality of three-dimensional position information acquired by the three-dimensional position information extraction unit .
The above three-dimensional position information extraction unit
A recognition unit that recognizes the imaging object based on the parallax image generated by the parallax image generation unit, and a recognition unit.
The recognition unit acquires the recognized three-dimensional position information of the imaging object, and the three-dimensional position of the imaging object is obtained from the three-dimensional position information acquired in the past and the three-dimensional position information extracted at the present time. It has a three-dimensional position locus generator that generates a locus.
The straight-ahead determination unit makes a straight-ahead determination based on whether or not the three-dimensional position locus generated by the three-dimensional position locus generation unit has a predetermined curvature or more .

本発明によれば、白線を初めとする基準直線が無い道路等を車両等の移動体が走行中であっても、自車両が直進しているか否を判定可能な撮像装置を実現することができ、道路状況に寄らない舵角補正値計算をすることができる。 According to the present invention, it is possible to realize an imaging device capable of determining whether or not the own vehicle is traveling straight even when a moving body such as a vehicle is traveling on a road or the like without a reference straight line such as a white line. It is possible to calculate the steering angle correction value regardless of the road conditions.

前述した以外の課題、構成、効果は以下の実施形態の説明にて述べる。 Issues, configurations, and effects other than those described above will be described in the description of the following embodiments.

本発明の一実施例である撮像装置の概略構成図であり、車両に搭載される撮像装置の例を示す図である。It is a schematic block diagram of the image pickup apparatus which is one Example of this invention, and is the figure which shows the example of the image pickup apparatus mounted on a vehicle. 図1に示した三次元位置情報抽出部の内部構成を示す図である。It is a figure which shows the internal structure of the 3D position information extraction part shown in FIG. 本発明の一実施例を説明する上で、自車両が走行する路面の状況を定義するための図である。It is a figure for defining the condition of the road surface on which the own vehicle travels in explaining one Embodiment of this invention. 図3を俯瞰したイメージを示す俯瞰図である。It is a bird's-eye view which shows the image which took a bird's-eye view of FIG. 本発明の一実施例における静止立体物の三次元位置軌跡に基づく直進判定ロジックの概略説明図である。It is a schematic explanatory diagram of the straight-ahead determination logic based on the three-dimensional position locus of a stationary three-dimensional object in one embodiment of the present invention. 直進判定部が、自車両の直進判定を行う際の処理フローである。This is a processing flow when the straight-ahead determination unit determines the straight-ahead of the own vehicle. 図6のステップ605においてプロット結果が直線になっているかどうか判定するフローの一例を示す図である。It is a figure which shows an example of the flow which determines whether or not the plot result is a straight line in step 605 of FIG. 図1に示した撮像装置の構成に舵角補正値演算部を備える撮像装置の構成図である。It is a block diagram of the image pickup apparatus which includes the steering angle correction value calculation unit in the configuration of the image pickup apparatus shown in FIG.

本発明の実施形態について、添付図面を参照して説明する。 An embodiment of the present invention will be described with reference to the accompanying drawings.

図1は、本発明の一実施例である撮像装置の概略構成図であり、車両に搭載される撮像装置の例である。 FIG. 1 is a schematic configuration diagram of an image pickup device according to an embodiment of the present invention, and is an example of an image pickup device mounted on a vehicle.

図1において、撮像部を構成する複数のCMOS素子(撮像素子)101および101’は、ステレオカメラを構成し、入力画像を撮像する(入力画像を取得する)。視差画像生成部103は、CMOS素子101および101’が撮像した入力画像を用いて、視差を生成する。 In FIG. 1, a plurality of CMOS elements (imaging elements) 101 and 101'that constitute an imaging unit constitute a stereo camera and capture an input image (acquire an input image). The parallax image generation unit 103 generates parallax using the input images captured by the CMOS elements 101 and 101'.

三次元位置情報抽出部105は、視差画像生成部103が生成する視差を用いて、特徴となる三次元位置情報を抽出する(取得する)。直進判定部107は、三次元位置情報抽出部105が出力した三次元位置情報を基に、自車の直進判定を行う。 The three-dimensional position information extraction unit 105 extracts (acquires) characteristic three-dimensional position information by using the parallax generated by the parallax image generation unit 103. The straight-ahead determination unit 107 determines the straight-ahead of the own vehicle based on the three-dimensional position information output by the three-dimensional position information extraction unit 105.

図2は、図1に示した三次元位置情報抽出部105の内部構成を示した図である。 FIG. 2 is a diagram showing an internal configuration of the three-dimensional position information extraction unit 105 shown in FIG.

図2において、三次元位置情報抽出部105は、認識部113と、三次元位置軌跡生成部115とを備える。 In FIG. 2, the three-dimensional position information extraction unit 105 includes a recognition unit 113 and a three-dimensional position locus generation unit 115.

認識部113は、視差画像生成部103が生成した視差画像を用いて、撮像対象物(例えば静止立体物や路面テクスチャ)の認識を行い、その結果を三次元位置軌跡生成部115に出力する。三次元位置軌跡生成部115は、認識部113が認識した対象(例えば静止立体物または路面テクスチャ)の三次元位置を、撮影画像のフレーム毎にプロットし、対象の三次元位置の時系列推移(三次元位置軌跡)を生成する。 The recognition unit 113 recognizes an imaged object (for example, a stationary three-dimensional object or a road surface texture) using the parallax image generated by the parallax image generation unit 103, and outputs the result to the three-dimensional position locus generation unit 115. The three-dimensional position locus generation unit 115 plots the three-dimensional position of the target (for example, a stationary three-dimensional object or a road surface texture) recognized by the recognition unit 113 for each frame of the captured image, and changes the three-dimensional position of the target in time series (for example). Three-dimensional position locus) is generated.

そして、三次元位置軌跡生成部115は、生成した三次元位置軌跡を直進判定部107に出力する。 Then, the three-dimensional position locus generation unit 115 outputs the generated three-dimensional position locus to the straight-ahead determination unit 107.

図3は、本発明の一実施例を説明する上で、路面301の状況を定義するための図である。今回説明する一実施例は、先行車303(図4に示す)、左右の白線301L、301R、路面テクスチャ(図示は省略図する)、標識(静止立体物)302が、路面301上に存在していると仮定する。 FIG. 3 is a diagram for defining a situation of the road surface 301 in explaining an embodiment of the present invention. In one embodiment described this time, the preceding vehicle 303 (shown in FIG. 4), the left and right white lines 301L and 301R, the road surface texture (not shown), and the sign (stationary three-dimensional object) 302 exist on the road surface 301. Suppose you are.

なお、図3に示した例は一例であり、実際には静止立体物、または検出可能なテクスチャが1つ以上存在していれば本発明は機能し、実施可能である。 The example shown in FIG. 3 is an example, and the present invention is functional and feasible as long as there is actually one or more stationary three-dimensional objects or detectable textures.

図4は、図3を俯瞰したイメージを示す俯瞰図である。図4において、撮像装置の撮像部101、101’が撮影する範囲を点線で記載している。図4に示した例では、自車両300の前方に先行車303が存在し、前方左側に標識302が存在している。自車両300の前方部に撮像部101、101’が配置されている。 FIG. 4 is a bird's-eye view showing an image of a bird's-eye view of FIG. In FIG. 4, the range to be photographed by the imaging units 101 and 101'of the imaging apparatus is shown by a dotted line. In the example shown in FIG. 4, the preceding vehicle 303 exists in front of the own vehicle 300, and the sign 302 exists on the front left side. Imaging units 101 and 101'are arranged in front of the own vehicle 300.

図5は、本発明の一実施例における静止立体物の三次元位置軌跡に基づく直進判定ロジックの概略説明図である。図5の(a)は、自車両300が図4に示した状態から1フレーム間に移動した状態の俯瞰図であり、図5の(b)は、直進判定をイメージ的に説明する図である。 FIG. 5 is a schematic explanatory view of a straight-ahead determination logic based on a three-dimensional position locus of a stationary three-dimensional object according to an embodiment of the present invention. FIG. 5A is a bird's-eye view of the state in which the own vehicle 300 has moved from the state shown in FIG. 4 to one frame, and FIG. 5B is a diagram for imaginatively explaining the straight-ahead determination. is there.

図3、図4、図5において、撮像画像の現フレームをnとした時、同フレームnで標識302を静止立体物として検出した場合を考える。 In FIGS. 3, 4, and 5, when the current frame of the captured image is n, the case where the marker 302 is detected as a stationary three-dimensional object in the same frame n is considered.

自車両300の長さ方向中心線上かつ自車両300の前方部であるボンネットの先端を原点Oとしたとき、標識302の三次元位置を(x,y,z)とする。 When the origin O is the tip of the bonnet on the center line in the length direction of the own vehicle 300 and the front portion of the own vehicle 300, the three-dimensional position of the sign 302 is (x, y, z).

そして、フレームnの次のフレームであるn+1フレームで、同標識302が移動した三次元位置を(x’,y’,z’)とする。 Then, in the n + 1 frame, which is the frame next to the frame n, the three-dimensional position where the marker 302 has moved is defined as (x', y', z').

三次元位置情報抽出部105の認識部113が認識した情報に基づいて、三次元位置軌跡生成部115は、nフレームとn+1フレームとの複数のフレームの間に渡って標識302の三次元位置を監視し、標識302の三次元位置軌跡501を算出する。図5の(b)の矢印500は自車両300の進行方向を示している。 Based on the information recognized by the recognition unit 113 of the three-dimensional position information extraction unit 105, the three-dimensional position locus generation unit 115 determines the three-dimensional position of the marker 302 over a plurality of frames of n frames and n + 1 frames. It monitors and calculates the three-dimensional position locus 501 of the marker 302. The arrow 500 in FIG. 5B indicates the traveling direction of the own vehicle 300.

三次元位置軌跡生成部115が生成した軌跡から、直進判定部107が、標識302の三次元位置軌跡が直線を描いていたと判定した場合は、自車両300が直進していると判断する。 When the straight-ahead determination unit 107 determines from the locus generated by the three-dimensional position locus generation unit 115 that the three-dimensional position locus of the sign 302 draws a straight line, it determines that the own vehicle 300 is traveling straight.

以上の手法を取る事で、白線を初めとする基準直線が無い路面でも、対象(例えば静止立体物(一例として標識302)や路面テクスチャさえあれば直進判定する事が可能となる。 By adopting the above method, even on a road surface without a reference straight line such as a white line, it is possible to determine straightness as long as there is an object (for example, a stationary three-dimensional object (for example, a sign 302) or a road surface texture).

図6は、直進判定部107が、自車両300の直進判定を行う際の処理フローの一例を示す図である。 FIG. 6 is a diagram showing an example of a processing flow when the straight-ahead determination unit 107 determines the straight-ahead of the own vehicle 300.

図6において、静止立体物(標識302)の三次元位置座標を取得する(ステップ601)。次に、静止立体物302の過去に取得された三次元位置情報と現時点で抽出された静止立体物302の三次元位置情報とを複数フレーム間プロットする(ステップ603)。 In FIG. 6, the three-dimensional position coordinates of the stationary three-dimensional object (mark 302) are acquired (step 601). Next, the three-dimensional position information acquired in the past of the stationary three-dimensional object 302 and the three-dimensional position information of the stationary three-dimensional object 302 extracted at the present time are plotted between a plurality of frames (step 603).

そして、プロット結果が直線になっているかどうか判定する(ステップ605)。このステップ605で、直線になっていると判定した場合は、自車両300が直進していると判定する(ステップ607)。ステップ605で、直線になっていないと判定した場合は、自車両300が直進していないと判定する(609)。 Then, it is determined whether or not the plot result is a straight line (step 605). If it is determined in step 605 that the vehicle is in a straight line, it is determined that the own vehicle 300 is traveling straight (step 607). If it is determined in step 605 that the vehicle is not in a straight line, it is determined that the own vehicle 300 is not traveling straight (609).

図7は、図6のステップ605において、プロット結果が直線になっているかどうか判定するフローの一例を示す図である。 FIG. 7 is a diagram showing an example of a flow for determining whether or not the plot result is a straight line in step 605 of FIG.

図7において、n個(例えば3点)の三次元位置座標が得られた場合、3点の座標(x,y,z)から3点の二次元座標(x,y)を抽出する(ステップ701)。 In FIG. 7, when n (for example, 3 points) three-dimensional position coordinates are obtained, the two-dimensional coordinates (x, y) of the three points are extracted from the coordinates (x, y, z) of the three points (step). 701).

抽出した3点の二次元時座標を用いて、例えば、次式(1)を用いて、未知数a、b、Rの連立方程式を解き、半径Rを求める(ステップ703)。 Using the two-dimensional time coordinates of the extracted three points, for example, the simultaneous equations of the unknowns a, b, and R are solved by using the following equation (1) to obtain the radius R (step 703).

(x−a)+(y−b)=R ・・・(1)
上記式(1)で求めた半径Rが閾値(例えばR=1000m)以上か否かを判断する(ステップ705)。半径Rが閾値(例えばR=1000m)以上であれば、直線であると判定して処理を終える(ステップ707)。
(X-a) 2 + (y-b) 2 = R 2 ... (1)
It is determined whether or not the radius R obtained by the above formula (1) is equal to or greater than the threshold value (for example, R = 1000 m) (step 705). If the radius R is equal to or greater than the threshold value (for example, R = 1000 m), it is determined that the radius R is a straight line, and the process ends (step 707).

そして、半径Rが閾値未満であれば直線では無いと判定し、処理を終える(ステップ709)。 Then, if the radius R is less than the threshold value, it is determined that the radius R is not a straight line, and the process is completed (step 709).

図8は、図1の撮像装置の構成に舵角補正値演算部809を備える撮像装置の構成図である。 FIG. 8 is a configuration diagram of an imaging device including a steering angle correction value calculation unit 809 in the configuration of the imaging device of FIG.

舵角補正値演算部809は、直線判定部107により直進していると判定された場合に舵角補正を行う。舵角補正は、公知の手法を用いて行うことができる。 The steering angle correction value calculation unit 809 corrects the steering angle when it is determined by the straight line determination unit 107 that the vehicle is traveling straight. The rudder angle correction can be performed by using a known method.

舵角補正値演算部809は、直線判定部107により、直進していると判定された場合のみならず、直進していないと判定された場合であっても、直進判定部107で求めたRの曲率や偏差を用いて、一律に舵角補正値を演算するように構成することもできる。 The steering angle correction value calculation unit 809 obtained the R obtained by the straight-ahead determination unit 107 not only when the straight-line determination unit 107 determines that the vehicle is traveling straight, but also when it is determined that the vehicle is not traveling straight. It is also possible to uniformly calculate the rudder angle correction value by using the curvature and deviation of.

以上のように、本発明の一実施例によれば、自車両300の周囲に存在する静止対象物302を撮像部101、101’が連続して撮像し、撮像した画像における対象物302の軌跡が、直線か否かにより、自車両300が、直進しているか否かを判定するように構成したので、白線を初めとする基準直線が無い道路等を車両等の移動体が走行中であっても、自車両が直進しているか否を判定可能な撮像装置を実現することができる。 As described above, according to one embodiment of the present invention, the imaging units 101 and 101'continuously image the stationary object 302 existing around the own vehicle 300, and the locus of the object 302 in the captured image. However, since it is configured to determine whether or not the own vehicle 300 is traveling straight based on whether or not it is a straight line, a moving body such as a vehicle is traveling on a road or the like where there is no reference straight line such as a white line. However, it is possible to realize an imaging device capable of determining whether or not the own vehicle is traveling straight.

なお、上述した例は、静止対象物を標識302としたが、建造物、木々等の他の静止物であってもよい。静止物は複数存在する物のうちから自動的に決定することができる。例えば、得られた画像から最初に静止物と判定したものを対象物としてもよい。 In the above-mentioned example, the stationary object is designated as the sign 302, but other stationary objects such as buildings and trees may be used. The stationary object can be automatically determined from among a plurality of existing objects. For example, an object that is first determined to be a stationary object from the obtained image may be used as an object.

また、複数の静止立体物を対象に、それぞれの軌跡から自車両300の直進判定をしてもよい。 Further, the straight-ahead determination of the own vehicle 300 may be made from the respective trajectories of a plurality of stationary three-dimensional objects.

また、撮像素子に対象が撮像される距離、例えば0.1m〜300mの間であれば、直進判定可能である。 Further, if the distance at which the object is imaged by the image sensor, for example, between 0.1 m and 300 m, the straight-ahead determination can be made.

また、上述した例は、自車両300の前方に存在する静止物を直進判定の対象物としたが、自車両を中心とした全方位の静止物を対象とする事も可能である。例えば、自車両の後方に存在する静止物を直進判定の対象物とすることも可能である。この場合は、自車両300の後方部に、複数の撮像素子を有する撮像部が配置されていることが必要である。 Further, in the above-described example, the stationary object existing in front of the own vehicle 300 is the target of the straight-ahead determination, but it is also possible to target the stationary object in all directions centered on the own vehicle. For example, it is also possible to set a stationary object behind the own vehicle as an object for straight-ahead determination. In this case, it is necessary that an image pickup unit having a plurality of image pickup elements is arranged at the rear portion of the own vehicle 300.

また、自車両300の周辺を撮像する複数の撮像部を配置する事も可能である。例えば、自車両300の前方及び後方に撮像部を配置し、撮像した静止物を直進判定の対象物とすることも可能である。
さらに、上述した例においては、本発明を車両に搭載する撮像装置に適用した場合の例を説明したが、車両に限らず、移動体であり、外界を認識して操舵し、移動するものであれば、本発明を適用することができる。
It is also possible to arrange a plurality of imaging units that image the periphery of the own vehicle 300. For example, it is possible to arrange the imaging unit in front of and behind the own vehicle 300 and use the captured stationary object as an object for straight-ahead determination.
Further, in the above-mentioned example, an example in which the present invention is applied to an imaging device mounted on a vehicle has been described, but the present invention is not limited to the vehicle, but is a moving body, which recognizes the outside world, steers, and moves. If so, the present invention can be applied.

例えば、目的箇所に、荷物等を運搬する荷物運搬用ロボット等にも本発明は適用可能であり、その場合は、白線等の基準線が存在しない場合でも、走行中に直進判定が可能である。 For example, the present invention can be applied to a luggage-carrying robot or the like that transports a luggage or the like to a target location, and in that case, even if a reference line such as a white line does not exist, it is possible to make a straight-ahead determination while traveling. ..

101、101’・・・撮像部、
103・・・視差画像生成部、
105・・・三次元位置情報抽出部、
107・・・直進判定部、
113・・・認識部、
115・・・三次元位置軌跡生成部、
300・・・自車両、
301・・・路面、
302・・・標識(静止対象物)、
303・・・先行車、
301L、301R・・・白線、
500・・・自車進行方向、
501・・・静止対象物の軌跡、
809・・・舵角補正値演算部
101, 101'... Imaging unit,
103 ... Parallax image generator,
105 ... 3D position information extraction unit,
107 ... Straight-ahead judgment unit,
113 ... Recognition unit,
115 ... Three-dimensional position locus generator,
300 ... own vehicle,
301 ... Road surface,
302 ... Sign (stationary object),
303 ... preceding vehicle,
301L, 301R ... White line,
500 ... Own vehicle traveling direction,
501 ... Trajectory of stationary object,
809 ... Rudder angle correction value calculation unit

Claims (6)

複数の画像を取得する複数の撮像素子を有する撮像部と、
取得された上記複数の画像から、静止物である撮像対象物の視差画像を生成する視差画像生成部と、
上記視差画像生成部により生成された上記視差画像に基づいて、上記撮像対象物の、時系列の複数の三次元位置情報を取得する三次元位置情報抽出部と、
上記三次元位置情報抽出部により取得された上記複数の三次元位置情報に基づいて、移動体の直進判定をする直進判定部と、
を備え
上記三次元位置情報抽出部は、
上記視差画像生成部により生成された上記視差画像に基づいて、上記撮像対象物を認識する認識部と、
上記認識部により、認識された上記撮像対象物の三次元位置情報を取得し、過去に取得された三次元位置情報と現時点で抽出された三次元位置情報とから上記撮像対象物の三次元位置軌跡を生成する三次元位置軌跡生成部とを有し、
上記直進判定部は、上記三次元位置軌跡生成部が生成した三次元位置軌跡が所定の曲率以上であるか否かに基づいて、直進判定をすることを特徴とする撮像装置。
An image pickup unit having a plurality of image pickup elements for acquiring a plurality of images, and an image pickup unit.
A parallax image generation unit that generates a parallax image of a stationary object to be imaged from the above-mentioned plurality of acquired images, and a parallax image generation unit.
Based on the parallax image generated by the parallax image generation unit, a three-dimensional position information extraction unit that acquires a plurality of three-dimensional position information of the imaging target in time series, and a three-dimensional position information extraction unit.
Based on the plurality of three-dimensional position information acquired by the three-dimensional position information extraction unit, the straight-ahead determination unit that determines the straight-ahead of the moving body, and the straight-ahead determination unit.
Equipped with a,
The above three-dimensional position information extraction unit
A recognition unit that recognizes the imaging object based on the parallax image generated by the parallax image generation unit, and a recognition unit.
The recognition unit acquires the recognized three-dimensional position information of the imaging object, and the three-dimensional position of the imaging object is obtained from the three-dimensional position information acquired in the past and the three-dimensional position information extracted at the present time. It has a three-dimensional position locus generator that generates a locus, and has a three-dimensional position locus generator.
The straight-ahead determination unit is an imaging device that makes a straight-ahead determination based on whether or not the three-dimensional position locus generated by the three-dimensional position locus generation unit has a predetermined curvature or more.
請求項に記載の撮像装置において、
上記三次元位置軌跡生成部が生成した三次元位置軌跡に基づいて、上記移動体の舵角補正を行う舵角補正値演算部を有することを特徴とする撮像装置。
In the imaging device according to claim 1,
An imaging device characterized by having a steering angle correction value calculation unit that corrects the steering angle of the moving body based on the three-dimensional position trajectory generated by the three-dimensional position trajectory generating unit.
請求項に記載の撮像装置において、
上記直進判定部は上記直進判定に用いた情報を上記舵角補正値演算部に出力し、上記舵角補正値演算部は、上記直進判定部から出力された上記情報に基づいて舵角補正を行うことを特徴とする撮像装置。
In the imaging device according to claim 2,
The straight-ahead determination unit outputs the information used for the straight-ahead determination to the rudder angle correction value calculation unit, and the rudder angle correction value calculation unit corrects the rudder angle based on the information output from the straight-ahead determination unit. An imaging device characterized by performing.
請求項に記載の撮像装置において、
上記三次元位置情報抽出部は、上記移動体の3点間の移動軌跡を表す二次方程式を算出し、上記直進判定部は、上記三次元位置情報抽出部が算出した上記二次方程式から算出される曲率が、予め定めた曲率より以上か否かにより、上記移動体の直進判定をすることを特徴とする撮像装置。
In the imaging device according to claim 1,
The three-dimensional position information extraction unit calculates a quadratic equation representing the movement locus between the three points of the moving body, and the straight-ahead determination unit calculates from the quadratic equation calculated by the three-dimensional position information extraction unit. An imaging device characterized in that a straight-ahead determination of the moving body is made based on whether or not the curvature to be performed is greater than or equal to a predetermined curvature.
請求項1に記載の撮像装置において、
上記撮像部は、上記移動体の前方部に配置されていることを特徴とする撮像装置。
In the imaging device according to claim 1,
The imaging device is an imaging device characterized in that the imaging unit is arranged in front of the moving body.
請求項1に記載の撮像装置において、
上記撮像部は、上記移動体の後方部に配置されていることを特徴とする撮像装置。
In the imaging device according to claim 1,
The imaging device is an imaging device characterized in that the imaging unit is arranged at a rear portion of the moving body.
JP2018559042A 2016-12-26 2017-12-15 Imaging device Active JP6841553B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016251458 2016-12-26
JP2016251458 2016-12-26
PCT/JP2017/045031 WO2018123640A1 (en) 2016-12-26 2017-12-15 Imaging device

Publications (2)

Publication Number Publication Date
JPWO2018123640A1 JPWO2018123640A1 (en) 2019-10-31
JP6841553B2 true JP6841553B2 (en) 2021-03-10

Family

ID=62708164

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2018559042A Active JP6841553B2 (en) 2016-12-26 2017-12-15 Imaging device

Country Status (3)

Country Link
JP (1) JP6841553B2 (en)
CN (1) CN110088803B (en)
WO (1) WO2018123640A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111976717B (en) * 2019-11-29 2022-07-08 长城汽车股份有限公司 Intelligent parking method and device
US11948372B2 (en) * 2020-11-27 2024-04-02 Nissan Motor Co., Ltd. Vehicle assist method and vehicle assist device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10222665A (en) * 1997-01-31 1998-08-21 Fujitsu Ten Ltd Picture recognizing device
JP2006199242A (en) * 2005-01-24 2006-08-03 Toyota Motor Corp Behavior controller of vehicle
JP5074365B2 (en) * 2008-11-28 2012-11-14 日立オートモティブシステムズ株式会社 Camera device
US9168953B2 (en) * 2011-11-08 2015-10-27 Toyota Jidosha Kabushiki Kaisha Vehicle travel track control device
WO2013157301A1 (en) * 2012-04-16 2013-10-24 日産自動車株式会社 Device for detecting three-dimensional object and method for detecting three-dimensional object
JP2014046710A (en) * 2012-08-29 2014-03-17 Isuzu Motors Ltd Neutral point correction unit and neutral point correction method for steering angle sensor
JP6328369B2 (en) * 2012-11-27 2018-05-23 クラリオン株式会社 In-vehicle control device

Also Published As

Publication number Publication date
JPWO2018123640A1 (en) 2019-10-31
CN110088803B (en) 2023-05-12
CN110088803A (en) 2019-08-02
WO2018123640A1 (en) 2018-07-05

Similar Documents

Publication Publication Date Title
JP5926228B2 (en) Depth detection method and system for autonomous vehicles
EP2887315B1 (en) Camera calibration device, method for implementing calibration, program and camera for movable body
JP6965739B2 (en) Vehicle control device
JP7077910B2 (en) Bound line detection device and lane marking method
JP6377970B2 (en) Parallax image generation apparatus and parallax image generation method
CN110555362B (en) Object recognition device
JP6838285B2 (en) Lane marker recognition device, own vehicle position estimation device
JPWO2019031137A1 (en) Roadside object detection device, roadside object detection method, and roadside object detection system
JP2017181476A (en) Vehicle location detection device, vehicle location detection method and vehicle location detection-purpose computer program
JP6841553B2 (en) Imaging device
JP6410231B2 (en) Alignment apparatus, alignment method, and computer program for alignment
JP2016062356A (en) Solid object detection device
US20150379372A1 (en) Object recognition apparatus
JP6044084B2 (en) Moving object position and orientation estimation apparatus and method
US11619495B2 (en) Position estimating apparatus and position estimating method
JP4696925B2 (en) Image processing device
WO2017188158A1 (en) Device for detecting road surface state
JP7303064B2 (en) Image processing device and image processing method
JP2012159470A (en) Vehicle image recognition device
WO2014054124A1 (en) Road surface markings detection device and road surface markings detection method
JP6963490B2 (en) Vehicle control device
JP6174884B2 (en) Outside environment recognition device and outside environment recognition method
JPH11211738A (en) Speed measurement method of traveling body and speed measuring device using the method
JP2019212203A (en) Three-dimensional (3d) model generation system
JP2019091295A (en) Image processing device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20190520

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20200721

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20200902

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20210119

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20210215

R150 Certificate of patent or registration of utility model

Ref document number: 6841553

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250