JP2021093670A5 - - Google Patents

Download PDF

Info

Publication number
JP2021093670A5
JP2021093670A5 JP2019224292A JP2019224292A JP2021093670A5 JP 2021093670 A5 JP2021093670 A5 JP 2021093670A5 JP 2019224292 A JP2019224292 A JP 2019224292A JP 2019224292 A JP2019224292 A JP 2019224292A JP 2021093670 A5 JP2021093670 A5 JP 2021093670A5
Authority
JP
Japan
Prior art keywords
feature point
optical flow
camera
unit
posture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2019224292A
Other languages
Japanese (ja)
Other versions
JP2021093670A (en
JP7256734B2 (en
Filing date
Publication date
Application filed filed Critical
Priority to JP2019224292A priority Critical patent/JP7256734B2/en
Priority claimed from JP2019224292A external-priority patent/JP7256734B2/en
Publication of JP2021093670A publication Critical patent/JP2021093670A/en
Publication of JP2021093670A5 publication Critical patent/JP2021093670A5/ja
Application granted granted Critical
Publication of JP7256734B2 publication Critical patent/JP7256734B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Claims (11)

移動体に搭載されたカメラで撮影された撮影画像を取得する取得部と、
前記撮影画像から第1特徴点及び第2特徴点を含む複数の特徴点を抽出する抽出部と、
前記複数の特徴点ごとにオプティカルフローを導出するフロー導出部と、
前記フロー導出部で導出されたオプティカルフローの位置に基づいて、前記フロー導出部で導出されたオプティカルフローの中から前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択する選択部と、
前記選択部で選択された前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローに基づき、所定の平面との交線が互いに平行となる面の組を2組特定する特定部と、
前記特定部で特定した面の組に基づき前記カメラの姿勢を推定する推定部と、
を備える、姿勢推定装置。
An acquisition unit that acquires images taken by a camera mounted on a moving object,
An extraction unit that extracts a plurality of feature points including the first feature point and the second feature point from the captured image, and an extraction unit.
A flow derivation unit that derives an optical flow for each of the plurality of feature points,
Selection to select the optical flow of the first feature point and the optical flow of the second feature point from the optical flows derived by the flow derivation section based on the position of the optical flow derived by the flow derivation section. Department and
Based on the optical flow of the first feature point and the optical flow of the second feature point selected by the selection section, a specific section that specifies two sets of faces whose intersections with a predetermined plane are parallel to each other. ,
An estimation unit that estimates the posture of the camera based on the set of surfaces specified by the specific unit, and an estimation unit.
A posture estimation device.
前記選択部は、前記第1特徴点の実世界上の位置と前記第2特徴点の実世界上の位置との所定の方向における距離が所定値以上になるように、前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択する、請求項1に記載の姿勢推定装置。 The selection unit has the first feature point so that the distance between the position of the first feature point in the real world and the position of the second feature point in the real world in a predetermined direction is equal to or more than a predetermined value. The posture estimation device according to claim 1, wherein the optical flow and the optical flow of the second feature point are selected. 前記選択部は、前記第1特徴点の実世界上の位置と前記第2特徴点の実世界上の位置との前記所定の方向に垂直な方向における距離が大きいほど、前記所定値を大きくする、請求項2に記載の姿勢推定装置。 The selection unit increases the predetermined value as the distance between the position of the first feature point in the real world and the position of the second feature point in the real world in the direction perpendicular to the predetermined direction is larger. , The posture estimation device according to claim 2. 前記選択部は、前記カメラの光軸中心から所定距離以内の領域に位置する前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択する、請求項2又は請求項3に記載の姿勢推定装置。 The second or third aspect, wherein the selection unit selects an optical flow of the first feature point and an optical flow of the second feature point located in a region within a predetermined distance from the center of the optical axis of the camera. Attitude estimation device. 前記カメラは、前記移動体の前方又は後方を臨む、請求項4に記載の姿勢推定装置。 The posture estimation device according to claim 4, wherein the camera faces the front or the rear of the moving body. 前記選択部は、前記カメラの光軸中心から所定距離以内の領域に位置する前記第1特徴点のオプティカルフロー及び前記カメラの光軸中心から前記所定距離より離れた領域に位置する前記第2特徴点のオプティカルフローを選択する、請求項2又は請求項3に記載の姿勢推定装置。 The selection unit is the optical flow of the first feature point located in a region within a predetermined distance from the center of the optical axis of the camera, and the second feature located in a region away from the center of the optical axis of the camera. The attitude estimation device according to claim 2 or 3, wherein the optical flow of points is selected. 前記カメラは、前記移動体の左方又は右方を臨む、請求項6に記載の姿勢推定装置。 The posture estimation device according to claim 6, wherein the camera faces the left or right side of the moving body. 請求項1~7のいずれか一項に記載の姿勢推定装置と、
前記推定部によって推定された前記カメラの姿勢に基づき、前記カメラの取付けのずれが生じた状態であるか否かを判定する判定部と、
を備える、異常検出装置。
The posture estimation device according to any one of claims 1 to 7.
Based on the posture of the camera estimated by the estimation unit, a determination unit for determining whether or not the mounting of the camera is misaligned, and a determination unit.
Anomaly detection device.
請求項1~7のいずれか一項に記載の姿勢推定装置と、
前記推定部によって推定された前記カメラの姿勢に基づき、前記カメラのパラメータを補正する補正部と、
を備える、補正装置。
The posture estimation device according to any one of claims 1 to 7.
A correction unit that corrects the parameters of the camera based on the posture of the camera estimated by the estimation unit, and a correction unit.
A correction device.
移動体に搭載されたカメラで撮影された撮影画像を取得する取得工程と、
前記撮影画像から第1特徴点及び第2特徴点を含む複数の特徴点を抽出する抽出工程と、
前記複数の特徴点ごとにオプティカルフローを導出するフロー導出工程と、
前記フロー導出工程で導出されたオプティカルフローの位置に基づいて、前記フロー導出工程で導出されたオプティカルフローの中から前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択する選択工程と、
前記選択工程で選択された前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローに基づき、所定の平面との交線が互いに平行となる面の組を2組特定する特定工程と、
前記特定工程で特定した面の組に基づき前記カメラの姿勢を推定する推定工程と、
を備える、姿勢推定方法。
The acquisition process to acquire the captured image taken by the camera mounted on the moving body, and
An extraction step of extracting a plurality of feature points including the first feature point and the second feature point from the captured image, and
The flow derivation process for deriving the optical flow for each of the plurality of feature points, and
Selection to select the optical flow of the first feature point and the optical flow of the second feature point from the optical flows derived in the flow derivation step based on the position of the optical flow derived in the flow derivation step. Process and
Based on the optical flow of the first feature point and the optical flow of the second feature point selected in the selection step, a specific step of specifying two sets of planes whose intersections with a predetermined plane are parallel to each other. ,
An estimation process that estimates the posture of the camera based on the set of surfaces specified in the specific process, and an estimation process.
A posture estimation method.
移動体に搭載されたカメラの姿勢を推定する制御部を備えた姿勢推定装置であって、It is a posture estimation device equipped with a control unit that estimates the posture of the camera mounted on the moving body.
前記制御部は、The control unit
前記カメラで撮影された撮影画像を取得し、The captured image taken by the camera is acquired, and the photographed image is acquired.
前記撮影画像から第1特徴点及び第2特徴点を含む複数の特徴点を抽出し、A plurality of feature points including the first feature point and the second feature point are extracted from the captured image.
前記複数の特徴点ごとにオプティカルフローを導出し、An optical flow is derived for each of the plurality of feature points, and the optical flow is derived.
導出された前記オプティカルフローの位置に基づいて、前記オプティカルフローの中から前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択し、Based on the derived position of the optical flow, the optical flow of the first feature point and the optical flow of the second feature point are selected from the optical flows.
選択された前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローに基づき、所定の平面との交線が互いに平行となる面の組を2組特定し、Based on the selected optical flow of the first feature point and the optical flow of the second feature point, two sets of planes whose intersections with a predetermined plane are parallel to each other are specified.
特定した面の組に基づき前記カメラの姿勢を推定する、Estimate the posture of the camera based on the specified set of faces,
姿勢推定装置。Posture estimation device.
JP2019224292A 2019-12-12 2019-12-12 Posture estimation device, anomaly detection device, correction device, and posture estimation method Active JP7256734B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019224292A JP7256734B2 (en) 2019-12-12 2019-12-12 Posture estimation device, anomaly detection device, correction device, and posture estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2019224292A JP7256734B2 (en) 2019-12-12 2019-12-12 Posture estimation device, anomaly detection device, correction device, and posture estimation method

Publications (3)

Publication Number Publication Date
JP2021093670A JP2021093670A (en) 2021-06-17
JP2021093670A5 true JP2021093670A5 (en) 2022-06-29
JP7256734B2 JP7256734B2 (en) 2023-04-12

Family

ID=76310846

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2019224292A Active JP7256734B2 (en) 2019-12-12 2019-12-12 Posture estimation device, anomaly detection device, correction device, and posture estimation method

Country Status (1)

Country Link
JP (1) JP7256734B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022185814A (en) 2021-06-03 2022-12-15 大王製紙株式会社 Tissue paper, and manufacturing method of tissue paper
JP2023076903A (en) * 2021-11-24 2023-06-05 三菱電機株式会社 Road surface deterioration detection system and road surface deterioration detection method
JP7359901B1 (en) 2022-04-28 2023-10-11 株式会社デンソーテン Information processing device, information processing method and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5864984B2 (en) * 2011-09-26 2016-02-17 東芝アルパイン・オートモティブテクノロジー株式会社 In-vehicle camera image correction method and in-vehicle camera image correction program
JP6947066B2 (en) * 2018-02-06 2021-10-13 株式会社デンソー Posture estimator
JP2019191808A (en) * 2018-04-23 2019-10-31 株式会社デンソーテン Abnormality detection device and abnormality detection method

Similar Documents

Publication Publication Date Title
JP2021093670A5 (en)
US9025009B2 (en) Method and systems for obtaining an improved stereo image of an object
MX2017011507A (en) Object distance estimation using data from a single camera.
JP2020529685A5 (en)
JP2012070389A5 (en)
JP2016533105A5 (en)
JP5895955B2 (en) Lane boundary detection device
JP2018092580A5 (en) Image processing apparatus, image processing method, and program
JP2015513662A5 (en)
JP2021189822A5 (en)
CN106031148B (en) Imaging device, method of auto-focusing in an imaging device and corresponding computer program
KR101850835B1 (en) Method of estimating the location of mobile robot using ray-tracing technique
JP2017208606A5 (en)
JP2013192804A5 (en)
KR102310286B1 (en) Apparatus and method for specific object detection
GB2603715A (en) Depth estimation using a neural network
JP2018195084A5 (en)
JP2018101942A5 (en)
JP2018022247A5 (en)
US20080226159A1 (en) Method and System For Calculating Depth Information of Object in Image
KR101896941B1 (en) Stereo matching apparatus and its method
US9836655B2 (en) Information processing apparatus, information processing method, and computer-readable medium
KR101217231B1 (en) Method and system of object recognition
CN110800020A (en) Image information acquisition method, image processing equipment and computer storage medium
JP2010145219A (en) Movement estimation device and program