JP2021093670A5 - - Google Patents
Download PDFInfo
- Publication number
- JP2021093670A5 JP2021093670A5 JP2019224292A JP2019224292A JP2021093670A5 JP 2021093670 A5 JP2021093670 A5 JP 2021093670A5 JP 2019224292 A JP2019224292 A JP 2019224292A JP 2019224292 A JP2019224292 A JP 2019224292A JP 2021093670 A5 JP2021093670 A5 JP 2021093670A5
- Authority
- JP
- Japan
- Prior art keywords
- feature point
- optical flow
- camera
- unit
- posture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 claims 31
- 238000000034 method Methods 0.000 claims 7
- 238000009795 derivation Methods 0.000 claims 6
- 238000000605 extraction Methods 0.000 claims 3
- 238000001514 detection method Methods 0.000 claims 1
- 239000000284 extract Substances 0.000 claims 1
Claims (11)
前記撮影画像から第1特徴点及び第2特徴点を含む複数の特徴点を抽出する抽出部と、
前記複数の特徴点ごとにオプティカルフローを導出するフロー導出部と、
前記フロー導出部で導出されたオプティカルフローの位置に基づいて、前記フロー導出部で導出されたオプティカルフローの中から前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択する選択部と、
前記選択部で選択された前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローに基づき、所定の平面との交線が互いに平行となる面の組を2組特定する特定部と、
前記特定部で特定した面の組に基づき前記カメラの姿勢を推定する推定部と、
を備える、姿勢推定装置。 An acquisition unit that acquires images taken by a camera mounted on a moving object,
An extraction unit that extracts a plurality of feature points including the first feature point and the second feature point from the captured image, and an extraction unit.
A flow derivation unit that derives an optical flow for each of the plurality of feature points,
Selection to select the optical flow of the first feature point and the optical flow of the second feature point from the optical flows derived by the flow derivation section based on the position of the optical flow derived by the flow derivation section. Department and
Based on the optical flow of the first feature point and the optical flow of the second feature point selected by the selection section, a specific section that specifies two sets of faces whose intersections with a predetermined plane are parallel to each other. ,
An estimation unit that estimates the posture of the camera based on the set of surfaces specified by the specific unit, and an estimation unit.
A posture estimation device.
前記推定部によって推定された前記カメラの姿勢に基づき、前記カメラの取付けのずれが生じた状態であるか否かを判定する判定部と、
を備える、異常検出装置。 The posture estimation device according to any one of claims 1 to 7.
Based on the posture of the camera estimated by the estimation unit, a determination unit for determining whether or not the mounting of the camera is misaligned, and a determination unit.
Anomaly detection device.
前記推定部によって推定された前記カメラの姿勢に基づき、前記カメラのパラメータを補正する補正部と、
を備える、補正装置。 The posture estimation device according to any one of claims 1 to 7.
A correction unit that corrects the parameters of the camera based on the posture of the camera estimated by the estimation unit, and a correction unit.
A correction device.
前記撮影画像から第1特徴点及び第2特徴点を含む複数の特徴点を抽出する抽出工程と、
前記複数の特徴点ごとにオプティカルフローを導出するフロー導出工程と、
前記フロー導出工程で導出されたオプティカルフローの位置に基づいて、前記フロー導出工程で導出されたオプティカルフローの中から前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択する選択工程と、
前記選択工程で選択された前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローに基づき、所定の平面との交線が互いに平行となる面の組を2組特定する特定工程と、
前記特定工程で特定した面の組に基づき前記カメラの姿勢を推定する推定工程と、
を備える、姿勢推定方法。 The acquisition process to acquire the captured image taken by the camera mounted on the moving body, and
An extraction step of extracting a plurality of feature points including the first feature point and the second feature point from the captured image, and
The flow derivation process for deriving the optical flow for each of the plurality of feature points, and
Selection to select the optical flow of the first feature point and the optical flow of the second feature point from the optical flows derived in the flow derivation step based on the position of the optical flow derived in the flow derivation step. Process and
Based on the optical flow of the first feature point and the optical flow of the second feature point selected in the selection step, a specific step of specifying two sets of planes whose intersections with a predetermined plane are parallel to each other. ,
An estimation process that estimates the posture of the camera based on the set of surfaces specified in the specific process, and an estimation process.
A posture estimation method.
前記制御部は、The control unit
前記カメラで撮影された撮影画像を取得し、The captured image taken by the camera is acquired, and the photographed image is acquired.
前記撮影画像から第1特徴点及び第2特徴点を含む複数の特徴点を抽出し、A plurality of feature points including the first feature point and the second feature point are extracted from the captured image.
前記複数の特徴点ごとにオプティカルフローを導出し、An optical flow is derived for each of the plurality of feature points, and the optical flow is derived.
導出された前記オプティカルフローの位置に基づいて、前記オプティカルフローの中から前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローを選択し、Based on the derived position of the optical flow, the optical flow of the first feature point and the optical flow of the second feature point are selected from the optical flows.
選択された前記第1特徴点のオプティカルフロー及び前記第2特徴点のオプティカルフローに基づき、所定の平面との交線が互いに平行となる面の組を2組特定し、Based on the selected optical flow of the first feature point and the optical flow of the second feature point, two sets of planes whose intersections with a predetermined plane are parallel to each other are specified.
特定した面の組に基づき前記カメラの姿勢を推定する、Estimate the posture of the camera based on the specified set of faces,
姿勢推定装置。Posture estimation device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019224292A JP7256734B2 (en) | 2019-12-12 | 2019-12-12 | Posture estimation device, anomaly detection device, correction device, and posture estimation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019224292A JP7256734B2 (en) | 2019-12-12 | 2019-12-12 | Posture estimation device, anomaly detection device, correction device, and posture estimation method |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2021093670A JP2021093670A (en) | 2021-06-17 |
JP2021093670A5 true JP2021093670A5 (en) | 2022-06-29 |
JP7256734B2 JP7256734B2 (en) | 2023-04-12 |
Family
ID=76310846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2019224292A Active JP7256734B2 (en) | 2019-12-12 | 2019-12-12 | Posture estimation device, anomaly detection device, correction device, and posture estimation method |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP7256734B2 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022185814A (en) | 2021-06-03 | 2022-12-15 | 大王製紙株式会社 | Tissue paper, and manufacturing method of tissue paper |
JP2023076903A (en) * | 2021-11-24 | 2023-06-05 | 三菱電機株式会社 | Road surface deterioration detection system and road surface deterioration detection method |
JP7359901B1 (en) | 2022-04-28 | 2023-10-11 | 株式会社デンソーテン | Information processing device, information processing method and program |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5864984B2 (en) * | 2011-09-26 | 2016-02-17 | 東芝アルパイン・オートモティブテクノロジー株式会社 | In-vehicle camera image correction method and in-vehicle camera image correction program |
JP6947066B2 (en) * | 2018-02-06 | 2021-10-13 | 株式会社デンソー | Posture estimator |
JP2019191808A (en) * | 2018-04-23 | 2019-10-31 | 株式会社デンソーテン | Abnormality detection device and abnormality detection method |
-
2019
- 2019-12-12 JP JP2019224292A patent/JP7256734B2/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2021093670A5 (en) | ||
US9025009B2 (en) | Method and systems for obtaining an improved stereo image of an object | |
MX2017011507A (en) | Object distance estimation using data from a single camera. | |
JP2020529685A5 (en) | ||
JP2012070389A5 (en) | ||
JP2016533105A5 (en) | ||
JP5895955B2 (en) | Lane boundary detection device | |
JP2018092580A5 (en) | Image processing apparatus, image processing method, and program | |
JP2015513662A5 (en) | ||
JP2021189822A5 (en) | ||
CN106031148B (en) | Imaging device, method of auto-focusing in an imaging device and corresponding computer program | |
KR101850835B1 (en) | Method of estimating the location of mobile robot using ray-tracing technique | |
JP2017208606A5 (en) | ||
JP2013192804A5 (en) | ||
KR102310286B1 (en) | Apparatus and method for specific object detection | |
GB2603715A (en) | Depth estimation using a neural network | |
JP2018195084A5 (en) | ||
JP2018101942A5 (en) | ||
JP2018022247A5 (en) | ||
US20080226159A1 (en) | Method and System For Calculating Depth Information of Object in Image | |
KR101896941B1 (en) | Stereo matching apparatus and its method | |
US9836655B2 (en) | Information processing apparatus, information processing method, and computer-readable medium | |
KR101217231B1 (en) | Method and system of object recognition | |
CN110800020A (en) | Image information acquisition method, image processing equipment and computer storage medium | |
JP2010145219A (en) | Movement estimation device and program |