WO2015068470A1 - 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム - Google Patents
3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム Download PDFInfo
- Publication number
- WO2015068470A1 WO2015068470A1 PCT/JP2014/074424 JP2014074424W WO2015068470A1 WO 2015068470 A1 WO2015068470 A1 WO 2015068470A1 JP 2014074424 W JP2014074424 W JP 2014074424W WO 2015068470 A1 WO2015068470 A1 WO 2015068470A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional
- region
- image
- dimensional shape
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to a three-dimensional shape measurement apparatus, a three-dimensional shape measurement method, and a three-dimensional shape measurement program.
- This application claims priority based on Japanese Patent Application No. 2013-230321 for which it applied to Japan on November 6, 2013, and uses the content here.
- Non-Patent Document 1 describes an example of a technique for generating a three-dimensional shape model of an object based on a plurality of two-dimensional images including the object imaged while moving the imaging unit.
- a three-dimensional shape model of an object is generated as follows. First, the entire object is imaged as a moving image while moving the stereo camera constituting the imaging unit.
- the stereo camera is also called a binocular stereoscopic camera, and is an apparatus that images a target object from a plurality of different viewpoints. Next, for each predetermined frame, a three-dimensional coordinate value corresponding to each pixel is calculated based on a set of two-dimensional images.
- the three-dimensional coordinate value calculated at this time is represented by a plurality of three-dimensional coordinates that are different for each viewpoint of the stereo camera. Therefore, in the three-dimensional shape measurement system described in Non-Patent Document 1, a viewpoint of a stereo camera is obtained by tracking feature points included in a plurality of two-dimensional images captured as moving images over a plurality of frames. Is estimated to move. Then, based on the estimation result of the viewpoint movement, the three-dimensional shape model represented by a plurality of coordinate systems is integrated into the same coordinate system, and a three-dimensional shape model of the object is generated.
- Non-Patent Document 2 discloses a three-dimensional shape model of an object based on a plurality of depth images (also referred to as depth images and distance images) acquired while moving an infrared depth sensor (hereinafter referred to as an imaging unit).
- an imaging unit moves an infrared depth sensor (hereinafter referred to as an imaging unit).
- the depth image is an image representing the measured distance information of the object (information on the distance from the imaging unit to the object) in units of pixels.
- the infrared depth sensor includes an infrared projection unit, an infrared imaging unit, and a signal processing unit.
- An infrared projection part projects a random speckle pattern with respect to a target object, and images the reflected light reflected from the target object with an infrared imaging part. And based on the bias
- the configuration of the infrared depth sensor is described in, for example, Patent Documents 1 to 3.
- Non-Patent Document 1 or Non-Patent Document 2 a plurality of two-dimensional images are picked up while moving the imaging unit, and based on the picked-up two-dimensional images.
- a three-dimensional shape model of the object hereinafter, a measured three-dimensional shape model
- This measured three-dimensional shape model can be displayed on the display unit almost simultaneously with imaging.
- a two-dimensional image captured by the imaging unit can be displayed on the display unit almost simultaneously with imaging. Therefore, a two-dimensional image including a captured object can be obtained by simply combining the three-dimensional shape measurement system shown in Non-Patent Document 1 or Non-Patent Document 2 with an imaging device in which an imaging unit and a display unit are integrated. In addition to the image, the measured three-dimensional shape model can be displayed on the display unit almost simultaneously with the imaging.
- the present invention has been made in consideration of the above circumstances, and provides a three-dimensional shape measuring apparatus, a three-dimensional shape measuring method, and a three-dimensional shape measuring program capable of easily specifying an unmeasured region when measuring a three-dimensional shape.
- the purpose is to do.
- a three-dimensional shape measuring apparatus includes an imaging unit that sequentially outputs a predetermined two-dimensional image that has been captured, and a storage unit that stores the two-dimensional image output by the imaging unit A three-dimensional shape model is generated based on the two-dimensional image stored in the storage unit, and the generated three-dimensional shape model is stored in the storage unit; An area calculation unit that calculates a measured area in the two-dimensional image from the three-dimensional image and the three-dimensional shape model, and a display image generation unit that generates a display image from the two-dimensional image based on the measured area.
- the region calculation unit includes, from the two-dimensional image and the three-dimensional shape model stored in the storage unit, a non-measurement region in the two-dimensional image. The measured area and the unmeasured area are calculated, and the display image generation unit generates a display image from the two-dimensional image based on the non-measurement area, the measured area, and the unmeasured area. It is preferable.
- the region calculation unit includes a low-precision measured region in the two-dimensional image from the two-dimensional image and the three-dimensional model stored in the storage unit, and It is preferable to calculate a highly accurate measured area.
- the three-dimensional shape model generation unit further includes an accuracy setting change instruction unit that instructs a change in accuracy setting.
- the display image generation unit generates the display image having a high pixel value in the measured region and a low pixel value in the non-measurement region. It is preferable.
- the three-dimensional shape measurement apparatus according to the first aspect of the present invention includes an optimum movement direction calculation unit that calculates an optimum movement direction of the imaging unit based on a distribution of the unmeasured area in the two-dimensional image, and the optimum movement direction calculation unit. It is preferable to further include an optimum movement direction output unit that outputs information for guiding the movement direction calculated by the movement direction calculation unit.
- the 3D shape measurement method sequentially acquires 2D images (captured image acquisition step), and stores the 2D images acquired in the captured image acquisition step in the storage unit (storage step). ), Generating a 3D model based on the 2D image stored in the storage step, storing the generated 3D shape model in the storage unit (3D shape model generation step), and storing the 2D in the storage unit A measured region in the two-dimensional image is calculated from the image and the three-dimensional shape model (region calculation step), and a display image is generated from the two-dimensional image based on the measured region calculated in the region calculation step ( Display image generation step).
- the 3D shape measurement program includes a storage step of storing a 2D image output from an imaging unit that captures a 2D image in the storage unit, and a 2D image stored in the storage step.
- a three-dimensional model is generated, a three-dimensional shape model generation step for storing the generated three-dimensional shape model in a storage unit, a two-dimensional image stored in the storage unit, and a two-dimensional image from the three-dimensional shape model
- the computer executes an area calculation step for calculating the measured area and a display image generation step for generating a display image from the two-dimensional image based on the measured area.
- the aspect of the present invention it is possible to realize a three-dimensional shape measuring apparatus, a three-dimensional shape measuring method, and a three-dimensional shape measuring program capable of easily specifying an unmeasured region when measuring a three-dimensional shape.
- the three-dimensional shape model of an object means a model expressed by quantifying the shape of the object in the three-dimensional space inside the computer, for example, a multi-view two-dimensional image. Or a point cloud model that restores the surface shape of an object by a set of a plurality of points (that is, a point cloud) in a three-dimensional space based on a two-dimensional image in which each pixel represents distance information.
- the three-dimensional shape measurement includes generating a three-dimensional shape model of an object by capturing a plurality of two-dimensional images, and generating a three-dimensional shape model of the object. This is a concept including capturing a two-dimensional image.
- FIG. 1 is a functional block diagram showing an example of a three-dimensional shape measuring apparatus 1 according to the embodiment.
- the three-dimensional shape measurement apparatus 1 includes an imaging unit 11, a storage unit 12, a three-dimensional shape model generation unit 13 (three-dimensional shape update unit), a region calculation unit 14, a display image generation unit 15, and a display unit. 16.
- the imaging unit 11 sequentially outputs a predetermined two-dimensional image captured.
- the two-dimensional image means an image based on each pixel value imaged by an imaging element having a plurality of pixels arranged two-dimensionally, or a signal or data representing the image.
- the image in this case is a monochrome image (gray image), a color image, an infrared image, a distance image, or the like, or an image (stereo image) obtained by simultaneous imaging.
- the storage unit 12 is a storage device that stores the two-dimensional image output from the imaging unit 11 and the measured three-dimensional shape model.
- the three-dimensional shape model generation unit 13 updates the three-dimensional shape model based on the two-dimensional image group stored in the storage unit 12.
- an imaged target region included in a plurality of two-dimensional images is estimated as a silhouette (silhouette figure).
- a region where a plurality of estimated silhouettes overlap can be calculated as a spatial region (see, for example, Patent Document 4).
- a visual volume intersection method or the like can be used.
- the three-dimensional shape model generation unit 13 further includes an accuracy setting change instruction unit that instructs to change the accuracy setting.
- the region calculation unit 14 calculates a measured region in the two-dimensional image, that is, a region in which the three-dimensional shape model has been created, based on the two-dimensional image and the three-dimensional shape model stored in the storage unit 12.
- the area calculation unit 14 calculates a non-measurement area, a measured area, and an unmeasured area in the two-dimensional image based on the two-dimensional image and the three-dimensional shape model stored in the storage unit 12.
- the area calculation unit 14 calculates a low-precision measured area and a high-precision measured area in the two-dimensional image from the two-dimensional image and the three-dimensional model.
- the display image generation unit 15 generates a display image V based on the measured region R1 and non-measurement region R2 calculated by the region calculation unit 14 and the two-dimensional image I stored in the storage unit 12.
- the display image generation unit 15 may generate a display image from the two-dimensional image based on the measured region R1, the non-measurement region R2, and the unmeasured region.
- the display image V can be generated as an image having a high pixel value in the measured region R1 and a low pixel value in the non-measurement region R2 among the pixels constituting the two-dimensional image I. .
- the pixel q (u, v) on the display image with respect to the pixel value p ′ (u, v) of the pixel p (u, v) on the two-dimensional image I.
- the pixel value q ′ (u, v) can be defined as follows.
- the display unit 16 is a display device such as a liquid crystal display device or an organic EL, and displays the display image generated by the display image generation unit 15.
- the three-dimensional shape measurement apparatus 1 includes an optimum movement direction calculation unit that calculates an optimum movement direction of the imaging unit based on the distribution of the unmeasured area in the two-dimensional image, and the optimum movement direction calculation unit. And an optimum movement direction output unit that outputs information for guiding the movement direction calculated by the above.
- FIG. 2 is a functional block diagram showing an example of the area calculation unit shown in FIG.
- the area calculation unit 14 illustrated in FIG. 2 includes a measured area calculation unit 21, a non-measurement area calculation unit 22, and an unmeasured area calculation unit 23.
- the measured area calculation unit 21 is a measured area that is a created area of a 3D shape model in a 2D image including an imaging target based on the 2D image group and the 3D shape model stored in the storage unit 12.
- R1 is calculated.
- the non-measuring region calculation unit 22 is a region outside the creation target of the three-dimensional shape model in the two-dimensional image including the imaging target based on the two-dimensional image group and the three-dimensional shape model stored in the storage unit 12.
- the non-measuring region R2 is calculated.
- the unmeasured area calculation unit 23 calculates an unmeasured area R3 of a two-dimensional image including the captured object based on the calculated measured area R1 and the non-measurement area R2.
- the measured region calculation unit 21 can calculate the measured region R1 using the estimation result of the movement of the three-dimensional shape measurement apparatus 1 stored in the storage unit 12.
- the three-dimensional shape model G stored in the storage unit 12 is represented as a two-dimensional image from the viewpoint at the time of imaging of the two-dimensional image I including the captured object, a set of pixels that can be calculated is This corresponds to the measured region R1. Thereby, the measured region R1 can be calculated.
- the measured region R1 can be defined as follows.
- int ((s, t)) is a function that returns an integer vector obtained by rounding off each element of the real vector (s, t).
- the non-measuring region calculation unit 22 can calculate the measured region R2 using the depth value of the three-dimensional shape model stored in the storage unit 12.
- a portion of the three-dimensional shape model G stored in the storage unit 12 having a depth value that is greater than or equal to a certain value is regarded as a foreground or a distant view that is not a measurement target, and A set of pixels corresponds to the non-measurement target region R2.
- the non-measurement target region R2 can be calculated.
- the non-measurement target region R2 is defined as follows using the depth value d (u, v), the depth threshold values dmin, and dmax. Can be defined as
- the unmeasured region calculation unit 23 can calculate the unmeasured region R3 using the measured region R1 and the non-measurement region R2.
- the unmeasured region R3 does not belong to either the measured region R1 or the non-measurement region R2 among the pixels of the two-dimensional image I including the imaged target, but corresponds to a set of pixels. .
- the unmeasured region R3 can be calculated. That is, for the pixel p (u, v) on the two-dimensional image I including the captured object, the unmeasured region R3 can be defined as follows.
- the imaging unit 11 acquires the two-dimensional image I (step S301).
- the storage unit 12 stores the two-dimensional image I (step S302).
- the three-dimensional shape model generation unit 13 updates the stored three-dimensional shape model G based on the two-dimensional image I and the stored two-dimensional image Iold (step S303).
- the measured region calculation unit 21 calculates a measured region R1 from the stored two-dimensional image I, the stored three-dimensional shape G, camera position P, and camera parameter A (step S304).
- the non-measurement area calculation unit 22 calculates the non-measurement area R2 from the stored two-dimensional image I and the stored three-dimensional shape model G (step S305).
- the unmeasured region calculation unit 23 calculates the unmeasured region R3 from the measured region R1 and the non-measurement region R2 (step S306).
- the display image generation unit 15 generates a display image V from the measured region R1, the non-measurement region R2, the unmeasured region R3, and the two-dimensional image I (Step S307).
- the display unit 16 displays the display image V (step S308). Thereafter, when the three-dimensional shape measurement process is terminated based on an instruction from the measurer, the process is terminated (YES in step S309), otherwise (NO in step S309), and the process returns to step S301.
- the measurer moves the photographing unit 11 (imaging unit).
- the photographing unit 11 newly acquires a two-dimensional image I including a subject imaged from a different viewpoint from the previous time.
- the processing returns to step S301, and the above processing is repeatedly executed.
- the display image V is generated based on the measured region R1 calculated from the two-dimensional image I including the captured subject and the three-dimensional shape model G. Is done. That is, in this configuration, when the display image V is generated, whether or not each pixel in the image belongs to the measured region R1 can be used as reference information. According to this configuration, for example, a display image in which the measured region R ⁇ b> 1 is emphasized can be generated and displayed on the display unit 16. That is, in comparison with the case where the measured three-dimensional shape model G is displayed on the display unit 16 in addition to the two-dimensional image I including the subject that is simply imaged, it is not measured if the imaging unit 11 is specifically moved. The measurer can easily determine whether the area can be measured.
- the area calculation unit 14 of the present embodiment calculates the measured area R1, the non-measurement area R2, and the unmeasured area R3 based on the two-dimensional image Iold and the three-dimensional shape model G stored in the storage unit 12. And output to the display image generation unit 15. Accordingly, when the display image V is generated, the measured region R1, the non-measurement region R2, and the non-measurement region R3 can be used as reference information. It is possible to generate the display image V by clarifying whether the measurement region can be measured.
- the three-dimensional shape measurement apparatus 1 includes an imaging unit 11, a storage unit 12, a three-dimensional shape model generation unit 13, a region calculation unit 14, a display image generation unit 15, and a display unit 16.
- each of one or two or more elements of the three-dimensional shape measuring apparatus may be configured as a separate apparatus.
- the imaging unit 11, the storage unit 12, the three-dimensional shape model generation unit 13, the region calculation unit 14, the display image generation unit 15, and the display unit 16 are integrally configured as an electronic device such as a portable camera or a portable information terminal. can do.
- the imaging unit 11, the storage unit 12, and the display unit 16 are configured by a portable camera, and the three-dimensional shape model generation unit 13, the region calculation unit 14, and the display image generation unit 15 are configured as a personal computer or the like. Can be realized.
- the three-dimensional shape measuring apparatus 1 may include a wireless or wired communication device, and the components shown in FIG. 1 may be connected via a wireless or wired communication line.
- the three-dimensional shape measuring apparatus 1 is provided with means (apparatus) that performs processing for estimating the movement of the three-dimensional shape measuring apparatus 1 based on the two-dimensional image group and the three-dimensional shape model stored in the storage unit 12.
- This movement can be estimated, for example, by tracking the positions of a plurality of feature points included in each two-dimensional image in a plurality of two-dimensional images (see, for example, Non-Patent Document 1).
- a technique for tracking feature points between a plurality of two-dimensional images such as moving images, a plurality of techniques such as the Kanade-Lucas-Tomasi method (KLT method) are widely used.
- KLT method Kanade-Lucas-Tomasi method
- the estimation result of the movement can be stored in the storage unit 12, for example.
- the three-dimensional shape measuring apparatus 1 has a function of acquiring position information of its own apparatus using, for example, a GPS (global positioning system) receiver or the like, or uses an acceleration sensor, a gyro sensor, or the like. It may have a function of detecting movement. The detection result of this movement can be stored in the storage unit 12, for example.
- a GPS global positioning system
- the embodiment of the present invention is not limited to the above embodiment.
- the three-dimensional shape measuring apparatus 1 can be configured using one or a plurality of CPUs and a program executed by the CPU, and the program is distributed via a computer-readable recording medium or a communication line. Can be equalized.
- the three-dimensional shape measuring apparatus, the three-dimensional shape measuring method, and the three-dimensional shape measuring program according to the present invention can be used for an imaging device such as a digital camera, a portable information terminal equipped with a camera, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
Description
本願は、2013年11月6日に日本に出願された特願2013-230321号に基づき優先権を主張し、その内容をここに援用する。
本発明の第一態様の3次元形状計測装置においては、前記領域算出部は、前記記憶部に記憶される前記2次元画像および前記3次元形状モデルから、前記2次元画像内の計測対象外領域、計測済み領域および未計測領域を算出し、前記表示用画像生成部は、前記計測対象外領域、前記計測済み領域および前記未計測領域に基づいて、前記2次元画像から表示用画像を生成することが好ましい。
本発明の第一態様の3次元形状計測装置においては、前記領域算出部は、前記記憶部に記憶される前記2次元画像および前記3次元モデルから前記2次元画像内の低精度計測済み領域および高精度計測済み領域を算出することが好ましい。
本発明の第一態様の3次元形状計測装置においては、前記3次元形状モデル生成部は、精度設定の変更を指示する精度設定変更指示部を更に備えることが好ましい。
本発明の第一態様の3次元形状計測装置においては、前記表示用画像生成部は、前記計測済み領域の画素値が高くかつ前記計測対象外領域の画素値が低い前記表示用画像を生成することが好ましい。
本発明の第一態様の3次元形状計測装置は、前記2次元画像内の前記未計測領域の分布に基づいて、最適な前記撮像部の移動方向を算出する最適移動方向算出部と、前記最適移動方向算出部によって算出された移動方向を案内する情報を出力する最適移動方向出力部とを更に備えることが好ましい。
また、3次元形状モデル生成部13は、精度設定の変更を指示する精度設定変更指示部を更に備える。
また、領域算出部14は、記憶部12に記憶される2次元画像および3次元形状モデルに基づき、2次元画像内の計測対象外領域、計測済み領域および未計測領域を算出する。
また、領域算出部14は、2次元画像および3次元モデルから2次元画像内の低精度計測済み領域および高精度計測済み領域を算出する。
この場合、例えば、表示用画像Vは、2次元画像Iを構成する各画素のうち、計測済み領域R1の画素値が高くかつ計測対象外領域R2の画素値が低い画像として生成することができる。すなわち、正の定数aおよびbを用い、2次元画像I上の画素p(u,v)の画素値p’(u,v)に対して、表示用画像上の画素q(u,v)の画素値q’(u,v)を以下のように定義することができる。
次に、図2を参照して、図1を参照して説明した領域算出部14の構成例について説明する。
11 撮像部
12 記憶部
13 3次元形状モデル生成部
14 領域算出部
15 表示用画像生成部
16 表示部
21 計測済み領域算出部
22 計測対象外領域算出部
23 未計測領域算出部
Claims (8)
- 撮像した所定の2次元画像を逐次出力する撮像部と、
前記撮像部が出力した前記2次元画像を記憶する記憶部と、
前記記憶部に記憶される前記2次元画像に基づいて、3次元形状モデルを生成し、生成した3次元形状モデルを前記記憶部に記憶させる3次元形状モデル生成部と、
前記記憶部に記憶される前記2次元画像および前記3次元形状モデルから前記2次元画像内の計測済み領域を算出する領域算出部と、
前記計測済み領域に基づいて、前記2次元画像から表示用画像を生成する表示用画像生成部とを備える、3次元形状計測装置。 - 前記領域算出部は、前記記憶部に記憶される前記2次元画像および前記3次元形状モデルから、前記2次元画像内の計測対象外領域、計測済み領域および未計測領域を算出し、
前記表示用画像生成部は、前記計測対象外領域、前記計測済み領域および前記未計測領域に基づいて、前記2次元画像から表示用画像を生成する請求項1に記載の3次元形状計測装置。 - 前記領域算出部は、前記記憶部に記憶される前記2次元画像および前記3次元モデルから前記2次元画像内の低精度計測済み領域および高精度計測済み領域を算出する請求項1又は請求項2に記載の3次元形状計測装置。
- 前記3次元形状モデル生成部は、精度設定の変更を指示する精度設定変更指示部を更に備える請求項3に記載の3次元形状計測装置。
- 前記表示用画像生成部は、前記計測済み領域の画素値が高くかつ前記計測対象外領域の画素値が低い前記表示用画像を生成する請求項1から請求項4のいずれか一項に記載の3次元形状計測装置。
- 前記2次元画像内の前記未計測領域の分布に基づいて、最適な前記撮像部の移動方向を算出する最適移動方向算出部と、
前記最適移動方向算出部によって算出された移動方向を案内する情報を出力する最適移動方向出力部とを更に備える請求項1から請求項5のいずれか一項に記載の3次元形状計測装置。 - 2次元画像を逐次取得し、
取得された前記2次元画像を記憶部に記憶し、
記憶された前記2次元画像に基づいて、3次元モデルを生成し、生成した3次元形状モデルを前記記憶部に記憶し、
前記記憶部に記憶される前記2次元画像および前記3次元形状モデルから、前記2次元画像内の計測済み領域を算出し、
前記領域算出ステップにおいて算出された前記計測済み領域に基づいて、前記2次元画像から表示用画像を生成する3次元形状計測方法。 - 2次元画像を撮像する撮像部から出力される前記2次元画像を記憶部に記憶させる記憶ステップと、
前記記憶ステップにおいて記憶された前記2次元画像に基づいて、3次元モデルを生成し、生成した3次元形状モデルを前記記憶部に記憶させる3次元形状モデル生成ステップと、
前記記憶部に記憶される前記2次元画像および前記3次元形状モデルから前記2次元画像内の計測済み領域を算出する領域算出ステップと、
前記計測済み領域に基づいて前記2次元画像から表示用画像を生成する表示用画像生成ステップとをコンピュータに実行させる、3次元形状計測プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480060070.4A CN105705903A (zh) | 2013-11-06 | 2014-09-16 | 3维形状计测装置、3维形状计测方法及3维形状计测程序 |
JP2015546327A JP6589636B2 (ja) | 2013-11-06 | 2014-09-16 | 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム |
EP14860750.0A EP3067658B1 (en) | 2013-11-06 | 2014-09-16 | 3d-shape measurement device, 3d-shape measurement method, and 3d-shape measurement program |
US15/148,180 US10347029B2 (en) | 2013-11-06 | 2016-05-06 | Apparatus for measuring three dimensional shape, method for measuring three dimensional shape and three dimensional shape measurement program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013230321 | 2013-11-06 | ||
JP2013-230321 | 2013-11-06 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/148,180 Continuation US10347029B2 (en) | 2013-11-06 | 2016-05-06 | Apparatus for measuring three dimensional shape, method for measuring three dimensional shape and three dimensional shape measurement program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015068470A1 true WO2015068470A1 (ja) | 2015-05-14 |
Family
ID=53041253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/074424 WO2015068470A1 (ja) | 2013-11-06 | 2014-09-16 | 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10347029B2 (ja) |
EP (1) | EP3067658B1 (ja) |
JP (1) | JP6589636B2 (ja) |
CN (1) | CN105705903A (ja) |
WO (1) | WO2015068470A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020098603A (ja) * | 2018-12-18 | 2020-06-25 | 富士通株式会社 | 画像処理方法及び情報処理装置 |
JP2020112981A (ja) * | 2019-01-10 | 2020-07-27 | 株式会社ニコン | 検出装置、情報処理装置、検出方法、検出プログラム、及び検出システム |
JP2020122653A (ja) * | 2019-01-29 | 2020-08-13 | 株式会社Subaru | 対象物確認装置 |
JP2021018071A (ja) * | 2019-07-17 | 2021-02-15 | 日本電気株式会社 | 地形情報生成装置、地形情報生成システム、情報生成方法 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105829829B (zh) * | 2013-12-27 | 2019-08-23 | 索尼公司 | 图像处理装置和图像处理方法 |
CN111157200B (zh) * | 2017-01-25 | 2024-07-19 | 松下知识产权经营株式会社 | 刚性测定装置以及刚性测定方法 |
US10527711B2 (en) * | 2017-07-10 | 2020-01-07 | Aurora Flight Sciences Corporation | Laser speckle system and method for an aircraft |
US10991160B1 (en) | 2019-06-25 | 2021-04-27 | A9.Com, Inc. | Depth hull for rendering three-dimensional models |
US11138789B1 (en) * | 2019-06-25 | 2021-10-05 | A9.Com, Inc. | Enhanced point cloud for three-dimensional models |
JP6708917B1 (ja) * | 2020-02-05 | 2020-06-10 | リンクウィズ株式会社 | 形状検出方法、形状検出システム、プログラム |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000180137A (ja) * | 1998-12-11 | 2000-06-30 | Sony Corp | 形状計測装置および形状表示方法 |
JP2001145128A (ja) * | 1999-11-12 | 2001-05-25 | Asahi Optical Co Ltd | 3次元画像検出装置 |
JP2002027500A (ja) * | 2000-07-06 | 2002-01-25 | Asahi Optical Co Ltd | 3次元画像入力装置 |
JP2006267031A (ja) * | 2005-03-25 | 2006-10-05 | Brother Ind Ltd | 3次元入力装置および3次元入力方法 |
JP2009511897A (ja) | 2005-10-11 | 2009-03-19 | プライム センス リミティド | 対象物再構成方法およびシステム |
JP2009530604A (ja) | 2006-03-14 | 2009-08-27 | プライム センス リミティド | 三次元検知のために深度変化させる光照射野 |
JP2010219825A (ja) * | 2009-03-16 | 2010-09-30 | Topcon Corp | 三次元計測用画像撮影装置 |
JP2011527790A (ja) | 2008-07-09 | 2011-11-04 | プライムセンス リミテッド | 3次元マッピング用集積処理装置 |
JP2012142779A (ja) * | 2010-12-28 | 2012-07-26 | Olympus Imaging Corp | 撮像装置および撮像プログラム |
JP2013186042A (ja) * | 2012-03-09 | 2013-09-19 | Hitachi Automotive Systems Ltd | 距離算出装置及び距離算出方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4136012B2 (ja) * | 1996-05-24 | 2008-08-20 | 株式会社リコー | 測距装置、撮影装置および背景処理装置 |
JP4387377B2 (ja) * | 1998-05-25 | 2009-12-16 | パナソニック株式会社 | カメラ |
US7538764B2 (en) * | 2001-01-05 | 2009-05-26 | Interuniversitair Micro-Elektronica Centrum (Imec) | System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display |
US6570952B2 (en) * | 2001-02-27 | 2003-05-27 | Siemens Corporate Research, Inc. | Memory efficient shear-warp voxel projection algorithm |
EP1607716A3 (en) | 2004-06-18 | 2012-06-20 | Topcon Corporation | Model forming apparatus and method, and photographing apparatus and method |
JP4588369B2 (ja) * | 2004-06-18 | 2010-12-01 | 株式会社トプコン | 撮影装置及び撮影方法 |
JP5060358B2 (ja) * | 2008-03-25 | 2012-10-31 | 株式会社トプコン | 測量システム |
JP5106375B2 (ja) | 2008-12-24 | 2012-12-26 | 日本放送協会 | 3次元形状復元装置及びそのプログラム |
US9042636B2 (en) * | 2009-12-31 | 2015-05-26 | Disney Enterprises, Inc. | Apparatus and method for indicating depth of one or more pixels of a stereoscopic 3-D image comprised from a plurality of 2-D layers |
US9113074B2 (en) | 2010-12-22 | 2015-08-18 | Olympus Corporation | Imaging apparatus, imaging method, and computer readable storage medium for applying special effects processing to an automatically set region of a stereoscopic image |
CN103053169B (zh) * | 2011-06-08 | 2016-03-16 | 松下电器产业株式会社 | 图像处理装置及图像处理方法 |
US8259161B1 (en) * | 2012-02-06 | 2012-09-04 | Google Inc. | Method and system for automatic 3-D image creation |
US9972120B2 (en) * | 2012-03-22 | 2018-05-15 | University Of Notre Dame Du Lac | Systems and methods for geometrically mapping two-dimensional images to three-dimensional surfaces |
-
2014
- 2014-09-16 CN CN201480060070.4A patent/CN105705903A/zh active Pending
- 2014-09-16 JP JP2015546327A patent/JP6589636B2/ja active Active
- 2014-09-16 EP EP14860750.0A patent/EP3067658B1/en active Active
- 2014-09-16 WO PCT/JP2014/074424 patent/WO2015068470A1/ja active Application Filing
-
2016
- 2016-05-06 US US15/148,180 patent/US10347029B2/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000180137A (ja) * | 1998-12-11 | 2000-06-30 | Sony Corp | 形状計測装置および形状表示方法 |
JP2001145128A (ja) * | 1999-11-12 | 2001-05-25 | Asahi Optical Co Ltd | 3次元画像検出装置 |
JP2002027500A (ja) * | 2000-07-06 | 2002-01-25 | Asahi Optical Co Ltd | 3次元画像入力装置 |
JP2006267031A (ja) * | 2005-03-25 | 2006-10-05 | Brother Ind Ltd | 3次元入力装置および3次元入力方法 |
JP2009511897A (ja) | 2005-10-11 | 2009-03-19 | プライム センス リミティド | 対象物再構成方法およびシステム |
JP2009530604A (ja) | 2006-03-14 | 2009-08-27 | プライム センス リミティド | 三次元検知のために深度変化させる光照射野 |
JP2011527790A (ja) | 2008-07-09 | 2011-11-04 | プライムセンス リミテッド | 3次元マッピング用集積処理装置 |
JP2010219825A (ja) * | 2009-03-16 | 2010-09-30 | Topcon Corp | 三次元計測用画像撮影装置 |
JP2012142779A (ja) * | 2010-12-28 | 2012-07-26 | Olympus Imaging Corp | 撮像装置および撮像プログラム |
JP2013186042A (ja) * | 2012-03-09 | 2013-09-19 | Hitachi Automotive Systems Ltd | 距離算出装置及び距離算出方法 |
Non-Patent Citations (3)
Title |
---|
HIROKI UNTEN; TOMOHITO MASUDA; TOHRU MIHASHI; MAKOTO ANDO: "A practical VR-model generation method by utilizing moving-shots with stereo camera: Stereo Moving-shot Modeling System (SM2S", THE VIRTUAL REALITY SOCIETY OF JAPAN, JOURNAL, vol. 12, no. 2, 2007 |
See also references of EP3067658A4 |
SHAHRAM IZADI; DAVID KIM; OTMAR HILLIGES; DAVID MOLYNEAUX; RICHARD NEWCOMBE; PUSHMEET KOHLI; JAMIE SHOTTON; STEVE HODGES; DUSTIN F: "KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera", October 2011, ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020098603A (ja) * | 2018-12-18 | 2020-06-25 | 富士通株式会社 | 画像処理方法及び情報処理装置 |
JP7327140B2 (ja) | 2018-12-18 | 2023-08-16 | 富士通株式会社 | 画像処理方法及び情報処理装置 |
JP2020112981A (ja) * | 2019-01-10 | 2020-07-27 | 株式会社ニコン | 検出装置、情報処理装置、検出方法、検出プログラム、及び検出システム |
JP7392262B2 (ja) | 2019-01-10 | 2023-12-06 | 株式会社ニコン | 検出装置、情報処理装置、検出方法、検出プログラム、及び検出システム |
JP2020122653A (ja) * | 2019-01-29 | 2020-08-13 | 株式会社Subaru | 対象物確認装置 |
JP7204504B2 (ja) | 2019-01-29 | 2023-01-16 | 株式会社Subaru | 対象物確認装置 |
JP2021018071A (ja) * | 2019-07-17 | 2021-02-15 | 日本電気株式会社 | 地形情報生成装置、地形情報生成システム、情報生成方法 |
Also Published As
Publication number | Publication date |
---|---|
US10347029B2 (en) | 2019-07-09 |
EP3067658A4 (en) | 2017-06-21 |
EP3067658B1 (en) | 2019-10-02 |
JP6589636B2 (ja) | 2019-10-16 |
US20160253836A1 (en) | 2016-09-01 |
JPWO2015068470A1 (ja) | 2017-03-09 |
CN105705903A (zh) | 2016-06-22 |
EP3067658A1 (en) | 2016-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6589636B2 (ja) | 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム | |
US10825198B2 (en) | 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images | |
US10068344B2 (en) | Method and system for 3D capture based on structure from motion with simplified pose detection | |
US20160260250A1 (en) | Method and system for 3d capture based on structure from motion with pose detection tool | |
US10277889B2 (en) | Method and system for depth estimation based upon object magnification | |
JP6304244B2 (ja) | 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム | |
JP2013535013A5 (ja) | ||
JP2010510559A5 (ja) | ||
TW201335888A (zh) | Ar影像處理裝置及方法 | |
JP6969121B2 (ja) | 撮像システム、画像処理装置および画像処理プログラム | |
JP2008249431A (ja) | 3次元画像補正方法及びその装置 | |
CN113610702B (zh) | 一种建图方法、装置、电子设备及存储介质 | |
JP6409769B2 (ja) | 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム | |
GB2569609A (en) | Method and device for digital 3D reconstruction | |
KR101459522B1 (ko) | 모바일 기반의 부가정보를 이용한 위치 보정 방법 | |
WO2018134866A1 (ja) | カメラ校正装置 | |
JP2022190173A (ja) | 位置推定装置 | |
KR101863647B1 (ko) | 3d 맵들에 대한 가설 라인 맵핑 및 검증 | |
JP6625654B2 (ja) | 投影装置、投影方法、および、プログラム | |
JP5230354B2 (ja) | 位置特定装置及び異動建物検出装置 | |
KR20150119770A (ko) | 카메라를 사용한 3차원 좌표 측정 장치 및 방법 | |
JP5409451B2 (ja) | 3次元変化検出装置 | |
JP7075090B1 (ja) | 情報処理システム、及び、情報処理方法 | |
WO2022019128A1 (ja) | 情報処理装置、情報処理方法、及びコンピュータが読み取り可能な記録媒体 | |
JP2011232982A (ja) | 画像表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14860750 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015546327 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014860750 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014860750 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |