TWI630431B - Device and system for capturing 3-d images - Google Patents

Device and system for capturing 3-d images Download PDF

Info

Publication number
TWI630431B
TWI630431B TW105127449A TW105127449A TWI630431B TW I630431 B TWI630431 B TW I630431B TW 105127449 A TW105127449 A TW 105127449A TW 105127449 A TW105127449 A TW 105127449A TW I630431 B TWI630431 B TW I630431B
Authority
TW
Taiwan
Prior art keywords
light
light source
module
capturing
time
Prior art date
Application number
TW105127449A
Other languages
Chinese (zh)
Other versions
TW201807443A (en
Inventor
張榮喬
Original Assignee
光寶電子(廣州)有限公司
光寶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 光寶電子(廣州)有限公司, 光寶科技股份有限公司 filed Critical 光寶電子(廣州)有限公司
Priority to TW105127449A priority Critical patent/TWI630431B/en
Publication of TW201807443A publication Critical patent/TW201807443A/en
Application granted granted Critical
Publication of TWI630431B publication Critical patent/TWI630431B/en

Links

Abstract

本發明提供一種用於捕捉立體影像的裝置及系統。用於捕捉立體影像的裝置包含飛行時間影像擷取模組以及發光模組,發光模組產生結構光源及照明光源,飛行時間影像擷取模組擷取由結構光源及照明光源投射至物體所形成的第一及第二反射光源,進而獲得物體的立體影像資訊。透過整合飛行時間(TOF)與結構光技術,本發明得以獲得具有高深度精度、低雜訊的立體影像資訊,同時縮減裝置及系統的整體體積以及使用功率。 The present invention provides an apparatus and system for capturing a stereoscopic image. The device for capturing a stereoscopic image comprises a time-of-flight image capturing module and a lighting module, wherein the lighting module generates a structural light source and an illumination light source, and the time-of-flight image capturing module captures the structure light source and the illumination light source projected onto the object. The first and second reflected light sources further obtain stereoscopic image information of the object. Through integrated time-of-flight (TOF) and structured light technology, the present invention can obtain stereoscopic image information with high depth accuracy and low noise, while reducing the overall volume and power consumption of the device and system.

Description

用於捕捉立體影像的裝置及系統 Device and system for capturing stereoscopic images

本發明涉及一種影像資訊擷取裝置及系統,特別是涉及一種用於捕捉立體影像的裝置及系統。 The present invention relates to an image information capturing device and system, and more particularly to an apparatus and system for capturing a stereoscopic image.

立體掃描技術已被廣泛用於工業領域及日常生活中,例如應用於立體掃描、監控辨識以及深度相機等。立體掃描技術主要是用於偵測以及分析物體或環境的幾何構造以及外觀資料,並利用所偵測到的資料進行三維運算,進而模擬及建立物體及環境的數位模型。 Stereoscopic scanning technology has been widely used in industrial fields and daily life, such as stereo scanning, monitoring and identification, and depth cameras. Stereoscopic scanning technology is mainly used to detect and analyze the geometric structure and appearance data of objects or environments, and use the detected data to perform three-dimensional operations to simulate and establish digital models of objects and environments.

現有的立體掃描技術包含使用飛行時間法(Time of Flight,TOF)、立體視覺(Stereo Vision)或結構光(Structured Light)等技術進行立體影像資料的擷取與運算。在前述立體掃描技術中,飛行時間法是通過計算光源投射於被待測物表面,再由待測物表面反射而到達感測器的時間來計算待測物與影像擷取模組之間的距離,進而獲得待測物的立體影像資料。飛行時間法的優點在於,相較於立體視覺技術及運用三角測距的結構光技術,飛行時間法可利用複雜度低的系統來達成。舉例而言,飛行時間影像擷取模組可包含緊靠於彼此而設置的發光元件及用於聚焦反射光的光學單元(例如透鏡),而不需用到三角測距,即,不需要考量飛行時間影像擷取模組與結構光源之間的距離(基準線(Baseline)),以及結構光源經物體反射所得之反射光線與基準線之間的夾角。另外,飛行時間影像擷取模組得以在單次投拍(投射-拍攝)下擷取完整的 影像資訊,因此適用於即時應用(Real-time Application)。然而,飛行時間法所獲得的影像資訊的深度雜訊(Depth Noise)較大,且所產生的影像資料的精度較低。 The existing stereo scanning technology includes the use of time-of-flight (TOF), stereoscopic (Stereo Vision) or structured light (Structured Light) techniques for capturing and computing stereoscopic image data. In the foregoing stereo scanning technology, the time-of-flight method calculates the time between the object to be tested and the image capturing module by calculating the time when the light source is projected onto the surface of the object to be tested and then reflected from the surface of the object to be tested to reach the sensor. The distance is obtained, thereby obtaining stereoscopic image data of the object to be tested. The advantage of the time-of-flight method is that the time-of-flight method can be achieved with a system with low complexity compared to stereo vision technology and structured light technology using triangulation. For example, the time-of-flight image capturing module may include a light-emitting element disposed close to each other and an optical unit (eg, a lens) for focusing the reflected light without using triangulation, that is, no consideration is needed. The time-of-flight image captures the distance between the module and the structural light source (baseline), and the angle between the reflected light reflected by the structured light source and the reference line. In addition, the time-of-flight image capture module captures complete shots in a single shot (projection-shooting) Image information, so it is suitable for real-time applications. However, the depth information (Depth Noise) of the image information obtained by the time-of-flight method is large, and the accuracy of the generated image data is low.

另一方面,結構光技術是通過向待測物投射特定圖形,再通過感測器擷取待測物表面的三維圖像,最後利用三角測距原理計算而得到待測物的三維座標。結構光技術雖可獲得較低的深度雜訊以及較高的影像資料精度,由結構光技術進行立體點雲(3D Cloud Point)的計算相對耗時。 On the other hand, the structured light technology is to project a specific figure to the object to be tested, and then draw a three-dimensional image of the surface of the object to be tested through the sensor, and finally calculate the three-dimensional coordinates of the object to be tested by using the principle of triangulation. Although structured light technology can obtain low depth noise and high image data accuracy, the calculation of 3D Cloud Point by structured light technology is relatively time consuming.

因此,仍有需要提供一種解決方案,用以結合飛行時間法以及結構光兩者的優點,進而提供一種可提供高深度精度,並具有較小體積、較低耗電量的用於捕捉立體影像的系統。 Therefore, there is still a need to provide a solution to combine the advantages of both the time-of-flight method and the structured light to provide a high-accuracy precision with a small volume and low power consumption for capturing stereoscopic images. system.

為了解決上述技術問題,根據本發明之其中一實施例,提供一種用於捕捉立體影像的裝置,其包含一飛行時間影像擷取模組以及一發光模組。飛行時間影像擷取模組用於擷取至少一物體的一立體影像資訊。發光模組鄰近所述飛行時間影像擷取模組,所述發光模組產生投向所述物體的一結構光源以及一照明光源。所述結構光源通過所述物體的反射,以形成一第一反射光源。所述照明光源通過所述物體的反射,以形成一第二反射光源。所述飛行時間影像擷取模組擷取所述第一反射光源以及所述第二反射光源,以獲得所述物體的所述立體影像資訊。 In order to solve the above technical problem, according to one embodiment of the present invention, an apparatus for capturing a stereoscopic image includes a time-of-flight image capturing module and a lighting module. The time-of-flight image capturing module is configured to capture a stereoscopic image information of at least one object. The light emitting module is adjacent to the time-of-flight image capturing module, and the light emitting module generates a structured light source and an illumination light source that are directed to the object. The structured light source is reflected by the object to form a first reflected light source. The illumination source is reflected by the object to form a second reflected source. The time-of-flight image capturing module captures the first reflective light source and the second reflective light source to obtain the stereoscopic image information of the object.

根據本發明之另一實施例,提供用於捕捉立體影像的系統,其用於擷取至少一物體的一影像資訊,所述系統包含一處理器模組、電性連接於所述處理器模組的一飛行時間處理模組、一發光模組,以及電性連接於所述飛行時間處理模組的一飛行時間影像擷取模組。所述發光模組用於產生投向所述物體的一結構光源以及投向所述物體的一照明光源,且所述結構光源通過所述物體的反射以產生一結構光資訊,且所述照明光源通過所述物體的反 射,以產生一照明光資訊。所述飛行時間影像擷取模組用於擷取所述結構光資訊以及所述照明光資訊。所述飛行時間影像擷取模組所擷取到的所述結構光資訊通過所述處理器模組運算後可得到一結構光立體點雲。所述飛行時間影像擷取模組所擷取到的所述照明光資訊通過所述處理器模組運算後可得到一飛行時間立體點雲。所述結構光立體點雲以及所述飛行時間立體點雲可合併成一用於提供所述物體的所述影像資訊的立體深度圖。 According to another embodiment of the present invention, a system for capturing a stereoscopic image is provided for capturing image information of at least one object, the system comprising a processor module electrically coupled to the processor module A flight time processing module, a lighting module, and a time-of-flight image capturing module electrically connected to the time-of-flight processing module. The illumination module is configured to generate a structured light source directed to the object and an illumination source directed to the object, and the structured light source generates a structured light information by reflection of the object, and the illumination source passes The inverse of the object Shoot to generate an illumination light information. The time-of-flight image capturing module is configured to capture the structured light information and the illumination light information. The structured light information captured by the time-of-flight image capturing module is calculated by the processor module to obtain a structured light stereo point cloud. The illumination light information captured by the time-of-flight image capturing module is calculated by the processor module to obtain a time-of-flight stereo point cloud. The structured light stereo point cloud and the time-of-flight stereo point cloud may be combined into a stereo depth map for providing the image information of the object.

本發明的主要技術手段在於,通過在單次影像擷取程序中產生結構光源及照明光源,並利用飛行時間影像擷取模組以擷取由結構光源及照明光源經待測物體表面反射而得的第一反射光源以及第二反射光源,可以結合飛行時間法以及結構光技術兩者的優點,獲得具有低深度雜訊(Depth Noise)及高深度精度的待測物體的立體影像資訊。另外,本發明的用於捕捉立體影像的裝置以及系統具有較小的整體體積,且其操作能源的消耗量較低。 The main technical means of the present invention is to generate a structured light source and an illumination source in a single image capture program, and use a time-of-flight image capture module to capture the surface of the object to be tested reflected by the structured light source and the illumination source. The first reflective light source and the second reflective light source can combine the advantages of both the time-of-flight method and the structured light technology to obtain stereoscopic image information of the object to be tested with low depth noise and high depth precision. In addition, the apparatus and system for capturing stereoscopic images of the present invention have a small overall volume and consume less energy for operating energy.

為使能更進一步瞭解本發明的特徵及技術內容,請參閱以下有關本發明的詳細說明與附圖,然而所提供的附圖僅提供參考與說明用,並非用來對本發明加以限制者。 For a better understanding of the features and technical aspects of the present invention, reference should be made to the accompanying drawings.

D‧‧‧用於捕捉立體影像的裝置 D‧‧‧Device for capturing stereoscopic images

1‧‧‧飛行時間影像擷取模組 1‧‧‧Time-of-flight image capture module

11‧‧‧光接收面 11‧‧‧Light receiving surface

12‧‧‧切換模組 12‧‧‧Switch Module

13‧‧‧鏡頭模組 13‧‧‧Lens module

14‧‧‧飛行時間感測元件 14‧‧‧Time-of-flight sensing components

2‧‧‧發光模組 2‧‧‧Lighting module

21‧‧‧光產生單元 21‧‧‧Light generating unit

211‧‧‧雷射產生器 211‧‧‧Laser Generator

2111‧‧‧準直儀 2111‧‧ ‧collimator

212‧‧‧光學組件 212‧‧‧Optical components

2121‧‧‧分光元件 2121‧‧‧Splitting components

2122‧‧‧反光元件 2122‧‧‧Reflective components

213‧‧‧發光元件 213‧‧‧Lighting elements

22‧‧‧光繞射元件 22‧‧‧Light diffractive components

221‧‧‧出光面 221‧‧‧Glossy

23‧‧‧光擴散元件 23‧‧‧Light diffusing elements

231‧‧‧出光面 231‧‧‧Glossy

S‧‧‧用於捕捉立體影像的系統 S‧‧‧System for capturing stereoscopic images

51‧‧‧記憶體單元 51‧‧‧ memory unit

52‧‧‧處理器模組 52‧‧‧Processor Module

53‧‧‧飛行時間處理模組 53‧‧‧Time-of-flight processing module

54‧‧‧第一發光模組 54‧‧‧First lighting module

55‧‧‧第二發光模組 55‧‧‧Second lighting module

56‧‧‧飛行時間影像擷取模組 56‧‧‧Time-of-flight image capture module

57‧‧‧計算單元 57‧‧‧Computation unit

58‧‧‧發光模組 58‧‧‧Lighting module

L11‧‧‧雷射光源 L11‧‧‧Laser light source

L12‧‧‧投射光源 L12‧‧‧projection light source

SL‧‧‧結構光源 SL‧‧‧Structural light source

IL‧‧‧照明光源 IL‧‧‧Light source

R1‧‧‧第一反射光源 R1‧‧‧first reflected light source

R2‧‧‧第二反射光源 R2‧‧‧second reflected light source

O‧‧‧物體 O‧‧‧ objects

BA‧‧‧基準軸線 BA‧‧‧Base axis

BL‧‧‧基準線 BL‧‧‧ baseline

θ‧‧‧三角測距角 θ‧‧‧Triangulation angle

圖1為本發明第一實施例所提供的用於捕捉立體影像的裝置的示意圖;圖2為本發明第二實施例所提供的用於捕捉立體影像的裝置的示意圖;圖3為本發明第三實施例所提供的用於捕捉立體影像的裝置的示意圖;圖4為本發明所提供的用於捕捉立體影像的方法的流程圖;圖5A以及圖5B為本發明所提供的用於捕捉立體影像的系統的示意圖;以及圖6為本發明所提供的用於捕捉立體影像的系統的操作流程 圖。 1 is a schematic diagram of an apparatus for capturing a stereoscopic image according to a first embodiment of the present invention; FIG. 2 is a schematic diagram of an apparatus for capturing a stereoscopic image according to a second embodiment of the present invention; 3 is a schematic diagram of a device for capturing a stereoscopic image provided by the embodiment; FIG. 4 is a flowchart of a method for capturing a stereoscopic image provided by the present invention; FIG. 5A and FIG. 5B are diagrams for capturing a stereoscopic image provided by the present invention. A schematic diagram of a system of images; and FIG. 6 is an operational flow of a system for capturing stereoscopic images provided by the present invention Figure.

以下是通過特定的具體實例來說明本發明所公開有關“用於捕捉立體影像的裝置、方法以及系統”的實施方式,本領域技術人員可由本說明書所公開的內容瞭解本發明的優點與功效。本發明可通過其他不同的具體實施例加以施行或應用,本說明書中的各項細節亦可基於不同觀點與應用,在不悖離本發明的精神下進行各種修飾與變更。另外,本發明的附圖僅為簡單示意說明,並非依實際尺寸的描繪,先予敘明。以下的實施方式將進一步詳細說明本發明的相關技術內容,但所公開的內容並非用以限制本發明的技術範疇。 The following is a description of embodiments of the present invention relating to "apparatus, method, and system for capturing stereoscopic images" by specific specific examples, and those skilled in the art can understand the advantages and effects of the present invention from the disclosure of the present specification. The present invention can be implemented or applied in various other specific embodiments, and various modifications and changes can be made without departing from the spirit and scope of the invention. In addition, the drawings of the present invention are merely illustrative and are not intended to be described in terms of actual dimensions. The following embodiments will further explain the related technical content of the present invention, but the disclosure is not intended to limit the technical scope of the present invention.

〔第一實施例〕 [First Embodiment]

首先,請參閱圖1。圖1為本發明第一實施例所提供的用於捕捉立體影像的裝置的示意圖。本發明第一實施例的用於捕捉立體影像的裝置D包含飛行時間影像擷取模組1以及發光模組2。飛行時間影像擷取模組1至少包含鏡頭模組13以及飛行時間感測單元14。飛行時間影像擷取模組1是用於擷取至少一物體O的立體影像資訊。發光模組2鄰近飛行時間影像擷取模組1。發光模組2與飛行時間影像擷取模組1之間的設置距離依據所欲達成的影像解析度,以及用於捕捉立體影像的裝置D與待測量的物體O之間的距離而定,詳細內容稍後說明。 First, please refer to Figure 1. FIG. 1 is a schematic diagram of an apparatus for capturing a stereoscopic image according to a first embodiment of the present invention. The device D for capturing a stereoscopic image according to the first embodiment of the present invention includes a time-of-flight image capturing module 1 and a lighting module 2. The time-of-flight image capturing module 1 includes at least a lens module 13 and a time-of-flight sensing unit 14. The time-of-flight image capturing module 1 is stereoscopic image information for capturing at least one object O. The lighting module 2 is adjacent to the time-of-flight image capturing module 1. The setting distance between the light-emitting module 2 and the time-of-flight image capturing module 1 depends on the image resolution to be achieved, and the distance between the device D for capturing the stereo image and the object O to be measured. The content will be explained later.

承上述,發光模組2包含光產生單元21、光繞射元件22以及光擴散元件23。發光模組2是用於產生投向物體O的結構光源SL以及照明光源IL光產生單元21包含光產生器,亦可進一步包含其他光學元件,如透鏡等。光產生單元21的光產生器的種類不在此限定。光產生單元21產生同調光(Coherent Light),因此,只要能夠達成上述目的,光產生單元21內部組件的選用及設置可以加 以調整。舉例而言,光產生單元21的光產生器為雷射產生器211,用以產生雷射光源L11。舉例而言,可使用產生紅外光的光產生器。另外,亦可使用其他的發光元件作為光產生單元21的光產生器,例如發光二極體(Light emitting diode,LED),再搭配「窄」帶通濾波器(“Narrow Bandwidth”Band Pass Filter)使由發光二極體所產生的光源轉換為同調光。較佳地,光產生單元21包含雷射產生器211,如此一來,可以在不使用帶通濾波器的情況下直接產生具有單一波長的光源,且足以提供進行立體影像資擷取足夠的能量。於本實施例中,光產生單元包含雷射產生器211。另外,在使用雷射產生器211時,雷射產生器211可具有至少一準直儀(Collimator)2111。 As described above, the light-emitting module 2 includes the light generating unit 21, the light diffraction element 22, and the light diffusing element 23. The light-emitting module 2 is a structure light source SL for generating the object O and the illumination light source IL light-generating unit 21 includes a light generator, and may further include other optical elements such as a lens. The kind of the light generator of the light generating unit 21 is not limited herein. The light generating unit 21 generates Coherent Light. Therefore, as long as the above object can be achieved, the selection and setting of the internal components of the light generating unit 21 can be added. To adjust. For example, the light generator of the light generating unit 21 is a laser generator 211 for generating the laser light source L11. For example, a light generator that produces infrared light can be used. In addition, other light-emitting elements may be used as the light generator of the light-generating unit 21, such as a light emitting diode (LED), and a "narrow bandwidth" bandpass filter ("Narrow Bandwidth" Band Pass Filter). The light source generated by the light-emitting diode is converted into the same dimming light. Preferably, the light generating unit 21 comprises a laser generator 211, so that a light source having a single wavelength can be directly generated without using a band pass filter, and is sufficient to provide sufficient energy for performing stereo image acquisition. . In the present embodiment, the light generating unit includes a laser generator 211. In addition, when the laser generator 211 is used, the laser generator 211 may have at least one collimator 2111.

如上所述,光繞射元件22以及光擴散元件23分別用於將由光產生單元21所產生的同調光轉換為結構光源SL以及照明光源IL。換句話說,由發光模組2所產生的結構光源SL是通過光繞射元件22的轉換後所形成,而發光模組2所產生的照明光源IL是通過光擴散元件23的轉換後所形成。另外,光繞射元件22以及光擴散元件23與光產生單元21之間可進一步包含其他光學元件,舉例而言,光繞射元件22以及光擴散元件23與光產生單元之間可包含透鏡,例如聚焦透鏡。 As described above, the light diffractive element 22 and the light diffusing element 23 are respectively used to convert the coherent light generated by the light generating unit 21 into the structural light source SL and the illumination light source IL. In other words, the structured light source SL generated by the light-emitting module 2 is formed by the conversion of the light diffraction element 22, and the illumination light source IL generated by the light-emitting module 2 is formed by the conversion of the light diffusing element 23. . In addition, the optical diffractive element 22 and the light diffusing element 23 and the light generating unit 21 may further include other optical components. For example, the optical diffractive component 22 and the light diffusing component 23 and the light generating unit may include a lens. For example, a focusing lens.

光繞射元件22又稱為繞射光學元件(Diffractive Optical Element,DOE),其可為全像片、光柵,或其他適合的光學元件。光繞射元件22用以透過其表面的微結構而利用光源產生二維編碼。於本發明實施例中,光繞射元件22將光產生單元21所產生的同調光轉換為投射於物體O(或是由多個物體O所組成的環境)表面的結構光源SL。光擴散元件23用於將光產生單元21所產生的同調光平均擴散,從而產生投射於待測的物體O表面的照明光源IL。為獲得待測物體O或環境的完整立體影像資訊,光擴散元件23是將光產生單元21所產生的同調光轉換為得以完整涵蓋待 測物體O或環境的照明光源IL。 Light diffractive element 22, also known as a Diffractive Optical Element (DOE), can be a full image, a grating, or other suitable optical component. The light diffractive element 22 is configured to generate a two-dimensional code using a light source through the microstructure of its surface. In the embodiment of the present invention, the light diffractive element 22 converts the coherent light generated by the light generating unit 21 into a structured light source SL projected onto the surface of the object O (or an environment composed of a plurality of objects O). The light diffusing element 23 is for diffusing the homology light generated by the light generating unit 21 on average, thereby generating an illumination light source IL projected onto the surface of the object O to be measured. In order to obtain complete stereoscopic image information of the object O or the environment to be tested, the light diffusing element 23 converts the homology light generated by the light generating unit 21 into a complete coverage. Measure the object O or the illumination source IL of the environment.

值得一提的是,由於本發明實施例所使用的光產生單元21具有複雜度較低的結構組成,其是具有較小的尺寸及體積而可應用於小型電子產品,例如行動電話等。除此之外,光產生單元21僅須透過簡單的電子控制系統以控制其中的雷射產生器211產生脈衝式的光源訊號,本發明實施例並不需要使用複雜的面板以及設置週邊電路用以控制光產生單元21出光的模式,因此,除了可降低本身的體積之外,還可降低製程與生產成本。 It is worth mentioning that the light generating unit 21 used in the embodiment of the present invention has a structural composition with a low complexity, which is small in size and volume and can be applied to small electronic products such as mobile phones and the like. In addition, the light generating unit 21 only needs to use a simple electronic control system to control the laser generator 211 to generate a pulsed light source signal. The embodiment of the present invention does not need to use a complicated panel and set peripheral circuits for use. The mode in which the light generating unit 21 emits light is controlled, and therefore, in addition to reducing the volume of itself, the process and production cost can be reduced.

在第一實施例中,光產生單元21是包含單一光產生器(雷射產生器211)。因此,為了將雷射產生器211所產生的雷射光源L1透過光繞射元件22以及光擴散元件23分別轉換為結構光源SL及照明光源IL,光產生單元21還進一步包含光學組件212,用以對雷射光源L1進行分光。換句話說,如圖1所示,光產生單元21包含雷射產生器211以及光學組件212,雷射產生器211所產生的雷射光源L11依序通過光學組件212及光繞射元件22,以形成結構光源SL,且雷射光源L11依序通過光學組件212及光擴散元件23,以形成照明光源IL。 In the first embodiment, the light generating unit 21 is composed of a single light generator (laser generator 211). Therefore, in order to convert the laser light source L1 generated by the laser generator 211 through the light diffractive element 22 and the light diffusing element 23 into the structural light source SL and the illumination light source IL, the light generating unit 21 further includes an optical component 212 for The laser light source L1 is split. In other words, as shown in FIG. 1, the light generating unit 21 includes a laser generator 211 and an optical component 212, and the laser light source L11 generated by the laser generator 211 sequentially passes through the optical component 212 and the light diffraction component 22, The structured light source SL is formed, and the laser light source L11 sequentially passes through the optical component 212 and the light diffusing element 23 to form the illumination light source IL.

具體來說,光學組件212包含分光元件2121以及反光元件2122。舉例而言,分光元件2121為半反射鏡(Half Mirror),其具有特定的反射率/穿透率比值,用於對雷射光源L11進行分光。反光元件2122為一導光元件,例如反射鏡。雷射光源L11進行分光後,一部分的光束通過光繞射元件22而形成結構光源SL,另一部分的光束通過反光元件2122的反射而被導引至光擴散元件23,進而產生照明光源IL。結構光源SL投射於物體O的表面後,經過物體O的表面的反射而形成第一反射光源R1,而照明光源IL投射於物體O的表面,並經過物體O的表面的反射而形成第二反射光源R2。 Specifically, the optical component 212 includes a beam splitting element 2121 and a light reflecting element 2122. For example, the beam splitting element 2121 is a Half Mirror having a specific reflectance/transmittance ratio for splitting the laser light source L11. Reflective element 2122 is a light directing element, such as a mirror. After the laser light source L11 is split, a part of the light beam passes through the light diffraction element 22 to form the structured light source SL, and the other part of the light beam is guided to the light diffusing element 23 by the reflection of the light reflecting element 2122, thereby generating the illumination light source IL. After the structured light source SL is projected on the surface of the object O, the first reflective light source R1 is formed by the reflection of the surface of the object O, and the illumination light source IL is projected on the surface of the object O and is reflected by the surface of the object O to form a second reflection. Light source R2.

同樣參閱圖1,飛行時間影像擷取模組1擷取第一反射光源 R1以及第二反射光源R2,以獲得物體O的立體影像資訊。換句話說,通過利用飛行時間影像擷取模組1中的鏡頭模組13擷取由結構光源SL所產生的第一反射光源R1,可以通過空間調變而獲得物體O的立體影像資訊,而通過利用飛行時間影像擷取模組1的鏡頭模組13擷取由照明光源IL所產生的第二反射光源R2,可以通過時間調變而獲得物體O的立體影像資訊。 Referring also to FIG. 1, the time-of-flight image capturing module 1 captures the first reflected light source. R1 and the second reflected light source R2 are used to obtain stereoscopic image information of the object O. In other words, by using the lens module 13 in the time-of-flight image capturing module 1 to capture the first reflected light source R1 generated by the structured light source SL, the stereoscopic image information of the object O can be obtained by spatial modulation. By capturing the second reflected light source R2 generated by the illumination source IL by using the lens module 13 of the time-of-flight image capturing module 1, the stereoscopic image information of the object O can be obtained by time modulation.

以下將詳細說明本發明第一實施例中飛行時間影像擷取模組1以及發光模組2的設置方式。於本發明中,飛行時間模組1以及發光模組2的排列方式是取決於所欲獲得的立體影像資訊的解析度、精度,以及用於捕捉立體影像的裝置D與待測量的物體O之間的距離而定。具體而言,利用結構光技術進行立體影像資訊的擷取時,須利用三角測距法進行計算。三角測距法包含透過已知參數,即,飛行時間模組1與結構光源SL發射點的橫向距離(即三角測距法中的Baseline,圖1中所示x方向的基準線BL),以及第一反射光源R1與基準線BL之間的夾角(圖1中所示的三角測距角θ),來推算深度距離(圖1中所示z方向)。因此,如圖1所示,於本發明中,飛行時間影像擷取模組1、光繞射元件22以及光擴散元件23呈線性設置,且飛行時間影像擷取模組1的光接收面11、光繞射元件22的出光面221以及光擴散元件23的出光面231都沿著同一基準軸線BA排列。 The setting manners of the time-of-flight image capturing module 1 and the light-emitting module 2 in the first embodiment of the present invention will be described in detail below. In the present invention, the arrangement of the time-of-flight module 1 and the illumination module 2 depends on the resolution and accuracy of the stereoscopic image information to be obtained, and the device D for capturing the stereoscopic image and the object O to be measured. Depending on the distance between them. Specifically, when the stereoscopic image information is captured by the structured light technology, the triangulation method is used for calculation. The triangulation method includes a known parameter, that is, a lateral distance between the time-of-flight module 1 and the emission point of the structural light source SL (ie, Baseline in the triangulation method, the reference line BL in the x direction shown in FIG. 1), and The angle between the first reflected light source R1 and the reference line BL (the triangular ranging angle θ shown in Fig. 1) is used to estimate the depth distance (z direction shown in Fig. 1). Therefore, as shown in FIG. 1 , in the present invention, the time-of-flight image capturing module 1 , the light diffractive element 22 , and the light diffusing element 23 are linearly arranged, and the light receiving surface 11 of the time-of-flight image capturing module 1 is arranged. The light exit surface 221 of the light diffraction element 22 and the light exit surface 231 of the light diffusing element 23 are all arranged along the same reference axis BA.

承上述,基於三角測距的考量,一般而言,光繞射元件22與飛行時間影像擷取模組1之間的距離越遠,由此所獲得的立體影性資訊的精度越高。舉例而言,使用格雷碼(Gray Code)演算法時,對於5 Mega Pixel的飛行時間影像擷取模組1而言,當光繞射元件22與飛行時間影像擷取模組1之間的距離(基準線BL)為14公分,而三角測距角θ為20度,在深度20公分處可獲得0.1毫米(mm)的解析度。另外,使用相移(Phase Shift)演算法,或是使用光斑(Speckle)演算法加上相移演算法時,依據亮暗動態對比程度,解 析度可再進一步提升。然而,隨著光繞射元件22與飛行時間影像擷取模組1之間的距離的增加,用於捕捉立體影像的裝置D的整體體積也越大。另一方面,由於通過光擴散元件23的轉換而產生的照明光源IL是利用時間調變的原理進行距離的計算,在本發明中,光擴散元件23與飛行時間影像擷取模組1之間的距離不在此限定。換句話說,只要使得由光擴散元件23的出光面231所投射於物體O表面的照明光源IL的範圍足以涵蓋飛行時間影像擷取模組1的可視區,光擴散元件23與飛行時間影像擷取模組1之間的距離可依據其他參數加以調整。 In view of the above, based on the consideration of the triangulation, generally, the farther the distance between the optical diffractive element 22 and the time-of-flight image capturing module 1 is, the higher the accuracy of the obtained stereoscopic information is. For example, when using the Gray Code algorithm, the distance between the light diffracting element 22 and the time-of-flight image capturing module 1 for the 5 Mega Pixel time-of-flight image capturing module 1 (the reference line BL) is 14 cm, and the triangulation angle θ is 20 degrees, and a resolution of 0.1 mm (mm) is obtained at a depth of 20 cm. In addition, when using the Phase Shift algorithm, or using the Speckle algorithm plus the phase shift algorithm, the solution is based on the degree of contrast between light and dark. The degree of resolution can be further improved. However, as the distance between the light diffractive element 22 and the time-of-flight image capturing module 1 increases, the overall volume of the device D for capturing a stereoscopic image is also larger. On the other hand, since the illumination light source IL generated by the conversion of the light diffusing element 23 is calculated by the principle of time modulation, in the present invention, between the light diffusing element 23 and the time-of-flight image capturing module 1 The distance is not limited here. In other words, as long as the range of the illumination source IL projected by the light-emitting surface 231 of the light-diffusing element 23 on the surface of the object O is sufficient to cover the visible area of the time-of-flight image capturing module 1, the light diffusing element 23 and the time-of-flight image 撷The distance between the modules 1 can be adjusted according to other parameters.

綜上所述,本發明第一實施例所提供的用於捕捉立體影像的裝置D,可藉由總共二次的投-拍(投影-拍攝)程序,即通過光繞射元件22投射結構光源SL、由飛行時間影像擷取模組1擷取由結構光源SL所產生的第一反射光源R1,以及通過光擴散元件23投射照明光源IL,再由飛行時間影像擷取模組1擷取由照明光源IL所產生的第二反射光源R2,來獲得分別藉由空間調變以及時間調變而得的立體影像資訊。另外,前述兩次投-拍程序的順序可以相反,本發明不在此限制。 In summary, the apparatus D for capturing a stereoscopic image provided by the first embodiment of the present invention can project a structured light source by a total of two projection-shooting (projection-photographing) programs, that is, by the light diffractive element 22. SL, the first reflected light source R1 generated by the structured light source SL is captured by the time-of-flight image capturing module 1 , and the illumination light source IL is projected by the light diffusing element 23, and then captured by the time-of-flight image capturing module 1 The second reflected light source R2 generated by the illumination source IL obtains stereoscopic image information obtained by spatial modulation and time modulation, respectively. In addition, the order of the aforementioned two shot-taking procedures may be reversed, and the present invention is not limited thereto.

藉由整合兩種不同立體掃描技術,可結合此等立體掃描技術各自的優點,即,利用結構光進行近距離、高精度的立體影像資擷取,並以飛行時間法進行中距及遠距的立體掃描,同時用以來提高以結構光進行近距立體掃描時的深度精度(Depth Accuracy)。 By integrating two different stereo scanning technologies, it is possible to combine the advantages of these stereo scanning technologies, that is, to perform close-range and high-precision stereo image acquisition using structured light, and to perform medium and long distances by time-of-flight method. The stereo scanning has been used to improve the depth accuracy (Depth Accuracy) when performing close-range stereo scanning with structured light.

同樣參閱圖1,本發明第一實施例的飛行時間影像感測模組1還包含切換模組12。切換模組12是用於對所獲得的立體影像資訊進行貼色(Color Texturing)。具體而言,切換模組12包含紅外光帶通(Bandpass)濾鏡、可見光帶通濾鏡,以及光切換器。光切換器用於將第一反射光源R1與第二反射光源R2導引至紅外光帶通濾鏡,或是將環境光源導引至可見光帶通濾鏡。光切換器可為,舉例而言,壓電馬達(Piezoelectric Motor)、音圈馬達(Voice Coil Motor,VCM)或電磁開關。 Referring to FIG. 1 , the time-of-flight image sensing module 1 of the first embodiment of the present invention further includes a switching module 12 . The switching module 12 is configured to perform color texturing on the obtained stereoscopic image information. Specifically, the switching module 12 includes an infrared light bandpass filter, a visible light bandpass filter, and an optical switcher. The optical switch is used to guide the first reflective light source R1 and the second reflective light source R2 to the infrared light band pass filter or to direct the ambient light source to the visible light band pass filter. The optical switch can be, for example, a piezoelectric motor (Piezoelectric Motor), a voice coil motor (Voice Coil) Motor, VCM) or electromagnetic switch.

如此一來,依據由鏡頭模組14所接收的光源所通過的帶通濾鏡的性質,可以獲得不同型態的影像資訊。舉例而言,在紅外光模式下,第一反射光源R1以及第二反射光源R2通過光切換器的切換而被導引至紅外光帶通濾鏡,第一反射光源R1以及第二反射光源R2中紅外光之外的波段會被過濾,並分別透過空間調變及時間調變演算法而將2D的結構光源SL及照明光源IL反射影像經過運算轉換為物體O的黑白立體影像資訊。另外,在可見光模式下,環境光源照射至物體O所產生的反射光通過光切換器的切換而被導引至可見光帶通濾鏡,反射光中可見光之外的波段會被過濾,而獲得物體O的彩色影像資訊。具體來說,在可見光模式下,用於產生結構光源SL及照明光源IL的發光模組2可為關閉狀態,而光切換器是將環境光,例如外界自然光(太陽光)或室內光源(例如電燈的光源)照射至物體O所產生的反射光導引至可見光帶通濾鏡,而透過可見光帶通濾鏡將反射光中可見光之外的波段過濾,而達到擷取物體O的彩色影像資訊的效果。換句話說,在可見光模式時,不需要透過發光模組進行光源投射的動作。 In this way, different types of image information can be obtained according to the nature of the band pass filter passed by the light source received by the lens module 14. For example, in the infrared light mode, the first reflective light source R1 and the second reflective light source R2 are guided to the infrared light bandpass filter by switching of the optical switch, the first reflective light source R1 and the second reflective light source R2. The bands other than the mid-infrared light are filtered, and the 2D structured light source SL and the illumination source IL reflected image are converted into black-and-white stereoscopic image information of the object O through spatial modulation and time modulation algorithms, respectively. In addition, in the visible light mode, the reflected light generated by the ambient light source irradiated to the object O is guided to the visible light band pass filter by the switching of the optical switch, and the band other than the visible light in the reflected light is filtered to obtain the object. O color image information. Specifically, in the visible light mode, the light-emitting module 2 for generating the structural light source SL and the illumination light source IL may be in a closed state, and the light switcher is an ambient light such as external natural light (sunlight) or an indoor light source (for example The light source that is irradiated to the object O is guided to the visible light band pass filter, and the visible light band pass filter filters the band other than the visible light in the reflected light to reach the color image information of the object O. Effect. In other words, in the visible light mode, there is no need to perform a light source projection operation through the light emitting module.

換句話說,在進行前述總共二次的投-拍(投影-拍攝)程序(分別透過結構光源SL以及照明光源IL而達成)之後,可透過切換模組12中的切換器而使環境光源照射至物體O所產生的反射光被導引至可見光帶通濾鏡,藉此完成第三次的彩色拍攝程序。前述第三次的拍攝程序可獲得物體O的彩色影像資訊,因此,透過資料的整合及運算,可將前兩次投-拍程序所獲得的黑白立體影像資訊,透過第三次拍攝程序所得的資訊而轉換為彩色立體影像資訊。 In other words, after performing the aforementioned total two-shot projection-projection (achieved by the structured light source SL and the illumination source IL, respectively), the ambient light source can be illuminated by the switcher in the switching module 12. The reflected light generated by the object O is guided to the visible light band pass filter, thereby completing the third color shooting process. The third shooting procedure can obtain the color image information of the object O. Therefore, through the integration and calculation of the data, the black and white stereoscopic image information obtained by the previous two shooting and shooting procedures can be obtained through the third shooting procedure. Information is converted into color stereoscopic image information.

綜上所述,透過在飛行時間影像感測模組1的結構上進一步加裝切換模組12,可對黑白立體影像資訊進行貼色,且由於包含有切換模組12的飛行時間影像感測模組1可同時作為飛行時間攝影機(TOF Camera)以及彩色攝影機(Color Camera),此二種攝影機 具有相同畫素以及共同的世界座標,據此,可在不須進行額外校正程序的情形下達到快速貼色以及無縫貼色的效果,同時省去使用額外彩色攝影機的必要性。 In summary, by further adding the switching module 12 to the structure of the time-of-flight image sensing module 1, the black and white stereo image information can be pasted, and the time-of-flight image sensing including the switching module 12 is included. Module 1 can be used as both a TOF Camera and a Color Camera. With the same pixels and a common world coordinate, it is possible to achieve fast color registration and seamless color registration without the need for additional calibration procedures, while eliminating the need to use additional color cameras.

〔第二實施例〕 [Second embodiment]

請參閱圖2。圖2為本發明第二實施例所提供的用於捕捉立體影像的裝置的示意圖。本發明第二實施例與第一實施例之間最大的差別在於發光模組2的設計不同。以下將針對第二實施例與第一實施例的不同之處進行說明,而第二實施例與第一實施例實質上相同的部份將不再贅述。 Please refer to Figure 2. 2 is a schematic diagram of an apparatus for capturing a stereoscopic image according to a second embodiment of the present invention. The biggest difference between the second embodiment of the present invention and the first embodiment is that the design of the light-emitting module 2 is different. The differences between the second embodiment and the first embodiment will be described below, and the portions of the second embodiment that are substantially the same as those of the first embodiment will not be described again.

如圖2所示,第二實施例中,光產生單元21包含雷射產生器211以及發光元件213,雷射光源211所產生的雷射光源L11通過光繞射元件22以轉換為結構光源SL,且發光元件213所產生的投射光源L12通過光擴散元件23以轉換為照明光源IL。換句話說,與第一實施例不同的是,第二實施例是利用兩個獨立的光產生器來分別產生用於產生結構光源SL以及照明光源IL的雷射光源L11以及投射光源L12。 As shown in FIG. 2, in the second embodiment, the light generating unit 21 includes a laser generator 211 and a light-emitting element 213, and the laser light source L11 generated by the laser light source 211 is converted into a structure light source SL by the light diffraction element 22. And the projection light source L12 generated by the light-emitting element 213 is converted into the illumination light source IL by the light diffusion element 23. In other words, unlike the first embodiment, the second embodiment utilizes two independent light generators to respectively generate a laser light source L11 and a projection light source L12 for generating the structured light source SL and the illumination light source IL.

具體而言,由於結構光源SL必須由同調光所產生,第二實施例中是選用雷射產生器211,並搭配準直儀2111來產生雷射光源L11。另一方面,用於產生照明光源IL的發光元件231的種類則不在此限制。舉例而言,發光元件231可為LED發光元件。 Specifically, since the structural light source SL must be generated by the same dimming, in the second embodiment, the laser generator 211 is selected, and the collimator 2111 is used to generate the laser light source L11. On the other hand, the kind of the light-emitting element 231 for generating the illumination light source IL is not limited here. For example, the light emitting element 231 can be an LED light emitting element.

請同樣參閱圖2,光繞射元件22以及光擴散元件23設置在飛行時間影像擷取模組1的同一側,且光擴散元件22是設置於光繞射元件23與飛行時間影像擷取模組1之間。如同針對第一實施例所述,飛行時間影像擷取模組1、光繞射元件22以及光擴散元件23呈線性設置,且飛行時間影像擷取模組11的光接收面111、光繞射元件22的出光面221以及光擴散元件23的出光面231都沿著同一基準軸線BA排列,而飛行時間影像擷取模組11與光繞射 元件222在x方向(水平方向)上的距離為三角測距法中的基準線BL(Baseline)。 Referring also to FIG. 2, the light diffractive element 22 and the light diffusing element 23 are disposed on the same side of the time-of-flight image capturing module 1, and the light diffusing element 22 is disposed on the light diffractive element 23 and the time-of-flight image capturing mode. Between group 1. As described in the first embodiment, the time-of-flight image capturing module 1, the light diffractive element 22, and the light diffusing element 23 are linearly arranged, and the light receiving surface 111 of the time-of-flight image capturing module 11 and the light diffraction The light-emitting surface 221 of the element 22 and the light-emitting surface 231 of the light-diffusing element 23 are all arranged along the same reference axis BA, and the time-of-flight image capturing module 11 and the light diffraction The distance of the element 222 in the x direction (horizontal direction) is the reference line BL (Baseline) in the triangulation method.

〔第三實施例〕 [Third embodiment]

請參閱圖3。圖3為本發明第三實施例所提供的用於捕捉立體影像的裝置的示意圖。本發明第三實施例與第二實施例之間最大的差別在於發光模組2中各組件的設置方式不同。以下將針對第三實施例與第二實施例的不同之處進行說明,而第三實施例與第二實施例實質上相同的部份將不再贅述。 Please refer to Figure 3. FIG. 3 is a schematic diagram of an apparatus for capturing a stereoscopic image according to a third embodiment of the present invention. The biggest difference between the third embodiment of the present invention and the second embodiment is that the components of the light-emitting module 2 are arranged differently. Differences between the third embodiment and the second embodiment will be described below, and substantially the same portions of the third embodiment and the second embodiment will not be described again.

如圖3所示,光產生單元21包含雷射產生器211以及發光元件213,雷射光源211所產生的雷射光源L11通過光繞射元件22以轉換為結構光源SL,且發光元件213所產生的投射光源L12通過光擴散元件23以轉換為照明光源IL。另外,光繞射元件22以及光擴散元件23分別設置在飛行時間影像擷取模組1的兩側。換句話說,光繞射元件22以及光擴散元件23分別設置在飛行時間影像擷取模組1的兩相反側。值得一提的是,第三實施例以及前述第二實施例中光繞射元件22、光擴散元件23以及飛行時間影像擷取模組1之間的相對位置可依據所欲獲得的立體影像資訊的精度,以及製造用於捕捉立體影像的裝置D的過程中的其他參數加以調整。 As shown in FIG. 3, the light generating unit 21 includes a laser generator 211 and a light-emitting element 213. The laser light source L11 generated by the laser light source 211 is converted into a structural light source SL by the light diffraction element 22, and the light-emitting element 213 is The generated projection light source L12 is converted into the illumination light source IL by the light diffusion element 23. In addition, the light diffractive element 22 and the light diffusing element 23 are respectively disposed on both sides of the time-of-flight image capturing module 1. In other words, the light diffractive element 22 and the light diffusing element 23 are respectively disposed on opposite sides of the time-of-flight image capturing module 1. It is to be noted that the relative positions between the light diffraction element 22, the light diffusing element 23, and the time-of-flight image capturing module 1 in the third embodiment and the foregoing second embodiment may be based on the desired stereoscopic image information. The accuracy, as well as other parameters in the process of fabricating the device D for capturing stereoscopic images, are adjusted.

另外,本發明還提供一種用於捕捉立體影像的方法。請參閱圖4,圖4為本發明所提供的用於捕捉立體影像的方法的流程圖。本發明所提供的用於捕捉立體影像的方法包含下列步驟:通過一發光模組產生投向一物體的一結構光源以及一照明光源(步驟S100);所述結構光源以及所述照射光源分別通過所述物體的反射,以分別形成一第一反射光源以及一第二反射光源(步驟S102);以及通過一飛行時間影像擷取模組擷取所述第一反射光源 以及所述第二反射光源,以獲得所述物體的一立體影像資訊(步驟S104)。 In addition, the present invention also provides a method for capturing a stereoscopic image. Please refer to FIG. 4. FIG. 4 is a flowchart of a method for capturing a stereoscopic image provided by the present invention. The method for capturing a stereoscopic image provided by the present invention includes the following steps: generating a structured light source and an illumination light source directed to an object through a light emitting module (step S100); the structured light source and the illumination light source respectively pass through Reflecting the object to form a first reflected light source and a second reflected light source respectively (step S102); and capturing the first reflected light source by a time-of-flight image capturing module And the second reflective light source to obtain a stereoscopic image information of the object (step S104).

請同時參閱圖1至3。首先,於步驟S100中,通過發光模組2產生投向物體O的結構光源SL以及照明光源IL。發光模組2可使用前述第一實施例至第三實施例中的發光模組2,其詳細內容不在此再次敘述。 Please also refer to Figures 1 to 3. First, in step S100, the structured light source SL and the illumination light source IL that are directed to the object O are generated by the light-emitting module 2. The light-emitting module 2 can use the light-emitting modules 2 in the first to third embodiments described above, and the details thereof will not be described again.

接下來,於步驟S102中,利用物體O的表面反射構光源SL以及照明光源IL,藉此分別形成第一反射光源R1以及第二反射光源R2。最後,於步驟S104中,通過飛行時間影像擷取模組1擷取第一反射光源R1以及第二反射光源R2,以獲得物體O的立體影像資訊。除此之外,本發明所提供的用於捕捉立體影像的方法還可進一步包含對立體影像資訊進行貼色的步驟。透過光切換器12對立體影像資訊進行貼色的方式與第一實施例中所敘述者實質上相同。 Next, in step S102, the light source SL and the illumination light source IL are reflected by the surface of the object O, whereby the first reflection light source R1 and the second reflection light source R2 are formed, respectively. Finally, in step S104, the first reflected light source R1 and the second reflected light source R2 are captured by the time-of-flight image capturing module 1 to obtain stereoscopic image information of the object O. In addition, the method for capturing a stereoscopic image provided by the present invention may further comprise the step of coloring the stereoscopic image information. The manner in which the stereoscopic image information is pasted by the optical switch 12 is substantially the same as that described in the first embodiment.

具體而言,飛行時間影像擷取模組1可進一步包含一切換模組12,所述切換模組12包含紅外光帶通濾鏡、可見光帶通濾鏡以及光切換器,且光切換器用於將第一反射光源R1與第二反射光源R2導引至紅外光帶通濾鏡,或是將環境光源照射至物體O所產生的反射光導引至可見光帶通濾鏡。在紅外光模式下,第一反射光源R1以及第二反射光源R2兩者通過光切換器的切換以導引至紅外光帶通濾鏡,第一反射光源R1以及第二反射光源R2通過紅外光帶通濾鏡再到達飛行時間感測元件14,以產生黑白立體影像資訊。在可見光模式下,環境光源照射至物體O所產生的反射光通過光切換器的切換以導引至可見光帶通濾鏡,反射光通過可見光帶通濾鏡再到達飛行時間感測元件14,以獲得彩色影像資訊。利用前述黑白立體影像資訊以及彩色影像資訊進行運算及處理,可獲得彩色立體影像資訊。 Specifically, the time-of-flight image capturing module 1 may further include a switching module 12, where the switching module 12 includes an infrared light band pass filter, a visible light band pass filter, and an optical switch, and the optical switch is used for The first reflective light source R1 and the second reflective light source R2 are guided to the infrared light band pass filter, or the reflected light generated by the ambient light source is irradiated to the object O is guided to the visible light band pass filter. In the infrared light mode, both the first reflective light source R1 and the second reflective light source R2 are switched by the optical switch to be guided to the infrared light bandpass filter, and the first reflective light source R1 and the second reflective light source R2 pass the infrared light. The bandpass filter then reaches the time of flight sensing component 14 to produce black and white stereoscopic image information. In the visible light mode, the reflected light generated by the ambient light source illuminating the object O is switched by the optical switch to be guided to the visible light band pass filter, and the reflected light passes through the visible light band pass filter and then reaches the time-of-flight sensing element 14 to Get color image information. The color stereoscopic image information can be obtained by performing the calculation and processing using the black and white stereoscopic image information and the color image information.

本發明另外提供一種用於捕捉立體影像的系統,其用於擷取至少一物體O的一立體影像資訊。請參考圖5A及圖5B,並同時配合圖6所示。圖5A及圖5B為本發明所提供的用於捕捉立體影像的系統的示意圖,而圖6為本發明所提供的用於捕捉立體影像的系統的操作流程圖。本發明所提供的用於捕捉立體影像的系統S包含記憶體單元51、處理器模組52、飛行時間處理模組53、發光模組58以及飛行時間影像擷取模組56。 The present invention further provides a system for capturing a stereoscopic image for capturing a stereoscopic image information of at least one object O. Please refer to FIG. 5A and FIG. 5B, and at the same time, as shown in FIG. 6. 5A and FIG. 5B are schematic diagrams of a system for capturing a stereoscopic image provided by the present invention, and FIG. 6 is a flowchart of operations of a system for capturing a stereoscopic image provided by the present invention. The system S for capturing stereoscopic images provided by the present invention comprises a memory unit 51, a processor module 52, a time-of-flight processing module 53, a lighting module 58, and a time-of-flight image capturing module 56.

首先,請參閱圖5A,同時配合圖1所示。處理器模組52電性連接於記憶體單元51;飛行時間處理模組53電性連接於處理器模組52;發光模組58電性連接於飛行時間處理模組53;而飛行時間影像擷取模組56電性連接於飛行時間處理模組53。舉例而言,發光模組58對應於第一實施例中的發光模組2。飛行時間影像擷取模組56可以對應於第一實施例中的飛行時間影像擷取模組1。 First, please refer to FIG. 5A, which is also shown in FIG. The processor module 52 is electrically connected to the memory unit 51; the time-of-flight processing module 53 is electrically connected to the processor module 52; the light-emitting module 58 is electrically connected to the time-of-flight processing module 53; The module 56 is electrically connected to the time of flight processing module 53. For example, the light emitting module 58 corresponds to the light emitting module 2 in the first embodiment. The time-of-flight image capturing module 56 may correspond to the time-of-flight image capturing module 1 in the first embodiment.

如圖5A所示,發光模組58可以通過飛行時間處理模組53的控制,用於分別產生投向物體O的結構光源SL以及投向物體O的照明光源IL。結構光源SL以及照明光源IL分別通過物體O的反射而分別產生一結構光資訊以及一照明光資訊。飛行時間影像擷取模組56通過飛行時間處理模組53的控制,以擷取結構光資訊以及照明光資訊。 As shown in FIG. 5A, the light-emitting module 58 can be controlled by the time-of-flight processing module 53 to generate a structured light source SL directed to the object O and an illumination source IL directed to the object O, respectively. The structured light source SL and the illumination light source IL respectively generate a structured light information and an illumination light information by reflection of the object O. The time-of-flight image capturing module 56 controls the flight time processing module 53 to extract structured light information and illumination light information.

承上述,飛行時間影像擷取模組56所擷取到的結構光資訊通過處理器模組52的運算後,可以得到物體O的一結構光立體點雲,而飛行時間影像擷取模組56所擷取到的照明光資訊通過處理器模組52的運算後,可以得到一飛行時間立體點雲,且結構光立體點雲以及飛行時間立體點雲通過運算可以合併成一立體深度圖。舉例而言,處理器模組52的結構光立體點雲以及飛行時間立體點雲通過可通過微處理器模組52的運算而合併為用於提供物體O的影像資訊的立體深度圖。然而,本發明不在此限制。 In the above, the structured light information captured by the time-of-flight image capturing module 56 can be obtained by the processor module 52, and a structured light stereo point cloud of the object O can be obtained, and the time-of-flight image capturing module 56 is obtained. The acquired illumination light information can be obtained by the processor module 52, and a flight time stereo point cloud can be obtained, and the structured light stereo point cloud and the flight time stereo point cloud can be combined into a stereo depth map by operation. For example, the structured light stereo point cloud of the processor module 52 and the time-of-flight stereo point cloud are merged into a stereo depth map for providing image information of the object O by calculation by the microprocessor module 52. However, the invention is not limited thereto.

具體來說,飛行時間處理模組53是用於調變發光模組58所產生的照明光源IL(即,給予調變訊號以進行時間調變),並且同時依據前述用於調變發光模組58的照明光源IL的方式,用於對飛行時間影像擷取模組56的感測器進行解調變(De-modulate)。由飛行時間影像擷取模組56所擷取的資訊以及由處理器模組52所運算而得的資訊,包含結構光立體點雲、飛行時間立體點雲以及其他資訊,皆可儲存於記憶體單元內。換句話說,由飛行時間影像擷取模組56所擷取的資訊可以先暫時儲存於記憶體單元內,再送回處理器模組52進行處理及運算。另外,用於捕捉立體影像的系統S可以進一步包含計算單元57,其電性連接於微處理模組52,用於對立體深度圖進行校正。處理器模組52亦對飛行時間影像擷取模組56及發光模組58進行整體控制,以避免此等組件之間發生交互時序干擾。 Specifically, the time-of-flight processing module 53 is used to modulate the illumination source IL generated by the illumination module 58 (ie, to provide a modulation signal for time modulation), and at the same time, according to the foregoing, for the modulation illumination module. The illumination source IL of 58 is used to de-modulate the sensor of the time-of-flight image capture module 56. The information captured by the time-of-flight image capturing module 56 and the information calculated by the processor module 52, including the structured light stereo point cloud, the flight time stereo point cloud, and other information, can be stored in the memory. Within the unit. In other words, the information captured by the time-of-flight image capturing module 56 can be temporarily stored in the memory unit and sent back to the processor module 52 for processing and calculation. In addition, the system S for capturing a stereoscopic image may further include a computing unit 57 electrically connected to the micro processing module 52 for correcting the stereoscopic depth map. The processor module 52 also controls the time-of-flight image capturing module 56 and the lighting module 58 as a whole to avoid interaction timing interference between the components.

接下來,請參閱圖5B,同時參閱圖2及圖3的內容。如圖5B所示,用於捕捉立體影像的系統S包含記憶體單元51、處理器模組52、飛行時間處理模組53、第一發光模組54、第二發光模組55以及飛行時間影像擷取模組56。處理器模組52電性連接於記憶體單元51;飛行時間處理模組53電性連接於處理器模組52;第一發光模組54電性連接於處理器模組52;第二發光模組55電性連接於飛行時間處理模組53;而飛行時間影像擷取模組56電性連接於飛行時間處理模組53。第一發光模組54通過處理器模組52的控制,以用於產生投向物體O的結構光源SL。結構光源SL通過物體O的反射而產生一結構光資訊。第二發光模組55通過飛行時間處理模組53的控制,以用於產生投向物體O的照明光源IL。照明光源IL通過物體O的反射而產生一照明光資訊。飛行時間影像擷取模組56通過飛行時間處理模組53的控制,以擷取結構光資訊以及照明光資訊。 Next, please refer to FIG. 5B, and refer to FIG. 2 and FIG. 3 at the same time. As shown in FIG. 5B, the system S for capturing a stereoscopic image includes a memory unit 51, a processor module 52, a time-of-flight processing module 53, a first illumination module 54, a second illumination module 55, and a time-of-flight image. The module 56 is captured. The processor module 52 is electrically connected to the memory unit 51; the time-of-flight processing module 53 is electrically connected to the processor module 52; the first lighting module 54 is electrically connected to the processor module 52; The group 55 is electrically connected to the time-of-flight processing module 53; and the time-of-flight image capturing module 56 is electrically connected to the time-of-flight processing module 53. The first lighting module 54 is controlled by the processor module 52 for generating the structural light source SL directed to the object O. The structured light source SL generates a structured light information by reflection of the object O. The second lighting module 55 is controlled by the time-of-flight processing module 53 for generating the illumination source IL directed to the object O. The illumination source IL generates an illumination light information by reflection of the object O. The time-of-flight image capturing module 56 controls the flight time processing module 53 to extract structured light information and illumination light information.

舉例而言,第一發光模組54以及第二發光模組55分別對應 至圖2及圖3所示的雷射產生器211以及發光元件213。飛行時間影像擷取模組56可以對應於第一實施例至第三實施例中的飛行時間影像擷取模組1。換句話說,圖5B所顯示的用於捕捉立體影像的系統可採用圖2及圖3的用於捕捉立體影像的裝置。圖5B與圖5A所顯示的用於捕捉立體影像的系統之間的不同之處在於,於圖5B所顯示的用於捕捉立體影像的系統中,第一發光模組54可直接透過處理器模組52進行開關切換(On/Off切換)。換句話說,第一發光模組54不需要通過飛行時間處理模組53進行時間調變。圖5B的其他詳細內容與圖5A實質上相同,在此不再次敘述。 For example, the first light emitting module 54 and the second light emitting module 55 respectively correspond to The laser generator 211 and the light-emitting element 213 shown in FIGS. 2 and 3 are provided. The time-of-flight image capturing module 56 may correspond to the time-of-flight image capturing module 1 in the first embodiment to the third embodiment. In other words, the system for capturing stereoscopic images shown in FIG. 5B can employ the apparatus for capturing stereoscopic images of FIGS. 2 and 3. The difference between the system for capturing a stereoscopic image shown in FIG. 5B and FIG. 5A is that, in the system for capturing a stereoscopic image shown in FIG. 5B, the first illumination module 54 can directly pass through the processor module. Group 52 performs switching (On/Off switching). In other words, the first lighting module 54 does not need to be time modulated by the time of flight processing module 53. The other details of FIG. 5B are substantially the same as FIG. 5A and will not be described again here.

請參閱圖6,並依需要參考圖1至圖3、圖5A及圖5B的內容。配合前述圖5A及圖5B所顯示的用於捕捉立體影像的系統,可進行立體影像資擷取的程序。如圖6所示,首先,在黑白模式下操作用於捕捉立體影像的系統S,以獲得一黑白影像資訊(步驟S200)。換句話說,步驟S200是在紅外光(IR)模式下操作用於捕捉立體影像的系統S。步驟S200中包含透過結構光獲得結構光立體點雲的程序(步驟S2001、S2003、S2005),以及透過飛行時間法獲得飛行時間立體點雲的程序(步驟S2002、S2004、S2006)。前述兩個程序進行的順序可以依需求加以調整。 Please refer to FIG. 6 and refer to the contents of FIG. 1 to FIG. 3, FIG. 5A and FIG. 5B as needed. The program for capturing stereoscopic images can be performed in conjunction with the system for capturing stereoscopic images shown in FIGS. 5A and 5B. As shown in FIG. 6, first, the system S for capturing a stereoscopic image is operated in a black and white mode to obtain a black and white image information (step S200). In other words, step S200 is a system S for operating a stereoscopic image in an infrared light (IR) mode. Step S200 includes a program for obtaining a structured light stereoscopic point cloud through structured light (steps S2001, S2003, and S2005), and a program for obtaining a time-of-flight stereo point cloud by a time-of-flight method (steps S2002, S2004, and S2006). The order of the above two procedures can be adjusted as needed.

舉例而言,可以先產生投向至少一物體O的結構光源SL(步驟S2001),即,可透過光產生單元21與光繞射元件22的組合(包含於發光模組58或是第一發光模組54內)產生結構光源SL。結構光源SL經物體O反射形成第一反射光源R1,飛行時間影像擷取模組56擷取第一反射光源R1,以產生結構光資訊(步驟S2003)。接下來,飛行時間影像擷取模組56所擷取到的結構光資訊通過處理器模組52的運算,以得到結構光立體點雲(步驟S2005)。至此,已完成以結構光擷取物體O的立體影像的程序。 For example, the structured light source SL that is directed to the at least one object O may be generated first (step S2001), that is, the combination of the permeable light generating unit 21 and the light diffraction element 22 (including the light emitting module 58 or the first light emitting mode) Within group 54) a structured light source SL is produced. The structured light source SL is reflected by the object O to form the first reflected light source R1, and the time-of-flight image capturing module 56 captures the first reflected light source R1 to generate structured light information (step S2003). Next, the structured light information captured by the time-of-flight image capturing module 56 is calculated by the processor module 52 to obtain a structured light stereo point cloud (step S2005). So far, the program of capturing the stereoscopic image of the object O by the structured light has been completed.

接下來,產生投向至少一物體O的照射光源LL(步驟S2002)。即,可透過光產生單元21與光擴散元件23的組合(包含於發光模 組58或是第二發光模組55內)產生照明光源IL。照明光源IL經物體O反射形成第二反射光源R2,飛行時間影像擷取模組56擷取第二反射光源R2,以產生照明光資訊(步驟S2004)。接下來,飛行時間影像擷取模組56所擷取到的照明光資訊通過處理器模組52的運算,以得到飛行時間立體點雲(步驟S2006)。至此,已完成以飛行時間法擷取物體O的立體影像的程序。 Next, an illumination light source LL that is directed to at least one object O is generated (step S2002). That is, the combination of the light-transmitting unit 21 and the light diffusing element 23 (included in the light-emitting mode) The group 58 or the second lighting module 55 generates an illumination source IL. The illumination source IL is reflected by the object O to form a second reflected light source R2, and the time-of-flight image capturing module 56 captures the second reflected light source R2 to generate illumination light information (step S2004). Next, the illumination light information captured by the time-of-flight image capturing module 56 is calculated by the processor module 52 to obtain a time-of-flight stereo point cloud (step S2006). So far, the procedure for capturing the stereoscopic image of the object O by the flight time method has been completed.

接著,結構光立體點雲以及飛行時間立體點雲通過處理器模組52的運算,以合併成一黑白立體深度圖(S2007)。在此步驟中,將以結構光進行近距離立體掃描所獲得的資訊以及以飛行時間法進行中距及遠距立體掃描所獲得的資訊加以整合。如此一來,可以利用結構光獲得近距離下的高精度的立體影像資擷取,並以飛行時間法進行中距及遠距的立體掃描,同時用以來提高以結構光進行近距立體掃描時的深度精度。 Then, the structured light stereo point cloud and the flight time stereo point cloud are processed by the processor module 52 to be merged into a black and white stereo depth map (S2007). In this step, the information obtained by the close-range stereo scanning of the structured light and the information obtained by the time-of-flight and long-distance stereo scanning by the time-of-flight method are integrated. In this way, the structured light can be used to obtain high-precision stereo image acquisition at close range, and the medium-distance and long-distance stereo scanning can be performed by the time-of-flight method, and at the same time, the close-range stereo scanning with structured light is improved. Depth accuracy.

承上述,由於在紅外光模式下操作用於捕捉立體影像的系統S,只能獲得黑白立體深度圖,可在進行前述步驟S200後,進行步驟S202以及S204,即在彩色模式下操作用於捕捉立體影像的系統S,以獲得一彩色影像資訊,以及基於彩色影像資訊中的畫素(Pixel)資訊對黑白立體深度圖上色,以獲得彩色立體深度圖。值得一提的是,由於彩色影像資訊中的畫素資訊與前述利用結構光源SL或照明光源IL所得的資訊之間畫素完全一致,不需要再進行外部校正即可以高精度對黑白立體深度圖進行上色。步驟S202可透過使用圖1至圖3中所述的切換模組12而完成。具體來說,步驟S202包含以環境光源投射於物體O、經由飛行時間影像擷取模組56擷取由物體O的表面所反射的彩色反射光、使用切換模組12中的光切換器導引彩色反射光至可見光帶通濾鏡,而獲得物體O的彩色影像資訊。最後,在步驟S204中,依據步驟S202中所獲得的彩色影像資訊對步驟S200中所獲得的黑白立體深度圖上色,以獲得彩色立體深度圖。 In the above, since the system S for capturing a stereoscopic image is operated in the infrared light mode, only the black and white stereoscopic depth map can be obtained. After performing the foregoing step S200, steps S202 and S204 can be performed, that is, operating in the color mode for capturing. The stereoscopic image system S obtains a color image information, and colors the black and white stereo depth map based on the Pixel information in the color image information to obtain a color stereo depth map. It is worth mentioning that since the pixel information in the color image information and the information obtained by using the structure light source SL or the illumination source IL are completely identical, the black and white stereo depth map can be accurately performed without external correction. Coloring. Step S202 can be accomplished by using the switching module 12 described in FIGS. 1 through 3. Specifically, step S202 includes projecting an object light source onto the object O, capturing the color reflected light reflected by the surface of the object O through the time-of-flight image capturing module 56, and guiding the light switch in the switching module 12. Color reflected light to visible light bandpass filter to obtain color image information of object O. Finally, in step S204, the black and white stereoscopic depth map obtained in step S200 is colored according to the color image information obtained in step S202 to obtain a color stereoscopic depth map.

綜上所述,本發明的有益效果在於,藉由整合結構光與飛行時間技術於單一用於立體影像擷取的裝置或系統之中,本發明得以獲得具有高深度精度、低雜訊的立體影像資訊,同時縮減裝置及系統的整體體積以及使用功率。 In summary, the present invention has an advantageous effect that the present invention can obtain a stereo with high depth accuracy and low noise by integrating structured light and time-of-flight technology into a single device or system for stereoscopic image capture. Image information while reducing the overall size and power of the device and system.

具體而言,本發明所提供的裝置、方法及系統中是利用結構光進行近距離、高精度的立體影像資擷取,並以飛行時間法進行中距及遠距的立體掃描。如此一來,可有效增加立體掃描的範圍,進而提升產品的應用範圍。另外,由於本發明的裝置及系統的體積較小、結構複雜度低,可以適用於較小的電子產品中,且可在提供較少能源的條件下進行操作。詳細而言,本發明所提供的裝置及系統中可透過簡單的組件,即小體積的雷射模組211搭配準直儀2111以及光繞射模組22(二者皆為小體積的透鏡)來產生結構光源SL,其中雷射模組211只須提供脈衝式訊號而不需要複雜的控制模組進行控制,製造成本可大幅降低。再者,利用飛行時間法可以來提高以結構光進行近距立體掃描時的深度精度(Depth Accuracy)。 Specifically, in the apparatus, method and system provided by the present invention, the stereoscopic image acquisition of the close-range and high-precision is performed by using the structured light, and the stereoscopic scanning of the intermediate distance and the long distance is performed by the time-of-flight method. In this way, the range of stereo scanning can be effectively increased, thereby increasing the application range of the product. In addition, since the apparatus and system of the present invention are small in size and low in structure complexity, they can be applied to smaller electronic products and can be operated under conditions of providing less energy. In detail, the device and system provided by the present invention can be combined with a simple component, that is, a small-volume laser module 211 with a collimator 2111 and a light diffraction module 22 (both are small-volume lenses). The structure light source SL is generated, wherein the laser module 211 only needs to provide a pulse signal without requiring a complicated control module for control, and the manufacturing cost can be greatly reduced. Furthermore, the time-of-flight method can be used to improve the depth accuracy (Depth Accuracy) when performing close-range stereo scanning with structured light.

除此之外,本發明所提供的裝置、方法及系統可透過在飛行時間影像擷取模組上加裝切換模組,而在不增加攝影機數量以及裝置、系統尺寸之下,達到快速貼色以及無縫貼色的效果。 In addition, the device, the method and the system provided by the present invention can realize the fast color registration by adding a switching module to the time-of-flight image capturing module without increasing the number of cameras and the size of the device and the system. And a seamless color effect.

以上所述僅為本發明的較佳可行實施例,非因此侷限本發明的專利範圍,故舉凡運用本發明說明書及附圖內容所做的等效技術變化,均包含於本發明的保護範圍內。 The above is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Therefore, equivalent technical changes made by using the present specification and the contents of the drawings are included in the protection scope of the present invention. .

Claims (13)

一種用於捕捉立體影像的裝置,其包含:一飛行時間影像擷取模組,其用於擷取至少一物體的一立體影像資訊;以及一發光模組,其鄰近所述飛行時間影像擷取模組,其中所述發光模組產生投向所述物體的一結構光源以及一照明光源;其中,所述結構光源通過所述物體的反射,以形成一第一反射光源,所述照明光源通過所述物體的反射,以形成一第二反射光源,且所述飛行時間影像擷取模組擷取所述第一反射光源以及所述第二反射光源,以獲得所述物體的所述立體影像資訊。 An apparatus for capturing a stereoscopic image, comprising: a time-of-flight image capturing module for capturing a stereoscopic image information of at least one object; and a lighting module adjacent to the time-of-flight image capturing a module, wherein the light emitting module generates a structured light source and an illumination light source that are directed to the object; wherein the structured light source is reflected by the object to form a first reflective light source, and the illumination light source passes through the Reflecting the object to form a second reflected light source, and the time-of-flight image capturing module captures the first reflective light source and the second reflective light source to obtain the stereoscopic image information of the object . 如請求項1所述的用於捕捉立體影像的裝置,其中,所述發光模組包含一光產生單元、一光繞射元件以及一光擴散元件,所述發光模組所產生的所述結構光源是通過所述光繞射元件的轉換後所形成,且所述發光模組所產生的所述照明光源是通過所述光擴散元件的轉換後所形成。 The device for capturing a stereoscopic image according to claim 1, wherein the light emitting module comprises a light generating unit, a light diffractive element and a light diffusing element, and the structure generated by the light emitting module The light source is formed by conversion of the light diffractive element, and the illumination light source generated by the light emitting module is formed by conversion of the light diffusing element. 如請求項2所述的用於捕捉立體影像的裝置,其中,所述光產生單元包含一用於產生一雷射光源的雷射產生器以及一光學組件,所述雷射光源依序通過所述光學組件以及所述光繞射元件,以形成所述結構光源,且所述雷射光源依序通過所述光學組件以及所述光擴散元件,以形成所述照明光源。 The apparatus for capturing a stereoscopic image according to claim 2, wherein the light generating unit comprises a laser generator for generating a laser light source and an optical component, the laser light source sequentially passing through The optical component and the light diffractive element are formed to form the structured light source, and the laser light source sequentially passes through the optical component and the light diffusing element to form the illumination light source. 如請求項3所述的用於捕捉立體影像的裝置,其中,所述雷射產生器具有至少一準直儀,所述光學組件包含一分光元件以及一反光元件,所述雷射光源依序通過至少一所述準直儀的準直、所述分光元件的分光以及所述光繞射元件的轉換,以形成所述結構光源,且所述雷射光源依序通過至少一所述準直儀的準直、所述分光元件的分光、所述反光元件的反射以及所述光 擴散元件的轉換,以形成所述照明光源。 The apparatus for capturing a stereoscopic image according to claim 3, wherein the laser generator has at least one collimator, the optical component includes a beam splitting component and a light reflecting component, and the laser light source is sequentially Performing collimation of at least one of the collimators, splitting of the spectroscopic element, and conversion of the light diffracting element to form the structured light source, and the laser light source sequentially passes at least one of the collimating sources Collimation of the instrument, splitting of the spectroscopic element, reflection of the retroreflective element, and the light The conversion of the diffusing elements to form the illumination source. 如請求項2所述的用於捕捉立體影像的裝置,其中,所述光產生單元包含一雷射產生器以及一發光元件,所述雷射產生器所產生的一雷射光源通過所述光繞射元件以轉換為所述結構光源,且所述發光元件所產生的一投射光源通過所述光擴散元件以轉換為所述照明光源。 The apparatus for capturing a stereoscopic image according to claim 2, wherein the light generating unit comprises a laser generator and a light emitting element, and a laser light source generated by the laser generator passes the light A diffractive element is converted into the structured light source, and a projected light source generated by the light emitting element is converted into the illumination light source by the light diffusing element. 如請求項2所述的用於捕捉立體影像的裝置,其中,所述飛行時間影像擷取模組、所述光繞射元件以及所述光擴散元件呈線性設置,且所述飛行時間影像擷取模組的一光接收面、所述光繞射元件的一出光面以及所述光擴散元件的一出光面都沿著同一基準軸線排列。 The apparatus for capturing a stereoscopic image according to claim 2, wherein the time-of-flight image capturing module, the light diffractive element, and the light diffusing element are linearly arranged, and the time-of-flight image is A light receiving surface of the module, a light emitting surface of the light diffraction element, and a light emitting surface of the light diffusing element are all arranged along the same reference axis. 如請求項6所述的用於捕捉立體影像的裝置,其中,所述光繞射元件以及所述光擴散元件設置在所述飛行時間影像擷取模組的同一側,且所述光擴散元件設置於所述光繞射元件與所述飛行時間影像擷取模組之間。 The apparatus for capturing a stereoscopic image according to claim 6, wherein the light diffractive element and the light diffusing element are disposed on a same side of the time-of-flight image capturing module, and the light diffusing element And disposed between the light diffractive element and the time-of-flight image capturing module. 如請求項6所述的用於捕捉立體影像的裝置,其中,所述光繞射元件以及所述光擴散元件分別設置在所述飛行時間影像擷取模組的兩側。 The apparatus for capturing a stereoscopic image according to claim 6, wherein the light diffractive element and the light diffusing element are respectively disposed at two sides of the time-of-flight image capturing module. 如請求項1所述的用於捕捉立體影像的裝置,其中,所述飛行時間影像感測模組還包含一切換模組,所述切換模組包含:一紅外光帶通濾鏡;一可見光帶通濾鏡;以及一光切換器,其用於將所述第一反射光源與所述第二反射光源兩者導引至所述紅外光帶通濾鏡或是將一環境光源照射於所述物體所產生的反射光導引至所述可見光帶通濾鏡;其中,當所述第一反射光源以及所述第二反射光源通過所述光切換器的切換以導引至所述紅外光帶通濾鏡時,所述第一反射光源以及所述第二反射光源通過所述紅外光帶通濾鏡,以 產生一黑白立體影像資訊;其中,當所述環境光源照射於所述物體所產生的所述反射光通過所述光切換器的切換以導引至所述可見光帶通濾鏡時,所述環境光源照射於所述物體所產生的所述反射光通過所述可見光帶通濾鏡,以獲得一彩色影像資訊。 The apparatus for capturing a stereoscopic image according to claim 1, wherein the time-of-flight image sensing module further comprises a switching module, the switching module comprising: an infrared bandpass filter; a visible light a band pass filter; and an optical switch for guiding both the first reflected light source and the second reflected light source to the infrared light band pass filter or illuminating an ambient light source The reflected light generated by the object is guided to the visible light band pass filter; wherein, when the first reflective light source and the second reflective light source are switched by the optical switcher, the infrared light is guided to When the band pass filter is used, the first reflective light source and the second reflective light source pass through the infrared light bandpass filter to Generating a black and white stereoscopic image information; wherein, when the ambient light source illuminates the reflected light generated by the object through the switching of the optical switch to be guided to the visible light band pass filter, the environment The reflected light generated by the light source illuminating the object passes through the visible light band pass filter to obtain a color image information. 一種用於捕捉立體影像的系統,其用於擷取至少一物體的一影像資訊,所述系統包含:一處理器模組;一飛行時間處理模組,其電性連接於所述處理器模組;一發光模組,用以產生投向所述物體的一結構光源以及投向所述物體的一照明光源,其中所述結構光源通過所述物體的反射以產生一結構光資訊,且所述照明光源通過所述物體的反射,以產生一照明光資訊;以及一飛行時間影像擷取模組,其電性連接於所述飛行時間處理模組,以用於擷取所述結構光資訊以及所述照明光資訊;其中,所述飛行時間影像擷取模組所擷取到的所述結構光資訊通過所述處理器模組運算後可得到一結構光立體點雲;其中,所述飛行時間影像擷取模組所擷取到的所述照明光資訊通過所述處理器模組運算後可得到一飛行時間立體點雲;其中,所述結構光立體點雲以及所述飛行時間立體點雲可合併成一用於提供所述影像資訊的立體深度圖。 A system for capturing a stereoscopic image for capturing image information of at least one object, the system comprising: a processor module; a time-of-flight processing module electrically coupled to the processor module a lighting module for generating a structured light source directed to the object and an illumination source directed to the object, wherein the structured light source is reflected by the object to generate a structured light information, and the illumination a light source is reflected by the object to generate an illumination light information; and a time-of-flight image capturing module is electrically connected to the time-of-flight processing module for capturing the structured light information and the The illumination light information; wherein the structured light information captured by the time-of-flight image capturing module is calculated by the processor module to obtain a structured light stereo point cloud; wherein the flight time The illumination light information captured by the image capture module is calculated by the processor module to obtain a time-of-flight stereo point cloud; wherein the structured light stereo point cloud and the fly Time can be combined into three-dimensional point cloud perspective providing the depth map for an image information. 如請求項10所述的用於捕捉立體影像的系統,其中,所述發光模組包含一光產生單元、一光繞射元件以及一光擴散元件,所述發光模組所產生的所述結構光源是通過所述光繞射元件的轉換後所形成,且所述發光模組所產生的所述照明光源是通過所述光繞射元件的轉換後所形成。 The system for capturing a stereoscopic image according to claim 10, wherein the light emitting module comprises a light generating unit, a light diffractive element and a light diffusing element, and the structure generated by the light emitting module The light source is formed by conversion of the light diffractive element, and the illumination source generated by the illumination module is formed by conversion of the light diffractive element. 如請求項10所述的用於捕捉立體影像的系統,其更包含一記憶體單元,其中,所述結構光立體點雲、所述飛行時間立體點 雲以及所述立體深度圖儲存在所述記憶體單元。 The system for capturing a stereoscopic image according to claim 10, further comprising a memory unit, wherein the structured light stereo point cloud, the time-of-flight stereo point The cloud and the stereo depth map are stored in the memory unit. 如請求項10所述的用於捕捉立體影像的系統,其更包含一計算單元,其電性連接於所述微處理單元,用於對所述立體深度圖進行校正。 The system for capturing a stereoscopic image according to claim 10, further comprising a computing unit electrically connected to the micro processing unit for correcting the stereoscopic depth map.
TW105127449A 2016-08-26 2016-08-26 Device and system for capturing 3-d images TWI630431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW105127449A TWI630431B (en) 2016-08-26 2016-08-26 Device and system for capturing 3-d images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW105127449A TWI630431B (en) 2016-08-26 2016-08-26 Device and system for capturing 3-d images

Publications (2)

Publication Number Publication Date
TW201807443A TW201807443A (en) 2018-03-01
TWI630431B true TWI630431B (en) 2018-07-21

Family

ID=62189610

Family Applications (1)

Application Number Title Priority Date Filing Date
TW105127449A TWI630431B (en) 2016-08-26 2016-08-26 Device and system for capturing 3-d images

Country Status (1)

Country Link
TW (1) TWI630431B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111721232B (en) * 2020-06-19 2022-05-31 广州立景创新科技有限公司 Three-dimensional sensing device, light emitting module and control method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558604A (en) * 2013-10-12 2014-02-05 浙江大学 Reflective imaging method and system for modulation-type diffuse reflection surface according to flight time principle
WO2014068073A1 (en) * 2012-11-05 2014-05-08 Hexagon Technology Center Gmbh Method and device for determining three-dimensional coordinates of an object
CN103472457B (en) * 2013-09-13 2015-06-10 中国科学院空间科学与应用研究中心 Three-dimensional imaging system and method for calculating correlation flight time by means of sparse aperture compression
US9131221B2 (en) * 2009-06-25 2015-09-08 Koninklijke Philips N.V. Stereoscopic image capturing method, system and camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9131221B2 (en) * 2009-06-25 2015-09-08 Koninklijke Philips N.V. Stereoscopic image capturing method, system and camera
WO2014068073A1 (en) * 2012-11-05 2014-05-08 Hexagon Technology Center Gmbh Method and device for determining three-dimensional coordinates of an object
CN103472457B (en) * 2013-09-13 2015-06-10 中国科学院空间科学与应用研究中心 Three-dimensional imaging system and method for calculating correlation flight time by means of sparse aperture compression
CN103558604A (en) * 2013-10-12 2014-02-05 浙江大学 Reflective imaging method and system for modulation-type diffuse reflection surface according to flight time principle

Also Published As

Publication number Publication date
TW201807443A (en) 2018-03-01

Similar Documents

Publication Publication Date Title
CN107783353B (en) Device and system for capturing three-dimensional image
US9967545B2 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
US20230392920A1 (en) Multiple channel locating
US11067692B2 (en) Detector for determining a position of at least one object
CN109798838B (en) ToF depth sensor based on laser speckle projection and ranging method thereof
CN104634276B (en) Three-dimension measuring system, capture apparatus and method, depth computing method and equipment
US8462357B2 (en) Device and method for obtaining three-dimensional object surface data
US9170097B2 (en) Hybrid system
TW201405092A (en) Three-dimensional image measuring apparatus
WO2017077276A1 (en) Systems and methods for forming models of three dimensional objects
CN103486979A (en) Hybrid sensor
CN108050958A (en) It is a kind of based on the matched monocular depth camera of visual field and its detection method to object appearance
WO2023207756A1 (en) Image reconstruction method and apparatus, and device
US11326874B2 (en) Structured light projection optical system for obtaining 3D data of object surface
TWI630431B (en) Device and system for capturing 3-d images
RU125335U1 (en) DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS
CN103697825A (en) System and method of utilizing super-resolution 3D (three-dimensional) laser to measure
TWI835610B (en) Light emitting module, camera module and electronic equipment
CN203687882U (en) Super-resolution 3D laser measurement system
CN116753861A (en) Three-dimensional reconstruction system and three-dimensional reconstruction method based on multi-wavelength super-surface element
Onishi et al. Three Dimensional Position Detection Using a Two Dimensional PSD and a Marker Composed of Two Light Sources
WO2015047067A1 (en) Three-dimensional image scanner