TWI842465B - Establishment system for 3d point cloud information and method thereof - Google Patents

Establishment system for 3d point cloud information and method thereof Download PDF

Info

Publication number
TWI842465B
TWI842465B TW112112961A TW112112961A TWI842465B TW I842465 B TWI842465 B TW I842465B TW 112112961 A TW112112961 A TW 112112961A TW 112112961 A TW112112961 A TW 112112961A TW I842465 B TWI842465 B TW I842465B
Authority
TW
Taiwan
Prior art keywords
information
dimensional
imaging unit
processor
depth sensor
Prior art date
Application number
TW112112961A
Other languages
Chinese (zh)
Inventor
楊善
林裕勛
Original Assignee
友達光電股份有限公司
Filing date
Publication date
Application filed by 友達光電股份有限公司 filed Critical 友達光電股份有限公司
Application granted granted Critical
Publication of TWI842465B publication Critical patent/TWI842465B/en

Links

Abstract

An establishment system for 3D point cloud information and method thereof is applied to establish the 3D point cloud information for an object. The establishment system includes a stage, a 2D imaging unit, a first processor, a depth sensor, and a second processor. The stage is configured to bear an object. Wherein, the stage is provided with a light emitting unit, and the light emitting is configured to illuminate the object. The 2D imaging unit is configured to capture a plurality of images of the object. The first processor is coupled to the 2D imaging unit, and configured to analyze each of the plurality of images to obtain the edge information of each of the plurality of images. Wherein, each of the plurality of images includes the X information and the Y information of a plurality of points. The depth sensor is coupled to the first processor, and configured to capture a plurality of Z information of each of the plurality of points according to the edge information of each of the plurality of images. The second processor is coupled to the first processor, and configured to map the X information, the Y information, and the plurality of Z information of each of the plurality of points to generate 3D point cloud information of the object.

Description

三維點雲資訊的建置系統與建置方法Three-dimensional point cloud information construction system and construction method

本案係有關於三維點雲資訊,特別是有關於一種三維點雲資訊的建置系統與建置方法。This case is about three-dimensional point cloud information, and more particularly about a system and method for constructing three-dimensional point cloud information.

相較於傳統平面螢幕,曲面螢幕在高階市場(例如車用電子、電競、電視等)中已逐漸成為主流產品。然而,應用於傳統平面液晶顯示器(LCD)面板的各種製程技術並無法直接應用在曲面LCD面板上,例如保護膜(PET)/防爆膜(ASF)/黏著膜(PSA)雷射切割技術以及自動化光學檢測(AOI)技術。若想要將所述雷射切割技術以及所述自動化光學檢測技術應用在曲面LCD面板上,必須事先建立好曲面LCD面板之三維點雲資訊,方能執行如此高精度的製程技術。Compared to traditional flat screens, curved screens have gradually become mainstream products in high-end markets (such as automotive electronics, e-sports, and televisions). However, various process technologies used in traditional flat liquid crystal display (LCD) panels cannot be directly applied to curved LCD panels, such as protective film (PET)/explosion-proof film (ASF)/adhesive film (PSA) laser cutting technology and automated optical inspection (AOI) technology. If you want to apply the laser cutting technology and the automated optical inspection technology to curved LCD panels, you must first establish the three-dimensional point cloud information of the curved LCD panel in order to execute such high-precision process technology.

有鑑於此,發明人提出一種三維點雲資訊的建置系統與建置方法,藉以建立待測物的三維點雲資訊,進而提供高精度的製程技術運用。In view of this, the inventor proposes a system and method for constructing three-dimensional point cloud information to establish three-dimensional point cloud information of the object to be measured, thereby providing high-precision process technology application.

在一些實施例中,一種三維點雲資訊的建置系統,包含:一載台、一二維成像單元、一第一處理器、一深度感測器、以及一第二處理器。載台用以承載一待測物,其中載台設置有一發光單元,並且發光單元用以發光以照射待測物。二維成像單元用以擷取待測物之複數二維影像,其中二維成像單元之光軸指向載台。第一處理器耦接二維成像單元,並用以分析各二維影像以取得各二維影像之邊緣資訊。其中,各二維影像之邊緣資訊包含複數點之X資訊及Y資訊。深度感測器耦接第一處理器,並用以根據各二維影像之邊緣資訊以擷取各點之複數Z資訊,其中深度感測器之光軸指向載台。第二處理器耦接第一處理器,並用以映射各點之X資訊、Y資訊及複數Z資訊而產生待測物之三維點雲資訊。In some embodiments, a system for constructing three-dimensional point cloud information includes: a carrier, a two-dimensional imaging unit, a first processor, a depth sensor, and a second processor. The carrier is used to carry an object to be measured, wherein the carrier is provided with a light-emitting unit, and the light-emitting unit is used to emit light to illuminate the object to be measured. The two-dimensional imaging unit is used to capture multiple two-dimensional images of the object to be measured, wherein the optical axis of the two-dimensional imaging unit points to the carrier. The first processor is coupled to the two-dimensional imaging unit, and is used to analyze each two-dimensional image to obtain edge information of each two-dimensional image. The edge information of each two-dimensional image includes X information and Y information of multiple points. The depth sensor is coupled to the first processor, and is used to capture multiple Z information of each point based on the edge information of each two-dimensional image, wherein the optical axis of the depth sensor points to the carrier. The second processor is coupled to the first processor and is used to map the X information, Y information and multiple Z information of each point to generate three-dimensional point cloud information of the object to be measured.

在一些實施例中,二維成像單元包含一電荷耦合元件(CCD)感測器及一遠心鏡頭,電荷耦合元件感測器之光軸及遠心鏡頭之光軸共軸,並且電荷耦合元件感測器耦接第一處理器。In some embodiments, the two-dimensional imaging unit includes a charge coupled device (CCD) sensor and a telecentric lens, the optical axis of the CCD sensor and the optical axis of the telecentric lens are coaxial, and the CCD sensor is coupled to the first processor.

在一些實施例中,二維成像單元之工作距離不小於65毫米。In some embodiments, the working distance of the two-dimensional imaging unit is not less than 65 mm.

在一些實施例中,深度感測器為一彩色共焦感測器。In some embodiments, the depth sensor is a chromatic confocal sensor.

在一些實施例中,二維成像單元之光軸及深度感測器之光軸平行。In some embodiments, the optical axis of the two-dimensional imaging unit and the optical axis of the depth sensor are parallel.

在一些實施例中,建置系統更包含一分光鏡,分光鏡位於載台與二維成像單元之間以及載台與深度感測器之間,並且二維成像單元之光軸及深度感測器之光軸經由分光鏡而共軸。In some embodiments, the construction system further includes a beam splitter, which is located between the stage and the two-dimensional imaging unit and between the stage and the depth sensor, and the optical axis of the two-dimensional imaging unit and the optical axis of the depth sensor are coaxial through the beam splitter.

在一些實施例中,建置系統更包含一移動裝置,耦接第一處理器,其中二維成像單元與深度感測器設置在移動裝置上,並且在第一處理器的控制下,移動裝置用以根據一初始取像路徑移動二維成像單元並根據一深度取像路徑移動深度感測器。In some embodiments, the construction system further includes a moving device coupled to the first processor, wherein the two-dimensional imaging unit and the depth sensor are disposed on the moving device, and under the control of the first processor, the moving device is used to move the two-dimensional imaging unit according to an initial imaging path and to move the depth sensor according to a depth imaging path.

在一些實施例中,建置系統更包含一移動裝置,耦接第一處理器,其中載台設置在移動裝置上,並且在第一處理器的控制下,移動裝置用以根據一初始取像路徑和一深度取像路徑移動載台。In some embodiments, the system further includes a moving device coupled to the first processor, wherein the stage is disposed on the moving device, and under the control of the first processor, the moving device is used to move the stage according to an initial imaging path and a depth imaging path.

在一些實施例中,一種三維點雲資訊的建置方法,包含:擷取一待測物之複數二維影像;分析各二維影像以取得各二維影像之邊緣資訊,其中各二維影像之邊緣資訊包含複數點之X資訊及Y資訊;根據各二維影像之邊緣資訊以擷取各點之複數Z資訊;以及映射各點之X資訊、Y資訊及複數Z資訊而產生待測物之一三維點雲資訊。In some embodiments, a method for constructing three-dimensional point cloud information includes: capturing a plurality of two-dimensional images of an object to be measured; analyzing each two-dimensional image to obtain edge information of each two-dimensional image, wherein the edge information of each two-dimensional image includes X information and Y information of a plurality of points; capturing a plurality of Z information of each point based on the edge information of each two-dimensional image; and mapping the X information, Y information and the plurality of Z information of each point to generate a three-dimensional point cloud information of the object to be measured.

在一些實施例中,此建置方法更包含:根據待測物之一規格圖建立複數二維影像之一初始取像路徑,其中擷取待測物之複數二維影像的步驟是基於初始取像路徑執行。In some embodiments, the construction method further includes: establishing an initial imaging path for a plurality of two-dimensional images according to a specification diagram of the object to be tested, wherein the step of capturing a plurality of two-dimensional images of the object to be tested is performed based on the initial imaging path.

在一些實施例中,分析各二維影像以取得各二維影像之邊緣資訊的步驟包含:二值化各二維影像以取得各二維影像之邊緣圖像;凸顯各二維影像之邊緣圖像;連線各二維影像之邊緣圖像上之複數點以得到邊緣資訊;以及對各二維影像之邊緣資訊進行反鋸齒處理。In some embodiments, the step of analyzing each two-dimensional image to obtain edge information of each two-dimensional image includes: binarizing each two-dimensional image to obtain an edge image of each two-dimensional image; highlighting the edge image of each two-dimensional image; connecting multiple points on the edge image of each two-dimensional image to obtain edge information; and performing anti-aliasing on the edge information of each two-dimensional image.

在一些實施例中,根據各二維影像之邊緣資訊以擷取各點之複數Z資訊的步驟包含:根據各二維影像之邊緣資訊計算一深度取像路徑;基於深度取像路徑移動一深度感測器而使深度感測器之光軸依序對準於複數點;以及於深度感測器對準於各點時,經由深度感測器依序偵測待測物不同深度的複數膜表面以取得各膜表面的Z資訊。In some embodiments, the step of capturing multiple Z information of each point based on the edge information of each two-dimensional image includes: calculating a depth acquisition path based on the edge information of each two-dimensional image; moving a depth sensor based on the depth acquisition path so that the optical axis of the depth sensor is aligned with the multiple points in sequence; and when the depth sensor is aligned with each point, sequentially detecting multiple film surfaces at different depths of the object to be measured through the depth sensor to obtain Z information of each film surface.

綜上所述,根據任一實施例,三維點雲資訊的建置系統或建置方法適用於取得待測物之高精度的三維座標(例如輪廓尺寸與膜厚),並據以建立此待測物的三維點雲資訊。如此一來,即可提供待測物之三維點雲資訊以應用於此待測物的雷射切割以及自動化光學等高精度製程中。在一些實施例中,三維點雲資訊的建置系統或建置方法還特別適用於建置具有透明膜之多膜層結構之待測物的高精度的三維點雲資訊。In summary, according to any embodiment, the system or method for constructing three-dimensional point cloud information is suitable for obtaining high-precision three-dimensional coordinates (such as contour size and film thickness) of the object to be measured, and constructing three-dimensional point cloud information of the object to be measured accordingly. In this way, the three-dimensional point cloud information of the object to be measured can be provided for application in high-precision processes such as laser cutting and automated optics of the object to be measured. In some embodiments, the system or method for constructing three-dimensional point cloud information is also particularly suitable for constructing high-precision three-dimensional point cloud information of an object to be measured having a multi-layer structure with a transparent film.

請參照圖1,一種三維點雲資訊的建置系統(以下簡稱建置系統10),其適用於建置此待測物20的三維點雲資訊。在一些實施例中,待測物20可以是具有透明膜層的物件。其中,具有透明膜層的物件可為例如但不限於保護貼、鏡片、玻璃面板、觸控面板或液晶顯示器(LCD)面板等。在一些實施例中,具有透明膜層的物件為其中至少一層為透明膜層的多膜層結構的非平面物件。其中,此非平面物件可為例如但不限於曲面液晶顯示器面板或異型液晶顯示器面板等大尺寸物件。舉例來說,此非平面物件為大尺寸之一曲面面板。其中,此曲面面板之長邊不小於750毫米(mm)、此曲面面板之短邊不小於150mm、此曲面面板之深度不小於120mm,並且此曲面面板之曲率半徑為2000mm。換言之,建置系統10特別適用於建置具有透明膜層的非平面物件的三維點雲資訊。Please refer to FIG. 1 , a three-dimensional point cloud information construction system (hereinafter referred to as construction system 10 ) is suitable for constructing the three-dimensional point cloud information of the object to be tested 20 . In some embodiments, the object to be tested 20 may be an object with a transparent film layer. The object with a transparent film layer may be, for example, but not limited to, a protective film, a lens, a glass panel, a touch panel, or a liquid crystal display (LCD) panel. In some embodiments, the object with a transparent film layer is a non-planar object of a multi-layer structure in which at least one layer is a transparent film layer. The non-planar object may be, for example, but not limited to, a large-sized object such as a curved liquid crystal display panel or a special-shaped liquid crystal display panel. For example, the non-planar object is a large-sized curved panel. The long side of the curved panel is not less than 750 millimeters (mm), the short side of the curved panel is not less than 150 mm, the depth of the curved panel is not less than 120 mm, and the radius of curvature of the curved panel is 2000 mm. In other words, the construction system 10 is particularly suitable for constructing three-dimensional point cloud information of non-planar objects with transparent film layers.

此建置系統10包含載台100、二維成像單元110、第一處理器120、深度感測器130以及第二處理器140。其中,第一處理器120耦接二維成像單元110與深度感測器130,並且第二處理器140耦接第一處理器120。The construction system 10 includes a stage 100, a two-dimensional imaging unit 110, a first processor 120, a depth sensor 130, and a second processor 140. The first processor 120 is coupled to the two-dimensional imaging unit 110 and the depth sensor 130, and the second processor 140 is coupled to the first processor 120.

載台100用以承載一待測物20。其中,載台100設置有一發光單元101。舉例來說,在建置過程中,待測物20可固定在載台100的表面(以下稱承載面)上以便於取得待測物20的三維座標。並且,發光單元101係發光以照射待測物20,使得二維成像單元110得以擷取待測物20之二維影像。在一些實施例中,載台100可透過夾具或真空吸附裝置等固定件(圖未示)固定所承載的待測物20,使得待測物20係可穩固地設置於載台100上而不會偏移或晃動。在一些實施例中,發光單元101可以是主動發光的光源裝置,例如但不限於白熾燈板、LED燈板、螢光燈管或發光面板等。The stage 100 is used to carry an object to be tested 20. The stage 100 is provided with a light-emitting unit 101. For example, during the construction process, the object to be tested 20 can be fixed on the surface of the stage 100 (hereinafter referred to as the carrying surface) to obtain the three-dimensional coordinates of the object to be tested 20. In addition, the light-emitting unit 101 emits light to illuminate the object to be tested 20, so that the two-dimensional imaging unit 110 can capture a two-dimensional image of the object to be tested 20. In some embodiments, the stage 100 can fix the carried object to be tested 20 through a fixture such as a clamp or a vacuum adsorption device (not shown), so that the object to be tested 20 can be stably set on the stage 100 without deviation or shaking. In some embodiments, the light-emitting unit 101 may be an active light-emitting light source device, such as but not limited to an incandescent light panel, an LED light panel, a fluorescent light tube, or a light-emitting panel.

請參照圖1及圖2,二維成像單元110的光軸Ax1與深度感測器130的光軸Ax2指向載台100。換言之,二維成像單元110的鏡頭與深度感測器130的鏡頭面向載台100的承載面。1 and 2 , the optical axis Ax1 of the two-dimensional imaging unit 110 and the optical axis Ax2 of the depth sensor 130 point to the stage 100 . In other words, the lens of the two-dimensional imaging unit 110 and the lens of the depth sensor 130 face the supporting surface of the stage 100 .

請參照圖1至圖3,於建置系統10開始運作時,二維成像單元110擷取待測物20之複數二維影像(步驟S110)。具體而言,二維成像單元110連續拍攝待測物20的不同區域以對應生成多個二維影像。1 to 3 , when the system 10 starts to operate, the two-dimensional imaging unit 110 captures a plurality of two-dimensional images of the object 20 (step S110 ). Specifically, the two-dimensional imaging unit 110 continuously captures different regions of the object 20 to generate a plurality of two-dimensional images.

在一些實施例中,二維成像單元110的感測光線為準直光(Collimated beam)。在一些實施例中,二維成像單元110係採用較大的工作距離WD以擷取待測物20之二維影像。於此,較大的工作距離WD是指足以使二維成像單元110的感測光線趨近於準直光的距離。其中,工作距離WD為二維成像單元110之鏡頭前端(即鄰近載台100的一側)與待測物20的上表面(即相對載台100的另一側表面)之間的距離。藉此,二維成像單元110可取得邊緣較銳利且清晰之二維影像。在一些實施例中,二維成像單元110之工作距離WD不小於65毫米(mm)。其中,二維成像單元110之工作距離WD例如為65mm~305mm,不以此為限。In some embodiments, the sensing light of the two-dimensional imaging unit 110 is collimated beam. In some embodiments, the two-dimensional imaging unit 110 uses a larger working distance WD to capture a two-dimensional image of the object to be tested 20. Here, the larger working distance WD refers to a distance sufficient to make the sensing light of the two-dimensional imaging unit 110 approach the collimated light. Among them, the working distance WD is the distance between the front end of the lens of the two-dimensional imaging unit 110 (i.e., one side adjacent to the stage 100) and the upper surface of the object to be tested 20 (i.e., the other side surface relative to the stage 100). In this way, the two-dimensional imaging unit 110 can obtain a two-dimensional image with sharper edges and clearer. In some embodiments, the working distance WD of the two-dimensional imaging unit 110 is not less than 65 millimeters (mm). The working distance WD of the two-dimensional imaging unit 110 is, for example, 65 mm to 305 mm, but is not limited thereto.

在一些實施例中,參照圖1、圖2及圖4,二維成像單元110包含一電荷耦合元件(Charge-Coupled Device,CCD)感測器111及一遠心鏡頭112(Telecentric lens)。遠心鏡頭112位在電荷耦合元件感測器111與載台100之間,並且電荷耦合元件感測器111耦接第一處理器120。In some embodiments, referring to FIG. 1 , FIG. 2 and FIG. 4 , the two-dimensional imaging unit 110 includes a charge-coupled device (CCD) sensor 111 and a telecentric lens 112 . The telecentric lens 112 is located between the CCD sensor 111 and the stage 100 , and the CCD sensor 111 is coupled to the first processor 120 .

在一些實施例中,當二維成像單元110擷取待測物20之二維影像時,遠心鏡頭112將待測物20所反射之一光線改變為平行光路,並且電荷耦合元件感測器111再透過感測此平行光路來生成對應的二維影像。在一些實施例中,電荷耦合元件感測器111之光軸Ax11及遠心鏡頭112之光軸Ax12共軸(即光軸Ax1)。因此,在進行高精密量測時,電荷耦合元件感測器111透過遠心鏡頭112來擷取待測物20之二維影像,藉以取得高解析度二維影像並減少誤差,同時能避免擷取到的二維影像失真(Distortion)。In some embodiments, when the two-dimensional imaging unit 110 captures a two-dimensional image of the object to be tested 20, the telecentric lens 112 changes a light reflected by the object to be tested 20 into a parallel light path, and the charge-coupled device sensor 111 generates a corresponding two-dimensional image by sensing the parallel light path. In some embodiments, the optical axis Ax11 of the charge-coupled device sensor 111 and the optical axis Ax12 of the telecentric lens 112 are coaxial (i.e., the optical axis Ax1). Therefore, when performing high-precision measurement, the charge-coupled device sensor 111 captures the two-dimensional image of the object to be tested 20 through the telecentric lens 112, thereby obtaining a high-resolution two-dimensional image and reducing errors, while avoiding distortion of the captured two-dimensional image.

請參照圖5,以曲面面板為例,待測物20之長邊Lth對應於三維座標系中之X軸座標,待測物20之短邊Wth對應於三維座標系中之Y軸座標,並且待測物20之深度Dth對應於三維座標系中之Z軸座標。請參照圖1、圖2、圖5、圖6及圖7,二維成像單元110係朝著待測物20的前視方向(即朝著XY平面,換言之,二維成像單元110之光軸Ax1大致上垂直),以致使二維成像單元110的可視範圍(Field of View,FOV,又稱視野)Rf11~Rfst落在待測物20上,並擷取可視範圍Rf11~Rfst內的畫面以得到待測物20之二維影像。換句話說,二維成像單元110所擷取到的二維影像Img1中所呈現的物件圖像Obt即是待測物20落在二維成像單元110的可視範圍Rf1t內的區域的圖像(如圖7所示)。Please refer to FIG5 , taking a curved panel as an example, the long side Lth of the object to be tested 20 corresponds to the X-axis coordinate in the three-dimensional coordinate system, the short side Wth of the object to be tested 20 corresponds to the Y-axis coordinate in the three-dimensional coordinate system, and the depth Dth of the object to be tested 20 corresponds to the Z-axis coordinate in the three-dimensional coordinate system. Please refer to FIG1 , FIG2 , FIG5 , FIG6 and FIG7 , the two-dimensional imaging unit 110 is facing the front view direction of the object to be tested 20 (i.e., facing the XY plane, in other words, the optical axis Ax1 of the two-dimensional imaging unit 110 is substantially vertical), so that the visible range (Field of View, FOV, also known as the field of view) Rf11~Rfst of the two-dimensional imaging unit 110 falls on the object to be tested 20, and the picture within the visible range Rf11~Rfst is captured to obtain a two-dimensional image of the object to be tested 20. In other words, the object image Obt presented in the two-dimensional image Img1 captured by the two-dimensional imaging unit 110 is the image of the area where the object 20 to be tested falls within the visible range Rf1t of the two-dimensional imaging unit 110 (as shown in FIG. 7 ).

如圖6所示,在一些實施例中,二維成像單元110的可視範圍Rf11~Rfst的尺寸遠小於待測物20的尺寸(長邊Lth × 短邊Wth)。As shown in FIG. 6 , in some embodiments, the size of the visible range Rf11 -Rfst of the two-dimensional imaging unit 110 is much smaller than the size of the object to be measured 20 (long side Lth×short side Wth).

在一些實施例中,二維成像單元110沿著待測物20之邊緣一圈以固定的可視範圍Rf11/Rf12/…/Rf1t/Rf2t/…/Rf(s-1)t/Rfst連續擷取待測物20之複數二維影像。其中,s與t可為大於1的相同或相異的正整數。舉例來說,承前例,當二維成像單元110沿著待測物20之長邊Lth擷取二維影像時,任意連續二次擷取的可視範圍Rf1t、Rf2t/Rf2t、Rf3t/…/Rf(s-1)t、Rfst之短邊係拼接在一起。當二維成像單元110沿著待測物20之短邊Wth擷取二維影像時,任意連續二次擷取的可視範圍Rf11、Rf12/Rf12、Rf13/…/Rf1(t-1)、Rf1t之長邊係拼接在一起。換句話說,二維成像單元110所擷取到的二維影像可拼接出呈現完整的待測物20之邊緣的拼接影像。In some embodiments, the two-dimensional imaging unit 110 continuously captures multiple two-dimensional images of the object 20 along the edge of the object 20 with a fixed visible range Rf11/Rf12/…/Rf1t/Rf2t/…/Rf(s-1)t/Rfst. Wherein, s and t can be the same or different positive integers greater than 1. For example, continuing the previous example, when the two-dimensional imaging unit 110 captures a two-dimensional image along the long side Lth of the object 20, the short sides of any consecutive two-time captured visible ranges Rf1t, Rf2t/Rf2t, Rf3t/…/Rf(s-1)t, Rfst are spliced together. When the two-dimensional imaging unit 110 captures a two-dimensional image along the short side Wth of the object to be tested 20, the long sides of any two consecutive captured visible ranges Rf11, Rf12/Rf12, Rf13/…/Rf1(t-1), and Rf1t are spliced together. In other words, the two-dimensional images captured by the two-dimensional imaging unit 110 can be spliced to form a spliced image showing the edge of the complete object to be tested 20.

在一些實施例中,任意連續二次擷取的可視範圍Rf11、Rf12/…/Rf(s-1)t、Rfst是以部分重疊的方式拼接。換句話說,任意連續二次擷取到的二維影像具有部分區域具有相同畫面。在各二維影像的相對二側邊中,一側邊的邊緣區域相同於下一次擷取的二維影像的一側邊的邊緣區域,並且另一側邊的邊緣區域則相同於上一次擷取的二維影像的一側的邊緣區域。In some embodiments, any two-dimensional images captured consecutively twice are spliced in a partially overlapping manner. In other words, any two-dimensional images captured consecutively twice have a part of the same image. In two opposite sides of each two-dimensional image, the edge area of one side is the same as the edge area of one side of the next captured two-dimensional image, and the edge area of the other side is the same as the edge area of one side of the last captured two-dimensional image.

於得到二維影像Img1後,第一處理器120會分析每一個二維影像以取得每一個二維影像Img1之邊緣資訊(步驟S120)。請參照圖8,每一個二維影像Img1之邊緣資訊包含複數點P11~P1n之X資訊x1~xn及Y資訊y1~yn。其中,X資訊x1/X2/.../xn為在待測物20的座標系上此點P11/P12/…/P1n所表示的待測物20的位置的X座標值,而Y資訊y1/y2/…/yn則為在待測物20的座標系上此點P11/P12/…/P1n所表示的待測物20的位置的Y座標值。並且,n為一正整數。After obtaining the two-dimensional image Img1, the first processor 120 analyzes each two-dimensional image to obtain edge information of each two-dimensional image Img1 (step S120). Please refer to FIG8 , the edge information of each two-dimensional image Img1 includes X information x1~xn and Y information y1~yn of a plurality of points P11~P1n. Among them, the X information x1/X2/.../xn is the X coordinate value of the position of the object 20 represented by this point P11/P12/.../P1n on the coordinate system of the object 20, and the Y information y1/y2/.../yn is the Y coordinate value of the position of the object 20 represented by this point P11/P12/.../P1n on the coordinate system of the object 20. And, n is a positive integer.

在步驟S120的一些實施例中,請參照圖1、圖3及圖7,於二維成像單元110擷取到待測物20之每一個二維影像Img1後,第一處理器120即接續分析此二維影像Img1以取得此二維影像Img1之邊緣資訊Edi1。具體而言,第一處理器120透過對每張二維影像Img1進行影像處理來找出二維影像Img1中的呈現邊緣的圖像(以下稱邊緣圖像Edp),進而得到形成此邊緣圖像Edp的每個像素(即點P11~P1n)在待測物20的座標系上之相對位置的二維座標值(即X資訊x1~xn及Y資訊y1~yn),也就是邊緣資訊Edi1。In some embodiments of step S120, please refer to FIG. 1, FIG. 3 and FIG. 7. After the two-dimensional imaging unit 110 captures each two-dimensional image Img1 of the object to be tested 20, the first processor 120 continues to analyze the two-dimensional image Img1 to obtain edge information Edi1 of the two-dimensional image Img1. Specifically, the first processor 120 performs image processing on each two-dimensional image Img1 to find an image showing an edge in the two-dimensional image Img1 (hereinafter referred to as edge image Edp), and then obtains the two-dimensional coordinate value (i.e., X information x1~xn and Y information y1~yn) of the relative position of each pixel (i.e., point P11~P1n) forming the edge image Edp on the coordinate system of the object to be tested 20, which is the edge information Edi1.

在一些實施例中,第一處理器120可以是具有影像處理功能的計算機裝置,例如但不限於中央處理器(CPU)、圖形處理器(GPU)或影像訊號處理器(ISP)等。其中,所述影像處理功能包含對影像進行去馬賽克、降噪、白平衡、曝光校正、銳化、色彩轉換、編碼或其任意組合等處理的能力,不以此為限。In some embodiments, the first processor 120 may be a computer device having an image processing function, such as but not limited to a central processing unit (CPU), a graphics processing unit (GPU), or an image signal processor (ISP), etc. The image processing function includes, but is not limited to, the ability to perform image processing such as de-mosaicing, noise reduction, white balance, exposure correction, sharpening, color conversion, encoding, or any combination thereof.

請參照圖1、圖3、圖8及圖9,步驟S120之後,深度感測器130會再根據每一個二維影像Img1之邊緣資訊以擷取各點P11/…/P1n之複數Z資訊z10~z15(步驟S130)。其中,各Z資訊z10/z11/…/z15為在對應點P11/…/P1n處待測物20的一膜層21的膜表面Sf0/Sf1/…/Sf5於待測物20之深度Dth上的位置。在一些實施例中,深度感測器130為一彩色共焦感測器,藉以感測透明材質,進而量測多層透明材質之膜層的距離(即深度值)。Please refer to FIG. 1 , FIG. 3 , FIG. 8 and FIG. 9 . After step S120 , the depth sensor 130 will capture the multiple Z information z10-z15 of each point P11/…/P1n according to the edge information of each two-dimensional image Img1 (step S130 ). Among them, each Z information z10/z11/…/z15 is the position of the film surface Sf0/Sf1/…/Sf5 of a film layer 21 of the object to be tested 20 at the corresponding point P11/…/P1n at the depth Dth of the object to be tested 20 . In some embodiments, the depth sensor 130 is a color confocal sensor to sense transparent materials and further measure the distance (i.e., depth value) of the film layers of multiple transparent materials.

在一些實施例中,當待測物20為具有多膜層21之結構時,建置系統10需要取得待測物20中每一膜層21之厚度資訊,方能建立待測物20之三維點雲資訊。因此,建置系統10透過彩色共焦感測器(即深度感測器130)擷取二維影像Img1之邊緣資訊Edi1的每一個點P11/…/P1n處所有膜層21之複數Z資訊z10~z15,藉以表示待測物20中各膜層21之厚度資訊。In some embodiments, when the object 20 to be tested is a structure with multiple film layers 21, the construction system 10 needs to obtain the thickness information of each film layer 21 in the object 20 to establish the three-dimensional point cloud information of the object 20. Therefore, the construction system 10 captures the multiple Z information z10-z15 of all the film layers 21 at each point P11/…/P1n of the edge information Edi1 of the two-dimensional image Img1 through the color confocal sensor (i.e., the depth sensor 130) to represent the thickness information of each film layer 21 in the object 20 to be tested.

具體來說,深度感測器130的光軸Ax2對準待量測的點P11/…/P1n,然後偵測並擷取待測物20的每一膜表面Sf0/Sf1/…/Sf5的Z資訊z10/z11/…/z15。於此,每一個Z資訊z10/z11/…/z15對應於在對應點P11/…/P1n處的不同膜層21之膜表面Sf0~Sf5,因此相鄰二膜表面Sf0、Sf1/Sf1、Sf2/…/Sf4、Sf5的Z資訊z10、z11/z11、z12/…/z14、z15之間的差值即為對應膜層21在點P11處之厚度資訊。換言之,第一處理器120可透過計算同一點P11/…/P1n的不同二Z資訊(z10~z15中之二)之間的差值以取得對應膜層21之厚度資訊。Specifically, the optical axis Ax2 of the depth sensor 130 is aligned with the point P11/…/P1n to be measured, and then the Z information z10/z11/…/z15 of each film surface Sf0/Sf1/…/Sf5 of the object to be measured is detected and captured. Here, each Z information z10/z11/…/z15 corresponds to the film surface Sf0~Sf5 of the different film layer 21 at the corresponding point P11/…/P1n, so the difference between the Z information z10, z11/z11, z12/…/z14, z15 of two adjacent film surfaces Sf0, Sf1/Sf1, Sf2/…/Sf4, Sf5 is the thickness information of the corresponding film layer 21 at the point P11. In other words, the first processor 120 can obtain the thickness information of the corresponding film layer 21 by calculating the difference between two different Z information (two of z10-z15) of the same point P11/.../P1n.

舉例來說,以待測物20為曲面液晶面板為例,在一些實施例中,待測物20的膜層21包含保護膜(Polyethylene Terephthalate,PET)201、防爆膜(Anti-Scattering Film,ASF)202、黏著膜(Pressure Sensitive Adhesive,PSA)203、玻璃層204以及遮光層(Black Matrix,BM)205。玻璃層204堆疊於遮光層205之上,黏著膜203堆疊於玻璃層204之上,防爆膜202堆疊於黏著膜203之上,並且保護膜201堆疊於防爆膜202之上。在深度感測器130於點P11處所擷取到的Z資訊z10~z15中,Z資訊z10對應於保護膜201之膜表面Sf0,Z資訊z11對應於防爆膜202之膜表面Sf1,Z資訊z12對應於黏著膜203之膜表面Sf2,Z資訊z13對應於玻璃層204之膜表面Sf3,並且Z資訊z14對應於遮光層205之膜表面Sf4。因此,Z資訊z10與Z資訊z11之間的差值為保護膜201於點P11處之厚度資訊,Z資訊z11與Z資訊z12之間的差值為防爆膜202於點P11處之厚度資訊,Z資訊z12與Z資訊z13之間的差值為黏著膜203於點P11處之厚度資訊,Z資訊z13與Z資訊z14之間的差值為玻璃層204於點P11處之厚度資訊,並且Z資訊z14與Z資訊z15之間的差值為遮光層205於點P11處之厚度資訊。For example, the object to be tested 20 is a curved liquid crystal panel. In some embodiments, the film layer 21 of the object to be tested 20 includes a protective film (Polyethylene Terephthalate, PET) 201, an explosion-proof film (Anti-Scattering Film, ASF) 202, an adhesive film (Pressure Sensitive Adhesive, PSA) 203, a glass layer 204, and a light shielding layer (Black Matrix, BM) 205. The glass layer 204 is stacked on the light shielding layer 205, the adhesive film 203 is stacked on the glass layer 204, the explosion-proof film 202 is stacked on the adhesive film 203, and the protective film 201 is stacked on the explosion-proof film 202. Among the Z information z10~z15 captured by the depth sensor 130 at point P11, Z information z10 corresponds to the film surface Sf0 of the protective film 201, Z information z11 corresponds to the film surface Sf1 of the explosion-proof film 202, Z information z12 corresponds to the film surface Sf2 of the adhesive film 203, Z information z13 corresponds to the film surface Sf3 of the glass layer 204, and Z information z14 corresponds to the film surface Sf4 of the shading layer 205. Therefore, the difference between Z information z10 and Z information z11 is the thickness information of the protective film 201 at point P11, the difference between Z information z11 and Z information z12 is the thickness information of the explosion-proof film 202 at point P11, the difference between Z information z12 and Z information z13 is the thickness information of the adhesive film 203 at point P11, the difference between Z information z13 and Z information z14 is the thickness information of the glass layer 204 at point P11, and the difference between Z information z14 and Z information z15 is the thickness information of the shading layer 205 at point P11.

如圖2所示,在一些實施例中,二維成像單元110與深度感測器130可為並排配置。換言之,二維成像單元110之光軸Ax1及深度感測器130之光軸Ax2平行。於此,二維成像單元110與深度感測器130皆係朝著待測物20的前視方向(即朝著XY平面,換言之,其光軸Ax1、Ax2大致上平行Z軸),並且二維成像單元110之光軸Ax1與深度感測器130之光軸Ax2之間相差了一固定距離CD。其中,當固定距離CD之值愈小時,深度感測器130感測待測物20之精度愈高。因此,在一些實施例中,固定距離CD之值愈小愈好,使得深度感測器130得以更精準地感測待測物20之深度Dth。在步驟S130的一些實施例中,於第一處理器120分析並得到二維影像Img1之邊緣資訊中每個點P11的X資訊(x1)與Y資訊(y1)(步驟S120)後,第一處理器120會計算深度感測器130由定點位置移動至點P11的移動向量,並根據移動向量將深度感測器130移動至其光軸Ax2對準點P11的位置。然後,深度感測器130再擷取二維影像Img1之邊緣資訊中所有點P11~P1n之複數Z資訊z10~z15。As shown in FIG. 2 , in some embodiments, the two-dimensional imaging unit 110 and the depth sensor 130 may be arranged side by side. In other words, the optical axis Ax1 of the two-dimensional imaging unit 110 and the optical axis Ax2 of the depth sensor 130 are parallel. Here, the two-dimensional imaging unit 110 and the depth sensor 130 are both facing the front view direction of the object to be measured 20 (i.e., facing the XY plane, in other words, their optical axes Ax1 and Ax2 are substantially parallel to the Z axis), and there is a fixed distance CD between the optical axis Ax1 of the two-dimensional imaging unit 110 and the optical axis Ax2 of the depth sensor 130. Among them, when the value of the fixed distance CD is smaller, the accuracy of the depth sensor 130 in sensing the object to be measured 20 is higher. Therefore, in some embodiments, the smaller the value of the fixed distance CD is, the better, so that the depth sensor 130 can more accurately sense the depth Dth of the object 20. In some embodiments of step S130, after the first processor 120 analyzes and obtains the X information (x1) and Y information (y1) of each point P11 in the edge information of the two-dimensional image Img1 (step S120), the first processor 120 calculates the movement vector of the depth sensor 130 from the fixed position to the point P11, and moves the depth sensor 130 to the position of the optical axis Ax2 aligned with the point P11 according to the movement vector. Then, the depth sensor 130 captures the multiple Z information z10~z15 of all points P11~P1n in the edge information of the two-dimensional image Img1.

請參照圖10。在一些實施例中,二維成像單元110與深度感測器130可藉由一分光鏡150而形成共軸,換言之,建置系統10可更包含一分光鏡150。分光鏡150位於載台100與二維成像單元110之間以及載台100與深度感測器130之間。二維成像單元110與深度感測器130是以不同指向設置。換言之,二維成像單元110與深度感測器130中之一者是朝著待測物20的前視方向(即朝著XY平面,換言之,光軸Ax1/Ax2大致上平行Z軸),並且其與載台100之間設置有分光鏡150。二維成像單元110與深度感測器130中之另一者則是朝著分光鏡150的前視方向(即朝著YZ平面,換言之,光軸Ax2/光軸Ax1大致上平行X軸),並且藉由分光鏡150使平行X軸的光軸Ax2/Ax1轉向待測物20的前視方向並與另一光軸Ax1/Ax2重疊成單一光軸Ax3。具體來說,二維成像單元110之光軸Ax1與光軸Ax3之間大致上相互平行且重合在一起,並且深度感測器130之光軸Ax2與光軸Ax3之間大致上相互垂直。Please refer to FIG. 10. In some embodiments, the two-dimensional imaging unit 110 and the depth sensor 130 may form a coaxial axis through a beam splitter 150. In other words, the construction system 10 may further include a beam splitter 150. The beam splitter 150 is located between the stage 100 and the two-dimensional imaging unit 110 and between the stage 100 and the depth sensor 130. The two-dimensional imaging unit 110 and the depth sensor 130 are arranged in different directions. In other words, one of the two-dimensional imaging unit 110 and the depth sensor 130 is facing the front direction of the object 20 to be measured (i.e., facing the XY plane, in other words, the optical axis Ax1/Ax2 is substantially parallel to the Z axis), and a beam splitter 150 is arranged between it and the stage 100. The other of the two-dimensional imaging unit 110 and the depth sensor 130 is facing the front view direction of the spectroscope 150 (i.e., facing the YZ plane, in other words, the optical axis Ax2/optical axis Ax1 is substantially parallel to the X axis), and the optical axis Ax2/Ax1 parallel to the X axis is turned toward the front view direction of the object to be measured 20 by the spectroscope 150 and overlapped with the other optical axis Ax1/Ax2 to form a single optical axis Ax3. Specifically, the optical axis Ax1 and the optical axis Ax3 of the two-dimensional imaging unit 110 are substantially parallel to each other and overlap together, and the optical axis Ax2 and the optical axis Ax3 of the depth sensor 130 are substantially perpendicular to each other.

於步驟S130後,第一處理器120係將所有二維影像之邊緣資訊上的每一個點之X資訊、Y資訊及多個Z資訊傳輸至第二處理器140,然後第二處理器140再映射(Mapping)每一個點之X資訊、Y資訊及多個Z資訊而產生待測物20之三維點雲資訊(步驟S140)。待測物20的三維點雲資訊是待測物20的輪廓上每個位置(即點P11/…/P1n)的三維座標(又稱笛卡兒座標)的資料集。換言之,當三維點雲資訊中的所有呈現三維座標的點P11~P1n連接在一起,即可呈現出所述待測物20之外觀。After step S130, the first processor 120 transmits the X information, Y information and multiple Z information of each point on the edge information of all two-dimensional images to the second processor 140, and then the second processor 140 maps the X information, Y information and multiple Z information of each point to generate three-dimensional point cloud information of the object to be tested 20 (step S140). The three-dimensional point cloud information of the object to be tested 20 is a data set of three-dimensional coordinates (also known as Cartesian coordinates) of each position (i.e., point P11/…/P1n) on the contour of the object to be tested 20. In other words, when all the points P11~P1n presenting three-dimensional coordinates in the three-dimensional point cloud information are connected together, the appearance of the object to be tested 20 can be presented.

在步驟S140的一些實施例中,第二處理器140係將待測物20之所有二維影像Img1之邊緣資訊上的每一個點P11/…/P1n之X資訊、Y資訊及多個Z資訊映射至一張映射表上,以記錄待測物20之所有二維影像Img1之邊緣資訊上的每一個點P11/…/P1n之三維座標,進而建立待測物20之三維點雲資訊。In some embodiments of step S140, the second processor 140 maps the X information, Y information, and multiple Z information of each point P11/…/P1n on the edge information of all two-dimensional images Img1 of the object to be tested 20 to a mapping table to record the three-dimensional coordinates of each point P11/…/P1n on the edge information of all two-dimensional images Img1 of the object to be tested 20, and then establish the three-dimensional point cloud information of the object to be tested 20.

在一些實施例中,第一處理器120係透過一有線通訊的方式耦接第二處理器140,並透過有線通訊的方式將待測物20之所有二維影像之邊緣資訊上的每一個點之X資訊、Y資訊及多個Z資訊傳輸至第二處理器140。其中,所述有線線路例如但不限於傳輸線、匯流排或網路線。在一些實施例中,第二處理器140可以是具有有線通訊功能的裝置,例如但不限於筆記型電腦或桌上型電腦。In some embodiments, the first processor 120 is coupled to the second processor 140 via a wired communication, and transmits the X information, Y information, and multiple Z information of each point on the edge information of all two-dimensional images of the object under test 20 to the second processor 140 via the wired communication. The wired line is, for example, but not limited to, a transmission line, a bus, or a network line. In some embodiments, the second processor 140 can be a device with a wired communication function, such as, but not limited to, a laptop or a desktop computer.

在一些實施例中,第一處理器120亦可係透過一無線通訊的方式耦接第二處理器140,並透過無線通訊的方式將待測物20之所有二維影像之邊緣資訊上的每一個點之X資訊、Y資訊及所述複數Z資訊傳輸至第二處理器140。其中,所述無線線路例如但不限於無線網路(Wifi)或藍牙。在一些實施例中,第二處理器140可以是具有無線通訊功能的裝置,例如但不限於雲端系統、伺服器系統、筆記型電腦或具有無線通訊功能之桌上型電腦。In some embodiments, the first processor 120 may also be coupled to the second processor 140 via a wireless communication method, and transmit the X information, Y information, and the plurality of Z information of each point on the edge information of all two-dimensional images of the object to be measured 20 to the second processor 140 via wireless communication. The wireless line may be, for example, but not limited to, a wireless network (Wifi) or Bluetooth. In some embodiments, the second processor 140 may be a device with a wireless communication function, such as, but not limited to, a cloud system, a server system, a laptop computer, or a desktop computer with a wireless communication function.

請參照圖1與圖3,在一些實施例中,於步驟S110前,建置系統10的第一處理器120會先根據待測物20之規格圖(Specification)建立所有二維影像之初始取像路徑(步驟S100),進而驅使二維成像單元110基於建立的述初始取像路徑擷取待測物20之二維影像(步驟S110)。在步驟S100的一些實施例中,第一處理器120是可以事先從待測物20之規格圖得知待測物20的尺寸,並根據待測物20的尺寸建立沿著待測物20之邊緣移動的初始取像路徑。隨後,在步驟S110中,二維成像單元110則依循初始取像路徑移動其位置來進行待測物20之二維影像,進而確保待測物20之邊緣逐段地落入二維成像單元110之可視範圍內。Please refer to FIG. 1 and FIG. 3 , in some embodiments, before step S110, the first processor 120 of the system 10 will first establish an initial imaging path for all two-dimensional images according to the specification of the object 20 to be tested (step S100), and then drive the two-dimensional imaging unit 110 to capture the two-dimensional image of the object 20 to be tested based on the established initial imaging path (step S110). In some embodiments of step S100, the first processor 120 can know the size of the object 20 to be tested from the specification of the object 20 to be tested in advance, and establish an initial imaging path moving along the edge of the object 20 to be tested according to the size of the object 20 to be tested. Then, in step S110 , the two-dimensional imaging unit 110 moves its position according to the initial imaging path to obtain a two-dimensional image of the object 20 to be tested, thereby ensuring that the edge of the object 20 to be tested falls into the visible range of the two-dimensional imaging unit 110 section by section.

請參照圖1至圖3、圖5、圖6及圖11至圖14。在步驟S120的一些實施例中,針對每一張二維影像Img1’,第一處理器120係先二值化擷取到的二維影像Img1’以取得此二維影像Img1’之邊緣圖像Edp(步驟S121)。在一些實施例中,第一處理器120先進行二維影像Img1’的灰度分劃,以根據每一像素的灰階值(Gray-scale value)與一灰階閾值將二維影像Img1’中的所有像素轉換成以「0」或「1」表示或以「灰階極小值」或「灰階極大值」表示。換言之,第一處理器120將二維影像Img1’從灰階圖像轉換為二元圖像(如圖12所示)。其中,當像素的灰階值高於灰階閾值時,第一處理器120會將此像素的灰階值設為黑色,即令其數據為1(或255)。當像素的灰階值低於灰階閾值時,第一處理器120則將此像素的灰階值設為白色,即令其數據為0。二值化的二維影像Img1’中由分別為不同數值的相鄰二像素的相鄰處即構成邊緣圖像Edp。Please refer to Figures 1 to 3, 5, 6 and 11 to 14. In some embodiments of step S120, for each two-dimensional image Img1', the first processor 120 first binarizes the captured two-dimensional image Img1' to obtain the edge image Edp of the two-dimensional image Img1' (step S121). In some embodiments, the first processor 120 first performs grayscale division of the two-dimensional image Img1' to convert all pixels in the two-dimensional image Img1' into "0" or "1" or "grayscale minimum value" or "grayscale maximum value" according to the grayscale value of each pixel and a grayscale threshold. In other words, the first processor 120 converts the two-dimensional image Img1' from a grayscale image to a binary image (as shown in FIG12). When the grayscale value of a pixel is higher than the grayscale threshold, the first processor 120 sets the grayscale value of the pixel to black, that is, the data thereof is 1 (or 255). When the grayscale value of a pixel is lower than the grayscale threshold, the first processor 120 sets the grayscale value of the pixel to white, that is, the data thereof is 0. The adjacent portions of two adjacent pixels having different values in the binary two-dimensional image Img1' constitute the edge image Edp.

於步驟S121後,第一處理器120接續凸顯二維影像Img1’之邊緣圖像Edp(步驟S122)。在一些實施例中,第一處理器120係將二值化後的二維影像Img1’中具有相似顏色、紋理等特徵之像素組成一塊連通區域,以凸顯出二維影像Img1’之邊緣圖像Edp(如圖13所示)。其中,第一處理器120將二維影像Img1’中轉換為黑色之像素組成為一塊連通區塊Rg1,並將二維影像Img1’中轉換為白色之像素組成為另一塊連通區塊Rg2,使得所述二個連通區塊Rg1、Rg2之間的交界線(即邊緣圖像Edp)被凸顯出來。After step S121, the first processor 120 continues to highlight the edge image Edp of the two-dimensional image Img1' (step S122). In some embodiments, the first processor 120 forms a connected area with pixels having similar colors, textures, and other features in the binarized two-dimensional image Img1' to highlight the edge image Edp of the two-dimensional image Img1' (as shown in FIG. 13). The first processor 120 forms a connected block Rg1 with pixels converted to black in the two-dimensional image Img1', and forms another connected block Rg2 with pixels converted to white in the two-dimensional image Img1', so that the boundary line (i.e., the edge image Edp) between the two connected blocks Rg1 and Rg2 is highlighted.

於步驟S122後,第一處理器120連線邊緣圖像Edp上的每一個點以得到邊緣資訊Edi1(步驟S123)。在一些實施例中,當邊緣圖像Edp被凸顯出來後,第一處理器120係將由黑色像素所組成之連通區域Rg1與由白色像素所組成之連通區塊Rg2之間的交界上的所有點(如灰色像素所示)連線在一起,以形成邊緣資訊Edi1’(如圖14所示)。After step S122, the first processor 120 connects each point on the edge image Edp to obtain edge information Edi1 (step S123). In some embodiments, after the edge image Edp is highlighted, the first processor 120 connects all points on the boundary between the connected area Rg1 composed of black pixels and the connected block Rg2 composed of white pixels (as shown by gray pixels) to form edge information Edi1' (as shown in FIG. 14).

於步驟S123後,第一處理器120係對二維影像Img1’之邊緣資訊Edi1’進行反鋸齒處理(步驟S124)。在一些實施例中,由複數像素連線在一起而形成的邊緣資訊Edi1’因像素採矩陣配置而呈現凹凸鋸齒狀的連線。因此,第一處理器120會對連線(即邊緣資訊Edi1’)作平滑處理,以消除邊緣資訊Edi1’上的所有凹凸鋸齒,進而形成更加平滑且柔化的邊緣資訊Edi1(如圖8所示)。After step S123, the first processor 120 performs anti-aliasing processing on the edge information Edi1' of the two-dimensional image Img1' (step S124). In some embodiments, the edge information Edi1' formed by connecting a plurality of pixels together presents a jagged connection due to the matrix arrangement of the pixels. Therefore, the first processor 120 performs smoothing processing on the connection (i.e., the edge information Edi1') to eliminate all jagged edges on the edge information Edi1', thereby forming a smoother and softer edge information Edi1 (as shown in FIG8).

請參照圖1至圖3、圖8、圖9及圖15。在步驟S130的一些實施例中,針對每一個二維影像Img1,第一處理器120於得到其邊緣資訊Edi1後會再根據此二維影像Img1之邊緣資訊Edi1計算一深度取像路徑(步驟S131),並基於深度取像路徑移動深度感測器130而使深度感測器130之光軸Ax2依序對準此二維影像Img1之邊緣資訊上的所有點P11~P1n(步驟S132)。並且,於深度感測器130之光軸Ax2對準每一個點P11/P12/…/P1n時,深度感測器130會依序偵測待測物20之不同深度的複數膜表面Sf0~Sf5以取得每一層膜表面Sf0/Sf1/…/Sf5的Z資訊z10/z11/…/z15(步驟S133)。Please refer to Figures 1 to 3, 8, 9 and 15. In some embodiments of step S130, for each two-dimensional image Img1, after obtaining its edge information Edi1, the first processor 120 will calculate a depth acquisition path according to the edge information Edi1 of the two-dimensional image Img1 (step S131), and move the depth sensor 130 based on the depth acquisition path so that the optical axis Ax2 of the depth sensor 130 is aligned with all points P11~P1n on the edge information of the two-dimensional image Img1 in sequence (step S132). Furthermore, when the optical axis Ax2 of the depth sensor 130 is aligned with each point P11/P12/…/P1n, the depth sensor 130 sequentially detects multiple film surfaces Sf0~Sf5 of different depths of the object 20 to obtain Z information z10/z11/…/z15 of each film surface Sf0/Sf1/…/Sf5 (step S133).

請參照圖1至圖3、圖8、圖15及圖16,在一些實施例中,深度取像路徑包括移動向量Vt與邊緣資訊Edi1。具體而言,每當二維成像單元110擷取到一張二維影像Img1時,第一處理器120即分析並取得此二維影像Img1之邊緣資訊Edi1(步驟S120),並接續根據此二維影像Img1之邊緣資訊Edi1計算由定點位置Da1移動至邊緣資訊Edi1中第一點P11的位置(以下稱偵測位置Dp11)的移動向量Vt(步驟S131)。Please refer to Figures 1 to 3, 8, 15 and 16. In some embodiments, the depth imaging path includes a motion vector Vt and edge information Edi1. Specifically, whenever the two-dimensional imaging unit 110 captures a two-dimensional image Img1, the first processor 120 analyzes and obtains the edge information Edi1 of the two-dimensional image Img1 (step S120), and then calculates the motion vector Vt from the fixed point position Da1 to the position of the first point P11 in the edge information Edi1 (hereinafter referred to as the detection position Dp11) according to the edge information Edi1 of the two-dimensional image Img1 (step S131).

於此,定點位置Da1是指二維成像單元110進行二維影像擷取時深度感測器130的光軸Ax2在XY平面上的位置。偵測位置Dp11則是邊緣資訊Edi1中第一個點P11在待測物20上的位置,其相當於第一個點P11的X資訊x1與Y資訊y1。換言之,當X資訊與Y資訊是以待測物20所在的三維座標系表示時,偵測位置Dp11即為邊緣資訊Edi1中第一個點P11的X資訊x1與Y資訊y1。隨後,深度感測器130即依循移動向量Vt移動,使其光軸Ax2從對準定點位置Da1移動至對準邊緣資訊Edi1中第一點P11(步驟S132),進而量測待測物20於第一點P11處的所有膜表面Sf0~Sf5的距離(即Z資訊z10~z15)(步驟S133)。Here, the fixed point position Da1 refers to the position of the optical axis Ax2 of the depth sensor 130 on the XY plane when the two-dimensional imaging unit 110 performs two-dimensional image capture. The detection position Dp11 is the position of the first point P11 in the edge information Edi1 on the object to be measured 20, which is equivalent to the X information x1 and Y information y1 of the first point P11. In other words, when the X information and the Y information are expressed in the three-dimensional coordinate system where the object to be measured 20 is located, the detection position Dp11 is the X information x1 and the Y information y1 of the first point P11 in the edge information Edi1. Subsequently, the depth sensor 130 moves according to the movement vector Vt, so that its optical axis Ax2 moves from the alignment fixed point position Da1 to the first point P11 in the alignment edge information Edi1 (step S132), and then measures the distance of all film surfaces Sf0~Sf5 of the object 20 to be tested at the first point P11 (i.e., Z information z10~z15) (step S133).

於取得第一個點P11的Z資訊z10~z15後,深度感測器130再依循邊緣資訊Edi1移動,使其光軸Ax2從對準當前偵測點P11的偵測位置Dp11移動至對準邊緣資訊Edi1中下一個點P12的偵測位置Dp12(步驟S132),並取得下一點P12的Z資訊(步驟S133),然後依此類推,直至移動至邊緣資訊Edi1中最後一點P1n的偵測位置Dp1n(步驟S132)並取得點P1n的Z資訊(步驟S133)。於二維成像單元110擷取到下一張二維影像時,對此張二維影像進行同前述步驟S120至步驟S133的流程,以得到此二維影像上的邊緣資訊Edi1的所有點的Z資訊。並且,依此類推至待測物20的所有二維影像。After obtaining the Z information z10~z15 of the first point P11, the depth sensor 130 moves according to the edge information Edi1, so that its optical axis Ax2 moves from the detection position Dp11 aligned with the current detection point P11 to the detection position Dp12 aligned with the next point P12 in the edge information Edi1 (step S132), and obtains the Z information of the next point P12 (step S133), and then repeats this process until it moves to the detection position Dp1n of the last point P1n in the edge information Edi1 (step S132) and obtains the Z information of point P1n (step S133). When the two-dimensional imaging unit 110 captures the next two-dimensional image, the same process from step S120 to step S133 is performed on the two-dimensional image to obtain the Z information of all points of the edge information Edi1 on the two-dimensional image. And the same is true for all two-dimensional images of the object 20 to be tested.

也就是說,在建置過程中,深度感測器130係隨著二維成像單元110而移動,且深度感測器130的光軸Ax2與二維成像單元110的光軸Ax1共軸後或以固定距離CD平行地指向待測物20,因此得深度感測器130之光軸Ax2固定於二維成像單元110之可視範圍Rf11/…/Rfst中之一定點(即定點位置Da1)上。雖然邊緣資訊Edi1的所有點都是落在二維成像單元110之可視範圍Rf11/…/Rfst內,但並非皆對準於二維成像單元110的光軸Ax1。That is, during the construction process, the depth sensor 130 moves with the two-dimensional imaging unit 110, and the optical axis Ax2 of the depth sensor 130 is coaxial with the optical axis Ax1 of the two-dimensional imaging unit 110 or is parallelly pointed to the object 20 at a fixed distance CD, so that the optical axis Ax2 of the depth sensor 130 is fixed at a fixed point (i.e., the fixed point position Da1) in the visible range Rf11/…/Rfst of the two-dimensional imaging unit 110. Although all points of the edge information Edi1 fall within the visible range Rf11/…/Rfst of the two-dimensional imaging unit 110, they are not all aligned with the optical axis Ax1 of the two-dimensional imaging unit 110.

舉例來說,請進一步參照圖17至圖20,假設二維成像單元110與深度感測器130共軸指向載台100。二維成像單元110以可視範圍Rf1(t-1)(如圖6及圖17所示)在待測物20之邊緣Edo的一個擷取位置擷取到二維影像Img2(如圖18所示)。此時,深度感測器130之光軸Ax2落在可視範圍Rf1(t-1)內的定點位置Da2與二維影像Img2之邊緣資訊Edi2中第一個點P21的偵測位置Dp21相距距離Dis1。於第一處理器120根據二維影像Img2之邊緣資訊Edi2計算對應此邊緣資訊Edi2的深度取像路徑後,深度感測器130依循對應此邊緣資訊Edi2的深度取像路徑移動至點P21的偵測位置Dp21並取得點P21的Z資訊,然後再逐一移動至邊緣資訊Edi2中的其他點的偵測位置並取得其Z資訊。For example, please further refer to Figures 17 to 20, assuming that the two-dimensional imaging unit 110 and the depth sensor 130 are coaxially pointing to the stage 100. The two-dimensional imaging unit 110 captures a two-dimensional image Img2 (as shown in Figure 18) at a capture position of the edge Edo of the object to be measured 20 with a visible range Rf1 (t-1) (as shown in Figures 6 and 17). At this time, the fixed point position Da2 where the optical axis Ax2 of the depth sensor 130 falls within the visible range Rf1 (t-1) is at a distance Dis1 from the detection position Dp21 of the first point P21 in the edge information Edi2 of the two-dimensional image Img2. After the first processor 120 calculates the depth acquisition path corresponding to the edge information Edi2 according to the edge information Edi2 of the two-dimensional image Img2, the depth sensor 130 moves to the detection position Dp21 of the point P21 along the depth acquisition path corresponding to the edge information Edi2 and obtains the Z information of the point P21, and then moves to the detection positions of other points in the edge information Edi2 one by one and obtains their Z information.

在一些實施例中,由於二維成像單元110與深度感測器130之間採用共軸之光學架構,深度感測器130需要移動的距離非常短(毫米等級)。因此,二維成像單元110及深度感測器130係可幾乎同時地取得二維影像之邊緣資訊上各點之X資訊、Y資訊及多個Z資訊,進而縮短三維點雲資訊的建構時間。In some embodiments, since the two-dimensional imaging unit 110 and the depth sensor 130 use a coaxial optical structure, the distance that the depth sensor 130 needs to move is very short (millimeter level). Therefore, the two-dimensional imaging unit 110 and the depth sensor 130 can obtain the X information, Y information and multiple Z information of each point on the edge information of the two-dimensional image almost simultaneously, thereby shortening the construction time of the three-dimensional point cloud information.

隨後,二維成像單元110再根據初始取像路徑移動待測物20之邊緣Edo的下一擷取位置,即移動至可視範圍Rf1t的位置。同時,深度感測器130隨著二維成像單元110而移動,使得深度感測器130之光軸Ax2移動至可視範圍Rf1t內的定點位置Da3(如圖17所示)。於二維成像單元110以可視範圍Rf1t擷取到二維影像Img3(如圖19所示)後,第一處理器120根據二維影像Img3之邊緣資訊Edi3計算對應此邊緣資訊Edi3的深度取像路徑,以使深度感測器130依循對應此邊緣資訊Edi3的深度取像路徑移動至邊緣資訊Edi3中的第一個點P31的偵測位置Dp31並取得點P31的Z資訊z10~z15(如圖21所示),然後再逐一移動至邊緣資訊Edi3中的其他點的偵測位置並取得其Z資訊。Then, the two-dimensional imaging unit 110 moves the edge Edo of the object 20 to the next capture position according to the initial image capture path, that is, moves to the position of the visible range Rf1t. At the same time, the depth sensor 130 moves with the two-dimensional imaging unit 110, so that the optical axis Ax2 of the depth sensor 130 moves to the fixed point position Da3 within the visible range Rf1t (as shown in FIG. 17 ). After the two-dimensional imaging unit 110 captures the two-dimensional image Img3 with the visible range Rf1t (as shown in FIG. 19 ), the first processor 120 calculates the depth acquisition path corresponding to the edge information Edi3 according to the edge information Edi3 of the two-dimensional image Img3, so that the depth sensor 130 moves to the detection position Dp31 of the first point P31 in the edge information Edi3 along the depth acquisition path corresponding to the edge information Edi3 and obtains the Z information z10~z15 of the point P31 (as shown in FIG. 21 ), and then moves to the detection positions of other points in the edge information Edi3 one by one and obtains their Z information.

同樣地,於取得邊緣資訊Edi3中的所有點的Z資訊後,二維成像單元110再根據初始取像路徑移動至待測物20之邊緣Edo的下一擷取位置,即移動至可視範圍Rf2t的位置。同時,深度感測器130隨著二維成像單元110而移動,使得深度感測器130之光軸Ax2移動至對準可視範圍Rf2t內的定點位置Da4(如圖17所示)。於二維成像單元110以可視範圍Rf2t擷取到二維影像Img4(如圖20所示)後,第一處理器120取得二維影像Img4之邊緣資訊Edi4並根據二維影像Img4之邊緣資訊Edi4計算對應此邊緣資訊Edi4的深度取像路徑,以使深度感測器130依循對應此邊緣資訊Edi4的深度取像路徑移動至邊緣資訊Edi4中的第一個點P41的偵測位置Dp41並取得點P41的Z資訊z10~z15(如圖21所示),然後再逐一移動至邊緣資訊Edi4中的其他點的偵測位置並取得其Z資訊。Similarly, after obtaining the Z information of all points in the edge information Edi3, the two-dimensional imaging unit 110 moves to the next capture position of the edge Edo of the object 20 to be measured according to the initial imaging path, that is, moves to the position of the visible range Rf2t. At the same time, the depth sensor 130 moves with the two-dimensional imaging unit 110, so that the optical axis Ax2 of the depth sensor 130 moves to the fixed point position Da4 aligned with the visible range Rf2t (as shown in FIG. 17). After the two-dimensional imaging unit 110 captures the two-dimensional image Img4 with the visible range Rf2t (as shown in FIG. 20 ), the first processor 120 obtains the edge information Edi4 of the two-dimensional image Img4 and calculates the depth acquisition path corresponding to the edge information Edi4 according to the edge information Edi4 of the two-dimensional image Img4, so that the depth sensor 130 moves to the detection position Dp41 of the first point P41 in the edge information Edi4 along the depth acquisition path corresponding to the edge information Edi4 and obtains the Z information z10~z15 of the point P41 (as shown in FIG. 21 ), and then moves to the detection positions of other points in the edge information Edi4 one by one and obtains their Z information.

在圖21中,方框Rs1為可視範圍Rf1t內的待測物20在通過點P31的截線下的側視圖,並且方框Rs2為可視範圍Rf2t內的待測物20在通過點P41的截線下的側視圖。In FIG. 21 , a frame Rs1 is a side view of the object under test 20 within the visible range Rf1t under a section line passing through point P31, and a frame Rs2 is a side view of the object under test 20 within the visible range Rf2t under a section line passing through point P41.

請參照圖1及圖22。在一些實施例中,建置系統10可更包含一移動裝置160。此移動裝置160耦接第一處理器120,並且受第一處理器120所控制。於此,二維成像單元110及深度感測器130組裝在移動裝置160上。在步驟S110中,移動裝置160會根據初始取像路徑帶動二維成像單元110於三維座標系中之XY面上進行二維移動,使得二維成像單元110的可視範圍沿著待測物20的邊緣位移,進而擷取待測物20的二維影像。在步驟S132中,移動裝置160會根據深度取像路徑帶動深度感測器130沿著待測物20之邊緣Edo移動,使得深度感測器130的光軸Ax2可沿著各二維影像的邊緣資訊位移,進而擷取邊緣資訊上各點的所有Z資訊。在一些實施例中,移動裝置160可以是具有二維移動機構的裝置,例如但不限於機械手臂。Please refer to FIG. 1 and FIG. 22. In some embodiments, the construction system 10 may further include a moving device 160. The moving device 160 is coupled to the first processor 120 and is controlled by the first processor 120. Here, the two-dimensional imaging unit 110 and the depth sensor 130 are assembled on the moving device 160. In step S110, the moving device 160 drives the two-dimensional imaging unit 110 to perform two-dimensional movement on the XY plane in the three-dimensional coordinate system according to the initial imaging path, so that the visual range of the two-dimensional imaging unit 110 moves along the edge of the object 20 to capture the two-dimensional image of the object 20. In step S132, the moving device 160 drives the depth sensor 130 to move along the edge Edo of the object 20 according to the depth imaging path, so that the optical axis Ax2 of the depth sensor 130 can be displaced along the edge information of each two-dimensional image, thereby capturing all Z information of each point on the edge information. In some embodiments, the moving device 160 can be a device with a two-dimensional moving mechanism, such as but not limited to a robot arm.

請參照圖1及圖23。在一些實施例中,建置系統10更包含一移動裝置161。移動裝置161耦接第一處理器120,並且受第一處理器120所控制。於此,載台100組裝在移動裝置161上。在步驟S110中,移動裝置161會根據初始取像路徑帶動載台100於三維座標系中之XY面上相對於二維成像單元110的可視範圍的位置進行二維移動,使得待測物20相對二維成像單元110的可視範圍移動而造成二維成像單元110的可視範圍沿著待測物20的邊緣Edo產生相對位移。在步驟S132中,移動裝置160會根據深度取像路徑帶動載台100相對於三維座標系中之XY面上相對於深度感測器130之光軸Ax2的位置進行二維移動,使得待測物20相對深度感測器130之光軸Ax2移動而造成深度感測器130之光軸Ax2沿著待測物20之邊緣Edo產生相對位移。在一些實施例中,移動裝置161可以是具有二維移動機構的移動平台,並且所述二維移動機構例如但不限於滑軌、齒輪、滾輪、滾珠、馬達或運輸帶。Please refer to FIG. 1 and FIG. 23. In some embodiments, the construction system 10 further includes a moving device 161. The moving device 161 is coupled to the first processor 120 and is controlled by the first processor 120. Here, the carrier 100 is assembled on the moving device 161. In step S110, the moving device 161 drives the carrier 100 to move two-dimensionally on the XY plane in the three-dimensional coordinate system relative to the visible range of the two-dimensional imaging unit 110 according to the initial imaging path, so that the object 20 to be tested moves relative to the visible range of the two-dimensional imaging unit 110, causing the visible range of the two-dimensional imaging unit 110 to produce a relative displacement along the edge Edo of the object 20 to be tested. In step S132, the moving device 160 drives the stage 100 to move two-dimensionally relative to the position of the optical axis Ax2 of the depth sensor 130 on the XY plane in the three-dimensional coordinate system according to the depth imaging path, so that the object 20 moves relative to the optical axis Ax2 of the depth sensor 130, causing the optical axis Ax2 of the depth sensor 130 to be relatively displaced along the edge Edo of the object 20. In some embodiments, the moving device 161 can be a moving platform with a two-dimensional moving mechanism, and the two-dimensional moving mechanism is, for example but not limited to, a slide rail, a gear, a roller, a ball, a motor, or a conveyor belt.

綜上所述,根據任一實施例,三維點雲資訊的建置系統或建置方法適用於取得待測物20之高精度的三維座標(例如輪廓尺寸與膜厚),並據以建立此待測物20的三維點雲資訊。如此一來,即可提供待測物20之三維點雲資訊以應用於此待測物20的雷射切割以及自動化光學等高精度製程中。在一些實施例中,三維點雲資訊的建置系統或建置方法還特別適用於建置具有透明膜之多膜層結構之待測物20的高精度的三維點雲資訊。In summary, according to any embodiment, the system or method for constructing three-dimensional point cloud information is suitable for obtaining high-precision three-dimensional coordinates (such as contour size and film thickness) of the object 20 to be tested, and then constructing the three-dimensional point cloud information of the object 20 to be tested. In this way, the three-dimensional point cloud information of the object 20 to be tested can be provided for application in high-precision processes such as laser cutting and automated optics of the object 20 to be tested. In some embodiments, the system or method for constructing three-dimensional point cloud information is particularly suitable for constructing high-precision three-dimensional point cloud information of the object 20 to be tested having a multi-layer structure with a transparent film.

雖然本案已以實施例揭露如上,然其並非用以限定本案之創作,任何所屬技術領域中具有通常知識者,在不脫離本揭露內容之精神和範圍內,當可作些許之修改與變化,惟些許之修改與變化仍然在本案之申請專利範圍內。Although the present invention has been disclosed as above by way of embodiments, it is not intended to limit the invention of the present invention. Anyone with ordinary knowledge in the relevant technical field may make some modifications and changes without departing from the spirit and scope of the present disclosure. However, such modifications and changes are still within the scope of the patent application of the present invention.

10:建置系統10: Build the system

100:載台100: Carrier

101:發光單元101: Light-emitting unit

110:二維成像單元110: Two-dimensional imaging unit

111:電荷耦合元件(CCD)感測器111: Charge-coupled device (CCD) sensor

112:遠心鏡頭112:Telecentric lens

120:第一處理器120: first processor

130:深度感測器130: Depth sensor

140:第二處理器140: Second processor

150:分光鏡150: Spectroscope

160,161:移動裝置160,161: Mobile device

20:待測物20: Object to be tested

21:膜層21: Membrane layer

201:保護膜201: Protective film

202:防爆膜202: Explosion-proof membrane

203:黏著膜203: Adhesive film

204:玻璃層204: Glass layer

205:遮光層205:Shading layer

Ax1,Ax2,Ax3:光軸Ax1, Ax2, Ax3: optical axis

Ax11,Ax12:光軸Ax11, Ax12: optical axis

CD:固定距離CD: Fixed distance

Da1~Da4:定點位置Da1~Da4: Fixed point position

Dis1:距離Dis1: Distance

Dp11,Dp12,Dp21,Dp31,Dp41,Dp1n:偵測位置Dp11,Dp12,Dp21,Dp31,Dp41,Dp1n: Detection location

Dth:深度Dth: Depth

Edi1~Edi4:邊緣資訊Edi1~Edi4: marginal information

Edi1’:邊緣資訊Edi1’: marginal information

Edo:邊緣Edo:Edge

Edp:邊緣圖像Edp:Edge Image

Img1~Img4:二維影像Img1~Img4: 2D images

Img1’:二維影像Img1’: 2D image

Lth:長邊Lth: Long side

Obt:物件圖像Obt: Object image

P11~P1n:點P11~P1n: point

P21,P31,P41:點P21, P31, P41: point

Rf11~Rf1t:可視範圍Rf11~Rf1t: Visible range

Rf2t~Rfst:可視範圍Rf2t~Rfst: Visible range

Rg1,Rg2:連通區塊Rg1, Rg2: connected blocks

Rs1,Rs2:方框Rs1, Rs2: Box

S100~S140:步驟S100~S140: Steps

S121~S124:步驟S121~S124: Steps

S131~S133:步驟S131~S133: Steps

Sf0~Sf5:膜表面Sf0~Sf5: membrane surface

Vt:移動向量Vt:Transfer vector

WD:工作距離WD: Working distance

Wth:短邊Wth: Short side

x1~xn:X資訊x1~xn:X information

y1~yn:Y資訊y1~yn:Y information

z10~z15:Z資訊z10~z15:Z information

圖1是根據一些實施例之三維點雲資訊的建置系統的功能方塊圖。 圖2是圖1中二維成像單元及深度感測器之第一實施例的示意圖。 圖3是根據一些實施例之建置方法的流程圖。 圖4是圖2中二維成像單元之一實施例的示意圖。 圖5是圖1中待測物之一示範態樣的立體示意圖。 圖6是圖5中待測物的前視平面示意圖。 圖7是圖6中一可視範圍下所擷取到的二維影像的一示範態樣的示意圖。 圖8是圖7中二維影像之邊緣資訊之一示範態樣的示意圖。 圖9是圖7中二維影像所對應的待測物之斷面示意圖。 圖10是圖1中二維成像單元及深度感測器之第二實施例的示意圖。 圖11是圖3中步驟S120之一實施例的細部流程圖。 圖12是圖7中二維影像經圖11中步驟S121處理後之一示範態樣的示意圖。 圖13是圖12中二維影像經圖11中步驟S122處理後之一示範態樣的示意圖。 圖14是圖13中二維影像經圖11中步驟S123處理後之一示範態樣的示意圖。 圖15是圖3中步驟S130之一實施例的細部流程圖。 圖16是圖15的步驟S131中移動向量之一示範態樣的示意圖。 圖17是在圖15的步驟S131中定點位置之一示範態樣的示意圖。 圖18是圖17中位於左邊的可視範圍所對應的二維影像的示意圖。 圖19是圖17中位於中間的可視範圍所對應的二維影像的示意圖。 圖20是圖17中位於右邊的可視範圍所對應的二維影像的示意圖。 圖21是圖17中位於中間與右邊的可視範圍內的待測物的斷面示意圖。 圖22是圖1中二維成像單元及深度感測器之第三實施例的示意圖。 圖23是圖1中二維成像單元及深度感測器之第四實施例的示意圖。 FIG. 1 is a functional block diagram of a system for constructing three-dimensional point cloud information according to some embodiments. FIG. 2 is a schematic diagram of a first embodiment of a two-dimensional imaging unit and a depth sensor in FIG. 1. FIG. 3 is a flow chart of a construction method according to some embodiments. FIG. 4 is a schematic diagram of an embodiment of a two-dimensional imaging unit in FIG. 2. FIG. 5 is a three-dimensional schematic diagram of an exemplary state of the object to be tested in FIG. 1. FIG. 6 is a front view plane schematic diagram of the object to be tested in FIG. 5. FIG. 7 is a schematic diagram of an exemplary state of a two-dimensional image captured in a visible range in FIG. 6. FIG. 8 is a schematic diagram of an exemplary state of edge information of the two-dimensional image in FIG. 7. FIG. 9 is a schematic diagram of a cross-section of the object to be tested corresponding to the two-dimensional image in FIG. 7. FIG. 10 is a schematic diagram of a second embodiment of the two-dimensional imaging unit and the depth sensor in FIG. 1. FIG. 11 is a detailed flowchart of an embodiment of step S120 in FIG. 3 . FIG. 12 is a schematic diagram of an exemplary state of the two-dimensional image in FIG. 7 after being processed by step S121 in FIG. 11 . FIG. 13 is a schematic diagram of an exemplary state of the two-dimensional image in FIG. 12 after being processed by step S122 in FIG. 11 . FIG. 14 is a schematic diagram of an exemplary state of the two-dimensional image in FIG. 13 after being processed by step S123 in FIG. 11 . FIG. 15 is a detailed flowchart of an embodiment of step S130 in FIG. 3 . FIG. 16 is a schematic diagram of an exemplary state of the motion vector in step S131 in FIG. 15 . FIG. 17 is a schematic diagram of an exemplary state of the fixed point position in step S131 in FIG. 15 . FIG. 18 is a schematic diagram of a two-dimensional image corresponding to the visible range on the left in FIG. 17. FIG. 19 is a schematic diagram of a two-dimensional image corresponding to the visible range in the middle in FIG. 17. FIG. 20 is a schematic diagram of a two-dimensional image corresponding to the visible range on the right in FIG. 17. FIG. 21 is a schematic diagram of a cross section of an object to be tested within the visible range in the middle and on the right in FIG. 17. FIG. 22 is a schematic diagram of a third embodiment of a two-dimensional imaging unit and a depth sensor in FIG. 1. FIG. 23 is a schematic diagram of a fourth embodiment of a two-dimensional imaging unit and a depth sensor in FIG. 1.

S110~S140:步驟 S110~S140: Steps

Claims (10)

一種三維點雲資訊的建置系統,包含:一載台,用以承載一待測物,其中該載台設置有一發光單元,並且該發光單元用以發光以照射該待測物;一二維成像單元,用以擷取該待測物之複數二維影像,其中該二維成像單元之光軸指向該載台;一第一處理器,耦接該二維成像單元,用以分析各該二維影像以取得各該二維影像之邊緣資訊,其中各該二維影像之該邊緣資訊包含複數點之X資訊及Y資訊;一深度感測器,耦接該第一處理器,用以根據各該二維影像之該邊緣資訊以擷取各該點之複數Z資訊,其中該深度感測器之光軸指向該載台;一第二處理器,耦接該第一處理器,用以映射各該點之該X資訊、該Y資訊及該些Z資訊而產生該待測物之一三維點雲資訊;以及一分光鏡,該分光鏡位於該載台與該二維成像單元之間以及該載台與該深度感測器之間,並且該二維成像單元之該光軸及該深度感測器之該光軸經由該分光鏡而共軸。 A system for constructing three-dimensional point cloud information includes: a carrier for carrying an object to be measured, wherein the carrier is provided with a light-emitting unit, and the light-emitting unit is used to emit light to illuminate the object to be measured; a two-dimensional imaging unit, used to capture a plurality of two-dimensional images of the object to be measured, wherein the optical axis of the two-dimensional imaging unit points to the carrier; a first processor, coupled to the two-dimensional imaging unit, used to analyze each of the two-dimensional images to obtain edge information of each of the two-dimensional images, wherein the edge information of each of the two-dimensional images includes X information and Y information of a plurality of points; a depth sensor, coupled to the two-dimensional imaging unit, used to obtain edge information of each of the two-dimensional images; The first processor is used to capture the multiple Z information of each point according to the edge information of each two-dimensional image, wherein the optical axis of the depth sensor points to the stage; a second processor is coupled to the first processor, and is used to map the X information, the Y information and the Z information of each point to generate a three-dimensional point cloud information of the object to be measured; and a spectroscope, the spectroscope is located between the stage and the two-dimensional imaging unit and between the stage and the depth sensor, and the optical axis of the two-dimensional imaging unit and the optical axis of the depth sensor are coaxial through the spectroscope. 如請求項1所述之建置系統,其中該二維成像單元包含一電荷耦合元件(CCD)感測器及一遠心鏡頭,該電荷耦合元件感測器之光軸及該遠心鏡頭之光軸共軸,並且該電荷耦合元件感測器耦接該第一處理器。 A system as described in claim 1, wherein the two-dimensional imaging unit includes a charge coupled device (CCD) sensor and a telecentric lens, the optical axis of the CCD sensor and the optical axis of the telecentric lens are coaxial, and the CCD sensor is coupled to the first processor. 如請求項1所述之建置系統,其中該二維成像單元之工作距離不小於65毫米。 A system as described in claim 1, wherein the working distance of the two-dimensional imaging unit is not less than 65 mm. 如請求項1所述之建置系統,其中該深度感測器為一彩色共焦感測器。 A system as described in claim 1, wherein the depth sensor is a chromatic confocal sensor. 如請求項1所述之建置系統,其中該二維成像單元之該光軸及該深度感測器之該光軸平行。 A system as described in claim 1, wherein the optical axis of the two-dimensional imaging unit and the optical axis of the depth sensor are parallel. 如請求項1所述之建置系統,更包含:一移動裝置,耦接該第一處理器,其中該二維成像單元與該深度感測器設置在該移動裝置上,並且在該第一處理器的控制下,該移動裝置用以根據一初始取像路徑移動該二維成像單元並根據一深度取像路徑移動該深度感測器。 The system as described in claim 1 further comprises: a moving device coupled to the first processor, wherein the two-dimensional imaging unit and the depth sensor are arranged on the moving device, and under the control of the first processor, the moving device is used to move the two-dimensional imaging unit according to an initial imaging path and move the depth sensor according to a depth imaging path. 如請求項1所述之建置系統,更包含:一移動裝置,耦接該第一處理器,其中該載台設置在該移動裝置上,並且在該第一處理器的控制下,該移動裝置用以根據一初始取像路徑和一深度取像路徑移動該載台。 The construction system as described in claim 1 further comprises: a moving device coupled to the first processor, wherein the carrier is disposed on the moving device, and under the control of the first processor, the moving device is used to move the carrier according to an initial imaging path and a depth imaging path. 一種三維點雲資訊的建置方法,包含:擷取一待測物之複數二維影像;分析各該二維影像以取得各該二維影像之邊緣資訊,其中各該二維影像之該邊緣資訊包含複數點之X資訊及Y資訊;根據各該二維影像之該邊緣資訊以擷取各該點之複數Z資訊;以及映射各該點之該X資訊、該Y資訊及該些Z資訊而產生該待測物之一三維點雲資訊; 其中,根據各該二維影像之該邊緣資訊以擷取各該點之該些Z資訊的步驟包含:根據各該二維影像之該邊緣資訊計算一深度取像路徑;基於該深度取像路徑移動一深度感測器而使該深度感測器之光軸依序對準於該些點;以及於該深度感測器對準於各該點時,經由該深度感測器依序偵測該待測物不同深度的複數膜表面以取得各該膜表面的該Z資訊。 A method for constructing three-dimensional point cloud information includes: capturing multiple two-dimensional images of an object to be measured; analyzing each of the two-dimensional images to obtain edge information of each of the two-dimensional images, wherein the edge information of each of the two-dimensional images includes X information and Y information of multiple points; capturing multiple Z information of each of the points based on the edge information of each of the two-dimensional images; and mapping the X information, the Y information and the Z information of each of the points to generate a three-dimensional point cloud information of the object to be measured; , the step of acquiring the Z information of each point according to the edge information of each two-dimensional image includes: calculating a depth imaging path according to the edge information of each two-dimensional image; moving a depth sensor based on the depth imaging path so that the optical axis of the depth sensor is aligned with the points in sequence; and when the depth sensor is aligned with each point, sequentially detecting multiple film surfaces of the object to be tested at different depths through the depth sensor to obtain the Z information of each film surface. 如請求項8所述之建置方法,更包含:根據該待測物之一規格圖建立該些二維影像之一初始取像路徑,其中擷取該待測物之該些二維影像的步驟是基於該初始取像路徑執行。 The construction method as described in claim 8 further includes: establishing an initial imaging path for the two-dimensional images according to a specification drawing of the object to be tested, wherein the step of capturing the two-dimensional images of the object to be tested is performed based on the initial imaging path. 如請求項8所述之建置方法,其中分析各該二維影像以取得各該二維影像之該邊緣資訊的步驟包含:二值化各該二維影像以取得各該二維影像之一邊緣圖像;凸顯各該二維影像之該邊緣圖像;連線各該二維影像之該邊緣圖像上之該些點以得到該邊緣資訊;以及對各該二維影像之該邊緣資訊進行反鋸齒處理。 The construction method as described in claim 8, wherein the step of analyzing each of the two-dimensional images to obtain the edge information of each of the two-dimensional images includes: binarizing each of the two-dimensional images to obtain an edge image of each of the two-dimensional images; highlighting the edge image of each of the two-dimensional images; connecting the points on the edge image of each of the two-dimensional images to obtain the edge information; and performing anti-aliasing on the edge information of each of the two-dimensional images.
TW112112961A 2023-04-06 Establishment system for 3d point cloud information and method thereof TWI842465B (en)

Publications (1)

Publication Number Publication Date
TWI842465B true TWI842465B (en) 2024-05-11

Family

ID=

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220005154A1 (en) 2019-07-30 2022-01-06 SZ DJI Technology Co., Ltd. Method and device for processing point cloud

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220005154A1 (en) 2019-07-30 2022-01-06 SZ DJI Technology Co., Ltd. Method and device for processing point cloud

Similar Documents

Publication Publication Date Title
CN109341527B (en) Automatic shadow compensation structured light projection three-dimensional measurement system and method
US7956842B2 (en) Pointing input system and method using one or more array sensors
US9432655B2 (en) Three-dimensional scanner based on contours from shadow images
Song et al. An accurate and robust strip-edge-based structured light means for shiny surface micromeasurement in 3-D
US20150261899A1 (en) Robot simulation system which simulates takeout process of workpieces
US20170176178A1 (en) Measurement system, measurement method, robot control method, robot, robot system, and picking apparatus
JP2002543411A (en) Optical detection method of object shape
US20100188355A1 (en) Apparatus and method for detecting an object pointed by a user
JP2021527220A (en) Methods and equipment for identifying points on complex surfaces in space
JP4566769B2 (en) Defect information detection device
Mahdy et al. Projector calibration using passive stereo and triangulation
US20190325593A1 (en) Image processing apparatus, system, method of manufacturing article, image processing method, and non-transitory computer-readable storage medium
JP2009058503A (en) Method and system for noncontact coordinate measurement on object surface
JP6441581B2 (en) Light detection for bending motion of flexible display
TWI842465B (en) Establishment system for 3d point cloud information and method thereof
JP2786070B2 (en) Inspection method and apparatus for transparent plate
JPH071164B2 (en) 3D shape recognition device
JP2002032784A (en) Device and method for operating virtual object
TWI229186B (en) Dual-view-angle 3D figure image line-scan inspection device
KR102586407B1 (en) Pixel compensating apparatus and display system having the same
JP2001194128A (en) Method for estimating configuration of three-dimensional surface shape using stereoscopic camera with focal light source
JP2974223B2 (en) Optical shape recognition device
US20130307845A1 (en) Image processing apparatus and image processing method
Varga et al. Experimental vision system setup based on the serial configuration interface
WO2022038814A1 (en) Corneal curvature radius calculation device, line-of-sight detection device, corneal curvature radius calculation method, and corneal curvature radius calculation program