TW201910950A - Robot processing method and system based on 3d image - Google Patents

Robot processing method and system based on 3d image Download PDF

Info

Publication number
TW201910950A
TW201910950A TW106127687A TW106127687A TW201910950A TW 201910950 A TW201910950 A TW 201910950A TW 106127687 A TW106127687 A TW 106127687A TW 106127687 A TW106127687 A TW 106127687A TW 201910950 A TW201910950 A TW 201910950A
Authority
TW
Taiwan
Prior art keywords
robot arm
processing
workpiece
dimensional model
model information
Prior art date
Application number
TW106127687A
Other languages
Chinese (zh)
Other versions
TWI650626B (en
Inventor
黃孝維
林伯聰
鄒嘉駿
Original Assignee
由田新技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 由田新技股份有限公司 filed Critical 由田新技股份有限公司
Priority to TW106127687A priority Critical patent/TWI650626B/en
Priority to CN201711420259.1A priority patent/CN109397282B/en
Priority to US15/942,571 priority patent/US10723020B2/en
Application granted granted Critical
Publication of TWI650626B publication Critical patent/TWI650626B/en
Publication of TW201910950A publication Critical patent/TW201910950A/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0075Manipulators for painting or coating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C65/00Joining or sealing of preformed parts, e.g. welding of plastics materials; Apparatus therefor
    • B29C65/48Joining or sealing of preformed parts, e.g. welding of plastics materials; Apparatus therefor using adhesives, i.e. using supplementary joining material; solvent bonding
    • B29C65/52Joining or sealing of preformed parts, e.g. welding of plastics materials; Apparatus therefor using adhesives, i.e. using supplementary joining material; solvent bonding characterised by the way of applying the adhesive
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C66/00General aspects of processes or apparatus for joining preformed parts
    • B29C66/80General aspects of machine operations or constructions and parts thereof
    • B29C66/84Specific machine types or machines suitable for specific applications
    • B29C66/863Robotised, e.g. mounted on a robot arm
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C66/00General aspects of processes or apparatus for joining preformed parts
    • B29C66/90Measuring or controlling the joining process
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B53/00Golf clubs
    • A63B53/04Heads
    • A63B53/0466Heads wood-type
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16BDEVICES FOR FASTENING OR SECURING CONSTRUCTIONAL ELEMENTS OR MACHINE PARTS TOGETHER, e.g. NAILS, BOLTS, CIRCLIPS, CLAMPS, CLIPS OR WEDGES; JOINTS OR JOINTING
    • F16B11/00Connecting constructional elements or machine parts by sticking or pressing them together, e.g. cold pressure welding
    • F16B11/006Connecting constructional elements or machine parts by sticking or pressing them together, e.g. cold pressure welding by gluing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39391Visual servoing, track end effector with camera image feedback
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40323Modeling robot environment for sensor based robot system
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/02Arm motion controller
    • Y10S901/09Closed loop, sensor feedback controls arm movement
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/30End effector
    • Y10S901/41Tool
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/46Sensing device
    • Y10S901/47Optical

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

Robot processing method and system based on 3D image are provided. The processing method includes the following steps: providing a robot 3D model data and a processing environment 3D model data; obtaining a workpiece 3D model data, and generating a processing path formed from multiple contacting points according to the workpiece 3D model data, wherein a free end of the robot moves according the processing path to complete the processing; generating a posture candidate group according to a relationship between the free end of the robot to any corresponding one of the contacting points; selecting a real moving posture from the posture candidate group; moving the free end of the robot to any corresponding one of the contacting points according to the selected real moving posture; and the free end of the moving in the processing path according to the real moving posture for completing the processing.

Description

基於三維影像之機械手臂加工方法及系統Robot arm processing method and system based on three-dimensional image

本發明是有關於一種加工方法及系統,且特別是有關於一種基於三維影像之機械手臂加工方法及系統。The present invention relates to a processing method and system, and more particularly to a method and system for processing a robot arm based on a three-dimensional image.

代工業中許多的加工步驟都是單一且重複施做的。目前逐漸以機械方式取代人力,利用機台執行加工有助於提升產品的產量及減少人力支出等。此外,透過程式控制加工路徑,可減輕使用人力加工時的不確定性,因而機台加工的運用成為各類產品生產過程中的較佳選擇。Many of the processing steps in the foundry industry are single and repeated. At present, the manpower is gradually replaced by mechanical means, and the use of machines to perform processing helps to increase product output and reduce manpower expenditure. In addition, the control of the machining path through the program can reduce the uncertainty when using manual machining, so the use of machine processing becomes a better choice in the production process of various products.

在現行的機台加工時,如點膠加工設備的加工路徑,通常是由工程師設計而成。假若加工路徑單純,像是直線移動或僅是翻轉單一角度,則加工路徑程式之程式設計較為簡單。然而,複雜的加工路徑,例如不規則弧線移動或是不規則圖形移動,加工路徑程式設計比較困難。In the current machine processing, such as the processing path of the dispensing processing equipment, it is usually designed by engineers. If the machining path is simple, such as a linear movement or simply flipping a single angle, the programming of the machining path program is relatively simple. However, complex machining paths, such as irregular arc movements or irregular graphic movements, are difficult to program the machining path.

此外,對於加工機台來說,在執行加工時,主要是通過控制機械手臂將工件移動至待加工位置,此時假若因為製造的可容忍公差使得同一個款式的每一個單一工件可能都會有些微的差異,或者是機械手臂夾取工件所造成的工件相對於機械手臂的距離或角度的偏差,都會影響加工結果。In addition, for the processing machine, when the machining is performed, the workpiece is mainly moved to the position to be processed by controlling the robot arm. At this time, if the tolerance of the manufacturing tolerance is made, each single workpiece of the same style may be slightly different. The difference, or the deviation of the distance or angle of the workpiece relative to the robot arm caused by the mechanical arm gripping the workpiece, will affect the machining result.

本發明提供一種能夠提升組裝良率的基於三維影像之機械手臂加工方法及系統。The invention provides a three-dimensional image-based robot arm processing method and system capable of improving assembly yield.

本發明的一種基於三維影像之機械手臂加工方法,利用機械手臂於加工環境內對至少一工件進行加工程序。所述加工方法至少包括下列步驟:提供機械手臂之三維模型資訊與加工環境之三維模型資訊;獲得工件之三維模型資訊,並根據工件之三維模型資訊產生由多個接觸點所形成的加工路徑,其中機械手臂之自由端根據加工路徑而移動以完成加工程序;根據機械手臂之自由端對應任一個接觸點之間的關係,產生機械手臂之移動姿態候選組;自移動姿態候選組選擇一實際移動姿態;根據實際移動姿態移動機械手臂之自由端至對應的任一個接觸點;根據多個實際移動姿態,機械手臂之自由端移動於加工路徑,以完成加工程序。The invention relates to a method for processing a robot arm based on a three-dimensional image, which uses a robot arm to perform a processing procedure on at least one workpiece in a processing environment. The processing method includes at least the following steps: providing three-dimensional model information of the robot arm and three-dimensional model information of the processing environment; obtaining three-dimensional model information of the workpiece, and generating a processing path formed by the plurality of contact points according to the three-dimensional model information of the workpiece, The free end of the robot arm moves according to the processing path to complete the machining process; according to the relationship between the free end of the robot arm corresponding to any one of the contact points, a moving posture candidate group of the robot arm is generated; and the moving movement candidate group selects an actual movement Attitude; moving the free end of the robot arm to any corresponding contact point according to the actual moving posture; according to a plurality of actual moving postures, the free end of the robot arm moves to the machining path to complete the machining process.

本發明另提供一種基於三維影像之機械手臂加工系統,包括:機械手臂,於加工環境內,對至少一工件進行加工程序;資料庫,儲存工件之三維模型資訊、機械手臂之三維模型資訊、加工環境之三維模型資訊;以及處理模組,耦合於機械手臂與資料庫之間,用以控制機械手臂執行加工程序;其中處理模組根據工件之三維模型資訊產生由多個接觸點所形成的加工路徑,控制機械手臂之自由端根據加工路徑而移動,以完成加工程序;其中處理模組根據機械手臂之自由端對應任一個接觸點之間的關係,產生機械手臂之移動姿態候選組,並自移動姿態候選組,選擇實際移動姿態;其中處理模組根據實際移動姿態,控制機械手臂之自由端移動至對應之任一個接觸點;其中處理模組根據多個實際移動姿態控制機械手臂之自由端移動於加工路徑,以完成加工程序。The invention further provides a robot arm processing system based on three-dimensional images, comprising: a mechanical arm, processing a workpiece on at least one workpiece in a processing environment; a database, storing three-dimensional model information of the workpiece, three-dimensional model information of the robot arm, processing a three-dimensional model information of the environment; and a processing module coupled between the robot arm and the database for controlling the robot arm to execute the machining program; wherein the processing module generates the machining formed by the plurality of contact points according to the three-dimensional model information of the workpiece a path, the free end of the control robot moves according to the machining path to complete the machining process; wherein the processing module generates a moving posture candidate group of the robot arm according to the relationship between the free ends of the robot arm corresponding to any one of the contact points, and Moving the gesture candidate group, selecting an actual moving posture; wherein the processing module controls the free end of the robot arm to move to any one of the contact points according to the actual moving posture; wherein the processing module controls the free end of the robot arm according to the plurality of actual moving postures Move to the machining path to complete the machining program

基於上述,本發明所提供的基於三維影像之機械手臂加工方法及系統可以進行自動化加工程序,不僅可以達到節省人力,同時還可減免人為錯誤進而提升組裝良率。Based on the above, the three-dimensional image-based robot arm processing method and system provided by the present invention can perform an automatic processing program, which not only can save manpower, but also can reduce human error and improve assembly yield.

為讓本發明的上述特徵和優點能更明顯易懂,下文特舉實施例,並配合所附圖式作詳細說明如下。The above described features and advantages of the invention will be apparent from the following description.

本發明提供基於三維影像之機械手臂加工方法及系統,通過將真實世界中的加工環境、加工設備等資訊輸入資料庫中以建立出虛擬的3D空間,並且將虛擬的3D空間以及真實世界進行校正,然後在獲取真實世界中待加工的工件的3D資訊後,並將真實世界中待加工的工件的3D資訊輸入資料庫中並進行計算後,與虛擬的3D空間中的加工設備及加工環境配合後產生加工路徑,而使得在真實世界中的機械手臂得以根據在虛擬3D空間中計算出來的加工路徑在真實世界中進行加工程序。The invention provides a robot arm processing method and system based on three-dimensional image, and inputs virtual world 3D space by correcting information such as processing environment and processing equipment in the real world, and corrects virtual 3D space and real world. Then, after obtaining the 3D information of the workpiece to be processed in the real world, and inputting the 3D information of the workpiece to be processed in the real world into the database and calculating it, cooperate with the processing equipment and the processing environment in the virtual 3D space. The machining path is then created, so that the robot in the real world can be processed in the real world according to the machining path calculated in the virtual 3D space.

[第一實施例][First Embodiment]

圖1A為第一實施例之基於三維影像之機械手臂加工系統的示意圖、圖1B為第一實施例之接觸點形成的加工路徑的示意圖,而圖2為第一實施例之基於三維影像之機械手臂加工方法的流程圖。請同時參考圖1A、圖1B及圖2,基於三維影像之機械手臂加工系統100包括處理模組110、機械手臂120以及資料庫130。機械手臂120用於在加工環境內對至少一工件190進行加工程序;資料庫130用來儲存工件190的三維模型資訊、機械手臂120的三維模型資訊以及加工環境之三維模型資訊;處理模組110電性耦合於機械手臂120與資料庫130之間,用來控制機械手臂120執行加工程序。1A is a schematic view of a three-dimensional image-based robot arm processing system of the first embodiment, FIG. 1B is a schematic view of a processing path formed by a contact point of the first embodiment, and FIG. 2 is a three-dimensional image-based machine of the first embodiment. Flow chart of the arm processing method. Referring to FIG. 1A , FIG. 1B and FIG. 2 simultaneously, the three-dimensional image-based robot processing system 100 includes a processing module 110 , a robot arm 120 , and a database 130 . The robot arm 120 is configured to perform processing on at least one workpiece 190 in a processing environment; the data library 130 is configured to store three-dimensional model information of the workpiece 190, three-dimensional model information of the robot arm 120, and three-dimensional model information of the processing environment; and the processing module 110 It is electrically coupled between the robot arm 120 and the database 130 for controlling the robot arm 120 to perform a machining process.

前述的處理模組110與資料庫130可以建置在同一個電子裝置(例如,電腦的主機)中,或者資料庫130與處理模組110也可以是兩個彼此獨立的個體,例如資料庫130可以是隨身硬碟,且隨身硬碟可透過媒介以與處理模組110電性連接。The foregoing processing module 110 and the database 130 may be built in the same electronic device (for example, a host computer), or the database 130 and the processing module 110 may be two independent entities, such as the database 130. It can be a portable hard disk, and the portable hard disk can be electrically connected to the processing module 110 through the medium.

當使用前述之加工系統對工件190進行加工時,至少包含下列步驟S110~步驟S160。When the workpiece 190 is processed using the aforementioned processing system, at least the following steps S110 to S160 are included.

如步驟S110,提供機械手臂120之三維模型資訊與加工環境之三維模型資訊。In step S110, three-dimensional model information of the robot arm 120 and three-dimensional model information of the processing environment are provided.

如步驟S120,獲得工件190之三維模型資訊,其中,獲得工件190之三維模型資訊可以是處理模組110透過資料庫130獲得預設的工件190之三維模型資訊,或者也可以是透過非接觸式的偵測裝置140,偵測工件190輪廓與尺寸所產生的工件190之三維模型資訊。工件190可以是任何裝置(包含電子裝置及非電子裝置)的殼體或高爾夫球桿頭的殼體…等,只要是需要進行加工的物品,都可能做為本加工系統或加工方法中的工件,並不以本實施例所舉的例子為限。在本實施例中,工件190以電子裝置的殼體進行說明,此電子裝置的殼體呈長方形,且待加工的表面為平面。此外,透過非接觸式的偵測裝置140偵測工件190的實施方式可能是使用深度攝影機拍攝工件190的影像,或者是利用3D雷射掃描的方式掃描工件190的輪廓,或者也可以是將深度攝影機與3D雷射掃描兩者混合使用。當然,透過非接觸式的偵測裝置140偵測工件190的實施方式也並不受本實施例的舉例而侷限,本領域人員能夠依照需求而選用適當的方法。In step S120, the three-dimensional model information of the workpiece 190 is obtained. The obtaining the three-dimensional model information of the workpiece 190 may be that the processing module 110 obtains the three-dimensional model information of the preset workpiece 190 through the database 130, or may be through the non-contact type. The detecting device 140 detects the three-dimensional model information of the workpiece 190 generated by the contour and size of the workpiece 190. The workpiece 190 may be a housing of any device (including electronic devices and non-electronic devices) or a housing of a golf club head, etc., as long as it is an article to be processed, it may be a workpiece in the processing system or the processing method. It is not limited to the examples given in the embodiment. In the present embodiment, the workpiece 190 is described in the housing of the electronic device. The housing of the electronic device has a rectangular shape, and the surface to be processed is a flat surface. In addition, the embodiment of detecting the workpiece 190 through the non-contact detecting device 140 may be to capture an image of the workpiece 190 using a depth camera, or to scan the contour of the workpiece 190 by means of 3D laser scanning, or may be depth. The camera is mixed with a 3D laser scan. Of course, the embodiment of detecting the workpiece 190 through the non-contact detecting device 140 is not limited by the example of the embodiment, and those skilled in the art can select an appropriate method according to the requirements.

附帶一提,在步驟S110及步驟S120之間,可更包括步驟S112,通過處理模組110校正機械手臂120之三維模型資訊與加工環境之三維模型資訊兩者在真實世界座標系的誤差。Incidentally, between step S110 and step S120, step S112 may be further included, and the processing module 110 corrects the error of the three-dimensional model information of the robot arm 120 and the three-dimensional model information of the processing environment in the real world coordinate system.

前述的校正是指處理模組110在機械手臂120之三維模型資訊與加工環境之三維模型資訊中,選擇至少一個校正點座標位置資訊,而處理模組110根據至少一個校正點座標位置資訊,使機械手臂120之自由端移動至真實世界坐標系上之對應座標位置,處理模組110比較至少一個校正點位置資訊與對應座標位置。The foregoing correction means that the processing module 110 selects at least one calibration point coordinate position information in the three-dimensional model information of the robot arm 120 and the three-dimensional model information of the processing environment, and the processing module 110 makes information according to at least one calibration point coordinate position information. The free end of the robot arm 120 moves to a corresponding coordinate position on the real world coordinate system, and the processing module 110 compares at least one correction point position information with a corresponding coordinate position.

請參考圖1C,圖1C左邊為真實世界中的機械手臂,而圖1C右邊為虛擬的3D空間中的機械手臂。詳細而言,在真實世界的機械手臂上選取至少一基準點S,獲得該基準點S的三維資訊,並且在虛擬世界的機械手臂對應該基準點S的位置處取得比較點C,並且獲取比較點C的三維模型資訊。然後,將比較點C的三維模型資訊與基準點S的三維資訊進行比較,以獲取真實世界與虛擬的3D空間的轉換系數。通過此轉換係數,可以將三維模型資訊轉換為真實世界的三維資訊,使虛擬的3D空間中的組件能夠與真實世界的相同的組件匹配。當然,透過轉換係數也可以將真實世界的三維資訊轉換為虛擬的3D空間中的三維資訊模型。Please refer to FIG. 1C. The left side of FIG. 1C is the robot arm in the real world, and the right side of FIG. 1C is the robot arm in the virtual 3D space. In detail, at least one reference point S is selected on the real world robot arm to obtain three-dimensional information of the reference point S, and a comparison point C is obtained at a position of the virtual world robot arm corresponding to the reference point S, and the comparison is obtained. Point C's 3D model information. Then, the three-dimensional model information of the comparison point C is compared with the three-dimensional information of the reference point S to obtain the conversion coefficients of the real world and the virtual 3D space. Through this conversion factor, the 3D model information can be converted into real-world 3D information, enabling components in the virtual 3D space to match the same components of the real world. Of course, real-world three-dimensional information can also be converted into a three-dimensional information model in virtual 3D space through the conversion coefficient.

特別的是,更將虛擬的3D空間中的機械手臂的加工姿態及移動路徑與真實世界中的機械手臂作進一步的校正。詳細而言,操作真實世界中的機械手臂,使其自由端任意移動至真實世界中的四個基準點S1、S2、S3及S4,並將此四個基準點S1、S2、S3及S4映射至虛擬的3D空間中形成比較點C1、C2、C3及C4,並且獲得比較點C1、C2、C3及C4的三維模型資訊。在真實世界中的機械手臂的自由端移動的同時,紀錄自由端的移動姿態及路徑。接著,依照真實世界的機械手臂的自由端移動至基準點S1、S2、S3及S4的順序,使虛擬的3D空間中的機械手臂的自由端移動至比較點C1、C2、C3及C4,同時紀錄虛擬的3D空間中的機械手臂的自由端的移動姿態及路徑,並且與真實世界中的機械手臂的移動姿態及路徑相比較,找出兩者間的誤差並校正。In particular, the machining posture and movement path of the robot arm in the virtual 3D space are further corrected with the robot arm in the real world. In detail, the real-world robotic arm is operated, and its free end is arbitrarily moved to four reference points S1, S2, S3, and S4 in the real world, and the four reference points S1, S2, S3, and S4 are mapped. The comparison points C1, C2, C3, and C4 are formed in the virtual 3D space, and the three-dimensional model information of the comparison points C1, C2, C3, and C4 is obtained. While moving the free end of the robot arm in the real world, record the moving posture and path of the free end. Then, according to the order in which the free end of the real world robot arm moves to the reference points S1, S2, S3, and S4, the free end of the robot arm in the virtual 3D space is moved to the comparison points C1, C2, C3, and C4, while The moving posture and path of the free end of the robot arm in the virtual 3D space are recorded, and compared with the moving posture and path of the robot arm in the real world, the error between the two is found and corrected.

之後,將結果回傳至處理模組110,由處理模組110進行計算及調整,以使虛擬的3D空間中的機械手臂的加工姿態及移動路徑與真實世界中的機械手臂的加工姿態及移動路徑能夠同步。此處所指的同步主要是指在真實世界以及虛擬的3D空間中的機械手臂的自由端以相同的姿態沿著相同的路徑移動,並非侷限在相同的時間作相同的事情,也可以是在不同的時間完成相同的任務。簡單地說,即是可以透過在虛擬的3D空間中設定加工程序,然後可以指定在預設的時間使真實世界中的加工系統才進行加工程序。After that, the result is transmitted back to the processing module 110, and the processing module 110 performs calculation and adjustment to make the processing posture and the moving path of the robot arm in the virtual 3D space and the processing posture and movement of the robot in the real world. The paths can be synchronized. The synchronization referred to here mainly means that the free ends of the robot arms in the real world and the virtual 3D space move along the same path in the same posture, and are not limited to the same thing at the same time, or may be different. The time to complete the same task. Simply put, it is possible to set the machining program in the virtual 3D space, and then specify that the processing system in the real world can be processed at a preset time.

此外,在步驟S120之後,可更包括步驟S122,處理模組110根據工件之三維模型資訊偵測至少一個接觸點特徵192,以建立多個接觸點194之位置,並產生由多個接觸點194所形成的加工路徑,機械手臂120受到處理模組110的驅動之後,機械手臂120之自由端可根據加工路徑而移動,如圖1C示。附帶一提,接觸點可以相同的間隔排列,或以不同的間隔排列,依照需求而決定。In addition, after step S120, step S122 may be further included, and the processing module 110 detects at least one contact point feature 192 according to the three-dimensional model information of the workpiece to establish a position of the plurality of contact points 194, and generates a plurality of contact points 194. After the machining path is formed, after the robot arm 120 is driven by the processing module 110, the free end of the robot arm 120 can be moved according to the processing path, as shown in FIG. 1C. Incidentally, the contact points may be arranged at the same interval or at different intervals, depending on the requirements.

接著請繼續參考圖1A、圖1B及圖1C,如步驟S130,處理模組110根據機械手臂120之自由端與相對應的任一個接觸點194之間的關係,產生機械手臂120之移動姿態候選組。所述移動姿態候選組是指加工過程中,機械手臂120從一處移動到另一處,並且完成加工程序的所有可能路徑所對應的機械手臂姿態。Then, referring to FIG. 1A, FIG. 1B and FIG. 1C, in step S130, the processing module 110 generates a motion posture candidate of the robot arm 120 according to the relationship between the free end of the robot arm 120 and the corresponding one of the contact points 194. group. The moving attitude candidate group refers to the robot arm posture corresponding to all possible paths of the machining program when the robot arm 120 moves from one place to another during the machining process.

如步驟S140,自移動姿態候選組中選擇實際移動姿態。確切而言,處理模組110根據移動姿態候選組產生對應的機械手臂120之三維姿態模型候選組,且根據移動姿態候選組、三維姿態模型候選組以及加工環境之三維模型資訊,自移動姿態候選組中刪除造成機械手臂120與環境空間相互干涉的移動姿態,並且自未相互干涉的移動姿態候選組中,選擇對機械手臂120之軸角度產生最小偏移量的移動姿態。In step S140, the actual moving posture is selected from the moving attitude candidate group. Specifically, the processing module 110 generates a corresponding three-dimensional attitude model candidate group of the robot arm 120 according to the moving posture candidate group, and the self-moving posture candidate according to the moving posture candidate group, the three-dimensional posture model candidate group, and the three-dimensional model information of the processing environment. The moving posture causing the robot arm 120 and the environmental space to interfere with each other is deleted, and a moving posture that produces a minimum offset amount to the axial angle of the robot arm 120 is selected from the moving posture candidate groups that do not interfere with each other.

詳細地說,在處理模組110計算出機械手臂120移動到加工位置的多條移動路徑時,需要將機械手臂120的型態、輪廓或尺寸考量進去,也需考量在加工環境中還設置有其他組件的狀況。若無考量上述因素,機械手臂120可能在移動過程中受限於其自身的型態、輪廓與尺寸而受到加工環境中的組件的干涉,致使加工無法完成。甚至,機械手臂120與加工環境中的組件因為撞擊而導致機械手臂120或組件的損壞。In detail, when the processing module 110 calculates a plurality of moving paths in which the robot arm 120 moves to the processing position, it is necessary to consider the type, contour, or size of the robot arm 120, and also needs to be set in the processing environment. The status of other components. Without considering the above factors, the robot arm 120 may be interfered with by components in the processing environment due to its own type, contour and size during movement, rendering the machining impossible. Even the mechanical arm 120 and the components in the processing environment cause damage to the robot arm 120 or assembly due to impact.

因此,在計算出多條可能的移動路徑之後,處理模組110更進一步將與機械手臂120及工件190的形狀、輪廓、尺寸及彼此間的關係等相關的三維姿態模型候選組納入考量及比較。在此同時,由於加工環境中的組件配置方式也會影響到加工程序的完成度,因此處理模組110也將加工環境的三維模型資訊一同納入考量及比較。綜合並且分析前述的移動姿態候選組、三維姿態模型候選組以及加工環境之三維模型資訊後,將可能造成干涉的移動姿態候選組刪除,然後自剩餘的、不會與環境中的組件造成干涉的移動姿態候選組中,選擇機械手臂120移動距離最少且機械手臂120的軸轉角度最小的移動姿態,不僅有利於機械手臂120以最省力的方式完成加工,更有助於提升加工程序的完成度。Therefore, after calculating a plurality of possible movement paths, the processing module 110 further considers and compares the three-dimensional posture model candidate groups related to the shape, the contour, the size, and the relationship between the robot arm 120 and the workpiece 190. . At the same time, since the component configuration manner in the processing environment also affects the completion degree of the processing program, the processing module 110 also takes into consideration and compares the three-dimensional model information of the processing environment. After synthesizing and analyzing the aforementioned moving attitude candidate group, the three-dimensional attitude model candidate group, and the three-dimensional model information of the processing environment, the motion pose candidate group that may cause interference is deleted, and then the remaining components do not interfere with the components in the environment. In the moving attitude candidate group, selecting the moving posture in which the robot arm 120 has the least moving distance and the shaft rotation angle of the robot arm 120 is the smallest is not only beneficial to the mechanical arm 120 to complete the processing in the most labor-saving manner, but also helps to improve the completion degree of the machining program. .

如步驟S150,處理模組110根據實際移動姿態移動機械手臂120之自由端至對應之任一個接觸點194。詳細而言,在挑選出機械手臂120可以最省力的方式完成加工的移動姿態後,經由前述的轉換係數,將此移動姿態轉換為應用在真實世界座標系的實際移動姿態,然後處理模組110驅動機械手臂120根據實際移動姿態移動至接觸點194。In step S150, the processing module 110 moves the free end of the robot arm 120 to any one of the contact points 194 according to the actual moving posture. In detail, after selecting the moving posture in which the robot arm 120 can complete the machining in the least labor-saving manner, the moving posture is converted into the actual moving posture applied to the real-world coordinate system via the aforementioned conversion coefficient, and then the processing module 110 is processed. The drive robot arm 120 moves to the contact point 194 in accordance with the actual movement attitude.

如步驟S160,處理模組110根據多個實際移動姿態,機械手臂120之自由端移動於加工路徑,以完成加工程序。確切地說,由於移動姿態已經由轉換係數轉換為實際移動姿態,因此機械手臂120在真實世界中,依據處理模組110所指示的實際移動路徑而於加工路徑上移動。In step S160, the processing module 110 moves the free end of the robot arm 120 to the machining path according to the plurality of actual movement postures to complete the machining process. Specifically, since the moving posture has been converted into the actual moving posture by the conversion coefficient, the robot arm 120 moves in the real world in accordance with the actual moving path indicated by the processing module 110 on the machining path.

此外,經由前述的同步真實世界中的機械手臂以及虛擬的3D空間中的機械手臂,可以確保真實世界中的機械手臂會依照實際移動姿態確實地完成加工程序,避免在虛擬的3D空間中所挑選出來的最佳加工路徑與真實世界中機械手臂實際移動的路徑不完全相同而導致加工程序無法完成。In addition, through the aforementioned robot arm in the real world and the robotic arm in the virtual 3D space, it can be ensured that the robot in the real world will surely complete the machining process according to the actual moving posture, avoiding the selection in the virtual 3D space. The best machining path that comes out is not exactly the same as the actual moving path of the robot arm in the real world, and the machining program cannot be completed.

特別的是,機械手臂120所夾取的同一種類的每一個工件190可能會因為公差而有外觀尺寸上的些微差異,或是起因於工件190的擺放角度導致工件190相對於機械手臂120的角度偏轉,使得在每一個工件190上建立的接觸點不完全相同,因此處理模組110針對每一個工件190所計算出來的加工路徑不完全相同。簡單地說,每一個單一的工件190會獲得其專屬的加工路徑。In particular, each of the workpieces 190 of the same type picked up by the robot arm 120 may have slight differences in appearance dimensions due to tolerances, or may result in the workpiece 190 being relative to the robot arm 120 due to the angle of placement of the workpiece 190. The angular deflection is such that the contact points established on each of the workpieces 190 are not identical, so the processing paths calculated by the processing module 110 for each of the workpieces 190 are not identical. Simply put, each single workpiece 190 will have its own unique processing path.

通過上述的基於三維影像之機械手臂120加工方法及系統,可以進行自動化加工程序,不僅可以達到節省人力,同時還可減免人為錯誤進而提升組裝良率。Through the above-described three-dimensional image-based robot arm processing method and system, an automated processing program can be performed, which not only saves labor, but also reduces human error and improves assembly yield.

[第二實施例][Second embodiment]

圖3為第二實施例之點膠加工設備的示意圖、圖4為點膠加工設備對高爾夫球桿頭的殼體進行加工的流程圖,而圖5為高爾夫球桿頭的殼體在虛擬3D空間中的示意圖。3 is a schematic view of the dispensing processing apparatus of the second embodiment, FIG. 4 is a flow chart of processing the housing of the golf club head by the dispensing processing apparatus, and FIG. 5 is a virtual 3D of the housing of the golf club head. Schematic diagram in space.

請同時參考圖3、圖4及圖5,點膠加工設備200包括處理模組110、機械手臂120、資料庫130、非接觸式的偵測裝置140以及點膠裝置250。機械手臂120用於在點膠加工設備200所提供的加工環境270內對高爾夫球桿頭的殼體300進行加工程序。資料庫130用來儲存點膠加工設備200、加工環境270以及高爾夫球桿頭的殼體300的三維模型資訊。處理模組110電性耦合於機械手臂120與資料庫130之間,用來控制機械手臂120執行加工程序。非接觸式的偵測裝置140例如是攝像鏡頭242,電性耦合於處理模組110與資料庫130,用來偵測殼體300的輪廓與尺寸,以產生殼體300的三維模型資訊。點膠裝置250的位置可以依照實際需求而設置。在本實施例中,是使點膠裝置250設置在加工環境270內的固定位置處,而機械手臂120的自由端連接夾持裝置212,且利用夾持裝置212夾持並機械手臂120沿著加工路徑移動殼體300,讓點膠裝置250對殼體300上的每一個接觸點300b進行點膠。在另一種未繪示的實施方式中,點膠裝置250可設置在機械手臂120的自由端,而固定高爾夫球桿頭的殼體300,以使用機械手臂120對加工路徑上的每一個接觸點300b進行點膠。Referring to FIG. 3, FIG. 4 and FIG. 5 simultaneously, the dispensing processing apparatus 200 includes a processing module 110, a robot arm 120, a database 130, a non-contact detecting device 140, and a dispensing device 250. The robotic arm 120 is used to process the housing 300 of the golf club head within the processing environment 270 provided by the dispensing processing apparatus 200. The database 130 is used to store three-dimensional model information of the dispensing processing apparatus 200, the processing environment 270, and the housing 300 of the golf club head. The processing module 110 is electrically coupled between the robot arm 120 and the database 130 for controlling the robot arm 120 to perform a machining process. The non-contact detecting device 140 is, for example, an imaging lens 242 electrically coupled to the processing module 110 and the data library 130 for detecting the contour and size of the housing 300 to generate three-dimensional model information of the housing 300. The position of the dispensing device 250 can be set according to actual needs. In the present embodiment, the dispensing device 250 is disposed at a fixed position within the processing environment 270, and the free end of the robot arm 120 is coupled to the clamping device 212 and is clamped by the clamping device 212 and the robot arm 120 is along The processing path moves the housing 300, allowing the dispensing device 250 to dispense each contact point 300b on the housing 300. In another embodiment, not shown, the dispensing device 250 can be disposed at the free end of the robotic arm 120 to secure the housing 300 of the golf club head to use the robot arm 120 for each contact point on the machining path. Dispensing at 300b.

又,點膠加工設備200還包含貼合裝置260,此貼合裝置260位於加工環境270內,且鄰近機械手臂120設置,用以提供壓力以將高爾夫球桿頭的殼體300貼合至高爾夫球桿頭的另一個殼體400,其中高爾夫球桿頭的兩個殼體300、400中的至少一個已經被點膠。貼合裝置260可選用氣壓缸,但並不以此為限。Moreover, the dispensing processing apparatus 200 further includes a laminating device 260 located within the processing environment 270 and disposed adjacent to the robotic arm 120 for providing pressure to conform the golf club head housing 300 to the golf ball. Another housing 400 of the club head in which at least one of the two housings 300, 400 of the golf club head has been dispensed. The sealing device 260 can be selected from a pneumatic cylinder, but is not limited thereto.

此外,點膠加工設備200還包括備料區200a以及置料區200b,且機械手臂120適於在備料區200a以及置料區200b之間移動,其中備料區200a用來放置待加工的高爾夫球桿頭的殼體300,而置料區200b用來放置已經貼合在一起的高爾夫球桿頭的兩個殼體300、400。In addition, the dispensing processing apparatus 200 further includes a stocking area 200a and a loading area 200b, and the robot arm 120 is adapted to move between the stocking area 200a and the stocking area 200b, wherein the stocking area 200a is used to place the golf club to be processed The housing 300 of the head, and the loading zone 200b are used to place the two housings 300, 400 of the golf club head that have been affixed together.

當使用點膠加工設備200對高爾夫球桿頭的殼體300進行自動點膠程序時,如步驟S210,先於資料庫130中取得預先輸入的機械手臂120的三維模型資訊與加工環境270的三維資訊。機械手臂120的三維模型資訊包含組成機械手臂120的軸數,軸的可轉動角度,機械手臂120的移動方向及距離…等。加工環境270的三維資訊包含除了前述提到的非接觸式的偵測裝置140(攝像鏡頭242)、點膠裝置250之外的其他可能組件或裝置,這些其他組件或裝置可能是組成點膠加工設備200的組裝構件,也可能是點膠加工設備200為了要施行其他製程的裝置。When the dispensing process is performed on the housing 300 of the golf club head using the dispensing processing apparatus 200, the three-dimensional model information of the robot arm 120 and the three-dimensionality of the processing environment 270 are obtained in advance from the database 130 in step S210. News. The three-dimensional model information of the robot arm 120 includes the number of axes constituting the robot arm 120, the rotatable angle of the shaft, the moving direction and distance of the robot arm 120, and the like. The three-dimensional information of the processing environment 270 includes other possible components or devices other than the aforementioned non-contact detecting device 140 (camera lens 242) and the dispensing device 250. These other components or devices may be composed of dispensing processing. The assembled components of the apparatus 200, and possibly the dispensing processing apparatus 200, are intended to perform other processes.

如步驟S212,通過比較三維模型資訊與真實世界座標系的誤差,找出利用三維模型資訊建置出的虛擬的3D空間與真實世界座標系之間的轉換係數。通過轉換係數讓機械手臂120之三維模型與加工環境270之三維模型資訊能夠正確地比對於真實世界的機械手臂120以及加工環境270,達到校正機械手臂120之三維模型資訊與加工環境270之三維模型資訊,兩者在真實世界座標系的誤差。前述雖然是將機械手臂120之三維模型與加工環境270之三維模型資訊能夠正確地比對於在真實世界的機械手臂120以及加工環境270來說明,但本領域人員亦應想得到,可以通過轉換係數以將真實世界的機械手臂120及加工環境270投射到虛擬的3D空間中。簡單地說,通過轉換係數,真實世界中的座標可以吻合於虛擬的3D世界的模型資訊,而虛擬的3D空間中的模型資訊也可以適用於真實世界的座標。In step S212, by comparing the error of the three-dimensional model information with the real world coordinate system, the conversion coefficient between the virtual 3D space and the real world coordinate system constructed by using the three-dimensional model information is found. By the conversion factor, the three-dimensional model of the robot arm 120 and the three-dimensional model information of the processing environment 270 can be correctly compared with the real-world robot arm 120 and the processing environment 270, and the three-dimensional model information of the correction robot 120 and the three-dimensional model of the processing environment 270 are achieved. Information, the error between the two in the real world coordinate system. Although the foregoing describes that the three-dimensional model of the robot arm 120 and the three-dimensional model information of the processing environment 270 can be correctly compared with the robot arm 120 and the processing environment 270 in the real world, those skilled in the art should also think that the conversion coefficient can be The real world robotic arm 120 and processing environment 270 are projected into a virtual 3D space. Simply put, by transforming coefficients, the coordinates in the real world can be matched to the model information of the virtual 3D world, and the model information in the virtual 3D space can also be applied to the coordinates of the real world.

此外,更進一步對真實世界的機械手臂120的移動方式與虛擬的3D世界中的機械手臂的移動方式作進一步校正,使真實世界的機械手臂120的移動方式同步於虛擬的3D世界中的機械手臂的移動方式。此處所指的同步主要是指在真實世界以及虛擬的3D空間中的機械手臂的自由端以相同的姿態沿著相同的路徑移動,但並非侷限在相同的時間作相同的事情,也可以是在不同的時間完成相同的任務。In addition, the movement of the robot arm 120 in the real world and the movement of the robot arm in the virtual 3D world are further corrected to synchronize the movement of the real-world robot arm 120 with the robot arm in the virtual 3D world. The way of moving. The synchronization referred to here mainly refers to the movement of the free end of the robot arm in the real world and the virtual 3D space along the same path in the same posture, but it is not limited to the same thing at the same time, but also Complete the same task at different times.

如步驟S220,透過攝像鏡頭242獲得高爾夫球桿頭的殼體300的三維模型資訊。詳細而言,攝像鏡頭242設置在點膠加工設備200的中央區域。因此在實際操作上,機械手臂120先取得殼體300後,機械手臂120移動至鄰近攝像鏡頭242處以讓攝像鏡頭242擷取影像以獲得殼體300的三維模型資訊。In step S220, three-dimensional model information of the housing 300 of the golf club head is obtained through the imaging lens 242. In detail, the imaging lens 242 is disposed in a central region of the dispensing processing apparatus 200. Therefore, in actual operation, after the robot arm 120 first obtains the housing 300, the robot arm 120 moves to the adjacent camera lens 242 to allow the camera lens 242 to capture an image to obtain three-dimensional model information of the housing 300.

高爾夫球桿頭的殼體300的三維模型資訊包括高爾夫球桿頭的殼體300的輪廓形狀、尺寸、待加工表面為平面或曲面、或其他物理性特徵。當然,在另一種實施方式中,高爾夫球桿頭的殼體300的三維模型資訊也可以內建在資料庫130中,而處理模組110可以直接存取內建在資料庫130中的殼體300的三維模型資訊。或是可將拍攝的殼體300影像,與資料庫內的預設殼體300三維模型影像輔助運用,以產生最終的殼體300之三維模型資訊The three-dimensional model information of the housing 300 of the golf club head includes the contour shape, size of the housing 300 of the golf club head, the surface or curved surface to be machined, or other physical features. Of course, in another embodiment, the three-dimensional model information of the shell 300 of the golf club head can also be built in the database 130, and the processing module 110 can directly access the shell built in the database 130. 300 3D model information. Alternatively, the image of the captured housing 300 can be used in conjunction with the three-dimensional model image of the preset housing 300 in the database to generate the three-dimensional model information of the final housing 300.

前述將殼體300的三維模型資訊內建在資料庫130的方式,是假設同一種類的個別單一殼體300都是完全一模一樣且沒有受到公差影響的。但是在實際製造過程,同一種類的個別單一殼體300難免會因為公差而有所不同,因此,透過攝像鏡頭242對每一個即將進入備料區200a中的殼體300進行拍攝的好處是,可以針對同一種類的單一殼體300做即時的特徵辨識,有利於加工路徑的設計最佳化。The foregoing manner of incorporating the three-dimensional model information of the housing 300 into the database 130 assumes that the individual single housings 300 of the same type are completely identical and are not affected by tolerances. However, in the actual manufacturing process, the individual single housings 300 of the same kind are inevitably different due to tolerances. Therefore, the advantage of photographing each of the housings 300 that are about to enter the stocking area 200a through the imaging lens 242 is that The same type of single housing 300 performs instant feature recognition, which is advantageous for design optimization of the processing path.

另外,本實施例的攝像鏡頭242是設置在高爾夫球桿頭的殼體300自備料區200a移動至置料區200b的路徑上,但攝像鏡頭242也可以依照需求地設置在點膠加工設備200中的適當處,以經由機械手臂120夾持殼體300並移動至攝像鏡頭242處進行拍攝。In addition, the imaging lens 242 of the present embodiment is disposed on the path of the housing 300 of the golf club head moving from the preparation area 200a to the loading area 200b, but the imaging lens 242 can also be disposed on the dispensing processing equipment according to requirements. Appropriate place in 200 to capture the housing 300 via the robot arm 120 and move to the imaging lens 242 for shooting.

如步驟S222,處理模組110根據殼體300的三維模型資訊,偵測至少一個接觸點特徵300a,以建立多個接觸點300b之位置。詳細而言,此接觸點特徵300a可以是經由人工設定的單一特徵,例如殼體300的某個凹陷點、突出點或是邊緣基準點,或者也可以是將多種接觸點特徵300a內建於資料庫130中,然後經由處理模組110亂數選取其中一種做為基準。而之後,處理模組110根據選定的接觸點特徵300a在待加工的表面上選擇多個位置作為其他接觸點300b。於其他實施例,亦可使用者自行設定複數個接觸點300,以形成預設之加工路徑。In step S222, the processing module 110 detects at least one contact point feature 300a according to the three-dimensional model information of the housing 300 to establish a position of the plurality of contact points 300b. In detail, the contact point feature 300a may be a single feature set manually, such as a certain recessed point, a protruding point or an edge reference point of the housing 300, or may be built into the data of the plurality of contact point features 300a. In the library 130, one of the hashes is then selected via the processing module 110 as a reference. Thereafter, the processing module 110 selects a plurality of locations on the surface to be processed as the other contact points 300b based on the selected contact point features 300a. In other embodiments, the user may also set a plurality of contact points 300 to form a predetermined processing path.

附帶一提,雖然本實施例的是以高爾夫球桿頭的殼體300做為待加工的工件說明,但本領域人員應知悉,工件的種類並不受本實施例侷限,更可能通過攝像鏡頭242所擷取的影像搭配偵測接觸點特徵300a來確定待處理的工件的種類,進而讓處理模組110依據辨識結果從資料庫130找到相應的資訊以進行相應的加工製程。Incidentally, although the housing 300 of the golf club head is used as the workpiece to be processed, it should be understood by those skilled in the art that the type of the workpiece is not limited by this embodiment, and it is more likely to pass the camera lens. The image captured by the 242 is matched with the detected contact point feature 300a to determine the type of the workpiece to be processed, and then the processing module 110 finds corresponding information from the database 130 according to the identification result to perform a corresponding processing process.

請繼續參考圖3、圖4及圖5,機械手臂120通過連接在其自由端的夾持裝置212夾取置放在備料區200a的殼體300時,可能會因為殼體300的製作公差、擺放位置或擺放角度的不同,造成夾持裝置212夾取每一個單一殼體300的方式不完全相同。所以,如步驟S230,處理模組110可根據機械手臂120的三維模型資訊,利用機械手臂120的自由端與相對應的任一個接觸點300b之間的關係,產生機械手臂120之移動姿態候選組。Referring to FIG. 3, FIG. 4 and FIG. 5, when the mechanical arm 120 is gripped by the clamping device 212 connected to the free end thereof, the housing 300 placed in the preparation area 200a may be placed due to the manufacturing tolerance of the housing 300. The difference in the placement position or the placement angle causes the clamping device 212 to grip each of the single housings 300 in a different manner. Therefore, in step S230, the processing module 110 can generate a moving attitude candidate group of the robot arm 120 according to the relationship between the free end of the robot arm 120 and the corresponding one of the contact points 300b according to the three-dimensional model information of the robot arm 120. .

詳細地說,通過攝像鏡頭242擷取影像後,處理模組110可以計算出機械手臂120的自由端的某一端點處與相對應的接觸點300b之間是否可能會有距離偏移或角度偏轉,且更計算出偏移距離以及偏轉角度並且利用改變機械手臂120的姿勢來進行補償,此姿勢包含機械手臂120相對於某一基準點的移動距離、機械手臂120的每一軸彼此間的所夾的角度、兩軸間的相對轉動角度、以及夾持裝置212相對於軸的夾角及轉動角度等。因此,移動姿態候選組的資訊包含對機械手臂120相對於某一基準點的距離補償、角度補償以及機械手臂120移動殼體300至點膠裝置250時的加工角度、機械手臂120從一定點移動至另一定點的所有可能的移動姿態及加工路徑…等。In detail, after the image is captured by the camera lens 242, the processing module 110 can calculate whether there is a possibility of a distance offset or an angular deflection between a certain end of the free end of the robot arm 120 and the corresponding contact point 300b. The offset distance and the deflection angle are further calculated and compensated by changing the posture of the robot arm 120, which includes the moving distance of the robot arm 120 with respect to a certain reference point, and the clamping of each axis of the robot arm 120 with each other. The angle, the relative rotation angle between the two axes, and the angle of the clamping device 212 with respect to the shaft, the angle of rotation, and the like. Therefore, the information of the moving attitude candidate group includes the distance compensation of the robot arm 120 with respect to a certain reference point, the angle compensation, and the machining angle when the mechanical arm 120 moves the housing 300 to the dispensing device 250, and the robot arm 120 moves from a certain point. All possible moving attitudes and machining paths to another fixed point...etc.

如步驟S240,自移動姿態候選組中選擇實際移動姿態。簡單地說,即是從所有可能的移動姿態,選取機械手臂120移動距離最短、機械手臂120的各軸的轉動角度最小的最佳的姿態(實際移動姿態)。In step S240, the actual moving posture is selected from the moving attitude candidate group. In short, it is the best posture (actual movement posture) in which the moving distance of the robot arm 120 is the shortest and the rotation angle of each axis of the robot arm 120 is the smallest, from all possible moving postures.

確切而言,處理模組110會根據移動姿態候選組產生對應的機械手臂120之三維姿態模型候選組,其中此三維姿態模型候選組包含通過處理模組110所計算出來的殼體300相對於點膠裝置250進行點膠時的最佳化角度以及機械手臂120的最佳化姿態。此外,處理模組110同時還綜合考量移動姿態候選組、三維姿態模型候選組以及加工環境270的三維模型資訊,計算移動姿態候選組的移動路徑是否與加工環境270中的其他組件形成干涉,並且自移動姿態候選組中刪除造成機械手臂120與環境空間相互干涉的移動姿態,然後從未相互干涉的移動姿態候選組中,選擇對機械手臂120之移動距離最短、軸角度產生最小偏移量的移動姿態。Specifically, the processing module 110 generates a corresponding three-dimensional attitude model candidate group of the robot arm 120 according to the moving posture candidate group, wherein the three-dimensional attitude model candidate group includes the housing 300 calculated by the processing module 110 relative to the point. The glue device 250 performs an optimized angle at the time of dispensing and an optimized posture of the robot arm 120. In addition, the processing module 110 also comprehensively considers the three-dimensional model information of the moving attitude candidate group, the three-dimensional attitude model candidate group, and the processing environment 270, and calculates whether the moving path of the moving attitude candidate group interferes with other components in the processing environment 270, and The moving postures causing the robot arm 120 and the environmental space to interfere with each other are deleted from the moving attitude candidate group, and then the moving distance candidate group that has not interfered with each other is selected to have the shortest moving distance to the robot arm 120 and the minimum offset of the shaft angle. Move gesture.

如步驟S250,處理模組110將在虛擬的3D空間中計算並挑選出來的移動姿態通過轉換係數轉化為在真實世界中的實際移動姿態,此姿態包含機械手臂120的加工移動路徑以及軸轉角度等。In step S250, the processing module 110 converts the calculated mobile gesture calculated in the virtual 3D space into a real moving posture in the real world by the conversion coefficient, and the posture includes the machining movement path of the robot arm 120 and the shaft rotation angle. Wait.

附帶一提,起因於每一個單一的殼體300的製造公差的影響、殼體300進入備料區200a後在備料區200a的位置或角度的不同,使得連接在機械手臂120的自由端上的夾持裝置212夾取殼體300時的角度也可能需要改變,因此針對每一個單一的殼體300,實際移動姿態會不同。換言之,綜觀所有的殼體300的加工製程,每一個單一的殼體300的加工路徑會有些微差異,並非完全相同。Incidentally, due to the influence of the manufacturing tolerance of each of the single casings 300, the position or angle of the casing 300 in the stocking area 200a after entering the stocking area 200a causes the clip attached to the free end of the robot arm 120. The angle at which the holding device 212 grips the housing 300 may also need to be changed, so the actual movement attitude will be different for each single housing 300. In other words, looking at the processing of all the housings 300, the processing path of each of the single housings 300 may be slightly different, not identical.

如步驟S260,處理模組110根據實際移動姿態發出訊號驅使機械手臂120之自由端開始作動,由連接在自由端之夾持裝置212夾取殼體300後,機械手臂120之自由端根據實際移動姿態於加工路徑移動,並且以最佳角度靠近點膠裝置250以進行點膠。點膠完畢後,機械手臂120將殼體300與另一個輔助殼體400黏合在一起,然後將黏合在一起的兩個殼體300、400放置到置料區200b中。此外,可更經由設置在置料區200b的貼合裝置260對黏合在一起的兩個殼體體300、400施加壓力,以使兩個殼體體300、400緊密貼合在一起形成高爾夫球桿頭。In step S260, the processing module 110 sends a signal according to the actual moving posture to drive the free end of the robot arm 120 to start. After the housing 300 is clamped by the clamping device 212 connected to the free end, the free end of the robot arm 120 is moved according to the actual movement. The gesture moves over the processing path and approaches the dispensing device 250 at an optimal angle for dispensing. After the dispensing is completed, the robot arm 120 bonds the housing 300 to the other auxiliary housing 400, and then the two housings 300, 400 that are bonded together are placed in the loading area 200b. In addition, pressure can be applied to the two housing bodies 300, 400 that are bonded together via the bonding device 260 disposed in the loading area 200b, so that the two housing bodies 300, 400 are closely fitted together to form a golf ball. Head.

附帶一提,經由前述的同步真實世界中的機械手臂120以及虛擬的3D空間中的機械手臂,可以確保真實世界中的機械手臂120會依照實際移動姿態確實地完成加工程序,避免在虛擬的3D空間中所挑選出來的最佳加工路徑與真實世界中機械手臂120實際移動的路徑不完全相同而導致加工程序無法完成。Incidentally, through the aforementioned robot arm 120 in the real world and the robot arm in the virtual 3D space, it can be ensured that the robot arm 120 in the real world can surely complete the machining program according to the actual moving posture, avoiding the virtual 3D. The optimal machining path selected in the space is not exactly the same as the actual movement path of the robot arm 120 in the real world, and the machining process cannot be completed.

以上,加工程序完成。之後可經由人工或是機械方式將高爾夫球桿頭從置料區200b取走。Above, the machining program is completed. The golf club head can then be removed from the stocking zone 200b either manually or mechanically.

[第三實施例][Third embodiment]

本實施例與前述第二實施例大致相同,其不同之處僅在於:在如圖6的第三實施例中,。高爾夫球桿頭的殼體300可以透過輸送帶P運送入備料區200a中,而攝像鏡頭242可設置在輸送帶P的運送途徑且位在備料區200a之前,以在將高爾夫球桿頭的殼體300進入備料區200a之前將殼體300拍攝下來,然後透過影像處理方式將高爾夫球桿頭的殼體300處理為三維模型資訊。This embodiment is substantially the same as the foregoing second embodiment except that it is in the third embodiment as shown in FIG. The housing 300 of the golf club head can be transported into the stocking area 200a through the conveyor belt P, and the camera lens 242 can be disposed in the transport path of the conveyor belt P and positioned in front of the stock preparation area 200a to place the shell of the golf club head The body 300 is photographed before entering the stocking area 200a, and then the housing 300 of the golf club head is processed into three-dimensional model information by image processing.

而取得待加工的殼體300的三維模型資訊的其餘步驟與前述實施例相同,因此不再贅述。The remaining steps of obtaining the three-dimensional model information of the casing 300 to be processed are the same as those of the foregoing embodiment, and therefore will not be described again.

雖然本實施例與第二實施例在系統的結構以及製程的施作步驟上有些微不同,但仍不超出本發明的基於三維影像之機械手臂加工方法及系統的架構之下。Although the present embodiment and the second embodiment are slightly different in the structure of the system and the application steps of the process, the present invention does not exceed the architecture of the three-dimensional image-based robot arm processing method and system of the present invention.

綜上所述,在本發明之基於三維影像之機械手臂加工方法及系統中,利用在虛擬的3D空間進行模型的建置,通過校正以將虛擬的3D空間及真實世界進行連結,除了能夠讓處理模組進行最佳化加工路徑的計算與篩選,機械手臂還具有自動判斷及學習的功能。此外,加工過程中可不需要人力介入,因此可以節省人力。再者,通過自動學習功能,可以針對偏差的角度或是距離即時進行校正,有利於提升工件的組裝良率。In summary, in the method and system for processing a three-dimensional image based robot of the present invention, the model is built in a virtual 3D space, and the virtual 3D space and the real world are connected by correction, in addition to being able to The processing module performs calculation and screening of the optimized processing path, and the robot arm also has the functions of automatic judgment and learning. In addition, no human intervention is required during the process, which saves manpower. Furthermore, through the automatic learning function, it is possible to correct the deviation angle or distance immediately, which is beneficial to improve the assembly yield of the workpiece.

雖然本發明已以實施例揭露如上,然其並非用以限定本發明,任何所屬技術領域中具有通常知識者,在不脫離本發明的精神和範圍內,當可作些許的更動與潤飾,故本發明的保護範圍當視後附的申請專利範圍所界定者為準。Although the present invention has been disclosed in the above embodiments, it is not intended to limit the present invention, and any one of ordinary skill in the art can make some changes and refinements without departing from the spirit and scope of the present invention. The scope of the invention is defined by the scope of the appended claims.

100‧‧‧基於三維影像之機械手臂加工系統100‧‧‧3D image based robotic arm processing system

110‧‧‧處理模組110‧‧‧Processing module

120‧‧‧機械手臂120‧‧‧ Robotic arm

130‧‧‧資料庫130‧‧‧Database

140‧‧‧非接觸式的偵測裝置140‧‧‧ Non-contact detection device

190‧‧‧工件190‧‧‧Workpiece

200‧‧‧點膠加工設備200‧‧‧ Dispensing equipment

212‧‧‧夾持裝置212‧‧‧Clamping device

242‧‧‧攝像鏡頭242‧‧‧ camera lens

250‧‧‧點膠裝置250‧‧‧ Dispensing device

260‧‧‧貼合裝置260‧‧‧Fitting device

270‧‧‧加工環境270‧‧‧Processing environment

300、400‧‧‧殼體300, 400‧‧‧ shell

200a‧‧‧備料區200a‧‧‧Material area

200b‧‧‧置料區200b‧‧‧Matching area

192、300a‧‧‧接觸點特徵192, 300a‧‧‧ contact point characteristics

194、300b‧‧‧接觸點194, 300b‧‧‧ touch points

P‧‧‧輸送帶P‧‧‧ conveyor belt

S‧‧‧基準點S‧‧‧ benchmark

C‧‧‧比較點C‧‧‧ comparison point

S110~S160、S210~S260‧‧‧步驟S110~S160, S210~S260‧‧‧ steps

圖1A為第一實施例之基於三維影像之機械手臂加工系統的示意圖。 圖1B為第一實施例之接觸點形成的加工路徑的示意圖。 圖1C為虛擬的3D空間與真實世界坐標系的坐標轉換示意圖。 圖2為第一實施例之基於三維影像之機械手臂加工方法的流程圖。 圖3為第二實施例之點膠加工設備的示意圖。 圖4為點膠加工設備的加工方法的流程圖。 圖5為高爾夫球桿頭的殼體在虛擬3D空間中的示意圖。 圖6為點膠加工設備的另一種實施例的示意圖。1A is a schematic view of a three-dimensional image-based robotic arm processing system of the first embodiment. Fig. 1B is a schematic view showing a processing path formed by the contact points of the first embodiment. FIG. 1C is a schematic diagram of coordinate conversion of a virtual 3D space and a real world coordinate system. 2 is a flow chart of a three-dimensional image-based robot arm processing method of the first embodiment. Fig. 3 is a schematic view of the dispensing processing apparatus of the second embodiment. 4 is a flow chart of a method of processing a dispensing processing apparatus. Figure 5 is a schematic illustration of the housing of the golf club head in a virtual 3D space. Figure 6 is a schematic illustration of another embodiment of a dispensing processing apparatus.

Claims (23)

一種基於三維影像之機械手臂加工方法,利用一機械手臂,於一加工環境內,對至少一工件進行一加工程序,該加工方法包括: 提供該機械手臂之三維模型資訊與該加工環境之三維模型資訊; 獲得該工件之三維模型資訊,並根據該工件之三維模型資訊,產生由複數個接觸點所形成的一加工路徑,其中該機械手臂之自由端根據該加工路徑而移動,以完成該加工程序; 根據該機械手臂之自由端對應任一個該接觸點之間的關係,產生該機械手臂之一移動姿態候選組; 自該移動姿態候選組,選擇一實際移動姿態; 根據該實際移動姿態,移動該機械手臂之自由端至對應之任一個該接觸點; 根據複數個該實際移動姿態,該機械手臂之自由端移動於該加工路徑,以完成該加工程序。A machining method for a robot arm based on a three-dimensional image, using a robot arm to perform a machining process on at least one workpiece in a processing environment, the processing method comprising: providing a three-dimensional model information of the robot arm and a three-dimensional model of the machining environment Obtaining the three-dimensional model information of the workpiece, and generating a processing path formed by the plurality of contact points according to the three-dimensional model information of the workpiece, wherein the free end of the robot arm moves according to the processing path to complete the processing a program for generating a motion posture candidate group of the robot arm according to a relationship between the free end of the robot arm and any one of the contact points; selecting an actual movement posture from the motion posture candidate group; according to the actual movement posture, Moving the free end of the robot arm to any one of the contact points; according to the plurality of actual movement postures, the free end of the robot arm moves to the machining path to complete the machining process. 如請求項1所述的機械手臂加工方法,更包含: 校正該機械手臂之三維模型資訊與該加工環境之三維模型資訊,兩者在真實世界座標系的誤差。The robot arm processing method according to claim 1, further comprising: correcting the three-dimensional model information of the robot arm and the three-dimensional model information of the processing environment, and the error of the two in the real world coordinate system. 如請求項2所述的機械手臂加工方法,其中所述之校正步驟包括: 在該機械手臂之三維模型資訊與該加工環境之三維模型資訊中,選擇至少一個校正點座標位置資訊; 根據該至少一個校正點座標位置資訊,使該機械手臂之自由端移動至真實世界坐標系上之一對應座標位置;以及 比較該至少一個校正點位置資訊與該對應座標位置。The method of processing a robot arm according to claim 2, wherein the correcting step comprises: selecting at least one calibration point coordinate position information in the three-dimensional model information of the robot arm and the three-dimensional model information of the processing environment; A correction point coordinate position information, the free end of the robot arm is moved to a corresponding coordinate position on the real world coordinate system; and the at least one correction point position information and the corresponding coordinate position are compared. 如請求項1所述的機械手臂加工方法,其中所述之獲得該工件之三維模型資訊包括: 透過一資料庫中獲得預設的該工件之三維模型資訊,或透過一非接觸式的偵測裝置,偵測該工件輪廓與尺寸,產生該工件之三維模型資訊。The method for processing a robot arm according to claim 1, wherein the obtaining the three-dimensional model information of the workpiece comprises: obtaining a preset three-dimensional model information of the workpiece through a database, or transmitting through a non-contact detection The device detects the contour and size of the workpiece and generates three-dimensional model information of the workpiece. 如請求項1所述的機械手臂加工方法,其中所述之產生由複數個接觸點所形成的該加工路徑包括: 根據該工件之三維模型資訊,設定複數個該接觸點之位置,以形成該加工路徑。The processing method of the robot arm according to claim 1, wherein the generating the machining path formed by the plurality of contact points comprises: setting a position of the plurality of contact points according to the three-dimensional model information of the workpiece to form the Processing path. 如請求項1所述的機械手臂加工方法,其中所述之產生由複數個接觸點所形成的該加工路徑包括: 根據該工件之三維模型資訊,偵測至少一個接觸點特徵,以建立複數個該接觸點之位置。The processing method of the robot arm according to claim 1, wherein the generating the processing path formed by the plurality of contact points comprises: detecting at least one contact point feature according to the three-dimensional model information of the workpiece to establish a plurality of The location of the contact point. 如請求項1所述的機械手臂加工方法,其中所述之自該移動姿態候選組,選擇該實際移動姿態包括: 根據該移動姿態候選組,產生對應的該機械手臂之一三維姿態模型候選組; 根據該移動姿態候選組、該三維姿態模型候選組,以及該加工環境之三維模型資訊,自該移動姿態候選組中,刪除造成該機械手臂與該環境空間相互干涉的該移動姿態。The robot arm processing method according to claim 1, wherein the selecting the actual moving posture from the moving posture candidate group comprises: generating a corresponding three-dimensional posture model candidate group of the robot arm according to the moving posture candidate group And removing, according to the moving attitude candidate group, the three-dimensional attitude model candidate group, and the three-dimensional model information of the processing environment, the moving posture that causes the robot arm to interfere with the environmental space from the moving attitude candidate group. 如請求項7所述的機械手臂加工方法,其中所述之自該移動姿態候選組,選擇該實際移動姿態包括: 自未相互干涉的該移動姿態候選組中,選擇對該機械手臂之軸角度產生最小偏移量的該移動姿態。The robot arm processing method according to claim 7, wherein the selecting the actual movement posture from the moving posture candidate group comprises: selecting an axis angle of the robot arm from the movement posture candidate group that does not interfere with each other The movement pose that produces the smallest offset. 如請求項1所述的機械手臂加工方法,其中所述之加工程序包括一點膠加工程序,其中該機械手臂之自由端連接一點膠裝置,使該機械手臂對該加工路徑上的每一個該接觸點進行點膠。The robot arm processing method according to claim 1, wherein the machining program comprises a one-step rubber processing program, wherein a free end of the robot arm is connected to a glue device, so that the robot arm is on each of the processing paths. The contact point is dispensed. 如請求項1所述的機械手臂加工方法,其中所述之加工程序包括一點膠加工程序,其中該機械手臂之自由端連接一夾持裝置,利用該挾持裝置夾持並移動該工件,使該機械手臂透過固定位置之一點膠裝置,對該加工路徑上的每一個該接觸點進行點膠。The robot arm processing method according to claim 1, wherein the machining program comprises a one-step rubber processing program, wherein a free end of the robot arm is coupled to a clamping device, and the holding device is used to clamp and move the workpiece, so that The robot arm dispenses each of the contact points on the processing path through a dispensing device in a fixed position. 如請求項9或10所述的機械手臂加工方法,更包含: 透過一貼合裝置,將已點膠的該工件貼合至一輔助工件。The method for processing a robot arm according to claim 9 or 10, further comprising: attaching the glued workpiece to an auxiliary workpiece through a bonding device. 一種基於三維影像之機械手臂加工系統,包括: 一機械手臂,於一加工環境內,對至少一工件進行一加工程序; 一資料庫,儲存該工件之三維模型資訊、該機械手臂之三維模型資訊、該加工環境之三維模型資訊;以及 一處理模組,耦合於該機械手臂與該資料庫之間,用以控制該機械手臂執行該加工程序; 其中該處理模組根據該工件之三維模型資訊,產生由複數個接觸點所形成的一加工路徑,控制該機械手臂之自由端根據該加工路徑而移動,以完成該加工程序; 其中該處理模組根據該機械手臂之自由端與對應的任一個該接觸點之間的關係,產生該機械手臂之一移動姿態候選組,並自該移動姿態候選組,選擇一實際移動姿態; 其中該處理模組根據該實際移動姿態,控制該機械手臂之自由端移動至對應之任一個該接觸點; 其中該處理模組根據複數個該實際移動姿態,控制該機械手臂之自由端移動於該加工路徑,以完成該加工程序。A robotic arm processing system based on a three-dimensional image, comprising: a robot arm for performing a machining process on at least one workpiece in a processing environment; a database for storing three-dimensional model information of the workpiece, and three-dimensional model information of the robot arm And a processing module coupled between the robot arm and the database for controlling the robot to execute the machining program; wherein the processing module is based on the three-dimensional model information of the workpiece And generating a processing path formed by the plurality of contact points, controlling the free end of the robot arm to move according to the processing path to complete the processing procedure; wherein the processing module is based on the free end of the robot arm and the corresponding a relationship between the contact points, generating a moving attitude candidate group of the robot arm, and selecting an actual moving posture from the moving attitude candidate group; wherein the processing module controls the robot arm according to the actual moving posture The free end moves to any one of the corresponding contact points; wherein the processing module is based on a plurality of The actual moving attitude controls the free end of the robot arm to move to the machining path to complete the machining process. 如請求項12所述的機械手臂加工系統,其中所述之處理模組校正該機械手臂之三維模型資訊與該加工環境之三維模型資訊,兩者在真實世界座標系的誤差。The robotic arm processing system of claim 12, wherein the processing module corrects the three-dimensional model information of the robot arm and the three-dimensional model information of the processing environment, and the error between the two in the real world coordinate system. 如請求項13所述的機械手臂加工系統,其中所述之該處理模組在該機械手臂之三維模型資訊與該加工環境之三維模型資訊中,選擇至少一個校正點座標位置資訊; 其中該處理模組根據該至少一個校正點座標位置資訊,使該機械手臂之自由端移動至真實世界坐標系上之一對應座標位置; 其中該處理模組比較該至少一個校正點位置資訊與該對應座標位置。The robot arm processing system of claim 13, wherein the processing module selects at least one calibration point coordinate position information in the three-dimensional model information of the robot arm and the three-dimensional model information of the processing environment; wherein the processing The module moves the free end of the robot arm to a corresponding coordinate position on the real world coordinate system according to the at least one calibration point coordinate position information; wherein the processing module compares the at least one correction point position information with the corresponding coordinate position . 如請求項12所述的機械手臂加工系統,其中所述之處理模組透過該資料庫,獲得預設的該工件之三維模型資訊。The robot arm processing system of claim 12, wherein the processing module passes through the database to obtain a preset three-dimensional model information of the workpiece. 如請求項12所述的機械手臂加工系統,更包括一非接觸式的偵測裝置,耦合於該處理模組與該資料庫之間,偵測該工件輪廓與尺寸,產生該工件之三維模型資訊。The robot arm processing system of claim 12, further comprising a non-contact detecting device coupled between the processing module and the database to detect the contour and size of the workpiece to generate a three-dimensional model of the workpiece News. 如請求項12所述的機械手臂加工系統,其中所述之處理模組根據該工件之三維模型資訊,設定複數個該接觸點之位置,以形成該加工路徑。The robot arm processing system of claim 12, wherein the processing module sets a plurality of locations of the contact points according to the three-dimensional model information of the workpiece to form the processing path. 如請求項12所述的機械手臂加工系統,其中所述之處理模組根據該工件之三維模型資訊,偵測至少一個接觸點特徵,以建立複數個該接觸點之位置。The robot arm processing system of claim 12, wherein the processing module detects at least one contact point feature based on the three-dimensional model information of the workpiece to establish a plurality of locations of the contact points. 如請求項12所述的機械手臂加工系統,其中所述之處理模組根據該移動姿態候選組,產生對應的該機械手臂之一三維姿態模型候選組; 其中該處理模組根據該移動姿態候選組、該三維姿態模型候選組,以及該加工環境之三維模型資訊,自該移動姿態候選組中,刪除造成該機械手臂與該環境空間相互干涉的該移動姿態。The robot arm processing system of claim 12, wherein the processing module generates a corresponding three-dimensional attitude model candidate group of the robot arm according to the moving posture candidate group; wherein the processing module is based on the moving posture candidate The group, the three-dimensional attitude model candidate group, and the three-dimensional model information of the processing environment, from the moving attitude candidate group, deleting the moving posture that causes the robot arm to interfere with the environment space. 如請求項19所述的機械手臂加工系統,其中所述之處理模組自未相互干涉的該移動姿態候選組中,選擇對該機械手臂之軸角度產生最小偏移量的該移動姿態。The robotic arm machining system of claim 19, wherein the processing module selects the movement attitude that produces a minimum offset from the axial angle of the robot arm from the set of movement attitude candidates that do not interfere with each other. 如請求項12所述的機械手臂加工系統,其中所述之機械手臂之自由端連接一點膠裝置,使該機械手臂對該加工路徑上的每一個該接觸點進行點膠。The robotic arm processing system of claim 12, wherein the free end of the robot arm is coupled to a dispensing device such that the robot arm dispenses each of the contact points on the processing path. 如請求項12所述的機械手臂加工系統,其中所述之機械手臂之自由端連接一夾持裝置,利用該夾持裝置夾持並移動該工件,使該機械手臂透過固定位置之一點膠裝置,對該加工路徑上的每一個該接觸點進行點膠。The robotic arm processing system of claim 12, wherein the free end of the robot arm is coupled to a clamping device, and the clamping device is used to clamp and move the workpiece to dispense the mechanical arm through one of the fixed positions. The device dispenses each of the contact points on the processing path. 如請求項21或22所述的機械手臂加工系統,更包含: 一貼合裝置,位於該加工環境內,且鄰近該機械手臂,將已點膠的該工件貼合至一輔助工件。The robotic arm processing system of claim 21 or 22, further comprising: a fitting device located in the processing environment and adjacent to the robot arm to attach the glued workpiece to an auxiliary workpiece.
TW106127687A 2017-08-15 2017-08-15 Robot processing method and system based on 3d image TWI650626B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
TW106127687A TWI650626B (en) 2017-08-15 2017-08-15 Robot processing method and system based on 3d image
CN201711420259.1A CN109397282B (en) 2017-08-15 2017-12-25 Method and system for machining robot arm and computer readable recording medium
US15/942,571 US10723020B2 (en) 2017-08-15 2018-04-02 Robotic arm processing method and system based on 3D image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW106127687A TWI650626B (en) 2017-08-15 2017-08-15 Robot processing method and system based on 3d image

Publications (2)

Publication Number Publication Date
TWI650626B TWI650626B (en) 2019-02-11
TW201910950A true TW201910950A (en) 2019-03-16

Family

ID=65360167

Family Applications (1)

Application Number Title Priority Date Filing Date
TW106127687A TWI650626B (en) 2017-08-15 2017-08-15 Robot processing method and system based on 3d image

Country Status (3)

Country Link
US (1) US10723020B2 (en)
CN (1) CN109397282B (en)
TW (1) TWI650626B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI747151B (en) * 2020-02-04 2021-11-21 達奈美克股份有限公司 Robot manipulator motion compensation method
TWI761891B (en) * 2020-07-22 2022-04-21 國立臺灣科技大學 Uninterrupted automation system and execution method thereof

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6469159B2 (en) * 2017-04-10 2019-02-13 ファナック株式会社 Offline programming apparatus and method with work position detection program generation function by contact sensor
KR102091935B1 (en) * 2018-08-29 2020-03-20 주식회사 프로텍 Viscous Liquid Dispensing Method Using 3 Dimensional Scanner
TWI696529B (en) * 2018-11-30 2020-06-21 財團法人金屬工業研究發展中心 Automatic positioning method and automatic control apparatus
US11762369B2 (en) * 2019-02-06 2023-09-19 Sensory Robotics, Inc. Robotic control via a virtual world simulation
TWI701122B (en) * 2019-07-15 2020-08-11 由田新技股份有限公司 Multi-axis robot arm system and path planning method thereof
TWI725630B (en) * 2019-11-21 2021-04-21 財團法人工業技術研究院 Processing path generating device and method thereof
EP3834998A1 (en) * 2019-12-11 2021-06-16 Siemens Aktiengesellschaft Method, computer program product and robot controller for configuring a robot-object system environment and robot
EP4074474A1 (en) * 2019-12-13 2022-10-19 Kawasaki Jukogyo Kabushiki Kaisha Robot system and method for forming three-dimensional model of workpiece
CN111524184B (en) * 2020-04-21 2024-01-16 湖南视普瑞智能科技有限公司 Intelligent unstacking method and unstacking system based on 3D vision
CN111889900A (en) * 2020-08-04 2020-11-06 上海柏楚电子科技股份有限公司 Data processing method and device for cutting special-shaped pipe, electronic equipment and medium
CN114101917A (en) * 2020-08-26 2022-03-01 复盛应用科技股份有限公司 Laser engraving method
EP4291369A1 (en) * 2021-02-10 2023-12-20 Abb Schweiz Ag Method and apparatus for tuning robot path for processing workpiece
CN113156607B (en) * 2021-04-14 2023-07-14 广景视睿科技(深圳)有限公司 Method for assembling prism, device for assembling prism and equipment for assembling prism
CN113288477B (en) * 2021-07-01 2022-06-03 吕涛 Orthodontic self-ligating bracket system with open auxiliary groove
CN113910259A (en) * 2021-11-03 2022-01-11 珠海格力智能装备有限公司 Robot paint spraying system and paint spraying method
CN113768640B (en) * 2021-11-09 2022-02-08 极限人工智能有限公司 Method and device for determining working pose of mechanical arm

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4305130A (en) * 1979-05-29 1981-12-08 University Of Rhode Island Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces
US6421048B1 (en) * 1998-07-17 2002-07-16 Sensable Technologies, Inc. Systems and methods for interacting with virtual objects in a haptic virtual reality environment
US6552722B1 (en) * 1998-07-17 2003-04-22 Sensable Technologies, Inc. Systems and methods for sculpting virtual objects in a haptic virtual reality environment
TW434495B (en) * 1999-03-31 2001-05-16 Lin Gu Chin Image servo positioning and path-tracking control system
US6587752B1 (en) * 2001-12-25 2003-07-01 National Institute Of Advanced Industrial Science And Technology Robot operation teaching method and apparatus
US6812665B2 (en) * 2002-04-19 2004-11-02 Abb Ab In-process relative robot workcell calibration
US7168935B1 (en) * 2002-08-02 2007-01-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Solid freeform fabrication apparatus and methods
DE10305384A1 (en) * 2003-02-11 2004-08-26 Kuka Roboter Gmbh Method and device for visualizing computer-aided information
JPWO2004106009A1 (en) * 2003-06-02 2006-07-20 松下電器産業株式会社 Article handling system and article handling server
US6836702B1 (en) * 2003-06-11 2004-12-28 Abb Ab Method for fine tuning of a robot program
GB0405014D0 (en) * 2004-03-05 2004-04-07 Qinetiq Ltd Movement control system
JP2007015037A (en) * 2005-07-05 2007-01-25 Sony Corp Motion editing device of robot, motion editing method, computer program and robot device
JP3971773B2 (en) * 2005-10-12 2007-09-05 ファナック株式会社 Offline teaching device for robots
JP4961860B2 (en) * 2006-06-27 2012-06-27 トヨタ自動車株式会社 Robot apparatus and control method of robot apparatus
US7974737B2 (en) * 2006-10-31 2011-07-05 GM Global Technology Operations LLC Apparatus and method of automated manufacturing
JP4347386B2 (en) * 2008-01-23 2009-10-21 ファナック株式会社 Processing robot program creation device
US10650608B2 (en) * 2008-10-08 2020-05-12 Strider Labs, Inc. System and method for constructing a 3D scene model from an image
US20100179689A1 (en) * 2009-01-09 2010-07-15 National Taiwan University Of Science And Technology Method of teaching robotic system
CN106994684B (en) * 2009-02-03 2021-04-09 范努克机器人技术美国有限公司 Method for controlling a robot tool
JP5436460B2 (en) * 2009-02-12 2014-03-05 三菱電機株式会社 Industrial robot system
US8204623B1 (en) * 2009-02-13 2012-06-19 Hrl Laboratories, Llc Planning approach for obstacle avoidance in complex environment using articulated redundant robot arm
US8437537B2 (en) * 2009-03-27 2013-05-07 Mitsubishi Electric Research Laboratories, Inc. Method and system for estimating 3D pose of specular objects
US8747188B2 (en) * 2011-02-24 2014-06-10 Apple Inc. Smart automation of robotic surface finishing
US9266241B2 (en) * 2011-03-14 2016-02-23 Matthew E. Trompeter Robotic work object cell calibration system
US9669546B2 (en) * 2011-03-14 2017-06-06 Matthew E. Trompeter Robotic work object cell calibration method
CN104411248B (en) * 2012-06-28 2017-09-26 皇家飞利浦有限公司 It is used for the C-arm trajectory planning that optimized image is gathered in endo-surgical
US9971339B2 (en) * 2012-09-26 2018-05-15 Apple Inc. Contact patch simulation
US9393686B1 (en) * 2013-03-15 2016-07-19 Industrial Perception, Inc. Moveable apparatuses having robotic manipulators and conveyors to facilitate object movement
US10665128B2 (en) * 2014-06-27 2020-05-26 Illinois Tool Works Inc. System and method of monitoring welding information
US9272417B2 (en) * 2014-07-16 2016-03-01 Google Inc. Real-time determination of object metrics for trajectory planning
JP6379874B2 (en) * 2014-08-29 2018-08-29 株式会社安川電機 Teaching system, robot system, and teaching method
US10518409B2 (en) * 2014-09-02 2019-12-31 Mark Oleynik Robotic manipulation methods and systems for executing a domain-specific application in an instrumented environment with electronic minimanipulation libraries
TWI517101B (en) * 2014-12-09 2016-01-11 財團法人工業技術研究院 Calibration system and method for 3d scanner
US20160210882A1 (en) * 2014-12-29 2016-07-21 Help Me See Inc. Surgical Simulator System and Method
JP6088563B2 (en) * 2015-02-10 2017-03-01 ファナック株式会社 Work picking robot system having position and orientation conversion operation function, and work picking method
US9486921B1 (en) * 2015-03-26 2016-11-08 Google Inc. Methods and systems for distributing remote assistance to facilitate robotic object manipulation
US9561941B1 (en) * 2015-03-30 2017-02-07 X Development Llc Autonomous approach and object pickup
US9878447B2 (en) * 2015-04-10 2018-01-30 Microsoft Technology Licensing, Llc Automated collection and labeling of object data
US9964398B2 (en) * 2015-05-06 2018-05-08 Faro Technologies, Inc. Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
CN106733341A (en) * 2015-11-20 2017-05-31 宝成工业股份有限公司 Polyaxial automation sole process equipment and processing method
US20170348854A1 (en) * 2015-12-16 2017-12-07 Mbl Limited Robotic manipulation methods and systems for executing a domain-specific application in an instrumented environment with containers and electronic minimanipulation libraries
US9687983B1 (en) * 2016-05-11 2017-06-27 X Development Llc Generating a grasp pose for grasping of an object by a grasping end effector of a robot
US10122995B2 (en) * 2016-09-22 2018-11-06 X Development Llc Systems and methods for generating and displaying a 3D model of items in a warehouse
TWI587994B (en) * 2016-11-18 2017-06-21 Hiwin Tech Corp Non-contact gestures teach robots
US10060857B1 (en) * 2017-11-16 2018-08-28 General Electric Company Robotic feature mapping and motion control

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI747151B (en) * 2020-02-04 2021-11-21 達奈美克股份有限公司 Robot manipulator motion compensation method
TWI761891B (en) * 2020-07-22 2022-04-21 國立臺灣科技大學 Uninterrupted automation system and execution method thereof

Also Published As

Publication number Publication date
US20190054617A1 (en) 2019-02-21
CN109397282B (en) 2021-12-14
US10723020B2 (en) 2020-07-28
TWI650626B (en) 2019-02-11
CN109397282A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
TWI650626B (en) Robot processing method and system based on 3d image
CN111331592B (en) Mechanical arm tool center point correcting device and method and mechanical arm system
JP7207851B2 (en) Control method, robot system, article manufacturing method, program and recording medium
CN111452040B (en) System and method for associating machine vision coordinate space in a pilot assembly environment
US11117262B2 (en) Intelligent robots
US10805546B2 (en) Image processing system, image processing device, and image processing program
JP7027299B2 (en) Calibration and operation of vision-based operation system
JP6222898B2 (en) Three-dimensional measuring device and robot device
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
US20130011018A1 (en) Information processing apparatus and information processing method
CN106272424A (en) A kind of industrial robot grasping means based on monocular camera and three-dimensional force sensor
US20040172164A1 (en) Method and apparatus for single image 3D vision guided robotics
JPWO2018043525A1 (en) Robot system, robot system control apparatus, and robot system control method
CN111067197A (en) Robot sole dynamic gluing system and method based on 3D scanning
US20200130189A1 (en) Reconfigurable, fixtureless manufacturing system and method assisted by learning software
JP6885856B2 (en) Robot system and calibration method
US20190287258A1 (en) Control Apparatus, Robot System, And Method Of Detecting Object
JP2004243215A (en) Robot teaching method for sealer applicator and sealer applicator
CN112109072A (en) Method for measuring and grabbing accurate 6D pose of large sparse feature tray
US11126844B2 (en) Control apparatus, robot system, and method of detecting object
JP7427370B2 (en) Imaging device, image processing device, image processing method, calibration method for imaging device, robot device, method for manufacturing articles using robot device, control program, and recording medium
CN113102882A (en) Geometric error compensation model training method and geometric error compensation method
TWI799310B (en) Robot and robot hand-eye callibrating method
TWI675000B (en) Object delivery method and system
CN109807890B (en) Equipment error calibration method and device