TW201520875A - A method and apparatus for creating animations - Google Patents

A method and apparatus for creating animations Download PDF

Info

Publication number
TW201520875A
TW201520875A TW102143360A TW102143360A TW201520875A TW 201520875 A TW201520875 A TW 201520875A TW 102143360 A TW102143360 A TW 102143360A TW 102143360 A TW102143360 A TW 102143360A TW 201520875 A TW201520875 A TW 201520875A
Authority
TW
Taiwan
Prior art keywords
animation
user interface
block
natural user
frame
Prior art date
Application number
TW102143360A
Other languages
Chinese (zh)
Other versions
TWI505176B (en
Inventor
Jin-Ren Chern
Jean-Han Wu
Original Assignee
Univ Chienkuo Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Chienkuo Technology filed Critical Univ Chienkuo Technology
Priority to TW102143360A priority Critical patent/TWI505176B/en
Publication of TW201520875A publication Critical patent/TW201520875A/en
Application granted granted Critical
Publication of TWI505176B publication Critical patent/TWI505176B/en

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

Embodiments of the invention disclose an apparatus and a method for producing novel as well as interested animations using a natural user interface. The animations produced can be shared with friends by transmitting to remote digital devices.

Description

動畫製作之方法及裝置Method and device for animation production

本發明主要有關於一種動畫之製作方法,特別是使用自然使用者介面為工具的一種動畫製作方法。The invention mainly relates to a method for making an animation, in particular to an animation method using a natural user interface as a tool.

一般之動畫,使用逐格繪製或運用補間動畫製作技巧進行動畫之製作,前者雖可獲較佳之影質,但須耗費相當多之繪製人力,而後者可降低繪製人力之需求,但影質較為呆板、機械化,而前述之動畫製作方式,皆有一共同特點即為其動作皆為人或人與機器繪製出來的,故無法將實際動畫角色之真實動作表達出來,然而近年來由於科技之進步,許多的動作捕捉器(Motion Capture)被發展出來,運用在3D動畫之製作上,雖然,這些動作捕捉器,可將動畫角色之真實動作捕捉下來,但因其需配帶許多感測器才能真正捕捉動作,實施上還是有很大的不便與限制。美國專利申請案第12/788,731號-"ACTIVE CALIBRATION OF A NATURAL USER INTERFACE",揭露一種物體追蹤系統,為一種自然使用者介面(Natural User Interface),可追蹤物體之骨架資訊,並提供給數位處理裝置,可進行加值之運用,如動畫之模擬。General animation, using grid-by-frame or using tween animation techniques for animation production, although the former can get better shadow, but it takes a lot of drawing manpower, while the latter can reduce the need to draw manpower, but the shadow is more It is rigid and mechanized, and the above-mentioned animation production methods all have the common feature that their actions are drawn by people or people and machines, so it is impossible to express the actual movements of the actual animated characters. However, due to advances in technology in recent years, Many motion captures (Motion Capture) have been developed for use in the production of 3D animations. Although these motion captures capture the true motion of an animated character, they need to be equipped with many sensors to truly There are still great inconveniences and limitations in capturing actions. U.S. Patent Application Serial No. 12/788,731 - "ACTIVE CALIBRATION OF A NATURAL USER INTERFACE", discloses an object tracking system, which is a natural user interface that tracks the skeleton information of an object and provides it to digital processing. The device can be used for value-added, such as animation simulation.

本發明主要包含一自然使用者介面12(Natural User Interface)及一動畫處理裝置11用以繪製新奇既有趣動畫之裝置及方法,而動畫處理裝置11包含了一數位處理裝置111、一動畫模組112及一電腦動畫角色113,動畫模組112電性連接於數位處理裝置111,首先,繪製電腦動畫角色113,繪製時須將該主角人物按骨架之部位分開繪製,之後,將其匯入動畫處理裝置11上,而數位處理裝置111接收來自自然使用者介面12所追蹤之實際動畫角色13骨架資訊後,利用該骨架資訊設定電腦動畫角色113對應之骨架部位之位置及旋轉角度,以進行即時動畫之繪製。藉由所改變繪製之電腦動畫角色113,配合現場之實際動畫角色13之動作,以營造出一個與眾不同且饒具趣味之動畫。The present invention mainly includes a natural user interface 12 and an animation processing device 11 for drawing a novel and interesting animation device, and the animation processing device 11 includes a digital processing device 111 and an animation module. 112 and a computer animation character 113, the animation module 112 is electrically connected to the digital processing device 111. First, the computer animation character 113 is drawn, and the main character is drawn separately according to the skeleton part, and then imported into the animation. On the processing device 11, the digital processing device 111 receives the skeleton information of the actual animation character 13 tracked by the natural user interface 12, and uses the skeleton information to set the position and rotation angle of the skeleton portion corresponding to the computer animation character 113 for immediate execution. Animation drawing. By changing the drawn computer animation character 113, the action of the actual animated character 13 on the spot is used to create a unique and interesting animation.

如圖1所示,本發明係使用自然使用者介面12(Natural User Interface),以獲取實際動畫角色13之骨架資訊,並套用至電腦動畫角色113以製作動畫,其係包括一自然使用者介面12用來獲取實際動畫角色13之骨架資訊、一動畫處理裝置11用來依自然使用者介面12回傳之骨架資訊繪製動畫,而動畫處理裝置11包含了一數位處理裝置111用來執行來自動畫模組之指令及處理來自自然使用者介面資料之裝置、一動畫模組112包含動畫繪製之程序用來繪製動畫及一電腦動畫角色113用來展現動畫之成果,動畫模組電性連接於數位處理裝置,數位處理裝置包含至少一處理器、一記憶體模組,該一自然使用者介面,電性連接至該一數位處理裝置。本發明主要包含一自然使用者介面12及一動畫處理裝置11,而動畫處理裝置11包含了一數位處理裝置111、一動畫模組112及一電腦動畫角色113,動畫模組112電性連接於數位處理裝置111,主要為包含動畫處理之程序與資料,數位處理裝置111主要為一包含至少一處理器、一記憶體模組以執行來自動畫模組112之指令及處理來自自然使用者介面12資料之裝置,並進而利用電腦動畫角色113進行動畫之繪製,該一自然使用者介面12,電性連接至該一動畫處理裝置11之數位處理裝置111,負責監控環境中之實際動畫角色13,當環境中出現實際動畫角色13時,該一自然使用者介面12,捕捉並追蹤環境中實際動畫角色13之動作,並將追蹤到之資訊轉化為骨架資訊,回傳至所藕接之動畫處理裝置11,在動畫處理裝置11中,電腦動畫角色113為動畫處理裝置11用來展現所捕捉到之實際動畫角色13之動作的圖案及動作,圖2為使用繪製方式所產生之電腦動畫角色113,圖3為圖2按部位切割所得到之各分割區塊,圖4為本發明之第一較佳實施例,主要闡明將電腦動畫角色113按骨架特性切割,再以自然使用者介面12偵測監控環境中之實際動畫角色13之動作,自然使用者介面12會將所偵測到實際動畫角色13之各關節位置回傳至動畫處理裝置11,動畫處理裝置11按照自然使用者介面12所回傳之骨骼及關節位置資訊,計算對應到電腦動畫角色113各分割區塊之旋轉角度後,設定對應分割區塊旋轉角度,再依對應之關節位置設定其位置,即可獲得該一時間點之動畫畫格。圖11及圖12分別為第二較佳實施例及第三較佳實施例,並與圖4之功用相似,惟圖11及圖12中之實際動畫角色13及電腦動畫角色113可做不同之變換,以創造出更豐富之樂趣。圖3為一初始繪圖,亦即後續之動畫以此一初始繪圖為基礎,再按一間隔時間擷取自然使用者介面12所回傳之骨架資訊,將圖3和分割區塊之初始繪圖平移及旋轉,逐一獲取動畫之每一畫格。將電腦動畫角色113按部位切割所得之各部位圖,並在各部位區塊中選取2定位點,分別為第一定位點與第二定位點,該定位點可與自然使用者介面12中所偵測之骨架架構位置一致,方便後續之分割區塊位置之設定。其中定位點2111與定位點2112分別為區塊211之第一定位點及第二定位點,定位點2121與定位點2122分別為區塊212之第一定位點及第二定位點,定位點2131與定位點2132分別為區塊213之第一定位點及第二定位點,定位點2141與定位點2142分別為區塊214之第一定位點及第二定位點,定位點2151與定位點2152分別為區塊215之第一定位點及第二定位點,定位點2161與定位點2162分別為區塊216之第一定位點及第二定位點,定位點2171與定位點2172分別為區塊217之第一定位點及第二定位點,定位點2181與定位點2182分別為區塊218之第一定位點及第二定位點,定位點2191與定位點2192分別為區塊219之第一定位點及第二定位點,定位點2201與定位點2202分別為區塊220之第一定位點及第二定位點,定位點2211與定位點2212分別為區塊221之第一定位點及第二定位點,定位點2221與定位點2222分別為區塊222之第一定位點及第二定位點,此一區塊之分割與定位點之設置,可本著前述技術之精神配合自然使用者介面12所提供之資訊及動畫製作上之需求而更動。相鄰區塊間之定位點可為同一位置亦可在不同位置,當設計為同一位置時,可使得在動畫播放時,整體角色之大小保持一致,而若設計為不同位置時,則其在動畫播放時第一動畫角色各部位之銜接上可較為密合,請參考圖3、圖9及圖10,以區塊214為例,其第一定位點為定位點2141,第二定位點為定位點2142,同時在區塊215中,其第一定位點為定位點2151,第二定位點為定位點2152,然而,區塊214與區塊215為相鄰之二區塊,其中區塊214之第二定位點即定位點2142與區塊215之第一定位點即定位點2151,其定位點之設置有二種方式:其一如圖9所示定位點2142與定位點2151在同一位置,另一如圖10所示,為求在區塊214與區塊215兩區塊之關節處為彎曲狀態時,能有更好的接合效果,因此,區塊213之第二定位點即定位點2132與區塊214之第一定位點即定位點2141為不同位置之兩點,在設計上可讓兩點間保持一距離,而此一距離可以兩區塊較短者為基準,小於其區塊長度80%為佳。As shown in FIG. 1, the present invention uses a natural user interface 12 to obtain the skeleton information of the actual animated character 13 and applies it to the computer animation character 113 to create an animation, which includes a natural user interface. 12 is used to obtain the skeleton information of the actual animated character 13, an animation processing device 11 is used to draw the animation according to the skeleton information returned by the natural user interface 12, and the animation processing device 11 includes a digital processing device 111 for performing the animation. The module commands and processes the device from the natural user interface data, and the animation module 112 includes an animation drawing program for drawing the animation and a computer animation character 113 for displaying the animation result. The animation module is electrically connected to the digital position. The processing device, the digital processing device includes at least one processor and a memory module, and the natural user interface is electrically connected to the digital processing device. The present invention mainly includes a natural user interface 12 and an animation processing device 11, and the animation processing device 11 includes a digital processing device 111, an animation module 112, and a computer animation character 113. The animation module 112 is electrically connected to the animation module 112. The digital processing device 111 is mainly a program and data including animation processing. The digital processing device 111 is mainly composed of at least one processor and a memory module for executing instructions and processing from the animation module 112 from the natural user interface 12 . The device of the data is further drawn by the computer animation character 113. The natural user interface 12 is electrically connected to the digital processing device 111 of the animation processing device 11, and is responsible for monitoring the actual animated character 13 in the environment. When the actual animated character 13 appears in the environment, the natural user interface 12 captures and tracks the action of the actual animated character 13 in the environment, and converts the tracked information into skeleton information, and transmits it back to the connected animation processing. Device 11, in the animation processing device 11, the computer animation character 113 is used by the animation processing device 11 to display the captured actual motion FIG. 2 is a computer animation character 113 generated by using a drawing mode, FIG. 3 is a divided block obtained by cutting a part in FIG. 2, and FIG. 4 is a first preferred embodiment of the present invention. For example, it is mainly explained that the computer animation character 113 is cut according to the skeleton characteristic, and then the natural user interface 12 detects the action of the actual animation character 13 in the monitoring environment, and the natural user interface 12 will detect the actual animation character 13 The position of each joint is transmitted back to the animation processing device 11, and the animation processing device 11 calculates the rotation angle corresponding to each divided block of the computer animation character 113 according to the bone and joint position information returned by the natural user interface 12, and then sets the corresponding segmentation. The block rotation angle and then the position of the corresponding joint position can be used to obtain the animated frame at that point in time. 11 and FIG. 12 are respectively a second preferred embodiment and a third preferred embodiment, and are similar to the functions of FIG. 4, but the actual animated character 13 and the computer animation character 113 in FIG. 11 and FIG. 12 can be different. Transform to create more fun. FIG. 3 is an initial drawing, that is, the subsequent animation is based on an initial drawing, and then the skeleton information returned by the natural user interface 12 is captured at an interval, and the initial drawing of FIG. 3 and the divided block is translated. And rotate, one by one to get each frame of the animation. The computer animation character 113 is cut into parts according to the parts, and two positioning points are selected in each part block, which are respectively a first positioning point and a second positioning point, and the positioning point can be combined with the natural user interface 12 The detected skeleton structure is consistent in position, which facilitates the setting of the subsequent partition location. The positioning point 2111 and the positioning point 2112 are respectively a first positioning point and a second positioning point of the block 211, and the positioning point 2121 and the positioning point 2122 are respectively a first positioning point and a second positioning point of the block 212, and the positioning point 2131 The positioning point 2132 is a first positioning point and a second positioning point of the block 213, and the positioning point 2141 and the positioning point 2142 are respectively a first positioning point and a second positioning point of the block 214, and the positioning point 2151 and the positioning point 2152 are respectively The first positioning point and the second positioning point of the block 215 are respectively the first positioning point and the second positioning point of the block 216, and the positioning point 2171 and the positioning point 2172 are respectively a block. The first positioning point and the second positioning point of the 217, the positioning point 2181 and the positioning point 2182 are the first positioning point and the second positioning point of the block 218, respectively, and the positioning point 2191 and the positioning point 2192 are respectively the first of the block 219. The positioning point 2201 and the positioning point 2202 are respectively a first positioning point and a second positioning point of the block 220, and the positioning point 2211 and the positioning point 2212 are respectively the first positioning point of the block 221 and the first positioning point The second positioning point, the positioning point 2221 and the positioning point 2222 are respectively the block 222 Positioning the first anchor point and a second point, a block of this division is provided with the positioning point, the spirit of the foregoing technologies may be adapted to the natural user interface 12 provided by the information needs of the animation and changed. The positioning points between adjacent blocks may be the same position or different positions. When designed to be the same position, the size of the overall character is kept consistent during the animation, and if the design is different, it is When the animation is played, the parts of the first animation character can be closely connected. Please refer to FIG. 3, FIG. 9 and FIG. 10. Taking block 214 as an example, the first positioning point is the positioning point 2141, and the second positioning point is Positioning point 2142, while in block 215, the first positioning point is the positioning point 2151, and the second positioning point is the positioning point 2152. However, the block 214 and the block 215 are adjacent blocks, wherein the block The second positioning point of 214 is the positioning point 2142 and the first positioning point of the block 215, that is, the positioning point 2151. The positioning point is set in two ways: one of the positioning points 2142 and the positioning point 2151 is the same as shown in FIG. Position, as shown in FIG. 10, in order to obtain a better joint effect when the joints of the two blocks of the block 214 and the block 215 are in a curved state, the second positioning point of the block 213 is The positioning point 2132 and the first positioning point of the block 214, that is, the positioning point 2141 are two different positions. , Allows the design to maintain a distance between two points, but this distance may be shorter for the two reference blocks, which block length is less than 80% preferably.

圖5A為圖3中區塊213、區塊214及區塊215之部份區塊圖,圖5B為圖5A進行動作後所形成之圖形,對應圖5A之區塊213、區塊214及區塊215,圖5B中分別為區塊313、區塊314及區塊315,圖5A之定位點2131、定位點2132、定位點2141、定位點2142、定位點2151及定位點2152分別對應至圖5B中之定位點3131、定位點3132、定位點3141、定位點3142、定位點3151及定位點3152,圖5A中之直線1213為連接定位點2131及2132之直線,代表區塊213之方向,直線1214為連接定位點2141及2142之直線,代表區塊214之方向,直線1215為連接定位點2151及2152之直線,代表區塊215之方向,圖5B中之直線1313為連接定位點3131及3132之直線,代表區塊313之方向,直線1314為連接定位點3141及3142之直線,代表區塊314之方向,直線1315為連接定位點3151及3152之直線,代表區塊315之方向。圖5A為圖3之子集,圖6為圖5A中區塊213之方向與圖5B中移動後之區塊313之夾角,圖7為圖5A中區塊214之方向與圖5B中移動後之區塊314之夾角,圖8為圖5A中區塊215之方向與圖5B中移動後之區塊315之夾角,為闡明其夾角之計算,在此以區塊213與區塊313之夾角為例,圖3與圖5A中之位置為第一動畫位置,圖5B中之位置為第二動畫位置,第一動畫位置中定位點2131之座標為(x11 , y11 , z11 ),定位點2132之座標為(x12 , y12 , z12 ),第二動畫位置中定位點3131之座標為(x21 , y21 , z21 ),定位點3132之座標為(x22 , y22 , z22 ),令 【數1】, 【數2】則旋轉夾角等於區塊213與區塊313之夾角為 【數3】其中 【數4】表從定位點2131到定位點2132之向量,而 【數5】表從定位點3131到定位點3132之向量,且 【數6】為反餘弦函數,而‧代表向量內積,其主要利用兩單位向量 【數7】, 【數8】以求得其夾角,為求本發明更週延與一般化,前述之計算雖然以3維方式表示,但在二維之情況,只要將前述之計算式其中一個不須使用之維度設為0,即可成為二維之計算方式。前述圖5A稱為基準圖或基準畫格,以計算出圖5B或後續相對應區塊之旋轉角度,如圖13所示,其中畫格401為動畫之初始繪圖,畫格402為動畫之第一畫格,畫格403為動畫之第n-1畫格,畫格404為動畫之第n畫格,畫格404為目前欲繪製之畫格,基準圖之選擇有下列作法,當動畫一開始製作時,沒有上一動畫位置可做參考時,亦即電腦動畫角色之第一畫格時,則以畫格401初始繪圖為基準圖,計算和對應分割區塊之旋轉夾角,而目前繪製動畫畫格若非第一畫格時,則可利用上一動畫畫格為基準畫格,以畫格404而言,亦即為畫格403,而第二動畫位置,則為自然使用者介面所回傳之實際動畫角色13各骨架區塊目前位置。此外,在非第一畫格時除了以上一動畫影格之資訊為基準圖外,亦可利用初始繪圖為基準圖,惟與以上一畫格為基準圖相較,以上一畫格為基準圖時其旋轉角度及平移通常較小,動畫播放時較為平順。本發明之動畫製作如圖14所示,包含步驟501先將電腦動畫角色分割成複數個區塊,接著執行步驟502以逐一繪製電腦動畫角色每一區塊之初始繪圖及設置各區塊之定位點並記錄其位置,步驟501及步驟502為動畫繪製之前置作業,前置作業完成後,接著步驟503判斷動畫之繪製是否已經結束,而其判斷依據為來自使用者之操作,如按下結束鍵或動畫之繪製時間已到達設定之時間或環境中無法偵測到任何實際動畫角色13等因素而結束,若判斷結果為結束動畫繪製,則進行步驟506之結束動畫繪製作業,若步驟506判斷為繼續繪製動畫,則進行步驟504以繪製目前畫格,待目前畫格繪製完成後,進行步驟505若基準圖設定為上一畫格則更新基準圖成目前畫格並等待一固定時間,待等待之時間到達即跳回步驟503以進行下一畫格之繪製,而其等待時間之長短由動畫之每秒畫格數(FPS; Frames Per Second)決定,則每畫格可用時間為1/FPS,而前述之等待一固定時間可以事件驅動方式來實作,亦即以設定計時器方式進行,而其時間間隔即為1/FPS,亦可以傳統之迴圈方式等待,而其等待時間為1/FPS再減去該畫格準備及繪製時間。步驟504之作業程序如圖15說明,為一繪製單一影格流程其繪製程序為:首先如步驟511判斷目前繪製動畫畫格是否為第一畫格?若為第一畫格則如步驟512將後續繪製動畫之基準圖設定為初始繪圖,反之,若非為第一畫格則如步驟513將後續繪製動畫之基準圖設定為上一畫格,接著即可進行步驟514以單一區塊動畫繪製方法,繪製所有區塊之動畫,而單一區塊動畫繪製方式如圖16所示,首先如步驟521從自然使用者介面接收第二動畫位置,獲取第二動畫位置後再如步驟522計算本區塊第一動畫位置與本區塊第二動畫位置之旋轉夾角,利用所計算出來之夾角再如步驟523以本區塊基準圖按前一步驟所計算之旋轉夾角旋轉,最後,如步驟524將前步驟旋轉後區塊之第一定位點平移至第二動畫位置中本區塊第一定位點之位置。在此特別強調,第一動畫位置為基準圖之位置,而第二動畫位置即為自然使用者介面所回傳之目前位置。位置之計算,有一種做法,以相對座標方式表示,即選定場景中之一特定區塊或物件之特定位置,如區塊212之定位點2121,為參考點,可將該參考點之相對座標設為任意之座標,考慮運算之方便性,通常設為原點即(0,0)或(0,0,0),而其他各點之絕對座標再與該參考點之絕對座標相減以得出相對之座標。5A is a partial block diagram of the block 213, the block 214 and the block 215 of FIG. 3, and FIG. 5B is a figure formed by the action of FIG. 5A, corresponding to the block 213, the block 214 and the area of FIG. 5A. Block 215, which is block 313, block 314 and block 315 respectively in FIG. 5B, and the anchor point 2131, the anchor point 2132, the anchor point 2141, the anchor point 2142, the anchor point 2151, and the anchor point 2152 of FIG. 5A respectively correspond to the map. In the 5B, the positioning point 3131, the positioning point 3132, the positioning point 3141, the positioning point 3142, the positioning point 3151 and the positioning point 3152, the straight line 1213 in FIG. 5A is a straight line connecting the positioning points 2131 and 2132, and represents the direction of the block 213. The line 1214 is a line connecting the positioning points 2141 and 2142, representing the direction of the block 214, the line 1215 is a line connecting the positioning points 2151 and 2152, representing the direction of the block 215, and the line 1313 in FIG. 5B is the connection positioning point 3131 and The straight line of 3132 represents the direction of the block 313, the straight line 1314 is a straight line connecting the positioning points 3141 and 3142, representing the direction of the block 314, and the straight line 1315 is a straight line connecting the positioning points 3151 and 3152, representing the direction of the block 315. 5A is a subset of FIG. 3, FIG. 6 is the angle between the direction of the block 213 in FIG. 5A and the block 313 after the movement in FIG. 5B, and FIG. 7 is the direction of the block 214 in FIG. 5A and the movement in FIG. 5B. The angle between the block 314 and FIG. 8 is the angle between the direction of the block 215 in FIG. 5A and the block 315 after the movement in FIG. 5B. To clarify the calculation of the angle, the angle between the block 213 and the block 313 is For example, the position in FIG. 3 and FIG. 5A is the first animation position, and the position in FIG. 5B is the second animation position, and the coordinates of the positioning point 2131 in the first animation position are (x 11 , y 11 , z 11 ), positioning The coordinates of point 2132 are (x 12 , y 12 , z 12 ), the coordinates of the anchor point 3131 in the second animation position are (x 21 , y 21 , z 21 ), and the coordinates of the anchor point 3132 are (x 22 , y 22 ) , z 22 ), let [number 1] , [Number 2] Then, the angle of rotation is equal to the angle between the block 213 and the block 313 is [3] Where [4] The vector from the anchor point 2131 to the anchor point 2132, and [5] The vector from the anchor point 3131 to the anchor point 3132, and [6] Is an inverse cosine function, and ‧ represents the inner product of the vector, which mainly uses two unit vectors [number 7] , [Number 8] In order to obtain the angle of the present invention, in order to make the present invention more comprehensive and generalized, although the foregoing calculation is expressed in a three-dimensional manner, in the case of two-dimensional, as long as one of the aforementioned calculation formulas is not required to be used, the dimension is set to zero. It can be a two-dimensional calculation method. The foregoing FIG. 5A is referred to as a reference map or a reference grid to calculate the rotation angle of FIG. 5B or subsequent corresponding blocks, as shown in FIG. 13 , wherein the frame 401 is the initial drawing of the animation, and the frame 402 is the animation. In a frame, the frame 403 is the n-1th frame of the animation, the frame 404 is the nth frame of the animation, and the frame 404 is the frame to be drawn currently. The selection of the reference picture has the following practices: When starting the production, when there is no previous animation position for reference, that is, the first frame of the computer animation character, the initial drawing of the frame 401 is used as the reference image, and the rotation angle of the corresponding segment is calculated and currently drawn. If the animated frame is not the first frame, the previous frame can be used as the reference frame. In the case of frame 404, it is frame 403, and the second animation position is the natural user interface. The current position of each skeleton block of the actual animated character 13 is returned. In addition, in the non-first frame, in addition to the information of the above animation frame as the reference image, the initial drawing may be used as the reference image, but compared with the above one frame as the reference image, when the above frame is the reference image The rotation angle and translation are usually small, and the animation is smoother when played. As shown in FIG. 14, the animation of the present invention includes the step 501 of first dividing the computer animation character into a plurality of blocks, and then performing step 502 to draw the initial drawing of each block of the computer animation character one by one and set the positioning of each block. Point and record the position, step 501 and step 502 draw the pre-work for the animation, after the pre-job is completed, then step 503 determines whether the drawing of the animation has ended, and the judgment is based on the operation from the user, such as pressing If the end time of the end key or the animation has reached the set time or the environment cannot detect any actual animation character 13 and the like, and the result is the end animation drawing, the end animation drawing operation of step 506 is performed, if step 506 is performed. If it is determined that the animation is continued to be drawn, step 504 is performed to draw the current frame. After the current frame is drawn, step 505 is performed. If the reference image is set to the previous frame, the reference image is updated to the current frame and waits for a fixed time. When the waiting time arrives, jump back to step 503 to draw the next frame, and the length of the waiting time is framed by the animation every second. The number (FPS; Frames Per Second) determines that the available time per frame is 1/FPS, and the aforementioned waiting for a fixed time can be implemented in an event-driven manner, that is, in a set timer manner, and the time interval is For 1/FPS, it can also be waited in the traditional loop mode, and its waiting time is 1/FPS minus the frame preparation and drawing time. The operation program of step 504 is as shown in FIG. 15. The drawing procedure for drawing a single frame process is as follows: First, as step 511, it is determined whether the currently drawn animation frame is the first frame; if it is the first frame, then step 512 is followed. The base drawing of the drawing animation is set as the initial drawing. Otherwise, if it is not the first frame, the reference drawing of the subsequent drawing animation is set to the previous frame as in step 513, and then step 514 can be performed to perform the single block animation drawing method. The animation of all the blocks is drawn, and the single block animation is drawn as shown in FIG. 16. First, the second animation position is received from the natural user interface as in step 521, and the second animation position is obtained, and then the block is calculated as in step 522. The angle between the rotation of an animation position and the second animation position of the block, using the calculated angle, and then rotating the angle of the rotation calculated by the previous step in the block reference map as in step 523. Finally, the previous step is as in step 524. The first positioning point of the rotated block is translated to the position of the first positioning point of the block in the second animation position. It is particularly emphasized here that the first animation position is the position of the reference image, and the second animation position is the current position returned by the natural user interface. For the calculation of the position, there is a method, which is expressed in a relative coordinate manner, that is, a specific position of a specific block or object in the selected scene, such as the positioning point 2121 of the block 212, which is a reference point, and the relative coordinates of the reference point can be Set to any coordinate, considering the convenience of the operation, usually set to the origin (0,0) or (0,0,0), and the absolute coordinates of the other points are subtracted from the absolute coordinates of the reference point. Find the relative coordinates.

動畫模組112將各分割區塊按前述之程序一一製作出動畫後再合併成一完整之畫格,並在一固定之時間間隔內讀取自然使用者介面12回傳之第二動畫位置,再逐一產生各時間點之畫格,按時間序播放各畫格,即可完成動畫之製作。The animation module 112 creates the animations according to the foregoing procedures, merges them into a complete frame, and reads the second animation position returned by the natural user interface 12 in a fixed time interval. Then, the frames of each time point are generated one by one, and each frame is played in time sequence to complete the animation.

本發明更可如圖17所示,透過數位處理裝置111連接網際網路15,並將製作完成之動畫儲存成特定之格式,傳送至遠端數位裝置14,該遠端數位裝置14可以是一郵件伺服器,可以電子郵件傳送給親朋好友,甚至任何之電子郵件帳戶。該遠端數位裝置14亦可為一如YouTube之影音伺服器,透過該伺服器可以將所上傳之動畫分享給其他網友。該遠端數位裝置14亦可為一網頁伺服器(web server),可以將上傳之動畫直接透過網頁連接,在瀏覽者之電腦播放。該遠端數位裝置14亦可為一社群網站(social web)之伺服器,數位處理裝置111所上傳之動畫,經由該一社群網站將動畫之訊息散佈給好友,好友登入該一社群網站後即可觀看所上傳之動畫。該遠端數位裝置14亦可為一遠端動畫播放裝置,將實際動畫角色13之動作利用本發明產生以電腦動畫角色113動作之動畫透過網路傳送到遠端之動畫播放裝置播放。同時該遠端數位裝置14亦可為一雲端伺服器,利用雲端之機制,將本發明所製作之饒富趣味之動畫散播給親朋好友觀賞,以創造出更多樣化之趣味與意境。The present invention can be connected to the Internet 15 through the digital processing device 111, and the completed animation can be stored in a specific format and transmitted to the remote digital device 14, which can be a The mail server can be emailed to friends and family, or even any email account. The remote digital device 14 can also be a video server such as YouTube, through which the uploaded animation can be shared with other netizens. The remote digital device 14 can also be a web server, which can directly upload the uploaded animation through the webpage and play on the viewer's computer. The remote digit device 14 can also be a server of a social web site, and the animation uploaded by the digital processing device 111 distributes the animation message to the friend via the social networking website, and the friend logs in to the community. Once you have the site, you can watch the uploaded animation. The remote digitizing device 14 can also be a remote animation playing device, and the animation of the actual animated character 13 is generated by the present invention to generate an animation of the computer animated character 113 to be transmitted to the remote animation playing device through the network. At the same time, the remote digital device 14 can also be a cloud server, which utilizes the mechanism of the cloud to spread the interesting animations produced by the present invention to friends and relatives for viewing, thereby creating a more diverse taste and mood.

前述之自然使用者介面12可以是能提供物體之骨架資訊之設備即可與本發明一起運作,而骨架資訊主要為各骨架包含景深之3D位置,可以是時下之3D攝影機,或是包含偵測景深之紅外線配合攝影機之裝置,亦或包含無線電波標籤與無線電波感應器以獲取骨架之3D資訊之裝置。對於2D動畫之製作自然使用者介面12只需提供2D之骨架位置資訊即可。此外,若自然使用者介面只提供3D之座標,則可利用電腦圖學(Computer Graphics)的3D轉2D技術即可完成,亦即在自然使用者之輸出端再加上一個3D座標轉2D座標模組,組該校經計算後再輸出即為2D座標。The aforementioned natural user interface 12 may be a device capable of providing skeleton information of an object to operate together with the present invention, and the skeleton information mainly includes a 3D position of each skeleton including a depth of field, which may be a 3D camera nowadays, or includes a Detector. A device for measuring infrared depth with a camera, or a device that includes a radio wave tag and a radio wave sensor to acquire 3D information of the skeleton. For the 2D animation production natural user interface 12 only need to provide 2D skeleton position information. In addition, if the natural user interface only provides 3D coordinates, it can be done by Computer Graphics' 3D to 2D technology, that is, adding a 3D coordinate to 2D coordinate at the output of the natural user. The module and the group are calculated and then output as 2D coordinates.

11‧‧‧動畫處理裝置
12‧‧‧自然使用者介面
13‧‧‧實際動畫角色
14‧‧‧網際網路
15‧‧‧遠端數位裝置
111‧‧‧數位處理裝置
112‧‧‧動畫模組
113‧‧‧電腦動畫角色
211‧‧‧區塊211
212‧‧‧區塊212
213‧‧‧區塊213
214‧‧‧區塊214
215‧‧‧區塊215
216‧‧‧區塊216
217‧‧‧區塊217
218‧‧‧區塊218
219‧‧‧區塊219
220‧‧‧區塊220
221‧‧‧區塊221
222‧‧‧區塊222
401‧‧‧初始繪圖
402‧‧‧第一畫格
403‧‧‧第n-1畫格
404‧‧‧第n畫格
501‧‧‧步驟501
502‧‧‧步驟502
503‧‧‧步驟503
504‧‧‧步驟504
505‧‧‧步驟505
506‧‧‧步驟506
511‧‧‧步驟511
512‧‧‧步驟512
513‧‧‧步驟513
514‧‧‧步驟514
521‧‧‧步驟521
522‧‧‧步驟522
523‧‧‧步驟523
524‧‧‧步驟524
1213‧‧‧直線1213
1214‧‧‧直線1214
1215‧‧‧直線1215
1313‧‧‧直線1313
1314‧‧‧直線1314
1315‧‧‧直線1315
2111‧‧‧定位點2111
2112‧‧‧定位點2112
2121‧‧‧定位點2121
2122‧‧‧定位點2122
2131‧‧‧定位點2131
2132‧‧‧定位點2132
2141‧‧‧定位點2141
2142‧‧‧定位點2142
2151‧‧‧定位點2151
2152‧‧‧定位點2152
2161‧‧‧定位點2161
2162‧‧‧定位點2162
2171‧‧‧定位點2171
2172‧‧‧定位點2172
2181‧‧‧定位點2181
2182‧‧‧定位點2182
2191‧‧‧定位點2191
2192‧‧‧定位點2192
2201‧‧‧定位點2201
2202‧‧‧定位點2202
2211‧‧‧定位點2211
2212‧‧‧定位點2212
2221‧‧‧定位點2221
2222‧‧‧定位點2222
11‧‧‧Animation processing device
12‧‧‧Natural user interface
13‧‧‧Actual animation characters
14‧‧‧Internet
15‧‧‧Remote digital device
111‧‧‧Digital processing unit
112‧‧‧Animation module
113‧‧‧Computer animation characters
Block 211‧‧‧ Block 211
212‧‧‧ Block 212
213‧‧‧ Block 213
214‧‧‧ Block 214
215‧‧‧ Block 215
216‧‧‧ Block 216
217‧‧‧ Block 217
218‧‧‧ Block 218
219‧‧‧ Block 219
220‧‧‧ Block 220
221‧‧‧ Block 221
Block 222‧‧‧ Block 222
401‧‧‧ initial drawing
402‧‧‧ first frame
403‧‧‧n-1 frame
404‧‧‧ nth frame
501‧‧‧Step 501
502‧‧‧Step 502
503‧‧‧Step 503
504‧‧‧Step 504
505‧‧‧Step 505
506‧‧‧Step 506
511‧‧‧Step 511
512‧‧‧Step 512
513‧‧‧Step 513
514‧‧‧Step 514
521‧‧‧Step 521
522‧‧‧Step 522
523‧‧‧Step 523
524‧‧‧Step 524
1213‧‧‧Line 1213
1214‧‧‧Line 1214
1215‧‧‧Line 1215
1313‧‧‧Line 1313
1314‧‧‧Line 1314
1315‧‧‧Line 1315
2111‧‧‧Location 2111
2112‧‧‧Locating point 2112
2121‧‧‧Location 2121
2122‧‧‧Locating point 2122
2131‧‧‧Location 2131
2132‧‧‧Location 2132
2141‧‧‧Location 2141
2142‧‧‧Location 2142
2151‧‧‧Location 2151
2152‧‧‧Locating point 2152
2161‧‧‧Locating point 2161
2162‧‧‧Locating point 2162
2171‧‧‧Locating point 2171
2172‧‧‧Location 2172
2181‧‧‧Locating point 2181
2182‧‧‧Locating point 2182
2191‧‧‧Location 2191
2192‧‧‧Location 2192
2201‧‧‧Location Point 2201
2202‧‧‧Location 2202
2211‧‧‧Locating point 2211
2212‧‧‧Locating point 2212
2221‧‧‧Locating point 2221
2222‧‧‧Location 2222

圖1本發明架構圖 圖2電腦動畫角色 圖3電腦動畫角色之分割區塊圖 圖4本發明之第一較佳實施例 圖5A電腦動畫角色部份分割區塊之初始繪圖 圖5B實際動畫角色動作後之部份分割區塊 圖6區塊213移動前後之夾角計算 圖7區塊214移動前後之夾角計算 圖8區塊215移動前後之夾角計算 圖9定位點之設置說明一 圖10定位點之設置說明二 圖11本發明之第二較佳實施例 圖12本發明之第三較佳實施例 圖13基準圖與初始繪圖 圖14動畫繪製流程 圖15繪製單一畫格動畫 圖16繪製單一區塊動畫 圖17動畫網路分享模式Figure 1 is an architectural diagram of the present invention. Figure 2 is a computer animation character. Figure 3 is a block diagram of a computer animation character. Figure 4 is a first preferred embodiment of the present invention. Figure 5A is an initial animation of a computer animation character segmentation block. Part of the divided block after the action Figure 6 The calculation of the angle between the block before and after the movement of the block 213 Figure 7 The calculation of the angle between the block 214 before and after the movement Figure 8 The calculation of the angle between the block 215 before and after the movement Figure 9 The setting of the positioning point Figure 10 BRIEF DESCRIPTION OF THE DRAWINGS FIG. 11 is a second preferred embodiment of the present invention. FIG. 12 is a third preferred embodiment of the present invention. FIG. 13 is a reference drawing and an initial drawing. FIG. 14 is an animation drawing flow chart 15 drawing a single frame animation FIG. Block animation Figure 17 animation network sharing mode

11‧‧‧動畫處理裝置 11‧‧‧Animation processing device

12‧‧‧自然使用者介面 12‧‧‧Natural user interface

13‧‧‧實際動畫角色 13‧‧‧Actual animation characters

111‧‧‧數位處理裝置 111‧‧‧Digital processing unit

112‧‧‧動畫模組 112‧‧‧Animation module

113‧‧‧電腦動畫角色 113‧‧‧Computer animation characters

Claims (8)

一種使用自然使用者介面以製作動畫之方法,其係用於繪製單一區塊,其步驟包括:(a)從自然使用者介面接收第二動畫位置; (b)計算本區塊第一動畫位置與本區塊第二動畫位置之旋轉夾角; (c)以本區塊基準圖按前一步驟所計算之旋轉夾角旋轉; (d) 將前步驟旋轉後區塊之第一定位點平移至第二動畫位置中本區塊第一定位點之位置。A method of using a natural user interface to create an animation for drawing a single block, the steps comprising: (a) receiving a second animation position from a natural user interface; (b) calculating a first animation position of the block (c) rotates the rotation angle calculated by the previous step with the block reference map; (d) shifts the first positioning point of the block after the previous step to the first position The position of the first positioning point of the block in the second animation position. 一種使用自然使用者介面以製作動畫之方法,其係用於繪製單一畫格,步驟包含:(a)首先判斷目前繪製動畫畫格是否為第一畫格; (b)若為第一畫格則將後續繪製動畫之基準圖設定為第一畫格; (c)若非為第一畫格則將後續繪製動畫之基準圖設定為初始繪圖; (d)以請求項1之單一區塊動畫繪製方法,繪製該影格所有區塊之動畫。A method of using a natural user interface to create an animation, which is used to draw a single frame, the steps include: (a) first determining whether the currently drawn animation frame is the first frame; (b) if the first frame is Then, the base drawing of the subsequent drawing animation is set as the first frame; (c) if it is not the first frame, the base drawing of the subsequent drawing animation is set as the initial drawing; (d) the single block animation of the request item 1 is drawn. Method, draw an animation of all the tiles of the frame. 一種使用自然使用者介面以製作動畫之方法,包含下列步驟:(a)先將電腦動畫角色分割成複數個區塊; (b)逐一繪製電腦動畫角色每一區塊之初始繪圖; (c)判斷動畫之繪製是否已經結束; (d)若判斷為結束動畫繪製,則進行結束動畫繪製作業; (e)若判斷為繼續繪製動畫,則以請求項2之繪製單一畫格方法繪製目前畫格; (f)若基準圖設定為上一畫格則更新基準圖成目前畫格並等待一固定時間; (g)回到步驟c。A method of using a natural user interface to create an animation, comprising the steps of: (a) first dividing a computer animation character into a plurality of blocks; (b) drawing an initial drawing of each block of the computer animation character one by one; (c) Determine whether the drawing of the animation has ended; (d) If it is determined that the drawing is finished, the ending animation drawing operation is performed; (e) If it is determined that the drawing is continued to be drawn, the current drawing is drawn by the single drawing method of the request item 2. (f) If the reference map is set to the previous frame, the reference map is updated to the current frame and waits for a fixed time; (g) returns to step c. 一種使用自然使用者介面以製作動畫之裝置,係使用自然使用者介面,以獲取實際動畫角色之骨架資訊,並套用至電腦動畫角色以製作動畫,係包括一自然使用者介面用來獲取實際動畫角色之骨架資訊、一動畫處理裝置用來處理自然使用者介面回傳之骨架資訊以繪製動畫,而動畫處理裝置更包含了一數位處理裝置用來執行來自動畫模組之指令、一動畫模組包含動畫繪製之程序用來繪製動畫及一電腦動畫角色用來展現動畫之成果,動畫模組電性連接於數位處理裝置,數位處理裝置包含至少一處理器用來處理本發明之所有指令、一記憶體模組用來儲存資料,該一自然使用者介面,電性連接至該一數位處理裝置。A device that uses a natural user interface to create animations, using a natural user interface to obtain skeleton information of an actual animated character, and applying it to a computer animated character to create an animation, including a natural user interface for obtaining an actual animation. The skeleton information of the character, an animation processing device is used to process the skeleton information of the natural user interface back to draw the animation, and the animation processing device further comprises a digital processing device for executing the instruction from the animation module and an animation module. The program including the animation drawing is used to draw the animation and a computer animation character is used to display the animation result. The animation module is electrically connected to the digital processing device, and the digital processing device includes at least one processor for processing all the instructions, a memory of the present invention. The body module is used for storing data, and the natural user interface is electrically connected to the digital processing device. 如請求項第1項、第2項或第3項所述之使用自然使用者介面以製作動畫之方法,其中自然使用者介面為一提供物體骨架2D或3D位置資訊之裝置。The method of using a natural user interface to create an animation as described in claim 1, item 2 or item 3, wherein the natural user interface is a device for providing 2D or 3D position information of the object skeleton. 如請求項第4項所述之使用自然使用者介面以製作動畫之裝置,其中自然使用者介面為一提供物體骨架2D或3D位置資訊之裝置。A device for making an animation using a natural user interface as described in claim 4, wherein the natural user interface is a device that provides 2D or 3D position information of the object skeleton. 如請求項第4項或第6項所述之使用自然使用者介面以製作動畫之裝置,其中數位處理裝置連接網際網路,透過網際網路將所繪製之動畫傳送至伺服器。A device for making an animation using a natural user interface as described in Item 4 or Item 6, wherein the digital processing device is connected to the Internet to transmit the drawn animation to the server via the Internet. 如請求項第4項、第6項或第7項所述之使用自然使用者介面以製作動畫之裝置,其中之伺服器可為一郵件伺服器或網頁伺服器或檔案伺服器或社群網站伺服器或影音伺服器或遠端動畫播放器或雲端伺服器。A device for making an animation using the natural user interface as described in item 4, item 6, or item 7, wherein the server can be a mail server or a web server or a file server or a social networking site. Server or video server or remote animation player or cloud server.
TW102143360A 2013-11-28 2013-11-28 A Method and Apparatus for Creating Animations TWI505176B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW102143360A TWI505176B (en) 2013-11-28 2013-11-28 A Method and Apparatus for Creating Animations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW102143360A TWI505176B (en) 2013-11-28 2013-11-28 A Method and Apparatus for Creating Animations

Publications (2)

Publication Number Publication Date
TW201520875A true TW201520875A (en) 2015-06-01
TWI505176B TWI505176B (en) 2015-10-21

Family

ID=53935042

Family Applications (1)

Application Number Title Priority Date Filing Date
TW102143360A TWI505176B (en) 2013-11-28 2013-11-28 A Method and Apparatus for Creating Animations

Country Status (1)

Country Link
TW (1) TWI505176B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184863A (en) * 2020-10-21 2021-01-05 网易(杭州)网络有限公司 Animation data processing method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002319036A (en) * 2001-02-13 2002-10-31 Sega Corp Animation generation program
TWI295037B (en) * 2005-11-11 2008-03-21 Best Wise Internat Computing Co Ltd Method and system for fast animation manufacture
US8427503B2 (en) * 2009-05-18 2013-04-23 Nokia Corporation Method, apparatus and computer program product for creating graphical objects with desired physical features for usage in animation
US8325192B2 (en) * 2009-07-10 2012-12-04 Microsoft Corporation Creating animations
US20110296352A1 (en) * 2010-05-27 2011-12-01 Microsoft Corporation Active calibration of a natural user interface

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184863A (en) * 2020-10-21 2021-01-05 网易(杭州)网络有限公司 Animation data processing method and device
CN112184863B (en) * 2020-10-21 2024-03-15 网易(杭州)网络有限公司 Animation data processing method and device

Also Published As

Publication number Publication date
TWI505176B (en) 2015-10-21

Similar Documents

Publication Publication Date Title
US11640694B2 (en) 3D model reconstruction and scale estimation
US8913809B2 (en) Monitoring physical body changes via image sensor
US20130293686A1 (en) 3d reconstruction of human subject using a mobile device
US20180276882A1 (en) Systems and methods for augmented reality art creation
US8724849B2 (en) Information processing device, information processing method, program, and information storage medium
CN109671141B (en) Image rendering method and device, storage medium and electronic device
CN109448099A (en) Rendering method, device, storage medium and the electronic device of picture
US20200098177A1 (en) System for reconstructing three-dimensional (3d) human body model using depth data from single viewpoint
WO2019109828A1 (en) Ar service processing method, device, server, mobile terminal, and storage medium
US10818078B2 (en) Reconstruction and detection of occluded portions of 3D human body model using depth data from single viewpoint
CN113822970A (en) Live broadcast control method and device, storage medium and electronic equipment
CN115769260A (en) Photometric measurement based 3D object modeling
JP2024502407A (en) Display methods, devices, devices and storage media based on augmented reality
CN105791390A (en) Data transmission method, device and system
CN115803783A (en) Reconstruction of 3D object models from 2D images
CN113965773A (en) Live broadcast display method and device, storage medium and electronic equipment
CN111046198B (en) Information processing method, device, equipment and storage medium
CN112866741A (en) Gift animation effect display method and system based on 3D face animation reconstruction
CN115802076A (en) Three-dimensional model distributed cloud rendering method and system and electronic equipment
CN108932055B (en) Method and equipment for enhancing reality content
TWI505176B (en) A Method and Apparatus for Creating Animations
CN117221633B (en) Virtual reality live broadcast system based on meta universe and digital twin technology
US20240185496A1 (en) Systems and Methods for Immersive Digital Experiences
US9843642B2 (en) Geo-referencing media content
US20230177788A1 (en) 3d models for augmented reality (ar)

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees