TW201216136A - Interactive projection system and opertating method thereof - Google Patents

Interactive projection system and opertating method thereof Download PDF

Info

Publication number
TW201216136A
TW201216136A TW99133896A TW99133896A TW201216136A TW 201216136 A TW201216136 A TW 201216136A TW 99133896 A TW99133896 A TW 99133896A TW 99133896 A TW99133896 A TW 99133896A TW 201216136 A TW201216136 A TW 201216136A
Authority
TW
Taiwan
Prior art keywords
interactive
image
projection system
camera
effect
Prior art date
Application number
TW99133896A
Other languages
Chinese (zh)
Inventor
Ming-Hsien Chou
Te-Sung Liu
Min-Chien Hsiao
Hung-Hsin Chen
Chen-Ming Lu
Original Assignee
Hc Photonics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hc Photonics Corp filed Critical Hc Photonics Corp
Priority to TW99133896A priority Critical patent/TW201216136A/en
Publication of TW201216136A publication Critical patent/TW201216136A/en

Links

Landscapes

  • Projection Apparatus (AREA)
  • User Interface Of Digital Computer (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

An interactive projection system comprises a projector for projecting an image on an interactive projection surface, at least one camera for capturing an interactive image, wherein the interactive image is generated when the user interacts with the image projected by the projector, a processor connected to the projector and the at least one camera for processing the interactive image and comparing the image and the interactive image to change the image projected by the projector according to the compared result, wherein the projector, the at least one camera, and the processor are set at one side of the interactive projection surface, a fill light set at the other side of the interactive projection surface for illuminating the user to enhance the interactive image so as to let the at least one camera capture the enhanced interactive image.

Description

201216136 六、發明說明: 【發明所屬之技術領域】 本發明係關於一種投影系統,且特別是關於一種互動 式投影系統。 【先前技術】 互動式投影系統分為正投影與背投影兩種模式。所謂 正投影模式是指投影機、攝影機與使用者位於螢幕(可為屏 • 幕、牆壁、地板或任何可投射影像的表面)的同一側。此外, 背投影模式是指投影機與攝影機位於螢幕一側,而使用者 則相對於投影機與攝影機位於螢幕的另一側。 基於背投影模式之互動式投影系統技術上的限制,一 般而言,使用者輸入操作指令的方式,並非由攝影機擷取 使用者的動作而經處理器處理為操作指令,而是提供觸控 膜並讓使用者碰觸前述觸控膜來輸入操作指令,或者於螢 幕邊緣設置紅外線感應器來感應使用者遮蔽的座標點,以 Φ 作為輸入之操作指令。 雖然相較於正投影模式之互動式投影系統,背投影模 式之互動式投影系統具有節省空間、設置容易等優點,惟 目前背投影模式之互動式投影系統是使用觸控膜或者紅外 線感應器的方式來提供使用者輸入操作指令,因此,目前 背投影模式之互動式投影系統具有製造成本高的缺點。 【發明内容】 201216136 為解決習知技術之問題,本發明之主要目的在於提供 一種互動式投影系統。前述互動式投影系統包含補光燈, 前述補光燈係相對投影機和至少一攝影機位於互動投影面 之另一側,以增強互動影像,使至少一攝影機擷取更精確 之互動影像。 為達上述目的,本發明提供一種互動式投影系統。根 據本發明一實施方式,前述互動式投影系統包含投影機、 至少一攝影機、處理器以及補光燈。投影機用以投射影像 於互動投影面。至少一攝影機用以擷取互動投影面之互動 影像,其中互動影像係為使用者與投影機所投射之影像進 行互動時所產生之另一影像。處理器連接於投影機與至少 一攝影機,用以對至少一攝影機所擷取之互動影像進行影 像處理,且比較前述影像與互動影像,並根據比較結果變 更投影機所投射之影像,其中投影機、至少一攝影機以及 處理器係位於互動投影面之一方。補光燈係位於互動投影 面之另一側,用以照射於使用者以增強互動影像,使至少 一攝影機擷取增強後之互動影像。 根據本發明另一實施方式,本發明提供一種互動式投 影系統操作方法。前述互動式投影系統操作方法包含:投 射影像於互動投影面;執行至少一攝影機參數設定;將至 少一攝影機擷取影像之範圍定位於互動投影面;測試互動 式投影系統是否正常運作;若互動式投影系統正常運作, 則執行預定之互動特效或影片排程;其中互動特效或影片 排程係由使用者自行排定,且互動特效或影片係於互動特 效或影片排程之空檔排程中隨機執行以進行無縫切換。 201216136 因此,根據本發明之實施 補光燈以增強互動影像,蚀s 了式,本發明實施例係利用 動影像,而使本發明實施例應==_取增強後之互 高至少一攝1彡#&_ ;奇投影模式時,得以提 器對互二=:之=:的準·,並經由處理 影像所輸人之操作指令的正像投=投射相應於互動 背投影模狀絲讀,減於習知的 器的方式來提供使用者;==控膜或者紅外線感應 作“:=:光燈即可準確测得使用者輸入之操 作才"’心可降低互動式投影系統之製造成本。 【實施方式】 ①第1圖料示依照本發明—實施方式的—種互動式投 影系統100示意圖。前述互動式投影系統1〇〇,包含投影 機110、至少一攝影機120、處理器130以及補光燈14〇了 投影機110是用以投射影像於互動投影面150。至少一攝 影機120是用以擷取互動投影面15〇之互動影像,其中互 動影像為使用者與投影機110所投射之影像進行互動時所 產生之另一影像。處理器130連接於投影機11〇與至少一 攝影機120,用以對至少一攝影機120所擷取之互動影像 進行影像處理,且比較前述影像與互動影像,並根據比較 結果變更投影機110所投射之影像’其中投影機110以及 處理器130係位於互動投影面150之一方。補光燈140位 於互動投影面150之另一方’用以照射於使用者以增強互 動影像,使至少一攝影機120擷取增強後之互動影像。 201216136201216136 VI. Description of the Invention: TECHNICAL FIELD OF THE INVENTION The present invention relates to a projection system, and more particularly to an interactive projection system. [Prior Art] The interactive projection system is divided into two modes: front projection and rear projection. The so-called front projection mode means that the projector, camera and user are on the same side of the screen (which can be the screen, wall, floor or any surface on which the image can be projected). In addition, the rear projection mode means that the projector and the camera are on the side of the screen, and the user is located on the other side of the screen with respect to the projector and the camera. The technical limitation of the interactive projection system based on the rear projection mode is that, in general, the manner in which the user inputs the operation command is not processed by the camera as the operation command by the camera, but the touch film is provided. The user can touch the touch film to input an operation command, or set an infrared sensor on the edge of the screen to sense the coordinate point blocked by the user, and use Φ as an input operation instruction. Although the interactive projection system of the rear projection mode has the advantages of space saving and easy setting compared with the interactive projection system of the front projection mode, the current interactive projection system of the rear projection mode uses a touch film or an infrared sensor. The method provides a user input operation instruction, and therefore, the current interactive projection system of the rear projection mode has the disadvantage of high manufacturing cost. SUMMARY OF THE INVENTION 201216136 In order to solve the problems of the prior art, it is a primary object of the present invention to provide an interactive projection system. The interactive projection system includes a fill light, and the fill light is located on the other side of the interactive projection surface relative to the projector and the at least one camera to enhance the interactive image, so that at least one camera captures a more accurate interactive image. To achieve the above object, the present invention provides an interactive projection system. According to an embodiment of the invention, the interactive projection system includes a projector, at least one camera, a processor, and a fill light. The projector is used to project images onto the interactive projection surface. At least one camera is used to capture an interactive image of the interactive projection surface, wherein the interactive image is another image produced by the user interacting with the image projected by the projector. The processor is connected to the projector and the at least one camera for performing image processing on the interactive image captured by the at least one camera, comparing the image and the interactive image, and changing the image projected by the projector according to the comparison result, wherein the projector At least one camera and processor are located on one side of the interactive projection surface. The fill light is located on the other side of the interactive projection surface to illuminate the user to enhance the interactive image so that at least one camera captures the enhanced interactive image. According to another embodiment of the present invention, the present invention provides an interactive projection system operating method. The foregoing interactive projection system operation method comprises: projecting an image on an interactive projection surface; performing at least one camera parameter setting; positioning at least one camera capturing image range on the interactive projection surface; testing whether the interactive projection system operates normally; The projection system works normally, and the scheduled interactive effects or video schedules are executed; the interactive effects or video schedules are scheduled by the user, and the interactive effects or videos are in the empty schedule of the interactive effects or video schedules. Randomly executed for seamless switching. 201216136 Therefore, according to the implementation of the present invention, the fill light is used to enhance the interactive image, and the embodiment of the present invention utilizes the moving image, so that the embodiment of the present invention should have an enhanced height of at least one shot.彡#&_ ; In the odd projection mode, it is possible to extract the correctness of the mutual two =: = =, and the positive image projection of the operation command input by the processing image corresponds to the interactive back projection mode wire. Read, reduce the way to the user to provide the user; == control film or infrared sensor for ": =: light can accurately measure the user input operation" "heart can reduce the interactive projection system [Embodiment] FIG. 1 is a schematic diagram showing an interactive projection system 100 according to the present invention. The interactive projection system includes a projector 110, at least one camera 120, and processing. The projector 130 and the fill light 14 are used to project an image on the interactive projection surface 150. At least one camera 120 is used to capture an interactive image of the interactive projection surface, wherein the interactive image is a user and a projector. 110 projected shadows Another image generated by the interaction is performed. The processor 130 is connected to the projector 11 and the at least one camera 120 for performing image processing on the interactive images captured by the at least one camera 120, and comparing the images and the interactive images. And changing the image projected by the projector 110 according to the comparison result, wherein the projector 110 and the processor 130 are located on one side of the interactive projection surface 150. The fill light 140 is located on the other side of the interactive projection surface 150 for illuminating the user. To enhance the interactive image, at least one camera 120 captures the enhanced interactive image. 201216136

如第1圖所示,投影機u〇投射預設影像於互動投影 面150,當使用者與前述影像進行互動時,由於使用者的 =或其它身體部位對互動式投影系統1〇〇外部入射的光線 j成遮蔽作用(例如:在互動投影的範圍内,使用者相對於 投影=110所投射之影像而移動、揮動手或腳,抑或於互 動式技衫系統100執行足球或打磚塊..·等遊戲時,使用者 身體的任-部位或其使用任—物品在球行進的路線上維持 不動比亦即對投影機11G所投射之影像進行_。综上所 S會對光線產生遮蔽作用。),因此產生了互動影像。 =後,至少一攝影機120便能擷取互動影像,並經由處理 裔130對至少一攝影機12〇所擷取之互動影像進行影像處 =在對互動影像進行影像處理後由處理器1⑽比較互 影像與投射於互動投影面15G的原始影像,如此,即能 辨別使用者相應於原始影像所做的動作。 具體而言 勒、 :虽使用者如上所述在互動投影的範圍内寺 合L· h動手或腳’抑或對原始影像進行阻擔時’處理器13 而動影像與投射於互動投影面150的原始影像,名 動作所代原=像所做的動作’並產饰 有按知影像,々’此外,原始影像中可例示性地, 鈕,傻日#田削述相應於原始影像所做的動作為觸碰去 影二斤:的器130會比較並辨別出使用者相應於原女 作ί:做:動作,隨後產生觸碰前述按纽影像所代表料 广*Τ日7 〇如此,虛冬田口口 影機110所投射之=卩可根據前述比較結果變更名 用者所輸入之摔作H ’而使投影機110可投射相應㈣ 刼作私令的影像。其中投影機110以及處s 201216136 器130係位於互動投影面150之同一方。 如上所述,本發明實施例中的補光燈14〇可相對於至 少一攝影機120而設置在互動投影面15()之另一方用以 照射於使用者以增強互動影像,使至少一攝影機12〇擷取 增強後之互動影像。是故,得以提高至少一攝影機12〇所 取得之互動影像的準確性,並經由處理器13〇對互動影像 進行處理,進而提供投影機110投射相應於互動影像所輪 入之操作指令的正確影像。此外,處理器13〇具有動態對 比(Dynamic Contrast Ratio,DCR)調整程式,可加強至 ^ 一 攝影機120所擷取影像之亮部與暗部的對比,進而加強互 動影像之對比度。 在本實施例中,互動式投影系統1〇〇同時兼具正投影 模式與背投影模式。如第1圖所示,在本實施例之互動^ 投影系統100中,由於投影機110以及處理器13〇係位= 互動投影面150之一方,當使用者亦位於同一方時,互動 式投影系統100可經使用者調整處理器13〇之設定,而使 互動式投影系統100操作於正投影模式。另外,當使用者 位於互動投影面150之另一方時,互動式投影系統1〇〇可 經使用者調整處理H 13G之設定,*使絲式投影系統1〇〇 操作於背投影模式。 此外,補光燈140所發出之光線為不可見光。再者, 互動投影面150可為玻璃、牆壁或屏幕,前述玻璃上附著 一層背光成像膜,使前述影像可投影於玻璃上。 ^在本實施例中,處理器130係用以執行預定之互動特 效或影片排程,並由投影機110投射互動特效或影片。其 201216136 中互動特效或景彡片姑> / 或影片會在互^ 程係由使用者自行排定,且互動特效 進行無縫切換。声效或影片排程之空檔排程_隨機執行以 平台,前述系例示性地具有内建之系統操作 進行排程。互動牲1千σ可對所欲播放之互動特效或影片 於互動特效切切4擁可由使用者自行排定,且 由互動特效咬# 程之空檔排程中,處理ϋ 130可自行As shown in FIG. 1, the projector u projects a preset image on the interactive projection surface 150. When the user interacts with the image, the user's or other body parts are externally incident on the interactive projection system. The ray j is shielded (for example, within the scope of the interactive projection, the user moves, waves the hand or the foot relative to the projected image of the projection =110, or the interactive tactical system 100 performs a soccer or arkanoid. When playing a game, any part of the user's body or its use-object remains on the route of the ball, that is, the image projected by the projector 11G is _. Effect.), thus creating an interactive image. After the camera 120 can capture the interactive image, and the image of the interactive image captured by the at least one camera 12 is processed by the processing person 130. The image is processed by the processor 1 (10) after the image processing of the interactive image. The original image projected on the interactive projection surface 15G is such that the user's action corresponding to the original image can be discerned. Specifically, the user: although the user is in the range of the interactive projection as described above, the L/h hands or the feet or the original image is blocked. The processor 13 moves the image and the projection surface 150. The original image, the original action of the name action = the action made like 'the production is decorated with the image of the know, 々' In addition, the original image can be exemplarily, button, silly day #田切说 corresponds to the original image Move as a touch to shadow two pounds: the device 130 will compare and identify the user corresponding to the original female work ί: do: action, then generate the touch of the aforementioned button image representative of the material wide * Τ 7 〇 so, virtual According to the comparison result, the projector 110 can project the image corresponding to the (4) private command. The projector 110 and the s 201216136 130 are located on the same side of the interactive projection surface 150. As described above, the fill light 14 in the embodiment of the present invention can be disposed on the other side of the interactive projection surface 15 () with respect to at least one camera 120 for illuminating the user to enhance the interactive image, so that at least one camera 12 is provided. Capture enhanced interactive images. Therefore, the accuracy of the interactive image obtained by at least one camera 12 is improved, and the interactive image is processed by the processor 13 ,, thereby providing the projector 110 to project the correct image corresponding to the operation instruction wheeled by the interactive image. . In addition, the processor 13A has a Dynamic Contrast Ratio (DCR) adjustment program, which can enhance the contrast between the bright portion and the dark portion of the image captured by the camera 120, thereby enhancing the contrast of the interactive image. In this embodiment, the interactive projection system 1 has both a front projection mode and a rear projection mode. As shown in FIG. 1 , in the interactive projection system 100 of the present embodiment, since the projector 110 and the processor 13 are one of the interactive projection surfaces 150, when the user is also in the same side, the interactive projection is The system 100 can adjust the settings of the processor 13 via the user to cause the interactive projection system 100 to operate in an orthographic mode. In addition, when the user is located on the other side of the interactive projection surface 150, the interactive projection system 1 can adjust the setting of the H 13G by the user, and * cause the wire projection system 1 to operate in the rear projection mode. In addition, the light emitted by the fill light 140 is invisible light. Furthermore, the interactive projection surface 150 can be a glass, a wall or a screen on which a backlit imaging film is attached so that the image can be projected onto the glass. In the present embodiment, the processor 130 is configured to execute a predetermined interactive effect or movie schedule, and the projector 110 projects an interactive effect or movie. The interactive effects or scenes in 201216136 will be scheduled by the user and the interactive effects will be seamlessly switched. The neutral schedule of sound effects or video schedules is arbitrarily executed on the platform, which illustratively has built-in system operations for scheduling. The interaction of 1 thousand sigma can be played on the interactive effects or videos that you want to play. The interactive effects can be arranged by the user, and by the interactive special effect bite #程之空排程, processing ϋ 130 can be

或影片,其播選取合適者(例如:單-互動特效 檔排程的時間.*二/、則述空檔排程的時間相近且小於空 播放時間與;二=複數個互動特效或影片,其加總之 間),以提供於与1排程的時間相近且小於空檔排程的時 空構排程中隨=^11()投影於互動投影面15G上。前述於 縫切拖β 仃互動特效或影片的方式,即稱之為「無 行播(11’:=^效可為—特效、_特效或執 掛裁r组·執行檔)。本發明實施例除可例示性的 互動式:号;系ί ;卜00:可掛載flash特效,讓使用者於應用 推因此使用者可掛载自行撰寫的flash特效。此外, 平么=特效製成執行檔,其亦可掛載於前述系統操作 ί互動特效包含底圖與特效式樣,其中底圖與特 效式樣係由使用者自行設定。 圖係繪城照本發日m施方式的—種互動式 〜糸崎作方法流程圖。請同時參照第i圖和第2圖。 =式^系統議操作方法之流程如下:首先,投射影 像於互動投影面15〇(步驟21G),接著,執行至少一攝影機 201216136 120參數設定(步驟220)。隨後,將至少一攝影機120擷取 影像之範圍定位於互動投影面15〇(步驟230),再者,測試 互動式投影系統1〇〇是否正常運作(步驟24〇)。最後,若互 動式技影系統1〇〇正常運作,則執行預定之互動特效或影 片排程(步驟250),若互動式投影系統1〇〇無法正常運作, 則重新執行至少一攝影機12〇參數設定之步驟(步驟22〇卜 其中互動特效或影片排程係由使用者自行排定,且互動特 效或影片係於互動特效或影片排程之空檔排程中隨機執行 以進行無縫切換。 在本實施例之步驟220中,至少一攝影機120之參數 設疋可為例如設定信號來源與設定解碼方式(如:歐規或美 規)等,並於設定完成後將至少一攝影機12〇擷取影像之範 園疋位於互動投影面15〇,用以擷取互動影像。步驟24〇 之測成方法例示性地使用互動式投影系統1〇〇中的驗證畫 面來達成,前述驗證晝面得以測試互動式投影系統1〇〇, 讓使用者實際與投影機11〇所投影之影像做互動,檢驗互 動式才又影系統100是否得以正常運作。 此外,在步驟250中執行的互動特效或影片排程可例 雜地由處理器Π0來執行,處理器⑽具有内建之系統 操作平台,前述系統操作平台可料欲播放之互動特效或 影片進行排私’互動特效或影片排程可由使用者自行排 定,且於互動特效或影片排程之空檔排程中,處理器13〇 玎自行由互動特效或影片中隨機選取合適者(例如:單一互 動特效或影片,其播放時間與前述空檔排程的時間相近且 小於空標排程的時間;或者,複數個互動特效或影片,其 201216136 加總之播放時間與前述空檔排程的時間相近且小於空柃 程的時間),以提供投影機110投影於互動投影面15〇 : 前述於空檔排程中隨機執行互動特效或影片的方式,韃 之為「無縫切換」。 稱Or the film, the broadcast selects the appropriate person (for example: the time of the single-interactive special effect schedule. * 2 /, the time of the neutral schedule is similar and less than the empty play time; 2 = a plurality of interactive effects or videos, Between the summations, it is projected on the interactive projection surface 15G with the =11() in the space-time configuration schedule which is provided in the time-space configuration which is close to the one-station and less than the neutral schedule. The above-mentioned method of stitching and dragging the β 仃 interactive special effect or movie is called “no line play (11′:=^ effect can be-special effect, _ special effect or hang-up r group·execution file). Implementation of the present invention Except for the exemplary interactive type: number; system ί; 00: can be loaded with flash effects, so that the user can push the application to push the self-written flash effects. In addition, the flat = special effects made The file can also be mounted on the aforementioned system operation. The interactive effects include the base map and the special effect style, wherein the base map and the special effect pattern are set by the user. The picture is drawn by the city. ~Sakizaki method flow chart. Please refer to the i-th and second pictures at the same time. The flow of the method is as follows: First, project the image on the interactive projection surface 15 (step 21G), and then execute at least one camera 201216136 120 parameter setting (step 220). Subsequently, at least one camera 120 captures the range of the image to the interactive projection surface 15 (step 230), and further tests whether the interactive projection system 1 is operating normally (step 24) 〇). Finally, if interactive technology If the shadow system 1 is in normal operation, the predetermined interactive effect or movie schedule is executed (step 250). If the interactive projection system 1 is not working properly, the step of at least one camera 12 parameter setting is re-executed (step 22). The interactive effects or video schedules are scheduled by the user, and the interactive effects or videos are randomly executed in the empty schedule of the interactive effects or video schedules for seamless switching. In 220, the parameter setting of at least one camera 120 can be, for example, a setting signal source and a setting decoding mode (such as an European standard or a US standard), and at least one camera 12 captures the image after the setting is completed. Located on the interactive projection surface 15〇 for capturing interactive images. The measurement method of step 24 is exemplarily achieved by using the verification image in the interactive projection system 1,, and the verification surface is tested by the interactive projection system 1 〇〇, let the user actually interact with the image projected by the projector 11 to check whether the interactive system 100 is functioning normally. Further, in step 250 The interactive effects or video schedules executed can be performed by the processor Π0. The processor (10) has a built-in system operation platform, and the system operation platform can be used to play interactive effects or videos for exclusive interaction or The video schedule can be scheduled by the user, and in the interactive schedule of the interactive effect or the video schedule, the processor 13 randomly selects the appropriate one by the interactive effect or the video (for example, a single interactive special effect or video, The playing time is similar to the time of the foregoing neutral scheduling and is less than the time of the empty standard scheduling; or, the plurality of interactive special effects or movies, the 201216136 total playing time is similar to the time of the foregoing neutral scheduling and is less than the empty process The time is provided to provide the projector 110 to project on the interactive projection surface 15: The foregoing method of randomly performing interactive effects or videos in the neutral schedule is referred to as "seamless switching". Weigh

在本實施例中,在測試互動式投影系統100是否正A 運作(步驟240)的步驟之後,若互動式投影系統1〇〇無法2 常運作,則執行至少一攝影機120參數設定之步驟、(步 22〇)。舉例而言,當使用前述驗證晝面測試互動式投ςIn this embodiment, after the step of testing whether the interactive projection system 100 is operating (step 240), if the interactive projection system 1 is not able to operate normally, then at least one camera 120 parameter setting step is performed, ( Step 22〇). For example, when using the aforementioned verification test, you can test interactive voting.

統100時,若互動式投影系统1〇〇無法正常運作,則二回 步驟220以重新設定至少一攝影機12()之參數,接著執二 步驟230以定位至少一攝影機12〇擷取影像之範圍。然後仃 再次執行步驟240以測試互動式投影系統1〇〇是否得以正 常運作,直到互動式投影系統1〇〇可正常運作時, 步驟250。 订 此外,刖述互動特效為flash特效、c語言特效或執行 檔(例如:任何exe執行檔)。本發明實施例除可例示性地掛 載C語言特效外,尚可掛载flash特效,讓使用者於應用互 動式投影系統100時’可將flash特效掛載於前述系統操作 平台,因此使用者可掛载自行撰寫的flash特效。此外,使 用者可將特效製成執行檔,其亦可掛載於前述系統操作平 台。其中互動特效包含底圖與特效式樣,其中底圖與特效 式樣係由使用者自行設定。 第3圖係依照本發明再一實施方式繪示一種互動式投 影系統300俯視圖。相較於第i圖,第3圖中之互動式投 影系統300包含兩攝影機,前述兩攝影機是為第一攝 201216136 =以及第二攝影機320,分別位於互動投影面(例如:螢 )33〇的,側邊,用以增強前述兩攝影機所擷取之互動影 像,而提升價測物件的定位點,讓使用者與投影機所投 射之影像it行互動時所產生之互動影像更為精確。 在本實施例中,採用兩個攝影機的方式是模擬人眼觀 察物體的模式。如第3圖所示之影像擷取區340,無論是 第一攝影機310或者第二攝影機320皆可擷取到影像擷取 匚340内的互動影像,因此,利用則述兩攝影機所取得互 φ 動影像的影像差,可以更精確地偵測物件的定位點。其次, 當第一攝影機310所欲擷取的互動影像被阻擋時,第二攝 影機320尚可取得互動影像,反之亦然。如此一來,可以 降低攝影機擷取影像的錯誤。在本實施例中,使用者是位 於影像掏取區340内。 第4圖係依照本發明第3圖繪示一種互動式投影系統 400前視圖。如第4圖所示,互動式投影系統400之第一 攝影機410與第二攝影機420分別位於互動投影面(例如: $ 螢幕)430的兩側邊上緣。 第5圖係依照本發明第3圖繪示又一種互動式投影系 統500前視圖。相較於第4圖’第5圖中的互動式投影系 統500之第一攝影機510與第二攝影機520分別位於互動 投影面(例如:螢幕)530的兩側邊下緣。 第6圖係依照本發明再一實施方式繪示一種互動式投 影系統600俯視圖。相較於第1圖’第6圖中之互動式投 ' 影系統600包含兩攝影機’前述兩攝影機是為第一攝影機 610以及第二攝影機620,位在以互動投影面630為基準, 12 201216136 相對於投影機所在位置的另一方,用以增強前述兩攝影機 所擷取之互動影像,進而提升偵測物件的定位點,讓使用 者與投影機所投射之影像進行互動時所產生之互動影像更 為精確。 在本實施例中,採用兩個攝影機的方式是模擬人眠觀 察物體的模式。如第6圖所示之影像操取區640,無論是 ^ 一攝影機610或者第二攝影機620皆可擷取到影像擷取 區640内的互動影像,因此,利用前述兩攝影機所取得互 φ 動影像的影像差,可以更精確地偵測物件的定位點。其次, 當第一攝影機610所欲擷取的互動影像被阻擋時,第二攝 衫機620尚可取得互動影像’反之亦然。如此一來,可以 降低攝影機擷取影像的錯誤。在本實施例中,使用者是位 於影像榻取區640内。 第7圖係依照本發明第6圖繪示一種互動式投影系統 前視圖。如第7圖所示,互動式投影系統7〇〇之第一攝影 機710與第二攝影機720位於互動投影面(例如:螢幕)73〇 φ 之另一方的下緣。舉例而言,前述兩個攝影機可分別位於 使用者的腳跟後方,並與使用者相距一段距離,前述距離 以可使攝影機準確地擷取到互動影像為準。 由上述本發明實施方式可知’應用本發明具有下列優 點。本發明實施例利用補光燈以增強互動影像,使至少一 攝影機擷取增強後之互動影像,而使本發明實施例應用於 背投影模式時,得以提高至少一攝影機所擷取之互動影像 ' 的準確性,並經由處理器對互動影像進行處理,進而提供 投影機投射相應於互動影像所輸入之操作指令的正確影 13 201216136 11 μ述至少—攝影機的數目為二時,相對於使用單一 而:林/1可用以增強前述兩攝影機所掏取之互動影像’進 而袄升偵測物件的定位點’讓使用 像進行互動時所產生之互動影像更為精^機所技射之衫 外二例除:例示性的掛載c _ 議時’可…特效掛載於前述系 使用者可掛载自行撰寫的flash特效。再去 ' 此, φ _成執行檔,其村掛餘前料統操作=者可將特 雖然本發明已以實施方式揭露如上,妙 定本發日月,任何熟習此技藝者,在 2、、’非用以限 範圍内’當可作各種之更動與潤飾,因t發明之精神和 圍當視後附之申請專利範園所界定者為準。發明之保護範 【圖式簡單說明】 為讓本發明之上述和其他目的、 鲁能更明顯易懂,所附圖式之說明如下:、主、優點與實施例 第1圖騎核照本制—實_ 衫系統示意圖。 種互動式才又 第2圖係繪示依照本發明另—實 投影系統操作方法流程圖。 式的一種互動式 第3圖係繪示依照本發明再—實施 、 投影系統俯視圖。 式的一種互動式 _ 帛4圖躲示依照本發日卜實施例的 系統前視圖。 種互動式投衫 201216136 第5圖係繪示依照本發明又一實施例的一種互動式投 影系統前視圖。 第6圖係繪示依照本發明再一實施方式的一種互動式 投影系統俯視圖。 第7圖係繪示依照本發明一實施例的一種互動式投影 系統前視圖。In the case of 100, if the interactive projection system 1 is not functioning properly, the second step 220 is to reset the parameters of at least one camera 12(), and then the second step 230 is performed to locate at least one camera 12 to capture the range of the image. . Then, step 240 is performed again to test whether the interactive projection system 1 is functioning normally until the interactive projection system 1 is functioning normally, step 250. In addition, the interactive effects are flash effects, c-language effects or executables (for example: any exe executable). In addition to the exemplary C-language effect, the embodiment of the present invention can also mount a flash special effect, so that when the user applies the interactive projection system 100, the flash special effect can be mounted on the system operating platform, so the user You can mount your own flash effects. In addition, the user can make the effect into an executable file, which can also be mounted on the aforementioned system operation platform. The interactive effects include basemap and special effects styles, where the basemap and special effects styles are set by the user. Figure 3 is a top plan view of an interactive projection system 300 in accordance with yet another embodiment of the present invention. Compared with the first diagram, the interactive projection system 300 in FIG. 3 includes two cameras, which are the first camera 201216136= and the second camera 320, respectively located on the interactive projection surface (eg, firefly) 33〇. The side edge is used to enhance the interactive image captured by the two cameras, and the positioning point of the price measuring object is raised, so that the interactive image generated by the user interacting with the image projected by the projector is more accurate. In the present embodiment, the way in which two cameras are employed is to simulate the mode in which the human eye observes an object. The image capturing area 340 shown in FIG. 3 can capture the interactive image in the image capturing unit 340, whether the first camera 310 or the second camera 320 is used. Therefore, the two cameras obtain the mutual φ. The image of the moving image is poor, and the positioning point of the object can be detected more accurately. Secondly, when the interactive image desired by the first camera 310 is blocked, the second camera 320 can still acquire the interactive image, and vice versa. In this way, you can reduce the error of the camera capturing images. In this embodiment, the user is located in the image capture area 340. Figure 4 is a front elevational view of an interactive projection system 400 in accordance with a third embodiment of the present invention. As shown in Fig. 4, the first camera 410 and the second camera 420 of the interactive projection system 400 are respectively located on the upper edges of the opposite sides of the interactive projection surface (e.g., $screen) 430. Fig. 5 is a front elevational view of still another interactive projection system 500 in accordance with a third embodiment of the present invention. The first camera 510 and the second camera 520 of the interactive projection system 500 in Fig. 4's Fig. 5 are respectively located at the lower edges of the opposite sides of the interactive projection surface (e.g., screen) 530. Figure 6 is a top plan view of an interactive projection system 600 in accordance with yet another embodiment of the present invention. Compared with the interactive projector system 600 in Fig. 1 and the second camera, the two cameras are the first camera 610 and the second camera 620, which are based on the interactive projection surface 630, 12 201216136 The other side of the position of the projector is used to enhance the interactive image captured by the two cameras, thereby enhancing the positioning point of the detected object and allowing the user to interact with the image projected by the projector. More precise. In the present embodiment, the mode in which two cameras are employed is a mode simulating a person's observation of an object. The image capturing area 640 shown in FIG. 6 can capture the interactive image in the image capturing area 640, whether the camera 610 or the second camera 620 is used. Therefore, the two cameras are used to obtain the mutual motion. The image of the image is poor, and the positioning point of the object can be detected more accurately. Secondly, when the interactive image desired by the first camera 610 is blocked, the second camera 620 can still obtain the interactive image 'and vice versa. In this way, you can reduce the error of the camera capturing images. In this embodiment, the user is located within the image couching area 640. Figure 7 is a front elevational view of an interactive projection system in accordance with a sixth embodiment of the present invention. As shown in Fig. 7, the first camera 710 and the second camera 720 of the interactive projection system 7 are located at the lower edge of the other of the interactive projection planes (e.g., screens) 73 〇 φ. For example, the two cameras may be located behind the user's heel and at a distance from the user, such that the camera can accurately capture the interactive image. It will be apparent from the above-described embodiments of the present invention that the application of the present invention has the following advantages. The embodiment of the present invention utilizes a fill light to enhance an interactive image, so that at least one camera captures the enhanced interactive image, and when the embodiment of the present invention is applied to the back projection mode, the interactive image captured by at least one camera is improved. Accuracy, and processing of the interactive image through the processor, thereby providing the projector to project the correct image corresponding to the operation command input by the interactive image. 13 201216136 11 μ At least - the number of cameras is two, compared to the use of a single : Lin/1 can be used to enhance the interactive images captured by the two cameras, and then the positioning points of the detected objects are used to make the interactive images generated by the interactions more precise. Except for the example: the exemplary mount c _ the time of the 'can be ... special effects mounted on the above users can mount their own flash effects. Then go to 'this, φ _ into the execution file, the village hangs the previous system operation = can be special, although the invention has been disclosed in the above embodiment, the date of this hair, any familiar with this skill, in 2, 'Not within the scope of the limit' can be used for a variety of changes and refinements, as defined by the spirit of the invention and the patent application park attached to it. MODE FOR CARRYING OUT THE INVENTION [Brief Description of the Drawings] In order to make the above and other objects of the present invention, Luneng more obvious and easy to understand, the description of the drawings is as follows: Main, Advantages and Embodiments Figure 1 riding the nuclear system - The actual _ shirt system schematic. An interactive diagram and a second diagram illustrate a flow chart of an operation method of another real projection system in accordance with the present invention. An interactive version of the drawing is a top view of a projection system in accordance with the present invention. An interactive _ 帛 4 diagram hides the front view of the system in accordance with the embodiment of the present invention. Interactive Pants 201216136 Figure 5 is a front elevational view of an interactive projection system in accordance with yet another embodiment of the present invention. Figure 6 is a plan view showing an interactive projection system in accordance with still another embodiment of the present invention. Figure 7 is a front elevational view of an interactive projection system in accordance with an embodiment of the present invention.

【主要元件符號說明】 100 :互動式投影系統 500 互動式投影糸統 110 投影機 510 第一攝影機 120 至少一攝影機 520 第二攝影機 130 處理器 530 互動投影面 140 補光燈 600 互動式投影系統 150 互動投影面 610 第一攝影機 210〜250 :步驟 620 第二攝影機 300 互動式投影系統 630 互動投影面 310 第一攝影機 640 影像擷取區 320 第二攝影機 700 互動式投影系統 330 互動投影面 710 第一攝影機 340 影像擷取區 720 第二攝影機 400 互動式投影系統 730 互動投影面 410 第一攝影機 420 第二攝影機 430 :互動投影面[Main component symbol description] 100: interactive projection system 500 interactive projection system 110 projector 510 first camera 120 at least one camera 520 second camera 130 processor 530 interactive projection surface 140 fill light 600 interactive projection system 150 Interactive projection surface 610 first camera 210~250: step 620 second camera 300 interactive projection system 630 interactive projection surface 310 first camera 640 image capture area 320 second camera 700 interactive projection system 330 interactive projection surface 710 first Camera 340 Image Capture Area 720 Second Camera 400 Interactive Projection System 730 Interactive Projection Surface 410 First Camera 420 Second Camera 430: Interactive Projection Surface

Claims (1)

201216136 七、申請專利範圍: 1. 一種互動式投影系統,包含: 一投影機,用以投射一影像於一互動投影面; 至少一攝影機,用以擷取該互動投影面之一互動影 像,其中該互動影像係為使用者與該投影機所投射之該影 像進行互動時所產生之另一影像; 一處理器,連接於該投影機與該至少一攝影機,用以 對該至少一攝影機所擷取之該互動影像進行影像處理,且 比較該影像與該互動影像,並根據該比較結果變更該投影 機所投射之該影像,其中該投影機以及該處理器係位於該 互動投影面之一方;以及 一補光燈,係位於該互動投影面之另一方,用以照射 於該使用者以增強該互動影像,使該至少一攝影機擷取增 強後之該互動影像。 2. 如請求項1所述之互動式投影系統,其中該互動 影像係為該使用者相對於該投影機所投射之該影像進行移 動、揮動或阻擋時所產生之該另一影像。 3. 如請求項1所述之互動式投影系統,其中該互動 式投影系統具有正投影模式與背投影模式。 4. 如請求項1所述之互動式投影系統,其中該補光 燈所發出之光線為不可見光。 201216136 5. 如請求項1所述之互動式投影系統,其中該處理 器係用以執行預定之互動特效或影片排程,並由該投影機 投射該互動特效或該影片。 6. 如請求項5所述之互動式投影系統,其中該互動 特效或該影片排程係由使用者可進行無縫切換自行排定, 且該互動特效或該影片係於該互動特效或該影片排程之空 檔排程中隨機執行以進行無缝切換。 7. 如請求項5所述之互動式投影系統,其中該互動 特效為flash特效、C語言特效或執行權,且該互動特效包 含一底圖與一特效式樣,其中該底圖與該特效式樣係由使 用者自行設定。 8. 如請求項1所述之互動式投影系統,其中該至少 一攝影機的數目為二時,該兩攝影機分別設置於該互動投 影面的兩端或該互動投影面之該側,用以增強該兩攝影機 所擷取之該互動影像。 9. 一種互動式投影系統操作方法,包含: 投射一影像於一互動投影面; 執行一至少一攝影機參數設定; 將該至少一攝影機擷取影像之範圍定位於該互動投影 17 201216136 面; 測試該互動式投影系統是否正常運作;以及 若該互動式投影系統正常運作,則執行預定之互動特 效或影片排程; 其中該互動特效或該影片排程係由使用者自行排定, 且該互動特效或該影片係於該互動特效或該影片排程之空 檔排程中隨機執行以進行無縫切換。 10. 如請求項9所述之方法,其中在該測試互動式投 影系統是否正常運作的步驟之後,若該互動式投影系統無 法正常運作,則執行該至少一攝影機參數設定之步驟。 11. 如請求項9所述之方法,其中該互動特效為flash 特效、C語言特效或執行檔。 12. 如請求項9所述之方法,其中該互動特效包含一 底圖與一特效式樣,其中該底圖與該特效式樣係由使用者 自行設定。201216136 VII. Patent application scope: 1. An interactive projection system comprising: a projector for projecting an image on an interactive projection surface; at least one camera for capturing an interactive image of the interactive projection surface, wherein The interactive image is another image generated by the user interacting with the image projected by the projector; a processor coupled to the projector and the at least one camera for licking the at least one camera Taking the interactive image for image processing, comparing the image with the interactive image, and changing the image projected by the projector according to the comparison result, wherein the projector and the processor are located on one side of the interactive projection surface; And a fill light is located on the other side of the interactive projection surface for illuminating the user to enhance the interactive image, so that the at least one camera captures the enhanced interactive image. 2. The interactive projection system of claim 1, wherein the interactive image is the other image produced by the user when moving, waving or blocking the image projected by the projector. 3. The interactive projection system of claim 1, wherein the interactive projection system has a front projection mode and a rear projection mode. 4. The interactive projection system of claim 1, wherein the light emitted by the fill light is invisible light. The interactive projection system of claim 1, wherein the processor is configured to perform a predetermined interactive effect or movie schedule, and the interactive effect or the movie is projected by the projector. 6. The interactive projection system of claim 5, wherein the interactive effect or the movie schedule is manually scheduled by a user, and the interactive effect or the video is tied to the interactive effect or the The random scheduling of the movie schedule is randomly performed for seamless switching. 7. The interactive projection system of claim 5, wherein the interactive effect is a flash effect, a C language effect or an execution right, and the interactive effect comprises a base map and a special effect pattern, wherein the base map and the special effect pattern It is set by the user. 8. The interactive projection system of claim 1, wherein the number of the at least one camera is two, the two cameras are respectively disposed at two ends of the interactive projection surface or the side of the interactive projection surface for enhancing The interactive image captured by the two cameras. 9. An interactive projection system operating method comprising: projecting an image on an interactive projection surface; performing at least one camera parameter setting; positioning the range of the at least one camera captured image on the interactive projection 17 201216136; testing the Whether the interactive projection system is functioning properly; and if the interactive projection system is functioning properly, performing a predetermined interactive effect or video schedule; wherein the interactive effect or the video schedule is scheduled by the user, and the interactive effect is Or the video is randomly executed in the interactive effect or in the blank schedule of the movie schedule for seamless switching. 10. The method of claim 9, wherein the step of setting the at least one camera parameter is performed after the step of testing whether the interactive projection system is functioning properly, if the interactive projection system is not functioning properly. 11. The method of claim 9, wherein the interactive effect is a flash effect, a C language effect, or an executable file. 12. The method of claim 9, wherein the interactive effect comprises a base map and a special effect pattern, wherein the base map and the special effect pattern are set by a user.
TW99133896A 2010-10-05 2010-10-05 Interactive projection system and opertating method thereof TW201216136A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW99133896A TW201216136A (en) 2010-10-05 2010-10-05 Interactive projection system and opertating method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW99133896A TW201216136A (en) 2010-10-05 2010-10-05 Interactive projection system and opertating method thereof

Publications (1)

Publication Number Publication Date
TW201216136A true TW201216136A (en) 2012-04-16

Family

ID=46787098

Family Applications (1)

Application Number Title Priority Date Filing Date
TW99133896A TW201216136A (en) 2010-10-05 2010-10-05 Interactive projection system and opertating method thereof

Country Status (1)

Country Link
TW (1) TW201216136A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI506483B (en) * 2013-12-13 2015-11-01 Ind Tech Res Inst Interactive writing device and operating method thereof using adaptive color identification mechanism

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI506483B (en) * 2013-12-13 2015-11-01 Ind Tech Res Inst Interactive writing device and operating method thereof using adaptive color identification mechanism
US9454247B2 (en) 2013-12-13 2016-09-27 Industrial Technology Research Institute Interactive writing device and operating method thereof using adaptive color identification mechanism

Similar Documents

Publication Publication Date Title
CN110581947B (en) Taking pictures within virtual reality
US9001226B1 (en) Capturing and relighting images using multiple devices
US9787943B2 (en) Natural user interface having video conference controls
US8957856B2 (en) Systems, methods, and apparatuses for spatial input associated with a display
US8818027B2 (en) Computing device interface
US8350896B2 (en) Terminal apparatus, display control method, and display control program
TWI307637B (en)
US20140002472A1 (en) Augmented reality surface painting
CN101017316A (en) Correction method for deformation of multiscreen playing suitable for irregular screen
JP5916248B2 (en) Image generating apparatus, image generating method, program, and computer-readable information storage medium
CN107291221B (en) Across screen self-adaption accuracy method of adjustment and device based on natural gesture
JP2002247602A (en) Image generator and control method therefor, and its computer program
CN103748893A (en) Display as lighting for photos or video
CN107770513A (en) Image-pickup method and device, terminal
TW578031B (en) Projection system with an image capturing device
Marner et al. Exploring interactivity and augmented reality in theater: A case study of Half Real
CN106210701A (en) A kind of mobile terminal for shooting VR image and VR image capturing apparatus thereof
TW201337639A (en) Finger-pointing direction image control device and method thereof
TW201216136A (en) Interactive projection system and opertating method thereof
TW201016275A (en) Image system for adjusting displaying angle by detecting human face and visual simulation control apparatus thereof
CN105302283B (en) The control system and its control method of mapping projections
TWI357063B (en)
JP2013218423A (en) Directional video control device and method
Gokcezade et al. Lighttracker: An open-source multitouch toolkit
CN108351576A (en) Method for shooting image by mobile device