TWI822267B - Immersive ecological house interactive learning system - Google Patents

Immersive ecological house interactive learning system Download PDF

Info

Publication number
TWI822267B
TWI822267B TW111131469A TW111131469A TWI822267B TW I822267 B TWI822267 B TW I822267B TW 111131469 A TW111131469 A TW 111131469A TW 111131469 A TW111131469 A TW 111131469A TW I822267 B TWI822267 B TW I822267B
Authority
TW
Taiwan
Prior art keywords
module
immersive
house
ecological
interactive learning
Prior art date
Application number
TW111131469A
Other languages
Chinese (zh)
Other versions
TW202409991A (en
Inventor
林逸農
Original Assignee
淡江大學學校財團法人淡江大學
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 淡江大學學校財團法人淡江大學 filed Critical 淡江大學學校財團法人淡江大學
Priority to TW111131469A priority Critical patent/TWI822267B/en
Application granted granted Critical
Publication of TWI822267B publication Critical patent/TWI822267B/en
Publication of TW202409991A publication Critical patent/TW202409991A/en

Links

Images

Abstract

An immersive ecological house interactive learning system is disclosed. The immersive ecological house interactive learning system includes a house body, at least one display module, a guided walker module, a computer system, a question-and answer module and an image recognition module. The at least one display module is used for displaying an ecological environment image screen. The guided walker module is used to provide the user with walking or running at the original position and output a displacement signal. The computer system controls the display modules to synchronously present moving pictures according to the displacement signal. The question and answer module extracts the corresponding voice answer from the computer system according to the user’s voice question, so as to reply to the user. The image recognition module performs an imager recognition program according to the ecological environment image picture touched by the user, and feeds back a recognition result to the computer system to extract a corresponding ecological teaching data and display it on the display modules.

Description

沉浸式生態屋互動學習系統Immersive eco-house interactive learning system

本發明是有關於一種互動學習系統,特別是關於一種沉浸式生態屋互動學習系統。 The present invention relates to an interactive learning system, and in particular to an immersive eco-house interactive learning system.

近年來虛擬實境(Virtual reality,VR)科技教育應用的相關研究有著快速成長的趨勢,研究結果指出透過VR科技輔助學習對於提升學習動機有著顯著的影響。藉著這樣的優勢,探討學習內容設計透過VR科技輔助學習的方式來進行是有其必要性。 In recent years, research related to the application of virtual reality (VR) technology in education has been a rapidly growing trend. Research results indicate that assisting learning through VR technology has a significant impact on improving learning motivation. Taking advantage of this advantage, it is necessary to explore the use of VR technology to assist learning in learning content design.

以VR科技為例,其頭戴式螢幕所能帶來的沉浸感體驗是一大亮點。但是,戴頭戴式螢幕的重量一直是難以解決的難題,因此年齡過小頭頸支撐力不足,使用上需要注意很多。而在互動上,需要先行定位,使用者定位方面,則需要在虛擬實境空間內,設置感測的鏡頭,以擷取使用者的位置和動作。在手部動作的感測方面,為了提升互動性與更精準的指令輸入反應,通常VR的使用者都需要配戴有線手套或手持控制器,用來對虛擬實境裡的物件進行指令的輸入,例如「執行」和「取消」,或者更複雜的「拿取」和「扭轉」的動作。 Taking VR technology as an example, the immersive experience that its head-mounted screens can bring is a highlight. However, the weight of the head-mounted screen has always been a difficult problem to solve. Therefore, if you are young and have insufficient head and neck support, you need to pay a lot of attention when using it. In terms of interaction, positioning needs to be done first. In terms of user positioning, it is necessary to set up sensing lenses in the virtual reality space to capture the user's position and movements. In terms of hand movement sensing, in order to improve interactivity and more accurate command input response, VR users usually need to wear wired gloves or handheld controllers to input commands to objects in the virtual reality. , such as "execute" and "cancel", or more complex "take" and "twist" actions.

然而,前述的配戴頭戴式螢幕、有線手套或手持控制器,不僅穿戴不方便且會學習者的行動靈活度也因此受限。另外,戴頭戴式螢幕也會造成使用者頭部有悶熱與壓迫的問題。 However, wearing the aforementioned head-mounted screen, wired gloves or hand-held controller is not only inconvenient to wear but also limits the learner's mobility. In addition, wearing a head-mounted screen can also cause stuffy heat and pressure on the user's head.

因此,如何能提供一種『沉浸式生態屋互動學習系統』,成為業界所待解決之課題。 Therefore, how to provide an "immersive eco-house interactive learning system" has become a problem to be solved in the industry.

本發明實施例提供一種沉浸式生態屋互動學習系統,包括一屋體、至少一顯示模組、一導覽行走器模組、一電腦系統、一問答模組與一影像辨識模組。至少一顯示模組,用以顯示一生態環境影像畫面。導覽行走器模組,用以提供使用者在原位置行走或跑步,並輸出一位移訊號。電腦系統,根據位移訊號控制該些顯示模組同步呈現移動畫面。問答模組,根據使用者的語音提問至電腦系統中提取對應的語音答案,以回覆使用者。影像辨識模組,根據使用者所觸碰到的生態環境影像畫面進行一影像辨識程序,並將一辨識結果回饋給電腦系統,以提取對應的一生態教學資料並於該些顯示模組上呈現。 Embodiments of the present invention provide an immersive eco-house interactive learning system, including a house, at least one display module, a navigation walker module, a computer system, a question and answer module and an image recognition module. At least one display module is used to display an ecological environment image screen. The navigation walker module is used to allow the user to walk or run in the original position and output a displacement signal. The computer system controls the display modules to synchronously display moving images based on the displacement signal. The Q&A module extracts corresponding voice answers from the computer system based on the user's voice questions to reply to the user. The image recognition module performs an image recognition process based on the ecological environment image screen touched by the user, and feeds a recognition result back to the computer system to extract corresponding ecological teaching data and present it on the display modules .

在一些實施例中,還包括有一影像辨識自走機器人,設置於遠端並與電腦系統連線,用以將拍攝到的畫面呈現於該些顯示模組上。 In some embodiments, an image recognition self-propelled robot is also included, which is installed at the remote end and connected to the computer system to present the captured images on the display modules.

在一些實施例中,屋體還包括有穹頂結構及多邊形牆體。穹頂結構與/或多邊形牆體與穹頂結構連接,用以設置該些顯示模組。 In some embodiments, the house also includes a dome structure and a polygonal wall. The dome structure and/or the polygonal wall are connected to the dome structure for setting the display modules.

在一些實施例中,穹頂結構為半圓形形狀,而多邊形牆體為六角形形狀。 In some embodiments, the dome structure has a semicircular shape and the polygonal wall has a hexagonal shape.

在一些實施例中,多邊形牆體上還設有投影模組,用以投影生態環境影像畫面。 In some embodiments, the polygonal wall is also provided with a projection module for projecting ecological environment images.

在一些實施例中,生態教學資料包括有哺乳類、鳥類、爬蟲類、兩棲類、魚類與昆蟲類。 In some embodiments, the ecological teaching materials include mammals, birds, reptiles, amphibians, fish and insects.

在一些實施例中,影像辨識模組包括一類神經網路模型。 In some embodiments, the image recognition module includes a type of neural network model.

在一些實施例中,類神經網路模型為YOLO模型、RCNN模型、CTPN模型或EAST模型。 In some embodiments, the neural network model is a YOLO model, an RCNN model, a CTPN model or an EAST model.

在一些實施例中,導覽行走器模組為萬向跑步機。 In some embodiments, the navigation walker module is a universal treadmill.

為讓本發明能更明顯易懂,下文特舉實施例,並配合所附圖式作 詳細說明如下。 In order to make the present invention more obvious and understandable, the following embodiments are illustrated in conjunction with the accompanying drawings. Details are as follows.

10:屋體 10:House

12:穹頂結構 12:Dome structure

14:多邊形牆體 14:Polygonal wall

20:顯示模組 20:Display module

22:投影模組 22:Projection module

30:導覽行走器模組 30: Navigation walker module

40:電腦系統 40:Computer system

50:問答模組 50: Q&A module

60:影像辨識模組 60:Image recognition module

70:影像辨識自走機器人 70: Image recognition self-propelled robot

100,110,120:沉浸式生態屋互動學習系統 100,110,120: Immersive eco-house interactive learning system

第1圖為本發明一實施例之系統方塊圖。 Figure 1 is a system block diagram of an embodiment of the present invention.

第2圖為本發明另一實施例之系統方塊圖。 Figure 2 is a system block diagram of another embodiment of the present invention.

第3圖為本發明又一實施例之系統方塊圖。 Figure 3 is a system block diagram of another embodiment of the present invention.

第4圖為本發明一實施例之外觀結構示意圖。 Figure 4 is a schematic diagram of the appearance structure of an embodiment of the present invention.

第5圖為本發明一實施例之透視結構示意圖。 Figure 5 is a perspective structural diagram of an embodiment of the present invention.

以下結合附圖和實施例,對本發明的具體實施方式作進一步描述。以下實施例僅用於更加清楚地說明本發明的技術方案,而不能以此限制本發明的保護範圍。 Specific implementations of the present invention will be further described below with reference to the accompanying drawings and examples. The following examples are only used to illustrate the technical solution of the present invention more clearly, but cannot limit the scope of protection of the present invention.

為了清楚與方便圖式說明之故,圖式中的各部件在尺寸與比例上可能會被擴大或縮小地呈現。在以下描述及/或申請專利範圍中,當提及元件「連接」或「耦合」至另一元件時,其可直接連接或耦合至該另一元件或可存在介入元件;而當提及元件「直接連接」或「直接耦合」至另一元件時,不存在介入元件,用於描述元件或層之間之關係之其他字詞應以相同方式解釋;「第一」、「第二」等序數,彼此之間並沒有順序上的先後關係,其僅用於標示區分兩個具有相同名字之不同元件。為便於理解,下述實施例中之相同元件係以相同之符號標示來說明。 For the sake of clarity and convenience of illustration, the size and proportion of components in the drawings may be exaggerated or reduced. In the following description and/or patent claims, when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present; and when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present; When "directly connected" or "directly coupled" to another element, there are no intervening components present, and other words used to describe the relationship between components or layers should be interpreted in a like manner; "first", "second", etc. Ordinal numbers have no sequential relationship with each other. They are only used to identify two different components with the same name. For ease of understanding, the same components in the following embodiments are labeled with the same symbols.

請參照第1圖,為本發明一實施例之系統方塊圖。如第1圖所示,沉浸式生態屋互動學習系統100包括屋體10、顯示模組20、導覽行走器模組30、電腦系統40、問答模組50與影像辨識模組60。 Please refer to Figure 1, which is a system block diagram of an embodiment of the present invention. As shown in Figure 1, the immersive eco-house interactive learning system 100 includes a house 10, a display module 20, a navigation walking module 30, a computer system 40, a question and answer module 50 and an image recognition module 60.

屋體10具有容納空間。屋體10可用以設置顯示模組20、導覽行走 器模組30、電腦系統40、問答模組50與影像辨識模組60。實際上,屋體10可採用木頭、水泥、保麗龍與/或紙板等建材組成。 The house body 10 has an accommodation space. The house 10 can be used to install the display module 20 and guide walking The server module 30, the computer system 40, the question and answer module 50 and the image recognition module 60. In fact, the house 10 can be made of building materials such as wood, cement, Styrofoam and/or cardboard.

顯示模組20位於屋體10內。顯示模組20用以顯示一生態環境影像畫面。顯示模組20可以例如是由液晶顯示器或觸控式顯示器組成。顯示模組20可以是由一個或多個液晶顯示器所組成。 The display module 20 is located in the house 10 . The display module 20 is used to display an ecological environment image screen. The display module 20 may, for example, be composed of a liquid crystal display or a touch display. The display module 20 may be composed of one or more liquid crystal displays.

導覽行走器模組30位於屋體10內。導覽行走器模組30與顯示模組20連接。導覽行走器模組30用以提供使用者在原位置行走或跑步,並輸出一位移訊號。導覽行走器模組30可以例如是由萬向跑步機組成。舉例來說,使用者可以依據自己想要行走的方向進行調整,所調整的方位可以看到所要投影出來的鏡頭。也就是說,使用者往前面走的時候會看到前面的牆面會投影出攝影機往前面行走的畫面,原地向右轉再往前走可以看到另外生態的畫面。導覽行走器模組30可以提供使用者行走時判斷攝影機鏡頭的方位,也就是說,導覽行走器模組30與鏡頭方位有連動的效應,還可以提供使用者選擇所要觀看影像的內容。換言之,導覽行走器模組30還提供類似於滑鼠的功能。 The navigation walking module 30 is located in the house 10 . The navigation walking module 30 is connected with the display module 20 . The navigation walker module 30 is used to allow the user to walk or run in the original position and output a displacement signal. The navigation walker module 30 may be composed of a universal treadmill, for example. For example, users can adjust according to the direction they want to walk, and the adjusted direction can see the lens to be projected. In other words, when the user walks forward, he will see the picture of the camera walking forward projected on the wall in front. If he turns right on the spot and walks forward, he can see another ecological picture. The navigation walker module 30 can allow the user to determine the orientation of the camera lens while walking. In other words, the navigation walker module 30 has a linkage effect with the lens orientation, and can also allow the user to select the content of the image to be viewed. In other words, the navigation walking module 30 also provides mouse-like functions.

電腦系統40與導覽行走器模組30連接。電腦系統40根據所述的位移訊號控制顯示模組20同步呈現對應於生態環境影像畫面之移動畫面。電腦系統40具有指令運算處理、資料庫、顯示、有線或無線網路連接等功能。 The computer system 40 is connected to the navigation walking module 30 . The computer system 40 controls the display module 20 to synchronously present a moving picture corresponding to the ecological environment image picture according to the displacement signal. The computer system 40 has functions such as instruction processing, database, display, wired or wireless network connection, etc.

問答模組50與電腦系統40連接。問答模組50根據使用者的語音提問至電腦系統40中提取對應的語音答案,以回覆使用者。舉例來說,屋體10中可架設麥克風,以收音使用者提問的問題,並針對問題的關鍵字進行判斷,以至電腦系統40的資料庫進行匹配,並將語音答案回覆給使用者。 The question and answer module 50 is connected to the computer system 40 . The question and answer module 50 extracts corresponding voice answers from the computer system 40 according to the user's voice questions to reply to the user. For example, a microphone can be installed in the room 10 to pick up questions asked by users, determine the keywords of the questions, match them with the database of the computer system 40, and reply voice answers to the users.

影像辨識模組60與電腦系統40連接。影像辨識模組60根據使用者於顯示模組20上所觸碰到的生態環境影像畫面進行一影像辨識程序,並將一辨識結果回饋給電腦系統40,以提取對應的生態教學資料並於顯示模組20上呈 現。生態教學資料包括有哺乳類、鳥類、爬蟲類、兩棲類、魚類與昆蟲類。影像辨識模組60包括一類神經網路模型。所述的類神經網路模型為YOLO模型、RCNN模型、CTPN模型或EAST模型。 The image recognition module 60 is connected to the computer system 40 . The image recognition module 60 performs an image recognition process based on the ecological environment image screen touched by the user on the display module 20, and feeds back a recognition result to the computer system 40 to extract the corresponding ecological teaching data and display it. Module 20 submitted now. Ecological teaching materials include mammals, birds, reptiles, amphibians, fish and insects. The image recognition module 60 includes a type of neural network model. The neural network-like model is YOLO model, RCNN model, CTPN model or EAST model.

請參照第2圖,為本發明另一實施例之系統方塊圖。如第2圖所示,沉浸式生態屋互動學習系統110包括屋體10、投影模組22、導覽行走器模組30、電腦系統40、問答模組50與影像辨識模組60。由於第2圖實施例與第1圖實施例的差別在於:投影模組22,其餘模組與第1圖相同,以下不再贅述。 Please refer to Figure 2, which is a system block diagram of another embodiment of the present invention. As shown in Figure 2, the immersive eco-house interactive learning system 110 includes a house 10, a projection module 22, a navigation walking module 30, a computer system 40, a question and answer module 50 and an image recognition module 60. Since the difference between the embodiment in Figure 2 and the embodiment in Figure 1 lies in the projection module 22, and the other modules are the same as those in Figure 1, they will not be described in detail below.

投影模組22與導覽行走器模組30連接。投影模組22用以投影生態環境影像畫面。投影模組22可以由全彩雷射投影機所組成。 The projection module 22 is connected to the navigation walking module 30 . The projection module 22 is used to project ecological environment images. The projection module 22 may be composed of a full-color laser projector.

請參照第3圖,為本發明又一實施例之系統方塊圖。如第3圖所示,沉浸式生態屋互動學習系統120包括屋體10、顯示模組20、導覽行走器模組30、電腦系統40、問答模組50、影像辨識模組60與影像辨識自走機器人70。由於第3圖實施例與第1圖實施例的差別在於:影像辨識自走機器人70,其餘模組與第1圖相同,以下不再贅述。影像辨識自走機器人70,設置於遠端並與電腦系統40連線。影像辨識自走機器人70用以將拍攝到的畫面呈現於顯示模組20上。 Please refer to Figure 3, which is a system block diagram of another embodiment of the present invention. As shown in Figure 3, the immersive eco-house interactive learning system 120 includes a house 10, a display module 20, a navigation walking module 30, a computer system 40, a question and answer module 50, an image recognition module 60 and an image recognition system. Self-propelled robot 70. Since the difference between the embodiment in Figure 3 and the embodiment in Figure 1 lies in the image recognition self-propelled robot 70, and the other modules are the same as those in Figure 1, they will not be described again. The image recognition self-propelled robot 70 is installed at the remote end and connected to the computer system 40 . The image recognition self-propelled robot 70 is used to present the captured images on the display module 20 .

影像辨識自走機器人70具有微電腦系統、網路模組、顯示模組、電池模組、攝影機模組、飛行螺旋槳與/或萬向輪等構件。藉此,影像辨識自走機器人70可在生態區移動,並拍攝生態環境影像畫面與記錄生態環境影像資料。在一些實施列中,影像辨識自走機器人70可以為多個,並分別設置於多個生態環境區(例如,森林生態系、水域生態系、陸域生態系等)中拍攝與記錄生態環境影像資料。 The image recognition self-propelled robot 70 has components such as a microcomputer system, a network module, a display module, a battery module, a camera module, a flying propeller, and/or a universal wheel. Thereby, the image recognition self-propelled robot 70 can move in the ecological zone, take pictures of ecological environment images, and record ecological environment image data. In some embodiments, there can be multiple image recognition self-propelled robots 70 , and they are respectively set up in multiple ecological environment areas (for example, forest ecosystem, water ecosystem, terrestrial ecosystem, etc.) to capture and record ecological environment images. material.

在一些實施列中,影像辨識自走機器人70可以是無人機,用以在空中環境飛行進行拍攝與記錄生態環境影像資料。在一些實施列中,影像辨識自走機器人70可以是潛水艇,用以在水下進行拍攝與記錄生態環境影像資料。 在一些實施列中,影像辨識自走機器人70可以是自走車,用以陸上進行拍攝與記錄生態環境影像資料。 In some embodiments, the image recognition autonomous robot 70 may be a drone, used to fly in the aerial environment to shoot and record ecological environment image data. In some embodiments, the image recognition autonomous robot 70 may be a submarine, used to photograph and record ecological environment image data underwater. In some embodiments, the image recognition self-propelled robot 70 may be a self-propelled vehicle, used to shoot and record ecological environment image data on land.

請參照第4圖,為本發明一實施例之外觀結構示意圖。如第4圖所示,屋體10包括有穹頂結構12與多邊形牆體14。穹頂結構12覆蓋於多邊形牆體14的上方。多邊形牆體14與穹頂結構12連接。穹頂結構12為半圓形形狀,而多邊形牆體14為六角形形狀,但不以此為限,也可以視實際需求設計成其他幾何形狀。 Please refer to Figure 4, which is a schematic diagram of the appearance structure of an embodiment of the present invention. As shown in FIG. 4 , the house 10 includes a dome structure 12 and a polygonal wall 14 . The dome structure 12 covers the polygonal wall 14 . The polygonal wall 14 is connected to the dome structure 12 . The dome structure 12 has a semicircular shape, and the polygonal wall 14 has a hexagonal shape, but is not limited to this, and can also be designed into other geometric shapes according to actual needs.

請參照第5圖,為本發明一實施例之透視結構示意圖。如第5圖所示,使用者位於屋體10內的導覽行走器模組30上。多邊形牆體14上設置有多個顯示模組20。更具體的說,多邊形牆體14為六角形形狀組成。在每一面牆體上均設置有一個顯示模組20。藉此,可提供使用者以環景360度的觀賞角度。在一些實施例中,穹頂結構12與多邊形牆體14上可設置有投影模組22,以全境投影生態環境影像畫面。 Please refer to Figure 5, which is a perspective structural diagram of an embodiment of the present invention. As shown in FIG. 5 , the user is located on the navigation walking vehicle module 30 in the room 10 . A plurality of display modules 20 are provided on the polygonal wall 14 . More specifically, the polygonal wall 14 is composed of a hexagonal shape. A display module 20 is provided on each wall. This provides users with a 360-degree viewing angle of the surrounding landscape. In some embodiments, a projection module 22 may be provided on the dome structure 12 and the polygonal wall 14 to project ecological environment images throughout the entire area.

舉例來說,沉浸式生態屋互動學習系統100的情境是由攝影機所投影出來的真實畫面,因此影像辨識自走機器人70會在生態環境中進行實體的畫面投影,過程中影像辨識自走機器人70所拍攝到的真實畫面會讓使用者觀看到。 For example, the situation of the immersive ecological house interactive learning system 100 is a real picture projected by a camera, so the image recognition self-propelled robot 70 will project the physical picture in the ecological environment. In the process, the image recognition self-propelled robot 70 The real pictures captured will be visible to users.

當使用者看到影像辨識自走機器人70所拍攝到的動物,並對該動物有興趣時,使用者可以停止走動站在原地進行觀賞。使用者可以觸摸顯示模組20上的該動物,此時,影像辨識模組60會判斷出使用者想要進一步了解該動物。 When the user sees an animal captured by the image recognition autonomous robot 70 and is interested in the animal, the user can stop walking and stand in place to watch. The user can touch the animal on the display module 20. At this time, the image recognition module 60 will determine that the user wants to know more about the animal.

接著,影像辨識自走機器人70會對該動物進行影像辨識,並且提供辨識後的生態教學資料並於顯示模組20上呈現,讓使用者有更多進一步的了解。 Then, the image recognition self-propelled robot 70 will perform image recognition on the animal, and provide the recognized ecological teaching materials and present them on the display module 20 so that the user can have more and further understanding.

例如,影像辨識自走機器人70拍攝到路上生物是一隻藍鵲,使用者對於藍鵲的認識想要有進一步的了解。可以點選顯示模組20上的藍鵲,影像辨識自走機器人70會對所點擊的藍鵲進行辨識,並且提供藍鵲相關的生態資訊(例如,藍鵲的性別、年齡、繁殖季等等),讓使用者對藍鵲有更多的了解。 For example, the image recognition self-propelled robot 70 captures that the creature on the road is a blue magpie, and the user wants to know more about the blue magpie. You can click on the blue magpie on the display module 20, and the image recognition self-propelled robot 70 will identify the clicked blue magpie and provide ecological information related to the blue magpie (for example, the gender, age, breeding season, etc. of the blue magpie) ), allowing users to know more about Blue Magpie.

若使用者有更多的興趣可以透過問答模組50進行更多的詢問。承上述,使用者對於藍鵲有更多不清楚的地方,使用者可以透過問答模組50,以口語敘述的機制進行提問,問答模組50會根據使用者所進行的提問進行判斷,並至電腦系統40中提取對應的語音答案,並以語音與/或畫面方式回覆使用者。 If the user is more interested, he or she can make more inquiries through the question and answer module 50 . Following the above, if the user has more unclear questions about Blue Magpie, the user can use the question and answer module 50 to ask questions using a spoken narrative mechanism. The question and answer module 50 will make judgments based on the questions raised by the user, and then The computer system 40 extracts the corresponding voice answer and replies to the user in voice and/or screen mode.

綜上所述,本發明之沉浸式生態屋互動學習系統,可提供使用者根據所要了解的生態環境進行選擇,包含森林生態系、水域生態系、陸域生態系。每一種生態系皆有不同的生態課程,大致分為哺乳類、鳥類、爬蟲類、兩棲類、魚類、昆蟲類等,學習者可以根據這些生態課程實際體驗與沉浸其中以了解生態環境的現況,以及透過課程的介紹了解所需學習的知識和技能,希望透過沉浸式生態屋互動學習系統能夠提升學習者對生態意識的看重並落實於日常生活中對生態實際保護。 To sum up, the immersive ecological house interactive learning system of the present invention can provide users with choices based on the ecological environment they want to understand, including forest ecosystem, water ecosystem, and terrestrial ecosystem. Each ecosystem has different ecological courses, which are roughly divided into mammals, birds, reptiles, amphibians, fish, insects, etc. Learners can actually experience and immerse themselves in these ecological courses to understand the current status of the ecological environment, and Through the introduction of the course, we can understand the knowledge and skills required. We hope that through the immersive eco-house interactive learning system, we can enhance learners' emphasis on ecological awareness and implement practical ecological protection in daily life.

根據本發明之沉浸式生態屋互動學習系統,不需要配戴頭戴式螢幕、有線手套或手持控制器,提升使用者行動靈活度,也不會造成頭部悶熱問題。 According to the immersive eco-house interactive learning system of the present invention, there is no need to wear a head-mounted screen, wired gloves or handheld controllers, which improves the user's mobility and does not cause the problem of stuffy head heat.

雖然本發明已以實施例揭露如上,然其並非用以限定本發明,任何所屬技術領域中具有通常知識者,在不脫離本發明之精神和範圍內,當可作些許之更動與潤飾,故本發明之保護範圍當視後附之申請專利範圍所界定者為準。 Although the present invention has been disclosed above through embodiments, they are not intended to limit the present invention. Anyone with ordinary knowledge in the relevant technical field may make some modifications and modifications without departing from the spirit and scope of the present invention. Therefore, The protection scope of the present invention shall be determined by the appended patent application scope.

10:屋體 10:House

20:顯示模組 20:Display module

30:導覽行走器模組 30: Navigation walker module

40:電腦系統 40:Computer system

50:問答模組 50: Q&A module

60:影像辨識模組 60:Image recognition module

100:沉浸式生態屋互動學習系統 100: Immersive eco-house interactive learning system

Claims (8)

一種沉浸式生態屋互動學習系統,包括:一屋體,包括一穹頂結構及一多邊形牆體,該多邊形牆體與該穹頂結構連接;至少一顯示模組,位於該屋體內,用以顯示一生態環境影像畫面,該多邊形牆體用以設置該些顯示模組;一導覽行走器模組,位於該屋體內,用以提供使用者在原位置行走或跑步,並輸出一位移訊號;一電腦系統,根據該位移訊號控制該些顯示模組同步呈現對應於該生態環境影像畫面之移動畫面;一問答模組,根據使用者的語音提問至該電腦系統中提取對應的語音答案,以回覆使用者;及一影像辨識模組,根據使用者於該些顯示模組上所觸碰到的該生態環境影像畫面進行一影像辨識程序,並將一辨識結果回饋給該電腦系統,以提取對應的一生態教學資料並於該些顯示模組上呈現。 An immersive ecological house interactive learning system includes: a house including a dome structure and a polygonal wall, the polygonal wall is connected to the dome structure; at least one display module is located in the house to display a Ecological environment image screen, the polygonal wall is used to set the display modules; a navigation walker module is located in the house to allow the user to walk or run in the original position and output a displacement signal; a computer The system controls the display modules to synchronously present moving images corresponding to the ecological environment image according to the displacement signal; a question and answer module extracts corresponding voice answers from the computer system based on the user's voice questions to reply to the user and an image recognition module that performs an image recognition process based on the ecological environment image screen touched by the user on the display modules, and feeds a recognition result back to the computer system to extract the corresponding An ecological teaching material is presented on the display modules. 如請求項1所述之沉浸式生態屋互動學習系統,其中還包括有一影像辨識自走機器人,設置於遠端並與該電腦系統連線,用以將拍攝到的畫面呈現於該些顯示模組上。 The immersive eco-house interactive learning system as described in claim 1, which also includes an image recognition self-propelled robot, which is installed at the remote end and connected to the computer system to present the captured images on the display models. group on. 如請求項1所述之沉浸式生態屋互動學習系統,其中該穹頂結構為半圓形形狀,而該多邊形牆體為六角形形狀。 The immersive eco-house interactive learning system as described in claim 1, wherein the dome structure is in a semicircular shape, and the polygonal wall is in a hexagonal shape. 如請求項1所述之沉浸式生態屋互動學習系統,其中該穹頂結構與/或該多邊形牆體上還設有投影模組,用以投影該生態環境影像畫面。 The immersive ecological house interactive learning system as described in claim 1, wherein the dome structure and/or the polygonal wall are further provided with a projection module for projecting the ecological environment image. 如請求項1所述之沉浸式生態屋互動學習系統,其中該生態教學資料包括有哺乳類、鳥類、爬蟲類、兩棲類、魚類與昆蟲類。 The immersive ecological house interactive learning system as described in claim 1, wherein the ecological teaching materials include mammals, birds, reptiles, amphibians, fish and insects. 如請求項1所述之沉浸式生態屋互動學習系統,其中該影像辨識模組包括一類神經網路模型。 The immersive eco-house interactive learning system as described in claim 1, wherein the image recognition module includes a type of neural network model. 如請求項6所述之沉浸式生態屋互動學習系統,其中該類神經網路模型為YOLO模型、RCNN模型、CTPN模型或EAST模型。 The immersive eco-house interactive learning system as described in claim 6, wherein the neural network model is a YOLO model, an RCNN model, a CTPN model or an EAST model. 如請求項1所述之沉浸式生態屋互動學習系統,其中該導覽行走器模組為萬向跑步機。 The immersive eco-house interactive learning system as described in claim 1, wherein the guide walker module is a universal treadmill.
TW111131469A 2022-08-22 2022-08-22 Immersive ecological house interactive learning system TWI822267B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW111131469A TWI822267B (en) 2022-08-22 2022-08-22 Immersive ecological house interactive learning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW111131469A TWI822267B (en) 2022-08-22 2022-08-22 Immersive ecological house interactive learning system

Publications (2)

Publication Number Publication Date
TWI822267B true TWI822267B (en) 2023-11-11
TW202409991A TW202409991A (en) 2024-03-01

Family

ID=89722452

Family Applications (1)

Application Number Title Priority Date Filing Date
TW111131469A TWI822267B (en) 2022-08-22 2022-08-22 Immersive ecological house interactive learning system

Country Status (1)

Country Link
TW (1) TWI822267B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI568475B (en) * 2015-10-26 2017-02-01 南臺科技大學 Virtual reality movable platform
CN108140383A (en) * 2016-07-19 2018-06-08 门箱股份有限公司 Display device, topic selection method, topic option program, image display method and image show program
CN109876369A (en) * 2019-03-28 2019-06-14 天津运筹天下科技有限公司 A kind of VR human-computer interaction all-round athletic event and universal treadmill
CN210072666U (en) * 2019-07-23 2020-02-14 浪潮软件集团有限公司 Information inquiry device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI568475B (en) * 2015-10-26 2017-02-01 南臺科技大學 Virtual reality movable platform
CN108140383A (en) * 2016-07-19 2018-06-08 门箱股份有限公司 Display device, topic selection method, topic option program, image display method and image show program
CN109876369A (en) * 2019-03-28 2019-06-14 天津运筹天下科技有限公司 A kind of VR human-computer interaction all-round athletic event and universal treadmill
CN210072666U (en) * 2019-07-23 2020-02-14 浪潮软件集团有限公司 Information inquiry device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
網路文獻 蔡宗勳 故宮南院文物 未來在家也能觀賞/唐鳳透過遙控機器人、全景VR相機線上直播 自由時報電子網 2017/02/20 *

Similar Documents

Publication Publication Date Title
Kulich et al. SyRoTek—Distance teaching of mobile robotics
LaValle Virtual reality
Kadous et al. Effective user interface design for rescue robotics
Riecke et al. Visual homing is possible without landmarks: A path integration study in virtual reality
Nourbakhsh et al. Human-robot teaming for search and rescue
JP2021515336A (en) Augmented reality adjustment of human-robot interaction
US8902255B2 (en) Mobile platform for augmented reality
Higuchi et al. Flying head: a head motion synchronization mechanism for unmanned aerial vehicle control
Piumsomboon et al. Superman vs giant: A study on spatial perception for a multi-scale mixed reality flying telepresence interface
KR20070023905A (en) Immersive training system for live-line workers
Lund et al. Robot soccer with LEGO mindstorms
CN108428375A (en) A kind of teaching auxiliary and equipment based on augmented reality
CN105844997A (en) Interactive electronic dance coaching system
Yun et al. Easy interface and control of tele-education robots
CN115933868B (en) Three-dimensional comprehensive teaching field system of turnover platform and working method thereof
TWI822267B (en) Immersive ecological house interactive learning system
Chen et al. Virtual, Augmented and Mixed Reality. Applications and Case Studies: 11th International Conference, VAMR 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, July 26–31, 2019, Proceedings, Part II
Αδαμίδης User interfaces for human-robot interaction: Application on a semi-autonomous agricultural robot sprayer
Ciupe et al. New trends in service robotics
TW202409991A (en) Immersive ecological house interactive learning system
Maeyama et al. Experiments on a remote appreciation robot in an art museum
Lenz et al. Nimbro wins ana avatar xprize immersive telepresence competition: Human-centric evaluation and lessons learned
KR101505598B1 (en) Apparatus for educating a robot
Miura et al. Device-free personal response system based on fiducial markers
CN206326608U (en) A kind of family assiatant intelligent robot system