TW201913292A - Mobile device and method for blending display content with environment scene - Google Patents

Mobile device and method for blending display content with environment scene Download PDF

Info

Publication number
TW201913292A
TW201913292A TW106128076A TW106128076A TW201913292A TW 201913292 A TW201913292 A TW 201913292A TW 106128076 A TW106128076 A TW 106128076A TW 106128076 A TW106128076 A TW 106128076A TW 201913292 A TW201913292 A TW 201913292A
Authority
TW
Taiwan
Prior art keywords
unit
image
mobile device
environment
display
Prior art date
Application number
TW106128076A
Other languages
Chinese (zh)
Inventor
黃國倫
Original Assignee
宏碁股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 宏碁股份有限公司 filed Critical 宏碁股份有限公司
Priority to TW106128076A priority Critical patent/TW201913292A/en
Publication of TW201913292A publication Critical patent/TW201913292A/en

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

A mobile device capable of blending a display content with an environment scene is provided. The mobile device includes: a sensing unit, configured to sense a distance information between a user and the mobile device; an image capturing unit, configured to capture an environment image and a depth information of the environment image from the environment scene; a display unit, configured to display the environment image; a processing unit coupled to the sensing unit and the image capturing unit, respectively, based on the distance information obtained by the sensing unit and the depth information obtained by the image capturing unit, configured to adjust the display size of the environment image in the display unit so that the environment image is blended with the environment scene that the user sees outside of the display unit.

Description

可將顯示內容融入環境實景的移動裝置及其方法  Mobile device capable of integrating display content into environmental reality and method thereof  

本發明係有關於一種擴充實境的移動裝置及其方法,特別是有關於一種可將顯示內容融入環境實景的移動裝置及其方法。 The present invention relates to an augmented reality mobile device and method thereof, and more particularly to a mobile device and method thereof for integrating display content into an environment.

擴充實境(Augmented Reality,簡稱AR)是一種將電腦所產生的虛擬資訊結合至現實世界影像中的方式,透過銀幕使虛擬世界能與現實世界場景進行結合與互動。此方式已運用在許多不同的生活應用中,舉凡廣告、遊戲、旅遊、導航、教育、運動、娛樂等等,皆是擴增實境所運用的領域。近年來,隨著隨身電子產品運算能力的提升,擴充實境的用途也越來越廣。 Augmented Reality (AR) is a way to combine the virtual information generated by computers into real-world images. The virtual world can be combined and interacted with real-world scenes through the screen. This method has been used in many different life applications, such as advertising, games, travel, navigation, education, sports, entertainment, etc., are the areas of augmented reality. In recent years, with the improvement of the computing power of portable electronic products, the use of augmented reality has become more and more widespread.

目前市面上大部分的擴增實境方案多為移動裝置或電腦透過銀幕疊加新增的虛擬物件進入裝置的相機視角範圍,然而,若使用者希望使用擴增實境時,裝置的顯示內容能配合人眼的視角,融入於環境實景中,目前一般產品仍無法滿足此一需求。舉例來說,第1圖係顯示一般移動裝置100在相機預覽模式(preview mode)下的顯示內容110,在不手動調整放大倍率的情況下,可以看出其顯示內容110與背景中環境實景120 的樹無法相融合。 At present, most of the augmented reality solutions on the market are mobile cameras or computers that add new virtual objects through the screen to enter the device's camera angle range. However, if the user wants to use the augmented reality, the display content of the device can In line with the perspective of the human eye, it is integrated into the real environment, and the current general products still cannot meet this demand. For example, the first figure shows the display content 110 of the general mobile device 100 in a preview mode. If the magnification is not manually adjusted, the display content 110 and the environment in the background can be seen 120. The trees cannot be merged.

本發明提供一種可將顯示內容融入環境實景的移動裝置及其方法。 The present invention provides a mobile device and method thereof that can integrate display content into an environment.

本發明提出一種可將顯示內容融入環境實景的移動裝置,包括:一感測單元,用以感測一使用者與該移動裝置之間的一距離資訊;一影像擷取單元,用以從該環境實景中擷取一環境影像以及該環境影像之一深度資訊;一顯示單元,顯示該環境影像;一處理單元,分別耦接該感測單元以及該影像擷取單元,並根據該感測單元獲得之該距離資訊以及該影像擷取單元獲得之該深度資訊,用以調整該環境影像於該顯示單元中的顯示尺寸,使該環境影像與該使用者在該顯示單元以外所見的該環境實景相融合。 The present invention provides a mobile device that can integrate display content into an environment, including: a sensing unit for sensing a distance information between a user and the mobile device; and an image capturing unit for The environment image captures an environment image and one of the environment image depth information; a display unit displays the environment image; a processing unit is coupled to the sensing unit and the image capturing unit, respectively, and according to the sensing unit Obtaining the distance information and the depth information obtained by the image capturing unit, for adjusting a display size of the environment image in the display unit, and causing the environment image and the environment view that the user sees outside the display unit Convergence.

本發明又提出一種可將顯示內容融入環境實景的方法,包括以下步驟:藉由一移動裝置的一感測單元,感測一使用者與該移動裝置之間的一距離資訊;藉由一影像擷取單元,用以從該環境實景中擷取一環境影像以及該環境影像之一深度資訊;藉由一顯示單元,顯示該環境影像;藉由一處理單元,根據該感測單元獲得之該距離資訊以及該影像擷取單元獲得之該深度資訊,用以調整該環境影像於該顯示單元中的顯示尺寸,使該環境影像與該使用者在該顯示單元以外所見的該環境實景相融合。 The present invention further provides a method for integrating display content into an environment, comprising the steps of: sensing a distance information between a user and the mobile device by using a sensing unit of the mobile device; The capturing unit is configured to capture an environmental image and a depth information of the environmental image from the real scene; display the environmental image by using a display unit; and obtain the image according to the sensing unit by using a processing unit The distance information and the depth information obtained by the image capturing unit are used to adjust the display size of the environment image in the display unit, so that the environment image is merged with the environment view seen by the user outside the display unit.

100‧‧‧移動裝置 100‧‧‧Mobile devices

110‧‧‧顯示內容 110‧‧‧Display content

120‧‧‧環境實景 120‧‧‧Environmental scene

200‧‧‧移動裝置 200‧‧‧Mobile devices

210‧‧‧影像擷取單元 210‧‧‧Image capture unit

220‧‧‧處理單元 220‧‧‧Processing unit

221‧‧‧擴增實境單元 221‧‧Augmented Reality Unit

222‧‧‧計算單元 222‧‧‧Computation unit

230‧‧‧感測單元 230‧‧‧Sensor unit

240‧‧‧顯示單元 240‧‧‧ display unit

250‧‧‧定位單元 250‧‧‧ Positioning unit

300‧‧‧移動裝置 300‧‧‧Mobile devices

310‧‧‧人眼 310‧‧‧ human eyes

320‧‧‧環境實景 320‧‧‧Environmental scene

400‧‧‧移動裝置 400‧‧‧Mobile devices

410‧‧‧顯示內容 410‧‧‧Display content

420‧‧‧環境實景 420‧‧‧Environmental scene

501、502、503、504‧‧‧步驟 501, 502, 503, 504‧ ‧ steps

601、602、603、604、605‧‧‧步驟 601, 602, 603, 604, 605 ‧ ‧ steps

第1圖係顯示一般移動裝置在相機預覽模式下的顯示內容之示意圖。 Figure 1 is a schematic diagram showing the display contents of a general mobile device in a camera preview mode.

第2A圖係根據本發明第一實施例之可將顯示內容融入環境實景的移動裝置之示意圖。 2A is a schematic diagram of a mobile device that can integrate display content into an environmental reality scene in accordance with a first embodiment of the present invention.

第2B圖係根據本發明第二實施例之可將顯示內容融入環境實景的移動裝置之示意圖。 2B is a schematic diagram of a mobile device that can integrate display content into an environmental reality scene in accordance with a second embodiment of the present invention.

第3A圖、第3B圖係人眼、移動裝置及環境實景之相對位置示意圖。 Figures 3A and 3B are schematic diagrams of the relative positions of the human eye, the mobile device, and the actual environment.

第4A圖、第4B圖、第4C圖、第4D圖係顯示內容與距離資訊之對應關係示意圖。 4A, 4B, 4C, and 4D are schematic diagrams showing the correspondence between content and distance information.

第5圖係根據本發明第一實施例之可將顯示內容融入環境實景之方法流程圖。 Figure 5 is a flow chart of a method for integrating display content into an environmental reality scene in accordance with a first embodiment of the present invention.

第6圖係根據本發明第二實施例之可將顯示內容融入環境實景之方法流程圖。 Figure 6 is a flow chart of a method for integrating display content into an environment according to a second embodiment of the present invention.

為使本發明之上述和其他目的、特徵和優點能更明顯易懂,下文特舉出較佳實施例,並配合所附圖式,作詳細說明如下。 The above and other objects, features and advantages of the present invention will become more <RTIgt;

第2A圖係根據本發明第一實施例所述之可將顯示內容融入環境實景的移動裝置200之示意圖。移動裝置200主要包括影像擷取單元210、處理單元220、感測單元230以及顯示單元240。在此實施例,移動裝置200例如為平板電腦、智慧型手機、行動電話、無邊框或窄邊框的行動裝置、筆記型電腦等電子裝置但並非限定於此。 2A is a schematic diagram of a mobile device 200 that can integrate display content into an environment according to a first embodiment of the present invention. The mobile device 200 mainly includes an image capturing unit 210, a processing unit 220, a sensing unit 230, and a display unit 240. In this embodiment, the mobile device 200 is, for example, a tablet computer, a smart phone, a mobile phone, a mobile device without a bezel or a narrow bezel, or an electronic device such as a notebook computer, but is not limited thereto.

影像擷取單元210設置在移動裝置200的後鏡頭區,即一般照相之主鏡頭區,主要用來對一環境實景擷取一環境影像,其環境影像可以是持續的動態影像或靜態影像,且具有一深度資訊。影像擷取單元210可以是任何可擷取影像及深度資訊的裝置或設備,例如,雙鏡頭攝影機/照相機、立體攝影機/照相機、雷射立體攝影機/照相機(具有雷射量測深度值之攝影裝置)、紅外線立體攝影機/照相機(具有紅外線量測深度值之攝影裝置)或三維掃描儀等。其中,三維掃描儀可以是雷射掃描儀、結構光源(structured light)掃描儀或飛時測距(Time of Flight)掃描儀等。再者,影像擷取單元210可使用包括光學變焦或數位變焦的變焦鏡頭,用以改變放大倍率或使用變焦鏡頭推攝或拉攝,使該環境影像具有放大(zoom in)或縮小(zoom out)功能。 The image capturing unit 210 is disposed in the rear lens area of the mobile device 200, that is, the main lens area of the general camera, and is mainly used to capture an environment image for an environment, and the environment image may be a continuous motion image or a static image, and Have a depth of information. The image capturing unit 210 can be any device or device that can capture image and depth information, for example, a dual lens camera/camera, a stereo camera/camera, a laser stereo camera/camera (a laser device with a laser depth value) ), an infrared stereo camera/camera (photographic device with infrared measurement depth value) or a three-dimensional scanner. The three-dimensional scanner may be a laser scanner, a structured light scanner or a time of flight scanner. Furthermore, the image capturing unit 210 can use a zoom lens including optical zoom or digital zoom to change the magnification or use the zoom lens to push or pull, so that the environment image has zoom in or zoom out (zoom out )Features.

顯示單元240透過處理單元220耦接影像擷取單元210,用來顯示影像擷取單元210所擷取的環境影像或經由處理單元220調整後的環境影像。顯示單元240可以是任何可顯示影像的裝置,例如液晶顯示器、電漿顯示器、LED顯示器、OLED顯示器、透明顯示器等,顯示單元240更可以具有觸控功能,使得使用者與移動裝置200可透過顯示單元240具有互動功能。 The display unit 240 is coupled to the image capturing unit 210 through the processing unit 220 for displaying the environmental image captured by the image capturing unit 210 or the environmental image adjusted by the processing unit 220. The display unit 240 can be any device that can display images, such as a liquid crystal display, a plasma display, an LED display, an OLED display, a transparent display, etc., and the display unit 240 can further have a touch function, so that the user and the mobile device 200 can be displayed through the display. Unit 240 has an interactive function.

感測單元230設置在移動裝置200的前鏡頭區,即一般使用者用來自拍之次鏡頭區,主要用來量測或感測使用者與移動裝置200之間的一距離資訊。感測單元230可以是任何可感測使用者與移動裝置200之間距離長度的裝置或設備,例如近接感測器、紅外線距離感測器、雙鏡頭攝影機/照相機或立 體攝影機/照相機等。在使用者透過雙手或單手改變與移動裝置200之間的距離時,其感測距離之範圍可以是10公分到70公分,約略為一般使用者雙手或單手拿著移動裝置與臉部之間的可變距離,然本發明實施例不限定此感測範圍。 The sensing unit 230 is disposed in the front lens area of the mobile device 200, that is, the general user uses the secondary lens area from the shooting, and is mainly used to measure or sense a distance information between the user and the mobile device 200. Sensing unit 230 can be any device or device that can sense the length of the distance between the user and mobile device 200, such as a proximity sensor, an infrared range sensor, a dual lens camera/camera, or a stereo camera/camera. When the user changes the distance between the mobile device 200 through the hands or one hand, the sensing distance may range from 10 cm to 70 cm, which is approximately the average user's hands or one hand holding the mobile device and the face. The variable distance between the parts, but the embodiment of the invention does not limit the sensing range.

處理單元220分別耦接影像擷取單元210、感測單元230以及顯示單元240,根據接收影像擷取單元210所擷取的環境影像、該環境影像的深度資訊,以及感測單元230所感測的使用者與移動裝置200之間的距離資訊,用以調整環境影像於顯示單元240中的顯示尺寸,使環境影像與使用者在顯示單元以外所見的環境實景相融合。處理單元220可以是中央處理器(CPU)、微處理器(microprocessor)、微控制器(microcontroller)等具有運算能力的處理單元。 The processing unit 220 is coupled to the image capturing unit 210, the sensing unit 230, and the display unit 240, and the environment image captured by the image capturing unit 210, the depth information of the environment image, and the sensing unit 230. The distance information between the user and the mobile device 200 is used to adjust the display size of the environmental image in the display unit 240 to integrate the environmental image with the environmental reality seen by the user outside the display unit. The processing unit 220 may be a processing unit having a computing capability such as a central processing unit (CPU), a microprocessor, or a microcontroller.

第2B圖係根據本發明第二實施例所述之可將顯示內容融入環境實景的移動裝置200之示意圖。在此實施例中,和第一實施例中相同名稱的元件,其功能亦如前所述,在此不再贅述。第2B圖和第2A圖的主要差異在於處理單元220更可包括擴增實境單元221及計算單元222,且處理單元220耦接定位單元250。擴增實境單元221提供一操作介面,供使用者開啟擴增實境(Augmented Reality,AR)功能,將某一個事先建立的虛擬影像或從一影像中取出特定物件影像,置入影像擷取單元210所擷取的環境影像中後整合呈現,透過顯示單元240顯示一擴增實境影像,並搭配計算單元222將擴增實境的虛擬影像依等比例擴充或非等比例擴充,疊加新增的物件進入環境影像中。更進一步地,定位單元250提供移動裝置定位功能,定位 單元250可以是三維空間定位單元或慣性測量單元(Inertial Measurement Unit,IMU),例如陀螺儀等感應器,用以提供使用者操控擴增實境影像的旋轉角度或放大縮小。 2B is a schematic diagram of a mobile device 200 that can integrate display content into an environment according to a second embodiment of the present invention. In this embodiment, elements having the same names as those in the first embodiment have functions as described above, and are not described herein again. The main difference between the 2B and 2A is that the processing unit 220 further includes an augmented reality unit 221 and a computing unit 222, and the processing unit 220 is coupled to the positioning unit 250. The augmented reality unit 221 provides an operation interface for the user to enable the Augmented Reality (AR) function to take a pre-established virtual image or take a specific object image from an image and place the image capture image. The integrated image captured by the unit 210 is post-integrated and displayed, and an augmented reality image is displayed through the display unit 240, and the virtual image of the augmented reality is expanded or non-equalized by the calculation unit 222, and the new image is superimposed. The added object enters the environmental image. Further, the positioning unit 250 provides a mobile device positioning function, and the positioning unit 250 can be a three-dimensional spatial positioning unit or an inertial measurement unit (IMU), such as a gyroscope, to provide a user-operated amplification. The rotation angle of the image is zoomed in or out.

在本發明的各種實施例中,移動裝置200包括上述影像擷取單元210、處理單元220、感測單元230以及顯示單元240,藉由其他硬體設備、作業系統、應用軟體及通訊網路,使用者可以開啟照相、攝影、網際網路及定位功能。 In various embodiments of the present invention, the mobile device 200 includes the image capturing unit 210, the processing unit 220, the sensing unit 230, and the display unit 240, and is used by other hardware devices, operating systems, application software, and communication networks. You can turn on photography, photography, internet and location.

接著,請參考第3A~3B圖之人眼310、移動裝置300及環境實景320之相對位置示意圖,在使用者開啟相機預覽模式(preview mode)時,操作介面(圖未示)提供一可將顯示內容融入環境實景的功能,在使用者開啟此功能後,移動裝置300的影像擷取單元擷取當前環境影像以及環境影像之深度資訊,即擷取環境實景320(例如,樹)與移動裝置300之深度值,移動裝置300越靠近環境實景320則深度值越小,反之,則越大。另外,感測單元感測使用者與移動裝置300之間的距離資訊,在此例如為使用者臉部與移動裝置300之間的距離。感測單元230可感測使用者雙眼(人眼310)中間之一點,作為感測該距離資訊的基準點,或者,感測使用者之臉部平面,作為感測該距離資訊的基準面。其感測距離之範圍可以是10公分到70公分,約略為使用者透過雙手或單手拿著移動裝置300與臉部之間的可變距離。更進一步地,處理單元接收環境影像之深度資訊與使用者之距離資訊,依據距離資訊的數值大小,依比例將環境影像放大(zoom in)或縮小(zoom out),以配合將顯示內容對齊環境實景320。其中,當距離資訊的數值變小時,處理單元依比例將 環境影像縮小,相反地,當距離資訊的數值變大時,處理單元依比例將環境影像放大,用以維持環境影像融合於環境實景中。處理單元將環境影像放大倍率之範圍為1倍~4倍,但本發明實施例不限定此範圍。 Next, please refer to the relative position diagrams of the human eye 310, the mobile device 300 and the environment real scene 320 in FIGS. 3A-3B. When the user turns on the preview mode, the operation interface (not shown) provides a The function of the display content is integrated into the real environment. After the user turns on the function, the image capturing unit of the mobile device 300 captures the current environment image and the depth information of the environment image, that is, the environment environment 320 (eg, tree) and the mobile device. The depth value of 300, the closer the mobile device 300 is to the ambient scene 320, the smaller the depth value, and vice versa. In addition, the sensing unit senses distance information between the user and the mobile device 300, such as the distance between the user's face and the mobile device 300. The sensing unit 230 may sense one point in the middle of the user's eyes (the human eye 310) as a reference point for sensing the distance information, or sense the face plane of the user as a reference surface for sensing the distance information. . The sensing distance may range from 10 cm to 70 cm, and the user may hold the variable distance between the mobile device 300 and the face through both hands or one hand. Further, the processing unit receives the depth information of the environment image and the distance information of the user, and zooms in or zooms out the environment image according to the value of the distance information to match the display content to the environment. Real scene 320. Wherein, when the value of the distance information becomes small, the processing unit scales down the environment image proportionally. Conversely, when the value of the distance information becomes larger, the processing unit enlarges the environment image proportionally to maintain the environment image in the environment. . The processing unit ranges the environment image magnification by a factor of 1 to 4, but the embodiment of the present invention does not limit the range.

舉例來說,請參考第4A~4D圖之顯示內容410與距離資訊之對應關係示意圖,在使用者開啟顯示內容融入環境實景的功能之前,顯示內容410將與圖式第1圖類似,顯示內容與背景中環境實景的樹無法相融合,而在使用者開啟該功能之後,顯示內容410將逐漸對齊環境實景420,如第4A圖所示。圖式第4A~4D圖係顯示在使用者開啟顯示內容融入環境實景的功能後,於使用者人眼視角對應不同距離資訊所看到移動裝置400的顯示內容410。第4A圖係人眼與移動裝置400距離一既定距離(例如,10公分)時,搭配環境影像之深度資訊(例如,移動裝置400與環境實景420之深度值為5公尺),在使用者與環境實景420(例如,樹)之間的距離大致上不變的情況下,移動裝置400將顯示內容410對齊環境實景420,此時環境影像放大倍率設定為1倍;接著,當移動裝置400逐漸遠離人眼,如第4B圖所示,人眼與移動裝置400之距離30公分時,環境影像逐漸放大為2倍;當移動裝置更遠離人眼,如第4C圖所示,人眼與移動裝置400之距離50公分時,環境影像更放大為3倍,移動裝置400仍然將顯示內容410對齊環境實景420;最後,如第4D圖所示,人眼與移動裝置400之距離達70公分時,環境影像放大倍率為4倍,用以維持環境影像融合於環境實景420中;相反地,當移動裝置400逐漸靠近人眼時,則環境影像依比例縮小。如上所 述,當距離改變時,移動裝置400即時感應人眼與裝置之間的距離,同時放大或縮小環境影像的顯示尺寸,使得在使用者的人眼視角中,看到移動裝置400的顯示內容410與該移動裝置以外所見的環境實景420相融合。 For example, please refer to the corresponding relationship between the display content 410 and the distance information in the 4A-4D figure. Before the user opens the function of displaying the content into the environment, the display content 410 will be similar to the first figure of the drawing, and the content is displayed. The tree with the real environment in the background cannot be merged, and after the user turns on the function, the display content 410 will gradually align with the real scene 420, as shown in FIG. 4A. The figure 4A~4D shows the display content 410 of the mobile device 400 as seen by the user's human eye angle corresponding to different distance information after the user turns on the display content into the environment. 4A is a depth information of the environmental image when the human eye and the mobile device 400 are separated by a predetermined distance (for example, 10 cm) (for example, the depth value of the mobile device 400 and the environment real scene 420 is 5 meters). When the distance between the environment 420 (eg, the tree) is substantially constant, the mobile device 400 aligns the display content 410 with the environmental reality 420, at which time the environment image magnification is set to 1 time; then, when the mobile device 400 Gradually away from the human eye, as shown in Fig. 4B, when the distance between the human eye and the mobile device 400 is 30 cm, the environmental image is gradually enlarged to 2 times; when the mobile device is farther away from the human eye, as shown in Fig. 4C, the human eye and When the distance of the mobile device 400 is 50 cm, the environmental image is further enlarged by 3 times, and the mobile device 400 still aligns the display content 410 with the environmental reality scene 420; finally, as shown in FIG. 4D, the distance between the human eye and the mobile device 400 is 70 cm. The environmental image magnification is 4 times to maintain the environmental image in the ambient scene 420; conversely, when the mobile device 400 gradually approaches the human eye, the environmental image is scaled down. As described above, when the distance is changed, the mobile device 400 instantly senses the distance between the human eye and the device while zooming in or out the display size of the environmental image so that the display of the mobile device 400 is seen in the user's human eye. The content 410 is merged with the environmental reality 420 seen outside of the mobile device.

第5圖係根據本發明第一實施例之可將顯示內容融入環境實景之方法流程圖。配合參考本發明第一實施例之第2A圖,在步驟501中,藉由第2A圖移動裝置200的一感測單元230,感測一使用者與該移動裝置200之間的一距離資訊。在步驟502中,藉由第2A圖的一影像擷取單元210,用以從該環境實景中擷取一環境影像以及該環境影像之一深度資訊。在步驟503中,藉由第2A圖的一顯示單元240,顯示該環境影像。在步驟504中,藉由第2A圖的一處理單元220,根據該感測單元獲得之該距離資訊以及該影像擷取單元獲得之該深度資訊,用以調整該環境影像於該顯示單元中的顯示尺寸,使該環境影像與該使用者在該顯示單元以外所見的該環境實景相融合。其中,在步驟504中,處理單元220係依據該距離資訊的數值大小以及該深度資訊,依比例將該環境影像放大(zoom in)或縮小(zoom out)。其中,當該距離資訊的數值變小時,依比例將該環境影像縮小,相反地,當該距離資訊的數值變大時,依比例將該環境影像放大,用以維持該環境影像融合於該環境實景中。另外,在步驟501中,該距離資訊之範圍係10~70公分,在步驟504中,處理單元220將該環境影像放大倍率之範圍係1倍~4倍。 Figure 5 is a flow chart of a method for integrating display content into an environmental reality scene in accordance with a first embodiment of the present invention. Referring to FIG. 2A of the first embodiment of the present invention, in step 501, a distance information between a user and the mobile device 200 is sensed by a sensing unit 230 of the mobile device 200 of FIG. In step 502, an image capturing unit 210 of FIG. 2A is used to extract an environment image and one depth information of the environment image from the environment. In step 503, the environmental image is displayed by a display unit 240 of FIG. 2A. In step 504, the distance information obtained by the sensing unit and the depth information obtained by the image capturing unit are used to adjust the environmental image in the display unit by using a processing unit 220 of FIG. The display size is such that the environmental image is merged with the environment view that the user sees outside of the display unit. In step 504, the processing unit 220 zooms in or out of the environment image according to the magnitude of the distance information and the depth information. Wherein, when the value of the distance information becomes smaller, the environment image is scaled down. Conversely, when the value of the distance information becomes larger, the environment image is enlarged according to a ratio to maintain the environment image in the environment. Real scene. In addition, in step 501, the range of the distance information is 10 to 70 cm. In step 504, the processing unit 220 ranges the environment image magnification by 1 to 4 times.

第6圖係根據本發明第二實施例之可將顯示內容融入環境實景之方法流程圖。在此方法流程圖中,和第5圖之 第一實施例之方法流程圖相同步驟如前所述,在此不再贅述。配合參考本發明第二實施例之第2B圖,第6圖和第5圖的主要差異在於,本方法更包括步驟605,在步驟605中,藉由該處理單元220還包括一擴增實境單元221以及一計算單元222,用以提供擴增實境功能,透過該顯示單元240顯示一擴增實境影像。其中,在步驟605中,該處理單元220依據該調整後的顯示尺寸,等比例調整該擴增實境影像。 Figure 6 is a flow chart of a method for integrating display content into an environment according to a second embodiment of the present invention. In the method flow chart, the same steps as the method flow chart of the first embodiment of Fig. 5 are as described above, and are not described herein again. Referring to FIG. 2B of the second embodiment of the present invention, the main difference between FIG. 6 and FIG. 5 is that the method further includes step 605. In step 605, the processing unit 220 further includes an augmented reality. The unit 221 and a computing unit 222 are configured to provide an augmented reality function, and display an augmented reality image through the display unit 240. In step 605, the processing unit 220 adjusts the augmented reality image according to the adjusted display size.

因此,透過本發明之可將顯示內容融入環境實景的移動裝置與方法,無需透過使用者手動調整放大倍率,移動裝置自動放大或縮小環境影像的顯示尺寸,以達到在使用者的人眼視角中,看到移動裝置的顯示內容與該移動裝置以外所見的環境實景相融合的效果,為移動裝置在擴增實境增添更好的使用者體驗。 Therefore, with the mobile device and method of the present invention that can integrate the display content into the real environment, the mobile device automatically enlarges or reduces the display size of the environmental image without manually adjusting the magnification by the user, so as to achieve the user's human eye angle of view. The effect of seeing the display content of the mobile device and the environmental reality seen outside the mobile device adds a better user experience to the mobile device in the augmented reality.

雖然本發明已以較佳實施例揭露如上,然其並非用以限定本發明,任何熟悉此項技藝者,在不脫離本發明之精神和範圍內,當可做些許更動與潤飾,因此本發明之保護範圍當視後附之申請專利範圍所界定者為準。 While the present invention has been described in its preferred embodiments, the present invention is not intended to limit the invention, and the present invention may be modified and modified without departing from the spirit and scope of the invention. The scope of protection is subject to the definition of the scope of the patent application.

Claims (20)

一種可將顯示內容融入環境實景的移動裝置,包括:一感測單元,用以感測一使用者與該移動裝置之間的一距離資訊;一影像擷取單元,用以從該環境實景中擷取一環境影像以及該環境影像之一深度資訊;一顯示單元,顯示該環境影像;一處理單元,分別耦接該感測單元以及該影像擷取單元,並根據該感測單元獲得之該距離資訊以及該影像擷取單元獲得之該深度資訊,用以調整該環境影像於該顯示單元中的顯示尺寸,使該環境影像與該使用者在該顯示單元以外所見的該環境實景相融合。  A mobile device capable of integrating display content into an environment, comprising: a sensing unit for sensing a distance information between a user and the mobile device; and an image capturing unit for using the environment from the environment An environmental image and a depth information of the environmental image; a display unit for displaying the environmental image; a processing unit coupled to the sensing unit and the image capturing unit, respectively, and obtained according to the sensing unit The distance information and the depth information obtained by the image capturing unit are used to adjust the display size of the environment image in the display unit, so that the environment image is merged with the environment view seen by the user outside the display unit.   如申請專利範圍第1項所述之移動裝置,其中,該處理單元依據該距離資訊的數值大小以及該深度資訊,依比例將該環境影像放大(zoom in)或縮小(zoom out)。  The mobile device of claim 1, wherein the processing unit zooms in or zooms out the environmental image according to the magnitude of the distance information and the depth information.   如申請專利範圍第2項所述之移動裝置,其中,當該距離資訊的數值變小時,該處理單元依比例將該環境影像縮小,相反地,當該距離資訊的數值變大時,該處理單元依比例將該環境影像放大,用以維持該環境影像融合於該環境實景中。  The mobile device of claim 2, wherein, when the value of the distance information becomes smaller, the processing unit scales the environmental image proportionally, and conversely, when the value of the distance information becomes larger, the processing The unit magnifies the environmental image in proportion to maintain the environmental image in the environment.   如申請專利範圍第3項所述之移動裝置,其中,該距離資訊之範圍係10~70公分,該處理單元將該環境影像放大倍率之範圍係1倍~4倍。  The mobile device of claim 3, wherein the range of the distance information is 10 to 70 cm, and the processing unit ranges the environment image magnification by 1 to 4 times.   如申請專利範圍第1項所述之移動裝置,其中,該移動裝置是一無邊框或窄邊框的行動裝置。  The mobile device of claim 1, wherein the mobile device is a mobile device having no border or narrow border.   如申請專利範圍第1項所述之移動裝置,其中,該感測單元是近接感測器、紅外線距離感測器、雙鏡頭攝影機或立體攝影機。  The mobile device of claim 1, wherein the sensing unit is a proximity sensor, an infrared distance sensor, a dual lens camera or a stereo camera.   如申請專利範圍第1項所述之移動裝置,其中,該影像擷取單元是雙鏡頭攝影機、立體攝影機或三維掃描儀。  The mobile device of claim 1, wherein the image capturing unit is a two-lens camera, a stereo camera or a three-dimensional scanner.   如申請專利範圍第7項所述之移動裝置,其中,該三維掃描儀是雷射掃描儀、結構光源(structured light)掃描儀或飛時測距(Time of Flight)掃描儀。  The mobile device of claim 7, wherein the three-dimensional scanner is a laser scanner, a structured light scanner or a time of flight scanner.   如申請專利範圍第1項所述之移動裝置,其中,該處理單元還包括一擴增實境單元以及一計算單元,用以提供擴增實境(AR)功能,透過該顯示單元顯示一擴增實境影像。  The mobile device of claim 1, wherein the processing unit further comprises an augmented reality unit and a computing unit for providing an augmented reality (AR) function, and displaying an expansion through the display unit Increase the reality image.   如申請專利範圍第9項所述之移動裝置,其中,該處理單元依據該調整後的顯示尺寸,等比例調整該擴增實境影像。  The mobile device of claim 9, wherein the processing unit adjusts the augmented reality image in an equal proportion according to the adjusted display size.   如申請專利範圍第9項或第10項所述之移動裝置,其中,該移動裝置還包括一定位單元,耦接該處理單元,用以定位該擴增實境影像。  The mobile device of claim 9 or claim 10, wherein the mobile device further comprises a positioning unit coupled to the processing unit for positioning the augmented reality image.   如申請專利範圍第11項所述之移動裝置,其中,該定位單元是三維空間定位單元或慣性測量單元。  The mobile device of claim 11, wherein the positioning unit is a three-dimensional spatial positioning unit or an inertial measurement unit.   如申請專利範圍第1項所述之移動裝置,其中,該感測單元感測該使用者雙眼中間之一點,作為感測該距離資訊的基準點。  The mobile device of claim 1, wherein the sensing unit senses a point in the middle of the user's eyes as a reference point for sensing the distance information.   如申請專利範圍第1項所述之移動裝置,其中,該感測單元感測該使用者之臉部平面,作為感測該距離資訊的基準面。  The mobile device of claim 1, wherein the sensing unit senses a face plane of the user as a reference surface for sensing the distance information.   一種可將顯示內容融入環境實景的方法,用於一移動裝置,包括以下步驟:藉由該移動裝置的一感測單元,感測一使用者與該移動裝置之間的一距離資訊;藉由一影像擷取單元,用以從該環境實景中擷取一環境影像以及該環境影像之一深度資訊;藉由一顯示單元,顯示該環境影像;藉由一處理單元,根據該感測單元獲得之該距離資訊以及該影像擷取單元獲得之該深度資訊,用以調整該環境影像於該顯示單元中的顯示尺寸,使該環境影像與該使用者在該顯示單元以外所見的該環境實景相融合。  A method for integrating display content into an environment, for a mobile device, comprising the steps of: sensing a distance information between a user and the mobile device by using a sensing unit of the mobile device; An image capturing unit is configured to capture an environment image and a depth information of the environment image from the real scene; display the environment image by using a display unit; and obtain, according to the sensing unit, a processing unit The distance information and the depth information obtained by the image capturing unit are used to adjust the display size of the environment image in the display unit, so that the environment image is similar to the environment view seen by the user outside the display unit. Fusion.   如申請專利範圍第15項所述之方法,其中,藉由該處理單元調整該環境影像於該顯示單元中的顯示尺寸之步驟,係依據該距離資訊的數值大小以及該深度資訊,依比例將該環境影像放大(zoom in)或縮小(zoom out)。  The method of claim 15, wherein the step of adjusting the display size of the environmental image in the display unit by the processing unit is based on the numerical value of the distance information and the depth information, The environment image zooms in or zoom out.   如申請專利範圍第16項所述之方法,其中,當該距離資訊的數值變小時,依比例將該環境影像縮小,相反地,當該距離資訊的數值變大時,依比例將該環境影像放大,用以維持該環境影像融合於該環境實景中。  The method of claim 16, wherein when the value of the distance information becomes smaller, the environmental image is scaled down, and conversely, when the value of the distance information becomes larger, the environmental image is proportionally Zoom in to maintain the environment image into the real scene.   如申請專利範圍第17項所述之方法,其中,該距離資訊之範圍係10~70公分,該處理單元將該環境影像放大倍率之範圍係1倍~4倍。  The method of claim 17, wherein the range of the distance information is 10 to 70 cm, and the processing unit ranges the environment image magnification by 1 to 4 times.   如申請專利範圍第15項所述之方法,更包括以下步驟:藉由該處理單元還包括一擴增實境單元以及一計算單 元,用以提供擴增實境功能,透過該顯示單元顯示一擴增實境影像。  The method of claim 15, further comprising the step of: the processing unit further comprising an augmented reality unit and a computing unit for providing an augmented reality function, displaying a display through the display unit Amplify the real-world image.   如申請專利範圍第19項所述之方法,其中,該處理單元依據該調整後的顯示尺寸,等比例調整該擴增實境影像。  The method of claim 19, wherein the processing unit adjusts the augmented reality image in an equal proportion according to the adjusted display size.  
TW106128076A 2017-08-18 2017-08-18 Mobile device and method for blending display content with environment scene TW201913292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW106128076A TW201913292A (en) 2017-08-18 2017-08-18 Mobile device and method for blending display content with environment scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW106128076A TW201913292A (en) 2017-08-18 2017-08-18 Mobile device and method for blending display content with environment scene

Publications (1)

Publication Number Publication Date
TW201913292A true TW201913292A (en) 2019-04-01

Family

ID=66991882

Family Applications (1)

Application Number Title Priority Date Filing Date
TW106128076A TW201913292A (en) 2017-08-18 2017-08-18 Mobile device and method for blending display content with environment scene

Country Status (1)

Country Link
TW (1) TW201913292A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI807751B (en) * 2022-04-01 2023-07-01 香港商冠捷投資有限公司 Display that automatically adjusts the size of the screen display area and method for automatically adjusting the size of the screen display area

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI807751B (en) * 2022-04-01 2023-07-01 香港商冠捷投資有限公司 Display that automatically adjusts the size of the screen display area and method for automatically adjusting the size of the screen display area

Similar Documents

Publication Publication Date Title
WO2022000992A1 (en) Photographing method and apparatus, electronic device, and storage medium
US10210664B1 (en) Capture and apply light information for augmented reality
JP6177872B2 (en) I / O device, I / O program, and I / O method
US9123272B1 (en) Realistic image lighting and shading
JP6333801B2 (en) Display control device, display control program, and display control method
US20160004320A1 (en) Tracking display system, tracking display program, tracking display method, wearable device using these, tracking display program for wearable device, and manipulation method for wearable device
US9727137B2 (en) User view point related image processing apparatus and method thereof
JP6250024B2 (en) Calibration apparatus, calibration program, and calibration method
WO2018054267A1 (en) Image display method and device utilized in virtual reality-based apparatus
WO2022022141A1 (en) Image display method and apparatus, and computer device and storage medium
US11720996B2 (en) Camera-based transparent display
WO2014128751A1 (en) Head mount display apparatus, head mount display program, and head mount display method
Hoberman et al. Immersive training games for smartphone-based head mounted displays
CN105866955A (en) Smart glasses
KR20130119094A (en) Transparent display virtual touch apparatus without pointer
CN115022614A (en) Method, system, and medium for illuminating inserted content
US11699412B2 (en) Application programming interface for setting the prominence of user interface elements
WO2023101881A1 (en) Devices, methods, and graphical user interfaces for capturing and displaying media
TWI603225B (en) Viewing angle adjusting method and apparatus of liquid crystal display
TW201913292A (en) Mobile device and method for blending display content with environment scene
JP2024512040A (en) Devices, methods, and graphical user interfaces for maps
CN111857461B (en) Image display method and device, electronic equipment and readable storage medium
US20240070931A1 (en) Distributed Content Rendering
Faaborg et al. METHODS AND APPARATUS TO SCALE ANNOTATIONS FOR DESIRABLE VIEWING IN AUGMENTED REALITY ENVIRONMENTS
Kwon et al. Implementation of immersive multi-view augmented reality system