WO2016059530A1 - Guiding system and method - Google Patents

Guiding system and method Download PDF

Info

Publication number
WO2016059530A1
WO2016059530A1 PCT/IB2015/057759 IB2015057759W WO2016059530A1 WO 2016059530 A1 WO2016059530 A1 WO 2016059530A1 IB 2015057759 W IB2015057759 W IB 2015057759W WO 2016059530 A1 WO2016059530 A1 WO 2016059530A1
Authority
WO
WIPO (PCT)
Prior art keywords
specific image
output end
input end
image
exhibit
Prior art date
Application number
PCT/IB2015/057759
Other languages
French (fr)
Chinese (zh)
Inventor
林维源
Original Assignee
在地实验文化事业有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 在地实验文化事业有限公司 filed Critical 在地实验文化事业有限公司
Publication of WO2016059530A1 publication Critical patent/WO2016059530A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/90335Query processing

Definitions

  • the present invention relates to a navigation system and method, and more particularly to a navigation system and method that can be automatically navigated.
  • the existing navigation methods used in exhibition halls can be roughly divided into special person guides, paper guides, mobile device guides, and inductive guides.
  • the mobile device guides are further divided into touch voice guides.
  • a guided tour is a guided tour of the exhibits, while the paper guide is
  • the navigation information is printed on a paper.
  • the mobile device navigation is to configure each user with a mobile device. In the case of a touch-sensitive or touch-tone audio guide, the user can act while viewing the exhibit.
  • the device enters the number of the exhibit, the navigation information is dialed, and the inductive guide detects that the user is close to the exhibit, the navigation information is dialed, and the quick response code guide is
  • An application is built in the mobile device, through which the mobile device can scan the quick response code of each exhibit, and the mobile device displays the navigation information conforming to the quick response code.
  • the touch-sensitive or touch-tone audio guide requires the user to input the serial number that matches the exhibit, which is inconvenient for the user.
  • the quick response code guide must additionally write a navigation-specific application, and must also be at the exhibition. Setting a quick response code on the product not only affects the aesthetics of the exhibit, but also adds additional cost to writing the application.
  • the user needs to be close to the exhibit to move the navigation information, which is easy to influence.
  • the demographic line of the exhibition hall is all the missing of the existing navigation methods.
  • a navigation system including: an image generating device that generates a specific image that is incapable of being recognized by a human eye, and projects a specific image to an external exhibit; and a wearable device for use
  • the user receives the specific image through the wearable device to receive the specific image, determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user, compares the specific image with the built-in preset image, and compares the output.
  • the user views the navigation information on the wearable device corresponding to the navigation information of the specific image.
  • the navigation system wherein the image generating device comprises: a housing; a light source enclosing the housing for emitting infrared light; and a focusing sheet disposed in front of the light source and spaced apart from the light source for The infrared light emitted by the light source is focused; the mask piece is arranged in parallel in front of the focusing piece and spaced apart from the focusing piece, the mask piece has a pattern for generating a specific image; the first focusing piece is arranged in parallel a front side of the cover sheet and spaced apart from the mask sheet for adjusting a focal length of the specific image; and a second focus sheet disposed in parallel with the front side of the first focus sheet and separated from the first focus sheet Distance, used to adjust the focal length of a specific image.
  • the navigation system includes: a light source light bar for emitting infrared light; a light guide plate, the light source light bar is disposed around the light guide plate, and the light guide plate is used for uniformly guiding the infrared light emitted by the light source to a whole plate; a mask plate disposed in front of the light guide plate to hollow out the mask plate into a pre-conceived image pattern; a light transmissive paper disposed in front of the mask plate for allowing the infrared light source to pass through The pattern of the mask that blocks the rear; the transparent acrylic sheet and the aluminum strip are used to securely bond the board to the paper.
  • the navigation system includes: a receiving unit having an input end and an output end, the input end receiving the visible light of the specific image projected by the user on the external exhibit and the external environment; the filter unit having the input end and the The output end is connected to the output end of the receiving unit, and the input end of the filter unit receives the specific image and visible light, filters the visible light, and outputs a specific image at the output end; the determining unit has an input end and an output end, and the input end Connected to the output of the filter unit, the judgment unit The input end receives a specific image, and determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user; the comparing unit has an input end and an output end, and the input end is connected to the output end of the judging unit, and the comparison unit is Constructing a preset image, and after receiving the specific image of the visible light filtered by the input end of the matching unit, comparing the specific image with the preset image, and outputting navigation information corresponding to the specific image at the
  • another object of the present invention is to provide a navigation system including: an image generating device that generates a specific image that is incapable of being recognized by a human eye, and projects a specific image to an external exhibit; a wearable device for The user receives the specific image through the wearable device to receive the specific image, determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user, and outputs the navigation information corresponding to the specific image after determining; and the cloud database.
  • the wireless connection is connected to the wearable device, and the cloud database has a plurality of preset images for receiving the specific image transmitted by the wearable device, comparing the specific image with the preset image, and outputting the preset corresponding to the specific image.
  • the navigation information corresponding to the image is to the wearable device.
  • the navigation system includes: a receiving unit having a first input end, a second input end, a first output end, and a second output end, the first input end receiving the user viewing the projection on the external exhibit The specific image and the visible light of the external environment, the second input receives the navigation information output by the cloud database, the first output outputs a specific image and visible light, the second output outputs the navigation information;
  • the filter unit has an input end and an output The input end is connected to the first output end of the receiving unit, and after receiving the specific image and the visible light, filtering the visible light and outputting a specific image at the output end;
  • the determining unit has an input end and an output end, and the input end is connected to the filter The output end of the unit receives a specific image and determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user;
  • the transmitting unit has a first input end, a second input end, a first output
  • the cloud database transmits the navigation information to the receiving unit, and the sending unit transmits the specific image to the cloud database in a wireless transmission manner.
  • a navigation method comprising the steps of: generating a specific image that is unrecognizable by a human eye, the specific image corresponding to a particular external exhibit; and the use of the external exhibit by the user.
  • the wearable device worn by the wearer receives the visible light of the specific image and the external environment; filters out the visible light; determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user; and compares the specific image with the built-in multiple pre- Set the image; and generate navigation information corresponding to the preset image that is the same as the specific image.
  • FIG. 1 is a schematic diagram of a navigation system according to an embodiment of the present invention.
  • 2 is a schematic view showing a first embodiment of an image generating apparatus of the navigation system of the present invention.
  • 3 is a schematic view showing a second embodiment of an image generating apparatus of a navigation system of the present invention.
  • FIG. 4 is a schematic diagram of various specific images according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a navigation system according to another embodiment of the present invention. 6 is a flow chart of a navigation system according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of a navigation system according to an embodiment of the present invention.
  • a navigation system 1 according to an embodiment of the present invention is provided by an image generating device 11, IV and a wearable device 12.
  • the image generating device 11 and IV generate a specific image that is not recognized by the human eye, and project the specific image to the external exhibit 10, and the user views the external exhibit 10 through the wearable device 12 to receive the specific image and the external environment.
  • the visible light, and the visible light band is filtered out, only the specific image is retained, and it is determined whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external exhibit 10, after judging Comparing the specific image filtered by the visible light band with a plurality of built-in preset images, and comparing the navigation information corresponding to the specific image, the user can view or receive the external exhibit 10 at the wearable device 12 Related navigation information. Referring to FIG.
  • the wearable device 12 includes a receiving unit 121 , a filter unit 122 , a determining unit 123 , a comparing unit 124 , and a display unit 125 .
  • the receiving unit 121 has an input end and an output end
  • the filter unit 122 has The input end and the output end
  • the determining unit 123 has an input end and an output end
  • the comparing unit 124 has an input end and an output end
  • the input end of the filter unit 122 is connected to the output end of the receiving unit 121
  • the input end of the judging unit 123 is connected.
  • the input end of the comparison unit 124 is connected to the output end of the determination unit 123, and one end of the display unit 125 is connected to the output end of the comparison unit 124.
  • the input end of the receiving unit 121 receives the specific image projected on the external exhibit 10 and the visible light of the external environment, and then outputs the output to the input end of the filter unit 122 at the output end, and the input of the filter unit 122.
  • the terminal After receiving the specific image and the visible light, the terminal filters the visible light band, retains only the specific image, and outputs a specific image to the input end of the determining unit 123 at the output end, and the input end of the determining unit 123 receives the specific image and determines whether the specific image is in use.
  • the input end of the determining unit 123 receives the specific image and determines whether the specific image is in use.
  • the comparing unit 124 has a plurality of built-in presets.
  • FIG. 2 is a schematic diagram of a first embodiment of an image generating apparatus of a navigation system according to the present invention
  • FIG. 4 is a schematic diagram of various specific images according to an embodiment of the present invention. As shown in FIG.
  • the image generating device is composed of a casing 111, a first bracket 111a, a second bracket light source 112,
  • the first bracket 111a is disposed at one end of the outer casing 111
  • the second bracket 111b is disposed at the other end of the outer casing 111
  • the first bracket 111a is disposed at one end of the outer casing 111
  • the second bracket 111b is disposed at the other end of the outer casing 111.
  • the light source 112 is covered in the outer casing 111.
  • One end surface of the focusing piece 113 is fixed to the first bracket 111a, and the other end surface is fixed to the second bracket 111b.
  • the focusing piece 113 is disposed in front of the light source 112 and is separated from the light source 112.
  • One end face of the mask piece 114 is fixed to the first bracket 11 la, and the other end surface is fixed to the second bracket 111b.
  • the mask piece 114 is disposed in parallel in front of the focusing piece 113, and is spaced apart from the focusing piece 113 by a distance.
  • the cover sheet U4 has a pattern for generating a specific image, and the pattern of the specific image is as shown in FIG. 4.
  • the first bracket 111a simultaneously penetrates the first focus sheet 115 And the second focusing sheet 116, the second bracket 11 lb
  • the first bracket 111a and the second bracket 111b are respectively spaced apart from each other by a distance between the first focusing sheet 115 and the second focusing sheet 116, and the first focusing is performed.
  • the sheet 115 is disposed in parallel with the mask sheet 114 at a distance from the mask sheet 114.
  • the second focus sheet 116 is disposed in parallel with the first focus sheet 115 and spaced apart from the first focus sheet 115.
  • the distance between the first focusing sheet 115 and the second focusing sheet 116 is a movable focusing sheet for adjusting the focal length.
  • the light source 112 is used to emit infrared light (invisible light) that is not recognized by the human eye.
  • FIG. 3 is a schematic diagram of a second embodiment of an image generating apparatus of a navigation system according to the present invention
  • FIG. 4 is a schematic diagram of various specific images according to an embodiment of the present invention. As shown in FIG.
  • the image generating device 1 is an infrared light box, which is composed of a light source strip 1U', a light guide plate 112', a mask 113', a transparent paper 114', a transparent acrylic plate 115', and an aluminum strip.
  • the light source strip 11 ⁇ is disposed around the light guide plate 112 ′
  • the mask plate 113 ′ is disposed in front of the light guide plate 112 ′
  • the mask plate 113 ′ has a pattern for generating a specific image, and a specific image.
  • the pattern is as shown in FIG. 4.
  • there are four specific image patterns eight, B, C, and D and each of the patterns eight, B, C, and D respectively correspond to different exhibits, patterns A, B, and C.
  • the black part of D is the opaque part, while the white part Divided into a light-transmissive portion; the transparent paper 114' is disposed in front of the mask plate for transmitting the infrared light source but blocking the pattern of the rear mask plate 113'; the transparent paper 114' is disposed with a transparent pressure in front
  • the Phillips board 115' and the aluminum strip 116' are used to fix all the boards and paper to form an infrared light box.
  • the light source strip 11 ⁇ is used to emit infrared light (invisible light) that is unrecognizable by the human eye, and the infrared light is guided by the light guide plate 112 ′, and the guided infrared light passes through the mask 113 .
  • FIG. 5 is a schematic diagram of a navigation system according to another embodiment of the present invention.
  • the navigation system 2 is composed of an image generating device 11, IV, a wearable device 21, and a cloud database 22, wherein the wearable device 21 and the cloud database 22 are wirelessly connected.
  • the image generating device 11 and IV generate a specific image that is not recognized by the human eye, and project a specific image to the external exhibit 10, and the user views the external exhibit 10 through the wearable device 21 to receive the specific image and the external environment.
  • the visible light, and the visible light band of the external environment is filtered out, only a specific image is retained, and it is determined whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external exhibit 10, After the determination, a specific image is outputted to the cloud database 22, and a plurality of preset images and various exhibit navigation information corresponding to each preset image are built in the cloud database 22, and the cloud database 22 receives the transmission transmitted by the wearable device 21.
  • the wearable device 21 is composed of a receiving unit 211 , a filter unit 212 , a determining unit 213 , a sending unit 214 , and a display unit 215 .
  • the receiving unit 211 has a first input end and a second input end.
  • the first output end and the second output end, the filter unit 212 has an input end and an output end, the judging unit 213 has an input end and an output end, and the sending unit 214 has a first input end, a second input end, and a first output end.
  • the first input end of the receiving unit 211 receives the visible light projected by the user from the external exhibit 10 and the visible light of the external environment, and the second receiving unit 211
  • the input end is wirelessly connected to the cloud database 22
  • the first output end of the receiving unit 211 is connected to the input end of the filter unit 212
  • the second output end of the receiving unit 211 is connected to the second input end of the sending unit 214
  • the filter unit The output end of the sending unit 213 is connected to the input end of the determining unit 213, the output end of the determining unit 213 is connected to the first input end of the sending unit 214, and the first output end of the sending unit 214 is wirelessly connected to the cloud data base 22, and the transmitting unit 214
  • the second output is connected to the display unit 215.
  • the first input end of the receiving unit 211 of the wearable device 21 receives the external exhibit 10 when the user views the external exhibit 10, and projects the external exhibit 10
  • the specific image on the upper surface and the visible light in the external environment.
  • the first output end of the receiving unit 211 outputs a specific image and visible light to the input end of the filter unit 212
  • the input end of the filter unit 212 receives the visible image of the specific image and the external environment. Filtering the visible light band, retaining only the specific image, and outputting the specific image to the input end of the determining unit 213 at the output end.
  • the determining unit 213 After the input end of the determining unit 213 receives the specific image, it is determined whether the specific image is located at the maximum viewing angle viewed by the user. In the vicinity of the corresponding external exhibit 10, it is determined whether the user is viewing the external exhibit 10, and it is judged that the received specific image does correspond to the external exhibit 10 within the maximum viewing angle range viewed by the user, and the output of the judging unit 213 is outputted.
  • the first input end of the element 214 receives the specific image of the visible light band, and sends a specific image to the cloud database 22 at the first output to compare the specific image with the preset image, and compares the cloud database 22 Then, the exhibit navigation information conforming to the preset image of the specific image is outputted to the second input end of the receiving unit 211, and the second output end of the receiving unit 211 outputs the navigation information to the second input end of the transmitting unit 214, and the transmitting unit 214
  • the second output terminal outputs the navigation information to one end of the display unit 215, and the display unit 215 receives and displays the navigation information conforming to the external exhibit 10 viewed by the user; the internal structure and function of the image generating device 11, IV of FIG.
  • FIG. 1 is a schematic diagram of a navigation system according to an embodiment of the present invention
  • FIG. 6 is a flowchart of a navigation system according to an embodiment of the present invention.
  • step S2 when the user views the external exhibit 10 by wearing the wearable device 12, the receiving unit 121 of the wearable device 12 receives the specific image and the external environment. Visible light; then, performing step S3, the filter unit 122 of the wearable device 12 filters out The visible light band of the specific image; next, executing step S4, the determining unit 123 receives the specific image and determines whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external exhibit 10 Then, in step S5, the comparison unit 124 builds a preset image, and the comparison unit 124 receives the specific image filtered out of the visible light band, and compares the specific image with the preset image; finally, step S6 is performed.
  • FIG. 5 is a schematic diagram of a navigation system according to another embodiment of the present invention.
  • FIG. 6 is a flowchart of a navigation system according to an embodiment of the present invention.
  • step S1 the image generating devices 11, IV generate a specific image that is invisible to the human eye, the specific image corresponds to a specific external exhibit 10, and the specific image is projected to the external exhibit 10; then, step S2 is performed
  • the receiving unit 211 of the wearable device 21 receives the visible light of the specific image and the external environment; then, executing step S3, the filter unit 212 of the wearable device 21 Filtering the visible light band of the specific image; then, performing step S4, the determining unit 213 receives the specific image and determines whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external view.
  • step S5 the determining unit 213 determines to output a specific image to the cloud database 22, and the cloud database 22 has a built-in preset image, and the cloud database 22 receives the filtered output of the wearable device 21.
  • the specific image in the visible light band the specific image and the preset image are compared; finally, the execution step S6, the cloud resource
  • the library 22 After comparing the specific image with the preset image, the library 22 generates the exhibit navigation information corresponding to the preset image of the specific image, and outputs it to the display unit 215 to display the navigation information.
  • the mask plate 113 is a mask plate that can be set with a specific pattern for making the light portion transparent and partially impenetrable, for example, a G0B0 mask plate, and the present invention does not Set limits.
  • the light emitted by the light source strips 111, 111' may be invisible light that is not recognized by other human eyes except for infrared light, as long as the wavelength band is between 70 (Tl000 n m can reach the present invention)
  • the light source strip iii, iir may be a light emitting diode (LED) or a light bulb that emits high-power infrared light, and the invention is not limited thereto.
  • the visible light band range is about 40 (T700 nm.
  • the wireless transmission mode is a transmission mode conforming to a network communication protocol, and the wireless transmission system is, for example, Ethernets WiFi, 2. 5G. Or 3G, the invention is not limited thereto.
  • the wearable devices 12, 21 are, for example, glasses or head-mounted devices, and the invention is not limited thereto.
  • the type and the number of the patterns can be designed and set according to the user's requirements, and the present invention is not limited thereto.
  • the configuration determining units 123 and 213 are used in the wearable devices 12 and 21 to determine What external exhibits are within the maximum viewing angle that the user is currently viewing. In other words, the user can actively determine the external exhibit 10 that the user is currently viewing. Therefore, the orientation problem can be solved without solving the positioning problem, and at the same time, the automatic guide can be achieved.
  • the exhibition there is no need for the user to input the identification code of the exhibit, and there is no need to use the mobile device to scan the exhibit's quick response code.
  • the navigation system 1 and 2 of the present invention are faster and more convenient than the existing navigation method; further, the image generating device 11, 11' can not be recognized by the human eye.
  • the infrared rays are applied to the external exhibits 10, so that the user's visual perception of the external exhibits 10 is not affected.

Abstract

A guiding system (1) consisting of an image generating device (11, 11') and a wearable device (12). The image generating device (11, 11') generates a specific image that human eyes cannot recognize and that corresponds to an external exhibit (10), and projects the specific image onto the external exhibit (10); when the user looks at the external exhibit (10) using the wearable device (12), the specific image is received and compared with pre-set embedded images, and guiding information is outputted after the comparison. The design automatically provides the guiding information that a user needs and does not affect the visual experience when the user looks at exhibits.

Description

导览系统及方法 技术领域  Navigation system and method
本发明是有关于一种导览系统及方法, 特别是有关于一种可自动导览的导 览系统及方法。  The present invention relates to a navigation system and method, and more particularly to a navigation system and method that can be automatically navigated.
背景技术 现有应用于展览馆的导览方式, 大致可分为专人导览、 纸本导览、 行动装 置导览及感应式导览机, 行动装置导览又进一歩分为触控语音导览机、 按键式 语音导览机及快速响应码(Quick Response Code; QR Code)导览机, 专人导览 是由一位导览者介绍展览品的导览资讯, 而纸本导览则是将导览资讯印制于纸 本上, 行动装置导览是配置每个使用者一个行动装置, 以触控式或按键式语音 导览机而言, 使用者在观看展览品时, 可于行动装置输入此展览品的号码, 导 览资讯即拨放, 而感应式导览机则是侦测到使用者靠近展览品时, 导览资讯即 以语音拨放, 快速响应码导览机则是在行动装置内建应用程式, 行动装置通过 此应用程式可扫描每个展览品的快速响应码, 行动装置显示符合此快速响应码 的导览资讯。 然而, 触控式或按键式语音导览机需要使用者输入符合展览品的串号码, 造成使用者的不便, 快速响应码导览机则必须额外撰写导览专用的应用程式, 也必须在展览品上设置快速响应码, 不仅影响展览品的美观, 撰写应用程式也 增加额外的成本; 至于感应式语音导览机则需要使用者接近展览品一定的距离 才可拨放导览资讯, 容易影响展览馆的人口动线, 以上皆为既有导览方式所具 有的多项缺失。 发明内容 为了解决上述现有导览方式所具有的问题, 本发明的主要目的在于提供一 种导览系统, 导览系统是由图像产生装置与穿戴式装置组成, 通过图像产生装 置产生对应于特定外部展品的特定图像, 并投射特定图像至外部展品, 再经由 使用者透过配戴穿戴式装置观看外部展品时, 接收特定图像, 并比对特定图像 与内建的预设图像, 比对后输出导览资讯。 通过本发明的设计, 可自动提供使 用者所需的导览资讯, 且不影响使用者观看展品的视觉感受。 根据上述目的, 本发明主要目的在于提供一种导览系统, 包括:图像产生装 置, 其产生人眼无法识别的特定图像, 并将特定图像投射至外部展品; 及穿戴 式装置, 用于让使用者透过穿戴式装置观看外部展品而接收特定图像, 判断特 定图像是否位于使用者观看的最大视角处所对应的外部展品附近, 将特定图像 与内建的预设图像进行比对, 比对后输出对应于特定图像的导览资讯, 使用者 于穿戴式装置观看导览资讯。 所述的导览系统, 其中图像产生装置包括: 外壳; 光源, 其包覆于外壳内, 用于发出红外线光; 聚焦片, 其配置于光源的前方, 并与光源相隔一段距离, 用于将光源发出的红外线光进行聚焦; 遮罩片, 其平行配置于聚焦片的前方, 并与聚焦片相隔一段距离, 遮罩片具有产生特定图像的图案; 第一调焦片, 其 平行配置于遮罩片的前方, 并与遮罩片相隔一段距离, 用于调整特定图像的焦 距; 及第二调焦片, 其平行配置于第一调焦片的前方, 并与第一调焦片相隔一 段距离, 用于调整特定图像地的焦距。 所述的导览系统, 其中图像产生装置包括:光源灯条, 用于发出红外线光; 导光板, 光源灯条配置于导光板的周围, 导光板用于将光源发出的红外线光均 匀导光至整张版子; 遮罩板, 其配置于导光板的前方, 可将遮罩板挖空成预先 设想的成像图形; 透光纸, 其配置于遮罩板的前方, 用于使红外线光源透过但 可挡住后方的遮罩板的图形; 透明压克力板及铝条, 用于将板子及纸张固定接 合。 所述的导览系统, 其中穿戴式装置包含:接收单元, 具有输入端与输出端, 输入端接收使用者观看外部展品上投影的特定图像与外部环境的可见光; 滤镜 单元, 具有输入端与输出端, 输入端连接于接收单元的输出端, 滤镜单元的输 入端接收特定图像与可见光后, 将可见光滤除, 并于输出端输出特定图像; 判 断单元, 具有输入端与输出端, 输入端连接于滤镜单元的输出端, 判断单元的 输入端接收特定图像, 并判断特定图像是否位于使用者观看的最大视角处所对 应的外部展品附近; 比对单元, 具有输入端与输出端, 输入端连接于判断单元 的输出端, 比对单元内建预设图像, 且比对单元的输入端接收经滤除可见光的 特定图像后, 比对特定图像与预设图像, 并于输出端输出对应于特定图像的导 览资讯; 及显示单元, 其一端连接于比对单元的输出端, 接收并显示导览资讯。 根据上述目的, 本发明的另一目的在于提供一种导览系统, 包括:图像产生 装置, 其产生人眼无法识别的特定图像, 并将特定图像投射至外部展品; 穿戴 式装置, 用于让使用者透过穿戴式装置观看外部展品而接收特定图像, 判断特 定图像是否位于使用者观看的最大视角处所对应的外部展品附近, 判断后输出 对应于特定图像的导览资讯; 及云端资料库, 其无线连接于穿戴式装置, 云端 资料库内建多个预设图像, 用于接收穿戴式装置所传输的特定图像后, 比对特 定图像与预设图像, 并输出相同于特定图像的预设图像所对应的导览资讯至穿 戴式装置。 所述的导览系统, 其中穿戴式装置包括:接收单元, 具有第一输入端、 第二 输入端、 第一输出端及第二输出端, 第一输入端接收使用者观看外部展品上投 影的特定图像与外部环境的可见光, 第二输入端接收云端资料库输出的导览资 讯, 第一输出端输出特定图像与可见光, 第二输出端输出导览资讯; 滤镜单元, 具有输入端与输出端, 输入端连接于接收单元的第一输出端, 接收特定图像与 可见光后, 将可见光滤除, 并于输出端输出特定图像; 判断单元, 具有输入端 与输出端, 输入端连接于滤镜单元的输出端, 接收特定图像, 并判断特定图像 是否位于使用者观看的最大视角处所对应的外部展品附近; 发送单元, 具有第 一输入端、 第二输入端、 第一输出端及第二输出端, 第一输入端连接于判断单 元的输出端, 接收经滤除可见光的特定图像后, 并于第一输出端发送特定图像 至云端资料库进行特定图像与预设图像的比对, 第二输入端连接于接收单元的 第二输出端, 接收接收单元输出的导览资讯, 第二输出端输出导览资讯; 及显 示单元, 其一端连接于发送单元的第二输出端, 接收并显示导览资讯。 所述的导览系统, 其中云端资料库传递导览资讯至接收单元, 及发送单元 传递特定图像至云端资料库的方式为无线传递方式。 根据上述目的, 本发明的再一目的在于提供一种导览方法, 包括以下歩骤: 产生人眼无法识别的特定图像, 特定图像对应于特定的外部展品; 使用者观看 外部展品时, 由使用者所配戴的穿戴式装置接收特定图像与外部环境的可见光; 滤除可见光; 判断特定图像是否位于使用者观看的最大视角处所对应的外部展 品附近; 比对特定图像与内建的多个预设图像; 及产生相同于特定图像的预设 图像所对应的导览资讯。 经上述可知通过本发明的导览系统, 可自动提供使用者所需的导览资讯, 且不影响使用者观看展品的视觉感受。 附图说明 图 1 为本发明一实施例的导览系统的示意图。 图 2 为本发明的导览系统的图像产生装置的第一实施例示意图。 图 3 为本发明的导览系统的图像产生装置的第二实施例示意图 图 4 为本发明一实施例的各种特定图像的示意图。 图 5 为本发明另一实施例的导览系统的示意图。 图 6 为本发明一实施例的导览系统的流程图。 BACKGROUND OF THE INVENTION The existing navigation methods used in exhibition halls can be roughly divided into special person guides, paper guides, mobile device guides, and inductive guides. The mobile device guides are further divided into touch voice guides. A tour guide, a push-button audio guide, and a Quick Response Code (QR Code) tour guide. A guided tour is a guided tour of the exhibits, while the paper guide is The navigation information is printed on a paper. The mobile device navigation is to configure each user with a mobile device. In the case of a touch-sensitive or touch-tone audio guide, the user can act while viewing the exhibit. The device enters the number of the exhibit, the navigation information is dialed, and the inductive guide detects that the user is close to the exhibit, the navigation information is dialed, and the quick response code guide is An application is built in the mobile device, through which the mobile device can scan the quick response code of each exhibit, and the mobile device displays the navigation information conforming to the quick response code. However, the touch-sensitive or touch-tone audio guide requires the user to input the serial number that matches the exhibit, which is inconvenient for the user. The quick response code guide must additionally write a navigation-specific application, and must also be at the exhibition. Setting a quick response code on the product not only affects the aesthetics of the exhibit, but also adds additional cost to writing the application. As for the inductive audio guide, the user needs to be close to the exhibit to move the navigation information, which is easy to influence. The demographic line of the exhibition hall is all the missing of the existing navigation methods. SUMMARY OF THE INVENTION In order to solve the problems of the above conventional navigation method, it is a primary object of the present invention to provide a navigation system that is composed of an image generation device and a wearable device, and is generated by the image generation device to correspond to a specific Specific images of external exhibits, and project specific images to external exhibits When the user views the external exhibit by wearing the wearable device, the specific image is received, and the specific image and the built-in preset image are compared, and the navigation information is output. Through the design of the invention, the navigation information required by the user can be automatically provided without affecting the visual feeling of the user viewing the exhibit. In accordance with the above objects, it is a primary object of the present invention to provide a navigation system including: an image generating device that generates a specific image that is incapable of being recognized by a human eye, and projects a specific image to an external exhibit; and a wearable device for use The user receives the specific image through the wearable device to receive the specific image, determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user, compares the specific image with the built-in preset image, and compares the output. The user views the navigation information on the wearable device corresponding to the navigation information of the specific image. The navigation system, wherein the image generating device comprises: a housing; a light source enclosing the housing for emitting infrared light; and a focusing sheet disposed in front of the light source and spaced apart from the light source for The infrared light emitted by the light source is focused; the mask piece is arranged in parallel in front of the focusing piece and spaced apart from the focusing piece, the mask piece has a pattern for generating a specific image; the first focusing piece is arranged in parallel a front side of the cover sheet and spaced apart from the mask sheet for adjusting a focal length of the specific image; and a second focus sheet disposed in parallel with the front side of the first focus sheet and separated from the first focus sheet Distance, used to adjust the focal length of a specific image. The navigation system includes: a light source light bar for emitting infrared light; a light guide plate, the light source light bar is disposed around the light guide plate, and the light guide plate is used for uniformly guiding the infrared light emitted by the light source to a whole plate; a mask plate disposed in front of the light guide plate to hollow out the mask plate into a pre-conceived image pattern; a light transmissive paper disposed in front of the mask plate for allowing the infrared light source to pass through The pattern of the mask that blocks the rear; the transparent acrylic sheet and the aluminum strip are used to securely bond the board to the paper. The navigation system includes: a receiving unit having an input end and an output end, the input end receiving the visible light of the specific image projected by the user on the external exhibit and the external environment; the filter unit having the input end and the The output end is connected to the output end of the receiving unit, and the input end of the filter unit receives the specific image and visible light, filters the visible light, and outputs a specific image at the output end; the determining unit has an input end and an output end, and the input end Connected to the output of the filter unit, the judgment unit The input end receives a specific image, and determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user; the comparing unit has an input end and an output end, and the input end is connected to the output end of the judging unit, and the comparison unit is Constructing a preset image, and after receiving the specific image of the visible light filtered by the input end of the matching unit, comparing the specific image with the preset image, and outputting navigation information corresponding to the specific image at the output end; and displaying unit One end is connected to the output of the comparison unit to receive and display the navigation information. In accordance with the above objects, another object of the present invention is to provide a navigation system including: an image generating device that generates a specific image that is incapable of being recognized by a human eye, and projects a specific image to an external exhibit; a wearable device for The user receives the specific image through the wearable device to receive the specific image, determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user, and outputs the navigation information corresponding to the specific image after determining; and the cloud database. The wireless connection is connected to the wearable device, and the cloud database has a plurality of preset images for receiving the specific image transmitted by the wearable device, comparing the specific image with the preset image, and outputting the preset corresponding to the specific image. The navigation information corresponding to the image is to the wearable device. The navigation system includes: a receiving unit having a first input end, a second input end, a first output end, and a second output end, the first input end receiving the user viewing the projection on the external exhibit The specific image and the visible light of the external environment, the second input receives the navigation information output by the cloud database, the first output outputs a specific image and visible light, the second output outputs the navigation information; the filter unit has an input end and an output The input end is connected to the first output end of the receiving unit, and after receiving the specific image and the visible light, filtering the visible light and outputting a specific image at the output end; the determining unit has an input end and an output end, and the input end is connected to the filter The output end of the unit receives a specific image and determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user; the transmitting unit has a first input end, a second input end, a first output end, and a second output The first input end is connected to the output end of the judging unit, and receives a specific image that filters out visible light And transmitting a specific image to the cloud database at the first output end to compare the specific image with the preset image, the second input end is connected to the second output end of the receiving unit, and receiving the navigation information output by the receiving unit, The second output outputs navigation information; and the display unit has one end connected to the second output end of the transmitting unit, and receives and displays the navigation information. In the navigation system, the cloud database transmits the navigation information to the receiving unit, and the sending unit transmits the specific image to the cloud database in a wireless transmission manner. In accordance with the above objects, it is still another object of the present invention to provide a navigation method comprising the steps of: generating a specific image that is unrecognizable by a human eye, the specific image corresponding to a particular external exhibit; and the use of the external exhibit by the user The wearable device worn by the wearer receives the visible light of the specific image and the external environment; filters out the visible light; determines whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user; and compares the specific image with the built-in multiple pre- Set the image; and generate navigation information corresponding to the preset image that is the same as the specific image. It can be seen from the above that the navigation system of the present invention can automatically provide the navigation information required by the user without affecting the visual perception of the user viewing the exhibit. BRIEF DESCRIPTION OF DRAWINGS FIG. 1 is a schematic diagram of a navigation system according to an embodiment of the present invention. 2 is a schematic view showing a first embodiment of an image generating apparatus of the navigation system of the present invention. 3 is a schematic view showing a second embodiment of an image generating apparatus of a navigation system of the present invention. FIG. 4 is a schematic diagram of various specific images according to an embodiment of the present invention. FIG. 5 is a schematic diagram of a navigation system according to another embodiment of the present invention. 6 is a flow chart of a navigation system according to an embodiment of the present invention.
由于本发明揭露一种导览系统, 其中所利用的过滤可见光的光学方式, 已 为相关技术领域具有通常知识者所能明了, 故以下文中的说明, 不再作完整描 述。 同时, 以下结合附图和具体实施例对本发明作详细说明。 需说明的是, 附 图均采用非常简化的形式且均使用非精准的比率, 仅用以方便、 明晰地辅助说 明本发明实施例的目的。 首先, 请参阅图 1, 为本发明一实施例的导览系统的示意图。 如图 1所示, 本发明一实施例的导览系统 1, 是由图像产生装置 11、 IV 及穿戴式装置 12所 组成, 其中, 图像产生装置 11、 I V 产生人眼无法识别的特定图像, 并将此特 定图像投射至外部展品 10, 使用者透过穿戴式装置 12观看外部展品 10而接收 此特定图像及外部环境的可见光, 并将此可见光波段滤除, 仅留存特定图像, 并判断特定图像是否位于使用者观看的最大视角处所对应的外部展品 10附近, 意即判断使用者是否正在观赏外部展品 10, 判断后将经滤除可见光波段的特定 图像与内建的多个预设图像进行比对, 比对后输出对应于特定图像的导览资讯, 使用者可于穿戴式装置 12观看或接收与外部展品 10相关的导览资讯。 请继续参阅图 1, 穿戴式装置 12包含接收单元 121、 滤镜单元 122、 判断单 元 123、 比对单元 124及显示单元 125; 其中, 接收单元 121具有输入端与输出 端, 滤镜单元 122具有输入端与输出端, 判断单元 123具有输入端与输出端, 比对单元 124具有输入端与输出端,滤镜单元 122的输入端连接于接收单元 121 的输出端, 判断单元 123的输入端连接于滤镜单元 122的输出端, 比对单元 124 的输入端连接于判断单元 123的输出端, 显示单元 125的一端连接于比对单元 124的输出端。 请继续参阅图 1, 接收单元 121的输入端接收使用者观看外部展品 10上投 影的特定图像及外部环境的可见光后, 于输出端输出至滤镜单元 122的输入端, 滤镜单元 122的输入端接收特定图像与可见光后, 将可见光波段滤除, 仅留存 特定图像, 并于输出端输出特定图像至判断单元 123的输入端, 判断单元 123 的输入端接收特定图像并判断特定图像是否位于使用者观看的最大视角处所对 应的外部展品 10附近, 意即判断使用者是否正在观赏外部展品 10, 接着, 于输 出端输出至比对单元 124的输入端, 比对单元 124内建多个预设图像及对应于 各个预设图像的各种展品导览资讯, 且比对单元 124的输入端接收经滤除可见 光波段的特定图像后, 比对特定图像与预设图像, 并于输出端输出相同于特定 图像的预设图像所对应的展品导览资讯至显示单元 125的一端, 显示单元 125 接收并显示导览资讯。 接着, 请同时参阅图 2及图 4, 图 2为本发明的导览系统地图像产生装置的 第一实施例示意图, 图 4为本发明一实施例的各种特定图像的示意图。 如图 2 所示, 图像产生装置由外壳 111、 第一支架 l l la、 第二支架 光源 112、 聚焦片 113、 遮罩片 114、 第一调焦片 115及第二调焦片 116所组成; 其中, 第 一支架 111a设置于外壳 111的一端, 第二支架 111b设置于外壳 111的另一端, 光源 112包覆于外壳 111内, 聚焦片 113的一端面固定于第一支架 l lla, 另一 端面固定于第二支架 l llb, 聚焦片 113配置于光源 112的前方, 并与光源 112 相隔一段距离; 遮罩片 114的一端面固定于第一支架 ll la, 另一端面固定于第 二支架 lllb, 遮罩片 114平行配置于聚焦片 113的前方, 并与聚焦片 113相隔 一段距离, 遮罩片 U4具有产生特定图像的图案, 特定图像的图案如图 4所示, 以本实施例而言, 有四种特定图像的图案八、 B、 C、 D, 各个图案八、 B、 C、 D分 别对应不同的展品, 图案 A、 B、 C、 D中的黑色部分为不透光的部分, 而白色部 分为可透光的部分;第一支架 l lla同时穿透第一调焦片 115与第二调焦片 116, 第二支架 ll lb同时穿透第一调焦片 115与第二调焦片 116,第一支架 llla与第 二支架 lllb分别于第一调焦片 115及第二调焦片 116中相隔一个间距, 第一调 焦片 115平行配置于遮罩片 114的前方, 并与遮罩片 114相隔一段距离, 第二 调焦片 116平行配置于第一调焦片 115的前方, 并与第一调焦片 115相隔一段 距离, 第一调焦片 115与第二调焦片 116为可移动的调焦片, 用以调整焦距 请继续参阅图 2,光源 112用于发出人眼无法识别的红外线光 (属不可见光), 红外线光行经聚焦片 113后, 将红外线光进行聚焦, 经聚焦后的红外线光行经 遮罩片 114时, 透过遮罩片 114上的特定图像的图案产生一种红外线光形式的 特定图像, 接着, 再经由第一调焦片 115与第二调焦片 116调整特定图像的焦 距, 最后, 特定图像照射于外部展品 10上。 接着, 请同时参阅图 3及图 4, 图 3为本发明的导览系统的图像产生装置的 第二实施例示意图, 图 4为本发明一实施例的各种特定图像的示意图。 如图 3 所示, 图像产生装置 1 Γ 为红外光线灯箱, 由光源灯条 1U ' 、 导光板 112 ' 、 遮罩板 113 ' 、 透光纸 114' 、 透明压克力板 115 ' 及铝条 116' 所组成; 其中, 光源灯条 11 Γ 配置于导光板 112 ' 的周围, 遮罩板 113 ' 配置于导光板 112 ' 的前方, 遮罩板 113 ' 具有产生特定图像的图案, 特定图像的图案如图 4所示, 以本实施例而言, 有四种特定图像的图案八、 B、 C、 D, 各个图案八、 B、 C、 D分 别对应不同的展品, 图案 A、 B、 C、 D中的黑色部分为不透光的部分, 而白色部 分为可透光的部分; 透光纸 114' 配置于遮罩板的前方, 用于使红外线光源透过 但可挡住后方的遮罩板 113 ' 的图形;透光纸 114' 前方配置透明压克力板 115 ' 及铝条 116 ' , 用于将所有板子及纸张固定接合, 组成红外光线灯箱。 请继续参阅图 3, 光源灯条 11 Γ 用于发出人眼无法识别的红外线光 (属不 可见光), 红外线光由导光板 112 ' 进行导光, 经导光后的红外线光行经遮罩板 113 ' 时, 红外线光即会通过遮罩板 113 ' 上的特定图像的图案产生一种红外线 光形式的特定图像, 接着红外线光再通过透光纸 114 ' 及透明压克力板 115 ' , 将特定图像照射于外部展品 10上。 接着, 请参阅图 5, 本发明另一实施例的导览系统的示意图。 如图 5所示, 导览系统 2是由图像产生装置 11、 I V 、 穿戴式装置 21及云端资料库 22所组 成, 其中, 穿戴式装置 21与云端资料库 22是以无线方式连接。 请继续参阅图 5, 图像产生装置 11、 I V 产生人眼无法辨识的特定图像, 并将特定图像投射至外部展品 10, 使用者透过穿戴式装置 21观看外部展品 10 而接收特定图像及外部环境的可见光, 并将外部环境的可见光波段滤除, 仅留 存特定图像, 并判断特定图像是否位于使用者观看的最大视角处所对应的外部 展品 10附近, 意即判断使用者是否正在观赏外部展品 10, 判断后输出特定图像 至云端资料库 22,云端资料库 22中内建多个预设图像及对应于各个预设图像的 各种展品导览资讯, 云端资料库 22接收穿戴式装置 21所传输的经滤除可见光 波段的特定图像后, 比对特定图像与预设图像, 并输出相同于特定图像的预设 图像所对应的展品导览资讯至穿戴式装置 21,使用者可于穿戴式装置 21接收或 观看与外部展品 10相关的导览资讯。 请继续参阅图 5, 穿戴式装置 21由接收单元 211、 滤镜单元 212、 判断单元 213、 发送单元 214及显示单元 215所组成; 其中, 接收单元 211具有第一输入 端、 第二输入端、 第一输出端及第二输出端, 滤镜单元 212具有输入端与输出 端, 判断单元 213具有输入端与输出端, 发送单元 214具有第一输入端、 第二 输入端、 第一输出端及第二输出端, 接收单元 211的第一输入端接收使用者观 看外部展品 10所投影出的特定图像与外部环境的可见光, 接收单元 211的第二 输入端无线连接于云端资料库 22, 接收单元 211的第一输出端连接于滤镜单元 212的输入端, 接收单元 211的第二输出端连接于发送单元 214的第二输入端, 滤镜单元 212的输出端连接于判断单元 213的输入端, 判断单元 213的输出端 连接于发送单元 214的第一输入端, 发送单元 214的第一输出端无线连接于云 端资料库 22, 发送单元 214的第二输出端连接于显示单元 215。 请继续参阅图 5, 当使用者透过配戴穿戴式装置 21观看外部展品 10时, 穿 戴式装置 21的接收单元 211的第一输入端接收使用者观看外部展品 10时, 投 影在外部展品 10上的特定图像及外部环境的可见光, 接着, 接收单元 211的第 一输出端输出特定图像与可见光至滤镜单元 212的输入端, 滤镜单元 212的输 入端接收特定图像与外部环境的可见光后, 将可见光波段滤除, 仅留存特定图 像, 并于输出端输出特定图像至判断单元 213的输入端, 判断单元 213的输入 端接收特定图像后, 判断特定图像是否位于使用者观看的最大视角处所对应的 外部展品 10附近, 意即判断使用者是否正在观赏外部展品 10, 判断所接收的特 定图像确实对应于使用者观看的最大视角范围内的外部展品 10,则判断单元 213 的输出端输出经滤除可见光波段的特定图像至发送单元 214的第一输入端, 发 送单元 214的第一输入端接收经滤除可见光波段的特定图像后, 并于第一输出 端发送特定图像至云端资料库 22进行特定图像与预设图像的比对, 经云端资料 库 22比对后输出符合特定图像的预设图像的展品导览资讯至接收单元 211的第 二输入端, 接收单元 211的第二输出端输出导览资讯至发送单元 214的第二输 入端, 发送单元 214的第二输出端输出导览资讯至显示单元 215的一端, 显示 单元 215接收并显示符合使用者所观看外部展品 10的导览资讯; 图 5的图像产 生装置 11、 I V 内部构造与功能如图 2及图 3所示, 在此不再详述。 接着,请同时参阅图 1及图 6,图 1为本发明一实施例的导览系统的示意图, 图 6为本发明一实施例的导览系统的流程图。 首先, 如歩骤 S1所示, 图像产生 装置 11、 I V 产生人眼无法辨识的特定图像, 特定图像对应于特定的外部展品Since the present invention discloses a navigation system in which the optical mode of filtering visible light is utilized, it will be apparent to those skilled in the art, and the description below will not be fully described. Meanwhile, the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments. It should be noted that the drawings are in a very simplified form and both use non-precise ratios, and are merely for convenience and clarity of the purpose of the embodiments of the present invention. First, please refer to FIG. 1, which is a schematic diagram of a navigation system according to an embodiment of the present invention. As shown in FIG. 1, a navigation system 1 according to an embodiment of the present invention is provided by an image generating device 11, IV and a wearable device 12. In the composition, the image generating device 11 and IV generate a specific image that is not recognized by the human eye, and project the specific image to the external exhibit 10, and the user views the external exhibit 10 through the wearable device 12 to receive the specific image and the external environment. The visible light, and the visible light band is filtered out, only the specific image is retained, and it is determined whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external exhibit 10, after judging Comparing the specific image filtered by the visible light band with a plurality of built-in preset images, and comparing the navigation information corresponding to the specific image, the user can view or receive the external exhibit 10 at the wearable device 12 Related navigation information. Referring to FIG. 1 , the wearable device 12 includes a receiving unit 121 , a filter unit 122 , a determining unit 123 , a comparing unit 124 , and a display unit 125 . The receiving unit 121 has an input end and an output end, and the filter unit 122 has The input end and the output end, the determining unit 123 has an input end and an output end, the comparing unit 124 has an input end and an output end, the input end of the filter unit 122 is connected to the output end of the receiving unit 121, and the input end of the judging unit 123 is connected. At the output end of the filter unit 122, the input end of the comparison unit 124 is connected to the output end of the determination unit 123, and one end of the display unit 125 is connected to the output end of the comparison unit 124. Referring to FIG. 1 , the input end of the receiving unit 121 receives the specific image projected on the external exhibit 10 and the visible light of the external environment, and then outputs the output to the input end of the filter unit 122 at the output end, and the input of the filter unit 122. After receiving the specific image and the visible light, the terminal filters the visible light band, retains only the specific image, and outputs a specific image to the input end of the determining unit 123 at the output end, and the input end of the determining unit 123 receives the specific image and determines whether the specific image is in use. In the vicinity of the external exhibit 10 corresponding to the maximum viewing angle, it is determined whether the user is viewing the external exhibit 10, and then outputted to the input end of the comparing unit 124 at the output end, and the comparing unit 124 has a plurality of built-in presets. The image and the various exhibit navigation information corresponding to each preset image, and the input end of the comparing unit 124 receives the specific image filtered out of the visible light band, compares the specific image with the preset image, and outputs the same output at the output end The display guide information corresponding to the preset image corresponding to the preset image of the specific image to the display unit 125, the display unit 125 Receive and display navigation information. 2 and FIG. 4, FIG. 2 is a schematic diagram of a first embodiment of an image generating apparatus of a navigation system according to the present invention, and FIG. 4 is a schematic diagram of various specific images according to an embodiment of the present invention. As shown in FIG. 2, the image generating device is composed of a casing 111, a first bracket 111a, a second bracket light source 112, The first bracket 111a is disposed at one end of the outer casing 111, and the second bracket 111b is disposed at the other end of the outer casing 111, wherein the first bracket 111a is disposed at one end of the outer casing 111, and the second bracket 111b is disposed at the other end of the outer casing 111. The light source 112 is covered in the outer casing 111. One end surface of the focusing piece 113 is fixed to the first bracket 111a, and the other end surface is fixed to the second bracket 111b. The focusing piece 113 is disposed in front of the light source 112 and is separated from the light source 112. One end face of the mask piece 114 is fixed to the first bracket 11 la, and the other end surface is fixed to the second bracket 111b. The mask piece 114 is disposed in parallel in front of the focusing piece 113, and is spaced apart from the focusing piece 113 by a distance. The cover sheet U4 has a pattern for generating a specific image, and the pattern of the specific image is as shown in FIG. 4. In the present embodiment, there are four patterns of specific images, eight, B, C, and D, and each pattern is eight, B, C, D corresponds to different exhibits, the black portion of the patterns A, B, C, D is the opaque portion, and the white portion is the permeable portion; the first bracket 111a simultaneously penetrates the first focus sheet 115 And the second focusing sheet 116, the second bracket 11 lb When the first focusing piece 115 and the second focusing piece 116 are penetrated, the first bracket 111a and the second bracket 111b are respectively spaced apart from each other by a distance between the first focusing sheet 115 and the second focusing sheet 116, and the first focusing is performed. The sheet 115 is disposed in parallel with the mask sheet 114 at a distance from the mask sheet 114. The second focus sheet 116 is disposed in parallel with the first focus sheet 115 and spaced apart from the first focus sheet 115. The distance between the first focusing sheet 115 and the second focusing sheet 116 is a movable focusing sheet for adjusting the focal length. Referring to FIG. 2, the light source 112 is used to emit infrared light (invisible light) that is not recognized by the human eye. After the infrared light passes through the focusing sheet 113, the infrared light is focused, and when the focused infrared light passes through the mask sheet 114, a specific image in the form of infrared light is generated through the pattern of the specific image on the mask sheet 114. Then, the focal length of the specific image is adjusted via the first focusing sheet 115 and the second focusing sheet 116, and finally, the specific image is irradiated onto the external exhibit 10. 3 and FIG. 4, FIG. 3 is a schematic diagram of a second embodiment of an image generating apparatus of a navigation system according to the present invention, and FIG. 4 is a schematic diagram of various specific images according to an embodiment of the present invention. As shown in FIG. 3, the image generating device 1 is an infrared light box, which is composed of a light source strip 1U', a light guide plate 112', a mask 113', a transparent paper 114', a transparent acrylic plate 115', and an aluminum strip. The light source strip 11 Γ is disposed around the light guide plate 112 ′, and the mask plate 113 ′ is disposed in front of the light guide plate 112 ′, and the mask plate 113 ′ has a pattern for generating a specific image, and a specific image. The pattern is as shown in FIG. 4. In this embodiment, there are four specific image patterns eight, B, C, and D, and each of the patterns eight, B, C, and D respectively correspond to different exhibits, patterns A, B, and C. The black part of D is the opaque part, while the white part Divided into a light-transmissive portion; the transparent paper 114' is disposed in front of the mask plate for transmitting the infrared light source but blocking the pattern of the rear mask plate 113'; the transparent paper 114' is disposed with a transparent pressure in front The Phillips board 115' and the aluminum strip 116' are used to fix all the boards and paper to form an infrared light box. Referring to FIG. 3, the light source strip 11 Γ is used to emit infrared light (invisible light) that is unrecognizable by the human eye, and the infrared light is guided by the light guide plate 112 ′, and the guided infrared light passes through the mask 113 . ', the infrared light will generate a specific image in the form of infrared light through the pattern of the specific image on the mask 113', and then the infrared light will pass through the transparent paper 114' and the transparent acrylic plate 115', which will be specific The image is illuminated on the exterior exhibit 10. Next, please refer to FIG. 5, which is a schematic diagram of a navigation system according to another embodiment of the present invention. As shown in FIG. 5, the navigation system 2 is composed of an image generating device 11, IV, a wearable device 21, and a cloud database 22, wherein the wearable device 21 and the cloud database 22 are wirelessly connected. Referring to FIG. 5, the image generating device 11 and IV generate a specific image that is not recognized by the human eye, and project a specific image to the external exhibit 10, and the user views the external exhibit 10 through the wearable device 21 to receive the specific image and the external environment. The visible light, and the visible light band of the external environment is filtered out, only a specific image is retained, and it is determined whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external exhibit 10, After the determination, a specific image is outputted to the cloud database 22, and a plurality of preset images and various exhibit navigation information corresponding to each preset image are built in the cloud database 22, and the cloud database 22 receives the transmission transmitted by the wearable device 21. After the specific image of the visible light band is filtered, the specific image and the preset image are compared, and the exhibit navigation information corresponding to the preset image of the specific image is output to the wearable device 21, and the user can wear the wearable device 21 Receive or view navigation information related to external exhibits 10. Continuing to refer to FIG. 5 , the wearable device 21 is composed of a receiving unit 211 , a filter unit 212 , a determining unit 213 , a sending unit 214 , and a display unit 215 . The receiving unit 211 has a first input end and a second input end. The first output end and the second output end, the filter unit 212 has an input end and an output end, the judging unit 213 has an input end and an output end, and the sending unit 214 has a first input end, a second input end, and a first output end. a second output end, the first input end of the receiving unit 211 receives the visible light projected by the user from the external exhibit 10 and the visible light of the external environment, and the second receiving unit 211 The input end is wirelessly connected to the cloud database 22, the first output end of the receiving unit 211 is connected to the input end of the filter unit 212, and the second output end of the receiving unit 211 is connected to the second input end of the sending unit 214, the filter unit The output end of the sending unit 213 is connected to the input end of the determining unit 213, the output end of the determining unit 213 is connected to the first input end of the sending unit 214, and the first output end of the sending unit 214 is wirelessly connected to the cloud data base 22, and the transmitting unit 214 The second output is connected to the display unit 215. Referring to FIG. 5, when the user views the external exhibit 10 by wearing the wearable device 21, the first input end of the receiving unit 211 of the wearable device 21 receives the external exhibit 10 when the user views the external exhibit 10, and projects the external exhibit 10 The specific image on the upper surface and the visible light in the external environment. Then, the first output end of the receiving unit 211 outputs a specific image and visible light to the input end of the filter unit 212, and the input end of the filter unit 212 receives the visible image of the specific image and the external environment. Filtering the visible light band, retaining only the specific image, and outputting the specific image to the input end of the determining unit 213 at the output end. After the input end of the determining unit 213 receives the specific image, it is determined whether the specific image is located at the maximum viewing angle viewed by the user. In the vicinity of the corresponding external exhibit 10, it is determined whether the user is viewing the external exhibit 10, and it is judged that the received specific image does correspond to the external exhibit 10 within the maximum viewing angle range viewed by the user, and the output of the judging unit 213 is outputted. Filtering a specific image of the visible light band to the first input of the transmitting unit 214, and transmitting The first input end of the element 214 receives the specific image of the visible light band, and sends a specific image to the cloud database 22 at the first output to compare the specific image with the preset image, and compares the cloud database 22 Then, the exhibit navigation information conforming to the preset image of the specific image is outputted to the second input end of the receiving unit 211, and the second output end of the receiving unit 211 outputs the navigation information to the second input end of the transmitting unit 214, and the transmitting unit 214 The second output terminal outputs the navigation information to one end of the display unit 215, and the display unit 215 receives and displays the navigation information conforming to the external exhibit 10 viewed by the user; the internal structure and function of the image generating device 11, IV of FIG. And as shown in Figure 3, it will not be described in detail here. 1 and FIG. 6, FIG. 1 is a schematic diagram of a navigation system according to an embodiment of the present invention, and FIG. 6 is a flowchart of a navigation system according to an embodiment of the present invention. First, as shown in step S1, the image generating devices 11, IV generate a specific image that the human eye cannot recognize, and the specific image corresponds to a specific external exhibit.
10, 并将特定图像投射至外部展品 10; 接着, 执行歩骤 S2, 使用者透过配戴穿 戴式装置 12观看外部展品 10时, 穿戴式装置 12的接收单元 121接收特定图像 与外部环境的可见光; 接着, 执行歩骤 S3, 穿戴式装置 12的滤镜单元 122滤除 特定图像的可见光波段; 接着, 执行歩骤 S4, 判断单元 123接收特定图像并判 断特定图像是否位于使用者观看的最大视角处所对应的外部展品 10附近, 意即 判断使用者是否正在观赏外部展品 10; 接着, 执行歩骤 S5, 比对单元 124内建 预设图像, 且比对单元 124接收经滤除可见光波段的特定图像后, 比对特定图 像与预设图像; 最后, 执行歩骤 S6, 比对单元 124比对特定图像与预设图像后, 产生相同于特定图像的预设图像所对应的展品导览资讯,并输出至显示单元 125 显示导览资讯。 接着, 请同时参阅图 5及图 6, 图 5为本发明另一实施例的导览系统的示意 图, 图 6为本发明一实施例的导览系统的流程图。 首先, 如歩骤 S1所示, 图像 产生装置 11、 I V 产生人眼无法辨识的特定图像, 特定图像对应于特定的外部 展品 10, 并将特定图像投射至外部展品 10; 接着, 执行歩骤 S2, 使用者透过配 戴穿戴式装置 21观看外部展品 10时, 穿戴式装置 21的接收单元 211接收特定 图像与外部环境的可见光; 接着, 执行歩骤 S3, 穿戴式装置 21的滤镜单元 212 滤除特定图像的可见光波段; 接着, 执行歩骤 S4, 判断单元 213接收特定图像 并判断特定图像是否位于使用者观看的最大视角处所对应的外部展品 10附近, 意即判断使用者是否正在观赏外部展品 10; 接着, 执行歩骤 S5, 判断单元 213 判断后输出特定图像至云端资料库 22, 云端资料库 22中内建预设图像, 云端资 料库 22接收穿戴式装置 21所传输的经滤除可见光波段的特定图像后, 比对特 定图像与预设图像; 最后, 执行歩骤 S6, 云端资料库 22比对特定图像与预设图 像后, 产生相同于特定图像的预设图像所对应的展品导览资讯, 并输出至显示 单元 215显示导览资讯。 上述本发明实施例中, 遮罩板 113为一种可设置特定图案的遮罩板, 用于 让光线部分可穿透, 部分不可穿透, 例如是 G0B0遮罩板,在此本发明并不设限。 上述本发明实施例中, 光源灯条 111、 111 ' 所发出的光除了红外线光的外, 也可以是其他人眼无法识别的不可见光, 只要波段介于 70(Tl000nm均可达到本 发明的效果, 光源灯条 iii、 iir 可以为一种可发出高功率红外线光的发光二 极体(Light Emitting Diode; LED)或是灯泡(Lamp), 在此本发明并不设限。 上述本发明实施例中, 可见光波段范围约为 40(T700nm。 上述本发明实施例中, 无线传递的方式为一种符合网路通讯协定的传递方 式, 无线传递系例如是 Ethernets WiFi、 2. 5G或 3G, 在此本发明并不设限。 上述本发明实施例中,穿戴式装置 12、 21例如是眼镜形状或是头戴式装置, 在此本发明并不设限。 在上述本发明实施例中, 图案种类及数量可依使用者需求设计与设置, 在 此本发明并不设限。 上述本发明实施例中, 通过配置判断单元 123、 213于穿戴式装置 12、 21 内, 通过判断使用者目前观看的最大视角范围内有哪些外部展品, 换言之, 可 主动判断使用者目前所观看的外部展品 10为何者, 因此, 可解决定向问题, 并 且无需解决定位问题, 同时, 可达到自动导览的目的, 无需使用者额外输入展 品的识别码, 也无需利用行动装置扫描展品的快速响应码, 也无需印制纸本或 请专人导览, 故本发明的导览系统 1、 2较既有的导览方式更快速且便利; 再者, 图像产生装置 11、 11 ' 投射人眼无法辨识的红外线于外部展品 10上, 故不影响 使用者观赏外部展品 10的视觉感受。 以上所述仅为本发明的较佳实施例, 并非用以限定本发明的权利范围; 同 时以上的描述, 对于相关技术领域的专门人士应可明了及实施, 因此其他未脱 离本发明所揭示的精神下所完成的等效改变或修饰, 均应包含在申请专利范围 中。 10, projecting a specific image to the external exhibit 10; then, performing step S2, when the user views the external exhibit 10 by wearing the wearable device 12, the receiving unit 121 of the wearable device 12 receives the specific image and the external environment. Visible light; then, performing step S3, the filter unit 122 of the wearable device 12 filters out The visible light band of the specific image; next, executing step S4, the determining unit 123 receives the specific image and determines whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external exhibit 10 Then, in step S5, the comparison unit 124 builds a preset image, and the comparison unit 124 receives the specific image filtered out of the visible light band, and compares the specific image with the preset image; finally, step S6 is performed. After comparing the specific image with the preset image, the matching unit 124 generates the exhibit navigation information corresponding to the preset image of the specific image, and outputs it to the display unit 125 to display the navigation information. Next, please refer to FIG. 5 and FIG. 6 simultaneously. FIG. 5 is a schematic diagram of a navigation system according to another embodiment of the present invention. FIG. 6 is a flowchart of a navigation system according to an embodiment of the present invention. First, as shown in step S1, the image generating devices 11, IV generate a specific image that is invisible to the human eye, the specific image corresponds to a specific external exhibit 10, and the specific image is projected to the external exhibit 10; then, step S2 is performed When the user views the external exhibit 10 by wearing the wearable device 21, the receiving unit 211 of the wearable device 21 receives the visible light of the specific image and the external environment; then, executing step S3, the filter unit 212 of the wearable device 21 Filtering the visible light band of the specific image; then, performing step S4, the determining unit 213 receives the specific image and determines whether the specific image is located near the external exhibit 10 corresponding to the maximum viewing angle viewed by the user, that is, whether the user is viewing the external view. Exhibit 10; Next, executing step S5, the determining unit 213 determines to output a specific image to the cloud database 22, and the cloud database 22 has a built-in preset image, and the cloud database 22 receives the filtered output of the wearable device 21. After the specific image in the visible light band, the specific image and the preset image are compared; finally, the execution step S6, the cloud resource After comparing the specific image with the preset image, the library 22 generates the exhibit navigation information corresponding to the preset image of the specific image, and outputs it to the display unit 215 to display the navigation information. In the above embodiment of the present invention, the mask plate 113 is a mask plate that can be set with a specific pattern for making the light portion transparent and partially impenetrable, for example, a G0B0 mask plate, and the present invention does not Set limits. In the above embodiment of the present invention, the light emitted by the light source strips 111, 111' may be invisible light that is not recognized by other human eyes except for infrared light, as long as the wavelength band is between 70 (Tl000 n m can reach the present invention) The light source strip iii, iir may be a light emitting diode (LED) or a light bulb that emits high-power infrared light, and the invention is not limited thereto. In the above embodiment of the present invention, the visible light band range is about 40 (T700 nm. In the above embodiment of the present invention, the wireless transmission mode is a transmission mode conforming to a network communication protocol, and the wireless transmission system is, for example, Ethernets WiFi, 2. 5G. Or 3G, the invention is not limited thereto. In the above embodiment of the invention, the wearable devices 12, 21 are, for example, glasses or head-mounted devices, and the invention is not limited thereto. In the example, the type and the number of the patterns can be designed and set according to the user's requirements, and the present invention is not limited thereto. In the embodiment of the present invention, the configuration determining units 123 and 213 are used in the wearable devices 12 and 21 to determine What external exhibits are within the maximum viewing angle that the user is currently viewing. In other words, the user can actively determine the external exhibit 10 that the user is currently viewing. Therefore, the orientation problem can be solved without solving the positioning problem, and at the same time, the automatic guide can be achieved. For the purpose of the exhibition, there is no need for the user to input the identification code of the exhibit, and there is no need to use the mobile device to scan the exhibit's quick response code. There is no need to print a paper or a special person to guide, so the navigation system 1 and 2 of the present invention are faster and more convenient than the existing navigation method; further, the image generating device 11, 11' can not be recognized by the human eye. The infrared rays are applied to the external exhibits 10, so that the user's visual perception of the external exhibits 10 is not affected. The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention; It will be apparent to those skilled in the art that the equivalent changes or modifications may be made without departing from the spirit of the invention.

Claims

权 利 要 求 书 Claim
1. 一种导览系统, 其特征在于, 包括:  A navigation system, comprising:
图像产生装置, 用于产生人眼无法识别的特定图像, 并将所述特定图像投射 至外部展品; 及  An image generating device for generating a specific image that is not recognized by a human eye, and projecting the specific image to an external exhibit;
穿戴式装置, 用于让使用者透过所述穿戴式装置观看所述外部展品而接收所 述特定图像, 判断所述特定图像是否位于所述使用者观看的最大视角处所对应 的所述外部展品附近, 判断后将所述特定图像与内建的多个预设图像进行比对, 比对后输出对应所述特定图像的导览资讯, 使所述使用者于所述穿戴式装置观 看所述导览资讯。  a wearable device, configured to allow a user to view the external exhibit through the wearable device to receive the specific image, and determine whether the specific image is located at the external exhibit corresponding to the maximum viewing angle viewed by the user In the vicinity, after comparing, the specific image is compared with a plurality of built-in preset images, and after the comparison, the navigation information corresponding to the specific image is output, so that the user views the device on the wearable device. Guided information.
2. 如权利要求 1所述的导览系统, 其特征在于, 所述图像产生装置包括: 外壳;  2. The navigation system according to claim 1, wherein the image generating device comprises: a casing;
光源, 其包覆于所述外壳内, 用于发出红外线光;  a light source encapsulated in the outer casing for emitting infrared light;
聚焦片, 其配置于所述光源的前方, 并与所述光源相隔一段距离, 用于将所 述光源发出的所述红外线光进行聚焦;  a focusing sheet disposed in front of the light source and spaced apart from the light source for focusing the infrared light emitted by the light source;
遮罩片, 其平行配置于所述聚焦片的前方, 并与所述聚焦片相隔一段距离, 所述遮罩片具有产生所述特定图像的图案;  a mask sheet disposed in parallel in front of the focusing sheet and spaced apart from the focusing sheet, the mask sheet having a pattern for generating the specific image;
第一调焦片, 其平行配置于所述遮罩片的前方, 并与所述遮罩片相隔一段距 离, 用于调整所述特定图像的焦距; 及  a first focusing sheet disposed in parallel with the mask sheet and spaced apart from the mask sheet for adjusting a focal length of the specific image;
第二调焦片, 其平行配置于所述第一调焦片的前方, 并与所述第一调焦片相 隔一段距离, 用于调整所述特定图像地的焦距。  And a second focus adjustment sheet disposed in front of the first focus adjustment sheet and spaced apart from the first focus adjustment sheet for adjusting a focal length of the specific image.
3. 如权利要求 1所述的导览系统, 其特征在于, 所述图像产生装置包括: 光源灯条, 用于发出红外线光;  3. The navigation system according to claim 1, wherein the image generating device comprises: a light source light bar for emitting infrared light;
导光板, 所述光源灯条配置于所述导光板的周围, 所述导光板用于将所述光 源发出的所述红外线光均匀导光至整张版子;  a light guide plate, the light source light bar is disposed around the light guide plate, and the light guide plate is configured to uniformly guide the infrared light emitted by the light source to a whole plate;
遮罩板, 其配置于所述导光板的前方, 可将所述遮罩板挖空成预先设想的成 像图形;  a mask plate disposed in front of the light guide plate, wherein the mask plate can be hollowed out into a pre-conceived image pattern;
透光纸, 其配置于所述遮罩板的前方, 用于使红外线光源透过但可挡住后方 的所述遮罩板的图形; 及  a light transmissive paper disposed in front of the mask plate for transmitting an infrared light source but blocking a pattern of the mask plate at the rear; and
透明压克力板及铝条, 用于将板子及纸张固定接合。 Transparent acrylic sheet and aluminum strip for fixing the board and paper.
4. 如权利要求 1所述的导览系统, 其特征在于, 所述穿戴式装置包含: 接收单元, 具有输入端与输出端, 所述输入端接收所述使用者观看所述外部 展品上投射的所述特定图像与外部环境的可见光, 所述输出端输出所述特定图 像与所述可见光; 4. The navigation system according to claim 1, wherein the wearable device comprises: a receiving unit having an input end and an output end, the input end receiving the user viewing the projection on the external exhibit The specific image and the visible light of the external environment, the output terminal outputs the specific image and the visible light;
滤镜单元, 具有输入端与输出端, 所述输入端连接于所述接收单元的所述输 出端, 所述滤镜单元的所述输入端接收所述特定图像与所述可见光后, 将所述 可见光滤除, 并于所述输出端输出所述特定图像;  a filter unit having an input end and an output end, wherein the input end is connected to the output end of the receiving unit, and the input end of the filter unit receives the specific image and the visible light, Depicting visible light filtering, and outputting the specific image at the output end;
判断单元, 具有输入端与输出端, 所述输入端连接于所述滤镜单元的所述输 出端, 所述判断单元的所述输入端接收所述特定图像, 并判断所述特定图像是 否位于所述使用者观看的最大视角处所对应的所述外部展品附近;  a determining unit having an input end connected to the output end of the filter unit, the input end of the determining unit receiving the specific image, and determining whether the specific image is located The vicinity of the external exhibit corresponding to the maximum viewing angle viewed by the user;
比对单元, 具有输入端与输出端, 所述输入端连接于所述判断单元的所述输 出端, 所述比对单元内建所述预设图像, 且所述比对单元的所述输入端接收经 滤除所述可见光的所述特定图像后, 比对所述特定图像与所述预设图像, 并于 所述输出端输出对应于所述特定图像的所述导览资讯; 及  a comparison unit having an input end and an output end, wherein the input end is connected to the output end of the judging unit, the comparison unit is built with the preset image, and the input of the comparison unit is After receiving the specific image filtered by the visible light, comparing the specific image with the preset image, and outputting the navigation information corresponding to the specific image at the output end;
显示单元, 其一端连接于所述比对单元的所述输出端, 接收并显示所述导览 资讯。  And a display unit, one end of which is connected to the output end of the comparison unit, and receives and displays the navigation information.
5. 一种导览系统, 其特征在于, 包括:  5. A navigation system, comprising:
图像产生装置, 其产生人眼无法识别的特定图像, 并将所述特定图像投射至 外部展品;  An image generating device that generates a specific image that is not recognized by the human eye and projects the specific image to an external exhibit;
穿戴式装置, 用以让使用者透过所述穿戴式装置观看所述外部展品而接收所 述特定图像, 判断所述特定图像是否位于所述使用者观看的最大视角处所对应 的所述外部展品附近, 判断后输出所述特定图像; 及  a wearable device, configured to allow a user to view the external exhibit through the wearable device to receive the specific image, and determine whether the specific image is located at the external exhibit corresponding to the maximum viewing angle viewed by the user Near, after judging, outputting the specific image; and
云端资料库, 其无线连接于所述穿戴式装置, 所述云端资料库内建多个预设 图像, 用于接收所述穿戴式装置所输出的所述特定图像后, 比对所述特定图像 与所述多个预设图像, 并输出相同于所述特定图像的预设图像所对应的导览资 讯至所述穿戴式装置, 所述穿戴式装置显示所述导览资讯。  a cloud database, which is wirelessly connected to the wearable device, and the cloud database has a plurality of preset images for receiving the specific image output by the wearable device, and comparing the specific image. And the plurality of preset images, and outputting navigation information corresponding to the preset image of the specific image to the wearable device, wherein the wearable device displays the navigation information.
6. 如权利要求 4所述的导览系统, 其特征在于, 所述穿戴式装置还包括: 接收单元, 具有第一输入端、 第二输入端、 第一输出端及第二输出端, 所述 第一输入端接收所述使用者观看所述外部展品上投影的所述特定图像与外部环 境的可见光, 所述第二输入端接收所述云端资料库输出的所述导览资讯, 所述 第一输出端输出所述特定图像与所述可见光, 所述第二输出端输出所述导览资 讯; The navigation system according to claim 4, wherein the wearable device further comprises: a receiving unit having a first input end, a second input end, a first output end, and a second output end, The first input receives the specific image projected by the user on the external exhibit and the outer ring The second input end receives the navigation information output by the cloud database, the first output end outputs the specific image and the visible light, and the second output end outputs the guide View information;
滤镜单元, 具有输入端与输出端, 所述输入端连接于所述接收单元的所述第 一输出端, 所述滤镜单元的所述输入端接收所述特定图像与所述可见光后, 将 所述可见光滤除, 并于所述输出端输出所述特定图像;  a filter unit having an input end and an output end, wherein the input end is connected to the first output end of the receiving unit, and after the input end of the filter unit receives the specific image and the visible light, Filtering the visible light and outputting the specific image at the output end;
判断单元, 具有输入端与输出端, 所述输入端连接于所述滤镜单元的所述输 出端, 所述判断单元的所述输入端接收所述特定图像, 并判断所述特定图像是 否位于所述使用者观看的最大视角处所对应的所述外部展品附近;  a determining unit having an input end connected to the output end of the filter unit, the input end of the determining unit receiving the specific image, and determining whether the specific image is located The vicinity of the external exhibit corresponding to the maximum viewing angle viewed by the user;
发送单元, 具有第一输入端、 第二输入端、 第一输出端及第二输出端, 所述 第一输入端连接于所述判断单元的所述输出端, 接收经滤除所述可见光的所述 特定图像后, 并于所述第一输出端发送所述特定图像至所述云端资料库进行所 述特定图像与所述预设图像的比对, 所述第二输入端连接于所述接收单元的所 述第二输出端, 接收所述接收单元输出的所述导览资讯, 所述第二输出端输出 所述导览资讯; 及  a transmitting unit having a first input end, a second input end, a first output end, and a second output end, wherein the first input end is connected to the output end of the determining unit, and receives the filtered visible light After the specific image, the specific image is sent to the cloud database at the first output end to perform comparison between the specific image and the preset image, and the second input end is connected to the Receiving, by the second output end of the receiving unit, the navigation information output by the receiving unit, and the second output end outputting the navigation information;
显示单元, 其一端连接于所述发送单元的所述第二输出端, 接收并显示所述 导览资讯。  a display unit, one end of which is connected to the second output end of the transmitting unit, and receives and displays the navigation information.
7. 如权利要求 5所述的导览系统, 其特征在于, 所述云端资料库传递所述导 览资讯至所述接收单元, 及所述发送单元传递所述特定图像至所述云端资料库 的方式为无线传递方式。  The navigation system according to claim 5, wherein the cloud database transmits the navigation information to the receiving unit, and the transmitting unit delivers the specific image to the cloud database The way is wireless delivery.
8. 一种导览方法, 其特征在于, 包括以下歩骤:  8. A navigation method, comprising the following steps:
产生人眼无法识别的特定图像, 所述特定图像对应特定的外部展品; 使用者观看所述外部展品时, 由所述使用者所配戴的穿戴式装置接收所述特 定图像与外部环境的可见光;  Generating a specific image that is not recognized by the human eye, the specific image corresponding to a specific external exhibit; when the user views the external exhibit, the wearable device worn by the user receives the visible light of the specific image and the external environment ;
滤除所述可见光;  Filtering out the visible light;
判断所述特定图像是否位于所述使用者观看的最大视角处所对应的所述外 部展品附近;  Determining whether the specific image is located near the external exhibit corresponding to the maximum viewing angle viewed by the user;
比对所述特定图像与内建的多个预设图像; 及  Comparing the specific image with a plurality of built-in preset images; and
产生相同于所述特定图像的预设图像所对应的导览资讯。  The navigation information corresponding to the preset image of the specific image is generated.
PCT/IB2015/057759 2014-10-15 2015-10-10 Guiding system and method WO2016059530A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW103135651A TWI613615B (en) 2014-10-15 2014-10-15 Guiding system and method
TW103135651 2014-10-15

Publications (1)

Publication Number Publication Date
WO2016059530A1 true WO2016059530A1 (en) 2016-04-21

Family

ID=55746205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/057759 WO2016059530A1 (en) 2014-10-15 2015-10-10 Guiding system and method

Country Status (3)

Country Link
CN (1) CN105528386A (en)
TW (1) TWI613615B (en)
WO (1) WO2016059530A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040090498A (en) * 2003-04-17 2004-10-26 이노에이스(주) Legacy Guidance System Using the Wireless Phone and Method Thereof
JP2005080116A (en) * 2003-09-02 2005-03-24 Fuji Photo Film Co Ltd Imaging device and information transmission system
CN2906798Y (en) * 2006-05-09 2007-05-30 彭建新 Invisible code optical phonation globe
CN202996080U (en) * 2012-06-29 2013-06-12 上海崇立文化传播有限公司 Handbook enabling independent travel via two-dimensional point codes and touch-reading electronic pen
CN103778261A (en) * 2014-03-04 2014-05-07 福建瑞恒信息技术有限公司 Self-guided tour method based on mobile cloud computing image recognition
CN103858073A (en) * 2011-09-19 2014-06-11 视力移动技术有限公司 Touch free interface for augmented reality systems

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103383446B (en) * 2013-04-09 2017-07-07 北京半导体照明科技促进中心 Indoor orientation method, device and system and light source based on visible ray
TWM498354U (en) * 2014-10-15 2015-04-01 Digital Art Fundation Guiding system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040090498A (en) * 2003-04-17 2004-10-26 이노에이스(주) Legacy Guidance System Using the Wireless Phone and Method Thereof
JP2005080116A (en) * 2003-09-02 2005-03-24 Fuji Photo Film Co Ltd Imaging device and information transmission system
CN2906798Y (en) * 2006-05-09 2007-05-30 彭建新 Invisible code optical phonation globe
CN103858073A (en) * 2011-09-19 2014-06-11 视力移动技术有限公司 Touch free interface for augmented reality systems
CN202996080U (en) * 2012-06-29 2013-06-12 上海崇立文化传播有限公司 Handbook enabling independent travel via two-dimensional point codes and touch-reading electronic pen
CN103778261A (en) * 2014-03-04 2014-05-07 福建瑞恒信息技术有限公司 Self-guided tour method based on mobile cloud computing image recognition

Also Published As

Publication number Publication date
TWI613615B (en) 2018-02-01
CN105528386A (en) 2016-04-27
TW201614575A (en) 2016-04-16

Similar Documents

Publication Publication Date Title
US10031579B2 (en) Automatic calibration for reflective lens
TWI615631B (en) Head-mounted display device and control method of head-mounted display device
JP6364715B2 (en) Transmission display device and control method of transmission display device
JP5834439B2 (en) Head-mounted display device and method for controlling head-mounted display device
JP6149403B2 (en) Display device and control method of display device
US10613333B2 (en) Head-mounted display device, computer program, and control method for head-mounted display device
US10191282B2 (en) Computer display device mounted on eyeglasses
US9336779B1 (en) Dynamic image-based voice entry of unlock sequence
JP6380091B2 (en) Head-mounted display device, head-mounted display device control method, and computer program
JP6432197B2 (en) Display device, display device control method, and program
US20160035137A1 (en) Display device, method of controlling display device, and program
WO2021031722A1 (en) Interactive method, head-mounted device, interactive system and storage medium
JP6479628B2 (en) Glasses-type information terminal, information processing apparatus, computer program
JP6539981B2 (en) Display device, control method of display device, display system, and program
TWM498354U (en) Guiding system
WO2016059530A1 (en) Guiding system and method
JP6476673B2 (en) Head-mounted display device, head-mounted display device control method, and computer program
JP6582374B2 (en) Display device, control method therefor, and computer program
JP6430347B2 (en) Electronic device and display method
JP6004053B2 (en) Head-mounted display device and method for controlling head-mounted display device
JP7055925B2 (en) Electronic devices and display methods
JP2019013050A (en) Electronic apparatus and display method
JP6394108B2 (en) Head-mounted display device, control method therefor, and computer program
JP2021002353A (en) Electronic apparatus and display method
JP2022089884A (en) Electronic device and display method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15849880

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15849880

Country of ref document: EP

Kind code of ref document: A1