WO2018198320A1 - Wearable terminal display system, wearable terminal display method and program - Google Patents

Wearable terminal display system, wearable terminal display method and program Download PDF

Info

Publication number
WO2018198320A1
WO2018198320A1 PCT/JP2017/016942 JP2017016942W WO2018198320A1 WO 2018198320 A1 WO2018198320 A1 WO 2018198320A1 JP 2017016942 W JP2017016942 W JP 2017016942W WO 2018198320 A1 WO2018198320 A1 WO 2018198320A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable terminal
restaurant
menu
display
image
Prior art date
Application number
PCT/JP2017/016942
Other languages
French (fr)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2017/016942 priority Critical patent/WO2018198320A1/en
Publication of WO2018198320A1 publication Critical patent/WO2018198320A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a wearable terminal display system, a wearable terminal display method, and a program for displaying a collected restaurant menu as an augmented reality on a display panel of a wearable terminal for a restaurant that can be seen through the display board.
  • Patent Document 1 An electronic menu system is provided that facilitates menu selection according to one's own health condition.
  • Patent Document 1 is a system that replaces a desktop terminal of a restaurant, there is a problem that it is impossible to grasp what menu the restaurant provides before entering the restaurant.
  • the present invention specifies a restaurant from a viewable image of a wearable terminal, and displays a menu collected according to the restaurant as an augmented reality on a display panel of the wearable terminal, a wearable terminal It is an object to provide a display method and a program.
  • the present invention provides the following solutions.
  • the invention according to the first feature is a wearable terminal display system for displaying a restaurant menu on a display panel of a wearable terminal, and an image acquisition means for acquiring an image of a restaurant entering the field of view of the wearable terminal; Analyzing the image, identifying means for identifying the restaurant, collecting means for collecting the restaurant menu, and the restaurant that appears through the display board on the display board of the wearable terminal
  • a wearable terminal display system comprising menu display means for displaying the menu as augmented reality is provided.
  • the invention according to the first feature is a wearable terminal display method for displaying a restaurant menu on a wearable terminal display board, and an image acquisition step of acquiring an image of a restaurant that has entered the field of view of the wearable terminal; Analyzing the image, specifying the restaurant, a collecting step for collecting the restaurant menu, and the restaurant that can be seen through the display board on the display board of the wearable terminal
  • a wearable terminal display method comprising: a menu display step for displaying the menu as augmented reality.
  • the invention according to the first feature includes, in a computer, an image acquisition step of acquiring an image of a restaurant that has entered the field of view of a wearable terminal, a specifying step of analyzing the image and specifying the restaurant, A collection step for collecting a restaurant menu; and a menu display step for displaying the menu as an augmented reality for the restaurant that is visible through the display board on the display board of the wearable terminal.
  • the restaurant menu Before entering the restaurant, the restaurant menu can be displayed on the display panel of the wearable terminal.
  • FIG. 1 is a schematic diagram of a wearable terminal display system.
  • FIG. 2 is an example in which a restaurant menu is collected and displayed on the display board of the wearable terminal.
  • the wearable terminal display system of the present invention is a system that displays collected menus as augmented reality on a display board of the wearable terminal for restaurants that can be seen through the display board.
  • a wearable terminal is a terminal with a field of view such as a smart glass or a head-mounted display.
  • FIG. 1 is a schematic diagram of a wearable terminal display system which is a preferred embodiment of the present invention.
  • the wearable terminal display system includes image acquisition means, identification means, collection means, and menu display means that are realized by a control unit reading a predetermined program.
  • determination means, change means, detection means, action result display means, position / direction acquisition means, estimation means, guideline display means, and selection reception means may be provided. These may be application type, cloud type or others.
  • Each means described above may be realized by a single computer or may be realized by two or more computers (for example, a server and a terminal).
  • the image acquisition means acquires an image of a restaurant that has entered the field of view of the wearable terminal. You may acquire the image imaged with the camera of the wearable terminal. Or even if it is other than a wearable terminal, as long as such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display the menu in real time, a real time image is preferable.
  • the identifying means identifies the restaurant by analyzing the image. For example, it is specified whether the restaurant is McDonald's, Yoshinoya, or Gust. It can be identified from the store structure, store name, sign color, logo mark, and so on. Moreover, when it takes time if all the restaurants shown are taken, only the restaurant in the center of the field of view of the wearable terminal may be specified. By specifying only the restaurant in the center of the field of view, the time required for identification can be greatly reduced.
  • the accuracy of image analysis may be improved by machine learning. For example, machine learning is performed using a past image of a restaurant as teacher data.
  • the collecting means collects a menu corresponding to the restaurant.
  • Menus corresponding to restaurants may be collected with reference to a database in which menus are registered in advance.
  • menus may be collected by accessing Web contents previously linked to restaurants.
  • the URL can be collected from the Web content by assigning a URL that links the restaurant and the menu.
  • menus may be collected from Web contents searched by searching for restaurants on the Internet. For example, there are cases where menus are posted on restaurants' homepages, which can be collected from internet searches.
  • menus may be collected from SNS (social networking service) or word-of-mouth sites.
  • the menu display means displays the menu as an augmented reality on the display board of the wearable terminal with respect to the restaurant that can be seen through the display board.
  • a menu drawn with a broken line is displayed as augmented reality on a display board of the wearable terminal with respect to a restaurant drawn with a solid line that can be seen through the display board.
  • the solid line is the real thing
  • the broken line is the augmented reality.
  • the determination means determines whether the displayed menu has been browsed. It may be determined whether the menu has been browsed by acquiring an image being browsed and analyzing the image. Moreover, you may determine whether the menu was browsed from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the changing means changes the menu to browsed when it is determined that it has been browsed, and changes the degree of attention so that the menu is browsed when it is determined that it has not been browsed. By doing in this way, it can be grasped visually which menu was browsed or not browsed. For example, if you check the check box of the menu, you have already viewed it. For example, it may be viewed by pressing a stamp on the menu.
  • the attention level may be changed by changing the menu color or size, or by pressing a stamp so that the menu stands out.
  • the detecting means detects an action for the displayed menu.
  • the action is, for example, a gesture, a hand movement, a line-of-sight movement, or the like.
  • An action for a menu can be detected by acquiring an image being browsed and analyzing the image. Moreover, you may detect the action with respect to a menu from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the action result display means displays a result corresponding to the action as an augmented reality on the display board of the wearable terminal for a restaurant that can be seen through the display board.
  • the display of the menu may be deleted when an action for deleting the menu is detected.
  • the link may be opened.
  • other actions are possible.
  • the position direction means acquires the terminal position and the imaging direction of the wearable terminal.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from a geomagnetic sensor or an acceleration sensor of the wearable terminal when imaging is performed with the wearable terminal. You may acquire from other than these.
  • the estimation means estimates the target position of the restaurant based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the store position of the imaged restaurant can be estimated.
  • the specifying unit may specify a restaurant from the target position and the image analysis. Even for restaurants in the same chain store, there are cases where the menu differs depending on the store. For example, if it can be specified that it is a Tokyo Minato Ward store, a limited menu of the Tokyo Minato Ward store can also be displayed.
  • the guideline display means displays a guideline for imaging the restaurant as an augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. Image analysis is facilitated by having images taken along the guidelines.
  • the acquisition unit may acquire an image taken along the guideline.
  • a restaurant can be identified efficiently by acquiring and analyzing only the images taken along the guidelines.
  • the selection accepting unit accepts selection of a selection target for a restaurant that is seen through the display board of the wearable terminal. For example, you may receive selection of the selection object by seeing the restaurant which can be seen through the display board of a wearable terminal for a certain period of time. For example, you may touch the restaurant which sees through the display board of a wearable terminal, and may receive selection of selection object. For example, selection of a selection target may be received by placing a cursor on a restaurant that can be seen through a display board of a wearable terminal. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
  • the menu display means may display the menu as augmented reality on the display board of the wearable terminal only in accordance with a selection target that can be seen through the display board. Since the menu is displayed as augmented reality only in accordance with the selected selection object, the menu can be grasped pinpointly. If the menu is displayed in all the specified restaurants, the screen of the display board may become troublesome.
  • the wearable terminal display method of the present invention is a method of displaying a collected menu as an augmented reality on a display board of a wearable terminal for a restaurant that can be seen through the display board.
  • the wearable terminal display method includes an image acquisition step, a specification step, a collection step, and a menu display step. Although not shown, similarly, a determination step, a change step, a detection step, an action result display step, a position / direction acquisition step, an estimation step, a guideline display step, and a selection reception step may be provided.
  • the image acquisition step acquires an image of a restaurant that has entered the field of view of the wearable terminal. You may acquire the image imaged with the camera of the wearable terminal. Or even if it is other than a wearable terminal, as long as such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display the menu in real time, a real time image is preferable.
  • the restaurant is specified by analyzing the image. For example, it is specified whether the restaurant is McDonald's, Yoshinoya, or Gust. It can be identified from the store structure, store name, sign color, logo mark, and so on. Moreover, when it takes time if all the restaurants shown are taken, only the restaurant in the center of the field of view of the wearable terminal may be specified. By specifying only the restaurant in the center of the field of view, the time required for identification can be greatly reduced.
  • the accuracy of image analysis may be improved by machine learning. For example, machine learning is performed using a past image of a restaurant as teacher data.
  • a menu corresponding to the restaurant is collected.
  • Menus corresponding to restaurants may be collected with reference to a database in which menus are registered in advance.
  • menus may be collected by accessing Web contents previously linked to restaurants.
  • the URL can be collected from the Web content by assigning a URL that links the restaurant and the menu.
  • menus may be collected from Web contents searched by searching for restaurants on the Internet. For example, there are cases where menus are posted on restaurants' homepages, which can be collected from internet searches.
  • menus may be collected from SNS (social networking service) or word-of-mouth sites.
  • the menu is displayed on the display board of the wearable terminal as an augmented reality for a restaurant that can be seen through the display board.
  • a menu drawn with a broken line is displayed as augmented reality on a display board of the wearable terminal with respect to a restaurant drawn with a solid line that can be seen through the display board.
  • the solid line is the real thing
  • the broken line is the augmented reality.
  • the determination step determines whether or not the displayed menu has been browsed. It may be determined whether the menu has been browsed by acquiring an image being browsed and analyzing the image. Moreover, you may determine whether the menu was browsed from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the menu is changed to viewed, and when it is determined that browsing has not been performed, the degree of attention is changed so that the menu is browsed.
  • the degree of attention is changed so that the menu is browsed.
  • it can be grasped visually which menu was browsed or not browsed. For example, if you check the check box of the menu, you have already viewed it. For example, it may be viewed by pressing a stamp on the menu.
  • the attention level may be changed by changing the menu color or size, or by pressing a stamp so that the menu stands out.
  • the detecting step detects an action for the displayed menu.
  • the action is, for example, a gesture, a hand movement, a line-of-sight movement, or the like.
  • An action for a menu can be detected by acquiring an image being browsed and analyzing the image. Moreover, you may detect the action with respect to a menu from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the result corresponding to the action is displayed as augmented reality on the display board of the wearable terminal with respect to the restaurant that can be seen through the display board.
  • the display of the menu may be deleted when an action for deleting the menu is detected.
  • the link may be opened.
  • other actions are possible.
  • the terminal position and the imaging direction of the wearable terminal are acquired.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from a geomagnetic sensor or an acceleration sensor of the wearable terminal when imaging is performed with the wearable terminal. You may acquire from other than these.
  • the estimation step estimates the target position of the restaurant based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the store position of the imaged restaurant can be estimated.
  • the specifying step may specify a restaurant from the target position and image analysis. Even for restaurants in the same chain store, there are cases where the menu differs depending on the store. For example, if it can be specified that it is a Tokyo Minato Ward store, a limited menu of the Tokyo Minato Ward store can also be displayed.
  • the guideline for imaging the restaurant is displayed as augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. Image analysis is facilitated by having images taken along the guidelines.
  • an image captured along the guideline may be acquired.
  • a restaurant can be identified efficiently by acquiring and analyzing only the images taken along the guidelines.
  • the selection accepting step accepts selection of a selection target for a restaurant that can be seen through the display board of the wearable terminal. For example, you may receive selection of the selection object by seeing the restaurant which can be seen through the display board of a wearable terminal for a certain period of time. For example, you may touch the restaurant which sees through the display board of a wearable terminal, and may receive selection of selection object. For example, selection of a selection target may be received by placing a cursor on a restaurant that can be seen through a display board of a wearable terminal. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
  • the menu may be displayed as an augmented reality on the display board of the wearable terminal only in accordance with a selection target that can be seen through the display board. Since the menu is displayed as augmented reality only in accordance with the selected selection object, the menu can be grasped pinpointly. If the menu is displayed in all the specified restaurants, the screen of the display board may become troublesome.
  • the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
  • the program may be, for example, an application installed on a computer, or may be in the form of SaaS (software as a service) provided from a computer via a network, for example, a flexible disk, a CD It may be provided in a form recorded on a computer-readable recording medium such as a CD-ROM (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
  • a nearest neighbor method a naive Bayes method, a decision tree, a support vector machine, reinforcement learning, or the like may be used.
  • deep learning may be used in which a characteristic amount for learning is generated by using a neural network.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

[Problem] To identify a restaurant from the image in the field of view of a wearable terminal and to display, as augmented reality, menus collected according to restaurant, on a display board of the wearable terminal. [Solution] To provide a wearable terminal display system, for displaying a menu of a restaurant on the display board of a wearable terminal, provided with: an image acquisition means for acquiring an image of a restaurant that has entered the field of view of the wearable terminal; an identifying means for analyzing said image to identify the restaurant; a collecting means for collecting menus of the restaurant; and a menu display means for displaying on a display board of the wearable terminal, as augmented reality, a menu of the restaurant which is visible through the display board.

Description

ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラムWearable terminal display system, wearable terminal display method and program
 本発明は、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対して、収集された飲食店のメニューを拡張現実として表示するウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラムに関する。 The present invention relates to a wearable terminal display system, a wearable terminal display method, and a program for displaying a collected restaurant menu as an augmented reality on a display panel of a wearable terminal for a restaurant that can be seen through the display board.
 近年、飲食店のIT化が進んでいる。例えば、店舗に専用のメニュー端末が備えられていなくとも、自己の健康状態に応じたメニュー選択が容易となる電子メニューシステムが提供されている(特許文献1)。 In recent years, IT use of restaurants is progressing. For example, even if a store does not have a dedicated menu terminal, an electronic menu system is provided that facilitates menu selection according to one's own health condition (Patent Document 1).
特開2014-157593号公報JP 2014-157593 A
 しかしながら、特許文献1のシステムは、飲食店の卓上端末の代わりになるシステムであるので、飲食店に入る前に、その飲食店がどんなメニューを提供しているかを把握できない問題がある。 However, since the system of Patent Document 1 is a system that replaces a desktop terminal of a restaurant, there is a problem that it is impossible to grasp what menu the restaurant provides before entering the restaurant.
 本発明は、上記課題に鑑み、ウェアラブル端末の視界の画像から飲食店を特定して、飲食店に応じて収集したメニューをウェアラブル端末の表示板に拡張現実として表示するウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラムを提供することを目的とする。 In view of the above problems, the present invention specifies a restaurant from a viewable image of a wearable terminal, and displays a menu collected according to the restaurant as an augmented reality on a display panel of the wearable terminal, a wearable terminal It is an object to provide a display method and a program.
 本発明では、以下のような解決手段を提供する。 The present invention provides the following solutions.
 第1の特徴に係る発明は、ウェアラブル端末の表示板に、飲食店のメニューを表示するウェアラブル端末表示システムであって、ウェアラブル端末の視界に入った飲食店の画像を取得する画像取得手段と、前記画像を画像解析して、前記飲食店を特定する特定手段と、前記飲食店のメニューを収集する収集手段と、前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記メニューを拡張現実として表示するメニュー表示手段と、を備えるウェアラブル端末表示システムを提供する。 The invention according to the first feature is a wearable terminal display system for displaying a restaurant menu on a display panel of a wearable terminal, and an image acquisition means for acquiring an image of a restaurant entering the field of view of the wearable terminal; Analyzing the image, identifying means for identifying the restaurant, collecting means for collecting the restaurant menu, and the restaurant that appears through the display board on the display board of the wearable terminal On the other hand, a wearable terminal display system comprising menu display means for displaying the menu as augmented reality is provided.
 第1の特徴に係る発明は、ウェアラブル端末の表示板に、飲食店のメニューを表示するウェアラブル端末表示方法であって、ウェアラブル端末の視界に入った飲食店の画像を取得する画像取得ステップと、前記画像を画像解析して、前記飲食店を特定する特定ステップと、前記飲食店のメニューを収集する収集ステップと、前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記メニューを拡張現実として表示するメニュー表示ステップと、を備えるウェアラブル端末表示方法を提供する。 The invention according to the first feature is a wearable terminal display method for displaying a restaurant menu on a wearable terminal display board, and an image acquisition step of acquiring an image of a restaurant that has entered the field of view of the wearable terminal; Analyzing the image, specifying the restaurant, a collecting step for collecting the restaurant menu, and the restaurant that can be seen through the display board on the display board of the wearable terminal On the other hand, a wearable terminal display method comprising: a menu display step for displaying the menu as augmented reality.
 第1の特徴に係る発明は、コンピュータに、ウェアラブル端末の視界に入った飲食店の画像を取得する画像取得ステップと、前記画像を画像解析して、前記飲食店を特定する特定ステップと、前記飲食店のメニューを収集する収集ステップと、前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記メニューを拡張現実として表示するメニュー表示ステップと、をさせるためのプログラムを提供する。 The invention according to the first feature includes, in a computer, an image acquisition step of acquiring an image of a restaurant that has entered the field of view of a wearable terminal, a specifying step of analyzing the image and specifying the restaurant, A collection step for collecting a restaurant menu; and a menu display step for displaying the menu as an augmented reality for the restaurant that is visible through the display board on the display board of the wearable terminal. Provide a program.
 飲食店に入る前に、ウェアラブル端末の表示板に、飲食店のメニューを表示できる。 Before entering the restaurant, the restaurant menu can be displayed on the display panel of the wearable terminal.
図1は、ウェアラブル端末表示システムの概要図である。FIG. 1 is a schematic diagram of a wearable terminal display system. 図2は、ウェアラブル端末の表示板に飲食店のメニューを収集して表示した一例である。FIG. 2 is an example in which a restaurant menu is collected and displayed on the display board of the wearable terminal.
 以下、本発明を実施するための最良の形態について説明する。なお、これはあくまでも一例であって、本発明の技術的範囲はこれに限られるものではない。 Hereinafter, the best mode for carrying out the present invention will be described. This is merely an example, and the technical scope of the present invention is not limited to this.
 本発明のウェアラブル端末表示システムは、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対して、収集されたメニューを拡張現実として表示するシステムである。ウェアラブル端末とはスマートグラスやヘッドマウントディスプレイなどの視界がある端末のことである。 The wearable terminal display system of the present invention is a system that displays collected menus as augmented reality on a display board of the wearable terminal for restaurants that can be seen through the display board. A wearable terminal is a terminal with a field of view such as a smart glass or a head-mounted display.
本発明の好適な実施形態の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態であるウェアラブル端末表示システムの概要図である。 An outline of a preferred embodiment of the present invention will be described with reference to FIG. FIG. 1 is a schematic diagram of a wearable terminal display system which is a preferred embodiment of the present invention.
 図1にあるように、ウェアラブル端末表示システムは、制御部が所定のプログラムを読み込むことで実現される、画像取得手段、特定手段、収集手段、メニュー表示手段、を備える。また図示しないが、同様に、判定手段、変更手段、検出手段、アクション結果表示手段、位置方向取得手段、推測手段、ガイドライン表示手段、選択受付手段、を備えてもよい。これらは、アプリケーション型、クラウド型またはその他であってもよい。上述の各手段が、単独のコンピュータで実現されてもよいし、2台以上のコンピュータ(例えば、サーバと端末のような場合)で実現されてもよい。 As shown in FIG. 1, the wearable terminal display system includes image acquisition means, identification means, collection means, and menu display means that are realized by a control unit reading a predetermined program. Although not shown, similarly, determination means, change means, detection means, action result display means, position / direction acquisition means, estimation means, guideline display means, and selection reception means may be provided. These may be application type, cloud type or others. Each means described above may be realized by a single computer or may be realized by two or more computers (for example, a server and a terminal).
 画像取得手段は、ウェアラブル端末の視界に入った飲食店の画像を取得する。ウェアラブル端末のカメラで撮像された画像を取得してもよい。または、ウェアラブル端末以外であっても、このような画像を取得できるのであれば、それでも構わない。画像とは動画でも静止画でもよい。リアルタイムにメニューを表示するためには、リアルタイムな画像の方が好ましい。 The image acquisition means acquires an image of a restaurant that has entered the field of view of the wearable terminal. You may acquire the image imaged with the camera of the wearable terminal. Or even if it is other than a wearable terminal, as long as such an image can be acquired, it does not matter. The image may be a moving image or a still image. In order to display the menu in real time, a real time image is preferable.
 特定手段は、画像を画像解析して飲食店を特定する。例えば、飲食店が、マクドナルドであるのか、吉野家であるのか、ガストであるのか、などを特定する。店構え、店名、看板の色、ロゴマーク、などから特定できる。また、映った飲食店の全てを特定してしまうと時間が掛かる場合には、ウェアラブル端末の視界の中央にある飲食店だけを特定してもよい。視界の中央にある飲食店だけを特定することで、特定に要する時間を大幅に短縮できる。機械学習によって画像解析の精度を向上させてもよい。例えば、飲食店の過去画像を教師データとして機械学習を行う。 The identifying means identifies the restaurant by analyzing the image. For example, it is specified whether the restaurant is McDonald's, Yoshinoya, or Gust. It can be identified from the store structure, store name, sign color, logo mark, and so on. Moreover, when it takes time if all the restaurants shown are taken, only the restaurant in the center of the field of view of the wearable terminal may be specified. By specifying only the restaurant in the center of the field of view, the time required for identification can be greatly reduced. The accuracy of image analysis may be improved by machine learning. For example, machine learning is performed using a past image of a restaurant as teacher data.
収集手段は、飲食店に応じたメニューを収集する。予めメニューが登録されたデータベースを参照して飲食店に応じたメニューを収集してもよい。また、飲食店に予め紐付けられたWebコンテンツにアクセスしてメニューを収集してもよい。例えば、飲食店とメニューとを紐づけるURLなど割当てることでWebコンテンツから収取できる。また、飲食店をインターネット検索して検索されたWebコンテンツからメニューを収集してもよい。例えば、飲食店のホームページにメニューが掲載されているケースがあるので、インターネット検索から収集できる。または、SNS(social networking service)や口コミサイトなどから、メニューを収集できることもある。 The collecting means collects a menu corresponding to the restaurant. Menus corresponding to restaurants may be collected with reference to a database in which menus are registered in advance. In addition, menus may be collected by accessing Web contents previously linked to restaurants. For example, the URL can be collected from the Web content by assigning a URL that links the restaurant and the menu. In addition, menus may be collected from Web contents searched by searching for restaurants on the Internet. For example, there are cases where menus are posted on restaurants' homepages, which can be collected from internet searches. Alternatively, menus may be collected from SNS (social networking service) or word-of-mouth sites.
 メニュー表示手段は、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対してメニューを拡張現実として表示する。例えば図2にあるように、ウェアラブル端末の表示板に、表示板を透過して見える実線で描かれた飲食店に対して、破線で描かれたメニューを拡張現実として表示している。ここでは理解のために、実線は実物、破線は拡張現実、としている。表示板を透過して見える実線で描かれた飲食店に対してメニューを拡張現実で表示することで、飲食店にどのようなメニューがあるのかを視覚的に把握することが出来る。拡張現実として表示するメニューは、表示板を透過して見える飲食店に重なるように表示しても良いが、飲食店が見づらくなるので、メニューの表示ON/OFFを切り替えられるようにしてもよい。 The menu display means displays the menu as an augmented reality on the display board of the wearable terminal with respect to the restaurant that can be seen through the display board. For example, as shown in FIG. 2, a menu drawn with a broken line is displayed as augmented reality on a display board of the wearable terminal with respect to a restaurant drawn with a solid line that can be seen through the display board. Here, for the sake of understanding, the solid line is the real thing, and the broken line is the augmented reality. By displaying the menu in augmented reality with respect to the restaurant drawn with a solid line that can be seen through the display board, it is possible to visually grasp what menu the restaurant has. The menu displayed as augmented reality may be displayed so as to overlap with the restaurant that can be seen through the display board, but it is difficult to see the restaurant, so the menu display ON / OFF may be switched.
判定手段は、表示されたメニューが閲覧されたかどうかを判定する。閲覧中の画像を取得して画像解析をすることで、メニューが閲覧されたかどうかを判定してもよい。また、ウェアラブル端末のセンサ情報や、閲覧者に装着されたセンサ情報などから、メニューが閲覧されたかどうかを判定してもよい。例えば、視線を検知するセンサ、モーションセンサ、加速度センサなど。 The determination means determines whether the displayed menu has been browsed. It may be determined whether the menu has been browsed by acquiring an image being browsed and analyzing the image. Moreover, you may determine whether the menu was browsed from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted | worn. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
変更手段は、閲覧されたと判定された場合はメニューを閲覧済みに変更し、閲覧されていないと判定された場合はメニューが閲覧されるように注目度を変更する。このようにすることで、どのメニューが、閲覧されたのか、閲覧されていないのか、を視覚的に把握できる。例えば、メニューのチェックボックスにチェックを入れることで閲覧済とてもよい。例えば、メニューにスタンプを押すことで閲覧済としてもよい。また、注目度の変更は、メニューの色・サイズを変更したり、メニューが目立つようにスタンプを押して変更したりしてもよい。 The changing means changes the menu to browsed when it is determined that it has been browsed, and changes the degree of attention so that the menu is browsed when it is determined that it has not been browsed. By doing in this way, it can be grasped visually which menu was browsed or not browsed. For example, if you check the check box of the menu, you have already viewed it. For example, it may be viewed by pressing a stamp on the menu. The attention level may be changed by changing the menu color or size, or by pressing a stamp so that the menu stands out.
検出手段は、表示されたメニューに対するアクションを検出する。アクションは、例えば、ジェスチャーや、手の動き、視線の動き、などである。閲覧中の画像を取得して画像解析をすることで、メニューに対するアクションを検出できる。また、ウェアラブル端末のセンサ情報や、閲覧者に装着されたセンサ情報などから、メニューに対するアクションを検出してもよい。例えば、視線を検知するセンサ、モーションセンサ、加速度センサなど。 The detecting means detects an action for the displayed menu. The action is, for example, a gesture, a hand movement, a line-of-sight movement, or the like. An action for a menu can be detected by acquiring an image being browsed and analyzing the image. Moreover, you may detect the action with respect to a menu from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted | worn. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
アクション結果表示手段は、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対して、アクションに応じた結果を拡張現実として表示する。例えば、メニューを消すアクションを検出したらメニューの表示を消してよい。例えば、メニューに付けられたリンクを開くアクションを検出したらリンクを開いてもよい。もちろん他のアクションでもよい。 The action result display means displays a result corresponding to the action as an augmented reality on the display board of the wearable terminal for a restaurant that can be seen through the display board. For example, the display of the menu may be deleted when an action for deleting the menu is detected. For example, if an action for opening a link attached to a menu is detected, the link may be opened. Of course, other actions are possible.
位置方向手段は、ウェアラブル端末の端末位置と撮像方向とを取得する。例えば、端末位置は、ウェアラブル端末のGPS(Global Positioning System)から取得できる。例えば、撮像方向は、ウェアラブル端末で撮像する場合は、ウェアラブル端末の地磁気センサや加速度センサから取得できる。これら以外から取得してもよい。 The position direction means acquires the terminal position and the imaging direction of the wearable terminal. For example, the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal. For example, the imaging direction can be acquired from a geomagnetic sensor or an acceleration sensor of the wearable terminal when imaging is performed with the wearable terminal. You may acquire from other than these.
 推測手段は、端末位置と撮像方向とに基づいて、飲食店の対象位置を推測する。端末位置と撮像方向が分かっていれば、撮像された飲食店の店舗位置を推測することができる。 The estimation means estimates the target position of the restaurant based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the store position of the imaged restaurant can be estimated.
また、特定手段は、対象位置と画像解析とから、飲食店を特定してもよい。同じチェーン店の飲食店であっても、店舗によってメニューが異なるケースもあるので、店舗位置を推測しないと一意的に特定できないことがある。例えば、東京都港区店であると特定できれば、東京都港区店の限定メニューも表示できることになる。 The specifying unit may specify a restaurant from the target position and the image analysis. Even for restaurants in the same chain store, there are cases where the menu differs depending on the store. For example, if it can be specified that it is a Tokyo Minato Ward store, a limited menu of the Tokyo Minato Ward store can also be displayed.
 ガイドライン表示手段は、ウェアラブル端末の表示板に、飲食店を撮像するためのガイドラインを拡張現実として表示する。例えば、枠や十字などのガイドラインを表示してもよい。ガイドラインに沿って撮像してもらうことで画像解析がしやすくなる。 The guideline display means displays a guideline for imaging the restaurant as an augmented reality on the display board of the wearable terminal. For example, guidelines such as a frame or a cross may be displayed. Image analysis is facilitated by having images taken along the guidelines.
 また、取得手段は、ガイドラインに沿って撮像された画像を取得してもよい。ガイドラインに沿って撮像された画像だけを取得して画像解析することで、効率良く飲食店を特定できる。 Further, the acquisition unit may acquire an image taken along the guideline. A restaurant can be identified efficiently by acquiring and analyzing only the images taken along the guidelines.
選択受付手段は、ウェアラブル端末の表示板を透過して見える飲食店に対して、選択対象の選択を受け付ける。例えば、ウェアラブル端末の表示板を透過して見える飲食店を一定時間見ることで選択対象の選択を受け付けてもよい。例えば、ウェアラブル端末の表示板を透過して見える飲食店にタッチして選択対象の選択を受け付けてもよい。例えば、ウェアラブル端末の表示板を透過して見える飲食店にカーソルを合わせることで選択対象の選択を受け付けてもよい。例えば、視線を検知するセンサ、モーションセンサ、加速度センサなど。 The selection accepting unit accepts selection of a selection target for a restaurant that is seen through the display board of the wearable terminal. For example, you may receive selection of the selection object by seeing the restaurant which can be seen through the display board of a wearable terminal for a certain period of time. For example, you may touch the restaurant which sees through the display board of a wearable terminal, and may receive selection of selection object. For example, selection of a selection target may be received by placing a cursor on a restaurant that can be seen through a display board of a wearable terminal. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
また、メニュー表示手段は、ウェアラブル端末の表示板に、表示板を透過して見える選択対象にだけ合わせて、メニューを拡張現実として表示してもよい。選択された選択対象にだけ合わせてメニューを拡張現実として表示するので、ピンポイントにメニューを把握することができる。特定された全ての飲食店にメニューを表示すると表示板の画面が煩わしくなることがある。
[動作の説明]
Further, the menu display means may display the menu as augmented reality on the display board of the wearable terminal only in accordance with a selection target that can be seen through the display board. Since the menu is displayed as augmented reality only in accordance with the selected selection object, the menu can be grasped pinpointly. If the menu is displayed in all the specified restaurants, the screen of the display board may become troublesome.
[Description of operation]
次に、ウェアラブル端末表示方法について説明する。本発明のウェアラブル端末表示方法は、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対して、収集されたメニューを拡張現実として表示する方法である。 Next, a wearable terminal display method will be described. The wearable terminal display method of the present invention is a method of displaying a collected menu as an augmented reality on a display board of a wearable terminal for a restaurant that can be seen through the display board.
ウェアラブル端末表示方法は、画像取得ステップ、特定ステップ、収集ステップ、メニュー表示ステップ、を備える。また図示しないが、同様に、判定ステップ、変更ステップ、検出ステップ、アクション結果表示ステップ、位置方向取得ステップ、推測ステップ、ガイドライン表示ステップ、選択受付ステップ、を備えてもよい。 The wearable terminal display method includes an image acquisition step, a specification step, a collection step, and a menu display step. Although not shown, similarly, a determination step, a change step, a detection step, an action result display step, a position / direction acquisition step, an estimation step, a guideline display step, and a selection reception step may be provided.
 画像取得ステップは、ウェアラブル端末の視界に入った飲食店の画像を取得する。ウェアラブル端末のカメラで撮像された画像を取得してもよい。または、ウェアラブル端末以外であっても、このような画像を取得できるのであれば、それでも構わない。画像とは動画でも静止画でもよい。リアルタイムにメニューを表示するためには、リアルタイムな画像の方が好ましい。 The image acquisition step acquires an image of a restaurant that has entered the field of view of the wearable terminal. You may acquire the image imaged with the camera of the wearable terminal. Or even if it is other than a wearable terminal, as long as such an image can be acquired, it does not matter. The image may be a moving image or a still image. In order to display the menu in real time, a real time image is preferable.
 特定ステップは、画像を画像解析して飲食店を特定する。例えば、飲食店が、マクドナルドであるのか、吉野家であるのか、ガストであるのか、などを特定する。店構え、店名、看板の色、ロゴマーク、などから特定できる。また、映った飲食店の全てを特定してしまうと時間が掛かる場合には、ウェアラブル端末の視界の中央にある飲食店だけを特定してもよい。視界の中央にある飲食店だけを特定することで、特定に要する時間を大幅に短縮できる。機械学習によって画像解析の精度を向上させてもよい。例えば、飲食店の過去画像を教師データとして機械学習を行う。 In the specifying step, the restaurant is specified by analyzing the image. For example, it is specified whether the restaurant is McDonald's, Yoshinoya, or Gust. It can be identified from the store structure, store name, sign color, logo mark, and so on. Moreover, when it takes time if all the restaurants shown are taken, only the restaurant in the center of the field of view of the wearable terminal may be specified. By specifying only the restaurant in the center of the field of view, the time required for identification can be greatly reduced. The accuracy of image analysis may be improved by machine learning. For example, machine learning is performed using a past image of a restaurant as teacher data.
収集ステップは、飲食店に応じたメニューを収集する。予めメニューが登録されたデータベースを参照して飲食店に応じたメニューを収集してもよい。また、飲食店に予め紐付けられたWebコンテンツにアクセスしてメニューを収集してもよい。例えば、飲食店とメニューとを紐づけるURLなど割当てることでWebコンテンツから収取できる。また、飲食店をインターネット検索して検索されたWebコンテンツからメニューを収集してもよい。例えば、飲食店のホームページにメニューが掲載されているケースがあるので、インターネット検索から収集できる。または、SNS(social networking service)や口コミサイトなどから、メニューを収集できることもある。 In the collecting step, a menu corresponding to the restaurant is collected. Menus corresponding to restaurants may be collected with reference to a database in which menus are registered in advance. In addition, menus may be collected by accessing Web contents previously linked to restaurants. For example, the URL can be collected from the Web content by assigning a URL that links the restaurant and the menu. In addition, menus may be collected from Web contents searched by searching for restaurants on the Internet. For example, there are cases where menus are posted on restaurants' homepages, which can be collected from internet searches. Alternatively, menus may be collected from SNS (social networking service) or word-of-mouth sites.
 メニュー表示ステップは、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対してメニューを拡張現実として表示する。例えば図2にあるように、ウェアラブル端末の表示板に、表示板を透過して見える実線で描かれた飲食店に対して、破線で描かれたメニューを拡張現実として表示している。ここでは理解のために、実線は実物、破線は拡張現実、としている。表示板を透過して見える実線で描かれた飲食店に対してメニューを拡張現実で表示することで、飲食店にどのようなメニューがあるのかを視覚的に把握することが出来る。拡張現実として表示するメニューは、表示板を透過して見える飲食店に重なるように表示しても良いが、飲食店が見づらくなるので、メニューの表示ON/OFFを切り替えられるようにしてもよい。 In the menu display step, the menu is displayed on the display board of the wearable terminal as an augmented reality for a restaurant that can be seen through the display board. For example, as shown in FIG. 2, a menu drawn with a broken line is displayed as augmented reality on a display board of the wearable terminal with respect to a restaurant drawn with a solid line that can be seen through the display board. Here, for the sake of understanding, the solid line is the real thing, and the broken line is the augmented reality. By displaying the menu in augmented reality with respect to the restaurant drawn with a solid line that can be seen through the display board, it is possible to visually grasp what menu the restaurant has. The menu displayed as augmented reality may be displayed so as to overlap with the restaurant that can be seen through the display board, but it is difficult to see the restaurant, so the menu display ON / OFF may be switched.
判定ステップは、表示されたメニューが閲覧されたかどうかを判定する。閲覧中の画像を取得して画像解析をすることで、メニューが閲覧されたかどうかを判定してもよい。また、ウェアラブル端末のセンサ情報や、閲覧者に装着されたセンサ情報などから、メニューが閲覧されたかどうかを判定してもよい。例えば、視線を検知するセンサ、モーションセンサ、加速度センサなど。 The determination step determines whether or not the displayed menu has been browsed. It may be determined whether the menu has been browsed by acquiring an image being browsed and analyzing the image. Moreover, you may determine whether the menu was browsed from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted | worn. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
変更ステップは、閲覧されたと判定された場合はメニューを閲覧済みに変更し、閲覧されていないと判定された場合はメニューが閲覧されるように注目度を変更する。このようにすることで、どのメニューが、閲覧されたのか、閲覧されていないのか、を視覚的に把握できる。例えば、メニューのチェックボックスにチェックを入れることで閲覧済とてもよい。例えば、メニューにスタンプを押すことで閲覧済としてもよい。また、注目度の変更は、メニューの色・サイズを変更したり、メニューが目立つようにスタンプを押して変更したりしてもよい。 In the changing step, when it is determined that browsing has been performed, the menu is changed to viewed, and when it is determined that browsing has not been performed, the degree of attention is changed so that the menu is browsed. By doing in this way, it can be grasped visually which menu was browsed or not browsed. For example, if you check the check box of the menu, you have already viewed it. For example, it may be viewed by pressing a stamp on the menu. The attention level may be changed by changing the menu color or size, or by pressing a stamp so that the menu stands out.
検出ステップは、表示されたメニューに対するアクションを検出する。アクションは、例えば、ジェスチャーや、手の動き、視線の動き、などである。閲覧中の画像を取得して画像解析をすることで、メニューに対するアクションを検出できる。また、ウェアラブル端末のセンサ情報や、閲覧者に装着されたセンサ情報などから、メニューに対するアクションを検出してもよい。例えば、視線を検知するセンサ、モーションセンサ、加速度センサなど。 The detecting step detects an action for the displayed menu. The action is, for example, a gesture, a hand movement, a line-of-sight movement, or the like. An action for a menu can be detected by acquiring an image being browsed and analyzing the image. Moreover, you may detect the action with respect to a menu from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted | worn. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
アクション結果表示ステップは、ウェアラブル端末の表示板に、表示板を透過して見える飲食店に対して、アクションに応じた結果を拡張現実として表示する。例えば、メニューを消すアクションを検出したらメニューの表示を消してよい。例えば、メニューに付けられたリンクを開くアクションを検出したらリンクを開いてもよい。もちろん他のアクションでもよい。 In the action result display step, the result corresponding to the action is displayed as augmented reality on the display board of the wearable terminal with respect to the restaurant that can be seen through the display board. For example, the display of the menu may be deleted when an action for deleting the menu is detected. For example, if an action for opening a link attached to a menu is detected, the link may be opened. Of course, other actions are possible.
位置方向ステップは、ウェアラブル端末の端末位置と撮像方向とを取得する。例えば、端末位置は、ウェアラブル端末のGPS(Global Positioning System)から取得できる。例えば、撮像方向は、ウェアラブル端末で撮像する場合は、ウェアラブル端末の地磁気センサや加速度センサから取得できる。これら以外から取得してもよい。 In the position / direction step, the terminal position and the imaging direction of the wearable terminal are acquired. For example, the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal. For example, the imaging direction can be acquired from a geomagnetic sensor or an acceleration sensor of the wearable terminal when imaging is performed with the wearable terminal. You may acquire from other than these.
 推測ステップは、端末位置と撮像方向とに基づいて、飲食店の対象位置を推測する。端末位置と撮像方向が分かっていれば、撮像された飲食店の店舗位置を推測することができる。 The estimation step estimates the target position of the restaurant based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the store position of the imaged restaurant can be estimated.
また、特定ステップは、対象位置と画像解析とから、飲食店を特定してもよい。同じチェーン店の飲食店であっても、店舗によってメニューが異なるケースもあるので、店舗位置を推測しないと一意的に特定できないことがある。例えば、東京都港区店であると特定できれば、東京都港区店の限定メニューも表示できることになる。 The specifying step may specify a restaurant from the target position and image analysis. Even for restaurants in the same chain store, there are cases where the menu differs depending on the store. For example, if it can be specified that it is a Tokyo Minato Ward store, a limited menu of the Tokyo Minato Ward store can also be displayed.
 ガイドライン表示ステップは、ウェアラブル端末の表示板に、飲食店を撮像するためのガイドラインを拡張現実として表示する。例えば、枠や十字などのガイドラインを表示してもよい。ガイドラインに沿って撮像してもらうことで画像解析がしやすくなる。 In the guideline display step, the guideline for imaging the restaurant is displayed as augmented reality on the display board of the wearable terminal. For example, guidelines such as a frame or a cross may be displayed. Image analysis is facilitated by having images taken along the guidelines.
 また、取得ステップは、ガイドラインに沿って撮像された画像を取得してもよい。ガイドラインに沿って撮像された画像だけを取得して画像解析することで、効率良く飲食店を特定できる。 In addition, in the acquisition step, an image captured along the guideline may be acquired. A restaurant can be identified efficiently by acquiring and analyzing only the images taken along the guidelines.
選択受付ステップは、ウェアラブル端末の表示板を透過して見える飲食店に対して、選択対象の選択を受け付ける。例えば、ウェアラブル端末の表示板を透過して見える飲食店を一定時間見ることで選択対象の選択を受け付けてもよい。例えば、ウェアラブル端末の表示板を透過して見える飲食店にタッチして選択対象の選択を受け付けてもよい。例えば、ウェアラブル端末の表示板を透過して見える飲食店にカーソルを合わせることで選択対象の選択を受け付けてもよい。例えば、視線を検知するセンサ、モーションセンサ、加速度センサなど。 The selection accepting step accepts selection of a selection target for a restaurant that can be seen through the display board of the wearable terminal. For example, you may receive selection of the selection object by seeing the restaurant which can be seen through the display board of a wearable terminal for a certain period of time. For example, you may touch the restaurant which sees through the display board of a wearable terminal, and may receive selection of selection object. For example, selection of a selection target may be received by placing a cursor on a restaurant that can be seen through a display board of a wearable terminal. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
また、メニュー表示ステップは、ウェアラブル端末の表示板に、表示板を透過して見える選択対象にだけ合わせて、メニューを拡張現実として表示してもよい。選択された選択対象にだけ合わせてメニューを拡張現実として表示するので、ピンポイントにメニューを把握することができる。特定された全ての飲食店にメニューを表示すると表示板の画面が煩わしくなることがある。 In the menu display step, the menu may be displayed as an augmented reality on the display board of the wearable terminal only in accordance with a selection target that can be seen through the display board. Since the menu is displayed as augmented reality only in accordance with the selected selection object, the menu can be grasped pinpointly. If the menu is displayed in all the specified restaurants, the screen of the display board may become troublesome.
 上述した手段、機能は、コンピュータ(CPU、情報処理装置、各種端末を含む)が、所定のプログラムを読み込んで、実行することによって実現される。プログラムは、例えば、コンピュータにインストールされるアプリケーションであってもよいし、コンピュータからネットワーク経由で提供されるSaaS(ソフトウェア・アズ・ア・サービス)形態であってもよいし、例えば、フレキシブルディスク、CD(CD-ROMなど)、DVD(DVD-ROM、DVD-RAMなど)等のコンピュータ読取可能な記録媒体に記録された形態で提供されてもよい。この場合、コンピュータはその記録媒体からプログラムを読み取って内部記憶装置または外部記憶装置に転送し記憶して実行する。また、そのプログラムを、例えば、磁気ディスク、光ディスク、光磁気ディスク等の記憶装置(記録媒体)に予め記録しておき、その記憶装置から通信回線を介してコンピュータに提供するようにしてもよい。 The means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program. The program may be, for example, an application installed on a computer, or may be in the form of SaaS (software as a service) provided from a computer via a network, for example, a flexible disk, a CD It may be provided in a form recorded on a computer-readable recording medium such as a CD-ROM (DVD-ROM, DVD-RAM, etc.). In this case, the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it. The program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
上述した機械学習の具体的なアルゴリズムとしては、最近傍法、ナイーブベイズ法、決定木、サポートベクターマシン、強化学習などを利用してよい。また、ニューラルネットワークを利用して、学習するための特徴量を自ら生成する深層学習(ディープラーニング)であってもよい。 As a specific algorithm of the machine learning described above, a nearest neighbor method, a naive Bayes method, a decision tree, a support vector machine, reinforcement learning, or the like may be used. Further, deep learning may be used in which a characteristic amount for learning is generated by using a neural network.
 以上、本発明の実施形態について説明したが、本発明は上述したこれらの実施形態に限るものではない。また、本発明の実施形態に記載された効果は、本発明から生じる最も好適な効果を列挙したに過ぎず、本発明による効果は、本発明の実施形態に記載されたものに限定されるものではない。

 
As mentioned above, although embodiment of this invention was described, this invention is not limited to these embodiment mentioned above. The effects described in the embodiments of the present invention are only the most preferable effects resulting from the present invention, and the effects of the present invention are limited to those described in the embodiments of the present invention. is not.

Claims (13)

  1.  ウェアラブル端末の表示板に、飲食店のメニューを表示するウェアラブル端末表示システムであって、
     ウェアラブル端末の視界に入った飲食店の画像を取得する画像取得手段と、
     前記画像を画像解析して、前記飲食店を特定する特定手段と、
     前記飲食店のメニューを収集する収集手段と、
     前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記メニューを拡張現実として表示するメニュー表示手段と、
    を備えるウェアラブル端末表示システム。
    A wearable terminal display system for displaying a restaurant menu on a wearable terminal display board,
    Image acquisition means for acquiring an image of a restaurant that has entered the field of view of the wearable terminal;
    Analyzing the image and identifying means for identifying the restaurant;
    A collecting means for collecting the restaurant menu;
    Menu display means for displaying the menu as augmented reality on the display of the wearable terminal, with respect to the restaurant that is seen through the display board,
    A wearable terminal display system.
  2.  前記特定手段は、前記ウェアラブル端末の視界の中央にある飲食店だけを特定する
    請求項1に記載のウェアラブル端末表示システム。
    The wearable terminal display system according to claim 1, wherein the specifying unit specifies only a restaurant in the center of the field of view of the wearable terminal.
  3.  前記収集手段は、予めメニューが登録されたデータベースを参照して、前記飲食店のメニューを収集する
    請求項1に記載のウェアラブル端末表示システム。
    The wearable terminal display system according to claim 1, wherein the collection unit collects the restaurant menu by referring to a database in which menus are registered in advance.
  4.  前記収集手段は、前記飲食店に予め紐付けられたWebコンテンツにアクセスして、前記メニューを収集する
    請求項1に記載のウェアラブル端末表示システム。
    The wearable terminal display system according to claim 1, wherein the collection unit collects the menu by accessing Web contents linked in advance to the restaurant.
  5.  前記収集手段は、前記飲食店をインターネット検索して、検索されたWebコンテンツから前記メニューを収集する
    請求項1に記載のウェアラブル端末表示システム。
    The wearable terminal display system according to claim 1, wherein the collection unit searches the restaurant for the Internet and collects the menu from the searched Web content.
  6.  前記表示されたメニューが閲覧されたかどうかを判定する判定手段と、
     前記閲覧されたと判定された場合、メニューを閲覧済みに変更する変更手段と、
    を備える請求項1に記載のウェアラブル端末表示システム。
    Determining means for determining whether the displayed menu has been browsed;
    When it is determined that the browsing has been performed, a changing unit that changes the menu to a browsed state;
    The wearable terminal display system according to claim 1, comprising:
  7.  前記表示されたメニューが閲覧されたかどうかを判定する判定手段と、
     前記閲覧されていないと判定された場合、メニューが閲覧されるように注目度を変更する変更手段と、
    を備える請求項1に記載のウェアラブル端末表示システム。
    Determining means for determining whether the displayed menu has been browsed;
    A changing means for changing the degree of attention so that the menu is browsed when it is determined that the menu is not browsed;
    The wearable terminal display system according to claim 1, comprising:
  8.  前記表示されたメニューに対するアクションを検出する検出手段と、
     前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記アクションに応じた結果を拡張現実として表示するアクション結果表示手段と、
    を備える請求項1に記載のウェアラブル端末表示システム。
    Detecting means for detecting an action on the displayed menu;
    Action result display means for displaying the result according to the action as augmented reality for the restaurant that appears through the display board on the display board of the wearable terminal;
    The wearable terminal display system according to claim 1, comprising:
  9.  前記ウェアラブル端末の、端末位置と撮像方向と、を取得する位置方向取得手段と、
     前記端末位置と前記撮像方向とに基づいて、前記飲食店の店舗位置を推測する推測手段と、
    を備え、
     前記特定手段は、前記店舗位置と前記画像解析とから、前記飲食店を特定する
    請求項1に記載のウェアラブル端末表示システム。
    Position and direction acquisition means for acquiring a terminal position and an imaging direction of the wearable terminal;
    Based on the terminal position and the imaging direction, an estimation means for estimating a store position of the restaurant;
    With
    The wearable terminal display system according to claim 1, wherein the specifying unit specifies the restaurant from the store position and the image analysis.
  10.  前記ウェアラブル端末の表示板に、前記飲食店を撮像するためのガイドラインを拡張現実として表示するガイドライン表示手段を備え、
     前記取得手段は、前記ガイドラインに沿って撮像された前記画像を取得する
    請求項1に記載のウェアラブル端末表示システム。 
    A guideline display means for displaying a guideline for imaging the restaurant as an augmented reality on a display board of the wearable terminal,
    The wearable terminal display system according to claim 1, wherein the acquisition unit acquires the image captured along the guideline.
  11.  前記ウェアラブル端末の表示板を透過して見える前記飲食店に対して、選択対象の選択を受け付ける選択受付手段を備え、
     前記メニュー表示手段は、前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記選択対象にだけ合わせて、前記メニューを拡張現実として表示する
    請求項1に記載のウェアラブル端末表示システム。
    A selection accepting unit that accepts selection of a selection target for the restaurant that is seen through the display board of the wearable terminal,
    2. The wearable terminal display system according to claim 1, wherein the menu display unit displays the menu as an augmented reality on a display board of the wearable terminal only in accordance with the selection target that can be seen through the display board.
  12.  ウェアラブル端末の表示板に、飲食店のメニューを表示するウェアラブル端末表示方法であって、
     ウェアラブル端末の視界に入った飲食店の画像を取得する画像取得ステップと、
     前記画像を画像解析して、前記飲食店を特定する特定ステップと、
     前記飲食店のメニューを収集する収集ステップと、
     前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記メニューを拡張現実として表示するメニュー表示ステップと、
    を備えるウェアラブル端末表示方法。
    A wearable terminal display method for displaying a restaurant menu on a display panel of a wearable terminal,
    An image acquisition step of acquiring an image of a restaurant that has entered the field of view of the wearable terminal;
    A specific step of analyzing the image and identifying the restaurant;
    A collecting step of collecting the restaurant menu;
    A menu display step for displaying the menu as an augmented reality on the display plate of the wearable terminal, with respect to the restaurant that is seen through the display plate,
    A wearable terminal display method comprising:
  13.  コンピュータに、
     ウェアラブル端末の視界に入った飲食店の画像を取得する画像取得ステップと、
     前記画像を画像解析して、前記飲食店を特定する特定ステップと、
     前記飲食店のメニューを収集する収集ステップと、
     前記ウェアラブル端末の表示板に、前記表示板を透過して見える前記飲食店に対して、前記メニューを拡張現実として表示するメニュー表示ステップと、
    をさせるためのプログラム。
     
    On the computer,
    An image acquisition step of acquiring an image of a restaurant that has entered the field of view of the wearable terminal;
    A specific step of analyzing the image and identifying the restaurant;
    A collecting step of collecting the restaurant menu;
    A menu display step for displaying the menu as an augmented reality on the display plate of the wearable terminal, with respect to the restaurant that is seen through the display plate,
    Program to let you do.
PCT/JP2017/016942 2017-04-28 2017-04-28 Wearable terminal display system, wearable terminal display method and program WO2018198320A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016942 WO2018198320A1 (en) 2017-04-28 2017-04-28 Wearable terminal display system, wearable terminal display method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016942 WO2018198320A1 (en) 2017-04-28 2017-04-28 Wearable terminal display system, wearable terminal display method and program

Publications (1)

Publication Number Publication Date
WO2018198320A1 true WO2018198320A1 (en) 2018-11-01

Family

ID=63920176

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/016942 WO2018198320A1 (en) 2017-04-28 2017-04-28 Wearable terminal display system, wearable terminal display method and program

Country Status (1)

Country Link
WO (1) WO2018198320A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021089458A (en) * 2019-12-02 2021-06-10 株式会社リクルート Order management system, user apparatus, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003132068A (en) * 2001-10-22 2003-05-09 Nec Corp Navigation system and navigation terminal
JP2012216135A (en) * 2011-04-01 2012-11-08 Olympus Corp Image generation system, program, and information storage medium
JP2014504413A (en) * 2010-12-16 2014-02-20 マイクロソフト コーポレーション Augmented reality display content based on understanding and intention
JP2015176516A (en) * 2014-03-18 2015-10-05 富士通株式会社 Display device, display control program, and display control method
JP2016039599A (en) * 2014-08-11 2016-03-22 セイコーエプソン株式会社 Head mounted display, information system, control method for head mounted display, and computer program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003132068A (en) * 2001-10-22 2003-05-09 Nec Corp Navigation system and navigation terminal
JP2014504413A (en) * 2010-12-16 2014-02-20 マイクロソフト コーポレーション Augmented reality display content based on understanding and intention
JP2012216135A (en) * 2011-04-01 2012-11-08 Olympus Corp Image generation system, program, and information storage medium
JP2015176516A (en) * 2014-03-18 2015-10-05 富士通株式会社 Display device, display control program, and display control method
JP2016039599A (en) * 2014-08-11 2016-03-22 セイコーエプソン株式会社 Head mounted display, information system, control method for head mounted display, and computer program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021089458A (en) * 2019-12-02 2021-06-10 株式会社リクルート Order management system, user apparatus, and program

Similar Documents

Publication Publication Date Title
US11630861B2 (en) Method and apparatus for video searching, terminal and storage medium
TWI418763B (en) Mobile imaging device as navigator
US20180247361A1 (en) Information processing apparatus, information processing method, wearable terminal, and program
US9183583B2 (en) Augmented reality recommendations
US9024972B1 (en) Augmented reality computing with inertial sensors
US9842268B1 (en) Determining regions of interest based on user interaction
JP2010061218A (en) Web advertising effect measurement device, web advertising effect measurement method, and program
US20140330814A1 (en) Method, client of retrieving information and computer storage medium
WO2014176938A1 (en) Method and apparatus of retrieving information
JP6887198B2 (en) Wearable device display system, wearable device display method and program
JP2017204134A (en) Attribute estimation device, attribute estimation method, and program
WO2018198320A1 (en) Wearable terminal display system, wearable terminal display method and program
TWI661351B (en) System of digital content as in combination with map service and method for producing the digital content
WO2019021446A1 (en) Wearable terminal display system, wearable terminal display method and program
WO2018216220A1 (en) Wearable terminal display system, wearable terminal display method and program
WO2018216221A1 (en) Wearable terminal display system, wearable terminal display method and program
US9911237B1 (en) Image processing techniques for self-captured images
WO2019003359A1 (en) Wearable terminal display system, wearable terminal display method, and program
WO2019021447A1 (en) Wearable terminal display system, wearable terminal display method and program
JP6762470B2 (en) Wearable device display system, wearable device display method and program
JP5541868B2 (en) Image search command system and operation control method thereof
JP3985826B2 (en) Image search method and apparatus
CN103870601A (en) Method and device for marking and highlighting webpage content
JP6343412B1 (en) Map-linked sensor information display system, map-linked sensor information display method, and program
CN112035692B (en) Picture information searching method and device, computer system and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP