WO2013088819A1 - Augmented reality display method - Google Patents

Augmented reality display method Download PDF

Info

Publication number
WO2013088819A1
WO2013088819A1 PCT/JP2012/075700 JP2012075700W WO2013088819A1 WO 2013088819 A1 WO2013088819 A1 WO 2013088819A1 JP 2012075700 W JP2012075700 W JP 2012075700W WO 2013088819 A1 WO2013088819 A1 WO 2013088819A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
display
augmented reality
extended information
tag
Prior art date
Application number
PCT/JP2012/075700
Other languages
French (fr)
Japanese (ja)
Inventor
世鎬 徐
Original Assignee
ステラグリーン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ステラグリーン株式会社 filed Critical ステラグリーン株式会社
Publication of WO2013088819A1 publication Critical patent/WO2013088819A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present invention displays a captured image of a real space captured by a camera on a display and combines extended information set in relation to an object in the captured image with the captured image.
  • extended information set in relation to an object in the captured image with the captured image.
  • detailed information associated with the object is set in the extended information synthesized and displayed on the captured image, and this detailed information is provided to the user. Concerning the method.
  • the typical application of the augmented reality display technology is as follows. For example, in a large-scale exhibition hall such as an automobile or electronic device, if you display an interesting exhibit with a camera of a portable computer (such as a smartphone or tablet computer) that visitors bring, the display includes that exhibit. An image of the real space is displayed, and extended information called a tag is additionally displayed (combined display) in a form associated with the exhibits in the display image. In the tag, descriptions of functions and how to obtain information about the exhibits are displayed using text and images.
  • the content of the extended information (for example, explanation of functions or explanation of how to obtain it) can be abundant when additional information represented by tags is additionally displayed on the object in the captured image.
  • additional information represented by tags is additionally displayed on the object in the captured image.
  • the tag extended information
  • the captured image is not completely invisible by the tag.
  • the tag occupies most of the display screen, it cannot be denied that the augmented reality created by the unique combination of the real space image and the augmented information image is impaired.
  • users who view the display by operating the camera can meet the user's needs by presenting abundant extended information in a large tag format at a stage when it is unknown whether they want it or not. It may not be a good method.
  • the method of the present invention created in view of such circumstances is an augmented reality display method in a computer processing system, which displays a captured image of a real space captured by a camera on a display, and an object in the captured image.
  • the detailed information required by the user based on the user's selection for each of the display areas of the extended information to the user before displaying the detailed information on the object. It is possible to choose.
  • a minimum size of an extended information designation receiving area for receiving user input for specifying the displayed extended information is defined on the display screen, and the extended information is defined. May be included in the extended information designation receiving area because the display size is smaller than the minimum size of the extended information designation receiving area.
  • the computer processing system includes a user computer having the camera and the display, and a server computer that communicates with the user computer.
  • the extended information and the detailed information edited at the time are shared as needed via communication.
  • the extended information (corresponding to the tag) to be combined with the photographed image does not include detailed information such as the explanation of the function relating to the object and the explanation of the obtaining method in the above example.
  • Such detailed information exists in the computer processing system as other data associated with the extended information (tag).
  • the extended information (tag) added to the captured image functions as a link element for accessing the detailed information, so that it can be understood what type (category) of the detailed information is linked. It consists of designed small pictograms and designs (like icons). For example, a tag (extended information) designed to link to “product information” which is detailed information, a tag (extended information) designed to link to “bargain information” which is detailed information, and detailed information. There are tags that are designed to link to “product video”. Such design by category is only an example.
  • the user sees the display screen in which the above tag is additionally displayed on the object in the captured image.
  • a “bargain information” type tag associated with an object of interest and designates the tag (by touching the tag with a fingertip on a typical touch panel display)
  • the user Detailed information “bargain information” associated with the tag is displayed on the display.
  • the user looks at the augmented reality image displayed on the display and selectively designates the tag on the screen according to his / her own will according to his / her interest.
  • Detailed information about the object pointed to by the tag can be displayed. Therefore, even if the computer processing system can present abundant detailed information about the object to the user, a small tag (extended information) to be synthesized and displayed on the photographed image in the real space can be used. The problem can be solved.
  • FIG. 1 shows an augmented reality image displayed on the display when the real space of FIG. 1 is taken with a portable computer. This represents the relationship between the tags 5a, 5b, 5c and the detailed information associated with the tags 6 respectively.
  • All the software necessary to execute augmented reality display on a portable computer is installed on the portable computer, and the processing of augmented reality display is completed without the portable computer communicating with other computers.
  • the stand-alone system that can be used is well known.
  • a distributed processing method is also known in which a portable computer performs distributed processing while wirelessly communicating with a server computer via the Internet or a LAN, and realizes augmented reality display processing in the portable computer while exchanging information with the server computer. .
  • the server computer can proceed with processing simultaneously in parallel with a large number of portable computers.
  • the “computer processing system” described in the claims of the present application refers to a portable computer in the case of the above stand-alone method, and includes both a portable computer and a server computer in the case of the above distributed processing method. become.
  • the distributed processing method it is obvious to those skilled in the art that various ways of distributing the processing can be implemented.
  • the technology for positioning the extended image (tag) on the captured image will be described.
  • a positioning technique two types, a marker method and a markerless method, are well known to those skilled in the art.
  • the marker method is described as an example, but it goes without saying that the method of the present invention is not directly related to the positioning method and may be a markerless method.
  • FIG. 1 shows a situation of a certain local real space in the exhibition hall. There is a desk on which products 1 and 2 are displayed.
  • Product 1 is a notebook computer, and product 2 is a desktop personal computer.
  • the product 1 is provided with three markers 3a, 3b, and 3c.
  • One marker 4 is attached to the product 2.
  • FIG. 2 shows an augmented reality image displayed on the display when the real space of FIG. 1 is taken with a portable computer.
  • the displayed images are mainly taken images of the product 1 and the product 2 on the desk, and the three tags 5a, 5b, and 5c corresponding to the three markers 3a, 3b, and 3c of the product 1 are transparent colors.
  • Additional display is performed, and the tag 6 corresponding to one marker 4 of the product 2 is additionally displayed (synthesized display) in a transparent color.
  • the tag is extended information designed like an icon.
  • Each marker expresses unique information
  • the computer processing system detects each marker included in the image data captured by the camera, reads the unique information, and associates it with the read marker specific information.
  • the extracted tag information is extracted from the setting information, and the design based on the tag information is displayed at the position of the corresponding marker in the captured image.
  • FIG. 3 shows the relationship between the tags 5a, 5b, 5c and the tag 6 and the detailed information associated with each.
  • the tag 5a is associated with detailed information 7a consisting of a description document of specifications and functions of the product 1.
  • the tag 5b is associated with detailed information 7b made up of a document explaining how to obtain the product 1 at a reasonable price.
  • the tag 5c is associated with detailed information 7c including a product moving image obtained by photographing the usage scene of the product 1.
  • Detailed information 8 consisting of a promotion video of the product 2 is associated with the tag 6.
  • the detailed information data including the document information and the moving image information described above may be stored in any of the computer processing systems (the server computer or the portable computer) or stored in other computers on the Internet. May be.
  • the data of each tag includes link information that makes it possible to acquire detailed information associated with them.
  • the display may display document information instead of the augmented reality image that has been displayed so far, or may open another window to display the document information so as to overlap the augmented reality image. Good. The same applies to the case where other tags are selected, so the description will not be repeated.
  • the augmented reality image shown in FIG. 2 displayed on the display of the portable computer can be enlarged or reduced according to the user's preference. If the augmented reality image is extremely reduced, the display size of the displayed tag (extended information) becomes too small, and it becomes difficult for the user to select the tag by touching with the fingertip.
  • the minimum size of the extended information designation receiving area for receiving user input for specifying the displayed extended information is defined on the display screen. It is desirable to adopt a configuration in which the display size may be smaller than the minimum size of the extended information designation receiving area and may be included in the extended information designation receiving area.
  • the computer processing system includes a user computer having the camera and the display, and a server computer that communicates with the user computer. It is desirable to adopt a method in which the extended information and the detailed information edited at a desired time on the server computer are shared at any time via communication.

Abstract

[Problem] When a large portion of a display screen becomes occupied by a tag containing detailed information, the fun of augmented reality created by the wonder of the combination of a real-space image and an augmented-information image is undeniably impaired. [Solution] Detailed information associated with an object of interest is set in augmented information (a tag), and before displaying the detailed information for the object of interest, the individual display area of the augmented information (tag) is subjected to selection by a user, enabling selection of detailed information needed by the user.

Description

拡張現実感表示方法Augmented reality display method
 この発明は、コンピューター処理系において、カメラで撮影されている現実空間の撮影画像をディスプレイに表示するとともに、当該撮影画像中の対象物に関連して設定されている拡張情報を当該撮影画像に合成して当該ディスプレイに表示する拡張現実感表示方法に関し、とくに、撮影画像に合成表示される拡張情報に前記対象物に関連づけられた詳細情報を設定しておき、この詳細情報を利用者に提供する手法に関する。 In the computer processing system, the present invention displays a captured image of a real space captured by a camera on a display and combines extended information set in relation to an object in the captured image with the captured image. In particular, with regard to the augmented reality display method displayed on the display, in particular, detailed information associated with the object is set in the extended information synthesized and displayed on the captured image, and this detailed information is provided to the user. Concerning the method.
 周知のように、カメラとディスプレイを備えた携帯型のコンピューター、いわゆるスマートフォンやタブレットコンピューターなどを主な対象として、さまざまな場面において拡張現実感表示技術をさまざまに活用する用途開発が大流行している。たとえば、2009年発行の日経エレクトロニクス9月7日号「特集・拡張現実感」には拡張現実感表示技術の応用展開に関する当時の状況が詳しく解説されている。また下記特許文献に見られるような拡張現実感表示技術に関連した数多くの発明が創作されて特許出願されている。 As is well known, application development using various augmented reality display technologies in various situations is popular, mainly for portable computers equipped with cameras and displays, so-called smartphones and tablet computers. . For example, the Nikkei Electronics September 7 issue “Special Issue: Augmented Reality” published in 2009 explains in detail the situation at that time regarding application development of augmented reality display technology. A number of inventions related to augmented reality display technology as found in the following patent documents have been created and patent applications have been filed.
 拡張現実感表示技術のわかりやすい典型的な応用はつぎのようなものである。たとえば自動車や電子機器などの大規模な展示会場において、来場者が持参している携帯型コンピューター(スマートフォンやタブレットコンピューターなど)のカメラで関心のある展示品を撮影すると、ディスプレイにその展示品を含む現実空間の画像が表示され、表示画像中の展示品に対応づける形態でタグと呼ぶ拡張情報が付加表示(合成表示)される。タグには、対象となっている展示品についての機能の説明や入手方法の説明などがテキストやイメージを用いて表現されている。 The typical application of the augmented reality display technology is as follows. For example, in a large-scale exhibition hall such as an automobile or electronic device, if you display an interesting exhibit with a camera of a portable computer (such as a smartphone or tablet computer) that visitors bring, the display includes that exhibit. An image of the real space is displayed, and extended information called a tag is additionally displayed (combined display) in a form associated with the exhibits in the display image. In the tag, descriptions of functions and how to obtain information about the exhibits are displayed using text and images.
特開2011-244058号公報JP 2011-244058 A 特開2011-242816号公報JP 2011-242816 A 特開2011-123807号公報JP 2011-123807 A 特開2011-22662号公報JP 2011-22626 A 特開2011-81556号公報JP 2011-81556 A 特開2011-204115号公報JP 2011-204115 A 特開2011-130025号公報JP 2011-130025 A 特開2011-123741号公報JP 2011-123741 A
 従来の拡張現実感表示方法においては、撮影画像中の対象物にタグで表現された拡張情報を付加表示するに際して、拡張情報の内容(たとえば機能の説明とか入手方法の説明など)が豊富になればなるほど、合成画像中に占めるタグの領域が大きくなり、タグによって隠れてしまう現実空間の画像領域が大きくなり、このことが以下のような問題を引き起こしていた。 In the conventional augmented reality display method, the content of the extended information (for example, explanation of functions or explanation of how to obtain it) can be abundant when additional information represented by tags is additionally displayed on the object in the captured image. The larger the tag area occupied in the composite image, the larger the real space image area hidden by the tag, which causes the following problems.
 周知のように、タグ(拡張情報)は透明色で撮影画像に重なるように表示されるので、撮影画像がタグによって完全に見えなくなるわけではない。しかしながら、ディスプレイ画面の大部分をタグが占めるようになれば、現実空間の画像と拡張情報の画像の組み合わせの妙によって醸しだされる拡張現実感のおもしろさが損なわれるのは否めない。また、カメラを操作してディスプレイを見る利用者に対し、欲しているのかどうか分からない段階で内容豊富な拡張情報を大きなタグの形式で一括して提示することが、利用者のニーズにかなった良い方法であるとは言えないであろう。 As is well known, since the tag (extended information) is displayed in a transparent color so as to overlap the captured image, the captured image is not completely invisible by the tag. However, if the tag occupies most of the display screen, it cannot be denied that the augmented reality created by the unique combination of the real space image and the augmented information image is impaired. In addition, users who view the display by operating the camera can meet the user's needs by presenting abundant extended information in a large tag format at a stage when it is unknown whether they want it or not. It may not be a good method.
 こうした事情に鑑みて創作された本発明の方法は、コンピューター処理系における拡張現実感表示方法であって、カメラで撮影されている現実空間の撮影画像をディスプレイに表示し、当該撮影画像中の対象物に関連して設定されている一つ乃至複数の拡張情報であって、前記拡張情報の各々に前記対象物に関連づけられた詳細情報が設定されている前記拡張情報を当該撮影画像に合成して当該ディスプレイに表示し、前記対象物に対する詳細情報を表示する前に、使用者に、前記拡張情報の表示領域のそれぞれについて、使用者の選択に基いて、使用者が必要とする前記詳細情報を選ぶことを可能とする。 The method of the present invention created in view of such circumstances is an augmented reality display method in a computer processing system, which displays a captured image of a real space captured by a camera on a display, and an object in the captured image. One or a plurality of extended information set in relation to an object, wherein the extended information in which detailed information related to the object is set in each of the extended information is combined with the captured image. The detailed information required by the user based on the user's selection for each of the display areas of the extended information to the user before displaying the detailed information on the object. It is possible to choose.
 上記の本発明の方法において、望ましい一形態として、前記ディスプレイの画面において、表示された前記拡張情報を特定する利用者入力を受け付ける拡張情報指定受付領域の最小サイズが規定されており、前記拡張情報の表示サイズが前記拡張情報指定受付領域の最小サイズ以下となって前記拡張情報指定受付領域に内包される場合がある。 In the method of the present invention described above, as a desirable mode, a minimum size of an extended information designation receiving area for receiving user input for specifying the displayed extended information is defined on the display screen, and the extended information is defined. May be included in the extended information designation receiving area because the display size is smaller than the minimum size of the extended information designation receiving area.
 上記の本発明の方法において、望ましい一形態として、前記コンピューター処理系は、前記カメラと前記ディスプレイを備えた使用者コンピューターと、使用者コンピューターと通信するサーバーコンピューターとによって構成され、サーバーコンピューターで所望の時点で編集した前記拡張情報と前記詳細情報を通信経由で随時共有する。 In the above-described method of the present invention, as a desirable mode, the computer processing system includes a user computer having the camera and the display, and a server computer that communicates with the user computer. The extended information and the detailed information edited at the time are shared as needed via communication.
 この発明の方法においては、撮影画像に合成する拡張情報(前記のタグに相当)としては、前記の例における対象物に関する機能の説明や入手方法の説明といった詳細情報を含めないこととしている。こうした詳細情報は、拡張情報(タグ)と関連づけられた別のデータとして前記コンピューター処理系に存在している。 In the method of the present invention, the extended information (corresponding to the tag) to be combined with the photographed image does not include detailed information such as the explanation of the function relating to the object and the explanation of the obtaining method in the above example. Such detailed information exists in the computer processing system as other data associated with the extended information (tag).
 撮影画像に付加表示される拡張情報(タグ)は、詳細情報にアクセスするためのリンク要素として機能するものであり、どのような種類(カテゴリー)の詳細情報にリンクしているのかが分かるようにデザインされた小型の絵文字や図案(アイコンのようなもの)からなっている。たとえば、詳細情報である「商品情報」にリンクすることを図案化したタグ(拡張情報)、詳細情報である「お買い得情報」にリンクすることを図案化したタグ(拡張情報)、詳細情報である「商品動画」にリンクすることを図案化したタグなどがある。なお、こうしたカテゴリー別の図案化は一例にすぎない。 The extended information (tag) added to the captured image functions as a link element for accessing the detailed information, so that it can be understood what type (category) of the detailed information is linked. It consists of designed small pictograms and designs (like icons). For example, a tag (extended information) designed to link to “product information” which is detailed information, a tag (extended information) designed to link to “bargain information” which is detailed information, and detailed information. There are tags that are designed to link to “product video”. Such design by category is only an example.
 利用者は、撮影画像中の対象物に上記のようなタグが付加表示されたディスプレイ画面を見ることになる。そして、利用者が、関心のある対象物に対応づけされた「お買い得情報」タイプのタグに興味を持ち、当該タグを指定操作すると(典型的なタッチパネル・ディスプレイでは指先でタグに触れる)、当該タグに関連づけされている詳細情報「お買い得情報」が前記ディスプレイに表示される。 The user sees the display screen in which the above tag is additionally displayed on the object in the captured image. When a user is interested in a “bargain information” type tag associated with an object of interest and designates the tag (by touching the tag with a fingertip on a typical touch panel display), the user Detailed information “bargain information” associated with the tag is displayed on the display.
 以上のように、この発明の方法によれば、利用者は、ディスプレイに表示された拡張現実感画像を見て、画面中のタグを関心に応じて自らの意志で選択的に指定することで、タグが指し示している対象物についての詳細情報を表示させることができる。したがって、前記コンピューター処理系は、対象物に関する豊富な詳細情報を利用者に提示可能としていても、現実空間の撮影画像に合成表示するタグ(拡張情報)は小さなものでよくなり、前述した従来の問題点を解消することができる。 As described above, according to the method of the present invention, the user looks at the augmented reality image displayed on the display and selectively designates the tag on the screen according to his / her own will according to his / her interest. Detailed information about the object pointed to by the tag can be displayed. Therefore, even if the computer processing system can present abundant detailed information about the object to the user, a small tag (extended information) to be synthesized and displayed on the photographed image in the real space can be used. The problem can be solved.
展示会場におけるある局所の現実空間の様子を表している。It shows the appearance of a certain local real space in the exhibition hall. 図1の現実空間を携帯型コンピューターで撮影した場合に、そのディスプレイに表示された拡張現実感画像を表している。1 shows an augmented reality image displayed on the display when the real space of FIG. 1 is taken with a portable computer. タグ5a・5b・5cおよびタグ6とそれぞれに対応づけされた詳細情報の関係を表している。This represents the relationship between the tags 5a, 5b, 5c and the detailed information associated with the tags 6 respectively.
===前提となる技術事項===
 実施例として、前述した自動車とか電子機器などの展示会での応用を前提として説明する。展示会に来場した利用者が、カメラとタッチパネル・ディスプレイを備えた一般的なスマートフォンやタブレットコンピューター(これを携帯型コンピューターとする)を持参しており、これにより拡張現実感表示を活用するものとして説明を進める。
=== Technical matters to be assumed ===
The embodiment will be described on the premise of application in an exhibition such as the above-described automobile or electronic device. Users visiting the exhibition bring a general smartphone or tablet computer with a camera and a touch panel display (this is called a portable computer). Proceed with the explanation.
 拡張現実感表示を携帯型コンピューターで実行するために必要なソフトウエアをすべて当該携帯型コンピューターに実装しておき、当該携帯型コンピューターが他のコンピューターと通信することなく拡張現実感表示の処理を完遂できるスタンドアローン方式は周知である。また、携帯型コンピューターがインターネットやLANを通じてサーバーコンピューターと無線通信しつつ分散処理を行い、サーバーコンピューターと情報交換しながら携帯型コンピューターにおいて拡張現実感表示の処理を具現化する分散処理方式も周知である。この場合、サーバーコンピューターは多数の携帯型コンピューターと通信しながら同時並行して処理を進めることができる。 All the software necessary to execute augmented reality display on a portable computer is installed on the portable computer, and the processing of augmented reality display is completed without the portable computer communicating with other computers. The stand-alone system that can be used is well known. A distributed processing method is also known in which a portable computer performs distributed processing while wirelessly communicating with a server computer via the Internet or a LAN, and realizes augmented reality display processing in the portable computer while exchanging information with the server computer. . In this case, the server computer can proceed with processing simultaneously in parallel with a large number of portable computers.
 上記したような基礎的な技術事項は当業者にとって自明であり、かつ、この発明の核心部分とは直接関係しないので、この明細書においては周知技術に関する冗長な説明はしない。本願の特許請求の範囲に記載した「コンピューター処理系」とは、上記のスタンドアローン方式であれば携帯型コンピューターを指し、上記の分散処理方式であれば携帯型コンピューターとサーバーコンピューターの両方を含むことになる。分散処理方式の場合、処理の分散の仕方はいろいろに実施できることは当業者にとって自明である。 Since the basic technical matters as described above are obvious to those skilled in the art and are not directly related to the core part of the present invention, redundant description of well-known technology is not provided in this specification. The “computer processing system” described in the claims of the present application refers to a portable computer in the case of the above stand-alone method, and includes both a portable computer and a server computer in the case of the above distributed processing method. become. In the case of the distributed processing method, it is obvious to those skilled in the art that various ways of distributing the processing can be implemented.
 つぎに撮影画像上に拡張画像(タグ)を位置決めする技術について触れる。位置決め技術としては、マーカー方式とマーカーレス方式の2種類が当業者に周知されている。以下の実施例においてはマーカー方式をとりあげて説明しているが、この発明の方法は位置決め方式に直接関係していないので、マーカーレス方式でも良いことはいうまでもない。 Next, the technology for positioning the extended image (tag) on the captured image will be described. As a positioning technique, two types, a marker method and a markerless method, are well known to those skilled in the art. In the following embodiments, the marker method is described as an example, but it goes without saying that the method of the present invention is not directly related to the positioning method and may be a markerless method.
===実施例===
 図1は、展示会場におけるある局所の現実空間の様子を表している。机があり、その上に商品1と商品2が展示されている。商品1はノートパソコンであり、商品2はデスクトップパソコンの本体である。商品1には3個のマーカー3a・3b・3cが付設されている。商品2には1個のマーカー4が付設されている。
=== Example ===
FIG. 1 shows a situation of a certain local real space in the exhibition hall. There is a desk on which products 1 and 2 are displayed. Product 1 is a notebook computer, and product 2 is a desktop personal computer. The product 1 is provided with three markers 3a, 3b, and 3c. One marker 4 is attached to the product 2.
 図2は、図1の現実空間を携帯型コンピューターで撮影した場合に、そのディスプレイに表示された拡張現実感画像を表している。表示画像は、机の上にある商品1と商品2の撮影画像が主体となり、商品1の3個のマーカー3a・3b・3cにそれぞれ対応した3個のタグ5a・5b・5cが透明色で付加表示(合成表示)され、商品2の1個のマーカー4に対応したタグ6が透明色で付加表示(合成表示)されている。ここで、タグとはアイコンのように図案化された拡張情報のことである。 FIG. 2 shows an augmented reality image displayed on the display when the real space of FIG. 1 is taken with a portable computer. The displayed images are mainly taken images of the product 1 and the product 2 on the desk, and the three tags 5a, 5b, and 5c corresponding to the three markers 3a, 3b, and 3c of the product 1 are transparent colors. Additional display (composite display) is performed, and the tag 6 corresponding to one marker 4 of the product 2 is additionally displayed (synthesized display) in a transparent color. Here, the tag is extended information designed like an icon.
 コンピューター処理系において上記のような拡張現実感画像を表示する仕組みは説明するまでもなく周知であるが、いちおう肝となる部分の概要を説明する。各マーカーにはそれぞれ固有の情報が表現されており、コンピューター処理系はカメラによる撮影画像データ中に含まれる各マーカーを検出して固有情報を読み取り、読み取ったマーカー固有情報に基づいてこれに対応づけされているタグ情報を設定情報中から抽出し、タグ情報に基づく図案を撮影画像中の該当マーカーの位置に表示させる。 Although the mechanism for displaying the augmented reality image as described above in the computer processing system is well known, the outline of the most important part will be explained. Each marker expresses unique information, and the computer processing system detects each marker included in the image data captured by the camera, reads the unique information, and associates it with the read marker specific information. The extracted tag information is extracted from the setting information, and the design based on the tag information is displayed at the position of the corresponding marker in the captured image.
 図3は、タグ5a・5b・5cおよびタグ6とそれぞれに対応づけされた詳細情報の関係を表している。タグ5aには商品1の仕様や機能の説明文書からなる詳細情報7aが対応づけされている。タグ5bには商品1のお買い得な入手方法の説明文書からなる詳細情報7bが対応づけされている。タグ5cには商品1の利用シーンを撮影した商品動画からなる詳細情報7cが対応づけされている。タグ6には商品2のプロモーションビデオからなる詳細情報8が対応づけされている。 FIG. 3 shows the relationship between the tags 5a, 5b, 5c and the tag 6 and the detailed information associated with each. The tag 5a is associated with detailed information 7a consisting of a description document of specifications and functions of the product 1. The tag 5b is associated with detailed information 7b made up of a document explaining how to obtain the product 1 at a reasonable price. The tag 5c is associated with detailed information 7c including a product moving image obtained by photographing the usage scene of the product 1. Detailed information 8 consisting of a promotion video of the product 2 is associated with the tag 6.
 上述した文書情報や動画情報などからなる詳細情報のデータは前記コンピューター処理系(前記サーバーコンピューターや携帯型コンピューター)のいずれかに蔵置されていてもいいし、インターネット上の他のコンピューターに蔵置されていてもよい。いずれにしても、各タグのデータにはそれらに対応づけされた詳細情報を取得可能にするリンク情報が含まれている。 The detailed information data including the document information and the moving image information described above may be stored in any of the computer processing systems (the server computer or the portable computer) or stored in other computers on the Internet. May be. In any case, the data of each tag includes link information that makes it possible to acquire detailed information associated with them.
 携帯型コンピューターにおいて、ディスプレイに表示された図2の拡張現実感画像を見ている利用者が画面上のタグ5aを指先で触れることで選択入力を与えると、タグ5aに対応づけされている詳細情報7a(文書情報)が当該ディスプレイに表示される。このとき、当該ディスプレイは、それまで表示されていた拡張現実感画像に代えて文書情報を表示してもいいし、拡張現実感画像に重なるように別ウインドウを開いて文書情報を表示してもよい。他のタグを選択した場合も同様であるので説明を繰り返さない。 In a portable computer, when a user who is viewing the augmented reality image of FIG. 2 displayed on the display gives a selection input by touching the tag 5a on the screen with his / her fingertip, the details associated with the tag 5a Information 7a (document information) is displayed on the display. At this time, the display may display document information instead of the augmented reality image that has been displayed so far, or may open another window to display the document information so as to overlap the augmented reality image. Good. The same applies to the case where other tags are selected, so the description will not be repeated.
 周知のように、携帯型コンピューターのディスプレイに表示されている図2のような拡張現実感画像を利用者の好みにより拡大・縮小することができる。拡張現実感画像を極端に縮小すると、表示されているタグ(拡張情報)の表示サイズが小さくなりすぎ、利用者が指先で触れることでタグを選択する操作がやりにくくなる。これを防ぐために、請求項2に定義したように、前記ディスプレイの画面において、表示された拡張情報を特定する利用者入力を受け付ける拡張情報指定受付領域の最小サイズが規定されており、拡張情報の表示サイズが拡張情報指定受付領域の最小サイズ以下となって拡張情報指定受付領域に内包される場合があるという構成を採用することが望ましい。 As is well known, the augmented reality image shown in FIG. 2 displayed on the display of the portable computer can be enlarged or reduced according to the user's preference. If the augmented reality image is extremely reduced, the display size of the displayed tag (extended information) becomes too small, and it becomes difficult for the user to select the tag by touching with the fingertip. In order to prevent this, as defined in claim 2, the minimum size of the extended information designation receiving area for receiving user input for specifying the displayed extended information is defined on the display screen. It is desirable to adopt a configuration in which the display size may be smaller than the minimum size of the extended information designation receiving area and may be included in the extended information designation receiving area.
 また、利用者に提示する拡張情報や詳細情報は適切に更新することが望ましい。この意味で、前記の分散処理方式を採用する場合、請求項3に定義したように、前記コンピューター処理系は、前記カメラと前記ディスプレイを備えた使用者コンピューターと、使用者コンピューターと通信するサーバーコンピューターとによって構成され、サーバーコンピューターで所望の時点で編集した前記拡張情報と前記詳細情報を通信経由で随時共有する方式を採用することが望ましい。 Also, it is desirable to update the extended information and detailed information presented to the user appropriately. In this sense, when the distributed processing method is adopted, as defined in claim 3, the computer processing system includes a user computer having the camera and the display, and a server computer that communicates with the user computer. It is desirable to adopt a method in which the extended information and the detailed information edited at a desired time on the server computer are shared at any time via communication.
===参照による取り込み===
 本出願は、2011年12月13日に出願された日本特許出願第2011-272393号の優先権を主張し、その内容を参照することにより、本出願に取り込む。
=== Import by reference ===
This application claims the priority of Japanese Patent Application No. 2011-272393 filed on December 13, 2011, and is incorporated herein by reference.
 1…商品、2…商品、3a・3b・3c…マーカー、4…マーカー、5a・5b・5c・6…タグ(拡張情報)、7a・7b・7c・8…詳細情報 1 ... Product, 2 ... Product, 3a, 3b, 3c ... Marker, 4 ... Marker, 5a, 5b, 5c, 6 ... Tag (extended information), 7a, 7b, 7c, 8 ... Detailed information

Claims (3)

  1.  コンピューター処理系における拡張現実感表示方法であって、
     カメラで撮影されている現実空間の撮影画像をディスプレイに表示し、
     当該撮影画像中の対象物に関連して設定されている一つ乃至複数の拡張情報であって、前記拡張情報の各々に前記対象物に関連づけられた詳細情報が設定されている前記拡張情報を当該撮影画像に合成して当該ディスプレイに表示し、
     前記対象物に対する詳細情報を表示する前に、使用者に、前記拡張情報の表示領域のそれぞれについて、使用者の選択に基いて、使用者が必要とする前記詳細情報を選ぶことを可能とする、拡張現実感表示方法。
    An augmented reality display method in a computer processing system,
    The real-world image taken with the camera is displayed on the display,
    One or a plurality of extended information set in relation to the object in the captured image, wherein the extended information is set with detailed information associated with the object in each of the extended information. Combined with the captured image and displayed on the display,
    Before displaying the detailed information on the object, the user can select the detailed information required by the user based on the selection of the user for each of the display areas of the extended information. , Augmented reality display method.
  2.  前記ディスプレイの画面において、表示された前記拡張情報を特定する利用者入力を受け付ける拡張情報指定受付領域の最小サイズが規定されており、前記拡張情報の表示サイズが前記拡張情報指定受付領域の最小サイズ以下となって前記拡張情報指定受付領域に内包される場合がある、請求項1に記載の拡張現実感表示方法。 In the display screen, a minimum size of an extended information designation receiving area for receiving user input for specifying the displayed extended information is defined, and a display size of the extended information is a minimum size of the extended information designation receiving area The augmented reality display method according to claim 1, which may be included in the extended information designation receiving area in the following.
  3.  前記コンピューター処理系は、前記カメラと前記ディスプレイを備えた使用者コンピューターと、使用者コンピューターと通信するサーバーコンピューターとによって構成され、前記サーバーコンピューターで所望の時点で編集した前記拡張情報と前記詳細情報を通信経由で随時共有する、請求項1又は2のいずれかに記載の拡張現実感表示方法。 The computer processing system includes a user computer having the camera and the display, and a server computer that communicates with the user computer, and the extended information and the detailed information edited at a desired time on the server computer. The augmented reality display method according to claim 1, wherein the augmented reality display method is shared at any time via communication.
PCT/JP2012/075700 2011-12-13 2012-10-03 Augmented reality display method WO2013088819A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-272393 2011-12-13
JP2011272393A JP2013125328A (en) 2011-12-13 2011-12-13 Augmented reality display method

Publications (1)

Publication Number Publication Date
WO2013088819A1 true WO2013088819A1 (en) 2013-06-20

Family

ID=48612276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/075700 WO2013088819A1 (en) 2011-12-13 2012-10-03 Augmented reality display method

Country Status (2)

Country Link
JP (1) JP2013125328A (en)
WO (1) WO2013088819A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105630366A (en) * 2014-10-31 2016-06-01 阿里巴巴集团控股有限公司 Method and apparatus for displaying object information in screen display device
JP6455193B2 (en) 2015-02-04 2019-01-23 株式会社デンソー Electronic mirror system and image display control program
JP6256430B2 (en) * 2015-08-17 2018-01-10 コニカミノルタ株式会社 Content providing server, content providing method, and computer program
US10382634B2 (en) 2016-05-06 2019-08-13 Fuji Xerox Co., Ltd. Information processing apparatus and non-transitory computer readable medium configured to generate and change a display menu
US9986113B2 (en) 2016-05-06 2018-05-29 Fuji Xerox Co., Ltd. Information processing apparatus and nontransitory computer readable medium
US10770035B2 (en) 2018-08-22 2020-09-08 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10698603B2 (en) 2018-08-24 2020-06-30 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
US10788880B2 (en) 2018-10-22 2020-09-29 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
JP7449672B2 (en) 2019-11-05 2024-03-14 ユニ・チャーム株式会社 Display control device, display control method, and display control program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002108873A (en) * 2000-09-25 2002-04-12 Internatl Business Mach Corp <Ibm> Space information utilizing system, information aquiring device and server system
JP2010519656A (en) * 2007-02-27 2010-06-03 アクセンチュア グローバル サービスィズ ゲーエムベーハー Remote object recognition
JP2011134076A (en) * 2009-12-24 2011-07-07 Mitsubishi Electric Corp Touch panel input device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002108873A (en) * 2000-09-25 2002-04-12 Internatl Business Mach Corp <Ibm> Space information utilizing system, information aquiring device and server system
JP2010519656A (en) * 2007-02-27 2010-06-03 アクセンチュア グローバル サービスィズ ゲーエムベーハー Remote object recognition
JP2011134076A (en) * 2009-12-24 2011-07-07 Mitsubishi Electric Corp Touch panel input device

Also Published As

Publication number Publication date
JP2013125328A (en) 2013-06-24

Similar Documents

Publication Publication Date Title
WO2013088819A1 (en) Augmented reality display method
US8601510B2 (en) User interface for interactive digital television
US20150062158A1 (en) Integration of head mounted displays with public display devices
US20140152869A1 (en) Methods and Systems for Social Overlay Visualization
US20140279025A1 (en) Methods and apparatus for display of mobile advertising content
JP6466347B2 (en) Personal information communicator
Li et al. Cognitive issues in mobile augmented reality: an embodied perspective
US10114799B2 (en) Method for arranging images in electronic documents on small devices
US9741062B2 (en) System for collaboratively interacting with content
Čopič Pucihar et al. ART for art: augmented reality taxonomy for art and cultural heritage
Grubert et al. Exploring the design of hybrid interfaces for augmented posters in public spaces
JP5083697B2 (en) Image display device, input device, and image display method
Crowther Ontology and aesthetics of digital art
JP7376191B2 (en) Guide device, guide system, guide method, program, and recording medium
Antoniac Augmented reality based user interface for mobile applications and services
KR20180071492A (en) Realistic contents service system using kinect sensor
US20150055869A1 (en) Method and apparatus for providing layout based on handwriting input
CN105183292A (en) Method and system for displaying picture
Hagen et al. Visual scoping and personal space on shared tabletop surfaces
JP7396326B2 (en) Information processing system, information processing device, information processing method and program
Claydon Alternative realities: from augmented reality to mobile mixed reality
Denzer Digital collections and exhibits
JP7472681B2 (en) Information processing device, program, and information processing method
US20230334792A1 (en) Interactive reality computing experience using optical lenticular multi-perspective simulation
Zhang et al. Towards Workplace Metaverse: A Human-Centered Approach for Designing and Evaluating XR Virtual Displays

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12857665

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12857665

Country of ref document: EP

Kind code of ref document: A1