WO2019021447A1 - Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme - Google Patents

Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme Download PDF

Info

Publication number
WO2019021447A1
WO2019021447A1 PCT/JP2017/027351 JP2017027351W WO2019021447A1 WO 2019021447 A1 WO2019021447 A1 WO 2019021447A1 JP 2017027351 W JP2017027351 W JP 2017027351W WO 2019021447 A1 WO2019021447 A1 WO 2019021447A1
Authority
WO
WIPO (PCT)
Prior art keywords
prediction
wearable terminal
display
future
target
Prior art date
Application number
PCT/JP2017/027351
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2017/027351 priority Critical patent/WO2019021447A1/fr
Publication of WO2019021447A1 publication Critical patent/WO2019021447A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present invention relates to a wearable terminal display system, a wearable terminal display method, and a program for displaying a calculated future prediction as an augmented reality on a display target of a wearable terminal with respect to a prediction target seen through the display panel.
  • Patent Document 1 An apparatus for identifying and extracting a document that predicts the future has been provided.
  • Patent Document 1 has a problem that it can not provide future prediction only by looking at the prediction target.
  • the present invention identifies a prediction target from an image of a field of view of a wearable terminal, and displays a future prediction calculated according to the prediction target on a display plate of the wearable terminal as an augmented reality, wearable terminal It aims at providing a terminal display method and program.
  • the present invention provides the following solutions.
  • An invention is a wearable terminal display system for displaying a future prediction of a prediction target on a display board of the wearable terminal, and an image acquisition unit for acquiring an image of the prediction target within the view of the wearable terminal
  • An identifying means for analyzing the image to identify the prediction target, a calculating means for calculating a future prediction of the prediction target, and the prediction that appears to be transmitted through the display plate to the display plate of the wearable terminal
  • a future prediction display means for displaying the future prediction as an augmented reality for the object.
  • An invention is a wearable terminal display method for displaying a future prediction of a prediction target on a display board of the wearable terminal, and an image acquisition step of acquiring an image of the prediction target within the view of the wearable terminal Identifying the prediction target by analyzing the image, calculating the future prediction of the prediction target, calculating the future prediction of the prediction target, and displaying the prediction on the display plate of the wearable terminal.
  • the invention according to the first aspect is characterized in that the computer has an image acquisition step of acquiring an image of an object to be predicted which has come into view of the wearable terminal, an image analysis of the image to specify the object to be predicted, and Calculating the future prediction of the prediction target, and displaying the future prediction as the augmented reality on the display target of the wearable terminal on the display target of the wearable terminal.
  • the computer has an image acquisition step of acquiring an image of an object to be predicted which has come into view of the wearable terminal, an image analysis of the image to specify the object to be predicted, and Calculating the future prediction of the prediction target, and displaying the future prediction as the augmented reality on the display target of the wearable terminal on the display target of the wearable terminal.
  • the future prediction according to the prediction object can be displayed on the display board of the wearable terminal simply by putting the prediction object in the view of the wearable terminal.
  • FIG. 1 is a schematic view of a wearable terminal display system.
  • FIG. 2 is an example which calculated and displayed future prediction on the display board of a wearable terminal.
  • the wearable terminal display system is a system for displaying the calculated future prediction as an augmented reality on a display target of the wearable terminal with respect to a prediction target seen through the display panel.
  • a wearable terminal is a terminal with a view such as a smart glass or a head mounted display.
  • FIG. 1 is a schematic view of a wearable terminal display system according to a preferred embodiment of the present invention.
  • the wearable terminal display system includes an image acquisition unit, an identification unit, a calculation unit, and a future prediction display unit, which are realized by the control unit reading a predetermined program.
  • determination means, change means, detection means, action result display means, position direction acquisition means, estimation means, guideline display means, selection acceptance means may be provided. These may be application based, cloud based or otherwise.
  • Each means described above may be realized by a single computer or may be realized by two or more computers (for example, in the case of a server and a terminal).
  • An image acquisition means acquires the image of the prediction object which entered into the view of a wearable terminal.
  • An image captured by the camera of the wearable terminal may be acquired. Or even if it is except a wearable terminal, if such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display future predictions in real time, real-time images are preferable.
  • the identification means analyzes the image to identify the prediction target. For example, it is specified whether the forecast target is Kagoshima black pig, Yubari melon, or land in Minato-ku, Tokyo.
  • the prediction target can be identified from the color, shape, size, features, and the like. Of course, the prediction target is not limited to these.
  • Machine learning may improve the accuracy of image analysis. For example, machine learning is performed using the past image to be predicted as teacher data.
  • a calculation means calculates the future prediction according to prediction object.
  • future prediction for example, Kagoshima black pig grows up to what size, how much sugar content of Yubari melon will become, land of Minato-ku, Tokyo, overseas 1-chome land is likely to be sold at unit price, etc. It is.
  • future prediction is not limited to these.
  • the future prediction according to the target of prediction may be calculated with reference to a database in which the future prediction is registered in advance.
  • the Web content pre-linked to the prediction target may be accessed to calculate the future prediction. For example, it can be calculated from Web content by assigning a URL etc. that links the prediction target and the future prediction.
  • the future prediction may be calculated from the Web content searched by searching the prediction target via the Internet. For example, since there is a case where past information is posted on an information site etc., it can be calculated from an internet search. Alternatively, there are cases where it is possible to calculate future predictions from social networking services (SNS) or word-of-mouth sites.
  • SNS social
  • the future prediction display means displays the future prediction as an augmented reality on the prediction object which is seen through the display plate on the display plate of the wearable terminal. For example, as shown in FIG. 2, on the display board of the wearable terminal, the future prediction drawn with a broken line is displayed as an augmented reality with respect to the prediction target drawn with a solid line seen through the display board.
  • solid lines are real and broken lines are augmented reality.
  • the future prediction to be displayed as augmented reality may be displayed so as to overlap with the prediction target seen through the display board, it becomes difficult to see the prediction target, so even if the display of future prediction can be switched ON / OFF. Good.
  • the determination means determines whether or not the displayed future prediction has been browsed. By acquiring the image being browsed and analyzing the image, it may be determined whether the future prediction has been browsed. In addition, it may be determined from the sensor information of the wearable terminal, the sensor information worn by the viewer, etc. whether or not the future prediction is browsed. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the change means changes the degree of attention so that the future prediction is changed to viewed when it is determined that the view has been viewed, and the future prediction is viewed when it is determined that the view is not viewed. By doing this, it is possible to visually grasp which future prediction has been viewed and has not been viewed. For example, it may be regarded as already viewed by putting a check in the check box of the future prediction. For example, a stamp may be pressed on the future prediction to make the view already viewed. Moreover, the change of attention level may change the color and the size of the future prediction, or may stamp stamps so that the future prediction may stand out.
  • the detection means detects an action on the displayed future prediction.
  • the action is, for example, a gesture, hand movement, gaze movement, or the like.
  • the action on the future prediction may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like.
  • a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the action result display means displays the result according to the action as the augmented reality on the prediction target viewed through the display plate on the display plate of the wearable terminal.
  • the display of the future prediction may be turned off when an action to delete the future prediction is detected.
  • the link may be opened upon detecting an action to open the link attached to the future prediction.
  • the page may be turned.
  • other actions may be used.
  • the position / direction unit acquires the terminal position and the imaging direction of the wearable terminal.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from the geomagnetic sensor or the acceleration sensor of the wearable terminal when imaging is performed by the wearable terminal. You may acquire from other than these.
  • An estimation means estimates the target position of prediction object based on a terminal position and an imaging direction. If the terminal position and the imaging direction are known, the target position of the imaged prediction target can be estimated.
  • the specifying unit may specify the prediction target from the target position and the image analysis. Specific accuracy can be improved by utilizing position information. For example, if it is possible to improve the accuracy of specifying that the location is Sensoji, the reliability of the future prediction to be displayed is also improved.
  • the guideline display means displays a guideline for imaging a prediction target as an augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. It will be easier to analyze the image by having the image taken according to the guidelines.
  • the acquisition unit may acquire an image captured along a guideline. By acquiring and analyzing only the image captured along the guideline, it is possible to efficiently specify the prediction target.
  • the selection receiving unit receives selection of a selection target for the prediction target viewed through the display plate of the wearable terminal.
  • the selection target selection may be accepted by looking at the prediction target seen through the display plate of the wearable terminal for a certain period of time.
  • the selection of the selection target may be received by touching the prediction target which is seen through the display plate of the wearable terminal.
  • the selection of the selection target may be received by positioning the cursor on the prediction target that is seen through the display plate of the wearable terminal.
  • a sensor that detects a sight line For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the future prediction display means may display the future prediction as an augmented reality on the display plate of the wearable terminal in accordance with only the selection object that is seen through the display plate. Since the future prediction is displayed as an augmented reality according to only the selected selection object, it is possible to pinpoint the future prediction. If future predictions are displayed on all of the identified prediction targets, the display screen may be cumbersome. [Description of operation]
  • the wearable terminal display method according to the present invention is a method of displaying the calculated future prediction as an augmented reality on a prediction target that is seen through the display plate on a display plate of the wearable terminal.
  • the wearable terminal display method includes an image acquisition step, a identification step, a calculation step, and a future prediction display step. Although not shown in the drawings, similarly, a determination step, a change step, a detection step, an action result display step, a position direction acquisition step, an estimation step, a guideline display step, and a selection acceptance step may be provided.
  • the image acquisition step acquires an image of a prediction target that has come into view of the wearable terminal.
  • An image captured by the camera of the wearable terminal may be acquired. Or even if it is except a wearable terminal, if such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display future predictions in real time, real-time images are preferable.
  • the image is analyzed to identify a prediction target.
  • a prediction target For example, it is specified whether the forecast target is Kagoshima black pig, Yubari melon, or land in Minato-ku, Tokyo.
  • the prediction target can be identified from the color, shape, size, features, and the like. Of course, the prediction target is not limited to these.
  • Machine learning may improve the accuracy of image analysis. For example, machine learning is performed using the past image to be predicted as teacher data.
  • the calculation step calculates a future prediction according to the prediction target.
  • future prediction for example, Kagoshima black pig grows up to what size, how much sugar content of Yubari melon will become, land of Minato-ku, Tokyo, overseas 1-chome land is likely to be sold at unit price, etc. It is.
  • future prediction is not limited to these.
  • the future prediction according to the target of prediction may be calculated with reference to a database in which the future prediction is registered in advance.
  • the Web content pre-linked to the prediction target may be accessed to calculate the future prediction. For example, it can be calculated from Web content by assigning a URL etc. that links the prediction target and the future prediction.
  • the future prediction may be calculated from the Web content searched by searching the prediction target via the Internet. For example, since there is a case where past information is posted on an information site etc., it can be calculated from an internet search. Alternatively, there are cases where it is possible to calculate future predictions from social networking services (SNS) or word-of-mouth sites.
  • the future prediction display step displays the future prediction as an augmented reality for the prediction target that is seen through the display plate on the display plate of the wearable terminal.
  • the future prediction drawn with a broken line is displayed as an augmented reality with respect to the prediction target drawn with a solid line seen through the display board.
  • solid lines are real and broken lines are augmented reality.
  • the future prediction to be displayed as augmented reality may be displayed so as to overlap with the prediction target seen through the display board, it becomes difficult to see the prediction target, so even if the display of future prediction can be switched ON / OFF. Good.
  • the determination step determines whether the displayed future prediction has been viewed. By acquiring the image being browsed and analyzing the image, it may be determined whether the future prediction has been browsed. In addition, it may be determined from the sensor information of the wearable terminal, the sensor information worn by the viewer, etc. whether or not the future prediction is browsed. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the changing step changes the degree of attention so that the future prediction is changed to viewed if it is determined to have been viewed, and the future prediction is to be viewed if it is determined that it has not been viewed.
  • it may be regarded as already viewed by putting a check in the check box of the future prediction.
  • a stamp may be pressed on the future prediction to make the view already viewed.
  • the change of attention level may change the color and the size of the future prediction, or may stamp stamps so that the future prediction may stand out.
  • the detecting step detects an action on the displayed future prediction.
  • the action is, for example, a gesture, hand movement, gaze movement, or the like.
  • the action on the future prediction may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like.
  • a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the action result display step displays, on the display board of the wearable terminal, the result corresponding to the action as the augmented reality for the prediction target seen through the display board.
  • the display of the future prediction may be turned off when an action to delete the future prediction is detected.
  • the link may be opened upon detecting an action to open the link attached to the future prediction.
  • the page may be turned.
  • other actions may be used.
  • the position direction step acquires the terminal position and the imaging direction of the wearable terminal.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from the geomagnetic sensor or the acceleration sensor of the wearable terminal when imaging is performed by the wearable terminal. You may acquire from other than these.
  • the estimation step estimates a target position of a prediction target based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the target position of the imaged prediction target can be estimated.
  • the identification step may identify the prediction target from the target position and the image analysis. Specific accuracy can be improved by utilizing position information. For example, if it is possible to improve the accuracy of specifying that the location is Sensoji, the reliability of the future prediction to be displayed is also improved.
  • the guideline display step displays a guideline for imaging a prediction target as an augmented reality on a display plate of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. It will be easier to analyze the image by having the image taken according to the guidelines.
  • the acquisition step may acquire an image captured along a guideline.
  • the acquisition step may acquire an image captured along a guideline.
  • the selection receiving step receives selection of a selection target for the prediction target viewed through the display board of the wearable terminal.
  • the selection target selection may be accepted by looking at the prediction target seen through the display plate of the wearable terminal for a certain period of time.
  • the selection of the selection target may be received by touching the prediction target which is seen through the display plate of the wearable terminal.
  • the selection of the selection target may be received by positioning the cursor on the prediction target that is seen through the display plate of the wearable terminal.
  • a sensor that detects a sight line For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the future prediction display step may display the future prediction as the augmented reality on the display board of the wearable terminal in accordance with only the selection target that is seen through the display board. Since the future prediction is displayed as an augmented reality according to only the selected selection object, it is possible to pinpoint the future prediction. If future predictions are displayed on all of the identified prediction targets, the display screen may be cumbersome.
  • the above-described means and functions are realized by a computer (including a CPU, an information processing device, and various terminals) reading and executing a predetermined program.
  • the program may be, for example, an application installed on a computer, or a SaaS (software as a service) provided from a computer via a network, for example, a flexible disk, a CD It may be provided in the form of being recorded in a computer readable recording medium such as a CD-ROM or the like, a DVD (DVD-ROM, DVD-RAM or the like).
  • the computer reads the program from the recording medium, transfers the program to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as, for example, a magnetic disk, an optical disk, or a magneto-optical disk, and may be provided from the storage device to the computer via a communication line.
  • nearest neighbor method naive Bayes method
  • decision tree naive Bayes method
  • support vector machine e.g., support vector machine
  • reinforcement learning e.g., reinforcement learning, etc.
  • deep learning may be used in which feature quantities for learning are generated by using a neural network.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention vise à identifier un sujet de prédiction à partir d'une image du champ de vision d'un terminal portable, et à afficher une prédiction future calculée selon le sujet de prédiction, sous la forme de réalité augmentée sur le dispositif d'affichage du terminal portable. À cet effet, l'invention concerne un système d'affichage de terminal portable qui affiche des prédictions futures d'un sujet de prédiction sur le dispositif d'affichage d'un terminal portable, le système d'affichage de terminal portable comprenant : un moyen d'acquisition d'image qui acquiert une image du sujet de prédiction qui est entré dans le champ de vision du terminal portable ; un moyen d'identification qui identifie le sujet de prédiction en effectuant une analyse d'image sur l'image ; un moyen de calcul qui calcule une prédiction future pour le sujet de prédiction ; et un moyen d'affichage de prédiction future qui affiche sur le dispositif d'affichage du terminal portable, sous la forme de réalité augmentée, la prédiction future pour le sujet de prédiction qui est visible à travers le dispositif d'affichage.
PCT/JP2017/027351 2017-07-28 2017-07-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme WO2019021447A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/027351 WO2019021447A1 (fr) 2017-07-28 2017-07-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/027351 WO2019021447A1 (fr) 2017-07-28 2017-07-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme

Publications (1)

Publication Number Publication Date
WO2019021447A1 true WO2019021447A1 (fr) 2019-01-31

Family

ID=65041129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/027351 WO2019021447A1 (fr) 2017-07-28 2017-07-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme

Country Status (1)

Country Link
WO (1) WO2019021447A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002149478A (ja) * 2000-08-29 2002-05-24 Fujitsu Ltd 更新情報の自動表示方法、装置、媒体およびプログラム
JP2005176345A (ja) * 2003-11-20 2005-06-30 Matsushita Electric Ind Co Ltd 連携制御装置、連携制御方法およびサービス連携システム
JP2007018456A (ja) * 2005-07-11 2007-01-25 Nikon Corp 情報表示装置及び情報表示方法
JP2007178124A (ja) * 2005-12-26 2007-07-12 Aisin Aw Co Ltd ナビゲーションシステム
WO2011126134A1 (fr) * 2010-04-09 2011-10-13 サイバーアイ・エンタテインメント株式会社 Système de serveur permettant une collecte, une reconnaissance, un classement, un traitement et une distribution d'images animées en temps réel
JP2014531662A (ja) * 2011-09-19 2014-11-27 アイサイト モバイル テクノロジーズ リミテッド 拡張現実システムのためのタッチフリーインターフェース

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002149478A (ja) * 2000-08-29 2002-05-24 Fujitsu Ltd 更新情報の自動表示方法、装置、媒体およびプログラム
JP2005176345A (ja) * 2003-11-20 2005-06-30 Matsushita Electric Ind Co Ltd 連携制御装置、連携制御方法およびサービス連携システム
JP2007018456A (ja) * 2005-07-11 2007-01-25 Nikon Corp 情報表示装置及び情報表示方法
JP2007178124A (ja) * 2005-12-26 2007-07-12 Aisin Aw Co Ltd ナビゲーションシステム
WO2011126134A1 (fr) * 2010-04-09 2011-10-13 サイバーアイ・エンタテインメント株式会社 Système de serveur permettant une collecte, une reconnaissance, un classement, un traitement et une distribution d'images animées en temps réel
JP2014531662A (ja) * 2011-09-19 2014-11-27 アイサイト モバイル テクノロジーズ リミテッド 拡張現実システムのためのタッチフリーインターフェース

Similar Documents

Publication Publication Date Title
US20200193487A1 (en) System and method to measure effectiveness and consumption of editorial content
CN110704684B (zh) 视频搜索的方法及装置、终端和存储介质
CN107690657B (zh) 根据影像发现商户
US20180247361A1 (en) Information processing apparatus, information processing method, wearable terminal, and program
JP6681342B2 (ja) 行動イベント計測システム及び関連する方法
US9761054B2 (en) Augmented reality computing with inertial sensors
US20200042516A1 (en) Automated sequential site navigation
JP6267841B1 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
US20190102952A1 (en) Identifying augmented reality visuals influencing user behavior in virtual-commerce environments
KR101925701B1 (ko) 시선 정보에 기초하는 자극들에 대한 주의의 결정
JP2010061218A (ja) ウェブ広告効果測定装置、ウェブ広告効果測定方法及びプログラム
WO2022007451A1 (fr) Procédé et appareil de détection de cible, ainsi que support lisible par ordinateur et dispositif électronique
US9619707B2 (en) Gaze position estimation system, control method for gaze position estimation system, gaze position estimation device, control method for gaze position estimation device, program, and information storage medium
WO2014176938A1 (fr) Procédé et appareil de récupération d'informations
JP6887198B2 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
JP2017204134A (ja) 属性推定装置、属性推定方法およびプログラム
WO2018198320A1 (fr) Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable
WO2019021446A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
WO2019021447A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
WO2018216221A1 (fr) Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable
WO2018216220A1 (fr) Système d'affichage de terminal vestimentaire, procédé et programme d'affichage de terminal vestimentaire
WO2019003359A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
JP6762470B2 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
JP2016218822A (ja) 販売情報利用装置、販売情報利用方法、およびプログラム
JP6343412B1 (ja) 地図連動センサ情報表示システム、地図連動センサ情報表示方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17918995

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17918995

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP