WO2019003359A1 - Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme - Google Patents

Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme Download PDF

Info

Publication number
WO2019003359A1
WO2019003359A1 PCT/JP2017/023814 JP2017023814W WO2019003359A1 WO 2019003359 A1 WO2019003359 A1 WO 2019003359A1 JP 2017023814 W JP2017023814 W JP 2017023814W WO 2019003359 A1 WO2019003359 A1 WO 2019003359A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable terminal
instruction manual
display
explanation
image
Prior art date
Application number
PCT/JP2017/023814
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2017/023814 priority Critical patent/WO2019003359A1/fr
Publication of WO2019003359A1 publication Critical patent/WO2019003359A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a wearable terminal display system, a wearable terminal display method, and a program for displaying an instruction manual of the collected explanation objects as an augmented reality for an explanation object seen through the display plate on a display plate of a wearable terminal.
  • a wearable terminal display system a wearable terminal display method, and a program for displaying an instruction manual of the collected explanation objects as an augmented reality for an explanation object seen through the display plate on a display plate of a wearable terminal.
  • Patent Document 1 an apparatus for realizing a display control apparatus capable of presenting an electronic instruction manual of the own apparatus in a more sophisticated form than in the past according to the state of the own apparatus.
  • Patent Document 1 has a problem that it can not display instruction manuals of various objects other than its own device.
  • the present invention identifies a target to be described from an image of a field of view of the wearable terminal, and displays an instruction manual collected according to the target on a display plate of the wearable terminal as augmented reality.
  • An object of the present invention is to provide a wearable terminal display method and program.
  • the present invention provides the following solutions.
  • the invention according to the first aspect is a wearable terminal display system for displaying an instruction manual of an explanation object on a display board of the wearable terminal, and an image acquisition means for acquiring an image of the explanation object within the field of view of the wearable terminal. And the identification means for analyzing the image to identify the explanation object, the acquisition means for collecting the instruction manual for the explanation object, and the display plate of the wearable terminal to be seen through the display plate
  • a wearable terminal display system comprising: instruction manual display means for displaying the instruction manual as augmented reality for the object of explanation.
  • the invention according to the first aspect is a wearable terminal display method for displaying an instruction manual for an explanation object on a display board of the wearable terminal, wherein an image acquisition step for acquiring an image of the explanation object within the field of view of the wearable terminal The image analysis of the image to identify the explanation target, the collection step collecting the instruction manual for the explanation target, and the display plate of the wearable terminal viewed through the display plate And a instruction manual display step of displaying the instruction manual as augmented reality for the object of explanation.
  • the invention comprises a computer, an image acquisition step of acquiring an image of an explanation object within the field of view of the wearable terminal, an image analysis of the image to specify the explanation object, and An instruction manual display for displaying the instruction manual as the augmented reality for the explanation object which is seen through the display plate on the display plate of the wearable terminal, and a collection step for collecting the instruction manual for the explanation object Provide a program for making the steps.
  • FIG. 1 is a schematic view of a wearable terminal display system.
  • FIG. 2 is an example which collected and displayed the instruction manual of description object on the display board of a wearable terminal.
  • the wearable terminal display system is a system for displaying the collected instruction manual as an augmented reality on the display target of the wearable terminal with respect to an explanatory object which is seen through the display board.
  • a wearable terminal is a terminal with a view such as a smart glass or a head mounted display.
  • FIG. 1 is a schematic view of a wearable terminal display system according to a preferred embodiment of the present invention.
  • the wearable terminal display system includes an image acquisition unit, an identification unit, a collection unit, an instruction manual display unit, which is realized by the control unit reading a predetermined program.
  • determination means, change means, detection means, action result display means, position direction acquisition means, estimation means, guideline display means, selection acceptance means may be provided. These may be application based, cloud based or otherwise.
  • Each means described above may be realized by a single computer or may be realized by two or more computers (for example, in the case of a server and a terminal).
  • An image acquisition means acquires the image of the description object which entered into the view of a wearable terminal.
  • An image captured by the camera of the wearable terminal may be acquired. Or even if it is except a wearable terminal, if such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display the instruction manual in real time, a real time image is preferable.
  • the identification means analyzes the image to identify the explanation object. For example, it is specified whether the description target is the dehumidifier X of company A, the medical instrument Y of company B, the game Z of company C, and the like.
  • the subject of explanation is not limited to these.
  • the target of explanation can be specified from color, shape, size, characters, marks, and the like.
  • only the explanation objects in the center of the view of the wearable terminal may be specified. By identifying only the explanatory object in the center of the field of view, the time required for identification can be significantly reduced.
  • Machine learning may improve the accuracy of image analysis. For example, machine learning is performed using a past image to be explained as teacher data.
  • the collecting means collects an instruction manual according to the subject of explanation. You may collect the instruction manual according to the description target with reference to the database with which the instruction manual was registered previously. Also, the instruction manual may be collected by accessing Web content linked in advance to the explanation object. For example, it can be collected from Web content by assigning a URL or the like that links an explanation object to an instruction manual. In addition, the instruction manual may be collected from the Web content searched by searching the explanation target via the Internet. For example, since there is a case where the instruction manual is posted on the website to be explained, it can be collected from the Internet search. Or, there are cases where you can collect instruction manuals from social networking service (SNS) or word-of-mouth sites.
  • SNS social networking service
  • the instruction manual display means displays the instruction manual as the augmented reality on the display object of the wearable terminal for the explanation object which is seen through the display substrate.
  • an instruction manual drawn by a broken line is displayed as an augmented reality on an explanatory object drawn by a solid line seen through the display plate on a display board of the wearable terminal.
  • solid lines are real and broken lines are augmented reality.
  • the instruction manual displayed as augmented reality may be displayed so as to overlap with the explanation object seen through the display plate, but since the explanation object becomes difficult to see, the display manual ON / OFF can be switched. May be
  • the determination means determines whether or not the displayed instruction manual has been viewed. It may be determined whether the instruction manual has been browsed by acquiring the image being browsed and analyzing the image. Further, it may be determined whether the instruction manual has been browsed from the sensor information of the wearable terminal, the sensor information worn by the viewer, or the like. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the change means changes the instruction manual to viewed if it is determined to have been viewed, and changes the degree of attention so that the instruction manual is to be viewed if it is determined to not be viewed. By doing this, it is possible to visually grasp which instruction manual has been viewed or not.
  • the document may be viewed by putting a check in the check box of the instruction manual.
  • the instruction manual may be viewed by pressing a stamp.
  • the attention level may be changed by changing the color and size of the instruction manual or by pressing a stamp so that the instruction manual is noticeable.
  • the detection means detects an action on the displayed instruction manual.
  • the action is, for example, a gesture, hand movement, gaze movement, or the like.
  • the action on the instruction manual may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like.
  • a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the action result display means displays the result according to the action as the augmented reality on the display object of the wearable terminal for the explanation object seen through the display device.
  • the display of the instruction manual may be turned off.
  • the link may be opened upon detecting an action for opening the link attached to the instruction manual.
  • the page may be turned.
  • other actions may be used.
  • the position / direction unit acquires the terminal position and the imaging direction of the wearable terminal.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from the geomagnetic sensor or the acceleration sensor of the wearable terminal when imaging is performed by the wearable terminal. You may acquire from other than these.
  • the estimation unit estimates an explanation target position of the explanation target based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the explanation object position of the imaged explanation object can be estimated.
  • the specification means may specify the explanation object from the explanation object position and the image analysis. Specific accuracy can be improved by using location information. For example, if it is possible to improve the accuracy of identifying the dehumidifier X manufactured by company A in 2016 by the position information, the reliability of the instruction manual to be displayed correspondingly is also improved.
  • the guideline display means displays a guideline for imaging an explanation object as an augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. It will be easier to analyze the image by having the image taken according to the guidelines.
  • the acquisition unit may acquire an image captured along a guideline.
  • the explanation target can be efficiently specified.
  • the selection receiving unit receives a selection of a selection target for an explanation target which is seen through the display plate of the wearable terminal.
  • the selection of the selection target may be accepted by looking at the explanation target which is seen through the display plate of the wearable terminal for a predetermined time.
  • the user may touch an explanation object seen through the display plate of the wearable terminal to accept the selection of the selection object.
  • the selection of the selection target may be received by placing the cursor on the explanation target which is seen through the display plate of the wearable terminal.
  • a sensor that detects a sight line For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the instruction manual display means may display the instruction manual as the augmented reality on the display plate of the wearable terminal in accordance with only the selection object seen through the display plate. Since the instruction manual is displayed as augmented reality according to only the selected selection object, the instruction manual can be grasped at pinpoint. If the instruction manual is displayed on all the specified explanations, the display screen may be bothersome. [Description of operation]
  • the wearable terminal display method of the present invention is a method of displaying the collected instruction manual as an augmented reality on an explanatory object which is seen through the display plate on a display plate of the wearable terminal.
  • the wearable terminal display method includes an image acquisition step, a identification step, a collection step, and an instruction manual display step. Although not shown in the drawings, similarly, a determination step, a change step, a detection step, an action result display step, a position direction acquisition step, an estimation step, a guideline display step, and a selection acceptance step may be provided.
  • the image acquisition step acquires an image of an explanation object which is in the view of the wearable terminal.
  • An image captured by the camera of the wearable terminal may be acquired. Or even if it is except a wearable terminal, if such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display the instruction manual in real time, a real time image is preferable.
  • the image is analyzed to identify an explanation object. For example, it is specified whether the description target is the dehumidifier X of company A, the medical instrument Y of company B, the game Z of company C, and the like.
  • the subject of explanation is not limited to these.
  • the target of explanation can be specified from color, shape, size, characters, marks, and the like.
  • only the explanation objects in the center of the view of the wearable terminal may be specified. By identifying only the explanatory object in the center of the field of view, the time required for identification can be significantly reduced.
  • Machine learning may improve the accuracy of image analysis. For example, machine learning is performed using a past image to be explained as teacher data.
  • the collecting step collects an instruction manual according to an explanation object.
  • You may collect the instruction manual according to the description target with reference to the database with which the instruction manual was registered previously.
  • the instruction manual may be collected by accessing Web content linked in advance to the explanation object. For example, it can be collected from Web content by assigning a URL or the like that links an explanation object to an instruction manual.
  • the instruction manual may be collected from the Web content searched by searching the explanation target via the Internet. For example, since there is a case where the instruction manual is posted on the website to be explained, it can be collected from the Internet search. Or, there are cases where you can collect instruction manuals from social networking service (SNS) or word-of-mouth sites.
  • SNS social networking service
  • the instruction manual display step displays the instruction manual as the augmented reality on the display object of the wearable terminal for the explanation object which is seen through the display substrate.
  • an instruction manual drawn by a broken line is displayed as an augmented reality on an explanatory object drawn by a solid line seen through the display plate on a display board of the wearable terminal.
  • solid lines are real and broken lines are augmented reality.
  • the instruction manual displayed as augmented reality may be displayed so as to overlap with the explanation object seen through the display plate, but since the explanation object becomes difficult to see, the display manual ON / OFF can be switched. May be
  • the determination step determines whether or not the displayed instruction manual has been viewed. It is also possible to determine whether the instruction manual has been browsed by acquiring an image being browsed and performing image analysis on the image in which the instruction manual is photographed. Further, it may be determined whether the instruction manual has been browsed from the sensor information of the wearable terminal, the sensor information worn by the viewer, or the like. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the change step changes the instruction manual to viewed if it is determined to have been viewed, and changes the degree of attention so that the instruction manual is to be viewed if it is determined to not be viewed. By doing this, it is possible to visually grasp which instruction manual has been viewed or not.
  • the document may be viewed by putting a check in the check box of the instruction manual.
  • the instruction manual may be viewed by pressing a stamp.
  • the attention level may be changed by changing the color and size of the instruction manual or by pressing a stamp so that the instruction manual is noticeable.
  • the detection step detects an action on the displayed instruction manual.
  • the action is, for example, a gesture, hand movement, gaze movement, or the like.
  • the action on the instruction manual may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like.
  • a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like may be detected from the sensor information of the wearable terminal, the sensor information attached to the viewer, or the like. For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the action result display step displays, on the display board of the wearable terminal, the result according to the action as the augmented reality for the explanation object which is seen through the display board.
  • the display of the instruction manual may be turned off.
  • the link may be opened upon detecting an action for opening the link attached to the instruction manual.
  • the page may be turned.
  • other actions may be used.
  • the position direction step acquires the terminal position and the imaging direction of the wearable terminal.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from the geomagnetic sensor or the acceleration sensor of the wearable terminal when imaging is performed by the wearable terminal. You may acquire from other than these.
  • the estimation step estimates an explanation target position of the explanation target based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the explanation object position of the imaged explanation object can be estimated.
  • the specifying step may specify the explanation target from the explanation target position and the image analysis. Specific accuracy can be improved by using location information. For example, if it is possible to improve the accuracy of identifying the dehumidifier X manufactured by company A in 2016 by the position information, the reliability of the instruction manual to be displayed correspondingly is also improved.
  • the guideline display step displays a guideline for imaging an explanation object as an augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. It will be easier to analyze the image by having the image taken according to the guidelines.
  • the acquisition step may acquire an image captured along a guideline.
  • the explanation target can be efficiently specified.
  • the selection receiving step receives selection of a selection target for an explanation target which is seen through the display plate of the wearable terminal.
  • the selection of the selection target may be accepted by looking at the explanation target which is seen through the display plate of the wearable terminal for a predetermined time.
  • the user may touch an explanation object seen through the display plate of the wearable terminal to accept the selection of the selection object.
  • the selection of the selection target may be received by placing the cursor on the explanation target which is seen through the display plate of the wearable terminal.
  • a sensor that detects a sight line For example, a sensor that detects a sight line, a motion sensor, an acceleration sensor, and the like.
  • the instruction manual may be displayed as the augmented reality on the display plate of the wearable terminal in accordance with only the selection object that can be seen through the display plate. Since the instruction manual is displayed as augmented reality according to only the selected selection object, the instruction manual can be grasped at pinpoint. If the instruction manual is displayed on all the specified explanations, the display screen may be bothersome.
  • the above-described means and functions are realized by a computer (including a CPU, an information processing device, and various terminals) reading and executing a predetermined program.
  • the program may be, for example, an application installed on a computer, or a SaaS (software as a service) provided from a computer via a network, for example, a flexible disk, a CD It may be provided in the form of being recorded in a computer readable recording medium such as a CD-ROM or the like, a DVD (DVD-ROM, DVD-RAM or the like).
  • the computer reads the program from the recording medium, transfers the program to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as, for example, a magnetic disk, an optical disk, or a magneto-optical disk, and may be provided from the storage device to the computer via a communication line.
  • nearest neighbor method naive Bayes method
  • decision tree naive Bayes method
  • support vector machine e.g., support vector machine
  • reinforcement learning e.g., reinforcement learning, etc.
  • deep learning may be used in which feature quantities for learning are generated by using a neural network.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'objectif de l'invention est de spécifier un sujet d'explication à partir d'une image dans un champ visuel d'un terminal portable, ainsi que d'afficher un manuel d'instruction collecté conformément au sujet d'explication sur une carte d'affichage du terminal portable en réalité augmentée. À cet effet, l'invention concerne un système d'affichage de terminal portable permettant d'afficher un manuel d'instruction d'un sujet d'explication sur un panneau d'affichage d'un terminal portable, ledit système d'affichage de terminal portable comprenant : un moyen d'acquisition d'image permettant d'acquérir une image d'un sujet d'explication qui est apparu dans un champ visuel du terminal portable ; un moyen de spécification permettant d'effectuer une analyse d'image sur l'image afin de spécifier le sujet d'explication ; un moyen de collecte permettant de collecter un manuel d'instruction du sujet d'explication ; et un moyen d'affichage de manuel d'instruction permettant d'afficher, sur un panneau d'affichage du terminal portable, le manuel d'instruction pour le sujet d'explication qui est visualisé au moyen du panneau d'affichage, le manuel d'instruction étant affiché en réalité augmentée.
PCT/JP2017/023814 2017-06-28 2017-06-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme WO2019003359A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/023814 WO2019003359A1 (fr) 2017-06-28 2017-06-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/023814 WO2019003359A1 (fr) 2017-06-28 2017-06-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme

Publications (1)

Publication Number Publication Date
WO2019003359A1 true WO2019003359A1 (fr) 2019-01-03

Family

ID=64741211

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/023814 WO2019003359A1 (fr) 2017-06-28 2017-06-28 Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme

Country Status (1)

Country Link
WO (1) WO2019003359A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002149478A (ja) * 2000-08-29 2002-05-24 Fujitsu Ltd 更新情報の自動表示方法、装置、媒体およびプログラム
JP2009008905A (ja) * 2007-06-28 2009-01-15 Ricoh Co Ltd 情報表示装置および情報表示システム
JP2011114781A (ja) * 2009-11-30 2011-06-09 Brother Industries Ltd ヘッドマウントディスプレイ装置、及びヘッドマウントディスプレイ装置を用いた画像共有システム
JP2015153157A (ja) * 2014-02-14 2015-08-24 Kddi株式会社 仮想情報管理システム
JP2017049763A (ja) * 2015-09-01 2017-03-09 株式会社東芝 電子機器、支援システムおよび支援方法
WO2017081920A1 (fr) * 2015-11-10 2017-05-18 日本電気株式会社 Dispositif de traitement d'informations, procédé de commande et programme

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002149478A (ja) * 2000-08-29 2002-05-24 Fujitsu Ltd 更新情報の自動表示方法、装置、媒体およびプログラム
JP2009008905A (ja) * 2007-06-28 2009-01-15 Ricoh Co Ltd 情報表示装置および情報表示システム
JP2011114781A (ja) * 2009-11-30 2011-06-09 Brother Industries Ltd ヘッドマウントディスプレイ装置、及びヘッドマウントディスプレイ装置を用いた画像共有システム
JP2015153157A (ja) * 2014-02-14 2015-08-24 Kddi株式会社 仮想情報管理システム
JP2017049763A (ja) * 2015-09-01 2017-03-09 株式会社東芝 電子機器、支援システムおよび支援方法
WO2017081920A1 (fr) * 2015-11-10 2017-05-18 日本電気株式会社 Dispositif de traitement d'informations, procédé de commande et programme

Similar Documents

Publication Publication Date Title
US20180247361A1 (en) Information processing apparatus, information processing method, wearable terminal, and program
US11630861B2 (en) Method and apparatus for video searching, terminal and storage medium
CN107690657B (zh) 根据影像发现商户
CN106164959B (zh) 行为事件测量系统和相关方法
TWI418763B (zh) 作為導航器之行動成像裝置
KR102093198B1 (ko) 시선 인식을 이용한 사용자 인터페이스 방법 및 장치
US10372958B2 (en) In-field data acquisition and formatting
US20130042171A1 (en) Method and system for generating and managing annotation in electronic book
CA2762662A1 (fr) Procede pour la concordance automatique de donnees de suivi du regard avec un contenu hypermedia
JP2010061218A (ja) ウェブ広告効果測定装置、ウェブ広告効果測定方法及びプログラム
CN105009034A (zh) 信息处理设备、信息处理方法和程序
KR20170028302A (ko) 시선 정보에 기초하는 자극들에 대한 주의의 결정
CN105210009A (zh) 显示控制装置、显示控制方法和记录介质
EP3640804B1 (fr) Procédé de préparation d'enregistrement d'écran pour évaluer l'utilisation d'un logiciel, système informatique, programme informatique et support d'informations lisible par ordinateur mettant en oeuvre ce procédé
JP6887198B2 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
EP2779087A1 (fr) Système et dispositif d'estimation de position du regard, procédés de commande de système et de dispositif d'estimation de position du regard, programme et support de stockage d'informations
WO2018198320A1 (fr) Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable
WO2019021446A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
WO2019003359A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
WO2018216221A1 (fr) Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable
WO2018216220A1 (fr) Système d'affichage de terminal vestimentaire, procédé et programme d'affichage de terminal vestimentaire
JP6762470B2 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
US9911237B1 (en) Image processing techniques for self-captured images
WO2019021447A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
CN114972500A (zh) 查验方法、标注方法、系统、装置、终端、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17916148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17916148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP