WO2002057885A2 - Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique - Google Patents

Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique Download PDF

Info

Publication number
WO2002057885A2
WO2002057885A2 PCT/US2002/001457 US0201457W WO02057885A2 WO 2002057885 A2 WO2002057885 A2 WO 2002057885A2 US 0201457 W US0201457 W US 0201457W WO 02057885 A2 WO02057885 A2 WO 02057885A2
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
haptic effect
interface device
magnitude
graphical
Prior art date
Application number
PCT/US2002/001457
Other languages
English (en)
Other versions
WO2002057885A3 (fr
Inventor
Louis B. Rosenberg
Mathew Mather
Danny Grant
Christophe Ramstein
Original Assignee
Immersion Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corporation filed Critical Immersion Corporation
Priority to AU2002249960A priority Critical patent/AU2002249960A1/en
Publication of WO2002057885A2 publication Critical patent/WO2002057885A2/fr
Publication of WO2002057885A3 publication Critical patent/WO2002057885A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • a cursor is often used to select graphical objects or manipulate graphical objects, such as resizing or moving the objects.
  • the user must navigate a cursor through the graphical objects in the environment to perform these tasks and to place the cursor in desired locations to perform other tasks.
  • one aspect of the present inventions provides an interface device capable of communicating with a computer running an application program and generating a graphical environment
  • the interface device includes a user manipulatable object capable of controlling the motion of a cursor displayed in the graphical environment and an actuator for outputting a haptic effect to a user of the interface device.
  • a modulator modulates the magnitude of the haptic effect in relation to a selected function.
  • the function can be selected based on a current navigational task of the user, such as selecting one or more graphical objects, positioning a cursor in relation to objects, or dragging, moving, resizing, or otherwise manipulating graphical objects.
  • FIGURE 1 is a block diagram of a haptic feedback system suitable for use with the present invention
  • the present invention relates to controlling haptic feedback during the navigation of a graphical image in a graphical environment, for example a graphical environment generated by a computer.
  • a graphical environment for example a graphical environment generated by a computer.
  • the process is illustrated in the context of navigating a cursor or pointer on a computer display, the present invention can be used while navigating other graphical images or and should not be limited to the examples provided herein.
  • the host computer 12 can be a personal computer which operates under the MS-DOS or Windows operating systems in conformance with an IBM PC AT standard.
  • host computer 12 can be one of a variety of home video game systems commonly connected to a television set, such as systems available from Nintendo, Sega, Sony, or Microsoft.
  • host computer system 12 can be a "set top box" which can be used, for example, to provide interactive television functions to users, or other devices or appliances providing electronic functions to users.
  • Display device 20 is coupled to host processor 16 by suitable display drivers and can be used to display images generated by host computer system 12 or other computer systems.
  • Display device 20 can be a display screen (LCD, plasma, CRT, etc.), 3-D goggles, projection device, or any other visual interface.
  • the display device 20 displays a graphical user interface and a graphical image, such as a cursor or pointer, for interaction therewith.
  • display device 20 displays images of a simulation or game environment.
  • other images such as images describing a point of view from a first-person perspective or images describing a third-person perspective of objects, backgrounds, etc, can be displayed.
  • host computer 12 may receive sensor data or a sensor signal via a bus 24 in communication with the interface device 14 and other information.
  • Processor 16 can receive data from bus 24 using I/O electronics, and can use I/O electronics to control other peripheral devices.
  • Host computer 12 can also generate and output a signal, or "command", to interface device 14 via bus 24.
  • the signal may be related to a haptic effect to be output by the interface device 14 to the user 22.
  • multiple interface devices 14 can be coupled to a single host computer system 12 through bus 24 (or multiple buses 24) so that multiple users can simultaneously interface with the host application program.
  • multiple players can interact in the host application program with multiple interface devices 14 using networked host computers 12.
  • Suitable microprocessors for use as local processor 26 include the MC68HC711E9 by Motorola and the PIC16C74 by Microchip, for example.
  • Local processor 26 can include one microprocessor chip, or multiple processors and/or co-processor chips, h other embodiments, processor 26 can include a digital signal processor (DSP) chip, or state machines, logic gates, an ASIC, etc.
  • DSP digital signal processor
  • Local memory 27, such as RAM and/or ROM, may be coupled to local processor 26 in interface device 14 to store instructions for processor 26 and store temporary and other data.
  • a local clock 29 may be provided in the interface device 14 and may be coupled to the local processor 26 to provide timing data, similar to system clock 18 of host computer 12.
  • Actuators 30 transmit forces to the user 22 manipulating the interface device 34, where the forces can be transmitted via object 34 and/or through another feature of the interface device, such as the housing, in response to signals received from local processor 26 and/or host computer 14. If forces are output to the user object, the actuators can output forces in one or more directions along one or more degrees of freedom of object 34; an actuator 30 can provided for each degree of freedom along which forces are desired to be transmitted. In tactile embodiments where forces are transmitted to the user not kinesthetically via the user object, the actuator(s) 30 can cause forces in the housing, user object 34, or other contacted surface; such forces can include pulses, vibrations, textures, etc. Some embodiments can include drive transmissions (gears, capstans, belt drives, etc.) to amplify force output. Some embodiments can include an actuator assembly to convert actuator output to a force having a desired direction, magnitude, etc.
  • the user object (or manipulandum) 34 may be a device or article that may be grasped or otherwise physically contacted by a user 22 and which is in communication with interface device 14.
  • the user 22 can manipulate and move the object along provided degrees of freedom to interface with the graphical environment generated by the host application program the user is viewing on display device 20.
  • the user object 34 may be, for example, a joystick, mouse, trackball, stylus, steering wheel, medical instrument (laparoscope, catheter, etc.), pool cue, hand grip, knob, button, or other article.
  • Targets are defined regions in the graphical environment, such as the GUI 200, to which a graphical object, such as a cursor, may be moved by the user.
  • the "target” may be associated with one or more forces or haptic effects, for example one or more forces or haptic effects may be associated with a graphical object of the GUI 200 graphical environment.
  • targets can be associated with, for example, graphical objects such as icons, pull-down menu items, and buttons.
  • the GUI 200 permits the user to access various functions implemented by an operating system or application program running on computer system 12. These functions typically include, but are not limited to, peripheral input/output functions (such as writing or reading data to disk or another peripheral), selecting and running application programs and other programs that are independent of the operating system, selecting or managing programs and data in memory, viewing/display functions (such as scrolling a document in a window, displaying and/or moving a cursor or icon across the screen, displaying or moving a window, displaying menu titles and selections, etc.), and other functions implemented by computer system 12.
  • peripheral input/output functions such as writing or reading data to disk or another peripheral
  • selecting and running application programs and other programs that are independent of the operating system selecting or managing programs and data in memory
  • viewing/display functions such as scrolling a document in a window, displaying and/or moving a cursor or icon across the screen, displaying or moving a window, displaying menu titles and selections, etc.
  • the display screen 20 displays GUI 200, which can, for example, be implemented by a Microsoft Windows® operating system, a Macintosh operating system, X-
  • icons 202 may have an attractive force associated with them. This attractive force can originate from a desired point I within each icon 202, which may be located at the center position of the icon, or located at a different area of icon 202, such as near the perimeter of the icon.
  • window 201 may have an attractive force associated with it which originates from a point W within window 201, which may be at the center of the window. Points I and W are considered to be "field origin points.”
  • force fields can originate from a point or region not shown on the screen.
  • the inertia force can be affected by the velocity and/or acceleration of cursor 206 in addition to or instead of the simulated mass.
  • an icon's mass can be related to how large in terms of storage space (e.g. in bytes) its associated program or file is.
  • force feedback can directly relate information about a target to the user.
  • damping and or friction forces can be provided instead of or in addition to the inertia forces.
  • each graphical object can be assigned a simulated damping coefficient or a coefficient of friction.
  • user intention for cursor motion can be related to the velocity of motion of the cursor and/or user object of the interface device.
  • the haptic clutter associated with passing the cursor over untargeted graphical objects may be reduced by relating characteristics of the haptic effect to the velocity of the cursor. It has been discovered that a user typically causes the cursor to move at a high velocity when a target is not in immediate proximity to the cursor and then slows the movement of the cursor as it approaches nearer to the target.
  • the haptic effects when the cursor is moving at a high velocity, the haptic effects may be lessened in strength or muted since it is unlikely that the cursor is near an object targeted by the user. As the cursor slows, the haptic effects are strengthened to enhance the user's ability to more accurately and quickly locate the target.
  • the velocity can be determined by the interface device 14 (e.g. local processor 26) or by host computer 12 by dividing the measured motion or displacement of the cursor (or user object) by a specific time interval.
  • the local processor 26 may determine the position of the cursor at fixed intervals of time. The change in displacement during the period between intervals may be divided by the amount of time that lapses during the period, and the value may be stored as the current velocity of the cursor. Also, several such velocity values can be stored over time and averaged to dete ⁇ nine the velocity.
  • the change in displacement is directly related to an approximation of the velocity of the cursor during the period, and the change in displacement may be stored as the current velocity of the cursor, hi other embodiments, the velocity can be determined in other ways; for example, velocity sensors can directly determine the velocity of the user object 34, or velocity can be determined from acceleration data from an accelerometer measuring motion of the user object 34.
  • the local processor 26 can perform this haptic effect magnitude change in some embodiments or modes, while the host computer can perform this function in other embodiments or modes.
  • haptic effect By adjusting the strength of a haptic effect in accordance with cursor velocity, the use of beneficial haptic effects can be increased. For example, in a word processing application, a haptic effect associated with every character in a document can be overwhelming for the user, since the cursor can move over many characters in a short space of time. However, by setting a haptic effect strength so that the haptic effects associated with characters are only felt when the cursor is moving slowly over the characters (or are felt at much less magnitude at faster cursor velocities), the user will be given improved feedback on the positioning of the cursor within the document.
  • haptic effects associated with individual cells, or with information within cells may be output at or near full strength only when the cursor is moving at lower velocities, when the user wishes to manipulate data at the detail level of individual cells.
  • more than one velocity function may be used.
  • one function can be used at one range of velocities, while a different function can be used for a different range of velocities.
  • Another embodiment can, for example, cater magnitude strength to different motions in different directions or situations. For example, it has been determined that users may have different vertical cursor movement characteristics than horizontal movement characteristics. A first user may use, for example, a horizontal tool bar displayed at the upper portion of a display screen and accordingly would develop a higher velocity routine in navigating the cursor to targets at the top of the screen. Another user may have a vertical toolbar on the side of a screen and accordingly develop a routine of fast horizontal movements.
  • average user velocities, past user velocities, areas on the screen where the cursor often is positioned, the extent of the screen which the user moves the cursor to, the application program being used, etc. may each or all be useful in determining a function that is best suited to the user and/or to the program or environment being used.
  • different velocity functions may be associated with different haptic effects and/or with different targets.
  • a haptic effect of a vibration is associated with an icon and a haptic effect of a pop or pulse is associated with a character in a word processing application
  • different velocity tliresholds may be desirable for modulating the strengths of the respective effects.
  • a user may be able to target an icon at a higher velocity than the user can target an area between two characters. Accordingly, the icon effect velocity threshold may desirably be greater than the character effect velocity threshold.
  • haptic effect strength may also be used to adjust haptic effect strength.
  • the cursor may encounter a large number of graphical objects when moving across the screen even if the cursor velocity is low, if there is a high density of graphical objects. For instance, small characters in a word or a large number of small icons on a crowded desktop screen may undesirably cause lots of haptic effects to be output if the user moves the cursor relatively slowly across all these objects.
  • Function 352 indicates that at those amounts of time below the time threshold tl, an exiting haptic effect is output at a first strength, here shown as a gain of "1" (normal strength), and at those times above the time threshold tl, the exiting haptic effect can be output at a second strength, here shown as a gain kl that is less than "1,” for example "0” to turn off the haptic effect entirely, or any fraction of 1 to reduce the strength of the effect.
  • the haptic effect does not interfere with the user moving the cursor away from the target, since time tl is close enough to the time of engagement so that the haptic effect is turned off or reduced in strength typically before the user can move the cursor away.
  • haptic effects that are applied within a target. For example, if a vibration is applied to indicate positioning over a target, modulating the effect strength over time will lessen or remove the haptic effect when it is not needed. This modulation also allows for the application of additional haptic effects within the target without overloading the user with different effects.
  • the time-dependent effects may have multiple time-dependent functions. For example, one target may have a first time threshold and a second target may have a second time threshold; or a single target may have different time thresholds and/or functions, dynamically determined based on past cursor movement by the user or other events or characteristics of the graphical environment. Or, different cursor navigation tasks can use different functions, e.g. for selecting, placement, dragging, etc.

Abstract

L'invention concerne un procédé et un dispositif pour le contrôle de l'effet de retour haptique, visant à améliorer la navigation d'un curseur ou autre objet contrôlé affiché dans un environnement graphique. Un dispositif d'interface permet de communiquer avec un ordinateur sur lequel on a lancé un programme d'applications et établi un environnement graphique. Ce dispositif comprend un actionneur produisant un effet haptique pour l'utilisateur. Un modulateur module l'ampleur de l'effet haptique en relation avec (selon les variantes) la vitesse du curseur ou autre objet manipulé par l'utilisateur; le degré d'interaction curseur/objet graphique; et/ou la durée d'interaction curseur/objet graphique.
PCT/US2002/001457 2001-01-16 2002-01-16 Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique WO2002057885A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002249960A AU2002249960A1 (en) 2001-01-16 2002-01-16 Controlling haptic feedback for enhancing navigation in a graphical environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26228601P 2001-01-16 2001-01-16
US60/262,286 2001-01-16

Publications (2)

Publication Number Publication Date
WO2002057885A2 true WO2002057885A2 (fr) 2002-07-25
WO2002057885A3 WO2002057885A3 (fr) 2002-12-05

Family

ID=22996912

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/001457 WO2002057885A2 (fr) 2001-01-16 2002-01-16 Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique

Country Status (2)

Country Link
AU (1) AU2002249960A1 (fr)
WO (1) WO2002057885A2 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6750877B2 (en) 1995-12-13 2004-06-15 Immersion Corporation Controlling haptic feedback for enhancing navigation in a graphical environment
ES2325976A1 (es) * 2008-12-18 2009-09-25 Universidad Carlos Iii De Madrid Dispositivo indicador.
EP2329339A1 (fr) * 2008-07-15 2011-06-08 Immersion Corporation Systèmes et procédés de passage d'une fonction de rétroaction haptique entre des modes passif et actif
FR2961610A1 (fr) * 2010-06-18 2011-12-23 Thales Sa Dispositif d'interaction haptique asservi a l'effort
EP2207084A3 (fr) * 2008-12-30 2012-03-07 Samsung Electronics Co., Ltd. Procédé de fourniture d'une interface utilisateur graphique utilisant un pointeur à effet sensoriel, le pointeur étant déplacé par gravité, et appareil électronique correspondant
EP2101246B1 (fr) * 2008-03-10 2018-09-05 LG Electronics Inc. Terminal et procédé pour son contrôle
US10343058B2 (en) 2007-10-09 2019-07-09 Nintendo Co., Ltd. Storage medium storing a load detecting program and load detecting apparatus
WO2021220816A1 (fr) * 2020-04-27 2021-11-04 株式会社Nttドコモ Dispositif de commande, dispositif d'exploitation et système d'exploitation

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5427343B2 (ja) 2007-04-20 2014-02-26 任天堂株式会社 ゲームコントローラ
JP5427346B2 (ja) 2007-10-05 2014-02-26 任天堂株式会社 荷重検出プログラム、荷重検出装置、荷重検出システムおよび荷重検出方法
JP4382844B2 (ja) 2007-10-31 2009-12-16 任天堂株式会社 調整用加重機、および調整用加重方法
JP5161182B2 (ja) 2009-09-28 2013-03-13 任天堂株式会社 情報処理プログラム及び情報処理装置

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6750877B2 (en) 1995-12-13 2004-06-15 Immersion Corporation Controlling haptic feedback for enhancing navigation in a graphical environment
US10343058B2 (en) 2007-10-09 2019-07-09 Nintendo Co., Ltd. Storage medium storing a load detecting program and load detecting apparatus
EP2101246B1 (fr) * 2008-03-10 2018-09-05 LG Electronics Inc. Terminal et procédé pour son contrôle
US9063571B2 (en) 2008-07-15 2015-06-23 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
EP3130983A1 (fr) * 2008-07-15 2017-02-15 Immersion Corporation Systèmes et procédés permettant de commuter une fonction de retour haptique entre des modes passif et actif
US10416775B2 (en) 2008-07-15 2019-09-17 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
US8638301B2 (en) 2008-07-15 2014-01-28 Immersion Corporation Systems and methods for transmitting haptic messages
US8866602B2 (en) 2008-07-15 2014-10-21 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US8976112B2 (en) 2008-07-15 2015-03-10 Immersion Corporation Systems and methods for transmitting haptic messages
CN104571336A (zh) * 2008-07-15 2015-04-29 意美森公司 用于在无源和有源模式之间变换触觉反馈功能的系统和方法
EP3496376A1 (fr) * 2008-07-15 2019-06-12 Immersion Corporation Systèmes et procédés permettant de commuter une fonction de retour haptique entre des modes passif et actif
US9134803B2 (en) 2008-07-15 2015-09-15 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US10248203B2 (en) 2008-07-15 2019-04-02 Immersion Corporation Systems and methods for physics-based tactile messaging
US9612662B2 (en) 2008-07-15 2017-04-04 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
US9785238B2 (en) 2008-07-15 2017-10-10 Immersion Corporation Systems and methods for transmitting haptic messages
EP2329339A1 (fr) * 2008-07-15 2011-06-08 Immersion Corporation Systèmes et procédés de passage d'une fonction de rétroaction haptique entre des modes passif et actif
US10198078B2 (en) 2008-07-15 2019-02-05 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US10203756B2 (en) 2008-07-15 2019-02-12 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
ES2325976A1 (es) * 2008-12-18 2009-09-25 Universidad Carlos Iii De Madrid Dispositivo indicador.
EP2207084A3 (fr) * 2008-12-30 2012-03-07 Samsung Electronics Co., Ltd. Procédé de fourniture d'une interface utilisateur graphique utilisant un pointeur à effet sensoriel, le pointeur étant déplacé par gravité, et appareil électronique correspondant
US8516369B2 (en) 2008-12-30 2013-08-20 Samsung Electronics Co., Ltd. Method for providing GUI using pointer with sensuous effect that pointer is moved by gravity and electronic apparatus thereof
FR2961610A1 (fr) * 2010-06-18 2011-12-23 Thales Sa Dispositif d'interaction haptique asservi a l'effort
WO2021220816A1 (fr) * 2020-04-27 2021-11-04 株式会社Nttドコモ Dispositif de commande, dispositif d'exploitation et système d'exploitation

Also Published As

Publication number Publication date
AU2002249960A1 (en) 2002-07-30
WO2002057885A3 (fr) 2002-12-05

Similar Documents

Publication Publication Date Title
US6750877B2 (en) Controlling haptic feedback for enhancing navigation in a graphical environment
EP0864144B1 (fr) Procede et appareil permettant d'obtenir un retour de force pour une interface graphique homme-machine
US6061004A (en) Providing force feedback using an interface device including an indexing function
CA2272627C (fr) Interface de retour de force a fonctionnalites isotonique et isometrique
US7131073B2 (en) Force feedback applications based on cursor engagement with graphical targets
US7843424B2 (en) Method and apparatus for designing force sensations in force feedback computer applications
US6697086B2 (en) Designing force sensations for force feedback computer applications
US7701438B2 (en) Design of force sensations for haptic feedback computer interfaces
WO2002057885A2 (fr) Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP