KR101890836B1 - Multi-axis interface for a touch-screen enabled wearable device - Google Patents
Multi-axis interface for a touch-screen enabled wearable device Download PDFInfo
- Publication number
- KR101890836B1 KR101890836B1 KR1020147029395A KR20147029395A KR101890836B1 KR 101890836 B1 KR101890836 B1 KR 101890836B1 KR 1020147029395 A KR1020147029395 A KR 1020147029395A KR 20147029395 A KR20147029395 A KR 20147029395A KR 101890836 B1 KR101890836 B1 KR 101890836B1
- Authority
- KR
- South Korea
- Prior art keywords
- application
- touch screen
- displayed
- user interface
- screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/08—Touch switches specially adapted for time-pieces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The touch screen-enabled wearable computer includes a multi-axis user interface provided by at least one software component running on the processor. Wherein the multi-axis user interface displays at least two user interface areas displayed on the touch screen, one each, a series of one or more application screens; And a vertical navigation axis, wherein the vertical navigation axis allows a user to navigate between the plurality of user interface areas in response to vertical swipe gestures generated on a touch screen, and wherein the horizontal The navigation axis allows the user to navigate the application screens of the currently displayed user interface area in response to horizontal swipe gestures across the touch screen.
Description
This application claims priority from U.S. Serial No. 13 / 425,355, filed March 20, 2012, which application is incorporated herein by reference.
Electronic data and communication devices are constantly becoming smaller even as their information processing capacity continues to increase. Currently, portable communication devices are primarily touch screen-based user interfaces that allow these devices to be controlled by user finger gestures. Many such user interfaces are optimized for pocket-sized devices, such as mobile phones, which typically have larger screens than the 3 "or 4" diagonal. Due to their relatively large form factors, one or more mechanical buttons are typically provided to support the operation of these devices.
For example, the user interface of the iPhone TM with touch screen is based on the concept of a home screen that displays an array of available application icons. Depending on the number of applications loaded on the iPhone, the home screen may contain multiple pages of icons, with the first page being the main home screen. The user can scroll from one home screen page to another home screen page by swiping the finger horizontally across the touch screen. A tap on the icon of one of the icons opens the corresponding application. The main home screen can be accessed from any open application or other home screen page by pressing a hardware button located below the touch screen, sometimes referred to as a home button. To quickly switch between applications, a user can double-click the home button to display a row of recently used applications, so that the user scrolls to horizontal swipes and fingers You can reopen any selected application. Due to the use of horizontal swipes, the user interface of the iPhone can be described as having horizontal-based navigation. While touch-based user interfaces, such as the iPhone's user interface, can provide many benefits, these touch-based user interfaces can be used to navigate applications and enter / exit (button / Wipes, and taps. This requires the user to visually target the functions necessary to focus the device and operate the device.
As rapid advances in miniaturization occur, much smaller form factors are becoming available that make these devices wearable. The user interface for a much smaller, wearable touchscreen device with a screen size of less than 2.5 "diagonally must be a significantly different and intuitive way to operate such a small device in order to be easy to use.
Accordingly, it is desirable to provide an enhanced touch screen-based user interface that is optimized for very small wearable electronic non-dice, which reduces the need for the user to view the visual focus during operation, To access and manipulate data and graphical objects in a manner that does not require the use of mechanical buttons.
The exemplary embodiment provides methods and systems for providing a multi-axis user interface to a touch screen enabled wearable computer. Aspects of the illustrative embodiment include: providing at least two user interface areas that are displayed on the touch screen one at a time in a multi-axis user interface, each area displaying a series of one or more application screens, and a vertical navigation axis Wherein the vertical navigation axis allows the user to navigate between a plurality of user interface areas in response to vertical swipe gestures made on the touch screen and the horizontal navigation axis allows the user to navigate between a plurality of user interface areas, In response to horizontal swipe gestures across the touch screen, to navigate the application screens of the currently displayed user interface area.
According to the methods and systems disclosed herein, the use of multi-axis navigation, rather than single-axis navigation, allows a user to use a pair of vertical and horizontal finger swipes (gross) rather than precisely targeted finger taps Gross gesture) and minimal focus on the wearable computer to invoke the necessary functions.
1 is a block diagram illustrating embodiments of a wearable computer.
2 is a high-level block diagram illustrating computer components including a wearable computer according to one embodiment.
Figures 3a, 3b, and 3c illustrate one embodiment of a multi-axis user interface for a wearable device.
4 is a flow diagram illustrating a process for providing a more detailed multi-axis user interface for a wearable computer.
Figure 5 is a diagram illustrating an embodiment in which the start page application includes a watch face.
6 is a diagram illustrating vertical transitions from a start page application on a top level area to an application launcher screen on a middle level area in response to a vertical swipe gesture.
7 is a diagram illustrating horizontal scrolling of various application icons from an application launcher.
8 is a diagram showing a vertical transition from the application launcher screen on the middle level area to the application screen on the bottom level area.
9 is a diagram showing an exemplary application screen of a weather application.
Figure 10 is a diagram illustrating a vertical transition from an exemplary weather application screen back to a start page application in response to a universal gesture such as a double finger swipe.
Embodiments relate to multi-axis user interfaces for wearable computers. The following description is presented to enable one of ordinary skill in the art to make and use the invention, and is provided in the context of a patent application and its requirements. Various changes to the embodiments and the general principles and features described herein will be readily apparent. The embodiments are described primarily with reference to specific methods and systems provided in the specific embodiments. However, the above methods and systems will work effectively in other implementations. Phrases such as " an embodiment ", "an embodiment ", and" other embodiments "may mean the same or different embodiments. The embodiments will be described with respect to systems and / or devices having certain components. However, the systems and / or devices may include more or fewer components than those shown herein, and variations of the array and type of components may be made without departing from the scope of the present invention. Embodiments will also be described in the context of specific methods having certain steps. However, the method and system operate effectively for different and / or additional steps and for other methods having different orders of steps that are inconsistent with the embodiments. Accordingly, it is not intended that the invention be limited solely to the embodiments shown, but the widest scope consistent with the principles and features described herein is set forth in the present invention.
Embodiments provide methods and systems for displaying a multi-axis user interface for a touch screen enabled wearable computer. The user interface includes two or more user interface areas, wherein only one of the user interface areas is displayed on the touch screen at any given time, and a combination of vertical and horizontal navigation axes. In one embodiment, the vertical navigation axis may allow a user to navigate between the user interface areas in response to vertical swipe gestures on the touch screen. The horizontal navigation axis may allow the user to navigate between one or more application screens of each of the user interface areas using horizontal swipe gestures.
The combination of vertical and horizontal navigation axes simplifies the user interface, allows the user to quickly access the required application or function, and does not require a hardware button for navigation. Thus, by using a series of finger swipes, the user can minimize the need to look at the wearable computer when calling the desired function.
1 is a block diagram illustrating embodiments of a wearable computer. According to these embodiments, the
In one embodiment, the
In one embodiment, the touch screen has a dimension less than 2.5 inches diagonally, and in some embodiments may be about 1.5 inches diagonally. For example, in one embodiment, the
In a further embodiment, the user interface may be provided with an auto configuration setting. When the
2 is a high-level block diagram illustrating computer components including a wearable computer in accordance with one embodiment. In addition to the
The processor 202 may be configured to simultaneously execute a plurality of software components to control various processes of the
The memories 204 may include random access memory (RAM) and non-volatile memory (not shown). The RAM may be used as a main memory for a microprocessor that supports execution of software routines and other optional storage functions. The non-volatile memory may store instructions and data without power and may store software routines that control the
I /
The power manager 208 may communicate with the processor 202 and may adjust power management for the
Communication interface 210 may include components that support unidirectional or two-way wireless communication. In one embodiment, communication interface 210 is for receiving data that is primarily displayed remotely on
The sensors 212 may include various sensors including a global positioning system (GPS) chip and an accelerometer (not shown). The accelerometer can be used to measure position, motion, tilt, shock, and vibration for use by the processors 202. The
The software components executed by the processor 202 may include a gesture interpreter 214, an application launcher 216, a number of software applications 218, and an operating system 220. The operating system 220 is preferably a multitasking operating system that manages computer hardware resources and provides general services for the application 218. In one embodiment, the operating system 220 may include a Linux-based operating system for mobile devices such as Android TM . In one embodiment, the applications 218 may be written in Java format and downloaded to the
The application launcher 216 may be invoked by the operating system 220 under device startup and / or wake from a sleep mode. The application launcher 216 is continuously run during the awake mode and serves to launch other applications 218. [ In one embodiment, the default application displayed by the application launcher is a
The user operates the
Although the operating system 220, the gesture interpreter 214, and the application launcher 216 are shown as separate components, each function may be integrated into fewer or greater numbers of modules / components.
According to one embodiment, application launcher 216 is configured to display a multi-axis user interface including a plurality of user interface areas in combination of both vertical and horizontal navigation axes. The user can navigate between the user interface areas using simple finger gestures done in accordance with the orientation of the vertical and horizontal navigation axes to reduce the amount of visual focus the user requires to operate the
Figures 3a, 3b, and 3c illustrate one embodiment of a multi-axis user interface for a touch screen enabled
Application launcher 212 is configured to provide a combination of
In contrast, the
In one embodiment, during vertical navigation between the user interface areas 300, the user interface that the user has reached the high-level area 300a or the low-level area 300c may be reversed in order for the user to return to the previous level A
In one embodiment, the user interface areas 300a, 300b, and 300c can be inferred to areas of the electronic map. The user places his finger on the screen,
Quot; dragging "around the map in the direction of the map, for example, by finger-up movement to drag the map upward with a smooth scroll movement, and revelation of previously hidden portions of the map, Gating can be done. In the current embodiments, the user does not "drag" the user interface areas to reveal the next user interface area. This is because it can require the user to watch the touch screen to guide the next area on the screen. Instead, the user navigates between areas with simple vertical swipes, e.g., up sweeps, that cause separate transitions between user interface areas 300a, 300b, and 300c. That is, the immediately adjacent area "snap" to the location and replace the previously displayed area.FIG. 3A shows an embodiment in which the high level region 300a includes a
3B may include an
3C illustrates that the lower level region 300c may include a series of one or more application screens 308 for an open application. Each application displayed by the application launcher 216 may have its own set of application screens 308. The series of application screens 308 may be displayed in response to detecting that the user is performing horizontal swipe gestures that move the currently displayed application screen left or right to reveal the previous or
In the embodiment shown in Figures 3a, 3b, and 3c, rather than implementing a series of user interface areas and application screens as a circular queue, a series of user interface areas and application screens may be scrolled past the first or last panel , It may be implemented as a linked list of screens or panels that terminate on each end. In this embodiment, if the user attempts a flip through the first panel or last panel with a swipe gesture (and thus there is no panel to flip), the currently displayed panel is displayed when the user's finger begins to move It can start to move, but when the user's finger is lifted from the touch screen, it returns to its original position. In one embodiment, the flip or return animation includes a simulated deceleration. For example, as the panel approaches the final stopping point, the panel decelerates to stop rather than stop suddenly.
In this embodiment, the user may switch from one application to another, which first uses the upswipe to return to the
Further, in another embodiment, the multi-axis user interface may be implemented with two user interface areas instead of three user interface areas. In this embodiment, the start page application may be implemented as part of the
4 is a flow diagram illustrating in greater detail a process for providing a multi-axis user interface for a wearable computer. In one embodiment, the process is performed by at least one user interface component running on the processor 202, including any combination of the gesture interpreter 214, application launcher 216, and operating system 220 .
When the
5 is a diagram illustrating an embodiment in which the
Referring back to FIG. 4, in response to detecting a vertical swipe gesture in a first direction (e.g., up direction) on the touch screen while the start page application is being displayed, And transitioned along the
6 is a diagram illustrating vertical transitions from the
Referring back to FIG. 4, in response to detecting a horizontal swipe gesture across the touch screen while the application launcher screen is being displayed, application icons are scrolled horizontally across the touch screen for user selection (block 404).
7 is a diagram illustrating horizontal scrolling of a
4, in response to detecting a vertical swipe gesture in a second direction (e.g., down direction) while the
In response to detecting the at least one of the tabs on the touch screen or the vertical swipe gesture in the first direction while the application launcher screen is being displayed, the application is opened and the user interface is displayed in the middle Transition from the level area to the lower level area (block 408).
8 is a diagram illustrating vertical transitions from the
9 is a diagram showing an
Referring again to FIG. 4, in response to detecting a vertical swipe gesture in a second direction (e.g., down) on the touch screen while the
In an alternative embodiment, in response to detecting a universal gesture on either the application launcher screen or the application screen for an open application, the home screen is displayed again. A typical gesture can be a gesture mapped to the same function, no matter what level of the user interface or what area is displayed. One example of this common gesture can be a two-finger vertical swipe. When detected by an application launcher or application, the application launcher causes a redisplay of the start page application, e.g., watch face.
10 illustrates a vertical transition from an exemplary
Referring again to Figures 3A-3C, the vertical scrolling between the screens of the user interface areas 300a-300c and the interaction between the
In one embodiment, when the gesture manager 214 (or equivalent code) detects that the user's finger begins to slide vertically or horizontally, the application launcher 216 causes the screen to move the spring- To move up / down or left / right with finger movements in a spring-loaded fashion. If the gesture manager determines that the finger has moved some minimum distance, e.g., 1 cm, and then is lifted from the touch screen, the application launcher 216 may move the finger in the same direction, e.g., Immediately display a quick animation of the screen flipping right. In one embodiment, the flipping animation may be implemented using a Hyperspace animation technique as shown in the Android "APIDemos ". If the user has not moved the minimum distance before the finger is lifted, the gesture manager determines that the user has not attempted to "flick". In this case, the screen appears to fall back to its original position. Transition animation may be aesthetically pleasing, but the separate transition may consume less battery power.
According to another aspect of the exemplary embodiments, the edge portion of the
11 is a block diagram illustrating fast-scroll portions on the
In another embodiment, when a user's finger remains on the acceleration scrolling zones, a
A method and system for providing a multi-axis user interface for a wearable computer has been disclosed. The present invention has been described in accordance with the above-described embodiments, and variations may be made in these embodiments, and any modifications within the spirit and scope of the present invention may be made. For example, in an alternative embodiment, the functions of the vertical and horizontal axes of the wearable computer may be interchanged such that the horizontal axis is used to navigate between user interface areas in response to a horizontal swipe, while the vertical navigation axis is vertical It is used to navigate between application screens using swipes. Accordingly, many modifications may be made by those skilled in the art without departing from the spirit and scope of the appended claims. The software recorded in accordance with the present invention may be stored in some form of computer readable medium, such as a memory or hard disk, and executed by a processor.
Claims (19)
A touch screen having a size less than 2.5 inches diagonally; And
Comprising at least one software component executing on a processor configured to display a multi-axis user interface,
The multi-axis user interface comprises:
An upper level area for displaying a first series of one or more application screens, an intermediate level area connectable to the upper level area by a swipe gesture detectable by the touch screen, Displaying a series of screens and a lower level area connectable to the intermediate level area by a selection gesture detectable by the touch screen, the lower level area displaying a third series of one or more application screens A plurality of user interface areas displayed on the touch screen including; And
Wherein the vertical navigation axis allows a user to navigate between the plurality of user interface areas in response to vertical swipe gestures created on the touch screen, and wherein the horizontal navigation axis Axis allows the user to navigate the application screens of the currently displayed user interface area in response to horizontal swipe gestures across the touch screen,
Only one of the upper level area, the middle level area and the lower level area is displayed on the touch screen at a predetermined time, and one application icon displayed on the displayed only one level area is displayed as a whole Page, fill the entire display of the touch screen,
The touch screen is divided into two acceleration scrolling zones along the first and second side edges of the swipe zone and the touch screen,
Wherein the processor is further configured to, while a predetermined application screen of the series of one or more application screens is displayed,
Outputting a next application screen of the series of one or more application screens for display by the touch screen in response to detecting a gesture at a location within the swipe zone of the touch screen by the touch screen,
Responsive to detecting a tap and hold gesture within one of the two accelerated scrolling zones by the touch screen, accelerated through at least a portion of the one or more series of application screens for display by the touch screen and to output the advancement of the at least one software component.
In response to detecting a single vertical swipe gesture on the currently displayed user interface area, immediately adjacent user interface areas are displayed.
During vertical navigation between the user interface areas, when the user reaches the high level area or the low level area, the user interface instructs the user to select a vertical user swipe in the opposite direction to return to the previous level The computer program being configured to cause the computer to perform the steps of:
Wherein successive scrolling through the user interface areas returns to a user interface area originally displayed, thereby creating a circular queue of user interface areas.
Wherein the user interface regions are implemented as a linked list of panels terminating on each stage, and wherein scrolling through the first or last panel is not allowed.
Wherein in response to detecting a single horizontal swipe gesture on a currently displayed application screen of a particular user interface area, an immediately adjacent application screen of the user interface area is displayed.
Wherein the sequential scrolling through the application screens returns to the first displayed application screen, thereby creating a circular queue of application screens.
Wherein the application screens are implemented as a linked list of panels terminating on each stage, and scrolling through the first or last panel is not allowed.
The middle level area includes an application launcher screen that displays a series of one or more application icons in response to the horizontal swipe gestures so that the user can scroll through the application icons and select an application for opening Featuring a wearable computer.
Wherein the lower level area comprises a series of one or more application screens for an opened application.
Wherein the upper level area includes a start page application that, in response to the horizontal swipe gestures, displays a series of one or more watch face screens so that the user can scroll through the watch face screens, Wherein the watch face screen can be selected to be a default watch screen to change the watch face screen.
Further comprising an operating system and a gesture interpreter,
The operating system detecting gesture events occurring on the touch screen, sending the gesture events to an application launcher,
The application launcher requests the gesture interpreter to determine a gesture type, and the application launcher changes the user interface based on the gesture type.
Displaying an upper level area including a start page application on a touch screen less than 2.5 inches diagonally, the touch screen being divided into two acceleration scrolling zones along the first and second side edges of the swipe zone and the touch screen -;
In response to detecting a vertical swipe gesture in a first direction on the touch screen while the start page application is being displayed, moving the user interface along a vertical axis from the upper level area to the middle level area to display an application launcher screen, ;
Scrolling application icons horizontally across the touch screen for user selection in response to detecting a horizontal swipe gesture across the touch screen while the application launcher screen is being displayed;
In response to detecting at least one of a tap on the touch screen or a vertical swipe gesture in the first direction while the application launcher screen is being displayed, Transitioning the user interface along a vertical axis to a level region, wherein only one of the high level region, the intermediate level region and the low level region is displayed on the touch screen at a predetermined time, One application icon includes an entire page of one of the application screens and fills the entire display of the touch screen;
In response to detecting a gesture at a location within the swipe zone of the touch screen by the touch screen while a predetermined application screen of the application screens is being displayed, Outputting the next of the application screens by the at least one processor; And
In response to detecting a tap and hold gesture within one of the two accelerated scrolling zones by the touch screen while the predetermined one of the application screens is being displayed, Outputting by the at least one processor an accelerated advancement through at least a portion of the application screens.
In response to detecting a vertical swipe gesture in a second direction on the touch screen while the application launcher screen is being displayed, the user interface is transitioned from the mid-level area to the upper level area to display the start page application again, ≪ / RTI >
In response to detecting a vertical swipe gesture in a second direction on the touch screen while the application screen is being displayed, the application launcher screen is displayed on the user interface ≪ / RTI > further comprising the step of:
Configuring the start page application as one or more watch paces of the series; And
Further comprising scrolling one or more watch faces of the series horizontally across the touch screen for user selection in response to detecting a horizontal swipe across the currently displayed watch face.
The program instructions,
Instructions for displaying a high level area including a start page application on a touch screen less than 2.5 inches diagonally, the touch screen being divided into two acceleration scrolling zones along the first and second side edges of the swipe zone and the touch screen -;
In response to detecting a vertical swipe gesture in a first direction on the touch screen while the start page application is being displayed, moving the user interface along a vertical axis from the upper level area to the middle level area to display an application launcher screen, Command;
In response to detecting a horizontal swipe gesture across the touch screen while the application launcher screen is being displayed, scrolling application icons horizontally across the touch screen for user selection; And
In response to detecting at least one of a tap in a first direction or a vertical swipe gesture in the first direction on the touch screen while the application launcher screen is being displayed, An upper level region, an intermediate level region and a lower level region are displayed on the touch screen at a predetermined time, and only one of the displayed level regions One application icon displayed on the touch screen includes an entire page of one of the application screens and fills the entire display of the touch screen;
In response to detecting a gesture at a location within the swipe zone of the touch screen by the touch screen while a predetermined application screen of the application screens is being displayed, Outputting the next of the application screens by the one or more processors; And
In response to detecting a tap and hold gesture within one of the two accelerated scrolling zones by the touch screen while the predetermined one of the application screens is being displayed, And outputting an accelerated advancement via at least a portion of the application screens by the one or more processors.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/425,355 US20130254705A1 (en) | 2012-03-20 | 2012-03-20 | Multi-axis user interface for a touch-screen enabled wearable device |
US13/425,355 | 2012-03-20 | ||
PCT/US2013/029269 WO2013142049A1 (en) | 2012-03-20 | 2013-03-06 | Multi-axis interface for a touch-screen enabled wearable device |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20150067086A KR20150067086A (en) | 2015-06-17 |
KR101890836B1 true KR101890836B1 (en) | 2018-08-22 |
Family
ID=48014287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020147029395A KR101890836B1 (en) | 2012-03-20 | 2013-03-06 | Multi-axis interface for a touch-screen enabled wearable device |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130254705A1 (en) |
EP (1) | EP2828732A1 (en) |
KR (1) | KR101890836B1 (en) |
CN (1) | CN104737114B (en) |
WO (1) | WO2013142049A1 (en) |
Families Citing this family (112)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US9124712B2 (en) * | 2012-06-05 | 2015-09-01 | Apple Inc. | Options presented on a device other than accept and decline for an incoming call |
US9507486B1 (en) * | 2012-08-23 | 2016-11-29 | Allscripts Software, Llc | Context switching system and method |
US8954878B2 (en) * | 2012-09-04 | 2015-02-10 | Google Inc. | Information navigation on electronic devices |
US9898184B2 (en) * | 2012-09-14 | 2018-02-20 | Asustek Computer Inc. | Operation method of operating system |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US8994827B2 (en) | 2012-11-20 | 2015-03-31 | Samsung Electronics Co., Ltd | Wearable electronic device |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US11157436B2 (en) | 2012-11-20 | 2021-10-26 | Samsung Electronics Company, Ltd. | Services associated with wearable electronic device |
US10551928B2 (en) * | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US20140164907A1 (en) * | 2012-12-12 | 2014-06-12 | Lg Electronics Inc. | Mobile terminal and method of controlling the mobile terminal |
US20140189584A1 (en) * | 2012-12-27 | 2014-07-03 | Compal Communications, Inc. | Method for switching applications in user interface and electronic apparatus using the same |
US9323363B2 (en) * | 2013-02-28 | 2016-04-26 | Polar Electro Oy | Providing meta information in wrist device |
WO2014143776A2 (en) | 2013-03-15 | 2014-09-18 | Bodhi Technology Ventures Llc | Providing remote interactions with host device using a wireless device |
KR102045282B1 (en) * | 2013-06-03 | 2019-11-15 | 삼성전자주식회사 | Apparatas and method for detecting another part's impormation of busy in an electronic device |
CN109739412B (en) | 2013-06-18 | 2021-10-26 | 三星电子株式会社 | User terminal equipment and management method of home network thereof |
US10564813B2 (en) * | 2013-06-18 | 2020-02-18 | Samsung Electronics Co., Ltd. | User terminal apparatus and management method of home network thereof |
KR102318442B1 (en) | 2013-06-18 | 2021-10-28 | 삼성전자주식회사 | User terminal device and method of managing home network thereof |
US9568891B2 (en) | 2013-08-15 | 2017-02-14 | I.Am.Plus, Llc | Multi-media wireless watch |
US20150098309A1 (en) * | 2013-08-15 | 2015-04-09 | I.Am.Plus, Llc | Multi-media wireless watch |
GB2517419A (en) * | 2013-08-19 | 2015-02-25 | Arm Ip Ltd | Wrist worn device |
EP3067792A4 (en) * | 2013-12-13 | 2016-12-14 | Huawei Device Co Ltd | Icon display method of wearable intelligent device and related device |
US9513665B2 (en) | 2013-12-26 | 2016-12-06 | Intel Corporation | Wearable electronic device including a formable display unit |
CN104169856B (en) * | 2013-12-30 | 2017-10-17 | 华为技术有限公司 | Side menu display method, device and terminal |
USD760771S1 (en) * | 2014-02-10 | 2016-07-05 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with graphical user interface |
USD760770S1 (en) * | 2014-02-10 | 2016-07-05 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with animated graphical user interface |
US10209779B2 (en) * | 2014-02-21 | 2019-02-19 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device therefor |
CN106030490B (en) * | 2014-02-21 | 2019-12-31 | 索尼公司 | Wearable device, electronic device, image control device, and display control method |
JP2015158753A (en) * | 2014-02-21 | 2015-09-03 | ソニー株式会社 | Wearable device and control apparatus |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
US20150286391A1 (en) * | 2014-04-08 | 2015-10-08 | Olio Devices, Inc. | System and method for smart watch navigation |
US9589539B2 (en) * | 2014-04-24 | 2017-03-07 | Kabushiki Kaisha Toshiba | Electronic device, method, and computer program product |
KR102173110B1 (en) * | 2014-05-07 | 2020-11-02 | 삼성전자주식회사 | Wearable device and controlling method thereof |
US10313506B2 (en) | 2014-05-30 | 2019-06-04 | Apple Inc. | Wellness aggregator |
KR102190062B1 (en) * | 2014-06-02 | 2020-12-11 | 엘지전자 주식회사 | Wearable device and method for controlling the same |
AU2015279544B2 (en) | 2014-06-27 | 2018-03-15 | Apple Inc. | Electronic device with rotatable input mechanism for navigating calendar application |
US9081421B1 (en) * | 2014-06-30 | 2015-07-14 | Linkedin Corporation | User interface for presenting heterogeneous content |
US20160004393A1 (en) * | 2014-07-01 | 2016-01-07 | Google Inc. | Wearable device user interface control |
EP3195098A2 (en) | 2014-07-21 | 2017-07-26 | Apple Inc. | Remote user interface |
WO2016017956A1 (en) * | 2014-07-30 | 2016-02-04 | Samsung Electronics Co., Ltd. | Wearable device and method of operating the same |
KR102156223B1 (en) * | 2014-08-02 | 2020-09-15 | 애플 인크. | Context-specific user interfaces |
US10452253B2 (en) * | 2014-08-15 | 2019-10-22 | Apple Inc. | Weather user interface |
KR102418119B1 (en) * | 2014-08-25 | 2022-07-07 | 삼성전자 주식회사 | Method for organizing a clock frame and an wearable electronic device implementing the same |
EP4209872A1 (en) | 2014-09-02 | 2023-07-12 | Apple Inc. | Phone user interface |
USD762692S1 (en) * | 2014-09-02 | 2016-08-02 | Apple Inc. | Display screen or portion thereof with graphical user interface |
JP2017527033A (en) | 2014-09-02 | 2017-09-14 | アップル インコーポレイテッド | User interface for receiving user input |
US10254948B2 (en) * | 2014-09-02 | 2019-04-09 | Apple Inc. | Reduced-size user interfaces for dynamically updated application overviews |
US20160070380A1 (en) * | 2014-09-08 | 2016-03-10 | Aliphcom | Forming wearable pods and devices including metalized interfaces |
JP6191567B2 (en) * | 2014-09-19 | 2017-09-06 | コニカミノルタ株式会社 | Operation screen display device, image forming apparatus, and display program |
US9465788B2 (en) | 2014-10-09 | 2016-10-11 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9489684B2 (en) | 2014-10-09 | 2016-11-08 | Wrap Media, LLC | Delivering wrapped packages in response to the selection of advertisements |
US9600594B2 (en) | 2014-10-09 | 2017-03-21 | Wrap Media, LLC | Card based package for distributing electronic media and services |
US20160103820A1 (en) | 2014-10-09 | 2016-04-14 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9448972B2 (en) * | 2014-10-09 | 2016-09-20 | Wrap Media, LLC | Wrap package of cards supporting transactional advertising |
WO2016057188A1 (en) | 2014-10-09 | 2016-04-14 | Wrap Media, LLC | Active receipt wrapped packages accompanying the sale of products and/or services |
KR102283546B1 (en) | 2014-10-16 | 2021-07-29 | 삼성전자주식회사 | Method and Wearable Device for executing application |
US20160139628A1 (en) * | 2014-11-13 | 2016-05-19 | Li Bao | User Programable Touch and Motion Controller |
US20160162148A1 (en) * | 2014-12-04 | 2016-06-09 | Google Inc. | Application launching and switching interface |
US9961293B2 (en) * | 2014-12-05 | 2018-05-01 | Lg Electronics Inc. | Method for providing interface using mobile device and wearable device |
KR102230523B1 (en) * | 2014-12-08 | 2021-03-19 | 신상현 | Mobile terminal |
US11036386B2 (en) * | 2015-01-06 | 2021-06-15 | Lenovo (Singapore) Pte. Ltd. | Application switching on mobile devices |
US10317938B2 (en) * | 2015-01-23 | 2019-06-11 | Intel Corporation | Apparatus utilizing computer on package construction |
EP3484134B1 (en) | 2015-02-02 | 2022-03-23 | Apple Inc. | Device, method, and graphical user interface for establishing a relationship and connection between two devices |
CN105988701B (en) * | 2015-02-16 | 2019-06-21 | 阿里巴巴集团控股有限公司 | A kind of intelligent wearable device display control method and intelligent wearable device |
US20160259491A1 (en) * | 2015-03-03 | 2016-09-08 | Olio Devices, Inc. | System and method for automatic third party user interface adjustment |
US20160259523A1 (en) * | 2015-03-06 | 2016-09-08 | Greg Watkins | Web Comments with Animation |
US10055121B2 (en) | 2015-03-07 | 2018-08-21 | Apple Inc. | Activity based thresholds and feedbacks |
US10379497B2 (en) | 2015-03-07 | 2019-08-13 | Apple Inc. | Obtaining and displaying time-related data on an electronic watch |
WO2016144385A1 (en) * | 2015-03-08 | 2016-09-15 | Apple Inc. | Sharing user-configurable graphical constructs |
KR20170130391A (en) * | 2015-03-25 | 2017-11-28 | 엘지전자 주식회사 | Watch type mobile terminal and control method thereof |
US9582917B2 (en) * | 2015-03-26 | 2017-02-28 | Wrap Media, LLC | Authoring tool for the mixing of cards of wrap packages |
US9600803B2 (en) | 2015-03-26 | 2017-03-21 | Wrap Media, LLC | Mobile-first authoring tool for the authoring of wrap packages |
US20160282947A1 (en) * | 2015-03-26 | 2016-09-29 | Lenovo (Singapore) Pte. Ltd. | Controlling a wearable device using gestures |
CA164671S (en) * | 2015-04-03 | 2016-10-17 | Lucis Technologies Holdings Ltd | Smart switch panel |
US11327640B2 (en) | 2015-06-05 | 2022-05-10 | Apple Inc. | Providing complications on an electronic device |
US10572571B2 (en) * | 2015-06-05 | 2020-02-25 | Apple Inc. | API for specifying display of complication on an electronic watch |
US9916075B2 (en) | 2015-06-05 | 2018-03-13 | Apple Inc. | Formatting content for a reduced-size user interface |
US10175866B2 (en) | 2015-06-05 | 2019-01-08 | Apple Inc. | Providing complications on an electronic watch |
US10275116B2 (en) | 2015-06-07 | 2019-04-30 | Apple Inc. | Browser with docked tabs |
EP3337583B1 (en) | 2015-08-20 | 2024-01-17 | Apple Inc. | Exercise-based watch face |
WO2017111903A1 (en) | 2015-12-21 | 2017-06-29 | Intel Corporation | Integrating system in package (sip) with input/output (io) board for platform miniaturization |
KR102475337B1 (en) | 2015-12-29 | 2022-12-08 | 에스케이플래닛 주식회사 | User equipment, control method thereof and computer readable medium having computer program recorded thereon |
US10521101B2 (en) | 2016-02-09 | 2019-12-31 | Microsoft Technology Licensing, Llc | Scroll mode for touch/pointing control |
KR20170100951A (en) | 2016-02-26 | 2017-09-05 | 삼성전자주식회사 | A Display Device And Image Displaying Method |
US20170357427A1 (en) * | 2016-06-10 | 2017-12-14 | Apple Inc. | Context-specific user interfaces |
DK201770423A1 (en) | 2016-06-11 | 2018-01-15 | Apple Inc | Activity and workout updates |
US10873786B2 (en) | 2016-06-12 | 2020-12-22 | Apple Inc. | Recording and broadcasting application visual output |
US10709422B2 (en) * | 2016-10-27 | 2020-07-14 | Clarius Mobile Health Corp. | Systems and methods for controlling visualization of ultrasound image data |
USD818492S1 (en) * | 2017-01-31 | 2018-05-22 | Relativity Oda Llc | Portion of a computer screen with an animated icon |
DK179412B1 (en) | 2017-05-12 | 2018-06-06 | Apple Inc | Context-Specific User Interfaces |
CN110914787B (en) * | 2017-09-05 | 2022-07-05 | 三星电子株式会社 | Accessing data items on a computing device |
US11561679B2 (en) * | 2017-11-09 | 2023-01-24 | Rakuten Group Inc. | Display control system, display control method, and program for page arrangement of information items |
US11327650B2 (en) | 2018-05-07 | 2022-05-10 | Apple Inc. | User interfaces having a collection of complications |
DK180171B1 (en) | 2018-05-07 | 2020-07-14 | Apple Inc | USER INTERFACES FOR SHARING CONTEXTUALLY RELEVANT MEDIA CONTENT |
CA186536S (en) * | 2018-09-18 | 2020-09-15 | Sony Interactive Entertainment Inc | Display screen with transitional graphical user interface |
US11422692B2 (en) * | 2018-09-28 | 2022-08-23 | Apple Inc. | System and method of controlling devices using motion gestures |
CN109992340A (en) * | 2019-03-15 | 2019-07-09 | 努比亚技术有限公司 | A kind of desktop display method, wearable device and computer readable storage medium |
US11960701B2 (en) | 2019-05-06 | 2024-04-16 | Apple Inc. | Using an illustration to show the passing of time |
US11131967B2 (en) | 2019-05-06 | 2021-09-28 | Apple Inc. | Clock faces for an electronic device |
US11340778B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Restricted operation of an electronic device |
US10852905B1 (en) | 2019-09-09 | 2020-12-01 | Apple Inc. | Techniques for managing display usage |
DK202070624A1 (en) | 2020-05-11 | 2022-01-04 | Apple Inc | User interfaces related to time |
EP4133371A1 (en) | 2020-05-11 | 2023-02-15 | Apple Inc. | User interfaces for managing user interface sharing |
US11372659B2 (en) | 2020-05-11 | 2022-06-28 | Apple Inc. | User interfaces for managing user interface sharing |
US11694590B2 (en) | 2020-12-21 | 2023-07-04 | Apple Inc. | Dynamic user interface with time indicator |
US11720239B2 (en) | 2021-01-07 | 2023-08-08 | Apple Inc. | Techniques for user interfaces related to an event |
US11921992B2 (en) | 2021-05-14 | 2024-03-05 | Apple Inc. | User interfaces related to time |
US11938376B2 (en) | 2021-05-15 | 2024-03-26 | Apple Inc. | User interfaces for group workouts |
US11630559B2 (en) | 2021-06-06 | 2023-04-18 | Apple Inc. | User interfaces for managing weather information |
CN113434061A (en) * | 2021-06-07 | 2021-09-24 | 深圳市爱都科技有限公司 | Method and device for realizing application entry in dial plate, intelligent watch and storage medium |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6266098B1 (en) * | 1997-10-22 | 2001-07-24 | Matsushita Electric Corporation Of America | Function presentation and selection using a rotatable function menu |
US6556222B1 (en) * | 2000-06-30 | 2003-04-29 | International Business Machines Corporation | Bezel based input mechanism and user interface for a smart watch |
US7081905B1 (en) * | 2000-06-30 | 2006-07-25 | International Business Machines Corporation | Method and apparatus for dynamically controlling scroller speed employed for a user interface of a wearable appliance |
US20050278757A1 (en) * | 2004-05-28 | 2005-12-15 | Microsoft Corporation | Downloadable watch faces |
SG153805A1 (en) * | 2004-07-19 | 2009-07-29 | Creative Tech Ltd | Method and apparatus for touch scrolling |
US7593755B2 (en) * | 2004-09-15 | 2009-09-22 | Microsoft Corporation | Display of wireless data |
US20070067738A1 (en) * | 2005-09-16 | 2007-03-22 | Microsoft Corporation | Extensible, filtered lists for mobile device user interface |
CN1949161B (en) * | 2005-10-14 | 2010-05-26 | 鸿富锦精密工业(深圳)有限公司 | Multi gradation menu displaying device and display controlling method |
US7946758B2 (en) * | 2008-01-31 | 2011-05-24 | WIMM Labs | Modular movement that is fully functional standalone and interchangeable in other portable devices |
US8677285B2 (en) * | 2008-02-01 | 2014-03-18 | Wimm Labs, Inc. | User interface of a small touch sensitive display for an electronic data and communication device |
JP5643809B2 (en) * | 2009-04-26 | 2014-12-17 | ナイキ イノベイト セー. フェー. | GPS features and functionality of an athletic watch system |
CN102053826A (en) * | 2009-11-10 | 2011-05-11 | 北京普源精电科技有限公司 | Grading display method for menus |
US20130067392A1 (en) * | 2011-09-12 | 2013-03-14 | Microsoft Corporation | Multi-Input Rearrange |
-
2012
- 2012-03-20 US US13/425,355 patent/US20130254705A1/en not_active Abandoned
-
2013
- 2013-03-06 WO PCT/US2013/029269 patent/WO2013142049A1/en active Application Filing
- 2013-03-06 KR KR1020147029395A patent/KR101890836B1/en active IP Right Grant
- 2013-03-06 CN CN201380026490.6A patent/CN104737114B/en not_active Expired - Fee Related
- 2013-03-06 EP EP13712956.5A patent/EP2828732A1/en not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
Sony Smart Watch Demo in the CES show; https://www.youtube.com/watch?v=SYpvM4pS8Yg(2012.01.11)* |
Also Published As
Publication number | Publication date |
---|---|
US20130254705A1 (en) | 2013-09-26 |
CN104737114B (en) | 2018-12-18 |
WO2013142049A1 (en) | 2013-09-26 |
CN104737114A (en) | 2015-06-24 |
KR20150067086A (en) | 2015-06-17 |
EP2828732A1 (en) | 2015-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101890836B1 (en) | Multi-axis interface for a touch-screen enabled wearable device | |
AU2019267413B2 (en) | User interfaces for watches | |
AU2018271366B2 (en) | Continuity | |
US11567644B2 (en) | Cursor integration with a touch screen user interface | |
US10838586B2 (en) | Context-specific user interfaces | |
KR101720849B1 (en) | Touch screen hover input handling | |
RU2678482C2 (en) | Electronic device and method for controlling screen display using temperature and humidity | |
EP2690543B1 (en) | Display device for executing multiple applications and method for controlling the same | |
US20140362119A1 (en) | One-handed gestures for navigating ui using touch-screen hover events | |
US20180253205A1 (en) | Wearable device and execution of application in wearable device | |
US20130179840A1 (en) | User interface for mobile device | |
US20130326415A1 (en) | Mobile terminal and control method thereof | |
CA2865442A1 (en) | Method and apparatus for providing a user interface on a device that indicates content operators | |
EP3657311B1 (en) | Apparatus including a touch screen and screen change method thereof | |
AU2015100490A4 (en) | Continuity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
N231 | Notification of change of applicant | ||
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) | ||
GRNT | Written decision to grant |