KR101890836B1 - Multi-axis interface for a touch-screen enabled wearable device - Google Patents

Multi-axis interface for a touch-screen enabled wearable device Download PDF

Info

Publication number
KR101890836B1
KR101890836B1 KR1020147029395A KR20147029395A KR101890836B1 KR 101890836 B1 KR101890836 B1 KR 101890836B1 KR 1020147029395 A KR1020147029395 A KR 1020147029395A KR 20147029395 A KR20147029395 A KR 20147029395A KR 101890836 B1 KR101890836 B1 KR 101890836B1
Authority
KR
South Korea
Prior art keywords
application
touch screen
displayed
user interface
screen
Prior art date
Application number
KR1020147029395A
Other languages
Korean (ko)
Other versions
KR20150067086A (en
Inventor
데이비드 제이. 모링
모건 터커
티모시 디. 트워덜
Original Assignee
구글 엘엘씨
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 구글 엘엘씨 filed Critical 구글 엘엘씨
Publication of KR20150067086A publication Critical patent/KR20150067086A/en
Application granted granted Critical
Publication of KR101890836B1 publication Critical patent/KR101890836B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/08Touch switches specially adapted for time-pieces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The touch screen-enabled wearable computer includes a multi-axis user interface provided by at least one software component running on the processor. Wherein the multi-axis user interface displays at least two user interface areas displayed on the touch screen, one each, a series of one or more application screens; And a vertical navigation axis, wherein the vertical navigation axis allows a user to navigate between the plurality of user interface areas in response to vertical swipe gestures generated on a touch screen, and wherein the horizontal The navigation axis allows the user to navigate the application screens of the currently displayed user interface area in response to horizontal swipe gestures across the touch screen.

Figure R1020147029395

Description

[0001] MULTI-AXIS INTERFACE FOR A TOUCH-SCREEN ENABLED WEARABLE DEVICE FOR TASK SCREEN-

This application claims priority from U.S. Serial No. 13 / 425,355, filed March 20, 2012, which application is incorporated herein by reference.

Electronic data and communication devices are constantly becoming smaller even as their information processing capacity continues to increase. Currently, portable communication devices are primarily touch screen-based user interfaces that allow these devices to be controlled by user finger gestures. Many such user interfaces are optimized for pocket-sized devices, such as mobile phones, which typically have larger screens than the 3 "or 4" diagonal. Due to their relatively large form factors, one or more mechanical buttons are typically provided to support the operation of these devices.

For example, the user interface of the iPhone TM with touch screen is based on the concept of a home screen that displays an array of available application icons. Depending on the number of applications loaded on the iPhone, the home screen may contain multiple pages of icons, with the first page being the main home screen. The user can scroll from one home screen page to another home screen page by swiping the finger horizontally across the touch screen. A tap on the icon of one of the icons opens the corresponding application. The main home screen can be accessed from any open application or other home screen page by pressing a hardware button located below the touch screen, sometimes referred to as a home button. To quickly switch between applications, a user can double-click the home button to display a row of recently used applications, so that the user scrolls to horizontal swipes and fingers You can reopen any selected application. Due to the use of horizontal swipes, the user interface of the iPhone can be described as having horizontal-based navigation. While touch-based user interfaces, such as the iPhone's user interface, can provide many benefits, these touch-based user interfaces can be used to navigate applications and enter / exit (button / Wipes, and taps. This requires the user to visually target the functions necessary to focus the device and operate the device.

As rapid advances in miniaturization occur, much smaller form factors are becoming available that make these devices wearable. The user interface for a much smaller, wearable touchscreen device with a screen size of less than 2.5 "diagonally must be a significantly different and intuitive way to operate such a small device in order to be easy to use.

Accordingly, it is desirable to provide an enhanced touch screen-based user interface that is optimized for very small wearable electronic non-dice, which reduces the need for the user to view the visual focus during operation, To access and manipulate data and graphical objects in a manner that does not require the use of mechanical buttons.

The exemplary embodiment provides methods and systems for providing a multi-axis user interface to a touch screen enabled wearable computer. Aspects of the illustrative embodiment include: providing at least two user interface areas that are displayed on the touch screen one at a time in a multi-axis user interface, each area displaying a series of one or more application screens, and a vertical navigation axis Wherein the vertical navigation axis allows the user to navigate between a plurality of user interface areas in response to vertical swipe gestures made on the touch screen and the horizontal navigation axis allows the user to navigate between a plurality of user interface areas, In response to horizontal swipe gestures across the touch screen, to navigate the application screens of the currently displayed user interface area.

According to the methods and systems disclosed herein, the use of multi-axis navigation, rather than single-axis navigation, allows a user to use a pair of vertical and horizontal finger swipes (gross) rather than precisely targeted finger taps Gross gesture) and minimal focus on the wearable computer to invoke the necessary functions.

1 is a block diagram illustrating embodiments of a wearable computer.
2 is a high-level block diagram illustrating computer components including a wearable computer according to one embodiment.
Figures 3a, 3b, and 3c illustrate one embodiment of a multi-axis user interface for a wearable device.
4 is a flow diagram illustrating a process for providing a more detailed multi-axis user interface for a wearable computer.
Figure 5 is a diagram illustrating an embodiment in which the start page application includes a watch face.
6 is a diagram illustrating vertical transitions from a start page application on a top level area to an application launcher screen on a middle level area in response to a vertical swipe gesture.
7 is a diagram illustrating horizontal scrolling of various application icons from an application launcher.
8 is a diagram showing a vertical transition from the application launcher screen on the middle level area to the application screen on the bottom level area.
9 is a diagram showing an exemplary application screen of a weather application.
Figure 10 is a diagram illustrating a vertical transition from an exemplary weather application screen back to a start page application in response to a universal gesture such as a double finger swipe.

Embodiments relate to multi-axis user interfaces for wearable computers. The following description is presented to enable one of ordinary skill in the art to make and use the invention, and is provided in the context of a patent application and its requirements. Various changes to the embodiments and the general principles and features described herein will be readily apparent. The embodiments are described primarily with reference to specific methods and systems provided in the specific embodiments. However, the above methods and systems will work effectively in other implementations. Phrases such as " an embodiment ", "an embodiment ", and" other embodiments "may mean the same or different embodiments. The embodiments will be described with respect to systems and / or devices having certain components. However, the systems and / or devices may include more or fewer components than those shown herein, and variations of the array and type of components may be made without departing from the scope of the present invention. Embodiments will also be described in the context of specific methods having certain steps. However, the method and system operate effectively for different and / or additional steps and for other methods having different orders of steps that are inconsistent with the embodiments. Accordingly, it is not intended that the invention be limited solely to the embodiments shown, but the widest scope consistent with the principles and features described herein is set forth in the present invention.

Embodiments provide methods and systems for displaying a multi-axis user interface for a touch screen enabled wearable computer. The user interface includes two or more user interface areas, wherein only one of the user interface areas is displayed on the touch screen at any given time, and a combination of vertical and horizontal navigation axes. In one embodiment, the vertical navigation axis may allow a user to navigate between the user interface areas in response to vertical swipe gestures on the touch screen. The horizontal navigation axis may allow the user to navigate between one or more application screens of each of the user interface areas using horizontal swipe gestures.

The combination of vertical and horizontal navigation axes simplifies the user interface, allows the user to quickly access the required application or function, and does not require a hardware button for navigation. Thus, by using a series of finger swipes, the user can minimize the need to look at the wearable computer when calling the desired function.

1 is a block diagram illustrating embodiments of a wearable computer. According to these embodiments, the wearable computer 12 is fully functional in a standalone state, but may be configured to be physically < RTI ID = 0.0 > physically < / RTI > in various form factors such as, for example, watchcases and lanyards. Lt; RTI ID = 0.0 > accessory < / RTI > devices. The example of Figure 1 shows two embodiments. In one embodiment, the wearable computer 12 may be inserted into the rear face of the watch case 10a. In another embodiment, the wearable computer 12 may be inserted into the rear surface of the other watch case 10b whose rear surface is closed. The watch cases 10a and 10b will be collectively referred to as a watch case 10. [

In one embodiment, the body 14 of the wearable computer 12 includes a high-resolution touch screen 16, Bluetooth for wireless communications, and a subassembly 18 of electronics such as WiFi, And a motion sensor (not shown). The wearable computer 12 displays relevant information from embedded applications and web services in a timely manner. The wearable computer 12 may also be viewed as a companion device of the smartphone by relaying information such as text, email, and caller ID information from the smartphone, Reducing the need to remove the pawn from the pocket, wallet, or briefcase for condition check.

In one embodiment, the touch screen has a dimension less than 2.5 inches diagonally, and in some embodiments may be about 1.5 inches diagonally. For example, in one embodiment, the touch screen 16 is 25.4 x 25.4 MM, while the body 14 of the wearable computer 12 may be 34 x 30 MM. According to one embodiment, the wearable computer 12 does not have any buttons for controlling the user interface. Instead, the user interface of the wearable computer 12 is entirely controlled by the user interacting with the touch screen 16 via the touch, so that the button or dial controlling the user interface is completely out of the wearable computer 12 The user interface is simplified and the manufacturing cost is reduced. In one embodiment, a button for turning on and off the wearable computer 12 may be provided on the side of the wearable computer 12, rather than for controlling the user interface. In an alternative embodiment, the modular movement 12 may be automatically turned on once it is plugged in to be initially charged.

In a further embodiment, the user interface may be provided with an auto configuration setting. When the wearable computer 12 is inserted into the case 10, the wearable computer 12 automatically and automatically adjusts the features of the case 10, such as the make and model of the case 10, And may be configured in the case 10 through the contacts 20 and a set of corresponding contacts to determine. Using the above features of the case 10, the wearable computer 12 can automatically configure its own user interface accordingly. For example, if the wearable computer 12 is inserted into the case 10 and the case 10 is determined to be an athletic accesory, the wearable computer 12 may be a heart rate monitor ). ≪ / RTI > By determining which of the various manufacturers (e.g., Nike TM , Under Armor TM , etc.) provided the accessory, the wearable computer 12 displays the manufacturer's graphic theme and logo Manufacturers designed for accessories - can automatically call specific applications.

2 is a high-level block diagram illustrating computer components including a wearable computer in accordance with one embodiment. In addition to the touch screen 16, the electronics sub-assembly 18 of the wearable computer 12 includes a processor 202, a memory 204, an input / output 206, a power manager 208, a communication interface 210, , And a sensor 212, as shown in FIG.

The processor 202 may be configured to simultaneously execute a plurality of software components to control various processes of the wearable computer 12. Processor 202 may include a main application processor and a processor that is always on for taking over time keeping and touch screen input when the main application processor enters the sleep mode, ) Processor. ≪ / RTI > In another embodiment, the processor 202 may be comprised of at least one processor having a plurality of cores.

The memories 204 may include random access memory (RAM) and non-volatile memory (not shown). The RAM may be used as a main memory for a microprocessor that supports execution of software routines and other optional storage functions. The non-volatile memory may store instructions and data without power and may store software routines that control the wearable computer 12 in the form of computer-readable program instructions. In one embodiment, the non-volatile memory includes flash memory. In alternate embodiments, the non-volatile memory may include any type of read-only memory (ROM).

I / O 206 may include components such as a touch screen controller, a display controller, and an optional audio chip (not shown). The touch controller may interface with the touch screen 16 to detect touches and touch locations and may send the information to the processor 202 for determination of user interactions. The display controller may access the RAM and transmit processed data, such as time and data and / or user interface, to the touch screen 16 for display. The audio chip may be coupled with optional speakers and microphones and may interface with the processors 202 to provide the audio capability of the wearable computer 12. [ Another example I / O 206 may include a USB controller.

The power manager 208 may communicate with the processor 202 and may adjust power management for the wearable computer 12 while the wearable computer 12 is drawing power from a battery (not shown) (coordination). In one embodiment, the battery may include, for example, a rechargeable, lithium ion battery, or the like.

Communication interface 210 may include components that support unidirectional or two-way wireless communication. In one embodiment, communication interface 210 is for receiving data that is primarily displayed remotely on touch screen 16 and that contains updated streaming data. However, in an alternative embodiment, in addition to transmitting data, the communication interface 216 may also support voice transmission. In one embodiment, communication interface 210 supports low and medium power radio frequency (RF) communications. The communication interface 210 includes a Wi-Fi transceiver that supports communication with a Wi-Fi network including wireless local area networks (WLAN) and WiMAX; A Bluetooth transceiver for low-power communication, such as wireless personal area networks (WPANs), according to Bluetooth protocol and the like; And passive RFID (radio-frequency identifi- cation). Other wireless options may include, for example, baseband and infrared. Communication interface 210 may also include other types of communication devices other than wireless, such as, for example, serial communications via contacts and / or USB communications.

The sensors 212 may include various sensors including a global positioning system (GPS) chip and an accelerometer (not shown). The accelerometer can be used to measure position, motion, tilt, shock, and vibration for use by the processors 202. The wearable computer 12 may be a personal computer or other device that can be used to monitor and control environmental sensors (e.g., ambient light, temperature, humidity, pressure, altitude, etc.), biological sensors (e.g., Body fat, etc.), and proximity detectors that detect the approach of an object. The wearable computer 12 may analyze and display the measured information from the sensors 212 or transmit raw or analyzed information via the communication interface 210. [

The software components executed by the processor 202 may include a gesture interpreter 214, an application launcher 216, a number of software applications 218, and an operating system 220. The operating system 220 is preferably a multitasking operating system that manages computer hardware resources and provides general services for the application 218. In one embodiment, the operating system 220 may include a Linux-based operating system for mobile devices such as Android TM . In one embodiment, the applications 218 may be written in Java format and downloaded to the wearable computer 12 from a third party Internet site or via an online application store. In one embodiment, the primary application that controls the user interface displayed on the wearable computer 12 is the application launcher 216. [

The application launcher 216 may be invoked by the operating system 220 under device startup and / or wake from a sleep mode. The application launcher 216 is continuously run during the awake mode and serves to launch other applications 218. [ In one embodiment, the default application displayed by the application launcher is a start page application 222. In one embodiment, the start page application 222 includes a dynamic watch face that displays at least the time but can display other information such as the current location (e.g., city), local weather and date. In one embodiment, all applications 218, including start page application 222, may include a plurality of screens or pages that may be displayed at any given time.

The user operates the wearable computer 12 by performing finger gestures using one or more fingers on the touch screen 16. [ In addition, a stylus may be used instead of a finger. Operating system 220 may detect finger / stylus gestures, i.e. gesture events, and send the gesture events to application launcher 216. As a result, the application launcher 216 may be referred to as a gesture interpreter 214 that determines gesture types (e.g., vertical swipe, tap, tap and hold, etc.) . The application launcher 216 may then change the user interface based on the gesture type.

Although the operating system 220, the gesture interpreter 214, and the application launcher 216 are shown as separate components, each function may be integrated into fewer or greater numbers of modules / components.

According to one embodiment, application launcher 216 is configured to display a multi-axis user interface including a plurality of user interface areas in combination of both vertical and horizontal navigation axes. The user can navigate between the user interface areas using simple finger gestures done in accordance with the orientation of the vertical and horizontal navigation axes to reduce the amount of visual focus the user requires to operate the wearable computer 12. [ do. In addition, the multi-axis user interface allows the user to operate the wearable computer 12 without the need for a mechanical button.

Figures 3a, 3b, and 3c illustrate one embodiment of a multi-axis user interface for a touch screen enabled wearable device 12. According to one embodiment, the multi-axis user interface includes a plurality of user interface areas (collectively, user interface areas 300) 300a, 300b, and 300c. The plurality of user interface areas 300 include an upper level area 300a for displaying one or more application screens of the first series, an intermediate level area 300b for displaying application screens of the second series, And a lower level area 300c for displaying the above application screens. In one embodiment, only one of the regions 300a, 300b, and 300c at a time is visible on the touch screen 12, except for embodiments in which transitions are animated between regions.

Application launcher 212 is configured to provide a combination of vertical navigation axis 310 and horizontal navigation axis 312. In one embodiment, vertical navigation axis 310 is used to allow the user to navigate between user interface areas 300a-300c in response to creating vertical swipe gestures 314 on touch screen 12 Let's do it. That is, in response to detecting a single vertical swipe gesture 314 on the currently displayed user interface level area 300, the immediately adjacent user interface level area 300 is displayed.

In contrast, the horizontal navigation axis 312 is used to display one or more application screens at each of the user interface areas 300, and the user can use the horizontal swipe gestures 316 across the touch screen, Is used to navigate between application screens in the displayed user interface area. In response to detecting a single horizontal swipe gesture 316 on the currently displayed application screen of a particular user interface level area 300, the immediately adjacent application screen of the user interface level area 300 is displayed.

In one embodiment, during vertical navigation between the user interface areas 300, the user interface that the user has reached the high-level area 300a or the low-level area 300c may be reversed in order for the user to return to the previous level A vertical user swipe 314 is performed. In an alternative embodiment, continuous vertical scrolling is possible through the user interface areas 300a-300c such that the user interface is configured to create a circular queue of user interface areas 300a-300c .

In one embodiment, the user interface areas 300a, 300b, and 300c can be inferred to areas of the electronic map. The user places his finger on the screen,

Figure 112014100008817-pct00001
Quot; dragging "around the map in the direction of the map, for example, by finger-up movement to drag the map upward with a smooth scroll movement, and revelation of previously hidden portions of the map, Gating can be done. In the current embodiments, the user does not "drag" the user interface areas to reveal the next user interface area. This is because it can require the user to watch the touch screen to guide the next area on the screen. Instead, the user navigates between areas with simple vertical swipes, e.g., up sweeps, that cause separate transitions between user interface areas 300a, 300b, and 300c. That is, the immediately adjacent area "snap" to the location and replace the previously displayed area.

FIG. 3A shows an embodiment in which the high level region 300a includes a start page application 222. FIG. The start page application 222 can display a series of one or more watch face screens 302 in response to the horizontal swipe gestures so that the user can scroll through the watch face screens 302, It is possible to select a watch face screen which becomes a default watch screen to change the appearance of the computer 12. [ In one embodiment, the start page application 222 is a displayed trivial application. In one embodiment, a single horizontal swipe gesture may cause the currently displayed watch face screen to move left or right to reveal the previous or next watch face screen. Continuous scrolling creates a circular queue of watch face screens 302 and can be returned to the original displayed watch face screen. A selection-type gesture, such as a tap or a double tap, may select the currently displayed watch face to be the default start page application 222. In alternate embodiments, the start page application 222 may include other information type displays that indicate a social network feed, weather, and so on.

3B may include an application screen 304 on a wearable computer 12 that displays a series of one or more application icons 306 in response to user swipes, It is possible to select an icon for scrolling and opening through the application icons 306. [ In one embodiment, each application icon 306 is displayed on its own screen. In response to detecting the horizontal user swipe gestures made on the touch screen 12, while displaying the mid-level area 300b, the application icons 306 are sequentially displayed. In one embodiment, only one horizontal swipe gesture can cause the currently displayed application icon to be shifted left or right to reveal the previous or next application icon. Continuous scrolling creates a circular queue of application icon screens and can return to the original displayed application icon screen. A select-type gesture, such as a tap or swipe, may open an application corresponding to the currently displayed application icon 306.

3C illustrates that the lower level region 300c may include a series of one or more application screens 308 for an open application. Each application displayed by the application launcher 216 may have its own set of application screens 308. The series of application screens 308 may be displayed in response to detecting that the user is performing horizontal swipe gestures that move the currently displayed application screen left or right to reveal the previous or next application screen 308 have.

In the embodiment shown in Figures 3a, 3b, and 3c, rather than implementing a series of user interface areas and application screens as a circular queue, a series of user interface areas and application screens may be scrolled past the first or last panel , It may be implemented as a linked list of screens or panels that terminate on each end. In this embodiment, if the user attempts a flip through the first panel or last panel with a swipe gesture (and thus there is no panel to flip), the currently displayed panel is displayed when the user's finger begins to move It can start to move, but when the user's finger is lifted from the touch screen, it returns to its original position. In one embodiment, the flip or return animation includes a simulated deceleration. For example, as the panel approaches the final stopping point, the panel decelerates to stop rather than stop suddenly.

In this embodiment, the user may switch from one application to another, which first uses the upswipe to return to the application launcher screen 304 and, for example, , And then executing a downswipe, for example, to enter an application screen 308 of another application. In an alternative embodiment, instead of the user having to move up, left / right and down to change applications, the user may continue to sweep horizontal swips in the lower level areas 300c until the screens for the desired application are shown Can be executed.

Further, in another embodiment, the multi-axis user interface may be implemented with two user interface areas instead of three user interface areas. In this embodiment, the start page application may be implemented as part of the application launcher screen 304 where the intermediate level area 300b is at a higher level. The user may then scroll from the start page application to any other application in the application launcher screen 304 using horizontal swipes.

4 is a flow diagram illustrating in greater detail a process for providing a multi-axis user interface for a wearable computer. In one embodiment, the process is performed by at least one user interface component running on the processor 202, including any combination of the gesture interpreter 214, application launcher 216, and operating system 220 .

When the wearable computer 12 starts up or wakes from sleep, the process can start by displaying the start page application on the touch screen 16 (block < RTI ID = 0.0 > 400). As described above, the start page application 22 may display a series of one or more watch paces. In one embodiment, the user may perform horizontal swipe gestures across the currently displayed watch face to scroll horizontally through a series of watch faces. In another embodiment, in order to prevent accidental scrolling, a user may select an access-type gesture, for example a tab or a tab and hold gesture, ≪ / RTI >

 5 is a diagram illustrating an embodiment in which the start page application 500 includes a watch face. In accordance with one embodiment, the user may view different watch paces from the start page application 500 in response to the left and right horizontal swipe gestures 502. [ In one embodiment, the horizontal swipes (e.g., left and right) 502 cause one watch face to replace the currently displayed watch face on the touch screen 16 with the previous or next watch face . In this embodiment, one watch face includes the entire page and fills the display of the touch screen 16. However, the one watch face may be configured to display partial views of adjacent watch faces.

 Referring back to FIG. 4, in response to detecting a vertical swipe gesture in a first direction (e.g., up direction) on the touch screen while the start page application is being displayed, And transitioned along the vertical axis 310 from the upper level area to the middle level area for display (block 402).

6 is a diagram illustrating vertical transitions from the start page application 500 on the upper level area to the application launcher screen 602 on the middle level area in response to the vertical swipe gesture 604. [ In this case, an application launcher screen 602 is shown that displays a single application icon for a weather application. In one embodiment, a single finger up swipe (or down swipe) on the start page application 500 causes the application launcher screen 602 to send a start page application 500 on the touch screen 16 It can be easily replaced.

Referring back to FIG. 4, in response to detecting a horizontal swipe gesture across the touch screen while the application launcher screen is being displayed, application icons are scrolled horizontally across the touch screen for user selection (block 404).

7 is a diagram illustrating horizontal scrolling of a different application icon 700 from an application launcher in response to left and right horizontal swipe gestures 702. FIG. In one embodiment, a horizontal swipe (e.g., left or right) may cause the application launcher 216 to replace the current application icon on the touch screen 16 with the previous or next application icon. In this embodiment, one application icon 700 includes the entire page and fills the display of the touch screen 16. [ However, the icon 700 may be configured to display partial views of adjacent watch faces.

4, in response to detecting a vertical swipe gesture in a second direction (e.g., down direction) while the application launcher screen 602 is being displayed, 300b to the high level area 300a and displays the start page application 500 again (block 406).

In response to detecting the at least one of the tabs on the touch screen or the vertical swipe gesture in the first direction while the application launcher screen is being displayed, the application is opened and the user interface is displayed in the middle Transition from the level area to the lower level area (block 408).

8 is a diagram illustrating vertical transitions from the application launcher screen 602 on the mid-level area to the application screen 800 on the lower level area, in response to a tap or vertical swipe gesture 802. FIG. In one embodiment, the tab or vertical swipe gesture 802 opens the application by displaying an application screen 800 that can simply replace the selected application icon 700. For example, while the application launcher screen 602 is being displayed, a single finger tap or upswipe on the touch screen may cause the application screen 800 corresponding to the application icon 700 to be displayed.

9 is a diagram showing an exemplary application screen 800 of a weather application opened in response to a user selecting the weather application icon 700 on the application launcher screen 602. [ The weather application 800 includes a number of pages, where each page can represent a current weather in a different city. The user may scroll from city to city using the horizontal swipe gesture 802. [ In response to the user performing a vertical swipe 804, for example, an upswipe, the page is pulled up to represent the weather for each day of the week. In one embodiment, each day of the week may be viewed on its own "mini-panel" 806 (e.g., a rectangular subdivision of the page). The mini panels 806 may occupy the lower portion of the application screen 800, or may be implemented as separate pages.

Referring again to FIG. 4, in response to detecting a vertical swipe gesture in a second direction (e.g., down) on the touch screen while the application screen 800 is being displayed, Transition to midlevel 300b, and displays the application launcher screen 602 again (block 410).

In an alternative embodiment, in response to detecting a universal gesture on either the application launcher screen or the application screen for an open application, the home screen is displayed again. A typical gesture can be a gesture mapped to the same function, no matter what level of the user interface or what area is displayed. One example of this common gesture can be a two-finger vertical swipe. When detected by an application launcher or application, the application launcher causes a redisplay of the start page application, e.g., watch face.

10 illustrates a vertical transition from an exemplary weather application screen 800 to a start page application in response to a common gesture 1000, such as a double finger swipe. Here, the user causes the user interface to jump from the lower level area 300c to the upper level area 300a with one movement.

Referring again to Figures 3A-3C, the vertical scrolling between the screens of the user interface areas 300a-300c and the interaction between the watch face screens 302, the application icons 306, and the application screens 308 Horizontal scrolling has been described as a separate step in which one screen is replaced with another screen during a scrolling transition. In an alternative embodiment, scrolling may be implemented with flick transition animations in which transitions between screens are smoothly animated, such that the currently displayed screen dynamically scrolls off the display On the other hand, the next screen is shown to dynamically scroll onto the display.

In one embodiment, when the gesture manager 214 (or equivalent code) detects that the user's finger begins to slide vertically or horizontally, the application launcher 216 causes the screen to move the spring- To move up / down or left / right with finger movements in a spring-loaded fashion. If the gesture manager determines that the finger has moved some minimum distance, e.g., 1 cm, and then is lifted from the touch screen, the application launcher 216 may move the finger in the same direction, e.g., Immediately display a quick animation of the screen flipping right. In one embodiment, the flipping animation may be implemented using a Hyperspace animation technique as shown in the Android "APIDemos ". If the user has not moved the minimum distance before the finger is lifted, the gesture manager determines that the user has not attempted to "flick". In this case, the screen appears to fall back to its original position. Transition animation may be aesthetically pleasing, but the separate transition may consume less battery power.

According to another aspect of the exemplary embodiments, the edge portion of the touch screen 16 may be designated for fast horizontal scrolling. If the user begins to slide a finger along a designated lower or upper edge of the touch screen 16, the system may consider it a "fast scroll " event and, in response, The flip begins quickly through the series.

11 is a block diagram illustrating fast-scroll portions on the touch screen 16. The surface of the touch screen 16 may be divided into a general swipe zone 1100 and two accelerated scrolling zones 1102 along the side edges. The gesture interpreter application launcher 216 may be configured to detect a finger sliding horizontally at any point within the general swipe zone 1100 to display the next screen in a series of screens. The detection of other gestures in the accelerated scrolling zones 1102 may cause a continuous and fast display of the screens in the series of screens. For example, the taps and holds of the fingers in the accelerated scrolling zones 1102 cause a continuous, increased accelerated progression through the list of screens, while a single tab advances the screens one at a time.

In another embodiment, when a user's finger remains on the acceleration scrolling zones, a progress indicator 1104 may be displayed on the touch screen 16 showing the current position 1106 as a series of screens . If the finger is fast-scrolling along one edge (e.g., bottom or top), the progress indicator 1104 may be displayed along the other edge.

A method and system for providing a multi-axis user interface for a wearable computer has been disclosed. The present invention has been described in accordance with the above-described embodiments, and variations may be made in these embodiments, and any modifications within the spirit and scope of the present invention may be made. For example, in an alternative embodiment, the functions of the vertical and horizontal axes of the wearable computer may be interchanged such that the horizontal axis is used to navigate between user interface areas in response to a horizontal swipe, while the vertical navigation axis is vertical It is used to navigate between application screens using swipes. Accordingly, many modifications may be made by those skilled in the art without departing from the spirit and scope of the appended claims. The software recorded in accordance with the present invention may be stored in some form of computer readable medium, such as a memory or hard disk, and executed by a processor.

Claims (19)

As a wearable computer,
A touch screen having a size less than 2.5 inches diagonally; And
Comprising at least one software component executing on a processor configured to display a multi-axis user interface,
The multi-axis user interface comprises:
An upper level area for displaying a first series of one or more application screens, an intermediate level area connectable to the upper level area by a swipe gesture detectable by the touch screen, Displaying a series of screens and a lower level area connectable to the intermediate level area by a selection gesture detectable by the touch screen, the lower level area displaying a third series of one or more application screens A plurality of user interface areas displayed on the touch screen including; And
Wherein the vertical navigation axis allows a user to navigate between the plurality of user interface areas in response to vertical swipe gestures created on the touch screen, and wherein the horizontal navigation axis Axis allows the user to navigate the application screens of the currently displayed user interface area in response to horizontal swipe gestures across the touch screen,
Only one of the upper level area, the middle level area and the lower level area is displayed on the touch screen at a predetermined time, and one application icon displayed on the displayed only one level area is displayed as a whole Page, fill the entire display of the touch screen,
The touch screen is divided into two acceleration scrolling zones along the first and second side edges of the swipe zone and the touch screen,
Wherein the processor is further configured to, while a predetermined application screen of the series of one or more application screens is displayed,
Outputting a next application screen of the series of one or more application screens for display by the touch screen in response to detecting a gesture at a location within the swipe zone of the touch screen by the touch screen,
Responsive to detecting a tap and hold gesture within one of the two accelerated scrolling zones by the touch screen, accelerated through at least a portion of the one or more series of application screens for display by the touch screen and to output the advancement of the at least one software component.
The method according to claim 1,
In response to detecting a single vertical swipe gesture on the currently displayed user interface area, immediately adjacent user interface areas are displayed.
3. The method of claim 2,
During vertical navigation between the user interface areas, when the user reaches the high level area or the low level area, the user interface instructs the user to select a vertical user swipe in the opposite direction to return to the previous level The computer program being configured to cause the computer to perform the steps of:
3. The method of claim 2,
Wherein successive scrolling through the user interface areas returns to a user interface area originally displayed, thereby creating a circular queue of user interface areas.
The method of claim 3,
Wherein the user interface regions are implemented as a linked list of panels terminating on each stage, and wherein scrolling through the first or last panel is not allowed.
The method according to claim 1,
Wherein in response to detecting a single horizontal swipe gesture on a currently displayed application screen of a particular user interface area, an immediately adjacent application screen of the user interface area is displayed.
The method according to claim 6,
Wherein the sequential scrolling through the application screens returns to the first displayed application screen, thereby creating a circular queue of application screens.
The method according to claim 6,
Wherein the application screens are implemented as a linked list of panels terminating on each stage, and scrolling through the first or last panel is not allowed.
The method according to claim 1,
The middle level area includes an application launcher screen that displays a series of one or more application icons in response to the horizontal swipe gestures so that the user can scroll through the application icons and select an application for opening Featuring a wearable computer.
The method according to claim 1,
Wherein the lower level area comprises a series of one or more application screens for an opened application.
The method according to claim 1,
Wherein the upper level area includes a start page application that, in response to the horizontal swipe gestures, displays a series of one or more watch face screens so that the user can scroll through the watch face screens, Wherein the watch face screen can be selected to be a default watch screen to change the watch face screen.
The method according to claim 1,
Further comprising an operating system and a gesture interpreter,
The operating system detecting gesture events occurring on the touch screen, sending the gesture events to an application launcher,
The application launcher requests the gesture interpreter to determine a gesture type, and the application launcher changes the user interface based on the gesture type.
CLAIMS What is claimed is: 1. A method for providing a multi-axis user interface on a wearable computer by software components running on at least one processor of a wearable computer,
Displaying an upper level area including a start page application on a touch screen less than 2.5 inches diagonally, the touch screen being divided into two acceleration scrolling zones along the first and second side edges of the swipe zone and the touch screen -;
In response to detecting a vertical swipe gesture in a first direction on the touch screen while the start page application is being displayed, moving the user interface along a vertical axis from the upper level area to the middle level area to display an application launcher screen, ;
Scrolling application icons horizontally across the touch screen for user selection in response to detecting a horizontal swipe gesture across the touch screen while the application launcher screen is being displayed;
In response to detecting at least one of a tap on the touch screen or a vertical swipe gesture in the first direction while the application launcher screen is being displayed, Transitioning the user interface along a vertical axis to a level region, wherein only one of the high level region, the intermediate level region and the low level region is displayed on the touch screen at a predetermined time, One application icon includes an entire page of one of the application screens and fills the entire display of the touch screen;
In response to detecting a gesture at a location within the swipe zone of the touch screen by the touch screen while a predetermined application screen of the application screens is being displayed, Outputting the next of the application screens by the at least one processor; And
In response to detecting a tap and hold gesture within one of the two accelerated scrolling zones by the touch screen while the predetermined one of the application screens is being displayed, Outputting by the at least one processor an accelerated advancement through at least a portion of the application screens.
14. The method of claim 13,
In response to detecting a vertical swipe gesture in a second direction on the touch screen while the application launcher screen is being displayed, the user interface is transitioned from the mid-level area to the upper level area to display the start page application again, ≪ / RTI >
14. The method of claim 13,
In response to detecting a vertical swipe gesture in a second direction on the touch screen while the application screen is being displayed, the application launcher screen is displayed on the user interface ≪ / RTI > further comprising the step of:
14. The method of claim 13,
Configuring the start page application as one or more watch paces of the series; And
Further comprising scrolling one or more watch faces of the series horizontally across the touch screen for user selection in response to detecting a horizontal swipe across the currently displayed watch face.
A computer-readable storage medium having stored thereon instructions for causing one or more processors to perform operations when executed by one or more processors and to provide a multi-axis user interface on a wearable computer,
The program instructions,
Instructions for displaying a high level area including a start page application on a touch screen less than 2.5 inches diagonally, the touch screen being divided into two acceleration scrolling zones along the first and second side edges of the swipe zone and the touch screen -;
In response to detecting a vertical swipe gesture in a first direction on the touch screen while the start page application is being displayed, moving the user interface along a vertical axis from the upper level area to the middle level area to display an application launcher screen, Command;
In response to detecting a horizontal swipe gesture across the touch screen while the application launcher screen is being displayed, scrolling application icons horizontally across the touch screen for user selection; And
In response to detecting at least one of a tap in a first direction or a vertical swipe gesture in the first direction on the touch screen while the application launcher screen is being displayed, An upper level region, an intermediate level region and a lower level region are displayed on the touch screen at a predetermined time, and only one of the displayed level regions One application icon displayed on the touch screen includes an entire page of one of the application screens and fills the entire display of the touch screen;
In response to detecting a gesture at a location within the swipe zone of the touch screen by the touch screen while a predetermined application screen of the application screens is being displayed, Outputting the next of the application screens by the one or more processors; And
In response to detecting a tap and hold gesture within one of the two accelerated scrolling zones by the touch screen while the predetermined one of the application screens is being displayed, And outputting an accelerated advancement via at least a portion of the application screens by the one or more processors.
delete delete
KR1020147029395A 2012-03-20 2013-03-06 Multi-axis interface for a touch-screen enabled wearable device KR101890836B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/425,355 US20130254705A1 (en) 2012-03-20 2012-03-20 Multi-axis user interface for a touch-screen enabled wearable device
US13/425,355 2012-03-20
PCT/US2013/029269 WO2013142049A1 (en) 2012-03-20 2013-03-06 Multi-axis interface for a touch-screen enabled wearable device

Publications (2)

Publication Number Publication Date
KR20150067086A KR20150067086A (en) 2015-06-17
KR101890836B1 true KR101890836B1 (en) 2018-08-22

Family

ID=48014287

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020147029395A KR101890836B1 (en) 2012-03-20 2013-03-06 Multi-axis interface for a touch-screen enabled wearable device

Country Status (5)

Country Link
US (1) US20130254705A1 (en)
EP (1) EP2828732A1 (en)
KR (1) KR101890836B1 (en)
CN (1) CN104737114B (en)
WO (1) WO2013142049A1 (en)

Families Citing this family (112)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US9124712B2 (en) * 2012-06-05 2015-09-01 Apple Inc. Options presented on a device other than accept and decline for an incoming call
US9507486B1 (en) * 2012-08-23 2016-11-29 Allscripts Software, Llc Context switching system and method
US8954878B2 (en) * 2012-09-04 2015-02-10 Google Inc. Information navigation on electronic devices
US9898184B2 (en) * 2012-09-14 2018-02-20 Asustek Computer Inc. Operation method of operating system
US11372536B2 (en) 2012-11-20 2022-06-28 Samsung Electronics Company, Ltd. Transition and interaction model for wearable electronic device
US10185416B2 (en) 2012-11-20 2019-01-22 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving movement of device
US8994827B2 (en) 2012-11-20 2015-03-31 Samsung Electronics Co., Ltd Wearable electronic device
US11237719B2 (en) 2012-11-20 2022-02-01 Samsung Electronics Company, Ltd. Controlling remote electronic device with wearable electronic device
US11157436B2 (en) 2012-11-20 2021-10-26 Samsung Electronics Company, Ltd. Services associated with wearable electronic device
US10551928B2 (en) * 2012-11-20 2020-02-04 Samsung Electronics Company, Ltd. GUI transitions on wearable electronic device
US20140164907A1 (en) * 2012-12-12 2014-06-12 Lg Electronics Inc. Mobile terminal and method of controlling the mobile terminal
US20140189584A1 (en) * 2012-12-27 2014-07-03 Compal Communications, Inc. Method for switching applications in user interface and electronic apparatus using the same
US9323363B2 (en) * 2013-02-28 2016-04-26 Polar Electro Oy Providing meta information in wrist device
WO2014143776A2 (en) 2013-03-15 2014-09-18 Bodhi Technology Ventures Llc Providing remote interactions with host device using a wireless device
KR102045282B1 (en) * 2013-06-03 2019-11-15 삼성전자주식회사 Apparatas and method for detecting another part's impormation of busy in an electronic device
CN109739412B (en) 2013-06-18 2021-10-26 三星电子株式会社 User terminal equipment and management method of home network thereof
US10564813B2 (en) * 2013-06-18 2020-02-18 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
KR102318442B1 (en) 2013-06-18 2021-10-28 삼성전자주식회사 User terminal device and method of managing home network thereof
US9568891B2 (en) 2013-08-15 2017-02-14 I.Am.Plus, Llc Multi-media wireless watch
US20150098309A1 (en) * 2013-08-15 2015-04-09 I.Am.Plus, Llc Multi-media wireless watch
GB2517419A (en) * 2013-08-19 2015-02-25 Arm Ip Ltd Wrist worn device
EP3067792A4 (en) * 2013-12-13 2016-12-14 Huawei Device Co Ltd Icon display method of wearable intelligent device and related device
US9513665B2 (en) 2013-12-26 2016-12-06 Intel Corporation Wearable electronic device including a formable display unit
CN104169856B (en) * 2013-12-30 2017-10-17 华为技术有限公司 Side menu display method, device and terminal
USD760771S1 (en) * 2014-02-10 2016-07-05 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with graphical user interface
USD760770S1 (en) * 2014-02-10 2016-07-05 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with animated graphical user interface
US10209779B2 (en) * 2014-02-21 2019-02-19 Samsung Electronics Co., Ltd. Method for displaying content and electronic device therefor
CN106030490B (en) * 2014-02-21 2019-12-31 索尼公司 Wearable device, electronic device, image control device, and display control method
JP2015158753A (en) * 2014-02-21 2015-09-03 ソニー株式会社 Wearable device and control apparatus
US10691332B2 (en) 2014-02-28 2020-06-23 Samsung Electronics Company, Ltd. Text input on an interactive display
US20150286391A1 (en) * 2014-04-08 2015-10-08 Olio Devices, Inc. System and method for smart watch navigation
US9589539B2 (en) * 2014-04-24 2017-03-07 Kabushiki Kaisha Toshiba Electronic device, method, and computer program product
KR102173110B1 (en) * 2014-05-07 2020-11-02 삼성전자주식회사 Wearable device and controlling method thereof
US10313506B2 (en) 2014-05-30 2019-06-04 Apple Inc. Wellness aggregator
KR102190062B1 (en) * 2014-06-02 2020-12-11 엘지전자 주식회사 Wearable device and method for controlling the same
AU2015279544B2 (en) 2014-06-27 2018-03-15 Apple Inc. Electronic device with rotatable input mechanism for navigating calendar application
US9081421B1 (en) * 2014-06-30 2015-07-14 Linkedin Corporation User interface for presenting heterogeneous content
US20160004393A1 (en) * 2014-07-01 2016-01-07 Google Inc. Wearable device user interface control
EP3195098A2 (en) 2014-07-21 2017-07-26 Apple Inc. Remote user interface
WO2016017956A1 (en) * 2014-07-30 2016-02-04 Samsung Electronics Co., Ltd. Wearable device and method of operating the same
KR102156223B1 (en) * 2014-08-02 2020-09-15 애플 인크. Context-specific user interfaces
US10452253B2 (en) * 2014-08-15 2019-10-22 Apple Inc. Weather user interface
KR102418119B1 (en) * 2014-08-25 2022-07-07 삼성전자 주식회사 Method for organizing a clock frame and an wearable electronic device implementing the same
EP4209872A1 (en) 2014-09-02 2023-07-12 Apple Inc. Phone user interface
USD762692S1 (en) * 2014-09-02 2016-08-02 Apple Inc. Display screen or portion thereof with graphical user interface
JP2017527033A (en) 2014-09-02 2017-09-14 アップル インコーポレイテッド User interface for receiving user input
US10254948B2 (en) * 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US20160070380A1 (en) * 2014-09-08 2016-03-10 Aliphcom Forming wearable pods and devices including metalized interfaces
JP6191567B2 (en) * 2014-09-19 2017-09-06 コニカミノルタ株式会社 Operation screen display device, image forming apparatus, and display program
US9465788B2 (en) 2014-10-09 2016-10-11 Wrap Media, LLC Authoring tool for the authoring of wrap packages of cards
US9489684B2 (en) 2014-10-09 2016-11-08 Wrap Media, LLC Delivering wrapped packages in response to the selection of advertisements
US9600594B2 (en) 2014-10-09 2017-03-21 Wrap Media, LLC Card based package for distributing electronic media and services
US20160103820A1 (en) 2014-10-09 2016-04-14 Wrap Media, LLC Authoring tool for the authoring of wrap packages of cards
US9448972B2 (en) * 2014-10-09 2016-09-20 Wrap Media, LLC Wrap package of cards supporting transactional advertising
WO2016057188A1 (en) 2014-10-09 2016-04-14 Wrap Media, LLC Active receipt wrapped packages accompanying the sale of products and/or services
KR102283546B1 (en) 2014-10-16 2021-07-29 삼성전자주식회사 Method and Wearable Device for executing application
US20160139628A1 (en) * 2014-11-13 2016-05-19 Li Bao User Programable Touch and Motion Controller
US20160162148A1 (en) * 2014-12-04 2016-06-09 Google Inc. Application launching and switching interface
US9961293B2 (en) * 2014-12-05 2018-05-01 Lg Electronics Inc. Method for providing interface using mobile device and wearable device
KR102230523B1 (en) * 2014-12-08 2021-03-19 신상현 Mobile terminal
US11036386B2 (en) * 2015-01-06 2021-06-15 Lenovo (Singapore) Pte. Ltd. Application switching on mobile devices
US10317938B2 (en) * 2015-01-23 2019-06-11 Intel Corporation Apparatus utilizing computer on package construction
EP3484134B1 (en) 2015-02-02 2022-03-23 Apple Inc. Device, method, and graphical user interface for establishing a relationship and connection between two devices
CN105988701B (en) * 2015-02-16 2019-06-21 阿里巴巴集团控股有限公司 A kind of intelligent wearable device display control method and intelligent wearable device
US20160259491A1 (en) * 2015-03-03 2016-09-08 Olio Devices, Inc. System and method for automatic third party user interface adjustment
US20160259523A1 (en) * 2015-03-06 2016-09-08 Greg Watkins Web Comments with Animation
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US10379497B2 (en) 2015-03-07 2019-08-13 Apple Inc. Obtaining and displaying time-related data on an electronic watch
WO2016144385A1 (en) * 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
KR20170130391A (en) * 2015-03-25 2017-11-28 엘지전자 주식회사 Watch type mobile terminal and control method thereof
US9582917B2 (en) * 2015-03-26 2017-02-28 Wrap Media, LLC Authoring tool for the mixing of cards of wrap packages
US9600803B2 (en) 2015-03-26 2017-03-21 Wrap Media, LLC Mobile-first authoring tool for the authoring of wrap packages
US20160282947A1 (en) * 2015-03-26 2016-09-29 Lenovo (Singapore) Pte. Ltd. Controlling a wearable device using gestures
CA164671S (en) * 2015-04-03 2016-10-17 Lucis Technologies Holdings Ltd Smart switch panel
US11327640B2 (en) 2015-06-05 2022-05-10 Apple Inc. Providing complications on an electronic device
US10572571B2 (en) * 2015-06-05 2020-02-25 Apple Inc. API for specifying display of complication on an electronic watch
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
US10175866B2 (en) 2015-06-05 2019-01-08 Apple Inc. Providing complications on an electronic watch
US10275116B2 (en) 2015-06-07 2019-04-30 Apple Inc. Browser with docked tabs
EP3337583B1 (en) 2015-08-20 2024-01-17 Apple Inc. Exercise-based watch face
WO2017111903A1 (en) 2015-12-21 2017-06-29 Intel Corporation Integrating system in package (sip) with input/output (io) board for platform miniaturization
KR102475337B1 (en) 2015-12-29 2022-12-08 에스케이플래닛 주식회사 User equipment, control method thereof and computer readable medium having computer program recorded thereon
US10521101B2 (en) 2016-02-09 2019-12-31 Microsoft Technology Licensing, Llc Scroll mode for touch/pointing control
KR20170100951A (en) 2016-02-26 2017-09-05 삼성전자주식회사 A Display Device And Image Displaying Method
US20170357427A1 (en) * 2016-06-10 2017-12-14 Apple Inc. Context-specific user interfaces
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
US10873786B2 (en) 2016-06-12 2020-12-22 Apple Inc. Recording and broadcasting application visual output
US10709422B2 (en) * 2016-10-27 2020-07-14 Clarius Mobile Health Corp. Systems and methods for controlling visualization of ultrasound image data
USD818492S1 (en) * 2017-01-31 2018-05-22 Relativity Oda Llc Portion of a computer screen with an animated icon
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
CN110914787B (en) * 2017-09-05 2022-07-05 三星电子株式会社 Accessing data items on a computing device
US11561679B2 (en) * 2017-11-09 2023-01-24 Rakuten Group Inc. Display control system, display control method, and program for page arrangement of information items
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
DK180171B1 (en) 2018-05-07 2020-07-14 Apple Inc USER INTERFACES FOR SHARING CONTEXTUALLY RELEVANT MEDIA CONTENT
CA186536S (en) * 2018-09-18 2020-09-15 Sony Interactive Entertainment Inc Display screen with transitional graphical user interface
US11422692B2 (en) * 2018-09-28 2022-08-23 Apple Inc. System and method of controlling devices using motion gestures
CN109992340A (en) * 2019-03-15 2019-07-09 努比亚技术有限公司 A kind of desktop display method, wearable device and computer readable storage medium
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US11340778B2 (en) 2019-05-06 2022-05-24 Apple Inc. Restricted operation of an electronic device
US10852905B1 (en) 2019-09-09 2020-12-01 Apple Inc. Techniques for managing display usage
DK202070624A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
EP4133371A1 (en) 2020-05-11 2023-02-15 Apple Inc. User interfaces for managing user interface sharing
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
US11938376B2 (en) 2021-05-15 2024-03-26 Apple Inc. User interfaces for group workouts
US11630559B2 (en) 2021-06-06 2023-04-18 Apple Inc. User interfaces for managing weather information
CN113434061A (en) * 2021-06-07 2021-09-24 深圳市爱都科技有限公司 Method and device for realizing application entry in dial plate, intelligent watch and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6266098B1 (en) * 1997-10-22 2001-07-24 Matsushita Electric Corporation Of America Function presentation and selection using a rotatable function menu
US6556222B1 (en) * 2000-06-30 2003-04-29 International Business Machines Corporation Bezel based input mechanism and user interface for a smart watch
US7081905B1 (en) * 2000-06-30 2006-07-25 International Business Machines Corporation Method and apparatus for dynamically controlling scroller speed employed for a user interface of a wearable appliance
US20050278757A1 (en) * 2004-05-28 2005-12-15 Microsoft Corporation Downloadable watch faces
SG153805A1 (en) * 2004-07-19 2009-07-29 Creative Tech Ltd Method and apparatus for touch scrolling
US7593755B2 (en) * 2004-09-15 2009-09-22 Microsoft Corporation Display of wireless data
US20070067738A1 (en) * 2005-09-16 2007-03-22 Microsoft Corporation Extensible, filtered lists for mobile device user interface
CN1949161B (en) * 2005-10-14 2010-05-26 鸿富锦精密工业(深圳)有限公司 Multi gradation menu displaying device and display controlling method
US7946758B2 (en) * 2008-01-31 2011-05-24 WIMM Labs Modular movement that is fully functional standalone and interchangeable in other portable devices
US8677285B2 (en) * 2008-02-01 2014-03-18 Wimm Labs, Inc. User interface of a small touch sensitive display for an electronic data and communication device
JP5643809B2 (en) * 2009-04-26 2014-12-17 ナイキ イノベイト セー. フェー. GPS features and functionality of an athletic watch system
CN102053826A (en) * 2009-11-10 2011-05-11 北京普源精电科技有限公司 Grading display method for menus
US20130067392A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Multi-Input Rearrange

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Sony Smart Watch Demo in the CES show; https://www.youtube.com/watch?v=SYpvM4pS8Yg(2012.01.11)*

Also Published As

Publication number Publication date
US20130254705A1 (en) 2013-09-26
CN104737114B (en) 2018-12-18
WO2013142049A1 (en) 2013-09-26
CN104737114A (en) 2015-06-24
KR20150067086A (en) 2015-06-17
EP2828732A1 (en) 2015-01-28

Similar Documents

Publication Publication Date Title
KR101890836B1 (en) Multi-axis interface for a touch-screen enabled wearable device
AU2019267413B2 (en) User interfaces for watches
AU2018271366B2 (en) Continuity
US11567644B2 (en) Cursor integration with a touch screen user interface
US10838586B2 (en) Context-specific user interfaces
KR101720849B1 (en) Touch screen hover input handling
RU2678482C2 (en) Electronic device and method for controlling screen display using temperature and humidity
EP2690543B1 (en) Display device for executing multiple applications and method for controlling the same
US20140362119A1 (en) One-handed gestures for navigating ui using touch-screen hover events
US20180253205A1 (en) Wearable device and execution of application in wearable device
US20130179840A1 (en) User interface for mobile device
US20130326415A1 (en) Mobile terminal and control method thereof
CA2865442A1 (en) Method and apparatus for providing a user interface on a device that indicates content operators
EP3657311B1 (en) Apparatus including a touch screen and screen change method thereof
AU2015100490A4 (en) Continuity

Legal Events

Date Code Title Description
N231 Notification of change of applicant
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant