WO2015034966A1 - User interface object manipulations in a user interface - Google Patents

User interface object manipulations in a user interface Download PDF

Info

Publication number
WO2015034966A1
WO2015034966A1 PCT/US2014/053958 US2014053958W WO2015034966A1 WO 2015034966 A1 WO2015034966 A1 WO 2015034966A1 US 2014053958 W US2014053958 W US 2014053958W WO 2015034966 A1 WO2015034966 A1 WO 2015034966A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
electronic device
crown
virtual object
response
Prior art date
Application number
PCT/US2014/053958
Other languages
English (en)
French (fr)
Inventor
Nicholas Zambetti
Imran Chaudhri
Jonathan R. DASCOLA
Alan C. DYE
Christopher Patrick FOSS
Aurelio GUZMAN
Chanaka G. KARUNAMUNI
Duncan Robert Kerr
Stephen O. Lemay
Natalia MARIC
Christopher Wilson
Eric Lance WILSON
Lawrence Y. YANG
Gary Ian BUTCHER
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201480060082.7A priority Critical patent/CN105683878B/zh
Priority to JP2016537947A priority patent/JP6170250B2/ja
Priority to KR1020187013265A priority patent/KR102131228B1/ko
Priority to KR1020207019035A priority patent/KR102305362B1/ko
Priority to EP14772002.3A priority patent/EP3042271B1/en
Priority to US14/913,350 priority patent/US10275117B2/en
Priority to EP19217240.1A priority patent/EP3677999A1/en
Priority to AU2014315325A priority patent/AU2014315325B2/en
Application filed by Apple Inc. filed Critical Apple Inc.
Priority to KR1020167008488A priority patent/KR102072614B1/ko
Priority to CN201910438645.6A priority patent/CN110262711B/zh
Publication of WO2015034966A1 publication Critical patent/WO2015034966A1/en
Priority to HK16111794.7A priority patent/HK1223699A1/zh
Priority to US16/358,483 priority patent/US11068128B2/en
Priority to US17/212,850 priority patent/US11829576B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • This disclosure relates generally to user interfaces and, more specifically, to user interfaces using a crown input mechanism.
  • Advanced personal electronic devices can have small form factors. These personal electronic devices include, but are not limited to, tablets and smart phones. Use of such personal electronic devices involves manipulation of user interface objects on display screens which also have small form factors that complement the design of the personal electronic devices.
  • Exemplary manipulations that users can perform on personal electronic devices include navigating a hierarchy, selecting a user interface object, adjusting the position, size, and zoom of user interface objects, or otherwise manipulating user interfaces.
  • Exemplary user interface objects include digital images, video, text, icons, maps, control elements such as buttons, and other graphics.
  • a user can perform such manipulations in image management software, video editing software, word pressing software, software execution platforms such as an operating system's desktop, website browsing software, and other environments.
  • One process can include receiving user input through a crown to rotate a virtual object.
  • the process includes selecting a surface of the object from among the multiple surfaces of the object in response to determining that the crown rotation exceeded a speed threshold.
  • FIG. 1 illustrates an exemplary wearable electronic device according to various examples.
  • FIG. 2 illustrates a block diagram of an exemplary wearable electronic device according to various examples.
  • FIGs. 3-12 illustrate an exemplary graphical user interface showing the selection of a surface of a two-sided object in response to a rotation of a crown.
  • FIG. 13 illustrates an exemplary process for selecting a surface of a two-sided object in response to a rotation of a crown.
  • FIGs. 14-23 illustrate an exemplary graphical user interface showing the selection of a surface of an object in response to a rotation of a crown.
  • FIG. 24 illustrates an exemplary process for selecting a surface of an object in response to a rotation of a crown.
  • FIG. 25 illustrates an exemplary multi-sided object in a graphical user interface.
  • FIG. 26 illustrates an exemplary computing system for manipulating a user interface in response to a rotation of a crown according to various examples.
  • Many personal electronic devices have graphical user interfaces with options that can be activated in response to user inputs.
  • a user can select and activate a particular option from among multiple options. For example, a user may select an option by placing a mouse cursor over the desired option using a pointing device. The user may activate the option by clicking a button of the pointing device while the option is selected.
  • a user may select and activate an option displayed on a touch-sensitive display (also known as a touch screen) by touching the touch- sensitive display at the location of the displayed option.
  • a touch-sensitive display also known as a touch screen
  • the examples below describe improved techniques for selecting a surface of a user interface object in a graphical user interface using user inputs. More specifically, these techniques use a physical crown as an input device to enable a user to select a desired option by selecting a surface of the user interface object. As a result, the examples described below allow a user to more efficiently and conveniently select a desired option.
  • FIG. 1 illustrates exemplary personal electronic device 100.
  • device 100 is a watch that generally includes body 102 and strap 104 for affixing device 100 to the body of a user. That is, device 100 is wearable. Body 102 can designed to couple with straps 104.
  • Device 100 can have touch-sensitive display screen (hereafter touchscreen) 106 and crown 108.
  • Device 100 can also have buttons 110, 112, and 114.
  • the term 'crown,' in the context of a watch refers to the cap atop a stem for winding the watch.
  • the crown can be a physical component of the electronic device, rather than a virtual crown on a touch sensitive display.
  • Crown 108 can be mechanical meaning that it can be connected to a sensor for converting physical movement of the crown into electrical signals. Crown 108 can rotate in two directions of rotation (e.g., forward and backward). Crown 108 can also be pushed in towards the body of device 100 and/or be pulled away from device 100.
  • Crown 108 can be touch- sensitive, for example, using capacitive touch technologies that can detect whether a user is touching the crown.
  • crown 108 can further be rocked in one or more directions or translated along a track along an edge or at least partially around a perimeter of body 102. In some examples, more than one crown 108 can be used. The visual appearance of crown 108 can, but need not, resemble crowns of conventional watches. Buttons 110, 112, and 114, if included, can each be a physical or a touch- sensitive button. That is, the buttons may be, for example, physical buttons or capacitive buttons. Further, body 102, which can include a bezel, may have predetermined regions on the bezel that act as buttons.
  • Display 106 can include a display device, such as a liquid crystal display (LCD), light-emitting diode (LED) display, organic light-emitting diode (OLED) display, or the like, positioned partially or fully behind or in front of a touch sensor panel implemented using any desired touch sensing technology, such as mutual-capacitance touch sensing, self-capacitance touch sensing, resistive touch sensing, projection scan touch sensing, or the like. Display 106 can allow a user to perform various functions by touching over hovering near the touch sensor panel using one or more fingers or other object.
  • LCD liquid crystal display
  • LED light-emitting diode
  • OLED organic light-emitting diode
  • Display 106 can allow a user to perform various functions by touching over hovering near the touch sensor panel using one or more fingers or other object.
  • device 100 can further include one or more pressure sensors (not shown) for detecting a force or pressure applied to the display.
  • the force or pressure applied to display 106 can be used as an input to device 100 to perform any desired operation, such as making a selection, entering or exiting a menu, causing the display of additional options/actions, or the like.
  • different operations can be performed based on the amount of force or pressure being applied to display 106.
  • the one or more pressure sensors can further be used to determine a position that the force is being applied to display 106.
  • FIG. 2 illustrates a block diagram of some of the components of device 100.
  • crown 108 can be coupled to encoder 204, which can be configured to monitor a physical state or change of state of crown 108 (e.g., the position of the crown), convert it to an electrical signal (e.g., convert it to an analog or digital signal representation of the position or change in position of crown 108), and provide the signal to processor 202.
  • encoder 204 can be configured to sense the absolute rotational position (e.g., an angle between 0- 360°) of crown 108 and output an analog or digital representation of this position to processor 202.
  • encoder 204 can be configured to sense a change in rotational position (e.g., a change in rotational angle) of crown 108 over some sampling period and to output an analog or digital representation of the sensed change to processor 202.
  • the crown position information can further indicate a direction of rotation of the crown (e.g., a positive value can correspond to one direction and a negative value can correspond to the other).
  • encoder 204 can be configured to detect a rotation of crown 108 in any desired manner (e.g., velocity, acceleration, or the like) and can provide the crown rotational information to processor 202.
  • this information can be provided to other components of device 100. While the examples described herein refer to the use of rotational position of crown 108 to control scrolling, scaling, or an objects position, it should be appreciated that any other physical state of crown 108 can be used.
  • the physical state of the crown can control physical attributes of display 106. For example, if crown 108 is in a particular position (e.g., rotated forward), display 106 can have limited z-axis traversal ability. In other words, the physical state of the crown can represent physical modal functionality of display 106. In some examples, a temporal attribute of the physical state of crown 108 can be used as an input to device 100. For example, a fast change in physical state can be interpreted differently than a slow change in physical state.
  • Processor 202 can be further coupled to receive input signals from buttons 110, 112, and 114, along with touch signals from touch-sensitive display 106.
  • the buttons may be, for example, physical buttons or capacitive buttons.
  • body 102 which can include a bezel, may have predetermined regions on the bezel that act as buttons.
  • Processor 202 can be configured to interpret these input signals and output appropriate display signals to cause an image to be produced by touch- sensitive display 106. While a single processor 202 is shown, it should be appreciated that any number of processors or other computational devices can be used to perform the general functions discussed above.
  • FIGs. 3-12 illustrate an exemplary user interface 300 displaying a two-sided user interface object 302.
  • Object 302 has a first surface 304 and a second surface 306.
  • Each surface of object 302 is a selectable surface associated with corresponding data.
  • the data may be, for example, text, an image, an application icon, an instruction, a binary ON or OFF option, and the like.
  • a user can select a surface from among the multiple selectable surfaces of object 302 by using a physical crown of a wearable electronic device to rotate object 302 to align the desired selection surface such that the surface is parallel to the display 106 of the device 100 and is displayed on the display 106.
  • the system is designed to transition between one surface to another, rather than stopping in between surfaces.
  • examples are described with respect to object surfaces (or planes) being parallel to display 106, the examples can also be modified to instead be described with respect to object surfaces (or planes) facing the viewer of display 106. This modification may be particularly helpful when object surfaces or display 106 is not plane surface.
  • Crown 108 of device 100 is a user rotatable user interface input.
  • the crown 108 can be turned in two distinct directions: clockwise and counterclockwise.
  • FIGs. 3-12 include rotation direction arrows illustrating the direction of crown rotation and movement direction arrows illustrating the direction of rotation of a user interface object, where applicable.
  • the rotation direction arrows and movement direction arrows are typically not part of the displayed user interface, but are provided to aid in the interpretation of the figures.
  • a clockwise direction rotation of crown 108 is illustrated by a rotation direction arrow pointing in the up direction.
  • a counterclockwise direction rotation of crown 108 is illustrated by a rotation direction arrow pointing in the down direction.
  • the characteristics of the rotation direction arrow are not indicative of the distance, speed, or acceleration with which crown 108 is rotated by a user. Instead, the rotation direction arrow is indicative of the direction of rotation of crown 108 by the user.
  • first surface 304 of object 302 is aligned parallel to display 106 and is displayed, indicating selection of first surface 304.
  • the selected first surface 304 can be activated through, for example, an additional user input.
  • device 100 determines a change in the position of crown 108 in the clockwise direction, as indicated by rotation direction arrow 308.
  • Device 100 determines a rotational speed and a direction based on the determined change in the position of crown 108.
  • the device rotates object 302, as indicated by movement direction arrow 310 and illustrated in FIG. 4.
  • the rotation of object 302 is based on the determined rotational speed and direction. Rotational speed may be expressed in numerous ways.
  • rotational speed may be expressed as hertz, as rotations per unit of time, as rotations per frame, as revolutions per unit of time, as revolutions per frame, as a change in angle per unit of time, and the like.
  • object 302 may be associated with a mass or may have a calculated rotational inertia.
  • device 100 continues to determine a change in the position of crown 108 in the clockwise direction, as indicated by rotation direction arrow 308.
  • Device 100 determines a rotational speed and a direction based on the determined change in the position of crown 108.
  • the device continues to rotate object 302, as indicated by movement direction arrow 310 and illustrated in FIG. 5-6.
  • the rotation of object 302 is based on the determined rotational speed and direction.
  • the degrees of rotation of object 302 is based on the determined speed.
  • object 302 can be thought of as having some similar qualities as an analog tachometer. As the determined speed increases, the degree of rotation of object 302 increases. In this example, if the rotation of crown 108 is maintained at a constant speed, object 302 will stay at a static rotated position that is not parallel to display 106. If the speed of the rotation of crown 108 is increased, the determined speed will increase and object 302 will rotate an additional amount.
  • object 302 is configured to become perpendicular to display 106 in response to the determined speed being at a speed threshold.
  • object 302 exceeds a total rotation of 90 degrees, causing first surface 304 of object 302 to no longer be displayed and instead causing second surface 306 of object 302 to be displayed.
  • This transition between the display of first surface 304 and second surface 306 is illustrated as the transition between FIGs. 7 and 8.
  • the object 302 flips from one side to another side.
  • device 100 determines that there is no further change in the position of crown 108. As a result of this determination, the rotation of object 302 is changed such that a surface of object 302 is parallel to display 106. This change may be animated, as illustrated in FIGs. 9-12.
  • Device 100 will rotate object 302 such that the surface of object 302 partially facing display 106 when device 100 determines that there is no change in the position of crown 108 is the surface that will be displayed as being parallel to display 106.
  • object 302 is in a steady state. An object is in a steady state when the object is not being translated, rotated, or scaled.
  • object 302 when object 302 is in a steady state, the displayed surface of object 302 that is parallel to display 106 can be activated with an additional input.
  • the displayed surface that is parallel to display 106 in a steady state is determined to be selected even prior to activation.
  • object 302 may be used as an ON/OFF switch or toggle.
  • First surface 304 is associated with an ON instruction
  • second surface 306 is associated with an OFF instruction.
  • a user can transition between the ON and OFF states by rotating crown 108 at above a speed threshold, causing object 302 to flip and display a desired surface.
  • the desired surface is determined to be selected when the desired surface is displayed on display 106, is parallel to display 106, and no change in the position of crown 108 is detected.
  • the user can activate the selected surface by one or more of many techniques. For example, the user may press on touch- sensitive display 106, press on touch- sensitive display with a force greater than a predetermined threshold, press button 112, or simply allow the surface to remain selected for a predetermined amount of time. In another example, when the displayed surface is parallel to display 106, the action can be interpreted as both a selection and an activation of the data associated with the displayed surface.
  • FIG. 13 illustrates an exemplary process for selecting a surface of a two-sided graphical user interface object in response to a rotation of a crown.
  • Process 1300 is performed at a wearable electronic device (e.g., device 100 in FIG. 1) having a physical crown.
  • the electronic device also includes a touch- sensitive display. The process provides an efficient technique for selecting a surface of a two-sided, two-dimensional object.
  • the device causes a display of a two-sided object on a touch-sensitive display of a wearable electronic device.
  • the object is two-dimensional.
  • the object is three dimensional but only two surfaces are selectable. Each selectable surface of the object is associated with a corresponding data value.
  • the data may be, for example, text, an image, an application icon, an instruction, a binary ON or OFF option, and the like.
  • the device receives crown position information.
  • the crown position information may be received as a series of pulse signals, real values, integer values, and the like.
  • the device determines whether a change has occurred in a crown distance value.
  • the crown distance value is based on an angular displacement of the physical crown of the wearable electronic device.
  • a change in the crown distance value is indicative of a user providing input to the wearable electronic device by, for example, turning the physical crown. If the device determines that a change in the crown distance value has not occurred, the system returns to block 1304 and continues receiving crown position information. If the device determines that a change in the crown distance value has occurred, the system continues to block 1308, though the system may continue to receive crown position information.
  • the device determines a direction and a crown speed.
  • the crown speed is based on the speed of rotation of the physical crown of the wearable electronic device.
  • the determined crown speed may be expressed as hertz, as rotations per unit of time, as rotations per frame, as revolutions per unit of time, as revolutions per frame, and the like.
  • the determined direction is based on a direction of rotation of the physical crown of the wearable electronic device.
  • an up direction can be determined based on a clockwise rotation of the physical crown.
  • a down direction can be determined based on a counterclockwise rotation of the physical crown.
  • a down direction can be determined based on a clockwise rotation of the physical crown and an up direction can be determined based on a counterclockwise rotation of the physical crown.
  • the device in response to determining the change in the crown distance value, causes an initial rotation of the two-sided object on the display.
  • the amount of the rotation is based on the determined crown speed.
  • the direction of rotation is based on the determined direction.
  • the rotation may be animated.
  • the device determines whether the determined crown speed exceeds a speed threshold. If the device determines that the determined crown speed exceeds the speed threshold, the device continues to block 1314.
  • the speed threshold may be thought of as an escape velocity (or escape speed).
  • An escape velocity is the speed at which the kinetic energy plus the gravitational potential energy of an object is zero. If the device determines that the determined crown speed does not exceed the speed threshold, the device transitions to block 1316.
  • the minimum angular velocity of crown rotation that is necessary to reach escape velocity corresponds directly to the instantaneous angular velocity of crown 108 (FIG.
  • the minimum angular velocity of crown rotation necessary for reaching the escape velocity is a calculated velocity that is based on, but not directly equal to, the instantaneous ("current") angular velocity of crown 108.
  • device 100 can maintain a calculated crown (angular) velocity V in discrete moments in time T according to equation 1 :
  • V T V(T-i) + AVCROWN - AVDRAG- (EQ. 1)
  • V T represents a calculated crown velocity (speed and direction) at time T
  • V(T-i) represents the previous velocity (speed and direction) at time T-l
  • AV CROWN represents the change in velocity caused by the force being applied through the rotation of the crown at time T
  • AV DRAG represents the change in velocity due to a drag force.
  • the force being applied, which is reflected through AV CROWN can depend on the current velocity of angular rotation of the crown.
  • AV CROWN can also depend on the current angular velocity of the crown.
  • device 100 can provide user interface interactions based not only on instantaneous crown velocity but also based on user input in the form of crown movement over multiple time intervals, even if those intervals are finely divided. Note, typically, in the absence of user input in the form of AV CROWN, V T will approach (and become) zero based on AV DRAG in accordance with EQ. 1, but V T would not change signs without user input in the form of crown rotation
  • the greater the velocity of angular rotation of the crown the greater the value of AV CROWN will be.
  • the actual mapping between the velocity of angular rotation of the crown and AV CROWN can be varied depending on the desired user interface effect. For example, various linear or non-linear mappings between the velocity of angular rotation of the crown and AV CROWN can be used.
  • AV DRAG can take on various values.
  • AV DRAG can depend on the velocity of crown rotation such that at greater velocities, a greater opposing change in velocity (AV DRAG ) can be produced.
  • AV DRAG can have a constant value. It should be appreciated that the above-described requirements of AV CROWN and AV DRAG can be changed to produce desirable user interface effects.
  • V T the maintained velocity
  • V T can continue to increase as long as AV CROWN is greater than AV DRAG -
  • V T can have non-zero values even when no AV CROWN input is being received, meaning that user interface objects can continue to change without the user rotating the crown.
  • objects can stop changing based on the maintained velocity at the time the user stops rotating the crown and the AV DRAG component.
  • the V (T -I ) component can be reset to a value of zero, allowing the user to quickly change the direction of the object without having to provide a force sufficient to offset the Vr.
  • the device causes the object to flip past a transition position between a first surface that was last selected and a second surface.
  • the object has flipped past the transition position when the object will not return to having the first surface displayed parallel to the display without receiving additional user input.
  • the transition position may be when the surface is perpendicular to the display.
  • the displayed surface that is parallel to the display can be activated by a designated user input.
  • the displayed surface that is parallel to the display in a steady state is determined to be selected even prior to activation.
  • An object is in a steady state when the object is not being translated, rotated, or scaled. This may result in the first surface of the object no longer being displayed, in the case of a cube-shaped object.
  • the device causes the object to at least partially return to the object's initial position at the time of block 1302. For example, part of the initial rotation of the object caused at block 2410 can be negated. To achieve this, the device animates a rotation of the object that is in an opposite direction of the initial rotation at block 1310.
  • FIGs. 14-23 illustrate an exemplary graphical user interface showing the selection of a surface of a cube object in response to a rotation of a crown.
  • Object 1402 is a cube with six surfaces. In this example, four of the six surfaces are selectable. These four selectable surfaces include surface 1404 of object 1402, which is facing a viewer of display 106, the top surface of object 1402, the bottom surface of object 1402, and the back surface of object 1402. In this example, the left and right surfaces of object 1402 are not selectable. However, the left and right surfaces of object 1402 may be selectable in other examples.
  • examples are described with respect to object surfaces (or planes) being parallel to display 106, the examples can also be modified to instead be described with respect to object surfaces (or planes) facing the viewer of display 106. This modification may be particularly helpful when object surfaces or display 106 is not plane surface.
  • Each selectable surface of object 1402 is associated with corresponding data.
  • the data may be, for example, text, an image, an application icon, an instruction, a quad-state setting (such as Off/Low/Medium/High), and the like.
  • a user can select a surface from among the multiple selectable surfaces of the object 1402 by using a physical crown of a wearable electronic device to rotate object 1402 to align the desired selection surface such that it is parallel to the display 106 and displayed on display 106.
  • Crown 108 of device 100 is a user rotatable user interface input.
  • the crown 108 can be turned in two distinct directions: clockwise and counterclockwise.
  • FIGs. 14-23 include rotation direction arrows illustrating the direction of crown rotation and movement direction arrows illustrating the direction of rotation of a user interface object, where applicable.
  • the rotation direction arrows and movement direction arrows are typically not part of the displayed user interface, but are provided to aid in the interpretation of the figures.
  • a clockwise direction rotation of crown 108 is illustrated by a rotation direction arrow pointing in the up direction.
  • a counterclockwise direction rotation of crown 108 is illustrated by a rotation direction arrow pointing in the down direction.
  • the characteristics of the rotation direction arrow are not indicative of the distance, speed, or acceleration with which crown 108 is rotated by a user. Instead, the rotation direction arrow is indicative of the direction of rotation of crown 108 by the user.
  • first surface 1404 of object 1402 is aligned parallel to display 106 and is displayed, indicating selection of first surface 1404.
  • device 100 determines a change in the position of crown 108 in the counterclockwise direction, as indicated by rotation direction arrow 1502.
  • Device 100 determines a rotational speed and a direction based on the determined change in the position of crown 108.
  • the device rotates object 1402, as indicated by movement direction arrow 1504 and illustrated in FIG. 15.
  • the rotation of object 1402 is based on the determined rotational speed and direction. Rotational speed may be expressed in numerous ways.
  • rotational speed may be expressed as hertz, as rotations per unit of time, as rotations per frame, as revolutions per unit of time, as revolutions per frame, and the like.
  • object 1402 may be associated with a mass or may have a calculated rotational inertia.
  • device 100 continues to determine a change in the position of crown 108 in the counterclockwise direction, as indicated by rotation direction arrow 1502.
  • Device 100 determines a rotational speed and a direction based on the determined change in the position of crown 108.
  • the device continues to rotate object 1402, as indicated by movement direction arrow 1504 and illustrated in FIG. 16.
  • the rotation of object 1402 is based on the determined rotational speed and direction.
  • the degrees of rotation of object 1402 is based on the determined speed. As the determined speed increases, the degree of rotation of object 1402 increases. In this example, if the rotation of crown 108 is maintained at a constant speed, object 1402 will stay at a static rotated position where no surface of object 1402 is parallel to display 106. If the speed of the rotation of crown 108 is increased, the determined speed will increase and object 1402 will rotate an additional amount.
  • object 1402 is configured to rotate to have a surface parallel to display 106 in response to the determined speed being above a speed threshold.
  • object 1402 exceeds a rotation of 45 degrees, causing first surface 1404 of object 1402 to rotate away from the display to no longer be displayed and instead causing second surface 1406 of object 1404 rotate toward the display to be displayed.
  • This transition between the display of first surface 1404 and second surface 1406 is illustrated as the transition between FIGs. 16 and 17.
  • the object 1402 flips from one surface to another surface.
  • device 100 determines that there is no change in the position of crown 108. As a result of this determination, object 1402 is rotated such that a displayed surface of object 1402 is parallel to display 106. This rotation may be animated, as illustrated in FIGs. 17-18. Device 100 will rotate object 1402 such that the displayed surface of object 1402 that has the smallest angle with respect to the display is made parallel to the display 106. In other words, the object's surface that best faces the display 106 or is closest to parallel to display 106 is made parallel to the display 106. When a surface of object 1402 is parallel to display 106 and no change in the position of crown 108 is detected, object 1402 is in a steady state. An object is in a steady state when the object is not being translated, rotated, or scaled.
  • object 1402 when object 1402 is in a steady state, the surface of object 1402 that is parallel to display 106 and displayed on display 106 is determined to be selected.
  • object 1402 may be used as four-phase selection switch.
  • First surface 1404 is associated with a LOW setting instruction and second surface 1406 is associated with a
  • MEDIUM instruction setting The remaining two selectable surfaces are associated with HIGH and OFF instruction settings.
  • a user can transition between the four settings by rotating crown 108 at above a speed threshold, causing object 1402 to flip and display a desired surface.
  • the desired surface is determined to be selected when the displayed surface is parallel to display 106 and no change in the position of crown 108 is detected.
  • FIGs. 20-23 illustrate a second flip of object 1402 to select third surface 2002 of object 1402.
  • device 100 determines a change in the position of crown 108 in the counterclockwise direction, as indicated by rotation direction arrow 1502.
  • Device 100 determines a rotational speed and a direction based on the determined change in the position of crown 108. In response to determining the change in the position of crown 108, the device rotates object 1402, as indicated by movement direction arrow 1504 and illustrated in FIG. 21- 22. The rotation of object 1402 is based on the determined rotational speed and direction.
  • object 1402 flips to cause third surface 2002 to be parallel to display 106 and to be displayed on display 106, as illustrated in FIG. 23.
  • An object is in a steady state when the object is not being translated, rotated, or scaled.
  • the surface of object 1402 that is parallel to display 106 and displayed on display 106 is determined to be selected.
  • third surface 2002 is selected.
  • FIG. 24 illustrates an exemplary process for selecting a surface of a multi-sided graphical user interface object in response to a rotation of a crown.
  • Process 2400 is performed at a wearable electronic device (e.g., device 100 in FIG. 1) having a physical crown.
  • the electronic device also includes a touch- sensitive display. The process provides an efficient technique for selecting a surface of a multi-sided, three-dimensional object.
  • the device causes a display of a multi-sided object on a touch- sensitive display of a wearable electronic device.
  • Each selectable surface of the object is associated with a corresponding data value.
  • the data may be, for example, text, an image, an application icon, an instruction, and the like.
  • the device receives crown position information.
  • the crown position information may be received as a series of pulse signals, real values, integer values, and the like.
  • the device determines whether a change has occurred in a crown distance value.
  • the crown distance value is based on an angular displacement of the physical crown of the wearable electronic device.
  • a change in the crown distance value is indicative of a user providing input to the wearable electronic device by, for example, turning the physical crown. If the device determines that a change in the crown distance value has not occurred, the system returns to block 2404 and continues receiving crown position information. If the device determines that a change in the crown distance value has occurred, the system continues to block 2408, though the system may continue to receive crown position information.
  • the device determines a direction and a crown speed.
  • the crown speed is based on the speed of rotation of the physical crown of the wearable electronic device.
  • the determined crown speed may be expressed as hertz, as rotations per unit of time, as rotations per frame, as revolutions per unit of time, as revolutions per frame, and the like.
  • the determined direction is based on a direction of rotation of the physical crown of the wearable electronic device.
  • an up direction can be determined based on a clockwise rotation of the physical crown.
  • a down direction can be determined based on a counterclockwise rotation of the physical crown.
  • a down direction can be determined based on a clockwise rotation of the physical crown and an up direction can be determined based on a counterclockwise rotation of the physical crown.
  • the device in response to determining the change in the crown distance value, causes an initial rotation of the multi-sided object on the display.
  • the amount of the rotation is based on the determined crown speed.
  • the direction of rotation is based on the determined direction.
  • the rotation may be animated.
  • the device determines whether the determined crown speed exceeds a speed threshold. If the device determines that the determined crown speed exceeds the speed threshold, the device continues to block 2414.
  • the speed threshold may be thought of as an escape velocity (or escape speed).
  • An escape velocity is the speed at which the kinetic energy plus the gravitational potential energy of an object is zero. If the device determines that the determined speed does not exceed the speed threshold, the device continues to block 2416.
  • the minimum angular velocity of crown rotation that is necessary to reach escape velocity corresponds directly to the instantaneous angular velocity of crown 108 (FIG. 1), meaning that the user interface of device 100, in essence, responds when crown 108 reaches a sufficient angular velocity.
  • the minimum angular velocity of crown rotation necessary for reaching the escape velocity is a calculated velocity that is based on, but not directly equal to, the instantaneous ("current") angular velocity of crown 108.
  • device 100 can maintain a calculated crown (angular) velocity V in discrete moments in time T according to equation 1 :
  • V T V(T-i) + AVCROWN - AVDRAG- (EQ. 1)
  • V T represents a calculated crown velocity (speed and direction) at time T
  • V(T-i) represents the previous velocity (speed and direction) at time T-l
  • AV CROWN represents the change in velocity caused by the force being applied through the rotation of the crown at time T
  • AV DRAG represents the change in velocity due to a drag force.
  • the force being applied, which is reflected through AV CROWN can depend on the current velocity of angular rotation of the crown.
  • AV CROWN can also depend on the current angular velocity of the crown.
  • device 100 can provide user interface interactions based not only on instantaneous crown velocity but also based on user input in the form of crown movement over multiple time intervals, even if those intervals are finely divided. Note, typically, in the absence of user input in the form of AV CROWN, T will approach (and become) zero based on AV DRAG in accordance with EQ. 1, but V T would not change signs without user input in the form of crown rotation
  • the greater the velocity of angular rotation of the crown the greater the value of AV CROWN will be.
  • the actual mapping between the velocity of angular rotation of the crown and AV CROWN can be varied depending on the desired user interface effect. For example, various linear or non-linear mappings between the velocity of angular rotation of the crown and AVC R OWN can be used.
  • AV DRAG can take on various values.
  • AV DRAG can depend on the velocity of crown rotation such that at greater velocities, a greater opposing change in velocity (AV DRAG ) can be produced.
  • AV DRAG can have a constant value. It should be appreciated that the above-described requirements of AV CROWN and AV DRAG can be changed to produce desirable user interface effects.
  • V T the maintained velocity
  • V T can continue to increase as long as AV CROWN is greater than AV DRAG -
  • V T can have non-zero values even when no AV CROWN input is being received, meaning that user interface objects can continue to change without the user rotating the crown.
  • objects can stop changing based on the maintained velocity at the time the user stops rotating the crown and the AV DRAG component.
  • the V (T -I ) component can be reset to a value of zero, allowing the user to quickly change the direction of the object without having to provide a force sufficient to offset the V T -
  • the device causes the object to flip past a transition position between a first surface that was last selected and a new surface. For example, the object has flipped past the transition position when the object will not return to having the first surface displayed parallel to the display without receiving additional user input.
  • the displayed surface that is parallel to the display can be activated through a designated user input.
  • the displayed surface parallel to the display in the steady state is determined to be selected even before activation.
  • An object is in a steady state when the object is not being translated, rotated, or scaled. This may result in the first surface of the object no longer being displayed, in the case of a cube-shaped object.
  • the device causes the object to at least partially return to the object's initial position at the time of block 2408. For example, part of the initial rotation of the object caused at block 2410 can be negated. To achieve this, the device animates a rotation of the object that is in an opposite direction of the initial rotation at block 2410.
  • FIG. 25 illustrates a graphical user interface 2500 showing the selection of a surface 2506 of a multi-sided object in response to a rotation of a crown.
  • Object 2502 is a 12-sided rotatable dial, shaped similar to a wheel.
  • Object 2502 is rotatable along a fixed axis.
  • all 12 surfaces of object 2502 are selectable. These 12 selectable surfaces include surface 2504, surface 2506, surface 2508, surface 2510, and surface 2512.
  • surface 2508 is selected because surface 2508 is parallel to display 106 and is displayed on display 106.
  • the selectable surfaces of object 2505 can be selected according to the processes and techniques described in other examples.
  • device 100 can provide haptic feedback based on the content displayed on the display 106.
  • the device can modify the appearance of the object based on a change in a crown distance value received at the device 100 based on a rotation of crown 108.
  • a tactile output is output at the device 100.
  • the object is a rotatable multi-sided object, such as is described above.
  • the criterion is satisfied when a surface of the multi-sided object is selected.
  • the criterion is satisfied each time a displayed surface of the multi-sided object passes through a plane parallel to the display.
  • System 2600 can include instructions stored in a non-transitory computer readable storage medium, such as memory 2604 or storage device 2602, and executed by processor 2606.
  • the instructions can also be stored and/or transported within any non-transitory computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • a "non-transitory computer readable storage medium” can be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the non-transitory computer readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like.
  • the instructions can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • a "transport medium" can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the transport medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.
  • system 2600 can be included within device 100.
  • processor 2606 can be the same or a different process than processor 202.
  • Processor 2606 can be configured to receive the output from encoder 204, buttons 110, 112, and 114, and from touch-sensitive display 106.
  • Processor 2606 can process these inputs as described above with respect to the processes described and illustrated. It is to be understood that the system is not limited to the components and configuration of FIG. 26, but can include other or additional components in multiple configurations according to various examples.
PCT/US2014/053958 2012-12-29 2014-09-03 User interface object manipulations in a user interface WO2015034966A1 (en)

Priority Applications (13)

Application Number Priority Date Filing Date Title
EP19217240.1A EP3677999A1 (en) 2013-09-03 2014-09-03 User interface object manipulations in a user interface
KR1020187013265A KR102131228B1 (ko) 2013-09-03 2014-09-03 사용자 인터페이스에서의 사용자 인터페이스 객체 조작
KR1020207019035A KR102305362B1 (ko) 2013-09-03 2014-09-03 사용자 인터페이스에서의 사용자 인터페이스 객체 조작
EP14772002.3A EP3042271B1 (en) 2013-09-03 2014-09-03 User interface object manipulations in a user interface
US14/913,350 US10275117B2 (en) 2012-12-29 2014-09-03 User interface object manipulations in a user interface
CN201480060082.7A CN105683878B (zh) 2013-09-03 2014-09-03 用户界面中的用户界面对象操作
AU2014315325A AU2014315325B2 (en) 2013-09-03 2014-09-03 User interface object manipulations in a user interface
JP2016537947A JP6170250B2 (ja) 2013-09-03 2014-09-03 ユーザインターフェースにおけるユーザインターフェースオブジェクトの操作
KR1020167008488A KR102072614B1 (ko) 2013-09-03 2014-09-03 사용자 인터페이스에서의 사용자 인터페이스 객체 조작
CN201910438645.6A CN110262711B (zh) 2013-09-03 2014-09-03 用户界面中的用户界面对象操作
HK16111794.7A HK1223699A1 (zh) 2013-09-03 2016-10-12 用戶界面中的用戶界面對象操作
US16/358,483 US11068128B2 (en) 2013-09-03 2019-03-19 User interface object manipulations in a user interface
US17/212,850 US11829576B2 (en) 2013-09-03 2021-03-25 User interface object manipulations in a user interface

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201361959851P 2013-09-03 2013-09-03
US201361873356P 2013-09-03 2013-09-03
US201361873359P 2013-09-03 2013-09-03
US201361873360P 2013-09-03 2013-09-03
US61/873,356 2013-09-03
US61/873,360 2013-09-03
US61/959,851 2013-09-03
US61/873,359 2013-09-03
US201414476657A 2014-09-03 2014-09-03
US14/476,657 2014-09-03

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US201414476657A Continuation-In-Part 2012-12-29 2014-09-03

Related Child Applications (4)

Application Number Title Priority Date Filing Date
US201414476657A A-371-Of-International 2012-12-29 2014-09-03
US14/913,350 A-371-Of-International US10275117B2 (en) 2012-12-29 2014-09-03 User interface object manipulations in a user interface
US201614913350A Continuation-In-Part 2013-09-03 2016-02-19
US16/358,483 Continuation US11068128B2 (en) 2013-09-03 2019-03-19 User interface object manipulations in a user interface

Publications (1)

Publication Number Publication Date
WO2015034966A1 true WO2015034966A1 (en) 2015-03-12

Family

ID=51589515

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/US2014/053957 WO2015034965A1 (en) 2012-12-29 2014-09-03 User interface for manipulating user interface objects
PCT/US2014/053958 WO2015034966A1 (en) 2012-12-29 2014-09-03 User interface object manipulations in a user interface
PCT/US2014/053951 WO2015034960A1 (en) 2012-12-29 2014-09-03 Crown input for a wearable electronic device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2014/053957 WO2015034965A1 (en) 2012-12-29 2014-09-03 User interface for manipulating user interface objects

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2014/053951 WO2015034960A1 (en) 2012-12-29 2014-09-03 Crown input for a wearable electronic device

Country Status (5)

Country Link
JP (11) JP6397918B2 (ja)
KR (12) KR102305362B1 (ja)
AU (11) AU2014315325B2 (ja)
DK (1) DK179231B1 (ja)
WO (3) WO2015034965A1 (ja)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105068412A (zh) * 2015-08-26 2015-11-18 广东欧珀移动通信有限公司 一种智能手表及操作方法
CN105117119A (zh) * 2015-07-28 2015-12-02 广东欧珀移动通信有限公司 一种屏幕画面旋转的方法及智能手表
CN105117001A (zh) * 2015-07-28 2015-12-02 广东欧珀移动通信有限公司 一种智能手表的表冠及智能手表的操作方法
WO2016171467A1 (en) * 2015-04-23 2016-10-27 Samsung Electronics Co., Ltd. Electronic device including rotary member and display method thereof
US10324620B2 (en) 2016-09-06 2019-06-18 Apple Inc. Processing capacitive touch gestures implemented on an electronic device
US11281167B2 (en) 2016-01-14 2022-03-22 Huawei Technologies Co., Ltd. Electronic device and a method of operating such an electronic device

Families Citing this family (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9100493B1 (en) * 2011-07-18 2015-08-04 Andrew H B Zhou Wearable personal digital device for facilitating mobile device payments and personal use
TWI439960B (zh) 2010-04-07 2014-06-01 Apple Inc 虛擬使用者編輯環境
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US9753436B2 (en) 2013-06-11 2017-09-05 Apple Inc. Rotary input mechanism for an electronic device
EP3014400B1 (en) 2013-08-09 2020-06-03 Apple Inc. Tactile switch for an electronic device
US10394325B2 (en) 2013-12-10 2019-08-27 Apple Inc. Input friction mechanism for rotary inputs of electronic devices
US10048802B2 (en) 2014-02-12 2018-08-14 Apple Inc. Rejection of false turns of rotary inputs for electronic devices
KR102130259B1 (ko) 2014-09-02 2020-07-03 애플 인크. 웨어러블 전자 디바이스
US10145712B2 (en) 2014-09-09 2018-12-04 Apple Inc. Optical encoder including diffuser members
US9829350B2 (en) 2014-09-09 2017-11-28 Apple Inc. Magnetically coupled optical encoder
US9651405B1 (en) 2015-03-06 2017-05-16 Apple Inc. Dynamic adjustment of a sampling rate for an optical encoder
KR101993073B1 (ko) 2015-03-08 2019-06-25 애플 인크. 회전가능 및 병진가능한 입력 메커니즘을 위한 압축성 밀봉부
ES2890451T3 (es) * 2015-03-27 2022-01-19 Saronikos Trading & Services Unipessoal Lda Reloj electrónico de pulsera o de bolsillo que comprende una corona giratoria
KR102406102B1 (ko) * 2015-04-24 2022-06-10 삼성전자주식회사 전자 장치 및 그의 표시 방법
KR20160131275A (ko) * 2015-05-06 2016-11-16 엘지전자 주식회사 와치 타입 단말기
KR102356449B1 (ko) * 2015-05-13 2022-01-27 삼성전자주식회사 회전 입력에 따른 추가 정보 제공 방법 및 장치
CN105117143B (zh) * 2015-07-28 2020-07-03 Oppo广东移动通信有限公司 一种信息展示方法、智能手表、服务器以及系统
CN105138116B (zh) * 2015-07-28 2018-07-06 广东欧珀移动通信有限公司 一种信息展示方法、智能手表、终端设备以及系统
CN105141755A (zh) * 2015-07-28 2015-12-09 广东欧珀移动通信有限公司 一种信息回复方法、智能手表、终端设备以及系统
CN106708379B (zh) * 2015-07-28 2020-01-10 Oppo广东移动通信有限公司 一种界面操作方法、装置以及智能手表
CN105117120B (zh) * 2015-07-28 2017-07-11 广东欧珀移动通信有限公司 一种智能手表的表冠及智能手表的操作方法
CN105068738A (zh) * 2015-07-28 2015-11-18 广东欧珀移动通信有限公司 一种智能手表的控制方法及智能手表
CN105025629B (zh) * 2015-07-28 2019-11-29 Oppo广东移动通信有限公司 一种智能手表的控制方法及智能手表
CN105117002B (zh) * 2015-07-28 2017-07-11 广东欧珀移动通信有限公司 一种智能手表的表冠及智能手表的操作方法
CN105117121B (zh) * 2015-07-28 2019-04-02 Oppo广东移动通信有限公司 一种智能手表求助的方法及智能手表
CN105005479B (zh) * 2015-07-28 2018-06-29 广东欧珀移动通信有限公司 一种闹钟关闭方法及智能手表
CN105022947B (zh) * 2015-07-28 2019-02-22 Oppo广东移动通信有限公司 一种智能手表的指纹识别方法及智能手表
CN105116996A (zh) * 2015-07-28 2015-12-02 广东欧珀移动通信有限公司 一种智能手表的控制方法及智能手表
CN105025630A (zh) * 2015-07-28 2015-11-04 广东欧珀移动通信有限公司 一种亮度调节方法及智能手表
CN105116998B (zh) * 2015-07-28 2019-05-21 Oppo广东移动通信有限公司 一种快速打开文件的方法及智能手表
CN105116997B (zh) * 2015-07-28 2018-05-29 广东欧珀移动通信有限公司 一种数据加密、解密的方法及智能手表
CN105137746B (zh) * 2015-07-28 2018-03-27 广东欧珀移动通信有限公司 一种接收频率调节方法及智能手表
CN105117118B (zh) * 2015-07-28 2019-02-01 Oppo广东移动通信有限公司 一种控制视频播放的方法及智能手表
CN105137819B (zh) * 2015-07-28 2019-07-02 Oppo广东移动通信有限公司 一种音乐播放的方法及智能手表
CN105389074A (zh) * 2015-07-28 2016-03-09 广东欧珀移动通信有限公司 一种智能手表的控制方法及智能手表
CN105013175A (zh) * 2015-07-28 2015-11-04 广东欧珀移动通信有限公司 一种游戏运动控制方法及智能手表
CN105117150B (zh) * 2015-07-28 2021-06-04 Oppo广东移动通信有限公司 一种空调控制方法及智能手表
EP4327731A3 (en) 2015-08-20 2024-05-15 Apple Inc. Exercise-based watch face
CN105117129A (zh) * 2015-08-26 2015-12-02 广东欧珀移动通信有限公司 一种界面操作方法、装置以及智能手表
CN105117011B (zh) * 2015-08-26 2017-08-29 广东欧珀移动通信有限公司 一种应用程序操作方法、装置以及智能手表
CN105208675B (zh) * 2015-08-26 2018-09-04 广东欧珀移动通信有限公司 一种基于智能手表的无线连接方法及智能手表
CN105117014B (zh) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 一种交友管理方法及智能手表
CN105117013B (zh) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 一种智能手表的解锁方法及智能手表
CN105068742B (zh) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 一种智能手表的控制方法及智能手表
CN105224072B (zh) * 2015-08-26 2018-07-06 广东欧珀移动通信有限公司 一种音乐播放的控制方法及智能手表
CN105204893B (zh) * 2015-08-26 2018-07-06 广东欧珀移动通信有限公司 一种应用程序控制方法及智能手表
CN105068847B (zh) * 2015-08-26 2016-12-28 广东欧珀移动通信有限公司 一种应用程序启动方法及智能手表
CN105227201B (zh) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 一种通信信息回复方法及智能手表
CN105224193B (zh) * 2015-08-26 2018-05-29 广东欧珀移动通信有限公司 一种智能手表的控制方法及智能手表
CN105117012B (zh) * 2015-08-26 2018-06-29 广东欧珀移动通信有限公司 一种显示界面调整方法及智能手表
CN105224208B (zh) * 2015-08-26 2018-07-06 广东欧珀移动通信有限公司 一种页面显示的方法及智能手表
CN105117010B (zh) * 2015-08-26 2018-12-11 广东欧珀移动通信有限公司 一种启动应用程序的方法及智能手表
US9983029B2 (en) 2015-09-30 2018-05-29 Apple Inc. Integrated optical encoder for tilt able rotatable shaft
US10503271B2 (en) 2015-09-30 2019-12-10 Apple Inc. Proximity detection for an input mechanism of an electronic device
WO2017126727A1 (ko) * 2016-01-22 2017-07-27 엘지전자 주식회사 와치 타입 이동 단말기 및 그의 동작 방법
US10048837B2 (en) 2016-02-16 2018-08-14 Google Llc Target selection on a small form factor display
WO2017152139A1 (en) * 2016-03-04 2017-09-08 Apple Inc. Input with haptic feedback
US10025399B2 (en) * 2016-03-16 2018-07-17 Lg Electronics Inc. Watch type mobile terminal and method for controlling the same
CN107203261B (zh) * 2016-03-16 2022-05-24 Lg电子株式会社 手表型移动终端及其控制方法
JP6927670B2 (ja) * 2016-05-26 2021-09-01 株式会社アイ・オー・データ機器 操作受付装置、プログラム、および操作受付方法
US10061399B2 (en) 2016-07-15 2018-08-28 Apple Inc. Capacitive gap sensor ring for an input device
US10019097B2 (en) 2016-07-25 2018-07-10 Apple Inc. Force-detecting input structure
KR102607562B1 (ko) * 2016-08-30 2023-11-30 삼성전자주식회사 베젤 기반 인터랙션에 따른 비주얼 이펙트 제공 방법 및 이를 구현한 전자 장치
KR102507252B1 (ko) * 2016-09-23 2023-03-07 애플 인크. 워치 극장 모드
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
DK179555B1 (en) 2017-05-16 2019-02-13 Apple Inc. USER INTERFACE FOR A FLASHLIGHT MODE ON AN ELECTRONIC DEVICE
US10962935B1 (en) 2017-07-18 2021-03-30 Apple Inc. Tri-axis force sensor
US10203662B1 (en) 2017-09-25 2019-02-12 Apple Inc. Optical position sensor for a crown
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
DK180212B1 (en) 2018-05-07 2020-08-19 Apple Inc USER INTERFACE FOR CREATING AVATAR
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11360440B2 (en) 2018-06-25 2022-06-14 Apple Inc. Crown for an electronic watch
US11561515B2 (en) 2018-08-02 2023-01-24 Apple Inc. Crown for an electronic watch
US11181863B2 (en) 2018-08-24 2021-11-23 Apple Inc. Conductive cap for watch crown
CN211293787U (zh) 2018-08-24 2020-08-18 苹果公司 电子表
CN209625187U (zh) 2018-08-30 2019-11-12 苹果公司 电子手表和电子设备
US11194299B1 (en) 2019-02-12 2021-12-07 Apple Inc. Variable frictional feedback device for a digital crown of an electronic watch
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
US10852905B1 (en) 2019-09-09 2020-12-01 Apple Inc. Techniques for managing display usage
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11526256B2 (en) 2020-05-11 2022-12-13 Apple Inc. User interfaces for managing user interface sharing
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
KR102503135B1 (ko) * 2020-05-11 2023-02-23 애플 인크. 시간과 관련된 사용자 인터페이스들
US11550268B2 (en) 2020-06-02 2023-01-10 Apple Inc. Switch module for electronic crown assembly
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1052566A1 (en) * 1999-05-14 2000-11-15 Alcatel Graphical user interface
US6266098B1 (en) * 1997-10-22 2001-07-24 Matsushita Electric Corporation Of America Function presentation and selection using a rotatable function menu
US6661438B1 (en) * 2000-01-18 2003-12-09 Seiko Epson Corporation Display apparatus and portable information processing apparatus

Family Cites Families (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530455A (en) * 1994-08-10 1996-06-25 Mouse Systems Corporation Roller mouse for implementing scrolling in windows applications
US6047301A (en) * 1996-05-24 2000-04-04 International Business Machines Corporation Wearable computer
JP3673425B2 (ja) * 1999-04-16 2005-07-20 松下電器産業株式会社 プログラム選択実行装置,およびデータ選択実行装置
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US6522347B1 (en) 2000-01-18 2003-02-18 Seiko Epson Corporation Display apparatus, portable information processing apparatus, information recording medium, and electronic apparatus
US6809724B1 (en) * 2000-01-18 2004-10-26 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US7081905B1 (en) * 2000-06-30 2006-07-25 International Business Machines Corporation Method and apparatus for dynamically controlling scroller speed employed for a user interface of a wearable appliance
US6556222B1 (en) * 2000-06-30 2003-04-29 International Business Machines Corporation Bezel based input mechanism and user interface for a smart watch
JP2002175139A (ja) 2000-12-07 2002-06-21 Sony Corp 情報処理装置、メニュー表示方法及びプログラム格納媒体
JP3762243B2 (ja) * 2001-03-26 2006-04-05 陣山 俊一 情報処理方法、情報処理プログラム並びに携帯情報端末装置
US7312785B2 (en) * 2001-10-22 2007-12-25 Apple Inc. Method and apparatus for accelerated scrolling
JP2003248544A (ja) * 2002-02-25 2003-09-05 Sony Corp グラフィカルユーザインターフェース、情報処理装置の操作方法、情報処理装置、並びにプログラム
CN100359441C (zh) * 2002-03-05 2008-01-02 索尼爱立信移动通信日本株式会社 图像处理装置以及图像处理方法
JP3761165B2 (ja) * 2002-05-13 2006-03-29 株式会社モバイルコンピューティングテクノロジーズ 表示制御装置、携帯型情報端末装置、プログラム、及び表示制御方法
JP2004021522A (ja) * 2002-06-14 2004-01-22 Sony Corp 情報処理装置および方法、並びにプログラム
JP2004070654A (ja) * 2002-08-06 2004-03-04 Matsushita Electric Ind Co Ltd 携帯用電子機器
JP2004184396A (ja) * 2002-10-09 2004-07-02 Seiko Epson Corp 表示装置、時計、表示装置の制御方法、制御プログラムおよび記録媒体
JP2004178584A (ja) 2002-11-26 2004-06-24 Asulab Sa 機能、装置、又は所定の場所にアクセスするためのタッチスクリーンによるセキュリティコードの入力方法、及びその方法を実行するためのデバイス
US20040130581A1 (en) * 2003-01-03 2004-07-08 Microsoft Corporation Interaction model
JP2004259063A (ja) * 2003-02-26 2004-09-16 Sony Corp 3次元オブジェクトの表示処理装置、表示処理方法およびコンピュータプログラム
US8046705B2 (en) * 2003-05-08 2011-10-25 Hillcrest Laboratories, Inc. Systems and methods for resolution consistent semantic zooming
US20040264301A1 (en) * 2003-06-30 2004-12-30 Microsoft Corporation Calendar user interface
US8209634B2 (en) * 2003-12-01 2012-06-26 Research In Motion Limited Previewing a new event on a small screen device
US7454713B2 (en) * 2003-12-01 2008-11-18 Sony Ericsson Mobile Communications Ab Apparatus, methods and computer program products providing menu expansion and organization functions
US8082382B2 (en) * 2004-06-04 2011-12-20 Micron Technology, Inc. Memory device with user configurable density/performance
US7778671B2 (en) * 2004-10-08 2010-08-17 Nokia Corporation Mobile communications terminal having an improved user interface and method therefor
JP2006140990A (ja) * 2004-10-13 2006-06-01 Olympus Corp 画像表示装置、カメラ、画像表示装置およびカメラの表示方法
KR100630154B1 (ko) * 2005-08-31 2006-10-02 삼성전자주식회사 지자기 센서를 이용하여 기울어짐 정도에 따라디스플레이를 제어하는 방법 및 그 이동 단말기
US20070063995A1 (en) * 2005-09-22 2007-03-22 Bailey Eric A Graphical user interface for use with a multi-media system
JP2007170995A (ja) * 2005-12-22 2007-07-05 Casio Comput Co Ltd 電子機器および電子時計
KR100678963B1 (ko) * 2005-12-28 2007-02-06 삼성전자주식회사 회전 가능한 입력 버튼을 구비한 휴대용 장치 및 그 조작방법
KR100754674B1 (ko) * 2006-03-10 2007-09-03 삼성전자주식회사 휴대 단말의 메뉴 선택 방법 및 장치
KR100896055B1 (ko) * 2007-01-15 2009-05-07 엘지전자 주식회사 회전입력장치 구비 이동단말기 및 그 디스플레이 방법
KR20080073868A (ko) * 2007-02-07 2008-08-12 엘지전자 주식회사 단말기 및 메뉴표시방법
TW200734916A (en) * 2007-05-03 2007-09-16 Ying-Chu Lee Method of using mouse wheel to operate picture
CN101821702A (zh) * 2007-10-12 2010-09-01 法国电信公司 用于显示多个多媒体文档的装置
JP4462338B2 (ja) * 2007-11-27 2010-05-12 セイコーエプソン株式会社 電子時計、電子時計の時刻修正方法、電子時計の制御プログラム
JP5356713B2 (ja) * 2008-03-28 2013-12-04 京セラ株式会社 携帯電話機
JP2009265793A (ja) 2008-04-23 2009-11-12 Sony Ericsson Mobilecommunications Japan Inc 表示操作装置、操作装置およびプログラム
KR101512041B1 (ko) * 2008-07-01 2015-04-14 엘지전자 주식회사 휴대 단말기 및 그 제어방법
KR101546774B1 (ko) * 2008-07-29 2015-08-24 엘지전자 주식회사 휴대 단말기 및 그 동작제어 방법
KR101555055B1 (ko) * 2008-10-10 2015-09-22 엘지전자 주식회사 이동단말기 및 그 디스플레이방법
US20110055752A1 (en) * 2009-06-04 2011-03-03 Rubinstein Jonathan J Method and Apparatus for Displaying and Auto-Correcting an Over-Scroll State on a Computing Device
JP5513071B2 (ja) * 2009-10-26 2014-06-04 株式会社プロフィールド 情報処理装置、情報処理方法、およびプログラム
CH701440A2 (fr) 2009-07-03 2011-01-14 Comme Le Temps Sa Montre-bracelet à écran tactile et procédé d'affichage sur une montre à écran tactile.
KR101608764B1 (ko) * 2009-07-14 2016-04-04 엘지전자 주식회사 이동 단말기 및 이것의 디스플레이 제어 방법
KR101595384B1 (ko) * 2009-07-20 2016-02-18 엘지전자 주식회사 와치형 이동 단말기
KR101649646B1 (ko) * 2010-02-11 2016-08-19 엘지전자 주식회사 이동 단말기
US8930841B2 (en) * 2010-02-15 2015-01-06 Motorola Mobility Llc Methods and apparatus for a user interface configured to display event information
CH702862A1 (fr) * 2010-03-30 2011-09-30 Comme Le Temps Sa Montre bracelet à affichage électronique.
US20110316888A1 (en) * 2010-06-28 2011-12-29 Invensense, Inc. Mobile device user interface combining input from motion sensors and other controls
JP5676952B2 (ja) * 2010-07-26 2015-02-25 キヤノン株式会社 表示制御装置及び表示制御方法、プログラム、記憶媒体
JP5745241B2 (ja) * 2010-09-08 2015-07-08 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、及び情報処理方法
US9104211B2 (en) * 2010-11-19 2015-08-11 Google Inc. Temperature controller with model-based time to target calculation and display
US9244606B2 (en) * 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
KR101740439B1 (ko) * 2010-12-23 2017-05-26 엘지전자 주식회사 이동 단말기 및 그 제어방법
US9423951B2 (en) * 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
KR101785323B1 (ko) * 2011-01-05 2017-10-17 삼성전자주식회사 휴대 단말기의 사용자 인터페이스 제공 방법 및 장치
TWI441051B (zh) * 2011-01-25 2014-06-11 Compal Electronics Inc 電子裝置及其資訊呈現方法
US20120246678A1 (en) * 2011-03-24 2012-09-27 Tobe Barksdale Distance Dependent Scalable User Interface
JP2013003718A (ja) * 2011-06-14 2013-01-07 Mitsubishi Electric Information Systems Corp 情報処理装置、情報処理装置のスクロール表示方法およびスクロール表示プログラム
EP2551784A1 (en) * 2011-07-28 2013-01-30 Roche Diagnostics GmbH Method of controlling the display of a dataset
US20130097566A1 (en) * 2011-10-17 2013-04-18 Carl Fredrik Alexander BERGLUND System and method for displaying items on electronic devices
JP6159078B2 (ja) * 2011-11-28 2017-07-05 京セラ株式会社 装置、方法、及びプログラム
JP2013152693A (ja) * 2011-12-27 2013-08-08 Nintendo Co Ltd 情報処理プログラム、情報処理装置、画像表示方法及び画像表示システム
CN103460164B (zh) * 2012-02-03 2017-02-08 松下知识产权经营株式会社 触觉提示装置以及触觉提示装置的驱动方法
JP2013164700A (ja) * 2012-02-10 2013-08-22 Samsung Electronics Co Ltd 携帯端末のスクロール方法及びスクロール装置
KR20130094054A (ko) * 2012-02-15 2013-08-23 삼성전자주식회사 휴대용 전자 장치에서 객체를 관리하기 위한 장치 및 방법

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6266098B1 (en) * 1997-10-22 2001-07-24 Matsushita Electric Corporation Of America Function presentation and selection using a rotatable function menu
EP1052566A1 (en) * 1999-05-14 2000-11-15 Alcatel Graphical user interface
US6661438B1 (en) * 2000-01-18 2003-12-09 Seiko Epson Corporation Display apparatus and portable information processing apparatus

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016171467A1 (en) * 2015-04-23 2016-10-27 Samsung Electronics Co., Ltd. Electronic device including rotary member and display method thereof
US10386942B2 (en) 2015-04-23 2019-08-20 Samsung Electronics Co., Ltd. Electronic device including rotary member and display method thereof
CN105117119A (zh) * 2015-07-28 2015-12-02 广东欧珀移动通信有限公司 一种屏幕画面旋转的方法及智能手表
CN105117001A (zh) * 2015-07-28 2015-12-02 广东欧珀移动通信有限公司 一种智能手表的表冠及智能手表的操作方法
WO2017016278A1 (zh) * 2015-07-28 2017-02-02 广东欧珀移动通信有限公司 一种智能手表的表冠及智能手表的操作方法
CN105117001B (zh) * 2015-07-28 2017-07-11 广东欧珀移动通信有限公司 一种智能手表的表冠及智能手表的操作方法
CN105068412A (zh) * 2015-08-26 2015-11-18 广东欧珀移动通信有限公司 一种智能手表及操作方法
CN105068412B (zh) * 2015-08-26 2017-10-17 广东欧珀移动通信有限公司 一种智能手表及操作方法
US11281167B2 (en) 2016-01-14 2022-03-22 Huawei Technologies Co., Ltd. Electronic device and a method of operating such an electronic device
US10324620B2 (en) 2016-09-06 2019-06-18 Apple Inc. Processing capacitive touch gestures implemented on an electronic device
US10949082B2 (en) 2016-09-06 2021-03-16 Apple Inc. Processing capacitive touch gestures implemented on an electronic device

Also Published As

Publication number Publication date
KR20210010661A (ko) 2021-01-27
JP7471262B2 (ja) 2024-04-19
KR20160048955A (ko) 2016-05-04
JP2016534462A (ja) 2016-11-04
JP6170250B2 (ja) 2017-07-26
JP2019215891A (ja) 2019-12-19
JP6333387B2 (ja) 2018-05-30
KR20190114034A (ko) 2019-10-08
KR20200096999A (ko) 2020-08-14
AU2014315319A1 (en) 2016-04-21
KR20160048967A (ko) 2016-05-04
AU2019206101B2 (en) 2020-12-24
AU2018200289A1 (en) 2018-02-01
JP2023126783A (ja) 2023-09-12
AU2021201748C1 (en) 2023-03-16
DK179231B1 (en) 2018-02-19
AU2021212114B2 (en) 2023-07-20
KR20180041779A (ko) 2018-04-24
WO2015034965A1 (en) 2015-03-12
JP2018142361A (ja) 2018-09-13
DK201670117A1 (en) 2016-03-21
AU2017276285B2 (en) 2019-04-18
JP7128153B2 (ja) 2022-08-30
AU2019257521A1 (en) 2019-11-28
AU2014315319B2 (en) 2017-10-26
WO2015034960A1 (en) 2015-03-12
JP6547039B2 (ja) 2019-07-17
AU2019206101A1 (en) 2019-08-08
AU2018200289B2 (en) 2019-08-01
KR20180122752A (ko) 2018-11-13
AU2022235585A1 (en) 2022-10-13
JP6397918B2 (ja) 2018-09-26
AU2023237127A1 (en) 2023-10-19
AU2014315325B2 (en) 2017-05-04
KR102045111B1 (ko) 2019-11-14
KR102143895B1 (ko) 2020-08-12
JP2016532973A (ja) 2016-10-20
JP6924802B2 (ja) 2021-08-25
KR20160048972A (ko) 2016-05-04
JP6564493B2 (ja) 2019-08-21
AU2021201748A1 (en) 2021-04-15
KR20180054897A (ko) 2018-05-24
AU2017276285A1 (en) 2018-01-18
AU2021212114A1 (en) 2021-08-26
JP2019194892A (ja) 2019-11-07
KR102263620B1 (ko) 2021-06-11
JP2021182426A (ja) 2021-11-25
AU2021212114B9 (en) 2023-11-23
AU2014315325A1 (en) 2016-04-21
KR20190032627A (ko) 2019-03-27
JP2021177397A (ja) 2021-11-11
AU2014315324A1 (en) 2016-04-28
JP2018136983A (ja) 2018-08-30
AU2014315324B2 (en) 2017-10-12
KR102305362B1 (ko) 2021-09-24
KR20200084906A (ko) 2020-07-13
KR102111452B1 (ko) 2020-05-15
KR102029303B1 (ko) 2019-10-07
JP2016532212A (ja) 2016-10-13
KR102131228B1 (ko) 2020-07-07
JP7223081B2 (ja) 2023-02-15
JP2023065397A (ja) 2023-05-12
KR20210070395A (ko) 2021-06-14
AU2021201748B2 (en) 2022-07-07
KR102072614B1 (ko) 2020-02-03

Similar Documents

Publication Publication Date Title
US11829576B2 (en) User interface object manipulations in a user interface
AU2014315325B2 (en) User interface object manipulations in a user interface
EP3042271B1 (en) User interface object manipulations in a user interface
US10275117B2 (en) User interface object manipulations in a user interface
US20200110522A1 (en) Crown input for a wearable electronic device
US10691230B2 (en) Crown input for a wearable electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14772002

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14913350

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2016537947

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20167008488

Country of ref document: KR

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2014772002

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014772002

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2014315325

Country of ref document: AU

Date of ref document: 20140903

Kind code of ref document: A