EP4332725A3 - Actions sensibles au contexte en réponse à une entrée tactile - Google Patents

Actions sensibles au contexte en réponse à une entrée tactile Download PDF

Info

Publication number
EP4332725A3
EP4332725A3 EP24151711.9A EP24151711A EP4332725A3 EP 4332725 A3 EP4332725 A3 EP 4332725A3 EP 24151711 A EP24151711 A EP 24151711A EP 4332725 A3 EP4332725 A3 EP 4332725A3
Authority
EP
European Patent Office
Prior art keywords
context
touch input
response
context sensitive
sensitive actions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP24151711.9A
Other languages
German (de)
English (en)
Other versions
EP4332725A2 (fr
Inventor
Christopher D. Moore
Marcel Van Os
Bradford A. Moore
Patrick S. Piemonte
Eleanor C. Wachsman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/964,961 external-priority patent/US9423946B2/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of EP4332725A2 publication Critical patent/EP4332725A2/fr
Publication of EP4332725A3 publication Critical patent/EP4332725A3/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/10Details of telephonic subscriber devices including a GPS signal receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Hardware Design (AREA)
  • Automation & Control Theory (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
EP24151711.9A 2013-08-12 2014-07-25 Actions sensibles au contexte en réponse à une entrée tactile Pending EP4332725A3 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US13/964,961 US9423946B2 (en) 2013-08-12 2013-08-12 Context sensitive actions in response to touch input
US13/964,967 US9110561B2 (en) 2013-08-12 2013-08-12 Context sensitive actions
PCT/US2014/048250 WO2015023419A1 (fr) 2013-08-12 2014-07-25 Actions dépendant du contexte en réponse à une entrée tactile
EP14755461.2A EP3033669B1 (fr) 2013-08-12 2014-07-25 Actions dépendant du contexte en réponse à une entrée tactile

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
EP14755461.2A Division EP3033669B1 (fr) 2013-08-12 2014-07-25 Actions dépendant du contexte en réponse à une entrée tactile

Publications (2)

Publication Number Publication Date
EP4332725A2 EP4332725A2 (fr) 2024-03-06
EP4332725A3 true EP4332725A3 (fr) 2024-06-05

Family

ID=51398868

Family Applications (2)

Application Number Title Priority Date Filing Date
EP14755461.2A Active EP3033669B1 (fr) 2013-08-12 2014-07-25 Actions dépendant du contexte en réponse à une entrée tactile
EP24151711.9A Pending EP4332725A3 (fr) 2013-08-12 2014-07-25 Actions sensibles au contexte en réponse à une entrée tactile

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP14755461.2A Active EP3033669B1 (fr) 2013-08-12 2014-07-25 Actions dépendant du contexte en réponse à une entrée tactile

Country Status (3)

Country Link
EP (2) EP3033669B1 (fr)
CN (2) CN110413054B (fr)
WO (1) WO2015023419A1 (fr)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016022204A1 (fr) 2014-08-02 2016-02-11 Apple Inc. Interfaces utilisateur spécifiques du contexte
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
WO2016144385A1 (fr) 2015-03-08 2016-09-15 Apple Inc. Partage de constructions graphiques configurables par l'utilisateur
EP4321088A3 (fr) 2015-08-20 2024-04-24 Apple Inc. Cadran de montre et complications basés sur l'exercice
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
US11036387B2 (en) 2017-05-16 2021-06-15 Apple Inc. Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects
US10203866B2 (en) 2017-05-16 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects
CN108319423A (zh) * 2017-12-22 2018-07-24 石化盈科信息技术有限责任公司 一种在触摸屏上显示地图的单手操作方法及装置
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
KR102393717B1 (ko) 2019-05-06 2022-05-03 애플 인크. 전자 디바이스의 제한된 동작
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
DK181103B1 (en) 2020-05-11 2022-12-15 Apple Inc User interfaces related to time
CN115904596B (zh) 2020-05-11 2024-02-02 苹果公司 用于管理用户界面共享的用户界面
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
US20230236547A1 (en) 2022-01-24 2023-07-27 Apple Inc. User interfaces for indicating time

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168396A1 (en) * 2007-01-07 2008-07-10 Michael Matas Portable Multifunction Device, Method, and Graphical User Interface for Providing Maps and Directions
US20090303204A1 (en) * 2007-01-05 2009-12-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4148159B2 (ja) * 2004-03-01 2008-09-10 株式会社日立製作所 ナビゲーションシステム
US7301529B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
US7934156B2 (en) * 2006-09-06 2011-04-26 Apple Inc. Deletion gestures on a portable multifunction device
US8896529B2 (en) * 2007-08-01 2014-11-25 Nokia Corporation Apparatus, methods, and computer program products providing context-dependent gesture recognition
KR101565768B1 (ko) * 2008-12-23 2015-11-06 삼성전자주식회사 휴대단말의 잠금 모드 해제 방법 및 장치
US9465532B2 (en) * 2009-12-18 2016-10-11 Synaptics Incorporated Method and apparatus for operating in pointing and enhanced gesturing modes
KR101915615B1 (ko) * 2010-10-14 2019-01-07 삼성전자주식회사 모션 기반 사용자 인터페이스 제어 장치 및 방법
US20120313847A1 (en) * 2011-06-09 2012-12-13 Nokia Corporation Method and apparatus for contextual gesture recognition
CN102520858B (zh) * 2011-12-08 2013-12-18 深圳万兴信息科技股份有限公司 一种移动终端的应用控制方法及装置
US8970519B2 (en) * 2012-02-01 2015-03-03 Logitech Europe S.A. System and method for spurious signal detection and compensation on an input device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090303204A1 (en) * 2007-01-05 2009-12-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US20080168396A1 (en) * 2007-01-07 2008-07-10 Michael Matas Portable Multifunction Device, Method, and Graphical User Interface for Providing Maps and Directions

Also Published As

Publication number Publication date
EP3033669B1 (fr) 2024-01-17
CN110413054A (zh) 2019-11-05
CN105453016B (zh) 2019-08-02
EP4332725A2 (fr) 2024-03-06
EP3033669A1 (fr) 2016-06-22
CN105453016A (zh) 2016-03-30
WO2015023419A1 (fr) 2015-02-19
CN110413054B (zh) 2023-04-28

Similar Documents

Publication Publication Date Title
EP4332725A3 (fr) Actions sensibles au contexte en réponse à une entrée tactile
EP2755112A3 (fr) Procédé et terminal portable pour fournir un effet haptique
EP2796982A3 (fr) Méthode et dispisitif pour changer l'icone pour un status
WO2012051209A3 (fr) Interface utilisateur commandée par geste
AU2016219716A1 (en) Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
EP2648087A3 (fr) Procédé et appareil pour exécuter une application
TWD165578S (zh) 顯示螢幕之圖像
TWD169225S (zh) 顯示螢幕之圖像
TWD169050S (zh) 顯示螢幕之圖像
EP2728457A3 (fr) Dispositif d'affichage et dispositif électronique l'utilisant
EP2733597A3 (fr) Modèle d'interaction et de transition pour dispositif électronique portable
MX2015016067A (es) Elementos de interfaz de usuario para multiples pantallas.
NZ618264A (en) Edge gesture
EP2565770A3 (fr) Appareil portable et procédé de saisie d'un appareil portable
EP2784657A3 (fr) Procédé et dispositif pour la commutation de tâches
EP2669786A3 (fr) Procédé d'affichage d'éléments dans un terminal et terminal utilisant ce procédé
EP2784646A3 (fr) Procédé et appareil pour exécuter une application
WO2012068542A3 (fr) Glissement orthogonal sur des barres de défilement
EP2703980A3 (fr) Appareil et procédé de reconnaissance de texte pour un terminal
EP2672481A3 (fr) Procédé de fourniture de service de reconnaissance vocale et dispositif électronique correspondant
EP2437153A3 (fr) Appareil et procédé pour tourner des pages de livre électronique (e-book) dans un terminal portable
EP2696260A3 (fr) Système et procédé permettant de réduire les effets de toucher par inadvertance sur un contrôleur d'écran tactile
EP2565752A3 (fr) Procédé permettant de fournir une interface utilisateur dans un terminal portatif et appareil associé
EP2474899A3 (fr) Définition et gestion des événements d'entrée d'utilisateur dans un navigateur Web
WO2011156161A3 (fr) Gestes associés à un contenu

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240112

AC Divisional application: reference to earlier application

Ref document number: 3033669

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: G06F0001160000

Ipc: G06F0003048800

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 1/16 20060101ALN20240430BHEP

Ipc: G01C 21/34 20060101ALI20240430BHEP

Ipc: H04M 1/725 20210101ALI20240430BHEP

Ipc: G06F 3/0488 20220101AFI20240430BHEP