IN2014MN00973A - - Google Patents

Info

Publication number
IN2014MN00973A
IN2014MN00973A IN973MUN2014A IN2014MN00973A IN 2014MN00973 A IN2014MN00973 A IN 2014MN00973A IN 973MUN2014 A IN973MUN2014 A IN 973MUN2014A IN 2014MN00973 A IN2014MN00973 A IN 2014MN00973A
Authority
IN
India
Prior art keywords
user
audio
space
audio cues
cues
Prior art date
Application number
Inventor
Pei Xiang
Hui Ya Liao Nelson
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of IN2014MN00973A publication Critical patent/IN2014MN00973A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1688Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being integrated loudspeakers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/162Interface to dedicated audio devices, e.g. audio drivers, interface to CODECs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Abstract

A user interface methods and article of manufacture each for selecting an audio cue presented in three dimensional (3D) space are disclosed. The audio cues are audibly perceivable in a space about a user where each of the audio cues may be perceived by the user as a directional sound at a distinct location from other audio cues in the space. Selection of a specific audio cue is made based on one or more user gestures. A portable electronic device may be configured to present the audio cues perceived by a user and detect certain user gestures to select audio cues. The audio cue selection can be used to control operation of the portable device and/or other associated devices.
IN973MUN2014 2011-12-19 2012-11-01 IN2014MN00973A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161577489P 2011-12-19 2011-12-19
US13/664,281 US9563278B2 (en) 2011-12-19 2012-10-30 Gesture controlled audio user interface
PCT/US2012/063077 WO2013095783A2 (en) 2011-12-19 2012-11-01 Gesture controlled audio user interface

Publications (1)

Publication Number Publication Date
IN2014MN00973A true IN2014MN00973A (en) 2015-04-24

Family

ID=48609617

Family Applications (1)

Application Number Title Priority Date Filing Date
IN973MUN2014 IN2014MN00973A (en) 2011-12-19 2012-11-01

Country Status (7)

Country Link
US (1) US9563278B2 (en)
EP (1) EP2795432A2 (en)
JP (1) JP6195843B2 (en)
KR (1) KR101708101B1 (en)
CN (1) CN103999021B (en)
IN (1) IN2014MN00973A (en)
WO (1) WO2013095783A2 (en)

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8326951B1 (en) 2004-06-05 2012-12-04 Sonos, Inc. Establishing a secure wireless network with minimum human intervention
US9779750B2 (en) 2004-07-30 2017-10-03 Invention Science Fund I, Llc Cue-aware privacy filter for participants in persistent communications
US9704502B2 (en) * 2004-07-30 2017-07-11 Invention Science Fund I, Llc Cue-aware privacy filter for participants in persistent communications
WO2013101469A1 (en) * 2011-12-29 2013-07-04 Intel Corporation Audio pipeline for audio distribution on system on a chip platforms
WO2013117806A2 (en) * 2012-02-07 2013-08-15 Nokia Corporation Visual spatial audio
US9632683B2 (en) * 2012-11-08 2017-04-25 Nokia Technologies Oy Methods, apparatuses and computer program products for manipulating characteristics of audio objects by using directional gestures
US9866964B1 (en) * 2013-02-27 2018-01-09 Amazon Technologies, Inc. Synchronizing audio outputs
CN103402156B (en) * 2013-07-25 2016-05-25 瑞声科技(南京)有限公司 Sound system
US10219094B2 (en) 2013-07-30 2019-02-26 Thomas Alan Donaldson Acoustic detection of audio sources to facilitate reproduction of spatial audio spaces
US10225680B2 (en) * 2013-07-30 2019-03-05 Thomas Alan Donaldson Motion detection of audio sources to facilitate reproduction of spatial audio spaces
EP2866182A1 (en) * 2013-10-25 2015-04-29 Nokia Technologies OY Providing contextual information
JP6553052B2 (en) * 2014-01-03 2019-07-31 ハーマン インターナショナル インダストリーズ インコーポレイテッド Gesture-interactive wearable spatial audio system
US9618618B2 (en) 2014-03-10 2017-04-11 Elwha Llc Systems and methods for ultrasonic position and motion detection
US9739883B2 (en) 2014-05-16 2017-08-22 Elwha Llc Systems and methods for ultrasonic velocity and acceleration detection
US9437002B2 (en) 2014-09-25 2016-09-06 Elwha Llc Systems and methods for a dual modality sensor system
US9026914B1 (en) 2014-05-28 2015-05-05 Google Inc. Multi-sound audio interface system
US9886236B2 (en) * 2014-05-28 2018-02-06 Google Llc Multi-dimensional audio interface system
US9392368B2 (en) * 2014-08-25 2016-07-12 Comcast Cable Communications, Llc Dynamic positional audio
KR102329193B1 (en) * 2014-09-16 2021-11-22 삼성전자주식회사 Method for Outputting the Screen Information to Sound And Electronic Device for Supporting the Same
US10048835B2 (en) 2014-10-31 2018-08-14 Microsoft Technology Licensing, Llc User interface functionality for facilitating interaction between users and their environments
JP6642989B2 (en) * 2015-07-06 2020-02-12 キヤノン株式会社 Control device, control method, and program
US9995823B2 (en) 2015-07-31 2018-06-12 Elwha Llc Systems and methods for utilizing compressed sensing in an entertainment system
US10206040B2 (en) * 2015-10-30 2019-02-12 Essential Products, Inc. Microphone array for generating virtual sound field
US9483693B1 (en) * 2015-11-25 2016-11-01 Clover Network, Inc. Free-hand character recognition on a touch screen POS terminal
US10134422B2 (en) * 2015-12-01 2018-11-20 Qualcomm Incorporated Determining audio event based on location information
CN105607738B (en) * 2015-12-22 2018-09-25 小米科技有限责任公司 Determine the method and device of one hand pattern
US10303422B1 (en) 2016-01-05 2019-05-28 Sonos, Inc. Multiple-device setup
EP3458872B1 (en) 2016-05-19 2021-04-07 Harman International Industries, Incorporated Gesture-enabled audio device with visible feedback
KR20180020517A (en) 2016-08-18 2018-02-28 엘지전자 주식회사 Mobile terminal
US11076261B1 (en) * 2016-09-16 2021-07-27 Apple Inc. Location systems for electronic device communications
JP2018092012A (en) * 2016-12-05 2018-06-14 ソニー株式会社 Information processing device, information processing method, and program
KR20180084550A (en) 2017-01-17 2018-07-25 삼성전자주식회사 Electronic apparatus and controlling method thereof
GB2562036A (en) * 2017-04-24 2018-11-07 Nokia Technologies Oy Spatial audio processing
WO2019199359A1 (en) * 2018-04-08 2019-10-17 Dts, Inc. Ambisonic depth extraction
US20210239831A1 (en) 2018-06-05 2021-08-05 Google Llc Systems and methods of ultrasonic sensing in smart devices
US11113092B2 (en) * 2019-02-08 2021-09-07 Sony Corporation Global HRTF repository
US11451907B2 (en) 2019-05-29 2022-09-20 Sony Corporation Techniques combining plural head-related transfer function (HRTF) spheres to place audio objects
US11347832B2 (en) 2019-06-13 2022-05-31 Sony Corporation Head related transfer function (HRTF) as biometric authentication
US11036464B2 (en) 2019-09-13 2021-06-15 Bose Corporation Spatialized augmented reality (AR) audio menu
US11146908B2 (en) 2019-10-24 2021-10-12 Sony Corporation Generating personalized end user head-related transfer function (HRTF) from generic HRTF
US11070930B2 (en) 2019-11-12 2021-07-20 Sony Corporation Generating personalized end user room-related transfer function (RRTF)
EP3879702A1 (en) * 2020-03-09 2021-09-15 Nokia Technologies Oy Adjusting a volume level
US11563783B2 (en) * 2020-08-14 2023-01-24 Cisco Technology, Inc. Distance-based framing for an online conference session
US11392250B1 (en) 2020-12-31 2022-07-19 Apple Inc. Ultrasonic touch sensing parasitic wave rejection
CN117499850A (en) * 2023-12-26 2024-02-02 荣耀终端有限公司 Audio data playing method and electronic equipment

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3834848B2 (en) 1995-09-20 2006-10-18 株式会社日立製作所 Sound information providing apparatus and sound information selecting method
JPH09114543A (en) 1995-10-02 1997-05-02 Xybernaut Corp Handfree computer system
US20070177804A1 (en) 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
IL127569A0 (en) * 1998-09-16 1999-10-28 Comsense Technologies Ltd Interactive toys
JP3285835B2 (en) 1998-12-25 2002-05-27 三菱電機株式会社 Menu selection device
GB2374772B (en) 2001-01-29 2004-12-29 Hewlett Packard Co Audio user interface
JP4624577B2 (en) 2001-02-23 2011-02-02 富士通株式会社 Human interface system with multiple sensors
US6798429B2 (en) * 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
FI20010958A0 (en) 2001-05-08 2001-05-08 Nokia Corp Procedure and Arrangements for Designing an Extended User Interface
GB0311177D0 (en) * 2003-05-15 2003-06-18 Qinetiq Ltd Non contact human-computer interface
JP3898673B2 (en) 2003-07-18 2007-03-28 株式会社タムラ製作所 Audio communication system, method and program, and audio reproduction apparatus
JP2006287878A (en) 2005-04-05 2006-10-19 Matsushita Electric Ind Co Ltd Portable telephone terminal
US7953236B2 (en) 2005-05-06 2011-05-31 Microsoft Corporation Audio user interface (UI) for previewing and selecting audio streams using 3D positional audio techniques
US8210942B2 (en) * 2006-03-31 2012-07-03 Wms Gaming Inc. Portable wagering game with vibrational cues and feedback mechanism
US7596765B2 (en) 2006-05-23 2009-09-29 Sony Ericsson Mobile Communications Ab Sound feedback on menu navigation
CN101449236A (en) * 2006-05-23 2009-06-03 索尼爱立信移动通讯股份有限公司 Sound feedback on menu navigation
US8421642B1 (en) * 2006-08-24 2013-04-16 Navisense System and method for sensorized user interface
US8942764B2 (en) * 2007-10-01 2015-01-27 Apple Inc. Personal media device controlled via user initiated movements utilizing movement based interfaces
US20090166098A1 (en) 2007-12-31 2009-07-02 Apple Inc. Non-visual control of multi-touch device
TW200934212A (en) 2008-01-16 2009-08-01 Asustek Comp Inc Mobile digital device with intuitive browsing and operating method thereof
US9454256B2 (en) * 2008-03-14 2016-09-27 Apple Inc. Sensor configurations of an input device that are switchable based on mode
US8625846B2 (en) * 2008-03-18 2014-01-07 Elliptic Laboratories As Object and movement detection
GB0810179D0 (en) * 2008-06-04 2008-07-09 Elliptic Laboratories As Object location
JP5219205B2 (en) 2008-10-24 2013-06-26 清水建設株式会社 Moving body position detection system
US9037468B2 (en) * 2008-10-27 2015-05-19 Sony Computer Entertainment Inc. Sound localization for user in motion
JP5245808B2 (en) 2008-12-25 2013-07-24 ヤマハ株式会社 Pointing system
US9389829B2 (en) * 2009-04-09 2016-07-12 Aliphcom Spatial user interface for audio system
US8923995B2 (en) 2009-12-22 2014-12-30 Apple Inc. Directional audio interface for portable media device
JP5488011B2 (en) 2010-02-04 2014-05-14 ソニー株式会社 COMMUNICATION CONTROL DEVICE, COMMUNICATION CONTROL METHOD, AND PROGRAM
JP2011211312A (en) 2010-03-29 2011-10-20 Panasonic Corp Sound image localization processing apparatus and sound image localization processing method
US8935438B1 (en) * 2011-06-28 2015-01-13 Amazon Technologies, Inc. Skin-dependent device components

Also Published As

Publication number Publication date
US20130154930A1 (en) 2013-06-20
JP2015506035A (en) 2015-02-26
KR101708101B1 (en) 2017-02-27
CN103999021B (en) 2017-12-08
KR20140107484A (en) 2014-09-04
EP2795432A2 (en) 2014-10-29
WO2013095783A3 (en) 2013-08-22
JP6195843B2 (en) 2017-09-13
WO2013095783A2 (en) 2013-06-27
CN103999021A (en) 2014-08-20
US9563278B2 (en) 2017-02-07

Similar Documents

Publication Publication Date Title
IN2014MN00973A (en)
IN2015DN02386A (en)
IN2014DN07500A (en)
IN2014MN02283A (en)
WO2016072823A3 (en) Loop-shaped tactile multi-touch input device and gestures, and method therefor
AU2014278636A8 (en) Device, method, and graphical user interface for moving user interface objects
GB201111911D0 (en) Displays
MX2011009186A (en) Electroactive polymer transducers for tactile feedback devices.
MX2011007670A (en) Electroactive polymer transducers for tactile feedback devices.
GB2512549A (en) Simulating touch texture on the display of a mobile device using vibration
MX2013008888A (en) Method for controlling electronic apparatus based on motion recognition, and electronic apparatus applying the same.
TWD160871S (en) Controller for electronic device
MX2014002802A (en) Semantic zoom gestures.
MX2014002779A (en) Semantic zoom.
TWD159698S (en) Controller for electronic device
MX2015010598A (en) Method for controlling display of multiple objects depending on input related to operation of mobile terminal, and mobile terminal therefor.
MX2014003188A (en) Establishing content navigation direction based on directional user gestures.
TWD159700S (en) Controller for electronic device
TWD167277S (en) Case for an electronic device
WO2012138917A3 (en) Gesture-activated input using audio recognition
IN2014CN02517A (en)
MX338139B (en) Method, computer program, reception apparatus, and information providing apparatus for accessing content from a plurality of content sources.
MX339341B (en) Electronic text manipulation and display.
AU2012354743A8 (en) Electronic device and program for controlling electronic device
WO2012109452A3 (en) Portable electronic device and method of controlling same