IN2012DE01672A - - Google Patents

Download PDF

Info

Publication number
IN2012DE01672A
IN2012DE01672A IN1672DE2012A IN2012DE01672A IN 2012DE01672 A IN2012DE01672 A IN 2012DE01672A IN 1672DE2012 A IN1672DE2012 A IN 1672DE2012A IN 2012DE01672 A IN2012DE01672 A IN 2012DE01672A
Authority
IN
India
Prior art keywords
user
pinch
pinch operation
stereoscopic object
detects
Prior art date
Application number
Inventor
Noda Takuro
Yamamoto Kazuyuki
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of IN2012DE01672A publication Critical patent/IN2012DE01672A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

There is provided an information processing apparatus including a detecting unit that detects a pinch operation of a user, and a control unit that determines a stereoscopic object as an object to be selected, when a pinch position by the detected pinch operation corresponds to a perceived position of the stereoscopic object by the user.
IN1672DE2012 2011-06-07 2012-06-01 IN2012DE01672A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011127447A JP2012256110A (en) 2011-06-07 2011-06-07 Information processing apparatus, information processing method, and program

Publications (1)

Publication Number Publication Date
IN2012DE01672A true IN2012DE01672A (en) 2015-09-25

Family

ID=46353997

Family Applications (1)

Application Number Title Priority Date Filing Date
IN1672DE2012 IN2012DE01672A (en) 2011-06-07 2012-06-01

Country Status (6)

Country Link
US (2) US20120317510A1 (en)
EP (1) EP2533143A2 (en)
JP (1) JP2012256110A (en)
CN (1) CN102981606A (en)
BR (1) BR102012013210A2 (en)
IN (1) IN2012DE01672A (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9225810B2 (en) * 2012-07-03 2015-12-29 Sony Corporation Terminal device, information processing method, program, and storage medium
US20140267142A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Extending interactive inputs via sensor fusion
JP6266229B2 (en) * 2013-05-14 2018-01-24 東芝メディカルシステムズ株式会社 Image processing apparatus, method, and program
WO2014207971A1 (en) 2013-06-26 2014-12-31 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ User interface apparatus and display object operation method
US9996797B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Interactions with virtual objects for machine control
JP2015090593A (en) * 2013-11-06 2015-05-11 ソニー株式会社 Information processing device, information processing method, and information processing system
US10416834B1 (en) * 2013-11-15 2019-09-17 Leap Motion, Inc. Interaction strength using virtual objects for machine control
US20150346981A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Slider controlling visibility of objects in a 3d space
WO2016052172A1 (en) * 2014-09-29 2016-04-07 シャープ株式会社 Portable terminal, method for controlling portable terminal, and control program
JP6573101B2 (en) * 2015-04-02 2019-09-11 株式会社コト INTERACTION EXECUTION METHOD, DEVICE USING THE METHOD, AND PROGRAM
JP6551363B2 (en) * 2016-10-28 2019-07-31 京セラドキュメントソリューションズ株式会社 Information processing device
JP6470356B2 (en) * 2017-07-21 2019-02-13 株式会社コロプラ Program and method executed by computer for providing virtual space, and information processing apparatus for executing the program
JP6568331B1 (en) * 2019-04-17 2019-08-28 京セラ株式会社 Electronic device, control method, and program
JP7314622B2 (en) 2019-05-29 2023-07-26 富士フイルムビジネスイノベーション株式会社 Image display device and image display program

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6466205B2 (en) * 1998-11-19 2002-10-15 Push Entertainment, Inc. System and method for creating 3D models from 2D sequential image data
GB2378497B (en) * 2000-09-26 2003-12-31 Eugenio Bustamante Providing input signals
US6753847B2 (en) * 2002-01-25 2004-06-22 Silicon Graphics, Inc. Three dimensional volumetric display input and output configurations
US7532196B2 (en) * 2003-10-30 2009-05-12 Microsoft Corporation Distributed sensing techniques for mobile devices
US20050275942A1 (en) * 2004-04-02 2005-12-15 David Hartkop Method and apparatus to retrofit a display device for autostereoscopic display of interactive computer graphics
WO2006035816A1 (en) * 2004-09-30 2006-04-06 Pioneer Corporation Pseudo-3d two-dimensional image display device
US20060250391A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Three dimensional horizontal perspective workstation
US8743109B2 (en) * 2006-08-31 2014-06-03 Kent State University System and methods for multi-dimensional rendering and display of full volumetric data sets
US20100053151A1 (en) * 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
JP5574523B2 (en) * 2009-04-22 2014-08-20 株式会社プロテックデザイン Rotary input device and electronic device
JP5343773B2 (en) 2009-09-04 2013-11-13 ソニー株式会社 Information processing apparatus, display control method, and display control program
RU2524834C2 (en) * 2009-10-14 2014-08-10 Нокиа Корпорейшн Autostereoscopic rendering and display apparatus
US8819591B2 (en) * 2009-10-30 2014-08-26 Accuray Incorporated Treatment planning in a virtual environment
US8593398B2 (en) * 2010-06-25 2013-11-26 Nokia Corporation Apparatus and method for proximity based input
KR20120000663A (en) * 2010-06-28 2012-01-04 주식회사 팬택 Apparatus for processing 3d object
US20120005624A1 (en) * 2010-07-02 2012-01-05 Vesely Michael A User Interface Elements for Use within a Three Dimensional Scene
US20120007819A1 (en) * 2010-07-08 2012-01-12 Gregory Robert Hewes Automatic Convergence Based on Touchscreen Input for Stereoscopic Imaging
JP6049990B2 (en) * 2010-09-15 2016-12-21 京セラ株式会社 Portable electronic device, screen control method, and screen control program
JP5703703B2 (en) * 2010-11-11 2015-04-22 ソニー株式会社 Information processing apparatus, stereoscopic display method, and program
KR101727899B1 (en) * 2010-11-26 2017-04-18 엘지전자 주식회사 Mobile terminal and operation control method thereof
CN106774880B (en) * 2010-12-22 2020-02-21 Z空间股份有限公司 Three-dimensional tracking of user control devices in space
CN102096511A (en) * 2011-02-10 2011-06-15 林胜军 Three-dimensional image touch device
US9632677B2 (en) * 2011-03-02 2017-04-25 The Boeing Company System and method for navigating a 3-D environment using a multi-input interface
US20120287065A1 (en) * 2011-05-10 2012-11-15 Kyocera Corporation Electronic device

Also Published As

Publication number Publication date
US20120317510A1 (en) 2012-12-13
EP2533143A2 (en) 2012-12-12
US20160328115A1 (en) 2016-11-10
BR102012013210A2 (en) 2014-12-09
JP2012256110A (en) 2012-12-27
CN102981606A (en) 2013-03-20

Similar Documents

Publication Publication Date Title
IN2012DE01672A (en)
IN2014DN07500A (en)
AU2016101481A4 (en) Device, method, and graphical user interface for moving and dropping a user interface object
EP2795434A4 (en) User control gesture detection
IN2013MU01308A (en)
IN2015DN02046A (en)
EP2825272A4 (en) System, method, and graphical user interface for controlling an application on a tablet
IN2014MN01835A (en)
EP3074923A4 (en) Eye tracking and user reaction detection
GB201305379D0 (en) Methods and systems for enrolling biometric data
EP3090353A4 (en) Systems and methods for guided user actions
GB2494520B (en) Graphical user interface, computing device, and method for operating the same
EP2954110A4 (en) User control interface for an appliance, and associated method
GB201106271D0 (en) A method, apparatus and computer program for user control of a state of an apparatus
IN2015DN02386A (en)
MX352448B (en) Auto-adjusting content size rendered on a display.
EP2676234A4 (en) Providing contextual content based on another user
EP2954389A4 (en) Varying user interface based on location or speed
MX370772B (en) Display apparatus and control method thereof.
GB2523509A8 (en) Temperature based on touching portable computing device
EP2989528A4 (en) Detecting an attentive user for providing personalized content on a display
AU2012354743A8 (en) Electronic device and program for controlling electronic device
EP2965195A4 (en) User authorization and presence detection in isolation from interference from and control by host central processing unit and operating system
GB201413419D0 (en) Control Signal Based On a Command Tapped By A User
WO2013176065A3 (en) Display controlling apparatus, display controlling method, program and control apparatus