IN2012DE01672A - - Google Patents

Download PDF

Info

Publication number
IN2012DE01672A
IN2012DE01672A IN1672DE2012A IN2012DE01672A IN 2012DE01672 A IN2012DE01672 A IN 2012DE01672A IN 1672DE2012 A IN1672DE2012 A IN 1672DE2012A IN 2012DE01672 A IN2012DE01672 A IN 2012DE01672A
Authority
IN
India
Prior art keywords
user
pinch
pinch operation
stereoscopic object
detects
Prior art date
Application number
Other languages
English (en)
Inventor
Noda Takuro
Yamamoto Kazuyuki
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of IN2012DE01672A publication Critical patent/IN2012DE01672A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Processing Or Creating Images (AREA)
IN1672DE2012 2011-06-07 2012-06-01 IN2012DE01672A (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011127447A JP2012256110A (ja) 2011-06-07 2011-06-07 情報処理装置、情報処理方法およびプログラム

Publications (1)

Publication Number Publication Date
IN2012DE01672A true IN2012DE01672A (fr) 2015-09-25

Family

ID=46353997

Family Applications (1)

Application Number Title Priority Date Filing Date
IN1672DE2012 IN2012DE01672A (fr) 2011-06-07 2012-06-01

Country Status (6)

Country Link
US (2) US20120317510A1 (fr)
EP (1) EP2533143A2 (fr)
JP (1) JP2012256110A (fr)
CN (1) CN102981606A (fr)
BR (1) BR102012013210A2 (fr)
IN (1) IN2012DE01672A (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9225810B2 (en) 2012-07-03 2015-12-29 Sony Corporation Terminal device, information processing method, program, and storage medium
US20140267142A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Extending interactive inputs via sensor fusion
JP6266229B2 (ja) * 2013-05-14 2018-01-24 東芝メディカルシステムズ株式会社 画像処理装置、方法、及びプログラム
US9836199B2 (en) 2013-06-26 2017-12-05 Panasonic Intellectual Property Corporation Of America User interface device and display object operating method
US9996797B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Interactions with virtual objects for machine control
JP2015090593A (ja) * 2013-11-06 2015-05-11 ソニー株式会社 情報処理装置、情報処理方法および情報処理システム
US10416834B1 (en) * 2013-11-15 2019-09-17 Leap Motion, Inc. Interaction strength using virtual objects for machine control
US20150346981A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Slider controlling visibility of objects in a 3d space
CN106716340B (zh) * 2014-09-29 2019-10-25 夏普株式会社 便携终端、便携终端的控制方法以及控制程序
JP6573101B2 (ja) * 2015-04-02 2019-09-11 株式会社コト インタラクション実行方法及び該方法を採用する装置並びにプログラム
JP6551363B2 (ja) * 2016-10-28 2019-07-31 京セラドキュメントソリューションズ株式会社 情報処理装置
JP6470356B2 (ja) * 2017-07-21 2019-02-13 株式会社コロプラ 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置
JP6568331B1 (ja) * 2019-04-17 2019-08-28 京セラ株式会社 電子機器、制御方法、及びプログラム
JP7314622B2 (ja) 2019-05-29 2023-07-26 富士フイルムビジネスイノベーション株式会社 画像表示装置、及び画像表示プログラム

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6466205B2 (en) * 1998-11-19 2002-10-15 Push Entertainment, Inc. System and method for creating 3D models from 2D sequential image data
EP1323019A2 (fr) * 2000-09-26 2003-07-02 Eugenio Bustamante Fourniture de signaux d'entree
US6753847B2 (en) * 2002-01-25 2004-06-22 Silicon Graphics, Inc. Three dimensional volumetric display input and output configurations
US7532196B2 (en) * 2003-10-30 2009-05-12 Microsoft Corporation Distributed sensing techniques for mobile devices
US20050275942A1 (en) * 2004-04-02 2005-12-15 David Hartkop Method and apparatus to retrofit a display device for autostereoscopic display of interactive computer graphics
JP4864713B2 (ja) * 2004-09-30 2012-02-01 パイオニア株式会社 立体的二次元画像表示装置
WO2006121957A2 (fr) * 2005-05-09 2006-11-16 Michael Vesely Simulateur a perspective horizontale
US8743109B2 (en) * 2006-08-31 2014-06-03 Kent State University System and methods for multi-dimensional rendering and display of full volumetric data sets
US20100053151A1 (en) * 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
JP5574523B2 (ja) * 2009-04-22 2014-08-20 株式会社プロテックデザイン 回転式入力装置及び電子機器
JP5343773B2 (ja) 2009-09-04 2013-11-13 ソニー株式会社 情報処理装置、表示制御方法及び表示制御プログラム
RU2524834C2 (ru) * 2009-10-14 2014-08-10 Нокиа Корпорейшн Устройство для автостереоскопического рендеринга и отображения
US8819591B2 (en) * 2009-10-30 2014-08-26 Accuray Incorporated Treatment planning in a virtual environment
US8593398B2 (en) * 2010-06-25 2013-11-26 Nokia Corporation Apparatus and method for proximity based input
KR20120000663A (ko) * 2010-06-28 2012-01-04 주식회사 팬택 3d 객체 처리 장치
US20120005624A1 (en) * 2010-07-02 2012-01-05 Vesely Michael A User Interface Elements for Use within a Three Dimensional Scene
US20120007819A1 (en) * 2010-07-08 2012-01-12 Gregory Robert Hewes Automatic Convergence Based on Touchscreen Input for Stereoscopic Imaging
JP6049990B2 (ja) * 2010-09-15 2016-12-21 京セラ株式会社 携帯電子機器、画面制御方法および画面制御プログラム
JP5703703B2 (ja) * 2010-11-11 2015-04-22 ソニー株式会社 情報処理装置、立体視表示方法及びプログラム
KR101727899B1 (ko) * 2010-11-26 2017-04-18 엘지전자 주식회사 휴대 단말기 및 그 동작 제어방법
EP2656181B1 (fr) * 2010-12-22 2019-10-30 zSpace, Inc. Suivi tridimensionnel de dispositif de commande utilisateur dans volume
CN102096511A (zh) * 2011-02-10 2011-06-15 林胜军 立体影像触控的装置
US9632677B2 (en) * 2011-03-02 2017-04-25 The Boeing Company System and method for navigating a 3-D environment using a multi-input interface
US20120287065A1 (en) * 2011-05-10 2012-11-15 Kyocera Corporation Electronic device

Also Published As

Publication number Publication date
BR102012013210A2 (pt) 2014-12-09
EP2533143A2 (fr) 2012-12-12
JP2012256110A (ja) 2012-12-27
US20120317510A1 (en) 2012-12-13
US20160328115A1 (en) 2016-11-10
CN102981606A (zh) 2013-03-20

Similar Documents

Publication Publication Date Title
IN2012DE01672A (fr)
IN2014DN07500A (fr)
AU2018253539A1 (en) Device, method, and graphical user interface for moving and dropping a user interface object
EP2795434A4 (fr) Détection de gestuelle de commande par l'utilisateur
IN2013MU01308A (fr)
IN2015DN02046A (fr)
EP3387628A4 (fr) Appareil, système et procédés pour assurer l'interface avec un utilisateur et/ou un appareil externe par détection d'état stationnaire
EP3074923A4 (fr) Suivi d'un il et détection de la réaction de l'utilisateur
GB201305379D0 (en) Methods and systems for enrolling biometric data
EP3090353A4 (fr) Systèmes et procédés d'actions utilisateur guidées
GB2494520B (en) Graphical user interface, computing device, and method for operating the same
EP2954110A4 (fr) Interface utilisateur de commande pour appareil, et procédé associé
WO2015033152A3 (fr) Dispositif portable
IN2015DN02386A (fr)
MX352448B (es) Auto-ajuste tamaño de contenido presentado en una presentacion.
GB201009249D0 (en) Entertainment device and entertainment methods
EP2676234A4 (fr) Fourniture de contenu contextuel basé sur un autre utilisateur
MX2017002065A (es) Aparato de despliegue y metodo de control del mismo.
GB2523509A (en) Temperature based on touching portable computing device
EP2989528A4 (fr) Détection d'un utilisateur attentif pour présenter un contenu personnalisé sur un affichage
GB201209638D0 (en) Perception loss detection
AU2012354743A8 (en) Electronic device and program for controlling electronic device
EP2965195A4 (fr) Autorisation d'utilisateur et détection de présence en isolement à partir d'une interférence provenant d'une unité de traitement centrale d'hôte, et à partir d'un contrôle par celle-ci, et système d'exploitation
EP2939085A4 (fr) Vitesse de balayage d'écran tactile variable basée sur une détection de présence d'utilisateur
EP3014401A4 (fr) Détection parallèle de points de contact à l'aide d'un processeur graphique