WO2012093779A2 - Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal - Google Patents

Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal Download PDF

Info

Publication number
WO2012093779A2
WO2012093779A2 PCT/KR2011/009607 KR2011009607W WO2012093779A2 WO 2012093779 A2 WO2012093779 A2 WO 2012093779A2 KR 2011009607 W KR2011009607 W KR 2011009607W WO 2012093779 A2 WO2012093779 A2 WO 2012093779A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
touch
breathing
command
input
Prior art date
Application number
PCT/KR2011/009607
Other languages
English (en)
Korean (ko)
Other versions
WO2012093779A3 (fr
Inventor
최종명
Original Assignee
목포대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 목포대학교산학협력단 filed Critical 목포대학교산학협력단
Publication of WO2012093779A2 publication Critical patent/WO2012093779A2/fr
Publication of WO2012093779A3 publication Critical patent/WO2012093779A3/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the present invention relates to a multi-modal interface supporting user terminal and a control method thereof, and more particularly, to a plurality of input devices (touch screen and microphone) without using multi-touch in a small terminal (cell phone, PDA, PMP, pad, etc.).
  • the present invention relates to a multi-modal interface supporting user terminal using a user touch and breathing to support a convenient user command input similarly to a multi-touch using a multi-touch and a control method thereof.
  • the user interface using a multi-touch gesture is very convenient, but there is a problem that you must always use two hands.
  • smartphones and pads iPad, Galaxy, etc.
  • smartphones and pads that support touch screens and multi-touch inputs can use commands such as zooming in, zooming out, and moving screen contents when controlling applications. Enter through For example, in the case of Apple's iPhone, attaching two fingers to the touchscreen and spreading them apart enlarges the screen.
  • Virtual thumb is a method of placing a finger on the touch screen so that a virtual finger appears at the corresponding position. That is, the user may use the multi-touch gesture function by using the virtual finger and the real finger.
  • the virtual thumb has the advantage of providing the function of a multi-touch gesture with one finger, but the user has to press the touch screen once to create a mask and to perform a gesture using a finger once again.
  • an aspect of the present invention is to provide a multi-modal interface supporting user terminal using a user touch and blowing and a control method thereof.
  • a method of controlling a user terminal using a multi-modal interface using a user touch and breathing comprising: receiving a user touch, receiving a user's breath, and the user's touch and the Performing a user command corresponding to a user's breathing.
  • the method may further include detecting the user blowing type, and the user command may correspond to the user blowing type.
  • the detection of the user's breathing type may be performed based on at least one of the intensity and the number of times of the user's blowing.
  • the user command may correspond to a combination of the user wearing form and the user touch position.
  • the user command is a command to rotate the screen counterclockwise.
  • the user command rotates the screen clockwise. It may be a command.
  • the rotation angle of the screen may increase according to the intensity or frequency of the user breathing.
  • the performing of the user command in the method may be performed when the user input is made within a predetermined time after the user touch input.
  • a computer readable medium records a program for causing a computer to execute any one of the above methods.
  • a multi-modal interface supporting user terminal using user touch and breathing includes a touch input unit for receiving a user touch, a breath input unit for receiving user breath, and a user corresponding to the user touch and the user's breath It includes a control unit for performing a command.
  • the control unit may detect the user's blowing type, and the user command may correspond to the user's blowing type.
  • FIG. 1 is a conceptual diagram illustrating a user command input through a multi-modal interface of a user terminal according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing the configuration of a user terminal according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a user blowing form input to a user terminal according to an embodiment of the present invention.
  • FIG. 4 is a flowchart provided to explain a control method of a user terminal according to an exemplary embodiment of the present invention.
  • 5A and 5B are views illustrating a location of a user touch input to a user terminal according to an embodiment of the present invention.
  • FIG. 1 is a conceptual diagram illustrating a user command input through a multi-modal interface of a user terminal according to an embodiment of the present invention.
  • the multi-modal interface refers to an input method using multiple input devices (touch screen, microphone, etc.) at the same time.
  • the user terminal 20 may turn on the user 13 for a predetermined time. Wait for input.
  • the user terminal 20 detects a user's blowing type and performs an operation accordingly.
  • the user terminal 20 performs an operation according to the touch input.
  • the user's wear type may be detected only when the user's finger 11 is in contact with the touch input unit 21 and the corresponding operation may be performed.
  • FIG. 2 is a block diagram showing the configuration of a user terminal according to an embodiment of the present invention.
  • the user terminal 20 may include a touch input unit 21, a breath input unit 22, a display unit 23, a storage unit 24, and a controller 25.
  • a touch input unit 21 may include a touch input unit 21, a breath input unit 22, a display unit 23, a storage unit 24, and a controller 25.
  • a controller 25 may include a touch input unit 21, a breath input unit 22, a display unit 23, a storage unit 24, and a controller 25.
  • the touch input unit 21 receives a user input by contact.
  • the touch input unit 21 may be disposed on the display unit 23 to identify a position where the user's finger is in contact with and transmit the touch input to the control unit 25.
  • the touch input unit 21 may be implemented in any manner such as capacitive and resistive.
  • the breath input unit 22 receives a user input by blowing the user.
  • the breathing input unit 22 may transmit information on a user's breathing form to the controller 25.
  • the input unit 22 may be implemented as a micro.
  • the microphone vibrates by a sound caused by a user's breath, and can output an electrical signal according to the vibration.
  • the electrical signal output from the microphone may have a different shape according to the intensity and frequency of the user's breath, and may correspond to the user's breath.
  • the display unit 23 performs a function of displaying various information related to the operation of the user terminal 20 on the screen.
  • the display unit 23 may be implemented as a liquid crystal display (LCD) panel, organic light emitting diodes (OLED) panel, or the like.
  • the storage unit 24 stores various data and programs related to the operation of the user terminal 20 and performs a function of providing the data according to a request of the controller 25.
  • the storage unit 24 may be implemented as a storage medium such as a RAM, a ROM, a hard disk, or the like.
  • the control unit 25 controls the overall operation of the user terminal 20, and in particular, the control unit 25 according to the present invention corresponds to a user touch input by the touch input unit 21 and the breath input unit 22 and the user's breath Run the user command.
  • the control unit 25 may detect a user's breathing pattern and perform a user command accordingly.
  • the user's breathing pattern may be detected by analyzing the intensity of the user's breath or the number of user's breaths.
  • FIG. 3 is a diagram illustrating a user blowing form input to a user terminal according to an embodiment of the present invention.
  • the user's breathing form is a form in which the user blows the blowing hardly once (A), the blowing blows the long blowing (B), the blowing blows twice hardly (C) or the blowing while the middle of the blowing is continued.
  • the second blowing may be divided into other forms according to the shape (D) or the like, depending on the intensity of the user's breathing or the number of user's breaths. 3 is merely an example, and various other forms may be used.
  • FIG. 4 is a flowchart provided to explain a control method of a user terminal according to an exemplary embodiment of the present invention.
  • the controller 25 waits for a user input for a predetermined time (S420).
  • the waiting time may be set by default by the user terminal manufacturer or the multi-modal interface application producer according to the present invention or set by the user.
  • the user terminal 20 may be implemented to perform the operations of steps S420 to S440 when the user input is input while the user touch input is input.
  • the control unit 25 detects the input user's blowing type (S430).
  • the user's breathing pattern may be classified into other forms and detected according to the intensity of the user's breath or the number of user's breaths.
  • the controller 25 performs a user command corresponding to the detected user blowing type (S440). For example, when the blowing form A is detected among the user blowing forms illustrated in FIG. 3, the screen is enlarged, when the blowing form B is detected, the screen is moved, and the blowing form C is detected. If the screen is reduced, the screen may be reduced, and if the blowing type D is detected, the screen may be rotated.
  • the user command performed according to the type of wearing is merely an example, and may be implemented to perform other user commands, and a user wearing type other than the illustrated type may also be used.
  • control unit 25 may perform a user command according to the user touch input (S450).
  • the user command may be implemented to correspond to a combination of a user's wearing form and a user's touch position.
  • a user command corresponding to a combination of a user's wearing type and a user's touch position will be described with reference to FIG. 5.
  • 5 (a) and 5 (b) are diagrams provided to explain the rotation of a screen when a user inputs a user input according to a user touch position according to an embodiment of the present invention.
  • the user terminal 20 may rotate the screen in a counterclockwise direction (L).
  • the user terminal 20 may rotate the screen clockwise.
  • the rotation angle of the screen may be increased according to the intensity or frequency of user's breathing. For example, if user touch is input while the user touch position Ptouch is left (or right) at the center of the screen, the screen is rotated more counterclockwise (or clockwise) in proportion to the intensity of the user's breath. Can be implemented.
  • Embodiments of the invention include a computer readable medium containing program instructions for performing various computer-implemented operations.
  • This medium records a program for executing a control method of a multi-modal interface supporting user terminal utilizing user touch and breath described so far.
  • the media may include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of such media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CDs and DVDs, floppy disks and program commands such as magnetic-optical media, ROM, RAM, flash memory, and the like.
  • Hardware devices configured to store and perform such operations.
  • the medium may be a transmission medium such as an optical or metal wire, a waveguide, or the like including a carrier wave for transmitting a signal specifying a program command, a data structure, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the user terminal in a small computer system such as a mobile phone, PDA, tablet pad, etc., the user terminal can be held with one hand and the other hand can freely provide a function similar to a multi-touch gesture through user breathing.
  • the user can control the user terminal through a function similar to a multi-touch gesture with only one hand, and at the same time, the other hand has an advantage of being able to do other work.
  • a user who is uncomfortable with one hand has an advantage of controlling the user terminal similarly to a multi-touch gesture.

Abstract

L'invention concerne un terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur, ainsi qu'un procédé de commande de ce terminal. Le terminal utilisateur selon l'invention comprend une unité d'entrée tactile qui reçoit une entrée tactile de l'utilisateur, une unité d'entrée au souffle qui reçoit une entrée au souffle de l'utilisateur, ainsi qu'une unité de commande qui exécute une commande utilisateur correspondant à l'effleurement et au souffle de l'utilisateur. L'invention permet ainsi à l'utilisateur d'utiliser une seule main pour commander le terminal utilisateur par une fonction similaire à un geste multi-effleurement et d'utiliser l'autre main pour une autre tâche.
PCT/KR2011/009607 2011-01-04 2011-12-14 Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal WO2012093779A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2011-0000496 2011-01-04
KR1020110000496A KR101242481B1 (ko) 2011-01-04 2011-01-04 사용자 터치와 입김을 활용한 멀티 모달 인터페이스 지원 사용자 단말과 그 제어 방법

Publications (2)

Publication Number Publication Date
WO2012093779A2 true WO2012093779A2 (fr) 2012-07-12
WO2012093779A3 WO2012093779A3 (fr) 2012-09-07

Family

ID=46457787

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/009607 WO2012093779A2 (fr) 2011-01-04 2011-12-14 Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal

Country Status (2)

Country Link
KR (1) KR101242481B1 (fr)
WO (1) WO2012093779A2 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9304583B2 (en) 2008-11-20 2016-04-05 Amazon Technologies, Inc. Movement recognition as input mechanism
CN105812506A (zh) * 2014-12-27 2016-07-27 深圳富泰宏精密工业有限公司 操作方式控制系统与方法
US9483113B1 (en) 2013-03-08 2016-11-01 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9832452B1 (en) 2013-08-12 2017-11-28 Amazon Technologies, Inc. Robust user detection and tracking
US11199906B1 (en) * 2013-09-04 2021-12-14 Amazon Technologies, Inc. Global user input management

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19832729B4 (de) * 1998-07-21 2007-09-06 Schaeffler Kg Synchronkörper mit integriertem Druckstück oder Rastelement
KR101984583B1 (ko) * 2012-08-22 2019-05-31 엘지전자 주식회사 이동단말기 및 그 제어방법
CN109981887B (zh) * 2019-02-19 2021-09-07 合肥京东方光电科技有限公司 对终端屏幕通电状态的控制方法、装置、介质和电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006204877A (ja) * 2005-04-13 2006-08-10 Nintendo Co Ltd ゲームプログラムおよびゲーム装置
US20090244003A1 (en) * 2008-03-26 2009-10-01 Pierre Bonnat Method and system for interfacing with an electronic device via respiratory and/or tactual input
KR20100077982A (ko) * 2008-12-29 2010-07-08 엘지전자 주식회사 단말기 및 그 제어 방법
KR20100128265A (ko) * 2010-11-17 2010-12-07 표현숙 소형 단말기에서 손가락 터치와 입김을 활용한 멀티 모달 인터페이스를 이용한 사용자 인터페이스

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006204877A (ja) * 2005-04-13 2006-08-10 Nintendo Co Ltd ゲームプログラムおよびゲーム装置
US20090244003A1 (en) * 2008-03-26 2009-10-01 Pierre Bonnat Method and system for interfacing with an electronic device via respiratory and/or tactual input
KR20100077982A (ko) * 2008-12-29 2010-07-08 엘지전자 주식회사 단말기 및 그 제어 방법
KR20100128265A (ko) * 2010-11-17 2010-12-07 표현숙 소형 단말기에서 손가락 터치와 입김을 활용한 멀티 모달 인터페이스를 이용한 사용자 인터페이스

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9304583B2 (en) 2008-11-20 2016-04-05 Amazon Technologies, Inc. Movement recognition as input mechanism
US9483113B1 (en) 2013-03-08 2016-11-01 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9832452B1 (en) 2013-08-12 2017-11-28 Amazon Technologies, Inc. Robust user detection and tracking
US11199906B1 (en) * 2013-09-04 2021-12-14 Amazon Technologies, Inc. Global user input management
CN105812506A (zh) * 2014-12-27 2016-07-27 深圳富泰宏精密工业有限公司 操作方式控制系统与方法

Also Published As

Publication number Publication date
WO2012093779A3 (fr) 2012-09-07
KR20120079285A (ko) 2012-07-12
KR101242481B1 (ko) 2013-03-13

Similar Documents

Publication Publication Date Title
WO2012093779A2 (fr) Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal
KR101995278B1 (ko) 터치 디바이스의 ui 표시방법 및 장치
WO2014065499A1 (fr) Procédé d'édition basé sur la définition d'un bloc de texte grâce à plusieurs touchers
KR102545602B1 (ko) 전자 장치 및 그의 동작 방법
WO2014084633A1 (fr) Procédé d'affichage d'applications et dispositif électronique associé
WO2016017972A1 (fr) Dispositif électronique et procédé d'affichage d'une interface utilisateur de celui-ci
WO2012060589A2 (fr) Procédé de régulation de contact et terminal portable le prenant en charge
WO2013100720A1 (fr) Appareil et procédé multi-tâches de dispositif utilisateur
WO2013032234A1 (fr) Procédé de mise en oeuvre d'une interface utilisateur dans un terminal portable et appareil associé
WO2011083962A2 (fr) Procédé et appareil d'établissement d'une section d'un fichier multimédia dans un dispositif mobile
WO2014107005A1 (fr) Procédé pour la fourniture d'une fonction de souris et terminal mettant en oeuvre ce procédé
WO2012108620A2 (fr) Procédé de commande d'un terminal basé sur une pluralité d'entrées, et terminal portable prenant en charge ce procédé
WO2015030526A1 (fr) Procédé et appareil pour fournir de multiples applications
CN106371900B (zh) 一种实现异步调用的数据处理方法及装置
WO2014025131A1 (fr) Procédé et système pour afficher une interface utilisateur graphique
CN110007996B (zh) 应用程序管理方法及终端
WO2014129828A1 (fr) Procédé de fourniture d'un retour d'informations en réponse à une entrée d'un utilisateur et terminal le mettant en œuvre
KR102521192B1 (ko) 전자 장치 및 그의 동작 방법
KR20150069801A (ko) 화면 제어 방법 및 그 전자 장치
WO2021104163A1 (fr) Procédé d'agencement d'icônes et dispositif électronique
WO2014129787A1 (fr) Dispositif électronique à interface utilisateur tactile et son procédé de fonctionnement
WO2011090302A2 (fr) Procédé d'exploitation d'un dispositif portable personnel à écran tactile
CN110928461A (zh) 一种图标移动方法及电子设备
WO2018084684A1 (fr) Procédé destiné à commander l'exécution d'une application sur un dispositif électronique à l'aide d'un écran tactile et dispositif électronique destiné à ce dernier
WO2020238357A1 (fr) Procédé d'affichage d'icône et dispositif terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11854563

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11854563

Country of ref document: EP

Kind code of ref document: A2