WO2015194918A1 - Procédé et système de fourniture d'interface utilisateur et support d'informations non-transitoire lisible par un ordinateur - Google Patents

Procédé et système de fourniture d'interface utilisateur et support d'informations non-transitoire lisible par un ordinateur Download PDF

Info

Publication number
WO2015194918A1
WO2015194918A1 PCT/KR2015/006297 KR2015006297W WO2015194918A1 WO 2015194918 A1 WO2015194918 A1 WO 2015194918A1 KR 2015006297 W KR2015006297 W KR 2015006297W WO 2015194918 A1 WO2015194918 A1 WO 2015194918A1
Authority
WO
WIPO (PCT)
Prior art keywords
biometric information
gesture
input
user
command
Prior art date
Application number
PCT/KR2015/006297
Other languages
English (en)
Korean (ko)
Inventor
황성재
Original Assignee
주식회사 퓨처플레이
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140153954A external-priority patent/KR20150145677A/ko
Application filed by 주식회사 퓨처플레이 filed Critical 주식회사 퓨처플레이
Priority to KR1020187026863A priority Critical patent/KR20180107288A/ko
Priority to KR1020167007891A priority patent/KR101901735B1/ko
Publication of WO2015194918A1 publication Critical patent/WO2015194918A1/fr
Priority to US15/092,791 priority patent/US20160370866A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Definitions

  • the present invention relates to a method, system and non-transitory computer readable recording medium for providing a user interface.
  • Apple, Inc. introduced a technology called Touch ID (i.e., recognizes a user's fingerprint using RF (Radio Frequency) within a predetermined area).
  • Touch ID i.e., recognizes a user's fingerprint using RF (Radio Frequency) within a predetermined area.
  • RF Radio Frequency
  • the conventional technology introduced so far only pays attention only to the recognition of biometric information such as a fingerprint on the mobile terminal device, and the user can easily control the mobile terminal device based on the biometric information recognized on the mobile terminal device. There is no suggestion for providing a user interface to help.
  • the present inventors have come to develop a technology for recognizing a gesture together with biometric information and utilizing both of them to provide a user with a more convenient and expanded user interface.
  • the object of the present invention is to solve all the above-mentioned problems.
  • the present invention recognizes the gesture and the biometric information obtained from the gesture input means and the biometric information input means, respectively, and with reference to the corresponding relationship between the predetermined gesture or biometric information and the predetermined object or command Specify the object and command corresponding to the recognized gesture or the recognized biometric information above, specify the user with reference to the recognized biometric information above, and specify the above within the scope of authority granted to the specified user above It is another object of the present invention to determine that the specified command is to be performed on a given object, so as to recognize both the gesture and the biometric information and to provide a user with a more convenient and expanded user interface based on the recognition result.
  • a method for providing a user interface comprising: (a) recognizing a gesture and biometric information obtained from a gesture input means and a biometric information input means, respectively, (b) a preset gesture Or specifying an object and a command corresponding to the recognized gesture or the recognized biometric information by referring to a corresponding relationship between the biometric information and a predetermined object or command, and specifying a user by referring to the recognized biometric information. And (c) determining that the specified command is to be performed on the specified object within the range of privileges granted to the specified user.
  • a system for providing a user interface comprising: a gesture recognition unit for recognizing a gesture obtained from the gesture input means, a biometric information recognition unit for recognizing biometric information obtained from the biometric information input means And an object and a command corresponding to the recognized gesture or the recognized biometric information with reference to a corresponding relationship between the predetermined gesture or the biometric information and the predetermined object or command, and referring to the recognized biometric information.
  • a gesture recognition unit for recognizing a gesture obtained from the gesture input means
  • a biometric information recognition unit for recognizing biometric information obtained from the biometric information input means
  • an object and a command corresponding to the recognized gesture or the recognized biometric information with reference to a corresponding relationship between the predetermined gesture or the biometric information and the predetermined object or command, and referring to the recognized biometric information.
  • non-transitory computer readable recording medium for recording another method, system, and computer program for executing the method for implementing the present invention.
  • the effect of enabling various control of the mobile terminal device is achieved by using the gesture and the biometric information together.
  • the present invention by associating an object or a command with an inputable gesture or biometric information, if the user simply inputs only the gesture and biometric information, the corresponding command can be performed on the corresponding object. Is achieved.
  • FIG. 1 is a diagram illustrating an internal configuration of a user interface providing system according to an embodiment of the present invention by way of example.
  • FIGS. 2 to 4 are diagrams exemplarily illustrating how a user interface operates according to an embodiment of the present invention.
  • FIG. 5 is a diagram illustrating a process of providing visual feedback according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a state in which a user interface operates according to an embodiment of the present invention.
  • FIG. 7 is a diagram exemplarily illustrating a configuration for providing a user interface using an integrated technical means capable of receiving both a gesture and biometric information according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating a configuration in which various commands are performed in response to various gestures according to an embodiment of the present invention.
  • a microprocessor is provided with memory means such as a personal computer (for example, a desktop computer, a notebook computer, etc.), a server, a workstation, a PDA, a web pad, a mobile phone, a smartphone, a tablet, and the like.
  • a personal computer for example, a desktop computer, a notebook computer, etc.
  • a server for example, a server, a workstation, a PDA, a web pad, a mobile phone, a smartphone, a tablet, and the like.
  • the user terminal device itself to be described later may be a user interface providing system.
  • the user interface providing system according to an embodiment of the present invention, a gesture input means and a fingerprint recognition sensor that can receive various gestures from a user using a touch screen, an infrared sensor, an acceleration sensor, a camera, or the like.
  • biometric information input means for receiving biometric information from a user using an iris recognition sensor, a voice recognition sensor, a pulse recognition sensor, an EEG sensor, a temperature recognition sensor, and the like, and visual information accompanying the user interface. It may include a display means for displaying the.
  • FIG. 1 is a diagram illustrating an internal configuration of a user interface providing system according to an embodiment of the present invention by way of example.
  • the user interface providing system 100 may include a gesture recognition unit 110, a biometric information recognition unit 120, a command execution unit 130, a communication unit 140, and the like. It may include a controller 150. According to an embodiment of the present invention, the gesture recognition unit 110, the biometric information recognition unit 120, the command execution unit 130, the communication unit 140 and the control unit 150 is at least a portion of the external system (not shown).
  • Program modules which are in communication with each other. Such program modules may be included in the user interface providing system 100 in the form of an operating system, an application program module, and other program modules, and may be physically stored on various known storage devices.
  • program modules may be stored in a remote storage device that can communicate with the user interface providing system 100.
  • program modules include, but are not limited to, routines, subroutines, programs, objects, components, data structures, etc. that perform particular tasks or execute particular abstract data types, described below, in accordance with the present invention.
  • the gesture recognition unit 110 may perform a function of recognizing a gesture input from a user with reference to a user operation input from a predetermined gesture input means.
  • the gesture according to the present invention may correspond to at least one object or command.
  • a gesture input from a user may include a touch input for an arbitrary point on the touch screen, a touch input extending from a first point to a second point on the touch screen (that is, swipe or drag). , Post-touch release, flicking, pinch, and the like.
  • the gesture or rotation of the user terminal device including the acceleration sensor or the gyro sensor may be included in the gesture of the present invention.
  • the gesture according to the present invention is not necessarily limited to those listed above, and it can be understood that the present invention may be changed as much as possible within the scope of achieving the object of the present invention.
  • the biometric information recognizing unit 120 may perform a function of recognizing biometric information input from a predetermined biometric information input means.
  • the biometric information according to the present invention may correspond to at least one object or command.
  • the biometric information input from the user may include fingerprint information obtained from a fingerprint recognition sensor, iris information obtained from an iris recognition sensor, vein information obtained from a vein recognition sensor, voice information obtained from a voice recognition sensor, and pulse recognition.
  • Pulse information obtained from the sensor, brain wave information obtained from the EEG sensor, temperature information obtained from the temperature sensor may be included.
  • the biological information according to the present invention is not necessarily limited to those listed above, and it can be understood that the present invention can be changed as much as possible within the scope of achieving the object of the present invention.
  • the gesture and the biometric information may be input sequentially or simultaneously.
  • the biometric information may be recognized after the gesture is first recognized, or the gesture may be recognized after the biometric information is first recognized.
  • the gesture input means and the biometric information input means are integrally formed, the gesture and the biometric information may be input at the same time.
  • the gesture may be input through a touch screen
  • the biometric information may be input through a fingerprint recognition sensor provided on a home button separate from the touch screen.
  • both the gesture and the biometric information may be input through an integrated technology means such as a fingerprint integrated display (FOD) or a trace fingerprint sensor.
  • an integrated technology means such as a fingerprint integrated display (FOD) or a trace fingerprint sensor.
  • the command execution unit 130 refers to the recognized gesture or the recognized biometric device above with reference to a corresponding relationship between the preset gesture or biometric information and the preset object or command.
  • An object and a command corresponding to the information may be specified, and the user may be specified with reference to the recognized biometric information.
  • the command execution unit 130 may perform a function of determining that the specified command is performed on the specified object within the authority granted to the specified user. Can be.
  • each of the various gestures that may be input from the user may be preset to correspond to an object or a command.
  • the gesture corresponding to the preset gesture may correspond to the gesture input from the user.
  • the gesture of touching one of the plurality of icons displayed on the touch screen may be preset to correspond to selecting an object indicated by the icon.
  • a gesture ie, swipe or drag
  • a save or load command may be preset to correspond to a save or load command.
  • the biometric information that may be input from the user may be used as a criterion for identifying the user and specifying the user.
  • each of the various biometric information that may be input from the user may be preset to correspond to an object or a command, and referring to the preset correspondence, the biometric information input from the user may be The corresponding object or command can be specified.
  • the fingerprint information corresponding to the index finger may be preset to correspond to the object A
  • the fingerprint information corresponding to the thumb may be preset to correspond to the object B.
  • the fingerprint information corresponding to the index finger may be preset to correspond to a command to perform a payment using a credit card A, and the fingerprint information corresponding to a thumb uses a credit card B to make a payment. It may be preset to correspond to a command to perform.
  • FIGS. 2 to 4 are diagrams exemplarily illustrating how a user interface operates according to an embodiment of the present invention.
  • the user interface providing system 100 recognizes the above gesture and biometric information.
  • the user A, the content A, and the save command can be specified, and a command for storing the content A in the storage space given to the user A of the cloud server can be determined.
  • the index finger may be placed on the touch screen 210 by the fingerprint recognition sensor 220.
  • the user interface providing system 100 recognizes the above gesture and biometric information, and executes a user B, content B, and a load command.
  • the content B stored in the storage space given to the user B of the cloud server corresponding to the fingerprint information of the index finger of the user B is loaded into the user terminal device, and the content B is touched by the user terminal device.
  • a command to display on the screen 210 may be determined to be performed.
  • the system 100 for providing a user interface may recognize a user A, a content A, and a storage command by recognizing the above gesture and biometric information, and detect a user A in a storage space given to the user A in a cloud server. It may be determined that a command for storing the content A corresponding to the finger is performed.
  • the user interface providing system 100 may recognize the above gesture and the biometric information to specify the user B, the content B, and a load command, and detect the user B in a storage space given to the user B in the cloud server. It may be determined that a command for loading content B stored in correspondence with a finger to the user terminal device is performed.
  • the user A touches the content A displayed on the touch screen 410 with the index finger for more than a predetermined time and the content B touches the content B with the thumb for more than a predetermined time
  • the user A's index finger and thumb Content A and content B may correspond to fingers, respectively.
  • a predetermined gesture using the index finger a predetermined command may be performed on the content A corresponding to the index finger, and the user A uses the thumb.
  • a predetermined gesture is input, a predetermined command may be performed on the finger B corresponding to the thumb.
  • FIG. 5 is a diagram illustrating a process of providing visual feedback according to an embodiment of the present invention.
  • an arrow shape may induce the user to input a gesture corresponding to the above-described "drag”.
  • Visual feedback may be displayed on the touch screen 510.
  • FIG. 6 is a diagram illustrating a state in which a user interface operates according to an embodiment of the present invention.
  • a user touches an arbitrary area on the touch screen 610 of the user terminal device with an index finger, it passes through specific areas 631 and 632 of a plurality of areas displayed on the touch screen 610.
  • the index finger is positioned on the fingerprint recognition sensor 620 after performing a gesture of dragging (ie, dragging) toward the fingerprint recognition sensor 620 of the user terminal device
  • a user interface according to an embodiment of the present invention is provided.
  • the system 100 may recognize the gesture and the biometric information, and determine that a command corresponding to the specific region is performed. For example, when the drag gesture above passes through block 631 indicated by the number “7”, payment may be made in a seven-month installment (see FIG. 6 (a)), and the drag gesture above is the number “3”. In the case where the block 632 is marked with ", payment may be made in a three-month installment (see FIG. 6B).
  • FIG. 7 is a diagram exemplarily illustrating a configuration for providing a user interface using an integrated technical means capable of receiving both a gesture and biometric information according to an embodiment of the present invention.
  • fingerprint information may be obtained only when the user touches a specific area (hereinafter, referred to as a fingerprint recognition area) on the fingerprint recognition integrated touch screen 720. That is, the activation of the fingerprint recognition function may be determined according to the area where the touch manipulation is input to the fingerprint recognition integrated touch screen 720.
  • a user performs an operation of touching a first graphic element on a fingerprint recognition integrated touch screen and the touch operation is not released (ie, the user continues the fingerprint recognition integrated touch screen).
  • the user's fingerprint information may be obtained through the fingerprint recognition area when the mobile phone is located in the fingerprint recognition area that performs the fingerprint recognition function.
  • the fingerprint recognition area performing the fingerprint recognition function on the fingerprint recognition integrated touch screen may be displayed only temporarily based on a user's touch manipulation or the like.
  • the fingerprint recognition area may be displayed on the fingerprint recognition integrated touch screen only when the user inputs a touch operation corresponding to a preset gesture on the fingerprint recognition integrated touch screen.
  • the fingerprint recognition area may be displayed on the fingerprint recognition integrated touch screen only when the user touches a predetermined graphic element on the fingerprint recognition integrated touch screen.
  • certain graphic elements may be displayed on the fingerprint recognition integrated touch screen only when the user touches the fingerprint recognition area on the fingerprint recognition integrated touch screen.
  • the fingerprint recognition area may be displayed on the fingerprint recognition integrated touch screen only while the user maintains a touch state for a predetermined graphic element on the fingerprint recognition integrated touch screen, and the user is in the touch state above.
  • the fingerprint recognition area displayed on the fingerprint recognition integrated touch screen may disappear.
  • the display size, the display form of the fingerprint recognition area according to the type or function of the graphic element touched by the user on the fingerprint recognition integrated touch screen or the object corresponding to the graphic element may vary.
  • auditory feedback or tactile feedback may be provided together in response to the user touching a predetermined graphic element displayed on the fingerprint recognition integrated touch screen.
  • the intensity, period (frequency), pattern, and providing method of the auditory feedback or the tactile feedback may vary according to the type or function of the graphic element to be touched or an object corresponding to the graphic element.
  • FIG. 8 is a diagram illustrating a configuration in which various commands are performed in response to various gestures according to an embodiment of the present invention.
  • a user may perform various gestures by inputting biometric information (ie, fingerprint information) after performing various gestures or by performing various gestures after the user inputs biometric information.
  • biometric information ie, fingerprint information
  • FIG. 8A when the user touches the fingerprint sensor immediately to input fingerprint information without performing any gesture (see FIG. 8A), after the user shakes his hand in the air, the user touches the fingerprint sensor.
  • FIG. 8B the user may input a fingerprint information by touching a fingerprint sensor after performing a gesture of inverting a hand (see FIG. 8C).
  • different commands may be performed corresponding to each of the above three cases.
  • the user directly inputs fingerprint information by touching the fingerprint sensor without any gesture see FIG.
  • the user touches the fingerprint sensor to input the fingerprint information and then touches the hand in the air. It may be assumed that the gesture of shaking is performed (see (e) of FIG. 8), the gesture of inverting the hand after the user touches the fingerprint sensor to input fingerprint information (see (f) of FIG. 8), and the like. In this case, different commands may be performed corresponding to each of the above three cases.
  • the communication unit 140 performs a function to enable the user interface providing system 100 to communicate with an external device.
  • the controller 150 controls the flow of data between the gesture recognition unit 110, the biometric information recognition unit 120, the command execution unit 130, and the communication unit 140. Do this. That is, the controller 150 controls the flow of data from the outside or between each component of the user interface providing system, such that the gesture recognition unit 110, the biometric information recognition unit 120, the command execution unit 130, and the communication unit Each control is performed at 140 to perform a unique function.
  • Embodiments according to the present invention described above may be implemented in the form of program instructions that may be executed by various computer components, and may be recorded on a non-transitory computer readable recording medium.
  • the non-transitory computer readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the non-transitory computer readable recording medium may be those specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
  • non-transitory computer readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs, DVDs, magnetic-optical media such as floppy disks ( magneto-optical media) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device may be configured to operate as one or more software modules to perform the process according to the invention, and vice versa.
  • gesture recognition unit 110 gesture recognition unit

Abstract

selon un aspect de la présente invention, un procédé de fourniture d'interface utilisateur comprend les étapes consistant à : (a) reconnaître un geste et des bio-informations obtenus respectivement d'un moyen d'entrée de geste et d'un moyen d'entrée de bio-informations ; (b) spécifier un objet et une instruction correspondant au geste reconnu ou aux bio-informations reconnues en se référant à une relation de correspondance entre un geste ou des bio-informations prédéfinis et un objet ou une instruction prédéfini, et spécifier un utilisateur en se référant aux bio-informations reconnues ; et (c) déterminer que l'instruction spécifiée doit être exécutée par rapport à l'objet spécifié comme faisant partie de la portée d'un droit accordé à l'utilisateur spécifié.
PCT/KR2015/006297 2014-06-20 2015-06-22 Procédé et système de fourniture d'interface utilisateur et support d'informations non-transitoire lisible par un ordinateur WO2015194918A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020187026863A KR20180107288A (ko) 2014-06-20 2015-06-22 사용자 인터페이스를 제공하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
KR1020167007891A KR101901735B1 (ko) 2014-06-20 2015-06-22 사용자 인터페이스를 제공하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
US15/092,791 US20160370866A1 (en) 2014-06-20 2016-04-07 Method, System and Non-Transitory Computer-Readable Recording Medium for Automatically Performing an Action

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2014-0075331 2014-06-20
KR20140075331 2014-06-20
KR1020140153954A KR20150145677A (ko) 2014-06-20 2014-11-06 사용자 인터페이스를 제공하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
KR10-2014-0153954 2014-11-06

Publications (1)

Publication Number Publication Date
WO2015194918A1 true WO2015194918A1 (fr) 2015-12-23

Family

ID=54935820

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/006297 WO2015194918A1 (fr) 2014-06-20 2015-06-22 Procédé et système de fourniture d'interface utilisateur et support d'informations non-transitoire lisible par un ordinateur

Country Status (1)

Country Link
WO (1) WO2015194918A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113056718A (zh) * 2018-11-14 2021-06-29 华为技术有限公司 手持移动终端操控方法及相关装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020087665A (ko) * 2001-05-15 2002-11-23 엘지전자 주식회사 개인휴대용 통신단말기의 분실모드에 따른 동작방법
US20060215883A1 (en) * 2005-03-25 2006-09-28 Samsung Electronics Co., Ltd. Biometric identification apparatus and method using bio signals and artificial neural network
US20070150842A1 (en) * 2005-12-23 2007-06-28 Imran Chaudhri Unlocking a device by performing gestures on an unlock image
KR20120048359A (ko) * 2010-11-05 2012-05-15 목포대학교산학협력단 적외선카메라를 이용하여 선박승무원 출입을 위한 3차원 영상 큐브 암호 인터페이스 시스템
KR20140037280A (ko) * 2007-09-24 2014-03-26 애플 인크. 전자 장치 내의 내장형 인증 시스템들

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020087665A (ko) * 2001-05-15 2002-11-23 엘지전자 주식회사 개인휴대용 통신단말기의 분실모드에 따른 동작방법
US20060215883A1 (en) * 2005-03-25 2006-09-28 Samsung Electronics Co., Ltd. Biometric identification apparatus and method using bio signals and artificial neural network
US20070150842A1 (en) * 2005-12-23 2007-06-28 Imran Chaudhri Unlocking a device by performing gestures on an unlock image
KR20140037280A (ko) * 2007-09-24 2014-03-26 애플 인크. 전자 장치 내의 내장형 인증 시스템들
KR20120048359A (ko) * 2010-11-05 2012-05-15 목포대학교산학협력단 적외선카메라를 이용하여 선박승무원 출입을 위한 3차원 영상 큐브 암호 인터페이스 시스템

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113056718A (zh) * 2018-11-14 2021-06-29 华为技术有限公司 手持移动终端操控方法及相关装置

Similar Documents

Publication Publication Date Title
WO2016129938A1 (fr) Procédé et appareil destinés à l'exécution d'une fonction de paiement dans un état limité
US8863042B2 (en) Handheld device with touch controls that reconfigure in response to the way a user operates the device
WO2012108723A2 (fr) Appareil d'affichage d'informations comportant au moins deux écrans tactiles et procédé associé d'affichage d'informations
JP6054892B2 (ja) 複数のディスプレイに対するアプリケーション画像の表示方法、電子機器およびコンピュータ・プログラム
WO2013141464A1 (fr) Procédé de commande d'entrée tactile
WO2014065499A1 (fr) Procédé d'édition basé sur la définition d'un bloc de texte grâce à plusieurs touchers
WO2014025131A1 (fr) Procédé et système pour afficher une interface utilisateur graphique
AU2012214924A1 (en) Information display apparatus having at least two touch screens and information display method thereof
KR20180107288A (ko) 사용자 인터페이스를 제공하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
KR102521192B1 (ko) 전자 장치 및 그의 동작 방법
WO2015108224A1 (fr) Système permettant d'entraîner un dispositif par une entrée tactile en mode de basse puissance dans lequel l'écran est éteint
WO2018151449A1 (fr) Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
CN104243749B (zh) 图像形成装置及图像形成装置的控制方法
WO2017209568A1 (fr) Dispositif électronique et procédé de fonctionnement associé
CN104536563A (zh) 一种电子设备控制方法及系统
WO2016080596A1 (fr) Procédé et système de fourniture d'outil de prototypage, et support d'enregistrement lisible par ordinateur non transitoire
CN105074644A (zh) 信息处理终端、屏幕控制方法以及屏幕控制程序
WO2017095123A1 (fr) Procédé, dispositif et système pour fournir une interface utilisateur, et support d'enregistrement lisible par ordinateur non temporaire
WO2012093779A2 (fr) Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal
WO2018084684A1 (fr) Procédé destiné à commander l'exécution d'une application sur un dispositif électronique à l'aide d'un écran tactile et dispositif électronique destiné à ce dernier
US10732719B2 (en) Performing actions responsive to hovering over an input surface
WO2013005901A1 (fr) Appareil et procédé d'entrée de caractère sur un écran tactile
KR20160098752A (ko) 디스플레이 장치 및 디스플레이 방법 및 컴퓨터 판독가능 기록매체
WO2013100727A1 (fr) Appareil d'affichage et procédé de représentation d'image utilisant celui-ci
WO2014185753A1 (fr) Procédé pour apparier de multiples dispositifs, et dispositif et système de serveur pour permettre l'appariement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15808890

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20167007891

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15808890

Country of ref document: EP

Kind code of ref document: A1