WO2016199967A1 - Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur - Google Patents

Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur Download PDF

Info

Publication number
WO2016199967A1
WO2016199967A1 PCT/KR2015/005968 KR2015005968W WO2016199967A1 WO 2016199967 A1 WO2016199967 A1 WO 2016199967A1 KR 2015005968 W KR2015005968 W KR 2015005968W WO 2016199967 A1 WO2016199967 A1 WO 2016199967A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
user
unit
matching image
matching
Prior art date
Application number
PCT/KR2015/005968
Other languages
English (en)
Korean (ko)
Inventor
박순주
남기헌
Original Assignee
(주)블루와이즈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)블루와이즈 filed Critical (주)블루와이즈
Priority to PCT/KR2015/005968 priority Critical patent/WO2016199967A1/fr
Publication of WO2016199967A1 publication Critical patent/WO2016199967A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to a user pseudo-input system, and in detail, analyzes a simple sketch and motion input from a user, and based on a pattern and sensor that can support a new pseudo-input method with increased degrees of freedom through matching images and motion information according to a search.
  • a user input system analyzes a simple sketch and motion input from a user, and based on a pattern and sensor that can support a new pseudo-input method with increased degrees of freedom through matching images and motion information according to a search.
  • the present invention was created to solve the above-described problems, and an object of the present invention is to provide a dynamic image object (KVC, Kinetic Visual) that allows a user to freely input and edit his or her intentions and intentions through sketches and motions. It is to provide a pattern and sensor-based user input system that helps the input of the doctor using contents.
  • KVC Kinetic Visual
  • the present invention provides a user input system, comprising: a database storing operation information corresponding to various types of sensing data and a matching image for communication; An interface unit for inputting or modifying a sketch for communication from a user and outputting processed graphic information; A sensor unit detecting a motion applied from a user; A pattern search unit for searching for a matching image of the sketch in the database and calculating a matching rate, and outputting a matching image having a high matching rate to the interface unit; A motion search unit for searching the database for sensing data of the sensor unit and outputting corresponding motion information to the interface unit; A pseudo information generator for generating pseudo information combining the matching image and motion information; Characterized in that consists of.
  • the sensor unit is preferably composed of an acceleration sensor for detecting the shaking and tilt applied from the user, an acoustic sensor for detecting the sound, an illuminance sensor for detecting the brightness, and a proximity sensor for detecting the approach and distance of the body.
  • the sensor unit may include a wearing unit worn on the user's body, an acceleration sensor installed in the wearing unit, and a short range communication module configured to transmit sensing data of the acceleration sensor to the motion search unit.
  • a matching rate setting unit outputs a matching rate to the matching image output through the interface unit, and receives a reference level from a user and outputs a matching image matching the condition of the reference level;
  • DB update unit for receiving the user's verification information on the matching image and operation information output through the interface unit and reconstructing the database based on the verified matching image and operation information; It may further include.
  • a user setting unit configured to configure the database; It may further include.
  • the pseudo-expression method through the dynamic image object created using the sketch and the simple motion is improved, and thus the pseudo-input can be made with the improved degree of freedom and expression than the conventional static image-based pseudo-expression method. .
  • FIG. 2 is a block diagram showing a configuration and a connection relationship according to a preferred embodiment of the present invention
  • FIG. 3 is a block diagram showing a state according to another embodiment of the present invention.
  • FIG. 1 is a conceptual diagram of the present invention
  • Figure 2 is a block diagram showing the configuration and connection relationships according to a preferred embodiment of the present invention.
  • the present invention basically provides a service through a server configured to transmit and receive data through wireless communication with a terminal possessed by a user such as a smart phone or a smart pad.
  • the terminal may recognize a user's motion through various sensors including an acceleration sensor 131 while receiving a sketch from the user through a touch screen.
  • the database 110 the interface unit 120, the sensor unit 130, the pattern search unit 140, the motion search unit 150 And, the configuration of the pseudo information generating unit 160 is provided, additionally the matching rate setting unit 170, DB update unit 180, and the user setting unit 190 is added.
  • the database 110 is a configuration in which the matching image for communication and the operation information corresponding to various sensing data are stored in advance.
  • the basic matching image and operation information may be stored on the terminal possessed by the user, but the database 110 may be provided in a large-capacity server for various expressions through more data and provided to the terminal. Make sure that updates are made.
  • the matching image means various sketch images for a specific object required for pseudo expression, and the motion information is sensing data applied from various sensors. Means information about the user's operation recognized through.
  • the interface unit 120 has a configuration corresponding to a touch screen formed on a terminal possessed by a user, and can basically receive a signal for controlling a function of the terminal and output processed graphic information.
  • the interface unit 120 provides an environment for inputting or modifying a sketch for communication from a user. For this purpose, a user inputs a sketch using a body part such as a pen or a finger in a predetermined section. Alternatively, it may be modified and a matching image or name corresponding to the input sketch may be output together in real time.
  • the sensor unit 130 is composed of a variety of sensors for sensing the motion applied from the user, so that the present invention basically utilizes a variety of sensors built in the smart terminal according to the use of the smart terminal.
  • the sensor unit 130 includes an acceleration sensor 131 for detecting a shake and a tilt applied from a user, an acoustic sensor 132 for detecting a change in sound and a size of the sound, and a change in brightness and brightness.
  • the illumination sensor 133, and the proximity sensor 134 for detecting the approach and distance of the user's body to the terminal.
  • the pattern search unit 140 is provided in the terminal to search the database 110 for a matching image for the sketch inputted through the interface unit 120 to calculate a matching rate, the matching image having a high matching rate
  • the configuration outputs to the interface unit 120. That is, as mentioned above, since the matching images corresponding to the various sketches input by the user are registered in the database 110, the searching images are searched for the matching images with the high matching rate, that is, the similarity of the sketches.
  • the matching image with high matching rate is updated and output in real time, or the matching rate can be set so that the user can set the search level. It may be.
  • the matching rate setting unit 170 outputs a matching rate to the matching image output through the interface unit 120, but receives a reference level from the user and outputs a matching image meeting the reference level condition. do. That is, the matching rate is set using a tool such as a slide that can receive a matching rate from the user in a portion of the interface unit 120 together with the matching rate of the searched matching image, and checks the matching image corresponding to the matching rate. To help. In this case, the higher the matching rate, the smaller the number of matching images to be searched.
  • the present invention classifies the elements constituting the sketch, i.e., lines and points, and searches an ORB (Oriented FAST and Rotated BRIEF) algorithm for retrieval and matching of pre-stored matching images.
  • ORB Oriented FAST and Rotated BRIEF
  • FAST Accelerated Segment Test
  • BRIEF Binary Robust Independent Elementary Features
  • the motion search unit 150 searches for sensing data of the sensor unit 130 in the database 110 and outputs corresponding motion information to the interface unit 120.
  • the sensor unit 130 includes an acceleration sensor 131, an acoustic sensor 132, an illuminance sensor 133, and a proximity sensor 134.
  • the acceleration sensor 131 can detect the start and end of the shaking of the terminal from the user, the change of the shaking, the flip of the terminal, the flipping and shaking motion, the start of the sound through the acoustic sensor 132 and The end and the change, the lightening and darkening of the surroundings through the illuminance sensor 133 may be detected, the near and far away of the user's body through the proximity sensor 134, respectively.
  • the corresponding operation information is searched and applied to the output matching image or outputted alone, and various operations for each type and operation of the sensor are performed.
  • Information can be set. For example, the user sketches a specific object through the interface unit 120 and then flips the terminal to detect the detected object through the acceleration sensor 131 and then flips the sketched object as an example of the corresponding motion information. Settings such as dynamic changes can be applied, such as hit or drop.
  • the pseudo information generating unit 160 is a configuration for generating pseudo information combining the matching image and the motion information.
  • the pseudo information generating unit 160 sets various motion information for the matching image in advance, compared to the conventional expression of the pseudo image through the static image. More expression can be achieved through the phosphorus effect. That is, dynamic content is added through the user's motion in addition to the static matching image through the sketch, so that dynamic object input such as moving or transforming the matching image to the set motion can be made.
  • the generated pseudo information may be output through the interface unit 120 to be confirmed by the user or other users who are present together, or may be transmitted to another terminal or a web server through a communication network.
  • the DB update unit 180 allows the user to verify the matching image and the operation information output through the interface unit 120, and excludes the incorrectly matched matching image and the operation information or the user through the verification.
  • the reconstruction such as deleting the matching image and operation information from the database 110 is performed.
  • the verification result is accumulated and a user-specific pseudo input environment can be made.
  • the user setting unit 190 is provided as a configuration for building a user-customized environment.
  • the user setting unit 190 provides a matching image or operation information stored in the database 110 through the interface unit 120 according to a user's request. Subsequently, the database 110 is reconfigured by setting an operation corresponding to a sketch or motion information corresponding to the matching image output from the interface unit 120 to be matched by a user.
  • FIG. 3 is a block diagram illustrating a form according to another embodiment of the present invention, in which a wearable sensor interworked with a terminal through short-range wireless communication can be utilized to more broadly utilize an operation input from a user.
  • the sensor unit 130 is provided with a wearing unit 135 that can be worn on the user's body.
  • the wearable part 135 may be provided in various forms, but in consideration of the fact that a large part of the expression is made through a gesture through a hand gesture, it is preferable that the wearable part 135 be formed on a wrist or an arm.
  • a normal acceleration sensor 131 is basically installed to detect a hand or arm movement.
  • the wearing unit 135 is formed separately from the terminal and configured to be separated, the sensing data of the acceleration sensor 131 installed in the wearing unit 135 is provided to the terminal through a short range communication module. It is configured to be applied to the unit 150.
  • the wearing unit 135 is provided with a first communication unit 136 composed of a short-range communication module such as a Bluetooth module for transmitting a detection result of the acceleration sensor 131 and the acceleration sensor 131, including a power supply,
  • the operation search unit includes a second communication unit 151 made of the same communication module to transmit and receive data with the first communication unit 136.

Abstract

La présente invention concerne un système d'entrée d'une intention d'utilisateur sur la base d'un motif et un capteur, qui peut prendre en charge un nouveau procédé d'entrée d'intention destiné à analyser un mouvement et une esquisse brève, l'entrée provenant d'un utilisateur et à augmenter le degré de liberté au moyen des informations de mouvement et d'une image, mises en correspondance selon une recherche.
PCT/KR2015/005968 2015-06-12 2015-06-12 Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur WO2016199967A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/005968 WO2016199967A1 (fr) 2015-06-12 2015-06-12 Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/005968 WO2016199967A1 (fr) 2015-06-12 2015-06-12 Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur

Publications (1)

Publication Number Publication Date
WO2016199967A1 true WO2016199967A1 (fr) 2016-12-15

Family

ID=57504166

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/005968 WO2016199967A1 (fr) 2015-06-12 2015-06-12 Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur

Country Status (1)

Country Link
WO (1) WO2016199967A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014084622A1 (fr) * 2012-11-28 2014-06-05 (주)미디어인터랙티브 Procédé de reconnaissance de mouvement par prédiction de mouvement
WO2014104612A2 (fr) * 2012-12-27 2014-07-03 주식회사 무크 Dispositif numérique pour une conception de produit utilisant une image ayant des coordonnées particulières
US20140205188A1 (en) * 2010-12-03 2014-07-24 Massachusetts Institute Of Technology Sketch Recognition System
WO2014171734A2 (fr) * 2013-04-17 2014-10-23 엘지전자 주식회사 Terminal mobile et procédé de commande de ce terminal
WO2015034177A1 (fr) * 2013-09-04 2015-03-12 에스케이텔레콤 주식회사 Procédé et dispositif d'exécution d'une commande sur la base d'une sensibilité au contexte

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140205188A1 (en) * 2010-12-03 2014-07-24 Massachusetts Institute Of Technology Sketch Recognition System
WO2014084622A1 (fr) * 2012-11-28 2014-06-05 (주)미디어인터랙티브 Procédé de reconnaissance de mouvement par prédiction de mouvement
WO2014104612A2 (fr) * 2012-12-27 2014-07-03 주식회사 무크 Dispositif numérique pour une conception de produit utilisant une image ayant des coordonnées particulières
WO2014171734A2 (fr) * 2013-04-17 2014-10-23 엘지전자 주식회사 Terminal mobile et procédé de commande de ce terminal
WO2015034177A1 (fr) * 2013-09-04 2015-03-12 에스케이텔레콤 주식회사 Procédé et dispositif d'exécution d'une commande sur la base d'une sensibilité au contexte

Similar Documents

Publication Publication Date Title
CN106104434B (zh) 使用触摸屏设备确定用户利手和定向
CN110443167B (zh) 传统文化手势的智能识别方法、智能交互方法及相关装置
KR20210023680A (ko) 증강 현실 환경에서의 콘텐트 생성
CN111930964B (zh) 内容处理方法、装置、设备及存储介质
JP7286208B2 (ja) 生体顔検出方法、生体顔検出装置、電子機器、及びコンピュータプログラム
Ahmed et al. Real-time sign language framework based on wearable device: analysis of MSL, DataGlove, and gesture recognition
CN110555102A (zh) 媒体标题识别方法、装置及存储介质
WO2022193973A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique, support de stockage lisible par ordinateur et produit programme informatique
Jiang et al. independent hand gesture recognition with Kinect
Liao et al. A systematic review of global research on natural user interface for smart home system
Bouteraa Smart real time wearable navigation support system for BVIP
CN113469360B (zh) 推理方法及装置
Kumar et al. Sixth sense technology
CN112561084A (zh) 特征提取方法、装置、计算机设备及存储介质
WO2016199967A1 (fr) Système d'entrée d'une intention d'utilisateur sur la base d'un motif et capteur
US11650798B2 (en) Developing source code leveraging smart glasses
Nikhil et al. Finger recognition and gesture based virtual keyboard
CN113454647A (zh) 用于识别图像中的对象的电子设备及其操作方法
CN114296627B (zh) 内容显示方法、装置、设备及存储介质
CN112711335B (zh) 虚拟环境画面的显示方法、装置、设备及存储介质
Aravindan et al. A Smart Assistive System for Visually Impaired to Inform Acquaintance Using Image Processing (ML) Supported by IoT
CN111259252B (zh) 用户标识识别方法、装置、计算机设备及存储介质
Phing et al. Wireless wearable for sign language translator device with Android-based app
KR101839244B1 (ko) 감정을 표현하는 수화보조 시스템
Annachhatre et al. Virtual Mouse Using Hand Gesture Recognition-A Systematic Literature Review

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15895035

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15895035

Country of ref document: EP

Kind code of ref document: A1