WO2016199967A1 - User intention input system on basis of pattern and sensor - Google Patents

User intention input system on basis of pattern and sensor Download PDF

Info

Publication number
WO2016199967A1
WO2016199967A1 PCT/KR2015/005968 KR2015005968W WO2016199967A1 WO 2016199967 A1 WO2016199967 A1 WO 2016199967A1 KR 2015005968 W KR2015005968 W KR 2015005968W WO 2016199967 A1 WO2016199967 A1 WO 2016199967A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
sensor
matching
information
image
Prior art date
Application number
PCT/KR2015/005968
Other languages
French (fr)
Korean (ko)
Inventor
박순주
남기헌
Original Assignee
(주)블루와이즈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)블루와이즈 filed Critical (주)블루와이즈
Priority to PCT/KR2015/005968 priority Critical patent/WO2016199967A1/en
Publication of WO2016199967A1 publication Critical patent/WO2016199967A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Abstract

The present invention relates to a user intention input system on the basis of a pattern and a sensor, which can support a new intention input method for analyzing a motion and a brief sketch, input from a user, and increasing the degree of freedom through motion information and an image, matched according to a search.

Description

Pattern and sensor-based user interaction input system

The present invention relates to a user's will enter the system, particularly, that can support the new pseudo-input methods analyze the brief sketch and an operation input from the user, and increased the degree of freedom through image and the motion information is matched in accordance with the search pattern, and sensor-based the present invention relates to your doctor entered the system.

If accompanied by such autism, developmental language disorders, mental disabilities, even the mention of the increasingly complex modern lives, some public and higon often run into linguistic limitations to quickly and accurately enter their doctor on the IT environment.

The supplement to the expression and use of doctors or facilitate different ways to replace the physician input of conventional language-based and are being studied, in particular, has been a lot of development to how to express their doctor as a picture. Especially in the case of languages ​​with developmental disabilities were known to be very effective to express a doctor through gestures and figures based on the children of Communications, in practice also appears to be very high percentage of these non-verbal expressions occupy in effective communication.

In such a conventional technique to help nonverbal expression to be generally improve them using an emoticon, but a visual speech techniques to express the user's intention and decision more freedom being developed, but still non-verbal in an asynchronous communication means in a character-based It got stuck in the framework of representation there is a need visual speech technology to express the user's intent and doctors more freely.

The present invention is the creation in order to solve the above problems, an object of the present invention is a user, the sketch and through the operation input his intention and decision freely and dynamic image object for editing (KVC, Kinetic Visual Contents) and to provide a pattern, and sensor-based input system of the user's will help the physician using the input.

For the above object, the present invention provides user's will enter the system in, communication matching image with the database corresponding to the operation information is stored for the various sensed data for; It comprises the input or modification of sketch for communication from the user, and outputting the processed graphics information interface; Sensor unit for sensing motion is applied by a user; But calculating the matching ratio, and a matching image retrieved from the database to the sketch pattern search unit for matching rate is output to the interface, the higher the matching images; Motion search unit to the interface output to the operation information to retrieve from the database and the corresponding sensing data of said sensor; Doctor information generator for generating pseudo information a combination of the matching image and the motion information; That comprising the features.

At this time, the sensor member preferably made of an acceleration sensor, a sound sensor, a proximity sensor to the ambient light sensor that detects brightness, detects the body approaching the more apart to detect the sound to detect a shake and tilt to be applied by the user.

Further, the sensor member may include a local area communication module for transmitting to the wear that is worn on the user's body part, sensing data of the acceleration sensor and the acceleration sensor provided in the wear part is applied to the motion search part.

In addition, the matching rate setting unit for output, but with a rate matching the matching image to be output through the interface, receiving a reference level from a user to match the image output conforming to the standard level condition; DB updating unit reconstructing the database based on the matching image and motion information verifying receives the user verification information for the matching image and motion information outputted through the interface; The may further include.

In addition, such according to a user request provided through the interface to the matching image or operation information stored in the database, and provide an operation corresponding to the sketch or the operation information corresponding to the matched image output from the interface unit by the user, and matching a user setting unit for configuring the database; The may further include.

Through the present invention it may be made sketches and simple in operation the doctor improve the expression method using the dynamic image object yirueojim created using that help freedom than the conventional static image based decision expression system dramatically improve the expression doctors input .

In addition, it is possible to easily secure access to communicate with vulnerable groups of the present invention can solve the various problems caused by social factors, engage and communicate with the absence of verbal communication disabilities.

1 is a conceptual diagram of the present invention,

Figure 2 is a block diagram showing the configuration and connection relations according to a preferred embodiment of the invention,

Figure 3 is a block diagram showing a state in accordance with another embodiment of the present invention.

With reference to the accompanying drawings, the invention will be described in detail a configuration of the pattern and a sensor based user input system of the doctor.

1 is a conceptual diagram of the present invention, Figure 2 is a block diagram showing the configuration and connection relations according to a preferred embodiment of the present invention. The present invention is basically the service takes place via a server is configured to allow users to send and receive data with a device possessing this via wireless communication such as a phone or smart pad. The terminal may be through a variety of sensors, including at the same time as receiving input from the user, the sketch acceleration sensor 131 through a touch screen that includes recognizing an operation of the user.

A preferred embodiment of the present invention is illustrated by the accompanying drawings for example, by default, database 110, and an interface unit 120, and a sensor unit 130, a pattern searching unit 140, a motion search unit 150 and, is provided with a configuration of a pseudo-information generating section 160, it is additionally added to the matching rate setting unit 170, a DB updating unit 180, and a user setting unit (190).

The database 110 is a configuration in which operation information corresponding to the matching image and the various sensed data for the communication is stored in advance. But also the user is also a basic matching image and operation information stored on a device possessing, using more data to establish the database 110 of the large capacity server for a variety of doctors express and to be provided to the terminal, period and of the update it is made.

In the present invention, basically through the sketches and the operation input from the user, so the pseudo input to occur, the matching image in the operation information refers to various sketched image on certain things necessary for the doctor expressed, and the sensing data supplied from the various sensors, and a means information about the recognized user via the operation.

The interface unit 120 may be configured to correspond to the touch screen provided on the user carrying the terminal, a signal that basically controls the functions of the terminal, and outputs the processed graphics information. Particular user enters a sketch using a part of the body such as a pen or a finger in and the interface unit 120 in the present invention provide an environment in which to occur the input or modification of sketch for communication from the user, a certain compartment for this purpose or, and to correct, it is also possible to a matching image or name corresponding to the input sketch to be output together in real time.

The sensor unit 130 is made of a variety of sensors for detecting the operation to be applied by the user, according to the present invention using a smart terminal basically is to take advantage of the variety of sensors embedded in the smart device.

More specifically, the acoustic sensor 132 to the sensor unit 130 has an acceleration sensor 131 that detects the shake and tilt to be applied by the user, detecting a change in size of the sound and voice, for sensing a change in the brightness and brightness a brightness sensor 133, the proximity sensor 134 to detect the approach of the user's body and moves away to the terminal will be written.

But the pattern searching unit 140 searches for the matching image to the input sketch is provided to the terminal via the interface unit 120 in the database 110, and calculates a matching ratio, the matching ratio is the higher the matching image It is configured to output to the interface unit 120. In other words, to the database 110, it is registered matching images corresponding to the different sketches entered by a user searching for a sketch with the matching rate that is the highest degree of similarity to find matching images as described above.

At this time, the more the sketch which basically enter specific matching images also can be accurately search, sketches proceeds to update the matching rate is higher matching images in real time according to the outputs or users to be set up matching rate to set a detection level may.

But for this purpose the matching rate setting unit 170 is output with a matching rate in the matching image to be outputted through the interface unit 120, receives the reference level by the user to match the image output conforming to the standard level condition do. I.e. with the matching rate of the retrieved matching pictures set a matching rate with a tool such as a slide which can be set to match rate from a user at a portion of the interface unit 120, and determine a matching image to be consistent with the matching rate be so. The higher the rate matching number of the matching image to be retrieved will be less.

Such a pattern search unit in the present invention discern element that is good and features of the points, such as configuring the sketch and, ORB (Oriented FAST and Rotated BRIEF) algorithm for pre-search and matching of the stored matching image for the function of the unit 140 use. This in combination BRIEF (Binary Robust Independent Elementary Features) algorithm for generating a binary representation of a gray level difference between two pixels in the given pattern FAST (Features from Accelerated Segment Test) to rapidly detect the feature point with the brightness value difference between neighboring pixels and pulling the main matching feature points of the matching image stored in the sketch with the database input by the user.

The motion search unit 150 and outputs the sensed data of the sensor unit 130 to the database 110, operation information to the interface unit 120 corresponding to, and retrieved from. The sensor unit 130 in the preferred embodiment of the present invention, as noted above is composed of an acceleration sensor 131, a sound sensor 132, ambient light sensor 133, the proximity sensor 134. Accordingly, the acceleration sensor starts the shake of the terminal from a user through the 131 and the end, wave changes in, turn over of the terminal well, it is possible to detect motion flip shaker, the beginning of sound through the acoustic sensor 132 and a the end and changes, via the load and the surrounding dark brighter, the proximity sensor 134, via the ambient light sensor 133 may sense the closer and farther away the body of the user, respectively.

The database 110 These operations information is stored corresponding to the sensing data of the sensors, so search for it to be applied to the output the operation information that matches the image or is the single output, the type of sensor and operates by various operations information can be set. For example, the things sketched as an example of the operation information corresponding through the interface unit 120 after the user detects this through the acceleration sensor 131 by taking the motion to overturn the terminal after the sketch at a particular object flipped the Hi or less dynamic change can be set, such as that applied to occur.

The pseudo information generating unit 160 dynamically as compared to the doctor expressed through a conventional static image by the configuration, the set various operation information in advance about the matching image for generating a pseudo-information combining the matching image and motion information through the effect it can be made more doctors expressed. That is, which can be the input of the dynamic objects, such as to move or operate the set the matching image to be modified in addition to the static matching image with a sketch additional dynamic content through the user's operation. The generated pseudo information may be transmitted to the output is via the interface unit 120 or to allow the user to check that the other user or together, and the other terminal naejineun web server through a communication network.

The DB update module 180 is such that the verified user to the matching image and motion information outputted via the interface unit 120 made and excluded by the matching image and the operation information or user matched incorrectly through the verification a matching image and motion information such that the reconstruction, such as deleting in the database 110 is done.

I.e. as reconfiguring the database 110. Based on the matching image and the operation information from the user validation and the verification results accumulated by so that the matching of the images and management information made by user operation can be made user-customizable doctor type environment.

In addition, the user setting unit 190 is provided with a structure for building a customized environment. The user setting unit 190 is provided via the database matching image or information for operating the interface unit 120, stored in 110 according to the user's request. Since, it is set to match to the input operation corresponding to the sketch or the operation information corresponding to the matched image output from the interface unit 120 from the user to reconfigure the database 110.

3 shows a state that a block diagram showing a state in accordance with another embodiment of the present invention, by utilizing a wearable sensor which is coupled, via a terminal and a near field communication to be more widely utilized in the operation input from the user.

To this end, the sensor unit 130 is provided with a wear part 135 which can be worn on the body of the user. But can be provided with various types of wear part 135, it is through the gesture with the normal hand movement made in view of the expression of many of the doctor band form made of a possible wear on the wrist or the arm are preferred. The wear part (135 is the default, the normal acceleration sensor 131 of the various sensors constituting the sensor unit 130 is installed to sense the movement of the hand or arm.

In addition, search the operations included in the terminal of the wear part 135 through the sensing data has local area communication module of the acceleration sensor 131 provided in the wear part 135 according to the configured off is formed separately from the terminal It is configured to be applied to the unit 150.

To this end, the wear part 135 is equipped with a first communication unit 136, which is composed of a local area communication module such as a Bluetooth module, including a power supply unit transmits a detection result of the acceleration sensor 131 and the acceleration sensor 131, the motion search unit is provided with a second communication unit 151 is made of the same communication module to allow the first communication unit 136 and the data transmission and reception.

The rights of the present invention is defined by the present invention as defined in the claims it is not limited to the embodiment described above, characters of ordinary skill that can be various modifications and adaptations within the scope set forth in the claims in the field of the invention It is self-evident.

Claims (5)

  1. In the user's will enter the system,
    And the matching image for the communications, database operations information is stored corresponding to various sensed data;
    It comprises the input or modification of sketch for communication from the user, and outputting the processed graphics information interface;
    Sensor unit for sensing motion is applied by a user;
    But calculating the matching ratio, and a matching image retrieved from the database to the sketch pattern search unit for matching the rate of the output to the interface, the higher the matching images;
    Motion search unit to the interface output to the operation information to retrieve from the database and the corresponding sensing data of said sensor;
    Doctor information generator for generating pseudo information a combination of the matching image and the motion information; Pattern and sensor-based user interaction input system which comprises a.
  2. According to claim 1,
    The sensor member pattern which comprises as an acceleration sensor, a sound sensor, a proximity sensor to the ambient light sensor that detects brightness, detects the body approaching the more apart to detect the sound to detect a shake and tilt to be applied by the user and sensor-based user interaction input system.
  3. 3. The method of claim 2,
    The sensor member pattern and the sensor comprises a local area communication module for transmitting to the wear wearing on the body of the user portion, the sensing data of the acceleration sensor and the acceleration sensor provided in the wear part is applied to the motion search part based on the user's decision-entry system.
  4. According to claim 1,
    Matching rate setting unit for output, but with a rate matching the matching image to be output through the interface, receiving a reference level from a user to match the image output conforming to the standard level condition;
    DB updating unit reconstructing the database based on the matching image and motion information verifying receives the user verification information for the matching image and motion information outputted through the interface; Further comprising a pattern and a sensor based user interaction input system that is characterized in that.
  5. 5. The method of claim 4,
    Above to ensure that according to a user request provided through the interface to the matching image or operation information stored in the database, and provide an operation corresponding to the sketch or the operation information corresponding to the matched image output from the interface unit by the user, and matching user settings section to configure the database; Further comprising a pattern and a sensor based user interaction input system that is characterized in that.
PCT/KR2015/005968 2015-06-12 2015-06-12 User intention input system on basis of pattern and sensor WO2016199967A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/005968 WO2016199967A1 (en) 2015-06-12 2015-06-12 User intention input system on basis of pattern and sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/005968 WO2016199967A1 (en) 2015-06-12 2015-06-12 User intention input system on basis of pattern and sensor

Publications (1)

Publication Number Publication Date
WO2016199967A1 true WO2016199967A1 (en) 2016-12-15

Family

ID=57504166

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/005968 WO2016199967A1 (en) 2015-06-12 2015-06-12 User intention input system on basis of pattern and sensor

Country Status (1)

Country Link
WO (1) WO2016199967A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014084622A1 (en) * 2012-11-28 2014-06-05 (주)미디어인터랙티브 Motion recognizing method through motion prediction
WO2014104612A2 (en) * 2012-12-27 2014-07-03 주식회사 무크 Digital device for product design using image having particular coordinates
US20140205188A1 (en) * 2010-12-03 2014-07-24 Massachusetts Institute Of Technology Sketch Recognition System
WO2014171734A2 (en) * 2013-04-17 2014-10-23 엘지전자 주식회사 Mobile terminal and control method therefor
WO2015034177A1 (en) * 2013-09-04 2015-03-12 에스케이텔레콤 주식회사 Method and device for executing command on basis of context awareness

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140205188A1 (en) * 2010-12-03 2014-07-24 Massachusetts Institute Of Technology Sketch Recognition System
WO2014084622A1 (en) * 2012-11-28 2014-06-05 (주)미디어인터랙티브 Motion recognizing method through motion prediction
WO2014104612A2 (en) * 2012-12-27 2014-07-03 주식회사 무크 Digital device for product design using image having particular coordinates
WO2014171734A2 (en) * 2013-04-17 2014-10-23 엘지전자 주식회사 Mobile terminal and control method therefor
WO2015034177A1 (en) * 2013-09-04 2015-03-12 에스케이텔레콤 주식회사 Method and device for executing command on basis of context awareness

Similar Documents

Publication Publication Date Title
Lane et al. A survey of mobile phone sensing
EP2559030B1 (en) Intuitive computing methods and systems
US8199115B2 (en) System and method for inputing user commands to a processor
EP2813938A1 (en) Apparatus and method for selecting object by using multi-touch, and computer readable recording medium
US7460884B2 (en) Data buddy
KR20150125472A (en) Voice command providing method and apparatus
JP6011938B2 (en) Sensor-based mobile search, related methods and systems
US20130329023A1 (en) Text recognition driven functionality
Feldman et al. ReachMedia: On-the-move interaction with everyday objects
US20130155237A1 (en) Interacting with a mobile device within a vehicle using gestures
KR100947990B1 (en) Gaze Tracking Apparatus and Method using Difference Image Entropy
US20150293592A1 (en) Haptic information management method and electronic device supporting the same
TW544637B (en) Computer system providing hands free user input via optical means for navigation or zooming
US9799177B2 (en) Apparatus and methods for haptic covert communication
JP2013522938A (en) Intuitively computing method and system
US20160042228A1 (en) Systems and methods for recognition and translation of gestures
US10139937B2 (en) Multi-modal user expressions and user intensity as interactions with an application
EP2447809A2 (en) User device and method of recognizing user context
US20120287070A1 (en) Method and apparatus for notification of input environment
EP2879095B1 (en) Method, apparatus and terminal device for image processing
US8893054B2 (en) Devices, systems, and methods for conveying gesture commands
CN102694906B (en) Mobile phone automatically answers
US20180101237A1 (en) System, method, and apparatus for man-machine interaction
KR100821161B1 (en) Method for inputting character using touch screen and apparatus thereof
England Whole body interaction: An introduction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15895035

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15895035

Country of ref document: EP

Kind code of ref document: A1