KR20150118377A - Information inputting system and method by movements of finger - Google Patents

Information inputting system and method by movements of finger Download PDF

Info

Publication number
KR20150118377A
KR20150118377A KR1020140044139A KR20140044139A KR20150118377A KR 20150118377 A KR20150118377 A KR 20150118377A KR 1020140044139 A KR1020140044139 A KR 1020140044139A KR 20140044139 A KR20140044139 A KR 20140044139A KR 20150118377 A KR20150118377 A KR 20150118377A
Authority
KR
South Korea
Prior art keywords
information
motion
finger
mode
motion information
Prior art date
Application number
KR1020140044139A
Other languages
Korean (ko)
Inventor
최린
김민지
배한준
Original Assignee
고려대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 고려대학교 산학협력단 filed Critical 고려대학교 산학협력단
Priority to KR1020140044139A priority Critical patent/KR20150118377A/en
Publication of KR20150118377A publication Critical patent/KR20150118377A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0384Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention relates to an information input system using finger movement and an input method thereof. More particularly, the present invention relates to an information input system that can be worn on a finger of a user, senses finger movement information of a user, A motion input unit 100 for transmitting the motion information to the motion input unit 100 and a finger mode information input from the motion input unit 100 to determine a mouse mode, an operation mode, and a handwriting mode, The integrated management unit 200 may be configured to control the operation of the motion input unit 100 before remote control of the smart device 300. [ And outputs the finger motion maximum movement distance and the finger movement Calculating a bacteria moving distance setting an initial reference value, sensitivity, and to an information input system using a finger movement, characterized in that to determine the finger motion information based on said initial sensitivity threshold.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an information inputting system using finger movements,

The present invention relates to an information input system using a finger movement and an input method thereof, and more particularly, to a system and method for inputting information by using a motion input unit that integrates a mouse function and a keyboard function, And more particularly, to an information input system and a method for inputting information using a finger movement that can remotely control various smart devices in a mouse mode, an operation mode, and a handwriting mode.

In recent years, the types and the penetration rates of smart devices have increased rapidly throughout the world, and smart appliances such as refrigerators and washing machines have also been added to the smart devices.

In such a smart / wearable environment, an input device such as a conventional keyboard, a mouse, and a touch pad may cause inconvenience of use.

Since the keyboard and mouse are optimized for the input of the PC environment, it is difficult to have a keyboard and a mouse in the case of a portable smart device. Therefore, the conventional keyboard and mouse are not suitable for application, so the touch interface using the position sensor is used .

In the case of such a touch interface, there is an advantage that a user can intuitively command and control an interlocked device. In recent years, a pen-shaped motion input unit capable of recognizing a touch is also developed .

However, such a touch interface is often used in a small portable smart device, but it is disadvantageous in that it is inefficient in medium and large smart devices such as a smart TV or a smart device.

In order to solve these problems, a sensor-based smart device control technology using an acceleration sensor, a geomagnetic sensor, a gyro sensor, or the like has been developed for a smart device-based smart device control technology using a camera, have.

However, in the case of the image recognition-based control technology, since the user's face, hands, and gestures must be received through the camera and a specific motion must be detected, it can be used only when there is sufficient light. It is only possible to control it.

In addition, in the case of the speech recognition-based control technique, since the user transmits commands to the smart device through the natural language, there is a disadvantage that the use of the device is limited depending on the circumstances such as noise environment and public places.

In addition, in the case of the sensor-based control technology, a product that replaces the mouse function by sensing the movement of the user's finger is described, but the accuracy of sensing is lowered and the handwriting input is impossible.

In addition, since smart devices use different input methods in recent years, incompatibility of interfaces between smart devices is increasing the burden of user's UI (User Interface) learning.

In order to solve such a problem, the information input system and the input method using the finger movement of the present invention include a motion input unit capable of integrating a mouse function and a keyboard function, and the integrated management unit includes a finger To an information input system and an input method for remotely controlling various smart devices connected to the mobile device in a mouse mode, an operation mode, and a handwriting mode .

Japanese Laid-Open Patent Application No. 2005-0047329 ("Information Input Device and Method Using Finger Movement ", hereinafter referred to as Prior Art 1) discloses a method of attaching to a fingertip of a user to move a fingertip on any contact surface, A signal sensing unit for sensing an axis information signal and a pressure change due to pressing; and a signal sensing unit for sensing a signal provided by the signal sensing unit, the signal sensing unit being installed separately from the signal sensing unit, And a signal processing unit for receiving and analyzing the input / output data and transmitting the input data to the computing device, and for providing input device events corresponding to the computing device. The control and input functions of the computing device are more convenient in a virtual space, a wearable computing environment, Discloses an apparatus and method for inputting information.

Korean Patent Publication No. 10-2005-0047329 (published on May 20, 2005).

SUMMARY OF THE INVENTION The present invention has been made in order to solve the problems of the related art as described above, and it is an object of the present invention to provide a motion input unit, which uses a motion input unit to recognize a motion of a finger, The present invention provides an information input system and a method of inputting information using a finger movement that can remotely control various smart devices connected to a mobile device in a mouse mode, an operation mode, and a handwriting mode.

The information input system using the finger movement according to an exemplary embodiment of the present invention includes a motion input unit that is wearable on a finger of a user and senses finger movement information of a user and transmits the finger movement information to the integrated management unit 200, The control unit 100 analyzes the finger movement information received from the motion input unit 100 to determine a corresponding mode among the mouse mode, the operation mode, and the handwriting mode, and controls the smart And an integrated management unit 200 for remotely controlling the device 300. The integrated management unit 200 may be configured to control the smart device 300 to perform the remote control of the smart device 300, The finger movement information is analyzed to calculate the finger movement maximum movement distance and the finger movement average movement distance, And the finger motion information is determined based on the initial sensitivity reference value.

In this case, the motion input unit 100 includes a network management unit 110 for managing a wireless network with the integrated management unit 200, a three-axis acceleration sensor and a three-axis gyro sensor, And a movement sensor unit 120 and a power source unit 130 for sensing finger movement of a user through a sensor.

The integrated management unit 200 includes a signal preprocessing unit 210 that receives the first finger motion information from the motion input unit 100 and recognizes the motion input unit 100 worn by the user, A sensitivity setting unit 220 for setting the initial sensitivity reference value when the motion input unit 100 recognizes the motion input unit 210 using the finger motion information and the finger motion information using the finger motion information received from the motion input unit 100, An effective motion determiner 230 for calculating the distance, comparing the calculated finger motion velocity and finger motion distance with the initial sensitivity reference value of the sensitivity setting unit 220 to determine effective motion information of the finger motion information, Using the valid motion information determined by the valid motion determining unit 230, A central control unit 240 for determining whether the mode is an operation mode or a handwriting mode and generating a control signal suitable for each mode, A mouse mode control unit 250 for transmitting a control signal corresponding to the valid motion information to the smart device 300 in an operation mode under the control of the central control unit 240, And a handwriting mode control unit 270 for transmitting a control signal corresponding to the valid motion information to the smart device 300 in a handwriting mode under the control of the mode control unit 260 and the central control unit 240 .

Here, the mouse mode controller 250 may receive the valid motion information of the mouse mode, analyze the finger movement direction to remotely control the mouse cursor information of the smart device 300, analyze the finger motion, The mouse click information of the mouse 300 is remotely controlled,

The operation mode control unit 260 receives the valid motion information in the operation mode, compares the stored motion information with the valid motion information, and generates the control information corresponding to the matching motion information, ), Characterized by comprising:

The handwriting mode controller 270 receives the valid motion information in the handwriting mode, analyzes the valid motion information, and performs a decision tree algorithm, which is set in advance, to transmit the calculated character input information to the smart device 300, and outputs the character information to the smart device 300. Alternatively, the smart device 300 may be output by comparing the stored character information with the character input information, and controlling the smart device 300 with control information corresponding to the character information to be matched.

In addition, the central control unit 240 compares the finger motion information received from the motion input unit 100 with the mode switching motion information using the stored mode switching motion information, And controls the mouse mode controller 250, the operation mode controller 260, and the handwriting mode controller 270 to perform mode switching when performing a mode change according to the determination result.

A method of inputting information using a finger movement according to an embodiment of the present invention is a method for inputting information using finger movement information of a user input through a motion input unit that can be worn on a finger by a user, A signal preprocessing step (S100) of receiving the first finger motion information from the motion input unit in the integrated management unit and recognizing the motion input unit worn by a user, An initial sensitivity setting step (S200) of analyzing the finger motion information received from the motion input unit in the integrated management unit to calculate an initial sensitivity standard value by calculating a finger movement maximum movement distance and a finger motion average movement distance, The finger movement rate and the finger movement distance are calculated by analyzing the finger movement information received from the finger movement input unit, and the finger movement rate and the finger movement distance are calculated based on the calculated initial sensitivity (S300) for determining valid motion information of the finger motion information by comparing the finger motion information with a reference value, a valid motion information determining step (S300) of determining whether the valid motion information determined in the valid motion information determining step (S300) The control unit transmits a control signal corresponding to the valid motion information to the smart device according to the mode determined in step S400 and the mode determined in the mode determination step S400 in the integrated management unit, The remote control step (S500) Characterized in that eojineun.

In this case, when the valid motion information is determined to be a mouse mode in the mode determination step (S400), the remote control step S500 analyzes the finger movement direction of the valid motion information to obtain mouse cursor information of the smart device Remote control, or remote control of mouse click information of the smart device by analyzing a finger operation,

If it is determined that the effective motion information is an operation mode in the mode determination step (S400), the control unit compares the previously stored motion information with the valid motion information to generate control information corresponding to the matched motion information, Characterized by remote control,

If the valid motion information is determined to be a handwriting mode in the mode determination step S400, the valid motion information is analyzed to determine a predetermined decision tree algorithm, And the smart device is controlled by the control information corresponding to the matching character information by comparing the stored character information with the character input information.

In this case, in the mode determination step (S400), the finger movement information received from the motion input unit is compared with the mode change motion information using the mode change motion information stored in the integrated management unit, And the mode switching is performed by determining whether or not the mode is switched.

According to an embodiment of the present invention, there is provided an information input system using finger movement and a method for inputting the same, which includes a motion input unit capable of integrating a mouse function and a keyboard function, Accordingly, various smart devices connected to the integrated management unit can be easily remotely controlled and input information in a mouse mode, an operation mode, and a handwriting mode.

In addition, the integrated management unit requests the initial input for each of the xy, yz, and zz planes through the motion input unit prior to remote control of the smart device, thereby acquiring the device movement range of the user three- Thereby enabling the sensitivity of the 3D motion input unit to be customized for each user.

In addition, when the motion input unit is controlled in the handwriting mode, the inputted motion information is applied to a decision tree algorithm, and the character can be quickly recognized.

1 is a block diagram illustrating an information input system using finger movement according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating an information input method using finger movement according to an exemplary embodiment of the present invention. Referring to FIG.
3 is a diagram illustrating a relationship between planes for setting sensitivity of a motion input unit in an information input system using finger movement and an input method thereof according to an embodiment of the present invention.
4 is a diagram illustrating an example of a motion signal in a handwriting mode through a motion input unit in an information input system using a finger motion according to an embodiment of the present invention and an input method thereof.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The following drawings are provided by way of example so that those skilled in the art can fully understand the spirit of the present invention. Therefore, the present invention is not limited to the following drawings, but may be embodied in other forms. In addition, like reference numerals designate like elements throughout the specification.

In this case, unless otherwise defined, technical terms and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. In the following description and the accompanying drawings, A description of known functions and configurations that may unnecessarily obscure the description of the present invention will be omitted.

In addition, a system refers to a collection of components, including devices, mechanisms, and means that are organized and regularly interact to perform the required function.

1 is a diagram illustrating an information input system using finger movement according to an embodiment of the present invention. The information input system using the finger movement of the present invention will be described in detail with reference to FIG.

According to an embodiment of the present invention, there is provided an information input system using finger movements, which uses a motion input unit capable of integrating a mouse function and a keyboard function, and controls various types of smart devices according to finger movement information of a user, 1 is an information input system using finger movements that can easily be switched to a mouse mode, an operation mode, and a handwriting mode by using finger movement information inputted through a motion input unit, A motion input unit 100 and an integrated management unit 200 connected via a wireless network, as shown in FIG.

To learn more about each configuration,

1, the motion input unit 100 may include a network management unit 110, a motion sensor unit 120, and a power source unit 130, and the motion input unit 100 may include a user's finger Or can be worn in the form of a ring and can be worn and transmitted to the integrated management unit 200 connected through a wireless network by sensing finger motion information of a user.

The network manager 110 may be a wireless LAN (WLAN), a wireless fidelity (Wi-Fi), a wireless broadband Internet, a World Interoperability for Microwave Access, a High Speed Downlink Packet Access (HSDPA) , Zigbee, Bluetooth, Ultra-WideBand, Infrared Data Association, Ultra Wild Band, Shared Wireless Access Protocol (SWAP), Long Term Evolution (LTE) And manage and perform a wireless network with the integrated management unit 200. [

The power supply unit 130 may be supplied with external power to perform an operation for sensing finger motion information of a user to the motion input unit 100 or may supply power through a battery.

The motion sensor unit 120 includes a three-axis acceleration sensor and a three-axis gyro sensor. The motion sensor unit 120 senses the movement of the user's finger through the three-axis acceleration sensor and the three-axis gyro sensor, .

Specifically, the three-axis acceleration sensor can measure the acceleration of the X, Y, and Z axes at regular intervals according to the movement of a finger of a user to which the motion sensor unit 120 is attached,

The three-axis gyro sensor can measure the angular speeds of the X, Y, and Z axes at regular intervals according to the movement of the user's finger attached to the motion sensor unit 120, and output the numerical values.

In other words, the motion sensor unit 20 senses the movement of the user in the three-dimensional space using the three-axis acceleration sensor and the three-axis gyro sensor, and senses finger motion information.

The integrated management unit 200 analyzes the finger motion information received from the motion sensor unit 120 of the motion input unit 100 to determine whether the movement of the user in the three-dimensional space is a mouse mode, an operation mode, Mode, and can remotely control the smart device 300 connected through the wireless network according to the determined mode.

The smart device 300 may be a wireless LAN, a wireless LAN (WLAN), a wireless fidelity (Wi-Fi), a wireless broadband Internet, a World Interoperability for Microwave Access (IEEE 802.11b), High Speed Downlink Packet Access, Zigbee, Bluetooth, Ultra-WideBand, Infrared Data Association, Ultra Wild Band, Shared Wireless Access Protocol (SWAP) Long Term Evolution), smart TV, smart home appliances, and smart glasses.

1, the integrated management unit 200 includes a signal preprocessing unit 210, a sensitivity setting unit 220, an effective motion determination unit 230, a central control unit 240, a mouse mode control unit 250, An operation mode control unit 260 and a handwriting mode control unit 270.

The signal preprocessing unit 210 can recognize the motion input unit 100 worn by the user through the finger motion information received from the motion input unit 100, i.e., sensor data of six axes in total. In other words, when the signal preprocessing unit 210 receives the finger motion information from the motion input unit 100, it can determine whether the motion input unit 100 is worn by the user.

The sensitivity setting unit 220 may set the initial sensitivity reference value after recognizing the motion input unit 100 in the signal preprocessing unit 210. [

In detail, the finger movement information input from the motion input unit 100 may be analyzed to calculate a finger movement maximum movement distance and a finger movement average movement distance to set an initial sensitivity reference value, Analyzes and determines the finger motion information received based on the initial sensitivity reference value.

As shown in FIG. 3, the sensitivity setting unit 220 receives sensor data of six axes from a user wearing the motion input unit 100, and receives the sensor data of xy, yz, zz, The length of the longest side parallel to the y axis, the length of the longest side parallel to the z axis and the average moving length of the sides parallel to the x axis, the average moving length of the sides parallel to the y axis, And sets the initial sensitivity reference value by adjusting the size of the average moving distance by a ratio appropriate to the screen size of the smart device 300. [

The effective motion determiner 230 calculates the finger motion velocity and the finger motion distance using the finger motion information received from the motion input unit 100 and outputs the calculated finger motion velocity and finger motion distance as the sensitivity It is possible to determine valid motion information among the finger motion information by comparing the initial sensitivity reference value with the initial sensitivity reference value of the setting unit 200.

That is, the valid motion determiner 230 may integrate the acceleration data sensed through the motion sensor unit 120 of the motion input unit 100 to calculate the speed of the finger motion of the user, So that the finger movement distance can be calculated.

If the finger motion information is within the initial sensitivity reference value, it can be determined that the finger motion information is a valid motion.

The central controller 240 may determine whether the finger motion information determined as valid motion information by the valid motion determiner 230 is a mouse mode, an operation mode, or a handwriting mode. The operation mode control unit 260 and the handwriting mode control unit 270 to generate the control signals for the respective modes.

In this case, the determination of the mouse mode, the operation mode, or the handwriting mode may be performed by using the mode determination motion information stored in advance in the central control unit 240 so that the finger motion information determined as the valid motion information is the mode determination motion information , It is determined that the mode is the corresponding mode.

Accordingly, the central control unit 240 may control the mouse mode control unit 250, the operation mode control unit 260, and the writing mode control unit 270, respectively.

To be more specific,

The mouse mode control unit 250 may transmit a control signal corresponding to the valid motion information to the smart device 300 in a mouse mode under the control of the central control unit 240. [

That is, the mouse mode controller 250 receives and analyzes the finger motion information determined to be the mouse mode, and outputs mouse cursor information to move the mouse cursor of the smart device 300 in the pointing direction according to the finger movement direction Remote control,

The user can remotely control the mouse click and left click of the smart device 300 by remotely controlling the mouse click information of the smart device 300 according to a finger operation.

Here, according to the initial sensitivity default value set by the initial sensitivity setting unit 220, the moving rate of the mouse cursor can be applied at a rate suitable for the screen size of the smart device 300.

The operation mode control unit 260 may transmit a control signal corresponding to the valid motion information to the smart device 300 in an operation mode under the control of the central control unit 240. [

That is, the operation mode control unit 260 receives the finger motion information determined as the operation mode, compares the previously stored motion information with the finger motion information determined as the valid motion information, The controller 300 can remotely control the smart device 300 with the control information corresponding to the control information.

When the user performs the stored operation information through the motion input unit 100, the operation mode control unit 260 recognizes the operation information and outputs a corresponding control signal To the smart device (300).

The handwriting mode controller 270 may transmit the control signal corresponding to the valid motion information to the smart device 300 in the handwriting mode under the control of the central controller 240. [

In other words, the handwriting mode control unit 270 receives and analyzes the finger motion information determined as the handwriting mode, and performs a predetermined decision tree algorithm to transmit the calculated character input information to the smart device 300 to output the character input information through the smart device 300,

The smart device 300 can be remotely controlled with control information corresponding to the character information matched by comparing the character information stored in advance with the character input information calculated through the decision tree algorithm.

Specifically, the handwriting mode control unit 270 can obtain finger movement distance information by integrating the acceleration sensor data from the finger movement information determined to be in the handwriting mode, and using this, the relative coordinates in the three- Can be calculated. The slope of the finger movement can be calculated by calculating the three-dimensional coordinates continuously. If the calculated finger movement slope exceeds a certain constant, it is recognized as a new stroke.

Also, when the stroke information is stored in the clockwise or counterclockwise direction with respect to the consecutive strokes and the stroke progresses in the continuous direction, the distance calculated by calculating the distance between the finger movement time and the end point is greater than a predetermined constant If it is not large, it can be formed into a loop.

At this time, the handwriting mode control unit 270 may pattern the characters that can be input by the user in the order of handwriting in the loop and store them in advance. In the table 1, , 'Character with multiple loops', 'character without loop', and so on.

In addition, the characters in the following Table 1 are as shown in FIG.

pattern The corresponding character The character in which the loop begins to be handwritten a, d1, e, q1, t1 Character ending in loop with handwriting b, d2, f, p1, t3, y2, z2 Characters containing multiple loops g, q2 A character whose entire character is also a loop e, o, l2, p2, t2, t4, x1, x3 Loopless character u1, i2, j, k, l1, m, n, r, s, u, v, w, x2,

The handwriting mode control unit 270 determines that handwriting input for one character is completed when the valid motion information is not generated through the motion input unit 100 and the finger movement information is not generated over the initial sensitivity reference value The first finger motion information and the last finger motion information may be determined as one character input information and calculated.

The writing mode control unit 270 defines each pattern as one character group for the five character patterns shown in Table 1 and sets a decision tree for the characters belonging to one character group as And the valid motion information input through the motion input unit 100 can be simultaneously recognized as a plurality of character groups. This is called group recognition.

In this case, the handwriting mode control unit 270 can output the final input character information by executing the decision tree algorithm in each character group. In other words, it is possible to independently perform the decision tree algorithm in each character group for the valid motion information simultaneously recognized by a plurality of character groups, and to calculate the character information corresponding to the valid motion information.

In addition, the handwriting mode control unit 270 may analyze the valid motion information on the virtual plane before performing the decision tree algorithm. In this way, a least mean square of the three-dimensional data with respect to the input valid motion information can be calculated, and three-dimensional data information of the valid motion information input after calculation can be continuously added, You can update your handwritten information.

The virtual plane is composed of the two-dimensional axes of X and Y, and the vertical and horizontal histograms of the motion coordinates obtained with respect to the X and Y axes are obtained, and the midpoints of the motion coordinates The radiation histogram is obtained while rotating in the direction of the edge of the coordinates. Since the sensor data of the motion input unit is discrete data, when obtaining the histogram, the histogram value can be obtained by supplementing the coordinates between the two points.

And one final character input information is calculated in one character group by using this.

However, when the character input information is recognized in a plurality of or a plurality of character groups, each character input information is calculated in each character group.

In this case, the decision tree algorithm is performed again.

In other words, when the final character input information calculated by analyzing the valid motion information is not included in one character group but is contained in a plurality of or a plurality of character groups, Tree algorithm, and the final character input information is calculated by the second decision tree algorithm.

In detail, the first decision tree algorithm first determines the direction of the progression of successive strokes of the input valid motion information, and the second decision tree algorithm can use the histogram characteristic first.

That is, the second decision tree algorithm calculates the final character input information by using the character group recognition result, which is the result of the first decision tree algorithm, and the histogram characteristic of the inputted valid motion information.

For example, when the user inputs a character of 'e' through the finger motion information, the 'e' character may be 'a character starting to be handwritten in a loop' or ' A character consisting of one loop as a whole can be recognized in two groups of characters.

Accordingly, the decision tree algorithm is independently performed in each character group (the first decision tree algorithm), and the second decision tree algorithm is performed using the histogram characteristic by the decision tree algorithm in each character group Tree algorithm) to recognize the final character input information, that is, the character 'e'.

In addition, when the user inputs the character 's' through the finger motion information, the 's' character generally has a 'rope-free character' as shown in Table 1, Looped character '.

In order to correct such a recognition error problem, the information input system using the finger motion of the present invention is configured to perform a decision tree algorithm on the handwriting mode controller 270, It is possible to calculate the final character input information in which the recognition error is corrected by using not only the progress direction but also the histogram characteristic.

Since the information input system using the finger movement of the present invention can remotely control various smart devices 300 through the single motion input unit 100, it is possible to change the mode at any time in addition to the determined mode, The device 300 can be remotely controlled.

For this, the central control unit 240 compares the finger motion information received from the motion input unit 100 with the mode switching motion information using pre-stored mode switching motion information, It is judged whether or not to switch.

The central controller 240 controls the mouse mode controller 250, the operation mode controller 260, and the handwriting mode controller 270, if the finger motion information matches the mode change motion information for mode switching, To perform the mode switching, and then remote control of the smart device 300 may be performed according to the mode corresponding to the finger motion information.

2 is a flowchart illustrating an information input method using a finger movement according to an embodiment of the present invention. Referring to FIG. 2, a method for inputting information using the finger movement according to the present invention will be described in detail.

The method of inputting information using the finger movement of the present invention is a method of inputting information from a motion input unit 100 in the integrated management unit 200 using a motion input unit 100 capable of integrally performing a mouse function and a keyboard function, Various types of smart devices 300 can be easily switched to a mouse mode, an operation mode, and a handwriting mode by using the finger motion information input through the motion input unit rather than the different input interfaces according to the user's finger motion information, (S100), an initial sensitivity setting step (S200), an effective motion information determination step (S300), a mode determination step (S400), and a mode determination step (S400) And a remote control step S500.

The method of inputting information using the finger movement of the present invention is a method of inputting information of the user to the integrated management unit 200 using the finger movement information of the user input through the motion input unit 100, The mobile terminal 300 can remotely control the different types of smart devices 300 connected to the wireless network.

To learn more about each step,

The signal preprocessing step S100 may be performed by the signal preprocessing unit 210 of the integrated management unit 200 using the first finger motion information received from the motion input unit 100, ) Can be recognized.

In this case, the finger motion information refers to sensor data of all six axes sensed by the three-axis gyro sensor included in the motion sensor unit 120 of the motion input unit 100 and the three-axis acceleration sensor.

In the initial sensitivity setting step S200, the sensitivity setting unit 220 of the integrated management unit 200 analyzes the finger motion information received from the motion input unit 100, and calculates a finger movement maximum movement distance and a finger movement average movement The initial sensitivity reference value can be set by calculating the distance.

In detail, the finger movement information input from the motion input unit 100 may be analyzed to calculate a finger movement maximum movement distance and a finger movement average movement distance to set an initial sensitivity reference value, Analyzes and determines the finger motion information received based on the initial sensitivity reference value.

In the valid motion information determination step S300, the valid motion determination unit 230 of the integrated management unit 200 analyzes the finger motion information received from the motion input unit 100 to calculate a finger motion velocity and a finger motion distance And compares the calculated finger motion velocity and finger motion distance with the initial sensitivity reference value set in the initial sensitivity setting step S200 to determine the valid motion information among the finger motion information.

That is, the valid motion determiner 230 may integrate the acceleration data sensed through the motion sensor unit 120 of the motion input unit 100 to calculate the speed of the finger motion of the user, So that the finger movement distance can be calculated.

If the finger motion information is within the initial sensitivity reference value, it can be determined that the finger motion information is a valid motion.

The mode determination step S400 is a step in which the central control unit 240 of the integrated management unit 200 determines whether the finger motion information determined as the effective motion information in the effective motion information determination step S300 is a mouse mode, It is possible to determine which one of the two modes.

In this case, the determination of the mouse mode, the operation mode, or the handwriting mode may be performed by using the mode determination motion information stored in advance in the central control unit 240 so that the finger motion information determined as the valid motion information is the mode determination motion information , It is determined that the mode is the corresponding mode.

Accordingly, the central control unit 240 may control the mouse mode control unit 250, the operation mode control unit 260, and the writing mode control unit 270, respectively, in accordance with each mode.

In addition, in the mode determination step S400, the finger movement information received from the motion input unit 100 is compared with the mode change motion information using the mode change motion information stored in advance in the integrated management unit 200 And determines whether or not the mode is switched according to whether the mode is coincident or not.

In the remote control step S500, the central control unit 240 of the integrated management unit 200 transmits a control signal corresponding to the valid motion information to the smart device 300 according to the mode determined in the mode determination step S400. So that the smart device 300 can be remotely controlled.

The control signal transmission to the smart device 300 may be performed by the mouse mode control unit 250, the operation mode control unit 260 and the handwriting mode control unit 270 under the control of the central control unit 240.

More specifically, when the valid motion information is determined to be a mouse mode in the mode determination step (S400), the remote control step (S500) controls the mouse mode control unit (250) to determine a finger movement direction of the valid motion information Analyzing the mouse cursor information of the smart device 300,

The mouse click information of the smart device 300 can be remotely controlled by analyzing the finger operation.

If it is determined in operation S400 that the valid motion information is an operation mode, the remote control step S500 controls the operation mode control unit 260 so that the valid motion information and the operation mode control unit 260 of the smart device 300, and remotely control the smart device 300 with the control information corresponding to the matching operation information.

If it is determined in step S400 that the valid motion information is a handwriting mode, the remote control step S500 controls the handwriting mode controller 270 to analyze the valid motion information, The control unit 270 executes a decision tree algorithm set in advance and transmits the calculated character input information to the smart device 300 for output,

The character information stored in advance in the writing mode control unit 270 is compared with the character input information calculated through the decision tree algorithm, and the smart device 300 is remotely controlled with control information corresponding to the matching character information can do.

That is, in other words, the information input system and the input method using finger movement according to an embodiment of the present invention include a motion input unit 100 capable of integrating a mouse function and a keyboard function, and the integrated management unit 200, Can easily perform remote control of the various smart devices 300 connected to the smart device 300 in the mouse mode, the operation mode, and the handwriting mode by recognizing the finger movement inputted through the motion input unit 100.

As described above, the present invention has been described with reference to specific embodiments such as specific components and exemplary embodiments. However, the present invention is not limited to the above-described embodiments, And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains.

Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, fall within the scope of the present invention .

100:
110: network management unit 120: motion sensor unit
130:
200: Integrated Management Department
210: Signal preprocessing unit 220: Sensitivity setting unit
230: valid motion determiner 240: central controller
250: Mouse mode control unit 260: Operation mode control unit
270: handwriting mode control section
300: Smart device

Claims (12)

A motion input unit 100 that is wearable on the user's finger and senses finger motion information of the user and transmits the information to the integrated management unit 200 connected via the wireless network; And
An operation mode and a handwriting mode by analyzing the finger movement information received from the motion input unit 100 and determines a corresponding mode among the mouse mode, the operation mode, and the handwriting mode, and controls the smart device 300, An integrated management unit 200 for remotely controlling the mobile terminal 100;
And,
The integrated management unit 200
Before performing remote control of the smart device 300,
The finger motion information input from the motion input unit 100 is analyzed to calculate an initial sensitivity reference value by calculating a finger movement maximum movement distance and a finger movement average movement distance and determines the finger movement information based on the initial sensitivity reference value Wherein the information input system comprises:
The method according to claim 1,
The motion input unit 100
A network management unit 110 for managing a wireless network with the integrated management unit 200;
A motion sensor unit 120 including a 3-axis acceleration sensor and a 3-axis gyro sensor, and sensing a user's finger movement through the 3-axis acceleration sensor and the 3-axis gyro sensor; And
A power supply unit 130;
Wherein the information input system comprises:
The method according to claim 1,
The integrated management unit 200
A signal preprocessing unit 210 receiving the first finger motion information from the motion input unit 100 and recognizing the motion input unit 100 worn by the user;
A sensitivity setting unit 220 for setting the initial sensitivity reference value when the signal preprocessing unit 210 recognizes the motion input unit 100;
Calculates the finger motion velocity and the finger motion distance using the finger motion information received from the motion input unit 100 and outputs the calculated finger motion velocity and finger motion distance to the initial sensitivity reference value of the sensitivity setting unit 220 An effective motion determiner 230 for determining valid motion information among the finger motion information by comparing the motion information with the motion information;
A central controller 240 for determining whether the finger motion information is a mouse mode, an operation mode, or a handwriting mode using the valid motion information determined by the valid motion determiner 230 and generating a control signal for each mode;
A mouse mode control unit 250 for transmitting a control signal corresponding to the valid motion information to the smart device 300 in a mouse mode under the control of the central control unit 240;
An operation mode control unit (260) for transmitting a control signal corresponding to the valid motion information to the smart device (300) in an operation mode under the control of the central control unit (240); And
A handwriting mode controller 270 for transmitting a control signal corresponding to the valid motion information to the smart device 300 in a handwriting mode under the control of the central controller 240;
Wherein the information input system comprises:
The method of claim 3,
The mouse mode control unit 250
Receiving the valid motion information of the mouse mode,
Wherein the mouse cursor information of the smart device (300) is remotely controlled by analyzing the finger movement direction, or the mouse click information of the smart device (300) is remotely controlled by analyzing the finger operation. system.
The method of claim 3,
The operation mode control unit 260
Receiving the valid motion information of the operation mode,
And compares the stored motion information with the valid motion information to remotely control the smart device (300) with control information corresponding to the matched motion information.
The method of claim 3,
The handwriting mode control unit 270
Receiving the valid motion information of the handwriting mode,
The valid motion information is analyzed and a predetermined decision tree algorithm is performed to transmit the calculated character input information to the smart device 300 and output,
And the smart device (300) is remotely controlled with the control information corresponding to the character information to be matched by comparing the stored character information with the character input information.
The method of claim 3,
The central control unit 240
The finger movement information received from the motion input unit 100 is compared with the mode switching motion information using the stored mode switching motion information,
And controls the mouse mode control unit 250, the operation mode control unit 260, and the handwriting mode control unit 270 to perform mode switching when performing a mode change according to a result of the determination. Input system.
A method for inputting information using a finger movement for remotely controlling a heterogeneous smart device connected to a wireless network in an integrated management unit using a user's finger motion information input through a motion input unit that can be worn by a user,
A signal preprocessing step (S100) of receiving the first finger motion information from the motion input unit in the integrated management unit and recognizing the motion input unit worn by the user;
An initial sensitivity setting step (S200) of analyzing the finger motion information received from the motion input unit in the integrated management unit and calculating an initial sensitivity reference value by calculating a finger movement maximum movement distance and a finger movement average movement distance;
The integrated management unit analyzes the finger motion information received from the motion input unit to calculate a finger motion velocity and a finger motion distance, and calculates the calculated finger motion velocity and finger motion distance in the initial sensitivity setting step (S200) A valid motion information determination step (S300) of determining valid motion information among the finger motion information by comparing the finger motion information with the initial sensitivity reference value;
A mode determination step (S400) of determining whether the effective motion information determined in the valid motion information determination step (S300) is a mouse mode, an operation mode, or a handwriting mode; And
A remote control step (S 500) of remotely controlling the smart device by transmitting a control signal corresponding to the valid motion information to the smart device according to the mode determined in the mode determination step (S400) in the integrated management part;
Wherein the information is input by a user.
9. The method of claim 8,
The remote control step (S500)
If it is determined that the valid motion information is a mouse mode in the mode determination step (S400)
Wherein the mouse cursor information of the smart device is remotely controlled by analyzing the finger movement direction of the valid motion information or by remote control of the mouse click information of the smart device by analyzing the finger operation, .
9. The method of claim 8,
The remote control step (S500)
If it is determined that the effective motion information is an operation mode in the mode determination step (S400)
Comparing the stored motion information with the valid motion information, and remotely controlling the smart device with control information corresponding to the matched motion information.
9. The method of claim 8,
The remote control step (S500)
If it is determined in step S400 that the valid motion information is a handwriting mode,
The valid motion information is analyzed and a predetermined decision tree algorithm is performed to transmit the calculated character input information to the smart device and output,
Comparing the stored character information with the character input information, and remotely controlling the smart device with control information corresponding to the character information to be matched.
9. The method of claim 8,
The mode determination step (S400)
Using the mode switching motion information stored in the integrated management unit, the finger motion information received from the motion input unit and the mode switching motion information are compared with each other, Wherein the information is input by using the finger movement.
KR1020140044139A 2014-04-14 2014-04-14 Information inputting system and method by movements of finger KR20150118377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140044139A KR20150118377A (en) 2014-04-14 2014-04-14 Information inputting system and method by movements of finger

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140044139A KR20150118377A (en) 2014-04-14 2014-04-14 Information inputting system and method by movements of finger

Publications (1)

Publication Number Publication Date
KR20150118377A true KR20150118377A (en) 2015-10-22

Family

ID=54426841

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140044139A KR20150118377A (en) 2014-04-14 2014-04-14 Information inputting system and method by movements of finger

Country Status (1)

Country Link
KR (1) KR20150118377A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102207510B1 (en) * 2020-04-30 2021-01-27 (주)콕스스페이스 Electronic device for controlling host device using motion signals and mouse signals
KR102286018B1 (en) * 2020-09-09 2021-08-05 주식회사 피앤씨솔루션 Wearable augmented reality device that inputs mouse events using hand gesture and method of mouse event input for wearable augmented reality device using hand gesture
KR20230007109A (en) * 2021-07-05 2023-01-12 주식회사 피앤씨솔루션 Wearable augmented reality device that inputs operation signal using a two-handed gesture and method of operating a wearable augmented reality device using a two-handed gesture

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102207510B1 (en) * 2020-04-30 2021-01-27 (주)콕스스페이스 Electronic device for controlling host device using motion signals and mouse signals
WO2022035027A1 (en) * 2020-04-30 2022-02-17 주식회사 콕스스페이스 Electronic device for controlling host device by using motion signal and mouse signal
US11797112B1 (en) 2020-04-30 2023-10-24 Cox Space Co., Ltd. Electronic device for controlling host device by using motion signal and mouse signal
KR102286018B1 (en) * 2020-09-09 2021-08-05 주식회사 피앤씨솔루션 Wearable augmented reality device that inputs mouse events using hand gesture and method of mouse event input for wearable augmented reality device using hand gesture
KR20230007109A (en) * 2021-07-05 2023-01-12 주식회사 피앤씨솔루션 Wearable augmented reality device that inputs operation signal using a two-handed gesture and method of operating a wearable augmented reality device using a two-handed gesture

Similar Documents

Publication Publication Date Title
US11914792B2 (en) Systems and methods of tracking moving hands and recognizing gestural interactions
US11561519B2 (en) Systems and methods of gestural interaction in a pervasive computing environment
KR102181588B1 (en) Method and apparatus for optimal control based on motion-voice multi-modal command
US9367138B2 (en) Remote manipulation device and method using a virtual touch of a three-dimensionally modeled electronic device
US9600078B2 (en) Method and system enabling natural user interface gestures with an electronic system
KR100630806B1 (en) Command input method using motion recognition device
KR101533319B1 (en) Remote control apparatus and method using camera centric virtual touch
US20150177836A1 (en) Wearable information input device, information input system, and information input method
CN105190483A (en) Detection of a gesture performed with at least two control objects
US20150145830A1 (en) Remote control apparatus and method for performing virtual touch by using information displayed by a projector
US9965041B2 (en) Input device, apparatus, input method, and recording medium
KR101928971B1 (en) A wearable device for controlling an electronic device based on hand motion and method for controlling the wearable device thereof
US10725550B2 (en) Methods and apparatus for recognition of a plurality of gestures using roll pitch yaw data
US20160334880A1 (en) Gesture recognition method, computing device, and control device
KR20150118377A (en) Information inputting system and method by movements of finger
US9529446B2 (en) Re-anchorable virtual panel in three-dimensional space
CN113569635B (en) Gesture recognition method and system
US9525906B2 (en) Display device and method of controlling the display device
KR101233793B1 (en) Virtual mouse driving method using hand motion recognition
KR20160011451A (en) Character input apparatus using virtual keyboard and hand gesture recognition and method thereof
KR102346294B1 (en) Method, system and non-transitory computer-readable recording medium for estimating user's gesture from 2d images
WO2009116079A2 (en) Character based input using pre-defined human body gestures
US20150123893A1 (en) Remote controller for motion recognition
KR20150124009A (en) Coaching System Of Robot Using Hand Movement
CN104461524A (en) Song requesting method based on Kinect

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application